Integration of rocket turbine design and analysis through computer graphics
NASA Technical Reports Server (NTRS)
Hsu, Wayne; Boynton, Jim
1988-01-01
An interactive approach with engineering computer graphics is used to integrate the design and analysis processes of a rocket engine turbine into a progressive and iterative design procedure. The processes are interconnected through pre- and postprocessors. The graphics are used to generate the blade profiles, their stacking, finite element generation, and analysis presentation through color graphics. Steps of the design process discussed include pitch-line design, axisymmetric hub-to-tip meridional design, and quasi-three-dimensional analysis. The viscous two- and three-dimensional analysis codes are executed after acceptable designs are achieved and estimates of initial losses are confirmed.
NASA Technical Reports Server (NTRS)
Biess, J. J.; Yu, Y.; Middlebrook, R. D.; Schoenfeld, A. D.
1974-01-01
A review is given of future power processing systems planned for the next 20 years, and the state-of-the-art of power processing design modeling and analysis techniques used to optimize power processing systems. A methodology of modeling and analysis of power processing equipment and systems has been formulated to fulfill future tradeoff studies and optimization requirements. Computer techniques were applied to simulate power processor performance and to optimize the design of power processing equipment. A program plan to systematically develop and apply the tools for power processing systems modeling and analysis is presented so that meaningful results can be obtained each year to aid the power processing system engineer and power processing equipment circuit designers in their conceptual and detail design and analysis tasks.
Optimization, an Important Stage of Engineering Design
ERIC Educational Resources Information Center
Kelley, Todd R.
2010-01-01
A number of leaders in technology education have indicated that a major difference between the technological design process and the engineering design process is analysis and optimization. The analysis stage of the engineering design process is when mathematical models and scientific principles are employed to help the designer predict design…
Designing a ticket to ride with the Cognitive Work Analysis Design Toolkit.
Read, Gemma J M; Salmon, Paul M; Lenné, Michael G; Jenkins, Daniel P
2015-01-01
Cognitive work analysis has been applied in the design of numerous sociotechnical systems. The process used to translate analysis outputs into design concepts, however, is not always clear. Moreover, structured processes for translating the outputs of ergonomics methods into concrete designs are lacking. This paper introduces the Cognitive Work Analysis Design Toolkit (CWA-DT), a design approach which has been developed specifically to provide a structured means of incorporating cognitive work analysis outputs in design using design principles and values derived from sociotechnical systems theory. This paper outlines the CWA-DT and describes its application in a public transport ticketing design case study. Qualitative and quantitative evaluations of the process provide promising early evidence that the toolkit fulfils the evaluation criteria identified for its success, with opportunities for improvement also highlighted. The Cognitive Work Analysis Design Toolkit has been developed to provide ergonomics practitioners with a structured approach for translating the outputs of cognitive work analysis into design solutions. This paper demonstrates an application of the toolkit and provides evaluation findings.
29 CFR 1910.119 - Process safety management of highly hazardous chemicals.
Code of Federal Regulations, 2011 CFR
2011-07-01
... complexity of the process will influence the decision as to the appropriate PHA methodology to use. All PHA... process hazard analysis in sufficient detail to support the analysis. (3) Information pertaining to the...) Relief system design and design basis; (E) Ventilation system design; (F) Design codes and standards...
29 CFR 1910.119 - Process safety management of highly hazardous chemicals.
Code of Federal Regulations, 2010 CFR
2010-07-01
... complexity of the process will influence the decision as to the appropriate PHA methodology to use. All PHA... process hazard analysis in sufficient detail to support the analysis. (3) Information pertaining to the...) Relief system design and design basis; (E) Ventilation system design; (F) Design codes and standards...
Using task analysis to improve the requirements elicitation in health information system.
Teixeira, Leonor; Ferreira, Carlos; Santos, Beatriz Sousa
2007-01-01
This paper describes the application of task analysis within the design process of a Web-based information system for managing clinical information in hemophilia care, in order to improve the requirements elicitation and, consequently, to validate the domain model obtained in a previous phase of the design process (system analysis). The use of task analysis in this case proved to be a practical and efficient way to improve the requirements engineering process by involving users in the design process.
Structural Optimization in automotive design
NASA Technical Reports Server (NTRS)
Bennett, J. A.; Botkin, M. E.
1984-01-01
Although mathematical structural optimization has been an active research area for twenty years, there has been relatively little penetration into the design process. Experience indicates that often this is due to the traditional layout-analysis design process. In many cases, optimization efforts have been outgrowths of analysis groups which are themselves appendages to the traditional design process. As a result, optimization is often introduced into the design process too late to have a significant effect because many potential design variables have already been fixed. A series of examples are given to indicate how structural optimization has been effectively integrated into the design process.
NASA Technical Reports Server (NTRS)
Consoli, Robert David; Sobieszczanski-Sobieski, Jaroslaw
1990-01-01
Advanced multidisciplinary analysis and optimization methods, namely system sensitivity analysis and non-hierarchical system decomposition, are applied to reduce the cost and improve the visibility of an automated vehicle design synthesis process. This process is inherently complex due to the large number of functional disciplines and associated interdisciplinary couplings. Recent developments in system sensitivity analysis as applied to complex non-hierarchic multidisciplinary design optimization problems enable the decomposition of these complex interactions into sub-processes that can be evaluated in parallel. The application of these techniques results in significant cost, accuracy, and visibility benefits for the entire design synthesis process.
Automated Simulation For Analysis And Design
NASA Technical Reports Server (NTRS)
Cantwell, E.; Shenk, Tim; Robinson, Peter; Upadhye, R.
1992-01-01
Design Assistant Workstation (DAWN) software being developed to facilitate simulation of qualitative and quantitative aspects of behavior of life-support system in spacecraft, chemical-processing plant, heating and cooling system of large building, or any of variety of systems including interacting process streams and processes. Used to analyze alternative design scenarios or specific designs of such systems. Expert system will automate part of design analysis: reason independently by simulating design scenarios and return to designer with overall evaluations and recommendations.
ERIC Educational Resources Information Center
Tutlys, Vidmantas; Spöttl, Georg
2017-01-01
Purpose: This paper aims to explore methodological and institutional challenges on application of the work-process analysis approach in the design and development of competence-based occupational standards for Lithuania. Design/methodology/approach: The theoretical analysis is based on the review of scientific literature and the analysis of…
Criteria for Comparing Domain Analysis Approaches Version 01.00.00
1991-12-01
Down-Bottom-Up Domain Analysis Process (1990 Version) ..... 14 Figure 8. FODAs Domain Analysis Process ............................................ 16... FODA , which uses the Design Approach for Real-Time Systems (DARTS) design method (Gomaa 1984)? 1 1. hntmduction Domain analysis is still immature... Analysis Process 16 2. An Orvicw of Some Domain AnabAppro•d•a 2.4.3 ExAzs The FODA report illustrates the process by using the window management
Planar Inlet Design and Analysis Process (PINDAP)
NASA Technical Reports Server (NTRS)
Slater, John W.; Gruber, Christopher R.
2005-01-01
The Planar Inlet Design and Analysis Process (PINDAP) is a collection of software tools that allow the efficient aerodynamic design and analysis of planar (two-dimensional and axisymmetric) inlets. The aerodynamic analysis is performed using the Wind-US computational fluid dynamics (CFD) program. A major element in PINDAP is a Fortran 90 code named PINDAP that can establish the parametric design of the inlet and efficiently model the geometry and generate the grid for CFD analysis with design changes to those parameters. The use of PINDAP is demonstrated for subsonic, supersonic, and hypersonic inlets.
NASA Technical Reports Server (NTRS)
Miller, R. E., Jr.; Hansen, S. D.; Redhed, D. D.; Southall, J. W.; Kawaguchi, A. S.
1974-01-01
Evaluation of the cost-effectiveness of integrated analysis/design systems with particular attention to Integrated Program for Aerospace-Vehicle Design (IPAD) project. An analysis of all the ingredients of IPAD indicates the feasibility of a significant cost and flowtime reduction in the product design process involved. It is also concluded that an IPAD-supported design process will provide a framework for configuration control, whereby the engineering costs for design, analysis and testing can be controlled during the air vehicle development cycle.
Huang, Jun; Kaul, Goldi; Cai, Chunsheng; Chatlapalli, Ramarao; Hernandez-Abad, Pedro; Ghosh, Krishnendu; Nagi, Arwinder
2009-12-01
To facilitate an in-depth process understanding, and offer opportunities for developing control strategies to ensure product quality, a combination of experimental design, optimization and multivariate techniques was integrated into the process development of a drug product. A process DOE was used to evaluate effects of the design factors on manufacturability and final product CQAs, and establish design space to ensure desired CQAs. Two types of analyses were performed to extract maximal information, DOE effect & response surface analysis and multivariate analysis (PCA and PLS). The DOE effect analysis was used to evaluate the interactions and effects of three design factors (water amount, wet massing time and lubrication time), on response variables (blend flow, compressibility and tablet dissolution). The design space was established by the combined use of DOE, optimization and multivariate analysis to ensure desired CQAs. Multivariate analysis of all variables from the DOE batches was conducted to study relationships between the variables and to evaluate the impact of material attributes/process parameters on manufacturability and final product CQAs. The integrated multivariate approach exemplifies application of QbD principles and tools to drug product and process development.
Designing Image Analysis Pipelines in Light Microscopy: A Rational Approach.
Arganda-Carreras, Ignacio; Andrey, Philippe
2017-01-01
With the progress of microscopy techniques and the rapidly growing amounts of acquired imaging data, there is an increased need for automated image processing and analysis solutions in biological studies. Each new application requires the design of a specific image analysis pipeline, by assembling a series of image processing operations. Many commercial or free bioimage analysis software are now available and several textbooks and reviews have presented the mathematical and computational fundamentals of image processing and analysis. Tens, if not hundreds, of algorithms and methods have been developed and integrated into image analysis software, resulting in a combinatorial explosion of possible image processing sequences. This paper presents a general guideline methodology to rationally address the design of image processing and analysis pipelines. The originality of the proposed approach is to follow an iterative, backwards procedure from the target objectives of analysis. The proposed goal-oriented strategy should help biologists to better apprehend image analysis in the context of their research and should allow them to efficiently interact with image processing specialists.
Natural Resource Information System, design analysis
NASA Technical Reports Server (NTRS)
1972-01-01
The computer-based system stores, processes, and displays map data relating to natural resources. The system was designed on the basis of requirements established in a user survey and an analysis of decision flow. The design analysis effort is described, and the rationale behind major design decisions, including map processing, cell vs. polygon, choice of classification systems, mapping accuracy, system hardware, and software language is summarized.
A Meta-Analysis and Review of Holistic Face Processing
Richler, Jennifer J.; Gauthier, Isabel
2014-01-01
The concept of holistic processing is a cornerstone of face recognition research, yet central questions related to holistic processing remain unanswered, and debates have thus far failed to reach a resolution despite accumulating empirical evidence. We argue that a considerable source of confusion in this literature stems from a methodological problem. Specifically, two different measures of holistic processing based on the composite paradigm (complete design and partial design) are used in the literature, but they often lead to qualitatively different results. First, we present a comprehensive review of the work that directly compares the two designs, and which clearly favors the complete design over the partial design. Second, we report a meta-analysis of holistic face processing according to both designs, and use this as further evidence for one design over the other. The meta-analysis effect size of holistic processing in the complete design is nearly three times that of the partial design. Effect sizes were not correlated between measures, consistent with the suggestion that they do not measure the same thing. Our meta-analysis also examines the correlation between conditions in the complete design of the composite task, and suggests that in an individual differences context, little is gained by including a misaligned baseline. Finally, we offer a comprehensive review of the state of knowledge about holistic processing based on evidence gathered from the measure we favor based on the first sections of our review—the complete design—and outline outstanding research questions in that new context. PMID:24956123
Modeling and Analysis of Power Processing Systems (MAPPS). Volume 1: Technical report
NASA Technical Reports Server (NTRS)
Lee, F. C.; Rahman, S.; Carter, R. A.; Wu, C. H.; Yu, Y.; Chang, R.
1980-01-01
Computer aided design and analysis techniques were applied to power processing equipment. Topics covered include: (1) discrete time domain analysis of switching regulators for performance analysis; (2) design optimization of power converters using augmented Lagrangian penalty function technique; (3) investigation of current-injected multiloop controlled switching regulators; and (4) application of optimization for Navy VSTOL energy power system. The generation of the mathematical models and the development and application of computer aided design techniques to solve the different mathematical models are discussed. Recommendations are made for future work that would enhance the application of the computer aided design techniques for power processing systems.
NASA Technical Reports Server (NTRS)
Fegley, K. A.; Hayden, J. H.; Rehmann, D. W.
1974-01-01
The feasibility of formulating a methodology for the modeling and analysis of aerospace electrical power processing systems is investigated. It is shown that a digital computer may be used in an interactive mode for the design, modeling, analysis, and comparison of power processing systems.
Inauen, A; Jenny, G J; Bauer, G F
2012-06-01
This article focuses on organizational analysis in workplace health promotion (WHP) projects. It shows how this analysis can be designed such that it provides rational data relevant to the further context-specific and goal-oriented planning of WHP and equally supports individual and organizational change processes implied by WHP. Design principles for organizational analysis were developed on the basis of a narrative review of the guiding principles of WHP interventions and organizational change as well as the scientific principles of data collection. Further, the practical experience of WHP consultants who routinely conduct organizational analysis was considered. This resulted in a framework with data-oriented and change-oriented design principles, addressing the following elements of organizational analysis in WHP: planning the overall procedure, data content, data-collection methods and information processing. Overall, the data-oriented design principles aim to produce valid, reliable and representative data, whereas the change-oriented design principles aim to promote motivation, coherence and a capacity for self-analysis. We expect that the simultaneous consideration of data- and change-oriented design principles for organizational analysis will strongly support the WHP process. We finally illustrate the applicability of the design principles to health promotion within a WHP case study.
Process Improvement Through Tool Integration in Aero-Mechanical Design
NASA Technical Reports Server (NTRS)
Briggs, Clark
2010-01-01
Emerging capabilities in commercial design tools promise to significantly improve the multi-disciplinary and inter-disciplinary design and analysis coverage for aerospace mechanical engineers. This paper explores the analysis process for two example problems of a wing and flap mechanical drive system and an aircraft landing gear door panel. The examples begin with the design solid models and include various analysis disciplines such as structural stress and aerodynamic loads. Analytical methods include CFD, multi-body dynamics with flexible bodies and structural analysis. Elements of analysis data management, data visualization and collaboration are also included.
A meta-analysis and review of holistic face processing.
Richler, Jennifer J; Gauthier, Isabel
2014-09-01
The concept of holistic processing is a cornerstone of face recognition research, yet central questions related to holistic processing remain unanswered, and debates have thus far failed to reach a resolution despite accumulating empirical evidence. We argue that a considerable source of confusion in this literature stems from a methodological problem. Specifically, 2 measures of holistic processing based on the composite paradigm (complete design and partial design) are used in the literature, but they often lead to qualitatively different results. First, we present a comprehensive review of the work that directly compares the 2 designs, and which clearly favors the complete design over the partial design. Second, we report a meta-analysis of holistic face processing according to both designs and use this as further evidence for one design over the other. The meta-analysis effect size of holistic processing in the complete design is nearly 3 times that of the partial design. Effect sizes were not correlated between measures, consistent with the suggestion that they do not measure the same thing. Our meta-analysis also examines the correlation between conditions in the complete design of the composite task, and suggests that in an individual differences context, little is gained by including a misaligned baseline. Finally, we offer a comprehensive review of the state of knowledge about holistic processing based on evidence gathered from the measure we favor based on the 1st sections of our review-the complete design-and outline outstanding research questions in that new context. PsycINFO Database Record (c) 2014 APA, all rights reserved.
2011-12-01
systems engineering technical and technical management processes. Technical Planning, Stakeholders Requirements Development, and Architecture Design were...Stakeholder Requirements Definition, Architecture Design and Technical Planning. A purposive sampling of AFRL rapid development program managers and engineers...emphasize one process over another however Architecture Design , Implementation scored higher among Technical Processes. Decision Analysis, Technical
Self-conscious robotic system design process--from analysis to implementation.
Chella, Antonio; Cossentino, Massimo; Seidita, Valeria
2011-01-01
Developing robotic systems endowed with self-conscious capabilities means realizing complex sub-systems needing ad-hoc software engineering techniques for their modelling, analysis and implementation. In this chapter the whole process (from analysis to implementation) to model the development of self-conscious robotic systems is presented and the new created design process, PASSIC, supporting each part of it, is fully illustrated.
Modernization of the Transonic Axial Compressor Test Rig
2017-12-01
13. ABSTRACT (maximum 200 words) This work presents the design and simulation process of modernizing the Naval Postgraduate School’s transonic...fabricate the materials. Stiffness tests and modal analysis were conducted via Finite Element Analysis (FEA) software. This analysis was used to design ...work presents the design and simulation process of modernizing the Naval Postgraduate School’s transonic compressor test rig (TCR). The TCR, which
NASA Astrophysics Data System (ADS)
Yu, Long; Druckenbrod, Markus; Greve, Martin; Wang, Ke-qi; Abdel-Maksoud, Moustafa
2015-10-01
A fully automated optimization process is provided for the design of ducted propellers under open water conditions, including 3D geometry modeling, meshing, optimization algorithm and CFD analysis techniques. The developed process allows the direct integration of a RANSE solver in the design stage. A practical ducted propeller design case study is carried out for validation. Numerical simulations and open water tests are fulfilled and proved that the optimum ducted propeller improves hydrodynamic performance as predicted.
The initial design of LAPAN's IR micro bolometer using mission analysis process
NASA Astrophysics Data System (ADS)
Bustanul, A.; Irwan, P.; M. T., Andi; Firman, B.
2016-11-01
As new player in Infra Red (IR) sector, uncooled, small, and lightweight IR Micro Bolometer has been chosen as one of payloads for LAPAN's next micro satellite project. Driven the desire to create our own IR Micro Bolometer, mission analysis design procedure has been applied. After tracing all possible missions, the Planck's and Wien's Law for black body, Temperature Responsivity (TR), and sub-pixel response had been utilized in order to determine the appropriate spectral radiance. The 3.8 - 4 μm wavelength were available to detect wild fire (forest fire) and active volcanoes, two major problems faced by Indonesia. In order to strengthen and broaden the result, iteration process had been used throughout the process. The analysis, then, were continued by calculating Ground pixel size, IFOV pixel, swath width, and focus length. Meanwhile, regarding of resolution, at least it is 400 m. The further procedure covered the integrated of optical design, wherein we combined among optical design software, Zemax, with mechanical analysis software (structure and thermal analysis), such as Nastran and Thermal Desktop / Sinda Fluint. The integration process was intended to produce high performance optical system of our IR Micro Bolometer that can be used under extreme environment. The results of all those analysis, either in graphs or in measurement, show that the initial design of LAPAN'S IR Micro Bolometer meets the determined requirement. However, it needs the further evaluation (iteration). This paper describes the initial design of LAPAN's IR Micro Bolometer using mission analysis process
Guidebook for solar process-heat applications
NASA Astrophysics Data System (ADS)
Fazzolare, R.; Mignon, G.; Campoy, L.; Luttmann, F.
1981-01-01
The potential for solar process heat in Arizona and some of the general technical aspects of solar, such as insolation, siting, and process analysis are explored. Major aspects of a solar plant design are presented. Collectors, storage, and heat exchange are discussed. Reducing hardware costs to annual dollar benefits is also discussed. Rate of return, cash flow, and payback are discussed as they relate to solar systems. Design analysis procedures are presented. The design cost optimization techniques using a yearly computer simulation of a solar process operation is demonstrated.
Design Process Improvement for Electric CAR Harness
NASA Astrophysics Data System (ADS)
Sawatdee, Thiwarat; Chutima, Parames
2017-06-01
In an automobile parts design company, the customer satisfaction is one of the most important factors for product design. Therefore, the company employs all means to focus its product design process based on the various requirements of customers resulting in high number of design changes. The objective of this research is to improve the design process of the electric car harness that effects the production scheduling by using Fault Tree Analysis (FTA) and Failure Mode and Effect Analysis (FMEA) as the main tools. FTA is employed for root cause analysis and FMEA is used to ranking a High Risk Priority Number (RPN) which is shows the priority of factors in the electric car harness that have high impact to the design of the electric car harness. After the implementation, the improvements are realized significantly since the number of design change is reduced from 0.26% to 0.08%.
The MSFC Collaborative Engineering Process for Preliminary Design and Concept Definition Studies
NASA Technical Reports Server (NTRS)
Mulqueen, Jack; Jones, David; Hopkins, Randy
2011-01-01
This paper describes a collaborative engineering process developed by the Marshall Space Flight Center's Advanced Concepts Office for performing rapid preliminary design and mission concept definition studies for potential future NASA missions. The process has been developed and demonstrated for a broad range of mission studies including human space exploration missions, space transportation system studies and in-space science missions. The paper will describe the design team structure and specialized analytical tools that have been developed to enable a unique rapid design process. The collaborative engineering process consists of integrated analysis approach for mission definition, vehicle definition and system engineering. The relevance of the collaborative process elements to the standard NASA NPR 7120.1 system engineering process will be demonstrated. The study definition process flow for each study discipline will be will be outlined beginning with the study planning process, followed by definition of ground rules and assumptions, definition of study trades, mission analysis and subsystem analyses leading to a standardized set of mission concept study products. The flexibility of the collaborative engineering design process to accommodate a wide range of study objectives from technology definition and requirements definition to preliminary design studies will be addressed. The paper will also describe the applicability of the collaborative engineering process to include an integrated systems analysis approach for evaluating the functional requirements of evolving system technologies and capabilities needed to meet the needs of future NASA programs.
ERIC Educational Resources Information Center
Morozov, Andrew; Kilgore, Deborah; Atman, Cynthia
2007-01-01
In this study, the authors used two methods for analyzing expert data: verbal protocol analysis (VPA) and narrative analysis. VPA has been effectively used to describe the design processes employed by engineering students, expert designers, and expert-novice comparative research. VPA involves asking participants to "think aloud" while…
Design sensitivity analysis and optimization tool (DSO) for sizing design applications
NASA Technical Reports Server (NTRS)
Chang, Kuang-Hua; Choi, Kyung K.; Perng, Jyh-Hwa
1992-01-01
The DSO tool, a structural design software system that provides the designer with a graphics-based menu-driven design environment to perform easy design optimization for general applications, is presented. Three design stages, preprocessing, design sensitivity analysis, and postprocessing, are implemented in the DSO to allow the designer to carry out the design process systematically. A framework, including data base, user interface, foundation class, and remote module, has been designed and implemented to facilitate software development for the DSO. A number of dedicated commercial software/packages have been integrated in the DSO to support the design procedures. Instead of parameterizing an FEM, design parameters are defined on a geometric model associated with physical quantities, and the continuum design sensitivity analysis theory is implemented to compute design sensitivity coefficients using postprocessing data from the analysis codes. A tracked vehicle road wheel is given as a sizing design application to demonstrate the DSO's easy and convenient design optimization process.
NASA Technical Reports Server (NTRS)
Amundsen, R. M.; Feldhaus, W. S.; Little, A. D.; Mitchum, M. V.
1995-01-01
Electronic integration of design and analysis processes was achieved and refined at Langley Research Center (LaRC) during the development of an optical bench for a laser-based aerospace experiment. Mechanical design has been integrated with thermal, structural and optical analyses. Electronic import of the model geometry eliminates the repetitive steps of geometry input to develop each analysis model, leading to faster and more accurate analyses. Guidelines for integrated model development are given. This integrated analysis process has been built around software that was already in use by designers and analysis at LaRC. The process as currently implemented used Pro/Engineer for design, Pro/Manufacturing for fabrication, PATRAN for solid modeling, NASTRAN for structural analysis, SINDA-85 and P/Thermal for thermal analysis, and Code V for optical analysis. Currently, the only analysis model to be built manually is the Code V model; all others can be imported for the Pro/E geometry. The translator from PATRAN results to Code V optical analysis (PATCOD) was developed and tested at LaRC. Directions for use of the translator or other models are given.
NASA Astrophysics Data System (ADS)
Li, Leihong
A modular structural design methodology for composite blades is developed. This design method can be used to design composite rotor blades with sophisticate geometric cross-sections. This design method hierarchically decomposed the highly-coupled interdisciplinary rotor analysis into global and local levels. In the global level, aeroelastic response analysis and rotor trim are conduced based on multi-body dynamic models. In the local level, variational asymptotic beam sectional analysis methods are used for the equivalent one-dimensional beam properties. Compared with traditional design methodology, the proposed method is more efficient and accurate. Then, the proposed method is used to study three different design problems that have not been investigated before. The first is to add manufacturing constraints into design optimization. The introduction of manufacturing constraints complicates the optimization process. However, the design with manufacturing constraints benefits the manufacturing process and reduces the risk of violating major performance constraints. Next, a new design procedure for structural design against fatigue failure is proposed. This procedure combines the fatigue analysis with the optimization process. The durability or fatigue analysis employs a strength-based model. The design is subject to stiffness, frequency, and durability constraints. Finally, the manufacturing uncertainty impacts on rotor blade aeroelastic behavior are investigated, and a probabilistic design method is proposed to control the impacts of uncertainty on blade structural performance. The uncertainty factors include dimensions, shapes, material properties, and service loads.
Developing a Methodology for Designing Systems of Instruction.
ERIC Educational Resources Information Center
Carpenter, Polly
This report presents a description of a process for instructional system design, identification of the steps in the design process, and determination of their sequence and interrelationships. As currently envisioned, several interrelated steps must be taken, five of which provide the inputs to the final design process. There are analysis of…
NASA Technical Reports Server (NTRS)
Hou, Jean W.
1985-01-01
The thermal analysis and the calculation of thermal sensitivity of a cure cycle in autoclave processing of thick composite laminates were studied. A finite element program for the thermal analysis and design derivatives calculation for temperature distribution and the degree of cure was developed and verified. It was found that the direct differentiation was the best approach for the thermal design sensitivity analysis. In addition, the approach of the direct differentiation provided time histories of design derivatives which are of great value to the cure cycle designers. The approach of direct differentiation is to be used for further study, i.e., the optimal cycle design.
POLLUTION PREVENTION IN THE DESIGN OF CHEMICAL PROCESSES USING HIERARCHICAL DESIGN AND SIMULATION
The design of chemical processes is normally an interactive process of synthesis and analysis. When one also desires or needs to limit the amount of pollution generated by the process the difficulty of the task can increase substantially. In this work, we show how combining hier...
SLS Navigation Model-Based Design Approach
NASA Technical Reports Server (NTRS)
Oliver, T. Emerson; Anzalone, Evan; Geohagan, Kevin; Bernard, Bill; Park, Thomas
2018-01-01
The SLS Program chose to implement a Model-based Design and Model-based Requirements approach for managing component design information and system requirements. This approach differs from previous large-scale design efforts at Marshall Space Flight Center where design documentation alone conveyed information required for vehicle design and analysis and where extensive requirements sets were used to scope and constrain the design. The SLS Navigation Team has been responsible for the Program-controlled Design Math Models (DMMs) which describe and represent the performance of the Inertial Navigation System (INS) and the Rate Gyro Assemblies (RGAs) used by Guidance, Navigation, and Controls (GN&C). The SLS Navigation Team is also responsible for the navigation algorithms. The navigation algorithms are delivered for implementation on the flight hardware as a DMM. For the SLS Block 1-B design, the additional GPS Receiver hardware is managed as a DMM at the vehicle design level. This paper provides a discussion of the processes and methods used to engineer, design, and coordinate engineering trades and performance assessments using SLS practices as applied to the GN&C system, with a particular focus on the Navigation components. These include composing system requirements, requirements verification, model development, model verification and validation, and modeling and analysis approaches. The Model-based Design and Requirements approach does not reduce the effort associated with the design process versus previous processes used at Marshall Space Flight Center. Instead, the approach takes advantage of overlap between the requirements development and management process, and the design and analysis process by efficiently combining the control (i.e. the requirement) and the design mechanisms. The design mechanism is the representation of the component behavior and performance in design and analysis tools. The focus in the early design process shifts from the development and management of design requirements to the development of usable models, model requirements, and model verification and validation efforts. The models themselves are represented in C/C++ code and accompanying data files. Under the idealized process, potential ambiguity in specification is reduced because the model must be implementable versus a requirement which is not necessarily subject to this constraint. Further, the models are shown to emulate the hardware during validation. For models developed by the Navigation Team, a common interface/standalone environment was developed. The common environment allows for easy implementation in design and analysis tools. Mechanisms such as unit test cases ensure implementation as the developer intended. The model verification and validation process provides a very high level of component design insight. The origin and implementation of the SLS variant of Model-based Design is described from the perspective of the SLS Navigation Team. The format of the models and the requirements are described. The Model-based Design approach has many benefits but is not without potential complications. Key lessons learned associated with the implementation of the Model Based Design approach and process from infancy to verification and certification are discussed
Development of Integrated Programs for Aerospace-vehicle design (IPAD): Reference design process
NASA Technical Reports Server (NTRS)
Meyer, D. D.
1979-01-01
The airplane design process and its interfaces with manufacturing and customer operations are documented to be used as criteria for the development of integrated programs for the analysis, design, and testing of aerospace vehicles. Topics cover: design process management, general purpose support requirements, design networks, and technical program elements. Design activity sequences are given for both supersonic and subsonic commercial transports, naval hydrofoils, and military aircraft.
Information Design: A New Approach to Teaching Technical Writing Service Courses
ERIC Educational Resources Information Center
McKee, Candie DeLane
2012-01-01
This study used a needs assessment, process analysis, process design, and textbook design to develop a new process and new textbook, based on Cargile-Cook's layered literacies, Quesenbery's five qualities of usability, and Carliner's information design theories, for use in technical writing service learning courses. The needs assessment was based…
Silicon production process evaluations
NASA Technical Reports Server (NTRS)
1981-01-01
Chemical engineering analysis of the HSC process (Hemlock Semiconductor Corporation) for producing silicon from dichlorosilane in a 1,000 MT/yr plant was continued. Progress and status for the chemical engineering analysis of the HSC process are reported for the primary process design engineering activities: base case conditions (85%), reaction chemistry (85%), process flow diagram (60%), material balance (60%), energy balance (30%), property data (30%), equipment design (20%) and major equipment list (10%). Engineering design of the initial distillation column (D-01, stripper column) in the process was initiated. The function of the distillation column is to remove volatile gases (such as hydrogen and nitrogen) which are dissolved in liquid chlorosilanes. Initial specifications and results for the distillation column design are reported including the variation of tray requirements (equilibrium stages) with reflux ratio for the distillation.
Dynamic Systems Analysis for Turbine Based Aero Propulsion Systems
NASA Technical Reports Server (NTRS)
Csank, Jeffrey T.
2016-01-01
The aircraft engine design process seeks to optimize the overall system-level performance, weight, and cost for a given concept. Steady-state simulations and data are used to identify trade-offs that should be balanced to optimize the system in a process known as systems analysis. These systems analysis simulations and data may not adequately capture the true performance trade-offs that exist during transient operation. Dynamic systems analysis provides the capability for assessing the dynamic tradeoffs at an earlier stage of the engine design process. The dynamic systems analysis concept, developed tools, and potential benefit are presented in this paper. To provide this capability, the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) was developed to provide the user with an estimate of the closed-loop performance (response time) and operability (high pressure compressor surge margin) for a given engine design and set of control design requirements. TTECTrA along with engine deterioration information, can be used to develop a more generic relationship between performance and operability that can impact the engine design constraints and potentially lead to a more efficient engine.
Launch vehicle systems design analysis
NASA Technical Reports Server (NTRS)
Ryan, Robert; Verderaime, V.
1993-01-01
Current launch vehicle design emphasis is on low life-cycle cost. This paper applies total quality management (TQM) principles to a conventional systems design analysis process to provide low-cost, high-reliability designs. Suggested TQM techniques include Steward's systems information flow matrix method, quality leverage principle, quality through robustness and function deployment, Pareto's principle, Pugh's selection and enhancement criteria, and other design process procedures. TQM quality performance at least-cost can be realized through competent concurrent engineering teams and brilliance of their technical leadership.
Design as Knowledge Construction: Constructing Knowledge of Design
ERIC Educational Resources Information Center
Cennamo, Katherine C.
2004-01-01
In this article, I present a model of instructional design that has evolved from analysis and reflection on the process of designing materials for constructivist learning environments. I observed how we addressed the critical questions for instructional design, comparing the process to traditional instructional design models and to my emerging…
Rapid Modeling and Analysis Tools: Evolution, Status, Needs and Directions
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.; Stone, Thomas J.; Ransom, Jonathan B. (Technical Monitor)
2002-01-01
Advanced aerospace systems are becoming increasingly more complex, and customers are demanding lower cost, higher performance, and high reliability. Increased demands are placed on the design engineers to collaborate and integrate design needs and objectives early in the design process to minimize risks that may occur later in the design development stage. High performance systems require better understanding of system sensitivities much earlier in the design process to meet these goals. The knowledge, skills, intuition, and experience of an individual design engineer will need to be extended significantly for the next generation of aerospace system designs. Then a collaborative effort involving the designer, rapid and reliable analysis tools and virtual experts will result in advanced aerospace systems that are safe, reliable, and efficient. This paper discusses the evolution, status, needs and directions for rapid modeling and analysis tools for structural analysis. First, the evolution of computerized design and analysis tools is briefly described. Next, the status of representative design and analysis tools is described along with a brief statement on their functionality. Then technology advancements to achieve rapid modeling and analysis are identified. Finally, potential future directions including possible prototype configurations are proposed.
ERIC Educational Resources Information Center
Vanfretti, Luigi; Farrokhabadi, Mostafa
2015-01-01
This article presents the implementation of the constructive alignment theory (CAT) in a power system analysis course through a consensus-based course design process. The consensus-based design process involves both the instructor and graduate-level students and it aims to develop the CAT framework in a holistic manner with the goal of including…
ERIC Educational Resources Information Center
Potgieter, Calvyn
2013-01-01
In this article an analysis is made of the responses of 95 technology education teachers, 14 technology education lecturers and 25 design practitioners to questionnaires regarding the teaching and the application of the design process. The main purpose of the questionnaires is to determine whether there are any trends regarding the strategies and…
An Intelligent Automation Platform for Rapid Bioprocess Design.
Wu, Tianyi; Zhou, Yuhong
2014-08-01
Bioprocess development is very labor intensive, requiring many experiments to characterize each unit operation in the process sequence to achieve product safety and process efficiency. Recent advances in microscale biochemical engineering have led to automated experimentation. A process design workflow is implemented sequentially in which (1) a liquid-handling system performs high-throughput wet lab experiments, (2) standalone analysis devices detect the data, and (3) specific software is used for data analysis and experiment design given the user's inputs. We report an intelligent automation platform that integrates these three activities to enhance the efficiency of such a workflow. A multiagent intelligent architecture has been developed incorporating agent communication to perform the tasks automatically. The key contribution of this work is the automation of data analysis and experiment design and also the ability to generate scripts to run the experiments automatically, allowing the elimination of human involvement. A first-generation prototype has been established and demonstrated through lysozyme precipitation process design. All procedures in the case study have been fully automated through an intelligent automation platform. The realization of automated data analysis and experiment design, and automated script programming for experimental procedures has the potential to increase lab productivity. © 2013 Society for Laboratory Automation and Screening.
An Intelligent Automation Platform for Rapid Bioprocess Design
Wu, Tianyi
2014-01-01
Bioprocess development is very labor intensive, requiring many experiments to characterize each unit operation in the process sequence to achieve product safety and process efficiency. Recent advances in microscale biochemical engineering have led to automated experimentation. A process design workflow is implemented sequentially in which (1) a liquid-handling system performs high-throughput wet lab experiments, (2) standalone analysis devices detect the data, and (3) specific software is used for data analysis and experiment design given the user’s inputs. We report an intelligent automation platform that integrates these three activities to enhance the efficiency of such a workflow. A multiagent intelligent architecture has been developed incorporating agent communication to perform the tasks automatically. The key contribution of this work is the automation of data analysis and experiment design and also the ability to generate scripts to run the experiments automatically, allowing the elimination of human involvement. A first-generation prototype has been established and demonstrated through lysozyme precipitation process design. All procedures in the case study have been fully automated through an intelligent automation platform. The realization of automated data analysis and experiment design, and automated script programming for experimental procedures has the potential to increase lab productivity. PMID:24088579
Framework for Multidisciplinary Analysis, Design, and Optimization with High-Fidelity Analysis Tools
NASA Technical Reports Server (NTRS)
Orr, Stanley A.; Narducci, Robert P.
2009-01-01
A plan is presented for the development of a high fidelity multidisciplinary optimization process for rotorcraft. The plan formulates individual disciplinary design problems, identifies practical high-fidelity tools and processes that can be incorporated in an automated optimization environment, and establishes statements of the multidisciplinary design problem including objectives, constraints, design variables, and cross-disciplinary dependencies. Five key disciplinary areas are selected in the development plan. These are rotor aerodynamics, rotor structures and dynamics, fuselage aerodynamics, fuselage structures, and propulsion / drive system. Flying qualities and noise are included as ancillary areas. Consistency across engineering disciplines is maintained with a central geometry engine that supports all multidisciplinary analysis. The multidisciplinary optimization process targets the preliminary design cycle where gross elements of the helicopter have been defined. These might include number of rotors and rotor configuration (tandem, coaxial, etc.). It is at this stage that sufficient configuration information is defined to perform high-fidelity analysis. At the same time there is enough design freedom to influence a design. The rotorcraft multidisciplinary optimization tool is built and substantiated throughout its development cycle in a staged approach by incorporating disciplines sequentially.
On Intelligent Design and Planning Method of Process Route Based on Gun Breech Machining Process
NASA Astrophysics Data System (ADS)
Hongzhi, Zhao; Jian, Zhang
2018-03-01
The paper states an approach of intelligent design and planning of process route based on gun breech machining process, against several problems, such as complex machining process of gun breech, tedious route design and long period of its traditional unmanageable process route. Based on gun breech machining process, intelligent design and planning system of process route are developed by virtue of DEST and VC++. The system includes two functional modules--process route intelligent design and its planning. The process route intelligent design module, through the analysis of gun breech machining process, summarizes breech process knowledge so as to complete the design of knowledge base and inference engine. And then gun breech process route intelligently output. On the basis of intelligent route design module, the final process route is made, edited and managed in the process route planning module.
Managing Analysis Models in the Design Process
NASA Technical Reports Server (NTRS)
Briggs, Clark
2006-01-01
Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.
Improvements in surface singularity analysis and design methods. [applicable to airfoils
NASA Technical Reports Server (NTRS)
Bristow, D. R.
1979-01-01
The coupling of the combined source vortex distribution of Green's potential flow function with contemporary numerical techniques is shown to provide accurate, efficient, and stable solutions to subsonic inviscid analysis and design problems for multi-element airfoils. The analysis problem is solved by direct calculation of the surface singularity distribution required to satisfy the flow tangency boundary condition. The design or inverse problem is solved by an iteration process. In this process, the geometry and the associated pressure distribution are iterated until the pressure distribution most nearly corresponding to the prescribed design distribution is obtained. Typically, five iteration cycles are required for convergence. A description of the analysis and design method is presented, along with supporting examples.
Coal gasification systems engineering and analysis. Appendix A: Coal gasification catalog
NASA Technical Reports Server (NTRS)
1980-01-01
The scope of work in preparing the Coal Gasification Data Catalog included the following subtasks: (1) candidate system subsystem definition, (2) raw materials analysis, (3) market analysis for by-products, (4) alternate products analysis, (5) preliminary integrated facility requirements. Definition of candidate systems/subsystems includes the identity of and alternates for each process unit, raw material requirements, and the cost and design drivers for each process design.
2013-04-01
project was to provide the Royal Canadian Navy ( RCN ) with a set of guidelines on analysis, design, and verification processes for effective room...design, and verification processes that should be used in the development of effective room layouts for Royal Canadian Navy ( RCN ) ships. The primary...designed CSC; however, the guidelines could be applied to the design of any multiple-operator space in any RCN vessel. Results: The development of
Djuris, Jelena; Medarevic, Djordje; Krstic, Marko; Djuric, Zorica; Ibric, Svetlana
2013-06-01
This study illustrates the application of experimental design and multivariate data analysis in defining design space for granulation and tableting processes. According to the quality by design concepts, critical quality attributes (CQAs) of granules and tablets, as well as critical parameters of granulation and tableting processes, were identified and evaluated. Acetaminophen was used as the model drug, and one of the study aims was to investigate the possibility of the development of immediate- or extended-release acetaminophen tablets. Granulation experiments were performed in the fluid bed processor using polyethylene oxide polymer as a binder in the direct granulation method. Tablets were compressed in the laboratory excenter tablet press. The first set of experiments was organized according to Plackett-Burman design, followed by the full factorial experimental design. Principal component analysis and partial least squares regression were applied as the multivariate analysis techniques. By using these different methods, CQAs and process parameters were identified and quantified. Furthermore, an in-line method was developed to monitor the temperature during the fluidized bed granulation process, to foresee possible defects in granules CQAs. Various control strategies that are based on the process understanding and assure desired quality attributes of the product are proposed. Copyright © 2013 Wiley Periodicals, Inc.
Teaching Tip: Using Activity Diagrams to Model Systems Analysis Techniques: Teaching What We Preach
ERIC Educational Resources Information Center
Lending, Diane; May, Jeffrey
2013-01-01
Activity diagrams are used in Systems Analysis and Design classes as a visual tool to model the business processes of "as-is" and "to-be" systems. This paper presents the idea of using these same activity diagrams in the classroom to model the actual processes (practices and techniques) of Systems Analysis and Design. This tip…
An expert system for integrated structural analysis and design optimization for aerospace structures
NASA Technical Reports Server (NTRS)
1992-01-01
The results of a research study on the development of an expert system for integrated structural analysis and design optimization is presented. An Object Representation Language (ORL) was developed first in conjunction with a rule-based system. This ORL/AI shell was then used to develop expert systems to provide assistance with a variety of structural analysis and design optimization tasks, in conjunction with procedural modules for finite element structural analysis and design optimization. The main goal of the research study was to provide expertise, judgment, and reasoning capabilities in the aerospace structural design process. This will allow engineers performing structural analysis and design, even without extensive experience in the field, to develop error-free, efficient and reliable structural designs very rapidly and cost-effectively. This would not only improve the productivity of design engineers and analysts, but also significantly reduce time to completion of structural design. An extensive literature survey in the field of structural analysis, design optimization, artificial intelligence, and database management systems and their application to the structural design process was first performed. A feasibility study was then performed, and the architecture and the conceptual design for the integrated 'intelligent' structural analysis and design optimization software was then developed. An Object Representation Language (ORL), in conjunction with a rule-based system, was then developed using C++. Such an approach would improve the expressiveness for knowledge representation (especially for structural analysis and design applications), provide ability to build very large and practical expert systems, and provide an efficient way for storing knowledge. Functional specifications for the expert systems were then developed. The ORL/AI shell was then used to develop a variety of modules of expert systems for a variety of modeling, finite element analysis, and design optimization tasks in the integrated aerospace structural design process. These expert systems were developed to work in conjunction with procedural finite element structural analysis and design optimization modules (developed in-house at SAT, Inc.). The complete software, AutoDesign, so developed, can be used for integrated 'intelligent' structural analysis and design optimization. The software was beta-tested at a variety of companies, used by a range of engineers with different levels of background and expertise. Based on the feedback obtained by such users, conclusions were developed and are provided.
An expert system for integrated structural analysis and design optimization for aerospace structures
NASA Astrophysics Data System (ADS)
1992-04-01
The results of a research study on the development of an expert system for integrated structural analysis and design optimization is presented. An Object Representation Language (ORL) was developed first in conjunction with a rule-based system. This ORL/AI shell was then used to develop expert systems to provide assistance with a variety of structural analysis and design optimization tasks, in conjunction with procedural modules for finite element structural analysis and design optimization. The main goal of the research study was to provide expertise, judgment, and reasoning capabilities in the aerospace structural design process. This will allow engineers performing structural analysis and design, even without extensive experience in the field, to develop error-free, efficient and reliable structural designs very rapidly and cost-effectively. This would not only improve the productivity of design engineers and analysts, but also significantly reduce time to completion of structural design. An extensive literature survey in the field of structural analysis, design optimization, artificial intelligence, and database management systems and their application to the structural design process was first performed. A feasibility study was then performed, and the architecture and the conceptual design for the integrated 'intelligent' structural analysis and design optimization software was then developed. An Object Representation Language (ORL), in conjunction with a rule-based system, was then developed using C++. Such an approach would improve the expressiveness for knowledge representation (especially for structural analysis and design applications), provide ability to build very large and practical expert systems, and provide an efficient way for storing knowledge. Functional specifications for the expert systems were then developed. The ORL/AI shell was then used to develop a variety of modules of expert systems for a variety of modeling, finite element analysis, and design optimization tasks in the integrated aerospace structural design process. These expert systems were developed to work in conjunction with procedural finite element structural analysis and design optimization modules (developed in-house at SAT, Inc.). The complete software, AutoDesign, so developed, can be used for integrated 'intelligent' structural analysis and design optimization. The software was beta-tested at a variety of companies, used by a range of engineers with different levels of background and expertise. Based on the feedback obtained by such users, conclusions were developed and are provided.
Development of Innovative Design Processor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, Y.S.; Park, C.O.
2004-07-01
The nuclear design analysis requires time-consuming and erroneous model-input preparation, code run, output analysis and quality assurance process. To reduce human effort and improve design quality and productivity, Innovative Design Processor (IDP) is being developed. Two basic principles of IDP are the document-oriented design and the web-based design. The document-oriented design is that, if the designer writes a design document called active document and feeds it to a special program, the final document with complete analysis, table and plots is made automatically. The active documents can be written with ordinary HTML editors or created automatically on the web, which ismore » another framework of IDP. Using the proper mix-up of server side and client side programming under the LAMP (Linux/Apache/MySQL/PHP) environment, the design process on the web is modeled as a design wizard style so that even a novice designer makes the design document easily. This automation using the IDP is now being implemented for all the reload design of Korea Standard Nuclear Power Plant (KSNP) type PWRs. The introduction of this process will allow large reduction in all reload design efforts of KSNP and provide a platform for design and R and D tasks of KNFC. (authors)« less
Cognitive Task Analysis, Interface Design, and Technical Troubleshooting.
ERIC Educational Resources Information Center
Steinberg, Linda S.; Gitomer, Drew H.
A model of the interface design process is proposed that makes use of two interdependent levels of cognitive analysis: the study of the criterion task through an analysis of expert/novice differences and the evaluation of the working user interface design through the application of a practical interface analysis methodology (GOMS model). This dual…
Methodology for CFD Design Analysis of National Launch System Nozzle Manifold
NASA Technical Reports Server (NTRS)
Haire, Scot L.
1993-01-01
The current design environment dictates that high technology CFD (Computational Fluid Dynamics) analysis produce quality results in a timely manner if it is to be integrated into the design process. The design methodology outlined describes the CFD analysis of an NLS (National Launch System) nozzle film cooling manifold. The objective of the analysis was to obtain a qualitative estimate for the flow distribution within the manifold. A complex, 3D, multiple zone, structured grid was generated from a 3D CAD file of the geometry. A Euler solution was computed with a fully implicit compressible flow solver. Post processing consisted of full 3D color graphics and mass averaged performance. The result was a qualitative CFD solution that provided the design team with relevant information concerning the flow distribution in and performance characteristics of the film cooling manifold within an effective time frame. Also, this design methodology was the foundation for a quick turnaround CFD analysis of the next iteration in the manifold design.
Multi-Disciplinary, Multi-Fidelity Discrete Data Transfer Using Degenerate Geometry Forms
NASA Technical Reports Server (NTRS)
Olson, Erik D.
2016-01-01
In a typical multi-fidelity design process, different levels of geometric abstraction are used for different analysis methods, and transitioning from one phase of design to the next often requires a complete re-creation of the geometry. To maintain consistency between lower-order and higher-order analysis results, Vehicle Sketch Pad (OpenVSP) recently introduced the ability to generate and export several degenerate forms of the geometry, representing the type of abstraction required to perform low- to medium-order analysis for a range of aeronautical disciplines. In this research, the functionality of these degenerate models was extended, so that in addition to serving as repositories for the geometric information that is required as input to an analysis, the degenerate models can also store the results of that analysis mapped back onto the geometric nodes. At the same time, the results are also mapped indirectly onto the nodes of lower-order degenerate models using a process called aggregation, and onto higher-order models using a process called disaggregation. The mapped analysis results are available for use by any subsequent analysis in an integrated design and analysis process. A simple multi-fidelity analysis process for a single-aisle subsonic transport aircraft is used as an example case to demonstrate the value of the approach.
An Overview of the Role of Systems Analysis in NASA's Hypersonics Project
NASA Technical Reports Server (NTRS)
Robinson, Jeffrey S.; Martin John G.; Bowles, Jeffrey V>
2006-01-01
NASA's Aeronautics Research Mission Directorate recently restructured its Vehicle Systems Program, refocusing it towards understanding the fundamental physics that govern flight in all speed regimes. Now called the Fundamental Aeronautics Program, it is comprised of four new projects, Subsonic Fixed Wing, Subsonic Rotary Wing, Supersonics, and Hypersonics. The Aeronautics Research Mission Directorate has charged the Hypersonics Project with having a basic understanding of all systems that travel at hypersonic speeds within the Earth's and other planets atmospheres. This includes both powered and unpowered systems, such as re-entry vehicles and vehicles powered by rocket or airbreathing propulsion that cruise in and accelerate through the atmosphere. The primary objective of the Hypersonics Project is to develop physics-based predictive tools that enable the design, analysis and optimization of such systems. The Hypersonics Project charges the systems analysis discipline team with providing it the decision-making information it needs to properly guide research and technology development. Credible, rapid, and robust multi-disciplinary system analysis processes and design tools are required in order to generate this information. To this end, the principal challenges for the systems analysis team are the introduction of high fidelity physics into the analysis process and integration into a design environment, quantification of design uncertainty through the use of probabilistic methods, reduction in design cycle time, and the development and implementation of robust processes and tools enabling a wide design space and associated technology assessment capability. This paper will discuss the roles and responsibilities of the systems analysis discipline team within the Hypersonics Project as well as the tools, methods, processes, and approach that the team will undertake in order to perform its project designated functions.
NASA Technical Reports Server (NTRS)
Zeiler, Thomas A.; Pototzky, Anthony S.
1989-01-01
A theoretical basis and example calculations are given that demonstrate the relationship between the Matched Filter Theory approach to the calculation of time-correlated gust loads and Phased Design Load Analysis in common use in the aerospace industry. The relationship depends upon the duality between Matched Filter Theory and Random Process Theory and upon the fact that Random Process Theory is used in Phased Design Loads Analysis in determining an equiprobable loads design ellipse. Extensive background information describing the relevant points of Phased Design Loads Analysis, calculating time-correlated gust loads with Matched Filter Theory, and the duality between Matched Filter Theory and Random Process Theory is given. It is then shown that the time histories of two time-correlated gust load responses, determined using the Matched Filter Theory approach, can be plotted as parametric functions of time and that the resulting plot, when superposed upon the design ellipse corresponding to the two loads, is tangent to the ellipse. The question is raised of whether or not it is possible for a parametric load plot to extend outside the associated design ellipse. If it is possible, then the use of the equiprobable loads design ellipse will not be a conservative design practice in some circumstances.
Tribology symposium 1995. PD-Volume 72
DOE Office of Scientific and Technical Information (OSTI.GOV)
Masudi, H.
After the keynote presentation by Professor Aaron Cohen of Texas A and M University, entitled Processes Used in Design, the program is divided into five major sessions: Research and Development -- Recent research and development of tribological components; Tribology in Manufacturing -- The impact of tribology on modern manufacturing; Design/Design Representation -- Aspects of design related to tribological systems; Tribo-Chemistry/Tribo-Physics -- Discussion of chemical and physical behavior of substances as related to tribology; and Failure Analysis -- An analysis of failure, failure detection, and failure monitoring as related to manufacturing processes. Papers have been processed separately for inclusion on themore » data base.« less
fermentation and refining process. One of his favorite topics is in the design and commissioning of custom research equipment. His areas of expertise include: Project management Process design, equipment design , and fabrication Instrumentation and controls design and programming Data analysis and presentation
An Instructional Design Framework for Fostering Student Engagement in Online Learning Environments
ERIC Educational Resources Information Center
Czerkawski, Betul C.; Lyman, Eugene W.
2016-01-01
Many approaches, models and frameworks exist when designing quality online learning environments. These approaches assist and guide instructional designers through the process of analysis, design, development, implementation and evaluation of instructional processes. Some of these frameworks are concerned with student participation, some with…
Optimisation of warpage on plastic injection moulding part using response surface methodology (RSM)
NASA Astrophysics Data System (ADS)
Miza, A. T. N. A.; Shayfull, Z.; Nasir, S. M.; Fathullah, M.; Rashidi, M. M.
2017-09-01
The warpage is often encountered which occur during injection moulding process of thin shell part depending the process condition. The statistical design of experiment method which are Integrating Finite Element (FE) Analysis, moldflow analysis and response surface methodology (RSM) are the stage of few ways in minimize the warpage values of x,y and z on thin shell plastic parts that were investigated. A battery cover of a remote controller is one of the thin shell plastic part that produced by using injection moulding process. The optimum process condition parameter were determined as to achieve the minimum warpage from being occur. Packing pressure, Cooling time, Melt temperature and Mould temperature are 4 parameters that considered in this study. A two full factorial experimental design was conducted in Design Expert of RSM analysis as to combine all these parameters study. FE analysis result gain from analysis of variance (ANOVA) method was the one of the important process parameters influenced warpage. By using RSM, a predictive response surface model for warpage data will be shown.
Model reduction in integrated controls-structures design
NASA Technical Reports Server (NTRS)
Maghami, Peiman G.
1993-01-01
It is the objective of this paper to present a model reduction technique developed for the integrated controls-structures design of flexible structures. Integrated controls-structures design problems are typically posed as nonlinear mathematical programming problems, where the design variables consist of both structural and control parameters. In the solution process, both structural and control design variables are constantly changing; therefore, the dynamic characteristics of the structure are also changing. This presents a problem in obtaining a reduced-order model for active control design and analysis which will be valid for all design points within the design space. In other words, the frequency and number of the significant modes of the structure (modes that should be included) may vary considerably throughout the design process. This is also true as the locations and/or masses of the sensors and actuators change. Moreover, since the number of design evaluations in the integrated design process could easily run into thousands, any feasible order-reduction method should not require model reduction analysis at every design iteration. In this paper a novel and efficient technique for model reduction in the integrated controls-structures design process, which addresses these issues, is presented.
Techno-economic analysis Process model development for existing and conceptual processes Detailed heat integration Economic analysis of integrated processes Integration of process simulation learnings into control ;Conceptual Process Design and Techno-Economic Assessment of Ex Situ Catalytic Fast Pyrolysis of Biomass: A
Typical uses of NASTRAN in a petrochemical industry
NASA Technical Reports Server (NTRS)
Winter, J. R.
1978-01-01
NASTRAN was principally used to perform failure analysis and redesign process equipment. It was also employed in the evaluation of vendor designs and proposed design modifications to existing process equipment. Stress analysis of forced draft fans, distillation trays, metal stacks, jacketed pipes, heat exchangers, large centrifugal fans, and agitator support structures are described.
Analysis of digester design concepts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ashare, E.; Wilson, E. H.
1979-01-29
Engineering economic analyses were performed on various digester design concepts to determine the relative performance for various biomass feedstocks. A comprehensive literature survey describing the state-of-the-art of the various digestion designs is included. The digester designs included in the analyses are CSTR, plug flow, batch, CSTR in series, multi-stage digestion and biomethanation. Other process options investigated included pretreatment processes such as shredding, degritting, and chemical pretreatment, and post-digestion processes, such as dewatering and gas purification. The biomass sources considered include feedlot manure, rice straw, and bagasse. The results of the analysis indicate that the most economical (on a unit gasmore » cost basis) digester design concept is the plug flow reactor. This conclusion results from this system providing a high gas production rate combined with a low capital hole-in-the-ground digester design concept. The costs determined in this analysis do not include any credits or penalties for feedstock or by-products, but present the costs only for conversion of biomass to methane. The batch land-fill type digester design was shown to have a unit gas cost comparable to that for a conventional stirred tank digester, with the potential of reducing the cost if a land-fill site were available for a lower cost per unit volume. The use of chemical pretreatment resulted in a higher unit gas cost, primarily due to the cost of pretreatment chemical. A sensitivity analysis indicated that the use of chemical pretreatment could improve the economics provided a process could be developed which utilized either less pretreatment chemical or a less costly chemical. The use of other process options resulted in higher unit gas costs. These options should only be used when necessary for proper process performance, or to result in production of a valuable by-product.« less
FINDING A METHOD FOR THE MADNESS: A COMPARATIVE ANALYSIS OF STRATEGIC DESIGN METHODOLOGIES
2017-06-01
FINDING A METHOD FOR THE MADNESS: A COMPARATIVE ANALYSIS OF STRATEGIC DESIGN METHODOLOGIES BY AMANDA DONNELLY A THESIS...work develops a comparative model for strategic design methodologies, focusing on the primary elements of vision, time, process, communication and...collaboration, and risk assessment. My analysis dissects and compares three potential design methodologies including, net assessment, scenarios and
Sketching in Design Journals: An Analysis of Visual Representations in the Product Design Process
ERIC Educational Resources Information Center
Lau, Kimberly; Oehlberg, Lora; Agogino, Alice
2009-01-01
This paper explores the sketching behavior of designers and the role of sketching in the design process. Observations from a descriptive study of sketches provided in design journals, characterized by a protocol measuring sketching activities, are presented. A distinction is made between journals that are entirely tangible and those that contain…
A Conceptual Aerospace Vehicle Structural System Modeling, Analysis and Design Process
NASA Technical Reports Server (NTRS)
Mukhopadhyay, Vivek
2007-01-01
A process for aerospace structural concept analysis and design is presented, with examples of a blended-wing-body fuselage, a multi-bubble fuselage concept, a notional crew exploration vehicle, and a high altitude long endurance aircraft. Aerospace vehicle structures must withstand all anticipated mission loads, yet must be designed to have optimal structural weight with the required safety margins. For a viable systems study of advanced concepts, these conflicting requirements must be imposed and analyzed early in the conceptual design cycle, preferably with a high degree of fidelity. In this design process, integrated multidisciplinary analysis tools are used in a collaborative engineering environment. First, parametric solid and surface models including the internal structural layout are developed for detailed finite element analyses. Multiple design scenarios are generated for analyzing several structural configurations and material alternatives. The structural stress, deflection, strain, and margins of safety distributions are visualized and the design is improved. Over several design cycles, the refined vehicle parts and assembly models are generated. The accumulated design data is used for the structural mass comparison and concept ranking. The present application focus on the blended-wing-body vehicle structure and advanced composite material are also discussed.
Articulating the Resources for Business Process Analysis and Design
ERIC Educational Resources Information Center
Jin, Yulong
2012-01-01
Effective process analysis and modeling are important phases of the business process management lifecycle. When many activities and multiple resources are involved, it is very difficult to build a correct business process specification. This dissertation provides a resource perspective of business processes. It aims at a better process analysis…
Innovating Method of Existing Mechanical Product Based on TRIZ Theory
NASA Astrophysics Data System (ADS)
Zhao, Cunyou; Shi, Dongyan; Wu, Han
Main way of product development is adaptive design and variant design based on existing product. In this paper, conceptual design frame and its flow model of innovating products is put forward through combining the methods of conceptual design and TRIZ theory. Process system model of innovating design that includes requirement analysis, total function analysis and decomposing, engineering problem analysis, finding solution of engineering problem and primarily design is constructed and this establishes the base for innovating design of existing product.
NASA Technical Reports Server (NTRS)
Deckert, George
2010-01-01
This viewgraph presentation reviews The NASA Hazard Analysis process. The contents include: 1) Significant Incidents and Close Calls in Human Spaceflight; 2) Subsystem Safety Engineering Through the Project Life Cycle; 3) The Risk Informed Design Process; 4) Types of NASA Hazard Analysis; 5) Preliminary Hazard Analysis (PHA); 6) Hazard Analysis Process; 7) Identify Hazardous Conditions; 8) Consider All Interfaces; 9) Work a Preliminary Hazard List; 10) NASA Generic Hazards List; and 11) Final Thoughts
Process-based organization design and hospital efficiency.
Vera, Antonio; Kuntz, Ludwig
2007-01-01
The central idea of process-based organization design is that organizing a firm around core business processes leads to cost reductions and quality improvements. We investigated theoretically and empirically whether the implementation of a process-based organization design is advisable in hospitals. The data came from a database compiled by the Statistical Office of the German federal state of Rheinland-Pfalz and from a written questionnaire, which was sent to the chief executive officers (CEOs) of all 92 hospitals in this federal state. We used data envelopment analysis (DEA) to measure hospital efficiency, and factor analysis and regression analysis to test our hypothesis. Our principal finding is that a high degree of process-based organization has a moderate but significant positive effect on the efficiency of hospitals. The main implication is that hospitals should implement a process-based organization to improve their efficiency. However, to actually achieve positive effects on efficiency, it is of paramount importance to observe some implementation rules, in particular to mobilize physician participation and to create an adequate organizational culture.
Multidisciplinary Design, Analysis, and Optimization Tool Development Using a Genetic Algorithm
NASA Technical Reports Server (NTRS)
Pak, Chan-gi; Li, Wesley
2009-01-01
Multidisciplinary design, analysis, and optimization using a genetic algorithm is being developed at the National Aeronautics and Space Administration Dryden Flight Research Center (Edwards, California) to automate analysis and design process by leveraging existing tools to enable true multidisciplinary optimization in the preliminary design stage of subsonic, transonic, supersonic, and hypersonic aircraft. This is a promising technology, but faces many challenges in large-scale, real-world application. This report describes current approaches, recent results, and challenges for multidisciplinary design, analysis, and optimization as demonstrated by experience with the Ikhana fire pod design.!
Interactive Image Analysis System Design,
1982-12-01
This report describes a design for an interactive image analysis system (IIAS), which implements terrain data extraction techniques. The design... analysis system. Additionally, the system is fully capable of supporting many generic types of image analysis and data processing, and is modularly...employs commercially available, state of the art minicomputers and image display devices with proven software to achieve a cost effective, reliable image
Analysis of Design-Build Processes, Best Practices, and Applications to the Department of Defense
2006-06-01
NAVFAC design-build processes published in trade journals, books , magazines, internet articles, and DoD policy. In their book , Contract Management...literature review concentrates on recent articles published in books , trade magazines, and on the internet to determine design-build processes and...Keith Molenaar ) Design-build projects under the State of California’s Public Contract Code (Legaltips.org, 2006) requires the owner, for example the
ERIC Educational Resources Information Center
Vann, Linda S.
2017-01-01
Instructional designers are tasked with making instructional strategy decisions to facilitate achievement of learning outcomes as part of their professional responsibilities. While the instructional design process includes learner analysis, that analysis alone does not embody opportunities to assist instructional designers with demonstrations of…
Reasserting the Fundamentals of Systems Analysis and Design through the Rudiments of Artifacts
ERIC Educational Resources Information Center
Jafar, Musa; Babb, Jeffry
2012-01-01
In this paper we present an artifacts-based approach to teaching a senior level Object-Oriented Analysis and Design course. Regardless of the systems development methodology and process model, and in order to facilitate communication across the business modeling, analysis, design, construction and deployment disciplines, we focus on (1) the…
Aviation System Analysis Capability Executive Assistant Design
NASA Technical Reports Server (NTRS)
Roberts, Eileen; Villani, James A.; Osman, Mohammed; Godso, David; King, Brent; Ricciardi, Michael
1998-01-01
In this technical document, we describe the design developed for the Aviation System Analysis Capability (ASAC) Executive Assistant (EA) Proof of Concept (POC). We describe the genesis and role of the ASAC system, discuss the objectives of the ASAC system and provide an overview of components and models within the ASAC system, and describe the design process and the results of the ASAC EA POC system design. We also describe the evaluation process and results for applicable COTS software. The document has six chapters, a bibliography, three appendices and one attachment.
Tribology symposium -- 1994. PD-Volume 61
DOE Office of Scientific and Technical Information (OSTI.GOV)
Masudi, H.
This year marks the first Tribology Symposium within the Energy-Sources Technology Conference, sponsored by the ASME Petroleum Division. The program was divided into five sessions: Tribology in High Technology, a historical discussion of some watershed events in tribology; Research/Development, design, research and development on modern manufacturing; Tribology in Manufacturing, the impact of tribology on modern manufacturing; Design/Design Representation, aspects of design related to tribological systems; and Failure Analysis, an analysis of failure, failure detection, and failure monitoring as relating to manufacturing processes. Eleven papers have been processed separately for inclusion on the data base.
Human Factors Analysis to Improve the Processing of Ares-1 Launch Vehicle
NASA Technical Reports Server (NTRS)
Stambolian, Damon B.; Dippolito, Gregory M.; Nyugen, Bao; Dischinger, Charles; Tran, Donald; Henderson, Gena; Barth, Tim
2011-01-01
This slide presentation reviews the use of Human Factors analysis in improving the ground processing procedures for the Ares-1 launch vehicle. The light vehicle engineering designers for Ares-l launch vehicle had to design the flight vehicle for effective, efficient and safe ground operations in the cramped dimensions in a rocket design. The use of a mockup of the area where the technician would be required to work proved to be a very effective method to promote the collaboration between the Ares-1 designers and the ground operations personnel.
Physical explosion analysis in heat exchanger network design
NASA Astrophysics Data System (ADS)
Pasha, M.; Zaini, D.; Shariff, A. M.
2016-06-01
The failure of shell and tube heat exchangers is being extensively experienced by the chemical process industries. This failure can create a loss of production for long time duration. Moreover, loss of containment through heat exchanger could potentially lead to a credible event such as fire, explosion and toxic release. There is a need to analyse the possible worst case effect originated from the loss of containment of the heat exchanger at the early design stage. Physical explosion analysis during the heat exchanger network design is presented in this work. Baker and Prugh explosion models are deployed for assessing the explosion effect. Microsoft Excel integrated with process design simulator through object linking and embedded (OLE) automation for this analysis. Aspen HYSYS V (8.0) used as a simulation platform in this work. A typical heat exchanger network of steam reforming and shift conversion process was presented as a case study. It is investigated from this analysis that overpressure generated from the physical explosion of each heat exchanger can be estimated in a more precise manner by using Prugh model. The present work could potentially assist the design engineer to identify the critical heat exchanger in the network at the preliminary design stage.
Silicon production process evaluations
NASA Technical Reports Server (NTRS)
1982-01-01
Chemical engineering analyses involving the preliminary process design of a plant (1,000 metric tons/year capacity) to produce silicon via the technology under consideration were accomplished. Major activities in the chemical engineering analyses included base case conditions, reaction chemistry, process flowsheet, material balance, energy balance, property data, equipment design, major equipment list, production labor and forward for economic analysis. The process design package provided detailed data for raw materials, utilities, major process equipment and production labor requirements necessary for polysilicon production in each process.
Anthropometric Accommodation in Space Suit Design
NASA Technical Reports Server (NTRS)
Rajulu, Sudhakar; Thaxton, Sherry
2007-01-01
Design requirements for next generation hardware are in process at NASA. Anthropometry requirements are given in terms of minimum and maximum sizes for critical dimensions that hardware must accommodate. These dimensions drive vehicle design and suit design, and implicitly have an effect on crew selection and participation. At this stage in the process, stakeholders such as cockpit and suit designers were asked to provide lists of dimensions that will be critical for their design. In addition, they were asked to provide technically feasible minimum and maximum ranges for these dimensions. Using an adjusted 1988 Anthropometric Survey of U.S. Army (ANSUR) database to represent a future astronaut population, the accommodation ranges provided by the suit critical dimensions were calculated. This project involved participation from the Anthropometry and Biomechanics facility (ABF) as well as suit designers, with suit designers providing expertise about feasible hardware dimensions and the ABF providing accommodation analysis. The initial analysis provided the suit design team with the accommodation levels associated with the critical dimensions provided early in the study. Additional outcomes will include a comparison of principal components analysis as an alternate method for anthropometric analysis.
Reference Models for Structural Technology Assessment and Weight Estimation
NASA Technical Reports Server (NTRS)
Cerro, Jeff; Martinovic, Zoran; Eldred, Lloyd
2005-01-01
Previously the Exploration Concepts Branch of NASA Langley Research Center has developed techniques for automating the preliminary design level of launch vehicle airframe structural analysis for purposes of enhancing historical regression based mass estimating relationships. This past work was useful and greatly reduced design time, however its application area was very narrow in terms of being able to handle a large variety in structural and vehicle general arrangement alternatives. Implementation of the analysis approach presented herein also incorporates some newly developed computer programs. Loft is a program developed to create analysis meshes and simultaneously define structural element design regions. A simple component defining ASCII file is read by Loft to begin the design process. HSLoad is a Visual Basic implementation of the HyperSizer Application Programming Interface, which automates the structural element design process. Details of these two programs and their use are explained in this paper. A feature which falls naturally out of the above analysis paradigm is the concept of "reference models". The flexibility of the FEA based JAVA processing procedures and associated process control classes coupled with the general utility of Loft and HSLoad make it possible to create generic program template files for analysis of components ranging from something as simple as a stiffened flat panel, to curved panels, fuselage and cryogenic tank components, flight control surfaces, wings, through full air and space vehicle general arrangements.
Considerations for the Systematic Analysis and Use of Single-Case Research
ERIC Educational Resources Information Center
Horner, Robert H.; Swaminathan, Hariharan; Sugai, George; Smolkowski, Keith
2012-01-01
Single-case research designs provide a rigorous research methodology for documenting experimental control. If single-case methods are to gain wider application, however, a need exists to define more clearly (a) the logic of single-case designs, (b) the process and decision rules for visual analysis, and (c) an accepted process for integrating…
Automated array assembly task, phase 1
NASA Technical Reports Server (NTRS)
Carbajal, B. G.
1977-01-01
An assessment of state-of-the-art technologies that are applicable to silicon solar cell and solar cell module fabrication is provided. The assessment consists of a technical feasibility evaluation and a cost projection for high-volume production of silicon solar cell modules. The cost projection was approached from two directions; a design-to-cost analysis assigned cost goals to each major process element in the fabrication scheme, and a cost analysis built up projected costs for alternate technologies for each process element. A technical evaluation was used in combination with the cost analysis to identify a baseline low cost process. A novel approach to metal pattern design based on minimum power loss was developed. These design equations were used as a tool in the evaluation of metallization technologies.
Prediction of Cutting Force in Turning Process-an Experimental Approach
NASA Astrophysics Data System (ADS)
Thangarasu, S. K.; Shankar, S.; Thomas, A. Tony; Sridhar, G.
2018-02-01
This Paper deals with a prediction of Cutting forces in a turning process. The turning process with advanced cutting tool has a several advantages over grinding such as short cycle time, process flexibility, compatible surface roughness, high material removal rate and less environment problems without the use of cutting fluid. In this a full bridge dynamometer has been used to measure the cutting forces over mild steel work piece and cemented carbide insert tool for different combination of cutting speed, feed rate and depth of cut. The experiments are planned based on taguchi design and measured cutting forces were compared with the predicted forces in order to validate the feasibility of the proposed design. The percentage contribution of each process parameter had been analyzed using Analysis of Variance (ANOVA). Both the experimental results taken from the lathe tool dynamometer and the designed full bridge dynamometer were analyzed using Taguchi design of experiment and Analysis of Variance.
NASA Technical Reports Server (NTRS)
Krist, Steven E.; Bauer, Steven X. S.
1999-01-01
The design process for developing the natural flow wing design on the HSR arrow wing configuration utilized several design tools and analysis methods. Initial fuselage/wing designs were generated with inviscid analysis and optimization methods in conjunction with the natural flow wing design philosophy. A number of designs were generated, satisfying different system constraints. Of the three natural flow wing designs developed, the NFWAc2 configuration is the design which satisfies the constraints utilized by McDonnell Douglas Aerospace (MDA) in developing a series of optimized configurations; a wind tunnel model of the MDA designed OPT5 configuration was constructed and tested. The present paper is concerned with the viscous analysis and inverse design of the arrow wing configurations, including the effects of the installed diverters/nacelles. Analyses were conducted with OVERFLOW, a Navier-Stokes flow solver for overset grids. Inverse designs were conducted with OVERDISC, which couples OVERFLOW with the CDISC inverse design method. An initial system of overset grids was generated for the OPT5 configuration with installed diverters/nacelles. An automated regridding process was then developed to use the OPT5 component grids to create grids for the natural flow wing designs. The inverse design process was initiated using the NFWAc2 configuration as a starting point, eventually culminating in the NFWAc4 design-for which a wind tunnel model was constructed. Due to the time constraints on the design effort, initial analyses and designs were conducted with a fairly coarse grid; subsequent analyses have been conducted on a refined system of grids. Comparisons of the computational results to experiment are provided at the end of this paper.
Silicon production process evaluations
NASA Technical Reports Server (NTRS)
1981-01-01
The chemical engineering analysis of the preliminary process design of a process for producing solar cell grade silicon from dichlorosilane is presented. A plant to produce 1,000 MT/yr of silicon is analyzed. Progress and status for the plant design are reported for the primary activities of base case conditions (60 percent), reaction chemistry (50 percent), process flow diagram (35 percent), energy balance (10 percent), property data (10 percent) and equipment design (5 percent).
Guo, Wei; Zheng, Qing; An, Weijin; Peng, Wei
2017-09-01
Collaborative innovation (co-innovation) community emerges as a new product design platform where companies involve users in the new product development (NPD) process. Large numbers of users participate and contribute to the process voluntarily. This exploratory study investigates the heterogeneous roles of users based on a global co-innovation project in online community. Content analysis, social network analysis and cluster method are employed to measure user behaviors, distinguish user roles, and analyze user contributions. The study identifies six user roles that emerge during the NPD process in co-innovation community: project leader, active designer, generalist, communicator, passive designer, and observer. The six user roles differ in their contribution forms and quality. This paper contributes to research on co-innovation in online communities, including design team structure, user roles and their contribution to design task and solution, as well as user value along the process. In addition, the study provides practices guidance on implementing project, attracting users, and designing platform for co-innovation community practitioners. Copyright © 2017 Elsevier Ltd. All rights reserved.
Lee, Haerin; Jung, Moonki; Lee, Ki-Kwang; Lee, Sang Hun
2017-02-06
In this paper, we propose a three-dimensional design and evaluation framework and process based on a probabilistic-based motion synthesis algorithm and biomechanical analysis system for the design of the Smith machine and squat training programs. Moreover, we implemented a prototype system to validate the proposed framework. The framework consists of an integrated human-machine-environment model as well as a squat motion synthesis system and biomechanical analysis system. In the design and evaluation process, we created an integrated model in which interactions between a human body and machine or the ground are modeled as joints with constraints at contact points. Next, we generated Smith squat motion using the motion synthesis program based on a Gaussian process regression algorithm with a set of given values for independent variables. Then, using the biomechanical analysis system, we simulated joint moments and muscle activities from the input of the integrated model and squat motion. We validated the model and algorithm through physical experiments measuring the electromyography (EMG) signals, ground forces, and squat motions as well as through a biomechanical simulation of muscle forces. The proposed approach enables the incorporation of biomechanics in the design process and reduces the need for physical experiments and prototypes in the development of training programs and new Smith machines.
A 3D Human-Machine Integrated Design and Analysis Framework for Squat Exercises with a Smith Machine
Lee, Haerin; Jung, Moonki; Lee, Ki-Kwang; Lee, Sang Hun
2017-01-01
In this paper, we propose a three-dimensional design and evaluation framework and process based on a probabilistic-based motion synthesis algorithm and biomechanical analysis system for the design of the Smith machine and squat training programs. Moreover, we implemented a prototype system to validate the proposed framework. The framework consists of an integrated human–machine–environment model as well as a squat motion synthesis system and biomechanical analysis system. In the design and evaluation process, we created an integrated model in which interactions between a human body and machine or the ground are modeled as joints with constraints at contact points. Next, we generated Smith squat motion using the motion synthesis program based on a Gaussian process regression algorithm with a set of given values for independent variables. Then, using the biomechanical analysis system, we simulated joint moments and muscle activities from the input of the integrated model and squat motion. We validated the model and algorithm through physical experiments measuring the electromyography (EMG) signals, ground forces, and squat motions as well as through a biomechanical simulation of muscle forces. The proposed approach enables the incorporation of biomechanics in the design process and reduces the need for physical experiments and prototypes in the development of training programs and new Smith machines. PMID:28178184
Samad, Noor Asma Fazli Abdul; Sin, Gürkan; Gernaey, Krist V; Gani, Rafiqul
2013-11-01
This paper presents the application of uncertainty and sensitivity analysis as part of a systematic model-based process monitoring and control (PAT) system design framework for crystallization processes. For the uncertainty analysis, the Monte Carlo procedure is used to propagate input uncertainty, while for sensitivity analysis, global methods including the standardized regression coefficients (SRC) and Morris screening are used to identify the most significant parameters. The potassium dihydrogen phosphate (KDP) crystallization process is used as a case study, both in open-loop and closed-loop operation. In the uncertainty analysis, the impact on the predicted output of uncertain parameters related to the nucleation and the crystal growth model has been investigated for both a one- and two-dimensional crystal size distribution (CSD). The open-loop results show that the input uncertainties lead to significant uncertainties on the CSD, with appearance of a secondary peak due to secondary nucleation for both cases. The sensitivity analysis indicated that the most important parameters affecting the CSDs are nucleation order and growth order constants. In the proposed PAT system design (closed-loop), the target CSD variability was successfully reduced compared to the open-loop case, also when considering uncertainty in nucleation and crystal growth model parameters. The latter forms a strong indication of the robustness of the proposed PAT system design in achieving the target CSD and encourages its transfer to full-scale implementation. Copyright © 2013 Elsevier B.V. All rights reserved.
A Data Envelopment Analysis Model for Selecting Material Handling System Designs
NASA Astrophysics Data System (ADS)
Liu, Fuh-Hwa Franklin; Kuo, Wan-Ting
The material handling system under design is an unmanned job shop with an automated guided vehicle that transport loads within the processing machines. The engineering task is to select the design alternatives that are the combinations of the four design factors: the ratio of production time to transportation time, mean job arrival rate to the system, input/output buffer capacities at each processing machine, and the vehicle control strategies. Each of the design alternatives is simulated to collect the upper and lower bounds of the five performance indices. We develop a Data Envelopment Analysis (DEA) model to assess the 180 designs with imprecise data of the five indices. The three-ways factorial experiment analysis for the assessment results indicates the buffer capacity and the interaction of job arrival rate and buffer capacity affect the performance significantly.
Multidisciplinary Analysis and Optimal Design: As Easy as it Sounds?
NASA Technical Reports Server (NTRS)
Moore, Greg; Chainyk, Mike; Schiermeier, John
2004-01-01
The viewgraph presentation examines optimal design for precision, large aperture structures. Discussion focuses on aspects of design optimization, code architecture and current capabilities, and planned activities and collaborative area suggestions. The discussion of design optimization examines design sensitivity analysis; practical considerations; and new analytical environments including finite element-based capability for high-fidelity multidisciplinary analysis, design sensitivity, and optimization. The discussion of code architecture and current capabilities includes basic thermal and structural elements, nonlinear heat transfer solutions and process, and optical modes generation.
Towards a systems approach to risk considerations for concurrent design
NASA Technical Reports Server (NTRS)
Meshkat, Leila; Oberto, Robert E.
2004-01-01
This paper describes the new process used by the Project Design Center at NASA's Jet Propulsion Laboratory for the identification, assessment and communication of risk elements throughout the lifecycle of a mission design. This process includes a software tool, 'RAP' that collects and communicates risk information between the various designers and a 'risk expert' who mediates this process. The establishment of this process is an attempt towards the systematic consideration of risk in the design decision making process. Using this process, we are able to better keep track of the risks associated with the design decisions. Furthermore, it helps us develop better risk profiles for the studies under consideration. We aim to refine and expand the current process to enable more thorough risk analysis capabilities in the future.
Integration of Design, Thermal, Structural, and Optical Analysis, Including Thermal Animation
NASA Technical Reports Server (NTRS)
Amundsen, Ruth M.
1993-01-01
In many industries there has recently been a concerted movement toward 'quality management' and the issue of how to accomplish work more efficiently. Part of this effort is focused on concurrent engineering; the idea of integrating the design and analysis processes so that they are not separate, sequential processes (often involving design rework due to analytical findings) but instead form an integrated system with smooth transfers of information. Presented herein are several specific examples of concurrent engineering methods being carried out at Langley Research Center (LaRC): integration of thermal, structural and optical analyses to predict changes in optical performance based on thermal and structural effects; integration of the CAD design process with thermal and structural analyses; and integration of analysis and presentation by animating the thermal response of a system as an active color map -- a highly effective visual indication of heat flow.
NASA Supportability Engineering Implementation Utilizing DoD Practices and Processes
NASA Technical Reports Server (NTRS)
Smith, David A.; Smith, John V.
2010-01-01
The Ares I design and development program made the determination early in the System Design Review Phase to utilize DoD ILS and LSA approach for supportability engineering as an integral part of the system engineering process. This paper is to provide a review of the overall approach to design Ares-I with an emphasis on a more affordable, supportable, and sustainable launch vehicle. Discussions will include the requirements development, design influence, support concept alternatives, ILS and LSA planning, Logistics support analyses/trades performed, LSA tailoring for NASA Ares Program, support system infrastructure identification, ILS Design Review documentation, Working Group coordination, and overall ILS implementation. At the outset, the Ares I Project initiated the development of the Integrated Logistics Support Plan (ILSP) and a Logistics Support Analysis process to provide a path forward for the management of the Ares-I ILS program and supportability analysis activities. The ILSP provide the initial planning and coordination between the Ares-I Project Elements and Ground Operation Project. The LSA process provided a system engineering approach in the development of the Ares-I supportability requirements; influence the design for supportability and development of alternative support concepts that satisfies the program operability requirements. The LSA planning and analysis results are documented in the Logistics Support Analysis Report. This document was required during the Ares-I System Design Review (SDR) and Preliminary Design Review (PDR) review cycles. To help coordinate the LSA process across the Ares-I project and between programs, the LSA Report is updated and released quarterly. A System Requirement Analysis was performed to determine the supportability requirements and technical performance measurements (TPMs). Two working groups were established to provide support in the management and implement the Ares-I ILS program, the Integrated Logistics Support Working Group (ILSWG) and the Logistics Support Analysis Record Working Group (LSARWG). The Ares I ILSWG is established to assess the requirements and conduct, evaluate analyses and trade studies associated with acquisition logistic and supportability processes and to resolve Ares I integrated logistics and supportability issues. It established a strategic collaborative alliance for coordination of Logistics Support Analysis activates in support of the integrated Ares I vehicle design and development of logistics support infrastructure. A Joint Ares I - Orion LSAR Working Group was established to: 1) Guide the development of Ares-I and Orion LSAR data and serve as a model for future Constellation programs, 2) Develop rules and assumptions that will apply across the Constellation program with regards to the program's LSAR development, and 3) Maintain the Constellation LSAR Style Guide.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-13
... Vehicle Access Element of the CDCA Plan for the WEMO area; and (2) Alternative processes for designating.... Identification of the process and decision criteria that should be used to designate routes in the sub-regional... analysis, and guide the entire process from plan decision-making to route designation review in order to...
Thermochemical Conversion Techno-Economic Analysis | Bioenergy | NREL
Conversion Techno-Economic Analysis Thermochemical Conversion Techno-Economic Analysis NREL's Thermochemical Conversion Analysis team focuses on the conceptual process design and techno-economic analysis , detailed process models, and TEA developed under this project provide insights into the potential economic
Paolantonacci, Philippe; Appourchaux, Philippe; Claudel, Béatrice; Ollivier, Monique; Dennett, Richard; Siret, Laurent
2018-01-01
Polyvalent human normal immunoglobulins for intravenous use (IVIg), indicated for rare and often severe diseases, are complex plasma-derived protein preparations. A quality by design approach has been used to develop the Laboratoire Français du Fractionnement et des Biotechnologies new-generation IVIg, targeting a high level of purity to generate an enhanced safety profile while maintaining a high level of efficacy. A modular approach of quality by design was implemented consisting of five consecutive steps to cover all the stages from the product design to the final product control strategy.A well-defined target product profile was translated into 27 product quality attributes that formed the basis of the process design. In parallel, a product risk analysis was conducted and identified 19 critical quality attributes among the product quality attributes. Process risk analysis was carried out to establish the links between process parameters and critical quality attributes. Twelve critical steps were identified, and for each of these steps a risk mitigation plan was established.Among the different process risk mitigation exercises, five process robustness studies were conducted at qualified small scale with a design of experiment approach. For each process step, critical process parameters were identified and, for each critical process parameter, proven acceptable ranges were established. The quality risk management and risk mitigation outputs, including verification of proven acceptable ranges, were used to design the process verification exercise at industrial scale.Finally, the control strategy was established using a mix, or hybrid, of the traditional approach plus elements of the quality by design enhanced approach, as illustrated, to more robustly assign material and process controls and in order to securely meet product specifications.The advantages of this quality by design approach were improved process knowledge for industrial design and process validation and a clear justification of the process and product specifications as a basis for control strategy and future comparability exercises. © PDA, Inc. 2018.
An application of computer aided requirements analysis to a real time deep space system
NASA Technical Reports Server (NTRS)
Farny, A. M.; Morris, R. V.; Hartsough, C.; Callender, E. D.; Teichroew, D.; Chikofsky, E.
1981-01-01
The entire procedure of incorporating the requirements and goals of a space flight project into integrated, time ordered sequences of spacecraft commands, is called the uplink process. The Uplink Process Control Task (UPCT) was created to examine the uplink process and determine ways to improve it. The Problem Statement Language/Problem Statement Analyzer (PSL/PSA) designed to assist the designer/analyst/engineer in the preparation of specifications of an information system is used as a supporting tool to aid in the analysis. Attention is given to a definition of the uplink process, the definition of PSL/PSA, the construction of a PSA database, the value of analysis to the study of the uplink process, and the PSL/PSA lessons learned.
New reflective symmetry design capability in the JPL-IDEAS Structure Optimization Program
NASA Technical Reports Server (NTRS)
Strain, D.; Levy, R.
1986-01-01
The JPL-IDEAS antenna structure analysis and design optimization computer program was modified to process half structure models of symmetric structures subjected to arbitrary external static loads, synthesize the performance, and optimize the design of the full structure. Significant savings in computation time and cost (more than 50%) were achieved compared to the cost of full model computer runs. The addition of the new reflective symmetry analysis design capabilities to the IDEAS program allows processing of structure models whose size would otherwise prevent automated design optimization. The new program produced synthesized full model iterative design results identical to those of actual full model program executions at substantially reduced cost, time, and computer storage.
Conceptual design of ACB-CP for ITER cryogenic system
NASA Astrophysics Data System (ADS)
Jiang, Yongcheng; Xiong, Lianyou; Peng, Nan; Tang, Jiancheng; Liu, Liqiang; Zhang, Liang
2012-06-01
ACB-CP (Auxiliary Cold Box for Cryopumps) is used to supply the cryopumps system with necessary cryogen in ITER (International Thermonuclear Experimental Reactor) cryogenic distribution system. The conceptual design of ACB-CP contains thermo-hydraulic analysis, 3D structure design and strength checking. Through the thermohydraulic analysis, the main specifications of process valves, pressure safety valves, pipes, heat exchangers can be decided. During the 3D structure design process, vacuum requirement, adiabatic requirement, assembly constraints and maintenance requirement have been considered to arrange the pipes, valves and other components. The strength checking has been performed to crosscheck if the 3D design meets the strength requirements for the ACB-CP.
Application of a neural network to simulate analysis in an optimization process
NASA Technical Reports Server (NTRS)
Rogers, James L.; Lamarsh, William J., II
1992-01-01
A new experimental software package called NETS/PROSSS aimed at reducing the computing time required to solve a complex design problem is described. The software combines a neural network for simulating the analysis program with an optimization program. The neural network is applied to approximate results of a finite element analysis program to quickly obtain a near-optimal solution. Results of the NETS/PROSSS optimization process can also be used as an initial design in a normal optimization process and make it possible to converge to an optimum solution with significantly fewer iterations.
The impact of distributed computing on education
NASA Technical Reports Server (NTRS)
Utku, S.; Lestingi, J.; Salama, M.
1982-01-01
In this paper, developments in digital computer technology since the early Fifties are reviewed briefly, and the parallelism which exists between these developments and developments in analysis and design procedures of structural engineering is identified. The recent trends in digital computer technology are examined in order to establish the fact that distributed processing is now an accepted philosophy for further developments. The impact of this on the analysis and design practices of structural engineering is assessed by first examining these practices from a data processing standpoint to identify the key operations and data bases, and then fitting them to the characteristics of distributed processing. The merits and drawbacks of the present philosophy in educating structural engineers are discussed and projections are made for the industry-academia relations in the distributed processing environment of structural analysis and design. An ongoing experiment of distributed computing in a university environment is described.
A framework of knowledge creation processes in participatory simulation of hospital work systems.
Andersen, Simone Nyholm; Broberg, Ole
2017-04-01
Participatory simulation (PS) is a method to involve workers in simulating and designing their own future work system. Existing PS studies have focused on analysing the outcome, and minimal attention has been devoted to the process of creating this outcome. In order to study this process, we suggest applying a knowledge creation perspective. The aim of this study was to develop a framework describing the process of how ergonomics knowledge is created in PS. Video recordings from three projects applying PS of hospital work systems constituted the foundation of process mining analysis. The analysis resulted in a framework revealing the sources of ergonomics knowledge creation as sequential relationships between the activities of simulation participants sharing work experiences; experimenting with scenarios; and reflecting on ergonomics consequences. We argue that this framework reveals the hidden steps of PS that are essential when planning and facilitating PS that aims at designing work systems. Practitioner Summary: When facilitating participatory simulation (PS) in work system design, achieving an understanding of the PS process is essential. By applying a knowledge creation perspective and process mining, we investigated the knowledge-creating activities constituting the PS process. The analysis resulted in a framework of the knowledge-creating process in PS.
NASA Astrophysics Data System (ADS)
Gunduz, Mustafa Emre
Many government agencies and corporations around the world have found the unique capabilities of rotorcraft indispensable. Incorporating such capabilities into rotorcraft design poses extra challenges because it is a complicated multidisciplinary process. The concept of applying several disciplines to the design and optimization processes may not be new, but it does not currently seem to be widely accepted in industry. The reason for this might be the lack of well-known tools for realizing a complete multidisciplinary design and analysis of a product. This study aims to propose a method that enables engineers in some design disciplines to perform a fairly detailed analysis and optimization of a design using commercially available software as well as codes developed at Georgia Tech. The ultimate goal is when the system is set up properly, the CAD model of the design, including all subsystems, will be automatically updated as soon as a new part or assembly is added to the design; or it will be updated when an analysis and/or an optimization is performed and the geometry needs to be modified. Designers and engineers will be involved in only checking the latest design for errors or adding/removing features. Such a design process will take dramatically less time to complete; therefore, it should reduce development time and costs. The optimization method is demonstrated on an existing helicopter rotor originally designed in the 1960's. The rotor is already an effective design with novel features. However, application of the optimization principles together with high-speed computing resulted in an even better design. The objective function to be minimized is related to the vibrations of the rotor system under gusty wind conditions. The design parameters are all continuous variables. Optimization is performed in a number of steps. First, the most crucial design variables of the objective function are identified. With these variables, Latin Hypercube Sampling method is used to probe the design space of several local minima and maxima. After analysis of numerous samples, an optimum configuration of the design that is more stable than that of the initial design is reached. The above process requires several software tools: CATIA as the CAD tool, ANSYS as the FEA tool, VABS for obtaining the cross-sectional structural properties, and DYMORE for the frequency and dynamic analysis of the rotor. MATLAB codes are also employed to generate input files and read output files of DYMORE. All these tools are connected using ModelCenter.
1988-11-01
system, using graphic techniques which enable users, analysts, and designers to get a clear and common picture of the system and how its parts fit...boxes into hierarchies suitable for computer implementation. ŗ. Structured Design uses tools, especially graphic ones, to render systems readily...LSA, PROCESSES, DATA FLOWS, DATA STORES, EX"RNAL ENTITIES, OVERALL SYSTEMS DESIGN PROCESS, over 19, ABSTRACT (Continue on reverse if necessary and
Real-Time Optical Image Processing Techniques
1988-10-31
pursued through the analysis, design, and fabrication of pulse frequency modulated halftone screens and the modification of micro-chan- nel spatial...required for non-linear operation. Real-time nonlinear processing was performed using the halftone screen and MSLM. The experiments showed the effectiveness...pulse frequency modulation has been pursued through the analysis, design, and fabrication of pulse frequency modulated halftone screens and the
Systems design analysis applied to launch vehicle configuration
NASA Technical Reports Server (NTRS)
Ryan, R.; Verderaime, V.
1993-01-01
As emphasis shifts from optimum-performance aerospace systems to least lift-cycle costs, systems designs must seek, adapt, and innovate cost improvement techniques in design through operations. The systems design process of concept, definition, and design was assessed for the types and flow of total quality management techniques that may be applicable in a launch vehicle systems design analysis. Techniques discussed are task ordering, quality leverage, concurrent engineering, Pareto's principle, robustness, quality function deployment, criteria, and others. These cost oriented techniques are as applicable to aerospace systems design analysis as to any large commercial system.
Cui, Xiang-Long; Xu, Bing; Sun, Fei; Dai, Sheng-Yun; Shi, Xin-Yuan; Qiao, Yan-Jiang
2017-03-01
In this paper, under the guidance of quality by design (QbD) concept, the control strategy of the high shear wet granulation process of the ginkgo leaf tablet based on the design space was established to improve the process controllability and product quality consistency. The median granule size (D50) and bulk density (Da) of granules were identified as critical quality attributes (CQAs) and potential critical process parameters (pCPPs) were determined by the failure modes and effect analysis (FMEA). The Plackeet-Burmann experimental design was used to screen pCPPs and the results demonstrated that the binder amount, the wet massing time and the wet mixing impeller speed were critical process parameters (CPPs). The design space of the high shear wet granulation process was developed within pCPPs range based on the Box-Behnken design and quadratic polynomial regression models. ANOVA analysis showed that the P-values of model were less than 0.05 and the values of lack of fit test were more than 0.1, indicating that the relationship between CQAs and CPPs could be well described by the mathematical models. D₅₀ could be controlled within 170 to 500 μm, and the bulk density could be controlled within 0.30 to 0.44 g•cm⁻³ by using any CPPs combination within the scope of design space. Besides, granules produced by process parameters within the design space region could also meet the requirement of tensile strength of the ginkgo leaf tablet.. Copyright© by the Chinese Pharmaceutical Association.
Conceptual Chemical Process Design for Sustainability. ...
This chapter examines the sustainable design of chemical processes, with a focus on conceptual design, hierarchical and short-cut methods, and analyses of process sustainability for alternatives. The chapter describes a methodology for incorporating process sustainability analyses throughout the conceptual design. Hierarchical and short-cut decision-making methods will be used to approach sustainability. An example showing a sustainability-based evaluation of chlor-alkali production processes is presented with economic analysis and five pollutants described as emissions. These emissions are analyzed according to their human toxicity potential by ingestion using the Waste Reduction Algorithm and a method based on US Environmental Protection Agency reference doses, with the addition of biodegradation for suitable components. Among the emissions, mercury as an element will not biodegrade, and results show the importance of this pollutant to the potential toxicity results and therefore the sustainability of the process design. The dominance of mercury in determining the long-term toxicity results when energy use is included suggests that all process system evaluations should (re)consider the role of mercury and other non-/slow-degrading pollutants in sustainability analyses. The cycling of nondegrading pollutants through the biosphere suggests the need for a complete analysis based on the economic, environmental, and social aspects of sustainability. Chapter reviews
Nagashima, Hiroaki; Watari, Akiko; Shinoda, Yasuharu; Okamoto, Hiroshi; Takuma, Shinya
2013-12-01
This case study describes the application of Quality by Design elements to the process of culturing Chinese hamster ovary cells in the production of a monoclonal antibody. All steps in the cell culture process and all process parameters in each step were identified by using a cause-and-effect diagram. Prospective risk assessment using failure mode and effects analysis identified the following four potential critical process parameters in the production culture step: initial viable cell density, culture duration, pH, and temperature. These parameters and lot-to-lot variability in raw material were then evaluated by process characterization utilizing a design of experiments approach consisting of a face-centered central composite design integrated with a full factorial design. Process characterization was conducted using a scaled down model that had been qualified by comparison with large-scale production data. Multivariate regression analysis was used to establish statistical prediction models for performance indicators and quality attributes; with these, we constructed contour plots and conducted Monte Carlo simulation to clarify the design space. The statistical analyses, especially for raw materials, identified set point values, which were most robust with respect to the lot-to-lot variability of raw materials while keeping the product quality within the acceptance criteria. © 2013 Wiley Periodicals, Inc. and the American Pharmacists Association.
A multi-fidelity framework for physics based rotor blade simulation and optimization
NASA Astrophysics Data System (ADS)
Collins, Kyle Brian
New helicopter rotor designs are desired that offer increased efficiency, reduced vibration, and reduced noise. Rotor Designers in industry need methods that allow them to use the most accurate simulation tools available to search for these optimal designs. Computer based rotor analysis and optimization have been advanced by the development of industry standard codes known as "comprehensive" rotorcraft analysis tools. These tools typically use table look-up aerodynamics, simplified inflow models and perform aeroelastic analysis using Computational Structural Dynamics (CSD). Due to the simplified aerodynamics, most design studies are performed varying structural related design variables like sectional mass and stiffness. The optimization of shape related variables in forward flight using these tools is complicated and results are viewed with skepticism because rotor blade loads are not accurately predicted. The most accurate methods of rotor simulation utilize Computational Fluid Dynamics (CFD) but have historically been considered too computationally intensive to be used in computer based optimization, where numerous simulations are required. An approach is needed where high fidelity CFD rotor analysis can be utilized in a shape variable optimization problem with multiple objectives. Any approach should be capable of working in forward flight in addition to hover. An alternative is proposed and founded on the idea that efficient hybrid CFD methods of rotor analysis are ready to be used in preliminary design. In addition, the proposed approach recognizes the usefulness of lower fidelity physics based analysis and surrogate modeling. Together, they are used with high fidelity analysis in an intelligent process of surrogate model building of parameters in the high fidelity domain. Closing the loop between high and low fidelity analysis is a key aspect of the proposed approach. This is done by using information from higher fidelity analysis to improve predictions made with lower fidelity models. This thesis documents the development of automated low and high fidelity physics based rotor simulation frameworks. The low fidelity framework uses a comprehensive code with simplified aerodynamics. The high fidelity model uses a parallel processor capable CFD/CSD methodology. Both low and high fidelity frameworks include an aeroacoustic simulation for prediction of noise. A synergistic process is developed that uses both the low and high fidelity frameworks together to build approximate models of important high fidelity metrics as functions of certain design variables. To test the process, a 4-bladed hingeless rotor model is used as a baseline. The design variables investigated include tip geometry and spanwise twist distribution. Approximation models are built for metrics related to rotor efficiency and vibration using the results from 60+ high fidelity (CFD/CSD) experiments and 400+ low fidelity experiments. Optimization using the approximation models found the Pareto Frontier anchor points, or the design having maximum rotor efficiency and the design having minimum vibration. Various Pareto generation methods are used to find designs on the frontier between these two anchor designs. When tested in the high fidelity framework, the Pareto anchor designs are shown to be very good designs when compared with other designs from the high fidelity database. This provides evidence that the process proposed has merit. Ultimately, this process can be utilized by industry rotor designers with their existing tools to bring high fidelity analysis into the preliminary design stage of rotors. In conclusion, the methods developed and documented in this thesis have made several novel contributions. First, an automated high fidelity CFD based forward flight simulation framework has been built for use in preliminary design optimization. The framework was built around an integrated, parallel processor capable CFD/CSD/AA process. Second, a novel method of building approximate models of high fidelity parameters has been developed. The method uses a combination of low and high fidelity results and combines Design of Experiments, statistical effects analysis, and aspects of approximation model management. And third, the determination of rotor blade shape variables through optimization using CFD based analysis in forward flight has been performed. This was done using the high fidelity CFD/CSD/AA framework and method mentioned above. While the low and high fidelity predictions methods used in the work still have inaccuracies that can affect the absolute levels of the results, a framework has been successfully developed and demonstrated that allows for an efficient process to improve rotor blade designs in terms of a selected choice of objective function(s). Using engineering judgment, this methodology could be applied today to investigate opportunities to improve existing designs. With improvements in the low and high fidelity prediction components that will certainly occur, this framework could become a powerful tool for future rotorcraft design work. (Abstract shortened by UMI.)
NASA Technical Reports Server (NTRS)
Csank, Jeffrey T.; Zinnecker, Alicia M.
2014-01-01
The aircraft engine design process seeks to achieve the best overall system-level performance, weight, and cost for a given engine design. This is achieved by a complex process known as systems analysis, where steady-state simulations are used to identify trade-offs that should be balanced to optimize the system. The steady-state simulations and data on which systems analysis relies may not adequately capture the true performance trade-offs that exist during transient operation. Dynamic Systems Analysis provides the capability for assessing these trade-offs at an earlier stage of the engine design process. The concept of dynamic systems analysis and the type of information available from this analysis are presented in this paper. To provide this capability, the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) was developed. This tool aids a user in the design of a power management controller to regulate thrust, and a transient limiter to protect the engine model from surge at a single flight condition (defined by an altitude and Mach number). Results from simulation of the closed-loop system may be used to estimate the dynamic performance of the model. This enables evaluation of the trade-off between performance and operability, or safety, in the engine, which could not be done with steady-state data alone. A design study is presented to compare the dynamic performance of two different engine models integrated with the TTECTrA software.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1994-11-30
Universal Oil Products, Inc. (UOP) of Des Plaines, Illinois has contracted A.E. Roberts & Associates, Inc. (AERA) of Atlanta, Georgia to prepare a sensitivity analysis for the development of the Fluidized-bed Copper Oxide (FBCO) process. As proposed by AERA in September 1991, development of the FBCO process design for a 500 mega-watt (MW) unit was divided into three tasks: (1) Establishment of a Conceptual Design, (2) Conceptual Design, (3) Cost Analysis Task 1 determined the basis for a conceptual design for the 500 megawatt (MW) FBCO process. It was completed by AERA in September of 1992, and a report wasmore » submitted at that time {open_quotes}Establishment of the Design Basis for Application to a 500 MW Coal-fired Facility.{close_quotes} Task 2 gathered all pertinent data available to date and reviewed its applicability to the 500 MW FBCO process. Work on this task was carried out on a joint basis by the AERA team members: Roberts & Schaefers worked on the dense phase transport aspect of the design; Cornell and Carnegie Mellon Universities worked on the design kinetics and modeling; and AERA contributed commercial power and combustion experience. Task 3 provides budgetary cost estimates for the FBCO process and competing alternative technologies for sulfur dioxide and nitrogen oxide removal.« less
NASA Technical Reports Server (NTRS)
Dec, John A.; Braun, Robert D.
2011-01-01
A finite element ablation and thermal response program is presented for simulation of three-dimensional transient thermostructural analysis. The three-dimensional governing differential equations and finite element formulation are summarized. A novel probabilistic design methodology for thermal protection systems is presented. The design methodology is an eight step process beginning with a parameter sensitivity study and is followed by a deterministic analysis whereby an optimum design can determined. The design process concludes with a Monte Carlo simulation where the probabilities of exceeding design specifications are estimated. The design methodology is demonstrated by applying the methodology to the carbon phenolic compression pads of the Crew Exploration Vehicle. The maximum allowed values of bondline temperature and tensile stress are used as the design specifications in this study.
Aircraft Structural Mass Property Prediction Using Conceptual-Level Structural Analysis
NASA Technical Reports Server (NTRS)
Sexstone, Matthew G.
1998-01-01
This paper describes a methodology that extends the use of the Equivalent LAminated Plate Solution (ELAPS) structural analysis code from conceptual-level aircraft structural analysis to conceptual-level aircraft mass property analysis. Mass property analysis in aircraft structures has historically depended upon parametric weight equations at the conceptual design level and Finite Element Analysis (FEA) at the detailed design level. ELAPS allows for the modeling of detailed geometry, metallic and composite materials, and non-structural mass coupled with analytical structural sizing to produce high-fidelity mass property analyses representing fully configured vehicles early in the design process. This capability is especially valuable for unusual configuration and advanced concept development where existing parametric weight equations are inapplicable and FEA is too time consuming for conceptual design. This paper contrasts the use of ELAPS relative to empirical weight equations and FEA. ELAPS modeling techniques are described and the ELAPS-based mass property analysis process is detailed. Examples of mass property stochastic calculations produced during a recent systems study are provided. This study involved the analysis of three remotely piloted aircraft required to carry scientific payloads to very high altitudes at subsonic speeds. Due to the extreme nature of this high-altitude flight regime, few existing vehicle designs are available for use in performance and weight prediction. ELAPS was employed within a concurrent engineering analysis process that simultaneously produces aerodynamic, structural, and static aeroelastic results for input to aircraft performance analyses. The ELAPS models produced for each concept were also used to provide stochastic analyses of wing structural mass properties. The results of this effort indicate that ELAPS is an efficient means to conduct multidisciplinary trade studies at the conceptual design level.
Aircraft Structural Mass Property Prediction Using Conceptual-Level Structural Analysis
NASA Technical Reports Server (NTRS)
Sexstone, Matthew G.
1998-01-01
This paper describes a methodology that extends the use of the Equivalent LAminated Plate Solution (ELAPS) structural analysis code from conceptual-level aircraft structural analysis to conceptual-level aircraft mass property analysis. Mass property analysis in aircraft structures has historically depended upon parametric weight equations at the conceptual design level and Finite Element Analysis (FEA) at the detailed design level ELAPS allows for the modeling of detailed geometry, metallic and composite materials, and non-structural mass coupled with analytical structural sizing to produce high-fidelity mass property analyses representing fully configured vehicles early in the design process. This capability is especially valuable for unusual configuration and advanced concept development where existing parametric weight equations are inapplicable and FEA is too time consuming for conceptual design. This paper contrasts the use of ELAPS relative to empirical weight equations and FEA. ELAPS modeling techniques are described and the ELAPS-based mass property analysis process is detailed Examples of mass property stochastic calculations produced during a recent systems study are provided This study involved the analysis of three remotely piloted aircraft required to carry scientific payloads to very high altitudes at subsonic speeds. Due to the extreme nature of this high-altitude flight regime,few existing vehicle designs are available for use in performance and weight prediction. ELAPS was employed within a concurrent engineering analysis process that simultaneously produces aerodynamic, structural, and static aeroelastic results for input to aircraft performance analyses. The ELAPS models produced for each concept were also used to provide stochastic analyses of wing structural mass properties. The results of this effort indicate that ELAPS is an efficient means to conduct multidisciplinary trade studies at the conceptual design level.
NASA Technical Reports Server (NTRS)
Bao, Han P.; Samareh, J. A.
2000-01-01
The primary objective of this paper is to demonstrate the use of process-based manufacturing and assembly cost models in a traditional performance-focused multidisciplinary design and optimization process. The use of automated cost-performance analysis is an enabling technology that could bring realistic processbased manufacturing and assembly cost into multidisciplinary design and optimization. In this paper, we present a new methodology for incorporating process costing into a standard multidisciplinary design optimization process. Material, manufacturing processes, and assembly processes costs then could be used as the objective function for the optimization method. A case study involving forty-six different configurations of a simple wing is presented, indicating that a design based on performance criteria alone may not necessarily be the most affordable as far as manufacturing and assembly cost is concerned.
Toxic release consequence analysis tool (TORCAT) for inherently safer design plant.
Shariff, Azmi Mohd; Zaini, Dzulkarnain
2010-10-15
Many major accidents due to toxic release in the past have caused many fatalities such as the tragedy of MIC release in Bhopal, India (1984). One of the approaches is to use inherently safer design technique that utilizes inherent safety principle to eliminate or minimize accidents rather than to control the hazard. This technique is best implemented in preliminary design stage where the consequence of toxic release can be evaluated and necessary design improvements can be implemented to eliminate or minimize the accidents to as low as reasonably practicable (ALARP) without resorting to costly protective system. However, currently there is no commercial tool available that has such capability. This paper reports on the preliminary findings on the development of a prototype tool for consequence analysis and design improvement via inherent safety principle by utilizing an integrated process design simulator with toxic release consequence analysis model. The consequence analysis based on the worst-case scenarios during process flowsheeting stage were conducted as case studies. The preliminary finding shows that toxic release consequences analysis tool (TORCAT) has capability to eliminate or minimize the potential toxic release accidents by adopting the inherent safety principle early in preliminary design stage. 2010 Elsevier B.V. All rights reserved.
Comprehensive assessment of the L-lysine production process from fermentation of sugarcane molasses.
Anaya-Reza, Omar; Lopez-Arenas, Teresa
2017-07-01
L-Lysine is an essential amino acid that can be produced by chemical processes from fossil raw materials, as well as by microbial fermentation, the latter being a more efficient and environmentally friendly procedure. In this work, the production process of L-lysine-HCl is studied using a systematic approach based on modeling and simulation, which supports decision making in the early stage of process design. The study considers two analysis stages: first, the dynamic analysis of the fermentation reactor, where the conversion of sugars from sugarcane molasses to L-lysine with a strain of Corynebacterium glutamicum is carried out. In this stage, the operation mode (either batch or fed batch) and operating conditions of the fermentation reactor are defined to reach the maximum technical criteria. Afterwards, the second analysis stage relates to the industrial production process of L-lysine-HCl, where the fermentation reactor, upstream processing, and downstream processing are included. In this stage, the influence of key parameters on the overall process performance is scrutinized through the evaluation of several technical, economic, and environmental criteria, to determine a profitable and sustainable design of the L-lysine production process. The main results show how the operating conditions, process design, and selection of evaluation criteria can influence in the conceptual design. The best plant design shows maximum product yield (0.31 g L-lysine/g glucose) and productivity (1.99 g/L/h), achieving 26.5% return on investment (ROI) with a payback period (PBP) of 3.8 years, decreasing water and energy consumption, and with a low potential environmental impact (PEI) index.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boggs, Paul T.; Althsuler, Alan; Larzelere, Alex R.
2005-08-01
The Design-through-Analysis Realization Team (DART) is chartered with reducing the time Sandia analysts require to complete the engineering analysis process. The DART system analysis team studied the engineering analysis processes employed by analysts in Centers 9100 and 8700 at Sandia to identify opportunities for reducing overall design-through-analysis process time. The team created and implemented a rigorous analysis methodology based on a generic process flow model parameterized by information obtained from analysts. They also collected data from analysis department managers to quantify the problem type and complexity distribution throughout Sandia's analyst community. They then used this information to develop a communitymore » model, which enables a simple characterization of processes that span the analyst community. The results indicate that equal opportunity for reducing analysis process time is available both by reducing the ''once-through'' time required to complete a process step and by reducing the probability of backward iteration. In addition, reducing the rework fraction (i.e., improving the engineering efficiency of subsequent iterations) offers approximately 40% to 80% of the benefit of reducing the ''once-through'' time or iteration probability, depending upon the process step being considered. Further, the results indicate that geometry manipulation and meshing is the largest portion of an analyst's effort, especially for structural problems, and offers significant opportunity for overall time reduction. Iteration loops initiated late in the process are more costly than others because they increase ''inner loop'' iterations. Identifying and correcting problems as early as possible in the process offers significant opportunity for time savings.« less
PICASSO: an end-to-end image simulation tool for space and airborne imaging systems
NASA Astrophysics Data System (ADS)
Cota, Steve A.; Bell, Jabin T.; Boucher, Richard H.; Dutton, Tracy E.; Florio, Chris J.; Franz, Geoffrey A.; Grycewicz, Thomas J.; Kalman, Linda S.; Keller, Robert A.; Lomheim, Terrence S.; Paulson, Diane B.; Willkinson, Timothy S.
2008-08-01
The design of any modern imaging system is the end result of many trade studies, each seeking to optimize image quality within real world constraints such as cost, schedule and overall risk. Image chain analysis - the prediction of image quality from fundamental design parameters - is an important part of this design process. At The Aerospace Corporation we have been using a variety of image chain analysis tools for many years, the Parameterized Image Chain Analysis & Simulation SOftware (PICASSO) among them. In this paper we describe our PICASSO tool, showing how, starting with a high quality input image and hypothetical design descriptions representative of the current state of the art in commercial imaging satellites, PICASSO can generate standard metrics of image quality in support of the decision processes of designers and program managers alike.
PICASSO: an end-to-end image simulation tool for space and airborne imaging systems
NASA Astrophysics Data System (ADS)
Cota, Stephen A.; Bell, Jabin T.; Boucher, Richard H.; Dutton, Tracy E.; Florio, Christopher J.; Franz, Geoffrey A.; Grycewicz, Thomas J.; Kalman, Linda S.; Keller, Robert A.; Lomheim, Terrence S.; Paulson, Diane B.; Wilkinson, Timothy S.
2010-06-01
The design of any modern imaging system is the end result of many trade studies, each seeking to optimize image quality within real world constraints such as cost, schedule and overall risk. Image chain analysis - the prediction of image quality from fundamental design parameters - is an important part of this design process. At The Aerospace Corporation we have been using a variety of image chain analysis tools for many years, the Parameterized Image Chain Analysis & Simulation SOftware (PICASSO) among them. In this paper we describe our PICASSO tool, showing how, starting with a high quality input image and hypothetical design descriptions representative of the current state of the art in commercial imaging satellites, PICASSO can generate standard metrics of image quality in support of the decision processes of designers and program managers alike.
Nancy Diaz; Dean Apostol
1992-01-01
This publication presents a Landscape Design and Analysis Process, along with some simple methods and tools for describing landscapes and their function. The information is qualitative in nature and highlights basic concepts, but does not address landscape ecology in great depth. Readers are encouraged to consult the list of selected references in Chapter 2 if they...
A Multidisciplinary Approach to Mixer-Ejector Analysis and Design
NASA Technical Reports Server (NTRS)
Hendricks, Eric, S.; Seidel, Jonathan, A.
2012-01-01
The design of an engine for a civil supersonic aircraft presents a difficult multidisciplinary problem to propulsion system engineers. There are numerous competing requirements for the engine, such as to be efficient during cruise while yet quiet enough at takeoff to meet airport noise regulations. The use of mixer-ejector nozzles presents one possible solution to this challenge. However, designing a mixer-ejector which will successfully address both of these concerns is a difficult proposition. Presented in this paper is an integrated multidisciplinary approach to the analysis and design of these systems. A process that uses several low-fidelity tools to evaluate both the performance and acoustics of mixer-ejectors nozzles is described. This process is further expanded to include system-level modeling of engines and aircraft to determine the effects on mission performance and noise near airports. The overall process is developed in the OpenMDAO framework currently being developed by NASA. From the developed process, sample results are given for a notional mixer-ejector design, thereby demonstrating the capabilities of the method.
Alternative Instructional Strategies in an IS Curriculum
ERIC Educational Resources Information Center
Parker, Kevin R.; LeRouge, Cynthia; Trimmer, Ken
2005-01-01
Systems Analysis and Design is a core component of an education in information systems. To appeal to a wider range of constituents and facilitate the learning process, the content of a traditional Systems Analysis and Design course has been supplemented with an alternative modeling approach. This paper presents an instructional design that…
Automated Tetrahedral Mesh Generation for CFD Analysis of Aircraft in Conceptual Design
NASA Technical Reports Server (NTRS)
Ordaz, Irian; Li, Wu; Campbell, Richard L.
2014-01-01
The paper introduces an automation process of generating a tetrahedral mesh for computational fluid dynamics (CFD) analysis of aircraft configurations in early conceptual design. The method was developed for CFD-based sonic boom analysis of supersonic configurations, but can be applied to aerodynamic analysis of aircraft configurations in any flight regime.
Numerical continuation and bifurcation analysis in aircraft design: an industrial perspective.
Sharma, Sanjiv; Coetzee, Etienne B; Lowenberg, Mark H; Neild, Simon A; Krauskopf, Bernd
2015-09-28
Bifurcation analysis is a powerful method for studying the steady-state nonlinear dynamics of systems. Software tools exist for the numerical continuation of steady-state solutions as parameters of the system are varied. These tools make it possible to generate 'maps of solutions' in an efficient way that provide valuable insight into the overall dynamic behaviour of a system and potentially to influence the design process. While this approach has been employed in the military aircraft control community to understand the effectiveness of controllers, the use of bifurcation analysis in the wider aircraft industry is yet limited. This paper reports progress on how bifurcation analysis can play a role as part of the design process for passenger aircraft. © 2015 The Author(s).
ProteMiner-SSM: a web server for efficient analysis of similar protein tertiary substructures.
Chang, Darby Tien-Hau; Chen, Chien-Yu; Chung, Wen-Chin; Oyang, Yen-Jen; Juan, Hsueh-Fen; Huang, Hsuan-Cheng
2004-07-01
Analysis of protein-ligand interactions is a fundamental issue in drug design. As the detailed and accurate analysis of protein-ligand interactions involves calculation of binding free energy based on thermodynamics and even quantum mechanics, which is highly expensive in terms of computing time, conformational and structural analysis of proteins and ligands has been widely employed as a screening process in computer-aided drug design. In this paper, a web server called ProteMiner-SSM designed for efficient analysis of similar protein tertiary substructures is presented. In one experiment reported in this paper, the web server has been exploited to obtain some clues about a biochemical hypothesis. The main distinction in the software design of the web server is the filtering process incorporated to expedite the analysis. The filtering process extracts the residues located in the caves of the protein tertiary structure for analysis and operates with O(nlogn) time complexity, where n is the number of residues in the protein. In comparison, the alpha-hull algorithm, which is a widely used algorithm in computer graphics for identifying those instances that are on the contour of a three-dimensional object, features O(n2) time complexity. Experimental results show that the filtering process presented in this paper is able to speed up the analysis by a factor ranging from 3.15 to 9.37 times. The ProteMiner-SSM web server can be found at http://proteminer.csie.ntu.edu.tw/. There is a mirror site at http://p4.sbl.bc.sinica.edu.tw/proteminer/.
Software For Design Of Life-Support Systems
NASA Technical Reports Server (NTRS)
Rudokas, Mary R.; Cantwell, Elizabeth R.; Robinson, Peter I.; Shenk, Timothy W.
1991-01-01
Design Assistant Workstation (DAWN) computer program is prototype of expert software system for analysis and design of regenerative, physical/chemical life-support systems that revitalize air, reclaim water, produce food, and treat waste. Incorporates both conventional software for quantitative mathematical modeling of physical, chemical, and biological processes and expert system offering user stored knowledge about materials and processes. Constructs task tree as it leads user through simulated process, offers alternatives, and indicates where alternative not feasible. Also enables user to jump from one design level to another.
Trajectory Dispersed Vehicle Process for Space Launch System
NASA Technical Reports Server (NTRS)
Statham, Tamara; Thompson, Seth
2017-01-01
The Space Launch System (SLS) vehicle is part of NASA's deep space exploration plans that includes manned missions to Mars. Manufacturing uncertainties in design parameters are key considerations throughout SLS development as they have significant effects on focus parameters such as lift-off-thrust-to-weight, vehicle payload, maximum dynamic pressure, and compression loads. This presentation discusses how the SLS program captures these uncertainties by utilizing a 3 degree of freedom (DOF) process called Trajectory Dispersed (TD) analysis. This analysis biases nominal trajectories to identify extremes in the design parameters for various potential SLS configurations and missions. This process utilizes a Design of Experiments (DOE) and response surface methodologies (RSM) to statistically sample uncertainties, and develop resulting vehicles using a Maximum Likelihood Estimate (MLE) process for targeting uncertainties bias. These vehicles represent various missions and configurations which are used as key inputs into a variety of analyses in the SLS design process, including 6 DOF dispersions, separation clearances, and engine out failure studies.
Integration of Off-Track Sonic Boom Analysis in Conceptual Design of Supersonic Aircraft
NASA Technical Reports Server (NTRS)
Ordaz, Irian; Li, Wu
2011-01-01
A highly desired capability for the conceptual design of aircraft is the ability to rapidly and accurately evaluate new concepts to avoid adverse trade decisions that may hinder the development process in the later stages of design. Evaluating the robustness of new low-boom concepts is important for the conceptual design of supersonic aircraft. Here, robustness means that the aircraft configuration has a low-boom ground signature at both under- and off-track locations. An integrated process for off-track boom analysis is developed to facilitate the design of robust low-boom supersonic aircraft. The integrated off-track analysis can also be used to study the sonic boom impact and to plan future flight trajectories where flight conditions and ground elevation might have a significant effect on ground signatures. The key enabler for off-track sonic boom analysis is accurate computational fluid dynamics (CFD) solutions for off-body pressure distributions. To ensure the numerical accuracy of the off-body pressure distributions, a mesh study is performed with Cart3D to determine the mesh requirements for off- body CFD analysis and comparisons are made between the Cart3D and USM3D results. The variations in ground signatures that result from changes in the initial location of the near-field waveform are also examined. Finally, a complete under- and off-track sonic boom analysis is presented for two distinct supersonic concepts to demonstrate the capability of the integrated analysis process.
NASA Astrophysics Data System (ADS)
Halbe, Johannes; Pahl-Wostl, Claudia; Adamowski, Jan
2018-01-01
Multiple barriers constrain the widespread application of participatory methods in water management, including the more technical focus of most water agencies, additional cost and time requirements for stakeholder involvement, as well as institutional structures that impede collaborative management. This paper presents a stepwise methodological framework that addresses the challenges of context-sensitive initiation, design and institutionalization of participatory modeling processes. The methodological framework consists of five successive stages: (1) problem framing and stakeholder analysis, (2) process design, (3) individual modeling, (4) group model building, and (5) institutionalized participatory modeling. The Management and Transition Framework is used for problem diagnosis (Stage One), context-sensitive process design (Stage Two) and analysis of requirements for the institutionalization of participatory water management (Stage Five). Conceptual modeling is used to initiate participatory modeling processes (Stage Three) and ensure a high compatibility with quantitative modeling approaches (Stage Four). This paper describes the proposed participatory model building (PMB) framework and provides a case study of its application in Québec, Canada. The results of the Québec study demonstrate the applicability of the PMB framework for initiating and designing participatory model building processes and analyzing barriers towards institutionalization.
Vocational Education Operations Analysis Process.
ERIC Educational Resources Information Center
California State Dept. of Education, Sacramento. Vocational Education Services.
This manual on the vocational education operations analysis process is designed to provide vocational administrators/coordinators with an internal device to collect, analyze, and display vocational education performance data. The first section describes the system and includes the following: analysis worksheet, data sources, utilization, system…
Gaining the Competitive Edge: Design for Manufacturing
NASA Technical Reports Server (NTRS)
Batill, Stephen M.; Pinkelman, Jim; Sellar, Richard
1993-01-01
The successful design of a commercial aircraft which is intended to be in direct competition with existing aircraft requires a market analysis to establish design requirements, the development of a concept to achieve those goals. and the ability to economically manufacture the aircraft. It is often the case that an engineer designs system components with only the perspective of a particular discipline. The relationship of that component to the entire system is often a minor consideration. In an effort to highlight the interaction that is necessary during the design process, the students were organized into design/build teams and required to integrate aspects of market analysis, engineering design, production and economics into their concepts. In order to facilitate this process a hypothetical "Aeroworld" was established. Having been furnished relevant demographic and economic data for "Aeroworld". students were given the task of designing and building an aircraft for a specific market while achieving an economically competitive design. Involvement of the team in the evolution of the design from market definition to technical development to manufacturing allowed the students to identify critical issues in the design process and to encounter many of the conflicting requirements which arise in an aerospace systems design.
Jiang, Canping; Flansburg, Lisa; Ghose, Sanchayita; Jorjorian, Paul; Shukla, Abhinav A
2010-12-15
The concept of design space has been taking root under the quality by design paradigm as a foundation of in-process control strategies for biopharmaceutical manufacturing processes. This paper outlines the development of a design space for a hydrophobic interaction chromatography (HIC) process step. The design space included the impact of raw material lot-to-lot variability and variations in the feed stream from cell culture. A failure modes and effects analysis was employed as the basis for the process characterization exercise. During mapping of the process design space, the multi-dimensional combination of operational variables were studied to quantify the impact on process performance in terms of yield and product quality. Variability in resin hydrophobicity was found to have a significant influence on step yield and high-molecular weight aggregate clearance through the HIC step. A robust operating window was identified for this process step that enabled a higher step yield while ensuring acceptable product quality. © 2010 Wiley Periodicals, Inc.
NASA Technical Reports Server (NTRS)
Hale, Mark A.; Craig, James I.; Mistree, Farrokh; Schrage, Daniel P.
1995-01-01
Computing architectures are being assembled that extend concurrent engineering practices by providing more efficient execution and collaboration on distributed, heterogeneous computing networks. Built on the successes of initial architectures, requirements for a next-generation design computing infrastructure can be developed. These requirements concentrate on those needed by a designer in decision-making processes from product conception to recycling and can be categorized in two areas: design process and design information management. A designer both designs and executes design processes throughout design time to achieve better product and process capabilities while expanding fewer resources. In order to accomplish this, information, or more appropriately design knowledge, needs to be adequately managed during product and process decomposition as well as recomposition. A foundation has been laid that captures these requirements in a design architecture called DREAMS (Developing Robust Engineering Analysis Models and Specifications). In addition, a computing infrastructure, called IMAGE (Intelligent Multidisciplinary Aircraft Generation Environment), is being developed that satisfies design requirements defined in DREAMS and incorporates enabling computational technologies.
Integrating end-to-end threads of control into object-oriented analysis and design
NASA Technical Reports Server (NTRS)
Mccandlish, Janet E.; Macdonald, James R.; Graves, Sara J.
1993-01-01
Current object-oriented analysis and design methodologies fall short in their use of mechanisms for identifying threads of control for the system being developed. The scenarios which typically describe a system are more global than looking at the individual objects and representing their behavior. Unlike conventional methodologies that use data flow and process-dependency diagrams, object-oriented methodologies do not provide a model for representing these global threads end-to-end. Tracing through threads of control is key to ensuring that a system is complete and timing constraints are addressed. The existence of multiple threads of control in a system necessitates a partitioning of the system into processes. This paper describes the application and representation of end-to-end threads of control to the object-oriented analysis and design process using object-oriented constructs. The issue of representation is viewed as a grouping problem, that is, how to group classes/objects at a higher level of abstraction so that the system may be viewed as a whole with both classes/objects and their associated dynamic behavior. Existing object-oriented development methodology techniques are extended by adding design-level constructs termed logical composite classes and process composite classes. Logical composite classes are design-level classes which group classes/objects both logically and by thread of control information. Process composite classes further refine the logical composite class groupings by using process partitioning criteria to produce optimum concurrent execution results. The goal of these design-level constructs is to ultimately provide the basis for a mechanism that can support the creation of process composite classes in an automated way. Using an automated mechanism makes it easier to partition a system into concurrently executing elements that can be run in parallel on multiple processors.
10 CFR 712.36 - Medical assessment process.
Code of Federal Regulations, 2010 CFR
2010-01-01
... assigned duties. (b) Employers must provide a job task analysis for those individuals involved in HRP... performed if a job task analysis has not been provided. (c) The medical process by the Designated Physician...
NASA Technical Reports Server (NTRS)
1981-01-01
The engineering design, fabrication, assembly, operation, economic analysis, and process support research and development for an Experimental Process System Development Unit for producing semiconductor-grade silicon using the slane-to-silicon process are reported. The design activity was completed. About 95% of purchased equipment was received. The draft of the operations manual was about 50% complete and the design of the free-space system continued. The system using silicon power transfer, melting, and shotting on a psuedocontinuous basis was demonstrated.
A CAD approach to magnetic bearing design
NASA Technical Reports Server (NTRS)
Jeyaseelan, M.; Anand, D. K.; Kirk, J. A.
1988-01-01
A design methodology has been developed at the Magnetic Bearing Research Laboratory for designing magnetic bearings using a CAD approach. This is used in the algorithm of an interactive design software package. The package is a design tool developed to enable the designer to simulate the entire process of design and analysis of the system. Its capabilities include interactive input/modification of geometry, finding any possible saturation at critical sections of the system, and the design and analysis of a control system that stabilizes and maintains magnetic suspension.
Importance of joint efforts for balanced process of designing and education
NASA Astrophysics Data System (ADS)
Mayorova, V. I.; Bannova, O. K.; Kristiansen, T.-H.; Igritsky, V. A.
2015-06-01
This paper discusses importance of a strategic planning and design process when developing long-term space exploration missions both robotic and manned. The discussion begins with reviewing current and/or traditional international perspectives on space development at the American, Russian and European space agencies. Some analogies and comparisons will be drawn upon analysis of several international student collaborative programs: Summer International workshops at the Bauman Moscow State Technical University, International European Summer Space School "Future Space Technologies and Experiments in Space", Summer school at Stuttgart University in Germany. The paper will focus on discussion about optimization of design and planning processes for successful space exploration missions and will highlight importance of the following: understanding connectivity between different levels of human being and machinery; simultaneous mission planning approach; reflections and correlations between disciplines involved in planning and executing space exploration missions; knowledge gained from different disciplines and through cross-applying and re-applying design approaches between variable space related fields of study and research. The conclusions will summarize benefits and complications of applying balanced design approach at all levels of the design process. Analysis of successes and failures of organizational efforts in space endeavors is used as a methodological approach to identify key questions to be researched as they often cause many planning and design processing problems.
NASA Astrophysics Data System (ADS)
Chen, Daniel; Chen, Damian; Yen, Ray; Cheng, Mingjen; Lan, Andy; Ghaskadvi, Rajesh
2008-11-01
Identifying hotspots--structures that limit the lithography process window--become increasingly important as the industry relies heavily on RET to print sub-wavelength designs. KLA-Tencor's patented Process Window Qualification (PWQ) methodology has been used for this purpose in various fabs. PWQ methodology has three key advantages (a) PWQ Layout--to obtain the best sensitivity (b) Design Based Binning--for pattern repeater analysis (c) Intelligent sampling--for the best DOI sampling rate. This paper evaluates two different analysis strategies for SEM review sampling successfully deployed at Inotera Memories, Inc. We propose a new approach combining the location repeater and pattern repeaters. Based on a recent case study the new sampling flow reduces the data analysis and sampling time from 6 hours to 1.5 hour maintaining maximum DOI sample rate.
NASA Technical Reports Server (NTRS)
Hawke, Veronica; Gage, Peter; Manning, Ted
2007-01-01
ComGeom2, a tool developed to generate Common Geometry representation for multidisciplinary analysis, has been used to create a large set of geometries for use in a design study requiring analysis by two computational codes. This paper describes the process used to generate the large number of configurations and suggests ways to further automate the process and make it more efficient for future studies. The design geometry for this study is the launch abort system of the NASA Crew Launch Vehicle.
Ultrasound-enhanced bioscouring of greige cotton: regression analysis of process factors
USDA-ARS?s Scientific Manuscript database
Process factors of enzyme concentration, time, power and frequency were investigated for ultrasound-enhanced bioscouring of greige cotton. A fractional factorial experimental design and subsequent regression analysis of the process factors were employed to determine the significance of each factor a...
Research on design connotation of hair drier system
NASA Astrophysics Data System (ADS)
Li, Yongchuan; Wu, Qiong
2018-04-01
After the analysis and summary of the research on the design of hair drier system, the system design is focused on. Product system design is not only to study its entity, but also is recognized as the part, element and component with a systematic feature to deeply analyze the innovation way of product system design, which is taken as its concept to carry out the association analysis on the component elements of hair driers and the overall analysis and study on the system design process of hair dryers. The product life cycle is taken as the main goal, through system analysis, system synthesis and system optimization, to solve the problems of product design. It is of great practical significance.
NASA Astrophysics Data System (ADS)
Sokolov, M. A.
This handbook treats the design and analysis of of pulsed radar receivers, with emphasis on elements (especially IC elements) that implement optimal and suboptimal algorithms. The design methodology is developed from the viewpoint of statistical communications theory. Particular consideration is given to the synthesis of single-channel and multichannel detectors, the design of analog and digital signal-processing devices, and the analysis of IF amplifiers.
Cognitive Task Analysis of Experts in Designing Multimedia Learning Object Guideline (M-LOG)
ERIC Educational Resources Information Center
Razak, Rafiza Abdul; Palanisamy, Punithavathy
2013-01-01
The purpose of this study was to design and develop a set of guidelines for multimedia learning objects to inform instructional designers (IDs) about the procedures involved in the process of content analysis. This study was motivated by the absence of standardized procedures in the beginning phase of the multimedia learning object design which is…
NASA Technical Reports Server (NTRS)
Chen, J. C.; Garba, J. A.; Wada, B. K.
1978-01-01
In the design/analysis process of a payload structural system, the accelerations at the payload/launch vehicle interface obtained from a system analysis using a rigid payload are often used as the input forcing function to the elastic payload to obtain structural design loads. Such an analysis is at best an approximation since the elastic coupling effects are neglected. This paper develops a method wherein the launch vehicle/rigid payload interface accelerations are modified to account for the payload elasticity. The advantage of the proposed method, which is exact to the extent that the physical system can be described by a truncated set of generalized coordinates, is that the complete design/analysis process can be performed within the organization responsible for the payload design. The method requires the updating of the system normal modes to account for payload changes, but does not require a complete transient solution using the composite system model. An application to a real complex structure, the Viking Spacecraft System, is given.
Prediction and Estimation of Scaffold Strength with different pore size
NASA Astrophysics Data System (ADS)
Muthu, P.; Mishra, Shubhanvit; Sri Sai Shilpa, R.; Veerendranath, B.; Latha, S.
2018-04-01
This paper emphasizes the significance of prediction and estimation of the mechanical strength of 3D functional scaffolds before the manufacturing process. Prior evaluation of the mechanical strength and structural properties of the scaffold will reduce the cost fabrication and in fact ease up the designing process. Detailed analysis and investigation of various mechanical properties including shear stress equivalence have helped to estimate the effect of porosity and pore size on the functionality of the scaffold. The influence of variation in porosity was examined by computational approach via finite element analysis (FEA) and ANSYS application software. The results designate the adequate perspective of the evolutionary method for the regulation and optimization of the intricate engineering design process.
NASA Astrophysics Data System (ADS)
Tuzkaya, Umut R.; Eser, Arzum; Argon, Goner
2004-02-01
Today, growing amounts of waste due to fast consumption rate of products started an irreversible environmental pollution and damage. A considerable part of this waste is caused by packaging material. With the realization of this fact, various waste policies have taken important steps. Here we considered a firm, where waste Aluminum constitutes majority of raw materials for this fir0m. In order to achieve a profitable recycling process, plant layout should be well designed. In this study, we propose a two-step approach involving Analytic Hierarchy Process (AHP) and Data Envelopment Analysis (DEA) to solve facility layout design problems. A case example is considered to demonstrate the results achieved.
NASA Astrophysics Data System (ADS)
Gradziński, Piotr
2017-10-01
Whereas World’s climate is changing (inter alia, under the influence of architecture activity), the author attempts to reorientations design practice primarily in a direction the use and adapt to the climatic conditions. Architectural Design using in early stages of the architectural Design Process of the building, among other Life Cycle Analysis (LCA) and digital analytical tools BIM (Building Information Modelling) defines the overriding requirements which the designer/architect should meet. The first part, the text characterized the architecture activity influences (by consumption, pollution, waste, etc.) and the use of building materials (embodied energy, embodied carbon, Global Warming Potential, etc.) within the meaning of the direct negative environmental impact. The second part, the paper presents the revision of the methods and analytical techniques prevent negative influences. Firstly, showing the study of the building by using the Life Cycle Analysis of the structure (e.g. materials) and functioning (e.g. energy consumptions) of the architectural object (stages: before use, use, after use). Secondly, the use of digital analytical tools for determining the benefits of running multi-faceted simulations in terms of environmental factors (exposure to light, shade, wind) directly affecting shaping the form of the building. The conclusion, author’s research results highlight the fact that indicates the possibility of building design using the above-mentioned elements (LCA, BIM) causes correction, early designs decisions in the design process of architectural form, minimizing the impact on nature, environment. The work refers directly to the architectural-environmental dimensions, orienting the design process of buildings in respect of widely comprehended climatic changes.
Systematic procedure for designing processes with multiple environmental objectives.
Kim, Ki-Joo; Smith, Raymond L
2005-04-01
Evaluation of multiple objectives is very important in designing environmentally benign processes. It requires a systematic procedure for solving multiobjective decision-making problems due to the complex nature of the problems, the need for complex assessments, and the complicated analysis of multidimensional results. In this paper, a novel systematic procedure is presented for designing processes with multiple environmental objectives. This procedure has four steps: initialization, screening, evaluation, and visualization. The first two steps are used for systematic problem formulation based on mass and energy estimation and order of magnitude analysis. In the third step, an efficient parallel multiobjective steady-state genetic algorithm is applied to design environmentally benign and economically viable processes and to provide more accurate and uniform Pareto optimal solutions. In the last step a new visualization technique for illustrating multiple objectives and their design parameters on the same diagram is developed. Through these integrated steps the decision-maker can easily determine design alternatives with respect to his or her preferences. Most importantly, this technique is independent of the number of objectives and design parameters. As a case study, acetic acid recovery from aqueous waste mixtures is investigated by minimizing eight potential environmental impacts and maximizing total profit. After applying the systematic procedure, the most preferred design alternatives and their design parameters are easily identified.
System architectures for telerobotic research
NASA Technical Reports Server (NTRS)
Harrison, F. Wallace
1989-01-01
Several activities are performed related to the definition and creation of telerobotic systems. The effort and investment required to create architectures for these complex systems can be enormous; however, the magnitude of process can be reduced if structured design techniques are applied. A number of informal methodologies supporting certain aspects of the design process are available. More recently, prototypes of integrated tools supporting all phases of system design from requirements analysis to code generation and hardware layout have begun to appear. Activities related to system architecture of telerobots are described, including current activities which are designed to provide a methodology for the comparison and quantitative analysis of alternative system architectures.
Advanced Usage of Vehicle Sketch Pad for CFD-Based Conceptual Design
NASA Technical Reports Server (NTRS)
Ordaz, Irian; Li, Wu
2013-01-01
Conceptual design is the most fluid phase of aircraft design. It is important to be able to perform large scale design space exploration of candidate concepts that can achieve the design intent to avoid more costly configuration changes in later stages of design. This also means that conceptual design is highly dependent on the disciplinary analysis tools to capture the underlying physics accurately. The required level of analysis fidelity can vary greatly depending on the application. Vehicle Sketch Pad (VSP) allows the designer to easily construct aircraft concepts and make changes as the design matures. More recent development efforts have enabled VSP to bridge the gap to high-fidelity analysis disciplines such as computational fluid dynamics and structural modeling for finite element analysis. This paper focuses on the current state-of-the-art geometry modeling for the automated process of analysis and design of low-boom supersonic concepts using VSP and several capability-enhancing design tools.
NASA Technical Reports Server (NTRS)
Follett, William W.; Rajagopal, Raj
2001-01-01
The focus of the AA MDO team is to reduce product development cost through the capture and automation of best design and analysis practices and through increasing the availability of low-cost, high-fidelity analysis. Implementation of robust designs reduces costs associated with the Test-Fall-Fix cycle. RD is currently focusing on several technologies to improve the design process, including optimization and robust design, expert and rule-based systems, and collaborative technologies.
Total-System Approach To Design And Analysis Of Structures
NASA Technical Reports Server (NTRS)
Verderaime, V.
1995-01-01
Paper presents overview and study of, and comprehensive approach to, multidisciplinary engineering design and analysis of structures. Emphasizes issues related to design of semistatic structures in environments in which spacecraft launched, underlying concepts applicable to other structures within unique terrestrial, marine, or flight environments. Purpose of study to understand interactions among traditionally separate engineering design disciplines with view toward optimizing not only structure but also overall design process.
NASA Technical Reports Server (NTRS)
Phillips, Dave; Haas, William; Barth, Tim; Benjamin, Perakath; Graul, Michael; Bagatourova, Olga
2005-01-01
Range Process Simulation Tool (RPST) is a computer program that assists managers in rapidly predicting and quantitatively assessing the operational effects of proposed technological additions to, and/or upgrades of, complex facilities and engineering systems such as the Eastern Test Range. Originally designed for application to space transportation systems, RPST is also suitable for assessing effects of proposed changes in industrial facilities and large organizations. RPST follows a model-based approach that includes finite-capacity schedule analysis and discrete-event process simulation. A component-based, scalable, open architecture makes RPST easily and rapidly tailorable for diverse applications. Specific RPST functions include: (1) definition of analysis objectives and performance metrics; (2) selection of process templates from a processtemplate library; (3) configuration of process models for detailed simulation and schedule analysis; (4) design of operations- analysis experiments; (5) schedule and simulation-based process analysis; and (6) optimization of performance by use of genetic algorithms and simulated annealing. The main benefits afforded by RPST are provision of information that can be used to reduce costs of operation and maintenance, and the capability for affordable, accurate, and reliable prediction and exploration of the consequences of many alternative proposed decisions.
NASA Technical Reports Server (NTRS)
Wanthal, Steven; Schaefer, Joseph; Justusson, Brian; Hyder, Imran; Engelstad, Stephen; Rose, Cheryl
2017-01-01
The Advanced Composites Consortium is a US Government/Industry partnership supporting technologies to enable timeline and cost reduction in the development of certified composite aerospace structures. A key component of the consortium's approach is the development and validation of improved progressive damage and failure analysis methods for composite structures. These methods will enable increased use of simulations in design trade studies and detailed design development, and thereby enable more targeted physical test programs to validate designs. To accomplish this goal with confidence, a rigorous verification and validation process was developed. The process was used to evaluate analysis methods and associated implementation requirements to ensure calculation accuracy and to gage predictability for composite failure modes of interest. This paper introduces the verification and validation process developed by the consortium during the Phase I effort of the Advanced Composites Project. Specific structural failure modes of interest are first identified, and a subset of standard composite test articles are proposed to interrogate a progressive damage analysis method's ability to predict each failure mode of interest. Test articles are designed to capture the underlying composite material constitutive response as well as the interaction of failure modes representing typical failure patterns observed in aerospace structures.
Main Engine Prototype Development for 2nd Generation RLV RS-83
NASA Technical Reports Server (NTRS)
Vilja, John; Fisher, Mark; Lyles, Garry M. (Technical Monitor)
2002-01-01
This presentation reports on the NASA project to develop a prototype for RS-83 engine designed for use on reusable launch vehicles (RLV). Topics covered include: program objectives, overview schedule, organizational chart, integrated systems engineering processes, requirement analysis, catastrophic engine loss, maintainability analysis tools, and prototype design analysis.
Space simulation facilities providing a stable thermal vacuum facility
NASA Technical Reports Server (NTRS)
Tellalian, Martin L.
1990-01-01
CBI has recently constructed the Intermediate Thermal Vacuum Facility. Built as a corporate facility, the installation will first be used on the Boost Surveillance and Tracking System (BSTS) program. It will also be used to develop and test other sensor systems. The horizontal chamber has a horseshoe shaped cross section and is supported on pneumatic isolators for vibration isolation. The chamber structure was designed to meet stability and stiffness requirements. The design process included measurement of the ambient ground vibrations, analysis of various foundation test article support configurations, design and analysis of the chamber shell and modal testing of the chamber shell. A detailed 3-D finite element analysis was made in the design stage to predict the lowest three natural frequencies and mode shapes and to identify local vibrating components. The design process is described and the results are compared of the finite element analysis to the results of the field modal testing and analysis for the 3 lowest natural frequencies and mode shapes. Concepts are also presented for stiffening large steel structures along with methods to improve test article stability in large space simulation facilities.
Application of structured analysis to a telerobotic system
NASA Technical Reports Server (NTRS)
Dashman, Eric; Mclin, David; Harrison, F. W.; Soloway, Donald; Young, Steven
1990-01-01
The analysis and evaluation of a multiple arm telerobotic research and demonstration system developed by the NASA Intelligent Systems Research Laboratory (ISRL) is described. Structured analysis techniques were used to develop a detailed requirements model of an existing telerobotic testbed. Performance models generated during this process were used to further evaluate the total system. A commercial CASE tool called Teamwork was used to carry out the structured analysis and development of the functional requirements model. A structured analysis and design process using the ISRL telerobotic system as a model is described. Evaluation of this system focused on the identification of bottlenecks in this implementation. The results demonstrate that the use of structured methods and analysis tools can give useful performance information early in a design cycle. This information can be used to ensure that the proposed system meets its design requirements before it is built.
May Stakeholders be Involved in Design Without Informed Consent? The Case of Hidden Design.
Pols, A J K
2017-06-01
Stakeholder involvement in design is desirable from both a practical and an ethical point of view. It is difficult to do well, however, and some problems recur again and again, both of a practical nature, e.g. stakeholders acting strategically rather than openly, and of an ethical nature, e.g. power imbalances unduly affecting the outcome of the process. Hidden Design has been proposed as a method to deal with the practical problems of stakeholder involvement. It aims to do so by taking the observation of stakeholder actions, rather than the outcomes of a deliberative process, as its input. Furthermore, it hides from stakeholders the fact that a design process is taking place so that they will not behave differently than they otherwise would. Both aspects of Hidden Design have raised ethical worries. In this paper I make an ethical analysis of what it means for a design process to leave participants uninformed or deceived rather than acquiring their informed consent beforehand, and to use observation of actions rather than deliberation as input for design, using Hidden Design as a case study. This analysis is based on two sets of normative guidelines: the ethical guidelines for psychological research involving deception or uninformed participants from two professional psychological organisations, and Habermasian norms for a fair and just (deliberative) process. It supports the conclusion that stakeholder involvement in design organised in this way can be ethically acceptable, though under a number of conditions and constraints.
ERIC Educational Resources Information Center
Ho, Hsuan-Fu; Hung, Chia-Chi
2008-01-01
Purpose: The purpose of this paper is to examine how a graduate institute at National Chiayi University (NCYU), by using a model that integrates analytic hierarchy process, cluster analysis and correspondence analysis, can develop effective marketing strategies. Design/methodology/approach: This is primarily a quantitative study aimed at…
Inverse Design of Low-Boom Supersonic Concepts Using Reversed Equivalent-Area Targets
NASA Technical Reports Server (NTRS)
Li, Wu; Rallabhand, Sriam
2011-01-01
A promising path for developing a low-boom configuration is a multifidelity approach that (1) starts from a low-fidelity low-boom design, (2) refines the low-fidelity design with computational fluid dynamics (CFD) equivalent-area (Ae) analysis, and (3) improves the design with sonic-boom analysis by using CFD off-body pressure distributions. The focus of this paper is on the third step of this approach, in which the design is improved with sonic-boom analysis through the use of CFD calculations. A new inverse design process for off-body pressure tailoring is formulated and demonstrated with a low-boom supersonic configuration that was developed by using the mixed-fidelity design method with CFD Ae analysis. The new inverse design process uses the reverse propagation of the pressure distribution (dp/p) from a mid-field location to a near-field location, converts the near-field dp/p into an equivalent-area distribution, generates a low-boom target for the reversed equivalent area (Ae,r) of the configuration, and modifies the configuration to minimize the differences between the configuration s Ae,r and the low-boom target. The new inverse design process is used to modify a supersonic demonstrator concept for a cruise Mach number of 1.6 and a cruise weight of 30,000 lb. The modified configuration has a fully shaped ground signature that has a perceived loudness (PLdB) value of 78.5, while the original configuration has a partially shaped aft signature with a PLdB of 82.3.
ACT Payload Shroud Structural Concept Analysis and Optimization
NASA Technical Reports Server (NTRS)
Zalewski, Bart B.; Bednarcyk, Brett A.
2010-01-01
Aerospace structural applications demand a weight efficient design to perform in a cost effective manner. This is particularly true for launch vehicle structures, where weight is the dominant design driver. The design process typically requires many iterations to ensure that a satisfactory minimum weight has been obtained. Although metallic structures can be weight efficient, composite structures can provide additional weight savings due to their lower density and additional design flexibility. This work presents structural analysis and weight optimization of a composite payload shroud for NASA s Ares V heavy lift vehicle. Two concepts, which were previously determined to be efficient for such a structure are evaluated: a hat stiffened/corrugated panel and a fiber reinforced foam sandwich panel. A composite structural optimization code, HyperSizer, is used to optimize the panel geometry, composite material ply orientations, and sandwich core material. HyperSizer enables an efficient evaluation of thousands of potential designs versus multiple strength and stability-based failure criteria across multiple load cases. HyperSizer sizing process uses a global finite element model to obtain element forces, which are statistically processed to arrive at panel-level design-to loads. These loads are then used to analyze each candidate panel design. A near optimum design is selected as the one with the lowest weight that also provides all positive margins of safety. The stiffness of each newly sized panel or beam component is taken into account in the subsequent finite element analysis. Iteration of analysis/optimization is performed to ensure a converged design. Sizing results for the hat stiffened panel concept and the fiber reinforced foam sandwich concept are presented.
NASA Astrophysics Data System (ADS)
Donders, S.; Pluymers, B.; Ragnarsson, P.; Hadjit, R.; Desmet, W.
2010-04-01
In the vehicle design process, design decisions are more and more based on virtual prototypes. Due to competitive and regulatory pressure, vehicle manufacturers are forced to improve product quality, to reduce time-to-market and to launch an increasing number of design variants on the global market. To speed up the design iteration process, substructuring and component mode synthesis (CMS) methods are commonly used, involving the analysis of substructure models and the synthesis of the substructure analysis results. Substructuring and CMS enable efficient decentralized collaboration across departments and allow to benefit from the availability of parallel computing environments. However, traditional CMS methods become prohibitively inefficient when substructures are coupled along large interfaces, i.e. with a large number of degrees of freedom (DOFs) at the interface between substructures. The reason is that the analysis of substructures involves the calculation of a number of enrichment vectors, one for each interface degree of freedom (DOF). Since large interfaces are common in vehicles (e.g. the continuous line connections to connect the body with the windshield, roof or floor), this interface bottleneck poses a clear limitation in the vehicle noise, vibration and harshness (NVH) design process. Therefore there is a need to describe the interface dynamics more efficiently. This paper presents a wave-based substructuring (WBS) approach, which allows reducing the interface representation between substructures in an assembly by expressing the interface DOFs in terms of a limited set of basis functions ("waves"). As the number of basis functions can be much lower than the number of interface DOFs, this greatly facilitates the substructure analysis procedure and results in faster design predictions. The waves are calculated once from a full nominal assembly analysis, but these nominal waves can be re-used for the assembly of modified components. The WBS approach thus enables efficient structural modification predictions of the global modes, so that efficient vibro-acoustic design modification, optimization and robust design become possible. The results show that wave-based substructuring offers a clear benefit for vehicle design modifications, by improving both the speed of component reduction processes and the efficiency and accuracy of design iteration predictions, as compared to conventional substructuring approaches.
Designed tools for analysis of lithography patterns and nanostructures
NASA Astrophysics Data System (ADS)
Dervillé, Alexandre; Baderot, Julien; Bernard, Guilhem; Foucher, Johann; Grönqvist, Hanna; Labrosse, Aurélien; Martinez, Sergio; Zimmermann, Yann
2017-03-01
We introduce a set of designed tools for the analysis of lithography patterns and nano structures. The classical metrological analysis of these objects has the drawbacks of being time consuming, requiring manual tuning and lacking robustness and user friendliness. With the goal of improving the current situation, we propose new image processing tools at different levels: semi automatic, automatic and machine-learning enhanced tools. The complete set of tools has been integrated into a software platform designed to transform the lab into a virtual fab. The underlying idea is to master nano processes at the research and development level by accelerating the access to knowledge and hence speed up the implementation in product lines.
Current development in selected stress and thermal analysis software interfaces with PRO-ENGINEER
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schulze, J.
1993-06-01
Ever since PRO-ENGINEER has become a dominating CAD package available to the public, some of us have been saying, ``Gee, if only I could export my geometry to a stress analysis program without having to recreate any of the details already created, wouldn`t that be spectacular?`` Well, much to the credit of the major stress and thermal analysis software vendors, some of them have been listening to design engineers like me badger them to furnish a seamless interface between PRO and their stress analysis programs. The down side of this problem is the fact that a lot of problems stillmore » exist with most of the vendors and their interfaces. I want to discuss the interfaces that I feel are currently ``State of the Art``, and how they are developing and the future for finally arriving at a transparent procedure that an engineer at a workstation can utilize in his or her design process. In years past, engineers would develop a design and changes would evolve based on intuition, or somebody else`s critical evaluation. Then the design would be forwarded to the production group, or the stress analysis group for further evaluation and analysis. Maybe data from a preliminary prototype would be collected and an evaluation report made. All of this took time and increased the cost of the item to be manufactured. Today, the engineer must assume responsibility for design and functional capability early on in the design process, if for no other reason than costs associated with diverse channels of critiquing. For that reason, one place to enhance the design process is to have the ability to do preliminary stress and thermal analysis during the initial design phase. This is both cost and time effective. But, as I am sure you are aware, this has been easier said than done.« less
Parametric Design and Mechanical Analysis of Beams based on SINOVATION
NASA Astrophysics Data System (ADS)
Xu, Z. G.; Shen, W. D.; Yang, D. Y.; Liu, W. M.
2017-07-01
In engineering practice, engineer needs to carry out complicated calculation when the loads on the beam are complex. The processes of analysis and calculation take a lot of time and the results are unreliable. So VS2005 and ADK are used to develop a software for beams design based on the 3D CAD software SINOVATION with C ++ programming language. The software can realize the mechanical analysis and parameterized design of various types of beams and output the report of design in HTML format. Efficiency and reliability of design of beams are improved.
NASA Technical Reports Server (NTRS)
Afjeh, Abdollah A.; Reed, John A.
2003-01-01
This research is aimed at developing a neiv and advanced simulation framework that will significantly improve the overall efficiency of aerospace systems design and development. This objective will be accomplished through an innovative integration of object-oriented and Web-based technologies ivith both new and proven simulation methodologies. The basic approach involves Ihree major areas of research: Aerospace system and component representation using a hierarchical object-oriented component model which enables the use of multimodels and enforces component interoperability. Collaborative software environment that streamlines the process of developing, sharing and integrating aerospace design and analysis models. . Development of a distributed infrastructure which enables Web-based exchange of models to simplify the collaborative design process, and to support computationally intensive aerospace design and analysis processes. Research for the first year dealt with the design of the basic architecture and supporting infrastructure, an initial implementation of that design, and a demonstration of its application to an example aircraft engine system simulation.
Analysis of Work Design in Rubber Processing Plant
NASA Astrophysics Data System (ADS)
Wahyuni, Dini; Nasution, Harmein; Budiman, Irwan; Wijaya, Khairini
2018-02-01
The work design illustrates how structured jobs, tasks, and roles are defined and modified and their impact on individuals, groups, and organizations. If the work is not designed well, the company must pay greater costs for workers' health, longer production processes or even penalties for not being able to meet the delivery schedule. This is visible to the condition in a rubber processing factory in North Sumatra. Work design aspects such as layouts, machinery and equipment, worker's physical working environment, work methods, and organizational policies have not been well-organized. Coagulum grinding machines into sheets are often damaged, resulting in 4 times the delay of product delivery in 2016, the presence of complaints of heat exposure submitted by workers, and workstation that has not been properly arranged is an indication of the need for work design. The research data will be collected through field observation, and distribution of questionnaires related aspects of work design. The result of the analysis depends on the respondent's answer from the distributed questionnaire regarding the 6 aspects studied.
ERIC Educational Resources Information Center
d'Ham, Cedric; de Vries, Erica; Girault, Isabelle; Marzin, Patricia
2004-01-01
This paper deals with the design process of a remote laboratory for labwork in chemistry. In particular, it focuses on the mutual dependency of theoretical conjectures about learning in the experimental sciences and technological opportunities in creating learning environments. The design process involves a detailed analysis of the expert task and…
A Government/Industry Summary of the Design Analysis Methods for Vibrations (DAMVIBS) Program
NASA Technical Reports Server (NTRS)
Kvaternik, Raymond G. (Compiler)
1993-01-01
The NASA Langley Research Center in 1984 initiated a rotorcraft structural dynamics program, designated DAMVIBS (Design Analysis Methods for VIBrationS), with the objective of establishing the technology base needed by the rotorcraft industry for developing an advanced finite-element-based dynamics design analysis capability for vibrations. An assessment of the program showed that the DAMVIBS Program has resulted in notable technical achievements and major changes in industrial design practice, all of which have significantly advanced the industry's capability to use and rely on finite-element-based dynamics analyses during the design process.
The NASA Monographs on Shell Stability Design Recommendations: A Review and Suggested Improvements
NASA Technical Reports Server (NTRS)
Nemeth, Michael P.; Starnes, James H., Jr.
1998-01-01
A summary of existing NASA design criteria monographs for the design of buckling-resistant thin-shell structures is presented. Subsequent improvements in the analysis for nonlinear shell response are reviewed, and current issues in shell stability analysis are discussed. Examples of nonlinear shell responses that are not included in the existing shell design monographs are presented, and an approach for including reliability based analysis procedures in the shell design process is discussed. Suggestions for conducting future shell experiments are presented, and proposed improvements to the NASA shell design criteria monographs are discussed.
The NASA Monographs on Shell Stability Design Recommendations: A Review and Suggested Improvements
NASA Technical Reports Server (NTRS)
Nemeth, Michael P.; Starnes, James H., Jr.
1998-01-01
A summary of the existing NASA design criteria monographs for the design of buckling-resistant thin-shell structures is presented. Subsequent improvements in the analysis for nonlinear shell response are reviewed, and current issues in shell stability analysis are discussed. Examples of nonlinear shell responses that are not included in the existing shell design monographs are presented, and an approach for including reliability-based analysis procedures in the shell design process is discussed. Suggestions for conducting future shell experiments are presented, and proposed improvements to the NASA shell design criteria monographs are discussed.
1988-10-01
Structured Analysis involves building a logical (non-physical) model of a system, using graphic techniques which enable users, analysts, and designers to... Design uses tools, especially graphic ones, to render systems readily understandable. 8 Ř. Structured Design offers a set of strategies for...in the overall systems design process, and an overview of the assessment procedures, as well as a guide to the overall assessment. 20. DISTRIBUTION
Process and assembly plans for low cost commercial fuselage structure
NASA Technical Reports Server (NTRS)
Willden, Kurtis; Metschan, Stephen; Starkey, Val
1991-01-01
Cost and weight reduction for a composite structure is a result of selecting design concepts that can be built using efficient low cost manufacturing and assembly processes. Since design and manufacturing are inherently cost dependent, concurrent engineering in the form of a Design-Build Team (DBT) is essential for low cost designs. Detailed cost analysis from DBT designs and hardware verification must be performed to identify the cost drivers and relationships between design and manufacturing processes. Results from the global evaluation are used to quantitatively rank design, identify cost centers for higher ranking design concepts, define and prioritize a list of technical/economic issues and barriers, and identify parameters that control concept response. These results are then used for final design optimization.
Baumgart, André; Denz, Christof; Bender, Hans-Joachim; Schleppers, Alexander
2009-01-01
The complexity of the operating room (OR) requires that both structural (eg, department layout) and behavioral (eg, staff interactions) patterns of work be considered when developing quality improvement strategies. In our study, we investigated how these contextual factors influence outpatient OR processes and the quality of care delivered. The study setting was a German university-affiliated hospital performing approximately 6000 outpatient surgeries annually. During the 3-year-study period, the hospital significantly changed its outpatient OR facility layout from a decentralized (ie, ORs in adjacent areas of the building) to a centralized (ie, ORs in immediate vicinity of each other) design. To study the impact of the facility change on OR processes, we used a mixed methods approach, including process analysis, process modeling, and social network analysis of staff interactions. The change in facility layout was seen to influence OR processes in ways that could substantially affect patient outcomes. For example, we found a potential for more errors during handovers in the new centralized design due to greater interdependency between tasks and staff. Utilization of the mixed methods approach in our analysis, as compared with that of a single assessment method, enabled a deeper understanding of the OR work context and its influence on outpatient OR processes.
Design techniques for low-voltage analog integrated circuits
NASA Astrophysics Data System (ADS)
Rakús, Matej; Stopjaková, Viera; Arbet, Daniel
2017-08-01
In this paper, a review and analysis of different design techniques for (ultra) low-voltage integrated circuits (IC) are performed. This analysis shows that the most suitable design methods for low-voltage analog IC design in a standard CMOS process include techniques using bulk-driven MOS transistors, dynamic threshold MOS transistors and MOS transistors operating in weak or moderate inversion regions. The main advantage of such techniques is that there is no need for any modification of standard CMOS structure or process. Basic circuit building blocks like differential amplifiers or current mirrors designed using these approaches are able to operate with the power supply voltage of 600 mV (or even lower), which is the key feature towards integrated systems for modern portable applications.
Design Process of Flight Vehicle Structures for a Common Bulkhead and an MPCV Spacecraft Adapter
NASA Technical Reports Server (NTRS)
Aggarwal, Pravin; Hull, Patrick V.
2015-01-01
Design and manufacturing space flight vehicle structures is a skillset that has grown considerably at NASA during that last several years. Beginning with the Ares program and followed by the Space Launch System (SLS); in-house designs were produced for both the Upper Stage and the SLS Multipurpose crew vehicle (MPCV) spacecraft adapter. Specifically, critical design review (CDR) level analysis and flight production drawing were produced for the above mentioned hardware. In particular, the experience of this in-house design work led to increased manufacturing infrastructure for both Marshal Space Flight Center (MSFC) and Michoud Assembly Facility (MAF), improved skillsets in both analysis and design, and hands on experience in building and testing (MSA) full scale hardware. The hardware design and development processes from initiation to CDR and finally flight; resulted in many challenges and experiences that produced valuable lessons. This paper builds on these experiences of NASA in recent years on designing and fabricating flight hardware and examines the design/development processes used, as well as the challenges and lessons learned, i.e. from the initial design, loads estimation and mass constraints to structural optimization/affordability to release of production drawing to hardware manufacturing. While there are many documented design processes which a design engineer can follow, these unique experiences can offer insight into designing hardware in current program environments and present solutions to many of the challenges experienced by the engineering team.
How system designers think: a study of design thinking in human factors engineering.
Papantonopoulos, Sotiris
2004-11-01
The paper presents a descriptive study of design thinking in human factors engineering. The objective of the study is to analyse the role of interpretation in design thinking and the role of design practice in guiding interpretation. The study involved 10 system designers undertaking the allocation of cognitive functions in three production planning and control task scenarios. Allocation decisions were recorded and verbal protocols of the design process were collected to elicit the subjects' thought processes. Verbal protocol analysis showed that subjects carried out the design of cognitive task allocation as a problem of applying a selected automation technology from their initial design deliberations. This design strategy stands in contrast to the predominant view of system design that stipulates that user requirements should be thoroughly analysed prior to making any decisions about technology. Theoretical frameworks from design research and ontological design showed that the system design process may be better understood by recognizing the role of design hypotheses in system design, as well as the diverse interactions between interpretation and practice, means and ends, and design practice and the designer's pre-understanding which shape the design process. Ways to balance the bias exerted on the design process were discussed.
A SYSTEMATIC PROCEDURE FOR DESIGNING PROCESSES WITH MULTIPLE ENVIRONMENTAL OBJECTIVES
Evaluation and analysis of multiple objectives are very important in designing environmentally benign processes. They require a systematic procedure for solving multi-objective decision-making problems due to the complex nature of the problems and the need for complex assessment....
Double jeopardy in inferring cognitive processes
Fific, Mario
2014-01-01
Inferences we make about underlying cognitive processes can be jeopardized in two ways due to problematic forms of aggregation. First, averaging across individuals is typically considered a very useful tool for removing random variability. The threat is that averaging across subjects leads to averaging across different cognitive strategies, thus harming our inferences. The second threat comes from the construction of inadequate research designs possessing a low diagnostic accuracy of cognitive processes. For that reason we introduced the systems factorial technology (SFT), which has primarily been designed to make inferences about underlying processing order (serial, parallel, coactive), stopping rule (terminating, exhaustive), and process dependency. SFT proposes that the minimal research design complexity to learn about n number of cognitive processes should be equal to 2n. In addition, SFT proposes that (a) each cognitive process should be controlled by a separate experimental factor, and (b) The saliency levels of all factors should be combined in a full factorial design. In the current study, the author cross combined the levels of jeopardies in a 2 × 2 analysis, leading to four different analysis conditions. The results indicate a decline in the diagnostic accuracy of inferences made about cognitive processes due to the presence of each jeopardy in isolation and when combined. The results warrant the development of more individual subject analyses and the utilization of full-factorial (SFT) experimental designs. PMID:25374545
Incorporating Handling Qualities Analysis into Rotorcraft Conceptual Design
NASA Technical Reports Server (NTRS)
Lawrence, Ben
2014-01-01
This paper describes the initial development of a framework to incorporate handling qualities analyses into a rotorcraft conceptual design process. In particular, the paper describes how rotorcraft conceptual design level data can be used to generate flight dynamics models for handling qualities analyses. Also, methods are described that couple a basic stability augmentation system to the rotorcraft flight dynamics model to extend analysis to beyond that of the bare airframe. A methodology for calculating the handling qualities characteristics of the flight dynamics models and for comparing the results to ADS-33E criteria is described. Preliminary results from the application of the handling qualities analysis for variations in key rotorcraft design parameters of main rotor radius, blade chord, hub stiffness and flap moment of inertia are shown. Varying relationships, with counteracting trends for different handling qualities criteria and different flight speeds are exhibited, with the action of the control system playing a complex part in the outcomes. Overall, the paper demonstrates how a broad array of technical issues across flight dynamics stability and control, simulation and modeling, control law design and handling qualities testing and evaluation had to be confronted to implement even a moderately comprehensive handling qualities analysis of relatively low fidelity models. A key outstanding issue is to how to 'close the loop' with an overall design process, and options for the exploration of how to feedback handling qualities results to a conceptual design process are proposed for future work.
Closed Loop Requirements and Analysis Management
NASA Technical Reports Server (NTRS)
Lamoreaux, Michael; Verhoef, Brett
2015-01-01
Effective systems engineering involves the use of analysis in the derivation of requirements and verification of designs against those requirements. The initial development of requirements often depends on analysis for the technical definition of specific aspects of a product. Following the allocation of system-level requirements to a product's components, the closure of those requirements often involves analytical approaches to verify that the requirement criteria have been satisfied. Meanwhile, changes that occur in between these two processes need to be managed in order to achieve a closed-loop requirement derivation/verification process. Herein are presented concepts for employing emerging Team center capabilities to jointly manage requirements and analysis data such that analytical techniques are utilized to effectively derive and allocate requirements, analyses are consulted and updated during the change evaluation processes, and analyses are leveraged during the design verification process. Recommendations on concept validation case studies are also discussed.
Interactive computer graphics system for structural sizing and analysis of aircraft structures
NASA Technical Reports Server (NTRS)
Bendavid, D.; Pipano, A.; Raibstein, A.; Somekh, E.
1975-01-01
A computerized system for preliminary sizing and analysis of aircraft wing and fuselage structures was described. The system is based upon repeated application of analytical program modules, which are interactively interfaced and sequence-controlled during the iterative design process with the aid of design-oriented graphics software modules. The entire process is initiated and controlled via low-cost interactive graphics terminals driven by a remote computer in a time-sharing mode.
Master of Puppets: Cooperative Multitasking for In Situ Processing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morozov, Dmitriy; Lukic, Zarija
2016-01-01
Modern scientific and engineering simulations track the time evolution of billions of elements. For such large runs, storing most time steps for later analysis is not a viable strategy. It is far more efficient to analyze the simulation data while it is still in memory. Here, we present a novel design for running multiple codes in situ: using coroutines and position-independent executables we enable cooperative multitasking between simulation and analysis, allowing the same executables to post-process simulation output, as well as to process it on the fly, both in situ and in transit. We present Henson, an implementation of ourmore » design, and illustrate its versatility by tackling analysis tasks with different computational requirements. This design differs significantly from the existing frameworks and offers an efficient and robust approach to integrating multiple codes on modern supercomputers. The techniques we present can also be integrated into other in situ frameworks.« less
NASA Technical Reports Server (NTRS)
Franck, Bruno M.
1990-01-01
The research is focused on automating the evaluation of complex structural systems, whether for the design of a new system or the analysis of an existing one, by developing new structural analysis techniques based on qualitative reasoning. The problem is to identify and better understand: (1) the requirements for the automation of design, and (2) the qualitative reasoning associated with the conceptual development of a complex system. The long-term objective is to develop an integrated design-risk assessment environment for the evaluation of complex structural systems. The scope of this short presentation is to describe the design and cognition components of the research. Design has received special attention in cognitive science because it is now identified as a problem solving activity that is different from other information processing tasks (1). Before an attempt can be made to automate design, a thorough understanding of the underlying design theory and methodology is needed, since the design process is, in many cases, multi-disciplinary, complex in size and motivation, and uses various reasoning processes involving different kinds of knowledge in ways which vary from one context to another. The objective is to unify all the various types of knowledge under one framework of cognition. This presentation focuses on the cognitive science framework that we are using to represent the knowledge aspects associated with the human mind's abstraction abilities and how we apply it to the engineering knowledge and engineering reasoning in design.
Empirical OPC rule inference for rapid RET application
NASA Astrophysics Data System (ADS)
Kulkarni, Anand P.
2006-10-01
A given technological node (45 nm, 65 nm) can be expected to process thousands of individual designs. Iterative methods applied at the node consume valuable days in determining proper placement of OPC features, and manufacturing and testing mask correspondence to wafer patterns in a trial-and-error fashion for each design. Repeating this fabrication process for each individual design is a time-consuming and expensive process. We present a novel technique which sidesteps the requirement to iterate through the model-based OPC analysis and pattern verification cycle on subsequent designs at the same node. Our approach relies on the inference of rules from a correct pattern at the wafer surface it relates to the OPC and pre-OPC pattern layout files. We begin with an offline phase where we obtain a "gold standard" design file that has been fab-tested at the node with a prepared, post-OPC layout file that corresponds to the intended on-wafer pattern. We then run an offline analysis to infer rules to be used in this method. During the analysis, our method implicitly identifies contextual OPC strategies for optimal placement of RET features on any design at that node. Using these strategies, we can apply OPC to subsequent designs at the same node with accuracy comparable to the original design file but significantly smaller expected runtimes. The technique promises to offer a rapid and accurate complement to existing RET application strategies.
A State Space Modeling Approach to Mediation Analysis
ERIC Educational Resources Information Center
Gu, Fei; Preacher, Kristopher J.; Ferrer, Emilio
2014-01-01
Mediation is a causal process that evolves over time. Thus, a study of mediation requires data collected throughout the process. However, most applications of mediation analysis use cross-sectional rather than longitudinal data. Another implicit assumption commonly made in longitudinal designs for mediation analysis is that the same mediation…
Cook, Richard J; Wei, Wei
2003-07-01
The design of clinical trials is typically based on marginal comparisons of a primary response under two or more treatments. The considerable gains in efficiency afforded by models conditional on one or more baseline responses has been extensively studied for Gaussian models. The purpose of this article is to present methods for the design and analysis of clinical trials in which the response is a count or a point process, and a corresponding baseline count is available prior to randomization. The methods are based on a conditional negative binomial model for the response given the baseline count and can be used to examine the effect of introducing selection criteria on power and sample size requirements. We show that designs based on this approach are more efficient than those proposed by McMahon et al. (1994).
Finite element analysis as a design tool for thermoplastic vulcanizate glazing seals
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gase, K.M.; Hudacek, L.L.; Pesevski, G.T.
1998-12-31
There are three materials that are commonly used in commercial glazing seals: EPDM, silicone and thermoplastic vulcanizates (TPVs). TPVs are a high performance class of thermoplastic elastomers (TPEs), where TPEs have elastomeric properties with thermoplastic processability. TPVs have emerged as materials well suited for use in glazing seals due to ease of processing, economics and part design flexibility. The part design and development process is critical to ensure that the chosen TPV provides economics, quality and function in demanding environments. In the design and development process, there is great value in utilizing dual durometer systems to capitalize on the benefitsmore » of soft and rigid materials. Computer-aided design tools, such as Finite Element Analysis (FEA), are effective in minimizing development time and predicting system performance. Examples of TPV glazing seals will illustrate the benefits of utilizing FEA to take full advantage of the material characteristics, which results in functional performance and quality while reducing development iterations. FEA will be performed on two glazing seal profiles to confirm optimum geometry.« less
NASA Technical Reports Server (NTRS)
Singh, S. P.
1979-01-01
The computer software developed to set up a method for Wiener spectrum analysis of photographic films is presented. This method is used for the quantitative analysis of the autoradiographic enhancement process. The software requirements and design for the autoradiographic enhancement process are given along with the program listings and the users manual. A software description and program listings modification of the data analysis software are included.
Sustainable Design Approach: A case study of BIM use
NASA Astrophysics Data System (ADS)
Abdelhameed, Wael
2017-11-01
Achieving sustainable design in areas such as energy-efficient design depends largely on the accuracy of the analysis performed after the design is completed with all its components and material details. There are different analysis approaches and methods that predict relevant values and metrics such as U value, energy use and energy savings. Although certain differences in the accuracy of these approaches and methods have been recorded, this research paper does not focus on such matter, where determining the reason for discrepancies between those approaches and methods is difficult, because all error sources act simultaneously. The research paper rather introduces an approach through which BIM, building information modelling, can be utilised during the initial phases of the designing process, by analysing the values and metrics of sustainable design before going into the design details of a building. Managing all of the project drawings in a single file, BIM -building information modelling- is well known as one digital platform that offers a multidisciplinary detailed design -AEC model (Barison and Santos, 2010, Welle et.al., 2011). The paper presents in general BIM use in the early phases of the design process, in order to achieve certain required areas of sustainable design. The paper proceeds to introduce BIM use in specific areas such as site selection, wind velocity and building orientation, in terms of reaching the farther possible sustainable solution. In the initial phases of designing, material details and building components are not fully specified or selected yet. The designer usually focuses on zoning, topology, circulations, and other design requirements. The proposed approach employs the strategies and analysis of BIM use during those initial design phases in order to have the analysis and results of each solution or alternative design. The stakeholders and designers would have a better effective decision making process with a full clarity of each alternative's consequences. The architect would settle down and proceed in the alternative design of the best sustainable analysis. In later design stages, using the sustainable types of materials such as insulation, cladding, etc., and applying sustainable building components such as doors, windows, etc. would add more improvements and enhancements in reaching better values and metrics. The paper describes the methodology of this design approach through BIM strategies adopted in design creation. Case studies of architectural designs are used to highlight the details and benefits of this proposed approach.
Analysis and Test Correlation of Proof of Concept Box for Blended Wing Body-Low Speed Vehicle
NASA Technical Reports Server (NTRS)
Spellman, Regina L.
2003-01-01
The Low Speed Vehicle (LSV) is a 14.2% scale remotely piloted vehicle of the revolutionary Blended Wing Body concept. The design of the LSV includes an all composite airframe. Due to internal manufacturing capability restrictions, room temperature layups were necessary. An extensive materials testing and manufacturing process development effort was underwent to establish a process that would achieve the high modulus/low weight properties required to meet the design requirements. The analysis process involved a loads development effort that incorporated aero loads to determine internal forces that could be applied to a traditional FEM of the vehicle and to conduct detailed component analyses. A new tool, Hypersizer, was added to the design process to address various composite failure modes and to optimize the skin panel thickness of the upper and lower skins for the vehicle. The analysis required an iterative approach as material properties were continually changing. As a part of the material characterization effort, test articles, including a proof of concept wing box and a full-scale wing, were fabricated. The proof of concept box was fabricated based on very preliminary material studies and tested in bending, torsion, and shear. The box was then tested to failure under shear. The proof of concept box was also analyzed using Nastran and Hypersizer. The results of both analyses were scaled to determine the predicted failure load. The test results were compared to both the Nastran and Hypersizer analytical predictions. The actual failure occurred at 899 lbs. The failure was predicted at 1167 lbs based on the Nastran analysis. The Hypersizer analysis predicted a lower failure load of 960 lbs. The Nastran analysis alone was not sufficient to predict the failure load because it does not identify local composite failure modes. This analysis has traditionally been done using closed form solutions. Although Hypersizer is typically used as an optimizer for the design process, the failure prediction was used to help gain acceptance and confidence in this new tool. The correlated models and process were to be used to analyze the full BWB-LSV airframe design. The analysis and correlation with test results of the proof of concept box is presented here, including the comparison of the Nastran and Hypersizer results.
Lefkoff, L.J.; Gorelick, S.M.
1986-01-01
Detailed two-dimensional flow simulation of a complex ground-water system is combined with quadratic and linear programming to evaluate design alternatives for rapid aquifer restoration. Results show how treatment and pumping costs depend dynamically on the type of treatment process, and capacity of pumping and injection wells, and the number of wells. The design for an inexpensive treatment process minimizes pumping costs, while an expensive process results in the minimization of treatment costs. Substantial reductions in pumping costs occur with increases in injection capacity or in the number of wells. Treatment costs are reduced by expansions in pumping capacity or injecion capacity. The analysis identifies maximum pumping and injection capacities.-from Authors
Power processing methodology. [computerized design of spacecraft electric power systems
NASA Technical Reports Server (NTRS)
Fegley, K. A.; Hansen, I. G.; Hayden, J. H.
1974-01-01
Discussion of the interim results of a program to investigate the feasibility of formulating a methodology for the modeling and analysis of aerospace electrical power processing systems. The object of the total program is to develop a flexible engineering tool which will allow the power processor designer to effectively and rapidly assess and analyze the tradeoffs available by providing, in one comprehensive program, a mathematical model, an analysis of expected performance, simulation, and a comparative evaluation with alternative designs. This requires an understanding of electrical power source characteristics and the effects of load control, protection, and total system interaction.
Tang, Qi-Yi; Zhang, Chuan-Xi
2013-04-01
A comprehensive but simple-to-use software package called DPS (Data Processing System) has been developed to execute a range of standard numerical analyses and operations used in experimental design, statistics and data mining. This program runs on standard Windows computers. Many of the functions are specific to entomological and other biological research and are not found in standard statistical software. This paper presents applications of DPS to experimental design, statistical analysis and data mining in entomology. © 2012 The Authors Insect Science © 2012 Institute of Zoology, Chinese Academy of Sciences.
Merlyn J. Paulson
1979-01-01
This paper outlines a project level process (V.I.S.) which utilizes very accurate and flexible computer algorithms in combination with contemporary site analysis and design techniques for visual evaluation, design and management. The process provides logical direction and connecting bridges through problem identification, information collection and verification, visual...
Tao, Ling; Aden, Andy; Elander, Richard T; Pallapolu, Venkata Ramesh; Lee, Y Y; Garlock, Rebecca J; Balan, Venkatesh; Dale, Bruce E; Kim, Youngmi; Mosier, Nathan S; Ladisch, Michael R; Falls, Matthew; Holtzapple, Mark T; Sierra, Rocio; Shi, Jian; Ebrik, Mirvat A; Redmond, Tim; Yang, Bin; Wyman, Charles E; Hames, Bonnie; Thomas, Steve; Warner, Ryan E
2011-12-01
Six biomass pretreatment processes to convert switchgrass to fermentable sugars and ultimately to cellulosic ethanol are compared on a consistent basis in this technoeconomic analysis. The six pretreatment processes are ammonia fiber expansion (AFEX), dilute acid (DA), lime, liquid hot water (LHW), soaking in aqueous ammonia (SAA), and sulfur dioxide-impregnated steam explosion (SO(2)). Each pretreatment process is modeled in the framework of an existing biochemical design model so that systematic variations of process-related changes are consistently captured. The pretreatment area process design and simulation are based on the research data generated within the Biomass Refining Consortium for Applied Fundamentals and Innovation (CAFI) 3 project. Overall ethanol production, total capital investment, and minimum ethanol selling price (MESP) are reported along with selected sensitivity analysis. The results show limited differentiation between the projected economic performances of the pretreatment options, except for processes that exhibit significantly lower monomer sugar and resulting ethanol yields. Copyright © 2011 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Yildirim, Nilay
2013-01-01
This cross-case study examines the relationships between game design attributes and collaborative problem solving process in the context of multi-player video games. The following game design attributes: sensory stimuli elements, level of challenge, and presentation of game goals and rules were examined to determine their influence on game…
NASA Astrophysics Data System (ADS)
Frits, Andrew P.
In the current Navy environment of undersea weapons development, the engineering aspect of design is decoupled from the development of the tactics with which the weapon is employed. Tactics are developed by intelligence experts, warfighters, and wargamers, while torpedo design is handled by engineers and contractors. This dissertation examines methods by which the conceptual design process of undersea weapon systems, including both torpedo systems and mine counter-measure systems, can be improved. It is shown that by simultaneously designing the torpedo and the tactics with which undersea weapons are used, a more effective overall weapon system can be created. In addition to integrating torpedo tactics with design, the thesis also looks at design methods to account for uncertainty. The uncertainty is attributable to multiple sources, including: lack of detailed analysis tools early in the design process, incomplete knowledge of the operational environments, and uncertainty in the performance of potential technologies. A robust design process is introduced to account for this uncertainty in the analysis and optimization of torpedo systems through the combination of Monte Carlo simulation with response surface methodology and metamodeling techniques. Additionally, various other methods that are appropriate to uncertainty analysis are discussed and analyzed. The thesis also advances a new approach towards examining robustness and risk: the treatment of probability of success (POS) as an independent variable. Examining the cost and performance tradeoffs between high and low probability of success designs, the decision-maker can make better informed decisions as to what designs are most promising and determine the optimal balance of risk, cost, and performance. Finally, the thesis examines the use of non-dimensionalization of parameters for torpedo design. The thesis shows that the use of non-dimensional torpedo parameters leads to increased knowledge about the scaleability of torpedo systems and increased performance of Designs of Experiments.
Automating expert role to determine design concept in Kansei Engineering
NASA Astrophysics Data System (ADS)
Lokman, Anitawati Mohd; Haron, Mohammad Bakri Che; Abidin, Siti Zaleha Zainal; Khalid, Noor Elaiza Abd
2016-02-01
Affect has become imperative in product quality. In affective design field, Kansei Engineering (KE) has been recognized as a technology that enables discovery of consumer's emotion and formulation of guide to design products that win consumers in the competitive market. Albeit powerful technology, there is no rule of thumb in its analysis and interpretation process. KE expertise is required to determine sets of related Kansei and the significant concept of emotion. Many research endeavors become handicapped with the limited number of available and accessible KE experts. This work is performed to simulate the role of experts with the use of Natphoric algorithm thus providing sound solution to the complexity and flexibility in KE. The algorithm is designed to learn the process by implementing training datasets taken from previous KE research works. A framework for automated KE is then designed to realize the development of automated KE system. A comparative analysis is performed to determine feasibility of the developed prototype to automate the process. The result shows that the significant Kansei is determined by manual KE implementation and the automated process is highly similar. KE research advocates will benefit this system to automatically determine significant design concepts.
Best Manufacturing Practices Survey Conducted at Litton Data Systems Division, Van Nuys, California
1988-10-01
Hardware and Software ................................ 10 DESIGN RELEASE Engineering Change Order Processing and Analysis...structured using bridges to isolate local traffic. Long term plans call for a wide-band network. ENGINEERING CHANGE ORDER PROCESSING AND ANALYSIS
2017-06-01
designed experiment to model and explore a ship-to-shore logistics process supporting dispersed units via three types of ULSs, which vary primarily in...systems, simulation, discrete event simulation, design of experiments, data analysis, simplekit, nearly orthogonal and balanced designs 15. NUMBER OF... designed experiment to model and explore a ship-to-shore logistics process supporting dispersed units via three types of ULSs, which vary primarily
The opto-mechanical design process: from vision to reality
NASA Astrophysics Data System (ADS)
Kvamme, E. Todd; Stubbs, David M.; Jacoby, Michael S.
2017-08-01
The design process for an opto-mechanical sub-system is discussed from requirements development through test. The process begins with a proper mission understanding and the development of requirements for the system. Preliminary design activities are then discussed with iterative analysis and design work being shared between the design, thermal, and structural engineering personnel. Readiness for preliminary review and the path to a final design review are considered. The value of prototyping and risk mitigation testing is examined with a focus on when it makes sense to execute a prototype test program. System level margin is discussed in general terms, and the practice of trading margin in one area of performance to meet another area is reviewed. Requirements verification and validation is briefly considered. Testing and its relationship to requirements verification concludes the design process.
Discrete event simulation tool for analysis of qualitative models of continuous processing systems
NASA Technical Reports Server (NTRS)
Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)
1990-01-01
An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.
The design of an m-Health monitoring system based on a cloud computing platform
NASA Astrophysics Data System (ADS)
Xu, Boyi; Xu, Lida; Cai, Hongming; Jiang, Lihong; Luo, Yang; Gu, Yizhi
2017-01-01
Compared to traditional medical services provided within hospitals, m-Health monitoring systems (MHMSs) face more challenges in personalised health data processing. To achieve personalised and high-quality health monitoring by means of new technologies, such as mobile network and cloud computing, in this paper, a framework of an m-Health monitoring system based on a cloud computing platform (Cloud-MHMS) is designed to implement pervasive health monitoring. Furthermore, the modules of the framework, which are Cloud Storage and Multiple Tenants Access Control Layer, Healthcare Data Annotation Layer, and Healthcare Data Analysis Layer, are discussed. In the data storage layer, a multiple tenant access method is designed to protect patient privacy. In the data annotation layer, linked open data are adopted to augment health data interoperability semantically. In the data analysis layer, the process mining algorithm and similarity calculating method are implemented to support personalised treatment plan selection. These three modules cooperate to implement the core functions in the process of health monitoring, which are data storage, data processing, and data analysis. Finally, we study the application of our architecture in the monitoring of antimicrobial drug usage to demonstrate the usability of our method in personal healthcare analysis.
NASA Technical Reports Server (NTRS)
Ables, Brett
2014-01-01
Multi-stage launch vehicles with solid rocket motors (SRMs) face design optimization challenges, especially when the mission scope changes frequently. Significant performance benefits can be realized if the solid rocket motors are optimized to the changing requirements. While SRMs represent a fixed performance at launch, rapid design iterations enable flexibility at design time, yielding significant performance gains. The streamlining and integration of SRM design and analysis can be achieved with improved analysis tools. While powerful and versatile, the Solid Performance Program (SPP) is not conducive to rapid design iteration. Performing a design iteration with SPP and a trajectory solver is a labor intensive process. To enable a better workflow, SPP, the Program to Optimize Simulated Trajectories (POST), and the interfaces between them have been improved and automated, and a graphical user interface (GUI) has been developed. The GUI enables real-time visual feedback of grain and nozzle design inputs, enforces parameter dependencies, removes redundancies, and simplifies manipulation of SPP and POST's numerous options. Automating the analysis also simplifies batch analyses and trade studies. Finally, the GUI provides post-processing, visualization, and comparison of results. Wrapping legacy high-fidelity analysis codes with modern software provides the improved interface necessary to enable rapid coupled SRM ballistics and vehicle trajectory analysis. Low cost trade studies demonstrate the sensitivities of flight performance metrics to propulsion characteristics. Incorporating high fidelity analysis from SPP into vehicle design reduces performance margins and improves reliability. By flying an SRM designed with the same assumptions as the rest of the vehicle, accurate comparisons can be made between competing architectures. In summary, this flexible workflow is a critical component to designing a versatile launch vehicle model that can accommodate a volatile mission scope.
Read, Gemma J M; Salmon, Paul M; Lenné, Michael G
2016-09-01
The Cognitive Work Analysis Design Toolkit (CWA-DT) is a recently developed approach that provides guidance and tools to assist in applying the outputs of CWA to design processes to incorporate the values and principles of sociotechnical systems theory. In this paper, the CWA-DT is evaluated based on an application to improve safety at rail level crossings. The evaluation considered the extent to which the CWA-DT met pre-defined methodological criteria and aligned with sociotechnical values and principles. Both process and outcome measures were taken based on the ratings of workshop participants and human factors experts. Overall, workshop participants were positive about the process and indicated that it met the methodological criteria and sociotechnical values. However, expert ratings suggested that the CWA-DT achieved only limited success in producing RLX designs that fully aligned with the sociotechnical approach. Discussion about the appropriateness of the sociotechnical approach in a public safety context is provided. Practitioner Summary: Human factors and ergonomics practitioners need evidence of the effectiveness of methods. A design toolkit for cognitive work analysis, incorporating values and principles from sociotechnical systems theory, was applied to create innovative designs for rail level crossings. Evaluation results based on the application are provided and discussed.
Structural Analysis in a Conceptual Design Framework
NASA Technical Reports Server (NTRS)
Padula, Sharon L.; Robinson, Jay H.; Eldred, Lloyd B.
2012-01-01
Supersonic aircraft designers must shape the outer mold line of the aircraft to improve multiple objectives, such as mission performance, cruise efficiency, and sonic-boom signatures. Conceptual designers have demonstrated an ability to assess these objectives for a large number of candidate designs. Other critical objectives and constraints, such as weight, fuel volume, aeroelastic effects, and structural soundness, are more difficult to address during the conceptual design process. The present research adds both static structural analysis and sizing to an existing conceptual design framework. The ultimate goal is to include structural analysis in the multidisciplinary optimization of a supersonic aircraft. Progress towards that goal is discussed and demonstrated.
Improving the Aircraft Design Process Using Web-Based Modeling and Simulation
NASA Technical Reports Server (NTRS)
Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.; Follen, Gregory J. (Technical Monitor)
2000-01-01
Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and multifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.
Improving the Aircraft Design Process Using Web-based Modeling and Simulation
NASA Technical Reports Server (NTRS)
Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.
2003-01-01
Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and muitifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.
Conceptual design and structural analysis for an 8.4-m telescope
NASA Astrophysics Data System (ADS)
Mendoza, Manuel; Farah, Alejandro; Ruiz Schneider, Elfego
2004-09-01
This paper describes the conceptual design of the optics support structures of a telescope with a primary mirror of 8.4 m, the same size as a Large Binocular Telescope (LBT) primary mirror. The design goal is to achieve a structure for supporting the primary and secondary mirrors and keeping them joined as rigid as possible. With this purpose an optimization with several models was done. This iterative design process includes: specifications development, concepts generation and evaluation. Process included Finite Element Analysis (FEA) as well as other analytical calculations. Quality Function Deployment (QFD) matrix was used to obtain telescope tube and spider specifications. Eight spiders and eleven tubes geometric concepts were proposed. They were compared in decision matrixes using performance indicators and parameters. Tubes and spiders went under an iterative optimization process. The best tubes and spiders concepts were assembled together. All assemblies were compared and ranked according to their performance.
Harbison, K; Kelly, J; Burnell, L; Silva, J
1995-01-01
The Scenario-based Engineering Process (SEP) is a user-focused methodology for large and complex system design. This process supports new application development from requirements analysis with domain models to component selection, design and modification, implementation, integration, and archival placement. It is built upon object-oriented methodologies, domain modeling strategies, and scenario-based techniques to provide an analysis process for mapping application requirements to available components. We are using SEP in the health care applications that we are developing. The process has already achieved success in the manufacturing and military domains and is being adopted by many organizations. SEP should prove viable in any domain containing scenarios that can be decomposed into tasks.
DOT National Transportation Integrated Search
1981-02-01
This volume documents the results of an analysis of the impact that various truck size and weight limits have on the carrier equipment selection process as a result of changes, in the design payload and design density of individual trucks. An analysi...
Organizational Analysis of the United States Army Evaluation Center
2014-12-01
analysis of qualitative or quantitative data obtained from design reviews, hardware inspections, M&S, hardware and software testing , metrics review... Research Development Test & Evaluation (RDT&E) appropriation account. The Defense Acquisition Portal ACQuipedia website describes RDT&E as “ one of the... research , design , development, test and evaluation, production, installation, operation, and maintenance; data collection; processing and analysis
Structural Loads Analysis for Wave Energy Converters
DOE Office of Scientific and Technical Information (OSTI.GOV)
van Rij, Jennifer A; Yu, Yi-Hsiang; Guo, Yi
2017-06-03
This study explores and verifies the generalized body-modes method for evaluating the structural loads on a wave energy converter (WEC). Historically, WEC design methodologies have focused primarily on accurately evaluating hydrodynamic loads, while methodologies for evaluating structural loads have yet to be fully considered and incorporated into the WEC design process. As wave energy technologies continue to advance, however, it has become increasingly evident that an accurate evaluation of the structural loads will enable an optimized structural design, as well as the potential utilization of composites and flexible materials, and hence reduce WEC costs. Although there are many computational fluidmore » dynamics, structural analyses and fluid-structure-interaction (FSI) codes available, the application of these codes is typically too computationally intensive to be practical in the early stages of the WEC design process. The generalized body-modes method, however, is a reduced order, linearized, frequency-domain FSI approach, performed in conjunction with the linear hydrodynamic analysis, with computation times that could realistically be incorporated into the WEC design process.« less
Internship Progress Summary: Fall 2016
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wiser, Ralph S.; Valencia, Matthew John
2016-12-13
This fall I had the opportunity to work at Los Alamos National Laboratory for the Technology Applications engineering group. I assisted two main projects during my appointment, both related to the Lab’s mission statement: “To solve national security challenges through scientific excellence.” My first project, a thermal source transfer unit, involved skills such as mechanical design, heat transfer simulation, and design analysis. The goal was to create a container that could protect a heat source and regulate its temperature during transit. I generated several designs, performed heat transfer simulations, and chose a design for prototyping. The second project was amore » soil drying unit for use in post blast sample analysis. To ensure fast and accurate sample processing, agents in the field wanted a system that could process wet dirt and turn it into dry powder. We designed a system of commercially available parts, and we tested the systems to determine the best methods and processes.« less
Innovation and design approaches within prospective ergonomics.
Liem, André; Brangier, Eric
2012-01-01
In this conceptual article the topic of "Prospective Ergonomics" will be discussed within the context of innovation, design thinking and design processes & methods. Design thinking is essentially a human-centred innovation process that emphasises observation, collaboration, interpretation, visualisation of ideas, rapid concept prototyping and concurrent business analysis, which ultimately influences innovation and business strategy. The objective of this project is to develop a roadmap for innovation, involving consumers, designers and business people in an integrative process, which can be applied to product, service and business design. A theoretical structure comprising of Innovation perspectives (1), Worldviews supported by rationalist-historicist and empirical-idealistic dimensions (2) and Models of "design" reasoning (3) precedes the development and classification of existing methods as well as the introduction of new ones.
Experimental design methods for bioengineering applications.
Keskin Gündoğdu, Tuğba; Deniz, İrem; Çalışkan, Gülizar; Şahin, Erdem Sefa; Azbar, Nuri
2016-01-01
Experimental design is a form of process analysis in which certain factors are selected to obtain the desired responses of interest. It may also be used for the determination of the effects of various independent factors on a dependent factor. The bioengineering discipline includes many different areas of scientific interest, and each study area is affected and governed by many different factors. Briefly analyzing the important factors and selecting an experimental design for optimization are very effective tools for the design of any bioprocess under question. This review summarizes experimental design methods that can be used to investigate various factors relating to bioengineering processes. The experimental methods generally used in bioengineering are as follows: full factorial design, fractional factorial design, Plackett-Burman design, Taguchi design, Box-Behnken design and central composite design. These design methods are briefly introduced, and then the application of these design methods to study different bioengineering processes is analyzed.
Post-test navigation data analysis techniques for the shuttle ALT
NASA Technical Reports Server (NTRS)
1975-01-01
Postflight test analysis data processing techniques for shuttle approach and landing tests (ALT) navigation data are defined. Postfight test processor requirements are described along with operational and design requirements, data input requirements, and software test requirements. The postflight test data processing is described based on the natural test sequence: quick-look analysis, postflight navigation processing, and error isolation processing. Emphasis is placed on the tradeoffs that must remain open and subject to analysis until final definition is achieved in the shuttle data processing system and the overall ALT plan. A development plan for the implementation of the ALT postflight test navigation data processing system is presented. Conclusions are presented.
ERIC Educational Resources Information Center
Yusop, Farrah Dina
2013-01-01
This paper presents a curriculum and design analyses of an Emmy-award winning children educational television series, Cyberchase. Using Posner's (2004) four process of curriculum analysis framework, this paper addresses each of the components and relates it to the design principles undertaken by the Cyberchase production team. Media and document…
ERIC Educational Resources Information Center
Ross, Sarah Gwen
2012-01-01
Response to intervention (RTI) is increasingly being used in educational settings to make high-stakes, special education decisions. Because of this, the accurate use and analysis of single-case designs to monitor intervention effectiveness has become important to the RTI process. Effect size methods for single-case designs provide a useful way to…
Automated Sensitivity Analysis of Interplanetary Trajectories for Optimal Mission Design
NASA Technical Reports Server (NTRS)
Knittel, Jeremy; Hughes, Kyle; Englander, Jacob; Sarli, Bruno
2017-01-01
This work describes a suite of Python tools known as the Python EMTG Automated Trade Study Application (PEATSA). PEATSA was written to automate the operation of trajectory optimization software, simplify the process of performing sensitivity analysis, and was ultimately found to out-perform a human trajectory designer in unexpected ways. These benefits will be discussed and demonstrated on sample mission designs.
NASA Technical Reports Server (NTRS)
Sobieszczanski-Sobieski, Jaroslaw
1988-01-01
Optimization by decomposition, complex system sensitivity analysis, and a rapid growth of disciplinary sensitivity analysis are some of the recent developments that hold promise of a quantum jump in the support engineers receive from computers in the quantitative aspects of design. Review of the salient points of these techniques is given and illustrated by examples from aircraft design as a process that combines the best of human intellect and computer power to manipulate data.
Practicing universal design to actual hand tool design process.
Lin, Kai-Chieh; Wu, Chih-Fu
2015-09-01
UD evaluation principles are difficult to implement in product design. This study proposes a methodology for implementing UD in the design process through user participation. The original UD principles and user experience are used to develop the evaluation items. Difference of product types was considered. Factor analysis and Quantification theory type I were used to eliminate considered inappropriate evaluation items and to examine the relationship between evaluation items and product design factors. Product design specifications were established for verification. The results showed that converting user evaluation into crucial design verification factors by the generalized evaluation scale based on product attributes as well as the design factors applications in product design can improve users' UD evaluation. The design process of this study is expected to contribute to user-centered UD application. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.
DOT National Transportation Integrated Search
2007-11-13
This document presents the findings from the baseline phase of the evaluation of the process being used by eight sites to develop a design for a Travel Management Coordination Center (TMCC) for improved coordination of human service transportation wi...
Mechanoluminescence assisting agile optimization of processing design on surgical epiphysis plates
NASA Astrophysics Data System (ADS)
Terasaki, Nao; Toyomasu, Takashi; Sonohata, Motoki
2018-04-01
We propose a novel method for agile optimization of processing design by visualization of mechanoluminescence. To demonstrate the effect of the new method, epiphysis plates were processed to form dots (diameters: 1 and 1.5 mm) and the mechanical information was evaluated. As a result, the appearance of new strain concentration was successfully visualized on the basis of mechanoluminescence, and complex mechanical information was instinctively understood by surgeons as the designers. In addition, it was clarified by mechanoluminescence analysis that small dots do not have serious mechanical effects such as strength reduction. Such detail mechanical information evaluated on the basis of mechanoluminescence was successfully applied to the judgement of the validity of the processing design. This clearly proves the effectiveness of the new methodology using mechanoluminescence for assisting agile optimization of the processing design.
Improving designer productivity
NASA Technical Reports Server (NTRS)
Hill, Gary C.
1992-01-01
Designer and design team productivity improves with skill, experience, and the tools available. The design process involves numerous trials and errors, analyses, refinements, and addition of details. Computerized tools have greatly speeded the analysis, and now new theories and methods, emerging under the label Artificial Intelligence (AI), are being used to automate skill and experience. These tools improve designer productivity by capturing experience, emulating recognized skillful designers, and making the essence of complex programs easier to grasp. This paper outlines the aircraft design process in today's technology and business climate, presenting some of the challenges ahead and some of the promising AI methods for meeting those challenges.
NASA Technical Reports Server (NTRS)
Steele, Gynelle C.
1999-01-01
The NASA Lewis Research Center and Flow Parametrics will enter into an agreement to commercialize the National Combustion Code (NCC). This multidisciplinary combustor design system utilizes computer-aided design (CAD) tools for geometry creation, advanced mesh generators for creating solid model representations, a common framework for fluid flow and structural analyses, modern postprocessing tools, and parallel processing. This integrated system can facilitate and enhance various phases of the design and analysis process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mata, Pedro; Fuente, Rafael de la; Iglesias, Javier
Iberdrola (spanish utility) and Iberdrola Ingenieria (engineering branch) have been developing during the last two years the 110% Extended Power Up-rate Project (EPU 110%) for Cofrentes BWR-6. IBERDROLA has available an in-house design and licensing reload methodology that has been approved by the Spanish Nuclear Regulatory Authority. This methodology has been already used to perform the nuclear design and the reload licensing analysis for Cofrentes cycles 12 to 14. The methodology has been also applied to develop a significant number of safety analysis of the Cofrentes Extended Power Up-rate including: Reactor Heat Balance, Core and Fuel performance, Thermal Hydraulic Stability,more » ECCS LOCA Evaluation, Transient Analysis, Anticipated Transient Without Scram (ATWS) and Station Blackout (SBO) Since the scope of the licensing process of the Cofrentes Extended Power Up-rate exceeds the range of analysis included in the Cofrentes generic reload licensing process, it has been required to extend the applicability of the Cofrentes licensing methodology to the analysis of new transients. This is the case of the TLFW transient. The content of this paper shows the benefits of having an in-house design and licensing methodology, and describes the process to extend the applicability of the methodology to the analysis of new transients. The case of analysis of Total Loss of Feedwater with the Cofrentes Retran Model is included as an example of this process. (authors)« less
A method for scenario-based risk assessment for robust aerospace systems
NASA Astrophysics Data System (ADS)
Thomas, Victoria Katherine
In years past, aircraft conceptual design centered around creating a feasible aircraft that could be built and could fly the required missions. More recently, aircraft viability entered into conceptual design, allowing that the product's potential to be profitable should also be examined early in the design process. While examining an aerospace system's feasibility and viability early in the design process is extremely important, it is also important to examine system risk. In traditional aerospace systems risk analysis, risk is examined from the perspective of performance, schedule, and cost. Recently, safety and reliability analysis have been brought forward in the design process to also be examined during late conceptual and early preliminary design. While these analyses work as designed, existing risk analysis methods and techniques are not designed to examine an aerospace system's external operating environment and the risks present there. A new method has been developed here to examine, during the early part of concept design, the risk associated with not meeting assumptions about the system's external operating environment. The risks are examined in five categories: employment, culture, government and politics, economics, and technology. The risks are examined over a long time-period, up to the system's entire life cycle. The method consists of eight steps over three focus areas. The first focus area is Problem Setup. During problem setup, the problem is defined and understood to the best of the decision maker's ability. There are four steps in this area, in the following order: Establish the Need, Scenario Development, Identify Solution Alternatives, and Uncertainty and Risk Identification. There is significant iteration between steps two through four. Focus area two is Modeling and Simulation. In this area the solution alternatives and risks are modeled, and a numerical value for risk is calculated. A risk mitigation model is also created. The four steps involved in completing the modeling and simulation are: Alternative Solution Modeling, Uncertainty Quantification, Risk Assessment, and Risk Mitigation. Focus area three consists of Decision Support. In this area a decision support interface is created that allows for game playing between solution alternatives and risk mitigation. A multi-attribute decision making process is also implemented to aid in decision making. A demonstration problem inspired by Airbus' mid 1980s decision to break into the widebody long-range market was developed to illustrate the use of this method. The results showed that the method is able to capture additional types of risk than previous analysis methods, particularly at the early stages of aircraft design. It was also shown that the method can be used to help create a system that is robust to external environmental factors. The addition of an external environment risk analysis in the early stages of conceptual design can add another dimension to the analysis of feasibility and viability. The ability to take risk into account during the early stages of the design process can allow for the elimination of potentially feasible and viable but too-risky alternatives. The addition of a scenario-based analysis instead of a traditional probabilistic analysis enabled uncertainty to be effectively bound and examined over a variety of potential futures instead of only a single future. There is also potential for a product to be groomed for a specific future that one believes is likely to happen, or for a product to be steered during design as the future unfolds.
Vision-sensing image analysis for GTAW process control
DOE Office of Scientific and Technical Information (OSTI.GOV)
Long, D.D.
1994-11-01
Image analysis of a gas tungsten arc welding (GTAW) process was completed using video images from a charge coupled device (CCD) camera inside a specially designed coaxial (GTAW) electrode holder. Video data was obtained from filtered and unfiltered images, with and without the GTAW arc present, showing weld joint features and locations. Data Translation image processing boards, installed in an IBM PC AT 386 compatible computer, and Media Cybernetics image processing software were used to investigate edge flange weld joint geometry for image analysis.
Structural design/margin assessment
NASA Technical Reports Server (NTRS)
Ryan, R. S.
1993-01-01
Determining structural design inputs and the structural margins following design completion is one of the major activities in space exploration. The end result is a statement of these margins as stability, safety factors on ultimate and yield stresses, fracture limits (fracture control), fatigue lifetime, reuse criteria, operational criteria and procedures, stability factors, deflections, clearance, handling criteria, etc. The process is normally called a load cycle and is time consuming, very complex, and involves much more than structures. The key to successful structural design is the proper implementation of the process. It depends on many factors: leadership and management of the process, adequate analysis and testing tools, data basing, communications, people skills, and training. This process and the various factors involved are discussed.
Galí, A; García-Montoya, E; Ascaso, M; Pérez-Lozano, P; Ticó, J R; Miñarro, M; Suñé-Negre, J M
2016-09-01
Although tablet coating processes are widely used in the pharmaceutical industry, they often lack adequate robustness. Up-scaling can be challenging as minor changes in parameters can lead to varying quality results. To select critical process parameters (CPP) using retrospective data of a commercial product and to establish a design of experiments (DoE) that would improve the robustness of the coating process. A retrospective analysis of data from 36 commercial batches. Batches were selected based on the quality results generated during batch release, some of which revealed quality deviations concerning the appearance of the coated tablets. The product is already marketed and belongs to the portfolio of a multinational pharmaceutical company. The Statgraphics 5.1 software was used for data processing to determine critical process parameters in order to propose new working ranges. This study confirms that it is possible to determine the critical process parameters and create design spaces based on retrospective data of commercial batches. This type of analysis is thus converted into a tool to optimize the robustness of existing processes. Our results show that a design space can be established with minimum investment in experiments, since current commercial batch data are processed statistically.
Johnson, B C
1990-01-01
As health care competition increases, and as the penalties for making poor decisions become potentially more devastating, market research continues to play an increasingly important role in the decision-making process for hospitals. Concern over the appropriate use of market research and the costs related to it remains high. As such, efficiency in research design and clarity in research outcome are clearly the goals. This paper examines the focus group process and its adjunctive role in enhancing the overall design of health care market research. Specifically, the function and placement of focus groups within the research plan as well as several methods of creative focus group analysis are considered within the context of an effective research design.
NASA Technical Reports Server (NTRS)
Hall, Laverne
1995-01-01
Modeling of the Multi-mission Image Processing System (MIPS) will be described as an example of the use of a modeling tool to design a distributed system that supports multiple application scenarios. This paper examines: (a) modeling tool selection, capabilities, and operation (namely NETWORK 2.5 by CACl), (b) pointers for building or constructing a model and how the MIPS model was developed, (c) the importance of benchmarking or testing the performance of equipment/subsystems being considered for incorporation the design/architecture, (d) the essential step of model validation and/or calibration using the benchmark results, (e) sample simulation results from the MIPS model, and (f) how modeling and simulation analysis affected the MIPS design process by having a supportive and informative impact.
SLUDGE TREATMENT PROJECT KOP CONCEPTUAL DESIGN CONTROL DECISION REPORT
DOE Office of Scientific and Technical Information (OSTI.GOV)
CARRO CA
2010-03-09
This control decision addresses the Knock-Out Pot (KOP) Disposition KOP Processing System (KPS) conceptual design. The KPS functions to (1) retrieve KOP material from canisters, (2) remove particles less than 600 {micro}m in size and low density materials from the KOP material, (3) load the KOP material into Multi-Canister Overpack (MCO) baskets, and (4) stage the MCO baskets for subsequent loading into MCOs. Hazard and accident analyses of the KPS conceptual design have been performed to incorporate safety into the design process. The hazard analysis is documented in PRC-STP-00098, Knock-Out Pot Disposition Project Conceptual Design Hazard Analysis. The accident analysismore » is documented in PRC-STP-CN-N-00167, Knock-Out Pot Disposition Sub-Project Canister Over Lift Accident Analysis. Based on the results of these analyses, and analyses performed in support of MCO transportation and MCO processing and storage activities at the Cold Vacuum Drying Facility (CVDF) and Canister Storage Building (CSB), control decision meetings were held to determine the controls required to protect onsite and offsite receptors and facility workers. At the conceptual design stage, these controls are primarily defined by their safety functions. Safety significant structures, systems, and components (SSCs) that could provide the identified safety functions have been selected for the conceptual design. It is anticipated that some safety SSCs identified herein will be reclassified based on hazard and accident analyses performed in support of preliminary and detailed design.« less
Team Design Communication Patterns in e-Learning Design and Development
ERIC Educational Resources Information Center
Rapanta, Chrysi; Maina, Marcelo; Lotz, Nicole; Bacchelli, Alberto
2013-01-01
Prescriptive stage models have been found insufficient to describe the dynamic aspects of designing, especially in interdisciplinary e-learning design teams. There is a growing need for a systematic empirical analysis of team design processes that offer deeper and more detailed insights into instructional design (ID) than general models can offer.…
Using pattern enumeration to accelerate process development and ramp yield
NASA Astrophysics Data System (ADS)
Zhuang, Linda; Pang, Jenny; Xu, Jessy; Tsai, Mengfeng; Wang, Amy; Zhang, Yifan; Sweis, Jason; Lai, Ya-Chieh; Ding, Hua
2016-03-01
During a new technology node process setup phase, foundries do not initially have enough product chip designs to conduct exhaustive process development. Different operational teams use manually designed simple test keys to set up their process flows and recipes. When the very first version of the design rule manual (DRM) is ready, foundries enter the process development phase where new experiment design data is manually created based on these design rules. However, these IP/test keys contain very uniform or simple design structures. This kind of design normally does not contain critical design structures or process unfriendly design patterns that pass design rule checks but are found to be less manufacturable. It is desired to have a method to generate exhaustive test patterns allowed by design rules at development stage to verify the gap of design rule and process. This paper presents a novel method of how to generate test key patterns which contain known problematic patterns as well as any constructs which designers could possibly draw based on current design rules. The enumerated test key patterns will contain the most critical design structures which are allowed by any particular design rule. A layout profiling method is used to do design chip analysis in order to find potential weak points on new incoming products so fab can take preemptive action to avoid yield loss. It can be achieved by comparing different products and leveraging the knowledge learned from previous manufactured chips to find possible yield detractors.
Enabling Rapid and Robust Structural Analysis During Conceptual Design
NASA Technical Reports Server (NTRS)
Eldred, Lloyd B.; Padula, Sharon L.; Li, Wu
2015-01-01
This paper describes a multi-year effort to add a structural analysis subprocess to a supersonic aircraft conceptual design process. The desired capabilities include parametric geometry, automatic finite element mesh generation, static and aeroelastic analysis, and structural sizing. The paper discusses implementation details of the new subprocess, captures lessons learned, and suggests future improvements. The subprocess quickly compares concepts and robustly handles large changes in wing or fuselage geometry. The subprocess can rank concepts with regard to their structural feasibility and can identify promising regions of the design space. The automated structural analysis subprocess is deemed robust and rapid enough to be included in multidisciplinary conceptual design and optimization studies.
Failure modes and effects analysis automation
NASA Technical Reports Server (NTRS)
Kamhieh, Cynthia H.; Cutts, Dannie E.; Purves, R. Byron
1988-01-01
A failure modes and effects analysis (FMEA) assistant was implemented as a knowledge based system and will be used during design of the Space Station to aid engineers in performing the complex task of tracking failures throughout the entire design effort. The three major directions in which automation was pursued were the clerical components of the FMEA process, the knowledge acquisition aspects of FMEA, and the failure propagation/analysis portions of the FMEA task. The system is accessible to design, safety, and reliability engineers at single user workstations and, although not designed to replace conventional FMEA, it is expected to decrease by many man years the time required to perform the analysis.
NASA Astrophysics Data System (ADS)
Hou, Zhenlong; Huang, Danian
2017-09-01
In this paper, we make a study on the inversion of probability tomography (IPT) with gravity gradiometry data at first. The space resolution of the results is improved by multi-tensor joint inversion, depth weighting matrix and the other methods. Aiming at solving the problems brought by the big data in the exploration, we present the parallel algorithm and the performance analysis combining Compute Unified Device Architecture (CUDA) with Open Multi-Processing (OpenMP) based on Graphics Processing Unit (GPU) accelerating. In the test of the synthetic model and real data from Vinton Dome, we get the improved results. It is also proved that the improved inversion algorithm is effective and feasible. The performance of parallel algorithm we designed is better than the other ones with CUDA. The maximum speedup could be more than 200. In the performance analysis, multi-GPU speedup and multi-GPU efficiency are applied to analyze the scalability of the multi-GPU programs. The designed parallel algorithm is demonstrated to be able to process larger scale of data and the new analysis method is practical.
Design and Manufacturing of Composite Tower Structure for Wind Turbine Equipment
NASA Astrophysics Data System (ADS)
Park, Hyunbum
2018-02-01
This study proposes the composite tower design process for large wind turbine equipment. In this work, structural design of tower and analysis using finite element method was performed. After structural design, prototype blade manufacturing and test was performed. The used material is a glass fiber and epoxy resin composite. And also, sand was used in the middle part. The optimized structural design and analysis was performed. The parameter for optimized structural design is weight reduction and safety of structure. Finally, structure of tower will be confirmed by structural test.
NASA Astrophysics Data System (ADS)
Yu, Yang; Zeng, Zheng
2009-10-01
By discussing the causes behind the high amendments ratio in the implementation of urban regulatory detailed plans in China despite its law-ensured status, the study aims to reconcile conflict between the legal authority of regulatory detailed planning and the insufficient scientific support in its decision-making and compilation by introducing into the process spatial analysis based on GIS technology and 3D modeling thus present a more scientific and flexible approach to regulatory detailed planning in China. The study first points out that the current compilation process of urban regulatory detailed plan in China employs mainly an empirical approach which renders it constantly subjected to amendments; the study then discusses the need and current utilization of GIS in the Chinese system and proposes the framework of a GIS-assisted 3D spatial analysis process from the designer's perspective which can be regarded as an alternating processes between the descriptive codes and physical design in the compilation of regulatory detailed planning. With a case study of the processes and results from the application of the framework, the paper concludes that the proposed framework can be an effective instrument which provides more rationality, flexibility and thus more efficiency to the compilation and decision-making process of urban regulatory detailed plan in China.
Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS
NASA Astrophysics Data System (ADS)
Joshi, D. M.; Patel, H. K.
2015-10-01
Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant.
DEP : a computer program for evaluating lumber drying costs and investments
Stewart Holmes; George B. Harpole; Edward Bilek
1983-01-01
The DEP computer program is a modified discounted cash flow computer program designed for analysis of problems involving economic analysis of wood drying processes. Wood drying processes are different from other processes because of the large amounts of working capital required to finance inventories, and because of relatively large shares of costs charged to inventory...
ERIC Educational Resources Information Center
Duffy, Melissa C.; Azevedo, Roger; Sun, Ning-Zi; Griscom, Sophia E.; Stead, Victoria; Crelinsten, Linda; Wiseman, Jeffrey; Maniatis, Thomas; Lachapelle, Kevin
2015-01-01
This study examined the nature of cognitive, metacognitive, and affective processes among a medical team experiencing difficulty managing a challenging simulated medical emergency case by conducting in-depth analysis of process data. Medical residents participated in a simulation exercise designed to help trainees to develop medical expertise,…
NASA Technical Reports Server (NTRS)
Chwalowski, Pawel; Samareh, Jamshid A.; Horta, Lucas G.; Piatak, David J.; McGowan, Anna-Maria R.
2009-01-01
The conceptual and preliminary design processes for aircraft with large shape changes are generally difficult and time-consuming, and the processes are often customized for a specific shape change concept to streamline the vehicle design effort. Accordingly, several existing reports show excellent results of assessing a particular shape change concept or perturbations of a concept. The goal of the current effort was to develop a multidisciplinary analysis tool and process that would enable an aircraft designer to assess several very different morphing concepts early in the design phase and yet obtain second-order performance results so that design decisions can be made with better confidence. The approach uses an efficient parametric model formulation that allows automatic model generation for systems undergoing radical shape changes as a function of aerodynamic parameters, geometry parameters, and shape change parameters. In contrast to other more self-contained approaches, the approach utilizes off-the-shelf analysis modules to reduce development time and to make it accessible to many users. Because the analysis is loosely coupled, discipline modules like a multibody code can be easily swapped for other modules with similar capabilities. One of the advantages of this loosely coupled system is the ability to use the medium- to high-fidelity tools early in the design stages when the information can significantly influence and improve overall vehicle design. Data transfer among the analysis modules are based on an accurate and automated general purpose data transfer tool. In general, setup time for the integrated system presented in this paper is 2-4 days for simple shape change concepts and 1-2 weeks for more mechanically complicated concepts. Some of the key elements briefly described in the paper include parametric model development, aerodynamic database generation, multibody analysis, and the required software modules as well as examples for a telescoping wing, a folding wing, and a bat-like wing. The paper also includes the verification of a medium-fidelity aerodynamic tool used for the aerodynamic database generation with a steady and unsteady high-fidelity CFD analysis tool for a folding wing example.
Sarkar, Sumona; Lund, Steven P; Vyzasatya, Ravi; Vanguri, Padmavathy; Elliott, John T; Plant, Anne L; Lin-Gibson, Sheng
2017-12-01
Cell counting measurements are critical in the research, development and manufacturing of cell-based products, yet determining cell quantity with accuracy and precision remains a challenge. Validating and evaluating a cell counting measurement process can be difficult because of the lack of appropriate reference material. Here we describe an experimental design and statistical analysis approach to evaluate the quality of a cell counting measurement process in the absence of appropriate reference materials or reference methods. The experimental design is based on a dilution series study with replicate samples and observations as well as measurement process controls. The statistical analysis evaluates the precision and proportionality of the cell counting measurement process and can be used to compare the quality of two or more counting methods. As an illustration of this approach, cell counting measurement processes (automated and manual methods) were compared for a human mesenchymal stromal cell (hMSC) preparation. For the hMSC preparation investigated, results indicated that the automated method performed better than the manual counting methods in terms of precision and proportionality. By conducting well controlled dilution series experimental designs coupled with appropriate statistical analysis, quantitative indicators of repeatability and proportionality can be calculated to provide an assessment of cell counting measurement quality. This approach does not rely on the use of a reference material or comparison to "gold standard" methods known to have limited assurance of accuracy and precision. The approach presented here may help the selection, optimization, and/or validation of a cell counting measurement process. Published by Elsevier Inc.
NASA Technical Reports Server (NTRS)
Poeschel, R. L.; Hawthorne, E. I.; Weisman, Y. C.; Frisman, M.; Benson, G. C.; Mcgrath, R. J.; Martinelli, R. M.; Linsenbardt, T. L.; Beattie, J. R.
1977-01-01
Several thrust system design concepts were evaluated and compared using the specifications of the most advanced 30 cm engineering model thruster as the technology base. Emphasis was placed on relatively high power missions (60 to 100 kW) such as a Halley's comet rendezvous. The extensions in thruster performance required for the Halley's comet mission were defined and alternative thrust system concepts were designed in sufficient detail for comparing mass, efficiency, reliability, structure, and thermal characteristics. Confirmation testing and analysis of thruster and power processing components were performed, and the feasibility of satisfying extended performance requirements was verified. A baseline design was selected from the alternatives considered, and the design analysis and documentation were refined. The baseline thrust system design features modular construction, conventional power processing, and a concentrator solar array concept and is designed to interface with the Space Shuttle.
Extended performance solar electric propulsion thrust system study. Volume 2: Baseline thrust system
NASA Technical Reports Server (NTRS)
Poeschel, R. L.; Hawthorne, E. I.
1977-01-01
Several thrust system design concepts were evaluated and compared using the specifications of the most advanced 30- cm engineering model thruster as the technology base. Emphasis was placed on relatively high-power missions (60 to 100 kW) such as a Halley's comet rendezvous. The extensions in thruster performance required for the Halley's comet mission were defined and alternative thrust system concepts were designed in sufficient detail for comparing mass, efficiency, reliability, structure, and thermal characteristics. Confirmation testing and analysis of thruster and power-processing components were performed, and the feasibility of satisfying extended performance requirements was verified. A baseline design was selected from the alternatives considered, and the design analysis and documentation were refined. The baseline thrust system design features modular construction, conventional power processing, and a concentractor solar array concept and is designed to interface with the space shuttle.
Modeling and Design Analysis Methodology for Tailoring of Aircraft Structures with Composites
NASA Technical Reports Server (NTRS)
Rehfield, Lawrence W.
2004-01-01
Composite materials provide design flexibility in that fiber placement and orientation can be specified and a variety of material forms and manufacturing processes are available. It is possible, therefore, to 'tailor' the structure to a high degree in order to meet specific design requirements in an optimum manner. Common industrial practices, however, have limited the choices designers make. One of the reasons for this is that there is a dearth of conceptual/preliminary design analysis tools specifically devoted to identifying structural concepts for composite airframe structures. Large scale finite element simulations are not suitable for such purposes. The present project has been devoted to creating modeling and design analysis methodology for use in the tailoring process of aircraft structures. Emphasis has been given to creating bend-twist elastic coupling in high aspect ratio wings or other lifting surfaces. The direction of our work was in concert with the overall NASA effort Twenty- First Century Aircraft Technology (TCAT). A multi-disciplinary team was assembled by Dr. Damodar Ambur to work on wing technology, which included our project.
Coupled Loads Analysis of the Modified NASA Barge Pegasus and Space Launch System Hardware
NASA Technical Reports Server (NTRS)
Knight, J. Brent
2015-01-01
A Coupled Loads Analysis (CLA) has been performed for barge transport of Space Launch System hardware on the recently modified NASA barge Pegasus. The barge re-design was facilitated with detailed finite element analyses by the ARMY Corps of Engineers - Marine Design Center. The Finite Element Model (FEM) utilized in the design was also used in the subject CLA. The Pegasus FEM and CLA results are presented as well as a comparison of the analysis process to that of a payload being transported to space via the Space Shuttle. Discussion of the dynamic forcing functions is included as well. The process of performing a dynamic CLA of NASA hardware during marine transport is thought to be a first and can likely support minimization of undue conservatism.
Nontraditional Intersections/Interchanges: Informational Report
DOT National Transportation Integrated Search
2007-06-18
Comprehensive Coverage -Geometric design considerations. -Traffic analysis and comparison with similar conventional design. -Signal settings. -Signing and marking. -Material or cost comparison. -Selection Process in a spread sheet.
Integration of Engine, Plume, and CFD Analyses in Conceptual Design of Low-Boom Supersonic Aircraft
NASA Technical Reports Server (NTRS)
Li, Wu; Campbell, Richard; Geiselhart, Karl; Shields, Elwood; Nayani, Sudheer; Shenoy, Rajiv
2009-01-01
This paper documents an integration of engine, plume, and computational fluid dynamics (CFD) analyses in the conceptual design of low-boom supersonic aircraft, using a variable fidelity approach. In particular, the Numerical Propulsion Simulation System (NPSS) is used for propulsion system cycle analysis and nacelle outer mold line definition, and a low-fidelity plume model is developed for plume shape prediction based on NPSS engine data and nacelle geometry. This model provides a capability for the conceptual design of low-boom supersonic aircraft that accounts for plume effects. Then a newly developed process for automated CFD analysis is presented for CFD-based plume and boom analyses of the conceptual geometry. Five test cases are used to demonstrate the integrated engine, plume, and CFD analysis process based on a variable fidelity approach, as well as the feasibility of the automated CFD plume and boom analysis capability.
NASA Astrophysics Data System (ADS)
Zhu, Ren; Wu, Lan; Wang, Shiming; Ye, Linhua; Ding, Zhihua
2008-03-01
As a fast, non-destructive analysis method, Fourier transform (FT) near-infrared (NIR) spectroscopy is very suitable and effective for online quality analysis of traditional Chinese medicine (TCM) manufacturing process. In this thesis, the theoretics of FT-NIRS was analyzed and an FT-NIR spectrometer with 4 cm -1 resolution in the 12500-5000 cm -1 frequency range was designed. The spectrometer was based on a Michelson interferometer with Bromine tungsten lamp as the NIR light source and InGaAs detector to collect the interference signal. Each element was designed and chosen to provide maximum sensitivity in the NIR spectral region. A fiber-optic flow cell system was used to realize online analysis of traditional Chinese medicine. The performance of the spectrometer was evaluated and the feasibility of using FT-NIR spectrometer to get absorption spectra of traditional Chinese medicine was demonstrated.
Development of Optimal Stressor Scenarios for New Operational Energy Systems
2017-12-01
Analyzing the previous model using a design of experiments (DOE) and regression analysis provides critical information about the associated operational...from experimentation. The resulting system requirements can be used to revisit the design requirements and develop a more robust system. This process...stressor scenarios for acceptance testing. Analyzing the previous model using a design of experiments (DOE) and regression analysis provides critical
Alternatives for Developing User Documentation for Applications Software
1991-09-01
style that is designed to match adult reading behaviors, using reader-based writing techniques, developing effective graphics , creating reference aids...involves research, analysis, design , and testing. The writer must have a solid understanding of the technical aspects of the document being prepared, good...ABSTRACT The preparation of software documentation is an iterative process that involves research, analysis, design , and testing. The writer must have
Multidisciplinary Design, Analysis, and Optimization Tool Development using a Genetic Algorithm
NASA Technical Reports Server (NTRS)
Pak, Chan-gi; Li, Wesley
2008-01-01
Multidisciplinary design, analysis, and optimization using a genetic algorithm is being developed at the National Aeronautics and Space A dministration Dryden Flight Research Center to automate analysis and design process by leveraging existing tools such as NASTRAN, ZAERO a nd CFD codes to enable true multidisciplinary optimization in the pr eliminary design stage of subsonic, transonic, supersonic, and hypers onic aircraft. This is a promising technology, but faces many challe nges in large-scale, real-world application. This paper describes cur rent approaches, recent results, and challenges for MDAO as demonstr ated by our experience with the Ikhana fire pod design.
STARS Conceptual Framework for Reuse Processes (CFRP). Volume 2: application Version 1.0
1993-09-30
Analysis and Design DISA/CIM process x OProcess [DIS93] Feature-Oriented Domain SEI process x Analysis ( FODA ) [KCH+90] JIAWG Object-Oriented Domain JIAWG...Domain Analysis ( FODA ) Feasibility Study. Technical Report CMU/S[1 ,N. I R 21. Soft- ware Engineering Institute, Carnegie Mellon University, Pittsburgh...Electronic Systems Center Air Force Materiel Command, USAF Hanscom AFB, MA 01731-5000 Prepared by: The Boeing Company , IBM, Unisys Corporation, Defense
Experience with case tools in the design of process-oriented software
NASA Astrophysics Data System (ADS)
Novakov, Ognian; Sicard, Claude-Henri
1994-12-01
In Accelerator systems such as the CERN PS complex, process equipment has a life time which may exceed the typical life cycle of its related software. Taking into account the variety of such equipment, it is important to keep the analysis and design of the software in a system-independent form. This paper discusses the experience gathered in using commercial CASE tools for analysis, design and reverse engineering of different process-oriented software modules, with a principal emphasis on maintaining the initial analysis in a standardized form. Such tools have been in existence for several years, but this paper shows that they are not fully adapted to our needs. In particular, the paper stresses the problems of integrating such a tool into an existing data-base-dependent development chain, the lack of real-time simulation tools and of Object-Oriented concepts in existing commercial packages. Finally, the paper gives a broader view of software engineering needs in our particular context.
Resource Analysis of Cognitive Process Flow Used to Achieve Autonomy
2016-03-01
to be used as a decision - making aid to guide system designers and program managers not necessarily familiar with cognitive pro- cessing, or resource...implementing end-to-end cognitive processing flows multiplies and the impact of these design decisions on efficiency and effectiveness increases [1]. The...end-to-end cognitive systems and alternative computing technologies, then system design and acquisition personnel could make systematic analyses and
Guetterman, Timothy C.; Fetters, Michael D.; Legocki, Laurie J.; Mawocha, Samkeliso; Barsan, William G.; Lewis, Roger J.; Berry, Donald A.; Meurer, William J.
2015-01-01
Context The context for this study was the Adaptive Designs Advancing Promising Treatments Into Trials (ADAPT-IT) project, which aimed to incorporate flexible adaptive designs into pivotal clinical trials and to conduct an assessment of the trial development process. Little research provides guidance to academic institutions in planning adaptive trials. Objectives The purpose of this qualitative study was to explore the perspectives and experiences of stakeholders as they reflected back about the interactive ADAPT-IT adaptive design development process, and to understand their perspectives regarding lessons learned about the design of the trials and trial development. Materials and methods We conducted semi-structured interviews with ten key stakeholders and observations of the process. We employed qualitative thematic text data analysis to reduce the data into themes about the ADAPT-IT project and adaptive clinical trials. Results The qualitative analysis revealed four themes: education of the project participants, how the process evolved with participant feedback, procedures that could enhance the development of other trials, and education of the broader research community. Discussion and conclusions While participants became more likely to consider flexible adaptive designs, additional education is needed to both understand the adaptive methodology and articulate it when planning trials. PMID:26622163
NASA Astrophysics Data System (ADS)
Wang, Qiang
2017-09-01
As an important part of software engineering, the software process decides the success or failure of software product. The design and development feature of security software process is discussed, so is the necessity and the present significance of using such process. Coordinating the function software, the process for security software and its testing are deeply discussed. The process includes requirement analysis, design, coding, debug and testing, submission and maintenance. In each process, the paper proposed the subprocesses to support software security. As an example, the paper introduces the above process into the power information platform.
2012-03-22
world’s first powered and controlled flying machine. Numerous flight designs and tests were done by scientists, engineers, and flight enthusiasts...conceptual flight and preliminary designs before they could control the craft with three-axis control and the correct airfoil design . These pioneers...analysis support. Although wind tunnel testing can provide data to predict and develop control surface designs , few SUAV operators opt to utilize wind
Multidisciplinary Optimization Methods for Aircraft Preliminary Design
NASA Technical Reports Server (NTRS)
Kroo, Ilan; Altus, Steve; Braun, Robert; Gage, Peter; Sobieski, Ian
1994-01-01
This paper describes a research program aimed at improved methods for multidisciplinary design and optimization of large-scale aeronautical systems. The research involves new approaches to system decomposition, interdisciplinary communication, and methods of exploiting coarse-grained parallelism for analysis and optimization. A new architecture, that involves a tight coupling between optimization and analysis, is intended to improve efficiency while simplifying the structure of multidisciplinary, computation-intensive design problems involving many analysis disciplines and perhaps hundreds of design variables. Work in two areas is described here: system decomposition using compatibility constraints to simplify the analysis structure and take advantage of coarse-grained parallelism; and collaborative optimization, a decomposition of the optimization process to permit parallel design and to simplify interdisciplinary communication requirements.
Ferreira, Ana P; Tobyn, Mike
2015-01-01
In the pharmaceutical industry, chemometrics is rapidly establishing itself as a tool that can be used at every step of product development and beyond: from early development to commercialization. This set of multivariate analysis methods allows the extraction of information contained in large, complex data sets thus contributing to increase product and process understanding which is at the core of the Food and Drug Administration's Process Analytical Tools (PAT) Guidance for Industry and the International Conference on Harmonisation's Pharmaceutical Development guideline (Q8). This review is aimed at providing pharmaceutical industry professionals an introduction to multivariate analysis and how it is being adopted and implemented by companies in the transition from "quality-by-testing" to "quality-by-design". It starts with an introduction to multivariate analysis and the two methods most commonly used: principal component analysis and partial least squares regression, their advantages, common pitfalls and requirements for their effective use. That is followed with an overview of the diverse areas of application of multivariate analysis in the pharmaceutical industry: from the development of real-time analytical methods to definition of the design space and control strategy, from formulation optimization during development to the application of quality-by-design principles to improve manufacture of existing commercial products.
Functional Fault Model Development Process to Support Design Analysis and Operational Assessment
NASA Technical Reports Server (NTRS)
Melcher, Kevin J.; Maul, William A.; Hemminger, Joseph A.
2016-01-01
A functional fault model (FFM) is an abstract representation of the failure space of a given system. As such, it simulates the propagation of failure effects along paths between the origin of the system failure modes and points within the system capable of observing the failure effects. As a result, FFMs may be used to diagnose the presence of failures in the modeled system. FFMs necessarily contain a significant amount of information about the design, operations, and failure modes and effects. One of the important benefits of FFMs is that they may be qualitative, rather than quantitative and, as a result, may be implemented early in the design process when there is more potential to positively impact the system design. FFMs may therefore be developed and matured throughout the monitored system's design process and may subsequently be used to provide real-time diagnostic assessments that support system operations. This paper provides an overview of a generalized NASA process that is being used to develop and apply FFMs. FFM technology has been evolving for more than 25 years. The FFM development process presented in this paper was refined during NASA's Ares I, Space Launch System, and Ground Systems Development and Operations programs (i.e., from about 2007 to the present). Process refinement took place as new modeling, analysis, and verification tools were created to enhance FFM capabilities. In this paper, standard elements of a model development process (i.e., knowledge acquisition, conceptual design, implementation & verification, and application) are described within the context of FFMs. Further, newer tools and analytical capabilities that may benefit the broader systems engineering process are identified and briefly described. The discussion is intended as a high-level guide for future FFM modelers.
Design and implementation of highly parallel pipelined VLSI systems
NASA Astrophysics Data System (ADS)
Delange, Alphonsus Anthonius Jozef
A methodology and its realization as a prototype CAD (Computer Aided Design) system for the design and analysis of complex multiprocessor systems is presented. The design is an iterative process in which the behavioral specifications of the system components are refined into structural descriptions consisting of interconnections and lower level components etc. A model for the representation and analysis of multiprocessor systems at several levels of abstraction and an implementation of a CAD system based on this model are described. A high level design language, an object oriented development kit for tool design, a design data management system, and design and analysis tools such as a high level simulator and graphics design interface which are integrated into the prototype system and graphics interface are described. Procedures for the synthesis of semiregular processor arrays, and to compute the switching of input/output signals, memory management and control of processor array, and sequencing and segmentation of input/output data streams due to partitioning and clustering of the processor array during the subsequent synthesis steps, are described. The architecture and control of a parallel system is designed and each component mapped to a module or module generator in a symbolic layout library, compacted for design rules of VLSI (Very Large Scale Integration) technology. An example of the design of a processor that is a useful building block for highly parallel pipelined systems in the signal/image processing domains is given.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brow, R.K.; Kovacic, L.; Chambers, R.S.
1996-04-01
Hernetic glass sealing technologies developed for weapons component applications can be utilized for the design and manufacture of fuel cells. Design and processing of of a seal are optimized through an integrated approach based on glass composition research, finite element analysis, and sealing process definition. Glass sealing procedures are selected to accommodate the limits imposed by glass composition and predicted calculations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Edwin A. Harvego; James E. O'Brien; Michael G. McKellar
2012-11-01
Results of a system evaluation and lifecycle cost analysis are presented for a commercial-scale high-temperature electrolysis (HTE) central hydrogen production plant. The plant design relies on grid electricity to power the electrolysis process and system components, and industrial natural gas to provide process heat. The HYSYS process analysis software was used to evaluate the reference central plant design capable of producing 50,000 kg/day of hydrogen. The HYSYS software performs mass and energy balances across all components to allow optimization of the design using a detailed process flow sheet and realistic operating conditions specified by the analyst. The lifecycle cost analysismore » was performed using the H2A analysis methodology developed by the Department of Energy (DOE) Hydrogen Program. This methodology utilizes Microsoft Excel spreadsheet analysis tools that require detailed plant performance information (obtained from HYSYS), along with financial and cost information to calculate lifecycle costs. The results of the lifecycle analyses indicate that for a 10% internal rate of return, a large central commercial-scale hydrogen production plant can produce 50,000 kg/day of hydrogen at an average cost of $2.68/kg. When the cost of carbon sequestration is taken into account, the average cost of hydrogen production increases by $0.40/kg to $3.08/kg.« less
Function-based design process for an intelligent ground vehicle vision system
NASA Astrophysics Data System (ADS)
Nagel, Robert L.; Perry, Kenneth L.; Stone, Robert B.; McAdams, Daniel A.
2010-10-01
An engineering design framework for an autonomous ground vehicle vision system is discussed. We present both the conceptual and physical design by following the design process, development and testing of an intelligent ground vehicle vision system constructed for the 2008 Intelligent Ground Vehicle Competition. During conceptual design, the requirements for the vision system are explored via functional and process analysis considering the flows into the vehicle and the transformations of those flows. The conceptual design phase concludes with a vision system design that is modular in both hardware and software and is based on a laser range finder and camera for visual perception. During physical design, prototypes are developed and tested independently, following the modular interfaces identified during conceptual design. Prototype models, once functional, are implemented into the final design. The final vision system design uses a ray-casting algorithm to process camera and laser range finder data and identify potential paths. The ray-casting algorithm is a single thread of the robot's multithreaded application. Other threads control motion, provide feedback, and process sensory data. Once integrated, both hardware and software testing are performed on the robot. We discuss the robot's performance and the lessons learned.
Mortier, Séverine Thérèse F C; Van Bockstal, Pieter-Jan; Corver, Jos; Nopens, Ingmar; Gernaey, Krist V; De Beer, Thomas
2016-06-01
Large molecules, such as biopharmaceuticals, are considered the key driver of growth for the pharmaceutical industry. Freeze-drying is the preferred way to stabilise these products when needed. However, it is an expensive, inefficient, time- and energy-consuming process. During freeze-drying, there are only two main process variables to be set, i.e. the shelf temperature and the chamber pressure, however preferably in a dynamic way. This manuscript focuses on the essential use of uncertainty analysis for the determination and experimental verification of the dynamic primary drying Design Space for pharmaceutical freeze-drying. Traditionally, the chamber pressure and shelf temperature are kept constant during primary drying, leading to less optimal process conditions. In this paper it is demonstrated how a mechanistic model of the primary drying step gives the opportunity to determine the optimal dynamic values for both process variables during processing, resulting in a dynamic Design Space with a well-known risk of failure. This allows running the primary drying process step as time efficient as possible, hereby guaranteeing that the temperature at the sublimation front does not exceed the collapse temperature. The Design Space is the multidimensional combination and interaction of input variables and process parameters leading to the expected product specifications with a controlled (i.e., high) probability. Therefore, inclusion of parameter uncertainty is an essential part in the definition of the Design Space, although it is often neglected. To quantitatively assess the inherent uncertainty on the parameters of the mechanistic model, an uncertainty analysis was performed to establish the borders of the dynamic Design Space, i.e. a time-varying shelf temperature and chamber pressure, associated with a specific risk of failure. A risk of failure acceptance level of 0.01%, i.e. a 'zero-failure' situation, results in an increased primary drying process time compared to the deterministic dynamic Design Space; however, the risk of failure is under control. Experimental verification revealed that only a risk of failure acceptance level of 0.01% yielded a guaranteed zero-defect quality end-product. The computed process settings with a risk of failure acceptance level of 0.01% resulted in a decrease of more than half of the primary drying time in comparison with a regular, conservative cycle with fixed settings. Copyright © 2016. Published by Elsevier B.V.
cycle inventories Economic and environmentally extended input-output analysis Sustainable design and models for sustainable design and optimization of processes, supply chains and life cycles Interactions engineering design and assessment." Doctoral dissertation, The Ohio State University, 2015. Hanes
Energy-Efficient Design for Florida Educational Facilities.
ERIC Educational Resources Information Center
Florida Solar Energy Center, Cape Canaveral.
This manual provides a detailed simulation analysis of a variety of energy conservation measures (ECMs) with the intent of giving educational facility design teams in Florida a basis for decision making. The manual's three sections cover energy efficiency design considerations that appear throughout the following design processes: schematic…
Development of a Prototype Model-Form Uncertainty Knowledge Base
NASA Technical Reports Server (NTRS)
Green, Lawrence L.
2016-01-01
Uncertainties are generally classified as either aleatory or epistemic. Aleatory uncertainties are those attributed to random variation, either naturally or through manufacturing processes. Epistemic uncertainties are generally attributed to a lack of knowledge. One type of epistemic uncertainty is called model-form uncertainty. The term model-form means that among the choices to be made during a design process within an analysis, there are different forms of the analysis process, which each give different results for the same configuration at the same flight conditions. Examples of model-form uncertainties include the grid density, grid type, and solver type used within a computational fluid dynamics code, or the choice of the number and type of model elements within a structures analysis. The objectives of this work are to identify and quantify a representative set of model-form uncertainties and to make this information available to designers through an interactive knowledge base (KB). The KB can then be used during probabilistic design sessions, so as to enable the possible reduction of uncertainties in the design process through resource investment. An extensive literature search has been conducted to identify and quantify typical model-form uncertainties present within aerospace design. An initial attempt has been made to assemble the results of this literature search into a searchable KB, usable in real time during probabilistic design sessions. A concept of operations and the basic structure of a model-form uncertainty KB are described. Key operations within the KB are illustrated. Current limitations in the KB, and possible workarounds are explained.
TU-AB-BRD-03: Fault Tree Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dunscombe, P.
2015-06-15
Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less
TU-AB-BRD-02: Failure Modes and Effects Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huq, M.
2015-06-15
Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less
Riser Feeding Evaluation Method for Metal Castings Using Numerical Analysis
NASA Astrophysics Data System (ADS)
Ahmad, Nadiah
One of the design aspects that continues to create a challenge for casting designers is the optimum design of casting feeders (risers). As liquid metal solidifies, the metal shrinks and forms cavities inside the casting. In order to avoid shrinkage cavities, risers are added to the casting shape to supply additional molten metal when shrinkage occurs during solidification. The shrinkage cavities in the casting are compensated by controlling the cooling rate to promote directional solidification. This control can be achieved by designing the casting such that the cooling begins at the sections that are farthest away from the risers and ends at the risers. Therefore, the risers will solidify last and feed the casting with the molten metal. As a result, the shrinkage cavities formed during solidification are in the risers which are later removed from the casting. Since casting designers have to usually go through iterative processes of validating the casting designs which are very costly due to expensive simulation processes or manual trials and errors on actual casting processes, this study investigates more efficient methods that will help casting designers utilize their casting experiences systematically to develop good initial casting designs. The objective is to reduce the casting design method iterations; therefore, reducing the cost involved in that design processes. The aim of this research aims at finding a method that can help casting designers design effective risers used in sand casting process of aluminum-silicon alloys by utilizing the analysis of solidification simulation. The analysis focuses on studying the significance of pressure distribution of the liquid metal at the early stage of casting solidification, when heat transfer and convective fluid flow are taken into account in the solidification simulation. The mathematical model of casting solidification was solved using the finite volume method (FVM). This study focuses to improve our understanding of the feeding behavior in aluminum-silicon alloys and the effective feeding by considering the pressure gradient distribution of the molten metal at casting dendrite coherency point. For this study, we will identify the relationship between feeding efficiency, shrinkage behavior and how the change in riser size affects the pressure gradient in the casting. This understanding will be used to help in the design of effective risers.
Tracking Hazard Analysis Data in a Jungle of Changing Design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sullivan, Robin S.; Young, Jonathan
2006-05-14
The biggest fear of the hazard analyst is the loss of data in the middle of the design jungle. When project schedules are demanding and design is changing rapidly it is essential that the hazard analysis data be tracked and kept current in order to provide the required project design, development, and regulatory support. Being able to identify the current information, as well as the past archived information, as the design progresses and to be able to show how the project is designing in safety through modifications based on hazard analysis results is imperative. At the DOE Hanford site inmore » Washington State, Flour Hanford Inc is in the process of the removal and disposition of sludge from the 100 Area K Basins. The K Basins were used to store spent fuel from the operating reactors at the Hanford Site. The sludge is a by-product from the corrosion of the fuel and fuel storage canisters. The sludge removal project has been very dynamic involving the design, procurement and, more recently, the operation of processes at two basins, K East and K West. The project has an ambitious schedule with a large number of changes to design concepts. In order to support the complex K Basins project a technique to track the status of the hazard analysis data was developed. This paper will identify the most important elements of the tracking system and how it was used to assist the project in ensuring that current design data was reflected in a specific version of the hazard analysis and to show how the project was keeping up with the design and ensuring compliance with the requirements to design in safety. While the specifics of the data tracking strategy for the K Basins sludge removal project will be described in the paper, the general concepts of the strategy are applicable to similar projects requiring iteration of hazard analysis and design.« less
NASA Technical Reports Server (NTRS)
Kriegler, F. J.; Gordon, M. F.; Mclaughlin, R. H.; Marshall, R. E.
1975-01-01
The MIDAS (Multivariate Interactive Digital Analysis System) processor is a high-speed processor designed to process multispectral scanner data (from Landsat, EOS, aircraft, etc.) quickly and cost-effectively to meet the requirements of users of remote sensor data, especially from very large areas. MIDAS consists of a fast multipipeline preprocessor and classifier, an interactive color display and color printer, and a medium scale computer system for analysis and control. The system is designed to process data having as many as 16 spectral bands per picture element at rates of 200,000 picture elements per second into as many as 17 classes using a maximum likelihood decision rule.
Mirel, Barbara
2009-02-13
Current usability studies of bioinformatics tools suggest that tools for exploratory analysis support some tasks related to finding relationships of interest but not the deep causal insights necessary for formulating plausible and credible hypotheses. To better understand design requirements for gaining these causal insights in systems biology analyses a longitudinal field study of 15 biomedical researchers was conducted. Researchers interacted with the same protein-protein interaction tools to discover possible disease mechanisms for further experimentation. Findings reveal patterns in scientists' exploratory and explanatory analysis and reveal that tools positively supported a number of well-structured query and analysis tasks. But for several of scientists' more complex, higher order ways of knowing and reasoning the tools did not offer adequate support. Results show that for a better fit with scientists' cognition for exploratory analysis systems biology tools need to better match scientists' processes for validating, for making a transition from classification to model-based reasoning, and for engaging in causal mental modelling. As the next great frontier in bioinformatics usability, tool designs for exploratory systems biology analysis need to move beyond the successes already achieved in supporting formulaic query and analysis tasks and now reduce current mismatches with several of scientists' higher order analytical practices. The implications of results for tool designs are discussed.
NASA Technical Reports Server (NTRS)
Dominick, Wayne D. (Editor); Bassari, Jinous; Triantafyllopoulos, Spiros
1984-01-01
The University of Southwestern Louisiana (USL) NASA PC R and D statistical analysis support package is designed to be a three-level package to allow statistical analysis for a variety of applications within the USL Data Base Management System (DBMS) contract work. The design addresses usage of the statistical facilities as a library package, as an interactive statistical analysis system, and as a batch processing package.
Improving designer productivity. [artificial intelligence
NASA Technical Reports Server (NTRS)
Hill, Gary C.
1992-01-01
Designer and design team productivity improves with skill, experience, and the tools available. The design process involves numerous trials and errors, analyses, refinements, and addition of details. Computerized tools have greatly speeded the analysis, and now new theories and methods, emerging under the label Artificial Intelligence (AI), are being used to automate skill and experience. These tools improve designer productivity by capturing experience, emulating recognized skillful designers, and making the essence of complex programs easier to grasp. This paper outlines the aircraft design process in today's technology and business climate, presenting some of the challenges ahead and some of the promising AI methods for meeting these challenges.
Optimisation study of a vehicle bumper subsystem with fuzzy parameters
NASA Astrophysics Data System (ADS)
Farkas, L.; Moens, D.; Donders, S.; Vandepitte, D.
2012-10-01
This paper deals with the design and optimisation for crashworthiness of a vehicle bumper subsystem, which is a key scenario for vehicle component design. The automotive manufacturers and suppliers have to find optimal design solutions for such subsystems that comply with the conflicting requirements of the regulatory bodies regarding functional performance (safety and repairability) and regarding the environmental impact (mass). For the bumper design challenge, an integrated methodology for multi-attribute design engineering of mechanical structures is set up. The integrated process captures the various tasks that are usually performed manually, this way facilitating the automated design iterations for optimisation. Subsequently, an optimisation process is applied that takes the effect of parametric uncertainties into account, such that the system level of failure possibility is acceptable. This optimisation process is referred to as possibility-based design optimisation and integrates the fuzzy FE analysis applied for the uncertainty treatment in crash simulations. This process is the counterpart of the reliability-based design optimisation used in a probabilistic context with statistically defined parameters (variabilities).
NASA Technical Reports Server (NTRS)
Blackburn, C. L.; Dovi, A. R.; Kurtze, W. L.; Storaasli, O. O.
1981-01-01
A computer software system for the processing and integration of engineering data and programs, called IPAD (Integrated Programs for Aerospace-Vehicle Design), is described. The ability of the system to relieve the engineer of the mundane task of input data preparation is demonstrated by the application of a prototype system to the design, analysis, and/or machining of three simple structures. Future work to further enhance the system's automated data handling and ability to handle larger and more varied design problems are also presented.
Integrating automated structured analysis and design with Ada programming support environments
NASA Technical Reports Server (NTRS)
Hecht, Alan; Simmons, Andy
1986-01-01
Ada Programming Support Environments (APSE) include many powerful tools that address the implementation of Ada code. These tools do not address the entire software development process. Structured analysis is a methodology that addresses the creation of complete and accurate system specifications. Structured design takes a specification and derives a plan to decompose the system subcomponents, and provides heuristics to optimize the software design to minimize errors and maintenance. It can also produce the creation of useable modules. Studies have shown that most software errors result from poor system specifications, and that these errors also become more expensive to fix as the development process continues. Structured analysis and design help to uncover error in the early stages of development. The APSE tools help to insure that the code produced is correct, and aid in finding obscure coding errors. However, they do not have the capability to detect errors in specifications or to detect poor designs. An automated system for structured analysis and design TEAMWORK, which can be integrated with an APSE to support software systems development from specification through implementation is described. These tools completement each other to help developers improve quality and productivity, as well as to reduce development and maintenance costs. Complete system documentation and reusable code also resultss from the use of these tools. Integrating an APSE with automated tools for structured analysis and design provide capabilities and advantages beyond those realized with any of these systems used by themselves.
2012-09-30
recognition. Algorithm design and statistical analysis and feature analysis. Post -Doctoral Associate, Cornell University, Bioacoustics Research...short. The HPC-ADA was designed based on fielded systems [1-4, 6] that offer a variety of desirable attributes, specifically dynamic resource...The software package was designed to utilize parallel and distributed processing for running recognition and other advanced algorithms. DeLMA
Integrated Optical Design Analysis (IODA): New Test Data and Modeling Features
NASA Technical Reports Server (NTRS)
Moore, Jim; Troy, Ed; Patrick, Brian
2003-01-01
A general overview of the capabilities of the IODA ("Integrated Optical Design Analysis") exchange of data and modeling results between thermal, structures, optical design, and testing engineering disciplines. This presentation focuses on new features added to the software that allow measured test data to be imported into the IODA environment for post processing or comparisons with pretest model predictions. software is presented. IODA promotes efficient
NASA Technical Reports Server (NTRS)
Welstead, Jason
2014-01-01
This research focused on incorporating stability and control into a multidisciplinary de- sign optimization on a Boeing 737-class advanced concept called the D8.2b. A new method of evaluating the aircraft handling performance using quantitative evaluation of the sys- tem to disturbances, including perturbations, continuous turbulence, and discrete gusts, is presented. A multidisciplinary design optimization was performed using the D8.2b transport air- craft concept. The con guration was optimized for minimum fuel burn using a design range of 3,000 nautical miles. Optimization cases were run using xed tail volume coecients, static trim constraints, and static trim and dynamic response constraints. A Cessna 182T model was used to test the various dynamic analysis components, ensuring the analysis was behaving as expected. Results of the optimizations show that including stability and con- trol in the design process drastically alters the optimal design, indicating that stability and control should be included in conceptual design to avoid system level penalties later in the design process.
Regression analysis as a design optimization tool
NASA Technical Reports Server (NTRS)
Perley, R.
1984-01-01
The optimization concepts are described in relation to an overall design process as opposed to a detailed, part-design process where the requirements are firmly stated, the optimization criteria are well established, and a design is known to be feasible. The overall design process starts with the stated requirements. Some of the design criteria are derived directly from the requirements, but others are affected by the design concept. It is these design criteria that define the performance index, or objective function, that is to be minimized within some constraints. In general, there will be multiple objectives, some mutually exclusive, with no clear statement of their relative importance. The optimization loop that is given adjusts the design variables and analyzes the resulting design, in an iterative fashion, until the objective function is minimized within the constraints. This provides a solution, but it is only the beginning. In effect, the problem definition evolves as information is derived from the results. It becomes a learning process as we determine what the physics of the system can deliver in relation to the desirable system characteristics. As with any learning process, an interactive capability is a real attriubute for investigating the many alternatives that will be suggested as learning progresses.
Stingray Failure Mode, Effects and Criticality Analysis: WEC Risk Registers
Ken Rhinefrank
2016-07-25
Analysis method to systematically identify all potential failure modes and their effects on the Stingray WEC system. This analysis is incorporated early in the development cycle such that the mitigation of the identified failure modes can be achieved cost effectively and efficiently. The FMECA can begin once there is enough detail to functions and failure modes of a given system, and its interfaces with other systems. The FMECA occurs coincidently with the design process and is an iterative process which allows for design changes to overcome deficiencies in the analysis.Risk Registers for major subsystems completed according to the methodology described in "Failure Mode Effects and Criticality Analysis Risk Reduction Program Plan.pdf" document below, in compliance with the DOE Risk Management Framework developed by NREL.
NASA Technical Reports Server (NTRS)
1979-01-01
Information to identify viable coal gasification and utilization technologies is presented. Analysis capabilities required to support design and implementation of coal based synthetic fuels complexes are identified. The potential market in the Southeast United States for coal based synthetic fuels is investigated. A requirements analysis to identify the types of modeling and analysis capabilities required to conduct and monitor coal gasification project designs is discussed. Models and methodologies to satisfy these requirements are identified and evaluated, and recommendations are developed. Requirements for development of technology and data needed to improve gasification feasibility and economies are examined.
Sociotechnical attributes of safe and unsafe work systems.
Kleiner, Brian M; Hettinger, Lawrence J; DeJoy, David M; Huang, Yuang-Hsiang; Love, Peter E D
2015-01-01
Theoretical and practical approaches to safety based on sociotechnical systems principles place heavy emphasis on the intersections between social-organisational and technical-work process factors. Within this perspective, work system design emphasises factors such as the joint optimisation of social and technical processes, a focus on reliable human-system performance and safety metrics as design and analysis criteria, the maintenance of a realistic and consistent set of safety objectives and policies, and regular access to the expertise and input of workers. We discuss three current approaches to the analysis and design of complex sociotechnical systems: human-systems integration, macroergonomics and safety climate. Each approach emphasises key sociotechnical systems themes, and each prescribes a more holistic perspective on work systems than do traditional theories and methods. We contrast these perspectives with historical precedents such as system safety and traditional human factors and ergonomics, and describe potential future directions for their application in research and practice. The identification of factors that can reliably distinguish between safe and unsafe work systems is an important concern for ergonomists and other safety professionals. This paper presents a variety of sociotechnical systems perspectives on intersections between social--organisational and technology--work process factors as they impact work system analysis, design and operation.
European Workshop Industrical Computer Science Systems approach to design for safety
NASA Technical Reports Server (NTRS)
Zalewski, Janusz
1992-01-01
This paper presents guidelines on designing systems for safety, developed by the Technical Committee 7 on Reliability and Safety of the European Workshop on Industrial Computer Systems. The focus is on complementing the traditional development process by adding the following four steps: (1) overall safety analysis; (2) analysis of the functional specifications; (3) designing for safety; (4) validation of design. Quantitative assessment of safety is possible by means of a modular questionnaire covering various aspects of the major stages of system development.
Engineering design: A cognitive process approach
NASA Astrophysics Data System (ADS)
Strimel, Greg Joseph
The intent of this dissertation was to identify the cognitive processes used by advanced pre-engineering students to solve complex engineering design problems. Students in technology and engineering education classrooms are often taught to use an ideal engineering design process that has been generated mostly by educators and curriculum developers. However, the review of literature showed that it is unclear as to how advanced pre-engineering students cognitively navigate solving a complex and multifaceted problem from beginning to end. Additionally, it was unclear how a student thinks and acts throughout their design process and how this affects the viability of their solution. Therefore, Research Objective 1 was to identify the fundamental cognitive processes students use to design, construct, and evaluate operational solutions to engineering design problems. Research Objective 2 was to determine identifiers within student cognitive processes for monitoring aptitude to successfully design, construct, and evaluate technological solutions. Lastly, Research Objective 3 was to create a conceptual technological and engineering problem-solving model integrating student cognitive processes for the improved development of problem-solving abilities. The methodology of this study included multiple forms of data collection. The participants were first given a survey to determine their prior experience with engineering and to provide a description of the subjects being studied. The participants were then presented an engineering design challenge to solve individually. While they completed the challenge, the participants verbalized their thoughts using an established "think aloud" method. These verbalizations were captured along with participant observational recordings using point-of-view camera technology. Additionally, the participant design journals, design artifacts, solution effectiveness data, and teacher evaluations were collected for analysis to help achieve the research objectives of this study. Two independent coders then coded the video/audio recordings and the additional design data using Halfin's (1973) 17 mental processes for technological problem-solving. The results of this study indicated that the participants employed a wide array of mental processes when solving engineering design challenges. However, the findings provide a general analysis of the number of times participants employed each mental process, as well as the amount of time consumed employing the various mental processes through the different stages of the engineering design process. The results indicated many similarities between the students solving the problem, which may highlight voids in current technology and engineering education curricula. Additionally, the findings showed differences between the processes employed by participants that created the most successful solutions and the participants who developed the least effective solutions. Upon comparing and contrasting these processes, recommendations for instructional strategies to enhance a student's capability for solving engineering design problems were developed. The results also indicated that students, when left without teacher intervention, use a simplified and more natural process to solve design challenges than the 12-step engineering design process reported in much of the literature. Lastly, these data indicated that students followed two different approaches to solving the design problem. Some students employed a sequential and logical approach, while others employed a nebulous, solution centered trial-and-error approach to solving the problem. In this study the participants who were more sequential had better performing solutions. Examining these two approaches and the student cognition data enabled the researcher to generate a conceptual engineering design model for the improved teaching and development of engineering design problem solving.
NASA Technical Reports Server (NTRS)
DeBonis, J. R.; Trefny, C. J.; Steffen, C. J., Jr.
1999-01-01
Design and analysis of the inlet for a rocket based combined cycle engine is discussed. Computational fluid dynamics was used in both the design and subsequent analysis. Reynolds averaged Navier-Stokes simulations were performed using both perfect gas and real gas assumptions. An inlet design that operates over the required Mach number range from 0 to 12 was produced. Performance data for cycle analysis was post processed using a stream thrust averaging technique. A detailed performance database for cycle analysis is presented. The effect ot vehicle forebody compression on air capture is also examined.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abel, L.W.
1996-05-01
This article discusses the methodology, design philosophy, and guidelines for planning a dynamic-kill operation for a wild well. The topics covered are two methods of computer analysis for designing dynamic-kill requirements, the design process, determining the pumping spread, and the pitfalls that a designer faces in planning a dynamic kill.
2016-06-01
design will help assess each individual’s perceptions on the five primary research questions. D. PILOT TESTING After creating the survey, it’s...distributed to individuals that have submitted requirements packages through the ASSP process. The survey field test was designed to determine the...Will be designated for each of the service portfolio groups and collaborates to define common processes across DOD Component Level Lead (CLL
Automation of the aircraft design process
NASA Technical Reports Server (NTRS)
Heldenfels, R. R.
1974-01-01
The increasing use of the computer to automate the aerospace product development and engineering process is examined with emphasis on structural analysis and design. Examples of systems of computer programs in aerospace and other industries are reviewed and related to the characteristics of aircraft design in its conceptual, preliminary, and detailed phases. Problems with current procedures are identified, and potential improvements from optimum utilization of integrated disciplinary computer programs by a man/computer team are indicated.
The development of a collaborative virtual environment for finite element simulation
NASA Astrophysics Data System (ADS)
Abdul-Jalil, Mohamad Kasim
Communication between geographically distributed designers has been a major hurdle in traditional engineering design. Conventional methods of communication, such as video conferencing, telephone, and email, are less efficient especially when dealing with complex design models. Complex shapes, intricate features and hidden parts are often difficult to describe verbally or even using traditional 2-D or 3-D visual representations. Virtual Reality (VR) and Internet technologies have provided a substantial potential to bridge the present communication barrier. VR technology allows designers to immerse themselves in a virtual environment to view and manipulate this model just as in real-life. Fast Internet connectivity has enabled fast data transfer between remote locations. Although various collaborative virtual environment (CVE) systems have been developed in the past decade, they are limited to high-end technology that is not accessible to typical designers. The objective of this dissertation is to discover and develop a new approach to increase the efficiency of the design process, particularly for large-scale applications wherein participants are geographically distributed. A multi-platform and easily accessible collaborative virtual environment (CVRoom), is developed to accomplish the stated research objective. Geographically dispersed designers can meet in a single shared virtual environment to discuss issues pertaining to the engineering design process and to make trade-off decisions more quickly than before, thereby speeding the entire process. This 'faster' design process will be achieved through the development of capabilities to better enable the multidisciplinary and modeling the trade-off decisions that are so critical before launching into a formal detailed design. The features of the environment developed as a result of this research include the ability to view design models, use voice interaction, and to link engineering analysis modules (such as Finite Element Analysis module, such as is demonstrated in this work). One of the major issues in developing a CVE system for engineering design purposes is to obtain any pertinent simulation results in real-time. This is critical so that the designers can make decisions based on these results quickly. For example, in a finite element analysis, if a design model is changed or perturbed, the analysis results must be obtained in real-time or near real-time to make the virtual meeting environment realistic. In this research, the finite difference-based Design Sensitivity Analysis (DSA) approach is employed to approximate structural responses (i.e. stress, displacement, etc), so as to demonstrate the applicability of CVRoom for engineering design trade-offs. This DSA approach provides for fast approximation and is well-suited for the virtual meeting environment where fast response time is required. The DSA-based approach is tested on several example test problems to show its applicability and limitations. This dissertation demonstrates that an increase in efficiency and reduction of time required for a complex design processing can be accomplished using the approach developed in this dissertation research. Several implementations of CVRoom by students working on common design tasks were investigated. All participants confirmed the preference of using the collaborative virtual environment developed in this dissertation work (CVRoom) over other modes of interactions. It is proposed here that CVRoom is representative of the type of collaborative virtual environment that will be used by most designers in the future to reduce the time required in a design cycle and thereby reduce the associated cost.
Defining process design space for monoclonal antibody cell culture.
Abu-Absi, Susan Fugett; Yang, LiYing; Thompson, Patrick; Jiang, Canping; Kandula, Sunitha; Schilling, Bernhard; Shukla, Abhinav A
2010-08-15
The concept of design space has been taking root as a foundation of in-process control strategies for biopharmaceutical manufacturing processes. During mapping of the process design space, the multidimensional combination of operational variables is studied to quantify the impact on process performance in terms of productivity and product quality. An efficient methodology to map the design space for a monoclonal antibody cell culture process is described. A failure modes and effects analysis (FMEA) was used as the basis for the process characterization exercise. This was followed by an integrated study of the inoculum stage of the process which includes progressive shake flask and seed bioreactor steps. The operating conditions for the seed bioreactor were studied in an integrated fashion with the production bioreactor using a two stage design of experiments (DOE) methodology to enable optimization of operating conditions. A two level Resolution IV design was followed by a central composite design (CCD). These experiments enabled identification of the edge of failure and classification of the operational parameters as non-key, key or critical. In addition, the models generated from the data provide further insight into balancing productivity of the cell culture process with product quality considerations. Finally, process and product-related impurity clearance was evaluated by studies linking the upstream process with downstream purification. Production bioreactor parameters that directly influence antibody charge variants and glycosylation in CHO systems were identified.
NASA Technical Reports Server (NTRS)
1980-01-01
The design, fabrication, and installation of an experimental process system development unit (EPSDU) were analyzed. Supporting research and development were performed to provide an information data base usable for the EPSDU and for technological design and economical analysis for potential scale-up of the process. Iterative economic analyses were conducted for the estimated product cost for the production of semiconductor grade silicon in a facility capable of producing 1000-MT/Yr.
Language Analysis Package (L.A.P.) Version I System Design.
ERIC Educational Resources Information Center
Porch, Ann
To permit researchers to use the speed and versatility of the computer to process natural language text as well as numerical data without undergoing special training in programing or computer operations, a language analysis package has been developed partially based on several existing programs. An overview of the design is provided and system…
78 FR 49337 - Direct Grant Programs and Definitions That Apply to Department Regulations
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-13
... Department's grant process with the Secretary's policy objectives and allow Department programs to design....210(c) (Quality of the Project Design) (Amended Sec. Sec. 75.209 and 75.210); 6. Authorize program... regulations. We group major issues according to subject. Analysis of Comments and Changes: An analysis of the...
Evolutionary computing for the design search and optimization of space vehicle power subsystems
NASA Technical Reports Server (NTRS)
Kordon, M.; Klimeck, G.; Hanks, D.
2004-01-01
Evolutionary computing has proven to be a straightforward and robust approach for optimizing a wide range of difficult analysis and design problems. This paper discusses the application of these techniques to an existing space vehicle power subsystem resource and performance analysis simulation in a parallel processing environment.
21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2010 CFR
2010-04-01
...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and could be introduced inside the processing plant environment; and (ii) Critical control points designed... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Hazard Analysis and Critical Control Point (HACCP...
21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2012 CFR
2012-04-01
...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and could be introduced inside the processing plant environment; and (ii) Critical control points designed... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Hazard Analysis and Critical Control Point (HACCP...
21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2013 CFR
2013-04-01
... identified food safety hazards, including as appropriate: (i) Critical control points designed to control... control points designed to control food safety hazards introduced outside the processing plant environment... Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF...
21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2011 CFR
2011-04-01
... identified food safety hazards, including as appropriate: (i) Critical control points designed to control... control points designed to control food safety hazards introduced outside the processing plant environment... Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF...
21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2013 CFR
2013-04-01
...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and could be introduced inside the processing plant environment; and (ii) Critical control points designed... 21 Food and Drugs 2 2013-04-01 2013-04-01 false Hazard Analysis and Critical Control Point (HACCP...
21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2014 CFR
2014-04-01
...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and could be introduced inside the processing plant environment; and (ii) Critical control points designed... 21 Food and Drugs 2 2014-04-01 2014-04-01 false Hazard Analysis and Critical Control Point (HACCP...
21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2011 CFR
2011-04-01
...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and could be introduced inside the processing plant environment; and (ii) Critical control points designed... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Hazard Analysis and Critical Control Point (HACCP...
21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2014 CFR
2014-04-01
... identified food safety hazards, including as appropriate: (i) Critical control points designed to control... control points designed to control food safety hazards introduced outside the processing plant environment... Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF...
Elemental Learning as a Framework for E-Learning
ERIC Educational Resources Information Center
Dempsey, John V.; Litchfield, Brenda C.
2013-01-01
Analysis of learning outcomes can be a complex and esoteric instructional design process that is often ignored by educators and e-learning designers. This paper describes a model of analysis that fosters the real-life application of learning outcomes and explains why the model may be needed. The Elemental Learning taxonomy is a hierarchical model…
Cervera-Padrell, Albert E; Skovby, Tommy; Kiil, Søren; Gani, Rafiqul; Gernaey, Krist V
2012-10-01
A systematic framework is proposed for the design of continuous pharmaceutical manufacturing processes. Specifically, the design framework focuses on organic chemistry based, active pharmaceutical ingredient (API) synthetic processes, but could potentially be extended to biocatalytic and fermentation-based products. The method exploits the synergic combination of continuous flow technologies (e.g., microfluidic techniques) and process systems engineering (PSE) methods and tools for faster process design and increased process understanding throughout the whole drug product and process development cycle. The design framework structures the many different and challenging design problems (e.g., solvent selection, reactor design, and design of separation and purification operations), driving the user from the initial drug discovery steps--where process knowledge is very limited--toward the detailed design and analysis. Examples from the literature of PSE methods and tools applied to pharmaceutical process design and novel pharmaceutical production technologies are provided along the text, assisting in the accumulation and interpretation of process knowledge. Different criteria are suggested for the selection of batch and continuous processes so that the whole design results in low capital and operational costs as well as low environmental footprint. The design framework has been applied to the retrofit of an existing batch-wise process used by H. Lundbeck A/S to produce an API: zuclopenthixol. Some of its batch operations were successfully converted into continuous mode, obtaining higher yields that allowed a significant simplification of the whole process. The material and environmental footprint of the process--evaluated through the process mass intensity index, that is, kg of material used per kg of product--was reduced to half of its initial value, with potential for further reduction. The case-study includes reaction steps typically used by the pharmaceutical industry featuring different characteristic reaction times, as well as L-L separation and distillation-based solvent exchange steps, and thus constitutes a good example of how the design framework can be useful to efficiently design novel or already existing API manufacturing processes taking advantage of continuous processes. Copyright © 2012 Elsevier B.V. All rights reserved.
Unstructured Grids for Sonic Boom Analysis and Design
NASA Technical Reports Server (NTRS)
Campbell, Richard L.; Nayani, Sudheer N.
2015-01-01
An evaluation of two methods for improving the process for generating unstructured CFD grids for sonic boom analysis and design has been conducted. The process involves two steps: the generation of an inner core grid using a conventional unstructured grid generator such as VGRID, followed by the extrusion of a sheared and stretched collar grid through the outer boundary of the core grid. The first method evaluated, known as COB, automatically creates a cylindrical outer boundary definition for use in VGRID that makes the extrusion process more robust. The second method, BG, generates the collar grid by extrusion in a very efficient manner. Parametric studies have been carried out and new options evaluated for each of these codes with the goal of establishing guidelines for best practices for maintaining boom signature accuracy with as small a grid as possible. In addition, a preliminary investigation examining the use of the CDISC design method for reducing sonic boom utilizing these grids was conducted, with initial results confirming the feasibility of a new remote design approach.
Object-Oriented MDAO Tool with Aeroservoelastic Model Tuning Capability
NASA Technical Reports Server (NTRS)
Pak, Chan-gi; Li, Wesley; Lung, Shun-fat
2008-01-01
An object-oriented multi-disciplinary analysis and optimization (MDAO) tool has been developed at the NASA Dryden Flight Research Center to automate the design and analysis process and leverage existing commercial as well as in-house codes to enable true multidisciplinary optimization in the preliminary design stage of subsonic, transonic, supersonic and hypersonic aircraft. Once the structural analysis discipline is finalized and integrated completely into the MDAO process, other disciplines such as aerodynamics and flight controls will be integrated as well. Simple and efficient model tuning capabilities based on optimization problem are successfully integrated with the MDAO tool. More synchronized all phases of experimental testing (ground and flight), analytical model updating, high-fidelity simulations for model validation, and integrated design may result in reduction of uncertainties in the aeroservoelastic model and increase the flight safety.
Computational analysis of liquid hypergolic propellant rocket engines
NASA Technical Reports Server (NTRS)
Krishnan, A.; Przekwas, A. J.; Gross, K. W.
1992-01-01
The combustion process in liquid rocket engines depends on a number of complex phenomena such as atomization, vaporization, spray dynamics, mixing, and reaction mechanisms. A computational tool to study their mutual interactions is developed to help analyze these processes with a view of improving existing designs and optimizing future designs of the thrust chamber. The focus of the article is on the analysis of the Variable Thrust Engine for the Orbit Maneuvering Vehicle. This engine uses a hypergolic liquid bipropellant combination of monomethyl hydrazine as fuel and nitrogen tetroxide as oxidizer.
Cost studies for commercial fuselage crown designs
NASA Technical Reports Server (NTRS)
Walker, T. H.; Smith, P. J.; Truslove, G.; Willden, K. S.; Metschan, S. L.; Pfahl, C. L.
1991-01-01
Studies were conducted to evaluate the cost and weight potential of advanced composite design concepts in the crown region of a commercial transport. Two designs from each of three design families were developed using an integrated design-build team. A range of design concepts and manufacturing processes were included to allow isolation and comparison of cost centers. Detailed manufacturing/assembly plans were developed as the basis for cost estimates. Each of the six designs was found to have advantages over the 1995 aluminum benchmark in cost and weight trade studies. Large quadrant panels and cobonded frames were found to save significant assembly labor costs. Comparisons of high- and intermediate-performance fiber systems were made for skin and stringer applications. Advanced tow placement was found to be an efficient process for skin lay up. Further analysis revealed attractive processes for stringers and frames. Optimized designs were informally developed for each design family, combining the most attractive concepts and processes within that family. A single optimized design was selected as the most promising, and the potential for further optimization was estimated. Technical issues and barriers were identified.
Parallel ICA and its hardware implementation in hyperspectral image analysis
NASA Astrophysics Data System (ADS)
Du, Hongtao; Qi, Hairong; Peterson, Gregory D.
2004-04-01
Advances in hyperspectral images have dramatically boosted remote sensing applications by providing abundant information using hundreds of contiguous spectral bands. However, the high volume of information also results in excessive computation burden. Since most materials have specific characteristics only at certain bands, a lot of these information is redundant. This property of hyperspectral images has motivated many researchers to study various dimensionality reduction algorithms, including Projection Pursuit (PP), Principal Component Analysis (PCA), wavelet transform, and Independent Component Analysis (ICA), where ICA is one of the most popular techniques. It searches for a linear or nonlinear transformation which minimizes the statistical dependence between spectral bands. Through this process, ICA can eliminate superfluous but retain practical information given only the observations of hyperspectral images. One hurdle of applying ICA in hyperspectral image (HSI) analysis, however, is its long computation time, especially for high volume hyperspectral data sets. Even the most efficient method, FastICA, is a very time-consuming process. In this paper, we present a parallel ICA (pICA) algorithm derived from FastICA. During the unmixing process, pICA divides the estimation of weight matrix into sub-processes which can be conducted in parallel on multiple processors. The decorrelation process is decomposed into the internal decorrelation and the external decorrelation, which perform weight vector decorrelations within individual processors and between cooperative processors, respectively. In order to further improve the performance of pICA, we seek hardware solutions in the implementation of pICA. Until now, there are very few hardware designs for ICA-related processes due to the complicated and iterant computation. This paper discusses capacity limitation of FPGA implementations for pICA in HSI analysis. A synthesis of Application-Specific Integrated Circuit (ASIC) is designed for pICA-based dimensionality reduction in HSI analysis. The pICA design is implemented using standard-height cells and aimed at TSMC 0.18 micron process. During the synthesis procedure, three ICA-related reconfigurable components are developed for the reuse and retargeting purpose. Preliminary results show that the standard-height cell based ASIC synthesis provide an effective solution for pICA and ICA-related processes in HSI analysis.
Process Design and Techno-economic Analysis for Materials to Treat Produced Waters.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heimer, Brandon Walter; Paap, Scott M; Sasan, Koroush
Significant quantities of water are produced during enhanced oil recovery making these “produced water” streams attractive candidates for treatment and reuse. However, high concentrations of dissolved silica raise the propensity for fouling. In this paper, we report the design and economic analysis for a new ion exchange process using calcined hydrotalcite (HTC) to remove silica from water. This process improves upon known technologies by minimizing sludge product, reducing process fouling, and lowering energy use. Process modeling outputs included raw material requirements, energy use, and the minimum water treatment price (MWTP). Monte Carlo simulations quantified the impact of uncertainty and variabilitymore » in process inputs on MWTP. These analyses showed that cost can be significantly reduced if the HTC materials are optimized. Specifically, R&D improving HTC reusability, silica binding capacity, and raw material price can reduce MWTP by 40%, 13%, and 20%, respectively. Optimizing geographic deployment further improves cost competitiveness.« less
Design and Analysis of a Two-Stage Adsorption Air Chiller
NASA Astrophysics Data System (ADS)
Benrajesh, P.; Rajan, A. John
2017-05-01
The objective of this article is to design and build a bio-friendly air-conditioner, by using adsorption method in the presence of 15% of calcium carbide in water. Aluminum sheet metals are used to form three identical tunnels, to pass the air for processing. Exhaust heat generated from the dairy sterilizing unit process is reutilized, for cooling the environment through this equipment. This equipment is designed, and the analysis is carried out to quantify the COP, SCP, and cooling power. Heat exchangers are designed; its Performance Parameters are quantified and correlated with the conventional designs. It is observed that the new adsorption chiller can produce the coefficient of performance of chiller as 1.068; the Specific cooling power of 10.66 (W/Kg); and the Cooling power of 4.2 KW. This equipment needs 0 to 15 minutes to reach the desired cool breeze (24°c) from the existing room temperature (29°c).
NASA Technical Reports Server (NTRS)
Dorney, Suzanne; Dorney, Daniel J.; Huber, Frank; Sheffler, David A.; Turner, James E. (Technical Monitor)
2001-01-01
The advent of advanced computer architectures and parallel computing have led to a revolutionary change in the design process for turbomachinery components. Two- and three-dimensional steady-state computational flow procedures are now routinely used in the early stages of design. Unsteady flow analyses, however, are just beginning to be incorporated into design systems. This paper outlines the transition of a three-dimensional unsteady viscous flow analysis from the research environment into the design environment. The test case used to demonstrate the analysis is the full turbine system (high-pressure turbine, inter-turbine duct and low-pressure turbine) from an advanced turboprop engine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palta, J.
2015-06-15
Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less
Portfolio Assessment on Chemical Reactor Analysis and Process Design Courses
ERIC Educational Resources Information Center
Alha, Katariina
2004-01-01
Assessment determines what students regard as important: if a teacher wants to change students' learning, he/she should change the methods of assessment. This article describes the use of portfolio assessment on five courses dealing with chemical reactor and process design during the years 1999-2001. Although the use of portfolio was a new…
Building Bridges to Connect the Disconnects: An Analysis of Business Program Design Processes
ERIC Educational Resources Information Center
Fleming, Debra L.
2008-01-01
The purpose of this study is to analyze current trends of design processes and redesign efforts for business programs. A review of the literature suggests business schools are not preparing graduates of their programs with the necessary knowledge, skills and dispositions as deemed appropriate to succeed in the world of work. Some research studies…
ERIC Educational Resources Information Center
Onaral, Banu; And Others
This report describes the development of a Drexel University electrical and computer engineering course on digital filter design that used interactive computing and graphics, and was one of three courses in a senior-level sequence on digital signal processing (DSP). Interactive and digital analysis/design routines and the interconnection of these…
Process-based Cost Estimation for Ramjet/Scramjet Engines
NASA Technical Reports Server (NTRS)
Singh, Brijendra; Torres, Felix; Nesman, Miles; Reynolds, John
2003-01-01
Process-based cost estimation plays a key role in effecting cultural change that integrates distributed science, technology and engineering teams to rapidly create innovative and affordable products. Working together, NASA Glenn Research Center and Boeing Canoga Park have developed a methodology of process-based cost estimation bridging the methodologies of high-level parametric models and detailed bottoms-up estimation. The NASA GRC/Boeing CP process-based cost model provides a probabilistic structure of layered cost drivers. High-level inputs characterize mission requirements, system performance, and relevant economic factors. Design alternatives are extracted from a standard, product-specific work breakdown structure to pre-load lower-level cost driver inputs and generate the cost-risk analysis. As product design progresses and matures the lower level more detailed cost drivers can be re-accessed and the projected variation of input values narrowed, thereby generating a progressively more accurate estimate of cost-risk. Incorporated into the process-based cost model are techniques for decision analysis, specifically, the analytic hierarchy process (AHP) and functional utility analysis. Design alternatives may then be evaluated not just on cost-risk, but also user defined performance and schedule criteria. This implementation of full-trade study support contributes significantly to the realization of the integrated development environment. The process-based cost estimation model generates development and manufacturing cost estimates. The development team plans to expand the manufacturing process base from approximately 80 manufacturing processes to over 250 processes. Operation and support cost modeling is also envisioned. Process-based estimation considers the materials, resources, and processes in establishing cost-risk and rather depending on weight as an input, actually estimates weight along with cost and schedule.
NASA Technical Reports Server (NTRS)
O'Connor, Brian; Hernandez, Deborah; Hornsby, Linda; Brown, Maria; Horton-Mullins, Kathryn
2017-01-01
Outline: Background of ISS (International Space Station) Material Science Research Rack; NASA SCA (Sample Cartridge Assembly) Design; GEDS (Gravitational Effects in Distortion in Sintering) Experiment Ampoule Design; Development Testing Summary; Thermal Modeling and Analysis. Summary: GEDS design development challenging (GEDS Ampoule design developed through MUGS (Microgravity) testing; Short duration transient sample processing; Unable to measure sample temperatures); MUGS Development testing used to gather data (Actual LGF (Low Gradient Furnace)-like furnace response; Provided sample for sintering evaluation); Transient thermal model integral to successful GEDS experiment (Development testing provided furnace response; PI (Performance Indicator) evaluation of sintering anchored model evaluation of processing durations; Thermal transient model used to determine flight SCA sample processing profiles).
Integrated Aerodynamic/Structural/Dynamic Analyses of Aircraft with Large Shape Changes
NASA Technical Reports Server (NTRS)
Samareh, Jamshid A.; Chwalowski, Pawel; Horta, Lucas G.; Piatak, David J.; McGowan, Anna-Maria R.
2007-01-01
The conceptual and preliminary design processes for aircraft with large shape changes are generally difficult and time-consuming, and the processes are often customized for a specific shape change concept to streamline the vehicle design effort. Accordingly, several existing reports show excellent results of assessing a particular shape change concept or perturbations of a concept. The goal of the current effort was to develop a multidisciplinary analysis tool and process that would enable an aircraft designer to assess several very different morphing concepts early in the design phase and yet obtain second-order performance results so that design decisions can be made with better confidence. The approach uses an efficient parametric model formulation that allows automatic model generation for systems undergoing radical shape changes as a function of aerodynamic parameters, geometry parameters, and shape change parameters. In contrast to other more self-contained approaches, the approach utilizes off-the-shelf analysis modules to reduce development time and to make it accessible to many users. Because the analysis is loosely coupled, discipline modules like a multibody code can be easily swapped for other modules with similar capabilities. One of the advantages of this loosely coupled system is the ability to use the medium-to high-fidelity tools early in the design stages when the information can significantly influence and improve overall vehicle design. Data transfer among the analysis modules are based on an accurate and automated general purpose data transfer tool. In general, setup time for the integrated system presented in this paper is 2-4 days for simple shape change concepts and 1-2 weeks for more mechanically complicated concepts. Some of the key elements briefly described in the paper include parametric model development, aerodynamic database generation, multibody analysis, and the required software modules as well as examples for a telescoping wing, a folding wing, and a bat-like wing.
Surface-specific additive manufacturing test artefacts
NASA Astrophysics Data System (ADS)
Townsend, Andrew; Racasan, Radu; Blunt, Liam
2018-06-01
Many test artefact designs have been proposed for use with additive manufacturing (AM) systems. These test artefacts have primarily been designed for the evaluation of AM form and dimensional performance. A series of surface-specific measurement test artefacts designed for use in the verification of AM manufacturing processes are proposed here. Surface-specific test artefacts can be made more compact because they do not require the large dimensions needed for accurate dimensional and form measurements. The series of three test artefacts are designed to provide comprehensive information pertaining to the manufactured surface. Measurement possibilities include deviation analysis, surface texture parameter data generation, sub-surface analysis, layer step analysis and build resolution comparison. The test artefacts are designed to provide easy access for measurement using conventional surface measurement techniques, for example, focus variation microscopy, stylus profilometry, confocal microscopy and scanning electron microscopy. Additionally, the test artefacts may be simply visually inspected as a comparative tool, giving a fast indication of process variation between builds. The three test artefacts are small enough to be included in every build and include built-in manufacturing traceability information, making them a convenient physical record of the build.
Design Criteria For Networked Image Analysis System
NASA Astrophysics Data System (ADS)
Reader, Cliff; Nitteberg, Alan
1982-01-01
Image systems design is currently undergoing a metamorphosis from the conventional computing systems of the past into a new generation of special purpose designs. This change is motivated by several factors, notably among which is the increased opportunity for high performance with low cost offered by advances in semiconductor technology. Another key issue is a maturing in understanding of problems and the applicability of digital processing techniques. These factors allow the design of cost-effective systems that are functionally dedicated to specific applications and used in a utilitarian fashion. Following an overview of the above stated issues, the paper presents a top-down approach to the design of networked image analysis systems. The requirements for such a system are presented, with orientation toward the hospital environment. The three main areas are image data base management, viewing of image data and image data processing. This is followed by a survey of the current state of the art, covering image display systems, data base techniques, communications networks and software systems control. The paper concludes with a description of the functional subystems and architectural framework for networked image analysis in a production environment.
Designing Systems for Environmental Sustainability
Dr. Smith will describe his U.S. EPA research which involves elements of design, from systems as diverse as biofuel supply chains to recycling systems and chemical processes. Design uses models that rate performance as part of a synthesis approach, where steps of analysis and sy...
[Quality by design approaches for pharmaceutical development and manufacturing of Chinese medicine].
Xu, Bing; Shi, Xin-Yuan; Wu, Zhi-Sheng; Zhang, Yan-Ling; Wang, Yun; Qiao, Yan-Jiang
2017-03-01
The pharmaceutical quality was built by design, formed in the manufacturing process and improved during the product's lifecycle. Based on the comprehensive literature review of pharmaceutical quality by design (QbD), the essential ideas and implementation strategies of pharmaceutical QbD were interpreted. Considering the complex nature of Chinese medicine, the "4H" model was innovated and proposed for implementing QbD in pharmaceutical development and industrial manufacture of Chinese medicine product. "4H" corresponds to the acronym of holistic design, holistic information analysis, holistic quality control, and holistic process optimization, which is consistent with the holistic concept of Chinese medicine theory. The holistic design aims at constructing both the quality problem space from the patient requirement and the quality solution space from multidisciplinary knowledge. Holistic information analysis emphasizes understanding the quality pattern of Chinese medicine by integrating and mining multisource data and information at a relatively high level. The batch-to-batch quality consistence and manufacturing system reliability can be realized by comprehensive application of inspective quality control, statistical quality control, predictive quality control and intelligent quality control strategies. Holistic process optimization is to improve the product quality and process capability during the product lifecycle management. The implementation of QbD is useful to eliminate the ecosystem contradictions lying in the pharmaceutical development and manufacturing process of Chinese medicine product, and helps guarantee the cost effectiveness. Copyright© by the Chinese Pharmaceutical Association.
NASA Technical Reports Server (NTRS)
Neam, Douglas C.; Gerber, John D.
1992-01-01
The stringent stability requirements of the Corrective Optics Space Telescope Axial Replacement (COSTAR) necessitates a Deployable Optical Bench (DOB) with both a low CTE and high resonant frequency. The DOB design consists of a monocoque thin shell structure which marries metallic machined parts with graphite epoxy formed structure. Structural analysis of the DOB has been integrated into the laminate design and optimization process. Also, the structural analytical results are compared with vibration and thermal test data to assess the reliability of the analysis.
Geometry and Function Definition for Discrete Analysis and Its Relationship to the Design Data Base.
1977-08-01
clarif y its dependenc e on the design process as a whole . The model generation capabilities of a state—of—the—art structural analysis system ( GIFTS ...a whole. The model generat ion capabilit ies of a state—of—the—art structural analysis system ( GIFTS 4), heav ily oriented toward s pre— and post...independentl y at a later stage. Ii . 1,1 ~1E TP IC HIER.ARCHY ~)F DEFINITION IN GIFTS ‘+ A three-d imensional object , to be designed or analyz ed
Probabilistic Structural Analysis of the SRB Aft Skirt External Fitting Modification
NASA Technical Reports Server (NTRS)
Townsend, John S.; Peck, J.; Ayala, S.
1999-01-01
NASA has funded several major programs (the PSAM Project is an example) to develop Probabilistic Structural Analysis Methods and tools for engineers to apply in the design and assessment of aerospace hardware. A probabilistic finite element design tool, known as NESSUS, is used to determine the reliability of the Space Shuttle Solid Rocket Booster (SRB) aft skirt critical weld. An external bracket modification to the aft skirt provides a comparison basis for examining the details of the probabilistic analysis and its contributions to the design process.
NASA Technical Reports Server (NTRS)
Bhat, Biliyar N.
2008-01-01
Ares I Crew Launch Vehicle Upper Stage is designed and developed based on sound systems engineering principles. Systems Engineering starts with Concept of Operations and Mission requirements, which in turn determine the launch system architecture and its performance requirements. The Ares I-Upper Stage is designed and developed to meet these requirements. Designers depend on the support from materials, processes and manufacturing during the design, development and verification of subsystems and components. The requirements relative to reliability, safety, operability and availability are also dependent on materials availability, characterization, process maturation and vendor support. This paper discusses the roles and responsibilities of materials and manufacturing engineering during the various phases of Ares IUS development, including design and analysis, hardware development, test and verification. Emphasis is placed how materials, processes and manufacturing support is integrated over the Upper Stage Project, both horizontally and vertically. In addition, the paper describes the approach used to ensure compliance with materials, processes, and manufacturing requirements during the project cycle, with focus on hardware systems design and development.
MAGMA: analysis of two-channel microarrays made easy.
Rehrauer, Hubert; Zoller, Stefan; Schlapbach, Ralph
2007-07-01
The web application MAGMA provides a simple and intuitive interface to identify differentially expressed genes from two-channel microarray data. While the underlying algorithms are not superior to those of similar web applications, MAGMA is particularly user friendly and can be used without prior training. The user interface guides the novice user through the most typical microarray analysis workflow consisting of data upload, annotation, normalization and statistical analysis. It automatically generates R-scripts that document MAGMA's entire data processing steps, thereby allowing the user to regenerate all results in his local R installation. The implementation of MAGMA follows the model-view-controller design pattern that strictly separates the R-based statistical data processing, the web-representation and the application logic. This modular design makes the application flexible and easily extendible by experts in one of the fields: statistical microarray analysis, web design or software development. State-of-the-art Java Server Faces technology was used to generate the web interface and to perform user input processing. MAGMA's object-oriented modular framework makes it easily extendible and applicable to other fields and demonstrates that modern Java technology is also suitable for rather small and concise academic projects. MAGMA is freely available at www.magma-fgcz.uzh.ch.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fuente, Rafael de la; Iglesias, Javier; Sedano, Pablo G.
IBERDROLA (Spanish utility) and IBERDROLA INGENIERIA (engineering branch) have been developing during the last 2 yr the 110% Extended Power Uprate Project for Cofrentes BWR-6. IBERDROLA has available an in-house design and licensing reload methodology that has been approved in advance by the Spanish Nuclear Regulatory Authority. This methodology has been applied to perform the nuclear design and the reload licensing analysis for Cofrentes cycles 12 and 13 and to develop a significant number of safety analyses of the Cofrentes Extended Power.Because the scope of the licensing process of the Cofrentes Extended Power Uprate exceeds the range of analysis includedmore » in the Cofrentes generic reload licensing process, it has been required to extend the applicability of the Cofrentes RETRAN model to the analysis of new transients. This is the case of the total loss of feedwater (TLFW) transient.The content of this paper shows the benefits of having an in-house design and licensing methodology and describes the process to extend the applicability of the Cofrentes RETRAN model to the analysis of new transients, particularly in this paper the TLFW transient.« less
Pressure Mapping and Efficiency Analysis of an EPPLER 857 Hydrokinetic Turbine
NASA Astrophysics Data System (ADS)
Clark, Tristan
A conceptual energy ship is presented to provide renewable energy. The ship, driven by the wind, drags a hydrokinetic turbine through the water. The power generated is used to run electrolysis on board, taking the resultant hydrogen back to shore to be used as an energy source. The basin efficiency (Power/thrust*velocity) of the Hydrokinetic Turbine (HTK) plays a vital role in this process. In order to extract the maximum allowable power from the flow, the blades need to be optimized. The structural analysis of the blade is important, as the blade will undergo high pressure loads from the water. A procedure for analysis of a preliminary Hydrokinetic Turbine blade design is developed. The blade was designed by a non-optimized Blade Element Momentum Theory (BEMT) code. Six simulations were run, with varying mesh resolution, turbulence models, and flow region size. The procedure was developed that provides detailed explanation for the entire process, from geometry and mesh generation to post-processing analysis tools. The efficiency results from the simulations are used to study the mesh resolution, flow region size, and turbulence models. The results are compared to the BEMT model design targets. Static pressure maps are created that can be used for structural analysis of the blades.
Accident analysis and control options in support of the sludge water system safety analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
HEY, B.E.
A hazards analysis was initiated for the SWS in July 2001 (SNF-8626, K Basin Sludge and Water System Preliminary Hazard Analysis) and updated in December 2001 (SNF-10020 Rev. 0, Hazard Evaluation for KE Sludge and Water System - Project A16) based on conceptual design information for the Sludge Retrieval System (SRS) and 60% design information for the cask and container. SNF-10020 was again revised in September 2002 to incorporate new hazards identified from final design information and from a What-if/Checklist evaluation of operational steps. The process hazards, controls, and qualitative consequence and frequency estimates taken from these efforts have beenmore » incorporated into Revision 5 of HNF-3960, K Basins Hazards Analysis. The hazards identification process documented in the above referenced reports utilized standard industrial safety techniques (AIChE 1992, Guidelines for Hazard Evaluation Procedures) to systematically guide several interdisciplinary teams through the system using a pre-established set of process parameters (e.g., flow, temperature, pressure) and guide words (e.g., high, low, more, less). The teams generally included representation from the U.S. Department of Energy (DOE), K Basins Nuclear Safety, T Plant Nuclear Safety, K Basin Industrial Safety, fire protection, project engineering, operations, and facility engineering.« less
Colossal Tooling Design: 3D Simulation for Ergonomic Analysis
NASA Technical Reports Server (NTRS)
Hunter, Steve L.; Dischinger, Charles; Thomas, Robert E.; Babai, Majid
2003-01-01
The application of high-level 3D simulation software to the design phase of colossal mandrel tooling for composite aerospace fuel tanks was accomplished to discover and resolve safety and human engineering problems. The analyses were conducted to determine safety, ergonomic and human engineering aspects of the disassembly process of the fuel tank composite shell mandrel. Three-dimensional graphics high-level software, incorporating various ergonomic analysis algorithms, was utilized to determine if the process was within safety and health boundaries for the workers carrying out these tasks. In addition, the graphical software was extremely helpful in the identification of material handling equipment and devices for the mandrel tooling assembly/disassembly process.
Formal Analysis of BPMN Models Using Event-B
NASA Astrophysics Data System (ADS)
Bryans, Jeremy W.; Wei, Wei
The use of business process models has gone far beyond documentation purposes. In the development of business applications, they can play the role of an artifact on which high level properties can be verified and design errors can be revealed in an effort to reduce overhead at later software development and diagnosis stages. This paper demonstrates how formal verification may add value to the specification, design and development of business process models in an industrial setting. The analysis of these models is achieved via an algorithmic translation from the de-facto standard business process modeling language BPMN to Event-B, a widely used formal language supported by the Rodin platform which offers a range of simulation and verification technologies.
Bounding the Spacecraft Atmosphere Design Space for Future Exploration Missions
NASA Technical Reports Server (NTRS)
Lange, Kevin E.; Perka, Alan T.; Duffield, Bruce E.; Jeng, Frank F.
2005-01-01
The selection of spacecraft and space suit atmospheres for future human space exploration missions will play an important, if not critical, role in the ultimate safety, productivity, and cost of such missions. Internal atmosphere pressure and composition (particularly oxygen concentration) influence many aspects of spacecraft and space suit design, operation, and technology development. Optimal atmosphere solutions must be determined by iterative process involving research, design, development, testing, and systems analysis. A necessary first step in this process is the establishment of working bounds on the atmosphere design space.
Three-Dimensional Computational Fluid Dynamics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haworth, D.C.; O'Rourke, P.J.; Ranganathan, R.
1998-09-01
Computational fluid dynamics (CFD) is one discipline falling under the broad heading of computer-aided engineering (CAE). CAE, together with computer-aided design (CAD) and computer-aided manufacturing (CAM), comprise a mathematical-based approach to engineering product and process design, analysis and fabrication. In this overview of CFD for the design engineer, our purposes are three-fold: (1) to define the scope of CFD and motivate its utility for engineering, (2) to provide a basic technical foundation for CFD, and (3) to convey how CFD is incorporated into engineering product and process design.
Study of photon correlation techniques for processing of laser velocimeter signals
NASA Technical Reports Server (NTRS)
Mayo, W. T., Jr.
1977-01-01
The objective was to provide the theory and a system design for a new type of photon counting processor for low level dual scatter laser velocimeter (LV) signals which would be capable of both the first order measurements of mean flow and turbulence intensity and also the second order time statistics: cross correlation auto correlation, and related spectra. A general Poisson process model for low level LV signals and noise which is valid from the photon-resolved regime all the way to the limiting case of nonstationary Gaussian noise was used. Computer simulation algorithms and higher order statistical moment analysis of Poisson processes were derived and applied to the analysis of photon correlation techniques. A system design using a unique dual correlate and subtract frequency discriminator technique is postulated and analyzed. Expectation analysis indicates that the objective measurements are feasible.
2012-02-09
different sources [12,13], but the analytical techniques needed for such analysis (XRD, INAA , & ICP-MS) are time consuming and require expensive...partial least-squares discriminant analysis (PLSDA) that used the SIMPLS solving method [33]. In the experi- ment design, a leave-one-sample-out (LOSO) para...REPORT Advanced signal processing analysis of laser-induced breakdown spectroscopy data for the discrimination of obsidian sources 14. ABSTRACT 16
System Level Uncertainty Assessment for Collaborative RLV Design
NASA Technical Reports Server (NTRS)
Charania, A. C.; Bradford, John E.; Olds, John R.; Graham, Matthew
2002-01-01
A collaborative design process utilizing Probabilistic Data Assessment (PDA) is showcased. Given the limitation of financial resources by both the government and industry, strategic decision makers need more than just traditional point designs, they need to be aware of the likelihood of these future designs to meet their objectives. This uncertainty, an ever-present character in the design process, can be embraced through a probabilistic design environment. A conceptual design process is presented that encapsulates the major engineering disciplines for a Third Generation Reusable Launch Vehicle (RLV). Toolsets consist of aerospace industry standard tools in disciplines such as trajectory, propulsion, mass properties, cost, operations, safety, and economics. Variations of the design process are presented that use different fidelities of tools. The disciplinary engineering models are used in a collaborative engineering framework utilizing Phoenix Integration's ModelCenter and AnalysisServer environment. These tools allow the designer to join disparate models and simulations together in a unified environment wherein each discipline can interact with any other discipline. The design process also uses probabilistic methods to generate the system level output metrics of interest for a RLV conceptual design. The specific system being examined is the Advanced Concept Rocket Engine 92 (ACRE-92) RLV. Previous experience and knowledge (in terms of input uncertainty distributions from experts and modeling and simulation codes) can be coupled with Monte Carlo processes to best predict the chances of program success.
Methods for Improving Information from ’Undesigned’ Human Factors Experiments.
Human factors engineering, Information processing, Regression analysis , Experimental design, Least squares method, Analysis of variance, Correlation techniques, Matrices(Mathematics), Multiple disciplines, Mathematical prediction
NASA Technical Reports Server (NTRS)
Lee, Alice T.; Gunn, Todd; Pham, Tuan; Ricaldi, Ron
1994-01-01
This handbook documents the three software analysis processes the Space Station Software Analysis team uses to assess space station software, including their backgrounds, theories, tools, and analysis procedures. Potential applications of these analysis results are also presented. The first section describes how software complexity analysis provides quantitative information on code, such as code structure and risk areas, throughout the software life cycle. Software complexity analysis allows an analyst to understand the software structure, identify critical software components, assess risk areas within a software system, identify testing deficiencies, and recommend program improvements. Performing this type of analysis during the early design phases of software development can positively affect the process, and may prevent later, much larger, difficulties. The second section describes how software reliability estimation and prediction analysis, or software reliability, provides a quantitative means to measure the probability of failure-free operation of a computer program, and describes the two tools used by JSC to determine failure rates and design tradeoffs between reliability, costs, performance, and schedule.
Creative user-centered visualization design for energy analysts and modelers.
Goodwin, Sarah; Dykes, Jason; Jones, Sara; Dillingham, Iain; Dove, Graham; Duffy, Alison; Kachkaev, Alexander; Slingsby, Aidan; Wood, Jo
2013-12-01
We enhance a user-centered design process with techniques that deliberately promote creativity to identify opportunities for the visualization of data generated by a major energy supplier. Visualization prototypes developed in this way prove effective in a situation whereby data sets are largely unknown and requirements open - enabling successful exploration of possibilities for visualization in Smart Home data analysis. The process gives rise to novel designs and design metaphors including data sculpting. It suggests: that the deliberate use of creativity techniques with data stakeholders is likely to contribute to successful, novel and effective solutions; that being explicit about creativity may contribute to designers developing creative solutions; that using creativity techniques early in the design process may result in a creative approach persisting throughout the process. The work constitutes the first systematic visualization design for a data rich source that will be increasingly important to energy suppliers and consumers as Smart Meter technology is widely deployed. It is novel in explicitly employing creativity techniques at the requirements stage of visualization design and development, paving the way for further use and study of creativity methods in visualization design.
Characteristics study of the gears by the CAD/CAE
NASA Astrophysics Data System (ADS)
Wang, P. Y.; Chang, S. L.; Lee, B. Y.; Nguyen, D. H.; Cao, C. W.
2017-09-01
Gears are the most important transmission component in machines. The rapid development of the machines in industry requires a shorter time of the analysis process. In traditional, the gears are analyzed by setting up the complete mathematical model firstly, considering the profile of cutter and coordinate systems relationship between the machine and the cutter. It is a really complex and time-consuming process. Recently, the CAD/CAE software is well developed and useful in the mechanical design. In this paper, the Autodesk Inventor® software is introduced to model the spherical gears firstly, and then the models can also be transferred into ANSYS Workbench for the finite element analysis. The proposed process in this paper is helpful to the engineers to speed up the analyzing process of gears in the design stage.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
2015-06-15
Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less
TU-AB-BRD-04: Development of Quality Management Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomadsen, B.
2015-06-15
Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less
NASA Astrophysics Data System (ADS)
Tai, Wei; Abbasi, Mortez; Ricketts, David S.
2018-01-01
We present the analysis and design of high-power millimetre-wave power amplifier (PA) systems using zero-degree combiners (ZDCs). The methodology presented optimises the PA device sizing and the number of combined unit PAs based on device load pull simulations, driver power consumption analysis and loss analysis of the ZDC. Our analysis shows that an optimal number of N-way combined unit PAs leads to the highest power-added efficiency (PAE) for a given output power. To illustrate our design methodology, we designed a 1-W PA system at 45 GHz using a 45 nm silicon-on-insulator process and showed that an 8-way combined PA has the highest PAE that yields simulated output power of 30.6 dBm and 31% peak PAE.
NASA Technical Reports Server (NTRS)
Lahoti, G. D.; Akgerman, N.; Altan, T.
1978-01-01
Mild steel (AISI 1018) was selected as model cold rolling material and Ti-6A1-4V and Inconel 718 were selected as typical hot rolling and cold rolling alloys, respectively. The flow stress and workability of these alloys were characterized and friction factor at the roll/workpiece interface was determined at their respective working conditions by conducting ring tests. Computer-aided mathematical models for predicting metal flow and stresses, and for simulating the shape rolling process were developed. These models utilized the upper bound and the slab methods of analysis, and were capable of predicting the lateral spread, roll separating force, roll torque, and local stresses, strains and strain rates. This computer-aided design system was also capable of simulating the actual rolling process, and thereby designing the roll pass schedule in rolling of an airfoil or a similar shape.
Fuel ethanol production: process design trends and integration opportunities.
Cardona, Carlos A; Sánchez, Oscar J
2007-09-01
Current fuel ethanol research and development deals with process engineering trends for improving biotechnological production of ethanol. In this work, the key role that process design plays during the development of cost-effective technologies is recognized through the analysis of major trends in process synthesis, modeling, simulation and optimization related to ethanol production. Main directions in techno-economical evaluation of fuel ethanol processes are described as well as some prospecting configurations. The most promising alternatives for compensating ethanol production costs by the generation of valuable co-products are analyzed. Opportunities for integration of fuel ethanol production processes and their implications are underlined. Main ways of process intensification through reaction-reaction, reaction-separation and separation-separation processes are analyzed in the case of bioethanol production. Some examples of energy integration during ethanol production are also highlighted. Finally, some concluding considerations on current and future research tendencies in fuel ethanol production regarding process design and integration are presented.
What are you trying to learn? Study designs and the appropriate analysis for your research question
USDA-ARS?s Scientific Manuscript database
One fundamental necessity in the entire process of a well-performed study is the experimental design. A well-designed study can help researchers understand and have confidence in their results and analyses, and additionally the agreement or disagreement with the stated hypothesis. This well-designed...
An open-loop system design for deep space signal processing applications
NASA Astrophysics Data System (ADS)
Tang, Jifei; Xia, Lanhua; Mahapatra, Rabi
2018-06-01
A novel open-loop system design with high performance is proposed for space positioning and navigation signal processing. Divided by functions, the system has four modules, bandwidth selectable data recorder, narrowband signal analyzer, time-delay difference of arrival estimator and ANFIS supplement processor. A hardware-software co-design approach is made to accelerate computing capability and improve system efficiency. Embedded with the proposed signal processing algorithms, the designed system is capable of handling tasks with high accuracy over long period of continuous measurements. The experiment results show the Doppler frequency tracking root mean square error during 3 h observation is 0.0128 Hz, while the TDOA residue analysis in correlation power spectrum is 0.1166 rad.
An Interactive Preliminary Design System of High Speed Forebody and Inlet Flows
NASA Technical Reports Server (NTRS)
Liou, May-Fun; Benson, Thomas J.; Trefny, Charles J.
2010-01-01
This paper demonstrates a simulation-based aerodynamic design process of high speed inlet. A genetic algorithm is integrated into the design process to facilitate the single objective optimization. The objective function is the total pressure recovery and is obtained by using a PNS solver for its computing efficiency. The system developed uses existing software of geometry definition, mesh generation and CFD analysis. The process which produces increasingly desirable design in each genetic evolution over many generations is automatically carried out. A generic two-dimensional inlet is created as a showcase to demonstrate the capabilities of this tool. A parameterized study of geometric shape and size of the showcase is also presented.
NASA Technical Reports Server (NTRS)
1981-01-01
Several major modifications were made to the design presented at the PDR. The frame was deleted in favor of a "frameless" design which will provide a substantially improved cell packing factor. Potential shaded cell damage resulting from operation into a short circuit can be eliminated by a change in the cell series/parallel electrical interconnect configuration. The baseline process sequence defined for the MEPSON was refined and equipment design and specification work was completed. SAMICS cost analysis work accelerated, format A's were prepared and computer simulations completed. Design work on the automated cell interconnect station was focused on bond technique selection experiments.
Analysis of 100 Years of Curriculum Designs
ERIC Educational Resources Information Center
Kelting-Gibson, Lynn
2013-01-01
Fifteen historical and contemporary curriculum designs were analyzed for elements of assessment that support student learning and inform instructional decisions. Educational researchers are purposely paying attention to the role assessment plays in a well-designed planning and teaching process. Assessment is a vital component to educational…
Instructional Design: System Strategies.
ERIC Educational Resources Information Center
Ledford, Bruce R.; Sleeman, Phillip J.
This book is intended as a source for those who desire to apply a coherent system of instructional design, thereby insuring accountability. Chapter 1 covers the instructional design process, including: instructional technology; the role of evaluation; goal setting; the psychology of teaching and learning; task analysis; operational objectives;…
Advanced microgrid design and analysis for forward operating bases
NASA Astrophysics Data System (ADS)
Reasoner, Jonathan
This thesis takes a holistic approach in creating an improved electric power generation system for a forward operating base (FOB) in the future through the design of an isolated microgrid. After an extensive literature search, this thesis found a need for drastic improvement of the FOB power system. A thorough design process analyzed FOB demand, researched demand side management improvements, evaluated various generation sources and energy storage options, and performed a HOMERRTM discrete optimization to determine the best microgrid design. Further sensitivity analysis was performed to see how changing parameters would affect the outcome. Lastly, this research also looks at some of the challenges which are associated with incorporating a design which relies heavily on inverter-based generation sources, and gives possible solutions to help make a renewable energy powered microgrid a reality. While this thesis uses a FOB as the case study, the process and discussion can be adapted to aide in the design of an off-grid small-scale power grid which utilizes high-penetration levels of renewable energy.
A User-centered Model for Web Site Design
Kinzie, Mable B.; Cohn, Wendy F.; Julian, Marti F.; Knaus, William A.
2002-01-01
As the Internet continues to grow as a delivery medium for health information, the design of effective Web sites becomes increasingly important. In this paper, the authors provide an overview of one effective model for Web site design, a user-centered process that includes techniques for needs assessment, goal/task analysis, user interface design, and rapid prototyping. They detail how this approach was employed to design a family health history Web site, Health Heritage
Ormerod, Marcus; Newton, Rita
2018-01-01
Modifying the home environments of older people as they age in place is a well-established health and social care intervention. Using design and construction methods to redress any imbalance caused by the ageing process or disability within the home environment, occupational therapists are seen as the experts in this field of practice. However, the process used by occupational therapists when modifying home environments has been criticised for being disorganised and not founded on theoretical principles and concepts underpinning the profession. To address this issue, research was conducted to develop a design and construction process protocol specifically for home modifications. A three-stage approach was taken for the analysis of qualitative data generated from an online survey, completed by 135 occupational therapists in the UK. Using both the existing occupational therapy intervention process model and the design and construction process protocol as the theoretical frameworks, a 4-phase, 9-subphase design and construction process protocol for home modifications was developed. Overall, the study is innovative in developing the first process protocol for home modifications, potentially providing occupational therapists with a systematic and effective approach to the design and delivery of home modification services for older and disabled people. PMID:29682348
Russell, Rachel; Ormerod, Marcus; Newton, Rita
2018-01-01
Modifying the home environments of older people as they age in place is a well-established health and social care intervention. Using design and construction methods to redress any imbalance caused by the ageing process or disability within the home environment, occupational therapists are seen as the experts in this field of practice. However, the process used by occupational therapists when modifying home environments has been criticised for being disorganised and not founded on theoretical principles and concepts underpinning the profession. To address this issue, research was conducted to develop a design and construction process protocol specifically for home modifications. A three-stage approach was taken for the analysis of qualitative data generated from an online survey, completed by 135 occupational therapists in the UK. Using both the existing occupational therapy intervention process model and the design and construction process protocol as the theoretical frameworks, a 4-phase, 9-subphase design and construction process protocol for home modifications was developed. Overall, the study is innovative in developing the first process protocol for home modifications, potentially providing occupational therapists with a systematic and effective approach to the design and delivery of home modification services for older and disabled people.
Automated Sensitivity Analysis of Interplanetary Trajectories
NASA Technical Reports Server (NTRS)
Knittel, Jeremy; Hughes, Kyle; Englander, Jacob; Sarli, Bruno
2017-01-01
This work describes a suite of Python tools known as the Python EMTG Automated Trade Study Application (PEATSA). PEATSA was written to automate the operation of trajectory optimization software, simplify the process of performing sensitivity analysis, and was ultimately found to out-perform a human trajectory designer in unexpected ways. These benefits will be discussed and demonstrated on sample mission designs.
Approximate simulation model for analysis and optimization in engineering system design
NASA Technical Reports Server (NTRS)
Sobieszczanski-Sobieski, Jaroslaw
1989-01-01
Computational support of the engineering design process routinely requires mathematical models of behavior to inform designers of the system response to external stimuli. However, designers also need to know the effect of the changes in design variable values on the system behavior. For large engineering systems, the conventional way of evaluating these effects by repetitive simulation of behavior for perturbed variables is impractical because of excessive cost and inadequate accuracy. An alternative is described based on recently developed system sensitivity analysis that is combined with extrapolation to form a model of design. This design model is complementary to the model of behavior and capable of direct simulation of the effects of design variable changes.
Stochastic Multiscale Analysis and Design of Engine Disks
2010-07-28
shown recently to fail when used with data-driven non-linear stochastic input models (KPCA, IsoMap, etc.). Need for scalable exascale computing algorithms Materials Process Design and Control Laboratory Cornell University
An Analysis of Creative Process Learning in Computer Game Activities through Player Experiences
ERIC Educational Resources Information Center
Inchamnan, Wilawan
2016-01-01
This research investigates the extent to which creative processes can be fostered through computer gaming. It focuses on creative components in games that have been specifically designed for educational purposes: Digital Game Based Learning (DGBL). A behavior analysis for measuring the creative potential of computer game activities and learning…
Teaching Special Education Teachers How to Conduct Functional Analysis in Natural Settings
ERIC Educational Resources Information Center
Erbas, Dilek; Tekin-Iftar, Elif; Yucesoy, Serife
2006-01-01
Effects of a training program utilized to teach how to conduct functional analysis process to teachers of children with developmental disabilities was examined. Furthermore, teachers' opinions regarding this process were investigated. A multiple probe design across subjects with probe conditions was used. Teacher training was in two phases. In the…
Probabilistic Design and Analysis Framework
NASA Technical Reports Server (NTRS)
Strack, William C.; Nagpal, Vinod K.
2010-01-01
PRODAF is a software package designed to aid analysts and designers in conducting probabilistic analysis of components and systems. PRODAF can integrate multiple analysis programs to ease the tedious process of conducting a complex analysis process that requires the use of multiple software packages. The work uses a commercial finite element analysis (FEA) program with modules from NESSUS to conduct a probabilistic analysis of a hypothetical turbine blade, disk, and shaft model. PRODAF applies the response surface method, at the component level, and extrapolates the component-level responses to the system level. Hypothetical components of a gas turbine engine are first deterministically modeled using FEA. Variations in selected geometrical dimensions and loading conditions are analyzed to determine the effects of the stress state within each component. Geometric variations include the cord length and height for the blade, inner radius, outer radius, and thickness, which are varied for the disk. Probabilistic analysis is carried out using developing software packages like System Uncertainty Analysis (SUA) and PRODAF. PRODAF was used with a commercial deterministic FEA program in conjunction with modules from the probabilistic analysis program, NESTEM, to perturb loads and geometries to provide a reliability and sensitivity analysis. PRODAF simplified the handling of data among the various programs involved, and will work with many commercial and opensource deterministic programs, probabilistic programs, or modules.
Conversion of paper sludge to ethanol, II: process design and economic analysis.
Fan, Zhiliang; Lynd, Lee R
2007-01-01
Process design and economics are considered for conversion of paper sludge to ethanol. A particular site, a bleached kraft mill operated in Gorham, NH by Fraser Papers (15 tons dry sludge processed per day), is considered. In addition, profitability is examined for a larger plant (50 dry tons per day) and sensitivity analysis is carried out with respect to capacity, tipping fee, and ethanol price. Conversion based on simultaneous saccharification and fermentation with intermittent feeding is examined, with ethanol recovery provided by distillation and molecular sieve adsorption. It was found that the Fraser plant achieves positive cash flow with or without xylose conversion and mineral recovery. Sensitivity analysis indicates economics are very sensitive to ethanol selling price and scale; significant but less sensitive to the tipping fee, and rather insensitive to the prices of cellulase and power. Internal rates of return exceeding 15% are projected for larger plants at most combinations of scale, tipping fee, and ethanol price. Our analysis lends support to the proposition that paper sludge is a leading point-of-entry and proving ground for emergent industrial processes featuring enzymatic hydrolysis of cellulosic biomass.
NASA Technical Reports Server (NTRS)
Watson, H. K.
1971-01-01
Digital computer program determines tolerance values of end to end signal chain or flow path, given preselected probability value. Technique is useful in the synthesis and analysis phases of subsystem design processes.
Akinade, Olugbenga O; Oyedele, Lukumon O; Ajayi, Saheed O; Bilal, Muhammad; Alaka, Hafiz A; Owolabi, Hakeem A; Bello, Sururah A; Jaiyeoba, Babatunde E; Kadiri, Kabir O
2017-02-01
The aim of this paper is to identify Critical Success Factors (CSF) needed for effective material recovery through Design for Deconstruction (DfD). The research approach employed in this paper is based on a sequential exploratory mixed method strategy. After a thorough review of literature and conducting four Focus Group Discussion (FGDs), 43 DfD factors were identified and put together in a questionnaire survey. Data analyses include Cronbach's alpha reliability analysis, mean testing using significance index, and exploratory factor analysis. The result of the factor analysis reveals that an underlying factor structure of five DfD factors groups that include 'stringent legislation and policy', 'deconstruction design process and competencies', 'design for material recovery', 'design for material reuse', and 'design for building flexibility'. These groups of DfD factor groups show that the requirements for DfD goes beyond technical competencies and that non-technical factors such as stringent legislation and policy and design process and competency for deconstruction are key in designing deconstructable buildings. Paying attention to the factors identified in all of these categories will help to tackle impediments that could hinder the effectiveness of DfD. The results of this study would help design and project managers to understand areas of possible improvement in employing DfD as a strategy for diverting waste from landfills. Copyright © 2016 Elsevier Ltd. All rights reserved.
The UARS and open data system concept and analysis study. Executive summary
NASA Technical Reports Server (NTRS)
Mittal, M.; Nebb, J.; Woodward, H.
1983-01-01
Alternative concepts for a common design for the UARS and OPEN Central Data Handling Facility (CDHF) are offered. The designs are consistent with requirements shared by UARS and OPEN and the data storage and data processing demands of these missions. Because more detailed information is available for UARS, the design approach was to size the system and to select components for a UARS CDHF, but in a manner that does not optimize the CDHF at the expense of OPEN. Costs for alternative implementations of the UARS designs are presented showing that the system design does not restrict the implementation to a single manufacturer. Processing demands on the alternative UARS CDHF implementations are discussed. With this information at hand together with estimates for OPEN processing demands, it is shown that any shortfall in system capability for OPEN support can be remedied by either component upgrades or array processing attachments rather than a system redesign.
A quality by design study applied to an industrial pharmaceutical fluid bed granulation.
Lourenço, Vera; Lochmann, Dirk; Reich, Gabriele; Menezes, José C; Herdling, Thorsten; Schewitz, Jens
2012-06-01
The pharmaceutical industry is encouraged within Quality by Design (QbD) to apply science-based manufacturing principles to assure quality not only of new but also of existing processes. This paper presents how QbD principles can be applied to an existing industrial pharmaceutical fluid bed granulation (FBG) process. A three-step approach is presented as follows: (1) implementation of Process Analytical Technology (PAT) monitoring tools at the industrial scale process, combined with multivariate data analysis (MVDA) of process and PAT data to increase the process knowledge; (2) execution of scaled-down designed experiments at a pilot scale, with adequate PAT monitoring tools, to investigate the process response to intended changes in Critical Process Parameters (CPPs); and finally (3) the definition of a process Design Space (DS) linking CPPs to Critical to Quality Attributes (CQAs), within which product quality is ensured by design, and after scale-up enabling its use at the industrial process scale. The proposed approach was developed for an existing industrial process. Through enhanced process knowledge established a significant reduction in product CQAs, variability already within quality specifications ranges was achieved by a better choice of CPPs values. The results of such step-wise development and implementation are described. Copyright © 2012 Elsevier B.V. All rights reserved.
Lourenço, Vera; Herdling, Thorsten; Reich, Gabriele; Menezes, José C; Lochmann, Dirk
2011-08-01
A set of 192 fluid bed granulation batches at industrial scale were in-line monitored using microwave resonance technology (MRT) to determine moisture, temperature and density of the granules. Multivariate data analysis techniques such as multiway partial least squares (PLS), multiway principal component analysis (PCA) and multivariate batch control charts were applied onto collected batch data sets. The combination of all these techniques, along with off-line particle size measurements, led to significantly increased process understanding. A seasonality effect could be put into evidence that impacted further processing through its influence on the final granule size. Moreover, it was demonstrated by means of a PLS that a relation between the particle size and the MRT measurements can be quantitatively defined, highlighting a potential ability of the MRT sensor to predict information about the final granule size. This study has contributed to improve a fluid bed granulation process, and the process knowledge obtained shows that the product quality can be built in process design, following Quality by Design (QbD) and Process Analytical Technology (PAT) principles. Copyright © 2011. Published by Elsevier B.V.
Kothari, Bhaveshkumar H; Fahmy, Raafat; Claycamp, H Gregg; Moore, Christine M V; Chatterjee, Sharmista; Hoag, Stephen W
2017-05-01
The goal of this study was to utilize risk assessment techniques and statistical design of experiments (DoE) to gain process understanding and to identify critical process parameters for the manufacture of controlled release multiparticulate beads using a novel disk-jet fluid bed technology. The material attributes and process parameters were systematically assessed using the Ishikawa fish bone diagram and failure mode and effect analysis (FMEA) risk assessment methods. The high risk attributes identified by the FMEA analysis were further explored using resolution V fractional factorial design. To gain an understanding of the processing parameters, a resolution V fractional factorial study was conducted. Using knowledge gained from the resolution V study, a resolution IV fractional factorial study was conducted; the purpose of this IV study was to identify the critical process parameters (CPP) that impact the critical quality attributes and understand the influence of these parameters on film formation. For both studies, the microclimate, atomization pressure, inlet air volume, product temperature (during spraying and curing), curing time, and percent solids in the coating solutions were studied. The responses evaluated were percent agglomeration, percent fines, percent yield, bead aspect ratio, median particle size diameter (d50), assay, and drug release rate. Pyrobuttons® were used to record real-time temperature and humidity changes in the fluid bed. The risk assessment methods and process analytical tools helped to understand the novel disk-jet technology and to systematically develop models of the coating process parameters like process efficiency and the extent of curing during the coating process.
Case analysis online: a strategic management case model for the health industry.
Walsh, Anne; Bearden, Eithne
2004-01-01
Despite the plethora of methods and tools available to support strategic management, the challenge for health executives in the next century will relate to their ability to access and interpret data from multiple and intricate communication networks. Integrated digital networks and satellite systems will expand the scope and ease of sharing information between business divisions, and networked systems will facilitate the use of virtual case discussions across universities. While the internet is frequently used to support clinical decisions in the healthcare industry, few executives rely upon the internetfor strategic analysis. Although electronic technologies can easily synthesize data from multiple information channels, research as well as technical issues may deter their application in strategic analysis. As digital models transform access to information, online models may become increasingly relevant in designing strategic solutions. While there are various pedagogical models available to support the strategic management process, this framework was designed to enhance strategic analysis through the application of technology and electronic research. A strategic analysis framework, which incorporated internet research and case analysis in a strategic managementcourse, is described alongwith design and application issues that emerged during the case analysis process.
Rotorcraft Conceptual Design Environment
NASA Technical Reports Server (NTRS)
Johnson, Wayne; Sinsay, Jeffrey
2009-01-01
Requirements for a rotorcraft conceptual design environment are discussed, from the perspective of a government laboratory. Rotorcraft design work in a government laboratory must support research, by producing technology impact assessments and defining the context for research and development; and must support the acquisition process, including capability assessments and quantitative evaluation of designs, concepts, and alternatives. An information manager that will enable increased fidelity of analysis early in the design effort is described. This manager will be a framework to organize information that describes the aircraft, and enable movement of that information to and from analyses. Finally, a recently developed rotorcraft system analysis tool is described.
Rotorcraft Conceptual Design Environment
NASA Technical Reports Server (NTRS)
Johnson, Wayne; Sinsay, Jeffrey D.
2010-01-01
Requirements for a rotorcraft conceptual design environment are discussed, from the perspective of a government laboratory. Rotorcraft design work in a government laboratory must support research, by producing technology impact assessments and defining the context for research and development; and must support the acquisition process, including capability assessments and quantitative evaluation of designs, concepts, and alternatives. An information manager that will enable increased fidelity of analysis early in the design effort is described. This manager will be a framework to organize information that describes the aircraft, and enable movement of that information to and from analyses. Finally, a recently developed rotorcraft system analysis tool is described.
Design Validation Methodology Development for an Aircraft Sensor Deployment System
NASA Astrophysics Data System (ADS)
Wowczuk, Zenovy S.
The OCULUS 1.0 Sensor Deployment concept design, was developed in 2004 at West Virginia University (WVU), outlined the general concept of a deployment system to be used on a C-130 aircraft. As a sequel, a new system, OCULUS 1.1, has been developed and designed. The new system transfers the concept system design to a safety of flight design, and also enhanced to a pre-production system to be used as the test bed to gain full military certification approval. The OCULUS 1.1 system has an implemented standard deployment system/procedure to go along with a design suited for military certification and implementation. This design process included analysis of the system's critical components and the generation of a critical component holistic model to be used as an analysis tool for future payload modification made to the system. Following the completion of the OCULUS 1.1 design, preparations and procedures for obtaining military airworthiness certification are described. The airworthiness process includes working with the agency overseeing all modifications to the normal operating procedures made to military C-130 aircraft and preparing the system for an experimental flight test. The critical steps in his process include developing a complete documentation package that details the analysis performed on the OCULUS 1.1 system and also the design of experiment flight test plan to analyze the system. Following the approval of the documentation and design of experiment an experimental flight test of the OCULUS 1.1 system was performed to verify the safety and airworthiness of the system. This test proved successfully that the OCULUS 1.1 system design was airworthy and approved for military use. The OCULUS 1.1 deployment system offers an open architecture design that is ideal for use as a sensor testing platform for developmental airborne sensors. The system's patented deployment methodology presents a simplistic approach to reaching the systems final operating position which offers the most robust field of view area of rear ramp deployment systems.
Field methods and data processing techniques associated with mapped inventory plots
William A. Bechtold; Stanley J. Zarnoch
1999-01-01
The U.S. Forest Inventory and Analysis (FIA) and Forest Health Monitoring (FHM) programs utilize a fixed-area mapped-plot design as the national standard for extensive forest inventories. The mapped-plot design is explained, as well as the rationale for its selection as the national standard. Ratio-of-means estimators am presented as a method to process data from...
Design Challenges in Converting a Paper Checklist to Digital Format for Dynamic Medical Settings
Sarcevic, Aleksandra; Rosen, Brett J.; Kulp, Leah J.; Marsic, Ivan; Burd, Randall S.
2016-01-01
We describe a mobile digital checklist that we designed and developed for trauma resuscitation—a dynamic, fast-paced medical process of treating severely injured patients. The checklist design was informed by our analysis of user interactions with a paper checklist that was introduced to improve team performance during resuscitations. The design process followed an iterative approach and involved several medical experts. We discuss design challenges in converting a paper checklist to its digital counterpart, as well as our approaches for addressing those challenges. While we show that using a digital checklist during a fast-paced medical event is feasible, we also recognize several design constraints, including limited display size, difficulties in entering notes about the medical process and patient, and difficulties in replicating user experience with paper checklists. PMID:28480116
Nozzle Numerical Analysis Of The Scimitar Engine
NASA Astrophysics Data System (ADS)
Battista, F.; Marini, M.; Cutrone, L.
2011-05-01
This work describes part of the activities on the LAPCAT-II A2 vehicle, in which starting from the available conceptual vehicle design and the related pre- cooled turbo-ramjet engine called SCIMITAR, well- thought assumptions made for performance figures of different components during the iteration process within LAPCAT-I will be assessed in more detail. In this paper it is presented a numerical analysis aimed at the design optimization of the nozzle contour of the LAPCAT A2 SCIMITAR engine designed by Reaction Engines Ltd. (REL) (see Figure 1). In particular, nozzle shape optimization process is presented for cruise conditions. All the computations have been carried out by using the CIRA C3NS code in non equilibrium conditions. The effect of considering detailed or reduced chemical kinetic schemes has been analyzed with a particular focus on the production of pollutants. An analysis of engine performance parameters, such as thrust and combustion efficiency has been carried out.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Monozov, Dmitriy; Lukie, Zarija
2016-04-01
Modern scientific and engineering simulations track the time evolution of billions of elements. For such large runs, storing most time steps for later analysis is not a viable strategy. It is far more efficient to analyze the simulation data while it is still in memory. The developers present a novel design for running multiple codes in situ: using coroutines and position-independent executables they enable cooperative multitasking between simulation and analysis, allowing the same executables to post-process simulation output, as well as to process it on the fly, both in situ and in transit. They present Henson, an implementation of ourmore » design, and illustrate its versatility by tackling analysis tasks with different computational requirements. Our design differs significantly from the existing frameworks and offers an efficient and robust approach to integrating multiple codes on modern supercomputers. The presented techniques can also be integrated into other in situ frameworks.« less
NASA Technical Reports Server (NTRS)
Hale, Mark A.; Craig, James I.; Mistree, Farrokh; Schrage, Daniel P.
1995-01-01
Integrated Product and Process Development (IPPD) embodies the simultaneous application of both system and quality engineering methods throughout an iterative design process. The use of IPPD results in the time-conscious, cost-saving development of engineering systems. Georgia Tech has proposed the development of an Integrated Design Engineering Simulator that will merge Integrated Product and Process Development with interdisciplinary analysis techniques and state-of-the-art computational technologies. To implement IPPD, a Decision-Based Design perspective is encapsulated in an approach that focuses on the role of the human designer in product development. The approach has two parts and is outlined in this paper. First, an architecture, called DREAMS, is being developed that facilitates design from a decision-based perspective. Second, a supporting computing infrastructure, called IMAGE, is being designed. The current status of development is given and future directions are outlined.
Requirements Analysis for Large Ada Programs: Lessons Learned on CCPDS- R
1989-12-01
when the design had matured and This approach was not optimal from the formal the SRS role was to be the tester’s contract, implemen- testing and...on the software development CPU processing load. These constraints primar- process is the necessity to include sufficient testing ily affect algorithm...allocations and timing requirements are by-products of the software design process when multiple CSCls are a P R StrR eSOFTWARE ENGINEERING executed within
Pusic, Martin V.; LeBlanc, Vicki; Patel, Vimla L.
2001-01-01
Traditional task analysis for instructional design has emphasized the importance of precisely defining behavioral educational objectives and working back to select objective-appropriate instructional strategies. However, this approach may miss effective strategies. Cognitive task analysis, on the other hand, breaks a process down into its component knowledge representations. Selection of instructional strategies based on all such representations in a domain is likely to lead to optimal instructional design. In this demonstration, using the interpretation of cervical spine x-rays as an educational example, we show how a detailed cognitive task analysis can guide the development of computer-aided instruction.
Surface laser marking optimization using an experimental design approach
NASA Astrophysics Data System (ADS)
Brihmat-Hamadi, F.; Amara, E. H.; Lavisse, L.; Jouvard, J. M.; Cicala, E.; Kellou, H.
2017-04-01
Laser surface marking is performed on a titanium substrate using a pulsed frequency doubled Nd:YAG laser ( λ= 532 nm, τ pulse=5 ns) to process the substrate surface under normal atmospheric conditions. The aim of the work is to investigate, following experimental and statistical approaches, the correlation between the process parameters and the response variables (output), using a Design of Experiment method (DOE): Taguchi methodology and a response surface methodology (RSM). A design is first created using MINTAB program, and then the laser marking process is performed according to the planned design. The response variables; surface roughness and surface reflectance were measured for each sample, and incorporated into the design matrix. The results are then analyzed and the RSM model is developed and verified for predicting the process output for the given set of process parameters values. The analysis shows that the laser beam scanning speed is the most influential operating factor followed by the laser pumping intensity during marking, while the other factors show complex influences on the objective functions.
Design and Analysis of a Preconcentrator for the ChemLab
DOE Office of Scientific and Technical Information (OSTI.GOV)
WONG,CHUNGNIN C.; FLEMMING,JEB H.; MANGINELL,RONALD P.
2000-07-17
Preconcentration is a critical analytical procedure when designing a microsystem for trace chemical detection, because it can purify a sample mixture and boost the small analyte concentration to a much higher level allowing a better analysis. This paper describes the development of a micro-fabricated planar preconcentrator for the {mu}ChemLab{trademark} at Sandia. To guide the design, an analytical model to predict the analyte transport, adsorption and resorption process in the preconcentrator has been developed. Experiments have also been conducted to analyze the adsorption and resorption process and to validate the model. This combined effort of modeling, simulation, and testing has ledmore » us to build a reliable, efficient preconcentrator with good performance.« less
1987-09-01
a useful average for population studies, do not delay data processing , and is relatively Inexpensive. Using MVEN and observing recipe preparation...for population studies, do not delay data processing , and is relatively inexpensive. Using HVEM and observing recipe preparation procedures improve the...extensive review of the procedures and problems in design, collection, analysis, processing and interpretation of dietary survey data for individuals
Exergy analysis of helium liquefaction systems based on modified Claude cycle with two-expanders
NASA Astrophysics Data System (ADS)
Thomas, Rijo Jacob; Ghosh, Parthasarathi; Chowdhury, Kanchan
2011-06-01
Large-scale helium liquefaction systems, being energy-intensive, demand judicious selection of process parameters. An effective tool for design and analysis of thermodynamic cycles for these systems is exergy analysis, which is used to study the behavior of a helium liquefaction system based on modified Claude cycle. Parametric evaluation using process simulator Aspen HYSYS® helps to identify the effects of cycle pressure ratio and expander flow fraction on the exergetic efficiency of the liquefaction cycle. The study computes the distribution of losses at different refrigeration stages of the cycle and helps in selecting optimum cycle pressures, operating temperature levels of expanders and mass flow rates through them. Results from the analysis may help evolving guidelines for designing appropriate thermodynamic cycles for practical helium liquefaction systems.
Image analysis of multiple moving wood pieces in real time
NASA Astrophysics Data System (ADS)
Wang, Weixing
2006-02-01
This paper presents algorithms for image processing and image analysis of wood piece materials. The algorithms were designed for auto-detection of wood piece materials on a moving conveyor belt or a truck. When wood objects on moving, the hard task is to trace the contours of the objects in n optimal way. To make the algorithms work efficiently in the plant, a flexible online system was designed and developed, which mainly consists of image acquisition, image processing, object delineation and analysis. A number of newly-developed algorithms can delineate wood objects with high accuracy and high speed, and in the wood piece analysis part, each wood piece can be characterized by a number of visual parameters which can also be used for constructing experimental models directly in the system.
NASA Astrophysics Data System (ADS)
Raju, B. S.; Sekhar, U. Chandra; Drakshayani, D. N.
2017-08-01
The paper investigates optimization of stereolithography process for SL5530 epoxy resin material to enhance part quality. The major characteristics indexed for performance selected to evaluate the processes are tensile strength, Flexural strength, Impact strength and Density analysis and corresponding process parameters are Layer thickness, Orientation and Hatch spacing. In this study, the process is intrinsically with multiple parameters tuning so that grey relational analysis which uses grey relational grade as performance index is specially adopted to determine the optimal combination of process parameters. Moreover, the principal component analysis is applied to evaluate the weighting values corresponding to various performance characteristics so that their relative importance can be properly and objectively desired. The results of confirmation experiments reveal that grey relational analysis coupled with principal component analysis can effectively acquire the optimal combination of process parameters. Hence, this confirm that the proposed approach in this study can be an useful tool to improve the process parameters in stereolithography process, which is very useful information for machine designers as well as RP machine users.
Lunar Landing Trajectory Design for Onboard Hazard Detection and Avoidance
NASA Technical Reports Server (NTRS)
Paschall, Steve; Brady, Tye; Sostaric, Ron
2009-01-01
The Autonomous Landing and Hazard Avoidance Technology (ALHAT) Project is developing the software and hardware technology needed to support a safe and precise landing for the next generation of lunar missions. ALHAT provides this capability through terrain-relative navigation measurements to enhance global-scale precision, an onboard hazard detection system to select safe landing locations, and an Autonomous Guidance, Navigation, and Control (AGNC) capability to process these measurements and safely direct the vehicle to a landing location. This paper focuses on the key trajectory design issues relevant to providing an onboard Hazard Detection and Avoidance (HDA) capability for the lander. Hazard detection can be accomplished by the crew visually scanning the terrain through a window, a sensor system imaging the terrain, or some combination of both. For ALHAT, this hazard detection activity is provided by a sensor system, which either augments the crew s perception or entirely replaces the crew in the case of a robotic landing. Detecting hazards influences the trajectory design by requiring the proper perspective, range to the landing site, and sufficient time to view the terrain. Following this, the trajectory design must provide additional time to process this information and make a decision about where to safely land. During the final part of the HDA process, the trajectory design must provide sufficient margin to enable a hazard avoidance maneuver. In order to demonstrate the effects of these constraints on the landing trajectory, a tradespace of trajectory designs was created for the initial ALHAT Design Analysis Cycle (ALDAC-1) and each case evaluated with these HDA constraints active. The ALHAT analysis process, described in this paper, narrows down this tradespace and subsequently better defines the trajectory design needed to support onboard HDA. Future ALDACs will enhance this trajectory design by balancing these issues and others in an overall system design process.
Finite element modelling of chain-die forming for ultra-high strength steel
NASA Astrophysics Data System (ADS)
Majji, Raju; Xiang, Yang; Ding, Scott; Yang, Chunhui
2017-10-01
There has been a high demand for weight reduction in automotive vehicles while maintaining passenger safety. A potential steel material to achieve this is Ultra High Strength Steel (UHSS). As a high strength material, it is difficult to be formed with desired profiles using traditional sheet metal forming processes such as Cold Roll Forming. To overcome this problem, a potentially alternative solution is Chain-die Forming (CDF), recently developed. The basic principal of the CDF is to fully combine roll forming and bending processes. The main advantage of this process is the elongated deformation length that significantly increases effective roll radius. This study focuses on identifying issues with the CDF by using CAD modelling, Motion Analysis and Finite Element Analysis (FEA) to devise solutions and construct a more reliable process in an optimal design sense. Some attempts on finite element modelling and simulation of the CDF were conducted using relatively simple models in literature and the research was still not sufficient enough for optimal design of a typical CDF for UHSS. Therefore two numerical models of Chain-die Forming process are developed in this study, including a) one having a set of rolls similar to roll forming but with a large radius, i.e., 20 meters; and b) the other one with dies and punch segments similar to a typical CDF machine. As a case study, to form a 60° channel with single pass was conducted using these two devised models for a comparison. The obtained numerical results clearly show the CDF could generate less residual stress, low strain and small springback of a single pass for the 60° UHSS channel. The design analysis procedure proposed in this study could greatly help the mechanical designers to devise a cost-effective and reliable CDF process for forming UHSS.
Computer aided analysis, simulation and optimisation of thermal sterilisation processes.
Narayanan, C M; Banerjee, Arindam
2013-04-01
Although thermal sterilisation is a widely employed industrial process, little work is reported in the available literature including patents on the mathematical analysis and simulation of these processes. In the present work, software packages have been developed for computer aided optimum design of thermal sterilisation processes. Systems involving steam sparging, jacketed heating/cooling, helical coils submerged in agitated vessels and systems that employ external heat exchangers (double pipe, shell and tube and plate exchangers) have been considered. Both batch and continuous operations have been analysed and simulated. The dependence of del factor on system / operating parameters such as mass or volume of substrate to be sterilised per batch, speed of agitation, helix diameter, substrate to steam ratio, rate of substrate circulation through heat exchanger and that through holding tube have been analysed separately for each mode of sterilisation. Axial dispersion in the holding tube has also been adequately accounted for through an appropriately defined axial dispersion coefficient. The effect of exchanger characteristics/specifications on the system performance has also been analysed. The multiparameter computer aided design (CAD) software packages prepared are thus highly versatile in nature and they permit to make the most optimum choice of operating variables for the processes selected. The computed results have been compared with extensive data collected from a number of industries (distilleries, food processing and pharmaceutical industries) and pilot plants and satisfactory agreement has been observed between the two, thereby ascertaining the accuracy of the CAD softwares developed. No simplifying assumptions have been made during the analysis and the design of associated heating / cooling equipment has been performed utilising the most updated design correlations and computer softwares.
Modeling of laser transmission contour welding process using FEA and DoE
NASA Astrophysics Data System (ADS)
Acherjee, Bappa; Kuar, Arunanshu S.; Mitra, Souren; Misra, Dipten
2012-07-01
In this research, a systematic investigation on laser transmission contour welding process is carried out using finite element analysis (FEA) and design of experiments (DoE) techniques. First of all, a three-dimensional thermal model is developed to simulate the laser transmission contour welding process with a moving heat source. The commercial finite element code ANSYS® multi-physics is used to obtain the numerical results by implementing a volumetric Gaussian heat source, and combined convection-radiation boundary conditions. Design of experiments together with regression analysis is then employed to plan the experiments and to develop mathematical models based on simulation results. Four key process parameters, namely power, welding speed, beam diameter, and carbon black content in absorbing polymer, are considered as independent variables, while maximum temperature at weld interface, weld width, and weld depths in transparent and absorbing polymers are considered as dependent variables. Sensitivity analysis is performed to determine how different values of an independent variable affect a particular dependent variable.
Research on animation design of growing plant based on 3D MAX technology
NASA Astrophysics Data System (ADS)
Chen, Yineng; Fang, Kui; Bu, Weiqiong; Zhang, Xiaoling; Lei, Menglong
In view of virtual plant has practical demands on quality, image and degree of realism animation in growing process of plant, this thesis design the animation based on mechanism and regularity of plant growth, and propose the design method based on 3D MAX technology. After repeated analysis and testing, it is concluded that there are modeling, rendering, animation fabrication and other key technologies in the animation design process. Based on this, designers can subdivid the animation into seed germination animation, plant growth prophase animation, catagen animation, later animation and blossom animation. This paper compounds the animation of these five stages by VP window to realize the completed 3D animation. Experimental result shows that the animation can realized rapid, visual and realistic simulatation the plant growth process.
Miller, Matthew James; McGuire, Kerry M.; Feigh, Karen M.
2016-01-01
The design and adoption of decision support systems within complex work domains is a challenge for cognitive systems engineering (CSE) practitioners, particularly at the onset of project development. This article presents an example of applying CSE techniques to derive design requirements compatible with traditional systems engineering to guide decision support system development. Specifically, it demonstrates the requirements derivation process based on cognitive work analysis for a subset of human spaceflight operations known as extravehicular activity. The results are presented in two phases. First, a work domain analysis revealed a comprehensive set of work functions and constraints that exist in the extravehicular activity work domain. Second, a control task analysis was performed on a subset of the work functions identified by the work domain analysis to articulate the translation of subject matter states of knowledge to high-level decision support system requirements. This work emphasizes an incremental requirements specification process as a critical component of CSE analyses to better situate CSE perspectives within the early phases of traditional systems engineering design. PMID:28491008
Miller, Matthew James; McGuire, Kerry M; Feigh, Karen M
2017-06-01
The design and adoption of decision support systems within complex work domains is a challenge for cognitive systems engineering (CSE) practitioners, particularly at the onset of project development. This article presents an example of applying CSE techniques to derive design requirements compatible with traditional systems engineering to guide decision support system development. Specifically, it demonstrates the requirements derivation process based on cognitive work analysis for a subset of human spaceflight operations known as extravehicular activity . The results are presented in two phases. First, a work domain analysis revealed a comprehensive set of work functions and constraints that exist in the extravehicular activity work domain. Second, a control task analysis was performed on a subset of the work functions identified by the work domain analysis to articulate the translation of subject matter states of knowledge to high-level decision support system requirements. This work emphasizes an incremental requirements specification process as a critical component of CSE analyses to better situate CSE perspectives within the early phases of traditional systems engineering design.
System analysis through bond graph modeling
NASA Astrophysics Data System (ADS)
McBride, Robert Thomas
2005-07-01
Modeling and simulation form an integral role in the engineering design process. An accurate mathematical description of a system provides the design engineer the flexibility to perform trade studies quickly and accurately to expedite the design process. Most often, the mathematical model of the system contains components of different engineering disciplines. A modeling methodology that can handle these types of systems might be used in an indirect fashion to extract added information from the model. This research examines the ability of a modeling methodology to provide added insight into system analysis and design. The modeling methodology used is bond graph modeling. An investigation into the creation of a bond graph model using the Lagrangian of the system is provided. Upon creation of the bond graph, system analysis is performed. To aid in the system analysis, an object-oriented approach to bond graph modeling is introduced. A framework is provided to simulate the bond graph directly. Through object-oriented simulation of a bond graph, the information contained within the bond graph can be exploited to create a measurement of system efficiency. A definition of system efficiency is given. This measurement of efficiency is used in the design of different controllers of varying architectures. Optimal control of a missile autopilot is discussed within the framework of the calculated system efficiency.
Ku-band signal design study. [space shuttle orbiter data processing network
NASA Technical Reports Server (NTRS)
Rubin, I.
1978-01-01
Analytical tools, methods and techniques for assessing the design and performance of the space shuttle orbiter data processing system (DPS) are provided. The computer data processing network is evaluated in the key areas of queueing behavior synchronization and network reliability. The structure of the data processing network is described as well as the system operation principles and the network configuration. The characteristics of the computer systems are indicated. System reliability measures are defined and studied. System and network invulnerability measures are computed. Communication path and network failure analysis techniques are included.
NASA Technical Reports Server (NTRS)
Wolf, M.
1981-01-01
The effect of solar cell metallization pattern design on solar cell performance and the costs and performance effects of different metallization processes are discussed. Definitive design rules for the front metallization pattern for large area solar cells are presented. Chemical and physical deposition processes for metallization are described and compared. An economic evaluation of the 6 principal metallization options is presented. Instructions for preparing Format A cost data for solar cell manufacturing processes from UPPC forms for input into the SAMIC computer program are presented.
CFD Process Pre- and Post-processing Automation in Support of Space Propulsion
NASA Technical Reports Server (NTRS)
Dorney, Suzanne M.
2003-01-01
The use of Computational Fluid Dynamics or CFD has become standard practice in the design and analysis of the major components used for space propulsion. In an attempt to standardize and improve the CFD process a series of automated tools have been developed. Through the use of these automated tools the application of CFD to the design cycle has been improved and streamlined. This paper presents a series of applications in which deficiencies were identified in the CFD process and corrected through the development of automated tools.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hilaly, A.K.; Sikdar, S.K.
In this study, the authors introduced several modifications to the WAR (waste reduction) algorithm developed earlier. These modifications were made for systematically handling sensitivity analysis and various tasks of waste minimization. A design hierarchy was formulated to promote appropriate waste reduction tasks at designated levels of the hierarchy. A sensitivity coefficient was used to measure the relative impacts of process variables on the pollution index of a process. The use of the WAR algorithm was demonstrated by a fermentation process for making penicillin.
Lee, Inkyu; Park, Jinwoo; Moon, Il
2017-12-01
This paper describes data of an integrated process, cryogenic energy storage system combined with liquefied natural gas (LNG) regasification process. The data in this paper is associated with the article entitled "Conceptual Design and Exergy Analysis of Combined Cryogenic Energy Storage and LNG Regasification Processes: Cold and Power Integration" (Lee et al., 2017) [1]. The data includes the sensitivity case study dataset of the air flow rate and the heat exchanging feasibility data by composite curves. The data is expected to be helpful to the cryogenic energy process development.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Imrich, K. J.
2015-03-27
Corrosion is an extremely complex process that is affected by numerous factors. Addition of a flowing multi-phase solution further complicates the analysis. The synergistic effects of the multiple corrosive species as well as the flow-induced synergistic effects from erosion and corrosion must be thoroughly evaluated in order to predict material degradation responses. Public domain data can help guide the analysis, but cannot reliably provide the design basis especially when the process is one-of-a-kind, designed for 40 plus years of service, and has no viable means for repair or replacement. Testing in representative simulants and environmental conditions with prototypic components willmore » provide a stronger technical basis for design. This philosophy was exemplified by the Defense Waste Processing Facility (DWPF) at the Savannah River Site and only after 15 plus years of successful operation has it been validated. There have been “hiccups”, some identified during the cold commissioning phase and some during radioactive operations, but they were minor and overcome. In addition, the system is robust enough to tolerate most flowsheet changes and the DWPF design allows minor modifications and replacements – approaches not available with the Hanford Waste Treatment Plant (WTP) “Black Cell” design methodology. Based on the available data, the synergistic effect between erosion and corrosion is a credible – virtually certain – degradation mechanism and must be considered for the design of the WTP process systems. Testing is recommended due to the number of variables (e.g., material properties, process parameters, and component design) that can affect synergy between erosion and corrosion and because the available literature is of limited applicability for the complex process chemistries anticipated in the WTP. Applicable testing will provide a reasonable and defensible path forward for design of the WTP Black Cell and Hard-to-Reach process equipment. These conclusions are consistent with findings from the various Bechtel National Inc., Independent Review Teams, and Department of Energy (DOE) reviews. A test methodology is outlined, which should provide a clear, logical road map for the testing that is necessary to provide applicable and defensible data essential to support design calculations.« less
NASA Astrophysics Data System (ADS)
Brereton, Margot Felicity
A series of short engineering exercises and design projects was created to help students learn to apply abstract knowledge to physical experiences with hardware. The exercises involved designing machines from kits of materials and dissecting and analyzing familiar household products. Students worked in teams. During the activities students brought their knowledge of engineering fundamentals to bear. Videotape analysis was used to identify and characterize the ways in which hardware contributed to learning fundamental concepts. Structural and qualitative analyses of videotaped activities were undertaken. Structural analysis involved counting the references to theory and hardware and the extent of interleaving of references in activity. The analysis found that there was much more discussion linking fundamental concepts to hardware in some activities than in others. The analysis showed that the interleaving of references to theory and hardware in activity is observable and quantifiable. Qualitative analysis was used to investigate the dialog linking concepts and hardware. Students were found to advance their designs and their understanding of engineering fundamentals through a negotiation process in which they pitted abstract concepts against hardware behavior. Through this process students sorted out theoretical assumptions and causal relations. In addition they discovered design assumptions, functional connections and physical embodiments of abstract concepts in hardware, developing a repertoire of familiar hardware components and machines. Hardware was found to be integral to learning, affecting the course of inquiry and the dynamics of group interaction. Several case studies are presented to illustrate the processes at work. The research illustrates the importance of working across the boundary between abstractions and experiences with hardware in order to learn engineering and physical sciences. The research findings are: (a) the negotiation process by which students discover fundamental concepts in hardware (and three central causes of negotiation breakdown); (b) a characterization of the ways that material systems contribute to learning activities, (the seven roles of hardware in learning); (c) the characteristics of activities that support discovering fundamental concepts in hardware (plus several engineering exercises); (d) a research methodology to examine how students learn in practice.
Cold Vacuum Drying facility civil structural system design description (SYS 06)
DOE Office of Scientific and Technical Information (OSTI.GOV)
PITKOFF, C.C.
This document describes the Cold Vacuum Drying (CVD) Facility civil - structural system. This system consists of the facility structure, including the administrative and process areas. The system's primary purpose is to provide for a facility to house the CVD process and personnel and to provide a tertiary level of containment. The document provides a description of the facility and demonstrates how the design meets the various requirements imposed by the safety analysis report and the design requirements document.
Combined process automation for large-scale EEG analysis.
Sfondouris, John L; Quebedeaux, Tabitha M; Holdgraf, Chris; Musto, Alberto E
2012-01-01
Epileptogenesis is a dynamic process producing increased seizure susceptibility. Electroencephalography (EEG) data provides information critical in understanding the evolution of epileptiform changes throughout epileptic foci. We designed an algorithm to facilitate efficient large-scale EEG analysis via linked automation of multiple data processing steps. Using EEG recordings obtained from electrical stimulation studies, the following steps of EEG analysis were automated: (1) alignment and isolation of pre- and post-stimulation intervals, (2) generation of user-defined band frequency waveforms, (3) spike-sorting, (4) quantification of spike and burst data and (5) power spectral density analysis. This algorithm allows for quicker, more efficient EEG analysis. Copyright © 2011 Elsevier Ltd. All rights reserved.
Experimental Design For Photoresist Characterization
NASA Astrophysics Data System (ADS)
Luckock, Larry
1987-04-01
In processing a semiconductor product (from discrete devices up to the most complex products produced) we find more photolithographic steps in wafer fabrication than any other kind of process step. Thus, the success of a semiconductor manufacturer hinges on the optimization of their photolithographic processes. Yet, we find few companies that have taken the time to properly characterize this critical operation; they are sitting in the "passenger's seat", waiting to see what will come out, hoping that the yields will improve someday. There is no "black magic" involved in setting up a process at its optimum conditions (i.e. minimum sensitivity to all variables at the same time). This paper gives an example of a real world situation for optimizing a photolithographic process by the use of a properly designed experiment, followed by adequate multidimensional analysis of the data. Basic SPC practices like plotting control charts will not, by themselves, improve yields; the control charts are, however, among the necessary tools used in the determination of the process capability and in the formulation of the problems to be addressed. The example we shall consider is the twofold objective of shifting the process average, while tightening the variance, of polysilicon line widths. This goal was identified from a Pareto analysis of yield-limiting mechanisms, plus inspection of the control charts. A key issue in a characterization of this type of process is the number of interactions between variables; this example rules out two-level full factorial and three-level fractional factorial designs (which cannot detect all of the interactions). We arrive at an experiment with five factors at five levels each. A full factorial design for five factors at three levels would require 3125 wafers. Instead, we will use a design that allows us to run this experiment with only 25 wafers, for a significant reduction in time, materials and manufacturing interruption in order to complete the experiment. An optimum solution is then determined via response surface analysis and a series of 3-D and contour plots are shown. The offset between the mask dimensions and poly CD at the optimum operating conditions is discussed with respect to yield, profits and return-on-investment. The expert system used for process optimization covers all types of process steps, producing the best custom designed experiment based on the actual equipment used. The knowledge base contains parameter lists, by machine make and model, ranked by sensitivity and controllability. One option allows 3-D spatial characterization of equipment. For the purpose of this presentation, we will assume that we want to optimize a photo-lithographic process used for polysilicon pattern definition and that we have determined minimum and maximum line widths, based on electrical yield requirements of the product. For this MOS process, the minimum critical dimension (CD) for the poly gate was determined by punchthrough voltage, threshold voltage, etc., while the maximum CD was determined from other performance factors like access time. We will start with the product engineer's analysis.
Study on the Preliminary Design of ARGO-M Operation System
NASA Astrophysics Data System (ADS)
Seo, Yoon-Kyung; Lim, Hyung-Chul; Rew, Dong-Young; Jo, Jung Hyun; Park, Jong-Uk; Park, Eun-Seo; Park, Jang-Hyun
2010-12-01
Korea Astronomy and Space Science Institute has been developing one mobile satellite laser ranging system named as accurate ranging system for geodetic observation-mobile (ARGO-M). Preliminary design of ARGO-M operation system (AOS) which is one of the ARGO-M subsystems was completed in 2009. Preliminary design results are applied to the following development phase by performing detailed design with analysis of pre-defined requirements and analysis of the derived specifications. This paper addresses the preliminary design of the whole AOS. The design results in operation and control part which is a key part in the operation system are described in detail. Analysis results of the interface between operation-supporting hardware and the control computer are summarized, which is necessary in defining the requirements for the operation-supporting hardware. Results of this study are expected to be used in the critical design phase to finalize the design process.
An inverse method for the aerodynamic design of three-dimensional aircraft engine nacelles
NASA Technical Reports Server (NTRS)
Bell, R. A.; Cedar, R. D.
1991-01-01
A fast, efficient and user friendly inverse design system for 3-D nacelles was developed. The system is a product of a 2-D inverse design method originally developed at NASA-Langley and the CFL3D analysis code which was also developed at NASA-Langley and modified for nacelle analysis. The design system uses a predictor/corrector design approach in which an analysis code is used to calculate the flow field for an initial geometry, the geometry is then modified based on the difference between the calculated and target pressures. A detailed discussion of the design method, the process of linking it to the modified CFL3D solver and its extension to 3-D is presented. This is followed by a number of examples of the use of the design system for the design of both axisymmetric and 3-D nacelles.
Code of Federal Regulations, 2014 CFR
2014-07-01
... permit limit applicable to the process vent. (iv) Design analysis based on accepted chemical engineering... rates, halogenated process vent determinations, process vent TRE index values, and engineering... corrected to 2.3 percent moisture; or (2) The engineering assessment procedures in paragraph (k) of this...
Code of Federal Regulations, 2013 CFR
2013-07-01
... permit limit applicable to the process vent. (iv) Design analysis based on accepted chemical engineering... rates, halogenated process vent determinations, process vent TRE index values, and engineering... corrected to 2.3 percent moisture; or (2) The engineering assessment procedures in paragraph (k) of this...
Code of Federal Regulations, 2011 CFR
2011-07-01
... permit limit applicable to the process vent. (iv) Design analysis based on accepted chemical engineering... rates, halogenated process vent determinations, process vent TRE index values, and engineering... corrected to 2.3 percent moisture; or (2) The engineering assessment procedures in paragraph (k) of this...
Code of Federal Regulations, 2012 CFR
2012-07-01
... permit limit applicable to the process vent. (iv) Design analysis based on accepted chemical engineering... rates, halogenated process vent determinations, process vent TRE index values, and engineering... corrected to 2.3 percent moisture; or (2) The engineering assessment procedures in paragraph (k) of this...
Practical, transparent prospective risk analysis for the clinical laboratory.
Janssens, Pim Mw
2014-11-01
Prospective risk analysis (PRA) is an essential element in quality assurance for clinical laboratories. Practical approaches to conducting PRA in laboratories, however, are scarce. On the basis of the classical Failure Mode and Effect Analysis method, an approach to PRA was developed for application to key laboratory processes. First, the separate, major steps of the process under investigation are identified. Scores are then given for the Probability (P) and Consequence (C) of predefined types of failures and the chances of Detecting (D) these failures. Based on the P and C scores (on a 10-point scale), an overall Risk score (R) is calculated. The scores for each process were recorded in a matrix table. Based on predetermined criteria for R and D, it was determined whether a more detailed analysis was required for potential failures and, ultimately, where risk-reducing measures were necessary, if any. As an illustration, this paper presents the results of the application of PRA to our pre-analytical and analytical activities. The highest R scores were obtained in the stat processes, the most common failure type in the collective process steps was 'delayed processing or analysis', the failure type with the highest mean R score was 'inappropriate analysis' and the failure type most frequently rated as suboptimal was 'identification error'. The PRA designed is a useful semi-objective tool to identify process steps with potential failures rated as risky. Its systematic design and convenient output in matrix tables makes it easy to perform, practical and transparent. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
NASA Technical Reports Server (NTRS)
Deckman, G.; Rousseau, J. (Editor)
1973-01-01
The Wash Water Recovery System (WWRS) is intended for use in processing shower bath water onboard a spacecraft. The WWRS utilizes flash evaporation, vapor compression, and pyrolytic reaction to process the wash water to allow recovery of potable water. Wash water flashing and foaming characteristics, are evaluated physical properties, of concentrated wash water are determined, and a long term feasibility study on the system is performed. In addition, a computer analysis of the system and a detail design of a 10 lb/hr vortex-type water vapor compressor were completed. The computer analysis also sized remaining system components on the basis of the new vortex compressor design.
Evolutionary computing for the design search and optimization of space vehicle power subsystems
NASA Technical Reports Server (NTRS)
Kordon, Mark; Klimeck, Gerhard; Hanks, David; Hua, Hook
2004-01-01
Evolutionary computing has proven to be a straightforward and robust approach for optimizing a wide range of difficult analysis and design problems. This paper discusses the application of these techniques to an existing space vehicle power subsystem resource and performance analysis simulation in a parallel processing environment. Out preliminary results demonstrate that this approach has the potential to improve the space system trade study process by allowing engineers to statistically weight subsystem goals of mass, cost and performance then automatically size power elements based on anticipated performance of the subsystem rather than on worst-case estimates.
Process Feasibility Study in Support of Silicon Material, Task 1
NASA Technical Reports Server (NTRS)
Li, K. Y.; Hansen, K. C.; Yaws, C. L.
1979-01-01
During this reporting period, major activies were devoted to process system properties, chemical engineering and economic analyses. Analyses of process system properties was continued for materials involved in the alternate processes under consideration for solar cell grade silicon. The following property data are reported for silicon tetrafluoride: critical constants, vapor pressure, heat of varporization, heat capacity, density, surface tension, viscosity, thermal conductivity, heat of formation and Gibb's free energy of formation. Chemical engineering analysis of the BCL process was continued with primary efforts being devoted to the preliminary process design. Status and progress are reported for base case conditions; process flow diagram; reaction chemistry; material and energy balances; and major process equipment design.
Waste processing building with incineration technology
NASA Astrophysics Data System (ADS)
Wasilah, Wasilah; Zaldi Suradin, Muh.
2017-12-01
In Indonesia, waste problem is one of major problem of the society in the city as part of their life dynamics. Based on Regional Medium Term Development Plan of South Sulawesi Province in 2013-2018, total volume and waste production from Makassar City, Maros, Gowa, and Takalar Regency estimates the garbage dump level 9,076.949 m3/person/day. Additionally, aim of this design is to present a recommendation on waste processing facility design that would accommodate waste processing process activity by incineration technology and supported by supporting activity such as place of education and research on waste, and the administration activity on waste processing facility. Implementation of incineration technology would reduce waste volume up to 90% followed by relative negative impact possibility. The result planning is in form of landscape layout that inspired from the observation analysis of satellite image line pattern of planning site and then created as a building site pattern. Consideration of building orientation conducted by wind analysis process and sun path by auto desk project Vasari software. The footprint designed by separate circulation system between waste management facility interest and the social visiting activity in order to minimize the croos and thus bring convenient to the building user. Building mass designed by inseparable connection series system, from the main building that located in the Northward, then connected to a centre visitor area lengthways, and walked to the waste processing area into the residue area in the Southward area.
Application of State Analysis and Goal-based Operations to a MER Mission Scenario
NASA Technical Reports Server (NTRS)
Morris, John Richard; Ingham, Michel D.; Mishkin, Andrew H.; Rasmussen, Robert D.; Starbird, Thomas W.
2006-01-01
State Analysis is a model-based systems engineering methodology employing a rigorous discovery process which articulates operations concepts and operability needs as an integrated part of system design. The process produces requirements on system and software design in the form of explicit models which describe the system behavior in terms of state variables and the relationships among them. By applying State Analysis to an actual MER flight mission scenario, this study addresses the specific real world challenges of complex space operations and explores technologies that can be brought to bear on future missions. The paper first describes the tools currently used on a daily basis for MER operations planning and provides an in-depth description of the planning process, in the context of a Martian day's worth of rover engineering activities, resource modeling, flight rules, science observations, and more. It then describes how State Analysis allows for the specification of a corresponding goal-based sequence that accomplishes the same objectives, with several important additional benefits.
Application of State Analysis and Goal-Based Operations to a MER Mission Scenario
NASA Technical Reports Server (NTRS)
Morris, J. Richard; Ingham, Michel D.; Mishkin, Andrew H.; Rasmussen, Robert D.; Starbird, Thomas W.
2006-01-01
State Analysis is a model-based systems engineering methodology employing a rigorous discovery process which articulates operations concepts and operability needs as an integrated part of system design. The process produces requirements on system and software design in the form of explicit models which describe the behavior of states and the relationships among them. By applying State Analysis to an actual MER flight mission scenario, this study addresses the specific real world challenges of complex space operations and explores technologies that can be brought to bear on future missions. The paper describes the tools currently used on a daily basis for MER operations planning and provides an in-depth description of the planning process, in the context of a Martian day's worth of rover engineering activities, resource modeling, flight rules, science observations, and more. It then describes how State Analysis allows for the specification of a corresponding goal-based sequence that accomplishes the same objectives, with several important additional benefits.
Quantitative Analysis of the Efficiency of OLEDs.
Sim, Bomi; Moon, Chang-Ki; Kim, Kwon-Hyeon; Kim, Jang-Joo
2016-12-07
We present a comprehensive model for the quantitative analysis of factors influencing the efficiency of organic light-emitting diodes (OLEDs) as a function of the current density. The model takes into account the contribution made by the charge carrier imbalance, quenching processes, and optical design loss of the device arising from various optical effects including the cavity structure, location and profile of the excitons, effective radiative quantum efficiency, and out-coupling efficiency. Quantitative analysis of the efficiency can be performed with an optical simulation using material parameters and experimental measurements of the exciton profile in the emission layer and the lifetime of the exciton as a function of the current density. This method was applied to three phosphorescent OLEDs based on a single host, mixed host, and exciplex-forming cohost. The three factors (charge carrier imbalance, quenching processes, and optical design loss) were influential in different ways, depending on the device. The proposed model can potentially be used to optimize OLED configurations on the basis of an analysis of the underlying physical processes.
Ayachit, Utkarsh; Bauer, Andrew; Duque, Earl P. N.; ...
2016-11-01
A key trend facing extreme-scale computational science is the widening gap between computational and I/O rates, and the challenge that follows is how to best gain insight from simulation data when it is increasingly impractical to save it to persistent storage for subsequent visual exploration and analysis. One approach to this challenge is centered around the idea of in situ processing, where visualization and analysis processing is performed while data is still resident in memory. Our paper examines several key design and performance issues related to the idea of in situ processing at extreme scale on modern platforms: Scalability, overhead,more » performance measurement and analysis, comparison and contrast with a traditional post hoc approach, and interfacing with simulation codes. We illustrate these principles in practice with studies, conducted on large-scale HPC platforms, that include a miniapplication and multiple science application codes, one of which demonstrates in situ methods in use at greater than 1M-way concurrency.« less
Sociotechnical attributes of safe and unsafe work systems
Kleiner, Brian M.; Hettinger, Lawrence J.; DeJoy, David M.; Huang, Yuang-Hsiang; Love, Peter E.D.
2015-01-01
Theoretical and practical approaches to safety based on sociotechnical systems principles place heavy emphasis on the intersections between social–organisational and technical–work process factors. Within this perspective, work system design emphasises factors such as the joint optimisation of social and technical processes, a focus on reliable human–system performance and safety metrics as design and analysis criteria, the maintenance of a realistic and consistent set of safety objectives and policies, and regular access to the expertise and input of workers. We discuss three current approaches to the analysis and design of complex sociotechnical systems: human–systems integration, macroergonomics and safety climate. Each approach emphasises key sociotechnical systems themes, and each prescribes a more holistic perspective on work systems than do traditional theories and methods. We contrast these perspectives with historical precedents such as system safety and traditional human factors and ergonomics, and describe potential future directions for their application in research and practice. Practitioner Summary: The identification of factors that can reliably distinguish between safe and unsafe work systems is an important concern for ergonomists and other safety professionals. This paper presents a variety of sociotechnical systems perspectives on intersections between social–organisational and technology–work process factors as they impact work system analysis, design and operation. PMID:25909756
Efficient sensitivity analysis and optimization of a helicopter rotor
NASA Technical Reports Server (NTRS)
Lim, Joon W.; Chopra, Inderjit
1989-01-01
Aeroelastic optimization of a system essentially consists of the determination of the optimum values of design variables which minimize the objective function and satisfy certain aeroelastic and geometric constraints. The process of aeroelastic optimization analysis is illustrated. To carry out aeroelastic optimization effectively, one needs a reliable analysis procedure to determine steady response and stability of a rotor system in forward flight. The rotor dynamic analysis used in the present study developed inhouse at the University of Maryland is based on finite elements in space and time. The analysis consists of two major phases: vehicle trim and rotor steady response (coupled trim analysis), and aeroelastic stability of the blade. For a reduction of helicopter vibration, the optimization process requires the sensitivity derivatives of the objective function and aeroelastic stability constraints. For this, the derivatives of steady response, hub loads and blade stability roots are calculated using a direct analytical approach. An automated optimization procedure is developed by coupling the rotor dynamic analysis, design sensitivity analysis and constrained optimization code CONMIN.
Geometry Modeling and Grid Generation for Design and Optimization
NASA Technical Reports Server (NTRS)
Samareh, Jamshid A.
1998-01-01
Geometry modeling and grid generation (GMGG) have played and will continue to play an important role in computational aerosciences. During the past two decades, tremendous progress has occurred in GMGG; however, GMGG is still the biggest bottleneck to routine applications for complicated Computational Fluid Dynamics (CFD) and Computational Structures Mechanics (CSM) models for analysis, design, and optimization. We are still far from incorporating GMGG tools in a design and optimization environment for complicated configurations. It is still a challenging task to parameterize an existing model in today's Computer-Aided Design (CAD) systems, and the models created are not always good enough for automatic grid generation tools. Designers may believe their models are complete and accurate, but unseen imperfections (e.g., gaps, unwanted wiggles, free edges, slivers, and transition cracks) often cause problems in gridding for CSM and CFD. Despite many advances in grid generation, the process is still the most labor-intensive and time-consuming part of the computational aerosciences for analysis, design, and optimization. In an ideal design environment, a design engineer would use a parametric model to evaluate alternative designs effortlessly and optimize an existing design for a new set of design objectives and constraints. For this ideal environment to be realized, the GMGG tools must have the following characteristics: (1) be automated, (2) provide consistent geometry across all disciplines, (3) be parametric, and (4) provide sensitivity derivatives. This paper will review the status of GMGG for analysis, design, and optimization processes, and it will focus on some emerging ideas that will advance the GMGG toward the ideal design environment.
Pattern database applications from design to manufacturing
NASA Astrophysics Data System (ADS)
Zhuang, Linda; Zhu, Annie; Zhang, Yifan; Sweis, Jason; Lai, Ya-Chieh
2017-03-01
Pattern-based approaches are becoming more common and popular as the industry moves to advanced technology nodes. At the beginning of a new technology node, a library of process weak point patterns for physical and electrical verification are starting to build up and used to prevent known hotspots from re-occurring on new designs. Then the pattern set is expanded to create test keys for process development in order to verify the manufacturing capability and precheck new tape-out designs for any potential yield detractors. With the database growing, the adoption of pattern-based approaches has expanded from design flows to technology development and then needed for mass-production purposes. This paper will present the complete downstream working flows of a design pattern database(PDB). This pattern-based data analysis flow covers different applications across different functional teams from generating enhancement kits to improving design manufacturability, populating new testing design data based on previous-learning, generating analysis data to improve mass-production efficiency and manufacturing equipment in-line control to check machine status consistency across different fab sites.
Information Flow in the Launch Vehicle Design/Analysis Process
NASA Technical Reports Server (NTRS)
Humphries, W. R., Sr.; Holland, W.; Bishop, R.
1999-01-01
This paper describes the results of a team effort aimed at defining the information flow between disciplines at the Marshall Space Flight Center (MSFC) engaged in the design of space launch vehicles. The information flow is modeled at a first level and is described using three types of templates: an N x N diagram, discipline flow diagrams, and discipline task descriptions. It is intended to provide engineers with an understanding of the connections between what they do and where it fits in the overall design process of the project. It is also intended to provide design managers with a better understanding of information flow in the launch vehicle design cycle.
Designing for Quality: An Analysis of Design and Pedagogical Issues in Online Course Development
ERIC Educational Resources Information Center
Sanga, Mapopa William
2017-01-01
This study investigated the process through which 100 online courses were developed in compliance with a purpose-made rubric designed to bring the courses to a level that would meet requirements of membership in a state authorization reciprocity agreement. The study identified and analyzed common design and pedagogical issues instructors…
2005-06-01
cognitive task analysis , organizational information dissemination and interaction, systems engineering, collaboration and communications processes, decision-making processes, and data collection and organization. By blending these diverse disciplines command centers can be designed to support decision-making, cognitive analysis, information technology, and the human factors engineering aspects of Command and Control (C2). This model can then be used as a baseline when dealing with work in areas of business processes, workflow engineering, information management,
Tool for Rapid Analysis of Monte Carlo Simulations
NASA Technical Reports Server (NTRS)
Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.
2011-01-01
Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very di cult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The Tool for Rapid Analysis of Monte Carlo simulations (TRAM) has been used in recent design and analysis work for the Orion vehicle, greatly decreasing the time it takes to evaluate performance requirements. A previous version of this tool was developed to automatically identify driving design variables in Monte Carlo data sets. This paper describes a new, parallel version, of TRAM implemented on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.
Lee, Heewon; Contento, Isobel R.; Koch, Pamela
2012-01-01
Objective To use and review a conceptual model of process evaluation and to examine the implementation of a nutrition education curriculum, Choice, Control & Change, designed to promote dietary and physical activity behaviors that reduce obesity risk. Design A process evaluation study based on a systematic conceptual model. Setting Five middle schools in New York City. Participants 562 students in 20 classes and their science teachers (n=8). Main Outcome Measures Based on the model, teacher professional development, teacher implementation, and student reception were evaluated. Also measured were teacher characteristics, teachers’ curriculum evaluation, and satisfaction with teaching the curriculum. Analysis Descriptive statistics and Spearman’s Rho Correlation for quantitative analysis and content analysis for qualitative data were used. Results Mean score of the teacher professional development evaluation was 4.75 on a 5-point scale. Average teacher implementation rate was 73%, and student reception rate was 69%. Ongoing teacher support was highly valued by teachers. Teachers’ satisfaction with teaching the curriculum was highly correlated with students’ satisfaction (p <.05). Teachers’ perception of amount of student work was negatively correlated with implementation and with student satisfaction (p<.05). Conclusions and implications Use of a systematic conceptual model and comprehensive process measures improves understanding of the implementation process and helps educators to better implement interventions as designed. PMID:23321021
A Study of Technical Engineering Peer Reviews at NASA
NASA Technical Reports Server (NTRS)
Chao, Lawrence P.; Tumer, Irem Y.; Bell, David G.
2003-01-01
This report describes the state of practices of design reviews at NASA and research into what can be done to improve peer review practices. There are many types of reviews at NASA: required and not, formalized and informal, programmatic and technical. Standing project formal reviews such as the Preliminary Design Review and Critical Design Review are a required part of every project and mission development. However, the technical, engineering peer reviews that support teams' work on such projects are informal, some times ad hoc, and inconsistent across the organization. The goal of this work is to identify best practices and lessons learned from NASA's experience, supported by academic research and methodologies to ultimately improve the process. This research has determined that the organization, composition, scope, and approach of the reviews impact their success. Failure Modes and Effects Analysis (FMEA) can identify key areas of concern before or in the reviews. Product definition tools like the Project Priority Matrix, engineering-focused Customer Value Chain Analysis (CVCA), and project or system-based Quality Function Deployment (QFD) help prioritize resources in reviews. The use of information technology and structured design methodologies can strengthen the engineering peer review process to help NASA work towards error-proofing the design process.
An Alternative View of Some FIA Sample Design and Analysis Issues
Paul C. Van Deusen
2005-01-01
Sample design and analysis decisions are the result of compromises and inputs from many sources. The end result would likely change if different individuals or groups were involved in the planning process. Discussed here are some alternatives to the procedures that are currently being used for the annual inventory. The purpose is to indicate that alternatives exist and...
Inverse Analysis to Formability Design in a Deep Drawing Process
NASA Astrophysics Data System (ADS)
Buranathiti, Thaweepat; Cao, Jian
Deep drawing process is an important process adding values to flat sheet metals in many industries. An important concern in the design of a deep drawing process generally is formability. This paper aims to present the connection between formability and inverse analysis (IA), which is a systematical means for determining an optimal blank configuration for a deep drawing process. In this paper, IA is presented and explored by using a commercial finite element software package. A number of numerical studies on the effect of blank configurations to the quality of a part produced by a deep drawing process were conducted and analyzed. The quality of the drawing processes is numerically analyzed by using an explicit incremental nonlinear finite element code. The minimum distance between elemental principal strains and the strain-based forming limit curve (FLC) is defined as tearing margin to be the key performance index (KPI) implying the quality of the part. The initial blank configuration has shown that it plays a highly important role in the quality of the product via the deep drawing process. In addition, it is observed that if a blank configuration is not greatly deviated from the one obtained from IA, the blank can still result a good product. The strain history around the bottom fillet of the part is also observed. The paper concludes that IA is an important part of the design methodology for deep drawing processes.
Machine Learning: A Crucial Tool for Sensor Design
Zhao, Weixiang; Bhushan, Abhinav; Santamaria, Anthony D.; Simon, Melinda G.; Davis, Cristina E.
2009-01-01
Sensors have been widely used for disease diagnosis, environmental quality monitoring, food quality control, industrial process analysis and control, and other related fields. As a key tool for sensor data analysis, machine learning is becoming a core part of novel sensor design. Dividing a complete machine learning process into three steps: data pre-treatment, feature extraction and dimension reduction, and system modeling, this paper provides a review of the methods that are widely used for each step. For each method, the principles and the key issues that affect modeling results are discussed. After reviewing the potential problems in machine learning processes, this paper gives a summary of current algorithms in this field and provides some feasible directions for future studies. PMID:20191110
Constellation Coverage Analysis
NASA Technical Reports Server (NTRS)
Lo, Martin W. (Compiler)
1997-01-01
The design of satellite constellations requires an understanding of the dynamic global coverage provided by the constellations. Even for a small constellation with a simple circular orbit propagator, the combinatorial nature of the analysis frequently renders the problem intractable. Particularly for the initial design phase where the orbital parameters are still fluid and undetermined, the coverage information is crucial to evaluate the performance of the constellation design. We have developed a fast and simple algorithm for determining the global constellation coverage dynamically using image processing techniques. This approach provides a fast, powerful and simple method for the analysis of global constellation coverage.
40 CFR 240.203-2 - Recommended procedures: Design.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 26 2013-07-01 2013-07-01 false Recommended procedures: Design. 240... § 240.203-2 Recommended procedures: Design. (a) The types, amounts (by weight and volume), and characteristics of all solid wastes expected to be processed should be determined by survey and analysis. The...
40 CFR 240.203-2 - Recommended procedures: Design.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 25 2014-07-01 2014-07-01 false Recommended procedures: Design. 240... § 240.203-2 Recommended procedures: Design. (a) The types, amounts (by weight and volume), and characteristics of all solid wastes expected to be processed should be determined by survey and analysis. The...
40 CFR 240.203-2 - Recommended procedures: Design.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 25 2011-07-01 2011-07-01 false Recommended procedures: Design. 240... § 240.203-2 Recommended procedures: Design. (a) The types, amounts (by weight and volume), and characteristics of all solid wastes expected to be processed should be determined by survey and analysis. The...
40 CFR 240.203-2 - Recommended procedures: Design.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 26 2012-07-01 2011-07-01 true Recommended procedures: Design. 240.203... § 240.203-2 Recommended procedures: Design. (a) The types, amounts (by weight and volume), and characteristics of all solid wastes expected to be processed should be determined by survey and analysis. The...
40 CFR 240.203-2 - Recommended procedures: Design.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 24 2010-07-01 2010-07-01 false Recommended procedures: Design. 240... § 240.203-2 Recommended procedures: Design. (a) The types, amounts (by weight and volume), and characteristics of all solid wastes expected to be processed should be determined by survey and analysis. The...
Evolution of Ada technology in the flight dynamics area: Design phase analysis
NASA Technical Reports Server (NTRS)
Quimby, Kelvin L.; Esker, Linda
1988-01-01
The software engineering issues related to the use of the Ada programming language during the design phase of an Ada project are analyzed. Discussion shows how an evolving understanding of these issues is reflected in the design processes of three generations of Ada projects.
Study of process variables associated with manufacturing hermetically-sealed nickel-cadmium cells
NASA Technical Reports Server (NTRS)
Miller, L.; Doan, D. J.; Carr, E. S.
1971-01-01
A program to determine and study the critical process variables associated with the manufacture of aerospace, hermetically-sealed, nickel-cadmium cells is described. The determination and study of the process variables associated with the positive and negative plaque impregnation/polarization process are emphasized. The experimental data resulting from the implementation of fractional factorial design experiments are analyzed by means of a linear multiple regression analysis technique. This analysis permits the selection of preferred levels for certain process variables to achieve desirable impregnated plaque characteristics.
Design of A Cyclone Separator Using Approximation Method
NASA Astrophysics Data System (ADS)
Sin, Bong-Su; Choi, Ji-Won; Lee, Kwon-Hee
2017-12-01
A Separator is a device installed in industrial applications to separate mixed objects. The separator of interest in this research is a cyclone type, which is used to separate a steam-brine mixture in a geothermal plant. The most important performance of the cyclone separator is the collection efficiency. The collection efficiency in this study is predicted by performing the CFD (Computational Fluid Dynamics) analysis. This research defines six shape design variables to maximize the collection efficiency. Thus, the collection efficiency is set up as the objective function in optimization process. Since the CFD analysis requires a lot of calculation time, it is impossible to obtain the optimal solution by linking the gradient-based optimization algorithm. Thus, two approximation methods are introduced to obtain an optimum design. In this process, an L18 orthogonal array is adopted as a DOE method, and kriging interpolation method is adopted to generate the metamodel for the collection efficiency. Based on the 18 analysis results, the relative importance of each variable to the collection efficiency is obtained through the ANOVA (analysis of variance). The final design is suggested considering the results obtained from two optimization methods. The fluid flow analysis of the cyclone separator is conducted by using the commercial CFD software, ANSYS-CFX.
An Integrated Approach for Conducting a Behavioral Systems Analysis
ERIC Educational Resources Information Center
Diener, Lori H.; McGee, Heather M.; Miguel, Caio F.
2009-01-01
The aim of this paper is to illustrate how to conduct a Behavioral Systems Analysis (BSA) to aid in the design of targeted performance improvement interventions. BSA is a continuous process of analyzing the right variables to the right extent to aid in planning and managing performance at the organization, process, and job levels. BSA helps to…
NASA Astrophysics Data System (ADS)
Sirirojvisuth, Apinut
In complex aerospace system design, making an effective design decision requires multidisciplinary knowledge from both product and process perspectives. Integrating manufacturing considerations into the design process is most valuable during the early design stages since designers have more freedom to integrate new ideas when changes are relatively inexpensive in terms of time and effort. Several metrics related to manufacturability are cost, time, and manufacturing readiness level (MRL). Yet, there is a lack of structured methodology that quantifies how changes in the design decisions impact these metrics. As a result, a new set of integrated cost analysis tools are proposed in this study to quantify the impacts. Equally important is the capability to integrate this new cost tool into the existing design methodologies without sacrificing agility and flexibility required during the early design phases. To demonstrate the applicability of this concept, a ModelCenter environment is used to develop software architecture that represents Integrated Product and Process Development (IPPD) methodology used in several aerospace systems designs. The environment seamlessly integrates product and process analysis tools and makes effective transition from one design phase to the other while retaining knowledge gained a priori. Then, an advanced cost estimating tool called Hybrid Lifecycle Cost Estimating Tool (HLCET), a hybrid combination of weight-, process-, and activity-based estimating techniques, is integrated with the design framework. A new weight-based lifecycle cost model is created based on Tailored Cost Model (TCM) equations [3]. This lifecycle cost tool estimates the program cost based on vehicle component weights and programmatic assumptions. Additional high fidelity cost tools like process-based and activity-based cost analysis methods can be used to modify the baseline TCM result as more knowledge is accumulated over design iterations. Therefore, with this concept, the additional manufacturing knowledge can be used to identify a more accurate lifecycle cost and facilitate higher fidelity tradeoffs during conceptual and preliminary design. Advanced Composite Cost Estimating Model (ACCEM) is employed as a process-based cost component to replace the original TCM result of the composite part production cost. The reason for the replacement is that TCM estimates production costs from part weights as a result of subtractive manufacturing of metallic origin such as casting, forging, and machining processes. A complexity factor can sometimes be adjusted to reflect different types of metal and machine settings. The TCM assumption, however, gives erroneous results when applied to additive processes like those of composite manufacturing. Another innovative aspect of this research is the introduction of a work measurement technique called Maynard Operation Sequence Technique (MOST) to be used, similarly to Activity-Based Costing (ABC) approach, to estimate manufacturing time of a part by virtue of breaking down the operations occurred during its production. ABC allows a realistic determination of cost incurred in each activity, as opposed to using a traditional method of time estimation by analogy or using response surface equations from historical process data. The MOST concept provides a tailored study of an individual process typically required for a new, innovative design. Nevertheless, the MOST idea has some challenges, one of which is its requirement to build a new process from ground up. The process development requires a Subject Matter Expertise (SME) in manufacturing method of the particular design. The SME must have also a comprehensive understanding of the MOST system so that the correct parameters are chosen. In practice, these knowledge requirements may demand people from outside of the design discipline and a priori training of MOST. To relieve the constraint, this study includes an entirely new sub-system architecture that comprises 1) a knowledge-based system to provide the required knowledge during the process selection; and 2) a new user-interface to guide the parameter selection when building the process using MOST. Also included in this study is the demonstration of how the HLCET and its constituents can be integrated with a Georgia Tech' Integrated Product and Process Development (IPPD) methodology. The applicability of this work will be shown through a complex aerospace design example to gain insights into how manufacturing knowledge helps make better design decisions during the early stages. The setup process is explained with an example of its utility demonstrated in a hypothetical fighter aircraft wing redesign. The evaluation of the system effectiveness against existing methodologies is illustrated to conclude the thesis.
Mechanical Design of a Performance Test Rig for the Turbine Air-Flow Task (TAFT)
NASA Technical Reports Server (NTRS)
Xenofos, George; Forbes, John; Farrow, John; Williams, Robert; Tyler, Tom; Sargent, Scott; Moharos, Jozsef
2003-01-01
To support development of the Boeing-Rocketdyne RS84 rocket engine, a fill-flow, reaction turbine geometry was integrated into the NASA-MSFC turbine air-flow test facility. A mechanical design was generated which minimized the amount of new hardware while incorporating all test and instrUmentation requirements. This paper provides details of the mechanical design for this Turbine Air-Flow Task (TAFT) test rig. The mechanical design process utilized for this task included the following basic stages: Conceptual Design. Preliminary Design. Detailed Design. Baseline of Design (including Configuration Control and Drawing Revision). Fabrication. Assembly. During the design process, many lessons were learned that should benefit future test rig design projects. Of primary importance are well-defined requirements early in the design process, a thorough detailed design package, and effective communication with both the customer and the fabrication contractors. The test rig provided steady and unsteady pressure data necessary to validate the computational fluid dynamics (CFD) code. The rig also helped characterize the turbine blade loading conditions. Test and CFD analysis results are to be presented in another JANNAF paper.
From the past to the future: Integrating work experience into the design process.
Bittencourt, João Marcos; Duarte, Francisco; Béguin, Pascal
2017-01-01
Integrating work activity issues into design process is a broadly discussed theme in ergonomics. Participation is presented as the main means for such integration. However, a late participation can limit the development of both project solutions and future work activity. This article presents the concept of construction of experience aiming at the articulated development of future activities and project solutions. It is a non-teleological approach where the initial concepts will be transformed by the experience built up throughout the design process. The method applied was a case study of an ergonomic participation during the design of a new laboratory complex for biotechnology research. Data was obtained through analysis of records in a simulation process using a Lego scale model and interviews with project participants. The simulation process allowed for developing new ways of working and generating changes in the initial design solutions, which enable workers to adopt their own developed strategies for conducting work more safely and efficiently in the future work system. Each project decision either opens or closes a window of opportunities for developing a future activity. Construction of experience in a non-teleological design process allows for understanding the consequences of project solutions for future work.
Learning About Ares I from Monte Carlo Simulation
NASA Technical Reports Server (NTRS)
Hanson, John M.; Hall, Charlie E.
2008-01-01
This paper addresses Monte Carlo simulation analyses that are being conducted to understand the behavior of the Ares I launch vehicle, and to assist with its design. After describing the simulation and modeling of Ares I, the paper addresses the process used to determine what simulations are necessary, and the parameters that are varied in order to understand how the Ares I vehicle will behave in flight. Outputs of these simulations furnish a significant group of design customers with data needed for the development of Ares I and of the Orion spacecraft that will ride atop Ares I. After listing the customers, examples of many of the outputs are described. Products discussed in this paper include those that support structural loads analysis, aerothermal analysis, flight control design, failure/abort analysis, determination of flight performance reserve, examination of orbit insertion accuracy, determination of the Upper Stage impact footprint, analysis of stage separation, analysis of launch probability, analysis of first stage recovery, thrust vector control and reaction control system design, liftoff drift analysis, communications analysis, umbilical release, acoustics, and design of jettison systems.
FRANOPP: Framework for analysis and optimization problems user's guide
NASA Technical Reports Server (NTRS)
Riley, K. M.
1981-01-01
Framework for analysis and optimization problems (FRANOPP) is a software aid for the study and solution of design (optimization) problems which provides the driving program and plotting capability for a user generated programming system. In addition to FRANOPP, the programming system also contains the optimization code CONMIN, and two user supplied codes, one for analysis and one for output. With FRANOPP the user is provided with five options for studying a design problem. Three of the options utilize the plot capability and present an indepth study of the design problem. The study can be focused on a history of the optimization process or on the interaction of variables within the design problem.
Mechanical System Analysis/Design Tool (MSAT) Quick Guide
NASA Technical Reports Server (NTRS)
Lee, HauHua; Kolb, Mark; Madelone, Jack
1998-01-01
MSAT is a unique multi-component multi-disciplinary tool that organizes design analysis tasks around object-oriented representations of configuration components, analysis programs and modules, and data transfer links between them. This creative modular architecture enables rapid generation of input stream for trade-off studies of various engine configurations. The data transfer links automatically transport output from one application as relevant input to the next application once the sequence is set up by the user. The computations are managed via constraint propagation - the constraints supplied by the user as part of any optimization module. The software can be used in the preliminary design stage as well as during the detail design of product development process.
CONFIG: Integrated engineering of systems and their operation
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Ryan, Dan; Fleming, Land
1994-01-01
This article discusses CONFIG 3, a prototype software tool that supports integrated conceptual design evaluation from early in the product life cycle, by supporting isolated or integrated modeling, simulation, and analysis of the function, structure, behavior, failures and operations of system designs. Integration and reuse of models is supported in an object-oriented environment providing capabilities for graph analysis and discrete event simulation. CONFIG supports integration among diverse modeling approaches (component view, configuration or flow path view, and procedure view) and diverse simulation and analysis approaches. CONFIG is designed to support integrated engineering in diverse design domains, including mechanical and electro-mechanical systems, distributed computer systems, and chemical processing and transport systems.
Preliminary Design and Analysis of the ARES Atmospheric Flight Vehicle Thermal Control System
NASA Technical Reports Server (NTRS)
Gasbarre, J. F.; Dillman, R. A.
2003-01-01
The Aerial Regional-scale Environmental Survey (ARES) is a proposed 2007 Mars Scout Mission that will be the first mission to deploy an atmospheric flight vehicle (AFV) on another planet. This paper will describe the preliminary design and analysis of the AFV thermal control system for its flight through the Martian atmosphere and also present other analyses broadening the scope of that design to include other phases of the ARES mission. Initial analyses are discussed and results of trade studies are presented which detail the design process for AFV thermal control. Finally, results of the most recent AFV thermal analysis are shown and the plans for future work are discussed.
A New Capability for Nuclear Thermal Propulsion Design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amiri, Benjamin W.; Nuclear and Radiological Engineering Department, University of Florida, Gainesville, FL 32611; Kapernick, Richard J.
2007-01-30
This paper describes a new capability for Nuclear Thermal Propulsion (NTP) design that has been developed, and presents the results of some analyses performed with this design tool. The purpose of the tool is to design to specified mission and material limits, while maximizing system thrust to weight. The head end of the design tool utilizes the ROCket Engine Transient Simulation (ROCETS) code to generate a system design and system design requirements as inputs to the core analysis. ROCETS is a modular system level code which has been used extensively in the liquid rocket engine industry for many years. Themore » core design tool performs high-fidelity reactor core nuclear and thermal-hydraulic design analysis. At the heart of this process are two codes TMSS-NTP and NTPgen, which together greatly automate the analysis, providing the capability to rapidly produce designs that meet all specified requirements while minimizing mass. A PERL based command script, called CORE DESIGNER controls the execution of these two codes, and checks for convergence throughout the process. TMSS-NTP is executed first, to produce a suite of core designs that meet the specified reactor core mechanical, thermal-hydraulic and structural requirements. The suite of designs consists of a set of core layouts and, for each core layout specific designs that span a range of core fuel volumes. NTPgen generates MCNPX models for each of the core designs from TMSS-NTP. Iterative analyses are performed in NTPgen until a reactor design (fuel volume) is identified for each core layout that meets cold and hot operation reactivity requirements and that is zoned to meet a radial core power distribution requirement.« less
Feasibility of Supersonic Aircraft Concepts for Low-Boom and Flight Trim Constraints
NASA Technical Reports Server (NTRS)
Li, Wu
2015-01-01
This paper documents a process for analyzing whether a particular supersonic aircraft configuration layout and a given cruise condition are feasible to achieve a trimmed low-boom design. This process was motivated by the need to know whether a particular configuration at a given cruise condition could be reshaped to satisfy both low-boom and flight trim constraints. Without such a process, much effort could be wasted on shaping a configuration layout at a cruise condition that could never satisfy both low-boom and flight trim constraints simultaneously. The process helps to exclude infeasible configuration layouts with minimum effort and allows a designer to develop trimmed low-boom concepts more effectively. A notional low-boom supersonic demonstrator concept is used to illustrate the analysis/design process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schaffner, Michael
2014-06-01
The current downward trend in funding for U.S. defense systems seems to be on a collision course with the state of the practice in systems engineering, which typically results in the increased pace and scale of capabilities and resultantly increased cost of complex national defense systems. Recent advances in the state of the art in systems engineering methodology can be leveraged to address this growing challenge. The present work leverages advanced constructs and methods for early-phase conceptual design of complex systems, when committed costs are still low and management influence is still high. First, a literature review is presented ofmore » the topics relevant to this work, including approaches to the design of affordable systems, assumptions and methods of exploratory modeling, and enabling techniques to help mitigate the computational challenges involved. The types, purposes, and limits of early-phase, exploratory models are then elucidated. The RSC-based Method for Affordable Concept Selection (RMACS) is described, which comprises nine processes in the three main thrusts of information gathering, evaluation, and analysis. The method is then applied to a naval ship case example, described as the Next-Generation Combat Ship, with representational information outputs and discussions of affordability with respect to each process. The ninth process, Multi-Era Analysis (MERA), is introduced and explicated, including required and optional informational components, temporal and change-related considerations, required and optional activities involved, and the potential types of outputs from the process. The MERA process is then applied to a naval ship case example similar to that of the RMACS application, but with discrete change options added to enable a tradespace network. The seven activities of the MERA process are demonstrated, with the salient outputs of each given and discussed. Additional thoughts are presented on MERA and RMACS, and 8 distinct areas are identified for further research in the MERA process, along with a brief description of the directions that such research might take. It is concluded that the affordability of complex systems can be better enabled through a conceptual design method that incorporates MERA as well as metrics such as Multi-Attribute Expense, Max Expense, and Expense Stability. It is also found that affordability of changeable systems can be better enabled through the use of existing path-planning algorithms in efficient evaluation and analysis of long-term strategies. Finally, it is found that MERA enables the identification and analysis of path-dependent considerations related to designs, epochs, strategies, and change options, in many possible futures.« less
NASA Technical Reports Server (NTRS)
Scola, Salvatore; Stavely, Rebecca; Jackson, Trevor; Boyer, Charlie; Osmundsen, Jim; Turczynski, Craig; Stimson, Chad
2016-01-01
Performance-related effects of system level temperature changes can be a key consideration in the design of many types of optical instruments. This is especially true for space-based imagers, which may require complex thermal control systems to maintain alignment of the optical components. Structural-Thermal-Optical-Performance (STOP) analysis is a multi-disciplinary process that can be used to assess the performance of these optical systems when subjected to the expected design environment. This type of analysis can be very time consuming, which makes it difficult to use as a trade study tool early in the project life cycle. In many cases, only one or two iterations can be performed over the course of a project. This limits the design space to best practices since it may be too difficult, or take too long, to test new concepts analytically. In order to overcome this challenge, automation, and a standard procedure for performing these studies is essential. A methodology was developed within the framework of the Comet software tool that captures the basic inputs, outputs, and processes used in most STOP analyses. This resulted in a generic, reusable analysis template that can be used for design trades for a variety of optical systems. The template captures much of the upfront setup such as meshing, boundary conditions, data transfer, naming conventions, and post-processing, and therefore saves time for each subsequent project. A description of the methodology and the analysis template is presented, and results are described for a simple telescope optical system.
NASA Astrophysics Data System (ADS)
Scola, Salvatore; Stavely, Rebecca; Jackson, Trevor; Boyer, Charlie; Osmundsen, Jim; Turczynski, Craig; Stimson, Chad
2016-09-01
Performance-related effects of system level temperature changes can be a key consideration in the design of many types of optical instruments. This is especially true for space-based imagers, which may require complex thermal control systems to maintain alignment of the optical components. Structural-Thermal-Optical-Performance (STOP) analysis is a multi-disciplinary process that can be used to assess the performance of these optical systems when subjected to the expected design environment. This type of analysis can be very time consuming, which makes it difficult to use as a trade study tool early in the project life cycle. In many cases, only one or two iterations can be performed over the course of a project. This limits the design space to best practices since it may be too difficult, or take too long, to test new concepts analytically. In order to overcome this challenge, automation, and a standard procedure for performing these studies is essential. A methodology was developed within the framework of the Comet software tool that captures the basic inputs, outputs, and processes used in most STOP analyses. This resulted in a generic, reusable analysis template that can be used for design trades for a variety of optical systems. The template captures much of the upfront setup such as meshing, boundary conditions, data transfer, naming conventions, and post-processing, and therefore saves time for each subsequent project. A description of the methodology and the analysis template is presented, and results are described for a simple telescope optical system.
Co-Simulation for Advanced Process Design and Optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stephen E. Zitney
2009-01-01
Meeting the increasing demand for clean, affordable, and secure energy is arguably the most important challenge facing the world today. Fossil fuels can play a central role in a portfolio of carbon-neutral energy options provided CO{sub 2} emissions can be dramatically reduced by capturing CO{sub 2} and storing it safely and effectively. Fossil energy industry faces the challenge of meeting aggressive design goals for next-generation power plants with CCS. Process designs will involve large, highly-integrated, and multipurpose systems with advanced equipment items with complex geometries and multiphysics. APECS is enabling software to facilitate effective integration, solution, and analysis of high-fidelitymore » process/equipment (CFD) co-simulations. APECS helps to optimize fluid flow and related phenomena that impact overall power plant performance. APECS offers many advanced capabilities including ROMs, design optimization, parallel execution, stochastic analysis, and virtual plant co-simulations. NETL and its collaborative R&D partners are using APECS to reduce the time, cost, and technical risk of developing high-efficiency, zero-emission power plants with CCS.« less
NASA Technical Reports Server (NTRS)
Agnone, A. M.
1972-01-01
The factors affecting a tangential fuel injector design for scramjet operation are reviewed and their effect on the efficiency of the supersonic combustion process is evaluated using both experimental data and theoretical predictions. A description of the physical problem of supersonic combustion and method of analysis is followed by a presentation and evaluation of some standard and exotic types of fuel injectors. Engineering fuel injector design criteria and hydrogen ignition schemes are presented along with a cursory review of available experimental data. A two-dimensional tangential fuel injector design is developed using analyses as a guide in evaluating the effects on the combustion process of various initial and boundary conditions including splitter plate thickness, injector wall temperature, pressure gradients, etc. The fuel injector wall geometry is shaped so as to maintain approximately constant pressure at the flame as required by a cycle analysis. A viscous characteristics program which accounts for lateral as well as axial pressure variations due to the mixing and combustion process is used in determining the wall geometry.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pope, G.A.; Sepehrnoori, K.
1995-12-31
The objective of this research is to develop cost-effective surfactant flooding technology by using simulation studies to evaluate and optimize alternative design strategies taking into account reservoir characteristics process chemistry, and process design options such as horizontal wells. Task 1 is the development of an improved numerical method for our simulator that will enable us to solve a wider class of these difficult simulation problems accurately and affordably. Task 2 is the application of this simulator to the optimization of surfactant flooding to reduce its risk and cost. In this quarter, we have continued working on Task 2 to optimizemore » surfactant flooding design and have included economic analysis to the optimization process. An economic model was developed using a spreadsheet and the discounted cash flow (DCF) method of economic analysis. The model was designed specifically for a domestic onshore surfactant flood and has been used to economically evaluate previous work that used a technical approach to optimization. The DCF model outputs common economic decision making criteria, such as net present value (NPV), internal rate of return (IRR), and payback period.« less
Designing Scenarios for Controller-in-the-Loop Air Traffic Simulations
NASA Technical Reports Server (NTRS)
Kupfer, Michael; Mercer, Joey S.; Cabrall, Christopher; Callantine, Todd
2013-01-01
Well prepared traffic scenarios contribute greatly to the success of controller-in-the-loop simulations. This paper describes each stage in the design process of realistic scenarios based on real-world traffic, to be used in the Airspace Operations Laboratory for simulations within the Air Traffic Management Technology Demonstration 1 effort. The steps from the initial analysis of real-world traffic, to the editing of individual aircraft records in the scenario file, until the final testing of the scenarios before the simulation conduct, are all described. The iterative nature of the design process and the various efforts necessary to reach the required fidelity, as well as the applied design strategies, challenges, and tools used during this process are also discussed.
Computer Design Technology of the Small Thrust Rocket Engines Using CAE / CAD Systems
NASA Astrophysics Data System (ADS)
Ryzhkov, V.; Lapshin, E.
2018-01-01
The paper presents an algorithm for designing liquid small thrust rocket engine, the process of which consists of five aggregated stages with feedback. Three stages of the algorithm provide engineering support for design, and two stages - the actual engine design. A distinctive feature of the proposed approach is a deep study of the main technical solutions at the stage of engineering analysis and interaction with the created knowledge (data) base, which accelerates the process and provides enhanced design quality. The using multifunctional graphic package Siemens NX allows to obtain the final product -rocket engine and a set of design documentation in a fairly short time; the engine design does not require a long experimental development.
Analysis of launch site processing effectiveness for the Space Shuttle 26R payload
NASA Technical Reports Server (NTRS)
Flores, Carlos A.; Heuser, Robert E.; Pepper, Richard E., Jr.; Smith, Anthony M.
1991-01-01
A trend analysis study has been performed on problem reports recorded during the Space Shuttle 26R payload's processing cycle at NASA-Kennedy, using the defect-flow analysis (DFA) methodology; DFA gives attention to the characteristics of the problem-report 'population' as a whole. It is established that the problem reports contain data which distract from pressing problems, and that fully 60 percent of such reports were caused during processing at NASA-Kennedy. The second major cause of problem reports was design defects.
COBRApy: COnstraints-Based Reconstruction and Analysis for Python.
Ebrahim, Ali; Lerman, Joshua A; Palsson, Bernhard O; Hyduke, Daniel R
2013-08-08
COnstraint-Based Reconstruction and Analysis (COBRA) methods are widely used for genome-scale modeling of metabolic networks in both prokaryotes and eukaryotes. Due to the successes with metabolism, there is an increasing effort to apply COBRA methods to reconstruct and analyze integrated models of cellular processes. The COBRA Toolbox for MATLAB is a leading software package for genome-scale analysis of metabolism; however, it was not designed to elegantly capture the complexity inherent in integrated biological networks and lacks an integration framework for the multiomics data used in systems biology. The openCOBRA Project is a community effort to promote constraints-based research through the distribution of freely available software. Here, we describe COBRA for Python (COBRApy), a Python package that provides support for basic COBRA methods. COBRApy is designed in an object-oriented fashion that facilitates the representation of the complex biological processes of metabolism and gene expression. COBRApy does not require MATLAB to function; however, it includes an interface to the COBRA Toolbox for MATLAB to facilitate use of legacy codes. For improved performance, COBRApy includes parallel processing support for computationally intensive processes. COBRApy is an object-oriented framework designed to meet the computational challenges associated with the next generation of stoichiometric constraint-based models and high-density omics data sets. http://opencobra.sourceforge.net/
Methodology for object-oriented real-time systems analysis and design: Software engineering
NASA Technical Reports Server (NTRS)
Schoeffler, James D.
1991-01-01
Successful application of software engineering methodologies requires an integrated analysis and design life-cycle in which the various phases flow smoothly 'seamlessly' from analysis through design to implementation. Furthermore, different analysis methodologies often lead to different structuring of the system so that the transition from analysis to design may be awkward depending on the design methodology to be used. This is especially important when object-oriented programming is to be used for implementation when the original specification and perhaps high-level design is non-object oriented. Two approaches to real-time systems analysis which can lead to an object-oriented design are contrasted: (1) modeling the system using structured analysis with real-time extensions which emphasizes data and control flows followed by the abstraction of objects where the operations or methods of the objects correspond to processes in the data flow diagrams and then design in terms of these objects; and (2) modeling the system from the beginning as a set of naturally occurring concurrent entities (objects) each having its own time-behavior defined by a set of states and state-transition rules and seamlessly transforming the analysis models into high-level design models. A new concept of a 'real-time systems-analysis object' is introduced and becomes the basic building block of a series of seamlessly-connected models which progress from the object-oriented real-time systems analysis and design system analysis logical models through the physical architectural models and the high-level design stages. The methodology is appropriate to the overall specification including hardware and software modules. In software modules, the systems analysis objects are transformed into software objects.
Safety Guided Design Based on Stamp/STPA for Manned Vehicle in Concept Design Phase
NASA Astrophysics Data System (ADS)
Ujiie, Ryo; Katahira, Masafumi; Miyamoto, Yuko; Umeda, Hiroki; Leveson, Nancy; Hoshino, Nobuyuki
2013-09-01
In manned vehicles, such as the Soyuz and the Space Shuttle, the crew and computer system cooperate to succeed in returning to the earth. While computers increase the functionality of system, they also increase the complexity of the interaction between the controllers (human and computer) and the target dynamics. In some cases, the complexity can produce a serious accident. To prevent such losses, traditional hazard analysis such as FTA has been applied to system development, however it can be used after creating a detailed system because it focuses on detailed component failures. As a result, it's more difficult to eliminate hazard cause early in the process when it is most feasible.STAMP/STPA is a new hazard analysis that can be applied from the early development phase, with the analysis being refined as more detailed decisions are made. In essence, the analysis and design decisions are intertwined and go hand-in-hand. We have applied STAMP/STPA to a concept design of a new JAXA manned vehicle and tried safety guided design of the vehicle. As a result of this trial, it has been shown that STAMP/STPA can be accepted easily by system engineers and the design has been made more sophisticated from a safety viewpoint. The result also shows that the consequences of human errors on system safety can be analysed in the early development phase and the system designed to prevent them. Finally, the paper will discuss an effective way to harmonize this safety guided design approach with system engineering process based on the result of this experience in this project.
Programming and machining of complex parts based on CATIA solid modeling
NASA Astrophysics Data System (ADS)
Zhu, Xiurong
2017-09-01
The complex parts of the use of CATIA solid modeling programming and simulation processing design, elaborated in the field of CNC machining, programming and the importance of processing technology. In parts of the design process, first make a deep analysis on the principle, and then the size of the design, the size of each chain, connected to each other. After the use of backstepping and a variety of methods to calculate the final size of the parts. In the selection of parts materials, careful study, repeated testing, the final choice of 6061 aluminum alloy. According to the actual situation of the processing site, it is necessary to make a comprehensive consideration of various factors in the machining process. The simulation process should be based on the actual processing, not only pay attention to shape. It can be used as reference for machining.
Managing fear in public health campaigns: a theory-based formative evaluation process.
Cho, Hyunyi; Witte, Kim
2005-10-01
The HIV/AIDS infection rate of Ethiopia is one of the world's highest. Prevention campaigns should systematically incorporate and respond to at-risk population's existing beliefs, emotions, and perceived barriers in the message design process to effectively promote behavior change. However, guidelines for conducting formative evaluation that are grounded in proven risk communication theory and empirical data analysis techniques are hard to find. This article provides a five-step formative evaluation process that translates theory and research for developing effective messages for behavior change. Guided by the extended parallel process model, the five-step process helps message designers manage public's fear surrounding issues such as HIV/AIDS. An entertainment education project that used the process to design HIV/AIDS prevention messages for Ethiopian urban youth is reported. Data were collected in five urban regions of Ethiopia and analyzed according to the process to develop key messages for a 26-week radio soap opera.
Execution Of Systems Integration Principles During Systems Engineering Design
2016-09-01
This thesis discusses integration failures observed by DOD and non - DOD systems as, inadequate stakeholder analysis, incomplete problem space and design ... design , development, test and deployment of a system. A lifecycle structure consists of phases within a methodology or process model. There are many...investigate design decisions without the need to commit to physical forms; “ experimental investigation using a model yields design or operational
2012-12-14
Each pair of rollers is designed to capture the shafts mounted to both ends of the tool lid. Additionally, a safety pin can be put in place to...ITRB for the AH-64D. The scope of the program included structural design , materials selection, manufacturing producibility analysis, tooling design ...responsible for tooling design and fabrication, fabrication process development and fabrication of spars and test samples; G3 who designed the RTM
Artan, N; Wilderer, P; Orhon, D; Morgenroth, E; Ozgür, N
2001-01-01
The Sequencing Batch Reactor (SBR) process for carbon and nutrient removal is subject to extensive research, and it is finding a wider application in full-scale installations. Despite the growing popularity, however, a widely accepted approach to process analysis and modeling, a unified design basis, and even a common terminology are still lacking; this situation is now regarded as the major obstacle hindering broader practical application of the SBR. In this paper a rational dimensioning approach is proposed for nutrient removal SBRs based on scientific information on process stoichiometry and modelling, also emphasizing practical constraints in design and operation.
Design tool for multiprocessor scheduling and evaluation of iterative dataflow algorithms
NASA Technical Reports Server (NTRS)
Jones, Robert L., III
1995-01-01
A graph-theoretic design process and software tool is defined for selecting a multiprocessing scheduling solution for a class of computational problems. The problems of interest are those that can be described with a dataflow graph and are intended to be executed repetitively on a set of identical processors. Typical applications include signal processing and control law problems. Graph-search algorithms and analysis techniques are introduced and shown to effectively determine performance bounds, scheduling constraints, and resource requirements. The software tool applies the design process to a given problem and includes performance optimization through the inclusion of additional precedence constraints among the schedulable tasks.
The Flight Optimization System Weights Estimation Method
NASA Technical Reports Server (NTRS)
Wells, Douglas P.; Horvath, Bryce L.; McCullers, Linwood A.
2017-01-01
FLOPS has been the primary aircraft synthesis software used by the Aeronautics Systems Analysis Branch at NASA Langley Research Center. It was created for rapid conceptual aircraft design and advanced technology impact assessments. FLOPS is a single computer program that includes weights estimation, aerodynamics estimation, engine cycle analysis, propulsion data scaling and interpolation, detailed mission performance analysis, takeoff and landing performance analysis, noise footprint estimation, and cost analysis. It is well known as a baseline and common denominator for aircraft design studies. FLOPS is capable of calibrating a model to known aircraft data, making it useful for new aircraft and modifications to existing aircraft. The weight estimation method in FLOPS is known to be of high fidelity for conventional tube with wing aircraft and a substantial amount of effort went into its development. This report serves as a comprehensive documentation of the FLOPS weight estimation method. The development process is presented with the weight estimation process.
Interference fits and stress-corrosion failure. [aircraft parts fatigue life analysis
NASA Technical Reports Server (NTRS)
Hanagud, S.; Carter, A. E.
1976-01-01
It is pointed out that any proper design of interference fit fastener, interference fit bushings, or stress coining processes should consider both the stress-corrosion susceptibility and fatigue-life improvement together. Investigations leading to such a methodology are discussed. A service failure analysis of actual aircraft parts is considered along with the stress-corrosion susceptibility of cold-working interference fit bushings. The optimum design of the amount of interference is considered, giving attention to stress formulas and aspects of design methodology.