Sample records for factors design methodology

  1. Evaluation and optimization of hepatocyte culture media factors by design of experiments (DoE) methodology

    PubMed Central

    Dong, Jia; Lübberstedt, Marc; Urbaniak, Thomas; Nüssler, Andreas K.N.; Knobeloch, Daniel; Gerlach, Jörg C.; Zeilinger, Katrin

    2008-01-01

    Optimization of cell culture media based on statistical experimental design methodology is a widely used approach for improving cultivation conditions. We applied this methodology to refine the composition of an established culture medium for growth of a human hepatoma cell line, C3A. A selection of growth factors and nutrient supplements were systematically screened according to standard design of experiments (DoE) procedures. The results of the screening indicated that the medium additives hepatocyte growth factor, oncostatin M, and fibroblast growth factor 4 significantly influenced the metabolic activities of the C3A cell line. Surface response methodology revealed that the optimum levels for these factors were 30 ng/ml for hepatocyte growth factor and 35 ng/ml for oncostatin M. Additional experiments on primary human hepatocyte cultures showed high variance in metabolic activities between cells from different individuals, making determination of optimal levels of factors more difficult. Still, it was possible to conclude that hepatocyte growth factor, epidermal growth factor, and oncostatin M had decisive effects on the metabolic functions of primary human hepatocytes. PMID:19003182

  2. Evaluation and optimization of hepatocyte culture media factors by design of experiments (DoE) methodology.

    PubMed

    Dong, Jia; Mandenius, Carl-Fredrik; Lübberstedt, Marc; Urbaniak, Thomas; Nüssler, Andreas K N; Knobeloch, Daniel; Gerlach, Jörg C; Zeilinger, Katrin

    2008-07-01

    Optimization of cell culture media based on statistical experimental design methodology is a widely used approach for improving cultivation conditions. We applied this methodology to refine the composition of an established culture medium for growth of a human hepatoma cell line, C3A. A selection of growth factors and nutrient supplements were systematically screened according to standard design of experiments (DoE) procedures. The results of the screening indicated that the medium additives hepatocyte growth factor, oncostatin M, and fibroblast growth factor 4 significantly influenced the metabolic activities of the C3A cell line. Surface response methodology revealed that the optimum levels for these factors were 30 ng/ml for hepatocyte growth factor and 35 ng/ml for oncostatin M. Additional experiments on primary human hepatocyte cultures showed high variance in metabolic activities between cells from different individuals, making determination of optimal levels of factors more difficult. Still, it was possible to conclude that hepatocyte growth factor, epidermal growth factor, and oncostatin M had decisive effects on the metabolic functions of primary human hepatocytes.

  3. A Human-Centered Design Methodology to Enhance the Usability, Human Factors, and User Experience of Connected Health Systems: A Three-Phase Methodology.

    PubMed

    Harte, Richard; Glynn, Liam; Rodríguez-Molinero, Alejandro; Baker, Paul Ma; Scharf, Thomas; Quinlan, Leo R; ÓLaighin, Gearóid

    2017-03-16

    Design processes such as human-centered design, which involve the end user throughout the product development and testing process, can be crucial in ensuring that the product meets the needs and capabilities of the user, particularly in terms of safety and user experience. The structured and iterative nature of human-centered design can often present a challenge when design teams are faced with the necessary, rapid, product development life cycles associated with the competitive connected health industry. We wanted to derive a structured methodology that followed the principles of human-centered design that would allow designers and developers to ensure that the needs of the user are taken into account throughout the design process, while maintaining a rapid pace of development. In this paper, we present the methodology and its rationale before outlining how it was applied to assess and enhance the usability, human factors, and user experience of a connected health system known as the Wireless Insole for Independent and Safe Elderly Living (WIISEL) system, a system designed to continuously assess fall risk by measuring gait and balance parameters associated with fall risk. We derived a three-phase methodology. In Phase 1 we emphasized the construction of a use case document. This document can be used to detail the context of use of the system by utilizing storyboarding, paper prototypes, and mock-ups in conjunction with user interviews to gather insightful user feedback on different proposed concepts. In Phase 2 we emphasized the use of expert usability inspections such as heuristic evaluations and cognitive walkthroughs with small multidisciplinary groups to review the prototypes born out of the Phase 1 feedback. Finally, in Phase 3 we emphasized classical user testing with target end users, using various metrics to measure the user experience and improve the final prototypes. We report a successful implementation of the methodology for the design and development

  4. A Human-Centered Design Methodology to Enhance the Usability, Human Factors, and User Experience of Connected Health Systems: A Three-Phase Methodology

    PubMed Central

    Harte, Richard; Glynn, Liam; Rodríguez-Molinero, Alejandro; Baker, Paul MA; Scharf, Thomas; ÓLaighin, Gearóid

    2017-01-01

    Background Design processes such as human-centered design, which involve the end user throughout the product development and testing process, can be crucial in ensuring that the product meets the needs and capabilities of the user, particularly in terms of safety and user experience. The structured and iterative nature of human-centered design can often present a challenge when design teams are faced with the necessary, rapid, product development life cycles associated with the competitive connected health industry. Objective We wanted to derive a structured methodology that followed the principles of human-centered design that would allow designers and developers to ensure that the needs of the user are taken into account throughout the design process, while maintaining a rapid pace of development. In this paper, we present the methodology and its rationale before outlining how it was applied to assess and enhance the usability, human factors, and user experience of a connected health system known as the Wireless Insole for Independent and Safe Elderly Living (WIISEL) system, a system designed to continuously assess fall risk by measuring gait and balance parameters associated with fall risk. Methods We derived a three-phase methodology. In Phase 1 we emphasized the construction of a use case document. This document can be used to detail the context of use of the system by utilizing storyboarding, paper prototypes, and mock-ups in conjunction with user interviews to gather insightful user feedback on different proposed concepts. In Phase 2 we emphasized the use of expert usability inspections such as heuristic evaluations and cognitive walkthroughs with small multidisciplinary groups to review the prototypes born out of the Phase 1 feedback. Finally, in Phase 3 we emphasized classical user testing with target end users, using various metrics to measure the user experience and improve the final prototypes. Results We report a successful implementation of the

  5. Rational Design Methodology.

    DTIC Science & Technology

    1978-09-01

    This report describes an effort to specify a software design methodology applicable to the Air Force software environment . Available methodologies...of techniques for proof of correctness, design specification, and performance assessment of static designs. The rational methodology selected is a

  6. The Study of the Relationship between Probabilistic Design and Axiomatic Design Methodology. Volume 2

    NASA Technical Reports Server (NTRS)

    Onwubiko, Chin-Yere; Onyebueke, Landon

    1996-01-01

    The structural design, or the design of machine elements, has been traditionally based on deterministic design methodology. The deterministic method considers all design parameters to be known with certainty. This methodology is, therefore, inadequate to design complex structures that are subjected to a variety of complex, severe loading conditions. A nonlinear behavior that is dependent on stress, stress rate, temperature, number of load cycles, and time is observed on all components subjected to complex conditions. These complex conditions introduce uncertainties; hence, the actual factor of safety margin remains unknown. In the deterministic methodology, the contingency of failure is discounted; hence, there is a use of a high factor of safety. It may be most useful in situations where the design structures are simple. The probabilistic method is concerned with the probability of non-failure performance of structures or machine elements. It is much more useful in situations where the design is characterized by complex geometry, possibility of catastrophic failure, sensitive loads and material properties. Also included: Comparative Study of the use of AGMA Geometry Factors and Probabilistic Design Methodology in the Design of Compact Spur Gear Set.

  7. CONCEPTUAL DESIGNS FOR A NEW HIGHWAY VEHICLE EMISSIONS ESTIMATION METHODOLOGY

    EPA Science Inventory

    The report discusses six conceptual designs for a new highway vehicle emissions estimation methodology and summarizes the recommendations of each design for improving the emissions and activity factors in the emissions estimation process. he complete design reports are included a...

  8. Methodology for designing psychological habitability for the space station.

    PubMed

    Komastubara, A

    2000-09-01

    Psychological habitability is a critical quality issue for the International Space Station because poor habitability degrades performance shaping factors (PSFs) and increases human errors. However, habitability often receives rather limited design attention based on someone's superficial tastes because systematic design procedures lack habitability quality. To improve design treatment of psychological habitability, this paper proposes and discusses a design methodology for designing psychological habitability for the International Space Station.

  9. Methodological Issues in Questionnaire Design.

    PubMed

    Song, Youngshin; Son, Youn Jung; Oh, Doonam

    2015-06-01

    The process of designing a questionnaire is complicated. Many questionnaires on nursing phenomena have been developed and used by nursing researchers. The purpose of this paper was to discuss questionnaire design and factors that should be considered when using existing scales. Methodological issues were discussed, such as factors in the design of questions, steps in developing questionnaires, wording and formatting methods for items, and administrations methods. How to use existing scales, how to facilitate cultural adaptation, and how to prevent socially desirable responding were discussed. Moreover, the triangulation method in questionnaire development was introduced. Steps were recommended for designing questions such as appropriately operationalizing key concepts for the target population, clearly formatting response options, generating items and confirming final items through face or content validity, sufficiently piloting the questionnaire using item analysis, demonstrating reliability and validity, finalizing the scale, and training the administrator. Psychometric properties and cultural equivalence should be evaluated prior to administration when using an existing questionnaire and performing cultural adaptation. In the context of well-defined nursing phenomena, logical and systematic methods will contribute to the development of simple and precise questionnaires.

  10. Reliability based design optimization: Formulations and methodologies

    NASA Astrophysics Data System (ADS)

    Agarwal, Harish

    Modern products ranging from simple components to complex systems should be designed to be optimal and reliable. The challenge of modern engineering is to ensure that manufacturing costs are reduced and design cycle times are minimized while achieving requirements for performance and reliability. If the market for the product is competitive, improved quality and reliability can generate very strong competitive advantages. Simulation based design plays an important role in designing almost any kind of automotive, aerospace, and consumer products under these competitive conditions. Single discipline simulations used for analysis are being coupled together to create complex coupled simulation tools. This investigation focuses on the development of efficient and robust methodologies for reliability based design optimization in a simulation based design environment. Original contributions of this research are the development of a novel efficient and robust unilevel methodology for reliability based design optimization, the development of an innovative decoupled reliability based design optimization methodology, the application of homotopy techniques in unilevel reliability based design optimization methodology, and the development of a new framework for reliability based design optimization under epistemic uncertainty. The unilevel methodology for reliability based design optimization is shown to be mathematically equivalent to the traditional nested formulation. Numerical test problems show that the unilevel methodology can reduce computational cost by at least 50% as compared to the nested approach. The decoupled reliability based design optimization methodology is an approximate technique to obtain consistent reliable designs at lesser computational expense. Test problems show that the methodology is computationally efficient compared to the nested approach. A framework for performing reliability based design optimization under epistemic uncertainty is also developed

  11. Assuring data transparency through design methodologies

    NASA Technical Reports Server (NTRS)

    Williams, Allen

    1990-01-01

    This paper addresses the role of design methodologies and practices in the assurance of technology transparency. The development of several subsystems on large, long life cycle government programs was analyzed to glean those characteristics in the design, development, test, and evaluation that precluded or enabled the insertion of new technology. The programs examined were Minuteman, DSP, B1-B, and space shuttle. All these were long life cycle, technology-intensive programs. The design methodologies (or lack thereof) and design practices for each were analyzed in terms of the success or failure in incorporating evolving technology. Common elements contributing to the success or failure were extracted and compared to current methodologies being proposed by the Department of Defense and NASA. The relevance of these practices to the design and deployment of Space Station Freedom were evaluated. In particular, appropriate methodologies now being used on the core development contract were examined.

  12. Total System Design (TSD) Methodology Assessment.

    DTIC Science & Technology

    1983-01-01

    hardware implementation. Author: Martin - Marietta Aerospace Title: Total System Design Methodology Source: Martin - Marietta Technical Report MCR -79-646...systematic, rational approach to computer systems design is needed. Martin - Marietta has produced a Total System Design Methodology to support such design...gathering and ordering. The purpose of the paper is to document the existing TSD methoeology at Martin - Marietta , describe the supporting tools, and

  13. Design methodology of Dutch banknotes

    NASA Astrophysics Data System (ADS)

    de Heij, Hans A. M.

    2000-04-01

    Since the introduction of a design methodology for Dutch banknotes, the quality of Dutch paper currency has improved in more than one way. The methodology is question provides for (i) a design policy, which helps fix clear objectives; (ii) design management, to ensure a smooth cooperation between the graphic designer, printer, papermaker an central bank, (iii) a program of requirements, a banknote development guideline for all parties involved. This systematic approach enables an objective selection of design proposals, including security features. Furthermore, the project manager obtains regular feedback from the public by conducting market surveys. Each new design of a Netherlands Guilder banknote issued by the Nederlandsche Bank of the past 50 years has been an improvement on its predecessor in terms of value recognition, security and durability.

  14. Sketching Designs Using the Five Design-Sheet Methodology.

    PubMed

    Roberts, Jonathan C; Headleand, Chris; Ritsos, Panagiotis D

    2016-01-01

    Sketching designs has been shown to be a useful way of planning and considering alternative solutions. The use of lo-fidelity prototyping, especially paper-based sketching, can save time, money and converge to better solutions more quickly. However, this design process is often viewed to be too informal. Consequently users do not know how to manage their thoughts and ideas (to first think divergently, to then finally converge on a suitable solution). We present the Five Design Sheet (FdS) methodology. The methodology enables users to create information visualization interfaces through lo-fidelity methods. Users sketch and plan their ideas, helping them express different possibilities, think through these ideas to consider their potential effectiveness as solutions to the task (sheet 1); they create three principle designs (sheets 2,3 and 4); before converging on a final realization design that can then be implemented (sheet 5). In this article, we present (i) a review of the use of sketching as a planning method for visualization and the benefits of sketching, (ii) a detailed description of the Five Design Sheet (FdS) methodology, and (iii) an evaluation of the FdS using the System Usability Scale, along with a case-study of its use in industry and experience of its use in teaching.

  15. [Evaluative designs in public health: methodological considerations].

    PubMed

    López, Ma José; Marí-Dell'Olmo, Marc; Pérez-Giménez, Anna; Nebot, Manel

    2011-06-01

    Evaluation of public health interventions poses numerous methodological challenges. Randomization of individuals is not always feasible and interventions are usually composed of multiple factors. To face these challenges, certain elements, such as the selection of the most appropriate design and the use of a statistical analysis that includes potential confounders, are essential. The objective of this article was to describe the most frequently used designs in the evaluation of public health interventions (policies, programs or campaigns). The characteristics, strengths and weaknesses of each of these evaluative designs are described. Additionally, a brief explanation of the most commonly used statistical analysis in each of these designs is provided. Copyright © 2011 Sociedad Española de Salud Pública y Administración Sanitaria. Published by Elsevier Espana. All rights reserved.

  16. Space Engineering Projects in Design Methodology

    NASA Technical Reports Server (NTRS)

    Crawford, R.; Wood, K.; Nichols, S.; Hearn, C.; Corrier, S.; DeKunder, G.; George, S.; Hysinger, C.; Johnson, C.; Kubasta, K.

    1993-01-01

    NASA/USRA is an ongoing sponsor of space design projects in the senior design courses of the Mechanical Engineering Department at The University of Texas at Austin. This paper describes the UT senior design sequence, focusing on the first-semester design methodology course. The philosophical basis and pedagogical structure of this course is summarized. A history of the Department's activities in the Advanced Design Program is then presented. The paper includes a summary of the projects completed during the 1992-93 Academic Year in the methodology course, and concludes with an example of two projects completed by student design teams.

  17. General Methodology for Designing Spacecraft Trajectories

    NASA Technical Reports Server (NTRS)

    Condon, Gerald; Ocampo, Cesar; Mathur, Ravishankar; Morcos, Fady; Senent, Juan; Williams, Jacob; Davis, Elizabeth C.

    2012-01-01

    A methodology for designing spacecraft trajectories in any gravitational environment within the solar system has been developed. The methodology facilitates modeling and optimization for problems ranging from that of a single spacecraft orbiting a single celestial body to that of a mission involving multiple spacecraft and multiple propulsion systems operating in gravitational fields of multiple celestial bodies. The methodology consolidates almost all spacecraft trajectory design and optimization problems into a single conceptual framework requiring solution of either a system of nonlinear equations or a parameter-optimization problem with equality and/or inequality constraints.

  18. Methodological convergence of program evaluation designs.

    PubMed

    Chacón-Moscoso, Salvador; Anguera, M Teresa; Sanduvete-Chaves, Susana; Sánchez-Martín, Milagrosa

    2014-01-01

    Nowadays, the confronting dichotomous view between experimental/quasi-experimental and non-experimental/ethnographic studies still exists but, despite the extensive use of non-experimental/ethnographic studies, the most systematic work on methodological quality has been developed based on experimental and quasi-experimental studies. This hinders evaluators and planners' practice of empirical program evaluation, a sphere in which the distinction between types of study is changing continually and is less clear. Based on the classical validity framework of experimental/quasi-experimental studies, we carry out a review of the literature in order to analyze the convergence of design elements in methodological quality in primary studies in systematic reviews and ethnographic research. We specify the relevant design elements that should be taken into account in order to improve validity and generalization in program evaluation practice in different methodologies from a practical methodological and complementary view. We recommend ways to improve design elements so as to enhance validity and generalization in program evaluation practice.

  19. The Study of the Relationship between Probabilistic Design and Axiomatic Design Methodology. Volume 1

    NASA Technical Reports Server (NTRS)

    Onwubiko, Chinyere; Onyebueke, Landon

    1996-01-01

    This program report is the final report covering all the work done on this project. The goal of this project is technology transfer of methodologies to improve design process. The specific objectives are: 1. To learn and understand the Probabilistic design analysis using NESSUS. 2. To assign Design Projects to either undergraduate or graduate students on the application of NESSUS. 3. To integrate the application of NESSUS into some selected senior level courses in Civil and Mechanical Engineering curricula. 4. To develop courseware in Probabilistic Design methodology to be included in a graduate level Design Methodology course. 5. To study the relationship between the Probabilistic design methodology and Axiomatic design methodology.

  20. Waste Package Component Design Methodology Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    D.C. Mecham

    2004-07-12

    This Executive Summary provides an overview of the methodology being used by the Yucca Mountain Project (YMP) to design waste packages and ancillary components. This summary information is intended for readers with general interest, but also provides technical readers a general framework surrounding a variety of technical details provided in the main body of the report. The purpose of this report is to document and ensure appropriate design methods are used in the design of waste packages and ancillary components (the drip shields and emplacement pallets). The methodology includes identification of necessary design inputs, justification of design assumptions, and usemore » of appropriate analysis methods, and computational tools. This design work is subject to ''Quality Assurance Requirements and Description''. The document is primarily intended for internal use and technical guidance for a variety of design activities. It is recognized that a wide audience including project management, the U.S. Department of Energy (DOE), the U.S. Nuclear Regulatory Commission, and others are interested to various levels of detail in the design methods and therefore covers a wide range of topics at varying levels of detail. Due to the preliminary nature of the design, readers can expect to encounter varied levels of detail in the body of the report. It is expected that technical information used as input to design documents will be verified and taken from the latest versions of reference sources given herein. This revision of the methodology report has evolved with changes in the waste package, drip shield, and emplacement pallet designs over many years and may be further revised as the design is finalized. Different components and analyses are at different stages of development. Some parts of the report are detailed, while other less detailed parts are likely to undergo further refinement. The design methodology is intended to provide designs that satisfy the safety and

  1. Identifying Items to Assess Methodological Quality in Physical Therapy Trials: A Factor Analysis

    PubMed Central

    Cummings, Greta G.; Fuentes, Jorge; Saltaji, Humam; Ha, Christine; Chisholm, Annabritt; Pasichnyk, Dion; Rogers, Todd

    2014-01-01

    Background Numerous tools and individual items have been proposed to assess the methodological quality of randomized controlled trials (RCTs). The frequency of use of these items varies according to health area, which suggests a lack of agreement regarding their relevance to trial quality or risk of bias. Objective The objectives of this study were: (1) to identify the underlying component structure of items and (2) to determine relevant items to evaluate the quality and risk of bias of trials in physical therapy by using an exploratory factor analysis (EFA). Design A methodological research design was used, and an EFA was performed. Methods Randomized controlled trials used for this study were randomly selected from searches of the Cochrane Database of Systematic Reviews. Two reviewers used 45 items gathered from 7 different quality tools to assess the methodological quality of the RCTs. An exploratory factor analysis was conducted using the principal axis factoring (PAF) method followed by varimax rotation. Results Principal axis factoring identified 34 items loaded on 9 common factors: (1) selection bias; (2) performance and detection bias; (3) eligibility, intervention details, and description of outcome measures; (4) psychometric properties of the main outcome; (5) contamination and adherence to treatment; (6) attrition bias; (7) data analysis; (8) sample size; and (9) control and placebo adequacy. Limitation Because of the exploratory nature of the results, a confirmatory factor analysis is needed to validate this model. Conclusions To the authors' knowledge, this is the first factor analysis to explore the underlying component items used to evaluate the methodological quality or risk of bias of RCTs in physical therapy. The items and factors represent a starting point for evaluating the methodological quality and risk of bias in physical therapy trials. Empirical evidence of the association among these items with treatment effects and a confirmatory factor

  2. Aerodynamic configuration design using response surface methodology analysis

    NASA Technical Reports Server (NTRS)

    Engelund, Walter C.; Stanley, Douglas O.; Lepsch, Roger A.; Mcmillin, Mark M.; Unal, Resit

    1993-01-01

    An investigation has been conducted to determine a set of optimal design parameters for a single-stage-to-orbit reentry vehicle. Several configuration geometry parameters which had a large impact on the entry vehicle flying characteristics were selected as design variables: the fuselage fineness ratio, the nose to body length ratio, the nose camber value, the wing planform area scale factor, and the wing location. The optimal geometry parameter values were chosen using a response surface methodology (RSM) technique which allowed for a minimum dry weight configuration design that met a set of aerodynamic performance constraints on the landing speed, and on the subsonic, supersonic, and hypersonic trim and stability levels. The RSM technique utilized, specifically the central composite design method, is presented, along with the general vehicle conceptual design process. Results are presented for an optimized configuration along with several design trade cases.

  3. Control/structure interaction design methodology

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.; Layman, William E.

    1989-01-01

    The Control Structure Interaction Program is a technology development program for spacecraft that exhibit interactions between the control system and structural dynamics. The program objectives include development and verification of new design concepts (such as active structure) and new tools (such as a combined structure and control optimization algorithm) and their verification in ground and possibly flight test. The new CSI design methodology is centered around interdisciplinary engineers using new tools that closely integrate structures and controls. Verification is an important CSI theme and analysts will be closely integrated to the CSI Test Bed laboratory. Components, concepts, tools and algorithms will be developed and tested in the lab and in future Shuttle-based flight experiments. The design methodology is summarized in block diagrams depicting the evolution of a spacecraft design and descriptions of analytical capabilities used in the process. The multiyear JPL CSI implementation plan is described along with the essentials of several new tools. A distributed network of computation servers and workstations was designed that will provide a state-of-the-art development base for the CSI technologies.

  4. Design and analysis of sustainable computer mouse using design for disassembly methodology

    NASA Astrophysics Data System (ADS)

    Roni Sahroni, Taufik; Fitri Sukarman, Ahmad; Agung Mahardini, Karunia

    2017-12-01

    This paper presents the design and analysis of computer mouse using Design for Disassembly methodology. Basically, the existing computer mouse model consist a number of unnecessary part that cause the assembly and disassembly time in production. The objective of this project is to design a new computer mouse based on Design for Disassembly (DFD) methodology. The main methodology of this paper was proposed from sketch generation, concept selection, and concept scoring. Based on the design screening, design concept B was selected for further analysis. New design of computer mouse is proposed using fastening system. Furthermore, three materials of ABS, Polycarbonate, and PE high density were prepared to determine the environmental impact category. Sustainable analysis was conducted using software SolidWorks. As a result, PE High Density gives the lowers amount in the environmental category with great maximum stress value.

  5. Methodological factors conducting research with incarcerated persons with diabetes.

    PubMed

    Reagan, Louise; Shelton, Deborah

    2016-02-01

    The aim of this study was to describe methodological issues specific to conducting research with incarcerated vulnerable populations who have diabetes. Much has been written about the ethical and logistical challenges of conducting research with vulnerable incarcerated populations. However, conducting research with incarcerated persons with diabetes is associated with additional issues related to research design, measurement, sampling and recruitment, and data collection procedures. A cross-sectional study examining the relationships of diabetes knowledge, illness representation and self-care behaviors with glycemic control in 124 incarcerated persons was conducted and serves as the basis for describing methodological factors for the conduct of research with an incarcerated population with diabetes. Within this incarcerated population with diabetes, sampling bias due to gender inequity, recruitment of participants not using insulin, self-reported vision impairment, and a lack of standardized instruments especially for measuring diabetes self-care were methodological challenges. Clinical factors that serve as potential barriers for study conduct were identified as risk for hypoglycemia due to insulin timing and other activities. Conducting research with incarcerated persons diagnosed with diabetes requires attention to a set of methodological concerns above and beyond that of the ethical and legal regulations for protecting the rights of this vulnerable population. To increase opportunities for conducting rigorous as well as facility- and patient-friendly research, researchers need to blend their knowledge of diabetes with an understanding of prison rules and routines. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. Methodology for the Design of Streamline-Traced External-Compression Supersonic Inlets

    NASA Technical Reports Server (NTRS)

    Slater, John W.

    2014-01-01

    A design methodology based on streamline-tracing is discussed for the design of external-compression, supersonic inlets for flight below Mach 2.0. The methodology establishes a supersonic compression surface and capture cross-section by tracing streamlines through an axisymmetric Busemann flowfield. The compression system of shock and Mach waves is altered through modifications to the leading edge and shoulder of the compression surface. An external terminal shock is established to create subsonic flow which is diffused in the subsonic diffuser. The design methodology was implemented into the SUPIN inlet design tool. SUPIN uses specified design factors to design the inlets and computes the inlet performance, which includes the flow rates, total pressure recovery, and wave drag. A design study was conducted using SUPIN and the Wind-US computational fluid dynamics code to design and analyze the properties of two streamline-traced, external-compression (STEX) supersonic inlets for Mach 1.6 freestream conditions. The STEX inlets were compared to axisymmetric pitot, two-dimensional, and axisymmetric spike inlets. The STEX inlets had slightly lower total pressure recovery and higher levels of total pressure distortion than the axisymmetric spike inlet. The cowl wave drag coefficients of the STEX inlets were 20% of those for the axisymmetric spike inlet. The STEX inlets had external sound pressures that were 37% of those of the axisymmetric spike inlet, which may result in lower adverse sonic boom characteristics. The flexibility of the shape of the capture cross-section may result in benefits for the integration of STEX inlets with aircraft.

  7. Identifying items to assess methodological quality in physical therapy trials: a factor analysis.

    PubMed

    Armijo-Olivo, Susan; Cummings, Greta G; Fuentes, Jorge; Saltaji, Humam; Ha, Christine; Chisholm, Annabritt; Pasichnyk, Dion; Rogers, Todd

    2014-09-01

    Numerous tools and individual items have been proposed to assess the methodological quality of randomized controlled trials (RCTs). The frequency of use of these items varies according to health area, which suggests a lack of agreement regarding their relevance to trial quality or risk of bias. The objectives of this study were: (1) to identify the underlying component structure of items and (2) to determine relevant items to evaluate the quality and risk of bias of trials in physical therapy by using an exploratory factor analysis (EFA). A methodological research design was used, and an EFA was performed. Randomized controlled trials used for this study were randomly selected from searches of the Cochrane Database of Systematic Reviews. Two reviewers used 45 items gathered from 7 different quality tools to assess the methodological quality of the RCTs. An exploratory factor analysis was conducted using the principal axis factoring (PAF) method followed by varimax rotation. Principal axis factoring identified 34 items loaded on 9 common factors: (1) selection bias; (2) performance and detection bias; (3) eligibility, intervention details, and description of outcome measures; (4) psychometric properties of the main outcome; (5) contamination and adherence to treatment; (6) attrition bias; (7) data analysis; (8) sample size; and (9) control and placebo adequacy. Because of the exploratory nature of the results, a confirmatory factor analysis is needed to validate this model. To the authors' knowledge, this is the first factor analysis to explore the underlying component items used to evaluate the methodological quality or risk of bias of RCTs in physical therapy. The items and factors represent a starting point for evaluating the methodological quality and risk of bias in physical therapy trials. Empirical evidence of the association among these items with treatment effects and a confirmatory factor analysis of these results are needed to validate these items.

  8. Critical Race Design: An Emerging Methodological Approach to Anti-Racist Design and Implementation Research

    ERIC Educational Resources Information Center

    Khalil, Deena; Kier, Meredith

    2017-01-01

    This article is about introducing Critical Race Design (CRD), a research methodology that centers race and equity at the nucleus of educational opportunities by design. First, the authors define design-based implementation research (DBIR; Penuel, Fishman, Cheng, & Sabelli, 2011) as an equity-oriented education research methodology where…

  9. Octopus: A Design Methodology for Motion Capture Wearables

    PubMed Central

    2017-01-01

    Human motion capture (MoCap) is widely recognised for its usefulness and application in different fields, such as health, sports, and leisure; therefore, its inclusion in current wearables (MoCap-wearables) is increasing, and it may be very useful in a context of intelligent objects interconnected with each other and to the cloud in the Internet of Things (IoT). However, capturing human movement adequately requires addressing difficult-to-satisfy requirements, which means that the applications that are possible with this technology are held back by a series of accessibility barriers, some technological and some regarding usability. To overcome these barriers and generate products with greater wearability that are more efficient and accessible, factors are compiled through a review of publications and market research. The result of this analysis is a design methodology called Octopus, which ranks these factors and schematises them. Octopus provides a tool that can help define design requirements for multidisciplinary teams, generating a common framework and offering a new method of communication between them. PMID:28809786

  10. Octopus: A Design Methodology for Motion Capture Wearables.

    PubMed

    Marin, Javier; Blanco, Teresa; Marin, Jose J

    2017-08-15

    Human motion capture (MoCap) is widely recognised for its usefulness and application in different fields, such as health, sports, and leisure; therefore, its inclusion in current wearables (MoCap-wearables) is increasing, and it may be very useful in a context of intelligent objects interconnected with each other and to the cloud in the Internet of Things (IoT). However, capturing human movement adequately requires addressing difficult-to-satisfy requirements, which means that the applications that are possible with this technology are held back by a series of accessibility barriers, some technological and some regarding usability. To overcome these barriers and generate products with greater wearability that are more efficient and accessible, factors are compiled through a review of publications and market research. The result of this analysis is a design methodology called Octopus, which ranks these factors and schematises them. Octopus provides a tool that can help define design requirements for multidisciplinary teams, generating a common framework and offering a new method of communication between them.

  11. Memristor-Based Computing Architecture: Design Methodologies and Circuit Techniques

    DTIC Science & Technology

    2013-03-01

    MEMRISTOR-BASED COMPUTING ARCHITECTURE : DESIGN METHODOLOGIES AND CIRCUIT TECHNIQUES POLYTECHNIC INSTITUTE OF NEW YORK UNIVERSITY...TECHNICAL REPORT 3. DATES COVERED (From - To) OCT 2010 – OCT 2012 4. TITLE AND SUBTITLE MEMRISTOR-BASED COMPUTING ARCHITECTURE : DESIGN METHODOLOGIES...schemes for a memristor-based reconfigurable architecture design have not been fully explored yet. Therefore, in this project, we investigated

  12. Towards a Methodology for the Design of Multimedia Public Access Interfaces.

    ERIC Educational Resources Information Center

    Rowley, Jennifer

    1998-01-01

    Discussion of information systems methodologies that can contribute to interface design for public access systems covers: the systems life cycle; advantages of adopting information systems methodologies; soft systems methodologies; task-oriented approaches to user interface design; holistic design, the Star model, and prototyping; the…

  13. Co-design of RAD and ETHICS methodologies: a combination of information system development methods

    NASA Astrophysics Data System (ADS)

    Nasehi, Arezo; Shahriyari, Salman

    2011-12-01

    Co-design is a new trend in the social world which tries to capture different ideas in order to use the most appropriate features for a system. In this paper, co-design of two information system methodologies is regarded; rapid application development (RAD) and effective technical and human implementation of computer-based systems (ETHICS). We tried to consider the characteristics of these methodologies to see the possibility of having a co-design or combination of them for developing an information system. To reach this purpose, four different aspects of them are analyzed: social or technical approach, user participation and user involvement, job satisfaction, and overcoming change resistance. Finally, a case study using the quantitative method is analyzed in order to examine the possibility of co-design using these factors. The paper concludes that RAD and ETHICS are appropriate to be co-designed and brings some suggestions for the co-design.

  14. A design and implementation methodology for diagnostic systems

    NASA Technical Reports Server (NTRS)

    Williams, Linda J. F.

    1988-01-01

    A methodology for design and implementation of diagnostic systems is presented. Also discussed are the advantages of embedding a diagnostic system in a host system environment. The methodology utilizes an architecture for diagnostic system development that is hierarchical and makes use of object-oriented representation techniques. Additionally, qualitative models are used to describe the host system components and their behavior. The methodology architecture includes a diagnostic engine that utilizes a combination of heuristic knowledge to control the sequence of diagnostic reasoning. The methodology provides an integrated approach to development of diagnostic system requirements that is more rigorous than standard systems engineering techniques. The advantages of using this methodology during various life cycle phases of the host systems (e.g., National Aerospace Plane (NASP)) include: the capability to analyze diagnostic instrumentation requirements during the host system design phase, a ready software architecture for implementation of diagnostics in the host system, and the opportunity to analyze instrumentation for failure coverage in safety critical host system operations.

  15. A design methodology for nonlinear systems containing parameter uncertainty: Application to nonlinear controller design

    NASA Technical Reports Server (NTRS)

    Young, G.

    1982-01-01

    A design methodology capable of dealing with nonlinear systems, such as a controlled ecological life support system (CELSS), containing parameter uncertainty is discussed. The methodology was applied to the design of discrete time nonlinear controllers. The nonlinear controllers can be used to control either linear or nonlinear systems. Several controller strategies are presented to illustrate the design procedure.

  16. Experimental Validation of an Integrated Controls-Structures Design Methodology

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Gupta, Sandeep; Elliot, Kenny B.; Walz, Joseph E.

    1996-01-01

    The first experimental validation of an integrated controls-structures design methodology for a class of large order, flexible space structures is described. Integrated redesign of the controls-structures-interaction evolutionary model, a laboratory testbed at NASA Langley, was described earlier. The redesigned structure was fabricated, assembled in the laboratory, and experimentally tested against the original structure. Experimental results indicate that the structure redesigned using the integrated design methodology requires significantly less average control power than the nominal structure with control-optimized designs, while maintaining the required line-of-sight pointing performance. Thus, the superiority of the integrated design methodology over the conventional design approach is experimentally demonstrated. Furthermore, amenability of the integrated design structure to other control strategies is evaluated, both analytically and experimentally. Using Linear-Quadratic-Guassian optimal dissipative controllers, it is observed that the redesigned structure leads to significantly improved performance with alternate controllers as well.

  17. Integrating uniform design and response surface methodology to optimize thiacloprid suspension

    PubMed Central

    Li, Bei-xing; Wang, Wei-chang; Zhang, Xian-peng; Zhang, Da-xia; Mu, Wei; Liu, Feng

    2017-01-01

    A model 25% suspension concentrate (SC) of thiacloprid was adopted to evaluate an integrative approach of uniform design and response surface methodology. Tersperse2700, PE1601, xanthan gum and veegum were the four experimental factors, and the aqueous separation ratio and viscosity were the two dependent variables. Linear and quadratic polynomial models of stepwise regression and partial least squares were adopted to test the fit of the experimental data. Verification tests revealed satisfactory agreement between the experimental and predicted data. The measured values for the aqueous separation ratio and viscosity were 3.45% and 278.8 mPa·s, respectively, and the relative errors of the predicted values were 9.57% and 2.65%, respectively (prepared under the proposed conditions). Comprehensive benefits could also be obtained by appropriately adjusting the amount of certain adjuvants based on practical requirements. Integrating uniform design and response surface methodology is an effective strategy for optimizing SC formulas. PMID:28383036

  18. Enhancing the Front-End Phase of Design Methodology

    ERIC Educational Resources Information Center

    Elias, Erasto

    2006-01-01

    Design methodology (DM) is defined by the procedural path, expressed in design models, and techniques or methods used to untangle the various activities within a design model. Design education in universities is mainly based on descriptive design models. Much knowledge and organization have been built into DM to facilitate design teaching.…

  19. Integrated design of the CSI evolutionary structure: A verification of the design methodology

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Joshi, S. M.; Elliott, Kenny B.; Walz, J. E.

    1993-01-01

    One of the main objectives of the Controls-Structures Interaction (CSI) program is to develop and evaluate integrated controls-structures design methodology for flexible space structures. Thus far, integrated design methodologies for a class of flexible spacecraft, which require fine attitude pointing and vibration suppression with no payload articulation, have been extensively investigated. Various integrated design optimization approaches, such as single-objective optimization, and multi-objective optimization, have been implemented with an array of different objectives and constraints involving performance and cost measures such as total mass, actuator mass, steady-state pointing performance, transient performance, control power, and many more. These studies have been performed using an integrated design software tool (CSI-DESIGN CODE) which is under development by the CSI-ADM team at the NASA Langley Research Center. To date, all of these studies, irrespective of the type of integrated optimization posed or objectives and constraints used, have indicated that integrated controls-structures design results in an overall spacecraft design which is considerably superior to designs obtained through a conventional sequential approach. Consequently, it is believed that validation of some of these results through fabrication and testing of a structure which is designed through an integrated design approach is warranted. The objective of this paper is to present and discuss the efforts that have been taken thus far for the validation of the integrated design methodology.

  20. Integrated Design Methodology for Highly Reliable Liquid Rocket Engine

    NASA Astrophysics Data System (ADS)

    Kuratani, Naoshi; Aoki, Hiroshi; Yasui, Masaaki; Kure, Hirotaka; Masuya, Goro

    The Integrated Design Methodology is strongly required at the conceptual design phase to achieve the highly reliable space transportation systems, especially the propulsion systems, not only in Japan but also all over the world in these days. Because in the past some catastrophic failures caused some losses of mission and vehicle (LOM/LOV) at the operational phase, moreover did affect severely the schedule delays and cost overrun at the later development phase. Design methodology for highly reliable liquid rocket engine is being preliminarily established and investigated in this study. The sensitivity analysis is systematically performed to demonstrate the effectiveness of this methodology, and to clarify and especially to focus on the correlation between the combustion chamber, turbopump and main valve as main components. This study describes the essential issues to understand the stated correlations, the need to apply this methodology to the remaining critical failure modes in the whole engine system, and the perspective on the engine development in the future.

  1. The methodology of database design in organization management systems

    NASA Astrophysics Data System (ADS)

    Chudinov, I. L.; Osipova, V. V.; Bobrova, Y. V.

    2017-01-01

    The paper describes the unified methodology of database design for management information systems. Designing the conceptual information model for the domain area is the most important and labor-intensive stage in database design. Basing on the proposed integrated approach to design, the conceptual information model, the main principles of developing the relation databases are provided and user’s information needs are considered. According to the methodology, the process of designing the conceptual information model includes three basic stages, which are defined in detail. Finally, the article describes the process of performing the results of analyzing user’s information needs and the rationale for use of classifiers.

  2. A novel methodology for building robust design rules by using design based metrology (DBM)

    NASA Astrophysics Data System (ADS)

    Lee, Myeongdong; Choi, Seiryung; Choi, Jinwoo; Kim, Jeahyun; Sung, Hyunju; Yeo, Hyunyoung; Shim, Myoungseob; Jin, Gyoyoung; Chung, Eunseung; Roh, Yonghan

    2013-03-01

    This paper addresses a methodology for building robust design rules by using design based metrology (DBM). Conventional method for building design rules has been using a simulation tool and a simple pattern spider mask. At the early stage of the device, the estimation of simulation tool is poor. And the evaluation of the simple pattern spider mask is rather subjective because it depends on the experiential judgment of an engineer. In this work, we designed a huge number of pattern situations including various 1D and 2D design structures. In order to overcome the difficulties of inspecting many types of patterns, we introduced Design Based Metrology (DBM) of Nano Geometry Research, Inc. And those mass patterns could be inspected at a fast speed with DBM. We also carried out quantitative analysis on PWQ silicon data to estimate process variability. Our methodology demonstrates high speed and accuracy for building design rules. All of test patterns were inspected within a few hours. Mass silicon data were handled with not personal decision but statistical processing. From the results, robust design rules are successfully verified and extracted. Finally we found out that our methodology is appropriate for building robust design rules.

  3. Design Science Methodology Applied to a Chemical Surveillance Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Zhuanyi; Han, Kyungsik; Charles-Smith, Lauren E.

    Public health surveillance systems gain significant benefits from integrating existing early incident detection systems,supported by closed data sources, with open source data.However, identifying potential alerting incidents relies on finding accurate, reliable sources and presenting the high volume of data in a way that increases analysts work efficiency; a challenge for any system that leverages open source data. In this paper, we present the design concept and the applied design science research methodology of ChemVeillance, a chemical analyst surveillance system.Our work portrays a system design and approach that translates theoretical methodology into practice creating a powerful surveillance system built for specificmore » use cases.Researchers, designers, developers, and related professionals in the health surveillance community can build upon the principles and methodology described here to enhance and broaden current surveillance systems leading to improved situational awareness based on a robust integrated early warning system.« less

  4. Multidisciplinary design and optimization (MDO) methodology for the aircraft conceptual design

    NASA Astrophysics Data System (ADS)

    Iqbal, Liaquat Ullah

    An integrated design and optimization methodology has been developed for the conceptual design of an aircraft. The methodology brings higher fidelity Computer Aided Design, Engineering and Manufacturing (CAD, CAE and CAM) Tools such as CATIA, FLUENT, ANSYS and SURFCAM into the conceptual design by utilizing Excel as the integrator and controller. The approach is demonstrated to integrate with many of the existing low to medium fidelity codes such as the aerodynamic panel code called CMARC and sizing and constraint analysis codes, thus providing the multi-fidelity capabilities to the aircraft designer. The higher fidelity design information from the CAD and CAE tools for the geometry, aerodynamics, structural and environmental performance is provided for the application of the structured design methods such as the Quality Function Deployment (QFD) and the Pugh's Method. The higher fidelity tools bring the quantitative aspects of a design such as precise measurements of weight, volume, surface areas, center of gravity (CG) location, lift over drag ratio, and structural weight, as well as the qualitative aspects such as external geometry definition, internal layout, and coloring scheme early in the design process. The performance and safety risks involved with the new technologies can be reduced by modeling and assessing their impact more accurately on the performance of the aircraft. The methodology also enables the design and evaluation of the novel concepts such as the blended (BWB) and the hybrid wing body (HWB) concepts. Higher fidelity computational fluid dynamics (CFD) and finite element analysis (FEA) allow verification of the claims for the performance gains in aerodynamics and ascertain risks of structural failure due to different pressure distribution in the fuselage as compared with the tube and wing design. The higher fidelity aerodynamics and structural models can lead to better cost estimates that help reduce the financial risks as well. This helps in

  5. Helicopter-V/STOL dynamic wind and turbulence design methodology

    NASA Technical Reports Server (NTRS)

    Bailey, J. Earl

    1987-01-01

    Aircraft and helicopter accidents due to severe dynamic wind and turbulence continue to present challenging design problems. The development of the current set of design analysis tools for a aircraft wind and turbulence design began in the 1940's and 1950's. The areas of helicopter dynamic wind and turbulence modeling and vehicle response to severe dynamic wind inputs (microburst type phenomena) during takeoff and landing remain as major unsolved design problems from a lack of both environmental data and computational methodology. The development of helicopter and V/STOL dynamic wind and turbulence response computation methology is reviewed, the current state of the design art in industry is outlined, and comments on design methodology are made which may serve to improve future flight vehicle design.

  6. De/signing Research in Education: Patchwork(ing) Methodologies with Theory

    ERIC Educational Resources Information Center

    Higgins, Marc; Madden, Brooke; Berard, Marie-France; Lenz Kothe, Elsa; Nordstrom, Susan

    2017-01-01

    Four education scholars extend the methodological space inspired by Jackson and Mazzei's "Thinking with Theory" through focusing on research design. The notion of de/sign is presented and employed to counter prescriptive method/ology that often sutures over pedagogical possibilities in research and educational settings. Key…

  7. Establishing Equivalence: Methodological Progress in Group-Matching Design and Analysis

    ERIC Educational Resources Information Center

    Kover, Sara T.; Atwood, Amy K.

    2013-01-01

    This methodological review draws attention to the challenges faced by intellectual and developmental disabilities researchers in the appropriate design and analysis of group comparison studies. We provide a brief overview of matching methodologies in the field, emphasizing group-matching designs used in behavioral research on cognition and…

  8. Design methodology and projects for space engineering

    NASA Technical Reports Server (NTRS)

    Nichols, S.; Kleespies, H.; Wood, K.; Crawford, R.

    1993-01-01

    NASA/USRA is an ongoing sponsor of space design projects in the senior design course of the Mechanical Engineering Department at The University of Texas at Austin. This paper describes the UT senior design sequence, consisting of a design methodology course and a capstone design course. The philosophical basis of this sequence is briefly summarized. A history of the Department's activities in the Advanced Design Program is then presented. The paper concludes with a description of the projects completed during the 1991-92 academic year and the ongoing projects for the Fall 1992 semester.

  9. Soft robot design methodology for `push-button' manufacturing

    NASA Astrophysics Data System (ADS)

    Paik, Jamie

    2018-06-01

    `Push-button' or fully automated manufacturing would enable the production of robots with zero intervention from human hands. Realizing this utopia requires a fundamental shift from a sequential (design-materials-manufacturing) to a concurrent design methodology.

  10. Methodology to design a municipal solid waste pre-collection system. A case study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gallardo, A., E-mail: gallardo@uji.es; Carlos, M., E-mail: mcarlos@uji.es; Peris, M., E-mail: perism@uji.es

    Highlights: • MSW recovery starts at homes; therefore it is important to facilitate it to people. • Additionally, to optimize MSW collection a previous pre-collection must be planned. • A methodology to organize pre-collection considering several factors is presented. • The methodology has been verified applying it to a Spanish middle town. - Abstract: The municipal solid waste (MSW) management is an important task that local governments as well as private companies must take into account to protect human health, the environment and to preserve natural resources. To design an adequate MSW management plan the first step consists in definingmore » the waste generation and composition patterns of the town. As these patterns depend on several socio-economic factors it is advisable to organize them previously. Moreover, the waste generation and composition patterns may vary around the town and over the time. Generally, the data are not homogeneous around the city as the number of inhabitants is not constant nor it is the economic activity. Therefore, if all the information is showed in thematic maps, the final waste management decisions can be made more efficiently. The main aim of this paper is to present a structured methodology that allows local authorities or private companies who deal with MSW to design its own MSW management plan depending on the available data. According to these data, this paper proposes two ways of action: a direct way when detailed data are available and an indirect way when there is a lack of data and it is necessary to take into account bibliographic data. In any case, the amount of information needed is considerable. This paper combines the planning methodology with the Geographic Information Systems to present the final results in thematic maps that make easier to interpret them. The proposed methodology is a previous useful tool to organize the MSW collection routes including the selective collection. To verify the methodology

  11. Calibration of Resistance Factors Needed in the LRFD Design of Drilled Shafts

    DOT National Transportation Integrated Search

    2010-09-01

    The first report on Load and Resistance Factor Design (LRFD) calibration of driven piles in Louisiana (LTRC Final Report 449) was : completed in May 2009. As a continuing effort to implement the LRFD design methodology for deep foundations in Louisia...

  12. Calibration of resistance factors needed in the LRFD design of drilled shafts.

    DOT National Transportation Integrated Search

    2010-09-01

    The first report on Load and Resistance Factor Design (LRFD) calibration of driven piles in Louisiana (LTRC Final Report 449) was completed in May 2009. As a continuing effort to implement the LRFD design methodology for deep foundations in Louisiana...

  13. Calibration of resistance factors needed in the LRFD design of drilled shafts.

    DOT National Transportation Integrated Search

    2010-09-01

    The first report on Load and Resistance Factor Design (LRFD) calibration of driven piles in Louisiana (LTRC Final Report 449) was : completed in May 2009. As a continuing effort to implement the LRFD design methodology for deep foundations in Louisia...

  14. [Optimization of Polysaccharide Extraction from Spirodela polyrrhiza by Plackett-Burman Design Combined with Box-Behnken Response Surface Methodology].

    PubMed

    Jiang, Zheng; Wang, Hong; Wu, Qi-nan

    2015-06-01

    To optimize the processing of polysaccharide extraction from Spirodela polyrrhiza. Five factors related to extraction rate of polysaccharide were optimized by the Plackett-Burman design. Based on this study, three factors, including alcohol volume fraction, extraction temperature and ratio of material to liquid, were regarded as investigation factors by Box-Behnken response surface methodology. The effect order of three factors on the extraction rate of polysaccharide from Spirodela polyrrhiza were as follows: extraction temperature, alcohol volume fraction,ratio of material to liquid. According to Box-Behnken response, the best extraction conditions were: alcohol volume fraction of 81%, ratio of material to liquid of 1:42, extraction temperature of 100 degrees C, extraction time of 60 min for four times. Plackett-Burman design and Box-Behnken response surface methodology used to optimize the extraction process for the polysaccharide in this study is effective and stable.

  15. Methodological Innovation in Practice-Based Design Doctorates

    ERIC Educational Resources Information Center

    Yee, Joyce S. R.

    2010-01-01

    This article presents a selective review of recent design PhDs that identify and analyse the methodological innovation that is occurring in the field, in order to inform future provision of research training. Six recently completed design PhDs are used to highlight possible philosophical and practical models that can be adopted by future PhD…

  16. A prototype computerized synthesis methodology for generic space access vehicle (SAV) conceptual design

    NASA Astrophysics Data System (ADS)

    Huang, Xiao

    2006-04-01

    Today's and especially tomorrow's competitive launch vehicle design environment requires the development of a dedicated generic Space Access Vehicle (SAV) design methodology. A total of 115 industrial, research, and academic aircraft, helicopter, missile, and launch vehicle design synthesis methodologies have been evaluated. As the survey indicates, each synthesis methodology tends to focus on a specific flight vehicle configuration, thus precluding the key capability to systematically compare flight vehicle design alternatives. The aim of the research investigation is to provide decision-making bodies and the practicing engineer a design process and tool box for robust modeling and simulation of flight vehicles where the ultimate performance characteristics may hinge on numerical subtleties. This will enable the designer of a SAV for the first time to consistently compare different classes of SAV configurations on an impartial basis. This dissertation presents the development steps required towards a generic (configuration independent) hands-on flight vehicle conceptual design synthesis methodology. This process is developed such that it can be applied to any flight vehicle class if desired. In the present context, the methodology has been put into operation for the conceptual design of a tourist Space Access Vehicle. The case study illustrates elements of the design methodology & algorithm for the class of Horizontal Takeoff and Horizontal Landing (HTHL) SAVs. The HTHL SAV design application clearly outlines how the conceptual design process can be centrally organized, executed and documented with focus on design transparency, physical understanding and the capability to reproduce results. This approach offers the project lead and creative design team a management process and tool which iteratively refines the individual design logic chosen, leading to mature design methods and algorithms. As illustrated, the HTHL SAV hands-on design methodology offers growth

  17. Design for human factors (DfHF): a grounded theory for integrating human factors into production design processes.

    PubMed

    Village, Judy; Searcy, Cory; Salustri, Filipo; Patrick Neumann, W

    2015-01-01

    The 'design for human factors' grounded theory explains 'how' human factors (HF) went from a reactive, after-injury programme in safety, to being proactively integrated into each step of the production design process. In this longitudinal case study collaboration with engineers and HF Specialists in a large electronics manufacturer, qualitative data (e.g. meetings, interviews, observations and reflections) were analysed using a grounded theory methodology. The central tenet in the theory is that when HF Specialists acclimated to the engineering process, language and tools, and strategically aligned HF to the design and business goals of the organisation, HF became a means to improve business performance. This led to engineers 'pulling' HF Specialists onto their team. HF targets were adopted into engineering tools to communicate HF concerns quantitatively, drive continuous improvement, visibly demonstrate change and lead to benchmarking. Senior management held engineers accountable for HF as a key performance indicator, thus integrating HF into the production design process. Practitioner Summary: Research and practice lack explanations about how HF can be integrated early in design of production systems. This three-year case study and the theory derived demonstrate how ergonomists changed their focus to align with design and business goals to integrate HF into the design process.

  18. Methodology for Designing Fault-Protection Software

    NASA Technical Reports Server (NTRS)

    Barltrop, Kevin; Levison, Jeffrey; Kan, Edwin

    2006-01-01

    A document describes a methodology for designing fault-protection (FP) software for autonomous spacecraft. The methodology embodies and extends established engineering practices in the technical discipline of Fault Detection, Diagnosis, Mitigation, and Recovery; and has been successfully implemented in the Deep Impact Spacecraft, a NASA Discovery mission. Based on established concepts of Fault Monitors and Responses, this FP methodology extends the notion of Opinion, Symptom, Alarm (aka Fault), and Response with numerous new notions, sub-notions, software constructs, and logic and timing gates. For example, Monitor generates a RawOpinion, which graduates into Opinion, categorized into no-opinion, acceptable, or unacceptable opinion. RaiseSymptom, ForceSymptom, and ClearSymptom govern the establishment and then mapping to an Alarm (aka Fault). Local Response is distinguished from FP System Response. A 1-to-n and n-to- 1 mapping is established among Monitors, Symptoms, and Responses. Responses are categorized by device versus by function. Responses operate in tiers, where the early tiers attempt to resolve the Fault in a localized step-by-step fashion, relegating more system-level response to later tier(s). Recovery actions are gated by epoch recovery timing, enabling strategy, urgency, MaxRetry gate, hardware availability, hazardous versus ordinary fault, and many other priority gates. This methodology is systematic, logical, and uses multiple linked tables, parameter files, and recovery command sequences. The credibility of the FP design is proven via a fault-tree analysis "top-down" approach, and a functional fault-mode-effects-and-analysis via "bottoms-up" approach. Via this process, the mitigation and recovery strategy(s) per Fault Containment Region scope (width versus depth) the FP architecture.

  19. Methodology to design a municipal solid waste generation and composition map: A case study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gallardo, A., E-mail: gallardo@uji.es; Carlos, M., E-mail: mcarlos@uji.es; Peris, M., E-mail: perism@uji.es

    Highlights: • To draw a waste generation and composition map of a town a lot of factors must be taken into account. • The methodology proposed offers two different depending on the available data combined with geographical information systems. • The methodology has been applied to a Spanish city with success. • The methodology will be a useful tool to organize the municipal solid waste management. - Abstract: The municipal solid waste (MSW) management is an important task that local governments as well as private companies must take into account to protect human health, the environment and to preserve naturalmore » resources. To design an adequate MSW management plan the first step consist in defining the waste generation and composition patterns of the town. As these patterns depend on several socio-economic factors it is advisable to organize them previously. Moreover, the waste generation and composition patterns may vary around the town and over the time. Generally, the data are not homogeneous around the city as the number of inhabitants is not constant nor it is the economic activity. Therefore, if all the information is showed in thematic maps, the final waste management decisions can be made more efficiently. The main aim of this paper is to present a structured methodology that allows local authorities or private companies who deal with MSW to design its own MSW management plan depending on the available data. According to these data, this paper proposes two ways of action: a direct way when detailed data are available and an indirect way when there is a lack of data and it is necessary to take into account bibliographic data. In any case, the amount of information needed is considerable. This paper combines the planning methodology with the Geographic Information Systems to present the final results in thematic maps that make easier to interpret them. The proposed methodology is a previous useful tool to organize the MSW collection routes

  20. DESIGN METHODOLOGIES AND TOOLS FOR SINGLE-FLUX QUANTUM LOGIC CIRCUITS

    DTIC Science & Technology

    2017-10-01

    DESIGN METHODOLOGIES AND TOOLS FOR SINGLE-FLUX QUANTUM LOGIC CIRCUITS UNIVERSITY OF SOUTHERN CALIFORNIA OCTOBER 2017 FINAL...SUBTITLE DESIGN METHODOLOGIES AND TOOLS FOR SINGLE-FLUX QUANTUM LOGIC CIRCUITS 5a. CONTRACT NUMBER FA8750-15-C-0203 5b. GRANT NUMBER N/A 5c. PROGRAM...of this project was to investigate the state-of-the-art in design and optimization of single-flux quantum (SFQ) logic circuits, e.g., RSFQ and ERSFQ

  1. Adapt Design: A Methodology for Enabling Modular Design for Mission Specific SUAS

    DTIC Science & Technology

    2016-08-24

    ADAPT DESIGN: A METHODOLOGY FOR ENABLING MODULAR DESIGN FOR MISSION SPECIFIC SUAS Zachary C. Fisher David Locascio K. Daniel Cooksey...vehicle’s small scale. This paper considers a different approach to SUAS design aimed at addressing this issue. In this approach, a hybrid modular and...Two types of platforms have been identified: scalable platforms where variants are produced by varying scalable design variables, and modular

  2. Hydrogel design of experiments methodology to optimize hydrogel for iPSC-NPC culture.

    PubMed

    Lam, Jonathan; Carmichael, S Thomas; Lowry, William E; Segura, Tatiana

    2015-03-11

    Bioactive signals can be incorporated in hydrogels to direct encapsulated cell behavior. Design of experiments methodology methodically varies the signals systematically to determine the individual and combinatorial effects of each factor on cell activity. Using this approach enables the optimization of three ligands concentrations (RGD, YIGSR, IKVAV) for the survival and differentiation of neural progenitor cells. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. A design methodology for nonlinear systems containing parameter uncertainty

    NASA Technical Reports Server (NTRS)

    Young, G. E.; Auslander, D. M.

    1983-01-01

    In the present design methodology for nonlinear systems containing parameter uncertainty, a generalized sensitivity analysis is incorporated which employs parameter space sampling and statistical inference. For the case of a system with j adjustable and k nonadjustable parameters, this methodology (which includes an adaptive random search strategy) is used to determine the combination of j adjustable parameter values which maximize the probability of those performance indices which simultaneously satisfy design criteria in spite of the uncertainty due to k nonadjustable parameters.

  4. A performance-oriented power transformer design methodology using multi-objective evolutionary optimization

    PubMed Central

    Adly, Amr A.; Abd-El-Hafiz, Salwa K.

    2014-01-01

    Transformers are regarded as crucial components in power systems. Due to market globalization, power transformer manufacturers are facing an increasingly competitive environment that mandates the adoption of design strategies yielding better performance at lower costs. In this paper, a power transformer design methodology using multi-objective evolutionary optimization is proposed. Using this methodology, which is tailored to be target performance design-oriented, quick rough estimation of transformer design specifics may be inferred. Testing of the suggested approach revealed significant qualitative and quantitative match with measured design and performance values. Details of the proposed methodology as well as sample design results are reported in the paper. PMID:26257939

  5. A performance-oriented power transformer design methodology using multi-objective evolutionary optimization.

    PubMed

    Adly, Amr A; Abd-El-Hafiz, Salwa K

    2015-05-01

    Transformers are regarded as crucial components in power systems. Due to market globalization, power transformer manufacturers are facing an increasingly competitive environment that mandates the adoption of design strategies yielding better performance at lower costs. In this paper, a power transformer design methodology using multi-objective evolutionary optimization is proposed. Using this methodology, which is tailored to be target performance design-oriented, quick rough estimation of transformer design specifics may be inferred. Testing of the suggested approach revealed significant qualitative and quantitative match with measured design and performance values. Details of the proposed methodology as well as sample design results are reported in the paper.

  6. PEM Fuel Cells Redesign Using Biomimetic and TRIZ Design Methodologies

    NASA Astrophysics Data System (ADS)

    Fung, Keith Kin Kei

    Two formal design methodologies, biomimetic design and the Theory of Inventive Problem Solving, TRIZ, were applied to the redesign of a Proton Exchange Membrane (PEM) fuel cell. Proof of concept prototyping was performed on two of the concepts for water management. The liquid water collection with strategically placed wicks concept demonstrated the potential benefits for a fuel cell. Conversely, the periodic flow direction reversal concepts might cause a potential reduction water removal from a fuel cell. The causes of this water removal reduction remain unclear. In additional, three of the concepts generated with biomimetic design were further studied and demonstrated to stimulate more creative ideas in the thermal and water management of fuel cells. The biomimetic design and the TRIZ methodologies were successfully applied to fuel cells and provided different perspectives to the redesign of fuel cells. The methodologies should continue to be used to improve fuel cells.

  7. Association between component costs, study methodologies, and foodborne illness-related factors with the cost of nontyphoidal Salmonella illness.

    PubMed

    McLinden, Taylor; Sargeant, Jan M; Thomas, M Kate; Papadopoulos, Andrew; Fazil, Aamir

    2014-09-01

    Nontyphoidal Salmonella spp. are one of the most common causes of bacterial foodborne illness. Variability in cost inventories and study methodologies limits the possibility of meaningfully interpreting and comparing cost-of-illness (COI) estimates, reducing their usefulness. However, little is known about the relative effect these factors have on a cost-of-illness estimate. This is important for comparing existing estimates and when designing new cost-of-illness studies. Cost-of-illness estimates, identified through a scoping review, were used to investigate the association between descriptive, component cost, methodological, and foodborne illness-related factors such as chronic sequelae and under-reporting with the cost of nontyphoidal Salmonella spp. illness. The standardized cost of nontyphoidal Salmonella spp. illness from 30 estimates reported in 29 studies ranged from $0.01568 to $41.22 United States dollars (USD)/person/year (2012). The mean cost of nontyphoidal Salmonella spp. illness was $10.37 USD/person/year (2012). The following factors were found to be significant in multiple linear regression (p≤0.05): the number of direct component cost categories included in an estimate (0-4, particularly long-term care costs) and chronic sequelae costs (inclusion/exclusion), which had positive associations with the cost of nontyphoidal Salmonella spp. illness. Factors related to study methodology were not significant. Our findings indicated that study methodology may not be as influential as other factors, such as the number of direct component cost categories included in an estimate and costs incurred due to chronic sequelae. Therefore, these may be the most important factors to consider when designing, interpreting, and comparing cost of foodborne illness studies.

  8. A top-down design methodology and its implementation for VCSEL-based optical links design

    NASA Astrophysics Data System (ADS)

    Li, Jiguang; Cao, Mingcui; Cai, Zilong

    2005-01-01

    In order to find the optimal design for a given specification of an optical communication link, an integrated simulation of electronic, optoelectronic, and optical components of a complete system is required. It is very important to be able to simulate at both system level and detailed model level. This kind of model is feasible due to the high potential of Verilog-AMS language. In this paper, we propose an effective top-down design methodology and employ it in the development of a complete VCSEL-based optical links simulation. The principle of top-down methodology is that the development would proceed from the system to device level. To design a hierarchical model for VCSEL based optical links, the design framework is organized in three levels of hierarchy. The models are developed, and implemented in Verilog-AMS. Therefore, the model parameters are fitted to measured data. A sample transient simulation demonstrates the functioning of our implementation. Suggestions for future directions in top-down methodology used for optoelectronic systems technology are also presented.

  9. Integrated Controls-Structures Design Methodology: Redesign of an Evolutionary Test Structure

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Gupta, Sandeep; Elliot, Kenny B.; Joshi, Suresh M.

    1997-01-01

    An optimization-based integrated controls-structures design methodology for a class of flexible space structures is described, and the phase-0 Controls-Structures-Integration evolutionary model, a laboratory testbed at NASA Langley, is redesigned using this integrated design methodology. The integrated controls-structures design is posed as a nonlinear programming problem to minimize the control effort required to maintain a specified line-of-sight pointing performance, under persistent white noise disturbance. Static and dynamic dissipative control strategies are employed for feedback control, and parameters of these controllers are considered as the control design variables. Sizes of strut elements in various sections of the CEM are used as the structural design variables. Design guides for the struts are developed and employed in the integrated design process, to ensure that the redesigned structure can be effectively fabricated. The superiority of the integrated design methodology over the conventional design approach is demonstrated analytically by observing a significant reduction in the average control power needed to maintain specified pointing performance with the integrated design approach.

  10. FINDING A METHOD FOR THE MADNESS: A COMPARATIVE ANALYSIS OF STRATEGIC DESIGN METHODOLOGIES

    DTIC Science & Technology

    2017-06-01

    FINDING A METHOD FOR THE MADNESS: A COMPARATIVE ANALYSIS OF STRATEGIC DESIGN METHODOLOGIES BY AMANDA DONNELLY A THESIS...work develops a comparative model for strategic design methodologies, focusing on the primary elements of vision, time, process, communication and...collaboration, and risk assessment. My analysis dissects and compares three potential design methodologies including, net assessment, scenarios and

  11. Methodologies for Root Locus and Loop Shaping Control Design with Comparisons

    NASA Technical Reports Server (NTRS)

    Kopasakis, George

    2017-01-01

    This paper describes some basics for the root locus controls design method as well as for loop shaping, and establishes approaches to expedite the application of these two design methodologies to easily obtain control designs that meet requirements with superior performance. The two design approaches are compared for their ability to meet control design specifications and for ease of application using control design examples. These approaches are also compared with traditional Proportional Integral Derivative (PID) control in order to demonstrate the limitations of PID control. Robustness of these designs is covered as it pertains to these control methodologies and for the example problems.

  12. Research design: the methodology for interdisciplinary research framework.

    PubMed

    Tobi, Hilde; Kampen, Jarl K

    2018-01-01

    Many of today's global scientific challenges require the joint involvement of researchers from different disciplinary backgrounds (social sciences, environmental sciences, climatology, medicine, etc.). Such interdisciplinary research teams face many challenges resulting from differences in training and scientific culture. Interdisciplinary education programs are required to train truly interdisciplinary scientists with respect to the critical factor skills and competences. For that purpose this paper presents the Methodology for Interdisciplinary Research (MIR) framework. The MIR framework was developed to help cross disciplinary borders, especially those between the natural sciences and the social sciences. The framework has been specifically constructed to facilitate the design of interdisciplinary scientific research, and can be applied in an educational program, as a reference for monitoring the phases of interdisciplinary research, and as a tool to design such research in a process approach. It is suitable for research projects of different sizes and levels of complexity, and it allows for a range of methods' combinations (case study, mixed methods, etc.). The different phases of designing interdisciplinary research in the MIR framework are described and illustrated by real-life applications in teaching and research. We further discuss the framework's utility in research design in landscape architecture, mixed methods research, and provide an outlook to the framework's potential in inclusive interdisciplinary research, and last but not least, research integrity.

  13. Methodology for worker neutron exposure evaluation in the PDCF facility design.

    PubMed

    Scherpelz, R I; Traub, R J; Pryor, K H

    2004-01-01

    A project headed by Washington Group International is meant to design the Pit Disassembly and Conversion Facility (PDCF) to convert the plutonium pits from excessed nuclear weapons into plutonium oxide for ultimate disposition. Battelle staff are performing the shielding calculations that will determine appropriate shielding so that the facility workers will not exceed target exposure levels. The target exposure levels for workers in the facility are 5 mSv y(-1) for the whole body and 100 mSv y(-1) for the extremity, which presents a significant challenge to the designers of a facility that will process tons of radioactive material. The design effort depended on shielding calculations to determine appropriate thickness and composition for glove box walls, and concrete wall thicknesses for storage vaults. Pacific Northwest National Laboratory (PNNL) staff used ORIGEN-S and SOURCES to generate gamma and neutron source terms, and Monte Carlo (computer code for) neutron photon (transport) (MCNP-4C) to calculate the radiation transport in the facility. The shielding calculations were performed by a team of four scientists, so it was necessary to develop a consistent methodology. There was also a requirement for the study to be cost-effective, so efficient methods of evaluation were required. The calculations were subject to rigorous scrutiny by internal and external reviewers, so acceptability was a major feature of the methodology. Some of the issues addressed in the development of the methodology included selecting appropriate dose factors, developing a method for handling extremity doses, adopting an efficient method for evaluating effective dose equivalent in a non-uniform radiation field, modelling the reinforcing steel in concrete, and modularising the geometry descriptions for efficiency. The relative importance of the neutron dose equivalent compared with the gamma dose equivalent varied substantially depending on the specific shielding conditions and lessons

  14. Railroad classification yard design methodology study Elkhart Yard Rehabilitation : a case study

    DOT National Transportation Integrated Search

    1980-02-01

    This interim report documents the application of a railroad classification : yard design methodology to CONRAIL's Elkhart Yard Rehabilitation. This : case study effort represents Phase 2 of a larger effort to develop a yard : design methodology, and ...

  15. Propulsion integration of hypersonic air-breathing vehicles utilizing a top-down design methodology

    NASA Astrophysics Data System (ADS)

    Kirkpatrick, Brad Kenneth

    In recent years, a focus of aerospace engineering design has been the development of advanced design methodologies and frameworks to account for increasingly complex and integrated vehicles. Techniques such as parametric modeling, global vehicle analyses, and interdisciplinary data sharing have been employed in an attempt to improve the design process. The purpose of this study is to introduce a new approach to integrated vehicle design known as the top-down design methodology. In the top-down design methodology, the main idea is to relate design changes on the vehicle system and sub-system level to a set of over-arching performance and customer requirements. Rather than focusing on the performance of an individual system, the system is analyzed in terms of the net effect it has on the overall vehicle and other vehicle systems. This detailed level of analysis can only be accomplished through the use of high fidelity computational tools such as Computational Fluid Dynamics (CFD) or Finite Element Analysis (FEA). The utility of the top-down design methodology is investigated through its application to the conceptual and preliminary design of a long-range hypersonic air-breathing vehicle for a hypothetical next generation hypersonic vehicle (NHRV) program. System-level design is demonstrated through the development of the nozzle section of the propulsion system. From this demonstration of the methodology, conclusions are made about the benefits, drawbacks, and cost of using the methodology.

  16. A Novel Methodology for Measurements of an LED's Heat Dissipation Factor

    NASA Astrophysics Data System (ADS)

    Jou, R.-Y.; Haung, J.-H.

    2015-12-01

    Heat generation is an inevitable byproduct with high-power light-emitting diode (LED) lighting. The increase in junction temperature that accompanies the heat generation sharply degrades the optical output of the LED and has a significant negative influence on the reliability and durability of the LED. For these reasons, the heat dissipation factor, Kh, is an important factor in modeling and thermal design of LED installations. In this study, a methodology is proposed and experiments are conducted to determine LED heat dissipation factors. Experiments are conducted for two different brands of LED. The average heat dissipation factor of the Edixeon LED is 0.69, and is 0.60 for the OSRAM LED. By using the developed test method and comparing the results to the calculated luminous fluxes using theoretical equations, the interdependence of optical, electrical, and thermal powers can be predicted with a reasonable accuracy. The difference between the theoretical and experimental values is less than 9 %.

  17. A combined stochastic feedforward and feedback control design methodology with application to autoland design

    NASA Technical Reports Server (NTRS)

    Halyo, Nesim

    1987-01-01

    A combined stochastic feedforward and feedback control design methodology was developed. The objective of the feedforward control law is to track the commanded trajectory, whereas the feedback control law tries to maintain the plant state near the desired trajectory in the presence of disturbances and uncertainties about the plant. The feedforward control law design is formulated as a stochastic optimization problem and is embedded into the stochastic output feedback problem where the plant contains unstable and uncontrollable modes. An algorithm to compute the optimal feedforward is developed. In this approach, the use of error integral feedback, dynamic compensation, control rate command structures are an integral part of the methodology. An incremental implementation is recommended. Results on the eigenvalues of the implemented versus designed control laws are presented. The stochastic feedforward/feedback control methodology is used to design a digital automatic landing system for the ATOPS Research Vehicle, a Boeing 737-100 aircraft. The system control modes include localizer and glideslope capture and track, and flare to touchdown. Results of a detailed nonlinear simulation of the digital control laws, actuator systems, and aircraft aerodynamics are presented.

  18. An investigation into creative design methodologies for textiles and fashion

    NASA Astrophysics Data System (ADS)

    Gault, Alison

    2017-10-01

    Understanding market intelligence, trends, influences and personal approaches are essential tools for design students to develop their ideas in textiles and fashion. Identifying different personal approaches including, visual, process-led or concept by employing creative methodologies are key to developing a brief. A series of ideas or themes start to emerge and through the design process serve to underpin and inform an entire collection. These investigations ensure that the design collections are able to produce a diverse range of outcomes. Following key structures and coherent stages in the design process creates authentic collections in textiles and fashion. A range of undergraduate students presented their design portfolios (180) and the methodologies employed were mapped against success at module level, industry response and graduate employment.

  19. One Controller at a Time (1-CAT): A mimo design methodology

    NASA Technical Reports Server (NTRS)

    Mitchell, J. R.; Lucas, J. C.

    1987-01-01

    The One Controller at a Time (1-CAT) methodology for designing digital controllers for Large Space Structures (LSS's) is introduced and illustrated. The flexible mode problem is first discussed. Next, desirable features of a LSS control system design methodology are delineated. The 1-CAT approach is presented, along with an analytical technique for carrying out the 1-CAT process. Next, 1-CAT is used to design digital controllers for the proposed Space Based Laser (SBL). Finally, the SBL design is evaluated for dynamical performance, noise rejection, and robustness.

  20. Advanced piloted aircraft flight control system design methodology. Volume 2: The FCX flight control design expert system

    NASA Technical Reports Server (NTRS)

    Myers, Thomas T.; Mcruer, Duane T.

    1988-01-01

    The development of a comprehensive and electric methodology for conceptual and preliminary design of flight control systems is presented and illustrated. The methodology is focused on the design states starting with the layout of system requirements and ending when some viable competing system architectures (feedback control structures) are defined. The approach is centered on the human pilot and the aircraft as both the sources of, and the keys to the solution of, many flight control problems. The methodology relies heavily on computational procedures which are highly interactive with the design engineer. To maximize effectiveness, these techniques, as selected and modified to be used together in the methodology, form a cadre of computational tools specifically tailored for integrated flight control system preliminary design purposes. The FCX expert system as presently developed is only a limited prototype capable of supporting basic lateral-directional FCS design activities related to the design example used. FCX presently supports design of only one FCS architecture (yaw damper plus roll damper) and the rules are largely focused on Class IV (highly maneuverable) aircraft. Despite this limited scope, the major elements which appear necessary for application of knowledge-based software concepts to flight control design were assembled and thus FCX represents a prototype which can be tested, critiqued and evolved in an ongoing process of development.

  1. Advanced piloted aircraft flight control system design methodology. Volume 1: Knowledge base

    NASA Technical Reports Server (NTRS)

    Mcruer, Duane T.; Myers, Thomas T.

    1988-01-01

    The development of a comprehensive and electric methodology for conceptual and preliminary design of flight control systems is presented and illustrated. The methodology is focused on the design stages starting with the layout of system requirements and ending when some viable competing system architectures (feedback control structures) are defined. The approach is centered on the human pilot and the aircraft as both the sources of, and the keys to the solution of, many flight control problems. The methodology relies heavily on computational procedures which are highly interactive with the design engineer. To maximize effectiveness, these techniques, as selected and modified to be used together in the methodology, form a cadre of computational tools specifically tailored for integrated flight control system preliminary design purposes. While theory and associated computational means are an important aspect of the design methodology, the lore, knowledge and experience elements, which guide and govern applications are critical features. This material is presented as summary tables, outlines, recipes, empirical data, lists, etc., which encapsulate a great deal of expert knowledge. Much of this is presented in topical knowledge summaries which are attached as Supplements. The composite of the supplements and the main body elements constitutes a first cut at a a Mark 1 Knowledge Base for manned-aircraft flight control.

  2. Statistical core design methodology using the VIPRE thermal-hydraulics code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lloyd, M.W.; Feltus, M.A.

    1994-12-31

    This Penn State Statistical Core Design Methodology (PSSCDM) is unique because it not only includes the EPRI correlation/test data standard deviation but also the computational uncertainty for the VIPRE code model and the new composite box design correlation. The resultant PSSCDM equation mimics the EPRI DNBR correlation results well, with an uncertainty of 0.0389. The combined uncertainty yields a new DNBR limit of 1.18 that will provide more plant operational flexibility. This methodology and its associated correlation and uniqe coefficients are for a very particular VIPRE model; thus, the correlation will be specifically linked with the lumped channel and subchannelmore » layout. The results of this research and methodology, however, can be applied to plant-specific VIPRE models.« less

  3. Methodological standards in single-case experimental design: Raising the bar.

    PubMed

    Ganz, Jennifer B; Ayres, Kevin M

    2018-04-12

    Single-case experimental designs (SCEDs), or small-n experimental research, are frequently implemented to assess approaches to improving outcomes for people with disabilities, particularly those with low-incidence disabilities, such as some developmental disabilities. SCED has become increasingly accepted as a research design. As this literature base is needed to determine what interventions are evidence-based practices, the acceptance of SCED has resulted in increased critiques with regard to methodological quality. Recent trends include recommendations from a number of expert scholars and institutions. The purpose of this article is to summarize the recent history of methodological quality considerations, synthesize the recommendations found in the SCED literature, and provide recommendations to researchers designing SCEDs with regard to essential and aspirational standards for methodological quality. Conclusions include imploring SCED to increase the quality of their experiments, with particular consideration regarding the applied nature of SCED research to be published in Research in Developmental Disabilities and beyond. Copyright © 2018 Elsevier Ltd. All rights reserved.

  4. A Synergy between the Technological Process and a Methodology for Web Design: Implications for Technological Problem Solving and Design

    ERIC Educational Resources Information Center

    Jakovljevic, Maria; Ankiewicz, Piet; De swardt, Estelle; Gross, Elna

    2004-01-01

    Traditional instructional methodology in the Information System Design (ISD) environment lacks explicit strategies for promoting the cognitive skills of prospective system designers. This contributes to the fragmented knowledge and low motivational and creative involvement of learners in system design tasks. In addition, present ISD methodologies,…

  5. Analysis and Design of Fuselage Structures Including Residual Strength Prediction Methodology

    NASA Technical Reports Server (NTRS)

    Knight, Norman F.

    1998-01-01

    The goal of this research project is to develop and assess methodologies for the design and analysis of fuselage structures accounting for residual strength. Two primary objectives are included in this research activity: development of structural analysis methodology for predicting residual strength of fuselage shell-type structures; and the development of accurate, efficient analysis, design and optimization tool for fuselage shell structures. Assessment of these tools for robustness, efficient, and usage in a fuselage shell design environment will be integrated with these two primary research objectives.

  6. Efficient preliminary floating offshore wind turbine design and testing methodologies and application to a concrete spar design.

    PubMed

    Matha, Denis; Sandner, Frank; Molins, Climent; Campos, Alexis; Cheng, Po Wen

    2015-02-28

    The current key challenge in the floating offshore wind turbine industry and research is on designing economic floating systems that can compete with fixed-bottom offshore turbines in terms of levelized cost of energy. The preliminary platform design, as well as early experimental design assessments, are critical elements in the overall design process. In this contribution, a brief review of current floating offshore wind turbine platform pre-design and scaled testing methodologies is provided, with a focus on their ability to accommodate the coupled dynamic behaviour of floating offshore wind systems. The exemplary design and testing methodology for a monolithic concrete spar platform as performed within the European KIC AFOSP project is presented. Results from the experimental tests compared to numerical simulations are presented and analysed and show very good agreement for relevant basic dynamic platform properties. Extreme and fatigue loads and cost analysis of the AFOSP system confirm the viability of the presented design process. In summary, the exemplary application of the reduced design and testing methodology for AFOSP confirms that it represents a viable procedure during pre-design of floating offshore wind turbine platforms. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  7. Efficient preliminary floating offshore wind turbine design and testing methodologies and application to a concrete spar design

    PubMed Central

    Matha, Denis; Sandner, Frank; Molins, Climent; Campos, Alexis; Cheng, Po Wen

    2015-01-01

    The current key challenge in the floating offshore wind turbine industry and research is on designing economic floating systems that can compete with fixed-bottom offshore turbines in terms of levelized cost of energy. The preliminary platform design, as well as early experimental design assessments, are critical elements in the overall design process. In this contribution, a brief review of current floating offshore wind turbine platform pre-design and scaled testing methodologies is provided, with a focus on their ability to accommodate the coupled dynamic behaviour of floating offshore wind systems. The exemplary design and testing methodology for a monolithic concrete spar platform as performed within the European KIC AFOSP project is presented. Results from the experimental tests compared to numerical simulations are presented and analysed and show very good agreement for relevant basic dynamic platform properties. Extreme and fatigue loads and cost analysis of the AFOSP system confirm the viability of the presented design process. In summary, the exemplary application of the reduced design and testing methodology for AFOSP confirms that it represents a viable procedure during pre-design of floating offshore wind turbine platforms. PMID:25583870

  8. Philosophical and Methodological Beliefs of Instructional Design Faculty and Professionals

    ERIC Educational Resources Information Center

    Sheehan, Michael D.; Johnson, R. Burke

    2012-01-01

    The purpose of this research was to probe the philosophical beliefs of instructional designers using sound philosophical constructs and quantitative data collection and analysis. We investigated the philosophical and methodological beliefs of instructional designers, including 152 instructional design faculty members and 118 non-faculty…

  9. CAGE IIIA Distributed Simulation Design Methodology

    DTIC Science & Technology

    2014-05-01

    2 VHF Very High Frequency VLC Video LAN Codec – an Open-source cross-platform multimedia player and framework VM Virtual Machine VOIP Voice Over...Implementing Defence Experimentation (GUIDEx). The key challenges for this methodology are with understanding how to: • design it o define the...operation and to be available in the other nation’s simulations. The challenge for the CAGE campaign of experiments is to continue to build upon this

  10. An NAFP Project: Use of Object Oriented Methodologies and Design Patterns to Refactor Software Design

    NASA Technical Reports Server (NTRS)

    Shaykhian, Gholam Ali; Baggs, Rhoda

    2007-01-01

    In the early problem-solution era of software programming, functional decompositions were mainly used to design and implement software solutions. In functional decompositions, functions and data are introduced as two separate entities during the design phase, and are followed as such in the implementation phase. Functional decompositions make use of refactoring through optimizing the algorithms, grouping similar functionalities into common reusable functions, and using abstract representations of data where possible; all these are done during the implementation phase. This paper advocates the usage of object-oriented methodologies and design patterns as the centerpieces of refactoring software solutions. Refactoring software is a method of changing software design while explicitly preserving its external functionalities. The combined usage of object-oriented methodologies and design patterns to refactor should also benefit the overall software life cycle cost with improved software.

  11. Probabilistic Design Methodology and its Application to the Design of an Umbilical Retract Mechanism

    NASA Technical Reports Server (NTRS)

    Onyebueke, Landon; Ameye, Olusesan

    2002-01-01

    A lot has been learned from past experience with structural and machine element failures. The understanding of failure modes and the application of an appropriate design analysis method can lead to improved structural and machine element safety as well as serviceability. To apply Probabilistic Design Methodology (PDM), all uncertainties are modeled as random variables with selected distribution types, means, and standard deviations. It is quite difficult to achieve a robust design without considering the randomness of the design parameters which is the case in the use of the Deterministic Design Approach. The US Navy has a fleet of submarine-launched ballistic missiles. An umbilical plug joins the missile to the submarine in order to provide electrical and cooling water connections. As the missile leaves the submarine, an umbilical retract mechanism retracts the umbilical plug clear of the advancing missile after disengagement during launch and retrains the plug in the retracted position. The design of the current retract mechanism in use was based on the deterministic approach which puts emphasis on factor of safety. A new umbilical retract mechanism that is simpler in design, lighter in weight, more reliable, easier to adjust, and more cost effective has become desirable since this will increase the performance and efficiency of the system. This paper reports on a recent project performed at Tennessee State University for the US Navy that involved the application of PDM to the design of an umbilical retract mechanism. This paper demonstrates how the use of PDM lead to the minimization of weight and cost, and the maximization of reliability and performance.

  12. Utility of Army Design Methodology in U.S. Coast Guard Counter Narcotic Interdiction Strategy

    DTIC Science & Technology

    2017-06-09

    UTILITY OF ARMY DESIGN METHODOLOGY IN U.S. COAST GUARD COUNTER NARCOTIC INTERDICTION STRATEGY A thesis presented to the...Thesis 3. DATES COVERED (From - To) AUG 2016 – JUN 2017 4. TITLE AND SUBTITLE Utility of Army Design Methodology in U.S. Coast Guard Counter...Distribution is Unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT This study investigates the utility of using Army Design Methodology (ADM) to

  13. Methodological Considerations in Designing and Evaluating Animal-Assisted Interventions.

    PubMed

    Stern, Cindy; Chur-Hansen, Anna

    2013-02-27

    This paper presents a discussion of the literature on animal-assisted interventions and describes limitations surrounding current methodological quality. Benefits to human physical, psychological and social health cannot be empirically confirmed due to the methodological limitations of the existing body of research, and comparisons cannot validly be made across different studies. Without a solid research base animal-assisted interventions will not receive recognition and acceptance as a credible alternative health care treatment. The paper draws on the work of four systematic reviews conducted over April-May 2009, with no date restrictions, focusing exclusively on the use of canine-assisted interventions for older people residing in long-term care. The reviews revealed a lack of good quality studies. Although the literature base has grown in volume since its inception, it predominantly consists of anecdotal accounts and reports. Experimental studies undertaken are often flawed in aspects of design, conduct and reporting. There are few qualitative studies available leading to the inability to draw definitive conclusions. It is clear that due to the complexities associated with these interventions not all weaknesses can be eliminated. However, there are basic methodological weaknesses that can be addressed in future studies in the area. Checklists for quantitative and qualitative research designs to guide future research are offered to help address methodological rigour.

  14. Bioremediation of chlorpyrifos contaminated soil by two phase bioslurry reactor: Processes evaluation and optimization by Taguchi's design of experimental (DOE) methodology.

    PubMed

    Pant, Apourv; Rai, J P N

    2018-04-15

    Two phase bioreactor was constructed, designed and developed to evaluate the chlorpyrifos remediation. Six biotic and abiotic factors (substrate-loading rate, slurry phase pH, slurry phase dissolved oxygen (DO), soil water ratio, temperature and soil micro flora load) were evaluated by design of experimental (DOE) methodology employing Taguchi's orthogonal array (OA). The selected six factors were considered at two levels L-8 array (2^7, 15 experiments) in the experimental design. The optimum operating conditions obtained from the methodology showed enhanced chlorpyrifos degradation from 283.86µg/g to 955.364µg/g by overall 70.34% of enhancement. In the present study, with the help of few well defined experimental parameters a mathematical model was constructed to understand the complex bioremediation process and optimize the approximate parameters upto great accuracy. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Designing, Evaluating, and Deploying Automated Scoring Systems with Validity in Mind: Methodological Design Decisions

    ERIC Educational Resources Information Center

    Rupp, André A.

    2018-01-01

    This article discusses critical methodological design decisions for collecting, interpreting, and synthesizing empirical evidence during the design, deployment, and operational quality-control phases for automated scoring systems. The discussion is inspired by work on operational large-scale systems for automated essay scoring but many of the…

  16. [Strengthening the methodology of study designs in scientific researches].

    PubMed

    Ren, Ze-qin

    2010-06-01

    Many problems in study designs have affected the validity of scientific researches seriously. We must understand the methodology of research, especially clinical epidemiology and biostatistics, and recognize the urgency in selection and implement of right study design. Thereafter we can promote the research capability and improve the overall quality of scientific researches.

  17. A Methodology for the Design of Application-Specific Cyber-Physical Social Sensing Co-Simulators.

    PubMed

    Sánchez, Borja Bordel; Alcarria, Ramón; Sánchez-Picot, Álvaro; Sánchez-de-Rivera, Diego

    2017-09-22

    Cyber-Physical Social Sensing (CPSS) is a new trend in the context of pervasive sensing. In these new systems, various domains coexist in time, evolve together and influence each other. Thus, application-specific tools are necessary for specifying and validating designs and simulating systems. However, nowadays, different tools are employed to simulate each domain independently. Mainly, the cause of the lack of co-simulation instruments to simulate all domains together is the extreme difficulty of combining and synchronizing various tools. In order to reduce that difficulty, an adequate architecture for the final co-simulator must be selected. Therefore, in this paper the authors investigate and propose a methodology for the design of CPSS co-simulation tools. The paper describes the four steps that software architects should follow in order to design the most adequate co-simulator for a certain application, considering the final users' needs and requirements and various additional factors such as the development team's experience. Moreover, the first practical use case of the proposed methodology is provided. An experimental validation is also included in order to evaluate the performing of the proposed co-simulator and to determine the correctness of the proposal.

  18. A Methodology for the Design of Application-Specific Cyber-Physical Social Sensing Co-Simulators

    PubMed Central

    Sánchez-Picot, Álvaro

    2017-01-01

    Cyber-Physical Social Sensing (CPSS) is a new trend in the context of pervasive sensing. In these new systems, various domains coexist in time, evolve together and influence each other. Thus, application-specific tools are necessary for specifying and validating designs and simulating systems. However, nowadays, different tools are employed to simulate each domain independently. Mainly, the cause of the lack of co-simulation instruments to simulate all domains together is the extreme difficulty of combining and synchronizing various tools. In order to reduce that difficulty, an adequate architecture for the final co-simulator must be selected. Therefore, in this paper the authors investigate and propose a methodology for the design of CPSS co-simulation tools. The paper describes the four steps that software architects should follow in order to design the most adequate co-simulator for a certain application, considering the final users’ needs and requirements and various additional factors such as the development team’s experience. Moreover, the first practical use case of the proposed methodology is provided. An experimental validation is also included in order to evaluate the performing of the proposed co-simulator and to determine the correctness of the proposal. PMID:28937610

  19. Methodology for cloud-based design of robots

    NASA Astrophysics Data System (ADS)

    Ogorodnikova, O. M.; Vaganov, K. A.; Putimtsev, I. D.

    2017-09-01

    This paper presents some important results for cloud-based designing a robot arm by a group of students. Methodology for the cloud-based design was developed and used to initiate interdisciplinary project about research and development of a specific manipulator. The whole project data files were hosted by Ural Federal University data center. The 3D (three-dimensional) model of the robot arm was created using Siemens PLM software (Product Lifecycle Management) and structured as a complex mechatronics product by means of Siemens Teamcenter thin client; all processes were performed in the clouds. The robot arm was designed in purpose to load blanks up to 1 kg into the work space of the milling machine for performing student's researches.

  20. Methodology of Computer-Aided Design of Variable Guide Vanes of Aircraft Engines

    ERIC Educational Resources Information Center

    Falaleev, Sergei V.; Melentjev, Vladimir S.; Gvozdev, Alexander S.

    2016-01-01

    The paper presents a methodology which helps to avoid a great amount of costly experimental research. This methodology includes thermo-gas dynamic design of an engine and its mounts, the profiling of compressor flow path and cascade design of guide vanes. Employing a method elaborated by Howell, we provide a theoretical solution to the task of…

  1. Methodological considerations for designing a community water fluoridation cessation study.

    PubMed

    Singhal, Sonica; Farmer, Julie; McLaren, Lindsay

    2017-06-01

    High-quality, up-to-date research on community water fluoridation (CWF), and especially on the implications of CWF cessation for dental health, is limited. Although CWF cessation studies have been conducted, they are few in number; one of the major reasons is the methodological complexity of conducting such a study. This article draws on a systematic review of existing cessation studies (n=15) to explore methodological considerations of conducting CWF cessation studies in future. We review nine important methodological aspects (study design, comparison community, target population, time frame, sampling strategy, clinical indicators, assessment criteria, covariates and biomarkers) and provide recommendations for planning future CWF cessation studies that examine effects on dental caries. There is no one ideal study design to answer a research question. However, recommendations proposed regarding methodological aspects to conduct an epidemiological study to observe the effects of CWF cessation on dental caries, coupled with our identification of important methodological gaps, will be useful for researchers who are looking to optimize resources to conduct such a study with standards of rigour. © 2017 Her Majesty the Queen in Right of Canada. Community Dentistry and Oral Epidemiology © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  2. When Playing Meets Learning: Methodological Framework for Designing Educational Games

    NASA Astrophysics Data System (ADS)

    Linek, Stephanie B.; Schwarz, Daniel; Bopp, Matthias; Albert, Dietrich

    Game-based learning builds upon the idea of using the motivational potential of video games in the educational context. Thus, the design of educational games has to address optimizing enjoyment as well as optimizing learning. Within the EC-project ELEKTRA a methodological framework for the conceptual design of educational games was developed. Thereby state-of-the-art psycho-pedagogical approaches were combined with insights of media-psychology as well as with best-practice game design. This science-based interdisciplinary approach was enriched by enclosed empirical research to answer open questions on educational game-design. Additionally, several evaluation-cycles were implemented to achieve further improvements. The psycho-pedagogical core of the methodology can be summarized by the ELEKTRA's 4Ms: Macroadaptivity, Microadaptivity, Metacognition, and Motivation. The conceptual framework is structured in eight phases which have several interconnections and feedback-cycles that enable a close interdisciplinary collaboration between game design, pedagogy, cognitive science and media psychology.

  3. An overview of research on waverider design methodology

    NASA Astrophysics Data System (ADS)

    Ding, Feng; Liu, Jun; Shen, Chi-bing; Liu, Zhen; Chen, Shao-hua; Fu, Xiang

    2017-11-01

    A waverider is any supersonic or hypersonic lifting body that is characterized by an attached, or nearly attached, bow shock wave along its leading edge. As a waverider can possess a high lift-to-drag ratio as well as an ideal precompression surface of the inlet system, it has become one of the most promising designs for air-breathing hypersonic vehicles. This paper reviews and classifies waverider design methodologies developed by local and foreign scholars up until 2016. The design concept of a waverider can be summarized as follows: modeling of the basic flow field is used to design the waverider in the streamwise direction and the osculating theory is used to design the waverider in the spanwise direction.

  4. Novel thermal management system design methodology for power lithium-ion battery

    NASA Astrophysics Data System (ADS)

    Nieto, Nerea; Díaz, Luis; Gastelurrutia, Jon; Blanco, Francisco; Ramos, Juan Carlos; Rivas, Alejandro

    2014-12-01

    Battery packs conformed by large format lithium-ion cells are increasingly being adopted in hybrid and pure electric vehicles in order to use the energy more efficiently and for a better environmental performance. Safety and cycle life are two of the main concerns regarding this technology, which are closely related to the cell's operating behavior and temperature asymmetries in the system. Therefore, the temperature of the cells in battery packs needs to be controlled by thermal management systems (TMSs). In the present paper an improved design methodology for developing TMSs is proposed. This methodology involves the development of different mathematical models for heat generation, transmission, and dissipation and their coupling and integration in the battery pack product design methodology in order to improve the overall safety and performance. The methodology is validated by comparing simulation results with laboratory measurements on a single module of the battery pack designed at IK4-IKERLAN for a traction application. The maximum difference between model predictions and experimental temperature data is 2 °C. The models developed have shown potential for use in battery thermal management studies for EV/HEV applications since they allow for scalability with accuracy and reasonable simulation time.

  5. Methodology for designing accelerated aging tests for predicting life of photovoltaic arrays

    NASA Technical Reports Server (NTRS)

    Gaines, G. B.; Thomas, R. E.; Derringer, G. C.; Kistler, C. W.; Bigg, D. M.; Carmichael, D. C.

    1977-01-01

    A methodology for designing aging tests in which life prediction was paramount was developed. The methodology builds upon experience with regard to aging behavior in those material classes which are expected to be utilized as encapsulant elements, viz., glasses and polymers, and upon experience with the design of aging tests. The experiences were reviewed, and results are discussed in detail.

  6. Review of the probabilistic failure analysis methodology and other probabilistic approaches for application in aerospace structural design

    NASA Technical Reports Server (NTRS)

    Townsend, J.; Meyers, C.; Ortega, R.; Peck, J.; Rheinfurth, M.; Weinstock, B.

    1993-01-01

    Probabilistic structural analyses and design methods are steadily gaining acceptance within the aerospace industry. The safety factor approach to design has long been the industry standard, and it is believed by many to be overly conservative and thus, costly. A probabilistic approach to design may offer substantial cost savings. This report summarizes several probabilistic approaches: the probabilistic failure analysis (PFA) methodology developed by Jet Propulsion Laboratory, fast probability integration (FPI) methods, the NESSUS finite element code, and response surface methods. Example problems are provided to help identify the advantages and disadvantages of each method.

  7. Methodology for object-oriented real-time systems analysis and design: Software engineering

    NASA Technical Reports Server (NTRS)

    Schoeffler, James D.

    1991-01-01

    Successful application of software engineering methodologies requires an integrated analysis and design life-cycle in which the various phases flow smoothly 'seamlessly' from analysis through design to implementation. Furthermore, different analysis methodologies often lead to different structuring of the system so that the transition from analysis to design may be awkward depending on the design methodology to be used. This is especially important when object-oriented programming is to be used for implementation when the original specification and perhaps high-level design is non-object oriented. Two approaches to real-time systems analysis which can lead to an object-oriented design are contrasted: (1) modeling the system using structured analysis with real-time extensions which emphasizes data and control flows followed by the abstraction of objects where the operations or methods of the objects correspond to processes in the data flow diagrams and then design in terms of these objects; and (2) modeling the system from the beginning as a set of naturally occurring concurrent entities (objects) each having its own time-behavior defined by a set of states and state-transition rules and seamlessly transforming the analysis models into high-level design models. A new concept of a 'real-time systems-analysis object' is introduced and becomes the basic building block of a series of seamlessly-connected models which progress from the object-oriented real-time systems analysis and design system analysis logical models through the physical architectural models and the high-level design stages. The methodology is appropriate to the overall specification including hardware and software modules. In software modules, the systems analysis objects are transformed into software objects.

  8. Methodology for Designing Operational Banking Risks Monitoring System

    NASA Astrophysics Data System (ADS)

    Kostjunina, T. N.

    2018-05-01

    The research looks at principles of designing an information system for monitoring operational banking risks. A proposed design methodology enables one to automate processes of collecting data on information security incidents in the banking network, serving as the basis for an integrated approach to the creation of an operational risk management system. The system can operate remotely ensuring tracking and forecasting of various operational events in the bank network. A structure of a content management system is described.

  9. [Factors conditioning primary care services utilization. Empirical evidence and methodological inconsistencies].

    PubMed

    Sáez, M

    2003-01-01

    In Spain, the degree and characteristics of primary care services utilization have been the subject of analysis since at least the 1980s. One of the main reasons for this interest is to assess the extent to which utilization matches primary care needs. In fact, the provision of an adequate health service for those who most need it is a generally accepted priority. The evidence shows that individual characteristics, mainly health status, are the factors most closely related to primary care utilization. Other personal characteristics, such as gender and age, could act as modulators of health care need. Some family and/or cultural variables, as well as factors related to the health care professional and institutions, could explain some of the observed variability in primary care services utilization. Socioeconomic variables, such as income, reveal a paradox. From an aggregate perspective, income is the main determinant of utilization as well as of health care expenditure. When data are analyzed for individuals, however, income is not related to primary health utilization. The situation is controversial, with methodological implications and, above all, consequences for the assessment of the efficiency in primary care utilization. Review of the literature reveals certain methodological inconsistencies that could at least partly explain the disparity of the empirical results. Among others, the following flaws can be highlighted: design problems, measurement errors, misspecification, and misleading statistical methods.Some solutions, among others, are quasi-experiments, the use of large administrative databases and of primary data sources (design problems); differentiation between types of utilization and between units of analysis other than consultations, and correction of measurement errors in the explanatory variables (measurement errors); consideration of relevant explanatory variables (misspecification); and the use of multilevel models (statistical methods).

  10. Development of a design methodology for asphalt treated mixtures.

    DOT National Transportation Integrated Search

    2013-12-01

    This report summarizes the results of a study that was conducted to develop a simplified design methodology for asphalt : treated mixtures that are durable, stable, constructible, and cost effective through the examination of the performance of : mix...

  11. Aero-Mechanical Design Methodology for Subsonic Civil Transport High-Lift Systems

    NASA Technical Reports Server (NTRS)

    vanDam, C. P.; Shaw, S. G.; VanderKam, J. C.; Brodeur, R. R.; Rudolph, P. K. C.; Kinney, D.

    2000-01-01

    In today's highly competitive and economically driven commercial aviation market, the trend is to make aircraft systems simpler and to shorten their design cycle which reduces recurring, non-recurring and operating costs. One such system is the high-lift system. A methodology has been developed which merges aerodynamic data with kinematic analysis of the trailing-edge flap mechanism with minimum mechanism definition required. This methodology provides quick and accurate aerodynamic performance prediction for a given flap deployment mechanism early on in the high-lift system preliminary design stage. Sample analysis results for four different deployment mechanisms are presented as well as descriptions of the aerodynamic and mechanism data required for evaluation. Extensions to interactive design capabilities are also discussed.

  12. Methodology for CFD Design Analysis of National Launch System Nozzle Manifold

    NASA Technical Reports Server (NTRS)

    Haire, Scot L.

    1993-01-01

    The current design environment dictates that high technology CFD (Computational Fluid Dynamics) analysis produce quality results in a timely manner if it is to be integrated into the design process. The design methodology outlined describes the CFD analysis of an NLS (National Launch System) nozzle film cooling manifold. The objective of the analysis was to obtain a qualitative estimate for the flow distribution within the manifold. A complex, 3D, multiple zone, structured grid was generated from a 3D CAD file of the geometry. A Euler solution was computed with a fully implicit compressible flow solver. Post processing consisted of full 3D color graphics and mass averaged performance. The result was a qualitative CFD solution that provided the design team with relevant information concerning the flow distribution in and performance characteristics of the film cooling manifold within an effective time frame. Also, this design methodology was the foundation for a quick turnaround CFD analysis of the next iteration in the manifold design.

  13. Passive and semi-active heave compensator: Project design methodology and control strategies.

    PubMed

    Cuellar Sanchez, William Humberto; Linhares, Tássio Melo; Neto, André Benine; Fortaleza, Eugênio Libório Feitosa

    2017-01-01

    Heave compensator is a system that mitigates transmission of heave movement from vessels to the equipment in the vessel. In drilling industry, a heave compensator enables drilling in offshore environments. Heave compensator attenuates movement transmitted from the vessel to the drill string and drill bit ensuring security and efficiency of the offshore drilling process. Common types of heave compensators are passive, active and semi-active compensators. This article presents 4 main points. First, a bulk modulus analysis obtains a simple condition to determine if the bulk modulus can be neglected in the design of hydropneumatic passive heave compensator. Second, the methodology to design passive heave compensators with the desired frequency response. Third, four control methodologies for semi-active heave compensator are tested and compared numerically. Lastly, we show experimental results obtained from a prototype with the methodology developed to design passive heave compensator.

  14. A robust optimization methodology for preliminary aircraft design

    NASA Astrophysics Data System (ADS)

    Prigent, S.; Maréchal, P.; Rondepierre, A.; Druot, T.; Belleville, M.

    2016-05-01

    This article focuses on a robust optimization of an aircraft preliminary design under operational constraints. According to engineers' know-how, the aircraft preliminary design problem can be modelled as an uncertain optimization problem whose objective (the cost or the fuel consumption) is almost affine, and whose constraints are convex. It is shown that this uncertain optimization problem can be approximated in a conservative manner by an uncertain linear optimization program, which enables the use of the techniques of robust linear programming of Ben-Tal, El Ghaoui, and Nemirovski [Robust Optimization, Princeton University Press, 2009]. This methodology is then applied to two real cases of aircraft design and numerical results are presented.

  15. An automated methodology development. [software design for combat simulation

    NASA Technical Reports Server (NTRS)

    Hawley, L. R.

    1985-01-01

    The design methodology employed in testing the applicability of Ada in large-scale combat simulations is described. Ada was considered as a substitute for FORTRAN to lower life cycle costs and ease the program development efforts. An object-oriented approach was taken, which featured definitions of military targets, the capability of manipulating their condition in real-time, and one-to-one correlation between the object states and real world states. The simulation design process was automated by the problem statement language (PSL)/problem statement analyzer (PSA). The PSL/PSA system accessed the problem data base directly to enhance the code efficiency by, e.g., eliminating non-used subroutines, and provided for automated report generation, besides allowing for functional and interface descriptions. The ways in which the methodology satisfied the responsiveness, reliability, transportability, modifiability, timeliness and efficiency goals are discussed.

  16. Failure detection system design methodology. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Chow, E. Y.

    1980-01-01

    The design of a failure detection and identification system consists of designing a robust residual generation process and a high performance decision making process. The design of these two processes are examined separately. Residual generation is based on analytical redundancy. Redundancy relations that are insensitive to modelling errors and noise effects are important for designing robust residual generation processes. The characterization of the concept of analytical redundancy in terms of a generalized parity space provides a framework in which a systematic approach to the determination of robust redundancy relations are developed. The Bayesian approach is adopted for the design of high performance decision processes. The FDI decision problem is formulated as a Bayes sequential decision problem. Since the optimal decision rule is incomputable, a methodology for designing suboptimal rules is proposed. A numerical algorithm is developed to facilitate the design and performance evaluation of suboptimal rules.

  17. Reassessing SERS enhancement factors: using thermodynamics to drive substrate design.

    PubMed

    Guicheteau, J A; Tripathi, A; Emmons, E D; Christesen, S D; Fountain, Augustus W

    2017-12-04

    Over the past 40 years fundamental and application research into Surface-Enhanced Raman Scattering (SERS) has been explored by academia, industry, and government laboratories. To date however, SERS has achieved little commercial success as an analytical technique. Researchers are tackling a variety of paths to help break through the commercial barrier by addressing the reproducibility in both the SERS substrates and SERS signals as well as continuing to explore the underlying mechanisms. To this end, investigators use a variety of methodologies, typically studying strongly binding analytes such as aromatic thiols and azarenes, and report SERS enhancement factor calculations. However a drawback of the traditional SERS enhancement factor calculation is that it does not yield enough information to understand substrate reproducibility, application potential with another analyte, or the driving factors behind the molecule-metal interaction. Our work at the US Army Edgewood Chemical Biological Center has focused on these questions and we have shown that thermodynamic principles play a key role in the SERS response and are an essential factor in future designs of substrates and applications. This work will discuss the advantages and disadvantages of various experimental techniques used to report SERS enhancement with planar SERS substrates and present our alternative SERS enhancement value. We will report on three types of analysis scenarios that all yield different information concerning the effectiveness of the SERS substrate, practical application of the substrate, and finally the thermodynamic properties of the substrate. We believe that through this work a greater understanding for designing substrates will be achieved, one that is based on both thermodynamic and plasmonic properties as opposed to just plasmonic properties. This new understanding and potential change in substrate design will enable more applications for SERS based methodologies including targeting

  18. Improving spacecraft design using a multidisciplinary design optimization methodology

    NASA Astrophysics Data System (ADS)

    Mosher, Todd Jon

    2000-10-01

    Spacecraft design has gone from maximizing performance under technology constraints to minimizing cost under performance constraints. This is characteristic of the "faster, better, cheaper" movement that has emerged within NASA. Currently spacecraft are "optimized" manually through a tool-assisted evaluation of a limited set of design alternatives. With this approach there is no guarantee that a systems-level focus will be taken and "feasibility" rather than "optimality" is commonly all that is achieved. To improve spacecraft design in the "faster, better, cheaper" era, a new approach using multidisciplinary design optimization (MDO) is proposed. Using MDO methods brings structure to conceptual spacecraft design by casting a spacecraft design problem into an optimization framework. Then, through the construction of a model that captures design and cost, this approach facilitates a quicker and more straightforward option synthesis. The final step is to automatically search the design space. As computer processor speed continues to increase, enumeration of all combinations, while not elegant, is one method that is straightforward to perform. As an alternative to enumeration, genetic algorithms are used and find solutions by reviewing fewer possible solutions with some limitations. Both methods increase the likelihood of finding an optimal design, or at least the most promising area of the design space. This spacecraft design methodology using MDO is demonstrated on three examples. A retrospective test for validation is performed using the Near Earth Asteroid Rendezvous (NEAR) spacecraft design. For the second example, the premise that aerobraking was needed to minimize mission cost and was mission enabling for the Mars Global Surveyor (MGS) mission is challenged. While one might expect no feasible design space for an MGS without aerobraking mission, a counterintuitive result is discovered. Several design options that don't use aerobraking are feasible and cost

  19. A first principles based methodology for design of axial compressor configurations

    NASA Astrophysics Data System (ADS)

    Iyengar, Vishwas

    Axial compressors are widely used in many aerodynamic applications. The design of an axial compressor configuration presents many challenges. Until recently, compressor design was done using 2-D viscous flow analyses that solve the flow field around cascades or in meridional planes or 3-D inviscid analyses. With the advent of modern computational methods it is now possible to analyze the 3-D viscous flow and accurately predict the performance of 3-D multistage compressors. It is necessary to retool the design methodologies to take advantage of the improved accuracy and physical fidelity of these advanced methods. In this study, a first-principles based multi-objective technique for designing single stage compressors is described. The study accounts for stage aerodynamic characteristics, rotor-stator interactions and blade elastic deformations. A parametric representation of compressor blades that include leading and trailing edge camber line angles, thickness and camber distributions was used in this study. A design of experiment approach is used to reduce the large combinations of design variables into a smaller subset. A response surface method is used to approximately map the output variables as a function of design variables. An optimized configuration is determined as the extremum of all extrema. This method has been applied to a rotor-stator stage similar to NASA Stage 35. The study has two parts: a preliminary study where a limited number of design variables were used to give an understanding of the important design variables for subsequent use, and a comprehensive application of the methodology where a larger, more complete set of design variables are used. The extended methodology also attempts to minimize the acoustic fluctuations at the rotor-stator interface by considering a rotor-wake influence coefficient (RWIC). Results presented include performance map calculations at design and off-design speed along with a detailed visualization of the flow field

  20. Methodological issues with adaptation of clinical trial design.

    PubMed

    Hung, H M James; Wang, Sue-Jane; O'Neill, Robert T

    2006-01-01

    Adaptation of clinical trial design generates many issues that have not been resolved for practical applications, though statistical methodology has advanced greatly. This paper focuses on some methodological issues. In one type of adaptation such as sample size re-estimation, only the postulated value of a parameter for planning the trial size may be altered. In another type, the originally intended hypothesis for testing may be modified using the internal data accumulated at an interim time of the trial, such as changing the primary endpoint and dropping a treatment arm. For sample size re-estimation, we make a contrast between an adaptive test weighting the two-stage test statistics with the statistical information given by the original design and the original sample mean test with a properly corrected critical value. We point out the difficulty in planning a confirmatory trial based on the crude information generated by exploratory trials. In regards to selecting a primary endpoint, we argue that the selection process that allows switching from one endpoint to the other with the internal data of the trial is not very likely to gain a power advantage over the simple process of selecting one from the two endpoints by testing them with an equal split of alpha (Bonferroni adjustment). For dropping a treatment arm, distributing the remaining sample size of the discontinued arm to other treatment arms can substantially improve the statistical power of identifying a superior treatment arm in the design. A common difficult methodological issue is that of how to select an adaptation rule in the trial planning stage. Pre-specification of the adaptation rule is important for the practicality consideration. Changing the originally intended hypothesis for testing with the internal data generates great concerns to clinical trial researchers.

  1. Integral Design Methodology of Photocatalytic Reactors for Air Pollution Remediation.

    PubMed

    Passalía, Claudio; Alfano, Orlando M; Brandi, Rodolfo J

    2017-06-07

    An integral reactor design methodology was developed to address the optimal design of photocatalytic wall reactors to be used in air pollution control. For a target pollutant to be eliminated from an air stream, the proposed methodology is initiated with a mechanistic derived reaction rate. The determination of intrinsic kinetic parameters is associated with the use of a simple geometry laboratory scale reactor, operation under kinetic control and a uniform incident radiation flux, which allows computing the local superficial rate of photon absorption. Thus, a simple model can describe the mass balance and a solution may be obtained. The kinetic parameters may be estimated by the combination of the mathematical model and the experimental results. The validated intrinsic kinetics obtained may be directly used in the scaling-up of any reactor configuration and size. The bench scale reactor may require the use of complex computational software to obtain the fields of velocity, radiation absorption and species concentration. The complete methodology was successfully applied to the elimination of airborne formaldehyde. The kinetic parameters were determined in a flat plate reactor, whilst a bench scale corrugated wall reactor was used to illustrate the scaling-up methodology. In addition, an optimal folding angle of the corrugated reactor was found using computational fluid dynamics tools.

  2. Bayesian cross-entropy methodology for optimal design of validation experiments

    NASA Astrophysics Data System (ADS)

    Jiang, X.; Mahadevan, S.

    2006-07-01

    An important concern in the design of validation experiments is how to incorporate the mathematical model in the design in order to allow conclusive comparisons of model prediction with experimental output in model assessment. The classical experimental design methods are more suitable for phenomena discovery and may result in a subjective, expensive, time-consuming and ineffective design that may adversely impact these comparisons. In this paper, an integrated Bayesian cross-entropy methodology is proposed to perform the optimal design of validation experiments incorporating the computational model. The expected cross entropy, an information-theoretic distance between the distributions of model prediction and experimental observation, is defined as a utility function to measure the similarity of two distributions. A simulated annealing algorithm is used to find optimal values of input variables through minimizing or maximizing the expected cross entropy. The measured data after testing with the optimum input values are used to update the distribution of the experimental output using Bayes theorem. The procedure is repeated to adaptively design the required number of experiments for model assessment, each time ensuring that the experiment provides effective comparison for validation. The methodology is illustrated for the optimal design of validation experiments for a three-leg bolted joint structure and a composite helicopter rotor hub component.

  3. Integrated structure/control design - Present methodology and future opportunities

    NASA Technical Reports Server (NTRS)

    Weisshaar, T. A.; Newsom, J. R.; Zeiler, T. A.; Gilbert, M. G.

    1986-01-01

    Attention is given to current methodology applied to the integration of the optimal design process for structures and controls. Multilevel linear decomposition techniques proved to be most effective in organizing the computational efforts necessary for ISCD (integrated structures and control design) tasks. With the development of large orbiting space structures and actively controlled, high performance aircraft, there will be more situations in which this concept can be applied.

  4. A broader consideration of human factor to enhance sustainable building design.

    PubMed

    Attaianese, Erminia

    2012-01-01

    The link between ergonomic/human factor and sustainability seems to be clearly evidenced mainly in relation to social dimension of sustainability, in order to contribute to assure corporate social responsibility and global value creation. But the will to establish an equilibrated connection among used resources in human activities, supported by the sustainability perspective, evidences that the contribution of ergonomics/human factors can be effectively enlarged to other aspects, especially in relation to building design. In fact a sustainable building is meant to be a building that contributes, through its characteristics and attribute, to a sustainable development by assuring, in the same time, a decrease of resources use and environmental impact and an increase of health, safety and comfort of the occupants. The purpose of this paper is to analyze in a broader sense the contribution of ergonomic/human factor to design of sustainable building, focusing how ergonomics principles, methodology and techniques can improve building design, enhancing its sustainability performance during all phases of building lifecycle.

  5. Thin Film Heat Flux Sensors: Design and Methodology

    NASA Technical Reports Server (NTRS)

    Fralick, Gustave C.; Wrbanek, John D.

    2013-01-01

    Thin Film Heat Flux Sensors: Design and Methodology: (1) Heat flux is one of a number of parameters, together with pressure, temperature, flow, etc. of interest to engine designers and fluid dynamists, (2) The measurement of heat flux is of interest in directly determining the cooling requirements of hot section blades and vanes, and (3)In addition, if the surface and gas temperatures are known, the measurement of heat flux provides a value for the convective heat transfer coefficient that can be compared with the value provided by CFD codes.

  6. ICS-II USA research design and methodology.

    PubMed

    Rana, H; Andersen, R M; Nakazono, T T; Davidson, P L

    1997-05-01

    The purpose of the WHO-sponsored International Collaborative Study of Oral Health Outcomes (ICS-II) was to provide policy-markers and researchers with detailed, reliable, and valid data on the oral health situation in their countries or regions, together with comparative data from other dental care delivery systems. ICS-II used a cross-sectional design with no explicit control groups or experimental interventions. A standardized methodology was developed and tested for collecting and analyzing epidemiological, sociocultural, economic, and delivery system data. Respondent information was obtained by household interviews, and clinical examinations were conducted by calibrated oral epidemiologists. Discussed are the sampling design characteristics for the USA research locations, response rates, samples size for interview and oral examination data, weighting procedures, and statistical methods. SUDAAN was used to adjust variance calculations, since complex sampling designs were used.

  7. Passive and semi-active heave compensator: Project design methodology and control strategies

    PubMed Central

    Cuellar Sanchez, William Humberto; Neto, André Benine; Fortaleza, Eugênio Libório Feitosa

    2017-01-01

    Heave compensator is a system that mitigates transmission of heave movement from vessels to the equipment in the vessel. In drilling industry, a heave compensator enables drilling in offshore environments. Heave compensator attenuates movement transmitted from the vessel to the drill string and drill bit ensuring security and efficiency of the offshore drilling process. Common types of heave compensators are passive, active and semi-active compensators. This article presents 4 main points. First, a bulk modulus analysis obtains a simple condition to determine if the bulk modulus can be neglected in the design of hydropneumatic passive heave compensator. Second, the methodology to design passive heave compensators with the desired frequency response. Third, four control methodologies for semi-active heave compensator are tested and compared numerically. Lastly, we show experimental results obtained from a prototype with the methodology developed to design passive heave compensator. PMID:28813494

  8. Methodology to design a municipal solid waste generation and composition map: a case study.

    PubMed

    Gallardo, A; Carlos, M; Peris, M; Colomer, F J

    2014-11-01

    The municipal solid waste (MSW) management is an important task that local governments as well as private companies must take into account to protect human health, the environment and to preserve natural resources. To design an adequate MSW management plan the first step consist in defining the waste generation and composition patterns of the town. As these patterns depend on several socio-economic factors it is advisable to organize them previously. Moreover, the waste generation and composition patterns may vary around the town and over the time. Generally, the data are not homogeneous around the city as the number of inhabitants is not constant nor it is the economic activity. Therefore, if all the information is showed in thematic maps, the final waste management decisions can be made more efficiently. The main aim of this paper is to present a structured methodology that allows local authorities or private companies who deal with MSW to design its own MSW management plan depending on the available data. According to these data, this paper proposes two ways of action: a direct way when detailed data are available and an indirect way when there is a lack of data and it is necessary to take into account bibliographic data. In any case, the amount of information needed is considerable. This paper combines the planning methodology with the Geographic Information Systems to present the final results in thematic maps that make easier to interpret them. The proposed methodology is a previous useful tool to organize the MSW collection routes including the selective collection. To verify the methodology it has been successfully applied to a Spanish town. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. Methodology to design a municipal solid waste generation and composition map: a case study.

    PubMed

    Gallardo, A; Carlos, M; Peris, M; Colomer, F J

    2015-02-01

    The municipal solid waste (MSW) management is an important task that local governments as well as private companies must take into account to protect human health, the environment and to preserve natural resources. To design an adequate MSW management plan the first step consists in defining the waste generation and composition patterns of the town. As these patterns depend on several socio-economic factors it is advisable to organize them previously. Moreover, the waste generation and composition patterns may vary around the town and over the time. Generally, the data are not homogeneous around the city as the number of inhabitants is not constant nor it is the economic activity. Therefore, if all the information is showed in thematic maps, the final waste management decisions can be made more efficiently. The main aim of this paper is to present a structured methodology that allows local authorities or private companies who deal with MSW to design its own MSW management plan depending on the available data. According to these data, this paper proposes two ways of action: a direct way when detailed data are available and an indirect way when there is a lack of data and it is necessary to take into account bibliographic data. In any case, the amount of information needed is considerable. This paper combines the planning methodology with the Geographic Information Systems to present the final results in thematic maps that make easier to interpret them. The proposed methodology is a previous useful tool to organize the MSW collection routes including the selective collection. To verify the methodology it has been successfully applied to a Spanish town. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio-Inspired Optimization Techniques

    DTIC Science & Technology

    2017-11-01

    ARL-TR-8225 ● NOV 2017 US Army Research Laboratory Methodology for Designing and Developing a New Ultra-Wideband Antenna Based...Research Laboratory Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio-Inspired Optimization Techniques by...SUBTITLE Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio-Inspired Optimization Techniques 5a. CONTRACT NUMBER

  11. Establishing equivalence: methodological progress in group-matching design and analysis.

    PubMed

    Kover, Sara T; Atwoo, Amy K

    2013-01-01

    This methodological review draws attention to the challenges faced by intellectual and developmental disabilities researchers in the appropriate design and analysis of group comparison studies. We provide a brief overview of matching methodologies in the field, emphasizing group-matching designs used in behavioral research on cognition and language in neurodevelopmental disorders, including autism spectrum disorder, Fragile X syndrome, Down syndrome, and Williams syndrome. The limitations of relying on p values to establish group equivalence are discussed in the context of other existing methods: equivalence tests, propensity scores, and regression-based analyses. Our primary recommendation for advancing research on intellectual and developmental disabilities is the use of descriptive indices of adequate group matching: effect sizes (i.e., standardized mean differences) and variance ratios.

  12. Establishing Equivalence: Methodological Progress in Group-Matching Design and Analysis

    PubMed Central

    Kover, Sara T.; Atwood, Amy K.

    2017-01-01

    This methodological review draws attention to the challenges faced by intellectual and developmental disabilities researchers in the appropriate design and analysis of group comparison studies. We provide a brief overview of matching methodologies in the field, emphasizing group-matching designs utilized in behavioral research on cognition and language in neurodevelopmental disorders, including autism spectrum disorder, fragile X syndrome, Down syndrome, and Williams syndrome. The limitations of relying on p-values to establish group equivalence are discussed in the context of other existing methods: equivalence tests, propensity scores, and regression-based analyses. Our primary recommendation for advancing research on intellectual and developmental disabilities is the use of descriptive indices of adequate group matching: effect sizes (i.e., standardized mean differences) and variance ratios. PMID:23301899

  13. Development of a Design Methodology for Reconfigurable Flight Control Systems

    NASA Technical Reports Server (NTRS)

    Hess, Ronald A.; McLean, C.

    2000-01-01

    A methodology is presented for the design of flight control systems that exhibit stability and performance-robustness in the presence of actuator failures. The design is based upon two elements. The first element consists of a control law that will ensure at least stability in the presence of a class of actuator failures. This law is created by inner-loop, reduced-order, linear dynamic inversion, and outer-loop compensation based upon Quantitative Feedback Theory. The second element consists of adaptive compensators obtained from simple and approximate time-domain identification of the dynamics of the 'effective vehicle' with failed actuator(s). An example involving the lateral-directional control of a fighter aircraft is employed both to introduce the proposed methodology and to demonstrate its effectiveness and limitations.

  14. Acceptance testing for PACS: from methodology to design to implementation

    NASA Astrophysics Data System (ADS)

    Liu, Brent J.; Huang, H. K.

    2004-04-01

    Acceptance Testing (AT) is a crucial step in the implementation process of a PACS within a clinical environment. AT determines whether the PACS is ready for clinical use and marks the official sign off of the PACS product. Most PACS vendors have Acceptance Testing (AT) plans, however, these plans do not provide a complete and robust evaluation of the full system. In addition, different sites will have different special requirements that vendor AT plans do not cover. The purpose of this paper is to introduce a protocol for AT design and present case studies of AT performed on clinical PACS. A methodology is presented that includes identifying testing components within PACS, quality assurance for both functionality and performance, and technical testing focusing on key single points-of-failure within the PACS product. Tools and resources that provide assistance in performing AT are discussed. In addition, implementation of the AT within the clinical environment and the overall implementation timeline of the PACS process are presented. Finally, case studies of actual AT of clinical PACS performed in the healthcare environment will be reviewed. The methodology for designing and implementing a robust AT plan for PACS was documented and has been used in PACS acceptance tests in several sites. This methodology can be applied to any PACS and can be used as a validation for the PACS product being acquired by radiology departments and hospitals. A methodology for AT design and implementation was presented that can be applied to future PACS installations. A robust AT plan for a PACS installation can increase both the utilization and satisfaction of a successful implementation of a PACS product that benefits both vendor and customer.

  15. A Formal Methodology to Design and Deploy Dependable Wireless Sensor Networks

    PubMed Central

    Testa, Alessandro; Cinque, Marcello; Coronato, Antonio; Augusto, Juan Carlos

    2016-01-01

    Wireless Sensor Networks (WSNs) are being increasingly adopted in critical applications, where verifying the correct operation of sensor nodes is a major concern. Undesired events may undermine the mission of the WSNs. Hence, their effects need to be properly assessed before deployment, to obtain a good level of expected performance; and during the operation, in order to avoid dangerous unexpected results. In this paper, we propose a methodology that aims at assessing and improving the dependability level of WSNs by means of an event-based formal verification technique. The methodology includes a process to guide designers towards the realization of a dependable WSN and a tool (“ADVISES”) to simplify its adoption. The tool is applicable to homogeneous WSNs with static routing topologies. It allows the automatic generation of formal specifications used to check correctness properties and evaluate dependability metrics at design time and at runtime for WSNs where an acceptable percentage of faults can be defined. During the runtime, we can check the behavior of the WSN accordingly to the results obtained at design time and we can detect sudden and unexpected failures, in order to trigger recovery procedures. The effectiveness of the methodology is shown in the context of two case studies, as proof-of-concept, aiming to illustrate how the tool is helpful to drive design choices and to check the correctness properties of the WSN at runtime. Although the method scales up to very large WSNs, the applicability of the methodology may be compromised by the state space explosion of the reasoning model, which must be faced by partitioning large topologies into sub-topologies. PMID:28025568

  16. Application of an Integrated Methodology for Propulsion and Airframe Control Design to a STOVL Aircraft

    NASA Technical Reports Server (NTRS)

    Garg, Sanjay; Mattern, Duane

    1994-01-01

    An advanced methodology for integrated flight propulsion control (IFPC) design for future aircraft, which will use propulsion system generated forces and moments for enhanced maneuver capabilities, is briefly described. This methodology has the potential to address in a systematic manner the coupling between the airframe and the propulsion subsystems typical of such enhanced maneuverability aircraft. Application of the methodology to a short take-off vertical landing (STOVL) aircraft in the landing approach to hover transition flight phase is presented with brief description of the various steps in the IFPC design methodology. The details of the individual steps have been described in previous publications and the objective of this paper is to focus on how the components of the control system designed at each step integrate into the overall IFPC system. The full nonlinear IFPC system was evaluated extensively in nonreal-time simulations as well as piloted simulations. Results from the nonreal-time evaluations are presented in this paper. Lessons learned from this application study are summarized in terms of areas of potential improvements in the STOVL IFPC design as well as identification of technology development areas to enhance the applicability of the proposed design methodology.

  17. Designing Trend-Monitoring Sounds for Helicopters: Methodological Issues and an Application

    ERIC Educational Resources Information Center

    Edworthy, Judy; Hellier, Elizabeth; Aldrich, Kirsteen; Loxley, Sarah

    2004-01-01

    This article explores methodological issues in sonification and sound design arising from the design of helicopter monitoring sounds. Six monitoring sounds (each with 5 levels) were tested for similarity and meaning with 3 different techniques: hierarchical cluster analysis, linkage analysis, and multidimensional scaling. In Experiment 1,…

  18. The telerobot workstation testbed for the shuttle aft flight deck: A project plan for integrating human factors into system design

    NASA Technical Reports Server (NTRS)

    Sauerwein, Timothy

    1989-01-01

    The human factors design process in developing a shuttle orbiter aft flight deck workstation testbed is described. In developing an operator workstation to control various laboratory telerobots, strong elements of human factors engineering and ergonomics are integrated into the design process. The integration of human factors is performed by incorporating user feedback at key stages in the project life-cycle. An operator centered design approach helps insure the system users are working with the system designer in the design and operation of the system. The design methodology is presented along with the results of the design and the solutions regarding human factors design principles.

  19. Diversity of nursing student views about simulation design: a q-methodological study.

    PubMed

    Paige, Jane B; Morin, Karen H

    2015-05-01

    Education of future nurses benefits from well-designed simulation activities. Skillful teaching with simulation requires educators to be constantly aware of how students experience learning and perceive educators' actions. Because revision of simulation activities considers feedback elicited from students, it is crucial to understand the perspective from which students base their response. In a Q-methodological approach, 45 nursing students rank-ordered 60 opinion statements about simulation design into a distribution grid. Factor analysis revealed that nursing students hold five distinct and uniquely personal perspectives-Let Me Show You, Stand By Me, The Agony of Defeat, Let Me Think It Through, and I'm Engaging and So Should You. Results suggest that nurse educators need to reaffirm that students clearly understand the purpose of each simulation activity. Nurse educators should incorporate presimulation assignments to optimize learning and help allay anxiety. The five perspectives discovered in this study can serve as a tool to discern individual students' learning needs. Copyright 2015, SLACK Incorporated.

  20. A game-based decision support methodology for competitive systems design

    NASA Astrophysics Data System (ADS)

    Briceno, Simon Ignacio

    This dissertation describes the development of a game-based methodology that facilitates the exploration and selection of research and development (R&D) projects under uncertain competitive scenarios. The proposed method provides an approach that analyzes competitor positioning and formulates response strategies to forecast the impact of technical design choices on a project's market performance. A critical decision in the conceptual design phase of propulsion systems is the selection of the best architecture, centerline, core size, and technology portfolio. This selection can be challenging when considering evolving requirements from both the airframe manufacturing company and the airlines in the market. Furthermore, the exceedingly high cost of core architecture development and its associated risk makes this strategic architecture decision the most important one for an engine company. Traditional conceptual design processes emphasize performance and affordability as their main objectives. These areas alone however, do not provide decision-makers with enough information as to how successful their engine will be in a competitive market. A key objective of this research is to examine how firm characteristics such as their relative differences in completing R&D projects, differences in the degree of substitutability between different project types, and first/second-mover advantages affect their product development strategies. Several quantitative methods are investigated that analyze business and engineering strategies concurrently. In particular, formulations based on the well-established mathematical field of game theory are introduced to obtain insights into the project selection problem. The use of game theory is explored in this research as a method to assist the selection process of R&D projects in the presence of imperfect market information. The proposed methodology focuses on two influential factors: the schedule uncertainty of project completion times and

  1. Turbofan engine control system design using the LQG/LTR methodology

    NASA Technical Reports Server (NTRS)

    Garg, Sanjay

    1989-01-01

    Application of the linear-quadratic-Gaussian with loop-transfer-recovery methodology to design of a control system for a simplified turbofan engine model is considered. The importance of properly scaling the plant to achieve the desired target feedback loop is emphasized. The steps involved in the application of the methodology are discussed via an example, and evaluation results are presented for a reduced-order compensator. The effect of scaling the plant on the stability robustness evaluation of the closed-loop system is studied in detail.

  2. Turbofan engine control system design using the LQG/LTR methodology

    NASA Technical Reports Server (NTRS)

    Garg, Sanjay

    1989-01-01

    Application of the Linear-Quadratic-Gaussian with Loop-Transfer-Recovery methodology to design of a control system for a simplified turbofan engine model is considered. The importance of properly scaling the plant to achieve the desired Target-Feedback-Loop is emphasized. The steps involved in the application of the methodology are discussed via an example, and evaluation results are presented for a reduced-order compensator. The effect of scaling the plant on the stability robustness evaluation of the closed-loop system is studied in detail.

  3. 77 FR 50514 - Post-Approval Studies 2012 Workshop: Design, Methodology, and Role in Evidence Appraisal...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-21

    ...] Post-Approval Studies 2012 Workshop: Design, Methodology, and Role in Evidence Appraisal Throughout the... Administration (FDA) is announcing the following public workshop entitled ``Post-Approval Studies 2012 Workshop: Design, Methodology, and Role in Evidence Appraisal Throughout the Total Product Life Cycle.'' The topics...

  4. An Analysis of Factors that Inhibit Business Use of User-Centered Design Principles: A Delphi Study

    ERIC Educational Resources Information Center

    Hilton, Tod M.

    2010-01-01

    The use of user-centered design (UCD) principles has a positive impact on the use of web-based interactive systems in customer-centric organizations. User-centered design methodologies are not widely adopted in organizations due to intraorganizational factors. A qualitative study using a modified Delphi technique was used to identify the factors…

  5. Design, methodological issues and participation in a multiple sclerosis case-control study.

    PubMed

    Williamson, D M; Marrie, R A; Ashley-Koch, A; Schiffer, R; Trottier, J; Wagner, L

    2012-09-01

    This study was conducted to determine whether the risk of developing multiple sclerosis (MS) was associated with certain environmental exposures or genetic factors previously reported to influence MS risk. This paper describes the methodological issues, study design and characteristics of the study population. Individuals with definite MS were identified from a prevalence study conducted in three geographic areas. The target number of cases was not reached, so an additional study area was added. Identifying clinic controls was inefficient, so controls were recruited using random digit dialing. All study participants completed a detailed questionnaire regarding environmental exposures using computer-assisted telephone interviewing, and blood was collected for genetic analysis. In total, 276 cases and 590 controls participated, but participation rates were low, ranging from 28.4% to 38.9%. Only one-third (33.6%) of individuals identified in the prevalence study agreed to participate in the case-control study. Cases were more likely to be non-Hispanic white and older than their source populations as identified in the preceding prevalence study (P < 0.05). Most participants provided a blood sample for genotyping (91%; n = 789). Epidemiological studies play a key role in identifying genetic and environmental factors that are associated with complex diseases like MS. Methodological issues arise in every study, and investigators need to be able to detect, respond to and correct problems in a timely and scientifically valid manner. © 2011 John Wiley & Sons A/S.

  6. Information System Design Methodology Based on PERT/CPM Networking and Optimization Techniques.

    ERIC Educational Resources Information Center

    Bose, Anindya

    The dissertation attempts to demonstrate that the program evaluation and review technique (PERT)/Critical Path Method (CPM) or some modified version thereof can be developed into an information system design methodology. The methodology utilizes PERT/CPM which isolates the basic functional units of a system and sets them in a dynamic time/cost…

  7. Measuring Sense of Community: A Methodological Interpretation of the Factor Structure Debate

    ERIC Educational Resources Information Center

    Peterson, N. Andrew; Speer, Paul W.; Hughey, Joseph

    2006-01-01

    Instability in the factor structure of the Sense of Community Index (SCI) was tested as a methodological artifact. Confirmatory factor analyses, tested with two data sets, supported neither the proposed one-factor nor the four-factor (needs fulfillment, group membership, influence, and emotional connection) SCI. Results demonstrated that the SCI…

  8. Factors in Human-Computer Interface Design (A Pilot Study).

    DTIC Science & Technology

    1994-12-01

    This study used a pretest - posttest control group experimental design to test the effect of consistency on speed, retention, and user satisfaction. Four...analysis. The overall methodology was a pretest - posttest control group experimental design using different prototypes to test the effects of...methodology used for this study was a pretest - posttest control group experimental design using different prototypes to test for features of the human

  9. Viability, Advantages and Design Methodologies of M-Learning Delivery

    ERIC Educational Resources Information Center

    Zabel, Todd W.

    2010-01-01

    The purpose of this study was to examine the viability and principle design methodologies of Mobile Learning models in developing regions. Demographic and market studies were utilized to determine the viability of M-Learning delivery as well as best uses for such technologies and methods given socioeconomic and political conditions within the…

  10. A multi-criteria decision aid methodology to design electric vehicles public charging networks

    NASA Astrophysics Data System (ADS)

    Raposo, João; Rodrigues, Ana; Silva, Carlos; Dentinho, Tomaz

    2015-05-01

    This article presents a new multi-criteria decision aid methodology, dynamic-PROMETHEE, here used to design electric vehicle charging networks. In applying this methodology to a Portuguese city, results suggest that it is effective in designing electric vehicle charging networks, generating time and policy based scenarios, considering offer and demand and the city's urban structure. Dynamic-PROMETHE adds to the already known PROMETHEE's characteristics other useful features, such as decision memory over time, versatility and adaptability. The case study, used here to present the dynamic-PROMETHEE, served as inspiration and base to create this new methodology. It can be used to model different problems and scenarios that may present similar requirement characteristics.

  11. A methodology for designing robust multivariable nonlinear control systems. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Grunberg, D. B.

    1986-01-01

    A new methodology is described for the design of nonlinear dynamic controllers for nonlinear multivariable systems providing guarantees of closed-loop stability, performance, and robustness. The methodology is an extension of the Linear-Quadratic-Gaussian with Loop-Transfer-Recovery (LQG/LTR) methodology for linear systems, thus hinging upon the idea of constructing an approximate inverse operator for the plant. A major feature of the methodology is a unification of both the state-space and input-output formulations. In addition, new results on stability theory, nonlinear state estimation, and optimal nonlinear regulator theory are presented, including the guaranteed global properties of the extended Kalman filter and optimal nonlinear regulators.

  12. Railroad classification yard design methodology study : East Deerfield Yard, a case study

    DOT National Transportation Integrated Search

    1980-02-01

    This interim report documents the application of a railroad classification yard design methodology to Boston and Maine's East Deerfield Yard Rehabiliation. This case study effort represents Phase 2 of a larger effort to develop a yard design methodol...

  13. A Methodology for Quantifying Certain Design Requirements During the Design Phase

    NASA Technical Reports Server (NTRS)

    Adams, Timothy; Rhodes, Russel

    2005-01-01

    A methodology for developing and balancing quantitative design requirements for safety, reliability, and maintainability has been proposed. Conceived as the basis of a more rational approach to the design of spacecraft, the methodology would also be applicable to the design of automobiles, washing machines, television receivers, or almost any other commercial product. Heretofore, it has been common practice to start by determining the requirements for reliability of elements of a spacecraft or other system to ensure a given design life for the system. Next, safety requirements are determined by assessing the total reliability of the system and adding redundant components and subsystems necessary to attain safety goals. As thus described, common practice leaves the maintainability burden to fall to chance; therefore, there is no control of recurring costs or of the responsiveness of the system. The means that have been used in assessing maintainability have been oriented toward determining the logistical sparing of components so that the components are available when needed. The process established for developing and balancing quantitative requirements for safety (S), reliability (R), and maintainability (M) derives and integrates NASA s top-level safety requirements and the controls needed to obtain program key objectives for safety and recurring cost (see figure). Being quantitative, the process conveniently uses common mathematical models. Even though the process is shown as being worked from the top down, it can also be worked from the bottom up. This process uses three math models: (1) the binomial distribution (greaterthan- or-equal-to case), (2) reliability for a series system, and (3) the Poisson distribution (less-than-or-equal-to case). The zero-fail case for the binomial distribution approximates the commonly known exponential distribution or "constant failure rate" distribution. Either model can be used. The binomial distribution was selected for

  14. [Optimization of one-step pelletization technology of Biqiu granules by Plackett-Burman design and Box-Behnken response surface methodology].

    PubMed

    Zhang, Yan-jun; Liu, Li-li; Hu, Jun-hua; Wu, Yun; Chao, En-xiang; Xiao, Wei

    2015-11-01

    First with the qualified rate of granules as the evaluation index, significant influencing factors were firstly screened by Plackett-Burman design. Then, with the qualified rate and moisture content as the evaluation indexes, significant factors that affect one-step pelletization technology were further optimized by Box-Behnken design; experimental data were imitated by multiple regression and second-order polynomial equation; and response surface method was used for predictive analysis of optimal technology. The best conditions were as follows: inlet air temperature of 85 degrees C, sample introduction speed of 33 r x min(-1), density of concrete 1. 10. One-step pelletization technology of Biqiu granules by Plackett-Burman design and Box-Behnken response surface methodology was stable and feasible with good predictability, which provided reliable basis for the industrialized production of Biqiu granules.

  15. Design Methodology for Multi-Element High-Lift Systems on Subsonic Civil Transport Aircraft

    NASA Technical Reports Server (NTRS)

    Pepper, R. S.; vanDam, C. P.

    1996-01-01

    The choice of a high-lift system is crucial in the preliminary design process of a subsonic civil transport aircraft. Its purpose is to increase the allowable aircraft weight or decrease the aircraft's wing area for a given takeoff and landing performance. However, the implementation of a high-lift system into a design must be done carefully, for it can improve the aerodynamic performance of an aircraft but may also drastically increase the aircraft empty weight. If designed properly, a high-lift system can improve the cost effectiveness of an aircraft by increasing the payload weight for a given takeoff and landing performance. This is why the design methodology for a high-lift system should incorporate aerodynamic performance, weight, and cost. The airframe industry has experienced rapid technological growth in recent years which has led to significant advances in high-lift systems. For this reason many existing design methodologies have become obsolete since they are based on outdated low Reynolds number wind-tunnel data and can no longer accurately predict the aerodynamic characteristics or weight of current multi-element wings. Therefore, a new design methodology has been created that reflects current aerodynamic, weight, and cost data and provides enough flexibility to allow incorporation of new data when it becomes available.

  16. Application of Design Methodologies for Feedback Compensation Associated with Linear Systems

    NASA Technical Reports Server (NTRS)

    Smith, Monty J.

    1996-01-01

    The work that follows is concerned with the application of design methodologies for feedback compensation associated with linear systems. In general, the intent is to provide a well behaved closed loop system in terms of stability and robustness (internal signals remain bounded with a certain amount of uncertainty) and simultaneously achieve an acceptable level of performance. The approach here has been to convert the closed loop system and control synthesis problem into the interpolation setting. The interpolation formulation then serves as our mathematical representation of the design process. Lifting techniques have been used to solve the corresponding interpolation and control synthesis problems. Several applications using this multiobjective design methodology have been included to show the effectiveness of these techniques. In particular, the mixed H 2-H performance criteria with algorithm has been used on several examples including an F-18 HARV (High Angle of Attack Research Vehicle) for sensitivity performance.

  17. Development of Probabilistic Rigid Pavement Design Methodologies for Military Airfields.

    DTIC Science & Technology

    1983-12-01

    4A161102AT22, Task AO, Work Unit 009, "Methodology for Considering Material Variability in Pavement Design." OCE Project Monitor was Mr. S. S. Gillespie. The...PREFACE. .. ............................. VOLUME 1: STATE OF THE ART VARIABILITY OF AIRFIELD PAVEMENT MATERIALS VOLUME 11: MATHEMATICAL FORMULATION OF...VOLUME IV: PROBABILISTIC ANALYSIS OF RIGID AIRFIELD DESIGN BY ELASTIC LAYERED THEORY VOLUME I STATE OF THE ART VARIABILITY OF AIRFIELD PAVEMENT MATERIALS

  18. Design-Based Research: Is This a Suitable Methodology for Short-Term Projects?

    ERIC Educational Resources Information Center

    Pool, Jessica; Laubscher, Dorothy

    2016-01-01

    This article reports on a design-based methodology of a thesis in which a fully face-to-face contact module was converted into a blended learning course. The purpose of the article is to report on how design-based phases, in the form of micro-, meso- and macro-cycles were applied to improve practice and to generate design principles. Design-based…

  19. The design and methodology of premature ejaculation interventional studies

    PubMed Central

    2016-01-01

    Large well-designed clinical efficacy and safety randomized clinical trials (RCTs) are required to achieve regulatory approval of new drug treatments. The objective of this article is to make recommendations for the criteria for defining and selecting the clinical trial study population, design and efficacy outcomes measures which comprise ideal premature ejaculation (PE) interventional trial methodology. Data on clinical trial design, epidemiology, definitions, dimensions and psychological impact of PE was reviewed, critiqued and incorporated into a series of recommendations for standardisation of PE clinical trial design, outcome measures and reporting using the principles of evidence based medicine. Data from PE interventional studies are only reliable, interpretable and capable of being generalised to patients with PE, when study populations are defined by the International Society for Sexual Medicine (ISSM) multivariate definition of PE. PE intervention trials should employ a double-blind RCT methodology and include placebo control, active standard drug control, and/or dose comparison trials. Ejaculatory latency time (ELT) and subject/partner outcome measures of control, personal/partner/relationship distress and other study-specific outcome measures should be used as outcome measures. There is currently no published literature which identifies a clinically significant threshold response to intervention. The ISSM definition of PE reflects the contemporary understanding of PE and represents the state-of-the-art multi-dimensional definition of PE and is recommended as the basis of diagnosis of PE for all PE clinical trials. PMID:27652224

  20. Realizing improved patient care through human-centered operating room design: a human factors methodology for observing flow disruptions in the cardiothoracic operating room.

    PubMed

    Palmer, Gary; Abernathy, James H; Swinton, Greg; Allison, David; Greenstein, Joel; Shappell, Scott; Juang, Kevin; Reeves, Scott T

    2013-11-01

    Human factors engineering has allowed a systematic approach to the evaluation of adverse events in a multitude of high-stake industries. This study sought to develop an initial methodology for identifying and classifying flow disruptions in the cardiac operating room (OR). Two industrial engineers with expertise in human factors workflow disruptions observed 10 cardiac operations from the moment the patient entered the OR to the time they left for the intensive care unit. Each disruption was fully documented on an architectural layout of the OR suite and time-stamped during each phase of surgery (preoperative [before incision], operative [incision to skin closure], and postoperative [skin closure until the patient leaves the OR]) to synchronize flow disruptions between the two observers. These disruptions were then categorized. The two observers made a total of 1,158 observations. After the elimination of duplicate observations, a total of 1,080 observations remained to be analyzed. These disruptions were distributed into six categories such as communication, usability, physical layout, environmental hazards, general interruptions, and equipment failures. They were further organized into 33 subcategories. The most common disruptions were related to OR layout and design (33%). By using the detailed architectural diagrams, the authors were able to clearly demonstrate for the first time the unique role that OR design and equipment layout has on the generation of physical layout flow disruptions. Most importantly, the authors have developed a robust taxonomy to describe the flow disruptions encountered in a cardiac OR, which can be used for future research and patient safety improvements.

  1. Analog design optimization methodology for ultralow-power circuits using intuitive inversion-level and saturation-level parameters

    NASA Astrophysics Data System (ADS)

    Eimori, Takahisa; Anami, Kenji; Yoshimatsu, Norifumi; Hasebe, Tetsuya; Murakami, Kazuaki

    2014-01-01

    A comprehensive design optimization methodology using intuitive nondimensional parameters of inversion-level and saturation-level is proposed, especially for ultralow-power, low-voltage, and high-performance analog circuits with mixed strong, moderate, and weak inversion metal-oxide-semiconductor transistor (MOST) operations. This methodology is based on the synthesized charge-based MOST model composed of Enz-Krummenacher-Vittoz (EKV) basic concepts and advanced-compact-model (ACM) physics-based equations. The key concept of this methodology is that all circuit and system characteristics are described as some multivariate functions of inversion-level parameters, where the inversion level is used as an independent variable representative of each MOST. The analog circuit design starts from the first step of inversion-level design using universal characteristics expressed by circuit currents and inversion-level parameters without process-dependent parameters, followed by the second step of foundry-process-dependent design and the last step of verification using saturation-level criteria. This methodology also paves the way to an intuitive and comprehensive design approach for many kinds of analog circuit specifications by optimization using inversion-level log-scale diagrams and saturation-level criteria. In this paper, we introduce an example of our design methodology for a two-stage Miller amplifier.

  2. [Optimization of process of icraiin be hydrolyzed to Baohuoside I by cellulase based on Plackett-Burman design combined with CCD response surface methodology].

    PubMed

    Song, Chuan-xia; Chen, Hong-mei; Dai, Yu; Kang, Min; Hu, Jia; Deng, Yun

    2014-11-01

    To optimize the process of Icraiin be hydrolyzed to Baohuoside I by cellulase by Plackett-Burman design combined with Central Composite Design (CCD) response surface methodology. To select the main influencing factors by Plackett-Burman design, using CCD response surface methodology to optimize the process of Icraiin be hydrolyzed to Baohuoside I by cellulase. Taking substrate concentration, the pH of buffer and reaction time as independent variables, with conversion rate of icariin as dependent variable,using regression fitting of completely quadratic response surface between independent variable and dependent variable,the optimum process of Icraiin be hydrolyzed to Baohuoside I by cellulase was intuitively analyzed by 3D surface chart, and taking verification tests and predictive analysis. The best enzymatic hydrolytic process was as following: substrate concentration 8. 23 mg/mL, pH 5. 12 of buffer,reaction time 35. 34 h. The optimum process of Icraiin be hydrolyzed to Baohuoside I by cellulase is determined by Plackett-Burman design combined with CCD response surface methodology. The optimized enzymatic hydrolytic process is simple, convenient, accurate, reproducible and predictable.

  3. Advanced Design Methodology for Robust Aircraft Sizing and Synthesis

    NASA Technical Reports Server (NTRS)

    Mavris, Dimitri N.

    1997-01-01

    Contract efforts are focused on refining the Robust Design Methodology for Conceptual Aircraft Design. Robust Design Simulation (RDS) was developed earlier as a potential solution to the need to do rapid trade-offs while accounting for risk, conflict, and uncertainty. The core of the simulation revolved around Response Surface Equations as approximations of bounded design spaces. An ongoing investigation is concerned with the advantages of using Neural Networks in conceptual design. Thought was also given to the development of systematic way to choose or create a baseline configuration based on specific mission requirements. Expert system was developed, which selects aerodynamics, performance and weights model from several configurations based on the user's mission requirements for subsonic civil transport. The research has also resulted in a step-by-step illustration on how to use the AMV method for distribution generation and the search for robust design solutions to multivariate constrained problems.

  4. Design Methodology of a Dual-Halbach Array Linear Actuator with Thermal-Electromagnetic Coupling

    PubMed Central

    Eckert, Paulo Roberto; Flores Filho, Aly Ferreira; Perondi, Eduardo; Ferri, Jeferson; Goltz, Evandro

    2016-01-01

    This paper proposes a design methodology for linear actuators, considering thermal and electromagnetic coupling with geometrical and temperature constraints, that maximizes force density and minimizes force ripple. The method allows defining an actuator for given specifications in a step-by-step way so that requirements are met and the temperature within the device is maintained under or equal to its maximum allowed for continuous operation. According to the proposed method, the electromagnetic and thermal models are built with quasi-static parametric finite element models. The methodology was successfully applied to the design of a linear cylindrical actuator with a dual quasi-Halbach array of permanent magnets and a moving-coil. The actuator can produce an axial force of 120 N and a stroke of 80 mm. The paper also presents a comparative analysis between results obtained considering only an electromagnetic model and the thermal-electromagnetic coupled model. This comparison shows that the final designs for both cases differ significantly, especially regarding its active volume and its electrical and magnetic loading. Although in this paper the methodology was employed to design a specific actuator, its structure can be used to design a wide range of linear devices if the parametric models are adjusted for each particular actuator. PMID:26978370

  5. Design Methodology of a Dual-Halbach Array Linear Actuator with Thermal-Electromagnetic Coupling.

    PubMed

    Eckert, Paulo Roberto; Flores Filho, Aly Ferreira; Perondi, Eduardo; Ferri, Jeferson; Goltz, Evandro

    2016-03-11

    This paper proposes a design methodology for linear actuators, considering thermal and electromagnetic coupling with geometrical and temperature constraints, that maximizes force density and minimizes force ripple. The method allows defining an actuator for given specifications in a step-by-step way so that requirements are met and the temperature within the device is maintained under or equal to its maximum allowed for continuous operation. According to the proposed method, the electromagnetic and thermal models are built with quasi-static parametric finite element models. The methodology was successfully applied to the design of a linear cylindrical actuator with a dual quasi-Halbach array of permanent magnets and a moving-coil. The actuator can produce an axial force of 120 N and a stroke of 80 mm. The paper also presents a comparative analysis between results obtained considering only an electromagnetic model and the thermal-electromagnetic coupled model. This comparison shows that the final designs for both cases differ significantly, especially regarding its active volume and its electrical and magnetic loading. Although in this paper the methodology was employed to design a specific actuator, its structure can be used to design a wide range of linear devices if the parametric models are adjusted for each particular actuator.

  6. A computer simulator for development of engineering system design methodologies

    NASA Technical Reports Server (NTRS)

    Padula, S. L.; Sobieszczanski-Sobieski, J.

    1987-01-01

    A computer program designed to simulate and improve engineering system design methodology is described. The simulator mimics the qualitative behavior and data couplings occurring among the subsystems of a complex engineering system. It eliminates the engineering analyses in the subsystems by replacing them with judiciously chosen analytical functions. With the cost of analysis eliminated, the simulator is used for experimentation with a large variety of candidate algorithms for multilevel design optimization to choose the best ones for the actual application. Thus, the simulator serves as a development tool for multilevel design optimization strategy. The simulator concept, implementation, and status are described and illustrated with examples.

  7. A Robust Design Methodology for Optimal Microscale Secondary Flow Control in Compact Inlet Diffusers

    NASA Technical Reports Server (NTRS)

    Anderson, Bernhard H.; Keller, Dennis J.

    2001-01-01

    It is the purpose of this study to develop an economical Robust design methodology for microscale secondary flow control in compact inlet diffusers. To illustrate the potential of economical Robust Design methodology, two different mission strategies were considered for the subject inlet, namely Maximum Performance and Maximum HCF Life Expectancy. The Maximum Performance mission maximized total pressure recovery while the Maximum HCF Life Expectancy mission minimized the mean of the first five Fourier harmonic amplitudes, i.e., 'collectively' reduced all the harmonic 1/2 amplitudes of engine face distortion. Each of the mission strategies was subject to a low engine face distortion constraint, i.e., DC60<0.10, which is a level acceptable for commercial engines. For each of these missions strategies, an 'Optimal Robust' (open loop control) and an 'Optimal Adaptive' (closed loop control) installation was designed over a twenty degree angle-of-incidence range. The Optimal Robust installation used economical Robust Design methodology to arrive at a single design which operated over the entire angle-of-incident range (open loop control). The Optimal Adaptive installation optimized all the design parameters at each angle-of-incidence. Thus, the Optimal Adaptive installation would require a closed loop control system to sense a proper signal for each effector and modify that effector device, whether mechanical or fluidic, for optimal inlet performance. In general, the performance differences between the Optimal Adaptive and Optimal Robust installation designs were found to be marginal. This suggests, however, that Optimal Robust open loop installation designs can be very competitive with Optimal Adaptive close loop designs. Secondary flow control in inlets is inherently robust, provided it is optimally designed. Therefore, the new methodology presented in this paper, combined array 'Lower Order' approach to Robust DOE, offers the aerodynamicist a very viable and

  8. Rating the methodological quality of single-subject designs and n-of-1 trials: introducing the Single-Case Experimental Design (SCED) Scale.

    PubMed

    Tate, Robyn L; McDonald, Skye; Perdices, Michael; Togher, Leanne; Schultz, Regina; Savage, Sharon

    2008-08-01

    Rating scales that assess methodological quality of clinical trials provide a means to critically appraise the literature. Scales are currently available to rate randomised and non-randomised controlled trials, but there are none that assess single-subject designs. The Single-Case Experimental Design (SCED) Scale was developed for this purpose and evaluated for reliability. Six clinical researchers who were trained and experienced in rating methodological quality of clinical trials developed the scale and participated in reliability studies. The SCED Scale is an 11-item rating scale for single-subject designs, of which 10 items are used to assess methodological quality and use of statistical analysis. The scale was developed and refined over a 3-year period. Content validity was addressed by identifying items to reduce the main sources of bias in single-case methodology as stipulated by authorities in the field, which were empirically tested against 85 published reports. Inter-rater reliability was assessed using a random sample of 20/312 single-subject reports archived in the Psychological Database of Brain Impairment Treatment Efficacy (PsycBITE). Inter-rater reliability for the total score was excellent, both for individual raters (overall ICC = 0.84; 95% confidence interval 0.73-0.92) and for consensus ratings between pairs of raters (overall ICC = 0.88; 95% confidence interval 0.78-0.95). Item reliability was fair to excellent for consensus ratings between pairs of raters (range k = 0.48 to 1.00). The results were replicated with two independent novice raters who were trained in the use of the scale (ICC = 0.88, 95% confidence interval 0.73-0.95). The SCED Scale thus provides a brief and valid evaluation of methodological quality of single-subject designs, with the total score demonstrating excellent inter-rater reliability using both individual and consensus ratings. Items from the scale can also be used as a checklist in the design, reporting and critical

  9. Analysis and Design of Power Factor Pre-Regulator Based on a Symmetrical Charge Pump Circuit Applied to Electronic Ballast

    NASA Astrophysics Data System (ADS)

    Lazcano Olea, Miguel; Ramos Astudillo, Reynaldo; Sanhueza Robles, René; Rodriguez Rubke, Leopoldo; Ruiz-Caballero, Domingo Antonio

    This paper presents the analysis and design of a power factor pre-regulator based on a symmetrical charge pump circuit applied to electronic ballast. The operation stages of the circuit are analyzed and its main design equations are obtained. Simulation and experimental results are presented in order to show the design methodology feasibility.

  10. Cyber-Informed Engineering: The Need for a New Risk Informed and Design Methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Price, Joseph Daniel; Anderson, Robert Stephen

    Current engineering and risk management methodologies do not contain the foundational assumptions required to address the intelligent adversary’s capabilities in malevolent cyber attacks. Current methodologies focus on equipment failures or human error as initiating events for a hazard, while cyber attacks use the functionality of a trusted system to perform operations outside of the intended design and without the operator’s knowledge. These threats can by-pass or manipulate traditionally engineered safety barriers and present false information, invalidating the fundamental basis of a safety analysis. Cyber threats must be fundamentally analyzed from a completely new perspective where neither equipment nor human operationmore » can be fully trusted. A new risk analysis and design methodology needs to be developed to address this rapidly evolving threatscape.« less

  11. Experimental validation of an integrated controls-structures design methodology for a class of flexible space structures

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Gupta, Sandeep; Elliott, Kenny B.; Joshi, Suresh M.; Walz, Joseph E.

    1994-01-01

    This paper describes the first experimental validation of an optimization-based integrated controls-structures design methodology for a class of flexible space structures. The Controls-Structures-Interaction (CSI) Evolutionary Model, a laboratory test bed at Langley, is redesigned based on the integrated design methodology with two different dissipative control strategies. The redesigned structure is fabricated, assembled in the laboratory, and experimentally compared with the original test structure. Design guides are proposed and used in the integrated design process to ensure that the resulting structure can be fabricated. Experimental results indicate that the integrated design requires greater than 60 percent less average control power (by thruster actuators) than the conventional control-optimized design while maintaining the required line-of-sight performance, thereby confirming the analytical findings about the superiority of the integrated design methodology. Amenability of the integrated design structure to other control strategies is considered and evaluated analytically and experimentally. This work also demonstrates the capabilities of the Langley-developed design tool CSI DESIGN which provides a unified environment for structural and control design.

  12. Methodology for Sensitivity Analysis, Approximate Analysis, and Design Optimization in CFD for Multidisciplinary Applications

    NASA Technical Reports Server (NTRS)

    Taylor, Arthur C., III; Hou, Gene W.

    1996-01-01

    An incremental iterative formulation together with the well-known spatially split approximate-factorization algorithm, is presented for solving the large, sparse systems of linear equations that are associated with aerodynamic sensitivity analysis. This formulation is also known as the 'delta' or 'correction' form. For the smaller two dimensional problems, a direct method can be applied to solve these linear equations in either the standard or the incremental form, in which case the two are equivalent. However, iterative methods are needed for larger two-dimensional and three dimensional applications because direct methods require more computer memory than is currently available. Iterative methods for solving these equations in the standard form are generally unsatisfactory due to an ill-conditioned coefficient matrix; this problem is overcome when these equations are cast in the incremental form. The methodology is successfully implemented and tested using an upwind cell-centered finite-volume formulation applied in two dimensions to the thin-layer Navier-Stokes equations for external flow over an airfoil. In three dimensions this methodology is demonstrated with a marching-solution algorithm for the Euler equations to calculate supersonic flow over the High-Speed Civil Transport configuration (HSCT 24E). The sensitivity derivatives obtained with the incremental iterative method from a marching Euler code are used in a design-improvement study of the HSCT configuration that involves thickness. camber, and planform design variables.

  13. New Methods in Design Education: The Systemic Methodology and the Use of Sketch in the Conceptual Design Stage

    ERIC Educational Resources Information Center

    Westermeyer, Juan Carlos Briede; Ortuno, Bernabe Hernandis

    2011-01-01

    This study describes the application of a new product concurrent design methodologies in the context in the education of industrial design. The use of the sketch has been utilized many times as a tool of creative expression especially in the conceptual design stage, in an intuitive way and a little out of the context of the reality needs that the…

  14. Architecture, Design, and System; Performance Assessment and Development Methodology for Computer-Based Systems. Volume 1. Methodology Description, Discussion, and Assessment,

    DTIC Science & Technology

    1983-12-30

    AD-Ri46 57? ARCHITECTURE DESIGN AND SYSTEM; PERFORMANCE ASSESSMENT i/i AND DEVELOPMENT ME..(U) NAVAL SURFACE WEAPONS CENTER SILYER SPRING MD J...AD-A 146 577 NSIWC TR 83-324 ARCHITECTURE , DESIGN , AND SYSTEM; PERFORMANCE ASSESSMENT AND DEVELOPMENT METHODOLOGY...REPORT NUMBER 12. GOVT ACCESSION NO.3. RECIPIENT’S CATALOG NUMBER NSWC TR 83-324 10- 1 1 51’ 4. ?ITLE (and subtitle) ARCHITECTURE , DESIGN , AND SYSTEM; S

  15. A human factors methodology for real-time support applications

    NASA Technical Reports Server (NTRS)

    Murphy, E. D.; Vanbalen, P. M.; Mitchell, C. M.

    1983-01-01

    A general approach to the human factors (HF) analysis of new or existing projects at NASA/Goddard is delineated. Because the methodology evolved from HF evaluations of the Mission Planning Terminal (MPT) and the Earth Radiation Budget Satellite Mission Operations Room (ERBS MOR), it is directed specifically to the HF analysis of real-time support applications. Major topics included for discussion are the process of establishing a working relationship between the Human Factors Group (HFG) and the project, orientation of HF analysts to the project, human factors analysis and review, and coordination with major cycles of system development. Sub-topics include specific areas for analysis and appropriate HF tools. Management support functions are outlined. References provide a guide to sources of further information.

  16. Hybrid PV/diesel solar power system design using multi-level factor analysis optimization

    NASA Astrophysics Data System (ADS)

    Drake, Joshua P.

    Solar power systems represent a large area of interest across a spectrum of organizations at a global level. It was determined that a clear understanding of current state of the art software and design methods, as well as optimization methods, could be used to improve the design methodology. Solar power design literature was researched for an in depth understanding of solar power system design methods and algorithms. Multiple software packages for the design and optimization of solar power systems were analyzed for a critical understanding of their design workflow. In addition, several methods of optimization were studied, including brute force, Pareto analysis, Monte Carlo, linear and nonlinear programming, and multi-way factor analysis. Factor analysis was selected as the most efficient optimization method for engineering design as it applied to solar power system design. The solar power design algorithms, software work flow analysis, and factor analysis optimization were combined to develop a solar power system design optimization software package called FireDrake. This software was used for the design of multiple solar power systems in conjunction with an energy audit case study performed in seven Tibetan refugee camps located in Mainpat, India. A report of solar system designs for the camps, as well as a proposed schedule for future installations was generated. It was determined that there were several improvements that could be made to the state of the art in modern solar power system design, though the complexity of current applications is significant.

  17. A methodology for hypersonic transport technology planning

    NASA Technical Reports Server (NTRS)

    Repic, E. M.; Olson, G. A.; Milliken, R. J.

    1973-01-01

    A systematic procedure by which the relative economic value of technology factors affecting design, configuration, and operation of a hypersonic cruise transport can be evaluated is discussed. Use of the methodology results in identification of first-order economic gains potentially achievable by projected advances in each of the definable, hypersonic technologies. Starting with a baseline vehicle, the formulas, procedures and forms which are integral parts of this methodology are developed. A demonstration of the methodology is presented for one specific hypersonic vehicle system.

  18. Autism genetics: Methodological issues and experimental design.

    PubMed

    Sacco, Roberto; Lintas, Carla; Persico, Antonio M

    2015-10-01

    Autism is a complex neuropsychiatric disorder of developmental origin, where multiple genetic and environmental factors likely interact resulting in a clinical continuum between "affected" and "unaffected" individuals in the general population. During the last two decades, relevant progress has been made in identifying chromosomal regions and genes in linkage or association with autism, but no single gene has emerged as a major cause of disease in a large number of patients. The purpose of this paper is to discuss specific methodological issues and experimental strategies in autism genetic research, based on fourteen years of experience in patient recruitment and association studies of autism spectrum disorder in Italy.

  19. The multi-copy simultaneous search methodology: a fundamental tool for structure-based drug design.

    PubMed

    Schubert, Christian R; Stultz, Collin M

    2009-08-01

    Fragment-based ligand design approaches, such as the multi-copy simultaneous search (MCSS) methodology, have proven to be useful tools in the search for novel therapeutic compounds that bind pre-specified targets of known structure. MCSS offers a variety of advantages over more traditional high-throughput screening methods, and has been applied successfully to challenging targets. The methodology is quite general and can be used to construct functionality maps for proteins, DNA, and RNA. In this review, we describe the main aspects of the MCSS method and outline the general use of the methodology as a fundamental tool to guide the design of de novo lead compounds. We focus our discussion on the evaluation of MCSS results and the incorporation of protein flexibility into the methodology. In addition, we demonstrate on several specific examples how the information arising from the MCSS functionality maps has been successfully used to predict ligand binding to protein targets and RNA.

  20. A design methodology of magentorheological fluid damper using Herschel-Bulkley model

    NASA Astrophysics Data System (ADS)

    Liao, Linqing; Liao, Changrong; Cao, Jianguo; Fu, L. J.

    2003-09-01

    Magnetorheological fluid (MR fluid) is highly concentrated suspension of very small magnetic particle in inorganic oil. The essential behavior of MR fluid is its ability to reversibly change from free-flowing, linear viscous liquids to semi-solids having controllable yield strength in milliseconds when exposed to magnetic field. This feature provides simple, quiet, rapid-response interfaces between electronic controls and mechanical systems. In this paper, a mini-bus MR fluid damper based on plate Poiseuille flow mode is typically analyzed using Herschel-Bulkley model, which can be used to account for post-yield shear thinning or thickening under the quasi-steady flow condition. In the light of various value of flow behavior index, the influences of post-yield shear thinning or thickening on flow velocity profiles of MR fluid in annular damping orifice are examined numerically. Analytical damping coefficient predictions also are compared via the nonlinear Bingham plastic model and Herschel-Bulkley constitutive model. A MR fluid damper, which is designed and fabricated according to design method presented in this paper, has tested by electro-hydraulic servo vibrator and its control system in National Center for Test and Supervision of Coach Quality. The experimental results reveal that the analysis methodology and design theory are reasonable and MR fluid damper can be designed according to the design methodology.

  1. Unshrouded Centrifugal Turbopump Impeller Design Methodology

    NASA Technical Reports Server (NTRS)

    Prueger, George H.; Williams, Morgan; Chen, Wei-Chung; Paris, John; Williams, Robert; Stewart, Eric

    2001-01-01

    Turbopump weight continues to be a dominant parameter in the trade space for reduction of engine weight. Space Shuttle Main Engine weight distribution indicates that the turbomachinery make up approximately 30% of the total engine weight. Weight reduction can be achieved through the reduction of envelope of the turbopump. Reduction in envelope relates to an increase in turbopump speed and an increase in impeller head coefficient. Speed can be increased until suction performance limits are achieved on the pump or due to alternate constraints the turbine or bearings limit speed. Once the speed of the turbopump is set the impeller tip speed sets the minimum head coefficient of the machine. To reduce impeller diameter the head coefficient must be increased. A significant limitation with increasing head coefficient is that the slope of the head-flow characteristic is affected and this can limit engine throttling range. Unshrouded impellers offer a design option for increased turbopump speed without increasing the impeller head coefficient. However, there are several issues with regard to using an unshrouded impeller: there is a pump performance penalty due to the front open face recirculation flow, there is a potential pump axial thrust problem from the unbalanced front open face and the back shroud face, and since test data is very limited for this configuration, there is uncertainty in the magnitude and phase of the rotordynamic forces due to the front impeller passage. The purpose of the paper is to discuss the design of an unshrouded impeller and to examine the hydrodynamic performance, axial thrust, and rotordynamic performance. The design methodology will also be discussed. This work will help provide some guidelines for unshrouded impeller design.

  2. A cost-effective methodology for the design of massively-parallel VLSI functional units

    NASA Technical Reports Server (NTRS)

    Venkateswaran, N.; Sriram, G.; Desouza, J.

    1993-01-01

    In this paper we propose a generalized methodology for the design of cost-effective massively-parallel VLSI Functional Units. This methodology is based on a technique of generating and reducing a massive bit-array on the mask-programmable PAcube VLSI array. This methodology unifies (maintains identical data flow and control) the execution of complex arithmetic functions on PAcube arrays. It is highly regular, expandable and uniform with respect to problem-size and wordlength, thereby reducing the communication complexity. The memory-functional unit interface is regular and expandable. Using this technique functional units of dedicated processors can be mask-programmed on the naked PAcube arrays, reducing the turn-around time. The production cost of such dedicated processors can be drastically reduced since the naked PAcube arrays can be mass-produced. Analysis of the the performance of functional units designed by our method yields promising results.

  3. Three-dimensional viscous design methodology for advanced technology aircraft supersonic inlet systems

    NASA Technical Reports Server (NTRS)

    Anderson, B. H.

    1983-01-01

    A broad program to develop advanced, reliable, and user oriented three-dimensional viscous design techniques for supersonic inlet systems, and encourage their transfer into the general user community is discussed. Features of the program include: (1) develop effective methods of computing three-dimensional flows within a zonal modeling methodology; (2) ensure reasonable agreement between said analysis and selective sets of benchmark validation data; (3) develop user orientation into said analysis; and (4) explore and develop advanced numerical methodology.

  4. Design Methodology for Magnetic Field-Based Soft Tri-Axis Tactile Sensors.

    PubMed

    Wang, Hongbo; de Boer, Greg; Kow, Junwai; Alazmani, Ali; Ghajari, Mazdak; Hewson, Robert; Culmer, Peter

    2016-08-24

    Tactile sensors are essential if robots are to safely interact with the external world and to dexterously manipulate objects. Current tactile sensors have limitations restricting their use, notably being too fragile or having limited performance. Magnetic field-based soft tactile sensors offer a potential improvement, being durable, low cost, accurate and high bandwidth, but they are relatively undeveloped because of the complexities involved in design and calibration. This paper presents a general design methodology for magnetic field-based three-axis soft tactile sensors, enabling researchers to easily develop specific tactile sensors for a variety of applications. All aspects (design, fabrication, calibration and evaluation) of the development of tri-axis soft tactile sensors are presented and discussed. A moving least square approach is used to decouple and convert the magnetic field signal to force output to eliminate non-linearity and cross-talk effects. A case study of a tactile sensor prototype, MagOne, was developed. This achieved a resolution of 1.42 mN in normal force measurement (0.71 mN in shear force), good output repeatability and has a maximum hysteresis error of 3.4%. These results outperform comparable sensors reported previously, highlighting the efficacy of our methodology for sensor design.

  5. Design Methodology for Magnetic Field-Based Soft Tri-Axis Tactile Sensors

    PubMed Central

    Wang, Hongbo; de Boer, Greg; Kow, Junwai; Alazmani, Ali; Ghajari, Mazdak; Hewson, Robert; Culmer, Peter

    2016-01-01

    Tactile sensors are essential if robots are to safely interact with the external world and to dexterously manipulate objects. Current tactile sensors have limitations restricting their use, notably being too fragile or having limited performance. Magnetic field-based soft tactile sensors offer a potential improvement, being durable, low cost, accurate and high bandwidth, but they are relatively undeveloped because of the complexities involved in design and calibration. This paper presents a general design methodology for magnetic field-based three-axis soft tactile sensors, enabling researchers to easily develop specific tactile sensors for a variety of applications. All aspects (design, fabrication, calibration and evaluation) of the development of tri-axis soft tactile sensors are presented and discussed. A moving least square approach is used to decouple and convert the magnetic field signal to force output to eliminate non-linearity and cross-talk effects. A case study of a tactile sensor prototype, MagOne, was developed. This achieved a resolution of 1.42 mN in normal force measurement (0.71 mN in shear force), good output repeatability and has a maximum hysteresis error of 3.4%. These results outperform comparable sensors reported previously, highlighting the efficacy of our methodology for sensor design. PMID:27563908

  6. A methodology to derive Synthetic Design Hydrographs for river flood management

    NASA Astrophysics Data System (ADS)

    Tomirotti, Massimo; Mignosa, Paolo

    2017-12-01

    The design of flood protection measures requires in many cases not only the estimation of the peak discharges, but also of the volume of the floods and its time distribution. A typical solution to this kind of problems is the formulation of Synthetic Design Hydrographs (SDHs). In this paper a methodology to derive SDHs is proposed on the basis of the estimation of the Flow Duration Frequency (FDF) reduction curve and of a Peak-Duration (PD) relationship furnishing respectively the quantiles of the maximum average discharge and the average peak position in each duration. The methodology is intended to synthesize the main features of the historical floods in a unique SDH for each return period. The shape of the SDH is not selected a priori but is a result of the behaviour of FDF and PD curves, allowing to account in a very convenient way for the variability of the shapes of the observed hydrographs at local time scale. The validation of the methodology is performed with reference to flood routing problems in reservoirs, lakes and rivers. The results obtained demonstrate the capability of the SDHs to describe the effects of different hydraulic systems on the statistical regime of floods, even in presence of strong modifications induced on the probability distribution of peak flows.

  7. Application of new methodologies based on design of experiments, independent component analysis and design space for robust optimization in liquid chromatography.

    PubMed

    Debrus, Benjamin; Lebrun, Pierre; Ceccato, Attilio; Caliaro, Gabriel; Rozet, Eric; Nistor, Iolanda; Oprean, Radu; Rupérez, Francisco J; Barbas, Coral; Boulanger, Bruno; Hubert, Philippe

    2011-04-08

    HPLC separations of an unknown sample mixture and a pharmaceutical formulation have been optimized using a recently developed chemometric methodology proposed by W. Dewé et al. in 2004 and improved by P. Lebrun et al. in 2008. This methodology is based on experimental designs which are used to model retention times of compounds of interest. Then, the prediction accuracy and the optimal separation robustness, including the uncertainty study, were evaluated. Finally, the design space (ICH Q8(R1) guideline) was computed as the probability for a criterion to lie in a selected range of acceptance. Furthermore, the chromatograms were automatically read. Peak detection and peak matching were carried out with a previously developed methodology using independent component analysis published by B. Debrus et al. in 2009. The present successful applications strengthen the high potential of these methodologies for the automated development of chromatographic methods. Copyright © 2011 Elsevier B.V. All rights reserved.

  8. Design Thinking: A Methodology towards Sustainable Problem Solving in Higher Education in South Africa

    ERIC Educational Resources Information Center

    Munyai, Keneilwe

    2016-01-01

    This short paper explores the potential contribution of design thinking methodology to the education and training system in South Africa. Design thinking is slowly gaining traction in South Africa. Design Thinking is gaining traction in South Africa. There is offered by the Hasso Plattner Institute of Design Thinking at the University of Cape Town…

  9. C-Based Design Methodology and Topological Change for an Indian Agricultural Tractor Component

    NASA Astrophysics Data System (ADS)

    Matta, Anil Kumar; Raju, D. Ranga; Suman, K. N. S.; Kranthi, A. S.

    2018-06-01

    The failure of tractor components and their replacement has now become very common in India because of re-cycling, re-sale, and duplication. To over come the problem of failure we propose a design methodology for topological change co-simulating with software's. In the proposed Design methodology, the designer checks Paxial, Pcr, Pfailue, τ by hand calculations, from which refined topological changes of R.S.Arm are formed. We explained several techniques employed in the component for reduction, removal of rib material to change center of gravity and centroid point by using system C for mixed level simulation and faster topological changes. The design process in system C can be compiled and executed with software, TURBO C7. The modified component is developed in proE and analyzed in ANSYS. The topologically changed component with slot 120 × 4.75 × 32.5 mm at the center showed greater effectiveness than the original component.

  10. C-Based Design Methodology and Topological Change for an Indian Agricultural Tractor Component

    NASA Astrophysics Data System (ADS)

    Matta, Anil Kumar; Raju, D. Ranga; Suman, K. N. S.; Kranthi, A. S.

    2018-02-01

    The failure of tractor components and their replacement has now become very common in India because of re-cycling, re-sale, and duplication. To over come the problem of failure we propose a design methodology for topological change co-simulating with software's. In the proposed Design methodology, the designer checks Paxial, Pcr, Pfailue, τ by hand calculations, from which refined topological changes of R.S.Arm are formed. We explained several techniques employed in the component for reduction, removal of rib material to change center of gravity and centroid point by using system C for mixed level simulation and faster topological changes. The design process in system C can be compiled and executed with software, TURBO C7. The modified component is developed in proE and analyzed in ANSYS. The topologically changed component with slot 120 × 4.75 × 32.5 mm at the center showed greater effectiveness than the original component.

  11. Application of an integrated flight/propulsion control design methodology to a STOVL aircraft

    NASA Technical Reports Server (NTRS)

    Garg, Sanjay; Mattern, Duane L.

    1991-01-01

    Results are presented from the application of an emerging Integrated Flight/Propulsion Control (IFPC) design methodology to a Short Take Off and Vertical Landing (STOVL) aircraft in transition flight. The steps in the methodology consist of designing command shaping prefilters to provide the overall desired response to pilot command inputs. A previously designed centralized controller is first validated for the integrated airframe/engine plant used. This integrated plant is derived from a different model of the engine subsystem than the one used for the centralized controller design. The centralized controller is then partitioned in a decentralized, hierarchical structure comprising of airframe lateral and longitudinal subcontrollers and an engine subcontroller. Command shaping prefilters from the pilot control effector inputs are then designed and time histories of the closed loop IFPC system response to simulated pilot commands are compared to desired responses based on handling qualities requirements. Finally, the propulsion system safety and nonlinear limited protection logic is wrapped around the engine subcontroller and the response of the closed loop integrated system is evaluated for transients that encounter the propulsion surge margin limit.

  12. A methodological approach for designing a usable ontology-based GUI in healthcare.

    PubMed

    Lasierra, N; Kushniruk, A; Alesanco, A; Borycki, E; García, J

    2013-01-01

    This paper presents a methodological approach to the design and evaluation of an interface for an ontology-based system used for designing care plans for monitoring patients at home. In order to define the care plans, physicians need a tool for creating instances of the ontology and configuring some rules. Our purpose is to develop an interface to allow clinicians to interact with the ontology. Although ontology-driven applications do not necessarily present the ontology in the user interface, it is our hypothesis that showing selected parts of the ontology in a "usable" way could enhance clinician's understanding and make easier the definition of the care plans. Based on prototyping and iterative testing, this methodology combines visualization techniques and usability methods. Preliminary results obtained after a formative evaluation indicate the effectiveness of suggested combination.

  13. Using Delphi Methodology to Design Assessments of Teachers' Pedagogical Content Knowledge

    ERIC Educational Resources Information Center

    Manizade, Agida Gabil; Mason, Marguerite M.

    2011-01-01

    Descriptions of methodologies that can be used to create items for assessing teachers' "professionally situated" knowledge are lacking in mathematics education research literature. In this study, researchers described and used the Delphi method to design an instrument to measure teachers' pedagogical content knowledge. The instrument focused on a…

  14. Power processing methodology. [computerized design of spacecraft electric power systems

    NASA Technical Reports Server (NTRS)

    Fegley, K. A.; Hansen, I. G.; Hayden, J. H.

    1974-01-01

    Discussion of the interim results of a program to investigate the feasibility of formulating a methodology for the modeling and analysis of aerospace electrical power processing systems. The object of the total program is to develop a flexible engineering tool which will allow the power processor designer to effectively and rapidly assess and analyze the tradeoffs available by providing, in one comprehensive program, a mathematical model, an analysis of expected performance, simulation, and a comparative evaluation with alternative designs. This requires an understanding of electrical power source characteristics and the effects of load control, protection, and total system interaction.

  15. A methodology for boost-glide transport technology planning

    NASA Technical Reports Server (NTRS)

    Repic, E. M.; Olson, G. A.; Milliken, R. J.

    1974-01-01

    A systematic procedure is presented by which the relative economic value of technology factors affecting design, configuration, and operation of boost-glide transport can be evaluated. Use of the methodology results in identification of first-order economic gains potentially achievable by projected advances in each of the definable, hypersonic technologies. Starting with a baseline vehicle, the formulas, procedures and forms which are integral parts of this methodology are developed. A demonstration of the methodology is presented for one specific boost-glide system.

  16. Aircraft conceptual design - an adaptable parametric sizing methodology

    NASA Astrophysics Data System (ADS)

    Coleman, Gary John, Jr.

    Aerospace is a maturing industry with successful and refined baselines which work well for traditional baseline missions, markets and technologies. However, when new markets (space tourism) or new constrains (environmental) or new technologies (composite, natural laminar flow) emerge, the conventional solution is not necessarily best for the new situation. Which begs the question "how does a design team quickly screen and compare novel solutions to conventional solutions for new aerospace challenges?" The answer is rapid and flexible conceptual design Parametric Sizing. In the product design life-cycle, parametric sizing is the first step in screening the total vehicle in terms of mission, configuration and technology to quickly assess first order design and mission sensitivities. During this phase, various missions and technologies are assessed. During this phase, the designer is identifying design solutions of concepts and configurations to meet combinations of mission and technology. This research undertaking contributes the state-of-the-art in aircraft parametric sizing through (1) development of a dedicated conceptual design process and disciplinary methods library, (2) development of a novel and robust parametric sizing process based on 'best-practice' approaches found in the process and disciplinary methods library, and (3) application of the parametric sizing process to a variety of design missions (transonic, supersonic and hypersonic transports), different configurations (tail-aft, blended wing body, strut-braced wing, hypersonic blended bodies, etc.), and different technologies (composite, natural laminar flow, thrust vectored control, etc.), in order to demonstrate the robustness of the methodology and unearth first-order design sensitivities to current and future aerospace design problems. This research undertaking demonstrates the importance of this early design step in selecting the correct combination of mission, technologies and configuration to

  17. A variable-gain output feedback control design methodology

    NASA Technical Reports Server (NTRS)

    Halyo, Nesim; Moerder, Daniel D.; Broussard, John R.; Taylor, Deborah B.

    1989-01-01

    A digital control system design technique is developed in which the control system gain matrix varies with the plant operating point parameters. The design technique is obtained by formulating the problem as an optimal stochastic output feedback control law with variable gains. This approach provides a control theory framework within which the operating range of a control law can be significantly extended. Furthermore, the approach avoids the major shortcomings of the conventional gain-scheduling techniques. The optimal variable gain output feedback control problem is solved by embedding the Multi-Configuration Control (MCC) problem, previously solved at ICS. An algorithm to compute the optimal variable gain output feedback control gain matrices is developed. The algorithm is a modified version of the MCC algorithm improved so as to handle the large dimensionality which arises particularly in variable-gain control problems. The design methodology developed is applied to a reconfigurable aircraft control problem. A variable-gain output feedback control problem was formulated to design a flight control law for an AFTI F-16 aircraft which can automatically reconfigure its control strategy to accommodate failures in the horizontal tail control surface. Simulations of the closed-loop reconfigurable system show that the approach produces a control design which can accommodate such failures with relative ease. The technique can be applied to many other problems including sensor failure accommodation, mode switching control laws and super agility.

  18. Design Evolution and Methodology for Pumpkin Super-Pressure Balloons

    NASA Astrophysics Data System (ADS)

    Farley, Rodger

    The NASA Ultra Long Duration Balloon (ULDB) program has had many technical development issues discovered and solved along its road to success as a new vehicle. It has the promise of being a sub-satellite, a means to launch up to 2700 kg to 33.5 km altitude for 100 days from a comfortable mid-latitude launch point. Current high-lift long duration ballooning is accomplished out of Antarctica with zero-pressure balloons, which cannot cope with the rigors of diurnal cycles. The ULDB design is still evolving, the product of intense analytical effort, scaled testing, improved manufacturing, and engineering intuition. The past technical problems, in particular the s-cleft deformation, their solutions, future challenges, and the methodology of pumpkin balloon design will generally be described.

  19. Object-oriented analysis and design: a methodology for modeling the computer-based patient record.

    PubMed

    Egyhazy, C J; Eyestone, S M; Martino, J; Hodgson, C L

    1998-08-01

    The article highlights the importance of an object-oriented analysis and design (OOAD) methodology for the computer-based patient record (CPR) in the military environment. Many OOAD methodologies do not adequately scale up, allow for efficient reuse of their products, or accommodate legacy systems. A methodology that addresses these issues is formulated and used to demonstrate its applicability in a large-scale health care service system. During a period of 6 months, a team of object modelers and domain experts formulated an OOAD methodology tailored to the Department of Defense Military Health System and used it to produce components of an object model for simple order processing. This methodology and the lessons learned during its implementation are described. This approach is necessary to achieve broad interoperability among heterogeneous automated information systems.

  20. Thermal Hydraulics Design and Analysis Methodology for a Solid-Core Nuclear Thermal Rocket Engine Thrust Chamber

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Canabal, Francisco; Chen, Yen-Sen; Cheng, Gary; Ito, Yasushi

    2013-01-01

    Nuclear thermal propulsion is a leading candidate for in-space propulsion for human Mars missions. This chapter describes a thermal hydraulics design and analysis methodology developed at the NASA Marshall Space Flight Center, in support of the nuclear thermal propulsion development effort. The objective of this campaign is to bridge the design methods in the Rover/NERVA era, with a modern computational fluid dynamics and heat transfer methodology, to predict thermal, fluid, and hydrogen environments of a hypothetical solid-core, nuclear thermal engine the Small Engine, designed in the 1960s. The computational methodology is based on an unstructured-grid, pressure-based, all speeds, chemically reacting, computational fluid dynamics and heat transfer platform, while formulations of flow and heat transfer through porous and solid media were implemented to describe those of hydrogen flow channels inside the solid24 core. Design analyses of a single flow element and the entire solid-core thrust chamber of the Small Engine were performed and the results are presented herein

  1. Cost-effectiveness methodology for computer systems selection

    NASA Technical Reports Server (NTRS)

    Vallone, A.; Bajaj, K. S.

    1980-01-01

    A new approach to the problem of selecting a computer system design has been developed. The purpose of this methodology is to identify a system design that is capable of fulfilling system objectives in the most economical way. The methodology characterizes each system design by the cost of the system life cycle and by the system's effectiveness in reaching objectives. Cost is measured by a 'system cost index' derived from an analysis of all expenditures and possible revenues over the system life cycle. Effectiveness is measured by a 'system utility index' obtained by combining the impact that each selection factor has on the system objectives and it is assessed through a 'utility curve'. A preestablished algorithm combines cost and utility and provides a ranking of the alternative system designs from which the 'best' design is selected.

  2. Space station definitions, design, and development. Task 5: Multiple arm telerobot coordination and control: Manipulator design methodology

    NASA Technical Reports Server (NTRS)

    Stoughton, R. M.

    1990-01-01

    A proposed methodology applicable to the design of manipulator systems is described. The current design process is especially weak in the preliminary design phase, since there is no accepted measure to be used in trading off different options available for the various subsystems. The design process described uses Cartesian End-Effector Impedance as a measure of performance for the system. Having this measure of performance, it is shown how it may be used to determine the trade-offs necessary to the preliminary design phase. The design process involves three main parts: (1) determination of desired system performance in terms of End-Effector Impedance; (2) trade-off design options to achieve this desired performance; and (3) verification of system performance through laboratory testing. The design process is developed using numerous examples and experiments to demonstrate the feasability of this approach to manipulator design.

  3. Factors That Shape Design Thinking

    ERIC Educational Resources Information Center

    Gray, Colin M.

    2013-01-01

    A wide range of design literature discusses the role of the studio and its related pedagogy in the development of design thinking. Scholars in a variety of design disciplines pose a number of factors that potentially affect this development process, but a full understanding of these factors as experienced from a critical pedagogy or student…

  4. Behavioral headache research: methodologic considerations and research design alternatives.

    PubMed

    Hursey, Karl G; Rains, Jeanetta C; Penzien, Donald B; Nash, Justin M; Nicholson, Robert A

    2005-05-01

    Behavioral headache treatments have garnered solid empirical support in recent years, but there is substantial opportunity to strengthen the next generation of studies with improved methods and consistency across studies. Recently, Guidelines for Trials of Behavioral Treatments for Recurrent Headache were published to facilitate the production of high-quality research. The present article compliments the guidelines with a discussion of methodologic and research design considerations. Since there is no research design that is applicable in every situation, selecting an appropriate research design is fundamental to producing meaningful results. Investigators in behavioral headache and other areas of research consider the developmental phase of the research, the principle objectives of the project, and the sources of error or alternative interpretations in selecting a design. Phases of clinical trials typically include pilot studies, efficacy studies, and effectiveness studies. These trials may be categorized as primarily pragmatic or explanatory. The most appropriate research designs for these different phases and different objectives vary on such characteristics as sample size and assignment to condition, types of control conditions, periods or frequency of measurement, and the dimensions along which comparisons are made. A research design also must fit within constraints on available resources. There are a large number of potential research designs that can be used and considering these characteristics allows selection of appropriate research designs.

  5. Anaerobic treatment of complex chemical wastewater in a sequencing batch biofilm reactor: process optimization and evaluation of factor interactions using the Taguchi dynamic DOE methodology.

    PubMed

    Venkata Mohan, S; Chandrasekhara Rao, N; Krishna Prasad, K; Murali Krishna, P; Sreenivas Rao, R; Sarma, P N

    2005-06-20

    The Taguchi robust experimental design (DOE) methodology has been applied on a dynamic anaerobic process treating complex wastewater by an anaerobic sequencing batch biofilm reactor (AnSBBR). For optimizing the process as well as to evaluate the influence of different factors on the process, the uncontrollable (noise) factors have been considered. The Taguchi methodology adopting dynamic approach is the first of its kind for studying anaerobic process evaluation and process optimization. The designed experimental methodology consisted of four phases--planning, conducting, analysis, and validation connected sequence-wise to achieve the overall optimization. In the experimental design, five controllable factors, i.e., organic loading rate (OLR), inlet pH, biodegradability (BOD/COD ratio), temperature, and sulfate concentration, along with the two uncontrollable (noise) factors, volatile fatty acids (VFA) and alkalinity at two levels were considered for optimization of the anae robic system. Thirty-two anaerobic experiments were conducted with a different combination of factors and the results obtained in terms of substrate degradation rates were processed in Qualitek-4 software to study the main effect of individual factors, interaction between the individual factors, and signal-to-noise (S/N) ratio analysis. Attempts were also made to achieve optimum conditions. Studies on the influence of individual factors on process performance revealed the intensive effect of OLR. In multiple factor interaction studies, biodegradability with other factors, such as temperature, pH, and sulfate have shown maximum influence over the process performance. The optimum conditions for the efficient performance of the anaerobic system in treating complex wastewater by considering dynamic (noise) factors obtained are higher organic loading rate of 3.5 Kg COD/m3 day, neutral pH with high biodegradability (BOD/COD ratio of 0.5), along with mesophilic temperature range (40 degrees C), and

  6. IDR: A Participatory Methodology for Interdisciplinary Design in Technology Enhanced Learning

    ERIC Educational Resources Information Center

    Winters, Niall; Mor, Yishay

    2008-01-01

    One of the important themes that emerged from the CAL'07 conference was the failure of technology to bring about the expected disruptive effect to learning and teaching. We identify one of the causes as an inherent weakness in prevalent development methodologies. While the problem of designing technology for learning is irreducibly…

  7. Optimal color design of psychological counseling room by design of experiments and response surface methodology.

    PubMed

    Liu, Wenjuan; Ji, Jianlin; Chen, Hua; Ye, Chenyu

    2014-01-01

    Color is one of the most powerful aspects of a psychological counseling environment. Little scientific research has been conducted on color design and much of the existing literature is based on observational studies. Using design of experiments and response surface methodology, this paper proposes an optimal color design approach for transforming patients' perception into color elements. Six indices, pleasant-unpleasant, interesting-uninteresting, exciting-boring, relaxing-distressing, safe-fearful, and active-inactive, were used to assess patients' impression. A total of 75 patients participated, including 42 for Experiment 1 and 33 for Experiment 2. 27 representative color samples were designed in Experiment 1, and the color sample (L = 75, a = 0, b = -60) was the most preferred one. In Experiment 2, this color sample was set as the 'central point', and three color attributes were optimized to maximize the patients' satisfaction. The experimental results show that the proposed method can get the optimal solution for color design of a counseling room.

  8. Optimal Color Design of Psychological Counseling Room by Design of Experiments and Response Surface Methodology

    PubMed Central

    Chen, Hua; Ye, Chenyu

    2014-01-01

    Color is one of the most powerful aspects of a psychological counseling environment. Little scientific research has been conducted on color design and much of the existing literature is based on observational studies. Using design of experiments and response surface methodology, this paper proposes an optimal color design approach for transforming patients’ perception into color elements. Six indices, pleasant-unpleasant, interesting-uninteresting, exciting-boring, relaxing-distressing, safe-fearful, and active-inactive, were used to assess patients’ impression. A total of 75 patients participated, including 42 for Experiment 1 and 33 for Experiment 2. 27 representative color samples were designed in Experiment 1, and the color sample (L = 75, a = 0, b = -60) was the most preferred one. In Experiment 2, this color sample was set as the ‘central point’, and three color attributes were optimized to maximize the patients’ satisfaction. The experimental results show that the proposed method can get the optimal solution for color design of a counseling room. PMID:24594683

  9. A methodology for the validated design space exploration of fuel cell powered unmanned aerial vehicles

    NASA Astrophysics Data System (ADS)

    Moffitt, Blake Almy

    Unmanned Aerial Vehicles (UAVs) are the most dynamic growth sector of the aerospace industry today. The need to provide persistent intelligence, surveillance, and reconnaissance for military operations is driving the planned acquisition of over 5,000 UAVs over the next five years. The most pressing need is for quiet, small UAVs with endurance beyond what is capable with advanced batteries or small internal combustion propulsion systems. Fuel cell systems demonstrate high efficiency, high specific energy, low noise, low temperature operation, modularity, and rapid refuelability making them a promising enabler of the small, quiet, and persistent UAVs that military planners are seeking. Despite the perceived benefits, the actual near-term performance of fuel cell powered UAVs is unknown. Until the auto industry began spending billions of dollars in research, fuel cell systems were too heavy for useful flight applications. However, the last decade has seen rapid development with fuel cell gravimetric and volumetric power density nearly doubling every 2--3 years. As a result, a few design studies and demonstrator aircraft have appeared, but overall the design methodology and vehicles are still in their infancy. The design of fuel cell aircraft poses many challenges. Fuel cells differ fundamentally from combustion based propulsion in how they generate power and interact with other aircraft subsystems. As a result, traditional multidisciplinary analysis (MDA) codes are inappropriate. Building new MDAs is difficult since fuel cells are rapidly changing in design, and various competitive architectures exist for balance of plant, hydrogen storage, and all electric aircraft subsystems. In addition, fuel cell design and performance data is closely protected which makes validation difficult and uncertainty significant. Finally, low specific power and high volumes compared to traditional combustion based propulsion result in more highly constrained design spaces that are

  10. Human Factors Engineering Program Review Model (NUREG-0711)Revision 3: Update Methodology and Key Revisions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    OHara J. M.; Higgins, J.; Fleger, S.

    The U.S. Nuclear Regulatory Commission (NRC) reviews the human factors engineering (HFE) programs of applicants for nuclear power plant construction permits, operating licenses, standard design certifications, and combined operating licenses. The purpose of these safety reviews is to help ensure that personnel performance and reliability are appropriately supported. Detailed design review procedures and guidance for the evaluations is provided in three key documents: the Standard Review Plan (NUREG-0800), the HFE Program Review Model (NUREG-0711), and the Human-System Interface Design Review Guidelines (NUREG-0700). These documents were last revised in 2007, 2004 and 2002, respectively. The NRC is committed to the periodicmore » update and improvement of the guidance to ensure that it remains a state-of-the-art design evaluation tool. To this end, the NRC is updating its guidance to stay current with recent research on human performance, advances in HFE methods and tools, and new technology being employed in plant and control room design. NUREG-0711 is the first document to be addressed. We present the methodology used to update NUREG-0711 and summarize the main changes made. Finally, we discuss the current status of the update program and the future plans.« less

  11. [Methodological design for the National Survey Violence Against Women in Mexico].

    PubMed

    Olaiz, Gustavo; Franco, Aurora; Palma, Oswaldo; Echarri, Carlos; Valdez, Rosario; Herrera, Cristina

    2006-01-01

    To describe the methodology, the research designs used, the estimation and sample selection, variable definitions, collection instruments, and operative design and analytical procedures for the National Survey Violence Against Women in Mexico. A complex (two-step) cross-sectional study was designed and the qualitative design was carried out using in-depth interviews and participant observation in health care units. We obtained for the quantitative study a total of 26 240 interviews in women users of health services and 2 636 questionnaires for health workers; the survey is representative of the 32 Mexican states. For the qualitative study 26 in-depth interviews were conducted with female users and 60 interviews with health workers in the States of Quintana Roo, Coahuila and the Federal District.

  12. A design methodology for portable software on parallel computers

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Miller, Keith W.; Chrisman, Dan A.

    1993-01-01

    This final report for research that was supported by grant number NAG-1-995 documents our progress in addressing two difficulties in parallel programming. The first difficulty is developing software that will execute quickly on a parallel computer. The second difficulty is transporting software between dissimilar parallel computers. In general, we expect that more hardware-specific information will be included in software designs for parallel computers than in designs for sequential computers. This inclusion is an instance of portability being sacrificed for high performance. New parallel computers are being introduced frequently. Trying to keep one's software on the current high performance hardware, a software developer almost continually faces yet another expensive software transportation. The problem of the proposed research is to create a design methodology that helps designers to more precisely control both portability and hardware-specific programming details. The proposed research emphasizes programming for scientific applications. We completed our study of the parallelizability of a subsystem of the NASA Earth Radiation Budget Experiment (ERBE) data processing system. This work is summarized in section two. A more detailed description is provided in Appendix A ('Programming Practices to Support Eventual Parallelism'). Mr. Chrisman, a graduate student, wrote and successfully defended a Ph.D. dissertation proposal which describes our research associated with the issues of software portability and high performance. The list of research tasks are specified in the proposal. The proposal 'A Design Methodology for Portable Software on Parallel Computers' is summarized in section three and is provided in its entirety in Appendix B. We are currently studying a proposed subsystem of the NASA Clouds and the Earth's Radiant Energy System (CERES) data processing system. This software is the proof-of-concept for the Ph.D. dissertation. We have implemented and measured

  13. Prognostics and health management design for rotary machinery systems—Reviews, methodology and applications

    NASA Astrophysics Data System (ADS)

    Lee, Jay; Wu, Fangji; Zhao, Wenyu; Ghaffari, Masoud; Liao, Linxia; Siegel, David

    2014-01-01

    Much research has been conducted in prognostics and health management (PHM), an emerging field in mechanical engineering that is gaining interest from both academia and industry. Most of these efforts have been in the area of machinery PHM, resulting in the development of many algorithms for this particular application. The majority of these algorithms concentrate on applications involving common rotary machinery components, such as bearings and gears. Knowledge of this prior work is a necessity for any future research efforts to be conducted; however, there has not been a comprehensive overview that details previous and on-going efforts in PHM. In addition, a systematic method for developing and deploying a PHM system has yet to be established. Such a method would enable rapid customization and integration of PHM systems for diverse applications. To address these gaps, this paper provides a comprehensive review of the PHM field, followed by an introduction of a systematic PHM design methodology, 5S methodology, for converting data to prognostics information. This methodology includes procedures for identifying critical components, as well as tools for selecting the most appropriate algorithms for specific applications. Visualization tools are presented for displaying prognostics information in an appropriate fashion for quick and accurate decision making. Industrial case studies are included in this paper to show how this methodology can help in the design of an effective PHM system.

  14. A methodology for double patterning compliant split and design

    NASA Astrophysics Data System (ADS)

    Wiaux, Vincent; Verhaegen, Staf; Iwamoto, Fumio; Maenhoudt, Mireille; Matsuda, Takashi; Postnikov, Sergei; Vandenberghe, Geert

    2008-11-01

    Double Patterning allows to further extend the use of water immersion lithography at its maximum numerical aperture NA=1.35. Splitting of design layers to recombine through Double Patterning (DP) enables an effective resolution enhancement. Single polygons may need to be split up (cut) depending on the pattern density and its 2D content. The split polygons recombine at the so-called 'stitching points'. These stitching points may affect the yield due to the sensitivity to process variations. We describe a methodology to ensure a robust double patterning by identifying proper split- and design- guidelines. Using simulations and experimental data, we discuss in particular metal1 first interconnect layers of random LOGIC and DRAM applications at 45nm half-pitch (hp) and 32nm hp where DP may become the only timely patterning solution.

  15. Application of Taguchi Design and Response Surface Methodology for Improving Conversion of Isoeugenol into Vanillin by Resting Cells of Psychrobacter sp. CSW4.

    PubMed

    Ashengroph, Morahem; Nahvi, Iraj; Amini, Jahanshir

    2013-01-01

    For all industrial processes, modelling, optimisation and control are the keys to enhance productivity and ensure product quality. In the current study, the optimization of process parameters for improving the conversion of isoeugenol to vanillin by Psychrobacter sp. CSW4 was investigated by means of Taguchi approach and Box-Behnken statistical design under resting cell conditions. Taguchi design was employed for screening the significant variables in the bioconversion medium. Sequentially, Box-Behnken design experiments under Response Surface Methodology (RSM) was used for further optimization. Four factors (isoeugenol, NaCl, biomass and tween 80 initial concentrations), which have significant effects on vanillin yield, were selected from ten variables by Taguchi experimental design. With the regression coefficient analysis in the Box-Behnken design, a relationship between vanillin production and four significant variables was obtained, and the optimum levels of the four variables were as follows: initial isoeugenol concentration 6.5 g/L, initial tween 80 concentration 0.89 g/L, initial NaCl concentration 113.2 g/L and initial biomass concentration 6.27 g/L. Under these optimized conditions, the maximum predicted concentration of vanillin was 2.25 g/L. These optimized values of the factors were validated in a triplicate shaking flask study and an average of 2.19 g/L for vanillin, which corresponded to a molar yield 36.3%, after a 24 h bioconversion was obtained. The present work is the first one reporting the application of Taguchi design and Response surface methodology for optimizing bioconversion of isoeugenol into vanillin under resting cell conditions.

  16. Multirate Flutter Suppression System Design for the Benchmark Active Controls Technology Wing. Part 2; Methodology Application Software Toolbox

    NASA Technical Reports Server (NTRS)

    Mason, Gregory S.; Berg, Martin C.; Mukhopadhyay, Vivek

    2002-01-01

    To study the effectiveness of various control system design methodologies, the NASA Langley Research Center initiated the Benchmark Active Controls Project. In this project, the various methodologies were applied to design a flutter suppression system for the Benchmark Active Controls Technology (BACT) Wing. This report describes the user's manual and software toolbox developed at the University of Washington to design a multirate flutter suppression control law for the BACT wing.

  17. Analytic Couple Modeling Introducing Device Design Factor, Fin Factor, Thermal Diffusivity Factor, and Inductance Factor

    NASA Technical Reports Server (NTRS)

    Mackey, Jon; Sehirlioglu, Alp; Dynys, Fred

    2014-01-01

    A set of convenient thermoelectric device solutions have been derived in order to capture a number of factors which are previously only resolved with numerical techniques. The concise conversion efficiency equations derived from governing equations provide intuitive and straight-forward design guidelines. These guidelines allow for better device design without requiring detailed numerical modeling. The analytical modeling accounts for factors such as i) variable temperature boundary conditions, ii) lateral heat transfer, iii) temperature variable material properties, and iv) transient operation. New dimensionless parameters, similar to the figure of merit, are introduced including the device design factor, fin factor, thermal diffusivity factor, and inductance factor. These new device factors allow for the straight-forward description of phenomenon generally only captured with numerical work otherwise. As an example a device design factor of 0.38, which accounts for thermal resistance of the hot and cold shoes, can be used to calculate a conversion efficiency of 2.28 while the ideal conversion efficiency based on figure of merit alone would be 6.15. Likewise an ideal couple with efficiency of 6.15 will be reduced to 5.33 when lateral heat is accounted for with a fin factor of 1.0.

  18. [Optimization of vacuum belt drying process of Gardeniae Fructus in Reduning injection by Box-Behnken design-response surface methodology].

    PubMed

    Huang, Dao-sheng; Shi, Wei; Han, Lei; Sun, Ke; Chen, Guang-bo; Wu Jian-xiong; Xu, Gui-hong; Bi, Yu-an; Wang, Zhen-zhong; Xiao, Wei

    2015-06-01

    To optimize the belt drying process conditions optimization of Gardeniae Fructus extract from Reduning injection by Box-Behnken design-response surface methodology, on the basis of single factor experiment, a three-factor and three-level Box-Behnken experimental design was employed to optimize the drying technology of Gardeniae Fructus extract from Reduning injection. With drying temperature, drying time, feeding speed as independent variables and the content of geniposide as dependent variable, the experimental data were fitted to a second order polynomial equation, establishing the mathematical relationship between the content of geniposide and respective variables. With the experimental data analyzed by Design-Expert 8. 0. 6, the optimal drying parameter was as follows: the drying temperature was 98.5 degrees C , the drying time was 89 min, the feeding speed was 99.8 r x min(-1). Three verification experiments were taked under this technology and the measured average content of geniposide was 564. 108 mg x g(-1), which was close to the model prediction: 563. 307 mg x g(-1). According to the verification test, the Gardeniae Fructus belt drying process is steady and feasible. So single factor experiments combined with response surface method (RSM) could be used to optimize the drying technology of Reduning injection Gardenia extract.

  19. [Methodological design of the National Health and Nutrition Survey 2016].

    PubMed

    Romero-Martínez, Martín; Shamah-Levy, Teresa; Cuevas-Nasu, Lucía; Gómez-Humarán, Ignacio Méndez; Gaona-Pineda, Elsa Berenice; Gómez-Acosta, Luz María; Rivera-Dommarco, Juan Ángel; Hernández-Ávila, Mauricio

    2017-01-01

    Describe the design methodology of the halfway health and nutrition national survey (Ensanut-MC) 2016. The Ensanut-MC is a national probabilistic survey whose objective population are the inhabitants of private households in Mexico. The sample size was determined to make inferences on the urban and rural areas in four regions. Describes main design elements: target population, topics of study, sampling procedure, measurement procedure and logistics organization. A final sample of 9 479 completed household interviews, and a sample of 16 591 individual interviews. The response rate for households was 77.9%, and the response rate for individuals was 91.9%. The Ensanut-MC probabilistic design allows valid statistical inferences about interest parameters for Mexico´s public health and nutrition, specifically on overweight, obesity and diabetes mellitus. Updated information also supports the monitoring, updating and formulation of new policies and priority programs.

  20. The Contribution of Human Factors in Military System Development: Methodological Considerations

    DTIC Science & Technology

    1980-07-01

    Risk/Uncertainty Analysis - Project Scoring - Utility Scales - Relevance Tree Techniques (Reverse Factor Analysis) 2. Computer Simulation Simulation...effectiveness of mathematical models for R&D project selection. Management Science, April 1973, 18. 6-43 .1~ *.-. Souder, W.E. h scoring methodology for...per some interval PROFICIENCY test scores (written) RADIATION radiation effects aircrew performance on radiation environments REACTION TIME 1) (time

  1. Applying the methodology of Design of Experiments to stability studies: a Partial Least Squares approach for evaluation of drug stability.

    PubMed

    Jordan, Nika; Zakrajšek, Jure; Bohanec, Simona; Roškar, Robert; Grabnar, Iztok

    2018-05-01

    The aim of the present research is to show that the methodology of Design of Experiments can be applied to stability data evaluation, as they can be seen as multi-factor and multi-level experimental designs. Linear regression analysis is usually an approach for analyzing stability data, but multivariate statistical methods could also be used to assess drug stability during the development phase. Data from a stability study for a pharmaceutical product with hydrochlorothiazide (HCTZ) as an unstable drug substance was used as a case example in this paper. The design space of the stability study was modeled using Umetrics MODDE 10.1 software. We showed that a Partial Least Squares model could be used for a multi-dimensional presentation of all data generated in a stability study and for determination of the relationship among factors that influence drug stability. It might also be used for stability predictions and potentially for the optimization of the extent of stability testing needed to determine shelf life and storage conditions, which would be time and cost-effective for the pharmaceutical industry.

  2. The Role of DNA Methylation in Cardiovascular Risk and Disease: Methodological Aspects, Study Design, and Data Analysis for Epidemiological Studies.

    PubMed

    Zhong, Jia; Agha, Golareh; Baccarelli, Andrea A

    2016-01-08

    Epidemiological studies have demonstrated that genetic, environmental, behavioral, and clinical factors contribute to cardiovascular disease development. How these risk factors interact at the cellular level to cause cardiovascular disease is not well known. Epigenetic epidemiology enables researchers to explore critical links between genomic coding, modifiable exposures, and manifestation of disease phenotype. One epigenetic link, DNA methylation, is potentially an important mechanism underlying these associations. In the past decade, there has been a significant increase in the number of epidemiological studies investigating cardiovascular risk factors and outcomes in relation to DNA methylation, but many gaps remain in our understanding of the underlying cause and biological implications. In this review, we provide a brief overview of the biology and mechanisms of DNA methylation and its role in cardiovascular disease. In addition, we summarize the current evidence base in epigenetic epidemiology studies relevant to cardiovascular health and disease and discuss the limitations, challenges, and future directions of the field. Finally, we provide guidelines for well-designed epigenetic epidemiology studies, with particular focus on methodological aspects, study design, and analytical challenges. © 2016 American Heart Association, Inc.

  3. 24 CFR 598.305 - Designation factors.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 24 Housing and Urban Development 3 2014-04-01 2013-04-01 true Designation factors. 598.305 Section 598.305 Housing and Urban Development Regulations Relating to Housing and Urban Development (Continued... § 598.305 Designation factors. In choosing among nominated urban areas eligible for designation, the...

  4. 24 CFR 598.305 - Designation factors.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 24 Housing and Urban Development 3 2013-04-01 2013-04-01 false Designation factors. 598.305 Section 598.305 Housing and Urban Development Regulations Relating to Housing and Urban Development... Designation Process § 598.305 Designation factors. In choosing among nominated urban areas eligible for...

  5. 24 CFR 598.305 - Designation factors.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 24 Housing and Urban Development 3 2012-04-01 2012-04-01 false Designation factors. 598.305 Section 598.305 Housing and Urban Development Regulations Relating to Housing and Urban Development... Designation Process § 598.305 Designation factors. In choosing among nominated urban areas eligible for...

  6. The engagement of children with disabilities in health-related technology design processes: identifying methodology.

    PubMed

    Allsop, Matthew J; Holt, Raymond J; Levesley, Martin C; Bhakta, Bipinchandra

    2010-01-01

    This review aims to identify research methodology that is suitable for involving children with disabilities in the design of healthcare technology, such as assistive technology and rehabilitation equipment. A review of the literature included the identification of methodology that is available from domains outside of healthcare and suggested a selection of available methods. The need to involve end users within the design of healthcare technology was highlighted, with particular attention to the need for greater levels of participation from children with disabilities within all healthcare research. Issues that may arise when trying to increase such involvement included the need to consider communication via feedback and tailored information, the need to measure levels of participation occurring in current research, and caution regarding the use of proxy information. Additionally, five suitable methods were highlighted that are available for use with children with disabilities in the design of healthcare technology. The methods identified in the review need to be put into practice to establish effective and, if necessary, novel ways of designing healthcare technology when end users are children with disabilities.

  7. Beyond Needs Analysis: Soft Systems Methodology for Meaningful Collaboration in EAP Course Design

    ERIC Educational Resources Information Center

    Tajino, Akira; James, Robert; Kijima, Kyoichi

    2005-01-01

    Designing an EAP course requires collaboration among various concerned stakeholders, including students, subject teachers, institutional administrators and EAP teachers themselves. While needs analysis is often considered fundamental to EAP, alternative research methodologies may be required to facilitate meaningful collaboration between these…

  8. Development of tf coil support concepts by design methodology in the case of a Bitter-type magnet. [Bitter-type magnets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brossmann, U.B.

    1981-01-01

    The application of the methodological design is demonstrated for the development of support concepts in the case of a Bitter-type magnet designed for a compact tokamak experimentat aiming at ignition of a DT plasma. With this methodology all boundary conditions and design criteria are more easily satisfied in a technical and economical way.

  9. The Atomic Intrinsic Integration Approach: A Structured Methodology for the Design of Games for the Conceptual Understanding of Physics

    ERIC Educational Resources Information Center

    Echeverria, Alejandro; Barrios, Enrique; Nussbaum, Miguel; Amestica, Matias; Leclerc, Sandra

    2012-01-01

    Computer simulations combined with games have been successfully used to teach conceptual physics. However, there is no clear methodology for guiding the design of these types of games. To remedy this, we propose a structured methodology for the design of conceptual physics games that explicitly integrates the principles of the intrinsic…

  10. A system-of-systems modeling methodology for strategic general aviation design decision-making

    NASA Astrophysics Data System (ADS)

    Won, Henry Thome

    General aviation has long been studied as a means of providing an on-demand "personal air vehicle" that bypasses the traffic at major commercial hubs. This thesis continues this research through development of a system of systems modeling methodology applicable to the selection of synergistic product concepts, market segments, and business models. From the perspective of the conceptual design engineer, the design and selection of future general aviation aircraft is complicated by the definition of constraints and requirements, and the tradeoffs among performance and cost aspects. Qualitative problem definition methods have been utilized, although their accuracy in determining specific requirement and metric values is uncertain. In industry, customers are surveyed, and business plans are created through a lengthy, iterative process. In recent years, techniques have developed for predicting the characteristics of US travel demand based on travel mode attributes, such as door-to-door time and ticket price. As of yet, these models treat the contributing systems---aircraft manufacturers and service providers---as independently variable assumptions. In this research, a methodology is developed which seeks to build a strategic design decision making environment through the construction of a system of systems model. The demonstrated implementation brings together models of the aircraft and manufacturer, the service provider, and most importantly the travel demand. Thus represented is the behavior of the consumers and the reactive behavior of the suppliers---the manufacturers and transportation service providers---in a common modeling framework. The results indicate an ability to guide the design process---specifically the selection of design requirements---through the optimization of "capability" metrics. Additionally, results indicate the ability to find synergetic solutions, that is solutions in which two systems might collaborate to achieve a better result than acting

  11. A Step-by-Step Design Methodology for a Base Case Vanadium Redox-Flow Battery

    ERIC Educational Resources Information Center

    Moore, Mark; Counce, Robert M.; Watson, Jack S.; Zawodzinski, Thomas A.; Kamath, Haresh

    2012-01-01

    The purpose of this work is to develop an evolutionary procedure to be used by Chemical Engineering students for the base-case design of a Vanadium Redox-Flow Battery. The design methodology is based on the work of Douglas (1985) and provides a profitability analysis at each decision level so that more profitable alternatives and directions can be…

  12. Analysis of Combined Data from Heterogeneous Study Designs: A Methodological Proposal from the Patient Navigation Research program

    PubMed Central

    Roetzheim, Richard G.; Freund, Karen M.; Corle, Don K.; Murray, David M.; Snyder, Frederick R.; Kronman, Andrea C.; Jean-Pierre, Pascal; Raich, Peter C.; Holden, Alan E. C.; Darnell, Julie S.; Warren-Mears, Victoria; Patierno, Steven; Design, PNRP; Committee, Analysis

    2013-01-01

    Background The Patient Navigation Research Program (PNRP) is a cooperative effort of nine research projects, each employing its own unique study design. To evaluate projects such as PNRP, it is desirable to perform a pooled analysis to increase power relative to the individual projects. There is no agreed upon prospective methodology, however, for analyzing combined data arising from different study designs. Expert opinions were thus solicited from members of the PNRP Design and Analysis Committee Purpose To review possible methodologies for analyzing combined data arising from heterogeneous study designs. Methods The Design and Analysis Committee critically reviewed the pros and cons of five potential methods for analyzing combined PNRP project data. Conclusions were based on simple consensus. The five approaches reviewed included: 1) Analyzing and reporting each project separately, 2) Combining data from all projects and performing an individual-level analysis, 3) Pooling data from projects having similar study designs, 4) Analyzing pooled data using a prospective meta analytic technique, 5) Analyzing pooled data utilizing a novel simulated group randomized design. Results Methodologies varied in their ability to incorporate data from all PNRP projects, to appropriately account for differing study designs, and in their impact from differing project sample sizes. Limitations The conclusions reached were based on expert opinion and not derived from actual analyses performed. Conclusions The ability to analyze pooled data arising from differing study designs may provide pertinent information to inform programmatic, budgetary, and policy perspectives. Multi-site community-based research may not lend itself well to the more stringent explanatory and pragmatic standards of a randomized controlled trial design. Given our growing interest in community-based population research, the challenges inherent in the analysis of heterogeneous study design are likely to become

  13. Review of Recent Methodological Developments in Group-Randomized Trials: Part 1-Design.

    PubMed

    Turner, Elizabeth L; Li, Fan; Gallis, John A; Prague, Melanie; Murray, David M

    2017-06-01

    In 2004, Murray et al. reviewed methodological developments in the design and analysis of group-randomized trials (GRTs). We have highlighted the developments of the past 13 years in design with a companion article to focus on developments in analysis. As a pair, these articles update the 2004 review. We have discussed developments in the topics of the earlier review (e.g., clustering, matching, and individually randomized group-treatment trials) and in new topics, including constrained randomization and a range of randomized designs that are alternatives to the standard parallel-arm GRT. These include the stepped-wedge GRT, the pseudocluster randomized trial, and the network-randomized GRT, which, like the parallel-arm GRT, require clustering to be accounted for in both their design and analysis.

  14. Creating Innovative Research Designs: The 10-Year Methodological Think Tank Case Study

    PubMed Central

    Katerndahl, David; Crabtree, Benjamin

    2006-01-01

    PURPOSE Addressing important but complex research questions often necessitates the creation of innovative mixed methods designs. This report describes an approach to developing research designs for studying important but methodologically challenging research questions. METHODS The Methodological Think Tank has been held annually in conjunction with the Primary Care Research Methods and Statistics Conference in San Antonio since 1994. A group of 3 to 4 methodologists with expertise balanced between quantitative and qualitative backgrounds is invited by the think tank coordinators to serve on a 2-day think tank to discuss a research question selected from those submitted in response to a call for proposals. During the first half-day, these experts explore the content area with the investigator, often challenging beliefs and assumptions. During the second half-day, the think tank participants systematically prune potential approaches until a desirable research method is identified. RESULTS To date, the most recent 7 think tanks have produced fundable research designs, with 1 being funded by a K award and 4 by R01 grants. All participating investigators attributed much of their success to think tank participation. Lessons learned include (1) the importance of careful selection of participating methodologists, (2) all think tank communities of inquiry must go through 4 stages of development from pseudocommunity to community, and (3) the critical importance of listening by the investigator. CONCLUSION Researchers and academic departments could use this process locally to develop innovative research designs. PMID:17003146

  15. Human Factors Considerations in System Design

    NASA Technical Reports Server (NTRS)

    Mitchell, C. M. (Editor); Vanbalen, P. M. (Editor); Moe, K. L. (Editor)

    1983-01-01

    Human factors considerations in systems design was examined. Human factors in automated command and control, in the efficiency of the human computer interface and system effectiveness are outlined. The following topics are discussed: human factors aspects of control room design; design of interactive systems; human computer dialogue, interaction tasks and techniques; guidelines on ergonomic aspects of control rooms and highly automated environments; system engineering for control by humans; conceptual models of information processing; information display and interaction in real time environments.

  16. SSME Investment in Turbomachinery Inducer Impeller Design Tools and Methodology

    NASA Technical Reports Server (NTRS)

    Zoladz, Thomas; Mitchell, William; Lunde, Kevin

    2010-01-01

    Within the rocket engine industry, SSME turbomachines are the de facto standards of success with regard to meeting aggressive performance requirements under challenging operational environments. Over the Shuttle era, SSME has invested heavily in our national inducer impeller design infrastructure. While both low and high pressure turbopump failures/anomaly resolution efforts spurred some of these investments, the SSME program was a major benefactor of key areas of turbomachinery inducer-impeller research outside of flight manifest pressures. Over the past several decades, key turbopump internal environments have been interrogated via highly instrumented hot-fire and cold-flow testing. Likewise, SSME has sponsored the advancement of time accurate and cavitating inducer impeller computation fluid dynamics (CFD) tools. These investments together have led to a better understanding of the complex internal flow fields within aggressive high performing inducers and impellers. New design tools and methodologies have evolved which intend to provide confident blade designs which strike an appropriate balance between performance and self induced load management.

  17. Automated Methodologies for the Design of Flow Diagrams for Development and Maintenance Activities

    NASA Astrophysics Data System (ADS)

    Shivanand M., Handigund; Shweta, Bhat

    The Software Requirements Specification (SRS) of the organization is a text document prepared by strategic management incorporating the requirements of the organization. These requirements of ongoing business/ project development process involve the software tools, the hardware devices, the manual procedures, the application programs and the communication commands. These components are appropriately ordered for achieving the mission of the concerned process both in the project development and the ongoing business processes, in different flow diagrams viz. activity chart, workflow diagram, activity diagram, component diagram and deployment diagram. This paper proposes two generic, automatic methodologies for the design of various flow diagrams of (i) project development activities, (ii) ongoing business process. The methodologies also resolve the ensuing deadlocks in the flow diagrams and determine the critical paths for the activity chart. Though both methodologies are independent, each complements other in authenticating its correctness and completeness.

  18. Application of optimal design methodologies in clinical pharmacology experiments.

    PubMed

    Ogungbenro, Kayode; Dokoumetzidis, Aristides; Aarons, Leon

    2009-01-01

    Pharmacokinetics and pharmacodynamics data are often analysed by mixed-effects modelling techniques (also known as population analysis), which has become a standard tool in the pharmaceutical industries for drug development. The last 10 years has witnessed considerable interest in the application of experimental design theories to population pharmacokinetic and pharmacodynamic experiments. Design of population pharmacokinetic experiments involves selection and a careful balance of a number of design factors. Optimal design theory uses prior information about the model and parameter estimates to optimize a function of the Fisher information matrix to obtain the best combination of the design factors. This paper provides a review of the different approaches that have been described in the literature for optimal design of population pharmacokinetic and pharmacodynamic experiments. It describes options that are available and highlights some of the issues that could be of concern as regards practical application. It also discusses areas of application of optimal design theories in clinical pharmacology experiments. It is expected that as the awareness about the benefits of this approach increases, more people will embrace it and ultimately will lead to more efficient population pharmacokinetic and pharmacodynamic experiments and can also help to reduce both cost and time during drug development. Copyright (c) 2008 John Wiley & Sons, Ltd.

  19. Development and Application of a Clinical Microsystem Simulation Methodology for Human Factors-Based Research of Alarm Fatigue.

    PubMed

    Kobayashi, Leo; Gosbee, John W; Merck, Derek L

    2017-07-01

    (1) To develop a clinical microsystem simulation methodology for alarm fatigue research with a human factors engineering (HFE) assessment framework and (2) to explore its application to the comparative examination of different approaches to patient monitoring and provider notification. Problems with the design, implementation, and real-world use of patient monitoring systems result in alarm fatigue. A multidisciplinary team is developing an open-source tool kit to promote bedside informatics research and mitigate alarm fatigue. Simulation, HFE, and computer science experts created a novel simulation methodology to study alarm fatigue. Featuring multiple interconnected simulated patient scenarios with scripted timeline, "distractor" patient care tasks, and triggered true and false alarms, the methodology incorporated objective metrics to assess provider and system performance. Developed materials were implemented during institutional review board-approved study sessions that assessed and compared an experimental multiparametric alerting system with a standard monitor telemetry system for subject response, use characteristics, and end-user feedback. A four-patient simulation setup featuring objective metrics for participant task-related performance and response to alarms was developed along with accompanying structured HFE assessment (questionnaire and interview) for monitor systems use testing. Two pilot and four study sessions with individual nurse subjects elicited true alarm and false alarm responses (including diversion from assigned tasks) as well as nonresponses to true alarms. In-simulation observation and subject questionnaires were used to test the experimental system's approach to suppressing false alarms and alerting providers. A novel investigative methodology applied simulation and HFE techniques to replicate and study alarm fatigue in controlled settings for systems assessment and experimental research purposes.

  20. Practicing universal design to actual hand tool design process.

    PubMed

    Lin, Kai-Chieh; Wu, Chih-Fu

    2015-09-01

    UD evaluation principles are difficult to implement in product design. This study proposes a methodology for implementing UD in the design process through user participation. The original UD principles and user experience are used to develop the evaluation items. Difference of product types was considered. Factor analysis and Quantification theory type I were used to eliminate considered inappropriate evaluation items and to examine the relationship between evaluation items and product design factors. Product design specifications were established for verification. The results showed that converting user evaluation into crucial design verification factors by the generalized evaluation scale based on product attributes as well as the design factors applications in product design can improve users' UD evaluation. The design process of this study is expected to contribute to user-centered UD application. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  1. Factors Contributing to Institutions Achieving Environmental Sustainability

    ERIC Educational Resources Information Center

    James, Matthew; Card, Karen

    2012-01-01

    Purpose: The purpose of this paper is to determine what factors contributed to three universities achieving environmental sustainability. Design/methodology/approach: A case study methodology was used to determine how each factor contributed to the institutions' sustainability. Site visits, fieldwork, document reviews, and interviews with…

  2. A Design Methodology for Medical Processes.

    PubMed

    Ferrante, Simona; Bonacina, Stefano; Pozzi, Giuseppe; Pinciroli, Francesco; Marceglia, Sara

    2016-01-01

    Healthcare processes, especially those belonging to the clinical domain, are acknowledged as complex and characterized by the dynamic nature of the diagnosis, the variability of the decisions made by experts driven by their experiences, the local constraints, the patient's needs, the uncertainty of the patient's response, and the indeterminacy of patient's compliance to treatment. Also, the multiple actors involved in patient's care need clear and transparent communication to ensure care coordination. In this paper, we propose a methodology to model healthcare processes in order to break out complexity and provide transparency. The model is grounded on a set of requirements that make the healthcare domain unique with respect to other knowledge domains. The modeling methodology is based on three main phases: the study of the environmental context, the conceptual modeling, and the logical modeling. The proposed methodology was validated by applying it to the case study of the rehabilitation process of stroke patients in the specific setting of a specialized rehabilitation center. The resulting model was used to define the specifications of a software artifact for the digital administration and collection of assessment tests that was also implemented. Despite being only an example, our case study showed the ability of process modeling to answer the actual needs in healthcare practices. Independently from the medical domain in which the modeling effort is done, the proposed methodology is useful to create high-quality models, and to detect and take into account relevant and tricky situations that can occur during process execution.

  3. A Design Methodology for Medical Processes

    PubMed Central

    Bonacina, Stefano; Pozzi, Giuseppe; Pinciroli, Francesco; Marceglia, Sara

    2016-01-01

    Summary Background Healthcare processes, especially those belonging to the clinical domain, are acknowledged as complex and characterized by the dynamic nature of the diagnosis, the variability of the decisions made by experts driven by their experiences, the local constraints, the patient’s needs, the uncertainty of the patient’s response, and the indeterminacy of patient’s compliance to treatment. Also, the multiple actors involved in patient’s care need clear and transparent communication to ensure care coordination. Objectives In this paper, we propose a methodology to model healthcare processes in order to break out complexity and provide transparency. Methods The model is grounded on a set of requirements that make the healthcare domain unique with respect to other knowledge domains. The modeling methodology is based on three main phases: the study of the environmental context, the conceptual modeling, and the logical modeling. Results The proposed methodology was validated by applying it to the case study of the rehabilitation process of stroke patients in the specific setting of a specialized rehabilitation center. The resulting model was used to define the specifications of a software artifact for the digital administration and collection of assessment tests that was also implemented. Conclusions Despite being only an example, our case study showed the ability of process modeling to answer the actual needs in healthcare practices. Independently from the medical domain in which the modeling effort is done, the proposed methodology is useful to create high-quality models, and to detect and take into account relevant and tricky situations that can occur during process execution. PMID:27081415

  4. Nonlinear flight control design using backstepping methodology

    NASA Astrophysics Data System (ADS)

    Tran, Thanh Trung

    The subject of nonlinear flight control design using backstepping control methodology is investigated in the dissertation research presented here. Control design methods based on nonlinear models of the dynamic system provide higher utility and versatility because the design model more closely matches the physical system behavior. Obtaining requisite model fidelity is only half of the overall design process, however. Design of the nonlinear control loops can lessen the effects of nonlinearity, or even exploit nonlinearity, to achieve higher levels of closed-loop stability, performance, and robustness. The goal of the research is to improve control quality for a general class of strict-feedback dynamic systems and provide flight control architectures to augment the aircraft motion. The research is divided into two parts: theoretical control development for the strict-feedback form of nonlinear dynamic systems and application of the proposed theory for nonlinear flight dynamics. In the first part, the research is built on two components: transforming the nonlinear dynamic model to a canonical strict-feedback form and then applying backstepping control theory to the canonical model. The research considers a process to determine when this transformation is possible, and when it is possible, a systematic process to transfer the model is also considered when practical. When this is not the case, certain modeling assumptions are explored to facilitate the transformation. After achieving the canonical form, a systematic design procedure for formulating a backstepping control law is explored in the research. Starting with the simplest subsystem and ending with the full system, pseudo control concepts based on Lyapunov control functions are used to control each successive subsystem. Typically each pseudo control must be solved from a nonlinear algebraic equation. At the end of this process, the physical control input must be re-expressed in terms of the physical states by

  5. 21st Century Cognitive Behavioural Therapy for Anger: A Systematic Review of Research Design, Methodology and Outcome.

    PubMed

    Fernandez, Ephrem; Malvaso, Catia; Day, Andrew; Guharajan, Deepan

    2018-07-01

    Past reviews of cognitive behavioural therapy (CBT) for anger have focused on outcome in specific subpopulations, with few questions posed about research design and methodology. Since the turn of the century, there has been a surge of methodologically varied studies awaiting systematic review. The basic aim was to review this recent literature in terms of trends and patterns in research design, operationalization of anger, and covariates such as social desirability bias (SDB). Also of interest was clinical outcome. After successive culling, 42 relevant studies were retained. These were subjected to a rapid evidence assessment (REA) with special attention to design (ranked on the Scientific Methods Scale) measurement methodology (self-monitored behaviour, anger questionnaires, and others' ratings), SDB assessment, and statistical versus clinical significance. The randomized controlled trial characterized 60% of the studies, and the State Trait Anger Expression Inventory was the dominant measure of anger. All but one of the studies reported statistically significant outcome, and all but one of the 21 studies evaluating clinical significance laid claim to it. The one study with neither statistical nor clinical significance was the only one that had assessed and corrected for SDB. Measures remain relatively narrow in scope, but study designs have improved, and the outcomes suggest efficacy and clinical effectiveness. In conjunction with previous findings of an inverse relationship between anger and SDB, the results raise the possibility that the favourable picture of CBT for anger may need closer scrutiny with SDB and other methodological details in mind.

  6. Unified methodology for airport pavement analysis and design. Vol. 1, state of the art

    DOT National Transportation Integrated Search

    1991-06-01

    This report presents an assessment of the state of the art of airport pavement analysis : and design. The objective is to identify those areas in current airport pavement : analysis methodology that need to be substantially improved from the perspect...

  7. LOX/hydrocarbon rocket engine analytical design methodology development and validation. Volume 1: Executive summary and technical narrative

    NASA Technical Reports Server (NTRS)

    Pieper, Jerry L.; Walker, Richard E.

    1993-01-01

    During the past three decades, an enormous amount of resources were expended in the design and development of Liquid Oxygen/Hydrocarbon and Hydrogen (LOX/HC and LOX/H2) rocket engines. A significant portion of these resources were used to develop and demonstrate the performance and combustion stability for each new engine. During these efforts, many analytical and empirical models were developed that characterize design parameters and combustion processes that influence performance and stability. Many of these models are suitable as design tools, but they have not been assembled into an industry-wide usable analytical design methodology. The objective of this program was to assemble existing performance and combustion stability models into a usable methodology capable of producing high performing and stable LOX/hydrocarbon and LOX/hydrogen propellant booster engines.

  8. Enhanced CAX Architecture, Design and Methodology - SPHINX (Architecture, definition et methodologie ameliorees des exercices assistes par ordinateur (CAX) - SPHINX)

    DTIC Science & Technology

    2016-08-01

    REPORT TR-MSG-106 Enhanced CAX Architecture, Design and Methodology – SPHINX (Architecture, définition et méthodologie améliorées des exercices...STO TECHNICAL REPORT TR-MSG-106 Enhanced CAX Architecture, Design and Methodology – SPHINX (Architecture, définition et méthodologie...transition, application and field-testing, experimentation and a range of related scientific activities that include systems engineering, operational

  9. Methodology for designing and manufacturing complex biologically inspired soft robotic fluidic actuators: prosthetic hand case study.

    PubMed

    Thompson-Bean, E; Das, R; McDaid, A

    2016-10-31

    We present a novel methodology for the design and manufacture of complex biologically inspired soft robotic fluidic actuators. The methodology is applied to the design and manufacture of a prosthetic for the hand. Real human hands are scanned to produce a 3D model of a finger, and pneumatic networks are implemented within it to produce a biomimetic bending motion. The finger is then partitioned into material sections, and a genetic algorithm based optimization, using finite element analysis, is employed to discover the optimal material for each section. This is based on two biomimetic performance criteria. Two sets of optimizations using two material sets are performed. Promising optimized material arrangements are fabricated using two techniques to validate the optimization routine, and the fabricated and simulated results are compared. We find that the optimization is successful in producing biomimetic soft robotic fingers and that fabrication of the fingers is possible. Limitations and paths for development are discussed. This methodology can be applied for other fluidic soft robotic devices.

  10. A Methodological Framework for Instructional Design Model Development: Critical Dimensions and Synthesized Procedures

    ERIC Educational Resources Information Center

    Lee, Jihyun; Jang, Seonyoung

    2014-01-01

    Instructional design (ID) models have been developed to promote understandings of ID reality and guide ID performance. As the number and diversity of ID practices grows, implicit doubts regarding the reliability, validity, and usefulness of ID models suggest the need for methodological guidance that would help to generate ID models that are…

  11. Methodology for application of field rainfall simulator to revise c-factor database for conditions of the Czech Republic

    NASA Astrophysics Data System (ADS)

    Neumann, Martin; Dostál, Tomáš; Krása, Josef; Kavka, Petr; Davidová, Tereza; Brant, Václav; Kroulík, Milan; Mistr, Martin; Novotný, Ivan

    2016-04-01

    The presentation will introduce a methodology of determination of crop and cover management factor (C-faktor) for the universal soil loss equation (USLE) using field rainfall simulator. The aim of the project is to determine the C-factor value for the different phenophases of the main crops of the central-european region, while also taking into account the different agrotechnical methods. By using the field rainfall simulator, it is possible to perform the measurements in specific phenophases, which is otherwise difficult to execute due to the variability and fortuity of the natural rainfall. Due to the number of measurements needed, two identical simulators will be used, operated by two independent teams, with coordinated methodology. The methodology will mainly specify the length of simulation, the rainfall intensity, and the sampling technique. The presentation includes a more detailed account of the methods selected. Due to the wide range of variable crops and soils, it is not possible to execute the measurements for all possible combinations. We therefore decided to perform the measurements for previously selected combinations of soils,crops and agrotechnologies that are the most common in the Czech Republic. During the experiments, the volume of the surface runoff and amount of sediment will be measured in their temporal distribution, as well as several other important parameters. The key values of the 3D matrix of the combinations of the crop, agrotechnique and soil will be determined experimentally. The remaining values will be determined by interpolation or by a model analogy. There are several methods used for C-factor calculation from measured experimental data. Some of these are not suitable to be used considering the type of data gathered. The presentation will discuss the benefits and drawbacks of these methods, as well as the final design of the method used. The problems concerning the selection of a relevant measurement method as well as the final

  12. HPCC Methodologies for Structural Design and Analysis on Parallel and Distributed Computing Platforms

    NASA Technical Reports Server (NTRS)

    Farhat, Charbel

    1998-01-01

    In this grant, we have proposed a three-year research effort focused on developing High Performance Computation and Communication (HPCC) methodologies for structural analysis on parallel processors and clusters of workstations, with emphasis on reducing the structural design cycle time. Besides consolidating and further improving the FETI solver technology to address plate and shell structures, we have proposed to tackle the following design related issues: (a) parallel coupling and assembly of independently designed and analyzed three-dimensional substructures with non-matching interfaces, (b) fast and smart parallel re-analysis of a given structure after it has undergone design modifications, (c) parallel evaluation of sensitivity operators (derivatives) for design optimization, and (d) fast parallel analysis of mildly nonlinear structures. While our proposal was accepted, support was provided only for one year.

  13. Design methodology: edgeless 3D ASICs with complex in-pixel processing for pixel detectors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fahim Farah, Fahim Farah; Deptuch, Grzegorz W.; Hoff, James R.

    The design methodology for the development of 3D integrated edgeless pixel detectors with in-pixel processing using Electronic Design Automation (EDA) tools is presented. A large area 3 tier 3D detector with one sensor layer and two ASIC layers containing one analog and one digital tier, is built for x-ray photon time of arrival measurement and imaging. A full custom analog pixel is 65μm x 65μm. It is connected to a sensor pixel of the same size on one side, and on the other side it has approximately 40 connections to the digital pixel. A 32 x 32 edgeless array withoutmore » any peripheral functional blocks constitutes a sub-chip. The sub-chip is an indivisible unit, which is further arranged in a 6 x 6 array to create the entire 1.248cm x 1.248cm ASIC. Each chip has 720 bump-bond I/O connections, on the back of the digital tier to the ceramic PCB. All the analog tier power and biasing is conveyed through the digital tier from the PCB. The assembly has no peripheral functional blocks, and hence the active area extends to the edge of the detector. This was achieved by using a few flavors of almost identical analog pixels (minimal variation in layout) to allow for peripheral biasing blocks to be placed within pixels. The 1024 pixels within a digital sub-chip array have a variety of full custom, semi-custom and automated timing driven functional blocks placed together. The methodology uses a modified mixed-mode on-top digital implementation flow to not only harness the tool efficiency for timing and floor-planning but also to maintain designer control over compact parasitically aware layout. The methodology uses the Cadence design platform, however it is not limited to this tool.« less

  14. Design methodology: edgeless 3D ASICs with complex in-pixel processing for pixel detectors

    NASA Astrophysics Data System (ADS)

    Fahim, Farah; Deptuch, Grzegorz W.; Hoff, James R.; Mohseni, Hooman

    2015-08-01

    The design methodology for the development of 3D integrated edgeless pixel detectors with in-pixel processing using Electronic Design Automation (EDA) tools is presented. A large area 3 tier 3D detector with one sensor layer and two ASIC layers containing one analog and one digital tier, is built for x-ray photon time of arrival measurement and imaging. A full custom analog pixel is 65μm x 65μm. It is connected to a sensor pixel of the same size on one side, and on the other side it has approximately 40 connections to the digital pixel. A 32 x 32 edgeless array without any peripheral functional blocks constitutes a sub-chip. The sub-chip is an indivisible unit, which is further arranged in a 6 x 6 array to create the entire 1.248cm x 1.248cm ASIC. Each chip has 720 bump-bond I/O connections, on the back of the digital tier to the ceramic PCB. All the analog tier power and biasing is conveyed through the digital tier from the PCB. The assembly has no peripheral functional blocks, and hence the active area extends to the edge of the detector. This was achieved by using a few flavors of almost identical analog pixels (minimal variation in layout) to allow for peripheral biasing blocks to be placed within pixels. The 1024 pixels within a digital sub-chip array have a variety of full custom, semi-custom and automated timing driven functional blocks placed together. The methodology uses a modified mixed-mode on-top digital implementation flow to not only harness the tool efficiency for timing and floor-planning but also to maintain designer control over compact parasitically aware layout. The methodology uses the Cadence design platform, however it is not limited to this tool.

  15. Application of Adjoint Methodology in Various Aspects of Sonic Boom Design

    NASA Technical Reports Server (NTRS)

    Rallabhandi, Sriram K.

    2014-01-01

    One of the advances in computational design has been the development of adjoint methods allowing efficient calculation of sensitivities in gradient-based shape optimization. This paper discusses two new applications of adjoint methodology that have been developed to aid in sonic boom mitigation exercises. In the first, equivalent area targets are generated using adjoint sensitivities of selected boom metrics. These targets may then be used to drive the vehicle shape during optimization. The second application is the computation of adjoint sensitivities of boom metrics on the ground with respect to parameters such as flight conditions, propagation sampling rate, and selected inputs to the propagation algorithms. These sensitivities enable the designer to make more informed selections of flight conditions at which the chosen cost functionals are less sensitive.

  16. LOX/hydrocarbon rocket engine analytical design methodology development and validation. Volume 2: Appendices

    NASA Technical Reports Server (NTRS)

    Niiya, Karen E.; Walker, Richard E.; Pieper, Jerry L.; Nguyen, Thong V.

    1993-01-01

    This final report includes a discussion of the work accomplished during the period from Dec. 1988 through Nov. 1991. The objective of the program was to assemble existing performance and combustion stability models into a usable design methodology capable of designing and analyzing high-performance and stable LOX/hydrocarbon booster engines. The methodology was then used to design a validation engine. The capabilities and validity of the methodology were demonstrated using this engine in an extensive hot fire test program. The engine used LOX/RP-1 propellants and was tested over a range of mixture ratios, chamber pressures, and acoustic damping device configurations. This volume contains time domain and frequency domain stability plots which indicate the pressure perturbation amplitudes and frequencies from approximately 30 tests of a 50K thrust rocket engine using LOX/RP-1 propellants over a range of chamber pressures from 240 to 1750 psia with mixture ratios of from 1.2 to 7.5. The data is from test configurations which used both bitune and monotune acoustic cavities and from tests with no acoustic cavities. The engine had a length of 14 inches and a contraction ratio of 2.0 using a 7.68 inch diameter injector. The data was taken from both stable and unstable tests. All combustion instabilities were spontaneous in the first tangential mode. Although stability bombs were used and generated overpressures of approximately 20 percent, no tests were driven unstable by the bombs. The stability instrumentation included six high-frequency Kistler transducers in the combustion chamber, a high-frequency Kistler transducer in each propellant manifold, and tri-axial accelerometers. Performance data is presented, both characteristic velocity efficiencies and energy release efficiencies, for those tests of sufficient duration to record steady state values.

  17. Proposed Methodology for Design of Carbon Fiber Reinforced Polymer Spike Anchors into Reinforced Concrete

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MacFarlane, Eric Robert

    The included methodology, calculations, and drawings support design of Carbon Fiber Reinforced Polymer (CFRP) spike anchors for securing U-wrap CFRP onto reinforced concrete Tbeams. This content pertains to an installation in one of Los Alamos National Laboratory’s facilities. The anchors are part of a seismic rehabilitation to the subject facility. The information contained here is for information purposes only. The reader is encouraged to verify all equations, details, and methodology prior to usage in future projects. However, development of the content contained here complied with Los Alamos National Laboratory’s NQA-1 quality assurance program for nuclear structures. Furthermore, the formulations andmore » details came from the referenced published literature. This literature represents the current state of the art for FRP anchor design. Construction personnel tested the subject anchor design to the required demand level demonstrated in the calculation. The testing demonstrated the ability of the anchors noted to carry loads in excess of 15 kips in direct tension. The anchors were not tested to failure in part because of the hazards associated with testing large-capacity tensile systems to failure. The calculation, methodology, and drawing originator was Eric MacFarlane of Los Alamos National Laboratory’s (LANL) Office of Seismic Hazards and Risk Mitigation (OSHRM). The checker for all components was Mike Salmon of the LANL OSHRM. The independent reviewers of all components were Insung Kim and Loring Wyllie of Degenkolb Engineers. Note that Insung Kim contributed to the initial formulations in the calculations that pertained directly to his Doctoral research.« less

  18. Review of Recent Methodological Developments in Group-Randomized Trials: Part 1—Design

    PubMed Central

    Li, Fan; Gallis, John A.; Prague, Melanie; Murray, David M.

    2017-01-01

    In 2004, Murray et al. reviewed methodological developments in the design and analysis of group-randomized trials (GRTs). We have highlighted the developments of the past 13 years in design with a companion article to focus on developments in analysis. As a pair, these articles update the 2004 review. We have discussed developments in the topics of the earlier review (e.g., clustering, matching, and individually randomized group-treatment trials) and in new topics, including constrained randomization and a range of randomized designs that are alternatives to the standard parallel-arm GRT. These include the stepped-wedge GRT, the pseudocluster randomized trial, and the network-randomized GRT, which, like the parallel-arm GRT, require clustering to be accounted for in both their design and analysis. PMID:28426295

  19. Improved FTA methodology and application to subsea pipeline reliability design.

    PubMed

    Lin, Jing; Yuan, Yongbo; Zhang, Mingyuan

    2014-01-01

    An innovative logic tree, Failure Expansion Tree (FET), is proposed in this paper, which improves on traditional Fault Tree Analysis (FTA). It describes a different thinking approach for risk factor identification and reliability risk assessment. By providing a more comprehensive and objective methodology, the rather subjective nature of FTA node discovery is significantly reduced and the resulting mathematical calculations for quantitative analysis are greatly simplified. Applied to the Useful Life phase of a subsea pipeline engineering project, the approach provides a more structured analysis by constructing a tree following the laws of physics and geometry. Resulting improvements are summarized in comparison table form.

  20. Application of Adjoint Methodology to Supersonic Aircraft Design Using Reversed Equivalent Areas

    NASA Technical Reports Server (NTRS)

    Rallabhandi, Sriram K.

    2013-01-01

    This paper presents an approach to shape an aircraft to equivalent area based objectives using the discrete adjoint approach. Equivalent areas can be obtained either using reversed augmented Burgers equation or direct conversion of off-body pressures into equivalent area. Formal coupling with CFD allows computation of sensitivities of equivalent area objectives with respect to aircraft shape parameters. The exactness of the adjoint sensitivities is verified against derivatives obtained using the complex step approach. This methodology has the benefit of using designer-friendly equivalent areas in the shape design of low-boom aircraft. Shape optimization results with equivalent area cost functionals are discussed and further refined using ground loudness based objectives.

  1. A Methodology for Instructional Design in Mathematics--With the Generic and Epistemic Student at the Centre

    ERIC Educational Resources Information Center

    Strømskag, Heidi

    2017-01-01

    This theoretical paper presents a methodology for instructional design in mathematics. It is a theoretical analysis of a proposed model for instructional design, where tasks are embedded in situations that preserve meaning with respect to particular pieces of mathematical knowledge. The model is applicable when there is an intention of teaching…

  2. Methodologic considerations in the design and analysis of nested case-control studies: association between cytokines and postoperative delirium.

    PubMed

    Ngo, Long H; Inouye, Sharon K; Jones, Richard N; Travison, Thomas G; Libermann, Towia A; Dillon, Simon T; Kuchel, George A; Vasunilashorn, Sarinnapha M; Alsop, David C; Marcantonio, Edward R

    2017-06-06

    similar. We found minimal evidence for overmatching. Using a matched NCC approach introduces methodological challenges into the study design and data analysis. Nonetheless, with careful selection of the match algorithm, match factors, and analysis methods, this design is cost effective and, for our study, yields estimates that are similar to those from a prospective cohort study design.

  3. Journal impact factor and methodological quality of surgical randomized controlled trials: an empirical study.

    PubMed

    Ahmed Ali, Usama; Reiber, Beata M M; Ten Hove, Joren R; van der Sluis, Pieter C; Gooszen, Hein G; Boermeester, Marja A; Besselink, Marc G

    2017-11-01

    The journal impact factor (IF) is often used as a surrogate marker for methodological quality. The objective of this study is to evaluate the relation between the journal IF and methodological quality of surgical randomized controlled trials (RCTs). Surgical RCTs published in PubMed in 1999 and 2009 were identified. According to IF, RCTs were divided into groups of low (<2), median (2-3) and high IF (>3), as well as into top-10 vs all other journals. Methodological quality characteristics and factors concerning funding, ethical approval and statistical significance of outcomes were extracted and compared between the IF groups. Additionally, a multivariate regression was performed. The median IF was 2.2 (IQR 2.37). The percentage of 'low-risk of bias' RCTs was 13% for top-10 journals vs 4% for other journals in 1999 (P < 0.02), and 30 vs 12% in 2009 (P < 0.02). Similar results were observed for high vs low IF groups. The presence of sample-size calculation, adequate generation of allocation and intention-to-treat analysis were independently associated with publication in higher IF journals; as were multicentre trials and multiple authors. Publication of RCTs in high IF journals is associated with moderate improvement in methodological quality compared to RCTs published in lower IF journals. RCTs with adequate sample-size calculation, generation of allocation or intention-to-treat analysis were associated with publication in a high IF journal. On the other hand, reporting a statistically significant outcome and being industry funded were not independently associated with publication in a higher IF journal.

  4. A Proposed Theory Seeded Methodology for Design Based Research into Effective Use of MUVES in Vocational Education Contexts

    ERIC Educational Resources Information Center

    Cochrane, Todd; Davis, Niki; Morrow, Donna

    2013-01-01

    A methodology for design based research (DBR) into effective development and use of Multi-User Virtual Environments (MUVE) in vocational education is proposed. It blends software development with DBR with two theories selected to inform the methodology. Legitimate peripheral participation LPP (Lave & Wenger, 1991) provides a filter when…

  5. Development of an Expert Judgement Elicitation and Calibration Methodology for Risk Analysis in Conceptual Vehicle Design

    NASA Technical Reports Server (NTRS)

    Unal, Resit; Keating, Charles; Conway, Bruce; Chytka, Trina

    2004-01-01

    A comprehensive expert-judgment elicitation methodology to quantify input parameter uncertainty and analysis tool uncertainty in a conceptual launch vehicle design analysis has been developed. The ten-phase methodology seeks to obtain expert judgment opinion for quantifying uncertainties as a probability distribution so that multidisciplinary risk analysis studies can be performed. The calibration and aggregation techniques presented as part of the methodology are aimed at improving individual expert estimates, and provide an approach to aggregate multiple expert judgments into a single probability distribution. The purpose of this report is to document the methodology development and its validation through application to a reference aerospace vehicle. A detailed summary of the application exercise, including calibration and aggregation results is presented. A discussion of possible future steps in this research area is given.

  6. Factors impeding flexible inpatient unit design.

    PubMed

    Pati, Debajyoti; Evans, Jennie; Harvey, Thomas E; Bazuin, Doug

    2012-01-01

    To identify and examine factors extraneous to the design decision-making process that could impede the optimization of flexibility on inpatient units. A 2006 empirical study to identify domains of design decisions that affect flexibility on inpatient units found some indication in the context of the acuity-adaptable operational model that factors extraneous to the design process could have negatively influenced the successful implementation of the model. This raised questions regarding extraneous factors that might influence the successful optimization of flexibility. An exploratory, qualitative method was adopted to examine the question. Stakeholders from five recently built acute care inpatient units participated in the study, which involved three types of data collection: (1) verbal protocol data from a gaming session; (2) in-depth semi-structured interviews; and (3) shadowing frontline personnel. Data collection was conducted between June 2009 and November 2010. The study revealed at least nine factors extraneous to the design process that have the potential to hinder the optimization of flexibility in four domains: (1) systemic; (2) cultural; (3) human; and (4) financial. Flexibility is critical to hospital operations in the new healthcare climate, where cost reduction constitutes a vital target. From this perspective, flexibility and efficiency strategies can be influenced by (1) return on investment, (2) communication, (3) culture change, and (4) problem definition. Extraneous factors identified in this study could also affect flexibility in other care settings; therefore, these findings may be viewed from the overall context of hospital design.

  7. A methodology towards virtualisation-based high performance simulation platform supporting multidisciplinary design of complex products

    NASA Astrophysics Data System (ADS)

    Ren, Lei; Zhang, Lin; Tao, Fei; (Luke) Zhang, Xiaolong; Luo, Yongliang; Zhang, Yabin

    2012-08-01

    Multidisciplinary design of complex products leads to an increasing demand for high performance simulation (HPS) platforms. One great challenge is how to achieve high efficient utilisation of large-scale simulation resources in distributed and heterogeneous environments. This article reports a virtualisation-based methodology to realise a HPS platform. This research is driven by the issues concerning large-scale simulation resources deployment and complex simulation environment construction, efficient and transparent utilisation of fine-grained simulation resources and high reliable simulation with fault tolerance. A framework of virtualisation-based simulation platform (VSIM) is first proposed. Then the article investigates and discusses key approaches in VSIM, including simulation resources modelling, a method to automatically deploying simulation resources for dynamic construction of system environment, and a live migration mechanism in case of faults in run-time simulation. Furthermore, the proposed methodology is applied to a multidisciplinary design system for aircraft virtual prototyping and some experiments are conducted. The experimental results show that the proposed methodology can (1) significantly improve the utilisation of fine-grained simulation resources, (2) result in a great reduction in deployment time and an increased flexibility for simulation environment construction and (3)achieve fault tolerant simulation.

  8. Human factors in spacecraft design

    NASA Technical Reports Server (NTRS)

    Harrison, Albert A.; Connors, Mary M.

    1990-01-01

    This paper describes some of the salient implications of evolving mission parameters for spacecraft design. Among the requirements for future spacecraft are new, higher standards of living, increased support of human productivity, and greater accommodation of physical and cultural variability. Design issues include volumetric allowances, architecture and layouts, closed life support systems, health maintenance systems, recreational facilities, automation, privacy, and decor. An understanding of behavioral responses to design elements is a precondition for critical design decisions. Human factors research results must be taken into account early in the course of the design process.

  9. Low-Radiation Cellular Inductive Powering of Rodent Wireless Brain Interfaces: Methodology and Design Guide.

    PubMed

    Soltani, Nima; Aliroteh, Miaad S; Salam, M Tariqus; Perez Velazquez, Jose Luis; Genov, Roman

    2016-08-01

    This paper presents a general methodology of inductive power delivery in wireless chronic rodent electrophysiology applications. The focus is on such systems design considerations under the following key constraints: maximum power delivery under the allowable specific absorption rate (SAR), low cost and spatial scalability. The methodology includes inductive coil design considerations within a low-frequency ferrite-core-free power transfer link which includes a scalable coil-array power transmitter floor and a single-coil implanted or worn power receiver. A specific design example is presented that includes the concept of low-SAR cellular single-transmitter-coil powering through dynamic tracking of a magnet-less receiver spatial location. The transmitter coil instantaneous supply current is monitored using a small number of low-cost electronic components. A drop in its value indicates the proximity of the receiver due to the reflected impedance of the latter. Only the transmitter coil nearest to the receiver is activated. Operating at the low frequency of 1.5 MHz, the inductive powering floor delivers a maximum of 15.9 W below the IEEE C95 SAR limit, which is over three times greater than that in other recently reported designs. The power transfer efficiency of 39% and 13% at the nominal and maximum distances of 8 cm and 11 cm, respectively, is maintained.

  10. Fast underdetermined BSS architecture design methodology for real time applications.

    PubMed

    Mopuri, Suresh; Reddy, P Sreenivasa; Acharyya, Amit; Naik, Ganesh R

    2015-01-01

    In this paper, we propose a high speed architecture design methodology for the Under-determined Blind Source Separation (UBSS) algorithm using our recently proposed high speed Discrete Hilbert Transform (DHT) targeting real time applications. In UBSS algorithm, unlike the typical BSS, the number of sensors are less than the number of the sources, which is of more interest in the real time applications. The DHT architecture has been implemented based on sub matrix multiplication method to compute M point DHT, which uses N point architecture recursively and where M is an integer multiples of N. The DHT architecture and state of the art architecture are coded in VHDL for 16 bit word length and ASIC implementation is carried out using UMC 90 - nm technology @V DD = 1V and @ 1MHZ clock frequency. The proposed architecture implementation and experimental comparison results show that the DHT design is two times faster than state of the art architecture.

  11. Software Requirements Engineering Methodology (Development)

    DTIC Science & Technology

    1979-06-01

    Higher Order Software [20]; and the Michael Jackson Design Methodology [21]. Although structured programming constructs have proven to be more useful...reviewed here. Similarly, the manual techniques for software design (e.g., HIPO Diagrams, Nassi-Schneidermann charts, Top-Down Design, the Michael ... Jackson Design Methodology, Yourdon’s Structured Design) are not addressed. 6.1.3 Research Programs There are a number of research programs underway

  12. Electronic automation of LRFD design programs.

    DOT National Transportation Integrated Search

    2010-03-01

    The study provided electronic programs to WisDOT for designing pre-stressed girders and piers using the Load : Resistance Factor Design (LRFD) methodology. The software provided is intended to ease the transition to : LRFD for WisDOT design engineers...

  13. Modeling and Design Analysis Methodology for Tailoring of Aircraft Structures with Composites

    NASA Technical Reports Server (NTRS)

    Rehfield, Lawrence W.

    2004-01-01

    Composite materials provide design flexibility in that fiber placement and orientation can be specified and a variety of material forms and manufacturing processes are available. It is possible, therefore, to 'tailor' the structure to a high degree in order to meet specific design requirements in an optimum manner. Common industrial practices, however, have limited the choices designers make. One of the reasons for this is that there is a dearth of conceptual/preliminary design analysis tools specifically devoted to identifying structural concepts for composite airframe structures. Large scale finite element simulations are not suitable for such purposes. The present project has been devoted to creating modeling and design analysis methodology for use in the tailoring process of aircraft structures. Emphasis has been given to creating bend-twist elastic coupling in high aspect ratio wings or other lifting surfaces. The direction of our work was in concert with the overall NASA effort Twenty- First Century Aircraft Technology (TCAT). A multi-disciplinary team was assembled by Dr. Damodar Ambur to work on wing technology, which included our project.

  14. A user-centred methodology for designing an online social network to motivate health behaviour change.

    PubMed

    Kamal, Noreen; Fels, Sidney

    2013-01-01

    Positive health behaviour is critical to preventing illness and managing chronic conditions. A user-centred methodology was employed to design an online social network to motivate health behaviour change. The methodology was augmented by utilizing the Appeal, Belonging, Commitment (ABC) Framework, which is based on theoretical models for health behaviour change and use of online social networks. The user-centred methodology included four phases: 1) initial user inquiry on health behaviour and use of online social networks; 2) interview feedback on paper prototypes; 2) laboratory study on medium fidelity prototype; and 4) a field study on the high fidelity prototype. The points of inquiry through these phases were based on the ABC Framework. This yielded an online social network system that linked to external third party databases to deploy to users via an interactive website.

  15. Improved FTA Methodology and Application to Subsea Pipeline Reliability Design

    PubMed Central

    Lin, Jing; Yuan, Yongbo; Zhang, Mingyuan

    2014-01-01

    An innovative logic tree, Failure Expansion Tree (FET), is proposed in this paper, which improves on traditional Fault Tree Analysis (FTA). It describes a different thinking approach for risk factor identification and reliability risk assessment. By providing a more comprehensive and objective methodology, the rather subjective nature of FTA node discovery is significantly reduced and the resulting mathematical calculations for quantitative analysis are greatly simplified. Applied to the Useful Life phase of a subsea pipeline engineering project, the approach provides a more structured analysis by constructing a tree following the laws of physics and geometry. Resulting improvements are summarized in comparison table form. PMID:24667681

  16. Evaluation of relative response factor methodology for demonstrating attainment of ozone in Houston, Texas.

    PubMed

    Vizuete, William; Biton, Leiran; Jeffries, Harvey E; Couzo, Evan

    2010-07-01

    In 2007, the U.S. Environmental Protection Agency (EPA) released guidance on demonstrating attainment of the federal ozone (O3) standard. This guidance recommended a change in the use of air quality model (AQM) predictions from an absolute to a relative way. This was accomplished by using a ratio, and not the absolute difference of AQM O3 predictions from a historical year to an attainment year. This ratio of O3 concentrations, labeled the relative response factor (RRF), is multiplied by an average of observed concentrations at every monitor. In this analysis, whether the methodology used to calculate RRFs is severing the source-receptor relationship for a given monitor was investigated. Model predictions were generated with a regulatory AQM system used to support the 2004 Houston-Galveston-Brazoria State Implementation Plan. Following the procedures in the EPA guidance, an attainment demonstration was completed using regulatory AQM predictions and measurements from the Houston ground-monitoring network. Results show that the model predictions used for the RRF calculation were often based on model conditions that were geographically remote from observations and counter to wind flow. Many of the monitors used the same model predictions for an RRF, even if that O3 plume did not impact it. The RRF methodology resulted in severing the true source-receptor relationship for a monitor. This analysis also showed that model performance could influence RRF values, and values at monitoring sites appear to be sensitive to model bias. Results indicate an inverse linear correlation of RRFs with model bias at each monitor (R2 = 0.47), resulting in a change in future O3 design values up to 5 parts per billion (ppb). These results suggest that the application of RRF methodology in Houston, TX, should be changed from using all model predictions above 85 ppb to a method that removes any predictions that are not relevant to the observed source-receptor relationship.

  17. Design Validation Methodology Development for an Aircraft Sensor Deployment System

    NASA Astrophysics Data System (ADS)

    Wowczuk, Zenovy S.

    The OCULUS 1.0 Sensor Deployment concept design, was developed in 2004 at West Virginia University (WVU), outlined the general concept of a deployment system to be used on a C-130 aircraft. As a sequel, a new system, OCULUS 1.1, has been developed and designed. The new system transfers the concept system design to a safety of flight design, and also enhanced to a pre-production system to be used as the test bed to gain full military certification approval. The OCULUS 1.1 system has an implemented standard deployment system/procedure to go along with a design suited for military certification and implementation. This design process included analysis of the system's critical components and the generation of a critical component holistic model to be used as an analysis tool for future payload modification made to the system. Following the completion of the OCULUS 1.1 design, preparations and procedures for obtaining military airworthiness certification are described. The airworthiness process includes working with the agency overseeing all modifications to the normal operating procedures made to military C-130 aircraft and preparing the system for an experimental flight test. The critical steps in his process include developing a complete documentation package that details the analysis performed on the OCULUS 1.1 system and also the design of experiment flight test plan to analyze the system. Following the approval of the documentation and design of experiment an experimental flight test of the OCULUS 1.1 system was performed to verify the safety and airworthiness of the system. This test proved successfully that the OCULUS 1.1 system design was airworthy and approved for military use. The OCULUS 1.1 deployment system offers an open architecture design that is ideal for use as a sensor testing platform for developmental airborne sensors. The system's patented deployment methodology presents a simplistic approach to reaching the systems final operating position which

  18. 24 CFR 598.305 - Designation factors.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... (Continued) OFFICE OF ASSISTANT SECRETARY FOR COMMUNITY PLANNING AND DEVELOPMENT, DEPARTMENT OF HOUSING AND... Designation Process § 598.305 Designation factors. In choosing among nominated urban areas eligible for designation, the Secretary will consider: (a) Quality of strategic plan. The quality of the strategic plan...

  19. Applying Central Composite Design and Response Surface Methodology to Optimize Growth and Biomass Production of Haemophilus influenzae Type b.

    PubMed

    Momen, Seyed Bahman; Siadat, Seyed Davar; Akbari, Neda; Ranjbar, Bijan; Khajeh, Khosro

    2016-06-01

    Haemophilus influenzae type b (Hib) is the leading cause of bacterial meningitis, otitis media, pneumonia, cellulitis, bacteremia, and septic arthritis in infants and young children. The Hib capsule contains the major virulence factor, and is composed of polyribosyl ribitol phosphate (PRP) that can induce immune system response. Vaccines consisting of Hib capsular polysaccharide (PRP) conjugated to a carrier protein are effective in the prevention of the infections. However, due to costly processes in PRP production, these vaccines are too expensive. To enhance biomass, in this research we focused on optimizing Hib growth with respect to physical factors such as pH, temperature, and agitation by using a response surface methodology (RSM). We employed a central composite design (CCD) and a response surface methodology to determine the optimum cultivation conditions for growth and biomass production of H. influenzae type b. The treatment factors investigated were initial pH, agitation, and temperature, using shaking flasks. After Hib cultivation and determination of dry biomass, analysis of experimental data was performed by the RSM-CCD. The model showed that temperature and pH had an interactive effect on Hib biomass production. The dry biomass produced in shaking flasks was about 5470 mg/L, which was under an initial pH of 8.5, at 250 rpm and 35° C. We found CCD and RSM very effective in optimizing Hib culture conditions, and Hib biomass production was greatly influenced by pH and incubation temperature. Therefore, optimization of the growth factors to maximize Hib production can lead to 1) an increase in bacterial biomass and PRP productions, 2) lower vaccine prices, 3) vaccination of more susceptible populations, and 4) lower risk of Hib infections.

  20. Methodology for the Preliminary Design of High Performance Schools in Hot and Humid Climates

    ERIC Educational Resources Information Center

    Im, Piljae

    2009-01-01

    A methodology to develop an easy-to-use toolkit for the preliminary design of high performance schools in hot and humid climates was presented. The toolkit proposed in this research will allow decision makers without simulation knowledge easily to evaluate accurately energy efficient measures for K-5 schools, which would contribute to the…

  1. Pushover Analysis Methodologies: A Tool For Limited Damage Based Design Of Structure For Seismic Vibration

    NASA Astrophysics Data System (ADS)

    Dutta, Sekhar Chandra; Chakroborty, Suvonkar; Raychaudhuri, Anusrita

    Vibration transmitted to the structure during earthquake may vary in magnitude over a wide range. Design methodology should, therefore, enumerates steps so that structures are able to survive in the event of even severe ground motion. However, on account of economic reason, the strengths can be provided to the structures in such a way that the structure remains in elastic range in low to moderate range earthquake and is allowed to undergo inelastic deformation in severe earthquake without collapse. To implement this design philosophy a rigorous nonlinear dynamic analysis is needed to be performed to estimate the inelastic demands. Furthermore, the same is time consuming and requires expertise to judge the results obtained from the same. In this context, the present paper discusses and demonstrates an alternative simple method known as Pushover method, which can be easily used by practicing engineers bypassing intricate nonlinear dynamic analysis and can be thought of as a substitute of the latter. This method is in the process of development and is increasingly becoming popular for its simplicity. The objective of this paper is to emphasize and demonstrate the basic concept, strength and ease of this state of the art methodology for regular use in design offices in performance based seismic design of structures.

  2. Motivation for orthodontic treatment investigated with Q-methodology: patients' and parents' perspectives.

    PubMed

    Prabakaran, Rema; Seymour, Shiri; Moles, David R; Cunningham, Susan J

    2012-08-01

    Motivation and cooperation are vital components of orthodontic treatment if a good outcome is to be achieved. In this study, we used Q-methodology to investigate motivating factors among adolescents seeking orthodontic treatment and parents wanting their children to undergo orthodontic treatment. This technique asks participants to rank a series of statements, and the analysis of this ranking then provides insight into the participants' opinions. Each of these complementary studies was divided into 2 phases: interviews to generate a list of reasons for seeking orthodontic treatment and the use of Q-methodology to assess and categorize the relative importance of these reasons for the groups of participants. In the patient study, 32 items were generated from the interviews and placed in order of importance on a Q-methodology grid by 60 patients who were about to commence orthodontic treatment. The rankings were subjected to factor analysis, which categorized the patients' views into groups of shared opinions. The same methodology was used with the parent group, and a Q-methodology grid was designed to accommodate 35 items that were then ranked by the 60 parents. The rankings were subjected to factor analysis as for the patient group. For the patients, factor analysis identified 3 factors, all of which included esthetics, as important. The remaining respondents had more individual viewpoints and did not map to any of the 3 factors. For the parents, factor analysis identified 4 factors, all of which included treatment in adolescence to prevent future problems, as important. This study showed that Q-methodology is a novel and efficient tool that can be used in dental research with few difficulties. It might prove useful for the aspects of care for which subjective views or opinions play an important role. Copyright © 2012 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.

  3. Enhancing the Benefit of the Chemical Mixture Methodology: A Report on Methodology Testing and Potential Approaches for Improving Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, Xiao-Ying; Yao, Juan; He, Hua

    2012-01-01

    Extensive testing shows that the current version of the Chemical Mixture Methodology (CMM) is meeting its intended mission to provide conservative estimates of the health effects from exposure to airborne chemical mixtures. However, the current version of the CMM could benefit from several enhancements that are designed to improve its application of Health Code Numbers (HCNs) and employ weighting factors to reduce over conservatism.

  4. A low-power photovoltaic system with energy storage for radio communications: Description and design methodology

    NASA Technical Reports Server (NTRS)

    Chapman, C. P.; Chapman, P. D.; Lewison, A. H.

    1982-01-01

    A low power photovoltaic system was constructed with approximately 500 amp hours of battery energy storage to provide power to an emergency amateur radio communications center. The system can power the communications center for about 72 hours of continuous nonsun operation. Complete construction details and a design methodology algorithm are given with abundant engineering data and adequate theory to allow similar systems to be constructed, scaled up or down, with minimum design effort.

  5. FACTORS INFLUENCING THE DESIGN OF BIOACCUMULATION FACTOR AND BIOTA-SEDIMENT ACCUMULATION FACTOR FIELD STUDIES

    EPA Science Inventory

    A series of modeling simulations were performed to develop an understanding of the underlying factors and principles involved in developing field sampling designs for measuring bioaccumulation factors (BAFs) and biota-sediment accumulation factors (BSAFs. These simulations reveal...

  6. Development of design and analysis methodology for composite bolted joints

    NASA Astrophysics Data System (ADS)

    Grant, Peter; Sawicki, Adam

    1991-05-01

    This paper summarizes work performed to develop composite joint design methodology for use on rotorcraft primary structure, determine joint characteristics which affect joint bearing and bypass strength, and develop analytical methods for predicting the effects of such characteristics in structural joints. Experimental results have shown that bearing-bypass interaction allowables cannot be defined using a single continuous function due to variance of failure modes for different bearing-bypass ratios. Hole wear effects can be significant at moderate stress levels and should be considered in the development of bearing allowables. A computer program has been developed and has successfully predicted bearing-bypass interaction effects for the (0/+/-45/90) family of laminates using filled hole and unnotched test data.

  7. A methodology for the design of experiments in computational intelligence with multiple regression models.

    PubMed

    Fernandez-Lozano, Carlos; Gestal, Marcos; Munteanu, Cristian R; Dorado, Julian; Pazos, Alejandro

    2016-01-01

    The design of experiments and the validation of the results achieved with them are vital in any research study. This paper focuses on the use of different Machine Learning approaches for regression tasks in the field of Computational Intelligence and especially on a correct comparison between the different results provided for different methods, as those techniques are complex systems that require further study to be fully understood. A methodology commonly accepted in Computational intelligence is implemented in an R package called RRegrs. This package includes ten simple and complex regression models to carry out predictive modeling using Machine Learning and well-known regression algorithms. The framework for experimental design presented herein is evaluated and validated against RRegrs. Our results are different for three out of five state-of-the-art simple datasets and it can be stated that the selection of the best model according to our proposal is statistically significant and relevant. It is of relevance to use a statistical approach to indicate whether the differences are statistically significant using this kind of algorithms. Furthermore, our results with three real complex datasets report different best models than with the previously published methodology. Our final goal is to provide a complete methodology for the use of different steps in order to compare the results obtained in Computational Intelligence problems, as well as from other fields, such as for bioinformatics, cheminformatics, etc., given that our proposal is open and modifiable.

  8. A methodology for the design of experiments in computational intelligence with multiple regression models

    PubMed Central

    Gestal, Marcos; Munteanu, Cristian R.; Dorado, Julian; Pazos, Alejandro

    2016-01-01

    The design of experiments and the validation of the results achieved with them are vital in any research study. This paper focuses on the use of different Machine Learning approaches for regression tasks in the field of Computational Intelligence and especially on a correct comparison between the different results provided for different methods, as those techniques are complex systems that require further study to be fully understood. A methodology commonly accepted in Computational intelligence is implemented in an R package called RRegrs. This package includes ten simple and complex regression models to carry out predictive modeling using Machine Learning and well-known regression algorithms. The framework for experimental design presented herein is evaluated and validated against RRegrs. Our results are different for three out of five state-of-the-art simple datasets and it can be stated that the selection of the best model according to our proposal is statistically significant and relevant. It is of relevance to use a statistical approach to indicate whether the differences are statistically significant using this kind of algorithms. Furthermore, our results with three real complex datasets report different best models than with the previously published methodology. Our final goal is to provide a complete methodology for the use of different steps in order to compare the results obtained in Computational Intelligence problems, as well as from other fields, such as for bioinformatics, cheminformatics, etc., given that our proposal is open and modifiable. PMID:27920952

  9. Design optimization for cost and quality: The robust design approach

    NASA Technical Reports Server (NTRS)

    Unal, Resit

    1990-01-01

    Designing reliable, low cost, and operable space systems has become the key to future space operations. Designing high quality space systems at low cost is an economic and technological challenge to the designer. A systematic and efficient way to meet this challenge is a new method of design optimization for performance, quality, and cost, called Robust Design. Robust Design is an approach for design optimization. It consists of: making system performance insensitive to material and subsystem variation, thus allowing the use of less costly materials and components; making designs less sensitive to the variations in the operating environment, thus improving reliability and reducing operating costs; and using a new structured development process so that engineering time is used most productively. The objective in Robust Design is to select the best combination of controllable design parameters so that the system is most robust to uncontrollable noise factors. The robust design methodology uses a mathematical tool called an orthogonal array, from design of experiments theory, to study a large number of decision variables with a significantly small number of experiments. Robust design also uses a statistical measure of performance, called a signal-to-noise ratio, from electrical control theory, to evaluate the level of performance and the effect of noise factors. The purpose is to investigate the Robust Design methodology for improving quality and cost, demonstrate its application by the use of an example, and suggest its use as an integral part of space system design process.

  10. 76 FR 11196 - Antidumping Methodologies in Proceedings Involving Non-Market Economies: Valuing the Factor of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-01

    ... DEPARTMENT OF COMMERCE International Trade Administration Antidumping Methodologies in Proceedings Involving Non-Market Economies: Valuing the Factor of Production: Labor; Correction to Request for Comment AGENCY: Import Administration, International Trade Administration, Department of Commerce DATES: Effective Date: March 1, 2011. FOR FURTHER...

  11. FACTORS INFLUENCING THE DESIGN OF BIOACCUMULATION FACTOR AND BIOTA-SEDIMENT ACCUMULATION FACTOR FIELD STUDIES

    EPA Science Inventory

    General guidance for designing field studies to measure bioaccumulation factors (BAFs) and biota-sediment accumulation factors (BSAFs) is not available. To develop such guidance, a series of modeling simulations were performed to evaluate the underlying factors and principles th...

  12. Optimization of laccase production by Pleurotus ostreatus IMI 395545 using the Taguchi DOE methodology.

    PubMed

    Periasamy, Rathinasamy; Palvannan, Thayumanavan

    2010-12-01

    Production of laccase using a submerged culture of Pleurotus orstreatus IMI 395545 was optimized by the Taguchi orthogonal array (OA) design of experiments (DOE) methodology. This approach facilitates the study of the interactions of a large number of variables spanned by factors and their settings, with a small number of experiments, leading to considerable savings in time and cost for process optimization. This methodology optimizes the number of impact factors and enables to calculate their interaction in the production of industrial enzymes. Eight factors, viz. glucose, yeast extract, malt extract, inoculum, mineral solution, inducer (1 mM CuSO₄) and amino acid (l-asparagine) at three levels and pH at two levels, with an OA layout of L18 (2¹ × 3⁷) were selected for the proposed experimental design. The laccase yield obtained from the 18 sets of fermentation experiments performed with the selected factors and levels was further processed with Qualitek-4 software. The optimized conditions shared an enhanced laccase expression of 86.8% (from 485.0 to 906.3 U). The combination of factors was further validated for laccase production and reactive blue 221 decolorization. The results revealed an enhanced laccase yield of 32.6% and dye decolorization up to 84.6%. This methodology allows the complete evaluation of main and interaction factors. © 2010 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim

  13. Sonic Boom Mitigation Through Aircraft Design and Adjoint Methodology

    NASA Technical Reports Server (NTRS)

    Rallabhandi, Siriam K.; Diskin, Boris; Nielsen, Eric J.

    2012-01-01

    This paper presents a novel approach to design of the supersonic aircraft outer mold line (OML) by optimizing the A-weighted loudness of sonic boom signature predicted on the ground. The optimization process uses the sensitivity information obtained by coupling the discrete adjoint formulations for the augmented Burgers Equation and Computational Fluid Dynamics (CFD) equations. This coupled formulation links the loudness of the ground boom signature to the aircraft geometry thus allowing efficient shape optimization for the purpose of minimizing the impact of loudness. The accuracy of the adjoint-based sensitivities is verified against sensitivities obtained using an independent complex-variable approach. The adjoint based optimization methodology is applied to a configuration previously optimized using alternative state of the art optimization methods and produces additional loudness reduction. The results of the optimizations are reported and discussed.

  14. 40 CFR Table Nn-1 to Subpart Hh of... - Default Factors for Calculation Methodology 1 of This Subpart

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Calculation Methodology 1 of This Subpart Fuel Default high heating value factor Default CO2 emission factor (kg CO2/MMBtu) Natural Gas 1.028 MMBtu/Mscf 53.02 Propane 3.822 MMBtu/bbl 61.46 Normal butane 4.242...

  15. 40 CFR Table Nn-1 to Subpart Hh of... - Default Factors for Calculation Methodology 1 of This Subpart

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Calculation Methodology 1 of This Subpart Fuel Default high heating value factor Default CO2 emission factor (kg CO2/MMBtu) Natural Gas 1.028 MMBtu/Mscf 53.02 Propane 3.822 MMBtu/bbl 61.46 Normal butane 4.242...

  16. 40 CFR Table Nn-1 to Subpart Hh of... - Default Factors for Calculation Methodology 1 of This Subpart

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Calculation Methodology 1 of This Subpart Fuel Default high heating value factor Default CO2 emission factor (kg CO2/MMBtu) Natural Gas 1.028 MMBtu/Mscf 53.02 Propane 3.822 MMBtu/bbl 61.46 Normal butane 4.242...

  17. Supporting Space Systems Design via Systems Dependency Analysis Methodology

    NASA Astrophysics Data System (ADS)

    Guariniello, Cesare

    assess the behavior of each system based on its internal status and on the topology of its dependencies on systems connected to it. Designers and decision makers can therefore quickly analyze and explore the behavior of complex systems and evaluate different architectures under various working conditions. The methods support educated decision making both in the design and in the update process of systems architecture, reducing the need to execute extensive simulations. In particular, in the phase of concept generation and selection, the information given by the methods can be used to identify promising architectures to be further tested and improved, while discarding architectures that do not show the required level of global features. The methods, when used in conjunction with appropriate metrics, also allow for improved reliability and risk analysis, as well as for automatic scheduling and re-scheduling based on the features of the dependencies and on the accepted level of risk. This dissertation illustrates the use of the two methods in sample aerospace applications, both in the operational and in the developmental domain. The applications show how to use the developed methodology to evaluate the impact of failures, assess the criticality of systems, quantify metrics of interest, quantify the impact of delays, support informed decision making when scheduling the development of systems and evaluate the achievement of partial capabilities. A larger, well-framed case study illustrates how the Systems Operational Dependency Analysis method and the Systems Developmental Dependency Analysis method can support analysis and decision making, at the mid and high level, in the design process of architectures for the exploration of Mars. The case study also shows how the methods do not replace the classical systems engineering methodologies, but support and improve them.

  18. phMRI: methodological considerations for mitigating potential confounding factors

    PubMed Central

    Bourke, Julius H.; Wall, Matthew B.

    2015-01-01

    Pharmacological Magnetic Resonance Imaging (phMRI) is a variant of conventional MRI that adds pharmacological manipulations in order to study the effects of drugs, or uses pharmacological probes to investigate basic or applied (e.g., clinical) neuroscience questions. Issues that may confound the interpretation of results from various types of phMRI studies are briefly discussed, and a set of methodological strategies that can mitigate these problems are described. These include strategies that can be employed at every stage of investigation, from study design to interpretation of resulting data, and additional techniques suited for use with clinical populations are also featured. Pharmacological MRI is a challenging area of research that has both significant advantages and formidable difficulties, however with due consideration and use of these strategies many of the key obstacles can be overcome. PMID:25999812

  19. Human factors of intelligent computer aided display design

    NASA Technical Reports Server (NTRS)

    Hunt, R. M.

    1985-01-01

    Design concepts for a decision support system being studied at NASA Langley as an aid to visual display unit (VDU) designers are described. Ideally, human factors should be taken into account by VDU designers. In reality, although the human factors database on VDUs is small, such systems must be constantly developed. Human factors are therefore a secondary consideration. An expert system will thus serve mainly in an advisory capacity. Functions can include facilitating the design process by shortening the time to generate and alter drawings, enhancing the capability of breaking design requirements down into simpler functions, and providing visual displays equivalent to the final product. The VDU system could also discriminate, and display the difference, between designer decisions and machine inferences. The system could also aid in analyzing the effects of designer choices on future options and in ennunciating when there are data available on a design selections.

  20. New methodology for shaft design based on life expectancy

    NASA Technical Reports Server (NTRS)

    Loewenthal, S. H.

    1986-01-01

    The design of power transmission shafting for reliability has not historically received a great deal of attention. However, weight sensitive aerospace and vehicle applications and those where the penalties of shaft failure are great, require greater confidence in shaft design than earlier methods provided. This report summarizes a fatigue strength-based, design method for sizing shafts under variable amplitude loading histories for limited or nonlimited service life. Moreover, applications factors such as press-fitted collars, shaft size, residual stresses from shot peening or plating, corrosive environments can be readily accommodated into the framework of the analysis. Examples are given which illustrate the use of the method, pointing out the large life penalties due to occasional cyclic overloads.

  1. A methodology for the design and evaluation of user interfaces for interactive information systems. Ph.D. Thesis Final Report, 1 Jul. 1985 - 31 Dec. 1987

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Farooq, Mohammad U.

    1986-01-01

    The definition of proposed research addressing the development and validation of a methodology for the design and evaluation of user interfaces for interactive information systems is given. The major objectives of this research are: the development of a comprehensive, objective, and generalizable methodology for the design and evaluation of user interfaces for information systems; the development of equations and/or analytical models to characterize user behavior and the performance of a designed interface; the design of a prototype system for the development and administration of user interfaces; and the design and use of controlled experiments to support the research and test/validate the proposed methodology. The proposed design methodology views the user interface as a virtual machine composed of three layers: an interactive layer, a dialogue manager layer, and an application interface layer. A command language model of user system interactions is presented because of its inherent simplicity and structured approach based on interaction events. All interaction events have a common structure based on common generic elements necessary for a successful dialogue. It is shown that, using this model, various types of interfaces could be designed and implemented to accommodate various categories of users. The implementation methodology is discussed in terms of how to store and organize the information.

  2. Ultra-Structure database design methodology for managing systems biology data and analyses

    PubMed Central

    Maier, Christopher W; Long, Jeffrey G; Hemminger, Bradley M; Giddings, Morgan C

    2009-01-01

    Background Modern, high-throughput biological experiments generate copious, heterogeneous, interconnected data sets. Research is dynamic, with frequently changing protocols, techniques, instruments, and file formats. Because of these factors, systems designed to manage and integrate modern biological data sets often end up as large, unwieldy databases that become difficult to maintain or evolve. The novel rule-based approach of the Ultra-Structure design methodology presents a potential solution to this problem. By representing both data and processes as formal rules within a database, an Ultra-Structure system constitutes a flexible framework that enables users to explicitly store domain knowledge in both a machine- and human-readable form. End users themselves can change the system's capabilities without programmer intervention, simply by altering database contents; no computer code or schemas need be modified. This provides flexibility in adapting to change, and allows integration of disparate, heterogenous data sets within a small core set of database tables, facilitating joint analysis and visualization without becoming unwieldy. Here, we examine the application of Ultra-Structure to our ongoing research program for the integration of large proteomic and genomic data sets (proteogenomic mapping). Results We transitioned our proteogenomic mapping information system from a traditional entity-relationship design to one based on Ultra-Structure. Our system integrates tandem mass spectrum data, genomic annotation sets, and spectrum/peptide mappings, all within a small, general framework implemented within a standard relational database system. General software procedures driven by user-modifiable rules can perform tasks such as logical deduction and location-based computations. The system is not tied specifically to proteogenomic research, but is rather designed to accommodate virtually any kind of biological research. Conclusion We find Ultra-Structure offers

  3. Methodology for Determining Limit Torques for Threaded Fasteners

    NASA Technical Reports Server (NTRS)

    Hissam, Andy

    2011-01-01

    In aerospace design, where minimizing weight is always a priority, achieving the full capacity from fasteners is essential. To do so, the initial bolt preload must be maximized. The benefits of high preload are well documented and include improved fatigue resistance, a stiffer joint, and resistance to loosening. But many factors like elastic interactions and embedment tend to lower the initial preload placed on the bolt. These factors provide additional motivation to maximize the initial preload. But, to maximize bolt preload, you must determine what torque to apply. Determining this torque is greatly complicated by the large preload scatter generally seen with torque control. This paper presents a detailed methodology for generating limit torques for threaded fasteners. This methodology accounts for the large scatter in preload found with torque control, and therefore, addresses the statistical nature of the problem. It also addresses prevailing torque, a feature common in aerospace fasteners. Although prevailing torque provides a desired locking feature, it can also increase preload scatter. In addition, it can limit the amount of preload that can be generated due to the torsion it creates in the bolt. This paper discusses the complications of prevailing torque and how best to handle it. A wide range of torque-tension bolt testing was conducted in support of this research. The results from this research will benefit the design engineer as well as analyst involved in the design of bolted joints, leading to better, more optimized structural designs.

  4. Integrated active and passive control design methodology for the LaRC CSI evolutionary model

    NASA Technical Reports Server (NTRS)

    Voth, Christopher T.; Richards, Kenneth E., Jr.; Schmitz, Eric; Gehling, Russel N.; Morgenthaler, Daniel R.

    1994-01-01

    A general design methodology to integrate active control with passive damping was demonstrated on the NASA LaRC CSI Evolutionary Model (CEM), a ground testbed for future large, flexible spacecraft. Vibration suppression controllers designed for Line-of Sight (LOS) minimization were successfully implemented on the CEM. A frequency-shaped H2 methodology was developed, allowing the designer to specify the roll-off of the MIMO compensator. A closed loop bandwidth of 4 Hz, including the six rigid body modes and the first three dominant elastic modes of the CEM was achieved. Good agreement was demonstrated between experimental data and analytical predictions for the closed loop frequency response and random tests. Using the Modal Strain Energy (MSE) method, a passive damping treatment consisting of 60 viscoelastically damped struts was designed, fabricated and implemented on the CEM. Damping levels for the targeted modes were more than an order of magnitude larger than for the undamped structure. Using measured loss and stiffness data for the individual damped struts, analytical predictions of the damping levels were very close to the experimental values in the (1-10) Hz frequency range where the open loop model matched the experimental data. An integrated active/passive controller was successfully implemented on the CEM and was evaluated against an active-only controller. A two-fold increase in the effective control bandwidth and further reductions of 30 percent to 50 percent in the LOS RMS outputs were achieved compared to an active-only controller. Superior performance was also obtained compared to a High-Authority/Low-Authority (HAC/LAC) controller.

  5. Introduction to human factors considerations in system design

    NASA Technical Reports Server (NTRS)

    Chapanis, A.

    1983-01-01

    A definition for human factors or ergonomics and its industrial and domestic application is presented. Human factors engineering, which discovers and applies information about human abilities, limitations, and other characteristics to the design of tools, machines, systems, tasks, jobs, and environments for safe, comfortable, and effective human use, is outlined. The origins of human factors and ergonomics, the philosophy of human factors, goals and objectives, systems development and design, are reviewed.

  6. A Multi-Objective Advanced Design Methodology of Composite Beam-to-Column Joints Subjected to Seismic and Fire Loads

    NASA Astrophysics Data System (ADS)

    Pucinotti, Raffaele; Ferrario, Fabio; Bursi, Oreste S.

    2008-07-01

    A multi-objective advanced design methodology dealing with seismic actions followed by fire on steel-concrete composite full strength joints with concrete filled tubes is proposed in this paper. The specimens were designed in detail in order to exhibit a suitable fire behaviour after a severe earthquake. The major aspects of the cyclic behaviour of composite joints are presented and commented upon. The data obtained from monotonic and cyclic experimental tests have been used to calibrate a model of the joint in order to perform seismic simulations on several moment resisting frames. A hysteretic law was used to take into account the seismic degradation of the joints. Finally, fire tests were conducted with the objective to evaluate fire resistance of the connection already damaged by an earthquake. The experimental activity together with FE simulation demonstrated the adequacy of the advanced design methodology.

  7. Systematic design methodology for robust genetic transistors based on I/O specifications via promoter-RBS libraries.

    PubMed

    Lee, Yi-Ying; Hsu, Chih-Yuan; Lin, Ling-Jiun; Chang, Chih-Chun; Cheng, Hsiao-Chun; Yeh, Tsung-Hsien; Hu, Rei-Hsing; Lin, Che; Xie, Zhen; Chen, Bor-Sen

    2013-10-27

    Synthetic genetic transistors are vital for signal amplification and switching in genetic circuits. However, it is still problematic to efficiently select the adequate promoters, Ribosome Binding Sides (RBSs) and inducer concentrations to construct a genetic transistor with the desired linear amplification or switching in the Input/Output (I/O) characteristics for practical applications. Three kinds of promoter-RBS libraries, i.e., a constitutive promoter-RBS library, a repressor-regulated promoter-RBS library and an activator-regulated promoter-RBS library, are constructed for systematic genetic circuit design using the identified kinetic strengths of their promoter-RBS components.According to the dynamic model of genetic transistors, a design methodology for genetic transistors via a Genetic Algorithm (GA)-based searching algorithm is developed to search for a set of promoter-RBS components and adequate concentrations of inducers to achieve the prescribed I/O characteristics of a genetic transistor. Furthermore, according to design specifications for different types of genetic transistors, a look-up table is built for genetic transistor design, from which we could easily select an adequate set of promoter-RBS components and adequate concentrations of external inducers for a specific genetic transistor. This systematic design method will reduce the time spent using trial-and-error methods in the experimental procedure for a genetic transistor with a desired I/O characteristic. We demonstrate the applicability of our design methodology to genetic transistors that have desirable linear amplification or switching by employing promoter-RBS library searching.

  8. Systematic design methodology for robust genetic transistors based on I/O specifications via promoter-RBS libraries

    PubMed Central

    2013-01-01

    Background Synthetic genetic transistors are vital for signal amplification and switching in genetic circuits. However, it is still problematic to efficiently select the adequate promoters, Ribosome Binding Sides (RBSs) and inducer concentrations to construct a genetic transistor with the desired linear amplification or switching in the Input/Output (I/O) characteristics for practical applications. Results Three kinds of promoter-RBS libraries, i.e., a constitutive promoter-RBS library, a repressor-regulated promoter-RBS library and an activator-regulated promoter-RBS library, are constructed for systematic genetic circuit design using the identified kinetic strengths of their promoter-RBS components. According to the dynamic model of genetic transistors, a design methodology for genetic transistors via a Genetic Algorithm (GA)-based searching algorithm is developed to search for a set of promoter-RBS components and adequate concentrations of inducers to achieve the prescribed I/O characteristics of a genetic transistor. Furthermore, according to design specifications for different types of genetic transistors, a look-up table is built for genetic transistor design, from which we could easily select an adequate set of promoter-RBS components and adequate concentrations of external inducers for a specific genetic transistor. Conclusion This systematic design method will reduce the time spent using trial-and-error methods in the experimental procedure for a genetic transistor with a desired I/O characteristic. We demonstrate the applicability of our design methodology to genetic transistors that have desirable linear amplification or switching by employing promoter-RBS library searching. PMID:24160305

  9. Estimation of design space for an extrusion-spheronization process using response surface methodology and artificial neural network modelling.

    PubMed

    Sovány, Tamás; Tislér, Zsófia; Kristó, Katalin; Kelemen, András; Regdon, Géza

    2016-09-01

    The application of the Quality by Design principles is one of the key issues of the recent pharmaceutical developments. In the past decade a lot of knowledge was collected about the practical realization of the concept, but there are still a lot of unanswered questions. The key requirement of the concept is the mathematical description of the effect of the critical factors and their interactions on the critical quality attributes (CQAs) of the product. The process design space (PDS) is usually determined by the use of design of experiment (DoE) based response surface methodologies (RSM), but inaccuracies in the applied polynomial models often resulted in the over/underestimation of the real trends and changes making the calculations uncertain, especially in the edge regions of the PDS. The completion of RSM with artificial neural network (ANN) based models is therefore a commonly used method to reduce the uncertainties. Nevertheless, since the different researches are focusing on the use of a given DoE, there is lack of comparative studies on different experimental layouts. Therefore, the aim of present study was to investigate the effect of the different DoE layouts (2 level full factorial, Central Composite, Box-Behnken, 3 level fractional and 3 level full factorial design) on the model predictability and to compare model sensitivities according to the organization of the experimental data set. It was revealed that the size of the design space could differ more than 40% calculated with different polynomial models, which was associated with a considerable shift in its position when higher level layouts were applied. The shift was more considerable when the calculation was based on RSM. The model predictability was also better with ANN based models. Nevertheless, both modelling methods exhibit considerable sensitivity to the organization of the experimental data set, and the use of design layouts is recommended, where the extreme values factors are more represented

  10. BIM Methodology Approach to Infrastructure Design: Case Study of Paniga Tunnel

    NASA Astrophysics Data System (ADS)

    Osello, Anna; Rapetti, Niccolò; Semeraro, Francesco

    2017-10-01

    Nowadays, the implementation of Building Information Modelling (BIM) in civil design represent a new challenge for the AECO (Architecture, Engineering, Construction, Owner and Operator) world, which will involve the interest of many researchers in the next years. It is due to the incentives of Public Administration and European Directives that aim to improve the efficiency and to enhance a better management of the complexity of infrastructure projects. For these reasons, the goal of this research is to propose a methodology for the use of BIM in a tunnel project, analysing the definition of a correct level of detail (LOD) and the possibility to share information via interoperability for FEM analysis.

  11. Optimising reversed-phase liquid chromatographic separation of an acidic mixture on a monolithic stationary phase with the aid of response surface methodology and experimental design.

    PubMed

    Wang, Y; Harrison, M; Clark, B J

    2006-02-10

    An optimization strategy for the separation of an acidic mixture by employing a monolithic stationary phase is presented, with the aid of experimental design and response surface methodology (RSM). An orthogonal array design (OAD) OA(16) (2(15)) was used to choose the significant parameters for the optimization. The significant factors were optimized by using a central composite design (CCD) and the quadratic models between the dependent and the independent parameters were built. The mathematical models were tested on a number of simulated data set and had a coefficient of R(2) > 0.97 (n = 16). On applying the optimization strategy, the factor effects were visualized as three-dimensional (3D) response surfaces and contour plots. The optimal condition was achieved in less than 40 min by using the monolithic packing with the mobile phase of methanol/20 mM phosphate buffer pH 2.7 (25.5/74.5, v/v). The method showed good agreement between the experimental data and predictive value throughout the studied parameter space and were suitable for optimization studies on the monolithic stationary phase for acidic compounds.

  12. Observations on computational methodologies for use in large-scale, gradient-based, multidisciplinary design incorporating advanced CFD codes

    NASA Technical Reports Server (NTRS)

    Newman, P. A.; Hou, G. J.-W.; Jones, H. E.; Taylor, A. C., III; Korivi, V. M.

    1992-01-01

    How a combination of various computational methodologies could reduce the enormous computational costs envisioned in using advanced CFD codes in gradient based optimized multidisciplinary design (MdD) procedures is briefly outlined. Implications of these MdD requirements upon advanced CFD codes are somewhat different than those imposed by a single discipline design. A means for satisfying these MdD requirements for gradient information is presented which appear to permit: (1) some leeway in the CFD solution algorithms which can be used; (2) an extension to 3-D problems; and (3) straightforward use of other computational methodologies. Many of these observations have previously been discussed as possibilities for doing parts of the problem more efficiently; the contribution here is observing how they fit together in a mutually beneficial way.

  13. Factors Affecting the Design of Slow Release Formulations of Herbicides Based on Clay-Surfactant Systems. A Methodological Approach

    PubMed Central

    Galán-Jiménez, María del Carmen; Mishael, Yael-Golda; Nir, Shlomo; Morillo, Esmeralda; Undabeytia, Tomás

    2013-01-01

    A search for clay-surfactant based formulations with high percentage of the active ingredient, which can yield slow release of active molecules is described. The active ingredients were the herbicides metribuzin (MZ), mesotrione (MS) and flurtamone (FL), whose solubilities were examined in the presence of four commercial surfactants; (i) neutral: two berols (B048, B266) and an alkylpolyglucoside (AG6202); (ii) cationic: an ethoxylated amine (ET/15). Significant percent of active ingredient (a.i.) in the clay/surfactant/herbicide formulations could be achieved only when most of the surfactant was added as micelles. MZ and FL were well solubilized by berols, whereas MS by ET/15. Sorption of surfactants on the clay mineral sepiolite occurred mostly by sorption of micelles, and the loadings exceeded the CEC. Higher loadings were determined for B266 and ET/15. The sorption of surfactants was modeled by using the Langmuir-Scatchard equation which permitted the determination of binding coefficients that could be used for further predictions of the sorbed amounts of surfactants under a wide range of clay/surfactant ratios. A possibility was tested of designing clay-surfactant based formulations of certain herbicides by assuming the same ratio between herbicides and surfactants in the formulations as for herbicides incorporated in micelles in solution. Calculations indicated that satisfactory FL formulations could not be synthesized. The experimental fractions of herbicides in the formulations were in agreement with the predicted ones for MS and MZ. The validity of this approach was confirmed in in vitro release tests that showed a slowing down of the release of a.i. from the designed formulations relative to the technical products. Soil dissipation studies with MS formulations also showed improved bioactivity of the clay-surfactant formulation relative to the commercial one. This methodological approach can be extended to other clay-surfactant systems for encapsulation and

  14. Development of a design methodology for pipelines in ice scoured seabeds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clark, J.I.; Paulin, M.J.; Lach, P.R.

    1994-12-31

    Large areas of the continental shelf of northern oceans are frequently scoured or gouged by moving bodies of ice such as icebergs and sea ice keels associated with pressure ridges. This phenomenon presents a formidable challenge when the route of a submarine pipeline is intersected by the scouring ice. It is generally acknowledged that if a pipeline, laid on the seabed, were hit by an iceberg or a pressure ridge keel, the forces imposed on the pipeline would be much greater than it could practically withstand. The pipeline must therefore be buried to avoid direct contact with ice, but itmore » is very important to determine with some assurance the minimum depth required for safety for both economical and environmental reasons. The safe burial depth of a pipeline, however, cannot be determined directly from the relatively straight forward measurement of maximum scour depth. The major design consideration is the determination of the potential sub-scour deformation of the ice scoured soil. Forces transmitted through the soil and soil displacement around the pipeline could load the pipeline to failure if not taken into account in the design. If the designer can predict the forces transmitted through the soil, the pipeline can be designed to withstand these external forces using conventional design practice. In this paper, the authors outline a design methodology that is based on phenomenological studies of ice scoured terrain, both modern and relict, laboratory tests, centrifuge modeling, and numerical analysis. The implications of these studies, which could assist in the safe and economical design of pipelines in ice scoured terrain, will also be discussed.« less

  15. Exploring factors that influence work analysis data: A meta-analysis of design choices, purposes, and organizational context.

    PubMed

    DuVernet, Amy M; Dierdorff, Erich C; Wilson, Mark A

    2015-09-01

    Work analysis is fundamental to designing effective human resource systems. The current investigation extends previous research by identifying the differential effects of common design decisions, purposes, and organizational contexts on the data generated by work analyses. The effects of 19 distinct factors that span choices of descriptor, collection method, rating scale, and data source, as well as project purpose and organizational features, are explored. Meta-analytic results cumulated from 205 articles indicate that many of these variables hold significant consequences for work analysis data. Factors pertaining to descriptor choice, collection method, rating scale, and the purpose for conducting the work analysis each showed strong associations with work analysis data. The source of the work analysis information and organizational context in which it was conducted displayed fewer relationships. Findings can be used to inform choices work analysts make about methodology and postcollection evaluations of work analysis information. (c) 2015 APA, all rights reserved).

  16. Factors and competitiveness analysis in rare earth mining, new methodology: case study from Brazil.

    PubMed

    Silva, Gustavo A; Petter, Carlos O; Albuquerque, Nelson R

    2018-03-01

    Rare earths are increasingly being applied in high-tech industries, such as green energy (e.g. wind power), hybrid cars, electric cars, permanent high-performance magnets, superconductors, luminophores and many other industrial sectors involved in modern technologies. Given that China dominates this market and imposes restrictions on production and exports whenever opportunities arise, it is becoming more and more challenging to develop business ventures in this sector. Several initiatives were taken to prospect new resources and develop the production chain, including the mining of these mineral assets around the world, but some factors of uncertainties, including current low prices, increased the challenge of transforming the current resources into deposits or productive mines. Thus, analyzing the competitiveness of advanced projects becomes indispensable. This work has the objective of introducing a new methodology of competitiveness analysis, where some variables are considered as main factors that can contribute strongly to make unfeasible a mining enterprise for the use of rare earth elements (REE) with this methodology, which is quite practical and reproducible, it was possible to verify some real facts, such as: the fact that the Lynas Mount Weld CLD (AUS) Project is resilient to the uncertainties of the RE sector, at the same time as the Molycorp Project is facing major financial difficulties (under judicial reorganization). It was also possible to verify that the Araxá Project of CBMM in Brazil is one of the most competitive in this country. Thus, we contribute to the existing literature, providing a new methodology for competitiveness analysis in rare earth mining.

  17. Bayes factor design analysis: Planning for compelling evidence.

    PubMed

    Schönbrodt, Felix D; Wagenmakers, Eric-Jan

    2018-02-01

    A sizeable literature exists on the use of frequentist power analysis in the null-hypothesis significance testing (NHST) paradigm to facilitate the design of informative experiments. In contrast, there is almost no literature that discusses the design of experiments when Bayes factors (BFs) are used as a measure of evidence. Here we explore Bayes Factor Design Analysis (BFDA) as a useful tool to design studies for maximum efficiency and informativeness. We elaborate on three possible BF designs, (a) a fixed-n design, (b) an open-ended Sequential Bayes Factor (SBF) design, where researchers can test after each participant and can stop data collection whenever there is strong evidence for either [Formula: see text] or [Formula: see text], and (c) a modified SBF design that defines a maximal sample size where data collection is stopped regardless of the current state of evidence. We demonstrate how the properties of each design (i.e., expected strength of evidence, expected sample size, expected probability of misleading evidence, expected probability of weak evidence) can be evaluated using Monte Carlo simulations and equip researchers with the necessary information to compute their own Bayesian design analyses.

  18. The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology: Capabilities and Applications

    NASA Technical Reports Server (NTRS)

    Evers, Ken H.; Bachert, Robert F.

    1987-01-01

    The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology has been formulated and applied over a five-year period. It has proven to be a unique, integrated approach utilizing a top-down, structured technique to define and document the system of interest; a knowledge engineering technique to collect and organize system descriptive information; a rapid prototyping technique to perform preliminary system performance analysis; and a sophisticated simulation technique to perform in-depth system performance analysis.

  19. Payload training methodology study

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The results of the Payload Training Methodology Study (PTMS) are documented. Methods and procedures are defined for the development of payload training programs to be conducted at the Marshall Space Flight Center Payload Training Complex (PCT) for the Space Station Freedom program. The study outlines the overall training program concept as well as the six methodologies associated with the program implementation. The program concept outlines the entire payload training program from initial identification of training requirements to the development of detailed design specifications for simulators and instructional material. The following six methodologies are defined: (1) The Training and Simulation Needs Assessment Methodology; (2) The Simulation Approach Methodology; (3) The Simulation Definition Analysis Methodology; (4) The Simulator Requirements Standardization Methodology; (5) The Simulator Development Verification Methodology; and (6) The Simulator Validation Methodology.

  20. New Methodology of Designing Inexpensive Hybrid Control-Acquisition Systems for Mechatronic Constructions

    PubMed Central

    Augustyn, Jacek

    2013-01-01

    This article presents a new methodology for designing a hybrid control and acquisition system consisting of a 32-bit SoC microsystem connected via a direct Universal Serial Bus (USB) with a standard commercial off-the-shelf (COTS) component running the Android operating system. It is proposed to utilize it avoiding the use of an additional converter. An Android-based component was chosen to explore the potential for a mobile, compact and energy efficient solution with easy to build user interfaces and easy wireless integration with other computer systems. This paper presents results of practical implementation and analysis of experimental real-time performance. It covers closed control loop time between the sensor/actuator module and the Android operating system as well as the real-time sensor data stream within such a system. Some optimisations are proposed and their influence on real-time performance was investigated. The proposed methodology is intended for acquisition and control of mechatronic systems, especially mobile robots. It can be used in a wide range of control applications as well as embedded acquisition-recording devices, including energy quality measurements, smart-grids and medicine. It is demonstrated that the proposed methodology can be employed without developing specific device drivers. The latency achieved was less than 0.5 ms and the sensor data stream throughput was on the order of 750 KB/s (compared to 3 ms latency and 300 KB/s in traditional solutions). PMID:24351633

  1. New methodology of designing inexpensive hybrid control-acquisition systems for mechatronic constructions.

    PubMed

    Augustyn, Jacek

    2013-12-13

    This article presents a new methodology for designing a hybrid control and acquisition system consisting of a 32-bit SoC microsystem connected via a direct Universal Serial Bus (USB) with a standard commercial off-the-shelf (COTS) component running the Android operating system. It is proposed to utilize it avoiding the use of an additional converter. An Android-based component was chosen to explore the potential for a mobile, compact and energy efficient solution with easy to build user interfaces and easy wireless integration with other computer systems. This paper presents results of practical implementation and analysis of experimental real-time performance. It covers closed control loop time between the sensor/actuator module and the Android operating system as well as the real-time sensor data stream within such a system. Some optimisations are proposed and their influence on real-time performance was investigated. The proposed methodology is intended for acquisition and control of mechatronic systems, especially mobile robots. It can be used in a wide range of control applications as well as embedded acquisition-recording devices, including energy quality measurements, smart-grids and medicine. It is demonstrated that the proposed methodology can be employed without developing specific device drivers. The latency achieved was less than 0.5 ms and the sensor data stream throughput was on the order of 750 KB/s (compared to 3 ms latency and 300 KB/s in traditional solutions).

  2. Design Process-System and Methodology of Design Research

    NASA Astrophysics Data System (ADS)

    Bashier, Fathi

    2017-10-01

    Studies have recognized the failure of the traditional design approach both in practice and in the studio. They showed that design problems today are too complex for the traditional approach to cope with and reflected a new interest in a better quality design services in order to meet the challenges of our time. In the mid-1970s and early 1980s, there has been a significant shift in focus within the field of design research towards the aim of creating a ‘design discipline’. The problem, as will be discussed, is the lack of an integrated theory of design knowledge that can explicitly describe the design process in a coherent way. As a consequence, the traditional approach fails to operate systematically, in a disciplinary manner. Addressing this problem is the primary goal of the research study in the design process currently being conducted in the research-based master studio at Wollega University, Ethiopia. The research study seeks to make a contribution towards a disciplinary approach, through proper understanding the mechanism of knowledge development within design process systems. This is the task of the ‘theory of design knowledge’. In this article the research project is introduced, and a model of the design process-system is developed in the studio as a research plan and a tool of design research at the same time. Based on data drawn from students’ research projects, the theory of design knowledge is developed and empirically verified through the research project.

  3. Piloted Evaluation of an Integrated Methodology for Propulsion and Airframe Control Design

    NASA Technical Reports Server (NTRS)

    Bright, Michelle M.; Simon, Donald L.; Garg, Sanjay; Mattern, Duane L.; Ranaudo, Richard J.; Odonoghue, Dennis P.

    1994-01-01

    An integrated methodology for propulsion and airframe control has been developed and evaluated for a Short Take-Off Vertical Landing (STOVL) aircraft using a fixed base flight simulator at NASA Lewis Research Center. For this evaluation the flight simulator is configured for transition flight using a STOVL aircraft model, a full nonlinear turbofan engine model, simulated cockpit and displays, and pilot effectors. The paper provides a brief description of the simulation models, the flight simulation environment, the displays and symbology, the integrated control design, and the piloted tasks used for control design evaluation. In the simulation, the pilots successfully completed typical transition phase tasks such as combined constant deceleration with flight path tracking, and constant acceleration wave-off maneuvers. The pilot comments of the integrated system performance and the display symbology are discussed and analyzed to identify potential areas of improvement.

  4. Design of experiment (DOE) based screening of factors affecting municipal solid waste (MSW) composting.

    PubMed

    Kazemi, Khoshrooz; Zhang, Baiyu; Lye, Leonard M; Cai, Qinghong; Cao, Tong

    2016-12-01

    A design of experiment (DOE) based methodology was adopted in this study to investigate the effects of multiple factors and their interactions on the performance of a municipal solid waste (MSW) composting process. The impact of four factors, carbon/nitrogen ratio (C/N), moisture content (MC), type of bulking agent (BA) and aeration rate (AR) on the maturity, stability and toxicity of compost product was investigated. The statistically significant factors were identified using final C/N, germination index (GI) and especially the enzyme activities as responses. Experimental results validated the use of enzyme activities as proper indices during the course of composting. Maximum enzyme activities occurred during the active phase of decomposition. MC has a significant effect on dehydrogenase activity (DGH), β-glucosidase activity (BGH), phosphodiesterase activity (PDE) and the final moisture content of the compost. C/N is statistically significant for final C/N, DGH, BGH, and GI. The results provided guidance to optimize a MSW composting system that will lead to increased decomposition rate and the production of more stable and mature compost. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Implementation Science and Employer Disability Practices: Embedding Implementation Factors in Research Designs.

    PubMed

    Main, Chris J; Nicholas, Michael K; Shaw, William S; Tetrick, Lois E; Ehrhart, Mark G; Pransky, Glenn

    2016-12-01

    Purpose For work disability research to have an impact on employer policies and practices it is important for such research to acknowledge and incorporate relevant aspects of the workplace. The goal of this article is to summarize recent theoretical and methodological advances in the field of Implementation Science, relate these to research of employer disability management practices, and recommend future research priorities. Methods The authors participated in a year-long collaboration culminating in an invited 3-day conference, "Improving Research of Employer Practices to Prevent Disability", held October 14-16, 2015, in Hopkinton, MA, USA. The collaboration included a topical review of the literature, group conference calls to identify key areas and challenges, drafting of initial documents, review of industry publications, and a conference presentation that included feedback from peer researchers and a question/answer session with a special panel of knowledge experts with direct employer experience. Results A 4-phase implementation model including both outer and inner contexts was adopted as the most appropriate conceptual framework, and aligned well with the set of process evaluation factors described in both the work disability prevention literature and the grey literature. Innovative interventions involving disability risk screening and psychologically-based interventions have been slow to gain traction among employers and insurers. Research recommendations to address this are : (1) to assess organizational culture and readiness for change in addition to individual factors; (2) to conduct process evaluations alongside controlled trials; (3) to analyze decision-making factors among stakeholders; and (4 ) to solicit input from employers and insurers during early phases of study design. Conclusions Future research interventions involving workplace support and involvement to prevent disability may be more feasible for implementation if organizational decision

  6. A Sizing Methodology for the Conceptual Design of Blended-Wing-Body Transports. Degree awarded by George Washington Univ.

    NASA Technical Reports Server (NTRS)

    Kimmel, William M. (Technical Monitor); Bradley, Kevin R.

    2004-01-01

    This paper describes the development of a methodology for sizing Blended-Wing-Body (BWB) transports and how the capabilities of the Flight Optimization System (FLOPS) have been expanded using that methodology. In this approach, BWB transports are sized based on the number of passengers in each class that must fit inside the centerbody or pressurized vessel. Weight estimation equations for this centerbody structure were developed using Finite Element Analysis (FEA). This paper shows how the sizing methodology has been incorporated into FLOPS to enable the design and analysis of BWB transports. Previous versions of FLOPS did not have the ability to accurately represent or analyze BWB configurations in any reliable, logical way. The expanded capabilities allow the design and analysis of a 200 to 450-passenger BWB transport or the analysis of a BWB transport for which the geometry is already known. The modifications to FLOPS resulted in differences of less than 4 percent for the ramp weight of a BWB transport in this range when compared to previous studies performed by NASA and Boeing.

  7. A robust rotorcraft flight control system design methodology utilizing quantitative feedback theory

    NASA Technical Reports Server (NTRS)

    Gorder, Peter James

    1993-01-01

    Rotorcraft flight control systems present design challenges which often exceed those associated with fixed-wing aircraft. First, large variations in the response characteristics of the rotorcraft result from the wide range of airspeeds of typical operation (hover to over 100 kts). Second, the assumption of vehicle rigidity often employed in the design of fixed-wing flight control systems is rarely justified in rotorcraft where rotor degrees of freedom can have a significant impact on the system performance and stability. This research was intended to develop a methodology for the design of robust rotorcraft flight control systems. Quantitative Feedback Theory (QFT) was chosen as the basis for the investigation. Quantitative Feedback Theory is a technique which accounts for variability in the dynamic response of the controlled element in the design robust control systems. It was developed to address a Multiple-Input Single-Output (MISO) design problem, and utilizes two degrees of freedom to satisfy the design criteria. Two techniques were examined for extending the QFT MISO technique to the design of a Multiple-Input-Multiple-Output (MIMO) flight control system (FCS) for a UH-60 Black Hawk Helicopter. In the first, a set of MISO systems, mathematically equivalent to the MIMO system, was determined. QFT was applied to each member of the set simultaneously. In the second, the same set of equivalent MISO systems were analyzed sequentially, with closed loop response information from each loop utilized in subsequent MISO designs. The results of each technique were compared, and the advantages of the second, termed Sequential Loop Closure, were clearly evident.

  8. Reporting and methodological quality of meta-analyses in urological literature.

    PubMed

    Xia, Leilei; Xu, Jing; Guzzo, Thomas J

    2017-01-01

    To assess the overall quality of published urological meta-analyses and identify predictive factors for high quality. We systematically searched PubMed to identify meta-analyses published from January 1st, 2011 to December 31st, 2015 in 10 predetermined major paper-based urology journals. The characteristics of the included meta-analyses were collected, and their reporting and methodological qualities were assessed by the PRISMA checklist (27 items) and AMSTAR tool (11 items), respectively. Descriptive statistics were used for individual items as a measure of overall compliance, and PRISMA and AMSTAR scores were calculated as the sum of adequately reported domains. Logistic regression was used to identify predictive factors for high qualities. A total of 183 meta-analyses were included. The mean PRISMA and AMSTAR scores were 22.74 ± 2.04 and 7.57 ± 1.41, respectively. PRISMA item 5, protocol and registration, items 15 and 22, risk of bias across studies, items 16 and 23, additional analysis had less than 50% adherence. AMSTAR item 1, " a priori " design, item 5, list of studies and item 10, publication bias had less than 50% adherence. Logistic regression analyses showed that funding support and " a priori " design were associated with superior reporting quality, following PRISMA guideline and " a priori " design were associated with superior methodological quality. Reporting and methodological qualities of recently published meta-analyses in major paper-based urology journals are generally good. Further improvement could potentially be achieved by strictly adhering to PRISMA guideline and having " a priori " protocol.

  9. Optimization of Tape Winding Process Parameters to Enhance the Performance of Solid Rocket Nozzle Throat Back Up Liners using Taguchi's Robust Design Methodology

    NASA Astrophysics Data System (ADS)

    Nath, Nayani Kishore

    2017-08-01

    The throat back up liners is used to protect the nozzle structural members from the severe thermal environment in solid rocket nozzles. The throat back up liners is made with E-glass phenolic prepregs by tape winding process. The objective of this work is to demonstrate the optimization of process parameters of tape winding process to achieve better insulative resistance using Taguchi's robust design methodology. In this method four control factors machine speed, roller pressure, tape tension, tape temperature that were investigated for the tape winding process. The presented work was to study the cogency and acceptability of Taguchi's methodology in manufacturing of throat back up liners. The quality characteristic identified was Back wall temperature. Experiments carried out using L 9 ' (34) orthogonal array with three levels of four different control factors. The test results were analyzed using smaller the better criteria for Signal to Noise ratio in order to optimize the process. The experimental results were analyzed conformed and successfully used to achieve the minimum back wall temperature of the throat back up liners. The enhancement in performance of the throat back up liners was observed by carrying out the oxy-acetylene tests. The influence of back wall temperature on the performance of throat back up liners was verified by ground firing test.

  10. Robust modular product family design

    NASA Astrophysics Data System (ADS)

    Jiang, Lan; Allada, Venkat

    2001-10-01

    This paper presents a modified Taguchi methodology to improve the robustness of modular product families against changes in customer requirements. The general research questions posed in this paper are: (1) How to effectively design a product family (PF) that is robust enough to accommodate future customer requirements. (2) How far into the future should designers look to design a robust product family? An example of a simplified vacuum product family is used to illustrate our methodology. In the example, customer requirements are selected as signal factors; future changes of customer requirements are selected as noise factors; an index called quality characteristic (QC) is set to evaluate the product vacuum family; and the module instance matrix (M) is selected as control factor. Initially a relation between the objective function (QC) and the control factor (M) is established, and then the feasible M space is systemically explored using a simplex method to determine the optimum M and the corresponding QC values. Next, various noise levels at different time points are introduced into the system. For each noise level, the optimal values of M and QC are computed and plotted on a QC-chart. The tunable time period of the control factor (the module matrix, M) is computed using the QC-chart. The tunable time period represents the maximum time for which a given control factor can be used to satisfy current and future customer needs. Finally, a robustness index is used to break up the tunable time period into suitable time periods that designers should consider while designing product families.

  11. A Test Methodology for Determining Space-Readiness of Xilinx SRAM-Based FPGA Designs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quinn, Heather M; Graham, Paul S; Morgan, Keith S

    2008-01-01

    Using reconfigurable, static random-access memory (SRAM) based field-programmable gate arrays (FPGAs) for space-based computation has been an exciting area of research for the past decade. Since both the circuit and the circuit's state is stored in radiation-tolerant memory, both could be alterd by the harsh space radiation environment. Both the circuit and the circuit's state can be prote cted by triple-moduler redundancy (TMR), but applying TMR to FPGA user designs is often an error-prone process. Faulty application of TMR could cause the FPGA user circuit to output incorrect data. This paper will describe a three-tiered methodology for testing FPGA usermore » designs for space-readiness. We will describe the standard approach to testing FPGA user designs using a particle accelerator, as well as two methods using fault injection and a modeling tool. While accelerator testing is the current 'gold standard' for pre-launch testing, we believe the use of fault injection and modeling tools allows for easy, cheap and uniform access for discovering errors early in the design process.« less

  12. Managing Academic Tasks in Junior High School: Background, Design, and Methodology. (R & D Rep. No. 6185).

    ERIC Educational Resources Information Center

    Doyle, Walter; And Others

    This report describes the conceptual background, design, and methodology for a study of management of academic tasks in junior high school. Previous research suggests that tasks students accomplish in classrooms determine what they actually learn, and acquisition of higher cognitive skills related to interpretation and planning is essential for…

  13. Centroid and Theoretical Rotation: Justification for Their Use in Q Methodology Research

    ERIC Educational Resources Information Center

    Ramlo, Sue

    2016-01-01

    This manuscript's purpose is to introduce Q as a methodology before providing clarification about the preferred factor analytical choices of centroid and theoretical (hand) rotation. Stephenson, the creator of Q, designated that only these choices allowed for scientific exploration of subjectivity while not violating assumptions associated with…

  14. Performance evaluation in full-mission simulation - Methodological advances and research challenges. [in air transport operations

    NASA Technical Reports Server (NTRS)

    Chidester, Thomas R.; Kanki, Barbara G.; Helmreich, Robert L.

    1989-01-01

    The crew-factors research program at NASA Ames has developed a methodology for studying the impact of a variety of variables on the effectiveness of crews flying realistic but high workload simulated trips. The validity of investigations using the methodology is enhanced by careful design of full-mission scenarios, performance assessment using converging sources of data, and recruitment of representative subjects. Recently, portions of this methodology have been adapted for use in assessing the effectiveness of crew coordination among participants in line-oriented flight training.

  15. [Radiotherapy phase I trials' methodology: Features].

    PubMed

    Rivoirard, R; Vallard, A; Langrand-Escure, J; Guy, J-B; Ben Mrad, M; Yaoxiong, X; Diao, P; Méry, B; Pigne, G; Rancoule, C; Magné, N

    2016-12-01

    In clinical research, biostatistical methods allow the rigorous analysis of data collection and should be defined from the trial design to obtain the appropriate experimental approach. Thus, if the main purpose of phase I is to determine the dose to use during phase II, methodology should be finely adjusted to experimental treatment(s). Today, the methodology for chemotherapy and targeted therapy is well known. For radiotherapy and chemoradiotherapy phase I trials, the primary endpoint must reflect both effectiveness and potential treatment toxicities. Methodology should probably be complex to limit failures in the following phases. However, there are very few data about methodology design in the literature. The present study focuses on these particular trials and their characteristics. It should help to raise existing methodological patterns shortcomings in order to propose new and better-suited designs. Copyright © 2016 Société française de radiothérapie oncologique (SFRO). Published by Elsevier SAS. All rights reserved.

  16. A methodology for design of a linear referencing system for surface transportation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vonderohe, A.; Hepworth, T.

    1997-06-01

    The transportation community has recently placed significant emphasis on development of data models, procedural standards, and policies for management of linearly-referenced data. There is an Intelligent Transportation Systems initiative underway to create a spatial datum for location referencing in one, two, and three dimensions. Most recently, a call was made for development of a unified linear reference system to support public, private, and military surface transportation needs. A methodology for design of the linear referencing system was developed from geodetic engineering principles and techniques used for designing geodetic control networks. The method is founded upon the law of propagation ofmore » random error and the statistical analysis of systems of redundant measurements, used to produce best estimates for unknown parameters. A complete mathematical development is provided. Example adjustments of linear distance measurement systems are included. The classical orders of design are discussed with regard to the linear referencing system. A simple design example is provided. A linear referencing system designed and analyzed with this method will not only be assured of meeting the accuracy requirements of users, it will have the potential for supporting delivery of error estimates along with the results of spatial analytical queries. Modeling considerations, alternative measurement methods, implementation strategies, maintenance issues, and further research needs are discussed. Recommendations are made for further advancement of the unified linear referencing system concept.« less

  17. Deformable Surface Accommodating Intraocular Lens: Second Generation Prototype Design Methodology and Testing.

    PubMed

    McCafferty, Sean J; Schwiegerling, Jim T

    2015-04-01

    Present an analysis methodology for developing and evaluating accommodating intraocular lenses incorporating a deformable interface. The next generation design of extruded gel interface intraocular lens is presented. A prototype based upon similar previously in vivo proven design was tested with measurements of actuation force, lens power, interface contour, optical transfer function, and visual Strehl ratio. Prototype verified mathematical models were used to optimize optical and mechanical design parameters to maximize the image quality and minimize the required force to accommodate. The prototype lens produced adequate image quality with the available physiologic accommodating force. The iterative mathematical modeling based upon the prototype yielded maximized optical and mechanical performance through maximum allowable gel thickness to extrusion diameter ratio, maximum feasible refractive index change at the interface, and minimum gel material properties in Poisson's ratio and Young's modulus. The design prototype performed well. It operated within the physiologic constraints of the human eye including the force available for full accommodative amplitude using the eye's natural focusing feedback, while maintaining image quality in the space available. The parameters that optimized optical and mechanical performance were delineated as those, which minimize both asphericity and actuation pressure. The design parameters outlined herein can be used as a template to maximize the performance of a deformable interface intraocular lens. The article combines a multidisciplinary basic science approach from biomechanics, optical science, and ophthalmology to optimize an intraocular lens design suitable for preliminary animal trials.

  18. Methodologic ramifications of paying attention to sex and gender differences in clinical research.

    PubMed

    Prins, Martin H; Smits, Kim M; Smits, Luc J

    2007-01-01

    Methodologic standards for studies on sex and gender differences should be developed to improve reporting of studies and facilitate their inclusion in systematic reviews. The essence of these studies lies within the concept of effect modification. This article reviews important methodologic issues in the design and reporting of pharmacogenetic studies. Differences in effect based on sex or gender should preferably be expressed in absolute terms (risk differences) to facilitate clinical decisions on treatment. Information on the distribution of potential effect modifiers or prognostic factors should be available to prevent a biased comparison of differences in effect between genotypes. Other considerations included the possibility of selective nonavailability of biomaterial and the choice of a statistical model to study effect modification. To ensure high study quality, additional methodologic issues should be taken into account when designing and reporting studies on sex and gender differences.

  19. Assessment of Registration Information on Methodological Design of Acupuncture RCTs: A Review of 453 Registration Records Retrieved from WHO International Clinical Trials Registry Platform

    PubMed Central

    Gu, Jing; Wang, Qi; Wang, Xiaogang; Li, Hailong; Gu, Mei; Ming, Haixia; Dong, Xiaoli; Yang, Kehu; Wu, Hongyan

    2014-01-01

    Background. This review provides the first methodological information assessment of protocol of acupuncture RCTs registered in WHO International Clinical Trials Registry Platform (ICTRP). Methods. All records of acupuncture RCTs registered in the ICTRP have been collected. The methodological design assessment involved whether the randomization methods, allocation concealment, and blinding were adequate or not based on the information of registration records (protocols of acupuncture RCTs). Results. A total of 453 records, found in 11 registries, were examined. Methodological details were insufficient in registration records; there were 76.4%, 89.0%, and 21.4% records that did not provide information on randomization methods, allocation concealment, and blinding respectively. The proportions of adequate randomization methods, allocation concealment, and blinding were only 107 (23.6%), 48 (10.6%), and 210 (46.4%), respectively. The methodological design improved year by year, especially after 2007. Additionally, methodology of RCTs with ethics approval was clearly superior to those without ethics approval and different among registries. Conclusions. The overall methodological design based on registration records of acupuncture RCTs is not very well but improved year by year. The insufficient information on randomization methods, allocation concealment, and blinding maybe due to the relevant description is not taken seriously in acupuncture RCTs' registration. PMID:24688591

  20. Assessment of Registration Information on Methodological Design of Acupuncture RCTs: A Review of 453 Registration Records Retrieved from WHO International Clinical Trials Registry Platform.

    PubMed

    Gu, Jing; Wang, Qi; Wang, Xiaogang; Li, Hailong; Gu, Mei; Ming, Haixia; Dong, Xiaoli; Yang, Kehu; Wu, Hongyan

    2014-01-01

    Background. This review provides the first methodological information assessment of protocol of acupuncture RCTs registered in WHO International Clinical Trials Registry Platform (ICTRP). Methods. All records of acupuncture RCTs registered in the ICTRP have been collected. The methodological design assessment involved whether the randomization methods, allocation concealment, and blinding were adequate or not based on the information of registration records (protocols of acupuncture RCTs). Results. A total of 453 records, found in 11 registries, were examined. Methodological details were insufficient in registration records; there were 76.4%, 89.0%, and 21.4% records that did not provide information on randomization methods, allocation concealment, and blinding respectively. The proportions of adequate randomization methods, allocation concealment, and blinding were only 107 (23.6%), 48 (10.6%), and 210 (46.4%), respectively. The methodological design improved year by year, especially after 2007. Additionally, methodology of RCTs with ethics approval was clearly superior to those without ethics approval and different among registries. Conclusions. The overall methodological design based on registration records of acupuncture RCTs is not very well but improved year by year. The insufficient information on randomization methods, allocation concealment, and blinding maybe due to the relevant description is not taken seriously in acupuncture RCTs' registration.

  1. Calibration of the live load factor in LRFD design guidelines.

    DOT National Transportation Integrated Search

    2010-09-01

    The Load and Resistant Factor Design (LRFD) approach is based on the concept of structural reliability. The approach is : more rational than the former design approaches such as Load Factor Design or Allowable Stress Design. The LRFD : Specification ...

  2. 33 CFR 156.230 - Factors considered in designating lightering zones.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 33 Navigation and Navigable Waters 2 2011-07-01 2011-07-01 false Factors considered in designating... Lightering of Oil and Hazardous Material Cargoes § 156.230 Factors considered in designating lightering zones. The following factors are considered in designating a lightering zone: (a) The findings of the...

  3. 33 CFR 156.230 - Factors considered in designating lightering zones.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 33 Navigation and Navigable Waters 2 2014-07-01 2014-07-01 false Factors considered in designating... Lightering of Oil and Hazardous Material Cargoes § 156.230 Factors considered in designating lightering zones. The following factors are considered in designating a lightering zone: (a) The findings of the...

  4. 33 CFR 156.230 - Factors considered in designating lightering zones.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 33 Navigation and Navigable Waters 2 2012-07-01 2012-07-01 false Factors considered in designating... Lightering of Oil and Hazardous Material Cargoes § 156.230 Factors considered in designating lightering zones. The following factors are considered in designating a lightering zone: (a) The findings of the...

  5. 33 CFR 156.230 - Factors considered in designating lightering zones.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 33 Navigation and Navigable Waters 2 2013-07-01 2013-07-01 false Factors considered in designating... Lightering of Oil and Hazardous Material Cargoes § 156.230 Factors considered in designating lightering zones. The following factors are considered in designating a lightering zone: (a) The findings of the...

  6. 33 CFR 156.230 - Factors considered in designating lightering zones.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 2 2010-07-01 2010-07-01 false Factors considered in designating... Lightering of Oil and Hazardous Material Cargoes § 156.230 Factors considered in designating lightering zones. The following factors are considered in designating a lightering zone: (a) The findings of the...

  7. Combining qualitative and quantitative research within mixed method research designs: a methodological review.

    PubMed

    Östlund, Ulrika; Kidd, Lisa; Wengström, Yvonne; Rowa-Dewar, Neneh

    2011-03-01

    It has been argued that mixed methods research can be useful in nursing and health science because of the complexity of the phenomena studied. However, the integration of qualitative and quantitative approaches continues to be one of much debate and there is a need for a rigorous framework for designing and interpreting mixed methods research. This paper explores the analytical approaches (i.e. parallel, concurrent or sequential) used in mixed methods studies within healthcare and exemplifies the use of triangulation as a methodological metaphor for drawing inferences from qualitative and quantitative findings originating from such analyses. This review of the literature used systematic principles in searching CINAHL, Medline and PsycINFO for healthcare research studies which employed a mixed methods approach and were published in the English language between January 1999 and September 2009. In total, 168 studies were included in the results. Most studies originated in the United States of America (USA), the United Kingdom (UK) and Canada. The analytic approach most widely used was parallel data analysis. A number of studies used sequential data analysis; far fewer studies employed concurrent data analysis. Very few of these studies clearly articulated the purpose for using a mixed methods design. The use of the methodological metaphor of triangulation on convergent, complementary, and divergent results from mixed methods studies is exemplified and an example of developing theory from such data is provided. A trend for conducting parallel data analysis on quantitative and qualitative data in mixed methods healthcare research has been identified in the studies included in this review. Using triangulation as a methodological metaphor can facilitate the integration of qualitative and quantitative findings, help researchers to clarify their theoretical propositions and the basis of their results. This can offer a better understanding of the links between theory and

  8. Permanent magnet design methodology

    NASA Technical Reports Server (NTRS)

    Leupold, Herbert A.

    1991-01-01

    Design techniques developed for the exploitation of high energy magnetically rigid materials such as Sm-Co and Nd-Fe-B have resulted in a revolution in kind rather than in degree in the design of a variety of electron guidance structures for ballistic and aerospace applications. Salient examples are listed. Several prototype models were developed. These structures are discussed in some detail: permanent magnet solenoids, transverse field sources, periodic structures, and very high field structures.

  9. Methodological survey of designed uneven randomization trials (DU-RANDOM): a protocol.

    PubMed

    Wu, Darong; Akl, Elie A; Guyatt, Gordon H; Devereaux, Philip J; Brignardello-Petersen, Romina; Prediger, Barbara; Patel, Krupesh; Patel, Namrata; Lu, Taoying; Zhang, Yuan; Falavigna, Maicon; Santesso, Nancy; Mustafa, Reem A; Zhou, Qi; Briel, Matthias; Schünemann, Holger J

    2014-01-23

    Although even randomization (that is, approximately 1:1 randomization ratio in study arms) provides the greatest statistical power, designed uneven randomization (DUR), (for example, 1:2 or 1:3) is used to increase participation rates. Until now, no convincing data exists addressing the impact of DUR on participation rates in trials. The objective of this study is to evaluate the epidemiology and to explore factors associated with DUR. We will search for reports of RCTs published within two years in 25 general medical journals with the highest impact factor according to the Journal Citation Report (JCR)-2010. Teams of two reviewers will determine eligibility and extract relevant information from eligible RCTs in duplicate and using standardized forms. We will report the prevalence of DUR trials, the reported reasons for using DUR, and perform a linear regression analysis to estimate the association between the randomization ratio and the associated factors, including participation rate, type of informed consent, clinical area, and so on. A clearer understanding of RCTs with DUR and its association with factors in trials, for example, participation rate, can optimize trial design and may have important implications for both researchers and users of the medical literature.

  10. Solid Waste Management Planning--A Methodology

    ERIC Educational Resources Information Center

    Theisen, Hilary M.; And Others

    1975-01-01

    This article presents a twofold solid waste management plan consisting of a basic design methodology and a decision-making methodology. The former provides a framework for the developing plan while the latter builds flexibility into the design so that there is a model for use during the planning process. (MA)

  11. Prioritizing critical success factors for reverse logistics implementation using fuzzy-TOPSIS methodology

    NASA Astrophysics Data System (ADS)

    Agrawal, Saurabh; Singh, Rajesh K.; Murtaza, Qasim

    2016-03-01

    Electronics industry is one of the fastest growing industries in the world. In India also, there are high turnovers and growing demand of electronics product especially after post liberalization in early nineties. These products generate e-waste which has become big environmental issue. Industries can handle these e-waste and product returns efficiently by developing reverse logistics (RL) system. A thorough study of critical success factors (CSFs) and their ordered implementation is essential for successful RL implementation. The aim of the study is to review the CSFs, and to prioritize them for RL implementation in Indian electronics industry. Twelve CSFs were identified through literature review, and discussion with the experts from the Indian electronics industry. Fuzzy-Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) approach is proposed for prioritizing these CSFs. Perusal of literature indicates that fuzzy-TOPSIS has not been applied earlier for prioritization of CSFs in Indian electronics industry. Five Indian electronics companies were selected for evaluation of this methodology. Results indicate that most of the identified factors are crucial for the RL implementation. Top management awareness, resource management, economic factors, and contracts terms and conditions are top four prioritized factor, and process capabilities and skilled workers is the least prioritized factor. The findings will be useful for successful RL implementation in Indian electronics industry.

  12. Rapid development of xylanase assay conditions using Taguchi methodology.

    PubMed

    Prasad Uday, Uma Shankar; Bandyopadhyay, Tarun Kanti; Bhunia, Biswanath

    2016-11-01

    The present investigation is mainly concerned with the rapid development of extracellular xylanase assay conditions by using Taguchi methodology. The extracellular xylanase was produced from Aspergillus niger (KP874102.1), a new strain isolated from a soil sample of the Baramura forest, Tripura West, India. Four physical parameters including temperature, pH, buffer concentration and incubation time were considered as key factors for xylanase activity and were optimized using Taguchi robust design methodology for enhanced xylanase activity. The main effect, interaction effects and optimal levels of the process factors were determined using signal-to-noise (S/N) ratio. The Taguchi method recommends the use of S/N ratio to measure quality characteristics. Based on analysis of the S/N ratio, optimal levels of the process factors were determined. Analysis of variance (ANOVA) was performed to evaluate statistically significant process factors. ANOVA results showed that temperature contributed the maximum impact (62.58%) on xylanase activity, followed by pH (22.69%), buffer concentration (9.55%) and incubation time (5.16%). Predicted results showed that enhanced xylanase activity (81.47%) can be achieved with pH 2, temperature 50°C, buffer concentration 50 Mm and incubation time 10 min.

  13. A consistent methodology for optimal shape design of graphene sheets to maximize their fundamental frequencies considering topological defects

    NASA Astrophysics Data System (ADS)

    Shi, Jin-Xing; Ohmura, Keiichiro; Shimoda, Masatoshi; Lei, Xiao-Wen

    2018-07-01

    In recent years, shape design of graphene sheets (GSs) by introducing topological defects for enhancing their mechanical behaviors has attracted the attention of scholars. In the present work, we propose a consistent methodology for optimal shape design of GSs using a combination of the molecular mechanics (MM) method, the non-parametric shape optimization method, the phase field crystal (PFC) method, Voronoi tessellation, and molecular dynamics (MD) simulation to maximize their fundamental frequencies. At first, we model GSs as continuum frame models using a link between the MM method and continuum mechanics. Then, we carry out optimal shape design of GSs in fundamental frequency maximization problem based on a developed shape optimization method for frames. However, the obtained optimal shapes of GSs only consisting of hexagonal carbon rings are unstable that do not satisfy the principle of least action, so we relocate carbon atoms on the optimal shapes by introducing topological defects using the PFC method and Voronoi tessellation. At last, we perform the structural relaxation through MD simulation to determine the final optimal shapes of GSs. We design two examples of GSs and the optimal results show that the fundamental frequencies of GSs can be significantly enhanced according to the optimal shape design methodology.

  14. Games and Diabetes: A Review Investigating Theoretical Frameworks, Evaluation Methodologies, and Opportunities for Design Grounded in Learning Theories.

    PubMed

    Lazem, Shaimaa; Webster, Mary; Holmes, Wayne; Wolf, Motje

    2015-09-02

    Here we review 18 articles that describe the design and evaluation of 1 or more games for diabetes from technical, methodological, and theoretical perspectives. We undertook searches covering the period 2010 to May 2015 in the ACM, IEEE, Journal of Medical Internet Research, Studies in Health Technology and Informatics, and Google Scholar online databases using the keywords "children," "computer games," "diabetes," "games," "type 1," and "type 2" in various Boolean combinations. The review sets out to establish, for future research, an understanding of the current landscape of digital games designed for children with diabetes. We briefly explored the use and impact of well-established learning theories in such games. The most frequently mentioned theoretical frameworks were social cognitive theory and social constructivism. Due to the limitations of the reported evaluation methodologies, little evidence was found to support the strong promise of games for diabetes. Furthermore, we could not establish a relation between design features and the game outcomes. We argue that an in-depth discussion about the extent to which learning theories could and should be manifested in the design decisions is required. © 2015 Diabetes Technology Society.

  15. A methodological framework to support the initiation, design and institutionalization of participatory modeling processes in water resources management

    NASA Astrophysics Data System (ADS)

    Halbe, Johannes; Pahl-Wostl, Claudia; Adamowski, Jan

    2018-01-01

    Multiple barriers constrain the widespread application of participatory methods in water management, including the more technical focus of most water agencies, additional cost and time requirements for stakeholder involvement, as well as institutional structures that impede collaborative management. This paper presents a stepwise methodological framework that addresses the challenges of context-sensitive initiation, design and institutionalization of participatory modeling processes. The methodological framework consists of five successive stages: (1) problem framing and stakeholder analysis, (2) process design, (3) individual modeling, (4) group model building, and (5) institutionalized participatory modeling. The Management and Transition Framework is used for problem diagnosis (Stage One), context-sensitive process design (Stage Two) and analysis of requirements for the institutionalization of participatory water management (Stage Five). Conceptual modeling is used to initiate participatory modeling processes (Stage Three) and ensure a high compatibility with quantitative modeling approaches (Stage Four). This paper describes the proposed participatory model building (PMB) framework and provides a case study of its application in Québec, Canada. The results of the Québec study demonstrate the applicability of the PMB framework for initiating and designing participatory model building processes and analyzing barriers towards institutionalization.

  16. FOREWORD: Computational methodologies for designing materials Computational methodologies for designing materials

    NASA Astrophysics Data System (ADS)

    Rahman, Talat S.

    2009-02-01

    It would be fair to say that in the past few decades, theory and computer modeling have played a major role in elucidating the microscopic factors that dictate the properties of functional novel materials. Together with advances in experimental techniques, theoretical methods are becoming increasingly capable of predicting properties of materials at different length scales, thereby bringing in sight the long-sought goal of designing material properties according to need. Advances in computer technology and their availability at a reasonable cost around the world have made tit all the more urgent to disseminate what is now known about these modern computational techniques. In this special issue on computational methodologies for materials by design we have tried to solicit articles from authors whose works collectively represent the microcosm of developments in the area. This turned out to be a difficult task for a variety of reasons, not the least of which is space limitation in this special issue. Nevertheless, we gathered twenty articles that represent some of the important directions in which theory and modeling are proceeding in the general effort to capture the ability to produce materials by design. The majority of papers presented here focus on technique developments that are expected to uncover further the fundamental processes responsible for material properties, and for their growth modes and morphological evolutions. As for material properties, some of the articles here address the challenges that continue to emerge from attempts at accurate descriptions of magnetic properties, of electronically excited states, and of sparse matter, all of which demand new looks at density functional theory (DFT). I should hasten to add that much of the success in accurate computational modeling of materials emanates from the remarkable predictive power of DFT, without which we would not be able to place the subject on firm theoretical grounds. As we know and will also

  17. Methodology for the nuclear design validation of an Alternate Emergency Management Centre (CAGE)

    NASA Astrophysics Data System (ADS)

    Hueso, César; Fabbri, Marco; de la Fuente, Cristina; Janés, Albert; Massuet, Joan; Zamora, Imanol; Gasca, Cristina; Hernández, Héctor; Vega, J. Ángel

    2017-09-01

    The methodology is devised by coupling different codes. The study of weather conditions as part of the data of the site will determine the relative concentrations of radionuclides in the air using ARCON96. The activity in the air is characterized depending on the source and release sequence specified in NUREG-1465 by RADTRAD code, which provides results of the inner cloud source term contribution. Known activities, energy spectra are inferred using ORIGEN-S, which are used as input for the models of the outer cloud, filters and containment generated with MCNP5. The sum of the different contributions must meet the conditions of habitability specified by the CSN (Spanish Nuclear Regulatory Body) (TEDE <50 mSv and equivalent dose to the thyroid <500 mSv within 30 days following the accident doses) so that the dose is optimized by varying parameters such as CAGE location, flow filtering need for recirculation, thicknesses and compositions of the walls, etc. The results for the most penalizing area meet the established criteria, and therefore the CAGE building design based on the methodology presented is radiologically validated.

  18. Reporting and methodological quality of meta-analyses in urological literature

    PubMed Central

    Xu, Jing

    2017-01-01

    Purpose To assess the overall quality of published urological meta-analyses and identify predictive factors for high quality. Materials and Methods We systematically searched PubMed to identify meta-analyses published from January 1st, 2011 to December 31st, 2015 in 10 predetermined major paper-based urology journals. The characteristics of the included meta-analyses were collected, and their reporting and methodological qualities were assessed by the PRISMA checklist (27 items) and AMSTAR tool (11 items), respectively. Descriptive statistics were used for individual items as a measure of overall compliance, and PRISMA and AMSTAR scores were calculated as the sum of adequately reported domains. Logistic regression was used to identify predictive factors for high qualities. Results A total of 183 meta-analyses were included. The mean PRISMA and AMSTAR scores were 22.74 ± 2.04 and 7.57 ± 1.41, respectively. PRISMA item 5, protocol and registration, items 15 and 22, risk of bias across studies, items 16 and 23, additional analysis had less than 50% adherence. AMSTAR item 1, “a priori” design, item 5, list of studies and item 10, publication bias had less than 50% adherence. Logistic regression analyses showed that funding support and “a priori” design were associated with superior reporting quality, following PRISMA guideline and “a priori” design were associated with superior methodological quality. Conclusions Reporting and methodological qualities of recently published meta-analyses in major paper-based urology journals are generally good. Further improvement could potentially be achieved by strictly adhering to PRISMA guideline and having “a priori” protocol. PMID:28439452

  19. Contentious issues in research on trafficked women working in the sex industry: study design, ethics, and methodology.

    PubMed

    Cwikel, Julie; Hoban, Elizabeth

    2005-11-01

    The trafficking of women and children for work in the globalized sex industry is a global social problem. Quality data is needed to provide a basis for legislation, policy, and programs, but first, numerous research design, ethical, and methodological problems must be addressed. Research design issues in studying women trafficked for sex work (WTSW) include how to (a) develop coalitions to fund and support research, (b) maintain a critical stance on prostitution, and therefore WTSW (c) use multiple paradigms and methods to accurately reflect WTSW's reality, (d) present the purpose of the study, and (e) protect respondents' identities. Ethical issues include (a) complications with informed consent procedures, (b) problematic access to WTSW (c) loss of WTSW to follow-up, (d) inability to intervene in illegal acts or human rights violations, and (e) the need to maintain trustworthiness as researchers. Methodological issues include (a) constructing representative samples, (b) managing media interest, and (c) handling incriminating materials about law enforcement and immigration.

  20. Optimization of EGFR high positive cell isolation procedure by design of experiments methodology.

    PubMed

    Levi, Ofer; Tal, Baruch; Hileli, Sagi; Shapira, Assaf; Benhar, Itai; Grabov, Pavel; Eliaz, Noam

    2015-01-01

    Circulating tumor cells (CTCs) in blood circulation may play a role in monitoring and even in early detection of metastasis patients. Due to the limited presence of CTCs in blood circulation, viable CTCs isolation technology must supply a very high recovery rate. Here, we implement design of experiments (DOE) methodology in order to optimize the Bio-Ferrography (BF) immunomagnetic isolation (IMI) procedure for the EGFR high positive CTCs application. All consequent DOE phases such as screening design, optimization experiments and validation experiments were used. A significant recovery rate of more than 95% was achieved while isolating 100 EGFR high positive CTCs from 1 mL human whole blood. The recovery achievement in this research positions BF technology as one of the most efficient IMI technologies, which is ready to be challenged with patients' blood samples. © 2015 International Clinical Cytometry Society.

  1. Applications of mixed-methods methodology in clinical pharmacy research.

    PubMed

    Hadi, Muhammad Abdul; Closs, S José

    2016-06-01

    Introduction Mixed-methods methodology, as the name suggests refers to mixing of elements of both qualitative and quantitative methodologies in a single study. In the past decade, mixed-methods methodology has gained popularity among healthcare researchers as it promises to bring together the strengths of both qualitative and quantitative approaches. Methodology A number of mixed-methods designs are available in the literature and the four most commonly used designs in healthcare research are: the convergent parallel design, the embedded design, the exploratory design, and the explanatory design. Each has its own unique advantages, challenges and procedures and selection of a particular design should be guided by the research question. Guidance on designing, conducting and reporting mixed-methods research is available in the literature, so it is advisable to adhere to this to ensure methodological rigour. When to use it is best suited when the research questions require: triangulating findings from different methodologies to explain a single phenomenon; clarifying the results of one method using another method; informing the design of one method based on the findings of another method, development of a scale/questionnaire and answering different research questions within a single study. Two case studies have been presented to illustrate possible applications of mixed-methods methodology. Limitations Possessing the necessary knowledge and skills to undertake qualitative and quantitative data collection, analysis, interpretation and integration remains the biggest challenge for researchers conducting mixed-methods studies. Sequential study designs are often time consuming, being in two (or more) phases whereas concurrent study designs may require more than one data collector to collect both qualitative and quantitative data at the same time.

  2. Compact sieve-tray distillation column for ammonia-water absorption heat pump: Part 1 -- Design methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anand, G.; Erickson, D.C.

    1999-07-01

    The distillation column is a key component of ammonia-water absorption units including advanced generator-absorber heat exchange (GAX) cycle heat pumps. The design of the distillation column is critical to unit performance, size, and cost. The distillation column can be designed with random packing, structured packing, or various tray configurations. A sieve-tray distillation column is the least complicated tray design and is less costly than high-efficiency packing. Substantial literature is available on sieve tray design and performance. However, most of the correlations and design recommendations were developed for large industrial hydrocarbon systems and are generally not directly applicable to the compactmore » ammonia-water column discussed here. The correlations were reviewed and modified as appropriate for this application, and a sieve-tray design model was developed. This paper presents the sieve-tray design methodology for highly compact ammonia-water columns. A conceptual design of the distillation column for an 8 ton vapor exchange (VX) GAX heat pump is presented, illustrating relevant design parameters and trends. The design process revealed several issues that have to be investigated experimentally to design the final optimized rectifier. Validation of flooding and weeping limits and tray/point efficiencies are of primary importance.« less

  3. Energy saving by using asymmetric aftbodies for merchant ships-design methodology, numerical simulation and validation

    NASA Astrophysics Data System (ADS)

    Dang, Jie; Chen, Hao

    2016-12-01

    The methodology and procedures are discussed on designing merchant ships to achieve fully-integrated and optimized hull-propulsion systems by using asymmetric aftbodies. Computational fluid dynamics (CFD) has been used to evaluate the powering performance through massive calculations with automatic deformation algorisms for the hull forms and the propeller blades. Comparative model tests of the designs to the optimized symmetric hull forms have been carried out to verify the efficiency gain. More than 6% improvement on the propulsive efficiency of an oil tanker has been measured during the model tests. Dedicated sea-trials show good agreement with the predicted performance from the test results.

  4. Designing timber highway bridge superstructures using AASHTO?LRFD specifications

    Treesearch

    James P. Wacker; James S. Groenier

    2007-01-01

    The allowable-stress design methodology that has been used for decades to design timber bridge superstructures is being replaced in the near future. Beginning in October 2007, bridge designers will be required by the Federal Highway Administration (FHWA) to utilize the Load and Resistance Factor Design (LRFD) design specifications published by the American Association...

  5. Use of Taguchi methodology to enhance the yield of caffeine removal with growing cultures of Pseudomonas pseudoalcaligenes.

    PubMed

    Ashengroph, Morahem; Ababaf, Sajad

    2014-12-01

    Microbial caffeine removal is a green solution for treatment of caffeinated products and agro-industrial effluents. We directed this investigation to optimizing a bio-decaffeination process with growing cultures of Pseudomonas pseudoalcaligenes through Taguchi methodology which is a structured statistical approach that can be lowered variations in a process through Design of Experiments (DOE). Five parameters, i.e. initial fructose, tryptone, Zn(+2) ion and caffeine concentrations and also incubation time selected and an L16 orthogonal array was applied to design experiments with four 4-level factors and one 3-level factor (4(4) × 1(3)). Data analysis was performed using the statistical analysis of variance (ANOVA) method. Furthermore, the optimal conditions were determined by combining the optimal levels of the significant factors and verified by a confirming experiment. Measurement of residual caffeine concentration in the reaction mixture was performed using high-performance liquid chromatography (HPLC). Use of Taguchi methodology for optimization of design parameters resulted in about 86.14% reduction of caffeine in 48 h incubation when 5g/l fructose, 3 mM Zn(+2) ion and 4.5 g/l of caffeine are present in the designed media. Under the optimized conditions, the yield of degradation of caffeine (4.5 g/l) by the native strain of Pseudomonas pseudoalcaligenes TPS8 has been increased from 15.8% to 86.14% which is 5.4 fold higher than the normal yield. According to the experimental results, Taguchi methodology provides a powerful methodology for identifying the favorable parameters on caffeine removal using strain TPS8 which suggests that the approach also has potential application with similar strains to improve the yield of caffeine removal from caffeine containing solutions.

  6. The influence of capture-recapture methodology on the evolution of the North American Bird Banding Program

    USGS Publications Warehouse

    Tautin, J.; Lebreton, J.-D.; North, P.M.

    1993-01-01

    Capture-recapture methodology has advanced greatly in the last twenty years and is now a major factor driving the continuing evolution of the North American bird banding program. Bird banding studies are becoming more scientific with improved study designs and analytical procedures. Researchers and managers are gaining more reliable knowledge which in turn betters the conservation of migratory birds. The advances in capture-recapture methodology have benefited gamebird studies primarily, but nongame bird studies will benefit similarly as they expand greatly in the next decade. Further theoretical development of capture-recapture methodology should be encouraged, and, to maximize benefits of the methodology, work on practical applications should be increased.

  7. Biomarker-Guided Adaptive Trial Designs in Phase II and Phase III: A Methodological Review

    PubMed Central

    Antoniou, Miranta; Jorgensen, Andrea L; Kolamunnage-Dona, Ruwanthi

    2016-01-01

    Background Personalized medicine is a growing area of research which aims to tailor the treatment given to a patient according to one or more personal characteristics. These characteristics can be demographic such as age or gender, or biological such as a genetic or other biomarker. Prior to utilizing a patient’s biomarker information in clinical practice, robust testing in terms of analytical validity, clinical validity and clinical utility is necessary. A number of clinical trial designs have been proposed for testing a biomarker’s clinical utility, including Phase II and Phase III clinical trials which aim to test the effectiveness of a biomarker-guided approach to treatment; these designs can be broadly classified into adaptive and non-adaptive. While adaptive designs allow planned modifications based on accumulating information during a trial, non-adaptive designs are typically simpler but less flexible. Methods and Findings We have undertaken a comprehensive review of biomarker-guided adaptive trial designs proposed in the past decade. We have identified eight distinct biomarker-guided adaptive designs and nine variations from 107 studies. Substantial variability has been observed in terms of how trial designs are described and particularly in the terminology used by different authors. We have graphically displayed the current biomarker-guided adaptive trial designs and summarised the characteristics of each design. Conclusions Our in-depth overview provides future researchers with clarity in definition, methodology and terminology for biomarker-guided adaptive trial designs. PMID:26910238

  8. A Human Factors Evaluation of a Methodology for Pressurized Crew Module Acceptability for Zero-Gravity Ingress of Spacecraft

    NASA Technical Reports Server (NTRS)

    Sanchez, Merri J.

    2000-01-01

    This project aimed to develop a methodology for evaluating performance and acceptability characteristics of the pressurized crew module volume suitability for zero-gravity (g) ingress of a spacecraft and to evaluate the operational acceptability of the NASA crew return vehicle (CRV) for zero-g ingress of astronaut crew, volume for crew tasks, and general crew module and seat layout. No standard or methodology has been established for evaluating volume acceptability in human spaceflight vehicles. Volume affects astronauts'ability to ingress and egress the vehicle, and to maneuver in and perform critical operational tasks inside the vehicle. Much research has been conducted on aircraft ingress, egress, and rescue in order to establish military and civil aircraft standards. However, due to the extremely limited number of human-rated spacecraft, this topic has been un-addressed. The NASA CRV was used for this study. The prototype vehicle can return a 7-member crew from the International Space Station in an emergency. The vehicle's internal arrangement must be designed to facilitate rapid zero-g ingress, zero-g maneuverability, ease of one-g egress and rescue, and ease of operational tasks in multiple acceleration environments. A full-scale crew module mockup was built and outfitted with representative adjustable seats, crew equipment, and a volumetrically equivalent hatch. Human factors testing was conducted in three acceleration environments using ground-based facilities and the KC-135 aircraft. Performance and acceptability measurements were collected. Data analysis was conducted using analysis of variance and nonparametric techniques.

  9. Assessment of seismic design response factors of concrete wall buildings

    NASA Astrophysics Data System (ADS)

    Mwafy, Aman

    2011-03-01

    To verify the seismic design response factors of high-rise buildings, five reference structures, varying in height from 20- to 60-stories, were selected and designed according to modern design codes to represent a wide range of concrete wall structures. Verified fiber-based analytical models for inelastic simulation were developed, considering the geometric nonlinearity and material inelasticity of the structural members. The ground motion uncertainty was accounted for by employing 20 earthquake records representing two seismic scenarios, consistent with the latest understanding of the tectonic setting and seismicity of the selected reference region (UAE). A large number of Inelastic Pushover Analyses (IPAs) and Incremental Dynamic Collapse Analyses (IDCAs) were deployed for the reference structures to estimate the seismic design response factors. It is concluded that the factors adopted by the design code are adequately conservative. The results of this systematic assessment of seismic design response factors apply to a wide variety of contemporary concrete wall buildings with various characteristics.

  10. Certify for success: A methodology for human-centered certification of advanced aviation systems

    NASA Technical Reports Server (NTRS)

    Small, Ronald L.; Rouse, William B.

    1994-01-01

    This position paper uses the methodology in Design for Success as a basis for a human factors certification program. The Design for Success (DFS) methodology espouses a multi-step process to designing and developing systems in a human-centered fashion. These steps are as follows: (1) naturalizing - understand stakeholders and their concerns; (2) marketing - understand market-oriented alternatives to meeting stakeholder concerns; (3) engineering - detailed design and development of the system considering tradeoffs between technology, cost, schedule, certification requirements, etc.; (4) system evaluation - determining if the system meets its goal(s); and (5) sales and service - delivering and maintaining the system. Because the main topic of this paper is certification, we will focus our attention on step 4, System Evaluation, since it is the natural precursor to certification. Evaluation involves testing the system and its parts for their correct behaviors. Certification focuses not only on ensuring that the system exhibits the correct behaviors, but ONLY the correct behaviors.

  11. Biomarker-Guided Non-Adaptive Trial Designs in Phase II and Phase III: A Methodological Review

    PubMed Central

    Antoniou, Miranta; Kolamunnage-Dona, Ruwanthi; Jorgensen, Andrea L.

    2017-01-01

    Biomarker-guided treatment is a rapidly developing area of medicine, where treatment choice is personalised according to one or more of an individual’s biomarker measurements. A number of biomarker-guided trial designs have been proposed in the past decade, including both adaptive and non-adaptive trial designs which test the effectiveness of a biomarker-guided approach to treatment with the aim of improving patient health. A better understanding of them is needed as challenges occur both in terms of trial design and analysis. We have undertaken a comprehensive literature review based on an in-depth search strategy with a view to providing the research community with clarity in definition, methodology and terminology of the various biomarker-guided trial designs (both adaptive and non-adaptive designs) from a total of 211 included papers. In the present paper, we focus on non-adaptive biomarker-guided trial designs for which we have identified five distinct main types mentioned in 100 papers. We have graphically displayed each non-adaptive trial design and provided an in-depth overview of their key characteristics. Substantial variability has been observed in terms of how trial designs are described and particularly in the terminology used by different authors. Our comprehensive review provides guidance for those designing biomarker-guided trials. PMID:28125057

  12. Towards viable, useful and usable human factors design guidance.

    PubMed

    Burns, C M; Vicente, K J; Christoffersen, K; Pawlak, W S

    1997-01-01

    This paper investigates the factors relevant to producing effective human factors design guidance, using the Engineering Data Compendium (EDC) as a research vehicle. A series of three exploratory experiments focusing on the factors that affect the usability, usefulness and viability of human factors handbooks was conducted. The results of these studies were interpreted in the context of the process by which the EDC was developed, leading to the following recommendations: (a) human factors guidance should be organized in a manner that is stepped in context; (b) human factors guidance should be based on an explicit requirements analysis; (c) the calibration of designers' perceptions of the cost of obtaining human factors information must be improved; (d) organizational policies must be changed to induce more effective information search behaviour.

  13. [Studies on optimizing preparation technics of wumeitougu oral liquid by response surface methodology].

    PubMed

    Yu, Xiao-cui; Liu, Gao-feng; Wang, Xin

    2011-02-01

    To optimize the preparation technics of wumeitougu oral liquid (WTOL) by response surface methodology. Based on the single-factor tests, the times of WTOL extraction, alcohol precipitation concentration and pH value were selected as three factors for box-behnken central composite design. The response surface methodology was used to optimize the parameters of the preparation. Under the condition of extraction time 1.5 h, extraction times 2.772, the relative density 1.12, alcohol precipitation concentration 68.704%, and pH value 5.0, he theory highest content of Asperosaponin VI was up to 549.908 mg/L. Considering the actual situation, the conditions were amended to three extract times, alcohol precipitation concentration 69%, pH value 5.0, and the content of Dipsacaceae VI saponin examined was 548.63 mg/L which was closed to the theoretical value. The optimized preparation technics of WTOL by response surface methodology is reasonable and feasible.

  14. 77 FR 9256 - Design and Methodology for Postmarket Surveillance Studies Under Section 522 of the Federal Food...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-16

    ... (i) Intended to be implanted in the human body for more than 1 year or to be used to sustain or... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration [Docket No. FDA-2012-N-0123] Design and Methodology for Postmarket Surveillance Studies Under Section 522 of the Federal Food, Drug...

  15. Longitudinal Study on the Lifestyle and Health of University Students (ELESEU): design, methodological procedures, and preliminary results.

    PubMed

    Nogueira, Patrícia Simone; Ferreira, Márcia Gonçalves; Rodrigues, Paulo Rogério Melo; Muraro, Ana Paula; Pereira, Lídia Pitaluga; Pereira, Rosangela Alves

    2018-03-29

    Admission to a university may cause significant changes in the pattern of exposure to health risks. The aim of this paper is to describe the study design and methodological procedures adopted in the Longitudinal Study on the Lifestyle and Health of University Students (ELESEU). This study examines a dynamic cohort of full-time students at a public university in the State of Mato Grosso, Brazil. This research, which started in 2015, will have four years of follow-up and is scheduled to end in 2018. A self-administered questionnaire is applied, containing questions regarding demographic and socioeconomic characteristics, and information on health conditions and risk factors such as lifestyle, perceived stress, symptoms of depression, body image, risk behaviors for eating disorders, self-assessment of health and diet quality, and other issues related to nutrition and health. Anthropometric and blood pressure measurements are also recorded. Two 24-hour dietary recalls and cholesterol, triglycerides, and glucose capillary measurements are collected in 50% of the students. In 2015, 495 participants (82.6% of the eligible students) were assessed in the baseline study. Of these, 348 (70.3%) were followed up in 2016. In 2016, 566 participants were included in the cohort (81% of the eligible students). This study will help to identify the factors that might influence changes in the nutritional, health, and metabolic status of young adults during college life.

  16. Calibration of the live load factor in LRFD design guidelines : [revised].

    DOT National Transportation Integrated Search

    2011-07-01

    The Load and Resistant Factor Design (LRFD) approach is based on the concept of structural reliability. The approach is : more rational than the former design approaches such as Load Factor Design or Allowable Stress Design. The LRFD : Specification ...

  17. Enviroplan—a summary methodology for comprehensive environmental planning and design

    Treesearch

    Robert Allen Jr.; George Nez; Fred Nicholson; Larry Sutphin

    1979-01-01

    This paper will discuss a comprehensive environmental assessment methodology that includes a numerical method for visual management and analysis. This methodology employs resource and human activity units as a means to produce a visual form unit which is the fundamental unit of the perceptual environment. The resource unit is based on the ecosystem as the fundamental...

  18. State of the Art Methodology for the Design and Analysis of Future Large Scale Evaluations: A Selective Examination.

    ERIC Educational Resources Information Center

    Burstein, Leigh

    Two specific methods of analysis in large-scale evaluations are considered: structural equation modeling and selection modeling/analysis of non-equivalent control group designs. Their utility in large-scale educational program evaluation is discussed. The examination of these methodological developments indicates how people (evaluators,…

  19. Situating methodology within qualitative research.

    PubMed

    Kramer-Kile, Marnie L

    2012-01-01

    Qualitative nurse researchers are required to make deliberate and sometimes complex methodological decisions about their work. Methodology in qualitative research is a comprehensive approach in which theory (ideas) and method (doing) are brought into close alignment. It can be difficult, at times, to understand the concept of methodology. The purpose of this research column is to: (1) define qualitative methodology; (2) illuminate the relationship between epistemology, ontology and methodology; (3) explicate the connection between theory and method in qualitative research design; and 4) highlight relevant examples of methodological decisions made within cardiovascular nursing research. Although there is no "one set way" to do qualitative research, all qualitative researchers should account for the choices they make throughout the research process and articulate their methodological decision-making along the way.

  20. The factor structure of posttraumatic stress disorder: a literature update, critique of methodology, and agenda for future research.

    PubMed

    Elhai, Jon D; Palmieri, Patrick A

    2011-08-01

    We present an update of recent literature (since 2007) exploring the factor structure of posttraumatic stress disorder (PTSD) symptom measures. Research supporting a four-factor emotional numbing model and a four-factor dysphoria model is presented, with these models fitting better than all other models examined. Variables accounting for factor structure differences are reviewed, including PTSD query instructions, type of PTSD measure, extent of trauma exposure, ethnicity, and timing of administration. Methodological and statistical limitations with recent studies are presented. Finally, a research agenda and recommendations are offered to push this research area forward, including suggestions to validate PTSD’s factors against external measures of psychopathology, test moderators of factor structure, and examine heterogeneity of symptom presentations based on factor structure examination.

  1. 77 FR 66471 - Methodology for Designation of Frontier and Remote Areas

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-05

    ... the use of a shorter, more intuitively appealing descriptive label in research publications and other...) Selection of final methodological approach; and (8) Analyses using final methodology on 2000 data. All the...

  2. A new design methodology of obtaining wide band high gain broadband parametric source for infrared wavelength applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maji, Partha Sona; Roy Chaudhuri, Partha

    In this article, we have presented a new design methodology of obtaining wide band parametric sources based on highly nonlinear chalcogenide material of As{sub 2}S{sub 3}. The dispersion profile of the photonic crystal fiber (PCF) has been engineered wisely by reducing the diameter of the second air-hole ring to have a favorable higher order dispersion parameter. The parametric gain dependence upon fiber length, pump power, and different pumping wavelengths has been investigated in detail. Based upon the nonlinear four wave mixing phenomenon, we are able to achieve a wideband parametric amplifier with peak gain of 29 dB with FWHM of ≈2000 nmmore » around the IR wavelength by proper tailoring of the dispersion profile of the PCF with a continuous wave Erbium (Er{sup 3+})-doped ZBLAN fiber laser emitting at 2.8 μm as the pump source with an average power of 5 W. The new design methodology will unleash a new dimension to the chalcogenide material based investigation for wavelength translation around IR wavelength band.« less

  3. An experimental strategy validated to design cost-effective culture media based on response surface methodology.

    PubMed

    Navarrete-Bolaños, J L; Téllez-Martínez, M G; Miranda-López, R; Jiménez-Islas, H

    2017-07-03

    For any fermentation process, the production cost depends on several factors, such as the genetics of the microorganism, the process condition, and the culture medium composition. In this work, a guideline for the design of cost-efficient culture media using a sequential approach based on response surface methodology is described. The procedure was applied to analyze and optimize a culture medium of registered trademark and a base culture medium obtained as a result of the screening analysis from different culture media used to grow the same strain according to the literature. During the experiments, the procedure quantitatively identified an appropriate array of micronutrients to obtain a significant yield and find a minimum number of culture medium ingredients without limiting the process efficiency. The resultant culture medium showed an efficiency that compares favorably with the registered trademark medium at a 95% lower cost as well as reduced the number of ingredients in the base culture medium by 60% without limiting the process efficiency. These results demonstrated that, aside from satisfying the qualitative requirements, an optimum quantity of each constituent is needed to obtain a cost-effective culture medium. Study process variables for optimized culture medium and scaling-up production for the optimal values are desirable.

  4. Archetype modeling methodology.

    PubMed

    Moner, David; Maldonado, José Alberto; Robles, Montserrat

    2018-03-01

    Clinical Information Models (CIMs) expressed as archetypes play an essential role in the design and development of current Electronic Health Record (EHR) information structures. Although there exist many experiences about using archetypes in the literature, a comprehensive and formal methodology for archetype modeling does not exist. Having a modeling methodology is essential to develop quality archetypes, in order to guide the development of EHR systems and to allow the semantic interoperability of health data. In this work, an archetype modeling methodology is proposed. This paper describes its phases, the inputs and outputs of each phase, and the involved participants and tools. It also includes the description of the possible strategies to organize the modeling process. The proposed methodology is inspired by existing best practices of CIMs, software and ontology development. The methodology has been applied and evaluated in regional and national EHR projects. The application of the methodology provided useful feedback and improvements, and confirmed its advantages. The conclusion of this work is that having a formal methodology for archetype development facilitates the definition and adoption of interoperable archetypes, improves their quality, and facilitates their reuse among different information systems and EHR projects. Moreover, the proposed methodology can be also a reference for CIMs development using any other formalism. Copyright © 2018 Elsevier Inc. All rights reserved.

  5. Psychosocial Factors as Predictors of Mentoring among Nurses in Southwestern Nigeria

    ERIC Educational Resources Information Center

    Salami, Samuel O.

    2008-01-01

    Purpose: The purpose of this paper is to examine the psychosocial factors that predict mentoring among nurses. Design/methodology/approach: This study adopted a survey research design. Questionnaires were used to collect data on self-esteem, locus of control, emotional intelligence and demographic factors from 480 nurses (males 230; females = 250)…

  6. Methodological quality of meta-analyses of single-case experimental studies.

    PubMed

    Jamshidi, Laleh; Heyvaert, Mieke; Declercq, Lies; Fernández-Castilla, Belén; Ferron, John M; Moeyaert, Mariola; Beretvas, S Natasha; Onghena, Patrick; Van den Noortgate, Wim

    2017-12-28

    Methodological rigor is a fundamental factor in the validity and credibility of the results of a meta-analysis. Following an increasing interest in single-case experimental design (SCED) meta-analyses, the current study investigates the methodological quality of SCED meta-analyses. We assessed the methodological quality of 178 SCED meta-analyses published between 1985 and 2015 through the modified Revised-Assessment of Multiple Systematic Reviews (R-AMSTAR) checklist. The main finding of the current review is that the methodological quality of the SCED meta-analyses has increased over time, but is still low according to the R-AMSTAR checklist. A remarkable percentage of the studies (93.80% of the included SCED meta-analyses) did not even reach the midpoint score (22, on a scale of 0-44). The mean and median methodological quality scores were 15.57 and 16, respectively. Relatively high scores were observed for "providing the characteristics of the included studies" and "doing comprehensive literature search". The key areas of deficiency were "reporting an assessment of the likelihood of publication bias" and "using the methods appropriately to combine the findings of studies". Although the results of the current review reveal that the methodological quality of the SCED meta-analyses has increased over time, still more efforts are needed to improve their methodological quality. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Design of Economic Evaluations of Mindfulness-Based Interventions: Ten Methodological Questions of Which to Be Mindful.

    PubMed

    Edwards, Rhiannon Tudor; Bryning, Lucy; Crane, Rebecca

    Mindfulness-based interventions (MBIs) are being increasingly applied in a variety of settings. A growing body of evidence to support the effectiveness of these interventions exists and there are a few published cost-effectiveness studies. With limited resources available within public sectors (health care, social care, and education), it is necessary to build in concurrent economic evaluations alongside trials in order to inform service commissioning and policy. If future research studies are well-designed, they have strong potential to investigate the economic impact of MBIs. The particular challenge to the health economist is how best to capture the ways that MBIs help people adjust to or build resilience to difficult life circumstances, and to disseminate effectively to enable policy makers to judge the value of the contribution that MBIs can make within the context of the limited resourcing of public services. In anticipation of more research worldwide evaluating MBIs in various settings, this article suggests ten health economics methodological design questions that researchers may want to consider prior to conducting MBI research. These questions draw on both published standards of good methodological practice in economic evaluation of medical interventions, and on the authors' knowledge and experience of mindfulness-based practice. We argue that it is helpful to view MBIs as both complex interventions and as public health prevention initiatives. Our suggestions for well-designed economic evaluations of MBIs in health and other settings, mirror current thinking on the challenges and opportunities of public health economics.

  8. Towards Methodologies for Building Knowledge-Based Instructional Systems.

    ERIC Educational Resources Information Center

    Duchastel, Philippe

    1992-01-01

    Examines the processes involved in building instructional systems that are based on artificial intelligence and hypermedia technologies. Traditional instructional systems design methodology is discussed; design issues including system architecture and learning strategies are addressed; and a new methodology for building knowledge-based…

  9. Economic and Cultural Factors Affecting University Excellence

    ERIC Educational Resources Information Center

    Jabnoun, Naceur

    2009-01-01

    Purpose: The ranking of top universities in the world has generated increased interest in the factors that enhance university performance. The purpose of this paper is to identify economic and cultural factors that affect the number of top ranking universities in each country. Design/methodology/approach: This paper first identifies the number of…

  10. Designing Knowledge Scaffolds to Support Mathematical Problem Solving

    ERIC Educational Resources Information Center

    Rittle-Johnson, Bethany; Koedinger, Kenneth R.

    2005-01-01

    We present a methodology for designing better learning environments. In Phase 1, 6th-grade students' (n = 223) prior knowledge was assessed using a difficulty factors assessment (DFA). The assessment revealed that scaffolds designed to elicit contextual, conceptual, or procedural knowledge each improved students' ability to add and subtract…

  11. IMPAC: An Integrated Methodology for Propulsion and Airframe Control

    NASA Technical Reports Server (NTRS)

    Garg, Sanjay; Ouzts, Peter J.; Lorenzo, Carl F.; Mattern, Duane L.

    1991-01-01

    The National Aeronautics and Space Administration is actively involved in the development of enabling technologies that will lead towards aircraft with new/enhanced maneuver capabilities such as Short Take-Off Vertical Landing (STOVL) and high angle of attack performance. Because of the high degree of dynamic coupling between the airframe and propulsion systems of these types of aircraft, one key technology is the integration of the flight and propulsion control. The NASA Lewis Research Center approach to developing Integrated Flight Propulsion Control (IFPC) technologies is an in-house research program referred to as IMPAC (Integrated Methodology for Propulsion and Airframe Control). The goals of IMPAC are to develop a viable alternative to the existing integrated control design methodologies that will allow for improved system performance and simplicity of control law synthesis and implementation, and to demonstrate the applicability of the methodology to a supersonic STOVL fighter aircraft. Based on some preliminary control design studies that included evaluation of the existing methodologies, the IFPC design methodology that is emerging at the Lewis Research Center consists of considering the airframe and propulsion system as one integrated system for an initial centralized controller design and then partitioning the centralized controller into separate airframe and propulsion system subcontrollers to ease implementation and to set meaningful design requirements for detailed subsystem control design and evaluation. An overview of IMPAC is provided and detailed discussion of the various important design and evaluation steps in the methodology are included.

  12. Factors Related to Successful Engineering Team Design

    NASA Technical Reports Server (NTRS)

    Nowaczyk, Ronald H.; Zang, Thomas A.

    1998-01-01

    The perceptions of a sample of 49 engineers and scientists from NASA Langley Research Center toward engineering design teams were evaluated. The respondents rated 60 team behaviors in terms of their relative importance for team success. They also completed a profile of their own perceptions of their strengths and weaknesses as team members. Behaviors related to team success are discussed in terms of those involving the organizational culture and commitment to the team and those dealing with internal team dynamics. The latter behaviors included the level and extent of debate and discussion regarding methods for completing the team task and the efficient use of team time to explore and discuss methodologies critical to the problem. Successful engineering teams may find their greatest challenges occurring during the early stages of their existence. In contrast to the prototypical business team, members on an engineering design share expertise and knowledge which allows them to deal with task issues sooner. However, discipline differences among team members can lead to conflicts regarding the best method or approach to solving the engineering problem.

  13. A novel integrated framework and improved methodology of computer-aided drug design.

    PubMed

    Chen, Calvin Yu-Chian

    2013-01-01

    Computer-aided drug design (CADD) is a critical initiating step of drug development, but a single model capable of covering all designing aspects remains to be elucidated. Hence, we developed a drug design modeling framework that integrates multiple approaches, including machine learning based quantitative structure-activity relationship (QSAR) analysis, 3D-QSAR, Bayesian network, pharmacophore modeling, and structure-based docking algorithm. Restrictions for each model were defined for improved individual and overall accuracy. An integration method was applied to join the results from each model to minimize bias and errors. In addition, the integrated model adopts both static and dynamic analysis to validate the intermolecular stabilities of the receptor-ligand conformation. The proposed protocol was applied to identifying HER2 inhibitors from traditional Chinese medicine (TCM) as an example for validating our new protocol. Eight potent leads were identified from six TCM sources. A joint validation system comprised of comparative molecular field analysis, comparative molecular similarity indices analysis, and molecular dynamics simulation further characterized the candidates into three potential binding conformations and validated the binding stability of each protein-ligand complex. The ligand pathway was also performed to predict the ligand "in" and "exit" from the binding site. In summary, we propose a novel systematic CADD methodology for the identification, analysis, and characterization of drug-like candidates.

  14. Acoustic methodology review

    NASA Technical Reports Server (NTRS)

    Schlegel, R. G.

    1982-01-01

    It is important for industry and NASA to assess the status of acoustic design technology for predicting and controlling helicopter external noise in order for a meaningful research program to be formulated which will address this problem. The prediction methodologies available to the designer and the acoustic engineer are three-fold. First is what has been described as a first principle analysis. This analysis approach attempts to remove any empiricism from the analysis process and deals with a theoretical mechanism approach to predicting the noise. The second approach attempts to combine first principle methodology (when available) with empirical data to formulate source predictors which can be combined to predict vehicle levels. The third is an empirical analysis, which attempts to generalize measured trends into a vehicle noise prediction method. This paper will briefly address each.

  15. Design methodology and results evaluation of a heating functionality in modular lab-on-chip systems

    NASA Astrophysics Data System (ADS)

    Streit, Petra; Nestler, Joerg; Shaporin, Alexey; Graunitz, Jenny; Otto, Thomas

    2018-06-01

    Lab-on-a-chip (LoC) systems offer the opportunity of fast and customized biological analyses executed at the ‘point-of-need’ without expensive lab equipment. Some biological processes need a temperature treatment. Therefore, it is important to ensure a defined and stable temperature distribution in the biosensor area. An integrated heating functionality is realized with discrete resistive heating elements including temperature measurement. The focus of this contribution is a design methodology and evaluation technique of the temperature distribution in the biosensor area with regard to the thermal-electrical behaviour of the heat sources. Furthermore, a sophisticated control of the biosensor temperature is proposed. A finite element (FE) model with one and more integrated heat sources in a polymer-based LoC system is used to investigate the impact of the number and arrangement of heating elements on the temperature distribution around the heating elements and in the biosensor area. Based on this model, various LOC systems are designed and fabricated. Electrical characterization of the heat sources and independent temperature measurements with infrared technique are performed to verify the model parameters and prove the simulation approach. The FE model and the proposed methodology is the foundation for optimization and evaluation of new designs with regard to temperature requirements of the biosensor. Furthermore, a linear dependency of the heater temperature on the electric current is demonstrated in the targeted temperature range of 20 °C to 70 °C enabling the usage of the heating functionality for biological reactions requiring a steady-state temperature up to 70 °C. The correlation between heater and biosensor area temperature is derived for a direct control through the heating current.

  16. [Purification Technology Optimization for Saponins from Ziziphi Spinosae Semen with Macroporous Adsorption Resin by Box-Behnken Design-Response Surface Methodology].

    PubMed

    Zhao, Hui-ru; Ren, Zao; Liu, Chun-ye

    2015-04-01

    To compare the purification effect of saponins from Ziziphi Spinosae Semen with different types of macroporous adsorption resin, and to optimize its purification technology. The type of macroporous resins was optimized by static adsorption method. The optimum technological conditions of saponins from Ziziphi Spinosae Semen was screened by single factor test and Box-Behnken Design-Response Surface Methodology. AB-8 macroporous resin had better purification effect of total saponins than other resins, optimum technological parameters were as follows: column height-diameter ratio was 5: 1, the concentration of sample solution was 2. 52 mg/mL, resin adsorption quantity was 8. 915 mg/g, eluted by 3 BV water, flow rate of adsorption and elution was 2 BV/h, elution solvent was 75% ethanol, elution solvent volume was 5 BV. AB-8 macroporous resin has a good purification effect on jujuboside A. The optimized technology is stable and feasible.

  17. A design methodology for neutral buoyancy simulation of space operations

    NASA Technical Reports Server (NTRS)

    Akin, David L.

    1988-01-01

    Neutral buoyancy has often been used in the past for EVA development activities, but little has been done to provide an analytical understanding of the environment and its correlation with space. This paper covers a set of related research topics at the MIT Space Systems Laboratory, dealing with the modeling of the space and underwater environments, validation of the models through testing in neutral buoyancy, parabolic flight, and space flight experiments, and applications of the models to gain a better design methodology for creating meaningful neutral buoyancy simulations. Examples covered include simulation validation criteria for human body dynamics, and for applied torques in a beam rotation task, which is the pacing crew operation for EVA structural assembly. Extensions of the dynamics models are presented for powered vehicles in the underwater environment, and examples given from the MIT Space Telerobotics Research Program, including the Beam Assembly Teleoperator and the Multimode Proximity Operations Device. Future expansions of the modeling theory are also presented, leading to remote vehicles which behave in neutral buoyancy exactly as the modeled system would in space.

  18. Application-specific coarse-grained reconfigurable array: architecture and design methodology

    NASA Astrophysics Data System (ADS)

    Zhou, Li; Liu, Dongpei; Zhang, Jianfeng; Liu, Hengzhu

    2015-06-01

    Coarse-grained reconfigurable arrays (CGRAs) have shown potential for application in embedded systems in recent years. Numerous reconfigurable processing elements (PEs) in CGRAs provide flexibility while maintaining high performance by exploring different levels of parallelism. However, a difference remains between the CGRA and the application-specific integrated circuit (ASIC). Some application domains, such as software-defined radios (SDRs), require flexibility with performance demand increases. More effective CGRA architectures are expected to be developed. Customisation of a CGRA according to its application can improve performance and efficiency. This study proposes an application-specific CGRA architecture template composed of generic PEs (GPEs) and special PEs (SPEs). The hardware of the SPE can be customised to accelerate specific computational patterns. An automatic design methodology that includes pattern identification and application-specific function unit generation is also presented. A mapping algorithm based on ant colony optimisation is provided. Experimental results on the SDR target domain show that compared with other ordinary and application-specific reconfigurable architectures, the CGRA generated by the proposed method performs more efficiently for given applications.

  19. A study of commuter airplane design optimization

    NASA Technical Reports Server (NTRS)

    Roskam, J.; Wyatt, R. D.; Griswold, D. A.; Hammer, J. L.

    1977-01-01

    Problems of commuter airplane configuration design were studied to affect a minimization of direct operating costs. Factors considered were the minimization of fuselage drag, methods of wing design, and the estimated drag of an airplane submerged in a propellor slipstream; all design criteria were studied under a set of fixed performance, mission, and stability constraints. Configuration design data were assembled for application by a computerized design methodology program similar to the NASA-Ames General Aviation Synthesis Program.

  20. Satisfiers and Dissatisfiers: A Two-Factor Model for Website Design and Evaluation.

    ERIC Educational Resources Information Center

    Zhang, Ping; von Dran, Gisela M.

    2000-01-01

    Investigates Web site design factors and their impact from a theoretical perspective. Presents a two-factor model that can guide Web site design and evaluation. According to the model, there are two types of design factors: hygiene and motivator. Results showed that the two-factor model provides a means for Web-user interface studies. Provides…

  1. Influencing Factors of Female Underrepresentation as School Principals in Indonesia

    ERIC Educational Resources Information Center

    Airin, Rashidah

    2010-01-01

    Purpose -- Number of women in the school principalship in Indonesia is less than half of the males'. This paper aims to identify the factor behind the underrepresentation of women in the principalship. Design/methodology/approach -- The methodological approach utilised in this paper was a structured review of the literature. Twenty sources namely…

  2. Improvement of the Assignment Methodology of the Approach Embankment Design to Highway Structures in Difficult Conditions

    NASA Astrophysics Data System (ADS)

    Chistyy, Y.; Kuzakhmetova, E.; Fazilova, Z.; Tsukanova, O.

    2018-03-01

    Design issues of junction of bridges and overhead road with approach embankment are studied. The reasons for the formation of deformations in the road structure are indicated. Activities to ensure sustainability and acceleration of the shrinkage of a weak subgrade approach embankment are listed. The necessity of taking into account the man-made impact of the approach embankment on the subgrade behavior is proved. Modern stabilizing agents to improve the properties of used soils in the embankment and the subgrade are suggested. Clarified methodology for determining an active zone of compression in the subgrade under load from the weight of the embankment is described. As an additional condition to the existing methodology for establishing the lower bound of the active zone of compression it is offered to accept the accuracy of evaluation of soil compressibility and determine shrinkage.

  3. Response surface methodology for the determination of the design space of enantiomeric separations on cinchona-based zwitterionic chiral stationary phases by high performance liquid chromatography.

    PubMed

    Hanafi, Rasha Sayed; Lämmerhofer, Michael

    2018-01-26

    Quality-by-Design approach for enantioselective HPLC method development surpasses Quality-by-Testing in offering the optimal separation conditions with the least number of experiments and in its ability to describe the method's Design Space visually which helps to determine enantiorecognition to a significant extent. Although some schemes exist for enantiomeric separations on Cinchona-based zwitterionic stationary phases, the exact design space and the weights by which each of the chromatographic parameters influences the separation have not yet been statistically studied. In the current work, a screening design followed by a Response Surface Methodology optimization design were adopted for enantioseparation optimization of 3 model drugs namely the acidic Fmoc leucine, the amphoteric tryptophan and the basic salbutamol. The screening design proved that the acid/base additives are of utmost importance for the 3 chiral drugs, and that among 3 different pairs of acids and bases, acetic acid and diethylamine is the couple able to provide acceptable resolution at variable conditions. Visualization of the response surface of the retention factor, separation factor and resolution helped describe accurately the magnitude by which each chromatographic factor (% MeOH, concentration and ratio of acid base modifiers) affects the separation while interacting with other parameters. The global optima compromising highest enantioresolution with the least run time for the 3 chiral model drugs varied extremely, where it was best to set low % methanol with equal ratio of acid-base modifiers for the acidic drug, very high % methanol and 10-fold higher concentration of the acid for the amphoteric drug while 20 folds of the base modifier with moderate %methanol were needed for the basic drug. Considering the selected drugs as models for many series of structurally related compounds, the design space defined and the optimum conditions computed are the key for method development on

  4. A Risk Analysis Methodology to Address Human and Organizational Factors in Offshore Drilling Safety: With an Emphasis on Negative Pressure Test

    NASA Astrophysics Data System (ADS)

    Tabibzadeh, Maryam

    According to the final Presidential National Commission report on the BP Deepwater Horizon (DWH) blowout, there is need to "integrate more sophisticated risk assessment and risk management practices" in the oil industry. Reviewing the literature of the offshore drilling industry indicates that most of the developed risk analysis methodologies do not fully and more importantly, systematically address the contribution of Human and Organizational Factors (HOFs) in accident causation. This is while results of a comprehensive study, from 1988 to 2005, of more than 600 well-documented major failures in offshore structures show that approximately 80% of those failures were due to HOFs. In addition, lack of safety culture, as an issue related to HOFs, have been identified as a common contributing cause of many accidents in this industry. This dissertation introduces an integrated risk analysis methodology to systematically assess the critical role of human and organizational factors in offshore drilling safety. The proposed methodology in this research focuses on a specific procedure called Negative Pressure Test (NPT), as the primary method to ascertain well integrity during offshore drilling, and analyzes the contributing causes of misinterpreting such a critical test. In addition, the case study of the BP Deepwater Horizon accident and their conducted NPT is discussed. The risk analysis methodology in this dissertation consists of three different approaches and their integration constitutes the big picture of my whole methodology. The first approach is the comparative analysis of a "standard" NPT, which is proposed by the author, with the test conducted by the DWH crew. This analysis contributes to identifying the involved discrepancies between the two test procedures. The second approach is a conceptual risk assessment framework to analyze the causal factors of the identified mismatches in the previous step, as the main contributors of negative pressure test

  5. Developing a methodology to assess the impact of research grant funding: a mixed methods approach.

    PubMed

    Bloch, Carter; Sørensen, Mads P; Graversen, Ebbe K; Schneider, Jesper W; Schmidt, Evanthia Kalpazidou; Aagaard, Kaare; Mejlgaard, Niels

    2014-04-01

    This paper discusses the development of a mixed methods approach to analyse research funding. Research policy has taken on an increasingly prominent role in the broader political scene, where research is seen as a critical factor in maintaining and improving growth, welfare and international competitiveness. This has motivated growing emphasis on the impacts of science funding, and how funding can best be designed to promote socio-economic progress. Meeting these demands for impact assessment involves a number of complex issues that are difficult to fully address in a single study or in the design of a single methodology. However, they point to some general principles that can be explored in methodological design. We draw on a recent evaluation of the impacts of research grant funding, discussing both key issues in developing a methodology for the analysis and subsequent results. The case of research grant funding, involving a complex mix of direct and intermediate effects that contribute to the overall impact of funding on research performance, illustrates the value of a mixed methods approach to provide a more robust and complete analysis of policy impacts. Reflections on the strengths and weaknesses of the methodology are used to examine refinements for future work. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Developing the Next Generation Shell Buckling Design Factors and Technologies

    NASA Technical Reports Server (NTRS)

    Hilburger, Mark W.

    2012-01-01

    NASA s Shell Buckling Knockdown Factor (SBKF) Project was established in the spring of 2007 by the NASA Engineering and Safety Center (NESC) in collaboration with the Constellation Program and Exploration Systems Mission Directorate. The SBKF project has the current goal of developing less-conservative, robust shell buckling design factors (a.k.a. knockdown factors) and design and analysis technologies for light-weight stiffened metallic launch vehicle (LV) structures. Preliminary design studies indicate that implementation of these new knockdown factors can enable significant reductions in mass and mass-growth in these vehicles and can help mitigate some of NASA s LV development and performance risks. In particular, it is expected that the results from this project will help reduce the reliance on testing, provide high-fidelity estimates of structural performance, reliability, robustness, and enable increased payload capability. The SBKF project objectives and approach used to develop and validate new design technologies are presented, and provide a glimpse into the future of design of the next generation of buckling-critical launch vehicle structures.

  7. Factors Influencing Teachers' Engagement in Informal Learning Activities

    ERIC Educational Resources Information Center

    Lohman, Margaret C.

    2006-01-01

    Purpose: The purpose of this study is to examine factors influencing the engagement of public school teachers in informal learning activities. Design/methodology/approach: This study used a survey research design. Findings: Analysis of the data found that teachers rely to a greater degree on interactive than on independent informal learning…

  8. A methodology to leverage cross-sectional accelerometry to capture weather's influence in active living research.

    PubMed

    Katapally, Tarun R; Rainham, Daniel; Muhajarine, Nazeem

    2016-06-27

    While active living interventions focus on modifying urban design and built environment, weather variation, a phenomenon that perennially interacts with these environmental factors, is consistently underexplored. This study's objective is to develop a methodology to link weather data with existing cross-sectional accelerometry data in capturing weather variation. Saskatoon's neighbourhoods were classified into grid-pattern, fractured grid-pattern and curvilinear neighbourhoods. Thereafter, 137 Actical accelerometers were used to derive moderate to vigorous physical activity (MVPA) and sedentary behaviour (SB) data from 455 children in 25 sequential one-week cycles between April and June, 2010. This sequential deployment was necessary to overcome the difference in the ratio between the sample size and the number of accelerometers. A data linkage methodology was developed, where each accelerometry cycle was matched with localized (Saskatoon-specific) weather patterns derived from Environment Canada. Statistical analyses were conducted to depict the influence of urban design on MVPA and SB after factoring in localized weather patterns. Integration of cross-sectional accelerometry with localized weather patterns allowed the capture of weather variation during a single seasonal transition. Overall, during the transition from spring to summer in Saskatoon, MVPA increased and SB decreased during warmer days. After factoring in localized weather, a recurring observation was that children residing in fractured grid-pattern neighbourhoods accumulated significantly lower MVPA and higher SB. The proposed methodology could be utilized to link globally available cross-sectional accelerometry data with place-specific weather data to understand how built and social environmental factors interact with varying weather patterns in influencing active living.

  9. Applications of a damage tolerance analysis methodology in aircraft design and production

    NASA Technical Reports Server (NTRS)

    Woodward, M. R.; Owens, S. D.; Law, G. E.; Mignery, L. A.

    1992-01-01

    Objectives of customer mandated aircraft structural integrity initiatives in design are to guide material selection, to incorporate fracture resistant concepts in the design, to utilize damage tolerance based allowables and planned inspection procedures necessary to enhance the safety and reliability of manned flight vehicles. However, validated fracture analysis tools for composite structures are needed to accomplish these objectives in a timely and economical manner. This paper briefly describes the development, validation, and application of a damage tolerance methodology for composite airframe structures. A closed-form analysis code, entitled SUBLAM was developed to predict the critical biaxial strain state necessary to cause sublaminate buckling-induced delamination extension in an impact damaged composite laminate. An embedded elliptical delamination separating a thin sublaminate from a thick parent laminate is modelled. Predicted failure strains were correlated against a variety of experimental data that included results from compression after impact coupon and element tests. An integrated analysis package was developed to predict damage tolerance based margin-of-safety (MS) using NASTRAN generated loads and element information. Damage tolerance aspects of new concepts are quickly and cost-effectively determined without the need for excessive testing.

  10. [Methodological approach to designing a telecare system for pre-dialysis and peritoneal dialysis patients].

    PubMed

    Calvillo-Arbizu, Jorge; Roa-Romero, Laura M; Milán-Martín, José A; Aresté-Fosalba, Nuria; Tornero-Molina, Fernando; Macía-Heras, Manuel; Vega-Díaz, Nicanor

    2014-01-01

    A major obstacle that hinders the implementation of technological solutions in healthcare is the rejection of developed systems by users (healthcare professionals and patients), who consider that they do not adapt to their real needs. (1) To design technological architecture for the telecare of nephrological patients by applying a methodology that prioritises the involvement of users (professionals and patients) throughout the design and development process; (2) to show how users' needs can be determined and addressed by means of technology, increasing the acceptance level of the final systems. In order to determine the main current needs in Nephrology, a group of Spanish Nephrology Services was involved. Needs were recorded through semi-structured interviews with the medical team and questionnaires for professionals and patients. A set of requirements were garnered from professionals and patients. In parallel, the group of biomedical engineers identified requirements for patient telecare from a technological perspective. All of these requirements drove the design of modular architecture for the telecare of peritoneal dialysis and pre-dialysis patients. This work shows how it is possible to involve users in the whole process of design and development of a system. The result of this work is the design of adaptable modular architecture for the telecare of nephrological patients and it addresses the preferences and needs of patient and professional users consulted.

  11. Facilitators of Organizational Learning in Design

    ERIC Educational Resources Information Center

    Pham, Ngoc Thuy; Swierczek, Fredric William

    2006-01-01

    Purpose: The purpose of this paper is to determine the influence of organizational factors such as leadership commitment, incentives and interaction on learning outcomes defined as performance improvement and organizational climate. Design/methodology/approach: Different aspects of knowledge acquisition, sharing and utilization were examined,…

  12. Development of AMSTAR: a measurement tool to assess the methodological quality of systematic reviews.

    PubMed

    Shea, Beverley J; Grimshaw, Jeremy M; Wells, George A; Boers, Maarten; Andersson, Neil; Hamel, Candyce; Porter, Ashley C; Tugwell, Peter; Moher, David; Bouter, Lex M

    2007-02-15

    Our objective was to develop an instrument to assess the methodological quality of systematic reviews, building upon previous tools, empirical evidence and expert consensus. A 37-item assessment tool was formed by combining 1) the enhanced Overview Quality Assessment Questionnaire (OQAQ), 2) a checklist created by Sacks, and 3) three additional items recently judged to be of methodological importance. This tool was applied to 99 paper-based and 52 electronic systematic reviews. Exploratory factor analysis was used to identify underlying components. The results were considered by methodological experts using a nominal group technique aimed at item reduction and design of an assessment tool with face and content validity. The factor analysis identified 11 components. From each component, one item was selected by the nominal group. The resulting instrument was judged to have face and content validity. A measurement tool for the 'assessment of multiple systematic reviews' (AMSTAR) was developed. The tool consists of 11 items and has good face and content validity for measuring the methodological quality of systematic reviews. Additional studies are needed with a focus on the reproducibility and construct validity of AMSTAR, before strong recommendations can be made on its use.

  13. Experimental Methodology for Measuring Combustion and Injection-Coupled Responses

    NASA Technical Reports Server (NTRS)

    Cavitt, Ryan C.; Frederick, Robert A.; Bazarov, Vladimir G.

    2006-01-01

    A Russian scaling methodology for liquid rocket engines utilizing a single, full scale element is reviewed. The scaling methodology exploits the supercritical phase of the full scale propellants to simplify scaling requirements. Many assumptions are utilized in the derivation of the scaling criteria. A test apparatus design is presented to implement the Russian methodology and consequently verify the assumptions. This test apparatus will allow researchers to assess the usefulness of the scaling procedures and possibly enhance the methodology. A matrix of the apparatus capabilities for a RD-170 injector is also presented. Several methods to enhance the methodology have been generated through the design process.

  14. A Rapid Python-Based Methodology for Target-Focused Combinatorial Library Design.

    PubMed

    Li, Shiliang; Song, Yuwei; Liu, Xiaofeng; Li, Honglin

    2016-01-01

    The chemical space is so vast that only a small portion of it has been examined. As a complementary approach to systematically probe the chemical space, virtual combinatorial library design has extended enormous impacts on generating novel and diverse structures for drug discovery. Despite the favorable contributions, high attrition rates in drug development that mainly resulted from lack of efficacy and side effects make it increasingly challenging to discover good chemical starting points. In most cases, focused libraries, which are restricted to particular regions of the chemical space, are deftly exploited to maximize hit rate and improve efficiency at the beginning of the drug discovery and drug development pipeline. This paper presented a valid methodology for fast target-focused combinatorial library design in both reaction-based and production-based ways with the library creating rates of approximately 70,000 molecules per second. Simple, quick and convenient operating procedures are the specific features of the method. SHAFTS, a hybrid 3D similarity calculation software, was embedded to help refine the size of the libraries and improve hit rates. Two target-focused (p38-focused and COX2-focused) libraries were constructed efficiently in this study. This rapid library enumeration method is portable and applicable to any other targets for good chemical starting points identification collaborated with either structure-based or ligand-based virtual screening.

  15. A hybrid design methodology for structuring an Integrated Environmental Management System (IEMS) for shipping business.

    PubMed

    Celik, Metin

    2009-03-01

    The International Safety Management (ISM) Code defines a broad framework for the safe management and operation of merchant ships, maintaining high standards of safety and environmental protection. On the other hand, ISO 14001:2004 provides a generic, worldwide environmental management standard that has been utilized by several industries. Both the ISM Code and ISO 14001:2004 have the practical goal of establishing a sustainable Integrated Environmental Management System (IEMS) for shipping businesses. This paper presents a hybrid design methodology that shows how requirements from both standards can be combined into a single execution scheme. Specifically, the Analytic Hierarchy Process (AHP) and Fuzzy Axiomatic Design (FAD) are used to structure an IEMS for ship management companies. This research provides decision aid to maritime executives in order to enhance the environmental performance in the shipping industry.

  16. Electronic Design Automation: Integrating the Design and Manufacturing Functions

    NASA Technical Reports Server (NTRS)

    Bachnak, Rafic; Salkowski, Charles

    1997-01-01

    As the complexity of electronic systems grows, the traditional design practice, a sequential process, is replaced by concurrent design methodologies. A major advantage of concurrent design is that the feedback from software and manufacturing engineers can be easily incorporated into the design. The implementation of concurrent engineering methodologies is greatly facilitated by employing the latest Electronic Design Automation (EDA) tools. These tools offer integrated simulation of the electrical, mechanical, and manufacturing functions and support virtual prototyping, rapid prototyping, and hardware-software co-design. This report presents recommendations for enhancing the electronic design and manufacturing capabilities and procedures at JSC based on a concurrent design methodology that employs EDA tools.

  17. Methodological, Theoretical, Infrastructural, and Design Issues in Conducting Good Outcome Studies

    ERIC Educational Resources Information Center

    Kelly, Michael P.; Moore, Tessa A.

    2011-01-01

    This article outlines a set of methodological, theoretical, and other issues relating to the conduct of good outcome studies. The article begins by considering the contribution of evidence-based medicine to the methodology of outcome research. The lessons which can be applied in outcome studies in nonmedical settings are described. The article…

  18. Integrated controls-structures design methodology development for a class of flexible spacecraft

    NASA Technical Reports Server (NTRS)

    Maghami, P. G.; Joshi, S. M.; Walz, J. E.; Armstrong, E. S.

    1990-01-01

    Future utilization of space will require large space structures in low-Earth and geostationary orbits. Example missions include: Earth observation systems, personal communication systems, space science missions, space processing facilities, etc., requiring large antennas, platforms, and solar arrays. The dimensions of such structures will range from a few meters to possibly hundreds of meters. For reducing the cost of construction, launching, and operating (e.g., energy required for reboosting and control), it will be necessary to make the structure as light as possible. However, reducing structural mass tends to increase the flexibility which would make it more difficult to control with the specified precision in attitude and shape. Therefore, there is a need to develop a methodology for designing space structures which are optimal with respect to both structural design and control design. In the current spacecraft design practice, it is customary to first perform the structural design and then the controller design. However, the structural design and the control design problems are substantially coupled and must be considered concurrently in order to obtain a truly optimal spacecraft design. For example, let C denote the set of the 'control' design variables (e.g., controller gains), and L the set of the 'structural' design variables (e.g., member sizes). If a structural member thickness is changed, the dynamics would change which would then change the control law and the actuator mass. That would, in turn, change the structural model. Thus, the sets C and L depend on each other. Future space structures can be roughly divided into four mission classes. Class 1 missions include flexible spacecraft with no articulated appendages which require fine attitude pointing and vibration suppression (e.g., large space antennas). Class 2 missions consist of flexible spacecraft with articulated multiple payloads, where the requirement is to fine-point the spacecraft and each

  19. A Human Factors Framework for Payload Display Design

    NASA Technical Reports Server (NTRS)

    Dunn, Mariea C.; Hutchinson, Sonya L.

    1998-01-01

    During missions to space, one charge of the astronaut crew is to conduct research experiments. These experiments, referred to as payloads, typically are controlled by computers. Crewmembers interact with payload computers by using visual interfaces or displays. To enhance the safety, productivity, and efficiency of crewmember interaction with payload displays, particular attention must be paid to the usability of these displays. Enhancing display usability requires adoption of a design process that incorporates human factors engineering principles at each stage. This paper presents a proposed framework for incorporating human factors engineering principles into the payload display design process.

  20. Developing a personalised self-management system for post stroke rehabilitation; utilising a user-centred design methodology.

    PubMed

    Mawson, Susan; Nasr, Nasrin; Parker, Jack; Zheng, Huiru; Davies, Richard; Mountain, Gail

    2014-11-01

    To develop and evaluate an information and communication technology (ICT) solution for a post-stroke Personalised Self-Managed Rehabilitation System (PSMrS). The PSMrS translates current models of stroke rehabilitation and theories underpinning self-management and self-efficacy into an ICT-based system for home-based post-stroke rehabilitation. The interdisciplinary research team applied a hybrid of health and social sciences research methods and user-centred design methods. This included a series of home visits, focus groups, in-depth interviews, cultural probes and technology biographies. The iterative development of both the content of the PSMrS and the interactive interfaces between the system and the user incorporates current models of post-stroke rehabilitation and addresses the factors that promote self-managed behaviour and self-efficacy such as mastery, verbal persuasion and physiological feedback. The methodological approach has ensured that the interactive technology has been driven by the needs of the stroke survivors and their carers in the context of their journey to both recovery and adaptation. Underpinned by theories of motor relearning, neuroplasticity, self-management and behaviour change, the PSMrS developed in this study has resulted in a personalised system for self-managed rehabilitation, which has the potential to change motor behaviour and promote the achievement of life goals for stroke survivors.

  1. Enhanced styrene recovery from waste polystyrene pyrolysis using response surface methodology coupled with Box-Behnken design.

    PubMed

    Mo, Yu; Zhao, Lei; Wang, Zhonghui; Chen, Chia-Lung; Tan, Giin-Yu Amy; Wang, Jing-Yuan

    2014-04-01

    A work applied response surface methodology coupled with Box-Behnken design (RSM-BBD) has been developed to enhance styrene recovery from waste polystyrene (WPS) through pyrolysis. The relationship between styrene yield and three selected operating parameters (i.e., temperature, heating rate, and carrier gas flow rate) was investigated. A second order polynomial equation was successfully built to describe the process and predict styrene yield under the study conditions. The factors identified as statistically significant to styrene production were: temperature, with a quadratic effect; heating rate, with a linear effect; carrier gas flow rate, with a quadratic effect; interaction between temperature and carrier gas flow rate; and interaction between heating rate and carrier gas flow rate. The optimum conditions for the current system were determined to be at a temperature range of 470-505°C, a heating rate of 40°C/min, and a carrier gas flow rate range of 115-140mL/min. Under such conditions, 64.52% WPS was recovered as styrene, which was 12% more than the highest reported yield for reactors of similar size. It is concluded that RSM-BBD is an effective approach for yield optimization of styrene recovery from WPS pyrolysis. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. A probabilistic methodology for radar cross section prediction in conceptual aircraft design

    NASA Astrophysics Data System (ADS)

    Hines, Nathan Robert

    System effectiveness has increasingly become the prime metric for the evaluation of military aircraft. As such, it is the decision maker's/designer's goal to maximize system effectiveness. Industry and government research documents indicate that all future military aircraft will incorporate signature reduction as an attempt to improve system effectiveness and reduce the cost of attrition. Today's operating environments demand low observable aircraft which are able to reliably take out valuable, time critical targets. Thus it is desirable to be able to design vehicles that are balanced for increased effectiveness. Previous studies have shown that shaping of the vehicle is one of the most important contributors to radar cross section, a measure of radar signature, and must be considered from the very beginning of the design process. Radar cross section estimation should be incorporated into conceptual design to develop more capable systems. This research strives to meet these needs by developing a conceptual design tool that predicts radar cross section for parametric geometries. This tool predicts the absolute radar cross section of the vehicle as well as the impact of geometry changes, allowing for the simultaneous tradeoff of the aerodynamic, performance, and cost characteristics of the vehicle with the radar cross section. Furthermore, this tool can be linked to a campaign theater analysis code to demonstrate the changes in system and system of system effectiveness due to changes in aircraft geometry. A general methodology was developed and implemented and sample computer codes applied to prototype the proposed process. Studies utilizing this radar cross section tool were subsequently performed to demonstrate the capabilities of this method and show the impact that various inputs have on the outputs of these models. The F/A-18 aircraft configuration was chosen as a case study vehicle to perform a design space exercise and to investigate the relative impact of

  3. [Perinatal mortality research in Brazil: review of methodology and results].

    PubMed

    Fonseca, Sandra Costa; Coutinho, Evandro da Silva Freire

    2004-01-01

    The perinatal mortality rate remains a public health problem, demanding epidemiological studies to describe its magnitude and time trends, identify risk factors, and define adequate interventions. There are still methodological controversies, resulting in heterogeneous studies and possible biases. In Brazil, there has been a growing scientific output on this theme, mainly in the South and Southeast of the country. Twenty-four articles from 1996 to 2003 were reviewed, focusing on definitions and classifications, data sources, study designs, measurement of variables, statistical analysis, and results. The review showed an increasing utilization of data bases (mainly SINASC and SIM), few studies on stillbirth, the incorporation of classification schemes, and disagreement concerning risk factors.

  4. Single Group, Pre- and Post-Test Research Designs: Some Methodological Concerns

    ERIC Educational Resources Information Center

    Marsden, Emma; Torgerson, Carole J.

    2012-01-01

    This article provides two illustrations of some of the factors that can influence findings from pre- and post-test research designs in evaluation studies, including regression to the mean (RTM), maturation, history and test effects. The first illustration involves a re-analysis of data from a study by Marsden (2004), in which pre-test scores are…

  5. Space Station Human Factors: Designing a Human-Robot Interface

    NASA Technical Reports Server (NTRS)

    Rochlis, Jennifer L.; Clarke, John Paul; Goza, S. Michael

    2001-01-01

    The experiments described in this paper are part of a larger joint MIT/NASA research effort and focus on the development of a methodology for designing and evaluating integrated interfaces for highly dexterous and multifunctional telerobot. Specifically, a telerobotic workstation is being designed for an Extravehicular Activity (EVA) anthropomorphic space station telerobot called Robonaut. Previous researchers have designed telerobotic workstations based upon performance of discrete subsets of tasks (for example, peg-in-hole, tracking, etc.) without regard for transitions that operators go through between tasks performed sequentially in the context of larger integrated tasks. The experiments presented here took an integrated approach to describing teleoperator performance and assessed how subjects operating a full-immersion telerobot perform during fine position and gross position tasks. In addition, a Robonaut simulation was also developed as part of this research effort, and experimentally tested against Robonaut itself to determine its utility. Results show that subject performance of teleoperated tasks using both Robonaut and the simulation are virtually identical, with no significant difference between the two. These results indicate that the simulation can be utilized as both a Robonaut training tool, and as a powerful design platform for telepresence displays and aids.

  6. Cognitive Activity-based Design Methodology for Novice Visual Communication Designers

    ERIC Educational Resources Information Center

    Kim, Hyunjung; Lee, Hyunju

    2016-01-01

    The notion of design thinking is becoming more concrete nowadays, as design researchers and practitioners study the thinking processes involved in design and employ the concept of design thinking to foster better solutions to complex and ill-defined problems. The goal of the present research is to develop a cognitive activity-based design…

  7. Analyzing the impacts of global trade and investment on non-communicable diseases and risk factors: a critical review of methodological approaches used in quantitative analyses.

    PubMed

    Cowling, Krycia; Thow, Anne Marie; Pollack Porter, Keshia

    2018-05-24

    A key mechanism through which globalization has impacted health is the liberalization of trade and investment, yet relatively few studies to date have used quantitative methods to investigate the impacts of global trade and investment policies on non-communicable diseases and risk factors. Recent reviews of this literature have found heterogeneity in results and a range of quality across studies, which may be in part attributable to a lack of conceptual clarity and methodological inconsistencies. This study is a critical review of methodological approaches used in the quantitative literature on global trade and investment and diet, tobacco, alcohol, and related health outcomes, with the objective of developing recommendations and providing resources to guide future robust, policy relevant research. A review of reviews, expert review, and reference tracing were employed to identify relevant studies, which were evaluated using a novel quality assessment tool designed for this research. Eight review articles and 34 quantitative studies were identified for inclusion. Important ways to improve this literature were identified and discussed: clearly defining exposures of interest and not conflating trade and investment; exploring mechanisms of broader relationships; increasing the use of individual-level data; ensuring consensus and consistency in key confounding variables; utilizing more sector-specific versus economy-wide trade and investment indicators; testing and adequately adjusting for autocorrelation and endogeneity when using longitudinal data; and presenting results from alternative statistical models and sensitivity analyses. To guide the development of future analyses, recommendations for international data sources for selected trade and investment indicators, as well as key gaps in the literature, are presented. More methodologically rigorous and consistent approaches in future quantitative studies on the impacts of global trade and investment policies on non

  8. Aroma profile design of wine spirits: Multi-objective optimization using response surface methodology.

    PubMed

    Matias-Guiu, Pau; Rodríguez-Bencomo, Juan José; Pérez-Correa, José R; López, Francisco

    2018-04-15

    Developing new distillation strategies can help the spirits industry to improve quality, safety and process efficiency. Batch stills equipped with a packed column and an internal partial condenser are an innovative experimental system, allowing a fast and flexible management of the rectification. In this study, the impact of four factors (heart-cut volume, head-cut volume, pH and cooling flow rate of the internal partial condenser during the head-cut fraction) on 18 major volatile compounds of Muscat spirits was optimized using response surface methodology and desirability function approaches. Results have shown that high rectification at the beginning of the heart-cut enhances the overall positive aroma compounds of the product, reducing off-flavor compounds. In contrast, optimum levels of heart-cut volume, head-cut volume and pH factors varied depending on the process goal. Finally, three optimal operational conditions (head off-flavors reduction, flowery terpenic enhancement and fruity ester enhancement) were evaluated by chemical and sensory analysis. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Air traffic control system baseline methodology guide.

    DOT National Transportation Integrated Search

    1999-06-01

    The Air Traffic Control System Baseline Methodology Guide serves as a reference in the design and conduct of baseline studies. : Engineering research psychologists are the intended audience for the Methodology Guide, which focuses primarily on techni...

  10. A literature review of applied adaptive design methodology within the field of oncology in randomised controlled trials and a proposed extension to the CONSORT guidelines.

    PubMed

    Mistry, Pankaj; Dunn, Janet A; Marshall, Andrea

    2017-07-18

    The application of adaptive design methodology within a clinical trial setting is becoming increasingly popular. However the application of these methods within trials is not being reported as adaptive designs hence making it more difficult to capture the emerging use of these designs. Within this review, we aim to understand how adaptive design methodology is being reported, whether these methods are explicitly stated as an 'adaptive design' or if it has to be inferred and to identify whether these methods are applied prospectively or concurrently. Three databases; Embase, Ovid and PubMed were chosen to conduct the literature search. The inclusion criteria for the review were phase II, phase III and phase II/III randomised controlled trials within the field of Oncology that published trial results in 2015. A variety of search terms related to adaptive designs were used. A total of 734 results were identified, after screening 54 were eligible. Adaptive designs were more commonly applied in phase III confirmatory trials. The majority of the papers performed an interim analysis, which included some sort of stopping criteria. Additionally only two papers explicitly stated the term 'adaptive design' and therefore for most of the papers, it had to be inferred that adaptive methods was applied. Sixty-five applications of adaptive design methods were applied, from which the most common method was an adaptation using group sequential methods. This review indicated that the reporting of adaptive design methodology within clinical trials needs improving. The proposed extension to the current CONSORT 2010 guidelines could help capture adaptive design methods. Furthermore provide an essential aid to those involved with clinical trials.

  11. Design of the subject of quality engineering and security of the product of the degree in engineering in industrial design and development of product based in the methodology of the case

    NASA Astrophysics Data System (ADS)

    González, M. R.; Lambán, M. P.

    2012-04-01

    This paper presents the result of designing the subject Quality Engineering and Security of the Product, belonging to the Degree of Engineering in Industrial Design and Product Development, on the basis of the case methodology. Practical sessions of this subject are organized using the whole documents of the Quality System Management of the virtual company BeaLuc S.A.

  12. Intelligent systems engineering methodology

    NASA Technical Reports Server (NTRS)

    Fouse, Scott

    1990-01-01

    An added challenge for the designers of large scale systems such as Space Station Freedom is the appropriate incorporation of intelligent system technology (artificial intelligence, expert systems, knowledge-based systems, etc.) into their requirements and design. This presentation will describe a view of systems engineering which successfully addresses several aspects of this complex problem: design of large scale systems, design with requirements that are so complex they only completely unfold during the development of a baseline system and even then continue to evolve throughout the system's life cycle, design that involves the incorporation of new technologies, and design and development that takes place with many players in a distributed manner yet can be easily integrated to meet a single view of the requirements. The first generation of this methodology was developed and evolved jointly by ISX and the Lockheed Aeronautical Systems Company over the past five years on the Defense Advanced Research Projects Agency/Air Force Pilot's Associate Program, one of the largest, most complex, and most successful intelligent systems constructed to date. As the methodology has evolved it has also been applied successfully to a number of other projects. Some of the lessons learned from this experience may be applicable to Freedom.

  13. Design, Implementation, and Operational Methodologies for Sub-arcsecond Attitude Determination, Control, and Stabilization of the Super-pressure Balloon-Borne Imaging Telescope (SuperBIT)

    NASA Astrophysics Data System (ADS)

    Javier Romualdez, Luis

    Scientific balloon-borne instrumentation offers an attractive, competitive, and effective alternative to space-borne missions when considering the overall scope, cost, and development timescale required to design and launch scientific instruments. In particular, the balloon-borne environment provides a near-space regime that is suitable for a number of modern astronomical and cosmological experiments, where the atmospheric interference suffered by ground-based instrumentation is negligible at stratospheric altitudes. This work is centered around the analytical strategies and implementation considerations for the attitude determination and control of SuperBIT, a scientific balloon-borne payload capable of meeting the strict sub-arcsecond pointing and image stability requirements demanded by modern cosmological experiments. Broadly speaking, the designed stability specifications of SuperBIT coupled with its observational efficiency, image quality, and accessibility rivals state-of-the-art astronomical observatories such as the Hubble Space Telescope. To this end, this work presents an end-to-end design methodology for precision pointing balloon-borne payloads such as SuperBIT within an analytical yet implementationally grounded context. Simulation models of SuperBIT are analytically derived to aid in pre-assembly trade-off and case studies that are pertinent to the dynamic balloon-borne environment. From these results, state estimation techniques and control methodologies are extensively developed, leveraging the analytical framework of simulation models and design studies. This pre-assembly design phase is physically validated during assembly, integration, and testing through implementation in real-time hardware and software, which bridges the gap between analytical results and practical application. SuperBIT attitude determination and control is demonstrated throughout two engineering test flights that verify pointing and image stability requirements in flight

  14. A goal programming approach for a joint design of macroeconomic and environmental policies: a methodological proposal and an application to the Spanish economy.

    PubMed

    André, Francisco J; Cardenete, M Alejandro; Romero, Carlos

    2009-05-01

    The economic policy needs to pay increasingly more attention to the environmental issues, which requires the development of methodologies able to incorporate environmental, as well as macroeconomic, goals in the design of public policies. Starting from this observation, this article proposes a methodology based upon a Simonian satisficing logic made operational with the help of goal programming (GP) models, to address the joint design of macroeconomic and environmental policies. The methodology is applied to the Spanish economy, where a joint policy is elicited, taking into consideration macroeconomic goals (economic growth, inflation, unemployment, public deficit) and environmental goals (CO(2), NO( x ) and SO( x ) emissions) within the context of a computable general equilibrium model. The results show how the government can "fine-tune" its policy according to different criteria using GP models. The resulting policies aggregate the environmental and the economic goals in different ways: maximum aggregate performance, maximum balance and a lexicographic hierarchy of the goals.

  15. Systematic review of communication partner training in aphasia: methodological quality.

    PubMed

    Cherney, Leora R; Simmons-Mackie, Nina; Raymer, Anastasia; Armstrong, Elizabeth; Holland, Audrey

    2013-10-01

    Twenty-three studies identified from a previous systematic review examining the effects of communication partner training on persons with aphasia and their communication partners were evaluated for methodological quality. Two reviewers rated the studies on defined methodological quality criteria relevant to each study design. There were 11 group studies, seven single-subject participant design studies, and five qualitative studies. Quality scores were derived for each study. The mean inter-rater reliability of scores for each study design ranged from 85-93%, with Cohen's Kappa indicating substantial agreement between raters. Methodological quality of research on communication partner training in aphasia was highly varied. Overall, group studies employed the least rigorous methodology as compared to single subject and qualitative research. Only two of 11 group studies complied with more than half of the quality criteria. No group studies reported therapist blinding and only one group study reported participant blinding. Across all types of studies, the criterion of treatment fidelity was most commonly omitted. Failure to explicitly report certain methodological quality criteria may account for low ratings. Using methodological rating scales specific to the type of study design may help improve the methodological quality of aphasia treatment studies, including those on communication partner training.

  16. Bio-Medical Factors and External Hazards in Space Station Design

    NASA Technical Reports Server (NTRS)

    Olling, Edward H.

    1966-01-01

    The design of space-station configurations is influenced by many factors, Probably the most demanding and critical are the biomedical and external hazards requirements imposed to provide the proper environment and supporting facilities for the crew and the adequate protective measures necessary to provide a configuration in which the crew can live and work efficiently in relative comfort and safety. The major biomedical factors, such as physiology, psychology, nutrition, personal hygiene, waste management, and recreation, all impose their own peculiar requirements. The commonality and integration of these requirements demand the utmost ingenuity and inventiveness be exercised in order to achieve effective configuration compliance. The relationship of biomedical factors for the internal space-station environment will be explored with respect to internal atmospheric constituency, atmospheric pressure levels, oxygen positive pressure, temperature, humidity, CO2 concentration, and atmospheric contamination. The range of these various parameters and the recommended levels for design use will be analyzed. Requirements and criteria for specific problem areas such as zero and artificial gravity and crew private quarters will be reviewed and the impact on the design of representative solutions will be presented. In the areas of external hazards, the impact of factors such as meteoroids, radiation, vacuum, temperature extremes, and cycling on station design will be evaluated. Considerations with respect to operational effectiveness and crew safety will be discussed. The impact of such factors on spacecraft design to achieve acceptable launch and reentry g levels, crew rotation intervals, etc., will be reviewed. Examples of configurations, subsystems, and internal a arrangement and installations to comply with such biomedical factor requirements will ber presented. The effects of solutions to certain biomedical factors on configuration weight, operational convenience, and

  17. Bio-Medical Factors and External Hazards in Space Station Design

    NASA Technical Reports Server (NTRS)

    Olling, E. H.

    1966-01-01

    The design of space-station configurations is influenced by many factors. Probably the most demanding and critical are the biomedical and external hazards requirements imposed to provide the proper environment and supporting facilities for the crew and the adequate protective measures necessary to provide a configuration'in which the crew can live and work efficiently in relative comfort and safety. The major biomedical factors, such as physiology, psychology, nutrition, personal hygiene, waste management, and recreation, all impose their own peculiar requirements. The commonality and integration of these requirements demand the utmost ingenuity and inventiveness be exercised in order to achieve effective configuration compliance. The relationship of biomedical factors for the internal space-station environment will be explored with respect to internal atmospheric constituency, atmospheric pressure levels, oxygen positive pressure, temperature, humidity, CO2 concentration, and atmospheric contamination. The range of these various parameters and the recommended levels for design use will be analyzed. Requirements and criteria for specific problem areas such as zero and artificial gravity and crew private quarters will be reviewed and the impact on the design of representative solutions will be presented. In the areas of external hazards, the impact of factors such as meteoroids, radiation, vacuum, temperature extremes, and cycling on station design will be evaluated. Considerations with respect to operational effectiveness and crew safety will be discussed. The impact of such factors on spacecraft design to achieve acceptable launch and reentry g levels, crew rotation intervals, etc., will be reviewed.

  18. Arab Teens Lifestyle Study (ATLS): objectives, design, methodology and implications

    PubMed Central

    Al-Hazzaa, Hazzaa M; Musaiger, Abdulrahman O

    2011-01-01

    Background There is a lack of comparable data on physical activity, sedentary behavior, and dietary habits among Arab adolescents, which limits our understanding and interpretation of the relationship between obesity and lifestyle parameters. Therefore, we initiated the Arab Teens Lifestyle Study (ATLS). The ATLS is a multicenter collaborative project for assessing lifestyle habits of Arab adolescents. The objectives of the ATLS project were to investigate the prevalence rates for overweight and obesity, physical activity, sedentary activity and dietary habits among Arab adolescents, and to examine the interrelationships between these lifestyle variables. This paper reports on the objectives, design, methodology, and implications of the ATLS. Design/Methods The ATLS is a school-based cross-sectional study involving 9182 randomly selected secondary-school students (14–19 years) from major Arab cities, using a multistage stratified sampling technique. The participating Arab cities included Riyadh, Jeddah, and Al-Khobar (Saudi Arabia), Bahrain, Dubai (United Arab Emirates), Kuwait, Amman (Jordan), Mosel (Iraq), Muscat (Oman), Tunisia (Tunisia) and Kenitra (Morocco). Measured variables included anthropometric measurements, physical activity, sedentary behavior, sleep duration, and dietary habits. Discussion The ATLS project will provide a unique opportunity to collect and analyze important lifestyle information from Arab adolescents using standardized procedures. This is the first time a collaborative Arab project will simultaneously assess broad lifestyle variables in a large sample of adolescents from numerous urbanized Arab regions. This joint research project will supply us with comprehensive and recent data on physical activity/inactivity and eating habits of Arab adolescents relative to obesity. Such invaluable lifestyle-related data are crucial for developing public health policies and regional strategies for health promotion and disease prevention. PMID

  19. Methodological issues of genetic association studies.

    PubMed

    Simundic, Ana-Maria

    2010-12-01

    Genetic association studies explore the association between genetic polymorphisms and a certain trait, disease or predisposition to disease. It has long been acknowledged that many genetic association studies fail to replicate their initial positive findings. This raises concern about the methodological quality of these reports. Case-control genetic association studies often suffer from various methodological flaws in study design and data analysis, and are often reported poorly. Flawed methodology and poor reporting leads to distorted results and incorrect conclusions. Many journals have adopted guidelines for reporting genetic association studies. In this review, some major methodological determinants of genetic association studies will be discussed.

  20. A Digital Methodology for the Design Process of Aerospace Assemblies with Sustainable Composite Processes & Manufacture

    NASA Astrophysics Data System (ADS)

    McEwan, W.; Butterfield, J.

    2011-05-01

    The well established benefits of composite materials are driving a significant shift in design and manufacture strategies for original equipment manufacturers (OEMs). Thermoplastic composites have advantages over the traditional thermosetting materials with regards to sustainability and environmental impact, features which are becoming increasingly pertinent in the aerospace arena. However, when sustainability and environmental impact are considered as design drivers, integrated methods for part design and product development must be developed so that any benefits of sustainable composite material systems can be assessed during the design process. These methods must include mechanisms to account for process induced part variation and techniques related to re-forming, recycling and decommissioning, which are in their infancy. It is proposed in this paper that predictive techniques related to material specification, part processing and product cost of thermoplastic composite components, be integrated within a Through Life Management (TLM) product development methodology as part of a larger strategy of product system modeling to improve disciplinary concurrency, realistic part performance, and to place sustainability at the heart of the design process. This paper reports the enhancement of digital manufacturing tools as a means of drawing simulated part manufacturing scenarios, real time costing mechanisms, and broader lifecycle performance data capture into the design cycle. The work demonstrates predictive processes for sustainable composite product manufacture and how a Product-Process-Resource (PPR) structure can be customised and enhanced to include design intent driven by `Real' part geometry and consequent assembly. your paper.

  1. Methodology for the optimal design of an integrated first and second generation ethanol production plant combined with power cogeneration.

    PubMed

    Bechara, Rami; Gomez, Adrien; Saint-Antonin, Valérie; Schweitzer, Jean-Marc; Maréchal, François

    2016-08-01

    The application of methodologies for the optimal design of integrated processes has seen increased interest in literature. This article builds on previous works and applies a systematic methodology to an integrated first and second generation ethanol production plant with power cogeneration. The methodology breaks into process simulation, heat integration, thermo-economic evaluation, exergy efficiency vs. capital costs, multi-variable, evolutionary optimization, and process selection via profitability maximization. Optimization generated Pareto solutions with exergy efficiency ranging between 39.2% and 44.4% and capital costs from 210M$ to 390M$. The Net Present Value was positive for only two scenarios and for low efficiency, low hydrolysis points. The minimum cellulosic ethanol selling price was sought to obtain a maximum NPV of zero for high efficiency, high hydrolysis alternatives. The obtained optimal configuration presented maximum exergy efficiency, hydrolyzed bagasse fraction, capital costs and ethanol production rate, and minimum cooling water consumption and power production rate. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Bayesian methodology for the design and interpretation of clinical trials in critical care medicine: a primer for clinicians.

    PubMed

    Kalil, Andre C; Sun, Junfeng

    2014-10-01

    To review Bayesian methodology and its utility to clinical decision making and research in the critical care field. Clinical, epidemiological, and biostatistical studies on Bayesian methods in PubMed and Embase from their inception to December 2013. Bayesian methods have been extensively used by a wide range of scientific fields, including astronomy, engineering, chemistry, genetics, physics, geology, paleontology, climatology, cryptography, linguistics, ecology, and computational sciences. The application of medical knowledge in clinical research is analogous to the application of medical knowledge in clinical practice. Bedside physicians have to make most diagnostic and treatment decisions on critically ill patients every day without clear-cut evidence-based medicine (more subjective than objective evidence). Similarly, clinical researchers have to make most decisions about trial design with limited available data. Bayesian methodology allows both subjective and objective aspects of knowledge to be formally measured and transparently incorporated into the design, execution, and interpretation of clinical trials. In addition, various degrees of knowledge and several hypotheses can be tested at the same time in a single clinical trial without the risk of multiplicity. Notably, the Bayesian technology is naturally suited for the interpretation of clinical trial findings for the individualized care of critically ill patients and for the optimization of public health policies. We propose that the application of the versatile Bayesian methodology in conjunction with the conventional statistical methods is not only ripe for actual use in critical care clinical research but it is also a necessary step to maximize the performance of clinical trials and its translation to the practice of critical care medicine.

  3. Methodological and Design Considerations in Evaluating the Impact of Prevention Programs on Violence and Related Health Outcomes.

    PubMed

    Massetti, Greta M; Simon, Thomas R; Smith, Deborah Gorman

    2016-10-01

    Drawing on research that has identified specific predictors and trajectories of risk for violence and related negative outcomes, a multitude of small- and large-scale preventive interventions for specific risk behaviors have been developed, implemented, and evaluated. One of the principal challenges of these approaches is that a number of separate problem-specific programs targeting different risk areas have emerged. However, as many negative health behaviors such as substance abuse and violence share a multitude of risk factors, many programs target identical risk factors. There are opportunities to understand whether evidence-based programs can be leveraged for potential effects across a spectrum of outcomes and over time. Some recent work has documented longitudinal effects of evidence-based interventions on generalized outcomes. This work has potential for advancing our understanding of the effectiveness of promising and evidence-based prevention strategies. However, conducting longitudinal follow-up of established interventions presents a number of methodological and design challenges. To answer some of these questions, the Centers for Disease Control and Prevention convened a panel of multidisciplinary experts to discuss opportunities to take advantage of evaluations of early prevention programs and evaluating multiple long-term outcomes. This special section of the journal Prevention Science includes a series of papers that begin to address the relevant considerations for conducting longitudinal follow-up evaluation research. This collection of papers is intended to inform our understanding of the challenges and strategies for conducting longitudinal follow-up evaluation research that could be used to drive future research endeavors.

  4. Longitudinal Research with Sexual Assault Survivors: A Methodological Review

    ERIC Educational Resources Information Center

    Campbell, Rebecca; Sprague, Heather Brown; Cottrill, Sara; Sullivan, Cris M.

    2011-01-01

    Longitudinal research designs are relatively rare in the academic literature on rape and sexual assault despite their tremendous methodological rigor and scientific utility. In the interest of promoting wider use of such methods, we conducted a methodological review of projects that have used prospective longitudinal designs to study the…

  5. Suggested criteria for evaluating systems engineering methodologies

    NASA Technical Reports Server (NTRS)

    Gates, Audrey; Paul, Arthur S.; Gill, Tepper L.

    1989-01-01

    Systems engineering is the application of mathematical and scientific principles to practical ends in the life-cycle of a system. A methodology for systems engineering is a carefully developed, relatively complex procedure or process for applying these mathematical and scientific principles. There are many systems engineering methodologies (or possibly many versions of a few methodologies) currently in use in government and industry. These methodologies are usually designed to meet the needs of a particular organization. It has been observed, however, that many technical and non-technical problems arise when inadequate systems engineering methodologies are applied by organizations to their systems development projects. Various criteria for evaluating systems engineering methodologies are discussed. Such criteria are developed to assist methodology-users in identifying and selecting methodologies that best fit the needs of the organization.

  6. Reporting and Methodology of Multivariable Analyses in Prognostic Observational Studies Published in 4 Anesthesiology Journals: A Methodological Descriptive Review.

    PubMed

    Guglielminotti, Jean; Dechartres, Agnès; Mentré, France; Montravers, Philippe; Longrois, Dan; Laouénan, Cedric

    2015-10-01

    Prognostic research studies in anesthesiology aim to identify risk factors for an outcome (explanatory studies) or calculate the risk of this outcome on the basis of patients' risk factors (predictive studies). Multivariable models express the relationship between predictors and an outcome and are used in both explanatory and predictive studies. Model development demands a strict methodology and a clear reporting to assess its reliability. In this methodological descriptive review, we critically assessed the reporting and methodology of multivariable analysis used in observational prognostic studies published in anesthesiology journals. A systematic search was conducted on Medline through Web of Knowledge, PubMed, and journal websites to identify observational prognostic studies with multivariable analysis published in Anesthesiology, Anesthesia & Analgesia, British Journal of Anaesthesia, and Anaesthesia in 2010 and 2011. Data were extracted by 2 independent readers. First, studies were analyzed with respect to reporting of outcomes, design, size, methods of analysis, model performance (discrimination and calibration), model validation, clinical usefulness, and STROBE (i.e., Strengthening the Reporting of Observational Studies in Epidemiology) checklist. A reporting rate was calculated on the basis of 21 items of the aforementioned points. Second, they were analyzed with respect to some predefined methodological points. Eighty-six studies were included: 87.2% were explanatory and 80.2% investigated a postoperative event. The reporting was fairly good, with a median reporting rate of 79% (75% in explanatory studies and 100% in predictive studies). Six items had a reporting rate <36% (i.e., the 25th percentile), with some of them not identified in the STROBE checklist: blinded evaluation of the outcome (11.9%), reason for sample size (15.1%), handling of missing data (36.0%), assessment of colinearity (17.4%), assessment of interactions (13.9%), and calibration (34

  7. Design strategies from sexual exploitation and sex work studies among women and girls: Methodological considerations in a hidden and vulnerable population.

    PubMed

    Gerassi, Lara; Edmond, Tonya; Nichols, Andrea

    2017-06-01

    The study of sex trafficking, prostitution, sex work, and sexual exploitation is associated with many methodological issues and challenges. Researchers' study designs must consider the many safety issues related to this vulnerable and hidden population. Community advisory boards and key stakeholder involvement are essential to study design to increase safety of participants, usefulness of study aims, and meaningfulness of conclusions. Nonrandomized sampling strategies are most often utilized when studying exploited women and girls, which have the capacity to provide rich data and require complex sampling and recruitment methods. This article reviews the current methodological issues when studying this marginalized population as well as strategies to address challenges while working with the community in order to bring about social change. The authors also discuss their own experiences in collaborating with community organizations to conduct research in this field.

  8. Design strategies from sexual exploitation and sex work studies among women and girls: Methodological considerations in a hidden and vulnerable population

    PubMed Central

    Gerassi, Lara; Edmond, Tonya; Nichols, Andrea

    2016-01-01

    The study of sex trafficking, prostitution, sex work, and sexual exploitation is associated with many methodological issues and challenges. Researchers’ study designs must consider the many safety issues related to this vulnerable and hidden population. Community advisory boards and key stakeholder involvement are essential to study design to increase safety of participants, usefulness of study aims, and meaningfulness of conclusions. Nonrandomized sampling strategies are most often utilized when studying exploited women and girls, which have the capacity to provide rich data and require complex sampling and recruitment methods. This article reviews the current methodological issues when studying this marginalized population as well as strategies to address challenges while working with the community in order to bring about social change. The authors also discuss their own experiences in collaborating with community organizations to conduct research in this field. PMID:28824337

  9. Development of risk-based decision methodology for facility design.

    DOT National Transportation Integrated Search

    2014-06-01

    This report develops a methodology for CDOT to use in the risk analysis of various types of facilities and provides : illustrative examples for the use of the proposed framework. An overview of the current practices and applications to : illustrate t...

  10. Contemporary Research on Parenting: Conceptual, Methodological, and Translational Issues

    PubMed Central

    Sleddens, Ester F. C.; Berge, Jerica; Connell, Lauren; Govig, Bert; Hennessy, Erin; Liggett, Leanne; Mallan, Kimberley; Santa Maria, Diane; Odoms-Young, Angela; St. George, Sara M.

    2013-01-01

    Abstract Researchers over the last decade have documented the association between general parenting style and numerous factors related to childhood obesity (e.g., children's eating behaviors, physical activity, and weight status). Many recent childhood obesity prevention programs are family focused and designed to modify parenting behaviors thought to contribute to childhood obesity risk. This article presents a brief consideration of conceptual, methodological, and translational issues that can inform future research on the role of parenting in childhood obesity. They include: (1) General versus domain specific parenting styles and practices; (2) the role of ethnicity and culture; (3) assessing bidirectional influences; (4) broadening assessments beyond the immediate family; (5) novel approaches to parenting measurement; and (6) designing effective interventions. Numerous directions for future research are offered. PMID:23944927

  11. Contemporary research on parenting: conceptual, methodological, and translational issues.

    PubMed

    Power, Thomas G; Sleddens, Ester F C; Berge, Jerica; Connell, Lauren; Govig, Bert; Hennessy, Erin; Liggett, Leanne; Mallan, Kimberley; Santa Maria, Diane; Odoms-Young, Angela; St George, Sara M

    2013-08-01

    Researchers over the last decade have documented the association between general parenting style and numerous factors related to childhood obesity (e.g., children's eating behaviors, physical activity, and weight status). Many recent childhood obesity prevention programs are family focused and designed to modify parenting behaviors thought to contribute to childhood obesity risk. This article presents a brief consideration of conceptual, methodological, and translational issues that can inform future research on the role of parenting in childhood obesity. They include: (1) General versus domain specific parenting styles and practices; (2) the role of ethnicity and culture; (3) assessing bidirectional influences; (4) broadening assessments beyond the immediate family; (5) novel approaches to parenting measurement; and (6) designing effective interventions. Numerous directions for future research are offered.

  12. Surface laser marking optimization using an experimental design approach

    NASA Astrophysics Data System (ADS)

    Brihmat-Hamadi, F.; Amara, E. H.; Lavisse, L.; Jouvard, J. M.; Cicala, E.; Kellou, H.

    2017-04-01

    Laser surface marking is performed on a titanium substrate using a pulsed frequency doubled Nd:YAG laser ( λ= 532 nm, τ pulse=5 ns) to process the substrate surface under normal atmospheric conditions. The aim of the work is to investigate, following experimental and statistical approaches, the correlation between the process parameters and the response variables (output), using a Design of Experiment method (DOE): Taguchi methodology and a response surface methodology (RSM). A design is first created using MINTAB program, and then the laser marking process is performed according to the planned design. The response variables; surface roughness and surface reflectance were measured for each sample, and incorporated into the design matrix. The results are then analyzed and the RSM model is developed and verified for predicting the process output for the given set of process parameters values. The analysis shows that the laser beam scanning speed is the most influential operating factor followed by the laser pumping intensity during marking, while the other factors show complex influences on the objective functions.

  13. How system designers think: a study of design thinking in human factors engineering.

    PubMed

    Papantonopoulos, Sotiris

    2004-11-01

    The paper presents a descriptive study of design thinking in human factors engineering. The objective of the study is to analyse the role of interpretation in design thinking and the role of design practice in guiding interpretation. The study involved 10 system designers undertaking the allocation of cognitive functions in three production planning and control task scenarios. Allocation decisions were recorded and verbal protocols of the design process were collected to elicit the subjects' thought processes. Verbal protocol analysis showed that subjects carried out the design of cognitive task allocation as a problem of applying a selected automation technology from their initial design deliberations. This design strategy stands in contrast to the predominant view of system design that stipulates that user requirements should be thoroughly analysed prior to making any decisions about technology. Theoretical frameworks from design research and ontological design showed that the system design process may be better understood by recognizing the role of design hypotheses in system design, as well as the diverse interactions between interpretation and practice, means and ends, and design practice and the designer's pre-understanding which shape the design process. Ways to balance the bias exerted on the design process were discussed.

  14. Human factors and the FDA's goals: improved medical device design.

    PubMed

    Burlington, D B

    1996-01-01

    The Food and Drug Administration's new human factors design requirements for medical devices were previewed by the director of the FDA's Center for Devices and Radiological Health (CDRH) at AAMI/FDA's Human Factors in Medical Devices Conference held in September 1995. Director Bruce Burlington, MD, said the FDA plans to take a closer look at how new medical devices are designed to ensure proper attention has been paid to human error prevention. As a medical practitioner who has witnessed use-related deaths and injuries, Burlington stressed the importance of the medical community's reporting use errors as they occur and manufacturers' creating easy-to-use labeling and packaging. He also called for simplicity and quality of design in medical products, and asked for a consolidated effort of all professionals involved in human factors issues to help implement and further the FDA's new human factors program. An edited version of his presentation appears here.

  15. Rail Passenger Vehicle Truck Design Methodology

    DOT National Transportation Integrated Search

    1981-01-01

    A procedure for the selection of rail passenger truck design parameters to meet dynamic performance indices has been developed. The procedure is based upon partitioning the design task into three tradeoff studies: (1) a vertical ride quality-secondar...

  16. Probability-based methodology for buckling investigation of sandwich composite shells with and without cut-outs

    NASA Astrophysics Data System (ADS)

    Alfano, M.; Bisagni, C.

    2017-01-01

    The objective of the running EU project DESICOS (New Robust DESign Guideline for Imperfection Sensitive COmposite Launcher Structures) is to formulate an improved shell design methodology in order to meet the demand of aerospace industry for lighter structures. Within the project, this article discusses the development of a probability-based methodology developed at Politecnico di Milano. It is based on the combination of the Stress-Strength Interference Method and the Latin Hypercube Method with the aim to predict the bucking response of three sandwich composite cylindrical shells, assuming a loading condition of pure compression. The three shells are made of the same material, but have different stacking sequence and geometric dimensions. One of them presents three circular cut-outs. Different types of input imperfections, treated as random variables, are taken into account independently and in combination: variability in longitudinal Young's modulus, ply misalignment, geometric imperfections, and boundary imperfections. The methodology enables a first assessment of the structural reliability of the shells through the calculation of a probabilistic buckling factor for a specified level of probability. The factor depends highly on the reliability level, on the number of adopted samples, and on the assumptions made in modeling the input imperfections. The main advantage of the developed procedure is the versatility, as it can be applied to the buckling analysis of laminated composite shells and sandwich composite shells including different types of imperfections.

  17. [Qualitative research methodology in health care].

    PubMed

    Bedregal, Paula; Besoain, Carolina; Reinoso, Alejandro; Zubarew, Tamara

    2017-03-01

    Health care research requires different methodological approaches such as qualitative and quantitative analyzes to understand the phenomena under study. Qualitative research is usually the least considered. Central elements of the qualitative method are that the object of study is constituted by perceptions, emotions and beliefs, non-random sampling by purpose, circular process of knowledge construction, and methodological rigor throughout the research process, from quality design to the consistency of results. The objective of this work is to contribute to the methodological knowledge about qualitative research in health services, based on the implementation of the study, “The transition process from pediatric to adult services: perspectives from adolescents with chronic diseases, caregivers and health professionals”. The information gathered through the qualitative methodology facilitated the understanding of critical points, barriers and facilitators of the transition process of adolescents with chronic diseases, considering the perspective of users and the health team. This study allowed the design of a transition services model from pediatric to adult health services based on the needs of adolescents with chronic diseases, their caregivers and the health team.

  18. Human factors engineering approaches to patient identification armband design.

    PubMed

    Probst, C Adam; Wolf, Laurie; Bollini, Mara; Xiao, Yan

    2016-01-01

    The task of patient identification is performed many times each day by nurses and other members of the care team. Armbands are used for both direct verification and barcode scanning during patient identification. Armbands and information layout are critical to reducing patient identification errors and dangerous workarounds. We report the effort at two large, integrated healthcare systems that employed human factors engineering approaches to the information layout design of new patient identification armbands. The different methods used illustrate potential pathways to obtain standardized armbands across healthcare systems that incorporate human factors principles. By extension, how the designs have been adopted provides examples of how to incorporate human factors engineering into key clinical processes. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  19. Preliminary human factors guidelines for automated highway system designers. Volume 1 : guidelines for AHS designers

    DOT National Transportation Integrated Search

    1998-04-01

    Human factors can be defined as "designing to match the capabilities and limitations of the human user." The objectives of this human-centered design process are to maximize the effectiveness and efficiency of system performance, ensure a high level ...

  20. Factorial Design: An Eight Factor Experiment Using Paper Helicopters

    NASA Technical Reports Server (NTRS)

    Kozma, Michael

    1996-01-01

    The goal of this paper is to present the analysis of the multi-factor experiment (factorial design) conducted in EG490, Junior Design at Loyola College in Maryland. The discussion of this paper concludes the experimental analysis and ties the individual class papers together.

  1. A comprehensive approach to environmental and human factors into product/service design and development. A review from an ergoecological perspective.

    PubMed

    Saravia-Pinilla, Martha H; Daza-Beltrán, Carolina; García-Acosta, Gabriel

    2016-11-01

    This article presents the results of a documentary-exploratory review of design methods and concepts associated with human and environmental factors, based on a qualitative-quantitative analysis of coincidences with the fundamentals of ergoecology and in line with sustainable dynamics, with a view to putting the principles of ergoecology into practice in product/service design and development. 61.6% of 696 documents found represent work on conceptual developments, while the remaining 38.4% refer to design methods. Searches were refined using Nvivo-10 software, and 101 documents were obtained about theoretical aspects while 17 focused on the application of methods, and these formed the analysis universe. The results show how little concern there is for working comprehensively on human and environmental aspects, and a trend toward segmentation of human and environmental aspects in the field of product/service design and development can be seen, at both concept and application/methodology levels. It was concluded from the above that comprehensive, simultaneous work is needed on human and environmental aspects, clarity and conceptual unity, in order to achieve sustainability in practical matters and ensure that ergoecology-compatible design methods are applied. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Design of integrated pitch axis for autopilot/autothrottle and integrated lateral axis for autopilot/yaw damper for NASA TSRV airplane using integral LQG methodology

    NASA Technical Reports Server (NTRS)

    Kaminer, Isaac; Benson, Russell A.; Coleman, Edward E.; Ebrahimi, Yaghoob S.

    1990-01-01

    Two designs are presented for control systems for the NASA Transport System Research Vehicle (TSRV) using integral Linear Quadratic Gaussian (LQG) methodology. The first is an integrated longitudinal autopilot/autothrottle design and the second design is an integrated lateral autopilot/yaw damper/sideslip controller design. It is shown that a systematic top-down approach to a complex design problem combined with proper application of modern control synthesis techniques yields a satisfactory solution in a reasonable period of time.

  3. 43 CFR 11.83 - Damage determination phase-use value methodologies.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...

  4. 43 CFR 11.83 - Damage determination phase-use value methodologies.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...

  5. 43 CFR 11.83 - Damage determination phase-use value methodologies.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...

  6. 43 CFR 11.83 - Damage determination phase-use value methodologies.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...

  7. 43 CFR 11.83 - Damage determination phase-use value methodologies.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...

  8. A review and preliminary evaluation of methodological factors in performance assessments of time-varying aircraft noise effects

    NASA Technical Reports Server (NTRS)

    Coates, G. D.; Alluisi, E. A.

    1975-01-01

    The effects of aircraft noise on human performance is considered. Progress is reported in the following areas: (1) review of the literature to identify the methodological and stimulus parameters involved in the study of noise effects on human performance; (2) development of a theoretical framework to provide working hypotheses as to the effects of noise on complex human performance; and (3) data collection on the first of several experimental investigations designed to provide tests of the hypotheses.

  9. Thermal sensation prediction by soft computing methodology.

    PubMed

    Jović, Srđan; Arsić, Nebojša; Vilimonović, Jovana; Petković, Dalibor

    2016-12-01

    Thermal comfort in open urban areas is very factor based on environmental point of view. Therefore it is need to fulfill demands for suitable thermal comfort during urban planning and design. Thermal comfort can be modeled based on climatic parameters and other factors. The factors are variables and they are changed throughout the year and days. Therefore there is need to establish an algorithm for thermal comfort prediction according to the input variables. The prediction results could be used for planning of time of usage of urban areas. Since it is very nonlinear task, in this investigation was applied soft computing methodology in order to predict the thermal comfort. The main goal was to apply extreme leaning machine (ELM) for forecasting of physiological equivalent temperature (PET) values. Temperature, pressure, wind speed and irradiance were used as inputs. The prediction results are compared with some benchmark models. Based on the results ELM can be used effectively in forecasting of PET. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Design and analysis of sustainable paper bicycle

    NASA Astrophysics Data System (ADS)

    Roni Sahroni, Taufik; Nasution, Januar

    2017-12-01

    This paper presents the design of sustainable paper bicycle which describes the stage by stage in the production of paper bicycle. The objective of this project is to design a sustainable paper bicycles to be used for children under five years old. The design analysis emphasizes in screening method to ensure the design fulfil the safety purposes. The evaluation concept is presented in designing a sustainable paper bicycle to determine highest rating. Project methodology is proposed for developing a sustainable paper bicycle. Design analysis of pedal, front and rear wheel, seat, and handle were presented using AutoCAD software. The design optimization was performed to fulfil the safety factors by modifying the material size and dimension. Based on the design analysis results, it is found that the optimization results met the factor safety. As a result, a sustainable paper bicycle was proposed for children under five years old.

  11. Design methodology for integrated downstream separation systems in an ethanol biorefinery

    NASA Astrophysics Data System (ADS)

    Mohammadzadeh Rohani, Navid

    and obtaining energy security. On the other hand, Process Integration (PI) as defined by Natural Resource Canada as the combination of activities which aim at improving process systems, their unit operations and their interactions in order to maximize the efficiency of using water, energy and raw materials can also help biorefineries lower their energy consumptions and improve their economics. Energy integration techniques such as pinch analysis adopted by different industries over the years have ensured using heat sources within a plant to supply the demand internally and decrease the external utility consumption. Therefore, adopting energy integration can be one of the ways biorefinery technology owners can consider in their process development as well as their business model in order to improve their overall economics. The objective of this thesis is to propose a methodology for designing integrated downstream separation in a biorefinery. This methodology is tested in an ethanol biorefinery case study. Several alternative separation techniques are evaluated in their energy consumption and economics in three different scenarios; stand-alone without energy integration, stand-alone with internal energy integration and integrated-with Kraft. The energy consumptions and capital costs of separation techniques are assessed in each scenario and the cost and benefit of integration are determined and finally the best alternative is found through techno-economic metrics. Another advantage of this methodology is the use of a graphical tool which provides insights on decreasing energy consumption by modifying the process condition. The pivot point of this work is the use of a novel energy integration method called Bridge analysis. This systematic method which originally is intended for retrofit situation is used here for integration with Kraft process. Integration potentials are identified through this method and savings are presented for each design. In stand-alone with

  12. An Examination of Factors Contributing to Student Satisfaction in Armenian Higher Education

    ERIC Educational Resources Information Center

    Martirosyan, Nara

    2015-01-01

    Purpose: The purpose of this paper is to investigate factors that affect student satisfaction in college environment in Armenian Higher Educational Institutions (AHEIs). Design/methodology/approach: This study used an "ex-post facto," non-experimental approach to investigate factors that affected student satisfaction in college…

  13. Guidelines for reporting evaluations based on observational methodology.

    PubMed

    Portell, Mariona; Anguera, M Teresa; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana

    2015-01-01

    Observational methodology is one of the most suitable research designs for evaluating fidelity of implementation, especially in complex interventions. However, the conduct and reporting of observational studies is hampered by the absence of specific guidelines, such as those that exist for other evaluation designs. This lack of specific guidance poses a threat to the quality and transparency of these studies and also constitutes a considerable publication hurdle. The aim of this study thus was to draw up a set of proposed guidelines for reporting evaluations based on observational methodology. The guidelines were developed by triangulating three sources of information: observational studies performed in different fields by experts in observational methodology, reporting guidelines for general studies and studies with similar designs to observational studies, and proposals from experts in observational methodology at scientific meetings. We produced a list of guidelines grouped into three domains: intervention and expected outcomes, methods, and results. The result is a useful, carefully crafted set of simple guidelines for conducting and reporting observational studies in the field of program evaluation.

  14. [Methodological Aspects of the Sampling Design for the 2015 National Mental Health Survey].

    PubMed

    Rodríguez, Nelcy; Rodríguez, Viviana Alejandra; Ramírez, Eugenia; Cediel, Sandra; Gil, Fabián; Rondón, Martín Alonso

    2016-12-01

    The WHO has encouraged the development, implementation and evaluation of policies related to mental health all over the world. In Colombia, within this framework and promoted by the Ministry of Health and Social Protection, as well as being supported by Colciencias, the fourth National Mental Health Survey (NMHST) was conducted using a observational cross sectional study. According to the context and following the guidelines and sampling design, a summary of the methodology used for this sampling process is presented. The fourth NMHST used the Homes Master Sample for Studies in Health from the National System of Studies and Population Surveys for Health to calculate its sample. This Master Sample was developed and implemented in the year 2013 by the Ministry of Social Protection. This study included non-institutionalised civilian population divided into four age groups: children 7-11 years, adolescent 12-17 years, 18-44 years and 44 years old or older. The sample size calculation was based on the reported prevalences in other studies for the outcomes of mental disorders, depression, suicide, associated morbidity, and alcohol use. A probabilistic, cluster, stratified and multistage selection process was used. Expansions factors to the total population were calculated. A total of 15,351 completed surveys were collected and were distributed according to the age groups: 2727, 7-11 years, 1754, 12-17 years, 5889, 18-44 years, and 4981, ≥45 years. All the surveys were distributed in five regions: Atlantic, Oriental, Bogotá, Central and Pacific. A sufficient number of surveys were collected in this study to obtain a more precise approximation of the mental problems and disorders at the regional and national level. Copyright © 2016 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.

  15. A methodology for the efficient integration of transient constraints in the design of aircraft dynamic systems

    NASA Astrophysics Data System (ADS)

    Phan, Leon L.

    The motivation behind this thesis mainly stems from previous work performed at Hispano-Suiza (Safran Group) in the context of the European research project "Power Optimised Aircraft". Extensive testing on the COPPER Bird RTM, a test rig designed to characterize aircraft electrical networks, demonstrated the relevance of transient regimes in the design and development of dynamic systems. Transient regimes experienced by dynamic systems may have severe impacts on the operation of the aircraft. For example, the switching on of a high electrical load might cause a network voltage drop inducing a loss of power available to critical aircraft systems. These transient behaviors are thus often regulated by dynamic constraints, requiring the dynamic signals to remain within bounds whose values vary with time. The verification of these peculiar types of constraints, which generally requires high-fidelity time-domain simulation, intervenes late in the system development process, thus potentially causing costly design iterations. The research objective of this thesis is to develop a methodology that integrates the verification of dynamic constraints in the early specification of dynamic systems. In order to circumvent the inefficiencies of time-domain simulation, multivariate dynamic surrogate models of the original time-domain simulation models are generated, building on a nonlinear system identification technique using wavelet neural networks (or wavenets), which allow the multiscale nature of transient signals to be captured. However, training multivariate wavenets can become computationally prohibitive as the number of design variables increases. Therefore, an alternate approach is formulated, in which dynamic surrogate models using sigmoid-based neural networks are used to emulate the transient behavior of the envelopes of the time-domain response. Thus, in order to train the neural network, the envelopes are extracted by first separating the scales of the dynamic response

  16. VIII. THE PAST, PRESENT, AND FUTURE OF DEVELOPMENTAL METHODOLOGY.

    PubMed

    Little, Todd D; Wang, Eugene W; Gorrall, Britt K

    2017-06-01

    This chapter selectively reviews the evolution of quantitative practices in the field of developmental methodology. The chapter begins with an overview of the past in developmental methodology, discussing the implementation and dissemination of latent variable modeling and, in particular, longitudinal structural equation modeling. It then turns to the present state of developmental methodology, highlighting current methodological advances in the field. Additionally, this section summarizes ample quantitative resources, ranging from key quantitative methods journal articles to the various quantitative methods training programs and institutes. The chapter concludes with the future of developmental methodology and puts forth seven future innovations in the field. The innovations discussed span the topics of measurement, modeling, temporal design, and planned missing data designs. Lastly, the chapter closes with a brief overview of advanced modeling techniques such as continuous time models, state space models, and the application of Bayesian estimation in the field of developmental methodology. © 2017 The Society for Research in Child Development, Inc.

  17. A methodology for system-of-systems design in support of the engineering team

    NASA Astrophysics Data System (ADS)

    Ridolfi, G.; Mooij, E.; Cardile, D.; Corpino, S.; Ferrari, G.

    2012-04-01

    Space missions have experienced a trend of increasing complexity in the last decades, resulting in the design of very complex systems formed by many elements and sub-elements working together to meet the requirements. In a classical approach, especially in a company environment, the two steps of design-space exploration and optimization are usually performed by experts inferring on major phenomena, making assumptions and doing some trial-and-error runs on the available mathematical models. This is done especially in the very early design phases where most of the costs are locked-in. With the objective of supporting the engineering team and the decision-makers during the design of complex systems, the authors developed a modelling framework for a particular category of complex, coupled space systems called System-of-Systems. Once modelled, the System-of-Systems is solved using a computationally cheap parametric methodology, named the mixed-hypercube approach, based on the utilization of a particular type of fractional factorial design-of-experiments, and analysis of the results via global sensitivity analysis and response surfaces. As an applicative example, a system-of-systems of a hypothetical human space exploration scenario for the support of a manned lunar base is presented. The results demonstrate that using the mixed-hypercube to sample the design space, an optimal solution is reached with a limited computational effort, providing support to the engineering team and decision makers thanks to sensitivity and robustness information. The analysis of the system-of-systems model that was implemented shows that the logistic support of a human outpost on the Moon for 15 years is still feasible with currently available launcher classes. The results presented in this paper have been obtained in cooperation with Thales Alenia Space—Italy, in the framework of a regional programme called STEPS. STEPS—Sistemi e Tecnologie per l'EsPlorazione Spaziale is a research

  18. Factors Influencing Employee Learning in Small Businesses

    ERIC Educational Resources Information Center

    Coetzer, Alan; Perry, Martin

    2008-01-01

    Purpose: The purpose of this research is to identify key factors influencing employee learning from the perspective of owners/managers. Design/methodology/research: Data were gathered from owners/managers in a total of 27 small manufacturing and services firms through interviews and analysed using content analytic procedures. Findings: The…

  19. Time-oriented experimental design method to optimize hydrophilic matrix formulations with gelation kinetics and drug release profiles.

    PubMed

    Shin, Sangmun; Choi, Du Hyung; Truong, Nguyen Khoa Viet; Kim, Nam Ah; Chu, Kyung Rok; Jeong, Seong Hoon

    2011-04-04

    A new experimental design methodology was developed by integrating the response surface methodology and the time series modeling. The major purposes were to identify significant factors in determining swelling and release rate from matrix tablets and their relative factor levels for optimizing the experimental responses. Properties of tablet swelling and drug release were assessed with ten factors and two default factors, a hydrophilic model drug (terazosin) and magnesium stearate, and compared with target values. The selected input control factors were arranged in a mixture simplex lattice design with 21 experimental runs. The obtained optimal settings for gelation were PEO, LH-11, Syloid, and Pharmacoat with weight ratios of 215.33 (88.50%), 5.68 (2.33%), 19.27 (7.92%), and 3.04 (1.25%), respectively. The optimal settings for drug release were PEO and citric acid with weight ratios of 191.99 (78.91%) and 51.32 (21.09%), respectively. Based on the results of matrix swelling and drug release, the optimal solutions, target values, and validation experiment results over time were similar and showed consistent patterns with very small biases. The experimental design methodology could be a very promising experimental design method to obtain maximum information with limited time and resources. It could also be very useful in formulation studies by providing a systematic and reliable screening method to characterize significant factors in the sustained release matrix tablet. Copyright © 2011 Elsevier B.V. All rights reserved.

  20. Impact of volunteer-related and methodology-related factors on the reproducibility of brachial artery flow-mediated vasodilation: analysis of 672 individual repeated measurements.

    PubMed

    van Mil, Anke C C M; Greyling, Arno; Zock, Peter L; Geleijnse, Johanna M; Hopman, Maria T; Mensink, Ronald P; Reesink, Koen D; Green, Daniel J; Ghiadoni, Lorenzo; Thijssen, Dick H

    2016-09-01

    Brachial artery flow-mediated dilation (FMD) is a popular technique to examine endothelial function in humans. Identifying volunteer and methodological factors related to variation in FMD is important to improve measurement accuracy and applicability. Volunteer-related and methodology-related parameters were collected in 672 volunteers from eight affiliated centres worldwide who underwent repeated measures of FMD. All centres adopted contemporary expert-consensus guidelines for FMD assessment. After calculating the coefficient of variation (%) of the FMD for each individual, we constructed quartiles (n = 168 per quartile). Based on two regression models (volunteer-related factors and methodology-related factors), statistically significant components of these two models were added to a final regression model (calculated as β-coefficient and R). This allowed us to identify factors that independently contributed to the variation in FMD%. Median coefficient of variation was 17.5%, with healthy volunteers demonstrating a coefficient of variation 9.3%. Regression models revealed age (β = 0.248, P < 0.001), hypertension (β = 0.104, P < 0.001), dyslipidemia (β = 0.331, P < 0.001), time between measurements (β = 0.318, P < 0.001), lab experience (β = -0.133, P < 0.001) and baseline FMD% (β = 0.082, P < 0.05) as contributors to the coefficient of variation. After including all significant factors in the final model, we found that time between measurements, hypertension, baseline FMD% and lab experience with FMD independently predicted brachial artery variability (total R = 0.202). Although FMD% showed good reproducibility, larger variation was observed in conditions with longer time between measurements, hypertension, less experience and lower baseline FMD%. Accounting for these factors may improve FMD% variability.

  1. A Design Pattern for Decentralised Decision Making

    PubMed Central

    Valentini, Gabriele; Fernández-Oto, Cristian; Dorigo, Marco

    2015-01-01

    The engineering of large-scale decentralised systems requires sound methodologies to guarantee the attainment of the desired macroscopic system-level behaviour given the microscopic individual-level implementation. While a general-purpose methodology is currently out of reach, specific solutions can be given to broad classes of problems by means of well-conceived design patterns. We propose a design pattern for collective decision making grounded on experimental/theoretical studies of the nest-site selection behaviour observed in honeybee swarms (Apis mellifera). The way in which honeybee swarms arrive at consensus is fairly well-understood at the macroscopic level. We provide formal guidelines for the microscopic implementation of collective decisions to quantitatively match the macroscopic predictions. We discuss implementation strategies based on both homogeneous and heterogeneous multiagent systems, and we provide means to deal with spatial and topological factors that have a bearing on the micro-macro link. Finally, we exploit the design pattern in two case studies that showcase the viability of the approach. Besides engineering, such a design pattern can prove useful for a deeper understanding of decision making in natural systems thanks to the inclusion of individual heterogeneities and spatial factors, which are often disregarded in theoretical modelling. PMID:26496359

  2. Methodological issues in the design of a rheumatoid arthritis activity score and its cut-offs.

    PubMed

    Collignon, Olivier

    2014-01-01

    Activity of rheumatoid arthritis (RA) can be evaluated using several scoring scales based on clinical features. The most widely used one is the Disease Activity Score involving 28 joint counts (DAS28) for which cut-offs were proposed to help physicians classify patients. However, inaccurate scoring can lead to inappropriate medical decisions. In this article some methodological issues in the design of such a score and its cut-offs are highlighted in order to further propose a strategy to overcome them. As long as the issues reviewed in this article are not addressed, results of studies based on standard disease activity scores such as DAS28 should be considered with caution.

  3. A transonic-small-disturbance wing design methodology

    NASA Technical Reports Server (NTRS)

    Phillips, Pamela S.; Waggoner, Edgar G.; Campbell, Richard L.

    1988-01-01

    An automated transonic design code has been developed which modifies an initial airfoil or wing in order to generate a specified pressure distribution. The design method uses an iterative approach that alternates between a potential-flow analysis and a design algorithm that relates changes in surface pressure to changes in geometry. The analysis code solves an extended small-disturbance potential-flow equation and can model a fuselage, pylons, nacelles, and a winglet in addition to the wing. A two-dimensional option is available for airfoil analysis and design. Several two- and three-dimensional test cases illustrate the capabilities of the design code.

  4. Human perception testing methodology for evaluating EO/IR imaging systems

    NASA Astrophysics Data System (ADS)

    Graybeal, John J.; Monfort, Samuel S.; Du Bosq, Todd W.; Familoni, Babajide O.

    2018-04-01

    The U.S. Army's RDECOM CERDEC Night Vision and Electronic Sensors Directorate (NVESD) Perception Lab is tasked with supporting the development of sensor systems for the U.S. Army by evaluating human performance of emerging technologies. Typical research questions involve detection, recognition and identification as a function of range, blur, noise, spectral band, image processing techniques, image characteristics, and human factors. NVESD's Perception Lab provides an essential bridge between the physics of the imaging systems and the performance of the human operator. In addition to quantifying sensor performance, perception test results can also be used to generate models of human performance and to drive future sensor requirements. The Perception Lab seeks to develop and employ scientifically valid and efficient perception testing procedures within the practical constraints of Army research, including rapid development timelines for critical technologies, unique guidelines for ethical testing of Army personnel, and limited resources. The purpose of this paper is to describe NVESD Perception Lab capabilities, recent methodological improvements designed to align our methodology more closely with scientific best practice, and to discuss goals for future improvements and expanded capabilities. Specifically, we discuss modifying our methodology to improve training, to account for human fatigue, to improve assessments of human performance, and to increase experimental design consultation provided by research psychologists. Ultimately, this paper outlines a template for assessing human perception and overall system performance related to EO/IR imaging systems.

  5. Methodology issues in implementation science.

    PubMed

    Newhouse, Robin; Bobay, Kathleen; Dykes, Patricia C; Stevens, Kathleen R; Titler, Marita

    2013-04-01

    Putting evidence into practice at the point of care delivery requires an understanding of implementation strategies that work, in what context and how. To identify methodological issues in implementation science using 4 studies as cases and make recommendations for further methods development. Four cases are presented and methodological issues identified. For each issue raised, evidence on the state of the science is described. Issues in implementation science identified include diverse conceptual frameworks, potential weaknesses in pragmatic study designs, and the paucity of standard concepts and measurement. Recommendations to advance methods in implementation include developing a core set of implementation concepts and metrics, generating standards for implementation methods including pragmatic trials, mixed methods designs, complex interventions and measurement, and endorsing reporting standards for implementation studies.

  6. Participants' Perceptions of the Instructional Design of an Online Professional Development Module for Teaching English Language Learners: A Q Methodology Study

    ERIC Educational Resources Information Center

    Collins, Linda J.

    2009-01-01

    Using Q methodology, thirteen online instructors shared subjective opinions about the instructional design of an online professional development module intended to provide teachers with basic information for supporting English language learners academically. The researcher selected a set of thirty-six sort items comprised of screen shots taken…

  7. Methodology of strength calculation under alternating stresses using the diagram of limiting amplitudes

    NASA Astrophysics Data System (ADS)

    Konovodov, V. V.; Valentov, A. V.; Kukhar, I. S.; Retyunskiy, O. Yu; Baraksanov, A. S.

    2016-08-01

    The work proposes the algorithm to calculate strength under alternating stresses using the developed methodology of building the diagram of limiting stresses. The overall safety factor is defined by the suggested formula. Strength calculations of components working under alternating stresses in the great majority of cases are conducted as the checking ones. It is primarily explained by the fact that the overall fatigue strength reduction factor (Kσg or Kτg) can only be chosen approximately during the component design as the engineer at this stage of work has just the approximate idea on the component size and shape.

  8. Methodology for the structural design of single spoke accelerating cavities at Fermilab

    NASA Astrophysics Data System (ADS)

    Passarelli, Donato; Wands, Robert H.; Merio, Margherita; Ristori, Leonardo

    2016-10-01

    Fermilab is planning to upgrade its accelerator complex to deliver a more powerful and intense proton-beam for neutrino experiments. In the framework of the so-called Proton Improvement Plan-II (PIP-II), we are designing and developing a cryomodule containing superconducting accelerating cavities, the Single Spoke Resonators of type 1 (SSR1). In this paper, we present the sequence of analysis and calculations performed for the structural design of these cavities, using the rules of the American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code (BPVC). The lack of an accepted procedure for addressing the design, fabrication, and inspection of such unique pressure vessels makes the task demanding and challenging every time. Several factors such as exotic materials, unqualified brazing procedures, limited nondestructive examination, and the general R&D nature of these early generations of cavity design, conspire to make it impractical to obtain full compliance with all ASME BPVC requirements. However, the presented approach allowed us to validate the design of this new generation of single spoke cavities with values of maximum allowable working pressure that exceeds the safety requirements. This set of rules could be used as a starting point for the structural design and development of similar objects.

  9. Longitudinal Emergency Medical Technician Attributes and Demographic Study (LEADS) Design and Methodology.

    PubMed

    Levine, Roger

    2016-12-01

    The objective of this study is to describe the Longitudinal Emergency Medical Technician (EMT) Attributes and Demographic Study (LEADS) design, instrument development, pilot testing, sampling procedures, and data collection methodology. Response rates are provided, along with results of follow-up surveys of non-responders (NRs) and a special survey of Emergency Medical Services (EMS) professionals who were not nationally certified. Annual surveys from 1999 to 2008 were mailed out to a random, stratified sample of nationally registered EMT-Basics and Paramedics. Survey weights were developed to reflect each respondent's probability of selection. A special survey of NRs was mailed out to individuals who did not respond to the annual survey to estimate the probable extent and direction of response bias. Individuals who indicated they were no longer in the profession were mailed a special exit survey to determine their reasons for leaving EMS. Given the large number of comparisons between NR and regular (annual) survey respondents, it is not surprising that some statistically significant differences were found. In general, there were few differences. However, NRs tended to report higher annual EMS incomes, were younger, healthier, more physically fit, and were more likely to report that they were not practicing EMS. Comparisons of the nationally certified EMS professionals with EMS professionals who were not nationally certified indicated that nationally certified EMS providers were younger, had less EMS experiences, earned less, were more likely to be female and work for private EMS services, and less likely to work for fire-based services. These differences may reflect state and local policy and practice, since many states and local agencies do not require maintenance of national certification as a requirement to practice. When these differences were controlled for statistically, there were few systematic differences between non-nationally certified and nationally

  10. A Quantitative Examination of Critical Success Factors Comparing Agile and Waterfall Project Management Methodologies

    ERIC Educational Resources Information Center

    Pedersen, Mitra

    2013-01-01

    This study investigated the rate of success for IT projects using agile and standard project management methodologies. Any successful project requires use of project methodology. Specifically, large projects require formal project management methodologies or models, which establish a blueprint of processes and project planning activities. This…

  11. 49 CFR 192.111 - Design factor (F) for steel pipe.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... NATURAL AND OTHER GAS BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Pipe Design § 192.111 Design factor (F... street, or a railroad; (3) Is supported by a vehicular, pedestrian, railroad, or pipeline bridge; or (4...

  12. Design Methodology for Automated Construction Machines

    DTIC Science & Technology

    1987-12-11

    along with the design of a pair of machines which automate framework installation.-,, 20. DISTRIBUTION IAVAILABILITY OF ABSTRACT 21. ABSTRACT SECURITY... Development Assistant Professor of Civil Engineering and Laura A . Demsetz, David H. Levy, Bruce Schena Graduate Research Assistants December 11, 1987 U.S...are discussed along with the design of a pair of machines which automate framework installation. Preliminary analysis and testing indicate that these

  13. Peer Review of a Formal Verification/Design Proof Methodology

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The role of formal verification techniques in system validation was examined. The value and the state of the art of performance proving for fault-tolerant compuers were assessed. The investigation, development, and evaluation of performance proving tools were reviewed. The technical issues related to proof methodologies are examined. The technical issues discussed are summarized.

  14. Human factor roles in design of teleoperator systems

    NASA Technical Reports Server (NTRS)

    Janow, C.; Malone, T. B.

    1973-01-01

    Teleoperator systems are considered, giving attention to types of teleoperators, a manned space vehicle attached manipulator, a free-flying teleoperator, a surface exploration roving vehicle, the human factors role in total system design, the manipulator system, the sensor system, the communication system, the control system, and the mobility system. The role of human factors in the development of teleoperator systems is also discussed, taking into account visual systems, an operator control station, and the manipulators.

  15. Prevalence of chronic kidney disease and comorbidities in isolated African descent communities (PREVRENAL): methodological design of a cohort study.

    PubMed

    Salgado-Filho, Natalino; Lages, Joyce Santos; Brito, Dyego José; Salgado, João Victor; Silva, Gyl Eanes; Santos, Alcione Miranda; Monteiro-Júnior, Francisco Chagas; Santos, Elisangela Milhomen; Silva, Antônio Augusto; Araújo, Denizar Vianna; Sesso, Ricardo Castro

    2018-02-26

    Chronic kidney disease (CKD) is considered a serious public health problem, both in Brazil and worldwide, with an increasing number of cases observed inrecent years. Especially, CKD has been reported to be highly prevalent in those of African descent. However, Brazil lacks data from early-stage CKD population studies, and the prevalence of CKD is unknown for both the overall and African descent populations. Hence, the present study aimsto estimate the prevalence of early-stage CKD and its associated risk factors in African-Brazilians from isolated African-descent communities. Herein, the detailed methodology design of the study is described. This population-based, prospective, longitudinal, cohort study (PREVRENAL) is performed in three stages: first, clinical, nutritional, and anthropometric evaluations; measurements of serum and urinary markers; and examinations of comorbiditieswere performed. Second, repeated examinations of individuals with CKD, systemic arterial hypertension, and/or diabetes mellitus; image screening; and cardiac risk assessment were performed. Third, long-term monitoring of all selected individuals will be conducted (ongoing). Using probability sampling, 1539 individuals from 32 communities were selected. CKD was defined asaglomerular filtration rate (GFR) ≤60 mL/min/1.73m 2 and albuminuria > 30 mg/day. This study proposes to identify and monitor individuals with and without reduced GFR and high albuminuria in isolated populations of African descendants in Brazil. As there are currently no specific recommendations for detecting CKD in African descendants, four equations for estimating the GFR based on serum creatinine and cystatin C were used and will be retrospectively compared. The present report describes the characteristics of the target population, selection of individuals, and detection of a population at risk, along with the imaging, clinical, and laboratory methodologies used. The first and second stages have been concluded and

  16. Human factors aspects of control room design: Guidelines and annotated bibliography

    NASA Technical Reports Server (NTRS)

    Mitchell, C. M.; Stewart, L. J.; Bocast, A. K.; Murphy, E. D.

    1982-01-01

    A human factors analysis of the workstation design for the Earth Radiation Budget Satellite mission operation room is discussed. The relevance of anthropometry, design rules, environmental design goals, and the social-psychological environment are discussed.

  17. Advanced power analysis methodology targeted to the optimization of a digital pixel readout chip design and its critical serial powering system

    NASA Astrophysics Data System (ADS)

    Marconi, S.; Orfanelli, S.; Karagounis, M.; Hemperek, T.; Christiansen, J.; Placidi, P.

    2017-02-01

    A dedicated power analysis methodology, based on modern digital design tools and integrated with the VEPIX53 simulation framework developed within RD53 collaboration, is being used to guide vital choices for the design and optimization of the next generation ATLAS and CMS pixel chips and their critical serial powering circuit (shunt-LDO). Power consumption is studied at different stages of the design flow under different operating conditions. Significant effort is put into extensive investigations of dynamic power variations in relation with the decoupling seen by the powering network. Shunt-LDO simulations are also reported to prove the reliability at the system level.

  18. Developmental Factors Associated with the Formation of the Antisocial Personality: A Literature Review.

    ERIC Educational Resources Information Center

    Cannon, Kent Wesley

    Research on factors which contribute to the development of antisocial personality disorder is reviewed. Methodological issues are critiqued, including major assessment instruments and frequently used research designs. Factors which current research indicates might lead to the continuation of antisocial behavior from childhood into adulthood are…

  19. Studying the Education of Educators: Methodology.

    ERIC Educational Resources Information Center

    Sirotnik, Kenneth A.

    1988-01-01

    Describes the methodology and research design of SEE, the study of the Education of Educators. The approach is multimethodological, exploratory, descriptive, and evaluative. The research design permits examination of working assumptions and concentration on the individual site--the college, the education departments, and specific programs within…

  20. The Promise of Virtual Teams: Identifying Key Factors in Effectiveness and Failure

    ERIC Educational Resources Information Center

    Horwitz, Frank M.; Bravington, Desmond; Silvis, Ulrik

    2006-01-01

    Purpose: The aim of the investigation is to identify enabling and disenabling factors in the development and operation of virtual teams; to evaluate the importance of factors such as team development, cross-cultural variables, leadership, communication and social cohesion as contributors to virtual team effectiveness. Design/methodology/approach:…