Science.gov

Sample records for progressive design methodology

  1. Progress in the Development of a Nozzle Design Methodology for Pulsed Detonation Engines

    NASA Technical Reports Server (NTRS)

    Leary, B. A.; Waltrup, P. J.; Rice, T.; Cybyk, B. Z.

    2002-01-01

    The Johns Hopkins University Applied Physics Laboratory (JHU/APL), in support of the NASA Glenn Research Center (NASA GRC), is investigating performance methodologies and system integration issues related to Pulsed Detonation Engine (PDE) nozzles. The primary goal of this ongoing effort is to develop design and performance assessment methodologies applicable to PDE exit nozzle(s). APL is currently focusing its efforts on a common plenum chamber design that collects the exhaust products from multiple PDE tubes prior to expansion in a single converging-diverging exit nozzle. To accomplish this goal, a time-dependent, quasi-one-dimensional analysis for determining the flow properties in and through a single plenum and exhaust nozzle is underway. In support of these design activities, parallel modeling efforts using commercial Computational Fluid Dynamics (CFD) software are on-going. These efforts include both two and three-dimensional as well as steady and time-dependent computations to assess the flow in and through these devices. This paper discusses the progress in developing this nozzle design methodology.

  2. Establishing Equivalence: Methodological Progress in Group-Matching Design and Analysis

    ERIC Educational Resources Information Center

    Kover, Sara T.; Atwood, Amy K.

    2013-01-01

    This methodological review draws attention to the challenges faced by intellectual and developmental disabilities researchers in the appropriate design and analysis of group comparison studies. We provide a brief overview of matching methodologies in the field, emphasizing group-matching designs used in behavioral research on cognition and…

  3. Establishing Equivalence: Methodological Progress in Group-Matching Design and Analysis

    ERIC Educational Resources Information Center

    Kover, Sara T.; Atwood, Amy K.

    2013-01-01

    This methodological review draws attention to the challenges faced by intellectual and developmental disabilities researchers in the appropriate design and analysis of group comparison studies. We provide a brief overview of matching methodologies in the field, emphasizing group-matching designs used in behavioral research on cognition and…

  4. Autonomous spacecraft design methodology

    SciTech Connect

    Divita, E.L.; Turner, P.R.

    1984-08-01

    A methodology for autonomous spacecraft design blends autonomy requirements with traditional mission requirements and assesses the impact of autonomy upon the total system resources available to support faulttolerance and automation. A baseline functional design can be examined for autonomy implementation impacts, and the costs, risk, and benefits of various options can be assessed. The result of the process is a baseline design that includes autonomous control functions.

  5. A Methodology of Analysis for Monitoring Treatment Progression with 19-Channel Z-Score Neurofeedback (19ZNF) in a Single-Subject Design.

    PubMed

    Krigbaum, Genomary; Wigton, Nancy L

    2015-09-01

    19-Channel Z-Score Neurofeedback (19ZNF) is a modality using 19-electrodes with real-time normative database z-scores, suggesting effective clinical outcomes in fewer sessions than traditional neurofeedback. Thus, monitoring treatment progression and clinical outcome is necessary. The area of focus in this study was a methodology of quantitative analysis for monitoring treatment progression and clinical outcome with 19ZNF. This methodology is noted as the Sites-of-Interest, which included repeated measures analyses of variance (rANOVA) and t-tests for z-scores; it was conducted on 10 cases in a single subject design. To avoid selection bias, the 10 sample cases were randomly selected from a pool of 17 cases that met the inclusion criteria. Available client outcome measures (including self-report) are briefly discussed. The results showed 90% of the pre-post comparisons moved in the targeted direction (z = 0) and of those, 96% (80% Bonferroni corrected) of the t-tests and 96% (91% Bonferroni corrected) of the rANOVAs were statistically significant; thus indicating a progression towards the mean in 15 or fewer 19ZNF sessions. All cases showed and reported improvement in all outcome measures (including quantitative electroencephalography assessment) at case termination.

  6. RAMCAD Design Methodology

    DTIC Science & Technology

    1993-05-01

    fault detection and isolation , and reduces...elements during the design process. Fault detection and isolation are simplified when an entire function can be assigned to a single hardware design element...defining the fault detection and isolation constraints and goals) are met or all of the test resources have been committed. Alternative resource

  7. Permanent magnet design methodology

    NASA Technical Reports Server (NTRS)

    Leupold, Herbert A.

    1991-01-01

    Design techniques developed for the exploitation of high energy magnetically rigid materials such as Sm-Co and Nd-Fe-B have resulted in a revolution in kind rather than in degree in the design of a variety of electron guidance structures for ballistic and aerospace applications. Salient examples are listed. Several prototype models were developed. These structures are discussed in some detail: permanent magnet solenoids, transverse field sources, periodic structures, and very high field structures.

  8. Solid lubrication design methodology

    NASA Technical Reports Server (NTRS)

    Aggarwal, B. B.; Yonushonis, T. M.; Bovenkerk, R. L.

    1984-01-01

    A single element traction rig was used to measure the traction forces at the contact of a ball against a flat disc at room temperature under combined rolling and sliding. The load and speed conditions were selected to match those anticipated for bearing applications in adiabatic diesel engines. The test program showed that the magnitude of traction forces were almost the same for all the lubricants tested; a lubricant should, therefore, be selected on the basis of its ability to prevent wear of the contact surfaces. Traction vs. slide/roll ratio curves were similar to those for liquid lubricants but the traction forces were an order of magnitude higher. The test data was used to derive equations to predict traction force as a function of contact stress and rolling speed. Qualitative design guidelines for solid lubricated concentrated contacts are proposed.

  9. Unattended Monitoring System Design Methodology

    SciTech Connect

    Drayer, D.D.; DeLand, S.M.; Harmon, C.D.; Matter, J.C.; Martinez, R.L.; Smith, J.D.

    1999-07-08

    A methodology for designing Unattended Monitoring Systems starting at a systems level has been developed at Sandia National Laboratories. This proven methodology provides a template that describes the process for selecting and applying appropriate technologies to meet unattended system requirements, as well as providing a framework for development of both training courses and workshops associated with unattended monitoring. The design and implementation of unattended monitoring systems is generally intended to respond to some form of policy based requirements resulting from international agreements or domestic regulations. Once the monitoring requirements are established, a review of the associated process and its related facilities enables identification of strategic monitoring locations and development of a conceptual system design. The detailed design effort results in the definition of detection components as well as the supporting communications network and data management scheme. The data analyses then enables a coherent display of the knowledge generated during the monitoring effort. The resultant knowledge is then compared to the original system objectives to ensure that the design adequately addresses the fundamental principles stated in the policy agreements. Implementation of this design methodology will ensure that comprehensive unattended monitoring system designs provide appropriate answers to those critical questions imposed by specific agreements or regulations. This paper describes the main features of the methodology and discusses how it can be applied in real world situations.

  10. Design methodology of Dutch banknotes

    NASA Astrophysics Data System (ADS)

    de Heij, Hans A. M.

    2000-04-01

    Since the introduction of a design methodology for Dutch banknotes, the quality of Dutch paper currency has improved in more than one way. The methodology is question provides for (i) a design policy, which helps fix clear objectives; (ii) design management, to ensure a smooth cooperation between the graphic designer, printer, papermaker an central bank, (iii) a program of requirements, a banknote development guideline for all parties involved. This systematic approach enables an objective selection of design proposals, including security features. Furthermore, the project manager obtains regular feedback from the public by conducting market surveys. Each new design of a Netherlands Guilder banknote issued by the Nederlandsche Bank of the past 50 years has been an improvement on its predecessor in terms of value recognition, security and durability.

  11. MEIC Design Progress

    SciTech Connect

    Zhang, Y; Douglas, D; Hutton, A; Krafft, G A; Li, R; Lin, F; Morozov, V S; Nissen, E W; Pilat, F C; Satogata, T; Tennant, C; Terzic, B; Yunn, C; Barber, D P; Filatov, Y; Hyde, C; Kondratenko, A M; Manikonda, S L; Ostroumov, P N; Sullivan, M K

    2012-07-01

    This paper will report the recent progress in the conceptual design of MEIC, a high luminosity medium energy polarized ring-ring electron-ion collider at Jefferson lab. The topics and achievements that will be covered are design of the ion large booster and the ERL-circulator-ring-based electron cooling facility, optimization of chromatic corrections and dynamic aperture studies, schemes and tracking simulations of lepton and ion polarization in the figure-8 collider ring, and the beam-beam and electron cooling simulations. A proposal of a test facility for the MEIC electron cooler will also be discussed.

  12. Waste Package Design Methodology Report

    SciTech Connect

    D.A. Brownson

    2001-09-28

    The objective of this report is to describe the analytical methods and processes used by the Waste Package Design Section to establish the integrity of the various waste package designs, the emplacement pallet, and the drip shield. The scope of this report shall be the methodology used in criticality, risk-informed, shielding, source term, structural, and thermal analyses. The basic features and appropriateness of the methods are illustrated, and the processes are defined whereby input values and assumptions flow through the application of those methods to obtain designs that ensure defense-in-depth as well as satisfy requirements on system performance. Such requirements include those imposed by federal regulation, from both the U.S. Department of Energy (DOE) and U.S. Nuclear Regulatory Commission (NRC), and those imposed by the Yucca Mountain Project to meet repository performance goals. The report is to be used, in part, to describe the waste package design methods and techniques to be used for producing input to the License Application Report.

  13. Methodological Issues in Questionnaire Design.

    PubMed

    Song, Youngshin; Son, Youn Jung; Oh, Doonam

    2015-06-01

    The process of designing a questionnaire is complicated. Many questionnaires on nursing phenomena have been developed and used by nursing researchers. The purpose of this paper was to discuss questionnaire design and factors that should be considered when using existing scales. Methodological issues were discussed, such as factors in the design of questions, steps in developing questionnaires, wording and formatting methods for items, and administrations methods. How to use existing scales, how to facilitate cultural adaptation, and how to prevent socially desirable responding were discussed. Moreover, the triangulation method in questionnaire development was introduced. Steps were recommended for designing questions such as appropriately operationalizing key concepts for the target population, clearly formatting response options, generating items and confirming final items through face or content validity, sufficiently piloting the questionnaire using item analysis, demonstrating reliability and validity, finalizing the scale, and training the administrator. Psychometric properties and cultural equivalence should be evaluated prior to administration when using an existing questionnaire and performing cultural adaptation. In the context of well-defined nursing phenomena, logical and systematic methods will contribute to the development of simple and precise questionnaires.

  14. Reliability based design optimization: Formulations and methodologies

    NASA Astrophysics Data System (ADS)

    Agarwal, Harish

    Modern products ranging from simple components to complex systems should be designed to be optimal and reliable. The challenge of modern engineering is to ensure that manufacturing costs are reduced and design cycle times are minimized while achieving requirements for performance and reliability. If the market for the product is competitive, improved quality and reliability can generate very strong competitive advantages. Simulation based design plays an important role in designing almost any kind of automotive, aerospace, and consumer products under these competitive conditions. Single discipline simulations used for analysis are being coupled together to create complex coupled simulation tools. This investigation focuses on the development of efficient and robust methodologies for reliability based design optimization in a simulation based design environment. Original contributions of this research are the development of a novel efficient and robust unilevel methodology for reliability based design optimization, the development of an innovative decoupled reliability based design optimization methodology, the application of homotopy techniques in unilevel reliability based design optimization methodology, and the development of a new framework for reliability based design optimization under epistemic uncertainty. The unilevel methodology for reliability based design optimization is shown to be mathematically equivalent to the traditional nested formulation. Numerical test problems show that the unilevel methodology can reduce computational cost by at least 50% as compared to the nested approach. The decoupled reliability based design optimization methodology is an approximate technique to obtain consistent reliable designs at lesser computational expense. Test problems show that the methodology is computationally efficient compared to the nested approach. A framework for performing reliability based design optimization under epistemic uncertainty is also developed

  15. Space Engineering Projects in Design Methodology

    NASA Technical Reports Server (NTRS)

    Crawford, R.; Wood, K.; Nichols, S.; Hearn, C.; Corrier, S.; DeKunder, G.; George, S.; Hysinger, C.; Johnson, C.; Kubasta, K.

    1993-01-01

    NASA/USRA is an ongoing sponsor of space design projects in the senior design courses of the Mechanical Engineering Department at The University of Texas at Austin. This paper describes the UT senior design sequence, focusing on the first-semester design methodology course. The philosophical basis and pedagogical structure of this course is summarized. A history of the Department's activities in the Advanced Design Program is then presented. The paper includes a summary of the projects completed during the 1992-93 Academic Year in the methodology course, and concludes with an example of two projects completed by student design teams.

  16. Space Engineering Projects in Design Methodology

    NASA Technical Reports Server (NTRS)

    Crawford, R.; Wood, K.; Nichols, S.; Hearn, C.; Corrier, S.; DeKunder, G.; George, S.; Hysinger, C.; Johnson, C.; Kubasta, K.

    1993-01-01

    NASA/USRA is an ongoing sponsor of space design projects in the senior design courses of the Mechanical Engineering Department at The University of Texas at Austin. This paper describes the UT senior design sequence, focusing on the first-semester design methodology course. The philosophical basis and pedagogical structure of this course is summarized. A history of the Department's activities in the Advanced Design Program is then presented. The paper includes a summary of the projects completed during the 1992-93 Academic Year in the methodology course, and concludes with an example of two projects completed by student design teams.

  17. Assuring data transparency through design methodologies

    NASA Technical Reports Server (NTRS)

    Williams, Allen

    1990-01-01

    This paper addresses the role of design methodologies and practices in the assurance of technology transparency. The development of several subsystems on large, long life cycle government programs was analyzed to glean those characteristics in the design, development, test, and evaluation that precluded or enabled the insertion of new technology. The programs examined were Minuteman, DSP, B1-B, and space shuttle. All these were long life cycle, technology-intensive programs. The design methodologies (or lack thereof) and design practices for each were analyzed in terms of the success or failure in incorporating evolving technology. Common elements contributing to the success or failure were extracted and compared to current methodologies being proposed by the Department of Defense and NASA. The relevance of these practices to the design and deployment of Space Station Freedom were evaluated. In particular, appropriate methodologies now being used on the core development contract were examined.

  18. Assuring data transparency through design methodologies

    NASA Technical Reports Server (NTRS)

    Williams, Allen

    1990-01-01

    This paper addresses the role of design methodologies and practices in the assurance of technology transparency. The development of several subsystems on large, long life cycle government programs was analyzed to glean those characteristics in the design, development, test, and evaluation that precluded or enabled the insertion of new technology. The programs examined were Minuteman, DSP, B1-B, and space shuttle. All these were long life cycle, technology-intensive programs. The design methodologies (or lack thereof) and design practices for each were analyzed in terms of the success or failure in incorporating evolving technology. Common elements contributing to the success or failure were extracted and compared to current methodologies being proposed by the Department of Defense and NASA. The relevance of these practices to the design and deployment of Space Station Freedom were evaluated. In particular, appropriate methodologies now being used on the core development contract were examined.

  19. General Methodology for Designing Spacecraft Trajectories

    NASA Technical Reports Server (NTRS)

    Condon, Gerald; Ocampo, Cesar; Mathur, Ravishankar; Morcos, Fady; Senent, Juan; Williams, Jacob; Davis, Elizabeth C.

    2012-01-01

    A methodology for designing spacecraft trajectories in any gravitational environment within the solar system has been developed. The methodology facilitates modeling and optimization for problems ranging from that of a single spacecraft orbiting a single celestial body to that of a mission involving multiple spacecraft and multiple propulsion systems operating in gravitational fields of multiple celestial bodies. The methodology consolidates almost all spacecraft trajectory design and optimization problems into a single conceptual framework requiring solution of either a system of nonlinear equations or a parameter-optimization problem with equality and/or inequality constraints.

  20. A new methodology for hospital design.

    PubMed

    Mejia, Ana Maria Silva

    2013-08-01

    According to architect, Ana Maria Silva Mejia, 'a new era for the design of hospitals in Guatemala has arrived', with a considerable growth in interest around good healthcare facility design. Here, in a slightly adapted version of an article, 'A new methodology for design', first published in the IFHE (International Federation of Hospital Engineering) Digest 2012, she reports on the application of a new methodology designed to optimise efficient use of space, and clinical and other adjacencies, in a district hospital in the City of Zacapa. The system has subsequently been successfully applied to a number of other Guatemalan healthcare facilities.

  1. Gas turbine combustor design methodology

    SciTech Connect

    Rizk, N.K.; Mongia, H.C.

    1986-01-01

    The detailed representation of flow and combustion processes offered by multidimensional models and the predictive tool of the proven empirical correlations are combined to form a basis for a gas turbine combustor design method. Provisions are made to fully utilize the output of the analytical computations by evaluating the values of relevant parameters within subdivisions of liner sector. By this means, the impact of a systematic modification to the detail of dome swirlers and liner configuration is easily determined. A heat transfer calculation method that utilizes the variation in combustor parameters in the three dimensions and evaluates radiation flux components through a view factor is considered. In comparison with experimental data obtained for a typical production liner, the predictions of the developed method in regard to emission formation, combustion performance, and wall temperature are quite satisfactory.

  2. Applying Software Design Methodology to Instructional Design

    ERIC Educational Resources Information Center

    East, J. Philip

    2004-01-01

    The premise of this paper is that computer science has much to offer the endeavor of instructional improvement. Software design processes employed in computer science for developing software can be used for planning instruction and should improve instruction in much the same manner that design processes appear to have improved software. Techniques…

  3. Design methodology and projects for space engineering

    NASA Technical Reports Server (NTRS)

    Nichols, S.; Kleespies, H.; Wood, K.; Crawford, R.

    1993-01-01

    NASA/USRA is an ongoing sponsor of space design projects in the senior design course of the Mechanical Engineering Department at The University of Texas at Austin. This paper describes the UT senior design sequence, consisting of a design methodology course and a capstone design course. The philosophical basis of this sequence is briefly summarized. A history of the Department's activities in the Advanced Design Program is then presented. The paper concludes with a description of the projects completed during the 1991-92 academic year and the ongoing projects for the Fall 1992 semester.

  4. Design methodology and projects for space engineering

    NASA Technical Reports Server (NTRS)

    Nichols, S.; Kleespies, H.; Wood, K.; Crawford, R.

    1993-01-01

    NASA/USRA is an ongoing sponsor of space design projects in the senior design course of the Mechanical Engineering Department at The University of Texas at Austin. This paper describes the UT senior design sequence, consisting of a design methodology course and a capstone design course. The philosophical basis of this sequence is briefly summarized. A history of the Department's activities in the Advanced Design Program is then presented. The paper concludes with a description of the projects completed during the 1991-92 academic year and the ongoing projects for the Fall 1992 semester.

  5. Monitoring Progress towards Education for All: A Methodological Guidebook.

    ERIC Educational Resources Information Center

    United Nations Educational, Scientific and Cultural Organization, Bangkok (Thailand). Principal Regional Office for Asia and the Pacific.

    This guidebook, aimed at middle-level policy advisors and planners in education, suggests a methodology that can be used to monitor progress in the implementation of Education for All (EFA) programs regardless of the proprieties and targets involved at the national level of the individual country. The first chapter discusses the decision to…

  6. Monitoring Progress towards Education for All: A Methodological Guidebook.

    ERIC Educational Resources Information Center

    United Nations Educational, Scientific and Cultural Organization, Bangkok (Thailand). Principal Regional Office for Asia and the Pacific.

    This guidebook, aimed at middle-level policy advisors and planners in education, suggests a methodology that can be used to monitor progress in the implementation of Education for All (EFA) programs regardless of the proprieties and targets involved at the national level of the individual country. The first chapter discusses the decision to…

  7. Waste Package Component Design Methodology Report

    SciTech Connect

    D.C. Mecham

    2004-07-12

    This Executive Summary provides an overview of the methodology being used by the Yucca Mountain Project (YMP) to design waste packages and ancillary components. This summary information is intended for readers with general interest, but also provides technical readers a general framework surrounding a variety of technical details provided in the main body of the report. The purpose of this report is to document and ensure appropriate design methods are used in the design of waste packages and ancillary components (the drip shields and emplacement pallets). The methodology includes identification of necessary design inputs, justification of design assumptions, and use of appropriate analysis methods, and computational tools. This design work is subject to ''Quality Assurance Requirements and Description''. The document is primarily intended for internal use and technical guidance for a variety of design activities. It is recognized that a wide audience including project management, the U.S. Department of Energy (DOE), the U.S. Nuclear Regulatory Commission, and others are interested to various levels of detail in the design methods and therefore covers a wide range of topics at varying levels of detail. Due to the preliminary nature of the design, readers can expect to encounter varied levels of detail in the body of the report. It is expected that technical information used as input to design documents will be verified and taken from the latest versions of reference sources given herein. This revision of the methodology report has evolved with changes in the waste package, drip shield, and emplacement pallet designs over many years and may be further revised as the design is finalized. Different components and analyses are at different stages of development. Some parts of the report are detailed, while other less detailed parts are likely to undergo further refinement. The design methodology is intended to provide designs that satisfy the safety and operational

  8. Hardware design methodology for efficient reuse

    SciTech Connect

    Seepold, R.; Kunzmann, A.; Rosenstiel, W.

    1996-12-31

    The approach presented offers a design methodology for efficient reuse that is based on a Reuse Management System (RMS) and a detailed analysis of component reuse. After the presentation of the current level of work being done in this field, RMS is introduced and a basic model is shown to describe fundamental mechanisms before design for reuse techniques can be introduced. In contrast to conventional reuse approaches, which are restricted to specific support, this new approach bridges the gap between design and reuse integration. The new methodology incorporates RMS requirements and it achieves several initial targets requested for a powerful system to provide comprehensive reuse. Based on the object-oriented internal data model and the sketched architecture, easy access via common Internet services is offered, and therefore, quick access to reuse data is possible. In summary, the approach helps to reduce long term development costs, and therefore, it is an innovative way to reach the objectives of efficient cost management.

  9. Autism genetics: Methodological issues and experimental design.

    PubMed

    Sacco, Roberto; Lintas, Carla; Persico, Antonio M

    2015-10-01

    Autism is a complex neuropsychiatric disorder of developmental origin, where multiple genetic and environmental factors likely interact resulting in a clinical continuum between "affected" and "unaffected" individuals in the general population. During the last two decades, relevant progress has been made in identifying chromosomal regions and genes in linkage or association with autism, but no single gene has emerged as a major cause of disease in a large number of patients. The purpose of this paper is to discuss specific methodological issues and experimental strategies in autism genetic research, based on fourteen years of experience in patient recruitment and association studies of autism spectrum disorder in Italy.

  10. Performance-based asphalt mixture design methodology

    NASA Astrophysics Data System (ADS)

    Ali, Al-Hosain Mansour

    Today, several State D.O.T.s are being investigating the use of tire rubber with local conventional materials. Several of the ongoing investigations identified potential benefits from the use of these materials, including improvements in material properties and performance. One of the major problems is being associated with the transferability of asphalt rubber technology without appropriately considering the effects of the variety of conventional materials on mixture behavior and performance. Typically, the design of these mixtures is being adapted to the physical properties of the conventional materials by using the empirical Marshall mixture design and without considering fundamental mixture behavior and performance. Use of design criteria related to the most common modes of failure for asphalt mixtures, such as rutting, fatigue cracking, and low temperature thermal cracking have to be developed and used for identifying the "best mixture," in term of performance, for the specific local materials and loading conditions. The main objective of this study was the development of a mixture design methodology that considers mixture behavior and performance. In order to achieve this objective a laboratory investigation able to evaluate mixture properties that can be related to mixture performance, (in terms of rutting, low temperature cracking, moisture damage and fatigue), and simulating the actual field loading conditions that the material is being exposed to, was conducted. The results proved that the inclusion of rubber into asphalt mixtures improved physical characteristics such as elasticity, flexibility, rebound, aging properties, increased fatigue resistance, and reduced rutting potential. The possibility of coupling the traditional Marshall mix design method with parameters related to mixture behavior and performance was investigated. Also, the SHRP SUPERPAVE mix design methodology was reviewed and considered in this study for the development of an integrated

  11. An Analysis of Software Design Methodologies

    DTIC Science & Technology

    1979-08-01

    Technical Report 401-, # "AN ANALYSIS OF SOFTWARE DESIGN METHODOLOGIES H. Rudy Ramsey, Michael E. Atwood , and Gary D. Campbell Science...H. Rudy Ramsey, Michael E. Atwood , and Gary D. Campbell Science Applications, Incorporated Submitted by: Edgar M. Johnson, Chief HUMAN FACTORS...expressed by members ot the Integrated Software Research and Development Working Group (ISRAD). The authors are indebted to Martha Cichelli, Margaret

  12. CAGE IIIA Distributed Simulation Design Methodology

    DTIC Science & Technology

    2014-05-01

    Greenach Mean Time GOTS Government Off The Shelf GPS Global Positioning System GW Gate Way HD Hard Drive HICON Higher Control HOA Horn of Africa HQ...Organisation SF Special Forces SME Subject Matter Expert SOCET GXP SOCET GXP is a geospatial-intelligence software package that uses imagery...Implementing Defence Experimentation (GUIDEx). The key challenges for this methodology are with understanding how to: • design it o define the

  13. A Design Methodology for Optoelectronic VLSI

    DTIC Science & Technology

    2007-01-01

    soldered to a copper -clad printed circuit (PC) board, are no longer sufficient for today’s high-speed ICs. A processing chip that can compute data at a rate...design approach. A new design methodology has to be adopted to take advan- tage of the benefits that FSOI offers. Optoelectronic VLSI is the coupling of...and connections are made from chip to chip via traces of copper wire, as shown in Figure 2-2. The signal from a logic gate on one chip to a logic gate

  14. Conceptual design methodology for vibration isolation

    NASA Astrophysics Data System (ADS)

    Hyde, T. Tupper

    1997-06-01

    High performance dynamic structures have strict requirements on structural motion that are emphasized by the flexibility inherent in lightweight space systems. Vibration isolation is used to prevent disturbances from affecting critical payload components where motion is to be minimized. Isolation, however, is often an engineering solution that is not properly considered in the early conceptual design of the spacecraft. It is at this key stage of a program that mission driving performance targets and resource allocations are made yet little analysis has been performed. A conceptual design methodology for isolation is developed and applied to the conceptual design of a proposed space shuttle based telescope system. In the developed methodology, frequency domain computation of the closed loop performance without isolation pinpoints frequency regimes and disturbance to performance channels targeted for improvement. A coarse fidelity structural model, with well defined disturbance and performance characterization, is more useful than a costly high fidelity analysis when evaluating the many isolation options available early in a project. Isolation design choices are made by trading their performance improvement against their complexity/cost. Simple, idealized mechanical descriptions of the passive or active isolation system provide the needed frequency domain effect on performance without the costly analysis that a detailed isolator design entails. Similarly, the effects of other integrating subsystems, such as structural or optical control are approximated by frequency domain descriptions.

  15. Structural design methodology for large space structures

    NASA Astrophysics Data System (ADS)

    Dornsife, Ralph J.

    The Department of Defense requires research and development in designing, fabricating, deploying, and maintaining large space structures (LSS) in support of Army and Strategic Defense Initiative military objectives. Because of their large size, extreme flexibility, and the unique loading conditions in the space environment, LSS will present engineers with problems unlike those encountered in designing conventional civil engineering or aerospace structures. LSS will require sophisticated passive damping and active control systems in order to meet stringent mission requirements. These structures must also be optimally designed to minimize high launch costs. This report outlines a methodology for the structural design of LSS. It includes a definition of mission requirements, structural modeling and analysis, passive damping and active control system design, ground-based testing, payload integration, on-orbit system verification, and on-orbit assessment of structural damage. In support of this methodology, analyses of candidate LSS truss configurations are presented, and an algorithm correlating ground-based test behavior to expected microgravity behavior is developed.

  16. Structural design methodology for large space structures

    NASA Astrophysics Data System (ADS)

    Dornsife, Ralph J.

    1992-02-01

    The Department of Defense requires research and development in designing, fabricating, deploying, and maintaining large space structures (LSS) in support of Army and Strategic Defense Initiative military objectives. Because of their large size, extreme flexibility, and the unique loading conditions in the space environment, LSS will present engineers with problems unlike those encountered in designing conventional civil engineering or aerospace structures. LSS will require sophisticated passive damping and active control systems in order to meet stringent mission requirements. These structures must also be optimally designed to minimize high launch costs. This report outlines a methodology for the structural design of LSS. It includes a definition of mission requirements, structural modeling and analysis, passive damping and active control system design, ground-based testing, payload integration, on-orbit system verification, and on-orbit assessment of structural damage. In support of this methodology, analyses of candidate LSS truss configurations are presented, and an algorithm correlating ground-based test behavior to expected microgravity behavior is developed.

  17. Control/structure interaction design methodology

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.; Layman, William E.

    1989-01-01

    The Control Structure Interaction Program is a technology development program for spacecraft that exhibit interactions between the control system and structural dynamics. The program objectives include development and verification of new design concepts (such as active structure) and new tools (such as a combined structure and control optimization algorithm) and their verification in ground and possibly flight test. The new CSI design methodology is centered around interdisciplinary engineers using new tools that closely integrate structures and controls. Verification is an important CSI theme and analysts will be closely integrated to the CSI Test Bed laboratory. Components, concepts, tools and algorithms will be developed and tested in the lab and in future Shuttle-based flight experiments. The design methodology is summarized in block diagrams depicting the evolution of a spacecraft design and descriptions of analytical capabilities used in the process. The multiyear JPL CSI implementation plan is described along with the essentials of several new tools. A distributed network of computation servers and workstations was designed that will provide a state-of-the-art development base for the CSI technologies.

  18. Methodology for Designing Fault-Protection Software

    NASA Technical Reports Server (NTRS)

    Barltrop, Kevin; Levison, Jeffrey; Kan, Edwin

    2006-01-01

    A document describes a methodology for designing fault-protection (FP) software for autonomous spacecraft. The methodology embodies and extends established engineering practices in the technical discipline of Fault Detection, Diagnosis, Mitigation, and Recovery; and has been successfully implemented in the Deep Impact Spacecraft, a NASA Discovery mission. Based on established concepts of Fault Monitors and Responses, this FP methodology extends the notion of Opinion, Symptom, Alarm (aka Fault), and Response with numerous new notions, sub-notions, software constructs, and logic and timing gates. For example, Monitor generates a RawOpinion, which graduates into Opinion, categorized into no-opinion, acceptable, or unacceptable opinion. RaiseSymptom, ForceSymptom, and ClearSymptom govern the establishment and then mapping to an Alarm (aka Fault). Local Response is distinguished from FP System Response. A 1-to-n and n-to- 1 mapping is established among Monitors, Symptoms, and Responses. Responses are categorized by device versus by function. Responses operate in tiers, where the early tiers attempt to resolve the Fault in a localized step-by-step fashion, relegating more system-level response to later tier(s). Recovery actions are gated by epoch recovery timing, enabling strategy, urgency, MaxRetry gate, hardware availability, hazardous versus ordinary fault, and many other priority gates. This methodology is systematic, logical, and uses multiple linked tables, parameter files, and recovery command sequences. The credibility of the FP design is proven via a fault-tree analysis "top-down" approach, and a functional fault-mode-effects-and-analysis via "bottoms-up" approach. Via this process, the mitigation and recovery strategy(s) per Fault Containment Region scope (width versus depth) the FP architecture.

  19. Sketching Designs Using the Five Design-Sheet Methodology.

    PubMed

    Roberts, Jonathan C; Headleand, Chris; Ritsos, Panagiotis D

    2016-01-01

    Sketching designs has been shown to be a useful way of planning and considering alternative solutions. The use of lo-fidelity prototyping, especially paper-based sketching, can save time, money and converge to better solutions more quickly. However, this design process is often viewed to be too informal. Consequently users do not know how to manage their thoughts and ideas (to first think divergently, to then finally converge on a suitable solution). We present the Five Design Sheet (FdS) methodology. The methodology enables users to create information visualization interfaces through lo-fidelity methods. Users sketch and plan their ideas, helping them express different possibilities, think through these ideas to consider their potential effectiveness as solutions to the task (sheet 1); they create three principle designs (sheets 2,3 and 4); before converging on a final realization design that can then be implemented (sheet 5). In this article, we present (i) a review of the use of sketching as a planning method for visualization and the benefits of sketching, (ii) a detailed description of the Five Design Sheet (FdS) methodology, and (iii) an evaluation of the FdS using the System Usability Scale, along with a case-study of its use in industry and experience of its use in teaching.

  20. Nonlinear flight control design using backstepping methodology

    NASA Astrophysics Data System (ADS)

    Tran, Thanh Trung

    The subject of nonlinear flight control design using backstepping control methodology is investigated in the dissertation research presented here. Control design methods based on nonlinear models of the dynamic system provide higher utility and versatility because the design model more closely matches the physical system behavior. Obtaining requisite model fidelity is only half of the overall design process, however. Design of the nonlinear control loops can lessen the effects of nonlinearity, or even exploit nonlinearity, to achieve higher levels of closed-loop stability, performance, and robustness. The goal of the research is to improve control quality for a general class of strict-feedback dynamic systems and provide flight control architectures to augment the aircraft motion. The research is divided into two parts: theoretical control development for the strict-feedback form of nonlinear dynamic systems and application of the proposed theory for nonlinear flight dynamics. In the first part, the research is built on two components: transforming the nonlinear dynamic model to a canonical strict-feedback form and then applying backstepping control theory to the canonical model. The research considers a process to determine when this transformation is possible, and when it is possible, a systematic process to transfer the model is also considered when practical. When this is not the case, certain modeling assumptions are explored to facilitate the transformation. After achieving the canonical form, a systematic design procedure for formulating a backstepping control law is explored in the research. Starting with the simplest subsystem and ending with the full system, pseudo control concepts based on Lyapunov control functions are used to control each successive subsystem. Typically each pseudo control must be solved from a nonlinear algebraic equation. At the end of this process, the physical control input must be re-expressed in terms of the physical states by

  1. Unshrouded Centrifugal Turbopump Impeller Design Methodology

    NASA Technical Reports Server (NTRS)

    Prueger, George H.; Williams, Morgan; Chen, Wei-Chung; Paris, John; Williams, Robert; Stewart, Eric

    2001-01-01

    Turbopump weight continues to be a dominant parameter in the trade space for reduction of engine weight. Space Shuttle Main Engine weight distribution indicates that the turbomachinery make up approximately 30% of the total engine weight. Weight reduction can be achieved through the reduction of envelope of the turbopump. Reduction in envelope relates to an increase in turbopump speed and an increase in impeller head coefficient. Speed can be increased until suction performance limits are achieved on the pump or due to alternate constraints the turbine or bearings limit speed. Once the speed of the turbopump is set the impeller tip speed sets the minimum head coefficient of the machine. To reduce impeller diameter the head coefficient must be increased. A significant limitation with increasing head coefficient is that the slope of the head-flow characteristic is affected and this can limit engine throttling range. Unshrouded impellers offer a design option for increased turbopump speed without increasing the impeller head coefficient. However, there are several issues with regard to using an unshrouded impeller: there is a pump performance penalty due to the front open face recirculation flow, there is a potential pump axial thrust problem from the unbalanced front open face and the back shroud face, and since test data is very limited for this configuration, there is uncertainty in the magnitude and phase of the rotordynamic forces due to the front impeller passage. The purpose of the paper is to discuss the design of an unshrouded impeller and to examine the hydrodynamic performance, axial thrust, and rotordynamic performance. The design methodology will also be discussed. This work will help provide some guidelines for unshrouded impeller design.

  2. CONCEPTUAL DESIGNS FOR A NEW HIGHWAY VEHICLE EMISSIONS ESTIMATION METHODOLOGY

    EPA Science Inventory

    The report discusses six conceptual designs for a new highway vehicle emissions estimation methodology and summarizes the recommendations of each design for improving the emissions and activity factors in the emissions estimation process. he complete design reports are included a...

  3. Methodology for Preliminary Design of Electrical Microgrids

    SciTech Connect

    Jensen, Richard P.; Stamp, Jason E.; Eddy, John P.; Henry, Jordan M; Munoz-Ramos, Karina; Abdallah, Tarek

    2015-09-30

    Many critical loads rely on simple backup generation to provide electricity in the event of a power outage. An Energy Surety Microgrid TM can protect against outages caused by single generator failures to improve reliability. An ESM will also provide a host of other benefits, including integration of renewable energy, fuel optimization, and maximizing the value of energy storage. The ESM concept includes a categorization for microgrid value proposi- tions, and quantifies how the investment can be justified during either grid-connected or utility outage conditions. In contrast with many approaches, the ESM approach explic- itly sets requirements based on unlikely extreme conditions, including the need to protect against determined cyber adversaries. During the United States (US) Department of Defense (DOD)/Department of Energy (DOE) Smart Power Infrastructure Demonstration for Energy Reliability and Security (SPIDERS) effort, the ESM methodology was successfully used to develop the preliminary designs, which direct supported the contracting, construction, and testing for three military bases. Acknowledgements Sandia National Laboratories and the SPIDERS technical team would like to acknowledge the following for help in the project: * Mike Hightower, who has been the key driving force for Energy Surety Microgrids * Juan Torres and Abbas Akhil, who developed the concept of microgrids for military installations * Merrill Smith, U.S. Department of Energy SPIDERS Program Manager * Ross Roley and Rich Trundy from U.S. Pacific Command * Bill Waugaman and Bill Beary from U.S. Northern Command * Melanie Johnson and Harold Sanborn of the U.S. Army Corps of Engineers Construc- tion Engineering Research Laboratory * Experts from the National Renewable Energy Laboratory, Idaho National Laboratory, Oak Ridge National Laboratory, and Pacific Northwest National Laboratory

  4. A New Methodology to Design Distributed Medical Diagnostic Centers

    DTIC Science & Technology

    2001-10-25

    METHODOLOGY TO DESIGN DISTRIBUTED MEDICAL DIAGNOSTIC CENTERS P. A. Baziana,. E. I. Karavatselou, D. K. Lymberopoulos, D. N. Serpanos Department of...Electrical and Computer Engineering, University of Patras, Patras, Hellas This paper introduces a new methodology for DDC design by controlling the above...TSPs) to design and support cooperative schemes among RUs and DUs in form of DDCs [2]. This paper introduces a global methodology for such a DDC’s

  5. Progressive Designs for New Curricula.

    ERIC Educational Resources Information Center

    Turner, William A.; Belida, Loren; Johnson, William C.

    2000-01-01

    Explores how school building design influences the success of children in preparing for the future. Considerations when renovating and upgrading school design to enhance learning are discussed, including issues of sustainability, collaboration, lighting, and ventilation. (GR)

  6. Progressive Designs for New Curricula.

    ERIC Educational Resources Information Center

    Turner, William A.; Belida, Loren; Johnson, William C.

    2000-01-01

    Explores how school building design influences the success of children in preparing for the future. Considerations when renovating and upgrading school design to enhance learning are discussed, including issues of sustainability, collaboration, lighting, and ventilation. (GR)

  7. Enhancing the Front-End Phase of Design Methodology

    ERIC Educational Resources Information Center

    Elias, Erasto

    2006-01-01

    Design methodology (DM) is defined by the procedural path, expressed in design models, and techniques or methods used to untangle the various activities within a design model. Design education in universities is mainly based on descriptive design models. Much knowledge and organization have been built into DM to facilitate design teaching.…

  8. A Design Methodology for Medical Processes.

    PubMed

    Ferrante, Simona; Bonacina, Stefano; Pozzi, Giuseppe; Pinciroli, Francesco; Marceglia, Sara

    2016-01-01

    Healthcare processes, especially those belonging to the clinical domain, are acknowledged as complex and characterized by the dynamic nature of the diagnosis, the variability of the decisions made by experts driven by their experiences, the local constraints, the patient's needs, the uncertainty of the patient's response, and the indeterminacy of patient's compliance to treatment. Also, the multiple actors involved in patient's care need clear and transparent communication to ensure care coordination. In this paper, we propose a methodology to model healthcare processes in order to break out complexity and provide transparency. The model is grounded on a set of requirements that make the healthcare domain unique with respect to other knowledge domains. The modeling methodology is based on three main phases: the study of the environmental context, the conceptual modeling, and the logical modeling. The proposed methodology was validated by applying it to the case study of the rehabilitation process of stroke patients in the specific setting of a specialized rehabilitation center. The resulting model was used to define the specifications of a software artifact for the digital administration and collection of assessment tests that was also implemented. Despite being only an example, our case study showed the ability of process modeling to answer the actual needs in healthcare practices. Independently from the medical domain in which the modeling effort is done, the proposed methodology is useful to create high-quality models, and to detect and take into account relevant and tricky situations that can occur during process execution.

  9. Design Research: Theoretical and Methodological Issues

    ERIC Educational Resources Information Center

    Collins, Allan; Joseph, Diana; Bielaczyc, Katerine

    2004-01-01

    The term "design experiments" was introduced in 1992, in articles by Ann Brown (1992) and Allan Collins (1992). Design experiments were developed as a way to carry out formative research to test and refine educational designs based on principles derived from prior research. More recently the term design research has been applied to this kind of…

  10. Technical report on LWR design decision methodology. Phase I

    SciTech Connect

    1980-03-01

    Energy Incorporated (EI) was selected by Sandia Laboratories to develop and test on LWR design decision methodology. Contract Number 42-4229 provided funding for Phase I of this work. This technical report on LWR design decision methodology documents the activities performed under that contract. Phase I was a short-term effort to thoroughly review the curret LWR design decision process to assure complete understanding of current practices and to establish a well defined interface for development of initial quantitative design guidelines.

  11. A Design Methodology for Medical Processes

    PubMed Central

    Bonacina, Stefano; Pozzi, Giuseppe; Pinciroli, Francesco; Marceglia, Sara

    2016-01-01

    Summary Background Healthcare processes, especially those belonging to the clinical domain, are acknowledged as complex and characterized by the dynamic nature of the diagnosis, the variability of the decisions made by experts driven by their experiences, the local constraints, the patient’s needs, the uncertainty of the patient’s response, and the indeterminacy of patient’s compliance to treatment. Also, the multiple actors involved in patient’s care need clear and transparent communication to ensure care coordination. Objectives In this paper, we propose a methodology to model healthcare processes in order to break out complexity and provide transparency. Methods The model is grounded on a set of requirements that make the healthcare domain unique with respect to other knowledge domains. The modeling methodology is based on three main phases: the study of the environmental context, the conceptual modeling, and the logical modeling. Results The proposed methodology was validated by applying it to the case study of the rehabilitation process of stroke patients in the specific setting of a specialized rehabilitation center. The resulting model was used to define the specifications of a software artifact for the digital administration and collection of assessment tests that was also implemented. Conclusions Despite being only an example, our case study showed the ability of process modeling to answer the actual needs in healthcare practices. Independently from the medical domain in which the modeling effort is done, the proposed methodology is useful to create high-quality models, and to detect and take into account relevant and tricky situations that can occur during process execution. PMID:27081415

  12. A Methodology for Total Hospital Design

    PubMed Central

    Delon, Gerald L.

    1970-01-01

    A procedure is described that integrates three techniques into a unified approach: a computerized method for estimating departmental areas and construction costs, a computerized layout routine that produces a space-relationship diagram based on qualitative factors, and a second layout program that establishes a final layout by a series of iterations. The methodology described utilizes as input the results of earlier phases of the research, with the output of each step in turn becoming the input for the succeeding step. The method is illustrated by application to a hypothetical pediatric hospital of 100 beds. PMID:5494263

  13. Forced vibration and flutter design methodology

    SciTech Connect

    Snyder, L.E.; Burns, D.W.

    1988-06-01

    The aeroelastic principles and considerations of designing blades, disks, and vanes to avoid high cycle fatigue failure is covered. Two types of vibration that can cause high cycle fatigue, flutter, and forced vibration, will first be defined and the basic governing equations discussed. Next, under forced vibration design the areas of source definition, types of components, vibratory mode shape definitions, and basic steps in design for adequate high cycle fatigue life will be presented. For clarification a forced vibration design example will be shown using a high performance turbine blade/disk component. Finally, types of flutter, dominant flutter parameters, and flutter procedures and design parameters will be discussed. The overall emphasis is on application to initial design of blades, disks, and vanes of aeroelastic criteria to prevent high cycle fatigue failures.

  14. The Study of the Relationship between Probabilistic Design and Axiomatic Design Methodology. Volume 1

    NASA Technical Reports Server (NTRS)

    Onwubiko, Chinyere; Onyebueke, Landon

    1996-01-01

    This program report is the final report covering all the work done on this project. The goal of this project is technology transfer of methodologies to improve design process. The specific objectives are: 1. To learn and understand the Probabilistic design analysis using NESSUS. 2. To assign Design Projects to either undergraduate or graduate students on the application of NESSUS. 3. To integrate the application of NESSUS into some selected senior level courses in Civil and Mechanical Engineering curricula. 4. To develop courseware in Probabilistic Design methodology to be included in a graduate level Design Methodology course. 5. To study the relationship between the Probabilistic design methodology and Axiomatic design methodology.

  15. Design: The Only Methodology of Technology?

    ERIC Educational Resources Information Center

    Williams, P. John

    2000-01-01

    Technology education involves procedural knowledge related to technology activity and conceptual knowledge related to content. Technology processes include design, problem solving, systems approach, invention, and manufacturing. Using a range of processes in teaching can accommodate various learning styles. (SK)

  16. A design methodology for portable software on parallel computers

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Miller, Keith W.; Chrisman, Dan A.

    1993-01-01

    This final report for research that was supported by grant number NAG-1-995 documents our progress in addressing two difficulties in parallel programming. The first difficulty is developing software that will execute quickly on a parallel computer. The second difficulty is transporting software between dissimilar parallel computers. In general, we expect that more hardware-specific information will be included in software designs for parallel computers than in designs for sequential computers. This inclusion is an instance of portability being sacrificed for high performance. New parallel computers are being introduced frequently. Trying to keep one's software on the current high performance hardware, a software developer almost continually faces yet another expensive software transportation. The problem of the proposed research is to create a design methodology that helps designers to more precisely control both portability and hardware-specific programming details. The proposed research emphasizes programming for scientific applications. We completed our study of the parallelizability of a subsystem of the NASA Earth Radiation Budget Experiment (ERBE) data processing system. This work is summarized in section two. A more detailed description is provided in Appendix A ('Programming Practices to Support Eventual Parallelism'). Mr. Chrisman, a graduate student, wrote and successfully defended a Ph.D. dissertation proposal which describes our research associated with the issues of software portability and high performance. The list of research tasks are specified in the proposal. The proposal 'A Design Methodology for Portable Software on Parallel Computers' is summarized in section three and is provided in its entirety in Appendix B. We are currently studying a proposed subsystem of the NASA Clouds and the Earth's Radiant Energy System (CERES) data processing system. This software is the proof-of-concept for the Ph.D. dissertation. We have implemented and measured

  17. A design optimization methodology for Li+ batteries

    NASA Astrophysics Data System (ADS)

    Golmon, Stephanie; Maute, Kurt; Dunn, Martin L.

    2014-05-01

    Design optimization for functionally graded battery electrodes is shown to improve the usable energy capacity of Li batteries predicted by computational simulations and numerically optimizing the electrode porosities and particle radii. A multi-scale battery model which accounts for nonlinear transient transport processes, electrochemical reactions, and mechanical deformations is used to predict the usable energy storage capacity of the battery over a range of discharge rates. A multi-objective formulation of the design problem is introduced to maximize the usable capacity over a range of discharge rates while limiting the mechanical stresses. The optimization problem is solved via a gradient based optimization. A LiMn2O4 cathode is simulated with a PEO-LiCF3SO3 electrolyte and both a Li Foil (half cell) and LiC6 anode. Studies were performed on both half and full cell configurations resulting in distinctly different optimal electrode designs. The numerical results show that the highest rate discharge drives the simulations and the optimal designs are dominated by Li+ transport rates. The results also suggest that spatially varying electrode porosities and active particle sizes provides an efficient approach to improve the power-to-energy density of Li+ batteries. For the half cell configuration, the optimal design improves the discharge capacity by 29% while for the full cell the discharge capacity was improved 61% relative to an initial design with a uniform electrode structure. Most of the improvement in capacity was due to the spatially varying porosity, with up to 5% of the gains attributed to the particle radii design variables.

  18. Design Study Methodology: Reflections from the Trenches and the Stacks.

    PubMed

    Sedlmair, M; Meyer, M; Munzner, T

    2012-12-01

    Design studies are an increasingly popular form of problem-driven visualization research, yet there is little guidance available about how to do them effectively. In this paper we reflect on our combined experience of conducting twenty-one design studies, as well as reading and reviewing many more, and on an extensive literature review of other field work methods and methodologies. Based on this foundation we provide definitions, propose a methodological framework, and provide practical guidance for conducting design studies. We define a design study as a project in which visualization researchers analyze a specific real-world problem faced by domain experts, design a visualization system that supports solving this problem, validate the design, and reflect about lessons learned in order to refine visualization design guidelines. We characterize two axes - a task clarity axis from fuzzy to crisp and an information location axis from the domain expert's head to the computer - and use these axes to reason about design study contributions, their suitability, and uniqueness from other approaches. The proposed methodological framework consists of 9 stages: learn, winnow, cast, discover, design, implement, deploy, reflect, and write. For each stage we provide practical guidance and outline potential pitfalls. We also conducted an extensive literature survey of related methodological approaches that involve a significant amount of qualitative field work, and compare design study methodology to that of ethnography, grounded theory, and action research.

  19. Progressive Failure Analysis Methodology for Laminated Composite Structures

    NASA Technical Reports Server (NTRS)

    Sleight, David W.

    1999-01-01

    A progressive failure analysis method has been developed for predicting the failure of laminated composite structures under geometrically nonlinear deformations. The progressive failure analysis uses C(exp 1) shell elements based on classical lamination theory to calculate the in-plane stresses. Several failure criteria, including the maximum strain criterion, Hashin's criterion, and Christensen's criterion, are used to predict the failure mechanisms and several options are available to degrade the material properties after failures. The progressive failure analysis method is implemented in the COMET finite element analysis code and can predict the damage and response of laminated composite structures from initial loading to final failure. The different failure criteria and material degradation methods are compared and assessed by performing analyses of several laminated composite structures. Results from the progressive failure method indicate good correlation with the existing test data except in structural applications where interlaminar stresses are important which may cause failure mechanisms such as debonding or delaminations.

  20. Methodology for Characterizing Trends | Cancer Trends Progress Report

    Cancer.gov

    The Cancer Trends Progress Report, first issued in 2001, summarizes our nation's advances against cancer in relation to Healthy People targets set forth by the Department of Health and Human Services.

  1. Implicit Shape Parameterization for Kansei Design Methodology

    NASA Astrophysics Data System (ADS)

    Nordgren, Andreas Kjell; Aoyama, Hideki

    Implicit shape parameterization for Kansei design is a procedure that use 3D-models, or concepts, to span a shape space for surfaces in the automotive field. A low-dimensional, yet accurate shape descriptor was found by Principal Component Analysis of an ensemble of point-clouds, which were extracted from mesh-based surfaces modeled in a CAD-program. A theoretical background of the procedure is given along with step-by-step instructions for the required data-processing. The results show that complex surfaces can be described very efficiently, and encode design features by an implicit approach that does not rely on error-prone explicit parameterizations. This provides a very intuitive way to explore shapes for a designer, because various design features can simply be introduced by adding new concepts to the ensemble. Complex shapes have been difficult to analyze with Kansei methods due to the large number of parameters involved, but implicit parameterization of design features provides a low-dimensional shape descriptor for efficient data collection, model-building and analysis of emotional content in 3D-surfaces.

  2. The methodology of database design in organization management systems

    NASA Astrophysics Data System (ADS)

    Chudinov, I. L.; Osipova, V. V.; Bobrova, Y. V.

    2017-01-01

    The paper describes the unified methodology of database design for management information systems. Designing the conceptual information model for the domain area is the most important and labor-intensive stage in database design. Basing on the proposed integrated approach to design, the conceptual information model, the main principles of developing the relation databases are provided and user’s information needs are considered. According to the methodology, the process of designing the conceptual information model includes three basic stages, which are defined in detail. Finally, the article describes the process of performing the results of analyzing user’s information needs and the rationale for use of classifiers.

  3. [Progress in methodological characteristics of clinical practice guideline for osteoarthritis].

    PubMed

    Xing, D; Wang, B; Lin, J H

    2017-06-01

    At present, several clinical practice guidelines for the treatment of osteoarthritis have been developed by institutes or societies. The ultimate purpose of developing clinical practice guidelines is to formulate the process in the treatment of osteoarthritis effectively. However, the methodologies used in developing clinical practice guidelines may place an influence on the transformation and application of that in treating osteoarthritis. The present study summarized the methodological features of individual clinical practice guideline and presented the tools for quality evaluation of clinical practice guideline. The limitations of current osteoarthritis guidelines of China are also indicated. The review article might help relevant institutions improve the quality in developing guide and clinical transformation.

  4. Design Methodology for Multiple Microcomputer Architectures.

    DTIC Science & Technology

    1982-07-01

    multimicro design knowledge is true both in industry and in university environments. In the industrial environment, it reduces productivity and increases...Real-Time Processor Problems," Proc. of ELECTRO-81 Tercer Seminario de Ingenieria Electronica, Nov. 9-13, 1981. 14 1981 "D Flip/Flop Substracts

  5. Army Design Methodology: Commander’s Resource

    DTIC Science & Technology

    2012-02-01

    This resource is intended to help bridge the gap from Design theory and classroom instruction to application of ADM in the field. It offers...to capture the knowledge, and that was immensely difficult. We drew a spaghetti diagram. It was awful. If you pulled it out today, we could

  6. A Design Methodology For Industrial Vision Systems

    NASA Astrophysics Data System (ADS)

    Batchelor, B. G.; Waltz, F. M.; Snyder, M. A.

    1988-11-01

    The cost of design, rather than that of target system hardware, represents the principal factor inhibiting the adoption of machine vision systems by manufacturing industry. To reduce design costs to a minimum, a number of software and hardware aids have been developed or are currently being built by the authors. These design aids are as follows: a. An expert system for giving advice about which image acquisition techniques (i.e. lighting/viewing techniques) might be appropriate in a given situation. b. A program to assist in the selection and setup of camera lenses. c. A rich repertoire of image processing procedures, integrated with the Al language Prolog. This combination (called ProVision) provides a facility for experimenting with intelligent image processing techniques and is intended to allow rapid prototyping of algorithms and/or heuristics. d. Fast image processing hardware, capable of implementing commands in the ProVision language. The speed of operation of this equipment is sufficiently high for it to be used, without modification, in many industrial applications. Where this is not possible, even higher execution speed may be achieved by adding extra modules to the processing hardware. In this way, it is possible to trade speed against the cost of the target system hardware. New and faster implementations of a given algorithm/heuristic can usually be achieved with the expenditure of only a small effort. Throughout this article, the emphasis is on designing an industrial vision system in a smooth and effortless manner. In order to illustrate our main thesis that the design of industrial vision systems can be made very much easier through the use of suitable utilities, the article concludes with a discussion of a case study: the dissection of tiny plants using a visually controlled robot.

  7. Philosophical and Methodological Beliefs of Instructional Design Faculty and Professionals

    ERIC Educational Resources Information Center

    Sheehan, Michael D.; Johnson, R. Burke

    2012-01-01

    The purpose of this research was to probe the philosophical beliefs of instructional designers using sound philosophical constructs and quantitative data collection and analysis. We investigated the philosophical and methodological beliefs of instructional designers, including 152 instructional design faculty members and 118 non-faculty…

  8. Philosophical and Methodological Beliefs of Instructional Design Faculty and Professionals

    ERIC Educational Resources Information Center

    Sheehan, Michael D.; Johnson, R. Burke

    2012-01-01

    The purpose of this research was to probe the philosophical beliefs of instructional designers using sound philosophical constructs and quantitative data collection and analysis. We investigated the philosophical and methodological beliefs of instructional designers, including 152 instructional design faculty members and 118 non-faculty…

  9. Development of Distributed Computing Systems Software Design Methodologies.

    DTIC Science & Technology

    1982-11-05

    R12i 941 DEVELOPMENT OF DISTRIBUTED COMPUTING SYSTEMS SOFTWARE ± DESIGN METHODOLOGIES(U) NORTHWESTERN UNIV EVANSTON IL DEPT OF ELECTRICAL...GUIRWAU OF STANDARDS -16 5 A Ax u FINAL REPORT Development of Distributed Computing System Software Design Methodologies C)0 Stephen S. Yau September 22...of Distributed Computing Systems Software pt.22,, 80 -OJu1, 2 * Dsig Mehodloges PERFORMING ORG REPORT NUMBERDesign th ol ies" 7. AUTHOR() .. CONTRACT

  10. Surface design methodology - challenge the steel

    NASA Astrophysics Data System (ADS)

    Bergman, M.; Rosen, B.-G.; Eriksson, L.; Anderberg, C.

    2014-03-01

    The way a product or material is experienced by its user could be different depending on the scenario. It is also well known that different materials and surfaces are used for different purposes. When optimizing materials and surface roughness for a certain something with the intention to improve a product, it is important to obtain not only the physical requirements, but also the user experience and expectations. Laws and requirements of the materials and the surface function, but also the conservative way of thinking about materials and colours characterize the design of medical equipment. The purpose of this paper is to link the technical- and customer requirements of current materials and surface textures in medical environments. By focusing on parts of the theory of Kansei Engineering, improvements of the companys' products are possible. The idea is to find correlations between desired experience or "feeling" for a product, -customer requirements, functional requirements, and product geometrical properties -design parameters, to be implemented on new improved products. To be able to find new materials with the same (or better) technical requirements but a higher level of user stimulation, the current material (stainless steel) and its surface (brushed textures) was used as a reference. The usage of focus groups of experts at the manufacturer lead to a selection of twelve possible new materials for investigation in the project. In collaboration with the topical company for this project, three new materials that fulfil the requirements -easy to clean and anti-bacterial came to be in focus for further investigation in regard to a new design of a washer-disinfector for medical equipment using the Kansei based Clean ability approach CAA.

  11. "MARK I" MEASUREMENT METHODOLOGY FOR POLLUTION PREVENTION PROGRESS OCCURRING AS A RESULT OF PRODUCT DECISIONS

    EPA Science Inventory

    A methodology for assessing progress in pollution prevention resulting from product redesign, reformulation or replacement is described. The method compares the pollution generated by the original product with that from the modified or replacement product, taking into account, if...

  12. Integrated Design Methodology for Highly Reliable Liquid Rocket Engine

    NASA Astrophysics Data System (ADS)

    Kuratani, Naoshi; Aoki, Hiroshi; Yasui, Masaaki; Kure, Hirotaka; Masuya, Goro

    The Integrated Design Methodology is strongly required at the conceptual design phase to achieve the highly reliable space transportation systems, especially the propulsion systems, not only in Japan but also all over the world in these days. Because in the past some catastrophic failures caused some losses of mission and vehicle (LOM/LOV) at the operational phase, moreover did affect severely the schedule delays and cost overrun at the later development phase. Design methodology for highly reliable liquid rocket engine is being preliminarily established and investigated in this study. The sensitivity analysis is systematically performed to demonstrate the effectiveness of this methodology, and to clarify and especially to focus on the correlation between the combustion chamber, turbopump and main valve as main components. This study describes the essential issues to understand the stated correlations, the need to apply this methodology to the remaining critical failure modes in the whole engine system, and the perspective on the engine development in the future.

  13. A Methodology for the Neutronics Design of Space Nuclear Reactors

    SciTech Connect

    King, Jeffrey C.; El-Genk, Mohamed S.

    2004-02-04

    A methodology for the neutronics design of space power reactors is presented. This methodology involves balancing the competing requirements of having sufficient excess reactivity for the desired lifetime, keeping the reactor subcritical at launch and during submersion accidents, and providing sufficient control over the lifetime of the reactor. These requirements are addressed by three reactivity values for a given reactor design: the excess reactivity at beginning of mission, the negative reactivity at shutdown, and the negative reactivity margin in submersion accidents. These reactivity values define the control worth and the safety worth in submersion accidents, used for evaluating the merit of a proposed reactor type and design. The Heat Pipe-Segmented Thermoelectric Module Converters space reactor core design is evaluated and modified based on the proposed methodology. The final reactor core design has sufficient excess reactivity for 10 years of nominal operation at 1.82 MW of fission power and is subcritical at launch and in all water submersion accidents.

  14. Solid lubrication design methodology, phase 2

    NASA Technical Reports Server (NTRS)

    Pallini, R. A.; Wedeven, L. D.; Ragen, M. A.; Aggarwal, B. B.

    1986-01-01

    The high temperature performance of solid lubricated rolling elements was conducted with a specially designed traction (friction) test apparatus. Graphite lubricants containing three additives (silver, phosphate glass, and zinc orthophosphate) were evaluated from room temperature to 540 C. Two hard coats were also evaluated. The evaluation of these lubricants, using a burnishing method of application, shows a reasonable transfer of lubricant and wear protection for short duration testing except in the 200 C temperature range. The graphite lubricants containing silver and zinc orthophosphate additives were more effective than the phosphate glass material over the test conditions examined. Traction coefficients ranged from a low of 0.07 to a high of 0.6. By curve fitting the traction data, empirical equations for slope and maximum traction coefficient as a function of contact pressure (P), rolling speed (U), and temperature (T) can be developed for each lubricant. A solid lubricant traction model was incorporated into an advanced bearing analysis code (SHABERTH). For comparison purposes, preliminary heat generation calculations were made for both oil and solid lubricated bearing operation. A preliminary analysis indicated a significantly higher heat generation for a solid lubricated ball bearing in a deep groove configuration. An analysis of a cylindrical roller bearing configuration showed a potential for a low friction solid lubricated bearing.

  15. Progress in multirate digital control system design

    NASA Technical Reports Server (NTRS)

    Berg, Martin C.; Mason, Gregory S.

    1991-01-01

    A new methodology for multirate sampled-data control design based on a new generalized control law structure, two new parameter-optimization-based control law synthesis methods, and a new singular-value-based robustness analysis method are described. The control law structure can represent multirate sampled-data control laws of arbitrary structure and dynamic order, with arbitrarily prescribed sampling rates for all sensors and update rates for all processor states and actuators. The two control law synthesis methods employ numerical optimization to determine values for the control law parameters. The robustness analysis method is based on the multivariable Nyquist criterion applied to the loop transfer function for the sampling period equal to the period of repetition of the system's complete sampling/update schedule. The complete methodology is demonstrated by application to the design of a combination yaw damper and modal suppression system for a commercial aircraft.

  16. Optimal Design and Purposeful Sampling: Complementary Methodologies for Implementation Research.

    PubMed

    Duan, Naihua; Bhaumik, Dulal K; Palinkas, Lawrence A; Hoagwood, Kimberly

    2015-09-01

    Optimal design has been an under-utilized methodology. However, it has significant real-world applications, particularly in mixed methods implementation research. We review the concept and demonstrate how it can be used to assess the sensitivity of design decisions and balance competing needs. For observational studies, this methodology enables selection of the most informative study units. For experimental studies, it entails selecting and assigning study units to intervention conditions in the most informative manner. We blend optimal design methods with purposeful sampling to show how these two concepts balance competing needs when there are multiple study aims, a common situation in implementation research.

  17. Optimal Design and Purposeful Sampling: Complementary Methodologies for Implementation Research

    PubMed Central

    Duan, Naihua; Bhaumik, Dulal K.; Palinkas, Lawrence A.; Hoagwood, Kimberly

    2015-01-01

    Optimal design has been an under-utilized methodology. However, it has significant real-world applications, particularly in mixed methods implementation research. We review the concept and demonstrate how it can be used to assess the sensitivity of design decisions and balance competing needs. For observational studies, this methodology enables selection of the most informative study units. For experimental studies, it entails selecting and assigning study units to intervention conditions in the most informative manner. We blend optimal design methods with purposeful sampling to show how these two concepts balance competing needs when there are multiple study aims, a common situation in implementation research. PMID:25491200

  18. PEM Fuel Cells Redesign Using Biomimetic and TRIZ Design Methodologies

    NASA Astrophysics Data System (ADS)

    Fung, Keith Kin Kei

    Two formal design methodologies, biomimetic design and the Theory of Inventive Problem Solving, TRIZ, were applied to the redesign of a Proton Exchange Membrane (PEM) fuel cell. Proof of concept prototyping was performed on two of the concepts for water management. The liquid water collection with strategically placed wicks concept demonstrated the potential benefits for a fuel cell. Conversely, the periodic flow direction reversal concepts might cause a potential reduction water removal from a fuel cell. The causes of this water removal reduction remain unclear. In additional, three of the concepts generated with biomimetic design were further studied and demonstrated to stimulate more creative ideas in the thermal and water management of fuel cells. The biomimetic design and the TRIZ methodologies were successfully applied to fuel cells and provided different perspectives to the redesign of fuel cells. The methodologies should continue to be used to improve fuel cells.

  19. Methodological Innovation in Practice-Based Design Doctorates

    ERIC Educational Resources Information Center

    Yee, Joyce S. R.

    2010-01-01

    This article presents a selective review of recent design PhDs that identify and analyse the methodological innovation that is occurring in the field, in order to inform future provision of research training. Six recently completed design PhDs are used to highlight possible philosophical and practical models that can be adopted by future PhD…

  20. Helicopter-V/STOL dynamic wind and turbulence design methodology

    NASA Technical Reports Server (NTRS)

    Bailey, J. Earl

    1987-01-01

    Aircraft and helicopter accidents due to severe dynamic wind and turbulence continue to present challenging design problems. The development of the current set of design analysis tools for a aircraft wind and turbulence design began in the 1940's and 1950's. The areas of helicopter dynamic wind and turbulence modeling and vehicle response to severe dynamic wind inputs (microburst type phenomena) during takeoff and landing remain as major unsolved design problems from a lack of both environmental data and computational methodology. The development of helicopter and V/STOL dynamic wind and turbulence response computation methology is reviewed, the current state of the design art in industry is outlined, and comments on design methodology are made which may serve to improve future flight vehicle design.

  1. Enhancing Instructional Design Efficiency: Methodologies Employed by Instructional Designers

    ERIC Educational Resources Information Center

    Roytek, Margaret A.

    2010-01-01

    Instructional systems design (ISD) has been frequently criticised as taking too long to implement, calling for a reduction in cycle time--the time that elapses between project initiation and delivery. While instructional design research has historically focused on increasing "learner" efficiencies, the study of what instructional designers do to…

  2. Enhancing Instructional Design Efficiency: Methodologies Employed by Instructional Designers

    ERIC Educational Resources Information Center

    Roytek, Margaret A.

    2010-01-01

    Instructional systems design (ISD) has been frequently criticised as taking too long to implement, calling for a reduction in cycle time--the time that elapses between project initiation and delivery. While instructional design research has historically focused on increasing "learner" efficiencies, the study of what instructional designers do to…

  3. A design methodology for nonlinear systems containing parameter uncertainty: Application to nonlinear controller design

    NASA Technical Reports Server (NTRS)

    Young, G.

    1982-01-01

    A design methodology capable of dealing with nonlinear systems, such as a controlled ecological life support system (CELSS), containing parameter uncertainty is discussed. The methodology was applied to the design of discrete time nonlinear controllers. The nonlinear controllers can be used to control either linear or nonlinear systems. Several controller strategies are presented to illustrate the design procedure.

  4. Top-down methodology in industrial mixed-signals design

    NASA Astrophysics Data System (ADS)

    Liao, E.; Postula, A.; Ding, Y.

    2005-12-01

    Analogue and mixed-signal designs are fast becoming significant in System-On-Chip (SoC) designs as digital computational cores need to interface with the real world. Cellular phones, magnetic disk drives, speech recognition hardware and other 'digital' innovations in fact rely on a core of analogue circuitry. Mature digital CAD tools competently handle the digital portions of SoC designs. This is not true for analogue and mixed-signals components, still designed manually using time-consuming techniques. A good top-down design methodology can drastically reduce the design time of analogue components in SoCs and allow comprehensive functionality verification. This paper contains a critical survey of current design processes and tools, a top-down design case study and introduces MIX-SYN, a new platform for fast-tracking exploration and design time for the analogue and mixed-signals design industry.

  5. Progress in the MITICA beam source design.

    PubMed

    Zaccaria, P; Agostinetti, P; Marcuzzi, D; Pavei, M; Pilan, N; Rizzolo, A; Sonato, P; Spada, F; Trevisan, L

    2012-02-01

    In the framework of the development of the ITER neutral beam (NB) system, a test facility is planned to be built in Padova. A full size prototype of the ITER heating NB injector (MITICA) shall be built and tested at full beam power (17 MW) as per ITER requirements. The design of the MITICA beam source has further progressed following updated optimization and overall integration criteria. In the paper, the major design choices and revisions are presented, together with some results of numerical analyses carried out in order to assess the electrostatic and thermo-mechanical behaviour of the source.

  6. PROGRESS IN DESIGN OF THE SNS LINAC

    SciTech Connect

    R. HARDEKOPF

    2000-11-01

    The Spallation Neutron Source (SNS) is a six-laboratory collaboration to build an intense pulsed neutron facility at Oak Ridge, TN. The linac design has evolved from the conceptual design presented in 1997 to achieve higher initial performance and to incorporate desirable upgrade features. The linac will initially produce 2-MW beam power using a combination of radio-frequency quadruple (RFQ) linac, drift-tube linac (DTL), coupled-cavity linac (CCL), and superconducting-cavity linac (SCL). Designs of each of these elements support the high peak intensity and high quality beam required for injection into the SNS accumulator ring. This paper will trace the evolution of the linac design, the cost and performance factors that drove architecture decisions, and the progress made in the R&D program.

  7. Implementation of Probabilistic Design Methodology at Tennessee State University

    NASA Technical Reports Server (NTRS)

    Onwubiko, Chinyere

    1996-01-01

    Engineering Design is one of the most important areas in engineering education. Deterministic Design Methodology (DDM) is the only design method that is taught in most engineering schools. This method does not give a direct account of uncertainties in design parameters. Hence, it is impossible to quantify the uncertainties in the response and the actual safety margin remains unknown. The desire for a design methodology tha can identify the primitive (random) variables that affect the structural behavior has led to a growing interest on Probabilistic Design Methodology (PDM). This method is gaining more recognition in industries than in educational institutions. Some of the reasons for the limited use of the PDM at the moment are that many are unaware of its potentials, and most of the software developed for PDM are very recent. The central goal of the PDM project at Tennessee State University is to introduce engineering students to the method. The students participating in the project learn about PDM and the computer codes that are available to the design engineer. The software being used of this project is NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) developed under NASA probabilistic structural analysis program. NESSUS has three different modules which make it a very comprehensive computer code for PDM. A research in technology transfer through course offering in PDM is in effect a Tennessee State University. The aim is to familiarize students with the problem of uncertainties in engineering design. Included in the paper are some projects on PDM carried out by some students and faculty. The areas this method is being applied at the moment include, Design of Gears (spur and worm); Design of Shafts; Design of Statistically Indeterminate Frame Structures; Design of Helical Springs; and Design of Shock Absorbers. Some of the current results of these projects are presented.

  8. Implementation of Probabilistic Design Methodology at Tennessee State University

    NASA Technical Reports Server (NTRS)

    Onwubiko, Chinyere

    1996-01-01

    Engineering Design is one of the most important areas in engineering education. Deterministic Design Methodology (DDM) is the only design method that is taught in most engineering schools. This method does not give a direct account of uncertainties in design parameters. Hence, it is impossible to quantify the uncertainties in the response and the actual safety margin remains unknown. The desire for a design methodology tha can identify the primitive (random) variables that affect the structural behavior has led to a growing interest on Probabilistic Design Methodology (PDM). This method is gaining more recognition in industries than in educational institutions. Some of the reasons for the limited use of the PDM at the moment are that many are unaware of its potentials, and most of the software developed for PDM are very recent. The central goal of the PDM project at Tennessee State University is to introduce engineering students to the method. The students participating in the project learn about PDM and the computer codes that are available to the design engineer. The software being used of this project is NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) developed under NASA probabilistic structural analysis program. NESSUS has three different modules which make it a very comprehensive computer code for PDM. A research in technology transfer through course offering in PDM is in effect a Tennessee State University. The aim is to familiarize students with the problem of uncertainties in engineering design. Included in the paper are some projects on PDM carried out by some students and faculty. The areas this method is being applied at the moment include, Design of Gears (spur and worm); Design of Shafts; Design of Statistically Indeterminate Frame Structures; Design of Helical Springs; and Design of Shock Absorbers. Some of the current results of these projects are presented.

  9. A bio-inspired EAP actuator design methodology

    NASA Astrophysics Data System (ADS)

    Fernandez, Diego; Moreno, Luis; Baselga, Juan

    2005-05-01

    Current EAP actuator sheets or fibers perform reasonable well in the centimeter and mN range, but are not practical for larger force and deformation requirements. In order to make EAP actuators technology scalable a design methodology for polymer actuators is required. Design variables, optimization formulas and a general architecture are required, as it is usual in electromagnetic or hydraulic actuator design. This will allow the development of large EAP actuators specifically designed for a particular application. It will also help to enhance the EAP material final performance. This approach is not new, it is found in Nature. Skeletal muscle architecture has a profound influence on muscle force-generating properties and functionality. Based on existing literature on skeletal muscle biomechanics, the Nature design philosophy is inferred. Formulas and curves employed by Nature in the design of muscles are presented. Design units such as fiber, tendon, aponeurosis, and motor unit are compared with the equivalent design units to be taken into account in the design of EAP actuators. Finally a complete design methodology for the design of actuators based on multiple EAP fiber is proposed. In addition, the procedure gives an idea of the required parameters that must be clearly modeled and characterized at EAP material level.

  10. Extensibility of a linear rapid robust design methodology

    NASA Astrophysics Data System (ADS)

    Steinfeldt, Bradley A.; Braun, Robert D.

    2016-05-01

    The extensibility of a linear rapid robust design methodology is examined. This analysis is approached from a computational cost and accuracy perspective. The sensitivity of the solution's computational cost is examined by analysing effects such as the number of design variables, nonlinearity of the CAs, and nonlinearity of the response in addition to several potential complexity metrics. Relative to traditional robust design methods, the linear rapid robust design methodology scaled better with the size of the problem and had performance that exceeded the traditional techniques examined. The accuracy of applying a method with linear fundamentals to nonlinear problems was examined. It is observed that if the magnitude of nonlinearity is less than 1000 times that of the nominal linear response, the error associated with applying successive linearization will result in ? errors in the response less than 10% compared to the full nonlinear error.

  11. Viability, Advantages and Design Methodologies of M-Learning Delivery

    ERIC Educational Resources Information Center

    Zabel, Todd W.

    2010-01-01

    The purpose of this study was to examine the viability and principle design methodologies of Mobile Learning models in developing regions. Demographic and market studies were utilized to determine the viability of M-Learning delivery as well as best uses for such technologies and methods given socioeconomic and political conditions within the…

  12. Chicken or Egg? Communicative Methodology or Communicative Syllabus Design.

    ERIC Educational Resources Information Center

    Yalden, Janice

    A consensus has emerged on many issues in communicative language teaching, but one question that needs attention is the question of what ought to constitute the appropriate starting point in the design and implementation of a second language program. Two positions to consider are the following: first, the development of communicative methodology,…

  13. Chicken or Egg? Communicative Methodology or Communicative Syllabus Design.

    ERIC Educational Resources Information Center

    Yalden, Janice

    A consensus has emerged on many issues in communicative language teaching, but one question that needs attention is the question of what ought to constitute the appropriate starting point in the design and implementation of a second language program. Two positions to consider are the following: first, the development of communicative methodology,…

  14. A computer simulator for development of engineering system design methodologies

    NASA Technical Reports Server (NTRS)

    Padula, S. L.; Sobieszczanski-Sobieski, J.

    1987-01-01

    A computer program designed to simulate and improve engineering system design methodology is described. The simulator mimics the qualitative behavior and data couplings occurring among the subsystems of a complex engineering system. It eliminates the engineering analyses in the subsystems by replacing them with judiciously chosen analytical functions. With the cost of analysis eliminated, the simulator is used for experimentation with a large variety of candidate algorithms for multilevel design optimization to choose the best ones for the actual application. Thus, the simulator serves as a development tool for multilevel design optimization strategy. The simulator concept, implementation, and status are described and illustrated with examples.

  15. Tornado missile simulation and design methodology. Volume 1: simulation methodology, design applications, and TORMIS computer code. Final report

    SciTech Connect

    Twisdale, L.A.; Dunn, W.L.

    1981-08-01

    A probabilistic methodology has been developed to predict the probabilities of tornado-propelled missiles impacting and damaging nuclear power plant structures. Mathematical models of each event in the tornado missile hazard have been developed and sequenced to form an integrated, time-history simulation methodology. The models are data based where feasible. The data include documented records of tornado occurrence, field observations of missile transport, results of wind tunnel experiments, and missile impact tests. Probabilistic Monte Carlo techniques are used to estimate the risk probabilities. The methodology has been encoded in the TORMIS computer code to facilitate numerical analysis and plant-specific tornado missile probability assessments. Sensitivity analyses have been performed on both the individual models and the integrated methodology, and risk has been assessed for a hypothetical nuclear power plant design case study.

  16. Yakima Hatchery Experimental Design : Annual Progress Report.

    SciTech Connect

    Busack, Craig; Knudsen, Curtis; Marshall, Anne

    1991-08-01

    This progress report details the results and status of Washington Department of Fisheries' (WDF) pre-facility monitoring, research, and evaluation efforts, through May 1991, designed to support the development of an Experimental Design Plan (EDP) for the Yakima/Klickitat Fisheries Project (YKFP), previously termed the Yakima/Klickitat Production Project (YKPP or Y/KPP). This pre- facility work has been guided by planning efforts of various research and quality control teams of the project that are annually captured as revisions to the experimental design and pre-facility work plans. The current objective are as follows: to develop genetic monitoring and evaluation approach for the Y/KPP; to evaluate stock identification monitoring tools, approaches, and opportunities available to meet specific objectives of the experimental plan; and to evaluate adult and juvenile enumeration and sampling/collection capabilities in the Y/KPP necessary to measure experimental response variables.

  17. The Biocompatibility of Bone Cements: Progress in Methodological Approach

    PubMed Central

    Dall'Oca, Carlo; Maluta, Tommaso; Micheloni, Gian Mario; Cengarle, Matteo; Morbioli, Giampaolo; Bernardi, Paolo; Sbarbati, Andrea; Degl'Innocenti, Daniele; Lavini, Franco; Magnan, Bruno

    2017-01-01

    The ideal bone graft substitute should have certain properties and there are many studies dealing with mixture of polymethylmetacrilate (PMMA) and ß-tricalciumphospate (ß-TCP) presenting the best characteristics of both. Scanning Electron Microscopy (SEM), for ultra-structural data, resulted a very reliable in vivo model to better understand the bioactivity of a cement and to properly evaluate its suitability for a particular purpose. The present study aims to further improve the knowledge on osteointegration development, using both parameters obtained with the Environmental Scanning Electron Microscopy (ESEM) and focused histological examination. Two hybrid bone graft substitute were designed among ceramic and polymer-based bone graft substitutes. Based on ß-TCP granules sizes, they were created with theoretical different osteoconductive properties. An acrylic standard cement was chosen as control. Cements were implanted in twelve New Zealand White (NZW) rabbits, which were sacrificed at 1, 2, 3, 6, 9 and 12 months after cement implantation. Histological samples were prepared with an infiltration process of LR white resin and then specimens were studied by X-rays, histology and Environmental Scanning Electron Microscopy (ESEM). Comparing the resulting data, it was possible to follow osteointegration’s various developments resulting from different sizes of ß-TCP granules. In this paper, we show that this evaluation process, together with ESEM, provides further important information that allows to follow any osteointegration at every stage of develop. PMID:28735526

  18. Methodological considerations for designing a community water fluoridation cessation study.

    PubMed

    Singhal, Sonica; Farmer, Julie; McLaren, Lindsay

    2017-02-22

    High-quality, up-to-date research on community water fluoridation (CWF), and especially on the implications of CWF cessation for dental health, is limited. Although CWF cessation studies have been conducted, they are few in number; one of the major reasons is the methodological complexity of conducting such a study. This article draws on a systematic review of existing cessation studies (n=15) to explore methodological considerations of conducting CWF cessation studies in future. We review nine important methodological aspects (study design, comparison community, target population, time frame, sampling strategy, clinical indicators, assessment criteria, covariates and biomarkers) and provide recommendations for planning future CWF cessation studies that examine effects on dental caries. There is no one ideal study design to answer a research question. However, recommendations proposed regarding methodological aspects to conduct an epidemiological study to observe the effects of CWF cessation on dental caries, coupled with our identification of important methodological gaps, will be useful for researchers who are looking to optimize resources to conduct such a study with standards of rigour.

  19. A design and implementation methodology for diagnostic systems

    NASA Technical Reports Server (NTRS)

    Williams, Linda J. F.

    1988-01-01

    A methodology for design and implementation of diagnostic systems is presented. Also discussed are the advantages of embedding a diagnostic system in a host system environment. The methodology utilizes an architecture for diagnostic system development that is hierarchical and makes use of object-oriented representation techniques. Additionally, qualitative models are used to describe the host system components and their behavior. The methodology architecture includes a diagnostic engine that utilizes a combination of heuristic knowledge to control the sequence of diagnostic reasoning. The methodology provides an integrated approach to development of diagnostic system requirements that is more rigorous than standard systems engineering techniques. The advantages of using this methodology during various life cycle phases of the host systems (e.g., National Aerospace Plane (NASP)) include: the capability to analyze diagnostic instrumentation requirements during the host system design phase, a ready software architecture for implementation of diagnostics in the host system, and the opportunity to analyze instrumentation for failure coverage in safety critical host system operations.

  20. FOREWORD: Computational methodologies for designing materials Computational methodologies for designing materials

    NASA Astrophysics Data System (ADS)

    Rahman, Talat S.

    2009-02-01

    It would be fair to say that in the past few decades, theory and computer modeling have played a major role in elucidating the microscopic factors that dictate the properties of functional novel materials. Together with advances in experimental techniques, theoretical methods are becoming increasingly capable of predicting properties of materials at different length scales, thereby bringing in sight the long-sought goal of designing material properties according to need. Advances in computer technology and their availability at a reasonable cost around the world have made tit all the more urgent to disseminate what is now known about these modern computational techniques. In this special issue on computational methodologies for materials by design we have tried to solicit articles from authors whose works collectively represent the microcosm of developments in the area. This turned out to be a difficult task for a variety of reasons, not the least of which is space limitation in this special issue. Nevertheless, we gathered twenty articles that represent some of the important directions in which theory and modeling are proceeding in the general effort to capture the ability to produce materials by design. The majority of papers presented here focus on technique developments that are expected to uncover further the fundamental processes responsible for material properties, and for their growth modes and morphological evolutions. As for material properties, some of the articles here address the challenges that continue to emerge from attempts at accurate descriptions of magnetic properties, of electronically excited states, and of sparse matter, all of which demand new looks at density functional theory (DFT). I should hasten to add that much of the success in accurate computational modeling of materials emanates from the remarkable predictive power of DFT, without which we would not be able to place the subject on firm theoretical grounds. As we know and will also

  1. Failure detection system design methodology. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Chow, E. Y.

    1980-01-01

    The design of a failure detection and identification system consists of designing a robust residual generation process and a high performance decision making process. The design of these two processes are examined separately. Residual generation is based on analytical redundancy. Redundancy relations that are insensitive to modelling errors and noise effects are important for designing robust residual generation processes. The characterization of the concept of analytical redundancy in terms of a generalized parity space provides a framework in which a systematic approach to the determination of robust redundancy relations are developed. The Bayesian approach is adopted for the design of high performance decision processes. The FDI decision problem is formulated as a Bayes sequential decision problem. Since the optimal decision rule is incomputable, a methodology for designing suboptimal rules is proposed. A numerical algorithm is developed to facilitate the design and performance evaluation of suboptimal rules.

  2. Progressive failure methodologies for predicting residual strength and life of laminated composites

    NASA Technical Reports Server (NTRS)

    Harris, Charles E.; Allen, David H.; Obrien, T. Kevin

    1991-01-01

    Two progressive failure methodologies currently under development by the Mechanics of Materials Branch at NASA Langley Research Center are discussed. The damage tolerance/fail safety methodology developed by O'Brien is an engineering approach to ensuring adequate durability and damage tolerance by treating only delamination onset and the subsequent delamination accumulation through the laminate thickness. The continuum damage model developed by Allen and Harris employs continuum damage laws to predict laminate strength and life. The philosophy, mechanics framework, and current implementation status of each methodology are presented.

  3. Computer aided die design: A new open-source methodology

    NASA Astrophysics Data System (ADS)

    Carneiro, Olga Sousa; Rajkumar, Ananth; Ferrás, Luís Lima; Fernandes, Célio; Sacramento, Alberto; Nóbrega, João Miguel

    2017-05-01

    In this work we present a detailed description of how to use open source based computer codes to aid the design of complex profile extrusion dies, aiming to improve its flow distribution. The work encompasses the description of the overall open-source die design methodology, the implementation of the energy conservation equation in an existing OpenFOAM® solver, which will be then capable of simulating the steady non-isothermal flow of an incompressible generalized Newtonian fluid, and two case studies to illustrate the capabilities and practical usefulness of the developed methodology. The results obtained with these case studies, used to solve real industrial problems, demonstrate that the computational design aid is an excellent alternative, from economical and technical points of view, to the experimental trial-and-error procedure commonly used in industry.

  4. Hybrid antibiotics - clinical progress and novel designs.

    PubMed

    Parkes, Alastair L; Yule, Ian A

    2016-07-01

    There is a growing need for new antibacterial agents, but success in development of antibiotics in recent years has been limited. This has led researchers to investigate novel approaches to finding compounds that are effective against multi-drug resistant bacteria, and that delay onset of resistance. One such strategy has been to link antibiotics to produce hybrids designed to overcome resistance mechanisms. The concept of dual-acting hybrid antibiotics was introduced and reviewed in this journal in 2010. In the present review the authors sought to discover how clinical candidates described had progressed, and to examine how the field has developed. In three sections the authors cover the clinical progress of hybrid antibiotics, novel agents produced from hybridisation of two or more small-molecule antibiotics, and novel agents produced from hybridisation of antibiotics with small-molecules that have complementary activity. Many key questions regarding dual-acting hybrid antibiotics remain to be answered, and the proposed benefits of this approach are yet to be demonstrated. While Cadazolid in particular continues to progress in the clinic, suggesting that there is promise in hybridisation through covalent linkage, it may be that properties other than antibacterial activity are key when choosing a partner molecule.

  5. A bioclimatic design methodology for urban outdoor spaces

    NASA Astrophysics Data System (ADS)

    Swaid, H.; Bar-El, M.; Hoffman, M. E.

    1993-03-01

    The development of a bioclimatic urban design methodology is described. The cluster thermal time constant ( CTTC) model for predicting street-level urban air temperature variations is coupled with the wind-profile power law and the index of thermal stress (ITS.) for human comfort. The CTTC model and the power law produce the diurnal air temperature and wind speed variations in various canyonlike urban forms. The thermal comfort requirements for lightly-dressed, moderately-walking/seated persons in the outdoor space in summer are then obtained using the ITS. model. The proposed methodology enables a first-order assessment of the climatic implications of different features of the physical structure of the city such as street orientation, canyon height-to-width ratio, building density, and street shading. The application of the proposed methodology is demonstrated for Tel Aviv.

  6. A robust optimization methodology for preliminary aircraft design

    NASA Astrophysics Data System (ADS)

    Prigent, S.; Maréchal, P.; Rondepierre, A.; Druot, T.; Belleville, M.

    2016-05-01

    This article focuses on a robust optimization of an aircraft preliminary design under operational constraints. According to engineers' know-how, the aircraft preliminary design problem can be modelled as an uncertain optimization problem whose objective (the cost or the fuel consumption) is almost affine, and whose constraints are convex. It is shown that this uncertain optimization problem can be approximated in a conservative manner by an uncertain linear optimization program, which enables the use of the techniques of robust linear programming of Ben-Tal, El Ghaoui, and Nemirovski [Robust Optimization, Princeton University Press, 2009]. This methodology is then applied to two real cases of aircraft design and numerical results are presented.

  7. Aerodynamic configuration design using response surface methodology analysis

    NASA Astrophysics Data System (ADS)

    Engelund, Walter C.; Stanley, Douglas O.; Lepsch, Roger A.; McMillin, Mark M.; Unal, Resit

    1993-08-01

    An investigation has been conducted to determine a set of optimal design parameters for a single-stage-to-orbit reentry vehicle. Several configuration geometry parameters which had a large impact on the entry vehicle flying characteristics were selected as design variables: the fuselage fineness ratio, the nose to body length ratio, the nose camber value, the wing planform area scale factor, and the wing location. The optimal geometry parameter values were chosen using a response surface methodology (RSM) technique which allowed for a minimum dry weight configuration design that met a set of aerodynamic performance constraints on the landing speed, and on the subsonic, supersonic, and hypersonic trim and stability levels. The RSM technique utilized, specifically the central composite design method, is presented, along with the general vehicle conceptual design process. Results are presented for an optimized configuration along with several design trade cases.

  8. Aerodynamic configuration design using response surface methodology analysis

    NASA Technical Reports Server (NTRS)

    Engelund, Walter C.; Stanley, Douglas O.; Lepsch, Roger A.; Mcmillin, Mark M.; Unal, Resit

    1993-01-01

    An investigation has been conducted to determine a set of optimal design parameters for a single-stage-to-orbit reentry vehicle. Several configuration geometry parameters which had a large impact on the entry vehicle flying characteristics were selected as design variables: the fuselage fineness ratio, the nose to body length ratio, the nose camber value, the wing planform area scale factor, and the wing location. The optimal geometry parameter values were chosen using a response surface methodology (RSM) technique which allowed for a minimum dry weight configuration design that met a set of aerodynamic performance constraints on the landing speed, and on the subsonic, supersonic, and hypersonic trim and stability levels. The RSM technique utilized, specifically the central composite design method, is presented, along with the general vehicle conceptual design process. Results are presented for an optimized configuration along with several design trade cases.

  9. ProSAR: a new methodology for combinatorial library design.

    PubMed

    Chen, Hongming; Börjesson, Ulf; Engkvist, Ola; Kogej, Thierry; Svensson, Mats A; Blomberg, Niklas; Weigelt, Dirk; Burrows, Jeremy N; Lange, Tim

    2009-03-01

    A method is introduced for performing reagent selection for chemical library design based on topological (2D) pharmacophore fingerprints. Optimal reagent selection is achieved by optimizing the Shannon entropy of the 2D pharmacophore distribution for the reagent set. The method, termed ProSAR, is therefore expected to enumerate compounds that could serve as a good starting point for deriving a structure activity relationship (SAR) in combinatorial library design. This methodology is exemplified by library design examples where the active compounds were already known. The results show that most of the pharmacophores on the substituents for the active compounds are covered by the designed library. This strategy is further expanded to include product property profiles for aqueous solubility, hERG risk assessment, etc. in the optimization process so that the reagent pharmacophore diversity and the product property profile are optimized simultaneously via a genetic algorithm. This strategy is applied to a two-dimensional library design example and compared with libraries designed by a diversity based strategy which minimizes the average ensemble Tanimoto similarity. Our results show that by using the PSAR methodology, libraries can be designed with simultaneously good pharmacophore coverage and product property profile.

  10. Evaluation of a Progressive Failure Analysis Methodology for Laminated Composite Structures

    NASA Technical Reports Server (NTRS)

    Sleight, David W.; Knight, Norman F., Jr.; Wang, John T.

    1997-01-01

    A progressive failure analysis methodology has been developed for predicting the nonlinear response and failure of laminated composite structures. The progressive failure analysis uses C plate and shell elements based on classical lamination theory to calculate the in-plane stresses. Several failure criteria, including the maximum strain criterion, Hashin's criterion, and Christensen's criterion, are used to predict the failure mechanisms. The progressive failure analysis model is implemented into a general purpose finite element code and can predict the damage and response of laminated composite structures from initial loading to final failure.

  11. Development of a Design Methodology for Reconfigurable Flight Control Systems

    NASA Technical Reports Server (NTRS)

    Hess, Ronald A.; McLean, C.

    2000-01-01

    A methodology is presented for the design of flight control systems that exhibit stability and performance-robustness in the presence of actuator failures. The design is based upon two elements. The first element consists of a control law that will ensure at least stability in the presence of a class of actuator failures. This law is created by inner-loop, reduced-order, linear dynamic inversion, and outer-loop compensation based upon Quantitative Feedback Theory. The second element consists of adaptive compensators obtained from simple and approximate time-domain identification of the dynamics of the 'effective vehicle' with failed actuator(s). An example involving the lateral-directional control of a fighter aircraft is employed both to introduce the proposed methodology and to demonstrate its effectiveness and limitations.

  12. An automated methodology development. [software design for combat simulation

    NASA Technical Reports Server (NTRS)

    Hawley, L. R.

    1985-01-01

    The design methodology employed in testing the applicability of Ada in large-scale combat simulations is described. Ada was considered as a substitute for FORTRAN to lower life cycle costs and ease the program development efforts. An object-oriented approach was taken, which featured definitions of military targets, the capability of manipulating their condition in real-time, and one-to-one correlation between the object states and real world states. The simulation design process was automated by the problem statement language (PSL)/problem statement analyzer (PSA). The PSL/PSA system accessed the problem data base directly to enhance the code efficiency by, e.g., eliminating non-used subroutines, and provided for automated report generation, besides allowing for functional and interface descriptions. The ways in which the methodology satisfied the responsiveness, reliability, transportability, modifiability, timeliness and efficiency goals are discussed.

  13. An automated methodology development. [software design for combat simulation

    NASA Technical Reports Server (NTRS)

    Hawley, L. R.

    1985-01-01

    The design methodology employed in testing the applicability of Ada in large-scale combat simulations is described. Ada was considered as a substitute for FORTRAN to lower life cycle costs and ease the program development efforts. An object-oriented approach was taken, which featured definitions of military targets, the capability of manipulating their condition in real-time, and one-to-one correlation between the object states and real world states. The simulation design process was automated by the problem statement language (PSL)/problem statement analyzer (PSA). The PSL/PSA system accessed the problem data base directly to enhance the code efficiency by, e.g., eliminating non-used subroutines, and provided for automated report generation, besides allowing for functional and interface descriptions. The ways in which the methodology satisfied the responsiveness, reliability, transportability, modifiability, timeliness and efficiency goals are discussed.

  14. Design methodology of an automated scattering measurement facility

    NASA Astrophysics Data System (ADS)

    Mazur, D. G.

    1985-12-01

    This thesis addresses the design methodology surrounding an automated scattering measurement facility. A brief historical survey of radar cross-section (RCS) measurements is presented. The electromagnetic theory associated with a continuous wave (CW) background cancellation technique for measuring RCS is discussed as background. In addition, problems associated with interfacing test equipment, data storage and output are addressed. The facility used as a model for this thesis is located at the Air Force Institute of Technology, WPARB, OH. The design methodology applies to any automated scattering measurement facility. A software package incorporating features that enhance the operation of AFIT's facility by students is presented. Finally, sample output from the software package illustrate formats for displaying RCS data.

  15. A Progressive Damage Methodology for Residual Strength Predictions of Notched Composite Panels

    NASA Technical Reports Server (NTRS)

    Coats, Timothy W.; Harris, Charles E.

    1998-01-01

    The translaminate fracture behavior of carbon/epoxy structural laminates with through-penetration notches was investigated to develop a residual strength prediction methodology for composite structures. An experimental characterization of several composite materials systems revealed a fracture resistance behavior that was very similar to the R-curve behavior exhibited by ductile metals. Fractographic examinations led to the postulate that the damage growth resistance was primarily due to fractured fibers in the principal load-carrying plies being bridged by intact fibers of the adjacent plies. The load transfer associated with this bridging mechanism suggests that a progressive damage analysis methodology will be appropriate for predicting the residual strength of laminates with through-penetration notches. A progressive damage methodology developed by the authors was used to predict the initiation and growth of matrix cracks and fiber fracture. Most of the residual strength predictions for different panel widths, notch lengths, and material systems were within about 10% of the experimental failure loads.

  16. Formulation of a methodology for power circuit design optimization

    NASA Technical Reports Server (NTRS)

    Yu, Y.; Bachmann, M.; Lee, F. C. Y.; Triner, J. E.

    1976-01-01

    A methodology for optimizing power-processor designs is described which achieves optimization with respect to some power-processor characteristic deemed particularly desirable by the designer, such as weight or efficiency. Optimization theory based on Lagrange multipliers is reviewed together with nonlinear programming techniques employing penalty functions. The methodology, the task of which is to minimize an objective function subject to design constraints, is demonstrated with the aid of four examples: optimum-weight core selection for an inductor with a predetermined winding size, optimum-weight inductor design with a given loss constraint, optimum-loss inductor design with a given weight constraint, and a comparison of optimum-weight single- and two-stage input-filter designs with identical loss and other requirement constraints. Closed-form solutions for the first three examples are obtained by applying the Lagrange-multiplier method, but solutions for the last example are found numerically through the use of the sequential unconstrained minimization technique.

  17. An integrated risk analysis methodology in a multidisciplinary design environment

    NASA Astrophysics Data System (ADS)

    Hampton, Katrina Renee

    Design of complex, one-of-a-kind systems, such as space transportation systems, is characterized by high uncertainty and, consequently, high risk. It is necessary to account for these uncertainties in the design process to produce systems that are more reliable. Systems designed by including uncertainties and managing them, as well, are more robust and less prone to poor operations as a result of parameter variability. The quantification, analysis and mitigation of uncertainties are challenging tasks as many systems lack historical data. In such an environment, risk or uncertainty quantification becomes subjective because input data is based on professional judgment. Additionally, there are uncertainties associated with the analysis tools and models. Both the input data and the model uncertainties must be considered for a multi disciplinary systems level risk analysis. This research synthesizes an integrated approach for developing a method for risk analysis. Expert judgment methodology is employed to quantify external risk. This methodology is then combined with a Latin Hypercube Sampling - Monte Carlo simulation to propagate uncertainties across a multidisciplinary environment for the overall system. Finally, a robust design strategy is employed to mitigate risk during the optimization process. This type of approach to risk analysis is conducive to the examination of quantitative risk factors. The core of this research methodology is the theoretical framework for uncertainty propagation. The research is divided into three stages or modules. The first two modules include the identification/quantification and propagation of uncertainties. The third module involves the management of uncertainties or response optimization. This final module also incorporates the integration of risk into program decision-making. The risk analysis methodology, is applied to a launch vehicle conceptual design study at NASA Langley Research Center. The launch vehicle multidisciplinary

  18. Methodology for object-oriented real-time systems analysis and design: Software engineering

    NASA Technical Reports Server (NTRS)

    Schoeffler, James D.

    1991-01-01

    Successful application of software engineering methodologies requires an integrated analysis and design life-cycle in which the various phases flow smoothly 'seamlessly' from analysis through design to implementation. Furthermore, different analysis methodologies often lead to different structuring of the system so that the transition from analysis to design may be awkward depending on the design methodology to be used. This is especially important when object-oriented programming is to be used for implementation when the original specification and perhaps high-level design is non-object oriented. Two approaches to real-time systems analysis which can lead to an object-oriented design are contrasted: (1) modeling the system using structured analysis with real-time extensions which emphasizes data and control flows followed by the abstraction of objects where the operations or methods of the objects correspond to processes in the data flow diagrams and then design in terms of these objects; and (2) modeling the system from the beginning as a set of naturally occurring concurrent entities (objects) each having its own time-behavior defined by a set of states and state-transition rules and seamlessly transforming the analysis models into high-level design models. A new concept of a 'real-time systems-analysis object' is introduced and becomes the basic building block of a series of seamlessly-connected models which progress from the object-oriented real-time systems analysis and design system analysis logical models through the physical architectural models and the high-level design stages. The methodology is appropriate to the overall specification including hardware and software modules. In software modules, the systems analysis objects are transformed into software objects.

  19. Thin Film Heat Flux Sensors: Design and Methodology

    NASA Technical Reports Server (NTRS)

    Fralick, Gustave C.; Wrbanek, John D.

    2013-01-01

    Thin Film Heat Flux Sensors: Design and Methodology: (1) Heat flux is one of a number of parameters, together with pressure, temperature, flow, etc. of interest to engine designers and fluid dynamists, (2) The measurement of heat flux is of interest in directly determining the cooling requirements of hot section blades and vanes, and (3)In addition, if the surface and gas temperatures are known, the measurement of heat flux provides a value for the convective heat transfer coefficient that can be compared with the value provided by CFD codes.

  20. Integrating rock mechanics issues with repository design through design process principles and methodology

    SciTech Connect

    Bieniawski, Z.T.

    1996-04-01

    A good designer needs not only knowledge for designing (technical know-how that is used to generate alternative design solutions) but also must have knowledge about designing (appropriate principles and systematic methodology to follow). Concepts such as {open_quotes}design for manufacture{close_quotes} or {open_quotes}concurrent engineering{close_quotes} are widely used in the industry. In the field of rock engineering, only limited attention has been paid to the design process because design of structures in rock masses presents unique challenges to the designers as a result of the uncertainties inherent in characterization of geologic media. However, a stage has now been reached where we are be able to sufficiently characterize rock masses for engineering purposes and identify the rock mechanics issues involved but are still lacking engineering design principles and methodology to maximize our design performance. This paper discusses the principles and methodology of the engineering design process directed to integrating site characterization activities with design, construction and performance of an underground repository. Using the latest information from the Yucca Mountain Project on geology, rock mechanics and starter tunnel design, the current lack of integration is pointed out and it is shown how rock mechanics issues can be effectively interwoven with repository design through a systematic design process methodology leading to improved repository performance. In essence, the design process is seen as the use of design principles within an integrating design methodology, leading to innovative problem solving. In particular, a new concept of {open_quotes}Design for Constructibility and Performance{close_quotes} is introduced. This is discussed with respect to ten rock mechanics issues identified for repository design and performance.

  1. When Playing Meets Learning: Methodological Framework for Designing Educational Games

    NASA Astrophysics Data System (ADS)

    Linek, Stephanie B.; Schwarz, Daniel; Bopp, Matthias; Albert, Dietrich

    Game-based learning builds upon the idea of using the motivational potential of video games in the educational context. Thus, the design of educational games has to address optimizing enjoyment as well as optimizing learning. Within the EC-project ELEKTRA a methodological framework for the conceptual design of educational games was developed. Thereby state-of-the-art psycho-pedagogical approaches were combined with insights of media-psychology as well as with best-practice game design. This science-based interdisciplinary approach was enriched by enclosed empirical research to answer open questions on educational game-design. Additionally, several evaluation-cycles were implemented to achieve further improvements. The psycho-pedagogical core of the methodology can be summarized by the ELEKTRA's 4Ms: Macroadaptivity, Microadaptivity, Metacognition, and Motivation. The conceptual framework is structured in eight phases which have several interconnections and feedback-cycles that enable a close interdisciplinary collaboration between game design, pedagogy, cognitive science and media psychology.

  2. Acceptance testing for PACS: from methodology to design to implementation

    NASA Astrophysics Data System (ADS)

    Liu, Brent J.; Huang, H. K.

    2004-04-01

    Acceptance Testing (AT) is a crucial step in the implementation process of a PACS within a clinical environment. AT determines whether the PACS is ready for clinical use and marks the official sign off of the PACS product. Most PACS vendors have Acceptance Testing (AT) plans, however, these plans do not provide a complete and robust evaluation of the full system. In addition, different sites will have different special requirements that vendor AT plans do not cover. The purpose of this paper is to introduce a protocol for AT design and present case studies of AT performed on clinical PACS. A methodology is presented that includes identifying testing components within PACS, quality assurance for both functionality and performance, and technical testing focusing on key single points-of-failure within the PACS product. Tools and resources that provide assistance in performing AT are discussed. In addition, implementation of the AT within the clinical environment and the overall implementation timeline of the PACS process are presented. Finally, case studies of actual AT of clinical PACS performed in the healthcare environment will be reviewed. The methodology for designing and implementing a robust AT plan for PACS was documented and has been used in PACS acceptance tests in several sites. This methodology can be applied to any PACS and can be used as a validation for the PACS product being acquired by radiology departments and hospitals. A methodology for AT design and implementation was presented that can be applied to future PACS installations. A robust AT plan for a PACS installation can increase both the utilization and satisfaction of a successful implementation of a PACS product that benefits both vendor and customer.

  3. Implementation of probabilistic design methodology at Tennessee State University

    NASA Technical Reports Server (NTRS)

    Onwubiko, Chinyere

    1995-01-01

    The fact that Deterministic Design Method no longer satisfies most design needs calls for methods that will cope with the high trend in technology. The advance in computer technology has reduced the rigors that normally accompany many design analysis methods that account for uncertainties in design parameters. Probabilistic Design Methodology (PDM) is beginning to make impact in engineering design. This method is gaining more recognition in industries than in educational institutions. Some of the reasons for the limited use of the PDM at the moment are that many are unaware of its potentials, and most of the software developed for PDM are very recent. The central goal of the PDM project at Tennessee State University is to introduce engineering students to this method. The students participating in the project learn about PDM and the computer codes that are available to the design engineer. The software being used for this project is NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) developed under NASA probabilistic structural analysis program. NESSUS has three different modules which make it a very comprehensive computer code for PDM. Since this method is new to the students, its introduction into the engineering curriculum is to be in stages. These range from the introduction of PDM and its software to the applications. While this program is being developed for its eventual inclusion into the engineering curriculum, some graduate and undergraduate students are already carrying out some projects using this method. As the students are increasing their understanding on PDM, they are at the same time applying it to some common design problems. The areas this method is being applied at the moment include, Design of Gears (spur and worm); Design of Brakes; Design of Heat Exchangers Design of Helical Springs; and Design of Shock Absorbers. Some of the current results of these projects are presented.

  4. Implementation of probabilistic design methodology at Tennessee State University

    NASA Technical Reports Server (NTRS)

    Onwubiko, Chinyere

    1995-01-01

    The fact that Deterministic Design Method no longer satisfies most design needs calls for methods that will cope with the high trend in technology. The advance in computer technology has reduced the rigors that normally accompany many design analysis methods that account for uncertainties in design parameters. Probabilistic Design Methodology (PDM) is beginning to make impact in engineering design. This method is gaining more recognition in industries than in educational institutions. Some of the reasons for the limited use of the PDM at the moment are that many are unaware of its potentials, and most of the software developed for PDM are very recent. The central goal of the PDM project at Tennessee State University is to introduce engineering students to this method. The students participating in the project learn about PDM and the computer codes that are available to the design engineer. The software being used for this project is NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) developed under NASA probabilistic structural analysis program. NESSUS has three different modules which make it a very comprehensive computer code for PDM. Since this method is new to the students, its introduction into the engineering curriculum is to be in stages. These range from the introduction of PDM and its software to the applications. While this program is being developed for its eventual inclusion into the engineering curriculum, some graduate and undergraduate students are already carrying out some projects using this method. As the students are increasing their understanding on PDM, they are at the same time applying it to some common design problems. The areas this method is being applied at the moment include, Design of Gears (spur and worm); Design of Brakes; Design of Heat Exchangers Design of Helical Springs; and Design of Shock Absorbers. Some of the current results of these projects are presented.

  5. The design and methodology of premature ejaculation interventional studies

    PubMed Central

    2016-01-01

    Large well-designed clinical efficacy and safety randomized clinical trials (RCTs) are required to achieve regulatory approval of new drug treatments. The objective of this article is to make recommendations for the criteria for defining and selecting the clinical trial study population, design and efficacy outcomes measures which comprise ideal premature ejaculation (PE) interventional trial methodology. Data on clinical trial design, epidemiology, definitions, dimensions and psychological impact of PE was reviewed, critiqued and incorporated into a series of recommendations for standardisation of PE clinical trial design, outcome measures and reporting using the principles of evidence based medicine. Data from PE interventional studies are only reliable, interpretable and capable of being generalised to patients with PE, when study populations are defined by the International Society for Sexual Medicine (ISSM) multivariate definition of PE. PE intervention trials should employ a double-blind RCT methodology and include placebo control, active standard drug control, and/or dose comparison trials. Ejaculatory latency time (ELT) and subject/partner outcome measures of control, personal/partner/relationship distress and other study-specific outcome measures should be used as outcome measures. There is currently no published literature which identifies a clinically significant threshold response to intervention. The ISSM definition of PE reflects the contemporary understanding of PE and represents the state-of-the-art multi-dimensional definition of PE and is recommended as the basis of diagnosis of PE for all PE clinical trials. PMID:27652224

  6. The design and methodology of premature ejaculation interventional studies.

    PubMed

    McMahon, Chris G

    2016-08-01

    Large well-designed clinical efficacy and safety randomized clinical trials (RCTs) are required to achieve regulatory approval of new drug treatments. The objective of this article is to make recommendations for the criteria for defining and selecting the clinical trial study population, design and efficacy outcomes measures which comprise ideal premature ejaculation (PE) interventional trial methodology. Data on clinical trial design, epidemiology, definitions, dimensions and psychological impact of PE was reviewed, critiqued and incorporated into a series of recommendations for standardisation of PE clinical trial design, outcome measures and reporting using the principles of evidence based medicine. Data from PE interventional studies are only reliable, interpretable and capable of being generalised to patients with PE, when study populations are defined by the International Society for Sexual Medicine (ISSM) multivariate definition of PE. PE intervention trials should employ a double-blind RCT methodology and include placebo control, active standard drug control, and/or dose comparison trials. Ejaculatory latency time (ELT) and subject/partner outcome measures of control, personal/partner/relationship distress and other study-specific outcome measures should be used as outcome measures. There is currently no published literature which identifies a clinically significant threshold response to intervention. The ISSM definition of PE reflects the contemporary understanding of PE and represents the state-of-the-art multi-dimensional definition of PE and is recommended as the basis of diagnosis of PE for all PE clinical trials.

  7. Fuel cell cathode air filters: Methodologies for design and optimization

    NASA Astrophysics Data System (ADS)

    Kennedy, Daniel M.; Cahela, Donald R.; Zhu, Wenhua H.; Westrom, Kenneth C.; Nelms, R. Mark; Tatarchuk, Bruce J.

    Proton exchange membrane (PEM) fuel cells experience performance degradation, such as reduction in efficiency and life, as a result of poisoning of platinum catalysts by airborne contaminants. Research on these contaminant effects suggests that the best possible solution to allowing fuel cells to operate in contaminated environments is by filtration of the harmful contaminants from the cathode air. A cathode air filter design methodology was created that connects properties of cathode air stream, filter design options, and filter footprint, to a set of adsorptive filter parameters that must be optimized to efficiently operate the fuel cell. Filter optimization requires a study of the trade off between two causal factors of power loss: first, a reduction in power production due to poisoning of the platinum catalyst by chemical contaminants and second, an increase in power requirements to operate the air compressor with a larger pressure drop from additional contaminant filtration. The design methodology was successfully applied to a 1.2 kW fuel cell using a programmable algorithm and predictions were made about the relationships between inlet concentration, breakthrough time, filter design, pressure drop, and compressor power requirements.

  8. Behavioral headache research: methodologic considerations and research design alternatives.

    PubMed

    Hursey, Karl G; Rains, Jeanetta C; Penzien, Donald B; Nash, Justin M; Nicholson, Robert A

    2005-05-01

    Behavioral headache treatments have garnered solid empirical support in recent years, but there is substantial opportunity to strengthen the next generation of studies with improved methods and consistency across studies. Recently, Guidelines for Trials of Behavioral Treatments for Recurrent Headache were published to facilitate the production of high-quality research. The present article compliments the guidelines with a discussion of methodologic and research design considerations. Since there is no research design that is applicable in every situation, selecting an appropriate research design is fundamental to producing meaningful results. Investigators in behavioral headache and other areas of research consider the developmental phase of the research, the principle objectives of the project, and the sources of error or alternative interpretations in selecting a design. Phases of clinical trials typically include pilot studies, efficacy studies, and effectiveness studies. These trials may be categorized as primarily pragmatic or explanatory. The most appropriate research designs for these different phases and different objectives vary on such characteristics as sample size and assignment to condition, types of control conditions, periods or frequency of measurement, and the dimensions along which comparisons are made. A research design also must fit within constraints on available resources. There are a large number of potential research designs that can be used and considering these characteristics allows selection of appropriate research designs.

  9. [Methodological design of the National Health and Nutrition Survey 2016].

    PubMed

    Romero-Martínez, Martín; Shamah-Levy, Teresa; Cuevas-Nasu, Lucía; Gómez-Humarán, Ignacio Méndez; Gaona-Pineda, Elsa Berenice; Gómez-Acosta, Luz María; Rivera-Dommarco, Juan Ángel; Hernández-Ávila, Mauricio

    2017-01-01

    Describe the design methodology of the halfway health and nutrition national survey (Ensanut-MC) 2016. The Ensanut-MC is a national probabilistic survey whose objective population are the inhabitants of private households in Mexico. The sample size was determined to make inferences on the urban and rural areas in four regions. Describes main design elements: target population, topics of study, sampling procedure, measurement procedure and logistics organization. A final sample of 9 479 completed household interviews, and a sample of 16 591 individual interviews. The response rate for households was 77.9%, and the response rate for individuals was 91.9%. The Ensanut-MC probabilistic design allows valid statistical inferences about interest parameters for Mexico´s public health and nutrition, specifically on overweight, obesity and diabetes mellitus. Updated information also supports the monitoring, updating and formulation of new policies and priority programs.

  10. Applying a user centered design methodology in a clinical context.

    PubMed

    Kashfi, Hajar

    2010-01-01

    A clinical decision support system (CDSS) is an interactive application that is used to facilitate the process of decisionmaking in a clinical context. Developing a usable CDSS is a challenging process; mostly because of the complex nature of domain knowledge and the context of use of those systems. This paper describes how a user centered design (UCD) approach can be used in a clinical context for developing a CDSS. In our effort, a design-based research methodology has been used. The outcomes of this work are as follow; a customized UCD approach is suggested that combines UCD and openEHR. Moreover, the GUI developed in the design phase and the result of the GUI evaluation is briefly presented.

  11. Aircraft conceptual design - an adaptable parametric sizing methodology

    NASA Astrophysics Data System (ADS)

    Coleman, Gary John, Jr.

    Aerospace is a maturing industry with successful and refined baselines which work well for traditional baseline missions, markets and technologies. However, when new markets (space tourism) or new constrains (environmental) or new technologies (composite, natural laminar flow) emerge, the conventional solution is not necessarily best for the new situation. Which begs the question "how does a design team quickly screen and compare novel solutions to conventional solutions for new aerospace challenges?" The answer is rapid and flexible conceptual design Parametric Sizing. In the product design life-cycle, parametric sizing is the first step in screening the total vehicle in terms of mission, configuration and technology to quickly assess first order design and mission sensitivities. During this phase, various missions and technologies are assessed. During this phase, the designer is identifying design solutions of concepts and configurations to meet combinations of mission and technology. This research undertaking contributes the state-of-the-art in aircraft parametric sizing through (1) development of a dedicated conceptual design process and disciplinary methods library, (2) development of a novel and robust parametric sizing process based on 'best-practice' approaches found in the process and disciplinary methods library, and (3) application of the parametric sizing process to a variety of design missions (transonic, supersonic and hypersonic transports), different configurations (tail-aft, blended wing body, strut-braced wing, hypersonic blended bodies, etc.), and different technologies (composite, natural laminar flow, thrust vectored control, etc.), in order to demonstrate the robustness of the methodology and unearth first-order design sensitivities to current and future aerospace design problems. This research undertaking demonstrates the importance of this early design step in selecting the correct combination of mission, technologies and configuration to

  12. Development and implementation of rotorcraft preliminary design methodology using multidisciplinary design optimization

    NASA Astrophysics Data System (ADS)

    Khalid, Adeel Syed

    Rotorcraft's evolution has lagged behind that of fixed-wing aircraft. One of the reasons for this gap is the absence of a formal methodology to accomplish a complete conceptual and preliminary design. Traditional rotorcraft methodologies are not only time consuming and expensive but also yield sub-optimal designs. Rotorcraft design is an excellent example of a multidisciplinary complex environment where several interdependent disciplines are involved. A formal framework is developed and implemented in this research for preliminary rotorcraft design using IPPD methodology. The design methodology consists of the product and process development cycles. In the product development loop, all the technical aspects of design are considered including the vehicle engineering, dynamic analysis, stability and control, aerodynamic performance, propulsion, transmission design, weight and balance, noise analysis and economic analysis. The design loop starts with a detailed analysis of requirements. A baseline is selected and upgrade targets are identified depending on the mission requirements. An Overall Evaluation Criterion (OEC) is developed that is used to measure the goodness of the design or to compare the design with competitors. The requirements analysis and baseline upgrade targets lead to the initial sizing and performance estimation of the new design. The digital information is then passed to disciplinary experts. This is where the detailed disciplinary analyses are performed. Information is transferred from one discipline to another as the design loop is iterated. To coordinate all the disciplines in the product development cycle, Multidisciplinary Design Optimization (MDO) techniques e.g. All At Once (AAO) and Collaborative Optimization (CO) are suggested. The methodology is implemented on a Light Turbine Training Helicopter (LTTH) design. Detailed disciplinary analyses are integrated through a common platform for efficient and centralized transfer of design

  13. EuroSkyWay Multipurpose Terminal: architecture and design methodology

    NASA Astrophysics Data System (ADS)

    Ciancarelli, C.; Macchia, G.

    2002-07-01

    This paper describes the architecture of the EuroSkyWay Multipurpose Terminal, aimed to satisfy the need of the Provider and Gateway Terminal of the EuroSkyWay system. First, an overview of the EuroSkyWay system is shown, with a short description of the proprietary EuroSkyWay protocol layer and of the terminals. The extension of the OMT methodology that has been used to design the system is described. Then, the architecture of the Multipurpose Terminal described in terms of configurations, hardware and software.

  14. Experimental design methodology: the scientific tool for performance evaluation

    NASA Astrophysics Data System (ADS)

    Sadjadi, Firooz A.

    1990-09-01

    With the rapid growth of the signal and image processing technology in the last several decades has arisen the need for means of evaluating and comparing the numerous algorithms and systems that are created or are being developed. Performance evaluation, in the past, has been mostly ad hoc and incohesive. In this paper we present a systematic step by step approach for the scientific evaluation of signal and image processing algorithms and systems. This approach is based on the methodology of Experimental Design. We illustrate this method by means of an example from the field of automatic object recognition.

  15. Application of concept selection methodology in IC process design

    NASA Astrophysics Data System (ADS)

    Kim, Myung-Kul

    1993-01-01

    Search for an effective methodology practical in IC manufacturing process development led to trial of quantitative 'concept selection' methodology in selecting the 'best' alternative for interlevel dielectric (ILD) processes. A cross-functional team selected multi-criteria with scoring guidelines to be used in the definition of the 'best'. The project was targeted for the 3 level metal backend process for sub-micron gate array product. The outcome of the project showed that the maturity of the alternatives has strong influence on the scores, because scores on the adopted criteria such as yield, reliability and maturity will depend on the maturity of a particular process. At the same time, the project took longer than expected since it required data for the multiple criteria. These observations suggest that adopting a simpler procedure that can analyze total inherent controllability of a process would be more effective. The methodology of the DFS (design for simplicity) tools used in analyzing the manufacturability of such electronics products as computers, phones and other consumer electronics products could be used as an 'analogy' in constructing an evaluation method for IC processes that produce devices used in those electronics products. This could be done by focusing on the basic process operation elements rather than the layers that are being built.

  16. 50 years of medicinal plant research - every progress in methodology is a progress in science.

    PubMed

    Phillipson, J David

    2003-06-01

    Many scientific methods of analysis have been developed for the investigation of the constituents and biological activities of medicinal plants during the 50 years since the inaugural meeting of the Gesellschaft für Arzneipflanzenforschung (GA). The chromatographic (e. g., TLC, GLC, HPLC), spectroscopic (e. g., UV, IR, 1H- and 13C-NMR, MS), and biological (e. g., anticancer, anti-inflammatory, immunostimulant, antiprotozoal, CNS) techniques utilized for medicinal plant research are briefly reviewed. The contribution that advances in scientific methodology have made to our understanding of the actions of some herbal medicines (e. g., Echinacea, Ginkgo, St John's wort, Cannabis), as well as to ethnopharmacology and biotechnology, are briefly summarized. Plants have provided many medicinal drugs in the past and remain as a potential source of novel therapeutic agents. Despite all of the powerful analytical techniques available, the majority of plant species has not been investigated chemically or biologically in any great detail and even well known medicinal plants require further clinical study.

  17. Methodologies and study designs relevant to medical education research.

    PubMed

    Turner, Teri L; Balmer, Dorene F; Coverdale, John H

    2013-06-01

    Research is an important part of educational scholarship. Knowledge of research methodologies is essential for both conducting research as well as determining the soundness of the findings from published studies. Our goals for this paper therefore are to inform medical education researchers of the range and key components of educational research designs. We will discuss both qualitative and quantitative approaches to educational research. Qualitative methods will be presented according to traditions that have a distinguished history in particular disciplines. Quantitative methods will be presented according to an evidence-based hierarchy akin to that of evidence-based medicine with the stronger designs (systematic reviews and well conducted educational randomized controlled trials) at the top, and weaker designs (descriptive studies without comparison groups, or single case studies) at the bottom. It should be appreciated, however, that the research question determines the study design. Therefore, the onus is on the researcher to choose a design that is appropriate to answering the question. We conclude with an overview of how educational researchers should describe the study design and methods in order to provide transparency and clarity.

  18. The Study of the Relationship between Probabilistic Design and Axiomatic Design Methodology. Volume 2

    NASA Technical Reports Server (NTRS)

    Onwubiko, Chin-Yere; Onyebueke, Landon

    1996-01-01

    The structural design, or the design of machine elements, has been traditionally based on deterministic design methodology. The deterministic method considers all design parameters to be known with certainty. This methodology is, therefore, inadequate to design complex structures that are subjected to a variety of complex, severe loading conditions. A nonlinear behavior that is dependent on stress, stress rate, temperature, number of load cycles, and time is observed on all components subjected to complex conditions. These complex conditions introduce uncertainties; hence, the actual factor of safety margin remains unknown. In the deterministic methodology, the contingency of failure is discounted; hence, there is a use of a high factor of safety. It may be most useful in situations where the design structures are simple. The probabilistic method is concerned with the probability of non-failure performance of structures or machine elements. It is much more useful in situations where the design is characterized by complex geometry, possibility of catastrophic failure, sensitive loads and material properties. Also included: Comparative Study of the use of AGMA Geometry Factors and Probabilistic Design Methodology in the Design of Compact Spur Gear Set.

  19. A Research Methodology for Green IT Systems Based on WSR and Design Science: The Case of a Chinese Company

    NASA Astrophysics Data System (ADS)

    Zhong, Yinghong; Liu, Hongwei

    Currently green IT has been a hotspot in both practice and research fields. Much progress has been made in the aspect of green technologies. However, researchers and designers could not simply build up a green IT system from technological aspect, which is normally considered as a wicked problem. This paper puts forward a research methodology for green IT systems by introducing WSR and design science. This methodology absorbs essence from soft systems methodology and action research. It considers the research, design and building of green IT systems from a systemic perspective which can be divided into as technological dimension, management dimension and human dimension. This methodology consists of 7 iterated stages. Each stage is presented and followed by a case study from a Chinese company.

  20. Design Evolution and Methodology for Pumpkin Super-Pressure Balloons

    NASA Astrophysics Data System (ADS)

    Farley, Rodger

    The NASA Ultra Long Duration Balloon (ULDB) program has had many technical development issues discovered and solved along its road to success as a new vehicle. It has the promise of being a sub-satellite, a means to launch up to 2700 kg to 33.5 km altitude for 100 days from a comfortable mid-latitude launch point. Current high-lift long duration ballooning is accomplished out of Antarctica with zero-pressure balloons, which cannot cope with the rigors of diurnal cycles. The ULDB design is still evolving, the product of intense analytical effort, scaled testing, improved manufacturing, and engineering intuition. The past technical problems, in particular the s-cleft deformation, their solutions, future challenges, and the methodology of pumpkin balloon design will generally be described.

  1. Fast underdetermined BSS architecture design methodology for real time applications.

    PubMed

    Mopuri, Suresh; Reddy, P Sreenivasa; Acharyya, Amit; Naik, Ganesh R

    2015-01-01

    In this paper, we propose a high speed architecture design methodology for the Under-determined Blind Source Separation (UBSS) algorithm using our recently proposed high speed Discrete Hilbert Transform (DHT) targeting real time applications. In UBSS algorithm, unlike the typical BSS, the number of sensors are less than the number of the sources, which is of more interest in the real time applications. The DHT architecture has been implemented based on sub matrix multiplication method to compute M point DHT, which uses N point architecture recursively and where M is an integer multiples of N. The DHT architecture and state of the art architecture are coded in VHDL for 16 bit word length and ASIC implementation is carried out using UMC 90 - nm technology @V DD = 1V and @ 1MHZ clock frequency. The proposed architecture implementation and experimental comparison results show that the DHT design is two times faster than state of the art architecture.

  2. A methodology for double patterning compliant split and design

    NASA Astrophysics Data System (ADS)

    Wiaux, Vincent; Verhaegen, Staf; Iwamoto, Fumio; Maenhoudt, Mireille; Matsuda, Takashi; Postnikov, Sergei; Vandenberghe, Geert

    2008-11-01

    Double Patterning allows to further extend the use of water immersion lithography at its maximum numerical aperture NA=1.35. Splitting of design layers to recombine through Double Patterning (DP) enables an effective resolution enhancement. Single polygons may need to be split up (cut) depending on the pattern density and its 2D content. The split polygons recombine at the so-called 'stitching points'. These stitching points may affect the yield due to the sensitivity to process variations. We describe a methodology to ensure a robust double patterning by identifying proper split- and design- guidelines. Using simulations and experimental data, we discuss in particular metal1 first interconnect layers of random LOGIC and DRAM applications at 45nm half-pitch (hp) and 32nm hp where DP may become the only timely patterning solution.

  3. A variable-gain output feedback control design methodology

    NASA Technical Reports Server (NTRS)

    Halyo, Nesim; Moerder, Daniel D.; Broussard, John R.; Taylor, Deborah B.

    1989-01-01

    A digital control system design technique is developed in which the control system gain matrix varies with the plant operating point parameters. The design technique is obtained by formulating the problem as an optimal stochastic output feedback control law with variable gains. This approach provides a control theory framework within which the operating range of a control law can be significantly extended. Furthermore, the approach avoids the major shortcomings of the conventional gain-scheduling techniques. The optimal variable gain output feedback control problem is solved by embedding the Multi-Configuration Control (MCC) problem, previously solved at ICS. An algorithm to compute the optimal variable gain output feedback control gain matrices is developed. The algorithm is a modified version of the MCC algorithm improved so as to handle the large dimensionality which arises particularly in variable-gain control problems. The design methodology developed is applied to a reconfigurable aircraft control problem. A variable-gain output feedback control problem was formulated to design a flight control law for an AFTI F-16 aircraft which can automatically reconfigure its control strategy to accommodate failures in the horizontal tail control surface. Simulations of the closed-loop reconfigurable system show that the approach produces a control design which can accommodate such failures with relative ease. The technique can be applied to many other problems including sensor failure accommodation, mode switching control laws and super agility.

  4. A symbolic methodology to improve disassembly process design.

    PubMed

    Rios, Pedro; Blyler, Leslie; Tieman, Lisa; Stuart, Julie Ann; Grant, Ed

    2003-12-01

    Millions of end-of-life electronic components are retired annually due to the proliferation of new models and their rapid obsolescence. The recovery of resources such as plastics from these goods requires their disassembly. The time required for each disassembly and its associated cost is defined by the operator's familiarity with the product design and its complexity. Since model proliferation serves to complicate an operator's learning curve, it is worthwhile to investigate the benefits to be gained in a disassembly operator's preplanning process. Effective disassembly process design demands the application of green engineering principles, such as those developed by Anastas and Zimmerman (Environ. Sci. Technol. 2003, 37, 94A-101A), which include regard for product complexity, structural commonality, separation energy, material value, and waste prevention. This paper introduces the concept of design symbolsto help the operator more efficiently survey product complexity with respect to location and number of fasteners to remove a structure that is common to all electronics: the housing. With a sample of 71 different computers, printers, and monitors, we demonstrate that appropriate symbols reduce the total disassembly planning time by 13.2 min. Such an improvement could well make efficient the separation of plastic that would otherwise be destined for waste-to-energy or landfill. The symbolic methodology presented may also improve Design for Recycling and Design for Maintenance and Support.

  5. A review and synthesis of late Pleistocene extinction modeling: progress delayed by mismatches between ecological realism, interpretation, and methodological transparency.

    PubMed

    Yule, Jeffrey V; Fournier, Robert J; Jensen, Christopher X J; Yang, Jinyan

    2014-06-01

    Late Pleistocene extinctions occurred globally over a period of about 50,000 years, primarily affecting mammals of > or = 44 kg body mass (i.e., megafauna) first in Australia, continuing in Eurasia and, finally, in the Americas. Polarized debate about the cause(s) of the extinctions centers on the role of climate change and anthropogenic factors (especially hunting). Since the late 1960s, investigators have developed mathematical models to simulate the ecological interactions that might have contributed to the extinctions. Here, we provide an overview of the various methodologies used and conclusions reached in the modeling literature, addressing both the strengths and weaknesses of modeling as an explanatory tool. Although late Pleistocene extinction models now provide a solid foundation for viable future work, we conclude, first, that single models offer less compelling support for their respective explanatory hypotheses than many realize; second, that disparities in methodology (both in terms of model parameterization and design) prevent meaningful comparison between models and, more generally, progress from model to model in increasing our understanding of these extinctions; and third, that recent models have been presented and possibly developed without sufficient regard for the transparency of design that facilitates scientific progress.

  6. Integrated design of the CSI evolutionary structure: A verification of the design methodology

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Joshi, S. M.; Elliott, Kenny B.; Walz, J. E.

    1993-01-01

    One of the main objectives of the Controls-Structures Interaction (CSI) program is to develop and evaluate integrated controls-structures design methodology for flexible space structures. Thus far, integrated design methodologies for a class of flexible spacecraft, which require fine attitude pointing and vibration suppression with no payload articulation, have been extensively investigated. Various integrated design optimization approaches, such as single-objective optimization, and multi-objective optimization, have been implemented with an array of different objectives and constraints involving performance and cost measures such as total mass, actuator mass, steady-state pointing performance, transient performance, control power, and many more. These studies have been performed using an integrated design software tool (CSI-DESIGN CODE) which is under development by the CSI-ADM team at the NASA Langley Research Center. To date, all of these studies, irrespective of the type of integrated optimization posed or objectives and constraints used, have indicated that integrated controls-structures design results in an overall spacecraft design which is considerably superior to designs obtained through a conventional sequential approach. Consequently, it is believed that validation of some of these results through fabrication and testing of a structure which is designed through an integrated design approach is warranted. The objective of this paper is to present and discuss the efforts that have been taken thus far for the validation of the integrated design methodology.

  7. Finite-element/progressive-lattice-sampling response surface methodology and application to benchmark probability quantification problems

    SciTech Connect

    Romero, V.J.; Bankston, S.D.

    1998-03-01

    Optimal response surface construction is being investigated as part of Sandia discretionary (LDRD) research into Analytic Nondeterministic Methods. The goal is to achieve an adequate representation of system behavior over the relevant parameter space of a problem with a minimum of computational and user effort. This is important in global optimization and in estimation of system probabilistic response, which are both made more viable by replacing large complex computer models with fast-running accurate and noiseless approximations. A Finite Element/Lattice Sampling (FE/LS) methodology for constructing progressively refined finite element response surfaces that reuse previous generations of samples is described here. Similar finite element implementations can be extended to N-dimensional problems and/or random fields and applied to other types of structured sampling paradigms, such as classical experimental design and Gauss, Lobatto, and Patterson sampling. Here the FE/LS model is applied in a ``decoupled`` Monte Carlo analysis of two sets of probability quantification test problems. The analytic test problems, spanning a large range of probabilities and very demanding failure region geometries, constitute a good testbed for comparing the performance of various nondeterministic analysis methods. In results here, FE/LS decoupled Monte Carlo analysis required orders of magnitude less computer time than direct Monte Carlo analysis, with no appreciable loss of accuracy. Thus, when arriving at probabilities or distributions by Monte Carlo, it appears to be more efficient to expend computer-model function evaluations on building a FE/LS response surface than to expend them in direct Monte Carlo sampling.

  8. Design Validation Methodology Development for an Aircraft Sensor Deployment System

    NASA Astrophysics Data System (ADS)

    Wowczuk, Zenovy S.

    The OCULUS 1.0 Sensor Deployment concept design, was developed in 2004 at West Virginia University (WVU), outlined the general concept of a deployment system to be used on a C-130 aircraft. As a sequel, a new system, OCULUS 1.1, has been developed and designed. The new system transfers the concept system design to a safety of flight design, and also enhanced to a pre-production system to be used as the test bed to gain full military certification approval. The OCULUS 1.1 system has an implemented standard deployment system/procedure to go along with a design suited for military certification and implementation. This design process included analysis of the system's critical components and the generation of a critical component holistic model to be used as an analysis tool for future payload modification made to the system. Following the completion of the OCULUS 1.1 design, preparations and procedures for obtaining military airworthiness certification are described. The airworthiness process includes working with the agency overseeing all modifications to the normal operating procedures made to military C-130 aircraft and preparing the system for an experimental flight test. The critical steps in his process include developing a complete documentation package that details the analysis performed on the OCULUS 1.1 system and also the design of experiment flight test plan to analyze the system. Following the approval of the documentation and design of experiment an experimental flight test of the OCULUS 1.1 system was performed to verify the safety and airworthiness of the system. This test proved successfully that the OCULUS 1.1 system design was airworthy and approved for military use. The OCULUS 1.1 deployment system offers an open architecture design that is ideal for use as a sensor testing platform for developmental airborne sensors. The system's patented deployment methodology presents a simplistic approach to reaching the systems final operating position which

  9. Combustor design and analysis using the Rocket Combustor Interactive Design (ROCCID) methodology

    NASA Technical Reports Server (NTRS)

    Klem, Mark D.; Pieper, Jerry L.; Walker, Richard E.

    1990-01-01

    The ROCket Combustor Interactive Design (ROCCID) Methodology is a newly developed, interactive computer code for the design and analysis of a liquid propellant rocket combustion chamber. The application of ROCCID to design a liquid rocket combustion chamber is illustrated. Designs for a 50,000 lbf thrust and 1250 psi chamber pressure combustor using liquid oxygen (LOX)RP-1 propellants are developed and evaluated. Tradeoffs between key design parameters affecting combustor performance and stability are examined. Predicted performance and combustion stability margin for these designs are provided as a function of the combustor operating mixture ratio and chamber pressure.

  10. Combustor design and analysis using the ROCket Combustor Interactive Design (ROCCID) Methodology

    NASA Technical Reports Server (NTRS)

    Klem, Mark D.; Pieper, Jerry L.; Walker, Richard E.

    1990-01-01

    The ROCket Combustor Interactive Design (ROCCID) Methodology is a newly developed, interactive computer code for the design and analysis of a liquid propellant rocket combustion chamber. The application of ROCCID to design a liquid rocket combustion chamber is illustrated. Designs for a 50,000 lbf thrust and 1250 psi chamber pressure combustor using liquid oxygen (LOX)RP-1 propellants are developed and evaluated. Tradeoffs between key design parameters affecting combustor performance and stability are examined. Predicted performance and combustion stability margin for these designs are provided as a function of the combustor operating mixture ratio and chamber pressure.

  11. Towards a Methodology for the Design of Multimedia Public Access Interfaces.

    ERIC Educational Resources Information Center

    Rowley, Jennifer

    1998-01-01

    Discussion of information systems methodologies that can contribute to interface design for public access systems covers: the systems life cycle; advantages of adopting information systems methodologies; soft systems methodologies; task-oriented approaches to user interface design; holistic design, the Star model, and prototyping; the…

  12. Towards a Methodology for the Design of Multimedia Public Access Interfaces.

    ERIC Educational Resources Information Center

    Rowley, Jennifer

    1998-01-01

    Discussion of information systems methodologies that can contribute to interface design for public access systems covers: the systems life cycle; advantages of adopting information systems methodologies; soft systems methodologies; task-oriented approaches to user interface design; holistic design, the Star model, and prototyping; the…

  13. A new hot gas cleanup filter design methodology

    SciTech Connect

    VanOsdol, J.G.; Dennis, R.A.; Shaffer, F.D.

    1996-12-31

    The fluid dynamics of Hot Gas Cleanup (HGCU) systems having complex geometrical configurations are typically analyzed using computational fluid dynamics codes (CFD) or bench-scale laboratory test facilities called cold-flow models (CFM). At the present time, both CFD and CFM can be effectively used for simple flows limited to one or two characteristic length scales with well defined boundary conditions. This is not the situation with HGCU devices. These devices have very complex geometries, low Reynolds number, multi-phase flows that operate on multiple-length scales. For this reason, both CFD and CFM analysis cannot yet be considered as a practical engineering analysis tool for modeling the entire flow field inside HGCU systems. The thrust of this work is to provide an aerodynamic analysis methodology that can be easily applied to the complex geometries characteristic of HGCU filter vessels, but would not require the tedious numerical solution to the entire set of transport equations. The analysis methodology performs the following tasks: Predicts problem areas where ash deposition will most likely occur; Predicts residence times for particles at various locations inside the filter vessel; Lends itself quickly to major design changes; Provides a sound technical basis for more appropriate use of CFD and CFM analysis; and Provides CFD and CFM analysis in a more focused way where if is needed.

  14. Progress in the spectacle correction of presbyopia. Part 1: Design and development of progressive lenses.

    PubMed

    Meister, Darryl J; Fisher, Scott W

    2008-05-01

    Most of the commercial advances in the spectacle correction of presbyopia continue to occur in progressive lens design, which has been the focus of intense research and development over the past 60 years by major spectacle lens manufacturers. While progressive lens design and manufacturing techniques have advanced at a steady pace, recent progress in 'free-form' lens surfacing has opened up many exciting possibilities that will in all likelihood bring about a paradigm shift in the current model of progressive lens fabrication and distribution. The first installment of this two-part series will review the fundamental optical principles and early development work associated with progressive lenses.

  15. Sonic Boom Mitigation Through Aircraft Design and Adjoint Methodology

    NASA Technical Reports Server (NTRS)

    Rallabhandi, Siriam K.; Diskin, Boris; Nielsen, Eric J.

    2012-01-01

    This paper presents a novel approach to design of the supersonic aircraft outer mold line (OML) by optimizing the A-weighted loudness of sonic boom signature predicted on the ground. The optimization process uses the sensitivity information obtained by coupling the discrete adjoint formulations for the augmented Burgers Equation and Computational Fluid Dynamics (CFD) equations. This coupled formulation links the loudness of the ground boom signature to the aircraft geometry thus allowing efficient shape optimization for the purpose of minimizing the impact of loudness. The accuracy of the adjoint-based sensitivities is verified against sensitivities obtained using an independent complex-variable approach. The adjoint based optimization methodology is applied to a configuration previously optimized using alternative state of the art optimization methods and produces additional loudness reduction. The results of the optimizations are reported and discussed.

  16. Development of design and analysis methodology for composite bolted joints

    NASA Astrophysics Data System (ADS)

    Grant, Peter; Sawicki, Adam

    1991-05-01

    This paper summarizes work performed to develop composite joint design methodology for use on rotorcraft primary structure, determine joint characteristics which affect joint bearing and bypass strength, and develop analytical methods for predicting the effects of such characteristics in structural joints. Experimental results have shown that bearing-bypass interaction allowables cannot be defined using a single continuous function due to variance of failure modes for different bearing-bypass ratios. Hole wear effects can be significant at moderate stress levels and should be considered in the development of bearing allowables. A computer program has been developed and has successfully predicted bearing-bypass interaction effects for the (0/+/-45/90) family of laminates using filled hole and unnotched test data.

  17. SSME Investment in Turbomachinery Inducer Impeller Design Tools and Methodology

    NASA Technical Reports Server (NTRS)

    Zoladz, Thomas; Mitchell, William; Lunde, Kevin

    2010-01-01

    Within the rocket engine industry, SSME turbomachines are the de facto standards of success with regard to meeting aggressive performance requirements under challenging operational environments. Over the Shuttle era, SSME has invested heavily in our national inducer impeller design infrastructure. While both low and high pressure turbopump failures/anomaly resolution efforts spurred some of these investments, the SSME program was a major benefactor of key areas of turbomachinery inducer-impeller research outside of flight manifest pressures. Over the past several decades, key turbopump internal environments have been interrogated via highly instrumented hot-fire and cold-flow testing. Likewise, SSME has sponsored the advancement of time accurate and cavitating inducer impeller computation fluid dynamics (CFD) tools. These investments together have led to a better understanding of the complex internal flow fields within aggressive high performing inducers and impellers. New design tools and methodologies have evolved which intend to provide confident blade designs which strike an appropriate balance between performance and self induced load management.

  18. SSME Investment in Turbomachinery Inducer Impeller Design Tools and Methodology

    NASA Technical Reports Server (NTRS)

    Zoladz, Thomas; Mitchell, William; Lunde, Kevin

    2010-01-01

    Within the rocket engine industry, SSME turbomachines are the de facto standards of success with regard to meeting aggressive performance requirements under challenging operational environments. Over the Shuttle era, SSME has invested heavily in our national inducer impeller design infrastructure. While both low and high pressure turbopump failures/anomaly resolution efforts spurred some of these investments, the SSME program was a major benefactor of key areas of turbomachinery inducer-impeller research outside of flight manifest pressures. Over the past several decades, key turbopump internal environments have been interrogated via highly instrumented hot-fire and cold-flow testing. Likewise, SSME has sponsored the advancement of time accurate and cavitating inducer impeller computation fluid dynamics (CFD) tools. These investments together have led to a better understanding of the complex internal flow fields within aggressive high performing inducers and impellers. New design tools and methodologies have evolved which intend to provide confident blade designs which strike an appropriate balance between performance and self induced load management.

  19. An NAFP Project: Use of Object Oriented Methodologies and Design Patterns to Refactor Software Design

    NASA Technical Reports Server (NTRS)

    Shaykhian, Gholam Ali; Baggs, Rhoda

    2007-01-01

    In the early problem-solution era of software programming, functional decompositions were mainly used to design and implement software solutions. In functional decompositions, functions and data are introduced as two separate entities during the design phase, and are followed as such in the implementation phase. Functional decompositions make use of refactoring through optimizing the algorithms, grouping similar functionalities into common reusable functions, and using abstract representations of data where possible; all these are done during the implementation phase. This paper advocates the usage of object-oriented methodologies and design patterns as the centerpieces of refactoring software solutions. Refactoring software is a method of changing software design while explicitly preserving its external functionalities. The combined usage of object-oriented methodologies and design patterns to refactor should also benefit the overall software life cycle cost with improved software.

  20. Systematic and progressive implementation of the centers of excellence for rheumatoid arthritis: a methodological proposal.

    PubMed

    Santos-Moreno, Pedro; Caballero-Uribe, Carlo V; Massardo, Maria Loreto; Maldonado, Claudio Galarza; Soriano, Enrique R; Pineda, Carlos; Cardiel, Mario; Benavides, Juan Alberto; Beltrán, Paula Andrea

    2017-08-24

    The implementation of excellence centers in specific diseases has been gaining recognition in the field of health; specifically in rheumatoid arthritis, where the prognosis of the disease is related to an early diagnosis and a timely intervention, it is necessary that the provision of health services is developed in an environment of quality, opportunity, and safety with the highest standards of care. A methodology that allows this implementation in such a way that is achievable by the most of the care centers is a priority to achieve a better attention to populations with this disease. In this paper, we propose a systematic and progressive methodology that will help all the institutions to develop successful models without faltering in the process. The expected impact on public health is defined by a better effective coverage of high-quality treatments, obtaining better health outcomes with safety and accessibility that reduces the budgetary impact for the health systems of our countries.

  1. Octopus: A Design Methodology for Motion Capture Wearables.

    PubMed

    Marin, Javier; Blanco, Teresa; Marin, Jose J

    2017-08-15

    Human motion capture (MoCap) is widely recognised for its usefulness and application in different fields, such as health, sports, and leisure; therefore, its inclusion in current wearables (MoCap-wearables) is increasing, and it may be very useful in a context of intelligent objects interconnected with each other and to the cloud in the Internet of Things (IoT). However, capturing human movement adequately requires addressing difficult-to-satisfy requirements, which means that the applications that are possible with this technology are held back by a series of accessibility barriers, some technological and some regarding usability. To overcome these barriers and generate products with greater wearability that are more efficient and accessible, factors are compiled through a review of publications and market research. The result of this analysis is a design methodology called Octopus, which ranks these factors and schematises them. Octopus provides a tool that can help define design requirements for multidisciplinary teams, generating a common framework and offering a new method of communication between them.

  2. Octopus: A Design Methodology for Motion Capture Wearables

    PubMed Central

    2017-01-01

    Human motion capture (MoCap) is widely recognised for its usefulness and application in different fields, such as health, sports, and leisure; therefore, its inclusion in current wearables (MoCap-wearables) is increasing, and it may be very useful in a context of intelligent objects interconnected with each other and to the cloud in the Internet of Things (IoT). However, capturing human movement adequately requires addressing difficult-to-satisfy requirements, which means that the applications that are possible with this technology are held back by a series of accessibility barriers, some technological and some regarding usability. To overcome these barriers and generate products with greater wearability that are more efficient and accessible, factors are compiled through a review of publications and market research. The result of this analysis is a design methodology called Octopus, which ranks these factors and schematises them. Octopus provides a tool that can help define design requirements for multidisciplinary teams, generating a common framework and offering a new method of communication between them. PMID:28809786

  3. Database Design Methodology and Database Management System for Computer-Aided Structural Design Optimization.

    DTIC Science & Technology

    1984-12-01

    1983). Several researchers Lillehagen and Dokkar (1982), Grabowski, Eigener and Ranch (1978), and Eberlein and Wedekind (1982) have worked on database...Proceedings of International Federation of Information Processing. pp. 335-366. Eberlein, W. and Wedekind , H., 1982, "A Methodology for Embedding Design

  4. We!Design: A Student-Centred Participatory Methodology for the Design of Educational Applications

    ERIC Educational Resources Information Center

    Triantafyllakos, George N.; Palaigeorgiou, George E.; Tsoukalas, Ioannis A.

    2008-01-01

    The development of educational applications has always been a challenging and complex issue, mainly because of the complications imposed by the cognitive and psychological aspects of student-computer interactions. This article presents a methodology, named We!Design, that tries to encounter the complexity of educational applications development…

  5. Supporting Space Systems Design via Systems Dependency Analysis Methodology

    NASA Astrophysics Data System (ADS)

    Guariniello, Cesare

    assess the behavior of each system based on its internal status and on the topology of its dependencies on systems connected to it. Designers and decision makers can therefore quickly analyze and explore the behavior of complex systems and evaluate different architectures under various working conditions. The methods support educated decision making both in the design and in the update process of systems architecture, reducing the need to execute extensive simulations. In particular, in the phase of concept generation and selection, the information given by the methods can be used to identify promising architectures to be further tested and improved, while discarding architectures that do not show the required level of global features. The methods, when used in conjunction with appropriate metrics, also allow for improved reliability and risk analysis, as well as for automatic scheduling and re-scheduling based on the features of the dependencies and on the accepted level of risk. This dissertation illustrates the use of the two methods in sample aerospace applications, both in the operational and in the developmental domain. The applications show how to use the developed methodology to evaluate the impact of failures, assess the criticality of systems, quantify metrics of interest, quantify the impact of delays, support informed decision making when scheduling the development of systems and evaluate the achievement of partial capabilities. A larger, well-framed case study illustrates how the Systems Operational Dependency Analysis method and the Systems Developmental Dependency Analysis method can support analysis and decision making, at the mid and high level, in the design process of architectures for the exploration of Mars. The case study also shows how the methods do not replace the classical systems engineering methodologies, but support and improve them.

  6. A Progressive Damage Methodology for Residual Strength Predictions of Center-Crack Tension Composite Panels

    NASA Technical Reports Server (NTRS)

    Coats, Timothy William

    1996-01-01

    An investigation of translaminate fracture and a progressive damage methodology was conducted to evaluate and develop a residual strength prediction capability for laminated composites with through penetration notches. This is relevant to the damage tolerance of an aircraft fuselage that might suffer an in-flight accident such as an uncontained engine failure. An experimental characterization of several composite materials systems revealed an R-curve type of behavior. Fractographic examinations led to the postulate that this crack growth resistance could be due to fiber bridging, defined here as fractured fibers of one ply bridged by intact fibers of an adjacent ply. The progressive damage methodology is currently capable of predicting the initiation and growth of matrix cracks and fiber fracture. Using two difference fiber failure criteria, residual strength was predicted for different size panel widths and notch lengths. A ply discount fiber failure criterion yielded extremely conservative results while an elastic-perfectly plastic fiber failure criterion showed that the fiber bridging concept is valid for predicting residual strength for tensile dominated failure loads. Furthermore, the R-curves predicted by the model using the elastic-perfectly plastic fiber criterion compared very well with the experimental R-curves.

  7. Design and fabrication of the progressive addition lenses

    NASA Astrophysics Data System (ADS)

    Qin, Linling; Qian, Lin; Yu, Jingchi

    2011-11-01

    The use of progressive addition lenses (PALs) for the correction of presbyopia has increased dramatically in recent years. These lenses are now being used as the preferred alternative to bifocal and trifocal lenses in many parts of the world. Progressive addition lenses are a kind of opthalmic lenses with freeform surface. The surface curvature of the Progressive addition lenses varies gradually from a minimum value in the upper area, to a maximum value in the lower area. Thus a PAL has a surface with three zones which have very small astigmatism: far-view zone, near-view zone, and intermediate zone. The far view zone and near view zone have relatively constant powers and connected by the intermediate zone with power varies progressively. The design and fabrication technologies of progressive addition lenses have fast progresses because of the massive development of the optical simulation software, multi-axis ultraprecision machining technologies and CNC machining technologies. The design principles of progressive addition lenses are discussed in a historic review. Several kinds of design methods are illustrated, and their advantages and disadvantages are also represented. In the current study, it is shown that the optical characteristics of the different progressive addition lenses designs are significantly different from one another. The different fabrication technologies of Progressive addition lenses are also discussed in the paper. Plastic injection molding and precision-machine turning are the common fabrication technologies for exterior PALs and Interior PALs respectively.

  8. Arab Teens Lifestyle Study (ATLS): objectives, design, methodology and implications

    PubMed Central

    Al-Hazzaa, Hazzaa M; Musaiger, Abdulrahman O

    2011-01-01

    Background There is a lack of comparable data on physical activity, sedentary behavior, and dietary habits among Arab adolescents, which limits our understanding and interpretation of the relationship between obesity and lifestyle parameters. Therefore, we initiated the Arab Teens Lifestyle Study (ATLS). The ATLS is a multicenter collaborative project for assessing lifestyle habits of Arab adolescents. The objectives of the ATLS project were to investigate the prevalence rates for overweight and obesity, physical activity, sedentary activity and dietary habits among Arab adolescents, and to examine the interrelationships between these lifestyle variables. This paper reports on the objectives, design, methodology, and implications of the ATLS. Design/Methods The ATLS is a school-based cross-sectional study involving 9182 randomly selected secondary-school students (14–19 years) from major Arab cities, using a multistage stratified sampling technique. The participating Arab cities included Riyadh, Jeddah, and Al-Khobar (Saudi Arabia), Bahrain, Dubai (United Arab Emirates), Kuwait, Amman (Jordan), Mosel (Iraq), Muscat (Oman), Tunisia (Tunisia) and Kenitra (Morocco). Measured variables included anthropometric measurements, physical activity, sedentary behavior, sleep duration, and dietary habits. Discussion The ATLS project will provide a unique opportunity to collect and analyze important lifestyle information from Arab adolescents using standardized procedures. This is the first time a collaborative Arab project will simultaneously assess broad lifestyle variables in a large sample of adolescents from numerous urbanized Arab regions. This joint research project will supply us with comprehensive and recent data on physical activity/inactivity and eating habits of Arab adolescents relative to obesity. Such invaluable lifestyle-related data are crucial for developing public health policies and regional strategies for health promotion and disease prevention. PMID

  9. The Design of the National Assessment of Educational Progress.

    ERIC Educational Resources Information Center

    Johnson, Eugene G.

    1992-01-01

    Features of the design of the National Assessment of Educational Progress (NAEP) are discussed, with emphasis on the design of the 1992 assessment. Student sample designs for the NAEP and the Trial State Assessment are described, and the focused-balanced incomplete block spiraling method of item sampling is discussed. (SLD)

  10. Multidisciplinary design and optimization (MDO) methodology for the aircraft conceptual design

    NASA Astrophysics Data System (ADS)

    Iqbal, Liaquat Ullah

    An integrated design and optimization methodology has been developed for the conceptual design of an aircraft. The methodology brings higher fidelity Computer Aided Design, Engineering and Manufacturing (CAD, CAE and CAM) Tools such as CATIA, FLUENT, ANSYS and SURFCAM into the conceptual design by utilizing Excel as the integrator and controller. The approach is demonstrated to integrate with many of the existing low to medium fidelity codes such as the aerodynamic panel code called CMARC and sizing and constraint analysis codes, thus providing the multi-fidelity capabilities to the aircraft designer. The higher fidelity design information from the CAD and CAE tools for the geometry, aerodynamics, structural and environmental performance is provided for the application of the structured design methods such as the Quality Function Deployment (QFD) and the Pugh's Method. The higher fidelity tools bring the quantitative aspects of a design such as precise measurements of weight, volume, surface areas, center of gravity (CG) location, lift over drag ratio, and structural weight, as well as the qualitative aspects such as external geometry definition, internal layout, and coloring scheme early in the design process. The performance and safety risks involved with the new technologies can be reduced by modeling and assessing their impact more accurately on the performance of the aircraft. The methodology also enables the design and evaluation of the novel concepts such as the blended (BWB) and the hybrid wing body (HWB) concepts. Higher fidelity computational fluid dynamics (CFD) and finite element analysis (FEA) allow verification of the claims for the performance gains in aerodynamics and ascertain risks of structural failure due to different pressure distribution in the fuselage as compared with the tube and wing design. The higher fidelity aerodynamics and structural models can lead to better cost estimates that help reduce the financial risks as well. This helps in

  11. A combined stochastic feedforward and feedback control design methodology with application to autoland design

    NASA Technical Reports Server (NTRS)

    Halyo, Nesim

    1987-01-01

    A combined stochastic feedforward and feedback control design methodology was developed. The objective of the feedforward control law is to track the commanded trajectory, whereas the feedback control law tries to maintain the plant state near the desired trajectory in the presence of disturbances and uncertainties about the plant. The feedforward control law design is formulated as a stochastic optimization problem and is embedded into the stochastic output feedback problem where the plant contains unstable and uncontrollable modes. An algorithm to compute the optimal feedforward is developed. In this approach, the use of error integral feedback, dynamic compensation, control rate command structures are an integral part of the methodology. An incremental implementation is recommended. Results on the eigenvalues of the implemented versus designed control laws are presented. The stochastic feedforward/feedback control methodology is used to design a digital automatic landing system for the ATOPS Research Vehicle, a Boeing 737-100 aircraft. The system control modes include localizer and glideslope capture and track, and flare to touchdown. Results of a detailed nonlinear simulation of the digital control laws, actuator systems, and aircraft aerodynamics are presented.

  12. A Synergy between the Technological Process and a Methodology for Web Design: Implications for Technological Problem Solving and Design

    ERIC Educational Resources Information Center

    Jakovljevic, Maria; Ankiewicz, Piet; De swardt, Estelle; Gross, Elna

    2004-01-01

    Traditional instructional methodology in the Information System Design (ISD) environment lacks explicit strategies for promoting the cognitive skills of prospective system designers. This contributes to the fragmented knowledge and low motivational and creative involvement of learners in system design tasks. In addition, present ISD methodologies,…

  13. A Synergy between the Technological Process and a Methodology for Web Design: Implications for Technological Problem Solving and Design

    ERIC Educational Resources Information Center

    Jakovljevic, Maria; Ankiewicz, Piet; De swardt, Estelle; Gross, Elna

    2004-01-01

    Traditional instructional methodology in the Information System Design (ISD) environment lacks explicit strategies for promoting the cognitive skills of prospective system designers. This contributes to the fragmented knowledge and low motivational and creative involvement of learners in system design tasks. In addition, present ISD methodologies,…

  14. Community-wide assessment of protein-interface modeling suggests improvements to design methodology

    PubMed Central

    Fleishman, Sarel J; Whitehead, Timothy A; Strauch, Eva-Maria; Corn, Jacob E; Qin, Sanbo; Zhou, Huan-Xiang; Mitchell, Julie C.; Demerdash, Omar N.A; Takeda-Shitaka, Mayuko; Terashi, Genki; Moal, Iain H.; Li, Xiaofan; Bates, Paul A.; Zacharias, Martin; Park, Hahnbeom; Ko, Jun-su; Lee, Hasup; Seok, Chaok; Bourquard, Thomas; Bernauer, Julie; Poupon, Anne; Azé, Jérôme; Soner, Seren; Ovali, Şefik Kerem; Ozbek, Pemra; Ben Tal, Nir; Haliloglu, Türkan; Hwang, Howook; Vreven, Thom; Pierce, Brian G.; Weng, Zhiping; Pérez-Cano, Laura; Pons, Carles; Fernández-Recio, Juan; Jiang, Fan; Yang, Feng; Gong, Xinqi; Cao, Libin; Xu, Xianjin; Liu, Bin; Wang, Panwen; Li, Chunhua; Wang, Cunxin; Robert, Charles H.; Guharoy, Mainak; Liu, Shiyong; Huang, Yangyu; Li, Lin; Guo, Dachuan; Chen, Ying; Xiao, Yi; London, Nir; Itzhaki, Zohar; Schueler-Furman, Ora; Inbar, Yuval; Patapov, Vladimir; Cohen, Mati; Schreiber, Gideon; Tsuchiya, Yuko; Kanamori, Eiji; Standley, Daron M.; Nakamura, Haruki; Kinoshita, Kengo; Driggers, Camden M.; Hall, Robert G.; Morgan, Jessica L.; Hsu, Victor L.; Zhan, Jian; Yang, Yuedong; Zhou, Yaoqi; Kastritis, Panagiotis L.; Bonvin, Alexandre M.J.J.; Zhang, Weiyi; Camacho, Carlos J.; Kilambi, Krishna P.; Sircar, Aroop; Gray, Jeffrey J.; Ohue, Masahito; Uchikoga, Nobuyuki; Matsuzaki, Yuri; Ishida, Takashi; Akiyama, Yutaka; Khashan, Raed; Bush, Stephen; Fouches, Denis; Tropsha, Alexander; Esquivel-Rodríguez, Juan; Kihara, Daisuke; Stranges, P Benjamin; Jacak, Ron; Kuhlman, Brian; Huang, Sheng-You; Zou, Xiaoqin; Wodak, Shoshana J; Janin, Joel; Baker, David

    2013-01-01

    The CAPRI and CASP prediction experiments have demonstrated the power of community wide tests of methodology in assessing the current state of the art and spurring progress in the very challenging areas of protein docking and structure prediction. We sought to bring the power of community wide experiments to bear on a very challenging protein design problem that provides a complementary but equally fundamental test of current understanding of protein-binding thermodynamics. We have generated a number of designed protein-protein interfaces with very favorable computed binding energies but which do not appear to be formed in experiments, suggesting there may be important physical chemistry missing in the energy calculations. 28 research groups took up the challenge of determining what is missing: we provided structures of 87 designed complexes and 120 naturally occurring complexes and asked participants to identify energetic contributions and/or structural features that distinguish between the two sets. The community found that electrostatics and solvation terms partially distinguish the designs from the natural complexes, largely due to the non-polar character of the designed interactions. Beyond this polarity difference, the community found that the designed binding surfaces were on average structurally less embedded in the designed monomers, suggesting that backbone conformational rigidity at the designed surface is important for realization of the designed function. These results can be used to improve computational design strategies, but there is still much to be learned; for example, one designed complex, which does form in experiments, was classified by all metrics as a non-binder. PMID:22001016

  15. Community-wide assessment of protein-interface modeling suggests improvements to design methodology.

    PubMed

    Fleishman, Sarel J; Whitehead, Timothy A; Strauch, Eva-Maria; Corn, Jacob E; Qin, Sanbo; Zhou, Huan-Xiang; Mitchell, Julie C; Demerdash, Omar N A; Takeda-Shitaka, Mayuko; Terashi, Genki; Moal, Iain H; Li, Xiaofan; Bates, Paul A; Zacharias, Martin; Park, Hahnbeom; Ko, Jun-su; Lee, Hasup; Seok, Chaok; Bourquard, Thomas; Bernauer, Julie; Poupon, Anne; Azé, Jérôme; Soner, Seren; Ovali, Sefik Kerem; Ozbek, Pemra; Tal, Nir Ben; Haliloglu, Türkan; Hwang, Howook; Vreven, Thom; Pierce, Brian G; Weng, Zhiping; Pérez-Cano, Laura; Pons, Carles; Fernández-Recio, Juan; Jiang, Fan; Yang, Feng; Gong, Xinqi; Cao, Libin; Xu, Xianjin; Liu, Bin; Wang, Panwen; Li, Chunhua; Wang, Cunxin; Robert, Charles H; Guharoy, Mainak; Liu, Shiyong; Huang, Yangyu; Li, Lin; Guo, Dachuan; Chen, Ying; Xiao, Yi; London, Nir; Itzhaki, Zohar; Schueler-Furman, Ora; Inbar, Yuval; Potapov, Vladimir; Cohen, Mati; Schreiber, Gideon; Tsuchiya, Yuko; Kanamori, Eiji; Standley, Daron M; Nakamura, Haruki; Kinoshita, Kengo; Driggers, Camden M; Hall, Robert G; Morgan, Jessica L; Hsu, Victor L; Zhan, Jian; Yang, Yuedong; Zhou, Yaoqi; Kastritis, Panagiotis L; Bonvin, Alexandre M J J; Zhang, Weiyi; Camacho, Carlos J; Kilambi, Krishna P; Sircar, Aroop; Gray, Jeffrey J; Ohue, Masahito; Uchikoga, Nobuyuki; Matsuzaki, Yuri; Ishida, Takashi; Akiyama, Yutaka; Khashan, Raed; Bush, Stephen; Fouches, Denis; Tropsha, Alexander; Esquivel-Rodríguez, Juan; Kihara, Daisuke; Stranges, P Benjamin; Jacak, Ron; Kuhlman, Brian; Huang, Sheng-You; Zou, Xiaoqin; Wodak, Shoshana J; Janin, Joel; Baker, David

    2011-11-25

    The CAPRI (Critical Assessment of Predicted Interactions) and CASP (Critical Assessment of protein Structure Prediction) experiments have demonstrated the power of community-wide tests of methodology in assessing the current state of the art and spurring progress in the very challenging areas of protein docking and structure prediction. We sought to bring the power of community-wide experiments to bear on a very challenging protein design problem that provides a complementary but equally fundamental test of current understanding of protein-binding thermodynamics. We have generated a number of designed protein-protein interfaces with very favorable computed binding energies but which do not appear to be formed in experiments, suggesting that there may be important physical chemistry missing in the energy calculations. A total of 28 research groups took up the challenge of determining what is missing: we provided structures of 87 designed complexes and 120 naturally occurring complexes and asked participants to identify energetic contributions and/or structural features that distinguish between the two sets. The community found that electrostatics and solvation terms partially distinguish the designs from the natural complexes, largely due to the nonpolar character of the designed interactions. Beyond this polarity difference, the community found that the designed binding surfaces were, on average, structurally less embedded in the designed monomers, suggesting that backbone conformational rigidity at the designed surface is important for realization of the designed function. These results can be used to improve computational design strategies, but there is still much to be learned; for example, one designed complex, which does form in experiments, was classified by all metrics as a nonbinder. Copyright © 2011 Elsevier Ltd. All rights reserved.

  16. Educational Design Research: Signs of Progress

    ERIC Educational Resources Information Center

    Reeves, Thomas C.

    2015-01-01

    This special issue of the "Australasian Journal of Educational Technology" includes an introductory article by the guest editors and six papers that illustrate the potential of educational design research (EDR) to address important problems in higher education. In this final paper, reflections on the papers are made. Then the rationale…

  17. Progress in aircraft design since 1903

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Significant developments in aviation history are documented to show the advancements in aircraft design which have taken place since 1903. Each aircraft is identified according to the manufacturer, powerplant, dimensions, normal weight, and typical performance. A narrative summary of the major accomplishments of the aircraft is provided. Photographs of each aircraft are included.

  18. Educational Design Research: Signs of Progress

    ERIC Educational Resources Information Center

    Reeves, Thomas C.

    2015-01-01

    This special issue of the "Australasian Journal of Educational Technology" includes an introductory article by the guest editors and six papers that illustrate the potential of educational design research (EDR) to address important problems in higher education. In this final paper, reflections on the papers are made. Then the rationale…

  19. Arab Teens Lifestyle Study (ATLS): objectives, design, methodology and implications.

    PubMed

    Al-Hazzaa, Hazzaa M; Musaiger, Abdulrahman O

    2011-01-01

    There is a lack of comparable data on physical activity, sedentary behavior, and dietary habits among Arab adolescents, which limits our understanding and interpretation of the relationship between obesity and lifestyle parameters. Therefore, we initiated the Arab Teens Lifestyle Study (ATLS). The ATLS is a multicenter collaborative project for assessing lifestyle habits of Arab adolescents. The objectives of the ATLS project were to investigate the prevalence rates for overweight and obesity, physical activity, sedentary activity and dietary habits among Arab adolescents, and to examine the interrelationships between these lifestyle variables. This paper reports on the objectives, design, methodology, and implications of the ATLS. The ATLS is a school-based cross-sectional study involving 9182 randomly selected secondary-school students (14-19 years) from major Arab cities, using a multistage stratified sampling technique. The participating Arab cities included Riyadh, Jeddah, and Al-Khobar (Saudi Arabia), Bahrain, Dubai (United Arab Emirates), Kuwait, Amman (Jordan), Mosel (Iraq), Muscat (Oman), Tunisia (Tunisia) and Kenitra (Morocco). Measured variables included anthropometric measurements, physical activity, sedentary behavior, sleep duration, and dietary habits. The ATLS project will provide a unique opportunity to collect and analyze important lifestyle information from Arab adolescents using standardized procedures. This is the first time a collaborative Arab project will simultaneously assess broad lifestyle variables in a large sample of adolescents from numerous urbanized Arab regions. This joint research project will supply us with comprehensive and recent data on physical activity/inactivity and eating habits of Arab adolescents relative to obesity. Such invaluable lifestyle-related data are crucial for developing public health policies and regional strategies for health promotion and disease prevention.

  20. An Examination of the MH-60S Common Cockpit from a Design Methodology and Acquisitions Standpoint

    DTIC Science & Technology

    2009-06-01

    HCI Design Methodology Based on the CCD Philosophy ...................................51 a. Use of Design Methodology Specifically Developed for...19 Figure 8. Lockheed Martin Human Computer Interface Requirements (HCIRS) contents (From: [6]).......36 Figure 9. Lockheed Martin eight step HCI ...function of time (From: [9]).....................................46 Figure 15. Systems engineering iterative HCI design process (From [10

  1. A rational design change methodology based on experimental and analytical modal analysis

    SciTech Connect

    Weinacht, D.J.; Bennett, J.G.

    1993-08-01

    A design methodology that integrates analytical modeling and experimental characterization is presented. This methodology represents a powerful tool for making rational design decisions and changes. An example of its implementation in the design, analysis, and testing of a precisions machine tool support structure is given.

  2. Progress in material design for biomedical applications

    PubMed Central

    Tibbitt, Mark W.; Rodell, Christopher B.; Burdick, Jason A.; Anseth, Kristi S.

    2015-01-01

    Biomaterials that interface with biological systems are used to deliver drugs safely and efficiently; to prevent, detect, and treat disease; to assist the body as it heals; and to engineer functional tissues outside of the body for organ replacement. The field has evolved beyond selecting materials that were originally designed for other applications with a primary focus on properties that enabled restoration of function and mitigation of acute pathology. Biomaterials are now designed rationally with controlled structure and dynamic functionality to integrate with biological complexity and perform tailored, high-level functions in the body. The transition has been from permissive to promoting biomaterials that are no longer bioinert but bioactive. This perspective surveys recent developments in the field of polymeric and soft biomaterials with a specific emphasis on advances in nano- to macroscale control, static to dynamic functionality, and biocomplex materials. PMID:26598696

  3. Progress in material design for biomedical applications.

    PubMed

    Tibbitt, Mark W; Rodell, Christopher B; Burdick, Jason A; Anseth, Kristi S

    2015-11-24

    Biomaterials that interface with biological systems are used to deliver drugs safely and efficiently; to prevent, detect, and treat disease; to assist the body as it heals; and to engineer functional tissues outside of the body for organ replacement. The field has evolved beyond selecting materials that were originally designed for other applications with a primary focus on properties that enabled restoration of function and mitigation of acute pathology. Biomaterials are now designed rationally with controlled structure and dynamic functionality to integrate with biological complexity and perform tailored, high-level functions in the body. The transition has been from permissive to promoting biomaterials that are no longer bioinert but bioactive. This perspective surveys recent developments in the field of polymeric and soft biomaterials with a specific emphasis on advances in nano- to macroscale control, static to dynamic functionality, and biocomplex materials.

  4. Tissue microarray methodology identifies complement pathway activation and dysregulation in progressive multiple sclerosis.

    PubMed

    Loveless, Sam; Neal, James W; Howell, Owain W; Harding, Katharine E; Sarkies, Patrick; Evans, Rhian; Bevan, Ryan J; Hakobyan, Svetlana; Harris, Claire L; Robertson, Neil P; Morgan, Bryan Paul

    2017-07-14

    The complement pathway has potential contributions to both white (WM) and grey matter (GM) pathology in Multiple Sclerosis (MS). A quantitative assessment of complement involvement is lacking. Here we describe the use of Tissue MicroArray (TMA) methodology in conjunction with immunohistochemistry to investigate the localization of complement pathway proteins in progressive MS cortical GM and subcortical WM. Antibodies targeting complement proteins C1q, C3b, regulatory proteins C1 inhibitor (C1INH, complement receptor 1 (CR1), clusterin, factor H (FH) and the C5a anaphylatoxin receptor (C5aR) were utilised alongside standard markers of tissue pathology. All stained slides were digitised for quantitative analysis. We found that numbers of cells immunolabelled for HLA-DR, GFAP, C5aR, C1q and C3b were increased in WM lesions (WML) and GM lesions (GML) compared to normal appearing WM (NAWM) and GM (NAGM), respectively. The complement regulators C1INH, CR1, FH and clusterin were more abundant in WM lesions, while the number of C1q+ neurons were increased and the number of C1INH+, clusterin+, FH+ and CR1+ neurons decreased in GM lesions. The number of complement component positive cells (C1q, C3b) correlated with complement regulator expression in WM, but there was no statistical association between complement activation and regulator expression in the GM. We conclude that TMA methodology and quantitative analysis provides evidence of complement dysregulation in MS GML, including an association of the numerical density of C1q+ cells with tissue lesions. Our work confirms that complement activation and dysregulation occur in all cases of progressive MS and suggest that complement may provide potential biomarkers of the disease. © 2017 International Society of Neuropathology.

  5. Hyperbolic tangential function-based progressive addition lens design.

    PubMed

    Qiu, Gufeng; Cui, Xudong

    2015-12-10

    The diopter distribution is key to the successful design of a progressive addition lens. A hyperbolic tangential function is then introduced to describe well the desired diopter distribution on the lens. Simulation and fabrication show that the astigmia on the whole surface is very close to the addition, exhibiting superior performance than that of currently used high-order polynomials and cosine functions. Our investigations found that once the diopter distribution design is reasonable, both the direct and indirect methods of constructing a progressive addition lens can give consistent results. With this function we are able to effectively control the design of critical areas, the position, sizes of far-view and near-view zones, as well as the channel of the lens. This study would provide an efficient way to customize different progressive lenses not only for presbyopia, but also for anti-fatigue, office progressive usages, etc.

  6. A performance-oriented power transformer design methodology using multi-objective evolutionary optimization

    PubMed Central

    Adly, Amr A.; Abd-El-Hafiz, Salwa K.

    2014-01-01

    Transformers are regarded as crucial components in power systems. Due to market globalization, power transformer manufacturers are facing an increasingly competitive environment that mandates the adoption of design strategies yielding better performance at lower costs. In this paper, a power transformer design methodology using multi-objective evolutionary optimization is proposed. Using this methodology, which is tailored to be target performance design-oriented, quick rough estimation of transformer design specifics may be inferred. Testing of the suggested approach revealed significant qualitative and quantitative match with measured design and performance values. Details of the proposed methodology as well as sample design results are reported in the paper. PMID:26257939

  7. A performance-oriented power transformer design methodology using multi-objective evolutionary optimization.

    PubMed

    Adly, Amr A; Abd-El-Hafiz, Salwa K

    2015-05-01

    Transformers are regarded as crucial components in power systems. Due to market globalization, power transformer manufacturers are facing an increasingly competitive environment that mandates the adoption of design strategies yielding better performance at lower costs. In this paper, a power transformer design methodology using multi-objective evolutionary optimization is proposed. Using this methodology, which is tailored to be target performance design-oriented, quick rough estimation of transformer design specifics may be inferred. Testing of the suggested approach revealed significant qualitative and quantitative match with measured design and performance values. Details of the proposed methodology as well as sample design results are reported in the paper.

  8. New Mexico Tech Satellite Design and Progress

    NASA Astrophysics Data System (ADS)

    Landavazo, M.; Cooper, B.; Jorgensen, A. M.; Bernson, C.; Chesebrough, S.; Dang, C.; Guillette, D.; Hall, T.; Huynh, A.; Jackson, R.; Klepper, J.; MacGillivray, J.; Park, D.; Ravindran, V.; Stanton, W.; Yelton, C.; Zagrai, A. N.

    2012-12-01

    New Mexico Tech Satellite (NMTSat) is a low-budget, 3U CubeSat for correlating state-of-health information from the spacecraft with space weather in low Earth orbit (LEO). NMTSat is funded by the NASA/EPSCoR program and is built almost entirely by NMT students at the New Mexico Institute of Mining and Technology. The scientific payload of NMTSat will consist of five instruments built in-house including: a magnetometer, a Langmuir plasma probe, a dosimeter, a state-of-the-art structural health monitor and an electrical health monitor. NMTSat utilizes passive attitude control by means of a magnet and hysteresis rods and carries out attitude determination from a combination of solar panel current and magnetometer readings. NMTSat will also be built around the Space Plug-and-Play Avionics I2C interface (SPA-1) to the greatest extent practical. In this presentation we will give an overview of the NMTSat design and design-tradeoffs and provide a status report on the work of completing NMTSat.

  9. Developing a Methodology for Observing Stress-Induced Temporal Variations in Travel Time: A Progress Report

    NASA Astrophysics Data System (ADS)

    Silver, P. G.; Niu, F.; Daley, T. M.; Majer, E. L.

    2005-12-01

    The dependence of crack properties on stress means that crustal seismic velocity exhibits stress dependence. This dependence constitutes, in principle, a powerful means of studying transient changes in stress at seismogenic depth through the repeat measurement of travel time from a controlled source. While the scientific potential of this stress dependence has been known for decades, time-dependent seismic imaging has yet to become a reliable means of measuring subsurface stress changes in fault-zone environments. This is due to 1) insufficient delay-time precision necessary to detect small changes in stress, and 2) the difficulty in establishing a reliable in-situ calibration between stress and seismic velocity. These two problems are coupled because the best sources of calibration, solid-earth tides and barometric pressure, produce weak stress perturbations of order 10{2}-10{3} Pa that require precision in the measurement of the fractional velocity change dlnv of order 10-6, based on laboratory experiments. We have thus focused on developing a methodology that is capable of providing this high level of precision. For example, we have shown that precision in dlnv is maximized when there are Q/π wavelengths in the source-receiver path. This relationship provides a means of selecting an optimal geometry and/or source characteristic frequency in the planning of experiments. We have initiated a series of experiments to demonstrate the detectability of these stress-calibration signals in progressively more tectonically relevant settings. Initial tests have been completed on the smallest scale, with two boreholes 17 m deep and 3 meters apart. We have used a piezoelectric source (0.1ms source pulse repeated every 100ms) and a string of 24 hydrophones to record P waves with a dominant frequency of 10KHz. Recording was conducted for 160 hours. The massive stacking of ~36,000 high-SNR traces/hr leads to delay-time precision of 6ns (hour sampling) corresponding to dlnv

  10. Design methodology of the strength properties of medical knitted meshes

    NASA Astrophysics Data System (ADS)

    Mikołajczyk, Z.; Walkowska, A.

    2016-07-01

    One of the most important utility properties of medical knitted meshes intended for hernia and urological treatment is their bidirectional strength along the courses and wales. The value of this parameter, expected by the manufacturers and surgeons, is estimated at 100 N per 5 cm of the sample width. The most frequently, these meshes are produced on the basis of single- or double-guide stitches. They are made of polypropylene and polyester monofilament yarns with the diameter in the range from 0.6 to 1.2 mm, characterized by a high medical purity. The aim of the study was to develop the design methodology of meshes strength based on the geometrical construction of the stitch and strength of yarn. In the environment of the ProCAD warpknit 5 software the simulated stretching process of meshes together with an analysis of their geometry changes was carried out. Simulations were made for four selected representative stitches. Both on a built, unique measuring position and on the tensile testing machine the real parameters of the loops geometry of meshes were measured. Model of mechanical stretching of warp-knitted meshes along the courses and wales was developed. The thesis argument was made, that the force that breaks the loop of warp-knitted fabric is the lowest value of breaking forces of loop link yarns or yarns that create straight sections of loop. This thesis was associate with the theory of strength that uses the “the weakest link concept”. Experimental verification of model was carried out for the basic structure of the single-guide mesh. It has been shown that the real, relative strength of the mesh related to one course is equal to the strength of the yarn breakage in a loop, while the strength along the wales is close to breaking strength of a single yarn. In relation to the specific construction of the medical mesh, based on the knowledge of the density of the loops structure, the a-jour mesh geometry and the yarns strength, it is possible, with high

  11. A Formal Semantics for the SRI Hierarchical Program Design Methodology

    NASA Technical Reports Server (NTRS)

    Boyer, R. S.; Moore, J. S.

    1983-01-01

    A formal statement of what it means to use (a subset of) the methodology is presented. It is formally defined that some specified module exists and what it means to say that another module is paid correctly implemented on top of it. No attention is to motivation, either of the methodology or of the formal development of it. Concentration is entirely upon mathematical succinctness and precision. A discussion is presented of how to use certain INTERLISP programs which implement the formal definitions. Among these are a program which generates Floyd like verification conditions sufficient to imply the correctness of a module implementation.

  12. A Formal Semantics for the SRI Hierarchical Program Design Methodology

    NASA Technical Reports Server (NTRS)

    Boyer, R. S.; Moore, J. S.

    1983-01-01

    A formal statement of what it means to use (a subset of) the methodology is presented. It is formally defined that some specified module exists and what it means to say that another module is paid correctly implemented on top of it. No attention is to motivation, either of the methodology or of the formal development of it. Concentration is entirely upon mathematical succinctness and precision. A discussion is presented of how to use certain INTERLISP programs which implement the formal definitions. Among these are a program which generates Floyd like verification conditions sufficient to imply the correctness of a module implementation.

  13. Integrated Controls-Structures Design Methodology for Flexible Spacecraft

    NASA Technical Reports Server (NTRS)

    Maghami, P. G.; Joshi, S. M.; Price, D. B.

    1995-01-01

    This paper proposes an approach for the design of flexible spacecraft, wherein the structural design and the control system design are performed simultaneously. The integrated design problem is posed as an optimization problem in which both the structural parameters and the control system parameters constitute the design variables, which are used to optimize a common objective function, thereby resulting in an optimal overall design. The approach is demonstrated by application to the integrated design of a geostationary platform, and to a ground-based flexible structure experiment. The numerical results obtained indicate that the integrated design approach generally yields spacecraft designs that are substantially superior to the conventional approach, wherein the structural design and control design are performed sequentially.

  14. Peer Review of a Formal Verification/Design Proof Methodology

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The role of formal verification techniques in system validation was examined. The value and the state of the art of performance proving for fault-tolerant compuers were assessed. The investigation, development, and evaluation of performance proving tools were reviewed. The technical issues related to proof methodologies are examined. The technical issues discussed are summarized.

  15. Loss Exposure and Risk Analysis Methodology (LERAM) Project Database Design.

    DTIC Science & Technology

    1996-06-01

    MISREPS) to more capably support system safety engineering concepts such as hazard analysis and risk management. As part of the Loss Exposure and Risk ... Analysis Methodology (LERAM) project, the research into the methods which we employ to report, track, and analyze hazards has resulted in a series of low

  16. Behavioral Methodology for Designing and Evaluating Applied Programs for Women.

    ERIC Educational Resources Information Center

    Thurston, Linda P.

    To be maximally effective in solving problems, researchers must place their methodological and theoretical models of science within social and political contexts. They must become aware of biases and assumptions and move toward a more valid perception of social realities. Psychologists must view women in the situational context within which…

  17. De/signing Research in Education: Patchwork(ing) Methodologies with Theory

    ERIC Educational Resources Information Center

    Higgins, Marc; Madden, Brooke; Berard, Marie-France; Lenz Kothe, Elsa; Nordstrom, Susan

    2017-01-01

    Four education scholars extend the methodological space inspired by Jackson and Mazzei's "Thinking with Theory" through focusing on research design. The notion of de/sign is presented and employed to counter prescriptive method/ology that often sutures over pedagogical possibilities in research and educational settings. Key…

  18. Methodology for designing accelerated aging tests for predicting life of photovoltaic arrays

    NASA Technical Reports Server (NTRS)

    Gaines, G. B.; Thomas, R. E.; Derringer, G. C.; Kistler, C. W.; Bigg, D. M.; Carmichael, D. C.

    1977-01-01

    A methodology for designing aging tests in which life prediction was paramount was developed. The methodology builds upon experience with regard to aging behavior in those material classes which are expected to be utilized as encapsulant elements, viz., glasses and polymers, and upon experience with the design of aging tests. The experiences were reviewed, and results are discussed in detail.

  19. De/signing Research in Education: Patchwork(ing) Methodologies with Theory

    ERIC Educational Resources Information Center

    Higgins, Marc; Madden, Brooke; Berard, Marie-France; Lenz Kothe, Elsa; Nordstrom, Susan

    2017-01-01

    Four education scholars extend the methodological space inspired by Jackson and Mazzei's "Thinking with Theory" through focusing on research design. The notion of de/sign is presented and employed to counter prescriptive method/ology that often sutures over pedagogical possibilities in research and educational settings. Key…

  20. Methodology of Computer-Aided Design of Variable Guide Vanes of Aircraft Engines

    ERIC Educational Resources Information Center

    Falaleev, Sergei V.; Melentjev, Vladimir S.; Gvozdev, Alexander S.

    2016-01-01

    The paper presents a methodology which helps to avoid a great amount of costly experimental research. This methodology includes thermo-gas dynamic design of an engine and its mounts, the profiling of compressor flow path and cascade design of guide vanes. Employing a method elaborated by Howell, we provide a theoretical solution to the task of…

  1. 77 FR 50514 - Post-Approval Studies 2012 Workshop: Design, Methodology, and Role in Evidence Appraisal...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-21

    ... experiences with post- approval studies, improvement of implementation strategies for post- approval studies... HUMAN SERVICES Food and Drug Administration Post-Approval Studies 2012 Workshop: Design, Methodology... ``Post-Approval Studies 2012 Workshop: Design, Methodology, and Role in Evidence Appraisal Throughout the...

  2. Game Methodology for Design Methods and Tools Selection

    ERIC Educational Resources Information Center

    Ahmad, Rafiq; Lahonde, Nathalie; Omhover, Jean-françois

    2014-01-01

    Design process optimisation and intelligence are the key words of today's scientific community. A proliferation of methods has made design a convoluted area. Designers are usually afraid of selecting one method/tool over another and even expert designers may not necessarily know which method is the best to use in which circumstances. This…

  3. Design of a methodology for assessing an electrocardiographic telemonitoring system.

    PubMed

    Alfonzo, A; Huerta, M K; Wong, S; Passariello, G; Díaz, M; La Cruz, A; Cruz, J

    2007-01-01

    Recent studies in Bioengineering show a great interest in telemedicine projects, it is motivated mainly for the fast communication technologies reached during the last decade. Since then many telemedicine projects in different areas have been pursued, among them the electrocardiographic monitoring, as well as methodological reports for the evaluation of these projects. In this work a methodology to evaluate an electrocardiographic telemonitoring system is presented. A procedure to verify the operation of Data Acquisition Module (DAM) of an electrocardiographic telemonitoring system is given, taking as reference defined standards, and procedures for the measurement of the Quality of Service (QoS) parameters required by the system in a Local Area Network (LAN). Finally a graphical model and protocols of evaluation are proposed.

  4. A Multiscale Progressive Failure Modeling Methodology for Composites that Includes Fiber Strength Stochastics

    NASA Technical Reports Server (NTRS)

    Ricks, Trenton M.; Lacy, Thomas E., Jr.; Bednarcyk, Brett A.; Arnold, Steven M.; Hutchins, John W.

    2014-01-01

    A multiscale modeling methodology was developed for continuous fiber composites that incorporates a statistical distribution of fiber strengths into coupled multiscale micromechanics/finite element (FE) analyses. A modified two-parameter Weibull cumulative distribution function, which accounts for the effect of fiber length on the probability of failure, was used to characterize the statistical distribution of fiber strengths. A parametric study using the NASA Micromechanics Analysis Code with the Generalized Method of Cells (MAC/GMC) was performed to assess the effect of variable fiber strengths on local composite failure within a repeating unit cell (RUC) and subsequent global failure. The NASA code FEAMAC and the ABAQUS finite element solver were used to analyze the progressive failure of a unidirectional SCS-6/TIMETAL 21S metal matrix composite tensile dogbone specimen at 650 degC. Multiscale progressive failure analyses were performed to quantify the effect of spatially varying fiber strengths on the RUC-averaged and global stress-strain responses and failure. The ultimate composite strengths and distribution of failure locations (predominately within the gage section) reasonably matched the experimentally observed failure behavior. The predicted composite failure behavior suggests that use of macroscale models that exploit global geometric symmetries are inappropriate for cases where the actual distribution of local fiber strengths displays no such symmetries. This issue has not received much attention in the literature. Moreover, the model discretization at a specific length scale can have a profound effect on the computational costs associated with multiscale simulations.models that yield accurate yet tractable results.

  5. Methodological developments in US state-level Genuine Progress Indicators: toward GPI 2.0

    USGS Publications Warehouse

    Bagstad, Kenneth J.; Berik, Günseli; Gaddis, Erica J. Brown

    2014-01-01

    The Genuine Progress Indicator (GPI) has emerged as an important monetary measure of economic well-being. Unlike mainstream economic indicators, primarily Gross Domestic Product (GDP), the GPI accounts for both the benefits and costs of economic production across diverse economic, social, and environmental domains in a more comprehensive manner. Recently, the GPI has gained traction in subnational policy in the United States, with GPI studies being conducted in a number of states and with their formal adoption by several state governments. As the GPI is applied in different locations, new methods are developed, different data sources are available, and new issues of policy relevance are addressed using its component indicators. This has led to a divergence in methods, reducing comparability between studies and yielding results that are of varying methodological sophistication. In this study, we review the “state of the art” in recent US state-level GPI studies, focusing on those from Hawaii, Maryland, Ohio, Utah, and Vermont. Through adoption of a consistent approach, these and future GPI studies could utilize a framework that supports more uniform, comparable, and accurate measurements of progress. We also identify longer-term issues, particularly related to treatment of nonrenewable resource depletion, government spending, income inequality, and ecosystem services. As these issues are successfully addressed and disseminated, a “GPI 2.0” will emerge that better measures economic well-being and has greater accuracy and policy relevance than past GPI measurements. As the GPI expands further into mainstream policy analysis, a more formal process by which methods could be updated, standardized, and applied is needed.

  6. Design methodology for optimal hardware implementation of wavelet transform domain algorithms

    NASA Astrophysics Data System (ADS)

    Johnson-Bey, Charles; Mickens, Lisa P.

    2005-05-01

    The work presented in this paper lays the foundation for the development of an end-to-end system design methodology for implementing wavelet domain image/video processing algorithms in hardware using Xilinx field programmable gate arrays (FPGAs). With the integration of the Xilinx System Generator toolbox, this methodology will allow algorithm developers to design and implement their code using the familiar MATLAB/Simulink development environment. By using this methodology, algorithm developers will not be required to become proficient in the intricacies of hardware design, thus reducing the design cycle and time-to-market.

  7. A Design Methodology for Complex (E)-Learning. Innovative Session.

    ERIC Educational Resources Information Center

    Bastiaens, Theo; van Merrienboer, Jeroen; Hoogveld, Bert

    Human resource development (HRD) specialists are searching for instructional design models that accommodate e-learning platforms. Van Merrienboer proposed the four-component instructional design model (4C/ID model) for competency-based education. The model's basic message is that well-designed learning environments can always be described in terms…

  8. Ethics of Engagement: User-Centered Design and Rhetorical Methodology.

    ERIC Educational Resources Information Center

    Salvo, Michael J.

    2001-01-01

    Explores the shift from observation of users to participation with users, describing and investigating three examples of user-centered design practice in order to consider the new ethical demands being made of technical communicators. Explores Pelle Ehn's participatory design method, Roger Whitehouse's design of tactile signage for blind users,…

  9. A methodology for designing aircraft to low sonic boom constraints

    NASA Technical Reports Server (NTRS)

    Mack, Robert J.; Needleman, Kathy E.

    1991-01-01

    A method for designing conceptual supersonic cruise aircraft to meet low sonic boom requirements is outlined and described. The aircraft design is guided through a systematic evolution from initial three view drawing to a final numerical model description, while the designer using the method controls the integration of low sonic boom, high supersonic aerodynamic efficiency, adequate low speed handling, and reasonable structure and materials technologies. Some experience in preliminary aircraft design and in the use of various analytical and numerical codes is required for integrating the volume and lift requirements throughout the design process.

  10. A transonic-small-disturbance wing design methodology

    NASA Technical Reports Server (NTRS)

    Phillips, Pamela S.; Waggoner, Edgar G.; Campbell, Richard L.

    1988-01-01

    An automated transonic design code has been developed which modifies an initial airfoil or wing in order to generate a specified pressure distribution. The design method uses an iterative approach that alternates between a potential-flow analysis and a design algorithm that relates changes in surface pressure to changes in geometry. The analysis code solves an extended small-disturbance potential-flow equation and can model a fuselage, pylons, nacelles, and a winglet in addition to the wing. A two-dimensional option is available for airfoil analysis and design. Several two- and three-dimensional test cases illustrate the capabilities of the design code.

  11. Propulsion integration of hypersonic air-breathing vehicles utilizing a top-down design methodology

    NASA Astrophysics Data System (ADS)

    Kirkpatrick, Brad Kenneth

    In recent years, a focus of aerospace engineering design has been the development of advanced design methodologies and frameworks to account for increasingly complex and integrated vehicles. Techniques such as parametric modeling, global vehicle analyses, and interdisciplinary data sharing have been employed in an attempt to improve the design process. The purpose of this study is to introduce a new approach to integrated vehicle design known as the top-down design methodology. In the top-down design methodology, the main idea is to relate design changes on the vehicle system and sub-system level to a set of over-arching performance and customer requirements. Rather than focusing on the performance of an individual system, the system is analyzed in terms of the net effect it has on the overall vehicle and other vehicle systems. This detailed level of analysis can only be accomplished through the use of high fidelity computational tools such as Computational Fluid Dynamics (CFD) or Finite Element Analysis (FEA). The utility of the top-down design methodology is investigated through its application to the conceptual and preliminary design of a long-range hypersonic air-breathing vehicle for a hypothetical next generation hypersonic vehicle (NHRV) program. System-level design is demonstrated through the development of the nozzle section of the propulsion system. From this demonstration of the methodology, conclusions are made about the benefits, drawbacks, and cost of using the methodology.

  12. A Comparison of Angoff's Design I and Design II for Vertical Equating Using Traditional and IRT Methodology.

    ERIC Educational Resources Information Center

    Harris, Deborah J.

    1991-01-01

    Two data collection designs, counterbalanced and spiraling (Angoff's Design I and Angoff's Design II) were compared using item response theory and equipercentile equating methodology in the vertical equating of 2 mathematics achievement tests using 1,000 eleventh graders and 1,000 twelfth graders. The greater stability of Design II is discussed.…

  13. A Methodology for Quantifying Certain Design Requirements During the Design Phase

    NASA Technical Reports Server (NTRS)

    Adams, Timothy; Rhodes, Russel

    2005-01-01

    A methodology for developing and balancing quantitative design requirements for safety, reliability, and maintainability has been proposed. Conceived as the basis of a more rational approach to the design of spacecraft, the methodology would also be applicable to the design of automobiles, washing machines, television receivers, or almost any other commercial product. Heretofore, it has been common practice to start by determining the requirements for reliability of elements of a spacecraft or other system to ensure a given design life for the system. Next, safety requirements are determined by assessing the total reliability of the system and adding redundant components and subsystems necessary to attain safety goals. As thus described, common practice leaves the maintainability burden to fall to chance; therefore, there is no control of recurring costs or of the responsiveness of the system. The means that have been used in assessing maintainability have been oriented toward determining the logistical sparing of components so that the components are available when needed. The process established for developing and balancing quantitative requirements for safety (S), reliability (R), and maintainability (M) derives and integrates NASA s top-level safety requirements and the controls needed to obtain program key objectives for safety and recurring cost (see figure). Being quantitative, the process conveniently uses common mathematical models. Even though the process is shown as being worked from the top down, it can also be worked from the bottom up. This process uses three math models: (1) the binomial distribution (greaterthan- or-equal-to case), (2) reliability for a series system, and (3) the Poisson distribution (less-than-or-equal-to case). The zero-fail case for the binomial distribution approximates the commonly known exponential distribution or "constant failure rate" distribution. Either model can be used. The binomial distribution was selected for

  14. Designing trials for pressure ulcer risk assessment research: methodological challenges.

    PubMed

    Balzer, K; Köpke, S; Lühmann, D; Haastert, B; Kottner, J; Meyer, G

    2013-08-01

    For decades various pressure ulcer risk assessment scales (PURAS) have been developed and implemented into nursing practice despite uncertainty whether use of these tools helps to prevent pressure ulcers. According to current methodological standards, randomised controlled trials (RCTs) are required to conclusively determine the clinical efficacy and safety of this risk assessment strategy. In these trials, PURAS-aided risk assessment has to be compared to nurses' clinical judgment alone in terms of its impact on pressure ulcer incidence and adverse outcomes. However, RCTs evaluating diagnostic procedures are prone to specific risks of bias and threats to the statistical power which may challenge their validity and feasibility. This discussion paper critically reflects on the rigour and feasibility of experimental research needed to substantiate the clinical efficacy of PURAS-aided risk assessment. Based on reflections of the methodological literature, a critical appraisal of available trials on this subject and an analysis of a protocol developed for a methodologically robust cluster-RCT, this paper arrives at the following conclusions: First, available trials do not provide reliable estimates of the impact of PURAS-aided risk assessment on pressure ulcer incidence compared to nurses' clinical judgement alone due to serious risks of bias and insufficient sample size. Second, it seems infeasible to assess this impact by means of rigorous experimental studies since sample size would become extremely high if likely threats to validity and power are properly taken into account. Third, means of evidence linkages seem to currently be the most promising approaches for evaluating the clinical efficacy and safety of PURAS-aided risk assessment. With this kind of secondary research, the downstream effect of use of PURAS on pressure ulcer incidence could be modelled by combining best available evidence for single parts of this pathway. However, to yield reliable modelling

  15. Optimal color design of psychological counseling room by design of experiments and response surface methodology.

    PubMed

    Liu, Wenjuan; Ji, Jianlin; Chen, Hua; Ye, Chenyu

    2014-01-01

    Color is one of the most powerful aspects of a psychological counseling environment. Little scientific research has been conducted on color design and much of the existing literature is based on observational studies. Using design of experiments and response surface methodology, this paper proposes an optimal color design approach for transforming patients' perception into color elements. Six indices, pleasant-unpleasant, interesting-uninteresting, exciting-boring, relaxing-distressing, safe-fearful, and active-inactive, were used to assess patients' impression. A total of 75 patients participated, including 42 for Experiment 1 and 33 for Experiment 2. 27 representative color samples were designed in Experiment 1, and the color sample (L = 75, a = 0, b = -60) was the most preferred one. In Experiment 2, this color sample was set as the 'central point', and three color attributes were optimized to maximize the patients' satisfaction. The experimental results show that the proposed method can get the optimal solution for color design of a counseling room.

  16. Optimal Color Design of Psychological Counseling Room by Design of Experiments and Response Surface Methodology

    PubMed Central

    Chen, Hua; Ye, Chenyu

    2014-01-01

    Color is one of the most powerful aspects of a psychological counseling environment. Little scientific research has been conducted on color design and much of the existing literature is based on observational studies. Using design of experiments and response surface methodology, this paper proposes an optimal color design approach for transforming patients’ perception into color elements. Six indices, pleasant-unpleasant, interesting-uninteresting, exciting-boring, relaxing-distressing, safe-fearful, and active-inactive, were used to assess patients’ impression. A total of 75 patients participated, including 42 for Experiment 1 and 33 for Experiment 2. 27 representative color samples were designed in Experiment 1, and the color sample (L = 75, a = 0, b = -60) was the most preferred one. In Experiment 2, this color sample was set as the ‘central point’, and three color attributes were optimized to maximize the patients’ satisfaction. The experimental results show that the proposed method can get the optimal solution for color design of a counseling room. PMID:24594683

  17. Improved FTA Methodology and Application to Subsea Pipeline Reliability Design

    PubMed Central

    Lin, Jing; Yuan, Yongbo; Zhang, Mingyuan

    2014-01-01

    An innovative logic tree, Failure Expansion Tree (FET), is proposed in this paper, which improves on traditional Fault Tree Analysis (FTA). It describes a different thinking approach for risk factor identification and reliability risk assessment. By providing a more comprehensive and objective methodology, the rather subjective nature of FTA node discovery is significantly reduced and the resulting mathematical calculations for quantitative analysis are greatly simplified. Applied to the Useful Life phase of a subsea pipeline engineering project, the approach provides a more structured analysis by constructing a tree following the laws of physics and geometry. Resulting improvements are summarized in comparison table form. PMID:24667681

  18. Improved FTA methodology and application to subsea pipeline reliability design.

    PubMed

    Lin, Jing; Yuan, Yongbo; Zhang, Mingyuan

    2014-01-01

    An innovative logic tree, Failure Expansion Tree (FET), is proposed in this paper, which improves on traditional Fault Tree Analysis (FTA). It describes a different thinking approach for risk factor identification and reliability risk assessment. By providing a more comprehensive and objective methodology, the rather subjective nature of FTA node discovery is significantly reduced and the resulting mathematical calculations for quantitative analysis are greatly simplified. Applied to the Useful Life phase of a subsea pipeline engineering project, the approach provides a more structured analysis by constructing a tree following the laws of physics and geometry. Resulting improvements are summarized in comparison table form.

  19. Probabilistic Design Methodology and its Application to the Design of an Umbilical Retract Mechanism

    NASA Technical Reports Server (NTRS)

    Onyebueke, Landon; Ameye, Olusesan

    2002-01-01

    A lot has been learned from past experience with structural and machine element failures. The understanding of failure modes and the application of an appropriate design analysis method can lead to improved structural and machine element safety as well as serviceability. To apply Probabilistic Design Methodology (PDM), all uncertainties are modeled as random variables with selected distribution types, means, and standard deviations. It is quite difficult to achieve a robust design without considering the randomness of the design parameters which is the case in the use of the Deterministic Design Approach. The US Navy has a fleet of submarine-launched ballistic missiles. An umbilical plug joins the missile to the submarine in order to provide electrical and cooling water connections. As the missile leaves the submarine, an umbilical retract mechanism retracts the umbilical plug clear of the advancing missile after disengagement during launch and retrains the plug in the retracted position. The design of the current retract mechanism in use was based on the deterministic approach which puts emphasis on factor of safety. A new umbilical retract mechanism that is simpler in design, lighter in weight, more reliable, easier to adjust, and more cost effective has become desirable since this will increase the performance and efficiency of the system. This paper reports on a recent project performed at Tennessee State University for the US Navy that involved the application of PDM to the design of an umbilical retract mechanism. This paper demonstrates how the use of PDM lead to the minimization of weight and cost, and the maximization of reliability and performance.

  20. Probabilistic Design Methodology and its Application to the Design of an Umbilical Retract Mechanism

    NASA Technical Reports Server (NTRS)

    Onyebueke, Landon; Ameye, Olusesan

    2002-01-01

    A lot has been learned from past experience with structural and machine element failures. The understanding of failure modes and the application of an appropriate design analysis method can lead to improved structural and machine element safety as well as serviceability. To apply Probabilistic Design Methodology (PDM), all uncertainties are modeled as random variables with selected distribution types, means, and standard deviations. It is quite difficult to achieve a robust design without considering the randomness of the design parameters which is the case in the use of the Deterministic Design Approach. The US Navy has a fleet of submarine-launched ballistic missiles. An umbilical plug joins the missile to the submarine in order to provide electrical and cooling water connections. As the missile leaves the submarine, an umbilical retract mechanism retracts the umbilical plug clear of the advancing missile after disengagement during launch and retrains the plug in the retracted position. The design of the current retract mechanism in use was based on the deterministic approach which puts emphasis on factor of safety. A new umbilical retract mechanism that is simpler in design, lighter in weight, more reliable, easier to adjust, and more cost effective has become desirable since this will increase the performance and efficiency of the system. This paper reports on a recent project performed at Tennessee State University for the US Navy that involved the application of PDM to the design of an umbilical retract mechanism. This paper demonstrates how the use of PDM lead to the minimization of weight and cost, and the maximization of reliability and performance.

  1. Probabilistic Design Methodology and its Application to the Design of an Umbilical Retract Mechanism

    NASA Astrophysics Data System (ADS)

    Onyebueke, Landon; Ameye, Olusesan

    2002-10-01

    A lot has been learned from past experience with structural and machine element failures. The understanding of failure modes and the application of an appropriate design analysis method can lead to improved structural and machine element safety as well as serviceability. To apply Probabilistic Design Methodology (PDM), all uncertainties are modeled as random variables with selected distribution types, means, and standard deviations. It is quite difficult to achieve a robust design without considering the randomness of the design parameters which is the case in the use of the Deterministic Design Approach. The US Navy has a fleet of submarine-launched ballistic missiles. An umbilical plug joins the missile to the submarine in order to provide electrical and cooling water connections. As the missile leaves the submarine, an umbilical retract mechanism retracts the umbilical plug clear of the advancing missile after disengagement during launch and retrains the plug in the retracted position. The design of the current retract mechanism in use was based on the deterministic approach which puts emphasis on factor of safety. A new umbilical retract mechanism that is simpler in design, lighter in weight, more reliable, easier to adjust, and more cost effective has become desirable since this will increase the performance and efficiency of the system. This paper reports on a recent project performed at Tennessee State University for the US Navy that involved the application of PDM to the design of an umbilical retract mechanism. This paper demonstrates how the use of PDM lead to the minimization of weight and cost, and the maximization of reliability and performance.

  2. Highly efficient design methodology for very large scale coupled microcavities

    NASA Astrophysics Data System (ADS)

    Swillam, Mohamed A.; Ahmed, Osman S.; Bakr, Mohamed H.; Li, Xun

    2012-10-01

    We propose a novel approach for efficient design of large number of coupled microcavities. This approach is based on formulating the design problem as an convex optimization problem. This formulation allows for fast, efficient solution of the desing problem. A filter design using 150 coupled microcavities has been achieved in less than one second of simulation using personal computer. The proposed technique require no initial desing to start the optimization process.

  3. Design Thinking: A Methodology towards Sustainable Problem Solving in Higher Education in South Africa

    ERIC Educational Resources Information Center

    Munyai, Keneilwe

    2016-01-01

    This short paper explores the potential contribution of design thinking methodology to the education and training system in South Africa. Design thinking is slowly gaining traction in South Africa. Design Thinking is gaining traction in South Africa. There is offered by the Hasso Plattner Institute of Design Thinking at the University of Cape Town…

  4. From inhibition of radiographic progression to maintaining structural integrity: a methodological framework for radiographic progression in rheumatoid arthritis and psoriatic arthritis clinical trials.

    PubMed

    Landewé, Robert; Strand, Vibeke; van der Heijde, Désirée

    2013-07-01

    Usually, a clinical trial in rheumatoid arthritis and psoriatic arthritis aiming to demonstrate that a new antirheumatic drug treatment can inhibit progression of structural damage has a 'superiority design': The new treatment is compared to placebo or to another active treatment. Currently, many new drug treatments have shown to be able to completely suppress progression (progression rates close to zero). For largely unknown reasons, during the last 10 years, radiographic progression rates in clinical trials have gradually decreased, so that progression rates in the comparator groups are often too low to demonstrate meaningful inhibition, and thus superiority of the new treatment. We here propose an alternative framework to demonstrate that new treatments have the ability to 'preserve structural integrity' rather than to 'inhibit radiographic progression'. Anno 2013, preserving structural integrity is conceptually more realistic than inhibiting radiographic progression.

  5. Manufacturing-aware design methodologies for mixed-signal communication circuits

    NASA Astrophysics Data System (ADS)

    Carballo, Juan A.; Nassif, Sani

    2004-05-01

    Mixed-signal communication circuits are becoming a very common component of systems-on-a-chip as part of modern communication systems. The implementation of DFM and DFT methodologies is critical to enhance communication across the tape-out barrier critical for these circuits. We present a manufacturing-aware design methodology specifically targeting integrated communication circuits in systems-on-a-chip (SoC). The key principle behind the methodology is that flexible design methods which can effectively adjust a design"s power consumption and functionality to its application can also provide critical reductions in manufacturing-induced design risk. The methodology is based on the following four techniques: goal-based design that directly relates top level goals with low level manufacturing-dependent parameters; semi-custom voltage-island physical design techniques; adaptive architecture design; and intelligent on-line at-speed monitoring and problem determination techniques. We describe these four methodology features, and illustrate them on a multi-protocol CMOS 3.2 Gbits/second low-power serial communications core. The presented data shows how this methodology results in better and more cost-effective adaptability of the design to manufacturing and post-manufacturing conditions, thereby improving turnaround time, yield, and overall profit.

  6. Design and development of progressive tool for manufacturing washer

    NASA Astrophysics Data System (ADS)

    Annigeri, Ulhas K.; Raghavendra Ravi Kiran, K.; Deepthi, Y. P.

    2017-07-01

    In a progressive tool the raw material is worked at different station to finally fabricate the component. A progressive tool is a lucrative tool for mass production of components. A lot of automobile and other transport industries develop progressive tool for the production of components. The design of tool involves lot of planning and the same amount of skill of process planning is required in the fabrication of the tool. The design also involves use of thumb rules and standard elements as per experience gained in practice. Manufacturing the press tool is a laborious task as special jigs and fixtures have to be designed for the purpose. Assembly of all the press tool elements is another task where use of accurate measuring instruments for alignment of various tool elements is important. In the present study, design and fabrication of progressive press tool for production of washer has been developed and the press tool has been tried out on a mechanical type of press. The components produced are to dimensions.

  7. Participant Observation, Anthropology Methodology and Design Anthropology Research Inquiry

    ERIC Educational Resources Information Center

    Gunn, Wendy; Løgstrup, Louise B.

    2014-01-01

    Within the design studio, and across multiple field sites, the authors compare involvement of research tools and materials during collaborative processes of designing. Their aim is to trace temporal dimensions (shifts/ movements) of where and when learning takes place along different sites of practice. They do so by combining participant…

  8. A Fundamental Methodology for Designing Management Information Systems for Schools.

    ERIC Educational Resources Information Center

    Visscher, Adrie J.

    Computer-assisted school information systems (SISs) are developed and used worldwide; however, the literature on strategies for their design and development is lacking. This paper presents the features of a fundamental approach to systems design that proved to be successful when developing SCHOLIS, a computer-assisted SIS for Dutch secondary…

  9. Participatory Pattern Workshops: A Methodology for Open Learning Design Inquiry

    ERIC Educational Resources Information Center

    Mor, Yishay; Warburton, Steven; Winters, Niall

    2012-01-01

    In order to promote pedagogically informed use of technology, educators need to develop an active, inquisitive, design-oriented mindset. Design Patterns have been demonstrated as powerful mediators of theory-praxis conversations yet widespread adoption by the practitioner community remains a challenge. Over several years, the authors and their…

  10. Participant Observation, Anthropology Methodology and Design Anthropology Research Inquiry

    ERIC Educational Resources Information Center

    Gunn, Wendy; Løgstrup, Louise B.

    2014-01-01

    Within the design studio, and across multiple field sites, the authors compare involvement of research tools and materials during collaborative processes of designing. Their aim is to trace temporal dimensions (shifts/ movements) of where and when learning takes place along different sites of practice. They do so by combining participant…

  11. Application of optimal design methodologies in clinical pharmacology experiments.

    PubMed

    Ogungbenro, Kayode; Dokoumetzidis, Aristides; Aarons, Leon

    2009-01-01

    Pharmacokinetics and pharmacodynamics data are often analysed by mixed-effects modelling techniques (also known as population analysis), which has become a standard tool in the pharmaceutical industries for drug development. The last 10 years has witnessed considerable interest in the application of experimental design theories to population pharmacokinetic and pharmacodynamic experiments. Design of population pharmacokinetic experiments involves selection and a careful balance of a number of design factors. Optimal design theory uses prior information about the model and parameter estimates to optimize a function of the Fisher information matrix to obtain the best combination of the design factors. This paper provides a review of the different approaches that have been described in the literature for optimal design of population pharmacokinetic and pharmacodynamic experiments. It describes options that are available and highlights some of the issues that could be of concern as regards practical application. It also discusses areas of application of optimal design theories in clinical pharmacology experiments. It is expected that as the awareness about the benefits of this approach increases, more people will embrace it and ultimately will lead to more efficient population pharmacokinetic and pharmacodynamic experiments and can also help to reduce both cost and time during drug development.

  12. Design methodology and application of high speed gate arrays

    NASA Astrophysics Data System (ADS)

    Decker, R.

    A system to provide real-time signal averaging of waveforms from a 50 MHz analog to digital converter has been fabricated to operate over a wide temperature range. This system evolved from conception, through an initial simulated design for emitter coupled logic (ECL), to a pair of CMOS gate array designs. Changing the implementation technology to CMOS gate arrays resulted in savings in cost, size, weight, and power. Design rules employed to obtain working silicon on the first cycle, at double state-of-the-art gate array speeds, are discussed. Also discussed are built-in, run-time, self-test features.

  13. Engineering design methodology for bio-mechatronic products.

    PubMed

    Derelöv, Micael; Detterfelt, Jonas; Björkman, Mats; Mandenius, Carl-Fredrik

    2008-01-01

    Four complex biotechnology products/product systems (a protein purification system, a bioreactor system, a surface plasmon resonance biosensor, and an enzymatic glucose analyzer) are analyzed using conceptual design principles. A design model well-known in mechanical system design, the Hubka-Eder (HE) model, is adapted to biotechnology products that exemplify combined technical systems of mechanical, electronic, and biological components, here referred to as bio-mechatronic systems. The analysis concludes that an extension of the previous HE model with a separate biological systems entity significantly contributes to facilitating the functional and systematic analyses of bio-mechatronic systems.

  14. Modern design methodology and problems in training aircraft engineers

    NASA Technical Reports Server (NTRS)

    Liseitsev, N. K.

    1989-01-01

    A brief report on the problem of modern aircraft specialist education is presented that is devoted to the content and methods of teaching a course in General Aircraft Design in the Moscow Aviation Institute.

  15. Modern design methodology and problems in training aircraft engineers

    NASA Technical Reports Server (NTRS)

    Liseitsev, N. K.

    1989-01-01

    A brief report on the problem of modern aircraft specialist education is presented that is devoted to the content and methods of teaching a course in General Aircraft Design in the Moscow Aviation Institute.

  16. Design methodology of microstructures for enhanced mechanical reliability

    NASA Astrophysics Data System (ADS)

    Wittler, Olaf; Walter, Hans; Vogel, Dietmar; Keller, Juergen; Michel, Bernd

    2005-04-01

    The achievement of reliability is a major task during the design process of microsystems (i.e. MEMS: mechanical-electrical microsystems). In this respect CAD (computer aided design) simulation methods play a major role in the dimensioning of mechanical structures. It can be observed that a pure CAD approach becomes difficult because of the complexity of these systems, which originates from the large variety of integrated materials and thus a diversity of the resulting failure mechanisms. Therefore strategies dealing with these uncertainties in reliability estimates need to be incorporated in the design process. The approach presented in this paper is based on the application of simulation and advanced deformation measurement methods named microDAC (micro deformation analysis by means of grey scale correlation) and nanoDAC. It is exemplified on different detail levels of the reliability assessment, with an emphasis on fracture. The first stage consists of a parametric simulation approach, which helps to develop design guidelines for the geometry. For a more absolute quantitative analysis and for material selection in a new design the mechanical properties need to be specified and evaluated with respect to reliability. Besides, the described systematics of reliability assessment needs a profound knowledge of the failure behavior, which is analyzed by the application of microDAC/nanoDAC techniques. In the prescribed way, it becomes possible to tackle mechanical reliability problems in early design phases.

  17. Information System Design Methodology Based on PERT/CPM Networking and Optimization Techniques.

    ERIC Educational Resources Information Center

    Bose, Anindya

    The dissertation attempts to demonstrate that the program evaluation and review technique (PERT)/Critical Path Method (CPM) or some modified version thereof can be developed into an information system design methodology. The methodology utilizes PERT/CPM which isolates the basic functional units of a system and sets them in a dynamic time/cost…

  18. Information System Design Methodology Based on PERT/CPM Networking and Optimization Techniques.

    ERIC Educational Resources Information Center

    Bose, Anindya

    The dissertation attempts to demonstrate that the program evaluation and review technique (PERT)/Critical Path Method (CPM) or some modified version thereof can be developed into an information system design methodology. The methodology utilizes PERT/CPM which isolates the basic functional units of a system and sets them in a dynamic time/cost…

  19. Design methodology for multilayer microwave filters and Balun circuits

    NASA Astrophysics Data System (ADS)

    Cho, Choonsik

    A systematic and efficient approach to the design for a broad class of passive microwave circuits in multilayer configurations is presented. Multilayer configurations are becoming popular at microwave frequencies due to their several advantages over single layer configurations. However, systematic design procedures for multilayer circuits have not been yet available. Design procedures for several types of microwave circuits in multilayer configurations have been developed. Parallel coupled-line band-pass filters, end-coupled bandpass filters and three-line baluns have been designed with the systematic design procedures developed. Procedures developed have been verified by comparing the results with full-wave electromagnetic simulations. These circuits have also been fabricated and measured to verify the design procedures. Wide bandwidth, size/volume compaction, flexible design and physically realizable dimensions are the factors that multilayer structures provide compared to single layer configurations. A network modeling is employed to characterize multilayer multi-conductor transmission line systems. Since the microwave circuits developed utilize multiple coupled lines in multilayer configurations, the characterization of these coupled lines plays a significant role in derivation of design equations and generation of design procedures. Using this modeling approach, a multiple coupled line system can be transformed to a multiple uncoupled line system. These equivalent uncoupled lines are used to derive network parameters ([S], [Y] or [Z] matrix) for coupled lines. The normal mode parameters (NMPs) for coupled lines derived in terms of system specifications are utilized to obtain the physical geometries. An optimization process is employed to find the geometries which yield the desired NMPs calculated from circuit specifications. For optimizing the geometry, a quasi-static field analysis program, Segmentation and Boundary Element Method (SBEM), is employed to

  20. Design for progressive fracture in composite shell structures

    NASA Technical Reports Server (NTRS)

    Minnetyan, Levon; Murthy, Pappu L. N.

    1992-01-01

    The load carrying capability and structural behavior of composite shell structures and stiffened curved panels are investigated to provide accurate early design loads. An integrated computer code is utilized for the computational simulation of composite structural degradation under practical loading for realistic design. Damage initiation, growth, accumulation, and propagation to structural fracture are included in the simulation. Progressive fracture investigations providing design insight for several classes of composite shells are presented. Results demonstrate the significance of local defects, interfacial regions, and stress concentrations on the structural durability of composite shells.

  1. Progress in multidisciplinary design optimization at NASA Langley

    NASA Technical Reports Server (NTRS)

    Padula, Sharon L.

    1993-01-01

    Multidisciplinary Design Optimization refers to some combination of disciplinary analyses, sensitivity analysis, and optimization techniques used to design complex engineering systems. The ultimate objective of this research at NASA Langley Research Center is to help the US industry reduce the costs associated with development, manufacturing, and maintenance of aerospace vehicles while improving system performance. This report reviews progress towards this objective and highlights topics for future research. Aerospace design problems selected from the author's research illustrate strengths and weaknesses in existing multidisciplinary optimization techniques. The techniques discussed include multiobjective optimization, global sensitivity equations and sequential linear programming.

  2. Structural Design Methodology Based on Concepts of Uncertainty

    NASA Technical Reports Server (NTRS)

    Lin, K. Y.; Du, Jiaji; Rusk, David

    2000-01-01

    In this report, an approach to damage-tolerant aircraft structural design is proposed based on the concept of an equivalent "Level of Safety" that incorporates past service experience in the design of new structures. The discrete "Level of Safety" for a single inspection event is defined as the compliment of the probability that a single flaw size larger than the critical flaw size for residual strength of the structure exists, and that the flaw will not be detected. The cumulative "Level of Safety" for the entire structure is the product of the discrete "Level of Safety" values for each flaw of each damage type present at each location in the structure. Based on the definition of "Level of Safety", a design procedure was identified and demonstrated on a composite sandwich panel for various damage types, with results showing the sensitivity of the structural sizing parameters to the relative safety of the design. The "Level of Safety" approach has broad potential application to damage-tolerant aircraft structural design with uncertainty.

  3. Methodology Series Module 8: Designing Questionnaires and Clinical Record Forms.

    PubMed

    Setia, Maninder Singh

    2017-01-01

    As researchers, we often collect data on a clinical record form or a questionnaire. It is an important part of study design. If the questionnaire is not well designed, the data collected will not be useful. In this section of the module, we have discussed some practical aspects of designing a questionnaire. It is useful to make a list of all the variables that will be assessed in the study before preparing the questionnaire. The researcher should review all the existing questionnaires. It may be efficient to use an existing standardized questionnaire or scale. Many of these scales are freely available and may be used with an appropriate reference. However, some may be under copyright protection and permissions may be required to use the same questionnaire. While designing their own questionnaire, researchers may use open- or close-ended questions. It is important to design the responses appropriately as the format of responses will influence the analysis. Sometimes, one can collect the same information in multiple ways - continuous or categorical response. Besides these, the researcher can also use visual analog scales or Likert's scale in the questionnaire. Some practical take-home points are: (1) Use specific language while framing the questions; (2) write detailed instructions in the questionnaire; (3) use mutually exclusive response categories; (4) use skip patterns; (5) avoid double-barreled questions; and (6) anchor the time period if required.

  4. Methodology Series Module 8: Designing Questionnaires and Clinical Record Forms

    PubMed Central

    Setia, Maninder Singh

    2017-01-01

    As researchers, we often collect data on a clinical record form or a questionnaire. It is an important part of study design. If the questionnaire is not well designed, the data collected will not be useful. In this section of the module, we have discussed some practical aspects of designing a questionnaire. It is useful to make a list of all the variables that will be assessed in the study before preparing the questionnaire. The researcher should review all the existing questionnaires. It may be efficient to use an existing standardized questionnaire or scale. Many of these scales are freely available and may be used with an appropriate reference. However, some may be under copyright protection and permissions may be required to use the same questionnaire. While designing their own questionnaire, researchers may use open- or close-ended questions. It is important to design the responses appropriately as the format of responses will influence the analysis. Sometimes, one can collect the same information in multiple ways - continuous or categorical response. Besides these, the researcher can also use visual analog scales or Likert's scale in the questionnaire. Some practical take-home points are: (1) Use specific language while framing the questions; (2) write detailed instructions in the questionnaire; (3) use mutually exclusive response categories; (4) use skip patterns; (5) avoid double-barreled questions; and (6) anchor the time period if required. PMID:28400630

  5. New methodology for shaft design based on life expectancy

    NASA Technical Reports Server (NTRS)

    Loewenthal, S. H.

    1986-01-01

    The design of power transmission shafting for reliability has not historically received a great deal of attention. However, weight sensitive aerospace and vehicle applications and those where the penalties of shaft failure are great, require greater confidence in shaft design than earlier methods provided. This report summarizes a fatigue strength-based, design method for sizing shafts under variable amplitude loading histories for limited or nonlimited service life. Moreover, applications factors such as press-fitted collars, shaft size, residual stresses from shot peening or plating, corrosive environments can be readily accommodated into the framework of the analysis. Examples are given which illustrate the use of the method, pointing out the large life penalties due to occasional cyclic overloads.

  6. Development of probabilistic rigid pavement design methodologies for military airfields

    NASA Astrophysics Data System (ADS)

    Witczak, M. W.; Uzan, J.; Johnson, M.

    1983-12-01

    The current Corps of Engineers design procedures for rigid airfield pavements is based on the Westergaard free edge stress slab theory, and a proposed procedure is based on the multilayer elastic theory. These two design procedures have been expanded to airfield pavement designs expressed in probabilistic and reliability terms. Further developments were required in these procedures to make the analysis more practicable. Two major investigations were conducted: (1) Evaluation and use of the composite modulus of elasticity for layers beneath the rigid pavement, and (2) Evaluation of the maximum tensile stress at the bottom of the slab for different aircraft types. Derivations obtained from the investigation of the composite modulus and maximum tensile stress are reported and are included in computer programs for probabilistic/reliability analysis of rigid pavements. The approximate closed form (Taylor series expansion) is utilized. Example runs of the computer program are presented.

  7. Graphene nanopores: electronic transport properties and design methodology.

    PubMed

    Qiu, Wanzhi; Nguyen, Phuong; Skafidas, Efstratios

    2014-01-28

    Graphene nanopores (GNPs) hold great promise as building blocks for electronic circuitry and sensors for biological and chemical sensing applications. Methods to design graphene nanopores that achieve desirable conduction performance and sensing characteristics have not been previously described. Here we present a study of the quantum transport properties of GNPs created by drilling pores in armchair and zigzag graphene ribbons. For the first time, our study reveals that the quantum transmission spectra of GNPs are highly tunable and GNPs with specific transport properties can be produced by properly designing pore shapes. Our investigation shows that the biological sensing capabilities of GNPs are transmission spectrum dependent, can vary dramatically, and are critically dependent on pore geometry. Our study provides design guidelines for creating graphene nanopores with specific transport properties to meet the needs of diverse applications and for developing sensitive biological/chemical sensors with required performance characteristics.

  8. Structural design methodologies for ceramic-based material systems

    NASA Technical Reports Server (NTRS)

    Duffy, Stephen F.; Chulya, Abhisak; Gyekenyesi, John P.

    1991-01-01

    One of the primary pacing items for realizing the full potential of ceramic-based structural components is the development of new design methods and protocols. The focus here is on low temperature, fast-fracture analysis of monolithic, whisker-toughened, laminated, and woven ceramic composites. A number of design models and criteria are highlighted. Public domain computer algorithms, which aid engineers in predicting the fast-fracture reliability of structural components, are mentioned. Emphasis is not placed on evaluating the models, but instead is focused on the issues relevant to the current state of the art.

  9. Local Design Methodologies for a Hierarchic Control Architecture

    DTIC Science & Technology

    1990-04-12

    decentralized manner. This design is based on the concept of the disturbance information "flowing" along the structure as " waves ". A control...computer. This design is based on a view of the disturbances "flowing"along a structure as " waves ", and the implementation is done in such a way as to...Company, New York, 1986 [47] Miller, D., Modelling and Active Modification of Wave Scattering in Struc- tural Networks, ScD Thesis, MIT Department of

  10. New Methods in Design Education: The Systemic Methodology and the Use of Sketch in the Conceptual Design Stage

    ERIC Educational Resources Information Center

    Westermeyer, Juan Carlos Briede; Ortuno, Bernabe Hernandis

    2011-01-01

    This study describes the application of a new product concurrent design methodologies in the context in the education of industrial design. The use of the sketch has been utilized many times as a tool of creative expression especially in the conceptual design stage, in an intuitive way and a little out of the context of the reality needs that the…

  11. Design and Methodology of the Korean Early Psychosis Cohort Study

    PubMed Central

    Kim, Sung-Wan; Lee, Bong Ju; Kim, Jung Jin; Yu, Je-Chun; Lee, Kyu Young; Won, Seung-Hee; Lee, Seung-Hwan; Kim, Seung-Hyun; Kang, Shi Hyun

    2017-01-01

    The present study details the rationale and methodology of the Korean Early Psychosis Cohort Study (KEPS), which is a clinical cohort investigation of first episode psychosis patients from a Korean population. The KEPS is a prospective naturalistic observational cohort study that follows the participants for at least 2 years. This study includes patients between 18 and 45 years of age who fulfill the criteria for one of schizophrenia spectrum and other psychotic disorders according to the diagnostic criteria of DSM-5. Early psychosis is defined as first episode patients who received antipsychotic treatment for fewer than 4 consecutive weeks after the onset of illness or stabilized patients in the early stages of the disorder whose duration of illness was less than 2 years from the initiation of antipsychotic treatment. The primary outcome measures are treatment response, remission, recovery, and relapse. Additionally, several laboratory tests are conducted and a variety of objective and subjective psychiatric measures assessing early life trauma, lifestyle pattern, and social and cognitive functioning are administered. This long-term prospective cohort study may contribute to the development of early intervention strategies and the improvement of long-term outcomes in patients with schizophrenia. PMID:28096881

  12. Situated Research Design and Methodological Choices in Formative Program Evaluation

    ERIC Educational Resources Information Center

    Supovitz, Jonathan

    2013-01-01

    Design-based implementation research offers the opportunity to rethink the relationships between intervention, research, and situation to better attune research and evaluation to the program development process. Using a heuristic called the intervention development curve, I describe the rough trajectory that programs typically follow as they…

  13. Kids in the city study: research design and methodology

    PubMed Central

    2011-01-01

    Background Physical activity is essential for optimal physical and psychological health but substantial declines in children's activity levels have occurred in New Zealand and internationally. Children's independent mobility (i.e., outdoor play and traveling to destinations unsupervised), an integral component of physical activity in childhood, has also declined radically in recent decades. Safety-conscious parenting practices, car reliance and auto-centric urban design have converged to produce children living increasingly sedentary lives. This research investigates how urban neighborhood environments can support or enable or restrict children's independent mobility, thereby influencing physical activity accumulation and participation in daily life. Methods/Design The study is located in six Auckland, New Zealand neighborhoods, diverse in terms of urban design attributes, particularly residential density. Participants comprise 160 children aged 9-11 years and their parents/caregivers. Objective measures (global positioning systems, accelerometers, geographical information systems, observational audits) assessed children's independent mobility and physical activity, neighborhood infrastructure, and streetscape attributes. Parent and child neighborhood perceptions and experiences were assessed using qualitative research methods. Discussion This study is one of the first internationally to examine the association of specific urban design attributes with child independent mobility. Using robust, appropriate, and best practice objective measures, this study provides robust epidemiological information regarding the relationships between the built environment and health outcomes for this population. PMID:21781341

  14. Serration Design Methodology for Wind Turbine Noise Reduction

    NASA Astrophysics Data System (ADS)

    Mathew, J.; Singh, A.; Madsen, J.; Arce León, C.

    2016-09-01

    Trailing edge serrations are today an established method to reduce the aeroacoustic noise from wind turbine blades. In this paper, a brief introduction to the aerodynamic and acoustic design procedure used at LM Wind Power is given. Early field tests on serrations, retrofitted to the turbine blades, gave preliminary indication of their noise reduction potential. However, a multitude of challenges stand in the way of any proof of concept and a viable commercial product. LM undertook a methodical test and validation procedure to understand the impact of design parameters on serration performance, and quantify the uncertainties associated with the proposed designs. Aerodynamic and acoustic validation tests were carried out in number of wind tunnel facilities. Models were written to predict the aerodynamic, acoustic and structural performance of the serrations. LM serration designs have evolved over the period of time to address constraints imposed by aero performance, structural reliability, manufacturing and installation. The latest LM serration offering was tested in the field on three different wind turbines. A consistent noise reduction in excess of 1.5 dB was achieved in the field for all three turbines.

  15. Design Based Research Methodology for Teaching with Technology in English

    ERIC Educational Resources Information Center

    Jetnikoff, Anita

    2015-01-01

    Design based research (DBR) is an appropriate method for small scale educational research projects involving collaboration between teachers, students and researchers. It is particularly useful in collaborative projects where an intervention is implemented and evaluated in a grounded context. The intervention can be technological, or a new program…

  16. Design Based Research Methodology for Teaching with Technology in English

    ERIC Educational Resources Information Center

    Jetnikoff, Anita

    2015-01-01

    Design based research (DBR) is an appropriate method for small scale educational research projects involving collaboration between teachers, students and researchers. It is particularly useful in collaborative projects where an intervention is implemented and evaluated in a grounded context. The intervention can be technological, or a new program…

  17. Situated Research Design and Methodological Choices in Formative Program Evaluation

    ERIC Educational Resources Information Center

    Supovitz, Jonathan

    2013-01-01

    Design-based implementation research offers the opportunity to rethink the relationships between intervention, research, and situation to better attune research and evaluation to the program development process. Using a heuristic called the intervention development curve, I describe the rough trajectory that programs typically follow as they…

  18. Optimum design criteria for a synchronous reluctance motor with concentrated winding using response surface methodology

    NASA Astrophysics Data System (ADS)

    Lee, Jung-Ho; Park, Seong-June; Jeon, Su-Jin

    2006-04-01

    This paper presents an optimization procedure using response surface methodology (RSM) to determine design parameters for reducing torque ripple. The RSM has been achieved to use the experimental design method in combination with finite element method and well adapted to make analytical model for a complex problem considering a lot of interaction of design variables.

  19. A wing design methodology for low-boom low-drag supersonic business jet

    NASA Astrophysics Data System (ADS)

    Le, Daniel B.

    2009-12-01

    The arguably most critical hindrance to the successful development of a commercial supersonic aircraft is the impact of the sonic boom signature. The sonic boom signature of a supersonic aircraft is predicted using sonic boom theory, which formulates a relationship between the complex three-dimensional geometry of the aircraft to the pressure distribution and decomposes the geometry in terms of simple geometrical components. The supersonic aircraft design process is typically based on boom minimization theory. This theory provides a theoretical equivalent area distribution which should be matched by the conceptual design in order to achieve the pre-determined sonic boom signature. The difference between the target equivalent area distribution and the actual equivalent area distribution is referred to here as the gap distribution. The primary intent of this dissertation is to provide the designer with a systematic and structured approach to designing the aircraft wings with limited changes to the baseline concept while achieving critical design goals. The design process can be easily overwhelmed and may be difficult to evaluate their effectiveness. The wing design is decoupled into two separate processes, one focused on the planform design and the other on the camber design. Moreover, this design methodology supplements the designer by allowing trade studies to be conducted between important design parameters and objectives. The wing planform design methodology incorporates a continuous gradient-based optimization scheme to supplement the design process. This is not meant to substitute the vast amount of knowledge and design decisions that are needed for a successful design. Instead, the numerical optimization helps the designer to refine creative concepts. Last, this dissertation integrates a risk mitigation scheme throughout the wing design process. The design methodology implements minimal design changes to the wing geometry white achieving the target design goal

  20. A general methodology and applications for conduction-like flow-channel design.

    SciTech Connect

    Cummings, Eric B.; Fiechtner, Gregory J.

    2004-02-01

    A novel design methodology is developed for creating conduction devices in which fields are piecewise uniform. This methodology allows the normally analytically intractable problem of Lagrangian transport to be solved using algebraic and trigonometric equations. Low-dispersion turns, manifolds, and expansions are developed. In this methodology, regions of piece-wise constant specific permeability (permeability per unit width) border each other with straight, generally tilted interfaces. The fields within each region are made uniform by satisfying a simple compatibility relation between the tilt angle and ratio of specific permeability of adjacent regions. This methodology has particular promise in the rational design of quasi-planar devices, in which the specific permeability is proportional to the depth of the channel. For such devices, the methodology can be implemented by connecting channel facets having two or more depths, fabricated, e.g., using a simple two-etch process.

  1. Designing a Methodology for Future Air Travel Scenarios

    NASA Technical Reports Server (NTRS)

    Wuebbles, Donald J.; Baughcum, Steven L.; Gerstle, John H.; Edmonds, Jae; Kinnison, Douglas E.; Krull, Nick; Metwally, Munir; Mortlock, Alan; Prather, Michael J.

    1992-01-01

    -subsonic future fleet. The methodology, procedures, and recommendations for the development of future HSCT and the subsonic fleet scenarios used for this evaluation are discussed.

  2. A Cybernetic Design Methodology for 'Intelligent' Online Learning Support

    NASA Astrophysics Data System (ADS)

    Quinton, Stephen R.

    The World Wide Web (WWW) provides learners and knowledge workers convenient access to vast stores of information, so much that present methods for refinement of a query or search result are inadequate - there is far too much potentially useful material. The problem often encountered is that users usually do not recognise what may be useful until they have progressed some way through the discovery, learning, and knowledge acquisition process. Additional support is needed to structure and identify potentially relevant information, and to provide constructive feedback. In short, support for learning is needed. The learning envisioned here is not simply the capacity to recall facts or to recognise objects. The focus is on learning that results in the construction of knowledge. Although most online learning platforms are efficient at delivering information, most do not provide tools that support learning as envisaged in this chapter. It is conceivable that Web-based learning environments can incorporate software systems that assist learners to form new associations between concepts and synthesise information to create new knowledge. This chapter details the rationale and theory behind a research study that aims to evolve Web-based learning environments into 'intelligent thinking' systems that respond to natural language human input. Rather than functioning simply as a means of delivering information, it is argued that online learning solutions will 1 day interact directly with students to support their conceptual thinking and cognitive development.

  3. Memristor-Based Computing Architecture: Design Methodologies and Circuit Techniques

    DTIC Science & Technology

    2013-03-01

    simply consists of an NMOS transistor (Q) and a memristor. When the input Vin is low, the transistor Q is turned off. Thus, the output Vout is...connected to ground through the memristor. Conversely, when Vin is high, turning Q on, the memristance M and the equivalent transistor resistance (RQ...synapse design was dependent on the equivalent resistance (effectively, the size) of the Q transistor (RQ). A larger Q would offer a wider range of Vout

  4. Methodological Foundations for Designing Intelligent Computer-Based Training

    DTIC Science & Technology

    1991-09-03

    metacognitive processes (Derry, in press). To this list, we would add that FSM techniques need not be restricted to intelligent tutoring systems. As Chin (1989...systems (pp. 313-333). New York: Springer-Verlag. Derry, S. (in press). Metacognitive models of learning and instructional systems design. In P.H. Winne...15 3.1 Stocks Portfolio Profiler Example .............................. 15 3.1.1 Chart Classes

  5. Kids in the city study: research design and methodology.

    PubMed

    Oliver, Melody; Witten, Karen; Kearns, Robin A; Mavoa, Suzanne; Badland, Hannah M; Carroll, Penelope; Drumheller, Chelsea; Tavae, Nicola; Asiasiga, Lanuola; Jelley, Su; Kaiwai, Hector; Opit, Simon; Lin, En-Yi Judy; Sweetsur, Paul; Barnes, Helen Moewaka; Mason, Nic; Ergler, Christina

    2011-07-24

    Physical activity is essential for optimal physical and psychological health but substantial declines in children's activity levels have occurred in New Zealand and internationally. Children's independent mobility (i.e., outdoor play and traveling to destinations unsupervised), an integral component of physical activity in childhood, has also declined radically in recent decades. Safety-conscious parenting practices, car reliance and auto-centric urban design have converged to produce children living increasingly sedentary lives. This research investigates how urban neighborhood environments can support or enable or restrict children's independent mobility, thereby influencing physical activity accumulation and participation in daily life. The study is located in six Auckland, New Zealand neighborhoods, diverse in terms of urban design attributes, particularly residential density. Participants comprise 160 children aged 9-11 years and their parents/caregivers. Objective measures (global positioning systems, accelerometers, geographical information systems, observational audits) assessed children's independent mobility and physical activity, neighborhood infrastructure, and streetscape attributes. Parent and child neighborhood perceptions and experiences were assessed using qualitative research methods. This study is one of the first internationally to examine the association of specific urban design attributes with child independent mobility. Using robust, appropriate, and best practice objective measures, this study provides robust epidemiological information regarding the relationships between the built environment and health outcomes for this population.

  6. One Controller at a Time (1-CAT): A mimo design methodology

    NASA Technical Reports Server (NTRS)

    Mitchell, J. R.; Lucas, J. C.

    1987-01-01

    The One Controller at a Time (1-CAT) methodology for designing digital controllers for Large Space Structures (LSS's) is introduced and illustrated. The flexible mode problem is first discussed. Next, desirable features of a LSS control system design methodology are delineated. The 1-CAT approach is presented, along with an analytical technique for carrying out the 1-CAT process. Next, 1-CAT is used to design digital controllers for the proposed Space Based Laser (SBL). Finally, the SBL design is evaluated for dynamical performance, noise rejection, and robustness.

  7. Application of an integrated flight/propulsion control design methodology to a STOVL aircraft

    NASA Technical Reports Server (NTRS)

    Garg, Sanjay; Mattern, Duane L.

    1991-01-01

    The application of an emerging Integrated Flight/Propulsion Control design methodology to a STOVL aircraft in transition flight is reported. The methodology steps consist of: (1) design of a centralized feedback controller to provide command tracking and stability and performance robustness considering the fully integrated airframe/propulsion model as one high-order system; (2) partition of the centralized controller into a decentralized, hierarchical form compatible with implementation requirements; and (3) design of command shaping prefilters from pilot control effectors to commanded variables to provide the overall desired response to pilot inputs. Intermediate design results using this methodology are presented, the complete point control design with the propulsion system operating schedule and limit protection logic included is evaluated for sample pilot control inputs, and the response is compared with that of an 'ideal response model' derived from Level I handling qualities requirements.

  8. Development of a combustor analytical design methodology for liquid rocket engines

    NASA Technical Reports Server (NTRS)

    Pieper, Jerry L.; Muss, Jeff

    1989-01-01

    The development of a user friendly computerized methodology for the design and analysis of liquid propellant rocket engine combustion chambers is described. An overview of the methodology, consisting of a computer program containing an appropriate modular assembly of existing industry wide performance and combustion stability models, is presented. These models are linked with an interactive front end processor enabling the user to define the performance and stability traits of an existing design (point analysis) or to create the essential design features of a combustor to meet specific performance goals and combustion stability (point design). Plans for demonstration and verification of this methodology are also presented. These plans include the creation of combustor designs using the methodology, together with predictions of the performance and combustion stability for each design. A verification test program of 26 hot fire tests with up to four designs created using this methodology is described. This testing is planned using LOX/RP-1 propellants with a thrust level of approx. 220,000 N (50,000 lbf).

  9. Software Design Methodology Migration for a Distributed Ground System

    NASA Technical Reports Server (NTRS)

    Ritter, George; McNair, Ann R. (Technical Monitor)

    2002-01-01

    The Marshall Space Flight Center's (MSFC) Payload Operations Center (POC) ground system has been developed and has evolved over a period of about 10 years. During this time the software processes have migrated from more traditional to more contemporary development processes. The new Software processes still emphasize requirements capture, software configuration management, design documenting, and making sure the products that have been developed are accountable to initial requirements. This paper will give an overview of how the Software Process have evolved highlighting the positives as well as the negatives. In addition, we will mention the COTS tools that have been integrated into the processes and how the COTS have provided value to the project .

  10. Systematic Controller Design Methodology for Variable-Speed Wind Turbines

    SciTech Connect

    Hand, M. M.; Balas, M. J.

    2002-02-01

    Variable-speed, horizontal axis wind turbines use blade-pitch control to meet specified objectives for three operational regions. This paper provides a guide for controller design for the constant power production regime. A simple, rigid, non-linear turbine model was used to systematically perform trade-off studies between two performance metrics. Minimization of both the deviation of the rotor speed from the desired speed and the motion of the actuator is desired. The robust nature of the proportional-integral-derivative controller is illustrated, and optimal operating conditions are determined. Because numerous simulation runs may be completed in a short time, the relationship between the two opposing metrics is easily visualized.

  11. Model free audit methodology for bias evaluation of tumour progression in oncology.

    PubMed

    Stone, Andrew; Macpherson, Euan; Smith, Ann; Jennison, Christopher

    2015-01-01

    Many oncology studies incorporate a blinded independent central review (BICR) to make an assessment of the integrity of the primary endpoint, progression free survival. Recently, it has been suggested that, in order to assess the potential for bias amongst investigators, a BICR amongst only a sample of patients could be performed; if evidence of bias is detected, according to a predefined threshold, the BICR is then assessed in all patients, otherwise, it is concluded that the sample was sufficient to rule out meaningful levels of bias. In this paper, we present an approach that adapts a method originally created for defining futility bounds in group sequential designs. The hazard ratio ratio, the ratio of the hazard ratio (HR) for the treatment effect estimated from the BICR to the corresponding HR for the investigator assessments, is used as the metric to define bias. The approach is simple to implement and ensures a high probability that a substantial true bias will be detected. In the absence of bias, there is a high probability of accepting the accuracy of local evaluations based on the sample, in which case an expensive BICR of all patients is avoided. The properties of the approach are demonstrated by retrospective application to a completed Phase III trial in colorectal cancer. The same approach could easily be adapted for other disease settings, and for test statistics other than the hazard ratio.

  12. The progression of burn depth in experimental burns: a histological and methodological study.

    PubMed

    Papp, A; Kiraly, K; Härmä, M; Lahtinen, T; Uusaro, A; Alhava, E

    2004-11-01

    This study was designed to create a reproducible model for experimental burn wound research in pigs. Previously, the thicker paraspinal skin has been used. We used the more human-like ventral skin to create burns of different depths. Contact burns were created to 11 pigs using a brass plate heated to 100 degrees C in boiling water. Different contact times were used to create burns of different depths. In pigs 1-6, the follow-up time was 72 h and in pigs 7-11 24 h. Burn depth was determined by histology. Histologically, samples were classified into five anatomical layers: epidermis, upper one-third of the dermis, middle third of the dermis, deepest third of the dermis and subcutaneous fat. The location of both thromboses and burn marks were evaluated, respectively. The 1 s contact time lead to a superficial thermal injury, 3 s to a partial thickness and 9 s to a full thickness injury. A progression of burn depth was found until 48 h post-injury. The intra-observer correlation after repeated histological analyses of burn depths by the same histopathologist and the repeatability of burn depth creation yielded kappa coefficients 0.83 and 0.92, respectively. a reproducible burn model for further research purposes was obtained.

  13. Weibull-Based Design Methodology for Rotating Aircraft Engine Structures

    NASA Technical Reports Server (NTRS)

    Zaretsky, Erwin; Hendricks, Robert C.; Soditus, Sherry

    2002-01-01

    The NASA Energy Efficient Engine (E(sup 3)-Engine) is used as the basis of a Weibull-based life and reliability analysis. Each component's life and thus the engine's life is defined by high-cycle fatigue (HCF) or low-cycle fatigue (LCF). Knowing the cumulative life distribution of each of the components making up the engine as represented by a Weibull slope is a prerequisite to predicting the life and reliability of the entire engine. As the engine Weibull slope increases, the predicted lives decrease. The predicted engine lives L(sub 5) (95 % probability of survival) of approximately 17,000 and 32,000 hr do correlate with current engine maintenance practices without and with refurbishment. respectively. The individual high pressure turbine (HPT) blade lives necessary to obtain a blade system life L(sub 0.1) (99.9 % probability of survival) of 9000 hr for Weibull slopes of 3, 6 and 9, are 47,391 and 20,652 and 15,658 hr, respectively. For a design life of the HPT disks having probable points of failure equal to or greater than 36,000 hr at a probability of survival of 99.9 %, the predicted disk system life L(sub 0.1) can vary from 9,408 to 24,911 hr.

  14. Design Methodology for Low-Speed Variable Reluctance Motors

    NASA Astrophysics Data System (ADS)

    Suriano, John Riden

    Lowering the gear reduction in actuators by utilizing high-torque low-speed motors enables the use of less expensive and simpler gear systems and decreases the overall system inertia. Variable reluctance machines can produce high torque at low speeds. Their static torque, a critical quantity for determination of low speed operation, is compared for three variable reluctance motor design variations using linear analysis. Saturation effects, which are crucial to the accurate determination of static torque, are modeled using a dual energy technique first proposed by Lord Rayleigh. Dual energy techniques utilizing flux tubes and magnetomotive force slices are developed into a numerical method for predicting nonlinear three-dimensional magnetostatic field parameters. The dual energy method offers a compromise between the accurate but laborious finite element method and the speed of simplified lumped parameter magnetic circuit calculations. A two-dimensional dual energy model of a variable reluctance motor is developed. Results of calculations on a 4 kW Oulton machine are compared to measurements and other calculation methods. Finally, as a demonstration, the model is used to evaluate two competing variable reluctance motors for use as replacements for a DC windshield wiper motor.

  15. Exercise for Methamphetamine Dependence: Rationale, Design, and Methodology

    PubMed Central

    Mooney, Larissa J.; Cooper, Christopher; London, Edythe; Chudzynski, Joy; Dolezal, Brett; Dickerson, Daniel; Brecht, Mary-Lynn; Penante, Jose; Rawson, Richard A.

    2015-01-01

    Background Effective pharmacotherapies to treat methamphetamine (MA) dependence have not been identified, and behavioral therapies are marginally effective. Based on behavioral studies demonstrating the potential efficacy of aerobic exercise for improving depressive symptoms, anxiety, cognitive deficits, and substance use outcomes, the study described here is examining exercise as a potential treatment for MA-dependent individuals. Methods This study is randomizing 150 participants with MA dependence at a residential treatment facility for addictive disorders to receive either a thrice-weekly structured aerobic and resistance exercise intervention or a health education condition. Recruitment commenced in March, 2010. Enrollment and follow-up phases are ongoing, and recruitment is exceeding targeted enrollment rates. Conclusions Seeking evidence for a possibly effective adjunct to traditional behavioral approaches for treatment of MA dependence, this study is assessing the ability of an 8-week aerobic and resistance exercise protocol to reduce relapse to MA use during a 12-week follow-up period after discharge from residential-based treatment. The study also is evaluating improvements in health and functional outcomes during and after the protocol. This paper describes the design and methods of the study. PMID:24291456

  16. Thermo-mechanical Design Methodology for ITER Cryodistribution cold boxes

    NASA Astrophysics Data System (ADS)

    Shukla, Vinit; Patel, Pratik; Das, Jotirmoy; Vaghela, Hitensinh; Bhattacharya, Ritendra; Shah, Nitin; Choukekar, Ketan; Chang, Hyun-Sik; Sarkar, Biswanath

    2017-04-01

    The ITER cryo-distribution (CD) system is in charge of proper distribution of the cryogen at required mass flow rate, pressure and temperature level to the users; namely the superconducting (SC) magnets and cryopumps (CPs). The CD system is also capable to use the magnet structures as a thermal buffer in order to operate the cryo-plant as much as possible at a steady state condition. A typical CD cold box is equipped with mainly liquid helium (LHe) bath, heat exchangers (HX’s), cryogenic valves, filter, heaters, cold circulator, cold compressor and process piping. The various load combinations which are likely to occur during the life cycle of the CD cold boxes are imposed on the representative model and impacts on the system are analyzed. This study shows that break of insulation vacuum during nominal operation (NO) along with seismic event (Seismic Level-2) is the most stringent load combination having maximum stress of 224 MPa. However, NO+SMHV (Séismes Maximaux Historiquement Vraisemblables = Maximum Historically Probable Earthquakes) load combination is having the least safety margin and will lead the basis of the design of the CD system and its sub components. This paper presents and compares the results of different load combinations which are likely to occur on a typical CD cold box.

  17. Partnerships for the Design, Conduct, and Analysis of Effectiveness, and Implementation Research: Experiences of the Prevention Science and Methodology Group

    PubMed Central

    Brown, C. Hendricks; Kellam, Sheppard G.; Kaupert, Sheila; Muthén, Bengt O.; Wang, Wei; Muthén, Linda K.; Chamberlain, Patricia; PoVey, Craig L.; Cady, Rick; Valente, Thomas W.; Ogihara, Mitsunori; Prado, Guillermo J.; Pantin, Hilda M.; Gallo, Carlos G.; Szapocznik, José; Czaja, Sara J.; McManus, John W.

    2012-01-01

    What progress prevention research has made comes through strategic partnerships with communities and institutions that host this research, as well as professional and practice networks that facilitate the diffusion of knowledge about prevention. We discuss partnership issues related to the design, analysis, and implementation of prevention research and especially how rigorous designs, including random assignment, get resolved through a partnership between community stakeholders, institutions, and researchers. These partnerships shape not only study design, but they determine the data that can be collected and how results and new methods are disseminated. We also examine a second type of partnership to improve the implementation of effective prevention programs into practice. We draw on social networks to studying partnership formation and function. The experience of the Prevention Science and Methodology Group, which itself is a networked partnership between scientists and methodologists, is highlighted. PMID:22160786

  18. Partnerships for the design, conduct, and analysis of effectiveness, and implementation research: experiences of the prevention science and methodology group.

    PubMed

    Brown, C Hendricks; Kellam, Sheppard G; Kaupert, Sheila; Muthén, Bengt O; Wang, Wei; Muthén, Linda K; Chamberlain, Patricia; PoVey, Craig L; Cady, Rick; Valente, Thomas W; Ogihara, Mitsunori; Prado, Guillermo J; Pantin, Hilda M; Gallo, Carlos G; Szapocznik, José; Czaja, Sara J; McManus, John W

    2012-07-01

    What progress prevention research has made comes through strategic partnerships with communities and institutions that host this research, as well as professional and practice networks that facilitate the diffusion of knowledge about prevention. We discuss partnership issues related to the design, analysis, and implementation of prevention research and especially how rigorous designs, including random assignment, get resolved through a partnership between community stakeholders, institutions, and researchers. These partnerships shape not only study design, but they determine the data that can be collected and how results and new methods are disseminated. We also examine a second type of partnership to improve the implementation of effective prevention programs into practice. We draw on social networks to studying partnership formation and function. The experience of the Prevention Science and Methodology Group, which itself is a networked partnership between scientists and methodologists, is highlighted.

  19. Design-Based Research: Is This a Suitable Methodology for Short-Term Projects?

    ERIC Educational Resources Information Center

    Pool, Jessica; Laubscher, Dorothy

    2016-01-01

    This article reports on a design-based methodology of a thesis in which a fully face-to-face contact module was converted into a blended learning course. The purpose of the article is to report on how design-based phases, in the form of micro-, meso- and macro-cycles were applied to improve practice and to generate design principles. Design-based…

  20. Design-Based Research: Is This a Suitable Methodology for Short-Term Projects?

    ERIC Educational Resources Information Center

    Pool, Jessica; Laubscher, Dorothy

    2016-01-01

    This article reports on a design-based methodology of a thesis in which a fully face-to-face contact module was converted into a blended learning course. The purpose of the article is to report on how design-based phases, in the form of micro-, meso- and macro-cycles were applied to improve practice and to generate design principles. Design-based…

  1. A design methodology for evolutionary air transportation networks

    NASA Astrophysics Data System (ADS)

    Yang, Eunsuk

    The air transportation demand at large hubs in the U.S. is anticipated to double in the near future. Current runway construction plans at selected airports can relieve some capacity and delay problems, but many are doubtful that this solution is sufficient to accommodate the anticipated demand growth in the National Airspace System (NAS). With the worsening congestion problem, it is imperative to seek alternative solutions other than costly runway constructions. In this respect, many researchers and organizations have been building models and performing analyses of the NAS. However, the complexity and size of the problem results in an overwhelming task for transportation system modelers. This research seeks to compose an active design algorithm for an evolutionary airline network model so as to include network specific control properties. An airline network designer, referred to as a network architect, can use this tool to assess the possibilities of gaining more capacity by changing the network configuration. Since the Airline Deregulation Act of 1978, the airline service network has evolved into a distinct Hub-and-Spoke (H&S) network. Enplanement demand on the H&S network is the sum of Origin-Destination (O-D) demand and transfer demand. Even though the flight or enplanement demand is a function of O-D demand and passenger routings on the airline network, the distinction between enplanement and O-D demand is not often made. Instead, many demand forecast practices in current days are based on scale-ups from the enplanements, which include the demand to and from transferring network hubs. Based on this research, it was found that the current demand prediction practice can be improved by dissecting enplanements further into smaller pieces of information. As a result, enplanement demand is decomposed into intrinsic and variable parts. The proposed intrinsic demand model is based on the concept of 'true' O-D demand which includes the direction of each round trip

  2. Cocaine use reduction with buprenorphine (CURB): rationale, design, and methodology.

    PubMed

    Mooney, Larissa J; Nielsen, Suzanne; Saxon, Andrew; Hillhouse, Maureen; Thomas, Christie; Hasson, Albert; Stablein, Don; McCormack, Jennifer; Lindblad, Robert; Ling, Walter

    2013-03-01

    Effective medications to treat cocaine dependence have not been identified. Recent pharmacotherapy trials demonstrate the potential efficacy of buprenorphine (BUP) (alone or with naltrexone) for reducing cocaine use. The National Institute on Drug Abuse Clinical Trials Network (CTN) launched the Cocaine Use Reduction with Buprenorphine (CURB) investigation to examine the safety and efficacy of sublingual BUP (as Suboxone®) in the presence of extended-release injectable naltrexone (XR-NTX, as Vivitrol®) for the treatment of cocaine dependence. This paper describes the design and rationale for this study. This multi-site, double-blind, placebo-controlled study will randomize 300 participants across 11 sites. Participants must meet the DSM-IV criteria for cocaine dependence and past or current opioid dependence or abuse. Participants are inducted onto XR-NTX after self-reporting at least 7 days of abstinence from opioids and tolerating a naloxone challenge followed by oral naltrexone and are then randomly assigned to one of three medication conditions (4 mg BUP, 16 mg BUP, or placebo) for 8 weeks. Participants receive a second injection of XR-NTX 4 weeks after the initial injection, and follow-up visits are scheduled at 1 and 3 months post-treatment. Participants receive weekly cognitive behavioral therapy (CBT). Recruitment commenced in September, 2011. Enrollment, active medication, and follow-up phases are ongoing, and recruitment is exceeding targeted enrollment rates. This research using 2 medications will demonstrate whether BUP, administered in the presence of XR-NTX, reduces cocaine use in adults with cocaine dependence and opioid use disorders and will demonstrate if XR-NTX prevents development of physiologic dependence on BUP. Copyright © 2012 Elsevier Inc. All rights reserved.

  3. Advanced piloted aircraft flight control system design methodology. Volume 2: The FCX flight control design expert system

    NASA Technical Reports Server (NTRS)

    Myers, Thomas T.; Mcruer, Duane T.

    1988-01-01

    The development of a comprehensive and electric methodology for conceptual and preliminary design of flight control systems is presented and illustrated. The methodology is focused on the design states starting with the layout of system requirements and ending when some viable competing system architectures (feedback control structures) are defined. The approach is centered on the human pilot and the aircraft as both the sources of, and the keys to the solution of, many flight control problems. The methodology relies heavily on computational procedures which are highly interactive with the design engineer. To maximize effectiveness, these techniques, as selected and modified to be used together in the methodology, form a cadre of computational tools specifically tailored for integrated flight control system preliminary design purposes. The FCX expert system as presently developed is only a limited prototype capable of supporting basic lateral-directional FCS design activities related to the design example used. FCX presently supports design of only one FCS architecture (yaw damper plus roll damper) and the rules are largely focused on Class IV (highly maneuverable) aircraft. Despite this limited scope, the major elements which appear necessary for application of knowledge-based software concepts to flight control design were assembled and thus FCX represents a prototype which can be tested, critiqued and evolved in an ongoing process of development.

  4. Object-oriented analysis and design: a methodology for modeling the computer-based patient record.

    PubMed

    Egyhazy, C J; Eyestone, S M; Martino, J; Hodgson, C L

    1998-08-01

    The article highlights the importance of an object-oriented analysis and design (OOAD) methodology for the computer-based patient record (CPR) in the military environment. Many OOAD methodologies do not adequately scale up, allow for efficient reuse of their products, or accommodate legacy systems. A methodology that addresses these issues is formulated and used to demonstrate its applicability in a large-scale health care service system. During a period of 6 months, a team of object modelers and domain experts formulated an OOAD methodology tailored to the Department of Defense Military Health System and used it to produce components of an object model for simple order processing. This methodology and the lessons learned during its implementation are described. This approach is necessary to achieve broad interoperability among heterogeneous automated information systems.

  5. Progressive addition lens design by optimizing NURBS surface

    NASA Astrophysics Data System (ADS)

    Liu, Yen-Liang; Hsu, Wei-Yao; Cheng, Yuan-Chieh; Su, Guo-Dung

    2011-10-01

    Progressive addition lenses (PAL) are used to compensate presbyopia, which is induced by losing accommodation of elder eyes. These eyes need different optical power provided by eye glasses while watching objects at different distance. A smaller optical power is required in further distance and a larger one in nearer zone. A progressive addition lens can provides different power requirements in one piece of lens. This paper introduces a whole process of PAL production, from design, fabrication, to measurement. The PAL is designed by optimizing NURBS surface. Parameters of merit function are adjusted to design lenses with different specifications. The simulation results confirm that the power distributes as expected and cylinders are controlled under an acceptable level. Besides, sample lenses have been fabricated and measured. We apply precise-machining to produce the molds for plastic injection. Then, the samples are produced by injecting polycorbonate to the molds. Finally, Ultra Accuracy 3D Profilemeter is used to measure the sample PALs. Practical examinations shows that our designs are achievable and feasible in practice use.

  6. Analysis and Design of Fuselage Structures Including Residual Strength Prediction Methodology

    NASA Technical Reports Server (NTRS)

    Knight, Norman F.

    1998-01-01

    The goal of this research project is to develop and assess methodologies for the design and analysis of fuselage structures accounting for residual strength. Two primary objectives are included in this research activity: development of structural analysis methodology for predicting residual strength of fuselage shell-type structures; and the development of accurate, efficient analysis, design and optimization tool for fuselage shell structures. Assessment of these tools for robustness, efficient, and usage in a fuselage shell design environment will be integrated with these two primary research objectives.

  7. A Step-by-Step Design Methodology for a Base Case Vanadium Redox-Flow Battery

    ERIC Educational Resources Information Center

    Moore, Mark; Counce, Robert M.; Watson, Jack S.; Zawodzinski, Thomas A.; Kamath, Haresh

    2012-01-01

    The purpose of this work is to develop an evolutionary procedure to be used by Chemical Engineering students for the base-case design of a Vanadium Redox-Flow Battery. The design methodology is based on the work of Douglas (1985) and provides a profitability analysis at each decision level so that more profitable alternatives and directions can be…

  8. A Step-by-Step Design Methodology for a Base Case Vanadium Redox-Flow Battery

    ERIC Educational Resources Information Center

    Moore, Mark; Counce, Robert M.; Watson, Jack S.; Zawodzinski, Thomas A.; Kamath, Haresh

    2012-01-01

    The purpose of this work is to develop an evolutionary procedure to be used by Chemical Engineering students for the base-case design of a Vanadium Redox-Flow Battery. The design methodology is based on the work of Douglas (1985) and provides a profitability analysis at each decision level so that more profitable alternatives and directions can be…

  9. Designing Trend-Monitoring Sounds for Helicopters: Methodological Issues and an Application

    ERIC Educational Resources Information Center

    Edworthy, Judy; Hellier, Elizabeth; Aldrich, Kirsteen; Loxley, Sarah

    2004-01-01

    This article explores methodological issues in sonification and sound design arising from the design of helicopter monitoring sounds. Six monitoring sounds (each with 5 levels) were tested for similarity and meaning with 3 different techniques: hierarchical cluster analysis, linkage analysis, and multidimensional scaling. In Experiment 1,…

  10. Developing organ-on-a-chip concepts using bio-mechatronic design methodology.

    PubMed

    Christoffersson, Jonas; van Noort, Danny; Mandenius, Carl-Fredrik

    2017-05-26

    Mechatronic design is an engineering methodology for conceiving, configuring and optimising the design of a technical device or product to the needs and requirements of the final user. In this article, we show how the basic principles of this methodology can be exploited for in vitro cell cultures-often referred to as organ-on-a-chip devices. Due to the key role of the biological cells, we have introduced the term bio-mechatronic design, to highlight the complexity of designing a system that should integrate biology, mechanics and electronics in the same device structure. The strength of the mechatronic design is to match the needs of the potential users to a systematic evaluation of overall functional design alternative. It may be especially attractive for organs-on-chips where biological constituents such as cells and tissues in 3D settings and in a fluidic environment should be compared, screened and selected. Through this approach, design solutions ranked to customer needs are generated according to specified criteria, thereby defining the key constraints of the fabrication. As an example, the bio-mechatronic methodology is applied to a liver-on-a-chip based on information extrapolated from previous theoretical and experimental knowledge. It is concluded that the methodology can generate new fabrication solutions for devices, as well as efficient guidelines for refining the design and fabrication of many of today's organ-on-a-chip devices.

  11. Monitoring Progress in Child Poverty Reduction: Methodological Insights and Illustration to the Case Study of Bangladesh

    ERIC Educational Resources Information Center

    Roche, Jose Manuel

    2013-01-01

    Important steps have been taken at international summits to set up goals and targets to improve the wellbeing of children worldwide. Now the world also has more and better data to monitor progress. This paper presents a new approach to monitoring progress in child poverty reduction based on the Alkire and Foster adjusted headcount ratio and an…

  12. Monitoring Progress in Child Poverty Reduction: Methodological Insights and Illustration to the Case Study of Bangladesh

    ERIC Educational Resources Information Center

    Roche, Jose Manuel

    2013-01-01

    Important steps have been taken at international summits to set up goals and targets to improve the wellbeing of children worldwide. Now the world also has more and better data to monitor progress. This paper presents a new approach to monitoring progress in child poverty reduction based on the Alkire and Foster adjusted headcount ratio and an…

  13. Failure: A Source of Progress in Maintenance and Design

    NASA Astrophysics Data System (ADS)

    Chaïb, R.; Taleb, M.; Benidir, M.; Verzea, I.; Bellaouar, A.

    This approach, allows using the failure as a source of progress in maintenance and design to detect the most critical components in equipment, to determine the priority order maintenance actions to lead and direct the exploitation procedure towards the most penalizing links in this equipment, even define the necessary changes and recommendations for future improvement. Thus, appreciate the pathological behaviour of the material and increase its availability, even increase its lifespan and improve its future design. In this context and in the light of these points, the failures are important in managing the maintenance function. Indeed, it has become important to understand the phenomena of failure and degradation of equipments in order to establish an appropriate maintenance policy for the rational use of mechanical components and move to the practice of proactive maintenance [1], do maintenance at the design [2].

  14. Compact DEMO, SlimCS: design progress and issues

    NASA Astrophysics Data System (ADS)

    Tobita, K.; Nishio, S.; Enoeda, M.; Kawashima, H.; Kurita, G.; Tanigawa, H.; Nakamura, H.; Honda, M.; Saito, A.; Sato, S.; Hayashi, T.; Asakura, N.; Sakurai, S.; Nishitani, T.; Ozeki, T.; Ando, M.; Ezato, K.; Hamamatsu, K.; Hirose, T.; Hoshino, T.; Ide, S.; Inoue, T.; Isono, T.; Liu, C.; Kakudate, S.; Kawamura, Y.; Mori, S.; Nakamichi, M.; Nishi, H.; Nozawa, T.; Ochiai, K.; Ogiwara, H.; Oyama, N.; Sakamoto, K.; Sakamoto, Y.; Seki, Y.; Shibama, Y.; Shimizu, K.; Suzuki, S.; Takahashi, K.; Tanigawa, H.; Tsuru, D.; Yamanishi, T.; Yoshida, T.

    2009-07-01

    The design progress in a compact low aspect ratio (low A) DEMO reactor, 'SlimCS', and its design issues are reported. The design study focused mainly on the torus configuration including the blanket, divertor, materials and maintenance scheme. For continuity with the Japanese ITER-TBM, the blanket is based on a water-cooled solid breeder blanket. For vertical stability of the elongated plasma and high beta access, the blanket is segmented into replaceable and permanent blankets and a sector-wide conducting shell is arranged inbetween these blankets. A numerical calculation indicates that fuel self-sufficiency can be satisfied when the blanket interior is ideally fabricated. An allowable heat load to the divertor plate should be 8 MW m-2 or lower, which can be a critical constraint for determining a handling power of DEMO.

  15. Three-dimensional viscous design methodology for advanced technology aircraft supersonic inlet systems

    NASA Technical Reports Server (NTRS)

    Anderson, B. H.

    1983-01-01

    A broad program to develop advanced, reliable, and user oriented three-dimensional viscous design techniques for supersonic inlet systems, and encourage their transfer into the general user community is discussed. Features of the program include: (1) develop effective methods of computing three-dimensional flows within a zonal modeling methodology; (2) ensure reasonable agreement between said analysis and selective sets of benchmark validation data; (3) develop user orientation into said analysis; and (4) explore and develop advanced numerical methodology.

  16. Advanced piloted aircraft flight control system design methodology. Volume 1: Knowledge base

    NASA Technical Reports Server (NTRS)

    Mcruer, Duane T.; Myers, Thomas T.

    1988-01-01

    The development of a comprehensive and electric methodology for conceptual and preliminary design of flight control systems is presented and illustrated. The methodology is focused on the design stages starting with the layout of system requirements and ending when some viable competing system architectures (feedback control structures) are defined. The approach is centered on the human pilot and the aircraft as both the sources of, and the keys to the solution of, many flight control problems. The methodology relies heavily on computational procedures which are highly interactive with the design engineer. To maximize effectiveness, these techniques, as selected and modified to be used together in the methodology, form a cadre of computational tools specifically tailored for integrated flight control system preliminary design purposes. While theory and associated computational means are an important aspect of the design methodology, the lore, knowledge and experience elements, which guide and govern applications are critical features. This material is presented as summary tables, outlines, recipes, empirical data, lists, etc., which encapsulate a great deal of expert knowledge. Much of this is presented in topical knowledge summaries which are attached as Supplements. The composite of the supplements and the main body elements constitutes a first cut at a a Mark 1 Knowledge Base for manned-aircraft flight control.

  17. Application of an Integrated Methodology for Propulsion and Airframe Control Design to a STOVL Aircraft

    NASA Technical Reports Server (NTRS)

    Garg, Sanjay; Mattern, Duane

    1994-01-01

    An advanced methodology for integrated flight propulsion control (IFPC) design for future aircraft, which will use propulsion system generated forces and moments for enhanced maneuver capabilities, is briefly described. This methodology has the potential to address in a systematic manner the coupling between the airframe and the propulsion subsystems typical of such enhanced maneuverability aircraft. Application of the methodology to a short take-off vertical landing (STOVL) aircraft in the landing approach to hover transition flight phase is presented with brief description of the various steps in the IFPC design methodology. The details of the individual steps have been described in previous publications and the objective of this paper is to focus on how the components of the control system designed at each step integrate into the overall IFPC system. The full nonlinear IFPC system was evaluated extensively in nonreal-time simulations as well as piloted simulations. Results from the nonreal-time evaluations are presented in this paper. Lessons learned from this application study are summarized in terms of areas of potential improvements in the STOVL IFPC design as well as identification of technology development areas to enhance the applicability of the proposed design methodology.

  18. Integrated Controls-Structures Design Methodology: Redesign of an Evolutionary Test Structure

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Gupta, Sandeep; Elliot, Kenny B.; Joshi, Suresh M.

    1997-01-01

    An optimization-based integrated controls-structures design methodology for a class of flexible space structures is described, and the phase-0 Controls-Structures-Integration evolutionary model, a laboratory testbed at NASA Langley, is redesigned using this integrated design methodology. The integrated controls-structures design is posed as a nonlinear programming problem to minimize the control effort required to maintain a specified line-of-sight pointing performance, under persistent white noise disturbance. Static and dynamic dissipative control strategies are employed for feedback control, and parameters of these controllers are considered as the control design variables. Sizes of strut elements in various sections of the CEM are used as the structural design variables. Design guides for the struts are developed and employed in the integrated design process, to ensure that the redesigned structure can be effectively fabricated. The superiority of the integrated design methodology over the conventional design approach is demonstrated analytically by observing a significant reduction in the average control power needed to maintain specified pointing performance with the integrated design approach.

  19. Co-design of RAD and ETHICS methodologies: a combination of information system development methods

    NASA Astrophysics Data System (ADS)

    Nasehi, Arezo; Shahriyari, Salman

    2011-12-01

    Co-design is a new trend in the social world which tries to capture different ideas in order to use the most appropriate features for a system. In this paper, co-design of two information system methodologies is regarded; rapid application development (RAD) and effective technical and human implementation of computer-based systems (ETHICS). We tried to consider the characteristics of these methodologies to see the possibility of having a co-design or combination of them for developing an information system. To reach this purpose, four different aspects of them are analyzed: social or technical approach, user participation and user involvement, job satisfaction, and overcoming change resistance. Finally, a case study using the quantitative method is analyzed in order to examine the possibility of co-design using these factors. The paper concludes that RAD and ETHICS are appropriate to be co-designed and brings some suggestions for the co-design.

  20. Efficient preliminary floating offshore wind turbine design and testing methodologies and application to a concrete spar design

    PubMed Central

    Matha, Denis; Sandner, Frank; Molins, Climent; Campos, Alexis; Cheng, Po Wen

    2015-01-01

    The current key challenge in the floating offshore wind turbine industry and research is on designing economic floating systems that can compete with fixed-bottom offshore turbines in terms of levelized cost of energy. The preliminary platform design, as well as early experimental design assessments, are critical elements in the overall design process. In this contribution, a brief review of current floating offshore wind turbine platform pre-design and scaled testing methodologies is provided, with a focus on their ability to accommodate the coupled dynamic behaviour of floating offshore wind systems. The exemplary design and testing methodology for a monolithic concrete spar platform as performed within the European KIC AFOSP project is presented. Results from the experimental tests compared to numerical simulations are presented and analysed and show very good agreement for relevant basic dynamic platform properties. Extreme and fatigue loads and cost analysis of the AFOSP system confirm the viability of the presented design process. In summary, the exemplary application of the reduced design and testing methodology for AFOSP confirms that it represents a viable procedure during pre-design of floating offshore wind turbine platforms. PMID:25583870

  1. Efficient preliminary floating offshore wind turbine design and testing methodologies and application to a concrete spar design.

    PubMed

    Matha, Denis; Sandner, Frank; Molins, Climent; Campos, Alexis; Cheng, Po Wen

    2015-02-28

    The current key challenge in the floating offshore wind turbine industry and research is on designing economic floating systems that can compete with fixed-bottom offshore turbines in terms of levelized cost of energy. The preliminary platform design, as well as early experimental design assessments, are critical elements in the overall design process. In this contribution, a brief review of current floating offshore wind turbine platform pre-design and scaled testing methodologies is provided, with a focus on their ability to accommodate the coupled dynamic behaviour of floating offshore wind systems. The exemplary design and testing methodology for a monolithic concrete spar platform as performed within the European KIC AFOSP project is presented. Results from the experimental tests compared to numerical simulations are presented and analysed and show very good agreement for relevant basic dynamic platform properties. Extreme and fatigue loads and cost analysis of the AFOSP system confirm the viability of the presented design process. In summary, the exemplary application of the reduced design and testing methodology for AFOSP confirms that it represents a viable procedure during pre-design of floating offshore wind turbine platforms.

  2. Design for manufacturing: application of collaborative multidisciplinary decision-making methodology

    NASA Astrophysics Data System (ADS)

    Xiao, A.; Seepersad, C. C.; Allen, J. K.; Rosen, D. W.; Mistree, F.

    2007-06-01

    Design for manufacturing is often difficult for mechanical parts, since significant manufacturing knowledge is required to adjust part designs for manufacturability. The traditional trial-and-error approach usually leads to expensive iterations and compromises the quality of the final design. The authors believe the appropriate way to handle product design for manufacturing problems is not to formulate a large design problem that exhaustively incorporates design and manufacturing issues, but to separate the design and manufacturing activities and provide support for collaboration between engineering teams. In this article, the Collaborative Multidisciplinary Decision-making Methodology is used to solve a product design and manufacturing problem. First, the compromise Decision Support Problem is used as a mathematical model of each engineering teams' design decisions and as a medium for information exchange. Second, game-theoretic principles are employed to resolve couplings or interactions between the teams' decisions. Third, design-capability indices are used to maintain design freedom at the early stages of product realization in order to accommodate unexpected downstream design changes. A plastic robot-arm design and manufacturing scenario is presented to demonstrate the application of this methodology and its effectiveness for solving a complex design for manufacturing problem in a streamlined manner, with minimal expensive iterations.

  3. Problems and progress in aeroelasticity for interdisciplinary design

    NASA Technical Reports Server (NTRS)

    Yates, E. Carson, Jr.

    1987-01-01

    Some problems and progress in the development of aerodynamic and aeroelastic computational capabilities are reviewed with emphasis on needs for use in current interdisciplinary design procedures as well as for stand-alone analyses. The primary focus is on integral-equation methods which are well suited for general, accurate, efficient, and unified treatment of flow around vehicles having arbitrary shapes, motions, and deformations at subsonic, transonic, and supersonic speeds up to high angles of attack. Computational methods for potential flows and viscous flows are discussed, and some applications are shown. Calculation of steady and unsteady aeroelastic characteristics of aircraft with nonlinear aerodynamic behavior is also addressed briefly.

  4. A Proposed Methodology to Assess the Accuracy of 3D Scanners and Casts and Monitor Tooth Wear Progression in Patients.

    PubMed

    Ahmed, Khaled E; Whitters, John; Ju, Xiangyang; Pierce, S Gareth; MacLeod, Charles N; Murray, Colin A

    2016-01-01

    The aim of this study was to detail and assess the capability of a novel methodology to 3D-quantify tooth wear progression in a patient over a period of 12 months. A calibrated stainless steel model was used to identify the accuracy of the scanning system by assessing the accuracy and precision of the contact scanner and the dimensional accuracy and stability of casts fabricated from three different types of impression materials. Thereafter, the overall accuracy of the 3D scanning system (scanner and casts) was ascertained. Clinically, polyether impressions were made of the patient's dentition at the initial examination and at the 12-month review, then poured in type IV dental stone to assess the tooth wear. The anterior teeth on the resultant casts were scanned, and images were analyzed using 3D matching software to detect dimensional variations between the patient's impressions. The accuracy of the 3D scanning system was established to be 33 μm. 3D clinical analysis demonstrated localized wear on the incisal and palatal surfaces of the patient's maxillary central incisors. The identified wear extended to a depth of 500 μm with a distribution of 4% to 7% of affected tooth surfaces. The newly developed 3D scanning methodology was found to be capable of assessing and accounting for the various factors affecting tooth wear scanning. Initial clinical evaluation of the methodology demonstrates successful monitoring of tooth wear progression. However, further clinical assessment is needed.

  5. Passive and semi-active heave compensator: Project design methodology and control strategies.

    PubMed

    Cuellar Sanchez, William Humberto; Linhares, Tássio Melo; Neto, André Benine; Fortaleza, Eugênio Libório Feitosa

    2017-01-01

    Heave compensator is a system that mitigates transmission of heave movement from vessels to the equipment in the vessel. In drilling industry, a heave compensator enables drilling in offshore environments. Heave compensator attenuates movement transmitted from the vessel to the drill string and drill bit ensuring security and efficiency of the offshore drilling process. Common types of heave compensators are passive, active and semi-active compensators. This article presents 4 main points. First, a bulk modulus analysis obtains a simple condition to determine if the bulk modulus can be neglected in the design of hydropneumatic passive heave compensator. Second, the methodology to design passive heave compensators with the desired frequency response. Third, four control methodologies for semi-active heave compensator are tested and compared numerically. Lastly, we show experimental results obtained from a prototype with the methodology developed to design passive heave compensator.

  6. Passive and semi-active heave compensator: Project design methodology and control strategies

    PubMed Central

    Cuellar Sanchez, William Humberto; Neto, André Benine; Fortaleza, Eugênio Libório Feitosa

    2017-01-01

    Heave compensator is a system that mitigates transmission of heave movement from vessels to the equipment in the vessel. In drilling industry, a heave compensator enables drilling in offshore environments. Heave compensator attenuates movement transmitted from the vessel to the drill string and drill bit ensuring security and efficiency of the offshore drilling process. Common types of heave compensators are passive, active and semi-active compensators. This article presents 4 main points. First, a bulk modulus analysis obtains a simple condition to determine if the bulk modulus can be neglected in the design of hydropneumatic passive heave compensator. Second, the methodology to design passive heave compensators with the desired frequency response. Third, four control methodologies for semi-active heave compensator are tested and compared numerically. Lastly, we show experimental results obtained from a prototype with the methodology developed to design passive heave compensator. PMID:28813494

  7. Simplifying multiobjective optimization: An automated design methodology for the nondominated sorted genetic algorithm-II

    NASA Astrophysics Data System (ADS)

    Reed, Patrick; Minsker, Barbara S.; Goldberg, David E.

    2003-07-01

    Many water resources problems require careful balancing of fiscal, technical, and social objectives. Informed negotiation and balancing of objectives can be greatly aided through the use of evolutionary multiobjective optimization (EMO) algorithms, which can evolve entire tradeoff (or Pareto) surfaces within a single run. The primary difficulty in using these methods lies in the large number of parameters that must be specified to ensure that these algorithms effectively quantify design tradeoffs. This technical note addresses this difficulty by introducing a multipopulation design methodology that automates parameter specification for the nondominated sorted genetic algorithm-II (NSGA-II). The NSGA-II design methodology is successfully demonstrated on a multiobjective long-term groundwater monitoring application. Using this methodology, multiobjective optimization problems can now be solved automatically with only a few simple user inputs.

  8. A time-harmonic inverse methodology for the design of RF coils in MRI.

    PubMed

    Lawrence, Ben G; Crozier, Stuart; Yau, Desmond D; Doddrell, David M

    2002-01-01

    An inverse methodology is described to assist in the design of radio-frequency (RF) coils for magnetic resonance imaging (MRI) applications. The time-harmonic electromagnetic Green's functions are used to calculate current on the coil and shield cylinders that will generate a specified internal magnetic field. Stream function techniques and the method of moments are then used to implement this theoretical current density into an RF coil. A novel asymmetric coil operating for a 4.5 T MRI machine was designed and constructed using this methodology and the results are presented.

  9. Energy-Based Design Methodology for Air Vehicle Systems: Aerodynamic Correlation Study

    DTIC Science & Technology

    2005-03-01

    ENERGY -BASED DESIGN METHODOLOGY FOR AIR VEHICLE SYSTEMS : AERODYNAMIC CORRELATION STUDY AFOSR: FA9550-64-"t/Dr. John Schmisseur AFOSR-NA C>(4-1-0- I...drag estimation and vehicle-level utilization of energy . The exergy utilization of a wing in a steady, low subsonic, three-dimensional, viscous flow...5a. CONTRACT NUMBER Energy -Based Design Methodology For Air Vehicle 5b. GRANT NUMBER Systems : Aerodynamic Correlation Study FA9550,-64 (9 4-1-- !(1 5c

  10. Methodological Study for Determining the Task Content of Dental Auxiliary Education Programs. Progress Report.

    ERIC Educational Resources Information Center

    Terry, David R.

    The purpose of the study was to develop a methodology of collecting data pertaining to the dental tasks taught and the responsibility levels to which they are taught in the curricula of educational institutions preparing dental assistants, hygienists, and laboratory technicians. The sample group consisted of Faculty and Preceptor respondents from…

  11. Space station definitions, design, and development. Task 5: Multiple arm telerobot coordination and control: Manipulator design methodology

    NASA Technical Reports Server (NTRS)

    Stoughton, R. M.

    1990-01-01

    A proposed methodology applicable to the design of manipulator systems is described. The current design process is especially weak in the preliminary design phase, since there is no accepted measure to be used in trading off different options available for the various subsystems. The design process described uses Cartesian End-Effector Impedance as a measure of performance for the system. Having this measure of performance, it is shown how it may be used to determine the trade-offs necessary to the preliminary design phase. The design process involves three main parts: (1) determination of desired system performance in terms of End-Effector Impedance; (2) trade-off design options to achieve this desired performance; and (3) verification of system performance through laboratory testing. The design process is developed using numerous examples and experiments to demonstrate the feasability of this approach to manipulator design.

  12. Capturing community change: Active Living by Design's progress reporting system.

    PubMed

    Bors, Philip A

    2012-11-01

    The Active Living by Design (ALbD) National Program Office (NPO) developed an evaluation system to track progress of 25 community partnerships, funded by the Robert Wood Johnson Foundation (RWJF). Between June 2004 and October 2008, partnerships documented their actions and accomplishments through ALbD's online Progress Reporting System (PRS) database. All entries were verified and analyzed by the NPO. Results from the PRS suggest that the ALbD partnerships were successful fundraisers, leveraging $256 million from grants, policy decisions, in-kind and direct sources. The partnerships also documented newspaper coverage, TV, and radio air time and they developed physical activity programs such as exercise clubs and "walking school buses." Partnerships were adept at influencing decision makers to create or rewrite policies and improve built environments. Selected policy examples included, but were not limited to, approvals for capital improvements, street design standards, and development ordinances. Partnerships also contributed to the completion and approval of influential planning products, such as comprehensive land use, neighborhood, and roadway corridor plans. The most common built-environment changes were street improvements for safer pedestrian and bicycle travel, including new crosswalks, bicycle facilities, and sidewalks. The ALbD community partnerships' accomplishments and challenges contribute to knowledge and best practices in the active living field. Five years after their grant began, RWJF's initial investment showed substantial and measurable results. Copyright © 2012 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  13. A methodology for designing robust multivariable nonlinear control systems. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Grunberg, D. B.

    1986-01-01

    A new methodology is described for the design of nonlinear dynamic controllers for nonlinear multivariable systems providing guarantees of closed-loop stability, performance, and robustness. The methodology is an extension of the Linear-Quadratic-Gaussian with Loop-Transfer-Recovery (LQG/LTR) methodology for linear systems, thus hinging upon the idea of constructing an approximate inverse operator for the plant. A major feature of the methodology is a unification of both the state-space and input-output formulations. In addition, new results on stability theory, nonlinear state estimation, and optimal nonlinear regulator theory are presented, including the guaranteed global properties of the extended Kalman filter and optimal nonlinear regulators.

  14. Development of an Expert Judgement Elicitation and Calibration Methodology for Risk Analysis in Conceptual Vehicle Design

    NASA Technical Reports Server (NTRS)

    Unal, Resit; Keating, Charles; Conway, Bruce; Chytka, Trina

    2004-01-01

    A comprehensive expert-judgment elicitation methodology to quantify input parameter uncertainty and analysis tool uncertainty in a conceptual launch vehicle design analysis has been developed. The ten-phase methodology seeks to obtain expert judgment opinion for quantifying uncertainties as a probability distribution so that multidisciplinary risk analysis studies can be performed. The calibration and aggregation techniques presented as part of the methodology are aimed at improving individual expert estimates, and provide an approach to aggregate multiple expert judgments into a single probability distribution. The purpose of this report is to document the methodology development and its validation through application to a reference aerospace vehicle. A detailed summary of the application exercise, including calibration and aggregation results is presented. A discussion of possible future steps in this research area is given.

  15. Probabilistic Accident Progression Analysis with application to a LMFBR design

    SciTech Connect

    Jamali, K.M.

    1982-01-01

    A method for probabilistic analysis of accident sequences in nuclear power plant systems referred to as ''Probabilistic Accident Progression Analysis'' (PAPA) is described. Distinctive features of PAPA include: (1) definition and analysis of initiator-dependent accident sequences on the component level; (2) a new fault-tree simplification technique; (3) a new technique for assessment of the effect of uncertainties in the failure probabilities in the probabilistic ranking of accident sequences; (4) techniques for quantification of dependent failures of similar components, including an iterative technique for high-population components. The methodology is applied to the Shutdown Heat Removal System (SHRS) of the Clinch River Breeder Reactor Plant during its short-term (0

  16. A Novel Multiscale Physics Based Progressive Failure Methodology for Laminated Composite Structures

    NASA Technical Reports Server (NTRS)

    Pineda, Evan J.; Waas, Anthony M.; Bednarcyk, Brett A.; Collier, Craig S.; Yarrington, Phillip W.

    2008-01-01

    A variable fidelity, multiscale, physics based finite element procedure for predicting progressive damage and failure of laminated continuous fiber reinforced composites is introduced. At every integration point in a finite element model, progressive damage is accounted for at the lamina-level using thermodynamically based Schapery Theory. Separate failure criteria are applied at either the global-scale or the microscale in two different FEM models. A micromechanics model, the Generalized Method of Cells, is used to evaluate failure criteria at the micro-level. The stress-strain behavior and observed failure mechanisms are compared with experimental results for both models.

  17. Cyber-Informed Engineering: The Need for a New Risk Informed and Design Methodology

    SciTech Connect

    Price, Joseph Daniel; Anderson, Robert Stephen

    2015-06-01

    Current engineering and risk management methodologies do not contain the foundational assumptions required to address the intelligent adversary’s capabilities in malevolent cyber attacks. Current methodologies focus on equipment failures or human error as initiating events for a hazard, while cyber attacks use the functionality of a trusted system to perform operations outside of the intended design and without the operator’s knowledge. These threats can by-pass or manipulate traditionally engineered safety barriers and present false information, invalidating the fundamental basis of a safety analysis. Cyber threats must be fundamentally analyzed from a completely new perspective where neither equipment nor human operation can be fully trusted. A new risk analysis and design methodology needs to be developed to address this rapidly evolving threatscape.

  18. A cost-effective methodology for the design of massively-parallel VLSI functional units

    NASA Technical Reports Server (NTRS)

    Venkateswaran, N.; Sriram, G.; Desouza, J.

    1993-01-01

    In this paper we propose a generalized methodology for the design of cost-effective massively-parallel VLSI Functional Units. This methodology is based on a technique of generating and reducing a massive bit-array on the mask-programmable PAcube VLSI array. This methodology unifies (maintains identical data flow and control) the execution of complex arithmetic functions on PAcube arrays. It is highly regular, expandable and uniform with respect to problem-size and wordlength, thereby reducing the communication complexity. The memory-functional unit interface is regular and expandable. Using this technique functional units of dedicated processors can be mask-programmed on the naked PAcube arrays, reducing the turn-around time. The production cost of such dedicated processors can be drastically reduced since the naked PAcube arrays can be mass-produced. Analysis of the the performance of functional units designed by our method yields promising results.

  19. A multi-criteria decision aid methodology to design electric vehicles public charging networks

    NASA Astrophysics Data System (ADS)

    Raposo, João; Rodrigues, Ana; Silva, Carlos; Dentinho, Tomaz

    2015-05-01

    This article presents a new multi-criteria decision aid methodology, dynamic-PROMETHEE, here used to design electric vehicle charging networks. In applying this methodology to a Portuguese city, results suggest that it is effective in designing electric vehicle charging networks, generating time and policy based scenarios, considering offer and demand and the city's urban structure. Dynamic-PROMETHE adds to the already known PROMETHEE's characteristics other useful features, such as decision memory over time, versatility and adaptability. The case study, used here to present the dynamic-PROMETHEE, served as inspiration and base to create this new methodology. It can be used to model different problems and scenarios that may present similar requirement characteristics.

  20. Cell-based top-down design methodology for RSFQ digital circuits

    NASA Astrophysics Data System (ADS)

    Yoshikawa, N.; Koshiyama, J.; Motoori, K.; Matsuzaki, F.; Yoda, K.

    2001-08-01

    We propose a cell-based top-down design methodology for rapid single flux quantum (RSFQ) digital circuits. Our design methodology employs a binary decision diagram (BDD), which is currently used for the design of CMOS pass-transistor logic circuits. The main features of the BDD RSFQ circuits are the limited primitive number, dual rail nature, non-clocking architecture, and small gate count. We have made a standard BDD RSFQ cell library and prepared a top-down design CAD environment, by which we can perform logic synthesis, logic simulation, circuit simulation and layout view extraction. In order to clarify problems expected in large-scale RSFQ circuits design, we have designed a small RSFQ microprocessor based on simple architecture using our top-down design methodology. We have estimated its system performance and compared it with that of the CMOS microprocessor with the same architecture. It was found that the RSFQ system is superior in terms of the operating speed though it requires extremely large chip area.

  1. BEAM STOP DESIGN METHODOLOGY AND DESCRIPTION OF A NEW SNS BEAM STOP

    SciTech Connect

    Polsky, Yarom; Plum, Michael A; Geoghegan, Patrick J; Jacobs, Lorelei L; Lu, Wei; McTeer, Stephen Mark

    2010-01-01

    The design of accelerator components such as magnets, accelerator cavities and beam instruments tends to be a fairly standardized and collective effort within the particle accelerator community with well established performance, reliability and, in some cases, even budgetary criteria. Beam stop design, by contrast, has been comparatively subjective historically with much more general goals. This lack of rigor has lead to a variety of facility implementations with limited standardization and minimal consensus on approach to development within the particle accelerator community. At the Spallation Neutron Source (SNS), for example, there are four high power beam stops in use, three of which have significantly different design solutions. This paper describes the design of a new off-momentum beam stop for the SNS. The technical description of the system will be complemented by a discussion of design methodology. This paper presented an overview of the new SNS HEBT off-momentum beam stop and outlined a methodology for beam stop system design. The new beam stop consists of aluminium and steel blocks cooled by a closed-loop forced-air system and is expected to be commissioned this summer. The design methodology outlined in the paper represents a basic description of the process, data, analyses and critical decisions involved in the development of a beam stop system.

  2. A design methodology for effective application of pan-tilt cameras in alarm assessment systems

    SciTech Connect

    Davis, R.F.

    1993-08-01

    Effective application of pan-tilt cameras in alarm assessment systems requires that the overall system design be such that any threat for which the system is designed will be within the field of view of the camera for a sufficiently long time for the assessment of the alarm to be performed. The assessment of alarms in large, unobstructed areas requires a different type of analysis than traditionally used for clear zones between fences along fixed perimeters where an intruder`s possible location is well defined. This paper presents a design methodology which integrates the threat characteristics, sensor detection pattern, system response time, and optics geometry considerations to identify all feasible locations for camera placement for effective assessment of large, unobstructed areas. The methodology also can be used to evaluate tradeoffs among these various considerations to improve candidate designs.

  3. Using Delphi Methodology to Design Assessments of Teachers' Pedagogical Content Knowledge

    ERIC Educational Resources Information Center

    Manizade, Agida Gabil; Mason, Marguerite M.

    2011-01-01

    Descriptions of methodologies that can be used to create items for assessing teachers' "professionally situated" knowledge are lacking in mathematics education research literature. In this study, researchers described and used the Delphi method to design an instrument to measure teachers' pedagogical content knowledge. The instrument focused on a…

  4. IDR: A Participatory Methodology for Interdisciplinary Design in Technology Enhanced Learning

    ERIC Educational Resources Information Center

    Winters, Niall; Mor, Yishay

    2008-01-01

    One of the important themes that emerged from the CAL'07 conference was the failure of technology to bring about the expected disruptive effect to learning and teaching. We identify one of the causes as an inherent weakness in prevalent development methodologies. While the problem of designing technology for learning is irreducibly…

  5. IDR: A Participatory Methodology for Interdisciplinary Design in Technology Enhanced Learning

    ERIC Educational Resources Information Center

    Winters, Niall; Mor, Yishay

    2008-01-01

    One of the important themes that emerged from the CAL'07 conference was the failure of technology to bring about the expected disruptive effect to learning and teaching. We identify one of the causes as an inherent weakness in prevalent development methodologies. While the problem of designing technology for learning is irreducibly…

  6. Using Delphi Methodology to Design Assessments of Teachers' Pedagogical Content Knowledge

    ERIC Educational Resources Information Center

    Manizade, Agida Gabil; Mason, Marguerite M.

    2011-01-01

    Descriptions of methodologies that can be used to create items for assessing teachers' "professionally situated" knowledge are lacking in mathematics education research literature. In this study, researchers described and used the Delphi method to design an instrument to measure teachers' pedagogical content knowledge. The instrument focused on a…

  7. Intranets and Digital Organizational Information Resources: Towards a Portable Methodology for Design and Development.

    ERIC Educational Resources Information Center

    Rosenbaum, Howard

    1997-01-01

    Discusses the concept of the intranet, comparing and contrasting it with groupware, and presents an argument for its value based on technical and information management considerations. Presents an intranet development project for an academic organization and describes a portable, user-centered and team-based methodology for the design and…

  8. Methodology for the Preliminary Design of High Performance Schools in Hot and Humid Climates

    ERIC Educational Resources Information Center

    Im, Piljae

    2009-01-01

    A methodology to develop an easy-to-use toolkit for the preliminary design of high performance schools in hot and humid climates was presented. The toolkit proposed in this research will allow decision makers without simulation knowledge easily to evaluate accurately energy efficient measures for K-5 schools, which would contribute to the…

  9. Beyond Needs Analysis: Soft Systems Methodology for Meaningful Collaboration in EAP Course Design

    ERIC Educational Resources Information Center

    Tajino, Akira; James, Robert; Kijima, Kyoichi

    2005-01-01

    Designing an EAP course requires collaboration among various concerned stakeholders, including students, subject teachers, institutional administrators and EAP teachers themselves. While needs analysis is often considered fundamental to EAP, alternative research methodologies may be required to facilitate meaningful collaboration between these…

  10. Learning Network Design: A Methodology for the Construction of Co-operative Distance Learning Environments.

    ERIC Educational Resources Information Center

    Davies, Dick

    Learning Network Design (LND) is a socially oriented methodology for construction of cooperative distance learning environments. The paper advances a social constructivist approach to learning in which learning and teaching are seen as a process of active communication, interpretation, and negotiation; offers a view of information technology as a…

  11. Curriculum Design: Nurse Educator's Role in Managing and Utilizing Various Teaching Methodologies.

    ERIC Educational Resources Information Center

    Walters, Norma J.

    The role of the nurse educator in curriculum design in the future is considered. Changing technology, shifts in patient care agencies, legislation and long-term care specialties in nursing are all factors that will have a significant impact on curricula. Plans for managing and utilizing various teaching methodologies will be an important role for…

  12. Fundamentals of clinical outcomes assessment for spinal disorders: study designs, methodologies, and analyses.

    PubMed

    Vavken, Patrick; Ganal-Antonio, Anne Kathleen B; Shen, Francis H; Chapman, Jens R; Samartzis, Dino

    2015-04-01

    Study Design A broad narrative review. Objective Management of spinal disorders is continuously evolving, with new technologies being constantly developed. Regardless, assessment of patient outcomes is key in understanding the safety and efficacy of various therapeutic interventions. As such, evidence-based spine care is an essential component to the armamentarium of the spine specialist in an effort to critically analyze the reported literature and execute studies in an effort to improve patient care and change clinical practice. The following article, part one of a two-part series, is meant to bring attention to the pros and cons of various study designs, their methodological issues, as well as statistical considerations. Methods An extensive review of the peer-reviewed literature was performed, irrespective of language of publication, addressing study designs and their methodologies as well as statistical concepts. Results Numerous articles and concepts addressing study designs and their methodological considerations as well as statistical analytical concepts have been reported. Their applications in the context of spine-related conditions and disorders were noted. Conclusion Understanding the fundamental principles of study designs and their methodological considerations as well as statistical analyses can further advance and improve future spine-related research.

  13. A water quality monitoring network design methodology for the selection of critical sampling points: Part I.

    PubMed

    Strobl, R O; Robillard, P D; Shannon, R D; Day, R L; McDonnell, A J

    2006-01-01

    The principal instrument to temporally and spatially manage water resources is a water quality monitoring network. However, to date in most cases, there is a clear absence of a concise strategy or methodology for designing monitoring networks, especially when deciding upon the placement of sampling stations. Since water quality monitoring networks can be quite costly, it is very important to properly design the monitoring network so that maximum information extraction can be accomplished, which in turn is vital when informing decision-makers. This paper presents the development of a methodology for identifying the critical sampling locations within a watershed. Hence, it embodies the spatial component in the design of a water quality monitoring network by designating the critical stream locations that should ideally be sampled. For illustration purposes, the methodology focuses on a single contaminant, namely total phosphorus, and is applicable to small, upland, predominantly agricultural-forested watersheds. It takes a number of hydrologic, topographic, soils, vegetative, and land use factors into account. In addition, it includes an economic as well as logistical component in order to approximate the number of sampling points required for a given budget and to only consider the logistically accessible stream reaches in the analysis, respectively. The methodology utilizes a geographic information system (GIS), hydrologic simulation model, and fuzzy logic.

  14. Methodology for the Preliminary Design of High Performance Schools in Hot and Humid Climates

    ERIC Educational Resources Information Center

    Im, Piljae

    2009-01-01

    A methodology to develop an easy-to-use toolkit for the preliminary design of high performance schools in hot and humid climates was presented. The toolkit proposed in this research will allow decision makers without simulation knowledge easily to evaluate accurately energy efficient measures for K-5 schools, which would contribute to the…

  15. A use case driven object-oriented design methodology for the design of multi-level workflow schemas

    NASA Astrophysics Data System (ADS)

    Chen, Pei-Hung

    Traditional workflow schema design largely involves manual processes. It begins with business-process models and ends with delivering a workflow schema that can be executed by a workflow management system. However, little effort has been made to develop methodological approaches for workflow design [BARE99]. As a consequence, this thesis develops a design methodology which will enable workflow designers to model complex business processes in a simple and straightforward manner and easily generate workflow schemas. Our methodology is a use case based approach to algorithmic multi-level workflow schema generation. It begins with an initial analysis phase to capture requirement specifications, and incorporates workflow technology to support business process modeling that captures business processes as workflow specifications. In the process, we extend the interaction diagram for modeling workflow applications and for integrating business rules. Consequently, the extended interaction diagrams, named workflow based interaction diagrams, can support process-related concepts, including static and dynamic rules, multiple use case scenarios, event scheduling and delay features. We develop automatic conversion algorithms, which enable designers to generate different levels of workflow schemas automatically based on the workflow-based interaction diagrams generated from the use case analysis. The workflow schemas produced can be specified at different designers can further create a multi-level web interface based on these workflow schemas to support state and process defined data views and decision trees simultaneously.

  16. QFD: a methodological tool for integration of ergonomics at the design stage.

    PubMed

    Marsot, Jacques

    2005-03-01

    As a marked increase in the number of musculoskeletal disorders was noted in many industrialized countries and more specifically in companies that require the use of hand tools, the French National Research and Safety Institute launched in 1999 a research program on the topic of integrating ergonomics into hand tool design. After a brief review of the problems of integrating ergonomics at the design stage, the paper shows how the "Quality Function Deployment" method has been applied to the design of a boning knife and it highlights the difficulties encountered. Then, it demonstrates how this method can be a methodological tool geared to greater ergonomics consideration in product design.

  17. A low-power photovoltaic system with energy storage for radio communications: Description and design methodology

    NASA Technical Reports Server (NTRS)

    Chapman, C. P.; Chapman, P. D.; Lewison, A. H.

    1982-01-01

    A low power photovoltaic system was constructed with approximately 500 amp hours of battery energy storage to provide power to an emergency amateur radio communications center. The system can power the communications center for about 72 hours of continuous nonsun operation. Complete construction details and a design methodology algorithm are given with abundant engineering data and adequate theory to allow similar systems to be constructed, scaled up or down, with minimum design effort.

  18. Low-power photovoltaic system with energy storage for radio communications. Description and design methodology

    SciTech Connect

    Chapman, C.P.; Chapman, P.D.; Lewison, A.H.

    1982-01-15

    A low-power photovoltaic system was constructed with approximately 500 amp-hours of battery energy storage to provide power to an emergency amateur radio communications center. The system can power the communications center for about 72 hours of continuous no-sun operation. Complete construction details and a design methodology algorithm are given with abundant engineering data and adequate theory to allow similar systems to be constructed, scaled up or down, with minimum design effort.

  19. A low-power photovoltaic system with energy storage for radio communications: description and design methodology

    SciTech Connect

    Chapman, C.P.; Chapman, P.D.

    1982-01-01

    A low power photovoltaic system was constructed with approximately 500 amp hours of battery energy storage to provide power to an emergency amateur radio communications center. The system can power the communications center for about 72 hours of continuous nonsun operation. Complete construction details and a design methodology algorithm are given with abundant engineering data and adequate theory to allow similar systems to be constructed, scaled up or down, with minimum design effort.

  20. Design Methodology for Multi-Element High-Lift Systems on Subsonic Civil Transport Aircraft

    NASA Technical Reports Server (NTRS)

    Pepper, R. S.; vanDam, C. P.

    1996-01-01

    The choice of a high-lift system is crucial in the preliminary design process of a subsonic civil transport aircraft. Its purpose is to increase the allowable aircraft weight or decrease the aircraft's wing area for a given takeoff and landing performance. However, the implementation of a high-lift system into a design must be done carefully, for it can improve the aerodynamic performance of an aircraft but may also drastically increase the aircraft empty weight. If designed properly, a high-lift system can improve the cost effectiveness of an aircraft by increasing the payload weight for a given takeoff and landing performance. This is why the design methodology for a high-lift system should incorporate aerodynamic performance, weight, and cost. The airframe industry has experienced rapid technological growth in recent years which has led to significant advances in high-lift systems. For this reason many existing design methodologies have become obsolete since they are based on outdated low Reynolds number wind-tunnel data and can no longer accurately predict the aerodynamic characteristics or weight of current multi-element wings. Therefore, a new design methodology has been created that reflects current aerodynamic, weight, and cost data and provides enough flexibility to allow incorporation of new data when it becomes available.

  1. The Progression of Prospective Primary Teachers' Conceptions of the Methodology of Teaching

    NASA Astrophysics Data System (ADS)

    Rivero, Ana; Azcárate, Pilar; Porlán, Rafael; Martín Del Pozo, Rosa; Harres, Joao

    2011-11-01

    This article describes the evolution of prospective primary teachers' conceptions of the methodology of teaching. Three categories were analyzed: the concept of activity, the organization of activities, and the concept of teaching resources. The study was conducted with five teams of prospective teachers, who were participating in teacher education courses of a constructivist orientation. The results showed very different itineraries in the processes of change, and the presence of two major obstacles—the belief that teaching is the direct cause of learning, and epistemological absolutism. The study allows us to deduce some implications for initial teacher education.

  2. Cognitive Activity-based Design Methodology for Novice Visual Communication Designers

    ERIC Educational Resources Information Center

    Kim, Hyunjung; Lee, Hyunju

    2016-01-01

    The notion of design thinking is becoming more concrete nowadays, as design researchers and practitioners study the thinking processes involved in design and employ the concept of design thinking to foster better solutions to complex and ill-defined problems. The goal of the present research is to develop a cognitive activity-based design…

  3. Cognitive Activity-based Design Methodology for Novice Visual Communication Designers

    ERIC Educational Resources Information Center

    Kim, Hyunjung; Lee, Hyunju

    2016-01-01

    The notion of design thinking is becoming more concrete nowadays, as design researchers and practitioners study the thinking processes involved in design and employ the concept of design thinking to foster better solutions to complex and ill-defined problems. The goal of the present research is to develop a cognitive activity-based design…

  4. Turbofan engine control system design using the LQG/LTR methodology

    NASA Technical Reports Server (NTRS)

    Garg, Sanjay

    1989-01-01

    Application of the Linear-Quadratic-Gaussian with Loop-Transfer-Recovery methodology to design of a control system for a simplified turbofan engine model is considered. The importance of properly scaling the plant to achieve the desired Target-Feedback-Loop is emphasized. The steps involved in the application of the methodology are discussed via an example, and evaluation results are presented for a reduced-order compensator. The effect of scaling the plant on the stability robustness evaluation of the closed-loop system is studied in detail.

  5. Novel thermal management system design methodology for power lithium-ion battery

    NASA Astrophysics Data System (ADS)

    Nieto, Nerea; Díaz, Luis; Gastelurrutia, Jon; Blanco, Francisco; Ramos, Juan Carlos; Rivas, Alejandro

    2014-12-01

    Battery packs conformed by large format lithium-ion cells are increasingly being adopted in hybrid and pure electric vehicles in order to use the energy more efficiently and for a better environmental performance. Safety and cycle life are two of the main concerns regarding this technology, which are closely related to the cell's operating behavior and temperature asymmetries in the system. Therefore, the temperature of the cells in battery packs needs to be controlled by thermal management systems (TMSs). In the present paper an improved design methodology for developing TMSs is proposed. This methodology involves the development of different mathematical models for heat generation, transmission, and dissipation and their coupling and integration in the battery pack product design methodology in order to improve the overall safety and performance. The methodology is validated by comparing simulation results with laboratory measurements on a single module of the battery pack designed at IK4-IKERLAN for a traction application. The maximum difference between model predictions and experimental temperature data is 2 °C. The models developed have shown potential for use in battery thermal management studies for EV/HEV applications since they allow for scalability with accuracy and reasonable simulation time.

  6. Aero-Mechanical Design Methodology for Subsonic Civil Transport High-Lift Systems

    NASA Technical Reports Server (NTRS)

    vanDam, C. P.; Shaw, S. G.; VanderKam, J. C.; Brodeur, R. R.; Rudolph, P. K. C.; Kinney, D.

    2000-01-01

    In today's highly competitive and economically driven commercial aviation market, the trend is to make aircraft systems simpler and to shorten their design cycle which reduces recurring, non-recurring and operating costs. One such system is the high-lift system. A methodology has been developed which merges aerodynamic data with kinematic analysis of the trailing-edge flap mechanism with minimum mechanism definition required. This methodology provides quick and accurate aerodynamic performance prediction for a given flap deployment mechanism early on in the high-lift system preliminary design stage. Sample analysis results for four different deployment mechanisms are presented as well as descriptions of the aerodynamic and mechanism data required for evaluation. Extensions to interactive design capabilities are also discussed.

  7. Application of Design Methodologies for Feedback Compensation Associated with Linear Systems

    NASA Technical Reports Server (NTRS)

    Smith, Monty J.

    1996-01-01

    The work that follows is concerned with the application of design methodologies for feedback compensation associated with linear systems. In general, the intent is to provide a well behaved closed loop system in terms of stability and robustness (internal signals remain bounded with a certain amount of uncertainty) and simultaneously achieve an acceptable level of performance. The approach here has been to convert the closed loop system and control synthesis problem into the interpolation setting. The interpolation formulation then serves as our mathematical representation of the design process. Lifting techniques have been used to solve the corresponding interpolation and control synthesis problems. Several applications using this multiobjective design methodology have been included to show the effectiveness of these techniques. In particular, the mixed H 2-H performance criteria with algorithm has been used on several examples including an F-18 HARV (High Angle of Attack Research Vehicle) for sensitivity performance.

  8. Aero-Mechanical Design Methodology for Subsonic Civil Transport High-Lift Systems

    NASA Technical Reports Server (NTRS)

    vanDam, C. P.; Shaw, S. G.; VanderKam, J. C.; Brodeur, R. R.; Rudolph, P. K. C.; Kinney, D.

    2000-01-01

    In today's highly competitive and economically driven commercial aviation market, the trend is to make aircraft systems simpler and to shorten their design cycle which reduces recurring, non-recurring and operating costs. One such system is the high-lift system. A methodology has been developed which merges aerodynamic data with kinematic analysis of the trailing-edge flap mechanism with minimum mechanism definition required. This methodology provides quick and accurate aerodynamic performance prediction for a given flap deployment mechanism early on in the high-lift system preliminary design stage. Sample analysis results for four different deployment mechanisms are presented as well as descriptions of the aerodynamic and mechanism data required for evaluation. Extensions to interactive design capabilities are also discussed.

  9. Design Methodology of a Dual-Halbach Array Linear Actuator with Thermal-Electromagnetic Coupling.

    PubMed

    Eckert, Paulo Roberto; Flores Filho, Aly Ferreira; Perondi, Eduardo; Ferri, Jeferson; Goltz, Evandro

    2016-03-11

    This paper proposes a design methodology for linear actuators, considering thermal and electromagnetic coupling with geometrical and temperature constraints, that maximizes force density and minimizes force ripple. The method allows defining an actuator for given specifications in a step-by-step way so that requirements are met and the temperature within the device is maintained under or equal to its maximum allowed for continuous operation. According to the proposed method, the electromagnetic and thermal models are built with quasi-static parametric finite element models. The methodology was successfully applied to the design of a linear cylindrical actuator with a dual quasi-Halbach array of permanent magnets and a moving-coil. The actuator can produce an axial force of 120 N and a stroke of 80 mm. The paper also presents a comparative analysis between results obtained considering only an electromagnetic model and the thermal-electromagnetic coupled model. This comparison shows that the final designs for both cases differ significantly, especially regarding its active volume and its electrical and magnetic loading. Although in this paper the methodology was employed to design a specific actuator, its structure can be used to design a wide range of linear devices if the parametric models are adjusted for each particular actuator.

  10. Design Methodology of a Dual-Halbach Array Linear Actuator with Thermal-Electromagnetic Coupling

    PubMed Central

    Eckert, Paulo Roberto; Flores Filho, Aly Ferreira; Perondi, Eduardo; Ferri, Jeferson; Goltz, Evandro

    2016-01-01

    This paper proposes a design methodology for linear actuators, considering thermal and electromagnetic coupling with geometrical and temperature constraints, that maximizes force density and minimizes force ripple. The method allows defining an actuator for given specifications in a step-by-step way so that requirements are met and the temperature within the device is maintained under or equal to its maximum allowed for continuous operation. According to the proposed method, the electromagnetic and thermal models are built with quasi-static parametric finite element models. The methodology was successfully applied to the design of a linear cylindrical actuator with a dual quasi-Halbach array of permanent magnets and a moving-coil. The actuator can produce an axial force of 120 N and a stroke of 80 mm. The paper also presents a comparative analysis between results obtained considering only an electromagnetic model and the thermal-electromagnetic coupled model. This comparison shows that the final designs for both cases differ significantly, especially regarding its active volume and its electrical and magnetic loading. Although in this paper the methodology was employed to design a specific actuator, its structure can be used to design a wide range of linear devices if the parametric models are adjusted for each particular actuator. PMID:26978370

  11. A Robust Design Methodology for Optimal Microscale Secondary Flow Control in Compact Inlet Diffusers

    NASA Technical Reports Server (NTRS)

    Anderson, Bernhard H.; Keller, Dennis J.

    2001-01-01

    It is the purpose of this study to develop an economical Robust design methodology for microscale secondary flow control in compact inlet diffusers. To illustrate the potential of economical Robust Design methodology, two different mission strategies were considered for the subject inlet, namely Maximum Performance and Maximum HCF Life Expectancy. The Maximum Performance mission maximized total pressure recovery while the Maximum HCF Life Expectancy mission minimized the mean of the first five Fourier harmonic amplitudes, i.e., 'collectively' reduced all the harmonic 1/2 amplitudes of engine face distortion. Each of the mission strategies was subject to a low engine face distortion constraint, i.e., DC60<0.10, which is a level acceptable for commercial engines. For each of these missions strategies, an 'Optimal Robust' (open loop control) and an 'Optimal Adaptive' (closed loop control) installation was designed over a twenty degree angle-of-incidence range. The Optimal Robust installation used economical Robust Design methodology to arrive at a single design which operated over the entire angle-of-incident range (open loop control). The Optimal Adaptive installation optimized all the design parameters at each angle-of-incidence. Thus, the Optimal Adaptive installation would require a closed loop control system to sense a proper signal for each effector and modify that effector device, whether mechanical or fluidic, for optimal inlet performance. In general, the performance differences between the Optimal Adaptive and Optimal Robust installation designs were found to be marginal. This suggests, however, that Optimal Robust open loop installation designs can be very competitive with Optimal Adaptive close loop designs. Secondary flow control in inlets is inherently robust, provided it is optimally designed. Therefore, the new methodology presented in this paper, combined array 'Lower Order' approach to Robust DOE, offers the aerodynamicist a very viable and

  12. Applying Item Response Theory Methods to Design a Learning Progression-Based Science Assessment

    ERIC Educational Resources Information Center

    Chen, Jing

    2012-01-01

    Learning progressions are used to describe how students' understanding of a topic progresses over time and to classify the progress of students into steps or levels. This study applies Item Response Theory (IRT) based methods to investigate how to design learning progression-based science assessments. The research questions of this study are: (1)…

  13. Applying Item Response Theory Methods to Design a Learning Progression-Based Science Assessment

    ERIC Educational Resources Information Center

    Chen, Jing

    2012-01-01

    Learning progressions are used to describe how students' understanding of a topic progresses over time and to classify the progress of students into steps or levels. This study applies Item Response Theory (IRT) based methods to investigate how to design learning progression-based science assessments. The research questions of this study are: (1)…

  14. New methods, new methodology: Advanced CFD in the Snecma turbomachinery design process

    NASA Astrophysics Data System (ADS)

    Vuillez, Christophe; Petot, Bertrand

    1994-05-01

    CFD tools represent a significant source of improvements in the design process of turbomachinery components, leading to higher performances, cost and cycle savings as well as lower associated risks. Such methods are the backbone of compressor and turbine design methodologies at Snecma. In the 80's, the use of 3D Euler solvers was a key factor in designing fan blades with very high performance level. Counter rotating high speed propellers designed with this methodology reached measured performances very close to their ambitious objective from the first test series. In the late 80's and the beginning of the 90's, new, more powerful methods were rapidly developed and are now commonly used in the design process: a quasi-3D, compressible, transonic inverse method; quasi-3D and 3D Navier-Stokes solvers; 3D unsteady Euler solvers. As an example, several hundred 3D Navier-Stokes computations are run yearly for the design of low and high pressure compressor and turbine blades. In addition to their modelling capabilities, the efficient use of such methods in the design process comes from their close integration in the global methodology and from an adequate exploitation environment. Their validation, their calibration, and the correlations between different levels of modelling are of critical importance to an actual improvement in design know-how. The integration of different methods in the design process is described. Several examples of application illustrate their practical utilization. Comparisons between computational results and test results show their capabilities as well as their present limitations. The prospects linked to new developments currently under way are discussed.

  15. A user-centred methodology for designing an online social network to motivate health behaviour change.

    PubMed

    Kamal, Noreen; Fels, Sidney

    2013-01-01

    Positive health behaviour is critical to preventing illness and managing chronic conditions. A user-centred methodology was employed to design an online social network to motivate health behaviour change. The methodology was augmented by utilizing the Appeal, Belonging, Commitment (ABC) Framework, which is based on theoretical models for health behaviour change and use of online social networks. The user-centred methodology included four phases: 1) initial user inquiry on health behaviour and use of online social networks; 2) interview feedback on paper prototypes; 2) laboratory study on medium fidelity prototype; and 4) a field study on the high fidelity prototype. The points of inquiry through these phases were based on the ABC Framework. This yielded an online social network system that linked to external third party databases to deploy to users via an interactive website.

  16. Automated Methodologies for the Design of Flow Diagrams for Development and Maintenance Activities

    NASA Astrophysics Data System (ADS)

    Shivanand M., Handigund; Shweta, Bhat

    The Software Requirements Specification (SRS) of the organization is a text document prepared by strategic management incorporating the requirements of the organization. These requirements of ongoing business/ project development process involve the software tools, the hardware devices, the manual procedures, the application programs and the communication commands. These components are appropriately ordered for achieving the mission of the concerned process both in the project development and the ongoing business processes, in different flow diagrams viz. activity chart, workflow diagram, activity diagram, component diagram and deployment diagram. This paper proposes two generic, automatic methodologies for the design of various flow diagrams of (i) project development activities, (ii) ongoing business process. The methodologies also resolve the ensuing deadlocks in the flow diagrams and determine the critical paths for the activity chart. Though both methodologies are independent, each complements other in authenticating its correctness and completeness.

  17. Experiences with a prevalidation methodology for designing integrated/propulsion control system architectures

    NASA Technical Reports Server (NTRS)

    Cohen, Gerald C.; Lee, Charles W.; Strickland, Michael J.; Palumbo, Daniel L.

    1989-01-01

    This paper describes a validation methodology and supporting analytical tools developed to provide system designers with a capability of selecting, in the early stages of development, candidate architectures for an integrated airframe/propulsion control system and of predicting their reliability and performance. The results of an application of this methodology to an integrated flight and propulsion control system demonstrated that firm system requirements are being established early in the system life cycle and that an early evaluation exposes missing and conflicting specifications. It is shown that fundamental improvements and refinements can be made early in the concept life cycle when the potential for increased performance is high and the cost and schedule impacts from changes are relatively low. The application of this methodology reduces technical risks associated with integrated system concepts incorporating new technologies.

  18. Using CFD as Rocket Injector Design Tool: Recent Progress at Marshall Space Flight Center

    NASA Technical Reports Server (NTRS)

    Tucker, Kevin; West, Jeff; Williams, Robert; Lin, Jeff; Rocker, Marvin; Canabal, Francisco; Robles, Bryan; Garcia, Robert; Chenoweth, James

    2003-01-01

    The choice of tools used for injector design is in a transitional phase between exclusive reliance on the empirically based correlations and extensive use of computational fluid dynamics (CFD). The Next Generation Launch Technology (NGLT) Program goals emphasizing lower costs and increased reliability have produced a need to enable CFD as an injector design tool in a shorter time frame. This is the primary objective of the Staged Combustor Injector Technology Task currently under way at Marshall Space Flight Center (MSFC). The documentation of this effort begins with a very brief status of current injector design tools. MSFC's vision for use of CFD as a tool for combustion devices design is stated and discussed with emphasis on the injector. The concept of the Simulation Readiness Level (SRL), comprised of solution fidelity, robustness and accuracy, is introduced and discussed. This quantitative measurement is used to establish the gap between the current state of demonstrated capability and that necessary for regular use in the design process. MSFC's view of the validation process is presented and issues associated with obtaining the necessary data are noted and discussed. Three current experimental efforts aimed at generating validation data are presented. The importance of uncertainty analysis to understand the data quality is also demonstrated. First, a brief status of current injector design tools is provided as context for the current effort. Next, the MSFC vision for using CFD as an injector design tool is stated. A generic CFD-based injector design methodology is also outlined and briefly discussed. Three areas where MSFC is using injector CFD analyses for program support will be discussed. These include the Integrated Powerhead Development (IPD) engine which uses hydrogen and oxygen propellants in a full flow staged combustion (FFSC) cycle and the TR-107 and the RS84 engine both of which use RP-1 and oxygen in an ORSC cycle. Finally, an attempt is made to

  19. Progress towards an Optimization Methodology for Combustion-Driven Portable Thermoelectric Power Generation Systems

    SciTech Connect

    Krishnan, Shankar; Karri, Naveen K.; Gogna, Pawan K.; Chase, Jordan R.; Fleurial, Jean-Pierre; Hendricks, Terry J.

    2012-03-13

    Enormous military and commercial interests exist in developing quiet, lightweight, and compact thermoelectric (TE) power generation systems. This paper investigates design integration and analysis of an advanced TE power generation system implementing JP-8 fueled combustion and thermal recuperation. Design and development of a portable TE power system using a JP-8 combustor as a high temperature heat source and optimal process flows depend on efficient heat generation, transfer, and recovery within the system are explored. Design optimization of the system required considering the combustion system efficiency and TE conversion efficiency simultaneously. The combustor performance and TE sub-system performance were coupled directly through exhaust temperatures, fuel and air mass flow rates, heat exchanger performance, subsequent hot-side temperatures, and cold-side cooling techniques and temperatures. Systematic investigation of this system relied on accurate thermodynamic modeling of complex, high-temperature combustion processes concomitantly with detailed thermoelectric converter thermal/mechanical modeling. To this end, this work reports on design integration of systemlevel process flow simulations using commercial software CHEMCADTM with in-house thermoelectric converter and module optimization, and heat exchanger analyses using COMSOLTM software. High-performance, high-temperature TE materials and segmented TE element designs are incorporated in coupled design analyses to achieve predicted TE subsystem level conversion efficiencies exceeding 10%. These TE advances are integrated with a high performance microtechnology combustion reactor based on recent advances at the Pacific Northwest National Laboratory (PNNL). Predictions from this coupled simulation established a basis for optimal selection of fuel and air flow rates, thermoelectric module design and operating conditions, and microtechnology heat-exchanger design criteria. This paper will discuss this

  20. Designing Progressive and Interactive Analytics Processes for High-Dimensional Data Analysis.

    PubMed

    Turkay, Cagatay; Kaya, Erdem; Balcisoy, Selim; Hauser, Helwig

    2017-01-01

    In interactive data analysis processes, the dialogue between the human and the computer is the enabling mechanism that can lead to actionable observations about the phenomena being investigated. It is of paramount importance that this dialogue is not interrupted by slow computational mechanisms that do not consider any known temporal human-computer interaction characteristics that prioritize the perceptual and cognitive capabilities of the users. In cases where the analysis involves an integrated computational method, for instance to reduce the dimensionality of the data or to perform clustering, such non-optimal processes are often likely. To remedy this, progressive computations, where results are iteratively improved, are getting increasing interest in visual analytics. In this paper, we present techniques and design considerations to incorporate progressive methods within interactive analysis processes that involve high-dimensional data. We define methodologies to facilitate processes that adhere to the perceptual characteristics of users and describe how online algorithms can be incorporated within these. A set of design recommendations and according methods to support analysts in accomplishing high-dimensional data analysis tasks are then presented. Our arguments and decisions here are informed by observations gathered over a series of analysis sessions with analysts from finance. We document observations and recommendations from this study and present evidence on how our approach contribute to the efficiency and productivity of interactive visual analysis sessions involving high-dimensional data.

  1. Methodology to design a municipal solid waste pre-collection system. A case study

    SciTech Connect

    Gallardo, A. Carlos, M. Peris, M. Colomer, F.J.

    2015-02-15

    Highlights: • MSW recovery starts at homes; therefore it is important to facilitate it to people. • Additionally, to optimize MSW collection a previous pre-collection must be planned. • A methodology to organize pre-collection considering several factors is presented. • The methodology has been verified applying it to a Spanish middle town. - Abstract: The municipal solid waste (MSW) management is an important task that local governments as well as private companies must take into account to protect human health, the environment and to preserve natural resources. To design an adequate MSW management plan the first step consists in defining the waste generation and composition patterns of the town. As these patterns depend on several socio-economic factors it is advisable to organize them previously. Moreover, the waste generation and composition patterns may vary around the town and over the time. Generally, the data are not homogeneous around the city as the number of inhabitants is not constant nor it is the economic activity. Therefore, if all the information is showed in thematic maps, the final waste management decisions can be made more efficiently. The main aim of this paper is to present a structured methodology that allows local authorities or private companies who deal with MSW to design its own MSW management plan depending on the available data. According to these data, this paper proposes two ways of action: a direct way when detailed data are available and an indirect way when there is a lack of data and it is necessary to take into account bibliographic data. In any case, the amount of information needed is considerable. This paper combines the planning methodology with the Geographic Information Systems to present the final results in thematic maps that make easier to interpret them. The proposed methodology is a previous useful tool to organize the MSW collection routes including the selective collection. To verify the methodology it has

  2. Impact of User-Centered Design Methodology on the Design of Information Systems.

    ERIC Educational Resources Information Center

    Sugar, William A.

    1995-01-01

    Examines the implications of incorporating user-centered design within information systems design practices. Highlights include a definition of user-centered design based on human-computer interface; questions asked about users, including outcome, process, and task variables; and three criteria for when to use this approach in information systems…

  3. Development of an aggregation methodology for risk analysis in aerospace conceptual vehicle design

    NASA Astrophysics Data System (ADS)

    Chytka, Trina Marsh

    2003-10-01

    The growing complexity of technical systems has emphasized a need to gather as much information as possible regarding specific systems of interest in order to make robust, sound decisions about their design and deployment. Acquiring as much data as possible requires the use of empirical statistics, historical information and expert opinion. In much of the aerospace conceptual design environment, the lack of historical information and infeasibility of gathering empirical data relegates the data collection to expert opinion. The conceptual design of a space vehicle requires input from several disciplines (weights and sizing, operations, trajectory, etc.). In this multidisciplinary environment, the design variables are often not easily quantified and have a high degree of uncertainty associated with their values. Decision-makers must rely on expert assessments of the uncertainty associated with the design variables to evaluate the risk level of a conceptual design. Since multiple experts are often queried for their evaluation of uncertainty, a means to combine/aggregate multiple expert assessments must be developed. Providing decision-makers with a solitary assessment that captures the consensus of the multiple experts would greatly enhance the ability to evaluate risk associated with a conceptual design. The objective of this research has been to develop an aggregation methodology that efficiently combines the uncertainty assessments of multiple experts in multiple disciplines involved in aerospace conceptual design. Bayesian probability augmented by uncertainty modeling and expert calibration was employed in the methodology construction. Appropriate questionnaire techniques were used to acquire expert opinion; the responses served as input distributions to the aggregation algorithm. Application of the derived techniques were applied as part of a larger expert assessment elicitation and calibration study. Results of this research demonstrate that aggregation of

  4. Methodology for the Design of Streamline-Traced External-Compression Supersonic Inlets

    NASA Technical Reports Server (NTRS)

    Slater, John W.

    2014-01-01

    A design methodology based on streamline-tracing is discussed for the design of external-compression, supersonic inlets for flight below Mach 2.0. The methodology establishes a supersonic compression surface and capture cross-section by tracing streamlines through an axisymmetric Busemann flowfield. The compression system of shock and Mach waves is altered through modifications to the leading edge and shoulder of the compression surface. An external terminal shock is established to create subsonic flow which is diffused in the subsonic diffuser. The design methodology was implemented into the SUPIN inlet design tool. SUPIN uses specified design factors to design the inlets and computes the inlet performance, which includes the flow rates, total pressure recovery, and wave drag. A design study was conducted using SUPIN and the Wind-US computational fluid dynamics code to design and analyze the properties of two streamline-traced, external-compression (STEX) supersonic inlets for Mach 1.6 freestream conditions. The STEX inlets were compared to axisymmetric pitot, two-dimensional, and axisymmetric spike inlets. The STEX inlets had slightly lower total pressure recovery and higher levels of total pressure distortion than the axisymmetric spike inlet. The cowl wave drag coefficients of the STEX inlets were 20% of those for the axisymmetric spike inlet. The STEX inlets had external sound pressures that were 37% of those of the axisymmetric spike inlet, which may result in lower adverse sonic boom characteristics. The flexibility of the shape of the capture cross-section may result in benefits for the integration of STEX inlets with aircraft.

  5. A knowledge management methodology for the integrated assessment of WWTP configurations during conceptual design.

    PubMed

    Garrido-Baserba, M; Reif, R; Rodriguez-Roda, I; Poch, M

    2012-01-01

    The current complexity involved in wastewater management projects is arising as the XXI century sets new challenges leading towards a more integrated plant design. In this context, the growing number of innovative technologies, stricter legislation and the development of new methodological approaches make it difficult to design appropriate flow schemes for new wastewater projects. Thus, new tools are needed for the wastewater treatment plant (WWTP) conceptual design using integrated assessment methods in order to include different types of objectives at the same time i.e. environmental, economical, technical, and legal. Previous experiences used the decision support system (DSS) methodology to handle the specific issues related to wastewater management, for example, the design of treatment facilities for small communities. However, tools developed for addressing the whole treatment process independently of the plant size, capable of integrating knowledge from many different areas, including both conventional and innovative technologies are not available. Therefore, the aim of this paper is to present and describe an innovative knowledge-based methodology that handles the conceptual design of WWTP process flow-diagrams (PFDs), satisfying a vast number of different criteria. This global approach is based on a hierarchy of decisions that uses the information contained in knowledge bases (KBs) with the aim of automating the generation of suitable WWTP configurations for a specific scenario. Expert interviews, legislation, specialized literature and engineering experience have been integrated within the different KBs, which indeed constitute one of the main highlights of this work. Therefore, the methodology is presented as a valuable tool which provides customized PFD for each specific case, taking into account process unit interactions and the user specified requirements and objectives.

  6. Bayesian cross-entropy methodology for optimal design of validation experiments

    NASA Astrophysics Data System (ADS)

    Jiang, X.; Mahadevan, S.

    2006-07-01

    An important concern in the design of validation experiments is how to incorporate the mathematical model in the design in order to allow conclusive comparisons of model prediction with experimental output in model assessment. The classical experimental design methods are more suitable for phenomena discovery and may result in a subjective, expensive, time-consuming and ineffective design that may adversely impact these comparisons. In this paper, an integrated Bayesian cross-entropy methodology is proposed to perform the optimal design of validation experiments incorporating the computational model. The expected cross entropy, an information-theoretic distance between the distributions of model prediction and experimental observation, is defined as a utility function to measure the similarity of two distributions. A simulated annealing algorithm is used to find optimal values of input variables through minimizing or maximizing the expected cross entropy. The measured data after testing with the optimum input values are used to update the distribution of the experimental output using Bayes theorem. The procedure is repeated to adaptively design the required number of experiments for model assessment, each time ensuring that the experiment provides effective comparison for validation. The methodology is illustrated for the optimal design of validation experiments for a three-leg bolted joint structure and a composite helicopter rotor hub component.

  7. Prognostics and health management design for rotary machinery systems—Reviews, methodology and applications

    NASA Astrophysics Data System (ADS)

    Lee, Jay; Wu, Fangji; Zhao, Wenyu; Ghaffari, Masoud; Liao, Linxia; Siegel, David

    2014-01-01

    Much research has been conducted in prognostics and health management (PHM), an emerging field in mechanical engineering that is gaining interest from both academia and industry. Most of these efforts have been in the area of machinery PHM, resulting in the development of many algorithms for this particular application. The majority of these algorithms concentrate on applications involving common rotary machinery components, such as bearings and gears. Knowledge of this prior work is a necessity for any future research efforts to be conducted; however, there has not been a comprehensive overview that details previous and on-going efforts in PHM. In addition, a systematic method for developing and deploying a PHM system has yet to be established. Such a method would enable rapid customization and integration of PHM systems for diverse applications. To address these gaps, this paper provides a comprehensive review of the PHM field, followed by an introduction of a systematic PHM design methodology, 5S methodology, for converting data to prognostics information. This methodology includes procedures for identifying critical components, as well as tools for selecting the most appropriate algorithms for specific applications. Visualization tools are presented for displaying prognostics information in an appropriate fashion for quick and accurate decision making. Industrial case studies are included in this paper to show how this methodology can help in the design of an effective PHM system.

  8. Incorporating Army Design Methodology into Army Operations: Barriers and Recommendations for Facilitating Integration

    DTIC Science & Technology

    2012-03-01

    mission. A critical thinker would not do that. (Former Brigade Commander) The beauty of Design is that it’s not hierarchical; it’s not reductionist...progress and indicated that what we were doing was making a difference (like a wedding dress store opening, which was a joint business between Sunnis

  9. Progression in Learning about "The Nature of Science": Issues of Conceptualisation and Methodology.

    ERIC Educational Resources Information Center

    Leach, John; And Others

    Recently, it was proposed that a curricular aim of science education should be to engender an understanding of the nature of the scientific enterprise among students, as well as a knowledge of the technical contents of science. Seven diagnostic instruments were designed and administered to students (between the ages of 9 and 16) in an effort to…

  10. Progression in Learning about "The Nature of Science": Issues of Conceptualisation and Methodology.

    ERIC Educational Resources Information Center

    Leach, John; And Others

    Recently, it was proposed that a curricular aim of science education should be to engender an understanding of the nature of the scientific enterprise among students, as well as a knowledge of the technical contents of science. Seven diagnostic instruments were designed and administered to students (between the ages of 9 and 16) in an effort to…

  11. Integrating uniform design and response surface methodology to optimize thiacloprid suspension

    PubMed Central

    Li, Bei-xing; Wang, Wei-chang; Zhang, Xian-peng; Zhang, Da-xia; Mu, Wei; Liu, Feng

    2017-01-01

    A model 25% suspension concentrate (SC) of thiacloprid was adopted to evaluate an integrative approach of uniform design and response surface methodology. Tersperse2700, PE1601, xanthan gum and veegum were the four experimental factors, and the aqueous separation ratio and viscosity were the two dependent variables. Linear and quadratic polynomial models of stepwise regression and partial least squares were adopted to test the fit of the experimental data. Verification tests revealed satisfactory agreement between the experimental and predicted data. The measured values for the aqueous separation ratio and viscosity were 3.45% and 278.8 mPa·s, respectively, and the relative errors of the predicted values were 9.57% and 2.65%, respectively (prepared under the proposed conditions). Comprehensive benefits could also be obtained by appropriately adjusting the amount of certain adjuvants based on practical requirements. Integrating uniform design and response surface methodology is an effective strategy for optimizing SC formulas. PMID:28383036

  12. Design methodology for multi-pumped discrete Raman amplifiers: case-study employing photonic crystal fibers.

    PubMed

    Castellani, C E S; Cani, S P N; Segatto, M E; Pontes, M J; Romero, M A

    2009-08-03

    This paper proposes a new design methodology for discrete multi-pumped Raman amplifier. In a multi-objective optimization scenario, in a first step the whole solution-space is inspected by a CW analytical formulation. Then, the most promising solutions are fully investigated by a rigorous numerical treatment and the Raman amplification performance is thus determined by the combination of analytical and numerical approaches. As an application of our methodology we designed an photonic crystal fiber Raman amplifier configuration which provides low ripple, high gain, clear eye opening and a low power penalty. The amplifier configuration also enables to fully compensate the dispersion introduced by a 70-km singlemode fiber in a 10 Gbit/s system. We have successfully obtained a configuration with 8.5 dB average gain over the C-band and 0.71 dB ripple with almost zero eye-penalty using only two pump lasers with relatively low pump power.

  13. Integrating uniform design and response surface methodology to optimize thiacloprid suspension.

    PubMed

    Li, Bei-Xing; Wang, Wei-Chang; Zhang, Xian-Peng; Zhang, Da-Xia; Mu, Wei; Liu, Feng

    2017-04-06

    A model 25% suspension concentrate (SC) of thiacloprid was adopted to evaluate an integrative approach of uniform design and response surface methodology. Tersperse2700, PE1601, xanthan gum and veegum were the four experimental factors, and the aqueous separation ratio and viscosity were the two dependent variables. Linear and quadratic polynomial models of stepwise regression and partial least squares were adopted to test the fit of the experimental data. Verification tests revealed satisfactory agreement between the experimental and predicted data. The measured values for the aqueous separation ratio and viscosity were 3.45% and 278.8 mPa·s, respectively, and the relative errors of the predicted values were 9.57% and 2.65%, respectively (prepared under the proposed conditions). Comprehensive benefits could also be obtained by appropriately adjusting the amount of certain adjuvants based on practical requirements. Integrating uniform design and response surface methodology is an effective strategy for optimizing SC formulas.

  14. Designing and Implementing INTREPID, an Intensive Program in Translational Research Methodologies for New Investigators

    PubMed Central

    Aphinyanaphongs, Yindalon; Shao, Yongzhao; Micoli, Keith J.; Fang, Yixin; Goldberg, Judith D.; Galeano, Claudia R.; Stangel, Jessica H.; Chavis‐Keeling, Deborah; Hochman, Judith S.; Cronstein, Bruce N.; Pillinger, Michael H.

    2014-01-01

    Abstract Senior housestaff and junior faculty are often expected to perform clinical research, yet may not always have the requisite knowledge and skills to do so successfully. Formal degree programs provide such knowledge, but require a significant commitment of time and money. Short‐term training programs (days to weeks) provide alternative ways to accrue essential information and acquire fundamental methodological skills. Unfortunately, published information about short‐term programs is sparse. To encourage discussion and exchange of ideas regarding such programs, we here share our experience developing and implementing INtensive Training in Research Statistics, Ethics, and Protocol Informatics and Design (INTREPID), a 24‐day immersion training program in clinical research methodologies. Designing, planning, and offering INTREPID was feasible, and required significant faculty commitment, support personnel and infrastructure, as well as committed trainees. PMID:25066862

  15. Progress Towards an Optimization Methodology for Combustion-Driven Portable Thermoelectric Power Generation Systems

    NASA Astrophysics Data System (ADS)

    Krishnan, Shankar; Karri, Naveen K.; Gogna, Pawan K.; Chase, Jordan R.; Fleurial, Jean-Pierre; Hendricks, Terry J.

    2012-06-01

    There is enormous military and commercial interest in developing quiet, lightweight, and compact thermoelectric (TE) power generation systems. This paper investigates design integration and analysis of an advanced TE power generation system implementing JP-8 fueled combustion and thermal recuperation. In the design and development of this portable TE power system using a JP-8 combustor as a high-temperature heat source, optimal process flows depend on efficient heat generation, transfer, and recovery within the system. The combustor performance and TE subsystem performance were coupled directly through combustor exhaust temperatures, fuel and air mass flow rates, heat exchanger performance, subsequent hot-side temperatures, and cold-side cooling techniques and temperatures. Systematic investigation and design optimization of this TE power system relied on accurate thermodynamic modeling of complex, high-temperature combustion processes concomitantly with detailed TE converter thermal/mechanical modeling. To this end, this paper reports integration of system-level process flow simulations using CHEMCAD™ commercial software with in-house TE converter and module optimization, and heat exchanger analyses using COMSOL™ software. High-performance, high-temperature TE materials and segmented TE element designs are incorporated in coupled design analyses to achieve predicted TE subsystem-level conversion efficiencies exceeding 10%. These TE advances are integrated with a high-performance microtechnology combustion reactor based on recent advances at Pacific Northwest National Laboratory (PNNL). Predictions from this coupled simulation approach lead directly to system efficiency-power maps defining potentially available optimal system operating conditions and regimes. Further, it is shown that, for a given fuel flow rate, there exists a combination of recuperative effectiveness and hot-side heat exchanger effectiveness that provides a higher specific power output from

  16. HPCC Methodologies for Structural Design and Analysis on Parallel and Distributed Computing Platforms

    NASA Technical Reports Server (NTRS)

    Farhat, Charbel

    1998-01-01

    In this grant, we have proposed a three-year research effort focused on developing High Performance Computation and Communication (HPCC) methodologies for structural analysis on parallel processors and clusters of workstations, with emphasis on reducing the structural design cycle time. Besides consolidating and further improving the FETI solver technology to address plate and shell structures, we have proposed to tackle the following design related issues: (a) parallel coupling and assembly of independently designed and analyzed three-dimensional substructures with non-matching interfaces, (b) fast and smart parallel re-analysis of a given structure after it has undergone design modifications, (c) parallel evaluation of sensitivity operators (derivatives) for design optimization, and (d) fast parallel analysis of mildly nonlinear structures. While our proposal was accepted, support was provided only for one year.

  17. Spintronic logic design methodology based on spin Hall effect-driven magnetic tunnel junctions

    NASA Astrophysics Data System (ADS)

    Kang, Wang; Wang, Zhaohao; Zhang, Youguang; Klein, Jacques-Olivier; Lv, Weifeng; Zhao, Weisheng

    2016-02-01

    Conventional complementary metal-oxide-semiconductor (CMOS) technology is now approaching its physical scaling limits to enable Moore’s law to continue. Spintronic devices, as one of the potential alternatives, show great promise to replace CMOS technology for next-generation low-power integrated circuits in nanoscale technology nodes. Until now, spintronic memory has been successfully commercialized. However spintronic logic still faces many critical challenges (e.g. direct cascading capability and small operation gain) before it can be practically applied. In this paper, we propose a standard complimentary spintronic logic (CSL) design methodology to form a CMOS-like logic design paradigm. Using the spin Hall effect (SHE)-driven magnetic tunnel junction (MTJ) device as an example, we demonstrate CSL implementation, functionality and performance. This logic family provides a unified design methodology for spintronic logic circuits and partly solves the challenges of direct cascading capability and small operation gain in the previously proposed spintronic logic designs. By solving a modified Landau-Lifshitz-Gilbert equation, the magnetization dynamics in the free layer of the MTJ is theoretically described and a compact electrical model is developed. With this electrical model, numerical simulations have been performed to evaluate the functionality and performance of the proposed CSL design. Simulation results demonstrate that the proposed CSL design paradigm is rather promising for low-power logic computing.

  18. Methodology for CFD Design Analysis of National Launch System Nozzle Manifold

    NASA Technical Reports Server (NTRS)

    Haire, Scot L.

    1993-01-01

    The current design environment dictates that high technology CFD (Computational Fluid Dynamics) analysis produce quality results in a timely manner if it is to be integrated into the design process. The design methodology outlined describes the CFD analysis of an NLS (National Launch System) nozzle film cooling manifold. The objective of the analysis was to obtain a qualitative estimate for the flow distribution within the manifold. A complex, 3D, multiple zone, structured grid was generated from a 3D CAD file of the geometry. A Euler solution was computed with a fully implicit compressible flow solver. Post processing consisted of full 3D color graphics and mass averaged performance. The result was a qualitative CFD solution that provided the design team with relevant information concerning the flow distribution in and performance characteristics of the film cooling manifold within an effective time frame. Also, this design methodology was the foundation for a quick turnaround CFD analysis of the next iteration in the manifold design.

  19. A Formal Methodology to Design and Deploy Dependable Wireless Sensor Networks.

    PubMed

    Testa, Alessandro; Cinque, Marcello; Coronato, Antonio; Augusto, Juan Carlos

    2016-12-23

    Wireless Sensor Networks (WSNs) are being increasingly adopted in critical applications, where verifying the correct operation of sensor nodes is a major concern. Undesired events may undermine the mission of the WSNs. Hence, their effects need to be properly assessed before deployment, to obtain a good level of expected performance; and during the operation, in order to avoid dangerous unexpected results. In this paper, we propose a methodology that aims at assessing and improving the dependability level of WSNs by means of an event-based formal verification technique. The methodology includes a process to guide designers towards the realization of a dependable WSN and a tool ("ADVISES") to simplify its adoption. The tool is applicable to homogeneous WSNs with static routing topologies. It allows the automatic generation of formal specifications used to check correctness properties and evaluate dependability metrics at design time and at runtime for WSNs where an acceptable percentage of faults can be defined. During the runtime, we can check the behavior of the WSN accordingly to the results obtained at design time and we can detect sudden and unexpected failures, in order to trigger recovery procedures. The effectiveness of the methodology is shown in the context of two case studies, as proof-of-concept, aiming to illustrate how the tool is helpful to drive design choices and to check the correctness properties of the WSN at runtime. Although the method scales up to very large WSNs, the applicability of the methodology may be compromised by the state space explosion of the reasoning model, which must be faced by partitioning large topologies into sub-topologies.

  20. The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology: Capabilities and Applications

    NASA Technical Reports Server (NTRS)

    Evers, Ken H.; Bachert, Robert F.

    1987-01-01

    The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology has been formulated and applied over a five-year period. It has proven to be a unique, integrated approach utilizing a top-down, structured technique to define and document the system of interest; a knowledge engineering technique to collect and organize system descriptive information; a rapid prototyping technique to perform preliminary system performance analysis; and a sophisticated simulation technique to perform in-depth system performance analysis.

  1. A Formal Methodology to Design and Deploy Dependable Wireless Sensor Networks

    PubMed Central

    Testa, Alessandro; Cinque, Marcello; Coronato, Antonio; Augusto, Juan Carlos

    2016-01-01

    Wireless Sensor Networks (WSNs) are being increasingly adopted in critical applications, where verifying the correct operation of sensor nodes is a major concern. Undesired events may undermine the mission of the WSNs. Hence, their effects need to be properly assessed before deployment, to obtain a good level of expected performance; and during the operation, in order to avoid dangerous unexpected results. In this paper, we propose a methodology that aims at assessing and improving the dependability level of WSNs by means of an event-based formal verification technique. The methodology includes a process to guide designers towards the realization of a dependable WSN and a tool (“ADVISES”) to simplify its adoption. The tool is applicable to homogeneous WSNs with static routing topologies. It allows the automatic generation of formal specifications used to check correctness properties and evaluate dependability metrics at design time and at runtime for WSNs where an acceptable percentage of faults can be defined. During the runtime, we can check the behavior of the WSN accordingly to the results obtained at design time and we can detect sudden and unexpected failures, in order to trigger recovery procedures. The effectiveness of the methodology is shown in the context of two case studies, as proof-of-concept, aiming to illustrate how the tool is helpful to drive design choices and to check the correctness properties of the WSN at runtime. Although the method scales up to very large WSNs, the applicability of the methodology may be compromised by the state space explosion of the reasoning model, which must be faced by partitioning large topologies into sub-topologies. PMID:28025568

  2. Hydrogel design of experiments methodology to optimize hydrogel for iPSC-NPC culture.

    PubMed

    Lam, Jonathan; Carmichael, S Thomas; Lowry, William E; Segura, Tatiana

    2015-03-11

    Bioactive signals can be incorporated in hydrogels to direct encapsulated cell behavior. Design of experiments methodology methodically varies the signals systematically to determine the individual and combinatorial effects of each factor on cell activity. Using this approach enables the optimization of three ligands concentrations (RGD, YIGSR, IKVAV) for the survival and differentiation of neural progenitor cells. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Modeling Thoracic Blunt Trauma; Towards A Finite-Element-Based Design Methodology for Body Armor

    DTIC Science & Technology

    2004-12-01

    Maio, M., Parks, S ., Schilke , P., Campman, S ., Meyers, C., Georgia, J., and Flemming, D., in preparation: Effects of Ballistically Induced Blunt...1 MODELING THORACIC BLUNT TRAUMA; TOWARDS A FINITE-ELEMENT-BASED DESIGN METHODOLOGY FOR BODY ARMOR Martin N. Raftenberg U. S . Army Research...Section 2 the WSTM is applied to the case of the M882 bullet at 445 m/ s versus a multi-ply Kevlar vest plus thorax. For this situation

  4. The Atomic Intrinsic Integration Approach: A Structured Methodology for the Design of Games for the Conceptual Understanding of Physics

    ERIC Educational Resources Information Center

    Echeverria, Alejandro; Barrios, Enrique; Nussbaum, Miguel; Amestica, Matias; Leclerc, Sandra

    2012-01-01

    Computer simulations combined with games have been successfully used to teach conceptual physics. However, there is no clear methodology for guiding the design of these types of games. To remedy this, we propose a structured methodology for the design of conceptual physics games that explicitly integrates the principles of the intrinsic…

  5. The Atomic Intrinsic Integration Approach: A Structured Methodology for the Design of Games for the Conceptual Understanding of Physics

    ERIC Educational Resources Information Center

    Echeverria, Alejandro; Barrios, Enrique; Nussbaum, Miguel; Amestica, Matias; Leclerc, Sandra

    2012-01-01

    Computer simulations combined with games have been successfully used to teach conceptual physics. However, there is no clear methodology for guiding the design of these types of games. To remedy this, we propose a structured methodology for the design of conceptual physics games that explicitly integrates the principles of the intrinsic…

  6. Assessment of audit methodologies for bias evaluation of tumor progression in oncology clinical trials.

    PubMed

    Zhang, Jenny J; Zhang, Lijun; Chen, Huanyu; Murgo, Anthony J; Dodd, Lori E; Pazdur, Richard; Sridhara, Rajeshwari

    2013-05-15

    As progression-free survival (PFS) has become increasingly used as the primary endpoint in oncology phase III trials, the U.S. Food and Drug Administration (FDA) has generally required a complete-case blinded independent central review (BICR) of PFS to assess and reduce potential bias in the investigator or local site evaluation. However, recent publications and FDA analyses have shown a high correlation between local site evaluation and BICR assessments of the PFS treatment effect, which questions whether complete-case BICR is necessary. One potential alternative is to use BICR as an audit tool to detect evaluation bias in the local site evaluation. In this article, the performance characteristics of two audit methods proposed in the literature are evaluated on 26 prospective, randomized phase III registration trials in nonhematologic malignancies. The results support that a BICR audit to assess potential bias in the local site evaluation is a feasible approach. However, implementation and logistical challenges need further consideration and discussion. ©2013 AACR

  7. In Vitro Developmental Toxicology Screens: A Report on the Progress of the Methodology and Future Applications.

    PubMed

    Zhang, Cindy; Ball, Jonathan; Panzica-Kelly, Julie; Augustine-Rauch, Karen

    2016-04-18

    There has been increasing focus on generation and assessment of in vitro developmental toxicology models for assessing teratogenic liability of chemicals. The driver for this focus has been to find reliable in vitro assays that will reduce or replace the use of in vivo tests for assessing teratogenicity. Such efforts may be eventually applied in testing pharmaceutical agents where a developmental toxicology assay or battery of assays may be incorporated into regulatory testing to replace one of the two species currently used in teratogenic assessment. Such assays may be eventually applied in testing a broader spectrum of chemicals, supporting efforts aligned with Tox21 strategies and responding to REACH legislation. This review describes the developmental toxicology assays that are of focus in these assessments: rodent whole embryo culture, zebrafish embryo assays, and embryonic stem cell assays. Progress on assay development as well as future directions of how these assays are envisioned to be applied for broader safety testing of chemicals are discussed. Altogether, the developmental model systems described in this review provide rich biological systems that can be utilized in better understanding teratogenic mechanisms of action of chemotypes and are promising in providing proactive safety assessment related to developmental toxicity. Continual advancements in refining/optimizing these in vitro assays are anticipated to provide a robust data set to provide thoughtful assessment of how whole animal teratogenicity evaluations can be reduced/refined in the future.

  8. Methodology to optimize fluid-dynamic design in a redox cell

    NASA Astrophysics Data System (ADS)

    Escudero-González, Juan; López-Jiménez, P. Amparo

    2014-04-01

    The present work is aimed at the optimization of a redox cell design. The studied redox cell consists on a device designed to convert the energy of reactants into electrical energy when a liquid electrolyte reacts at the electrode in a conventional manner. In this particular sort of cells, the two electrolytes are present and separated by a proton exchange membrane. Therefore, the flow of the electrolyte and the interaction with the membrane takes a paramount importance for the general performance of the cell. A methodology for designing the inlet part of the cell based on optimizing the uniformity of the flow and the initial position of the membrane is presented in this study. This methodology, based on the definition and optimization of several parameters related to the electrolyte flow in different regions of the geometry, is depicted. The CFD (Computational Fluid Dynamics) model coupled with the statistical study pointed to several practical conclusions on how to improve the final geometry construction of the redox cell. A particular case study of redox cell is implemented in order to validate the proposed methodology.

  9. Adapt Design: A Methodology for Enabling Modular Design for Mission Specific SUAS

    DTIC Science & Technology

    2016-08-24

    the application of this approach are presented via the design of several SUAS. The capability of the design paradigm is assessed through a...stakeholders drives a need for providing users with a small set of inputs that can fully capture the mission, without requiring detailed knowledge of

  10. A Visual Analytics Based Decision Support Methodology For Evaluating Low Energy Building Design Alternatives

    NASA Astrophysics Data System (ADS)

    Dutta, Ranojoy

    The ability to design high performance buildings has acquired great importance in recent years due to numerous federal, societal and environmental initiatives. However, this endeavor is much more demanding in terms of designer expertise and time. It requires a whole new level of synergy between automated performance prediction with the human capabilities to perceive, evaluate and ultimately select a suitable solution. While performance prediction can be highly automated through the use of computers, performance evaluation cannot, unless it is with respect to a single criterion. The need to address multi-criteria requirements makes it more valuable for a designer to know the "latitude" or "degrees of freedom" he has in changing certain design variables while achieving preset criteria such as energy performance, life cycle cost, environmental impacts etc. This requirement can be met by a decision support framework based on near-optimal "satisficing" as opposed to purely optimal decision making techniques. Currently, such a comprehensive design framework is lacking, which is the basis for undertaking this research. The primary objective of this research is to facilitate a complementary relationship between designers and computers for Multi-Criterion Decision Making (MCDM) during high performance building design. It is based on the application of Monte Carlo approaches to create a database of solutions using deterministic whole building energy simulations, along with data mining methods to rank variable importance and reduce the multi-dimensionality of the problem. A novel interactive visualization approach is then proposed which uses regression based models to create dynamic interplays of how varying these important variables affect the multiple criteria, while providing a visual range or band of variation of the different design parameters. The MCDM process has been incorporated into an alternative methodology for high performance building design referred to as

  11. Urban design and health: progress to date and future challenges.

    PubMed

    Lowe, Melanie; Boulange, Claire; Giles-Corti, Billie

    2014-04-01

    Over the last 15 years, a growing body of Australian and international evidence has demonstrated that urban design attributes are associated with a range of health outcomes. For example, the location of employment, shops and services, provision of public and active transport infrastructure and access to open space and recreational opportunities are associated with chronic disease risk factors such as physical activity levels, access to healthy food, social connectedness, and air quality. Despite the growing knowledge base, this evidence is not being consistently translated into urban planning policy and practice in Australia. Low-density neighbourhoods with poor access to public transport, shops and services continue to be developed at a rapid rate in the sprawling outer suburbs of Australian cities. This paper provides an overview of the evidence of the association between the built environment and chronic diseases, highlighting progress and future challenges for health promotion. It argues that health promotion practitioners and researchers need to more closely engage with urban planning practitioners, policymakers and researchers to encourage the creation of healthy urban environments through integrated transport, land use and infrastructure planning. There is also a need for innovative research to evaluate the effectiveness of policy options. This would help evidence to be more effectively translated into policy and practice, making Australia a leader in planning healthy communities.

  12. Preliminary Design and Evaluation of Portable Electronic Flight Progress Strips

    NASA Technical Reports Server (NTRS)

    Doble, Nathan A.; Hansman, R. John

    2002-01-01

    There has been growing interest in using electronic alternatives to the paper Flight Progress Strip (FPS) for air traffic control. However, most research has been centered on radar-based control environments, and has not considered the unique operational needs of the airport air traffic control tower. Based on an analysis of the human factors issues for control tower Decision Support Tool (DST) interfaces, a requirement has been identified for an interaction mechanism which replicates the advantages of the paper FPS (e.g., head-up operation, portability) but also enables input and output with DSTs. An approach has been developed which uses a Portable Electronic FPS that has attributes of both a paper strip and an electronic strip. The prototype flight strip system uses Personal Digital Assistants (PDAs) to replace individual paper strips in addition to a central management interface which is displayed on a desktop computer. Each PDA is connected to the management interface via a wireless local area network. The Portable Electronic FPSs replicate the core functionality of paper flight strips and have additional features which provide a heads-up interface to a DST. A departure DST is used as a motivating example. The central management interface is used for aircraft scheduling and sequencing and provides an overview of airport departure operations. This paper will present the design of the Portable Electronic FPS system as well as preliminary evaluation results.

  13. Biomarker-Guided Adaptive Trial Designs in Phase II and Phase III: A Methodological Review

    PubMed Central

    Antoniou, Miranta; Jorgensen, Andrea L; Kolamunnage-Dona, Ruwanthi

    2016-01-01

    Background Personalized medicine is a growing area of research which aims to tailor the treatment given to a patient according to one or more personal characteristics. These characteristics can be demographic such as age or gender, or biological such as a genetic or other biomarker. Prior to utilizing a patient’s biomarker information in clinical practice, robust testing in terms of analytical validity, clinical validity and clinical utility is necessary. A number of clinical trial designs have been proposed for testing a biomarker’s clinical utility, including Phase II and Phase III clinical trials which aim to test the effectiveness of a biomarker-guided approach to treatment; these designs can be broadly classified into adaptive and non-adaptive. While adaptive designs allow planned modifications based on accumulating information during a trial, non-adaptive designs are typically simpler but less flexible. Methods and Findings We have undertaken a comprehensive review of biomarker-guided adaptive trial designs proposed in the past decade. We have identified eight distinct biomarker-guided adaptive designs and nine variations from 107 studies. Substantial variability has been observed in terms of how trial designs are described and particularly in the terminology used by different authors. We have graphically displayed the current biomarker-guided adaptive trial designs and summarised the characteristics of each design. Conclusions Our in-depth overview provides future researchers with clarity in definition, methodology and terminology for biomarker-guided adaptive trial designs. PMID:26910238

  14. Joint application of AI techniques, PRA and disturbance analysis methodology to problems in the maintenance and design of nuclear power plants

    SciTech Connect

    Okrent, D.

    1989-03-01

    This final report summarizes the accomplishments of a two year research project entitled Joint Application of Artificial Intelligence Techniques, Probabilistic Risk Analysis, and Disturbance Analysis Methodology to Problems in the Maintenance and Design of Nuclear Power Plants. The objective of this project is to develop and apply appropriate combinations of techniques from artificial intelligence, (AI), reliability and risk analysis and disturbance analysis to well-defined programmatic problems of nuclear power plants. Reactor operations issues were added to those of design and maintenance as the project progressed.

  15. Joint application of AI techniques, PRA and disturbance analysis methodology to problems in the maintenance and design of nuclear power plants. Final report

    SciTech Connect

    Okrent, D.

    1989-03-01

    This final report summarizes the accomplishments of a two year research project entitled ``Joint Application of Artificial Intelligence Techniques, Probabilistic Risk Analysis, and Disturbance Analysis Methodology to Problems in the Maintenance and Design of Nuclear Power Plants. The objective of this project is to develop and apply appropriate combinations of techniques from artificial intelligence, (AI), reliability and risk analysis and disturbance analysis to well-defined programmatic problems of nuclear power plants. Reactor operations issues were added to those of design and maintenance as the project progressed.

  16. Methodology to design a municipal solid waste generation and composition map: A case study

    SciTech Connect

    Gallardo, A. Carlos, M. Peris, M. Colomer, F.J.

    2014-11-15

    Highlights: • To draw a waste generation and composition map of a town a lot of factors must be taken into account. • The methodology proposed offers two different depending on the available data combined with geographical information systems. • The methodology has been applied to a Spanish city with success. • The methodology will be a useful tool to organize the municipal solid waste management. - Abstract: The municipal solid waste (MSW) management is an important task that local governments as well as private companies must take into account to protect human health, the environment and to preserve natural resources. To design an adequate MSW management plan the first step consist in defining the waste generation and composition patterns of the town. As these patterns depend on several socio-economic factors it is advisable to organize them previously. Moreover, the waste generation and composition patterns may vary around the town and over the time. Generally, the data are not homogeneous around the city as the number of inhabitants is not constant nor it is the economic activity. Therefore, if all the information is showed in thematic maps, the final waste management decisions can be made more efficiently. The main aim of this paper is to present a structured methodology that allows local authorities or private companies who deal with MSW to design its own MSW management plan depending on the available data. According to these data, this paper proposes two ways of action: a direct way when detailed data are available and an indirect way when there is a lack of data and it is necessary to take into account bibliographic data. In any case, the amount of information needed is considerable. This paper combines the planning methodology with the Geographic Information Systems to present the final results in thematic maps that make easier to interpret them. The proposed methodology is a previous useful tool to organize the MSW collection routes including the

  17. Application of an integrated flight/propulsion control design methodology to a STOVL aircraft

    NASA Technical Reports Server (NTRS)

    Garg, Sanjay; Mattern, Duane L.

    1991-01-01

    Results are presented from the application of an emerging Integrated Flight/Propulsion Control (IFPC) design methodology to a Short Take Off and Vertical Landing (STOVL) aircraft in transition flight. The steps in the methodology consist of designing command shaping prefilters to provide the overall desired response to pilot command inputs. A previously designed centralized controller is first validated for the integrated airframe/engine plant used. This integrated plant is derived from a different model of the engine subsystem than the one used for the centralized controller design. The centralized controller is then partitioned in a decentralized, hierarchical structure comprising of airframe lateral and longitudinal subcontrollers and an engine subcontroller. Command shaping prefilters from the pilot control effector inputs are then designed and time histories of the closed loop IFPC system response to simulated pilot commands are compared to desired responses based on handling qualities requirements. Finally, the propulsion system safety and nonlinear limited protection logic is wrapped around the engine subcontroller and the response of the closed loop integrated system is evaluated for transients that encounter the propulsion surge margin limit.

  18. A methodology for the validated design space exploration of fuel cell powered unmanned aerial vehicles

    NASA Astrophysics Data System (ADS)

    Moffitt, Blake Almy

    Unmanned Aerial Vehicles (UAVs) are the most dynamic growth sector of the aerospace industry today. The need to provide persistent intelligence, surveillance, and reconnaissance for military operations is driving the planned acquisition of over 5,000 UAVs over the next five years. The most pressing need is for quiet, small UAVs with endurance beyond what is capable with advanced batteries or small internal combustion propulsion systems. Fuel cell systems demonstrate high efficiency, high specific energy, low noise, low temperature operation, modularity, and rapid refuelability making them a promising enabler of the small, quiet, and persistent UAVs that military planners are seeking. Despite the perceived benefits, the actual near-term performance of fuel cell powered UAVs is unknown. Until the auto industry began spending billions of dollars in research, fuel cell systems were too heavy for useful flight applications. However, the last decade has seen rapid development with fuel cell gravimetric and volumetric power density nearly doubling every 2--3 years. As a result, a few design studies and demonstrator aircraft have appeared, but overall the design methodology and vehicles are still in their infancy. The design of fuel cell aircraft poses many challenges. Fuel cells differ fundamentally from combustion based propulsion in how they generate power and interact with other aircraft subsystems. As a result, traditional multidisciplinary analysis (MDA) codes are inappropriate. Building new MDAs is difficult since fuel cells are rapidly changing in design, and various competitive architectures exist for balance of plant, hydrogen storage, and all electric aircraft subsystems. In addition, fuel cell design and performance data is closely protected which makes validation difficult and uncertainty significant. Finally, low specific power and high volumes compared to traditional combustion based propulsion result in more highly constrained design spaces that are

  19. Methodology for worker neutron exposure evaluation in the PDCF facility design.

    PubMed

    Scherpelz, R I; Traub, R J; Pryor, K H

    2004-01-01

    A project headed by Washington Group International is meant to design the Pit Disassembly and Conversion Facility (PDCF) to convert the plutonium pits from excessed nuclear weapons into plutonium oxide for ultimate disposition. Battelle staff are performing the shielding calculations that will determine appropriate shielding so that the facility workers will not exceed target exposure levels. The target exposure levels for workers in the facility are 5 mSv y(-1) for the whole body and 100 mSv y(-1) for the extremity, which presents a significant challenge to the designers of a facility that will process tons of radioactive material. The design effort depended on shielding calculations to determine appropriate thickness and composition for glove box walls, and concrete wall thicknesses for storage vaults. Pacific Northwest National Laboratory (PNNL) staff used ORIGEN-S and SOURCES to generate gamma and neutron source terms, and Monte Carlo (computer code for) neutron photon (transport) (MCNP-4C) to calculate the radiation transport in the facility. The shielding calculations were performed by a team of four scientists, so it was necessary to develop a consistent methodology. There was also a requirement for the study to be cost-effective, so efficient methods of evaluation were required. The calculations were subject to rigorous scrutiny by internal and external reviewers, so acceptability was a major feature of the methodology. Some of the issues addressed in the development of the methodology included selecting appropriate dose factors, developing a method for handling extremity doses, adopting an efficient method for evaluating effective dose equivalent in a non-uniform radiation field, modelling the reinforcing steel in concrete, and modularising the geometry descriptions for efficiency. The relative importance of the neutron dose equivalent compared with the gamma dose equivalent varied substantially depending on the specific shielding conditions and lessons

  20. Review of Recent Methodological Developments in Group-Randomized Trials: Part 1—Design

    PubMed Central

    Li, Fan; Gallis, John A.; Prague, Melanie; Murray, David M.

    2017-01-01

    In 2004, Murray et al. reviewed methodological developments in the design and analysis of group-randomized trials (GRTs). We have highlighted the developments of the past 13 years in design with a companion article to focus on developments in analysis. As a pair, these articles update the 2004 review. We have discussed developments in the topics of the earlier review (e.g., clustering, matching, and individually randomized group-treatment trials) and in new topics, including constrained randomization and a range of randomized designs that are alternatives to the standard parallel-arm GRT. These include the stepped-wedge GRT, the pseudocluster randomized trial, and the network-randomized GRT, which, like the parallel-arm GRT, require clustering to be accounted for in both their design and analysis. PMID:28426295

  1. An Optimal Design Methodology of Tapered Roller Bearings Using Genetic Algorithms

    NASA Astrophysics Data System (ADS)

    Tiwari, Rajiv; Sunil, Kumar K.; Reddy, R. S.

    2012-03-01

    In the design of tapered roller bearings, long life is the one of the most important criterion. The design of bearings has to satisfy constraints of geometry and strength, while operating at its rated speed. An optimal design methodology is needed to achieve this objective (i.e., the maximization of the fatigue life). The fatigue life is directly proportional to the dynamic capacity; hence, for the present case, the latter has been chosen as the objective function. It has been optimized by using a constrained nonlinear formulation with real-coded genetic algorithms. Design variables for the bearing include four geometrical parameters: the bearing pitch diameter, the diameter of the roller, the effective length of the roller, and the number of rollers. These directly affect the dynamic capacity of tapered roller bearings. In addition to these, another five design constraint constants are included, which indirectly affect the basic dynamic capacity of tapered roller bearings. The five design constraint constants have been given bounds based on the parametric studies through initial optimization runs. There is good agreement between the optimized and standard bearings in respect to the basic dynamic capacity. A convergence study has been carried out to ensure the global optimum point in the design. A sensitivity analysis of various design parameters, using the Monte Carlo simulation method, has been performed to see changes in the dynamic capacity. Illustrations show that none of the geometric design parameters have adverse affect on the dynamic capacity.

  2. Advanced Turbine Systems (ATS) program conceptual design and product development. Quarterly progress report, December 1, 1995--February 29, 1996

    SciTech Connect

    1997-06-01

    This report describes the overall program status of the General Electric Advanced Gas Turbine Development program, and reports progress on three main task areas. The program is focused on two specific products: (1) a 70-MW class industrial gas turbine based on the GE90 core technology, utilizing a new air cooling methodology; and (2) a 200-MW class utility gas turbine based on an advanced GE heavy-duty machine, utilizing advanced cooling and enhancement in component efficiency. The emphasis for the industrial system is placed on cycle design and low emission combustion. For the utility system, the focus is on developing a technology base for advanced turbine cooling while achieving low emission combustion. The three tasks included in this progress report are on: conversion to a coal-fueled advanced turbine system, integrated program plan, and design and test of critical components. 13 figs., 1 tab.

  3. Device Thrombogenicty Emulator (DTE) – Design optimization Methodology for Cardiovascular Devices: A Study in Two Bileaflet MHV Designs

    PubMed Central

    Xenos, Michalis; Girdhar, Gaurav; Alemu, Yared; Jesty, Jolyon; Slepian, Marvin; Einav, Shmuel; Bluestein, Danny

    2010-01-01

    Patients who receive prosthetic heart valve (PHV) implants require mandatory anticoagulation medication after implantation due to the thrombogenic potential of the valve. Optimization of PHV designs may facilitate reduction of flow-induced thrombogenicity and reduce or eliminate the need for post-implant anticoagulants. We present a methodology entitled Device Thrombogenicty Emulator (DTE) for optimizing the thrombo-resistance performance of PHV by combining numerical and experimental approaches. Two bileaflet mechanical heart valves (MHV) designs – St. Jude Medical (SJM) and ATS were investigated, by studying the effect of distinct flow phases on platelet activation. Transient turbulent and direct numerical simulations (DNS) were conducted, and stress loading histories experienced by the platelets were calculated along flow trajectories. The numerical simulations indicated distinct design dependent differences between the two valves. The stress-loading waveforms extracted from the numerical simulations were programmed into a hemodynamic shearing device (HSD), emulating the flow conditions past the valves in distinct ‘hot spot’ flow regions that are implicated in MHV thrombogenicity. The resultant platelet activity was measured with a modified prothrombinase assay, and was found to be significantly higher in the SJM valve, mostly during the regurgitation phase. The experimental results were in excellent agreement with the calculated platelet activation potential. This establishes the utility of the DTE methodology for serving as a test bed for evaluating design modifications for achieving better thrombogenic performance for such devices. PMID:20483411

  4. A methodology for the design of experiments in computational intelligence with multiple regression models.

    PubMed

    Fernandez-Lozano, Carlos; Gestal, Marcos; Munteanu, Cristian R; Dorado, Julian; Pazos, Alejandro

    2016-01-01

    The design of experiments and the validation of the results achieved with them are vital in any research study. This paper focuses on the use of different Machine Learning approaches for regression tasks in the field of Computational Intelligence and especially on a correct comparison between the different results provided for different methods, as those techniques are complex systems that require further study to be fully understood. A methodology commonly accepted in Computational intelligence is implemented in an R package called RRegrs. This package includes ten simple and complex regression models to carry out predictive modeling using Machine Learning and well-known regression algorithms. The framework for experimental design presented herein is evaluated and validated against RRegrs. Our results are different for three out of five state-of-the-art simple datasets and it can be stated that the selection of the best model according to our proposal is statistically significant and relevant. It is of relevance to use a statistical approach to indicate whether the differences are statistically significant using this kind of algorithms. Furthermore, our results with three real complex datasets report different best models than with the previously published methodology. Our final goal is to provide a complete methodology for the use of different steps in order to compare the results obtained in Computational Intelligence problems, as well as from other fields, such as for bioinformatics, cheminformatics, etc., given that our proposal is open and modifiable.

  5. Evaluation and optimization of hepatocyte culture media factors by design of experiments (DoE) methodology.

    PubMed

    Dong, Jia; Mandenius, Carl-Fredrik; Lübberstedt, Marc; Urbaniak, Thomas; Nüssler, Andreas K N; Knobeloch, Daniel; Gerlach, Jörg C; Zeilinger, Katrin

    2008-07-01

    Optimization of cell culture media based on statistical experimental design methodology is a widely used approach for improving cultivation conditions. We applied this methodology to refine the composition of an established culture medium for growth of a human hepatoma cell line, C3A. A selection of growth factors and nutrient supplements were systematically screened according to standard design of experiments (DoE) procedures. The results of the screening indicated that the medium additives hepatocyte growth factor, oncostatin M, and fibroblast growth factor 4 significantly influenced the metabolic activities of the C3A cell line. Surface response methodology revealed that the optimum levels for these factors were 30 ng/ml for hepatocyte growth factor and 35 ng/ml for oncostatin M. Additional experiments on primary human hepatocyte cultures showed high variance in metabolic activities between cells from different individuals, making determination of optimal levels of factors more difficult. Still, it was possible to conclude that hepatocyte growth factor, epidermal growth factor, and oncostatin M had decisive effects on the metabolic functions of primary human hepatocytes.

  6. Methodology for designing and manufacturing complex biologically inspired soft robotic fluidic actuators: prosthetic hand case study.

    PubMed

    Thompson-Bean, E; Das, R; McDaid, A

    2016-10-31

    We present a novel methodology for the design and manufacture of complex biologically inspired soft robotic fluidic actuators. The methodology is applied to the design and manufacture of a prosthetic for the hand. Real human hands are scanned to produce a 3D model of a finger, and pneumatic networks are implemented within it to produce a biomimetic bending motion. The finger is then partitioned into material sections, and a genetic algorithm based optimization, using finite element analysis, is employed to discover the optimal material for each section. This is based on two biomimetic performance criteria. Two sets of optimizations using two material sets are performed. Promising optimized material arrangements are fabricated using two techniques to validate the optimization routine, and the fabricated and simulated results are compared. We find that the optimization is successful in producing biomimetic soft robotic fingers and that fabrication of the fingers is possible. Limitations and paths for development are discussed. This methodology can be applied for other fluidic soft robotic devices.

  7. A methodology for the design of experiments in computational intelligence with multiple regression models

    PubMed Central

    Gestal, Marcos; Munteanu, Cristian R.; Dorado, Julian; Pazos, Alejandro

    2016-01-01

    The design of experiments and the validation of the results achieved with them are vital in any research study. This paper focuses on the use of different Machine Learning approaches for regression tasks in the field of Computational Intelligence and especially on a correct comparison between the different results provided for different methods, as those techniques are complex systems that require further study to be fully understood. A methodology commonly accepted in Computational intelligence is implemented in an R package called RRegrs. This package includes ten simple and complex regression models to carry out predictive modeling using Machine Learning and well-known regression algorithms. The framework for experimental design presented herein is evaluated and validated against RRegrs. Our results are different for three out of five state-of-the-art simple datasets and it can be stated that the selection of the best model according to our proposal is statistically significant and relevant. It is of relevance to use a statistical approach to indicate whether the differences are statistically significant using this kind of algorithms. Furthermore, our results with three real complex datasets report different best models than with the previously published methodology. Our final goal is to provide a complete methodology for the use of different steps in order to compare the results obtained in Computational Intelligence problems, as well as from other fields, such as for bioinformatics, cheminformatics, etc., given that our proposal is open and modifiable. PMID:27920952

  8. The scheme machine: A case study in progress in design derivation at system levels

    NASA Technical Reports Server (NTRS)

    Johnson, Steven D.

    1995-01-01

    The Scheme Machine is one of several design projects of the Digital Design Derivation group at Indiana University. It differs from the other projects in its focus on issues of system design and its connection to surrounding research in programming language semantics, compiler construction, and programming methodology underway at Indiana and elsewhere. The genesis of the project dates to the early 1980's, when digital design derivation research branched from the surrounding research effort in programming languages. Both branches have continued to develop in parallel, with this particular project serving as a bridge. However, by 1990 there remained little real interaction between the branches and recently we have undertaken to reintegrate them. On the software side, researchers have refined a mathematically rigorous (but not mechanized) treatment starting with the fully abstract semantic definition of Scheme and resulting in an efficient implementation consisting of a compiler and virtual machine model, the latter typically realized with a general purpose microprocessor. The derivation includes a number of sophisticated factorizations and representations and is also deep example of the underlying engineering methodology. The hardware research has created a mechanized algebra supporting the tedious and massive transformations often seen at lower levels of design. This work has progressed to the point that large scale devices, such as processors, can be derived from first-order finite state machine specifications. This is roughly where the language oriented research stops; thus, together, the two efforts establish a thread from the highest levels of abstract specification to detailed digital implementation. The Scheme Machine project challenges hardware derivation research in several ways, although the individual components of the system are of a similar scale to those we have worked with before. The machine has a custom dual-ported memory to support garbage collection

  9. Design Methodology for Magnetic Field-Based Soft Tri-Axis Tactile Sensors.

    PubMed

    Wang, Hongbo; de Boer, Greg; Kow, Junwai; Alazmani, Ali; Ghajari, Mazdak; Hewson, Robert; Culmer, Peter

    2016-08-24

    Tactile sensors are essential if robots are to safely interact with the external world and to dexterously manipulate objects. Current tactile sensors have limitations restricting their use, notably being too fragile or having limited performance. Magnetic field-based soft tactile sensors offer a potential improvement, being durable, low cost, accurate and high bandwidth, but they are relatively undeveloped because of the complexities involved in design and calibration. This paper presents a general design methodology for magnetic field-based three-axis soft tactile sensors, enabling researchers to easily develop specific tactile sensors for a variety of applications. All aspects (design, fabrication, calibration and evaluation) of the development of tri-axis soft tactile sensors are presented and discussed. A moving least square approach is used to decouple and convert the magnetic field signal to force output to eliminate non-linearity and cross-talk effects. A case study of a tactile sensor prototype, MagOne, was developed. This achieved a resolution of 1.42 mN in normal force measurement (0.71 mN in shear force), good output repeatability and has a maximum hysteresis error of 3.4%. These results outperform comparable sensors reported previously, highlighting the efficacy of our methodology for sensor design.

  10. Creating innovative research designs: the 10-year Methodological Think Tank case study.

    PubMed

    Katerndahl, David; Crabtree, Benjamin

    2006-01-01

    Addressing important but complex research questions often necessitates the creation of innovative mixed methods designs. This report describes an approach to developing research designs for studying important but methodologically challenging research questions. The Methodological Think Tank has been held annually in conjunction with the Primary Care Research Methods and Statistics Conference in San Antonio since 1994. A group of 3 to 4 methodologists with expertise balanced between quantitative and qualitative backgrounds is invited by the think tank coordinators to serve on a 2-day think tank to discuss a research question selected from those submitted in response to a call for proposals. During the first half-day, these experts explore the content area with the investigator, often challenging beliefs and assumptions. During the second half-day, the think tank participants systematically prune potential approaches until a desirable research method is identified. To date, the most recent 7 think tanks have produced fundable research designs, with 1 being funded by a K award and 4 by R01 grants. All participating investigators attributed much of their success to think tank participation. Lessons learned include (1) the importance of careful selection of participating methodologists, (2) all think tank communities of inquiry must go through 4 stages of development from pseudocommunity to community, and (3) the critical importance of listening by the investigator. Researchers and academic departments could use this process locally to develop innovative research designs.

  11. Creating Innovative Research Designs: The 10-Year Methodological Think Tank Case Study

    PubMed Central

    Katerndahl, David; Crabtree, Benjamin

    2006-01-01

    PURPOSE Addressing important but complex research questions often necessitates the creation of innovative mixed methods designs. This report describes an approach to developing research designs for studying important but methodologically challenging research questions. METHODS The Methodological Think Tank has been held annually in conjunction with the Primary Care Research Methods and Statistics Conference in San Antonio since 1994. A group of 3 to 4 methodologists with expertise balanced between quantitative and qualitative backgrounds is invited by the think tank coordinators to serve on a 2-day think tank to discuss a research question selected from those submitted in response to a call for proposals. During the first half-day, these experts explore the content area with the investigator, often challenging beliefs and assumptions. During the second half-day, the think tank participants systematically prune potential approaches until a desirable research method is identified. RESULTS To date, the most recent 7 think tanks have produced fundable research designs, with 1 being funded by a K award and 4 by R01 grants. All participating investigators attributed much of their success to think tank participation. Lessons learned include (1) the importance of careful selection of participating methodologists, (2) all think tank communities of inquiry must go through 4 stages of development from pseudocommunity to community, and (3) the critical importance of listening by the investigator. CONCLUSION Researchers and academic departments could use this process locally to develop innovative research designs. PMID:17003146

  12. The engagement of children with disabilities in health-related technology design processes: identifying methodology.

    PubMed

    Allsop, Matthew J; Holt, Raymond J; Levesley, Martin C; Bhakta, Bipinchandra

    2010-01-01

    This review aims to identify research methodology that is suitable for involving children with disabilities in the design of healthcare technology, such as assistive technology and rehabilitation equipment. A review of the literature included the identification of methodology that is available from domains outside of healthcare and suggested a selection of available methods. The need to involve end users within the design of healthcare technology was highlighted, with particular attention to the need for greater levels of participation from children with disabilities within all healthcare research. Issues that may arise when trying to increase such involvement included the need to consider communication via feedback and tailored information, the need to measure levels of participation occurring in current research, and caution regarding the use of proxy information. Additionally, five suitable methods were highlighted that are available for use with children with disabilities in the design of healthcare technology. The methods identified in the review need to be put into practice to establish effective and, if necessary, novel ways of designing healthcare technology when end users are children with disabilities.

  13. Design Methodology for Magnetic Field-Based Soft Tri-Axis Tactile Sensors

    PubMed Central

    Wang, Hongbo; de Boer, Greg; Kow, Junwai; Alazmani, Ali; Ghajari, Mazdak; Hewson, Robert; Culmer, Peter

    2016-01-01

    Tactile sensors are essential if robots are to safely interact with the external world and to dexterously manipulate objects. Current tactile sensors have limitations restricting their use, notably being too fragile or having limited performance. Magnetic field-based soft tactile sensors offer a potential improvement, being durable, low cost, accurate and high bandwidth, but they are relatively undeveloped because of the complexities involved in design and calibration. This paper presents a general design methodology for magnetic field-based three-axis soft tactile sensors, enabling researchers to easily develop specific tactile sensors for a variety of applications. All aspects (design, fabrication, calibration and evaluation) of the development of tri-axis soft tactile sensors are presented and discussed. A moving least square approach is used to decouple and convert the magnetic field signal to force output to eliminate non-linearity and cross-talk effects. A case study of a tactile sensor prototype, MagOne, was developed. This achieved a resolution of 1.42 mN in normal force measurement (0.71 mN in shear force), good output repeatability and has a maximum hysteresis error of 3.4%. These results outperform comparable sensors reported previously, highlighting the efficacy of our methodology for sensor design. PMID:27563908

  14. Low-Radiation Cellular Inductive Powering of Rodent Wireless Brain Interfaces: Methodology and Design Guide.

    PubMed

    Soltani, Nima; Aliroteh, Miaad S; Salam, M Tariqus; Perez Velazquez, Jose Luis; Genov, Roman

    2016-03-04

    This paper presents a general methodology of inductive power delivery in wireless chronic rodent electrophysiology applications. The focus is on such systems design considerations under the following key constraints: maximum power delivery under the allowable specific absorption rate (SAR), low cost and spatial scalability. The methodology includes inductive coil design considerations within a low-frequency ferrite-core-free power transfer link which includes a scalable coil-array power transmitter floor and a single-coil implanted or worn power receiver. A specific design example is presented that includes the concept of low-SAR cellular single-transmitter-coil powering through dynamic tracking of a magnet-less receiver spatial location. The transmitter coil instantaneous supply current is monitored using a small number of low-cost electronic components. A drop in its value indicates the proximity of the receiver due to the reflected impedance of the latter. Only the transmitter coil nearest to the receiver is activated. Operating at the low frequency of 1.5 MHz, the inductive powering floor delivers a maximum of 15.9 W below the IEEE C95 SAR limit, which is over three times greater than that in other recently reported designs. The power transfer efficiency of 39% and 13% at the nominal and maximum distances of 8 cm and 11 cm, respectively, is maintained.

  15. Low-Radiation Cellular Inductive Powering of Rodent Wireless Brain Interfaces: Methodology and Design Guide.

    PubMed

    Soltani, Nima; Aliroteh, Miaad S; Salam, M Tariqus; Perez Velazquez, Jose Luis; Genov, Roman

    2016-08-01

    This paper presents a general methodology of inductive power delivery in wireless chronic rodent electrophysiology applications. The focus is on such systems design considerations under the following key constraints: maximum power delivery under the allowable specific absorption rate (SAR), low cost and spatial scalability. The methodology includes inductive coil design considerations within a low-frequency ferrite-core-free power transfer link which includes a scalable coil-array power transmitter floor and a single-coil implanted or worn power receiver. A specific design example is presented that includes the concept of low-SAR cellular single-transmitter-coil powering through dynamic tracking of a magnet-less receiver spatial location. The transmitter coil instantaneous supply current is monitored using a small number of low-cost electronic components. A drop in its value indicates the proximity of the receiver due to the reflected impedance of the latter. Only the transmitter coil nearest to the receiver is activated. Operating at the low frequency of 1.5 MHz, the inductive powering floor delivers a maximum of 15.9 W below the IEEE C95 SAR limit, which is over three times greater than that in other recently reported designs. The power transfer efficiency of 39% and 13% at the nominal and maximum distances of 8 cm and 11 cm, respectively, is maintained.

  16. Design methodology: edgeless 3D ASICs with complex in-pixel processing for pixel detectors

    SciTech Connect

    Fahim Farah, Fahim Farah; Deptuch, Grzegorz W.; Hoff, James R.; Mohseni, Hooman

    2015-08-28

    The design methodology for the development of 3D integrated edgeless pixel detectors with in-pixel processing using Electronic Design Automation (EDA) tools is presented. A large area 3 tier 3D detector with one sensor layer and two ASIC layers containing one analog and one digital tier, is built for x-ray photon time of arrival measurement and imaging. A full custom analog pixel is 65μm x 65μm. It is connected to a sensor pixel of the same size on one side, and on the other side it has approximately 40 connections to the digital pixel. A 32 x 32 edgeless array without any peripheral functional blocks constitutes a sub-chip. The sub-chip is an indivisible unit, which is further arranged in a 6 x 6 array to create the entire 1.248cm x 1.248cm ASIC. Each chip has 720 bump-bond I/O connections, on the back of the digital tier to the ceramic PCB. All the analog tier power and biasing is conveyed through the digital tier from the PCB. The assembly has no peripheral functional blocks, and hence the active area extends to the edge of the detector. This was achieved by using a few flavors of almost identical analog pixels (minimal variation in layout) to allow for peripheral biasing blocks to be placed within pixels. The 1024 pixels within a digital sub-chip array have a variety of full custom, semi-custom and automated timing driven functional blocks placed together. The methodology uses a modified mixed-mode on-top digital implementation flow to not only harness the tool efficiency for timing and floor-planning but also to maintain designer control over compact parasitically aware layout. The methodology uses the Cadence design platform, however it is not limited to this tool.

  17. Methodology to design a municipal solid waste generation and composition map: a case study.

    PubMed

    Gallardo, A; Carlos, M; Peris, M; Colomer, F J

    2015-02-01

    The municipal solid waste (MSW) management is an important task that local governments as well as private companies must take into account to protect human health, the environment and to preserve natural resources. To design an adequate MSW management plan the first step consists in defining the waste generation and composition patterns of the town. As these patterns depend on several socio-economic factors it is advisable to organize them previously. Moreover, the waste generation and composition patterns may vary around the town and over the time. Generally, the data are not homogeneous around the city as the number of inhabitants is not constant nor it is the economic activity. Therefore, if all the information is showed in thematic maps, the final waste management decisions can be made more efficiently. The main aim of this paper is to present a structured methodology that allows local authorities or private companies who deal with MSW to design its own MSW management plan depending on the available data. According to these data, this paper proposes two ways of action: a direct way when detailed data are available and an indirect way when there is a lack of data and it is necessary to take into account bibliographic data. In any case, the amount of information needed is considerable. This paper combines the planning methodology with the Geographic Information Systems to present the final results in thematic maps that make easier to interpret them. The proposed methodology is a previous useful tool to organize the MSW collection routes including the selective collection. To verify the methodology it has been successfully applied to a Spanish town. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Methodology to design a municipal solid waste generation and composition map: a case study.

    PubMed

    Gallardo, A; Carlos, M; Peris, M; Colomer, F J

    2014-11-01

    The municipal solid waste (MSW) management is an important task that local governments as well as private companies must take into account to protect human health, the environment and to preserve natural resources. To design an adequate MSW management plan the first step consist in defining the waste generation and composition patterns of the town. As these patterns depend on several socio-economic factors it is advisable to organize them previously. Moreover, the waste generation and composition patterns may vary around the town and over the time. Generally, the data are not homogeneous around the city as the number of inhabitants is not constant nor it is the economic activity. Therefore, if all the information is showed in thematic maps, the final waste management decisions can be made more efficiently. The main aim of this paper is to present a structured methodology that allows local authorities or private companies who deal with MSW to design its own MSW management plan depending on the available data. According to these data, this paper proposes two ways of action: a direct way when detailed data are available and an indirect way when there is a lack of data and it is necessary to take into account bibliographic data. In any case, the amount of information needed is considerable. This paper combines the planning methodology with the Geographic Information Systems to present the final results in thematic maps that make easier to interpret them. The proposed methodology is a previous useful tool to organize the MSW collection routes including the selective collection. To verify the methodology it has been successfully applied to a Spanish town. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Design, Progressive Modeling, Manufacture, and Testing of Composite Shield for Turbine Engine Blade Containment

    NASA Technical Reports Server (NTRS)

    Binienda, Wieslaw K.; Sancaktar, Erol; Roberts, Gary D. (Technical Monitor)

    2002-01-01

    An effective design methodology was established for composite jet engine containment structures. The methodology included the development of the full and reduced size prototypes, and FEA models of the containment structure, experimental and numerical examination of the modes of failure clue to turbine blade out event, identification of materials and design candidates for future industrial applications, and design and building of prototypes for testing and evaluation purposes.

  20. Multirate Flutter Suppression System Design for the Benchmark Active Controls Technology Wing. Part 2; Methodology Application Software Toolbox

    NASA Technical Reports Server (NTRS)

    Mason, Gregory S.; Berg, Martin C.; Mukhopadhyay, Vivek

    2002-01-01

    To study the effectiveness of various control system design methodologies, the NASA Langley Research Center initiated the Benchmark Active Controls Project. In this project, the various methodologies were applied to design a flutter suppression system for the Benchmark Active Controls Technology (BACT) Wing. This report describes the user's manual and software toolbox developed at the University of Washington to design a multirate flutter suppression control law for the BACT wing.

  1. Inductive Powering of Subcutaneous Stimulators: Key Parameters and Their Impact on the Design Methodology

    PubMed Central

    Godfraind, Carmen; Debelle, Adrien; Lonys, Laurent; Acuña, Vicente; Doguet, Pascal; Nonclercq, Antoine

    2016-01-01

    Inductive powering of implantable medical devices involves numerous factors acting on the system efficiency and safety in adversarial ways. This paper lightens up their role and identifies a procedure enabling the system design. The latter enables the problem to be decoupled into four principal steps: the frequency choice, the magnetic link optimization, the secondary circuit and then finally the primary circuit designs. The methodology has been tested for the powering system of a device requirering a power of 300mW and implanted at a distance of 15 to 30mm from the outside power source. It allowed the identification of the most critical parameters. A satisfying efficiency of 34% was reached at 21mm and tend to validate the proposed design procedure. PMID:27478572

  2. Design Methodology: ASICs with complex in-pixel processing for Pixel Detectors

    SciTech Connect

    Fahim, Farah

    2014-10-31

    The development of Application Specific Integrated Circuits (ASIC) for pixel detectors with complex in-pixel processing using Computer Aided Design (CAD) tools that are, themselves, mainly developed for the design of conventional digital circuits requires a specialized approach. Mixed signal pixels often require parasitically aware detailed analog front-ends and extremely compact digital back-ends with more than 1000 transistors in small areas below 100μm x 100μm. These pixels are tiled to create large arrays, which have the same clock distribution and data readout speed constraints as in, for example, micro-processors. The methodology uses a modified mixed-mode on-top digital implementation flow to not only harness the tool efficiency for timing and floor-planning but also to maintain designer control over compact parasitically aware layout.

  3. Inductive Powering of Subcutaneous Stimulators: Key Parameters and Their Impact on the Design Methodology.

    PubMed

    Godfraind, Carmen; Debelle, Adrien; Lonys, Laurent; Acuña, Vicente; Doguet, Pascal; Nonclercq, Antoine

    2016-06-13

    Inductive powering of implantable medical devices involves numerous factors acting on the system efficiency and safety in adversarial ways. This paper lightens up their role and identifies a procedure enabling the system design. The latter enables the problem to be decoupled into four principal steps: the frequency choice, the magnetic link optimization, the secondary circuit and then finally the primary circuit designs. The methodology has been tested for the powering system of a device requirering a power of 300mW and implanted at a distance of 15 to 30mm from the outside power source. It allowed the identification of the most critical parameters. A satisfying efficiency of 34% was reached at 21mm and tend to validate the proposed design procedure.

  4. Numerical simulation methodologies for design and development of Diffuser-Augmented Wind Turbines - analysis and comparison

    NASA Astrophysics Data System (ADS)

    Michał, Lipian; Maciej, Karczewski; Jakub, Molinski; Krzysztof, Jozwik

    2016-01-01

    Different numerical computation methods used to develop a methodology for fast, efficient, reliable design and comparison of Diffuser-Augmented Wind Turbine (DAWT) geometries are presented. The demand for such methods is evident, following the multitude of geometrical parameters that influence the flow character through ducted turbines. The results of the Actuator Disk Model (ADM) simulations will be confronted with a simulation method of higher order of accuracy, i.e. the 3D Fully-resolved Rotor Model (FRM) in the rotor design point. Both will be checked for consistency with the experimental results measured in the wind tunnel at the Institute of Turbo-machinery (IMP), Lodz University of Technology (TUL). An attempt to find an efficient method (with a compromise between accuracy and design time) for the flow analysis pertinent to the DAWT is a novel approach presented in this paper.

  5. A Human-Centered Design Methodology to Enhance the Usability, Human Factors, and User Experience of Connected Health Systems: A Three-Phase Methodology

    PubMed Central

    Harte, Richard; Glynn, Liam; Rodríguez-Molinero, Alejandro; Baker, Paul MA; Scharf, Thomas; ÓLaighin, Gearóid

    2017-01-01

    Background Design processes such as human-centered design, which involve the end user throughout the product development and testing process, can be crucial in ensuring that the product meets the needs and capabilities of the user, particularly in terms of safety and user experience. The structured and iterative nature of human-centered design can often present a challenge when design teams are faced with the necessary, rapid, product development life cycles associated with the competitive connected health industry. Objective We wanted to derive a structured methodology that followed the principles of human-centered design that would allow designers and developers to ensure that the needs of the user are taken into account throughout the design process, while maintaining a rapid pace of development. In this paper, we present the methodology and its rationale before outlining how it was applied to assess and enhance the usability, human factors, and user experience of a connected health system known as the Wireless Insole for Independent and Safe Elderly Living (WIISEL) system, a system designed to continuously assess fall risk by measuring gait and balance parameters associated with fall risk. Methods We derived a three-phase methodology. In Phase 1 we emphasized the construction of a use case document. This document can be used to detail the context of use of the system by utilizing storyboarding, paper prototypes, and mock-ups in conjunction with user interviews to gather insightful user feedback on different proposed concepts. In Phase 2 we emphasized the use of expert usability inspections such as heuristic evaluations and cognitive walkthroughs with small multidisciplinary groups to review the prototypes born out of the Phase 1 feedback. Finally, in Phase 3 we emphasized classical user testing with target end users, using various metrics to measure the user experience and improve the final prototypes. Results We report a successful implementation of the

  6. A Human-Centered Design Methodology to Enhance the Usability, Human Factors, and User Experience of Connected Health Systems: A Three-Phase Methodology.

    PubMed

    Harte, Richard; Glynn, Liam; Rodríguez-Molinero, Alejandro; Baker, Paul Ma; Scharf, Thomas; Quinlan, Leo R; ÓLaighin, Gearóid

    2017-03-16

    Design processes such as human-centered design, which involve the end user throughout the product development and testing process, can be crucial in ensuring that the product meets the needs and capabilities of the user, particularly in terms of safety and user experience. The structured and iterative nature of human-centered design can often present a challenge when design teams are faced with the necessary, rapid, product development life cycles associated with the competitive connected health industry. We wanted to derive a structured methodology that followed the principles of human-centered design that would allow designers and developers to ensure that the needs of the user are taken into account throughout the design process, while maintaining a rapid pace of development. In this paper, we present the methodology and its rationale before outlining how it was applied to assess and enhance the usability, human factors, and user experience of a connected health system known as the Wireless Insole for Independent and Safe Elderly Living (WIISEL) system, a system designed to continuously assess fall risk by measuring gait and balance parameters associated with fall risk. We derived a three-phase methodology. In Phase 1 we emphasized the construction of a use case document. This document can be used to detail the context of use of the system by utilizing storyboarding, paper prototypes, and mock-ups in conjunction with user interviews to gather insightful user feedback on different proposed concepts. In Phase 2 we emphasized the use of expert usability inspections such as heuristic evaluations and cognitive walkthroughs with small multidisciplinary groups to review the prototypes born out of the Phase 1 feedback. Finally, in Phase 3 we emphasized classical user testing with target end users, using various metrics to measure the user experience and improve the final prototypes. We report a successful implementation of the methodology for the design and development

  7. A Digital Methodology for the Design Process of Aerospace Assemblies with Sustainable Composite Processes & Manufacture

    NASA Astrophysics Data System (ADS)

    McEwan, W.; Butterfield, J.

    2011-05-01

    The well established benefits of composite materials are driving a significant shift in design and manufacture strategies for original equipment manufacturers (OEMs). Thermoplastic composites have advantages over the traditional thermosetting materials with regards to sustainability and environmental impact, features which are becoming increasingly pertinent in the aerospace arena. However, when sustainability and environmental impact are considered as design drivers, integrated methods for part design and product development must be developed so that any benefits of sustainable composite material systems can be assessed during the design process. These methods must include mechanisms to account for process induced part variation and techniques related to re-forming, recycling and decommissioning, which are in their infancy. It is proposed in this paper that predictive techniques related to material specification, part processing and product cost of thermoplastic composite components, be integrated within a Through Life Management (TLM) product development methodology as part of a larger strategy of product system modeling to improve disciplinary concurrency, realistic part performance, and to place sustainability at the heart of the design process. This paper reports the enhancement of digital manufacturing tools as a means of drawing simulated part manufacturing scenarios, real time costing mechanisms, and broader lifecycle performance data capture into the design cycle. The work demonstrates predictive processes for sustainable composite product manufacture and how a Product-Process-Resource (PPR) structure can be customised and enhanced to include design intent driven by `Real' part geometry and consequent assembly. your paper.

  8. A system-of-systems modeling methodology for strategic general aviation design decision-making

    NASA Astrophysics Data System (ADS)

    Won, Henry Thome

    General aviation has long been studied as a means of providing an on-demand "personal air vehicle" that bypasses the traffic at major commercial hubs. This thesis continues this research through development of a system of systems modeling methodology applicable to the selection of synergistic product concepts, market segments, and business models. From the perspective of the conceptual design engineer, the design and selection of future general aviation aircraft is complicated by the definition of constraints and requirements, and the tradeoffs among performance and cost aspects. Qualitative problem definition methods have been utilized, although their accuracy in determining specific requirement and metric values is uncertain. In industry, customers are surveyed, and business plans are created through a lengthy, iterative process. In recent years, techniques have developed for predicting the characteristics of US travel demand based on travel mode attributes, such as door-to-door time and ticket price. As of yet, these models treat the contributing systems---aircraft manufacturers and service providers---as independently variable assumptions. In this research, a methodology is developed which seeks to build a strategic design decision making environment through the construction of a system of systems model. The demonstrated implementation brings together models of the aircraft and manufacturer, the service provider, and most importantly the travel demand. Thus represented is the behavior of the consumers and the reactive behavior of the suppliers---the manufacturers and transportation service providers---in a common modeling framework. The results indicate an ability to guide the design process---specifically the selection of design requirements---through the optimization of "capability" metrics. Additionally, results indicate the ability to find synergetic solutions, that is solutions in which two systems might collaborate to achieve a better result than acting

  9. LOX/hydrocarbon rocket engine analytical design methodology development and validation. Volume 2: Appendices

    NASA Technical Reports Server (NTRS)

    Niiya, Karen E.; Walker, Richard E.; Pieper, Jerry L.; Nguyen, Thong V.

    1993-01-01

    This final report includes a discussion of the work accomplished during the period from Dec. 1988 through Nov. 1991. The objective of the program was to assemble existing performance and combustion stability models into a usable design methodology capable of designing and analyzing high-performance and stable LOX/hydrocarbon booster engines. The methodology was then used to design a validation engine. The capabilities and validity of the methodology were demonstrated using this engine in an extensive hot fire test program. The engine used LOX/RP-1 propellants and was tested over a range of mixture ratios, chamber pressures, and acoustic damping device configurations. This volume contains time domain and frequency domain stability plots which indicate the pressure perturbation amplitudes and frequencies from approximately 30 tests of a 50K thrust rocket engine using LOX/RP-1 propellants over a range of chamber pressures from 240 to 1750 psia with mixture ratios of from 1.2 to 7.5. The data is from test configurations which used both bitune and monotune acoustic cavities and from tests with no acoustic cavities. The engine had a length of 14 inches and a contraction ratio of 2.0 using a 7.68 inch diameter injector. The data was taken from both stable and unstable tests. All combustion instabilities were spontaneous in the first tangential mode. Although stability bombs were used and generated overpressures of approximately 20 percent, no tests were driven unstable by the bombs. The stability instrumentation included six high-frequency Kistler transducers in the combustion chamber, a high-frequency Kistler transducer in each propellant manifold, and tri-axial accelerometers. Performance data is presented, both characteristic velocity efficiencies and energy release efficiencies, for those tests of sufficient duration to record steady state values.

  10. Meta-Design. An Approach to the Development of Design Methodologies

    DTIC Science & Technology

    1990-01-01

    subject to the constraints. The KKT conditions are necessary conditions for a particular value X* for the vector of design variables X, to be a... optimization problem a second time. We start by applying the requirement that optimality is to be maintained, so we must also satisfy the third KKT condition ... optimal solution to this problem must satisfy the third KKT condition : afl/’cl + X .g/Dcl = 2(cl/10 - 1)(1/10) + X1 = 0. Then X1 = (1/5)(1 - cl/10

  11. New Methodology of Designing Inexpensive Hybrid Control-Acquisition Systems for Mechatronic Constructions

    PubMed Central

    Augustyn, Jacek

    2013-01-01

    This article presents a new methodology for designing a hybrid control and acquisition system consisting of a 32-bit SoC microsystem connected via a direct Universal Serial Bus (USB) with a standard commercial off-the-shelf (COTS) component running the Android operating system. It is proposed to utilize it avoiding the use of an additional converter. An Android-based component was chosen to explore the potential for a mobile, compact and energy efficient solution with easy to build user interfaces and easy wireless integration with other computer systems. This paper presents results of practical implementation and analysis of experimental real-time performance. It covers closed control loop time between the sensor/actuator module and the Android operating system as well as the real-time sensor data stream within such a system. Some optimisations are proposed and their influence on real-time performance was investigated. The proposed methodology is intended for acquisition and control of mechatronic systems, especially mobile robots. It can be used in a wide range of control applications as well as embedded acquisition-recording devices, including energy quality measurements, smart-grids and medicine. It is demonstrated that the proposed methodology can be employed without developing specific device drivers. The latency achieved was less than 0.5 ms and the sensor data stream throughput was on the order of 750 KB/s (compared to 3 ms latency and 300 KB/s in traditional solutions). PMID:24351633

  12. New methodology of designing inexpensive hybrid control-acquisition systems for mechatronic constructions.

    PubMed

    Augustyn, Jacek

    2013-12-13

    This article presents a new methodology for designing a hybrid control and acquisition system consisting of a 32-bit SoC microsystem connected via a direct Universal Serial Bus (USB) with a standard commercial off-the-shelf (COTS) component running the Android operating system. It is proposed to utilize it avoiding the use of an additional converter. An Android-based component was chosen to explore the potential for a mobile, compact and energy efficient solution with easy to build user interfaces and easy wireless integration with other computer systems. This paper presents results of practical implementation and analysis of experimental real-time performance. It covers closed control loop time between the sensor/actuator module and the Android operating system as well as the real-time sensor data stream within such a system. Some optimisations are proposed and their influence on real-time performance was investigated. The proposed methodology is intended for acquisition and control of mechatronic systems, especially mobile robots. It can be used in a wide range of control applications as well as embedded acquisition-recording devices, including energy quality measurements, smart-grids and medicine. It is demonstrated that the proposed methodology can be employed without developing specific device drivers. The latency achieved was less than 0.5 ms and the sensor data stream throughput was on the order of 750 KB/s (compared to 3 ms latency and 300 KB/s in traditional solutions).

  13. P-band Radar Retrieval of Root-Zone Soil Moisture: AirMOSS Methodology, Progress, and Improvements

    NASA Astrophysics Data System (ADS)

    Moghaddam, M.; Tabatabaeenejad, A.; Chen, R.

    2015-12-01

    The AirMOSS mission seeks to improve the estimates of the North American Net Ecosystem Exchange (NEE)by providing high-resolution observations of the root zone soil moisture (RZSM) over regions representative of themajor North American biomes. The radar snapshots are used to generate estimates of RZSM. To retrieve RZSM, weuse a discrete scattering model integrated with layered-soil scattering models. The soil moisture profile is representedas a quadratic function in the form of az2 + bz + c, where z is the depth and a, b, and c are the coefficients to beretrieved. The ancillary data necessary to characterize a pixel are available from various databases. We applythe retrieval method to the radar data acquired over AirMOSS sites including Canada's BERMS, Walnut Gulchin Arizona, MOISST in Oklahoma, Tonzi Ranch in California, and Metolius in Oregon, USA. The estimated soilmoisture profile is validated against in-situ soil moisture measurements. We have continued to improve the accuracyof retrievals as the delivery of the RZSMproducts has progressed since 2012. For example, the 'threshold depth' (thedepth up to which the retrieval is mathematically valid) has been reduced from 100 cm to 50 cm after the retrievalaccuracy was assessed both mathematically and physically. Moreover, we progressively change the implementationof the inversion code and its subroutines as we find more accurate and efficient ways of mathematical operations. Thelatest AirMOSS results (including soil moisture maps, validation plots, and scatter plots) as well as all improvementsapplied to the retrieval algorithm, including the one mentioned above, will be reported at the talk, following a briefdescription of the retrieval methodology. Fig. 1 shows a validation plot for a flight over Tonzi Ranch from September2014 (a) and a scatter plot for various threshold depths using 2012 and 2013 data.

  14. Redirector design methodology for horizontal target plane applications at the Central Receiver Test Facility

    SciTech Connect

    Arvizu, D.E.; Mulholland, G.P.

    1984-11-01

    The equations necessary for designing a multifaceted redirector that directs energy from a heliostat onto a secondary, sometimes horizontal, target have been derived. Although the equations are quite general, the approach has been formulated with specific applications of the Central Receiver Test Facility (CRTF) and the Sandia Solar Furnace in mind. A computer code, ORC, has been developed that applies the derived set of equations to the CRTF heliostat field. The output of ORC is a preliminary design for the redirector. This output is subsequently used as an input to the CRTF facility code, HELIOS, to obtain a complete flux density distribution on both the redirector and receiver surfaces. Upon examination of these results, the redirector design can be modified and the above procedures repeated until a satisfactory design is obtained. The proposed design methodology is illustrated with a preliminary design example. The new capabilities that a redirector can provide to the CRTF or the Solar Furnace represent a powerful new resource for activities and experiments where radiation direction is an important variable.

  15. Novel DPT methodology co-optimized with design rules for sub-20nm device

    NASA Astrophysics Data System (ADS)

    Lee, Hyun-Jong; Choi, Soo-Han; Yang, Jae-Seok; Chun, Kwan-Young; Do, Jeong-ho; Park, Chul-Hong

    2012-11-01

    Because extreme ultra violet (EUV) lithography is not ready due to technical challenges and low throughput, we are facing severe limitation for sub-20nm node patterning even though the extreme resolution enhancement technology (RET) such as the off-axis illumination and computational lithography have been used to achieve enough process window and critical dimension uniformity (CDU). As an alternative solution, double patterning technology (DPT) becomes the essential patterning scheme for the sub-20nm technology node. DPT requires the complex design rules because DPT rules need to consider layout decomposability into two masks. In order to improve CDU and to achieve both design rule simplicity and better designability, we propose two kinds of layout decomposition methodologies in this paper; 1) new mandrel decomposition of the Fin generation for better uniformity, 2) chip-level decomposition and colorless design rule of the contact to improve the scalability. Co-optimized design rules, decomposition method and process requirement enable us to obtain about 6% scaling benefits by comparison with normal DPT flow. These DPT approaches provide benefits for both process and design.

  16. The Study of the Relationship between Probabilistic Design and Axiomatic Design Methodology. Volume 3

    NASA Technical Reports Server (NTRS)

    Onwubiko, Chin-Yere; Onyebueke, Landon

    1996-01-01

    Structural failure is rarely a "sudden death" type of event, such sudden failures may occur only under abnormal loadings like bomb or gas explosions and very strong earthquakes. In most cases, structures fail due to damage accumulated under normal loadings such as wind loads, dead and live loads. The consequence of cumulative damage will affect the reliability of surviving components and finally causes collapse of the system. The cumulative damage effects on system reliability under time-invariant loadings are of practical interest in structural design and therefore will be investigated in this study. The scope of this study is, however, restricted to the consideration of damage accumulation as the increase in the number of failed components due to the violation of their strength limits.

  17. Proposal of a methodology for the design of offshore wind farms

    NASA Astrophysics Data System (ADS)

    Esteban, Dolores; Diez, J. Javier; Santos Lopez, J.; Negro, Vicente

    2010-05-01

    In fact, the wind power installed in the sea is still very scarce, with only 1,500 megawatts in operation in the middle of 2009. Although the first offshore wind farm experiment took place in 1990, the facilities built up to now have been mainly pilot projects. These previous statements confirm the incipient state of offshore wind power, Anyway, in this moment this technology is being strongly pushed, especially by the governments of some countries - like the United Kingdom, Germany, etc. - which is due above all to the general commitments made to reduce the emission of greenhouses gases. All of these factors lead to predict a promising future for offshore wind power. Nevertheless, it has not been still established a general methodology for the design and the management of this kind of installations. This paper includes some of the results of a research project, which consists on the elaboration of a methodology to enable the optimization of the global process of the operations leading to the implantation of offshore wind facilities. The proposed methodology allows the planning of offshore wind projects according to an integral management policy, enabling not only technical and financial feasibility of the offshore wind project to be achieved, but also respect for the environment. For that, it has been necessary to take into account multiple factors, including the territory, the terrain, the physical-chemical properties of the contact area between the atmosphere and the ocean, the dynamics resulting in both as a consequence of the Earth's behaviour as a heat machine, external geodynamics, internal geodynamics, planetary dynamics, biokenosis, the legislative and financial framework, human activities, wind turbines, met masts, electric substations and lines, foundations, logistics and the project's financial profitability. For its validation, this methodology has been applied to different offshore wind farms in operation.

  18. Human factors analysis and design methods for nuclear waste retrieval systems. Human factors design methodology and integration plan

    SciTech Connect

    Casey, S.M.

    1980-06-01

    The purpose of this document is to provide an overview of the recommended activities and methods to be employed by a team of human factors engineers during the development of a nuclear waste retrieval system. This system, as it is presently conceptualized, is intended to be used for the removal of storage canisters (each canister containing a spent fuel rod assembly) located in an underground salt bed depository. This document, and the others in this series, have been developed for the purpose of implementing human factors engineering principles during the design and construction of the retrieval system facilities and equipment. The methodology presented has been structured around a basic systems development effort involving preliminary development, equipment development, personnel subsystem development, and operational test and evaluation. Within each of these phases, the recommended activities of the human engineering team have been stated, along with descriptions of the human factors engineering design techniques applicable to the specific design issues. Explicit examples of how the techniques might be used in the analysis of human tasks and equipment required in the removal of spent fuel canisters have been provided. Only those techniques having possible relevance to the design of the waste retrieval system have been reviewed. This document is intended to provide the framework for integrating human engineering with the rest of the system development effort. The activities and methodologies reviewed in this document have been discussed in the general order in which they will occur, although the time frame (the total duration of the development program in years and months) in which they should be performed has not been discussed.

  19. Combining qualitative and quantitative research within mixed method research designs: a methodological review.

    PubMed

    Östlund, Ulrika; Kidd, Lisa; Wengström, Yvonne; Rowa-Dewar, Neneh

    2011-03-01

    It has been argued that mixed methods research can be useful in nursing and health science because of the complexity of the phenomena studied. However, the integration of qualitative and quantitative approaches continues to be one of much debate and there is a need for a rigorous framework for designing and interpreting mixed methods research. This paper explores the analytical approaches (i.e. parallel, concurrent or sequential) used in mixed methods studies within healthcare and exemplifies the use of triangulation as a methodological metaphor for drawing inferences from qualitative and quantitative findings originating from such analyses. This review of the literature used systematic principles in searching CINAHL, Medline and PsycINFO for healthcare research studies which employed a mixed methods approach and were published in the English language between January 1999 and September 2009. In total, 168 studies were included in the results. Most studies originated in the United States of America (USA), the United Kingdom (UK) and Canada. The analytic approach most widely used was parallel data analysis. A number of studies used sequential data analysis; far fewer studies employed concurrent data analysis. Very few of these studies clearly articulated the purpose for using a mixed methods design. The use of the methodological metaphor of triangulation on convergent, complementary, and divergent results from mixed methods studies is exemplified and an example of developing theory from such data is provided. A trend for conducting parallel data analysis on quantitative and qualitative data in mixed methods healthcare research has been identified in the studies included in this review. Using triangulation as a methodological metaphor can facilitate the integration of qualitative and quantitative findings, help researchers to clarify their theoretical propositions and the basis of their results. This can offer a better understanding of the links between theory and

  20. Application of Adjoint Methodology in Various Aspects of Sonic Boom Design

    NASA Technical Reports Server (NTRS)

    Rallabhandi, Sriram K.

    2014-01-01

    One of the advances in computational design has been the development of adjoint methods allowing efficient calculation of sensitivities in gradient-based shape optimization. This paper discusses two new applications of adjoint methodology that have been developed to aid in sonic boom mitigation exercises. In the first, equivalent area targets are generated using adjoint sensitivities of selected boom metrics. These targets may then be used to drive the vehicle shape during optimization. The second application is the computation of adjoint sensitivities of boom metrics on the ground with respect to parameters such as flight conditions, propagation sampling rate, and selected inputs to the propagation algorithms. These sensitivities enable the designer to make more informed selections of flight conditions at which the chosen cost functionals are less sensitive.

  1. Optimization of EGFR high positive cell isolation procedure by design of experiments methodology.

    PubMed

    Levi, Ofer; Tal, Baruch; Hileli, Sagi; Shapira, Assaf; Benhar, Itai; Grabov, Pavel; Eliaz, Noam

    2015-01-01

    Circulating tumor cells (CTCs) in blood circulation may play a role in monitoring and even in early detection of metastasis patients. Due to the limited presence of CTCs in blood circulation, viable CTCs isolation technology must supply a very high recovery rate. Here, we implement design of experiments (DOE) methodology in order to optimize the Bio-Ferrography (BF) immunomagnetic isolation (IMI) procedure for the EGFR high positive CTCs application. All consequent DOE phases such as screening design, optimization experiments and validation experiments were used. A significant recovery rate of more than 95% was achieved while isolating 100 EGFR high positive CTCs from 1 mL human whole blood. The recovery achievement in this research positions BF technology as one of the most efficient IMI technologies, which is ready to be challenged with patients' blood samples. © 2015 International Clinical Cytometry Society.

  2. Piloted Evaluation of an Integrated Methodology for Propulsion and Airframe Control Design

    NASA Technical Reports Server (NTRS)

    Bright, Michelle M.; Simon, Donald L.; Garg, Sanjay; Mattern, Duane L.; Ranaudo, Richard J.; Odonoghue, Dennis P.

    1994-01-01

    An integrated methodology for propulsion and airframe control has been developed and evaluated for a Short Take-Off Vertical Landing (STOVL) aircraft using a fixed base flight simulator at NASA Lewis Research Center. For this evaluation the flight simulator is configured for transition flight using a STOVL aircraft model, a full nonlinear turbofan engine model, simulated cockpit and displays, and pilot effectors. The paper provides a brief description of the simulation models, the flight simulation environment, the displays and symbology, the integrated control design, and the piloted tasks used for control design evaluation. In the simulation, the pilots successfully completed typical transition phase tasks such as combined constant deceleration with flight path tracking, and constant acceleration wave-off maneuvers. The pilot comments of the integrated system performance and the display symbology are discussed and analyzed to identify potential areas of improvement.

  3. Application of Adjoint Methodology to Supersonic Aircraft Design Using Reversed Equivalent Areas

    NASA Technical Reports Server (NTRS)

    Rallabhandi, Sriram K.

    2013-01-01

    This paper presents an approach to shape an aircraft to equivalent area based objectives using the discrete adjoint approach. Equivalent areas can be obtained either using reversed augmented Burgers equation or direct conversion of off-body pressures into equivalent area. Formal coupling with CFD allows computation of sensitivities of equivalent area objectives with respect to aircraft shape parameters. The exactness of the adjoint sensitivities is verified against derivatives obtained using the complex step approach. This methodology has the benefit of using designer-friendly equivalent areas in the shape design of low-boom aircraft. Shape optimization results with equivalent area cost functionals are discussed and further refined using ground loudness based objectives.

  4. A hybrid design methodology for structuring an Integrated Environmental Management System (IEMS) for shipping business.

    PubMed

    Celik, Metin

    2009-03-01

    The International Safety Management (ISM) Code defines a broad framework for the safe management and operation of merchant ships, maintaining high standards of safety and environmental protection. On the other hand, ISO 14001:2004 provides a generic, worldwide environmental management standard that has been utilized by several industries. Both the ISM Code and ISO 14001:2004 have the practical goal of establishing a sustainable Integrated Environmental Management System (IEMS) for shipping businesses. This paper presents a hybrid design methodology that shows how requirements from both standards can be combined into a single execution scheme. Specifically, the Analytic Hierarchy Process (AHP) and Fuzzy Axiomatic Design (FAD) are used to structure an IEMS for ship management companies. This research provides decision aid to maritime executives in order to enhance the environmental performance in the shipping industry.

  5. Assessment of current structural design methodology for high-temperature reactors based on failure tests

    SciTech Connect

    Corum, J.M.; Sartory, W.K.

    1985-01-01

    A mature design methodology, consisting of inelastic analysis methods, provided in Department of Energy guidelines, and failure criteria, contained in ASME Code Case N-47, exists in the United States for high-temperature reactor components. The objective of this paper is to assess the adequacy of this overall methodology by comparing predicted inelastic deformations and lifetimes with observed results from structural failure tests and from an actual service failure. Comparisons are presented for three types of structural situations: (1) nozzle-to-spherical shell specimens, where stresses at structural discontinuities lead to cracking, (2) welded structures, where metallurgical discontinuities play a key role in failures, and (3) thermal shock loadings of cylinders and pipes, where thermal discontinuities can lead to failure. The comparison between predicted and measured inelastic responses are generally reasonalbly good; quantities are sometimes overpredicted somewhat, and, sometimes underpredicted. However, even seemingly small discrepancies can have a significant effect on structural life, and lifetimes are not always as closely predicted. For a few cases, the lifetimes are substantially overpredicted, which raises questions regarding the adequacy of existing design margins.

  6. A Modulator Design Methodology Minimizing Power Dissipation in a Quantum Well Modulator-Based Optical Interconnect

    NASA Astrophysics Data System (ADS)

    Cho, Hoyeol; Kapur, Pawan; Saraswat, Krishna C.

    2007-06-01

    There is a strong need for a methodology that minimizes total power, which inherently includes device design, for short-distance optical link applications (chip-to-chip or board-to-board communications). We present such a power optimization methodology for a modulator-based optical link, where we do a full 3-D modulator parameter optimization, keeping the power of the entire link in mind. We find for low bit rates (10 Gb/s) that the optimum operational voltage for the modulator was within the supply voltage at the 65-nm technology node. At higher bit rates, the optimum voltage is found to increase and go beyond the stipulated supply voltage. In such a case, a suboptimum operation at the supply voltage incurs a 46% power penalty at 25 Gb/s. Having obtained the optimum modulator design and operation parameters and the corresponding total link power dissipation, we examine the impact of device and system parameters on the optimization. We find that a smaller device capacitance is an efficient solution to push the optimum swing voltage to be within the supply voltage. This is feasible using monolithically integrated Ge-based complementary-metal oxide semiconductor-compatible modulator and metal semiconductor metal photodetectors.

  7. Methodology for Sensitivity Analysis, Approximate Analysis, and Design Optimization in CFD for Multidisciplinary Applications

    NASA Technical Reports Server (NTRS)

    Taylor, Arthur C., III; Hou, Gene W.

    1996-01-01

    An incremental iterative formulation together with the well-known spatially split approximate-factorization algorithm, is presented for solving the large, sparse systems of linear equations that are associated with aerodynamic sensitivity analysis. This formulation is also known as the 'delta' or 'correction' form. For the smaller two dimensional problems, a direct method can be applied to solve these linear equations in either the standard or the incremental form, in which case the two are equivalent. However, iterative methods are needed for larger two-dimensional and three dimensional applications because direct methods require more computer memory than is currently available. Iterative methods for solving these equations in the standard form are generally unsatisfactory due to an ill-conditioned coefficient matrix; this problem is overcome when these equations are cast in the incremental form. The methodology is successfully implemented and tested using an upwind cell-centered finite-volume formulation applied in two dimensions to the thin-layer Navier-Stokes equations for external flow over an airfoil. In three dimensions this methodology is demonstrated with a marching-solution algorithm for the Euler equations to calculate supersonic flow over the High-Speed Civil Transport configuration (HSCT 24E). The sensitivity derivatives obtained with the incremental iterative method from a marching Euler code are used in a design-improvement study of the HSCT configuration that involves thickness. camber, and planform design variables.

  8. Integrated active and passive control design methodology for the LaRC CSI evolutionary model

    NASA Technical Reports Server (NTRS)

    Voth, Christopher T.; Richards, Kenneth E., Jr.; Schmitz, Eric; Gehling, Russel N.; Morgenthaler, Daniel R.

    1994-01-01

    A general design methodology to integrate active control with passive damping was demonstrated on the NASA LaRC CSI Evolutionary Model (CEM), a ground testbed for future large, flexible spacecraft. Vibration suppression controllers designed for Line-of Sight (LOS) minimization were successfully implemented on the CEM. A frequency-shaped H2 methodology was developed, allowing the designer to specify the roll-off of the MIMO compensator. A closed loop bandwidth of 4 Hz, including the six rigid body modes and the first three dominant elastic modes of the CEM was achieved. Good agreement was demonstrated between experimental data and analytical predictions for the closed loop frequency response and random tests. Using the Modal Strain Energy (MSE) method, a passive damping treatment consisting of 60 viscoelastically damped struts was designed, fabricated and implemented on the CEM. Damping levels for the targeted modes were more than an order of magnitude larger than for the undamped structure. Using measured loss and stiffness data for the individual damped struts, analytical predictions of the damping levels were very close to the experimental values in the (1-10) Hz frequency range where the open loop model matched the experimental data. An integrated active/passive controller was successfully implemented on the CEM and was evaluated against an active-only controller. A two-fold increase in the effective control bandwidth and further reductions of 30 percent to 50 percent in the LOS RMS outputs were achieved compared to an active-only controller. Superior performance was also obtained compared to a High-Authority/Low-Authority (HAC/LAC) controller.

  9. Integrated active and passive control design methodology for the LaRC CSI evolutionary model

    NASA Astrophysics Data System (ADS)

    Voth, Christopher T.; Richards, Kenneth E., Jr.; Schmitz, Eric; Gehling, Russel N.; Morgenthaler, Daniel R.

    1994-04-01

    A general design methodology to integrate active control with passive damping was demonstrated on the NASA LaRC CSI Evolutionary Model (CEM), a ground testbed for future large, flexible spacecraft. Vibration suppression controllers designed for Line-of Sight (LOS) minimization were successfully implemented on the CEM. A frequency-shaped H2 methodology was developed, allowing the designer to specify the roll-off of the MIMO compensator. A closed loop bandwidth of 4 Hz, including the six rigid body modes and the first three dominant elastic modes of the CEM was achieved. Good agreement was demonstrated between experimental data and analytical predictions for the closed loop frequency response and random tests. Using the Modal Strain Energy (MSE) method, a passive damping treatment consisting of 60 viscoelastically damped struts was designed, fabricated and implemented on the CEM. Damping levels for the targeted modes were more than an order of magnitude larger than for the undamped structure. Using measured loss and stiffness data for the individual damped struts, analytical predictions of the damping levels were very close to the experimental values in the (1-10) Hz frequency range where the open loop model matched the experimental data. An integrated active/passive controller was successfully implemented on the CEM and was evaluated against an active-only controller. A two-fold increase in the effective control bandwidth and further reductions of 30 percent to 50 percent in the LOS RMS outputs were achieved compared to an active-only controller. Superior performance was also obtained compared to a High-Authority/Low-Authority (HAC/LAC) controller.

  10. Modeling and Design Analysis Methodology for Tailoring of Aircraft Structures with Composites

    NASA Technical Reports Server (NTRS)

    Rehfield, Lawrence W.

    2004-01-01

    Composite materials provide design flexibility in that fiber placement and orientation can be specified and a variety of material forms and manufacturing processes are available. It is possible, therefore, to 'tailor' the structure to a high degree in order to meet specific design requirements in an optimum manner. Common industrial practices, however, have limited the choices designers make. One of the reasons for this is that there is a dearth of conceptual/preliminary design analysis tools specifically devoted to identifying structural concepts for composite airframe structures. Large scale finite element simulations are not suitable for such purposes. The present project has been devoted to creating modeling and design analysis methodology for use in the tailoring process of aircraft structures. Emphasis has been given to creating bend-twist elastic coupling in high aspect ratio wings or other lifting surfaces. The direction of our work was in concert with the overall NASA effort Twenty- First Century Aircraft Technology (TCAT). A multi-disciplinary team was assembled by Dr. Damodar Ambur to work on wing technology, which included our project.

  11. A robust rotorcraft flight control system design methodology utilizing quantitative feedback theory

    NASA Technical Reports Server (NTRS)

    Gorder, Peter James

    1993-01-01

    Rotorcraft flight control systems present design challenges which often exceed those associated with fixed-wing aircraft. First, large variations in the response characteristics of the rotorcraft result from the wide range of airspeeds of typical operation (hover to over 100 kts). Second, the assumption of vehicle rigidity often employed in the design of fixed-wing flight control systems is rarely justified in rotorcraft where rotor degrees of freedom can have a significant impact on the system performance and stability. This research was intended to develop a methodology for the design of robust rotorcraft flight control systems. Quantitative Feedback Theory (QFT) was chosen as the basis for the investigation. Quantitative Feedback Theory is a technique which accounts for variability in the dynamic response of the controlled element in the design robust control systems. It was developed to address a Multiple-Input Single-Output (MISO) design problem, and utilizes two degrees of freedom to satisfy the design criteria. Two techniques were examined for extending the QFT MISO technique to the design of a Multiple-Input-Multiple-Output (MIMO) flight control system (FCS) for a UH-60 Black Hawk Helicopter. In the first, a set of MISO systems, mathematically equivalent to the MIMO system, was determined. QFT was applied to each member of the set simultaneously. In the second, the same set of equivalent MISO systems were analyzed sequentially, with closed loop response information from each loop utilized in subsequent MISO designs. The results of each technique were compared, and the advantages of the second, termed Sequential Loop Closure, were clearly evident.

  12. Integrated controls-structures design methodology development for a class of flexible spacecraft

    NASA Technical Reports Server (NTRS)

    Maghami, P. G.; Joshi, S. M.; Walz, J. E.; Armstrong, E. S.

    1990-01-01

    Future utilization of space will require large space structures in low-Earth and geostationary orbits. Example missions include: Earth observation systems, personal communication systems, space science missions, space processing facilities, etc., requiring large antennas, platforms, and solar arrays. The dimensions of such structures will range from a few meters to possibly hundreds of meters. For reducing the cost of construction, launching, and operating (e.g., energy required for reboosting and control), it will be necessary to make the structure as light as possible. However, reducing structural mass tends to increase the flexibility which would make it more difficult to control with the specified precision in attitude and shape. Therefore, there is a need to develop a methodology for designing space structures which are optimal with respect to both structural design and control design. In the current spacecraft design practice, it is customary to first perform the structural design and then the controller design. However, the structural design and the control design problems are substantially coupled and must be considered concurrently in order to obtain a truly optimal spacecraft design. For example, let C denote the set of the 'control' design variables (e.g., controller gains), and L the set of the 'structural' design variables (e.g., member sizes). If a structural member thickness is changed, the dynamics would change which would then change the control law and the actuator mass. That would, in turn, change the structural model. Thus, the sets C and L depend on each other. Future space structures can be roughly divided into four mission classes. Class 1 missions include flexible spacecraft with no articulated appendages which require fine attitude pointing and vibration suppression (e.g., large space antennas). Class 2 missions consist of flexible spacecraft with articulated multiple payloads, where the requirement is to fine-point the spacecraft and each

  13. Integrated controls-structures design methodology development for a class of flexible spacecraft

    NASA Technical Reports Server (NTRS)

    Maghami, P. G.; Joshi, S. M.; Walz, J. E.; Armstrong, E. S.

    1990-01-01

    Future utilization of space will require large space structures in low-Earth and geostationary orbits. Example missions include: Earth observation systems, personal communication systems, space science missions, space processing facilities, etc., requiring large antennas, platforms, and solar arrays. The dimensions of such structures will range from a few meters to possibly hundreds of meters. For reducing the cost of construction, launching, and operating (e.g., energy required for reboosting and control), it will be necessary to make the structure as light as possible. However, reducing structural mass tends to increase the flexibility which would make it more difficult to control with the specified precision in attitude and shape. Therefore, there is a need to develop a methodology for designing space structures which are optimal with respect to both structural design and control design. In the current spacecraft design practice, it is customary to first perform the structural design and then the controller design. However, the structural design and the control design problems are substantially coupled and must be considered concurrently in order to obtain a truly optimal spacecraft design. For example, let C denote the set of the 'control' design variables (e.g., controller gains), and L the set of the 'structural' design variables (e.g., member sizes). If a structural member thickness is changed, the dynamics would change which would then change the control law and the actuator mass. That would, in turn, change the structural model. Thus, the sets C and L depend on each other. Future space structures can be roughly divided into four mission classes. Class 1 missions include flexible spacecraft with no articulated appendages which require fine attitude pointing and vibration suppression (e.g., large space antennas). Class 2 missions consist of flexible spacecraft with articulated multiple payloads, where the requirement is to fine-point the spacecraft and each

  14. Formal Learning Sequences and Progression in the Studio: A Framework for Digital Design Education

    ERIC Educational Resources Information Center

    Wärnestål, Pontus

    2016-01-01

    This paper examines how to leverage the design studio learning environment throughout long-term Digital Design education in order to support students to progress from tactical, well-defined, device-centric routine design, to confidently design sustainable solutions for strategic, complex, problems for a wide range of devices and platforms in the…

  15. Formal Learning Sequences and Progression in the Studio: A Framework for Digital Design Education

    ERIC Educational Resources Information Center

    Wärnestål, Pontus

    2016-01-01

    This paper examines how to leverage the design studio learning environment throughout long-term Digital Design education in order to support students to progress from tactical, well-defined, device-centric routine design, to confidently design sustainable solutions for strategic, complex, problems for a wide range of devices and platforms in the…

  16. Experimental Validation of an Electromagnet Thermal Design Methodology for Magnetized Dusty Plasma Research

    NASA Astrophysics Data System (ADS)

    Birmingham, W. J.; Bates, E. M.; Romero-Talamás, C. A.; Rivera, W. F.

    2016-10-01

    An analytic thermal design method developed to aid in the engineering design of Bitter-type magnets, as well as finite element calculations of heat transfer, are compared against experimental measurements of temperature evolution in a prototype magnet designed to operate continuously at 1 T fields while dissipating 9 kW of heat. The analytic thermal design method is used to explore a variety of configurations of cooling holes in the Bitter plates, including their geometry and radial placement. The prototype has diagnostic ports that can accommodate thermocouples, pressure sensors, and optical access to measure the water flow. We present temperature and pressure sensor data from the prototype compared to the analytic thermal model and finite element calculations. The data is being used to guide the design of a 10 T Bitter magnet capable of sustained fields of up to 10 T for at least 10 seconds, which will be used in dusty plasma experiments at the University of Maryland Baltimore County. Preliminary design plans and progress towards the construction of the 10 T electromagnet are also presented.

  17. Development of a design methodology for pipelines in ice scoured seabeds

    SciTech Connect

    Clark, J.I.; Paulin, M.J.; Lach, P.R.; Yang, Q.S.; Poorooshasb, H.

    1994-12-31

    Large areas of the continental shelf of northern oceans are frequently scoured or gouged by moving bodies of ice such as icebergs and sea ice keels associated with pressure ridges. This phenomenon presents a formidable challenge when the route of a submarine pipeline is intersected by the scouring ice. It is generally acknowledged that if a pipeline, laid on the seabed, were hit by an iceberg or a pressure ridge keel, the forces imposed on the pipeline would be much greater than it could practically withstand. The pipeline must therefore be buried to avoid direct contact with ice, but it is very important to determine with some assurance the minimum depth required for safety for both economical and environmental reasons. The safe burial depth of a pipeline, however, cannot be determined directly from the relatively straight forward measurement of maximum scour depth. The major design consideration is the determination of the potential sub-scour deformation of the ice scoured soil. Forces transmitted through the soil and soil displacement around the pipeline could load the pipeline to failure if not taken into account in the design. If the designer can predict the forces transmitted through the soil, the pipeline can be designed to withstand these external forces using conventional design practice. In this paper, the authors outline a design methodology that is based on phenomenological studies of ice scoured terrain, both modern and relict, laboratory tests, centrifuge modeling, and numerical analysis. The implications of these studies, which could assist in the safe and economical design of pipelines in ice scoured terrain, will also be discussed.

  18. Assessment of biodistribution using mesenchymal stromal cells: Algorithm for study design and challenges in detection methodologies.

    PubMed

    Reyes, Blanca; Coca, Maria Isabel; Codinach, Margarita; López-Lucas, María Dolores; Del Mazo-Barbara, Anna; Caminal, Marta; Oliver-Vila, Irene; Cabañas, Valentín; Lope-Piedrafita, Silvia; García-López, Joan; Moraleda, José M; Fontecha, Cesar G; Vives, Joaquim

    2017-09-01

    Biodistribution of candidate cell-based therapeutics is a critical safety concern that must be addressed in the preclinical development program. We aimed to design a decision tree based on a series of studies included in actual dossiers approved by competent regulatory authorities, noting that the design, execution and interpretation of pharmacokinetics studies using this type of therapy is not straightforward and presents a challenge for both developers and regulators. Eight studies were evaluated for the definition of a decision tree, in which mesenchymal stromal cells (MSCs) were administered to mouse, rat and sheep models using diverse routes (local or systemic), cell labeling (chemical or genetic) and detection methodologies (polymerase chain reaction [PCR], immunohistochemistry [IHC], fluorescence bioimaging, and magnetic resonance imaging [MRI]). Moreover, labeling and detection methodologies were compared in terms of cost, throughput, speed, sensitivity and specificity. A decision tree was defined based on the model chosen: (i) small immunodeficient animals receiving heterologous MSC products for assessing biodistribution and other safety aspects and (ii) large animals receiving homologous labeled products; this contributed to gathering data not only on biodistribution but also on pharmacodynamics. PCR emerged as the most convenient technique despite the loss of spatial information on cell distribution that can be further assessed by IHC. This work contributes to the standardization in the design of biodistribution studies by improving methods for accurate assessment of safety. The evaluation of different animal models and screening of target organs through a combination of techniques is a cost-effective and timely strategy. Copyright © 2017 International Society for Cellular Therapy. Published by Elsevier Inc. All rights reserved.

  19. A Test Methodology for Determining Space-Readiness of Xilinx SRAM-Based FPGA Designs

    SciTech Connect

    Quinn, Heather M; Graham, Paul S; Morgan, Keith S; Caffrey, Michael P

    2008-01-01

    Using reconfigurable, static random-access memory (SRAM) based field-programmable gate arrays (FPGAs) for space-based computation has been an exciting area of research for the past decade. Since both the circuit and the circuit's state is stored in radiation-tolerant memory, both could be alterd by the harsh space radiation environment. Both the circuit and the circuit's state can be prote cted by triple-moduler redundancy (TMR), but applying TMR to FPGA user designs is often an error-prone process. Faulty application of TMR could cause the FPGA user circuit to output incorrect data. This paper will describe a three-tiered methodology for testing FPGA user designs for space-readiness. We will describe the standard approach to testing FPGA user designs using a particle accelerator, as well as two methods using fault injection and a modeling tool. While accelerator testing is the current 'gold standard' for pre-launch testing, we believe the use of fault injection and modeling tools allows for easy, cheap and uniform access for discovering errors early in the design process.

  20. Development of a decision-making methodology to design a water quality monitoring network.

    PubMed

    Keum, Jongho; Kaluarachchi, Jagath J

    2015-07-01

    The number of water quality monitoring stations in the USA has decreased over the past few decades. Scarcity of observations can easily produce prediction uncertainty due to unreliable model calibration. An effective water quality monitoring network is important not only for model calibration and water quality prediction but also for resources management. Redundant or improperly located monitoring stations may cause increased monitoring costs without improvement to the understanding of water quality in watersheds. In this work, a decision-making methodology is proposed to design a water quality monitoring network by providing an adequate number of monitoring stations and their approximate locations at the eight-digit hydrologic unit codes (HUC8) scale. The proposed methodology is demonstrated for an example at the Upper Colorado River Basin (UCRB), where salinity is a serious concern. The level of monitoring redundancy or scarcity is defined by an index, station ratio (SR), which represents a monitoring density based on water quality load originated within a subbasin. By comparing the number of stations from a selected target SR with the available number of stations including the actual and the potential stations, the suggested number of stations in each subbasin was decided. If monitoring stations are primarily located in the low salinity loading subbasins, the average actual SR tends to increase, and vice versa. Results indicate that the spatial distribution of monitoring locations in 2011 is concentrated on low salinity loading subbasins, and therefore, additional monitoring is required for the high salinity loading subbasins. The proposed methodology shows that the SR is a simple and a practical indicator for monitoring density.

  1. A probabilistic methodology for radar cross section prediction in conceptual aircraft design

    NASA Astrophysics Data System (ADS)

    Hines, Nathan Robert

    System effectiveness has increasingly become the prime metric for the evaluation of military aircraft. As such, it is the decision maker's/designer's goal to maximize system effectiveness. Industry and government research documents indicate that all future military aircraft will incorporate signature reduction as an attempt to improve system effectiveness and reduce the cost of attrition. Today's operating environments demand low observable aircraft which are able to reliably take out valuable, time critical targets. Thus it is desirable to be able to design vehicles that are balanced for increased effectiveness. Previous studies have shown that shaping of the vehicle is one of the most important contributors to radar cross section, a measure of radar signature, and must be considered from the very beginning of the design process. Radar cross section estimation should be incorporated into conceptual design to develop more capable systems. This research strives to meet these needs by developing a conceptual design tool that predicts radar cross section for parametric geometries. This tool predicts the absolute radar cross section of the vehicle as well as the impact of geometry changes, allowing for the simultaneous tradeoff of the aerodynamic, performance, and cost characteristics of the vehicle with the radar cross section. Furthermore, this tool can be linked to a campaign theater analysis code to demonstrate the changes in system and system of system effectiveness due to changes in aircraft geometry. A general methodology was developed and implemented and sample computer codes applied to prototype the proposed process. Studies utilizing this radar cross section tool were subsequently performed to demonstrate the capabilities of this method and show the impact that various inputs have on the outputs of these models. The F/A-18 aircraft configuration was chosen as a case study vehicle to perform a design space exercise and to investigate the relative impact of

  2. Improving Clinical Trial Participant Tracking Tools Using Knowledge-Anchored Design Methodologies

    PubMed Central

    Payne, P.R.O.; Embi, P.J.; Johnson, S.B.; Mendonca, E.; Starren, J.

    2010-01-01

    Objective Rigorous human-computer interaction (HCI) design methodologies have not traditionally been applied to the development of clinical trial participant tracking (CTPT) tools. Given the frequent use of iconic HCI models in CTPTs, and prior evidence of usability problems associated with the use of ambiguous icons in complex interfaces, such approaches may be problematic. Presentation Discovery (PD), a knowledge-anchored HCI design method, has been previously demonstrated to improve the design of iconic HCI models. In this study, we compare the usability of a CTPT HCI model designed using PD and an intuitively designed CTPT HCI model. Methods An iconic CPTP HCI model was created using PD. The PD-generated and an existing iconic CTPT HCI model were subjected to usability testing, with an emphasis on task accuracy and completion times. Study participants also completed a qualitative survey instrument to evaluate subjective satisfaction with the two models. Results CTPT end-users reliably and reproducibly agreed on the visual manifestation and semantics of prototype graphics generated using PD. The performance of the PD-generated iconic HCI model was equivalent to an existing HCI model for tasks at multiple levels of complexity, and in some cases superior. This difference was particularly notable when tasks required an understanding of the semantic meanings of multiple icons. Conclusion The use of PD to design an iconic CTPT HCI model generated beneficial results and improved end-user subjective satisfaction, while reducing task completion time. Such results are desirable in information and time intensive domains, such as clinical trials management. PMID:22132037

  3. "Filming in Progress": New Spaces for Multimodal Designing

    ERIC Educational Resources Information Center

    Mills, Kathy A.

    2010-01-01

    Global trends call for new research to investigate multimodal designing mediated by new technologies and the implications for classroom spaces. This article addresses the relationship between new technologies, students' multimodal designing, and the social production of classroom spaces. Multimodal semiotics and sociological principles are applied…

  4. "Filming in Progress": New Spaces for Multimodal Designing

    ERIC Educational Resources Information Center

    Mills, Kathy A.

    2010-01-01

    Global trends call for new research to investigate multimodal designing mediated by new technologies and the implications for classroom spaces. This article addresses the relationship between new technologies, students' multimodal designing, and the social production of classroom spaces. Multimodal semiotics and sociological principles are applied…

  5. Progress and prospects for an IFE relevant FI point design

    NASA Astrophysics Data System (ADS)

    Key, M.; Amendt, P.; Bellei, C.; Clark, D.; Cohen, B.; Divol, L.; Ho, D.; Kemp, A.; Larson, D.; Marinak, M.; Patel, P.; Shay, H.; Strozzi, D.; Tabak, M.

    2013-11-01

    The physics issues involved in scaling from sub-ignition to high gain fast ignition are discussed. Successful point designs must collimate the electrons and minimise the standoff distance to avoid multi-megajoule ignition energies. Collimating B field configurations are identified and some initial designs are explored.

  6. Progress and prospects for an FI relevant point design

    SciTech Connect

    Key, M; Amendt, P; Bellei, C; Clark, D; Cohen, B; Divol, L; Ho, D; Kemp, A; Larson, D; Marinak, M; Patel, P; Shay, H; Strozzi, D; Tabak, M

    2011-11-02

    The physics issues involved in scaling from sub ignition to high gain fast ignition are discussed. Successful point designs must collimate the electrons and minimize the stand off distance to avoid multi mega-joule ignition energies. Collimating B field configurations are identified and some initial designs are explored.

  7. Human factors analysis and design methods for nuclear waste retrieval systems: Human factors design methodology and integration plan

    NASA Astrophysics Data System (ADS)

    Casey, S. M.

    1980-06-01

    The nuclear waste retrieval system intended to be used for the removal of storage canisters (each canister containing a spent fuel rod assembly) located in an underground salt bed depository is discussed. The implementation of human factors engineering principles during the design and construction of the retrieval system facilities and equipment is reported. The methodology is structured around a basic system development effort involving preliminary development, equipment development, personnel subsystem development, and operational test and evaluation. Examples of application of the techniques in the analysis of human tasks, and equipment required in the removal of spent fuel canisters is provided. The framework for integrating human engineering with the rest of the system development effort is documented.

  8. Assessment of an effective quasirelativistic methodology designed to study astatine chemistry in aqueous solution.

    PubMed

    Champion, Julie; Seydou, Mahamadou; Sabatié-Gogova, Andrea; Renault, Eric; Montavon, Gilles; Galland, Nicolas

    2011-09-07

    A cost-effective computational methodology designed to study astatine (At) chemistry in aqueous solution has been established. It is based on two-component spin-orbit density functional theory calculations and solvation calculations using the conductor-like polarizable continuum model in conjunction with specific astatine cavities. Theoretical calculations are confronted with experimental data measured for complexation reactions between metallic forms of astatine (At(+) and AtO(+)) and inorganic ligands (Cl(-), Br(-) and SCN(-)). For each reaction, both 1:1 and 1:2 complexes are evidenced. The experimental trends regarding the thermodynamic constants (K) can be reproduced qualitatively and quantitatively. The mean signed error on computed Log K values is -0.4, which corresponds to a mean signed error smaller than 1 kcal mol(-1) on free energies of reaction. Theoretical investigations show that the reactivity of cationic species of astatine is highly sensitive to spin-orbit coupling and solvent effects. At the moment, the presented computational methodology appears to be the only tool to gain an insight into astatine chemistry at a molecular level. This journal is © the Owner Societies 2011

  9. Design of a strong cation exchange methodology for the evaluation of charge heterogeneity in glatiramer acetate.

    PubMed

    Campos-García, Víctor R; López-Morales, Carlos A; Benites-Zaragoza, Eleuterio; Jiménez-Miranda, Armando; Espinosa-de la Garza, Carlos E; Herrera-Fernández, Daniel; Padilla-Calderón, Jesús; Pérez, Néstor O; Flores-Ortiz, Luis F; Medina-Rivero, E

    2017-01-05

    Complex pharmaceuticals are in demand of competent analytical methods able to analyze charge heterogeneity as a critical quality attribute (CQA), in compliance with current regulatory expectations. A notorious example is glatiramer acetate (GA), a complex polypeptide mixture useful for the treatment of relapsing-remitting multiple sclerosis. This pharmaceutical challenges the current state of analytical technology in terms of the capacity to study their constituent species. Thus, a strong cation exchange methodology was designed under the lifecycle approach to support the establishment of GA identity, trough the evaluation of its chromatographic profile, which acts as a charge heterogeneity fingerprint. In this regard, a maximum relative margin of error of 5% for relative retention time and symmetry factor were proposed for the analytical target profile. The methodology met the proposed requirements after precision and specificity tests results, the former comprised of sensitivity and selectivity. Subsequently, method validation was conducted and showed that the method is able to differentiate between intact GA and heterogeneity profiles coming from stressed, fractioned or process-modified samples. In summary, these results provide evidence that the method is adequate to assess charge heterogeneity as a CQA of this complex pharmaceutical. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Methodology for the nuclear design validation of an Alternate Emergency Management Centre (CAGE)

    NASA Astrophysics Data System (ADS)

    Hueso, César; Fabbri, Marco; de la Fuente, Cristina; Janés, Albert; Massuet, Joan; Zamora, Imanol; Gasca, Cristina; Hernández, Héctor; Vega, J. Ángel

    2017-09-01

    The methodology is devised by coupling different codes. The study of weather conditions as part of the data of the site will determine the relative concentrations of radionuclides in the air using ARCON96. The activity in the air is characterized depending on the source and release sequence specified in NUREG-1465 by RADTRAD code, which provides results of the inner cloud source term contribution. Known activities, energy spectra are inferred using ORIGEN-S, which are used as input for the models of the outer cloud, filters and containment generated with MCNP5. The sum of the different contributions must meet the conditions of habitability specified by the CSN (Spanish Nuclear Regulatory Body) (TEDE <50 mSv and equivalent dose to the thyroid <500 mSv within 30 days following the accident doses) so that the dose is optimized by varying parameters such as CAGE location, flow filtering need for recirculation, thicknesses and compositions of the walls, etc. The results for the most penalizing area meet the established criteria, and therefore the CAGE building design based on the methodology presented is radiologically validated.

  11. A Bayesian maximum entropy-based methodology for optimal spatiotemporal design of groundwater monitoring networks.

    PubMed

    Hosseini, Marjan; Kerachian, Reza

    2017-09-01

    This paper presents a new methodology for analyzing the spatiotemporal variability of water table levels and redesigning a groundwater level monitoring network (GLMN) using the Bayesian Maximum Entropy (BME) technique and a multi-criteria decision-making approach based on ordered weighted averaging (OWA). The spatial sampling is determined using a hexagonal gridding pattern and a new method, which is proposed to assign a removal priority number to each pre-existing station. To design temporal sampling, a new approach is also applied to consider uncertainty caused by lack of information. In this approach, different time lag values are tested by regarding another source of information, which is simulation result of a numerical groundwater flow model. Furthermore, to incorporate the existing uncertainties in available monitoring data, the flexibility of the BME interpolation technique is taken into account in applying soft data and improving the accuracy of the calculations. To examine the methodology, it is applied to the Dehgolan plain in northwestern Iran. Based on the results, a configuration of 33 monitoring stations for a regular hexagonal grid of side length 3600 m is proposed, in which the time lag between samples is equal to 5 weeks. Since the variance estimation errors of the BME method are almost identical for redesigned and existing networks, the redesigned monitoring network is more cost-effective and efficient than the existing monitoring network with 52 stations and monthly sampling frequency.

  12. Use of a qualitative methodological scaffolding process to design robust interprofessional studies.

    PubMed

    Wener, Pamela; Woodgate, Roberta L

    2013-07-01

    Increasingly, researchers are using qualitative methodology to study interprofessional collaboration (IPC). With this increase in use, there seems to be an appreciation for how qualitative studies allow us to understand the unique individual or group experience in more detail and form a basis for policy change and innovative interventions. Furthermore, there is an increased understanding of the potential of studying new or emerging phenomena qualitatively to inform further large-scale studies. Although there is a current trend toward greater acceptance of the value of qualitative studies describing the experiences of IPC, these studies are mostly descriptive in nature. Applying a process suggested by Crotty (1998) may encourage researchers to consider the value in situating research questions within a broader theoretical framework that will inform the overall research approach including methodology and methods. This paper describes the application of a process to a research project and then illustrates how this process encouraged iterative cycles of thinking and doing. The authors describe each step of the process, shares decision-making points, as well as suggests an additional step to the process. Applying this approach to selecting data collection methods may serve to guide and support the qualitative researcher in creating a well-designed study approach.

  13. A methodology for the efficient integration of transient constraints in the design of aircraft dynamic systems

    NASA Astrophysics Data System (ADS)

    Phan, Leon L.

    The motivation behind this thesis mainly stems from previous work performed at Hispano-Suiza (Safran Group) in the context of the European research project "Power Optimised Aircraft". Extensive testing on the COPPER Bird RTM, a test rig designed to characterize aircraft electrical networks, demonstrated the relevance of transient regimes in the design and development of dynamic systems. Transient regimes experienced by dynamic systems may have severe impacts on the operation of the aircraft. For example, the switching on of a high electrical load might cause a network voltage drop inducing a loss of power available to critical aircraft systems. These transient behaviors are thus often regulated by dynamic constraints, requiring the dynamic signals to remain within bounds whose values vary with time. The verification of these peculiar types of constraints, which generally requires high-fidelity time-domain simulation, intervenes late in the system development process, thus potentially causing costly design iterations. The research objective of this thesis is to develop a methodology that integrates the verification of dynamic constraints in the early specification of dynamic systems. In order to circumvent the inefficiencies of time-domain simulation, multivariate dynamic surrogate models of the original time-domain simulation models are generated, building on a nonlinear system identification technique using wavelet neural networks (or wavenets), which allow the multiscale nature of transient signals to be captured. However, training multivariate wavenets can become computationally prohibitive as the number of design variables increases. Therefore, an alternate approach is formulated, in which dynamic surrogate models using sigmoid-based neural networks are used to emulate the transient behavior of the envelopes of the time-domain response. Thus, in order to train the neural network, the envelopes are extracted by first separating the scales of the dynamic response

  14. A two-phase methodology for technology selection and system design

    NASA Technical Reports Server (NTRS)

    Bard, Jonathan F.; Feinberg, Abe

    1989-01-01

    A two-phase methodology that can be used to guide R&D managers in the evaluation and selection of competing technologies is presented. Deterministic multiattribute utility theory is used in the first phase to rank the technological alternatives; the example presented involves the evaluation of electric and hybrid passenger vehicles. In all, 39 individuals from eight automotive firms were interviewed to assess their risk preferences and attitudes toward the vehicle design. In the second phase, the decision-maker must allocate a fixed amount of resources to different projects for the technology selected, some of which may be undertaken in parallel, to maximize a given measure of performance. When parallel funding is pursued the best outcome is chosen. The problem is formulated as a probabilistic network and solved heuristically using Monte Carlo simulation. Results are presented for two decision-makers and three budget options. In each case, the heuristic finds the optimal allocation of funds.

  15. Transmutation of singularities and zeros in graded index optical instruments: a methodology for designing practical devices.

    PubMed

    Hooper, I R; Philbin, T G

    2013-12-30

    We describe a design methodology for modifying the refractive index profile of graded-index optical instruments that incorporate singularities or zeros in their refractive index. The process maintains the device performance whilst resulting in graded profiles that are all-dielectric, do not require materials with unrealistic values, and that are impedance matched to the bounding medium. This is achieved by transmuting the singularities (or zeros) using the formalism of transformation optics, but with an additional boundary condition requiring the gradient of the co-ordinate transformation be continuous. This additional boundary condition ensures that the device is impedance matched to the bounding medium when the spatially varying permittivity and permeability profiles are scaled to realizable values. We demonstrate the method in some detail for an Eaton lens, before describing the profiles for an "invisible disc" and "multipole" lenses.

  16. Mixed culture optimization for marigold flower ensilage via experimental design and response surface methodology.

    PubMed

    Navarrete-Bolaños, José Luis; Jiménez-Islas, Hugo; Botello-Alvarez, Enrique; Rico-Martínez, Ramiro

    2003-04-09

    Endogenous microorganisms isolated from the marigold flower (Tagetes erecta) were studied to understand the events taking place during its ensilage. Studies of the cellulase enzymatic activity and the ensilage process were undertaken. In both studies, the use of approximate second-order models and multiple lineal regression, within the context of an experimental mixture design using the response surface methodology as optimization strategy, determined that the microorganisms Flavobacterium IIb, Acinetobacter anitratus, and Rhizopus nigricans are the most significant in marigold flower ensilage and exhibit high cellulase activity. A mixed culture comprised of 9.8% Flavobacterium IIb, 41% A. anitratus, and 49.2% R. nigricans used during ensilage resulted in an increased yield of total xanthophylls extracted of 24.94 g/kg of dry weight compared with 12.92 for the uninoculated control ensilage.

  17. Progress in the planar CPn SOFC system design verification

    SciTech Connect

    Elangovan, S.; Hartvigsen, J.; Khandkar, A.

    1996-04-01

    SOFCo is developing a high efficiency, modular and scaleable planar SOFC module termed the CPn design. This design has been verified in a 1.4 kW module test operated directly on pipeline natural gas. The design features multistage oxidation of fuel wherein the fuel is consumed incrementally over several stages. High efficiency is achieved by uniform current density distribution per stage, which lowers the stack resistance. Additional benefits include thermal regulation and compactness. Test results from stack modules operating in pipeline natural gas are presented.

  18. Robust design of spot welds in automotive structures: A decision-making methodology

    NASA Astrophysics Data System (ADS)

    Ouisse, M.; Cogan, S.

    2010-05-01

    Automotive structures include thousands of spot welds whose design must allow the assembled vehicle to satisfy a wide variety of performance constraints including static, dynamic and crash criteria. The objective of a standard optimization strategy is to reduce the number of spot welds as much as possible while satisfying all the design objectives. However, a classical optimization of the spot weld distribution using an exhaustive search approach is simply not feasible due to the very high order of the design space and the subsequently prohibitive calculation costs. Moreover, even if this calculation could be done, the result would not necessarily be very informative with respect to the design robustness to manufacturing uncertainties (location of welds and defective welds) and to the degradation of spot welds due to fatigue effects over the lifetime of the vehicle. In this paper, a decision-making methodology is presented which allows some aspects of the robustness issues to be integrated into the spot weld design process. The starting point is a given distribution of spot welds on the structure, which is based on both engineering know-how and preliminary critical numerical results, in particular criteria such as crash behavior. An over-populated spot weld distribution is then built in order to satisfy the remaining design criteria, such as static torsion angle and modal behavior. Then, an efficient optimization procedure based on energy considerations is used to eliminate redundant spot welds while preserving as far as possible the nominal structural behavior. The resulting sub-optimal solution is then used to provide a decision indicator for defining effective quality control procedures (e.g. visual post-assembly inspection of a small number of critical spot welds) as well as designing redundancy into critical zones. The final part of the paper is related to comparing the robustness of competing designs. Some decision-making indicators are presented to help the

  19. [Principles and methodology for ecological rehabilitation and security pattern design in key project construction].

    PubMed

    Chen, Li-Ding; Lu, Yi-He; Tian, Hui-Ying; Shi, Qian

    2007-03-01

    Global ecological security becomes increasingly important with the intensive human activities. The function of ecological security is influenced by human activities, and in return, the efficiency of human activities will also be affected by the patterns of regional ecological security. Since the 1990s, China has initiated the construction of key projects "Yangtze Three Gorges Dam", "Qinghai-Tibet Railway", "West-to-East Gas Pipeline", "West-to-East Electricity Transmission" and "South-to-North Water Transfer" , etc. The interaction between these projects and regional ecological security has particularly attracted the attention of Chinese government. It is not only important for the regional environmental protection, but also of significance for the smoothly implementation of various projects aimed to develop an ecological rehabilitation system and to design a regional ecological security pattern. This paper made a systematic analysis on the types and characteristics of key project construction and their effects on the environment, and on the basis of this, brought forward the basic principles and methodology for ecological rehabilitation and security pattern design in this construction. It was considered that the following issues should be addressed in the implementation of a key project: 1) analysis and evaluation of current regional ecological environment, 2) evaluation of anthropogenic disturbances and their ecological risk, 3) regional ecological rehabilitation and security pattern design, 4) scenario analysis of environmental benefits of regional ecological security pattern, 5) re-optimization of regional ecological system framework, and 6) establishment of regional ecosystem management plan.

  20. Design methodology accounting for fabrication errors in manufactured modified Fresnel lenses for controlled LED illumination.

    PubMed

    Shim, Jongmyeong; Kim, Joongeok; Lee, Jinhyung; Park, Changsu; Cho, Eikhyun; Kang, Shinill

    2015-07-27

    The increasing demand for lightweight, miniaturized electronic devices has prompted the development of small, high-performance optical components for light-emitting diode (LED) illumination. As such, the Fresnel lens is widely used in applications due to its compact configuration. However, the vertical groove angle between the optical axis and the groove inner facets in a conventional Fresnel lens creates an inherent Fresnel loss, which degrades optical performance. Modified Fresnel lenses (MFLs) have been proposed in which the groove angles along the optical paths are carefully controlled; however, in practice, the optical performance of MFLs is inferior to the theoretical performance due to fabrication errors, as conventional design methods do not account for fabrication errors as part of the design process. In this study, the Fresnel loss and the loss area due to microscopic fabrication errors in the MFL were theoretically derived to determine optical performance. Based on this analysis, a design method for the MFL accounting for the fabrication errors was proposed. MFLs were fabricated using an ultraviolet imprinting process and an injection molding process, two representative processes with differing fabrication errors. The MFL fabrication error associated with each process was examined analytically and experimentally to investigate our methodology.

  1. Designing reasonable accommodation of the workplace: a new methodology based on risk assessment.

    PubMed

    Pigini, L; Andrich, R; Liverani, G; Bucciarelli, P; Occhipinti, E

    2010-05-01

    If working tasks are carried out in inadequate conditions, workers with functional limitations may, over time, risk developing further disabilities. While several validated risk assessment methods exist for able-bodied workers, few studies have been carried out for workers with disabilities. This article, which reports the findings of a Study funded by the Italian Ministry of Labour, proposes a general methodology for the technical and organisational re-design of a worksite, based on risk assessment and irrespective of any worker disability. To this end, a sample of 16 disabled workers, composed of people with either mild or severe motor disabilities, was recruited. Their jobs include business administration (5), computer programmer (1), housewife (1), mechanical worker (2), textile worker (1), bus driver (1), nurse (2), electrical worker (1), teacher (1), warehouseman (1). By using a mix of risk assessment methods and the International Classification of Functioning (ICF) taxonomy, their worksites were re-designed in view of a reasonable accommodation, and prospective evaluation was carried out to check whether the new design would eliminate the risks. In one case - a man with congenital malformations who works as a help-desk operator for technical assistance in the Information and Communication Technology (ICT) department of a big organisation - the accommodation was actually carried out within the time span of the study, thus making it possible to confirm the hypotheses raised in the prospective assessment.

  2. Flexible energy-storage devices: design consideration and recent progress.

    PubMed

    Wang, Xianfu; Lu, Xihong; Liu, Bin; Chen, Di; Tong, Yexiang; Shen, Guozhen

    2014-07-23

    Flexible energy-storage devices are attracting increasing attention as they show unique promising advantages, such as flexibility, shape diversity, light weight, and so on; these properties enable applications in portable, flexible, and even wearable electronic devices, including soft electronic products, roll-up displays, and wearable devices. Consequently, considerable effort has been made in recent years to fulfill the requirements of future flexible energy-storage devices, and much progress has been witnessed. This review describes the most recent advances in flexible energy-storage devices, including flexible lithium-ion batteries and flexible supercapacitors. The latest successful examples in flexible lithium-ion batteries and their technological innovations and challenges are reviewed first. This is followed by a detailed overview of the recent progress in flexible supercapacitors based on carbon materials and a number of composites and flexible micro-supercapacitors. Some of the latest achievements regarding interesting integrated energy-storage systems are also reviewed. Further research direction is also proposed to surpass existing technological bottle-necks and realize idealized flexible energy-storage devices.

  3. Analog design optimization methodology for ultralow-power circuits using intuitive inversion-level and saturation-level parameters

    NASA Astrophysics Data System (ADS)

    Eimori, Takahisa; Anami, Kenji; Yoshimatsu, Norifumi; Hasebe, Tetsuya; Murakami, Kazuaki

    2014-01-01

    A comprehensive design optimization methodology using intuitive nondimensional parameters of inversion-level and saturation-level is proposed, especially for ultralow-power, low-voltage, and high-performance analog circuits with mixed strong, moderate, and weak inversion metal-oxide-semiconductor transistor (MOST) operations. This methodology is based on the synthesized charge-based MOST model composed of Enz-Krummenacher-Vittoz (EKV) basic concepts and advanced-compact-model (ACM) physics-based equations. The key concept of this methodology is that all circuit and system characteristics are described as some multivariate functions of inversion-level parameters, where the inversion level is used as an independent variable representative of each MOST. The analog circuit design starts from the first step of inversion-level design using universal characteristics expressed by circuit currents and inversion-level parameters without process-dependent parameters, followed by the second step of foundry-process-dependent design and the last step of verification using saturation-level criteria. This methodology also paves the way to an intuitive and comprehensive design approach for many kinds of analog circuit specifications by optimization using inversion-level log-scale diagrams and saturation-level criteria. In this paper, we introduce an example of our design methodology for a two-stage Miller amplifier.

  4. Progress in conceptual design of EU DEMO EC system

    NASA Astrophysics Data System (ADS)

    Garavaglia, Saul; Bruschi, Alex; Franke, Thomas; Granucci, Gustavo; Grossetti, Giovanni; Jelonnek, John; Moro, Alessandro; Poli, Emanuele; Rispoli, Natale; Strauss, Dirk; Tran, Quang Minh

    2017-07-01

    Since 2014 under the umbrella of EUROfusion Consortium the Work Package Heating and Current Drive (WPHCD) is performing the engineering design and R&D for the electron cyclotron (EC), ion cyclotron and neutral beam systems of the future fusion power plant DEMO. This presentation covers the activities performed in the last two years on the EC system conceptual design, as part of the WPHCD, focusing on launchers, transmission lines, system reliability and architecture.

  5. Progress Toward Efficient Laminar Flow Analysis and Design

    NASA Technical Reports Server (NTRS)

    Campbell, Richard L.; Campbell, Matthew L.; Streit, Thomas

    2011-01-01

    A multi-fidelity system of computer codes for the analysis and design of vehicles having extensive areas of laminar flow is under development at the NASA Langley Research Center. The overall approach consists of the loose coupling of a flow solver, a transition prediction method and a design module using shell scripts, along with interface modules to prepare the input for each method. This approach allows the user to select the flow solver and transition prediction module, as well as run mode for each code, based on the fidelity most compatible with the problem and available resources. The design module can be any method that designs to a specified target pressure distribution. In addition to the interface modules, two new components have been developed: 1) an efficient, empirical transition prediction module (MATTC) that provides n-factor growth distributions without requiring boundary layer information; and 2) an automated target pressure generation code (ATPG) that develops a target pressure distribution that meets a variety of flow and geometry constraints. The ATPG code also includes empirical estimates of several drag components to allow the optimization of the target pressure distribution. The current system has been developed for the design of subsonic and transonic airfoils and wings, but may be extendable to other speed ranges and components. Several analysis and design examples are included to demonstrate the current capabilities of the system.

  6. Teaching Mathematical Modelling in a Design Context: A Methodology Based on the Mechanical Analysis of a Domestic Cancrusher.

    ERIC Educational Resources Information Center

    Pace, Sydney

    2000-01-01

    Presents a methodology for teaching mathematical modeling skills to A-level students. Gives an example illustrating how mathematics teachers and design teachers can take joint perspective in devising learning opportunities that develop mathematical and design skills concurrently. (Contains 14 references.) (Author/ASK)

  7. A game-based decision support methodology for competitive systems design

    NASA Astrophysics Data System (ADS)

    Briceno, Simon Ignacio

    This dissertation describes the development of a game-based methodology that facilitates the exploration and selection of research and development (R&D) projects under uncertain competitive scenarios. The proposed method provides an approach that analyzes competitor positioning and formulates response strategies to forecast the impact of technical design choices on a project's market performance. A critical decision in the conceptual design phase of propulsion systems is the selection of the best architecture, centerline, core size, and technology portfolio. This selection can be challenging when considering evolving requirements from both the airframe manufacturing company and the airlines in the market. Furthermore, the exceedingly high cost of core architecture development and its associated risk makes this strategic architecture decision the most important one for an engine company. Traditional conceptual design processes emphasize performance and affordability as their main objectives. These areas alone however, do not provide decision-makers with enough information as to how successful their engine will be in a competitive market. A key objective of this research is to examine how firm characteristics such as their relative differences in completing R&D projects, differences in the degree of substitutability between different project types, and first/second-mover advantages affect their product development strategies. Several quantitative methods are investigated that analyze business and engineering strategies concurrently. In particular, formulations based on the well-established mathematical field of game theory are introduced to obtain insights into the project selection problem. The use of game theory is explored in this research as a method to assist the selection process of R&D projects in the presence of imperfect market information. The proposed methodology focuses on two influential factors: the schedule uncertainty of project completion times and

  8. A Software Designed For STP Data Plot and Analysis Based on Object-oriented Methodology

    NASA Astrophysics Data System (ADS)

    Lina, L.; Murata, K.

    2006-12-01

    simply follows the present system as long as the language is object-oriented language. Researchers would want to add their data into the STARS. In this case, they simply add their own data class in the domain object model. It is because any satellite data has properties such as time or date, which are inherited from the upper class. In this way, their effort is less than in other old methodologies. In the OMT, description format of the system is rather strictly standardized. When new developers take part in STARS project, they have only to understand each model to obtain the overview of the STARS. Then they follow this designs and documents to implement the system. The OMT makes a new comer easy to join into the project already running.

  9. Emulating a System Dynamics Model with Agent-Based Models: A Methodological Case Study in Simulation of Diabetes Progression

    SciTech Connect

    Schryver, Jack; Nutaro, James; Shankar, Mallikarjun

    2015-10-30

    An agent-based simulation model hierarchy emulating disease states and behaviors critical to progression of diabetes type 2 was designed and implemented in the DEVS framework. The models are translations of basic elements of an established system dynamics model of diabetes. In this model hierarchy, which mimics diabetes progression over an aggregated U.S. population, was dis-aggregated and reconstructed bottom-up at the individual (agent) level. Four levels of model complexity were defined in order to systematically evaluate which parameters are needed to mimic outputs of the system dynamics model. Moreover, the four estimated models attempted to replicate stock counts representing disease states in the system dynamics model, while estimating impacts of an elderliness factor, obesity factor and health-related behavioral parameters. Health-related behavior was modeled as a simple realization of the Theory of Planned Behavior, a joint function of individual attitude and diffusion of social norms that spread over each agent s social network. Although the most complex agent-based simulation model contained 31 adjustable parameters, all models were considerably less complex than the system dynamics model which required numerous time series inputs to make its predictions. In all three elaborations of the baseline model provided significantly improved fits to the output of the system dynamics model. The performances of the baseline agent-based model and its extensions illustrate a promising approach to translate complex system dynamics models into agent-based model alternatives that are both conceptually simpler and capable of capturing main effects of complex local agent-agent interactions.

  10. Emulating a System Dynamics Model with Agent-Based Models: A Methodological Case Study in Simulation of Diabetes Progression

    DOE PAGES

    Schryver, Jack; Nutaro, James; Shankar, Mallikarjun

    2015-10-30

    An agent-based simulation model hierarchy emulating disease states and behaviors critical to progression of diabetes type 2 was designed and implemented in the DEVS framework. The models are translations of basic elements of an established system dynamics model of diabetes. In this model hierarchy, which mimics diabetes progression over an aggregated U.S. population, was dis-aggregated and reconstructed bottom-up at the individual (agent) level. Four levels of model complexity were defined in order to systematically evaluate which parameters are needed to mimic outputs of the system dynamics model. Moreover, the four estimated models attempted to replicate stock counts representing disease statesmore » in the system dynamics model, while estimating impacts of an elderliness factor, obesity factor and health-related behavioral parameters. Health-related behavior was modeled as a simple realization of the Theory of Planned Behavior, a joint function of individual attitude and diffusion of social norms that spread over each agent s social network. Although the most complex agent-based simulation model contained 31 adjustable parameters, all models were considerably less complex than the system dynamics model which required numerous time series inputs to make its predictions. In all three elaborations of the baseline model provided significantly improved fits to the output of the system dynamics model. The performances of the baseline agent-based model and its extensions illustrate a promising approach to translate complex system dynamics models into agent-based model alternatives that are both conceptually simpler and capable of capturing main effects of complex local agent-agent interactions.« less

  11. Experimental validation of an integrated controls-structures design methodology for a class of flexible space structures

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Gupta, Sandeep; Elliott, Kenny B.; Joshi, Suresh M.; Walz, Joseph E.

    1994-01-01

    This paper describes the first experimental validation of an optimization-based integrated controls-structures design methodology for a class of flexible space structures. The Controls-Structures-Interaction (CSI) Evolutionary Model, a laboratory test bed at Langley, is redesigned based on the integrated design methodology with two different dissipative control strategies. The redesigned structure is fabricated, assembled in the laboratory, and experimentally compared with the original test structure. Design guides are proposed and used in the integrated design process to ensure that the resulting structure can be fabricated. Experimental results indicate that the integrated design requires greater than 60 percent less average control power (by thruster actuators) than the conventional control-optimized design while maintaining the required line-of-sight performance, thereby confirming the analytical findings about the superiority of the integrated design methodology. Amenability of the integrated design structure to other control strategies is considered and evaluated analytically and experimentally. This work also demonstrates the capabilities of the Langley-developed design tool CSI DESIGN which provides a unified environment for structural and control design.

  12. Response surface methodology and process optimization of sustained release pellets using Taguchi orthogonal array design and central composite design.

    PubMed

    Singh, Gurinder; Pai, Roopa S; Devi, V Kusum

    2012-01-01

    Furosemide is a powerful diuretic and antihypertensive drug which has low bioavailability due to hepatic first pass metabolism and has a short half-life of 2 hours. To overcome the above drawback, the present study was carried out to formulate and evaluate sustained release (SR) pellets of furosemide for oral administration prepared by extrusion/spheronization. Drug Coat L-100 was used within the pellet core along with microcrystalline cellulose as the diluent and concentration of selected binder was optimized to be 1.2%. The formulation was prepared with drug to polymer ratio 1:3. It was optimized using Design of Experiments by employing a 3(2) central composite design that was used to systematically optimize the process parameters combined with response surface methodology. Dissolution studies were carried out with USP apparatus Type I (basket type) in both simulated gastric and intestinal pH. The statistical technique, i.e., the two-tailed paired t test and one-way ANOVA of in vitro data has proposed that there was very significant (P≤0.05) difference in dissolution profile of furosemide SR pellets when compared with pure drug and commercial product. Validation of the process optimization study indicated an extremely high degree of prognostic ability. The study effectively undertook the development of optimized process parameters of pelletization of furosemide pellets with tremendous SR characteristics.

  13. Response surface methodology and process optimization of sustained release pellets using Taguchi orthogonal array design and central composite design

    PubMed Central

    Singh, Gurinder; Pai, Roopa S.; Devi, V. Kusum

    2012-01-01

    Furosemide is a powerful diuretic and antihypertensive drug which has low bioavailability due to hepatic first pass metabolism and has a short half-life of 2 hours. To overcome the above drawback, the present study was carried out to formulate and evaluate sustained release (SR) pellets of furosemide for oral administration prepared by extrusion/spheronization. Drug Coat L-100 was used within the pellet core along with microcrystalline cellulose as the diluent and concentration of selected binder was optimized to be 1.2%. The formulation was prepared with drug to polymer ratio 1:3. It was optimized using Design of Experiments by employing a 32 central composite design that was used to systematically optimize the process parameters combined with response surface methodology. Dissolution studies were carried out with USP apparatus Type I (basket type) in both simulated gastric and intestinal pH. The statistical technique, i.e., the two-tailed paired t test and one-way ANOVA of in vitro data has proposed that there was very significant (P≤0.05) difference in dissolution profile of furosemide SR pellets when compared with pure drug and commercial product. Validation of the process optimization study indicated an extremely high degree of prognostic ability. The study effectively undertook the development of optimized process parameters of pelletization of furosemide pellets with tremendous SR characteristics. PMID:22470891

  14. Science Underpinning TBC Design to Overcome the CMAS Threat to Progress in Gas Turbine Technology

    DTIC Science & Technology

    2015-09-30

    Science Underpinning TBC Design to Overcome the CMAS Threat to Progress in Gas Turbine Technology Sb. GRANT NUMBER NOOO 14-08-1-0522 Sc. PROGRAM ELEMENT...degradation in current and future gas turbine engines with expected material temperatures :::=:: 1300°C. The overarching goal was to elucidate the...Z39.1t Final Report on ONR Grant No. N00014-08-1-0522 SCIENCE UNDERPINNING TBC DESIGN TO OVERCOME THE CMAS THREAT TO PROGRESS IN GAS TURBINE

  15. Progress in Conceptual Design and Analysis of Advanced Rotorcraft

    NASA Technical Reports Server (NTRS)

    Yamauchi, Gloria K.

    2012-01-01

    This presentation will give information on Multi-Disciplinary Analysis and Technology Development, including it's objectives and how they will be met. In addition, it will also present recent highlights including the Lift-Offset Civil Design and it's study conclusions, as well as, the LCTR2 Propulsion Concept's study conclusions. Recent publications and future publications will also be discussed.

  16. Design and progress report for compact cryocooled sapphire oscillator 'VCSO'

    NASA Technical Reports Server (NTRS)

    Dick, G. John; Wang, Rabi T.; Tjoelker, Robert L.

    2005-01-01

    We report on the development of a compact cryocooled sapphiere oscillator 'VCSO', designed as a higher-performance replacement for ultra-stable quartz oscillators in local oscillator, cleanup, and flywheel applications in the frequency generation and distribution subsystems of NASA's Deep Space Network (DSN).

  17. Progress and Design Concerns of Nanostructured Solar Energy Harvesting Devices.

    PubMed

    Leung, Siu-Fung; Zhang, Qianpeng; Tavakoli, Mohammad Mahdi; He, Jin; Mo, Xiaoliang; Fan, Zhiyong

    2016-05-01

    Integrating devices with nanostructures is considered a promising strategy to improve the performance of solar energy harvesting devices such as photovoltaic (PV) devices and photo-electrochemical (PEC) solar water splitting devices. Extensive efforts have been exerted to improve the power conversion efficiencies (PCE) of such devices by utilizing novel nanostructures to revolutionize device structural designs. The thicknesses of light absorber and material consumption can be substantially reduced because of light trapping with nanostructures. Meanwhile, the utilization of nanostructures can also result in more effective carrier collection by shortening the photogenerated carrier collection path length. Nevertheless, performance optimization of nanostructured solar energy harvesting devices requires a rational design of various aspects of the nanostructures, such as their shape, aspect ratio, periodicity, etc. Without this, the utilization of nanostructures can lead to compromised device performance as the incorporation of these structures can result in defects and additional carrier recombination. The design guidelines of solar energy harvesting devices are summarized, including thin film non-uniformity on nanostructures, surface recombination, parasitic absorption, and the importance of uniform distribution of photo-generated carriers. A systematic view of the design concerns will assist better understanding of device physics and benefit the fabrication of high performance devices in the future. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Multi-acoustic lens design methodology for a low cost C-scan photoacoustic imaging camera

    NASA Astrophysics Data System (ADS)

    Chinni, Bhargava; Han, Zichao; Brown, Nicholas; Vallejo, Pedro; Jacobs, Tess; Knox, Wayne; Dogra, Vikram; Rao, Navalgund

    2016-03-01

    We have designed and implemented a novel acoustic lens based focusing technology into a prototype photoacoustic imaging camera. All photoacoustically generated waves from laser exposed absorbers within a small volume get focused simultaneously by the lens onto an image plane. We use a multi-element ultrasound transducer array to capture the focused photoacoustic signals. Acoustic lens eliminates the need for expensive data acquisition hardware systems, is faster compared to electronic focusing and enables real-time image reconstruction. Using this photoacoustic imaging camera, we have imaged more than 150 several centimeter size ex-vivo human prostate, kidney and thyroid specimens with a millimeter resolution for cancer detection. In this paper, we share our lens design strategy and how we evaluate the resulting quality metrics (on and off axis point spread function, depth of field and modulation transfer function) through simulation. An advanced toolbox in MATLAB was adapted and used for simulating a two-dimensional gridded model that incorporates realistic photoacoustic signal generation and acoustic wave propagation through the lens with medium properties defined on each grid point. Two dimensional point spread functions have been generated and compared with experiments to demonstrate the utility of our design strategy. Finally we present results from work in progress on the use of two lens system aimed at further improving some of the quality metrics of our system.

  19. The Component Packaging Problem: A Vehicle for the Development of Multidisciplinary Design and Analysis Methodologies

    NASA Technical Reports Server (NTRS)

    Fadel, Georges; Bridgewood, Michael; Figliola, Richard; Greenstein, Joel; Kostreva, Michael; Nowaczyk, Ronald; Stevenson, Steve

    1999-01-01

    This report summarizes academic research which has resulted in an increased appreciation for multidisciplinary efforts among our students, colleagues and administrators. It has also generated a number of research ideas that emerged from the interaction between disciplines. Overall, 17 undergraduate students and 16 graduate students benefited directly from the NASA grant: an additional 11 graduate students were impacted and participated without financial support from NASA. The work resulted in 16 theses (with 7 to be completed in the near future), 67 papers or reports mostly published in 8 journals and/or presented at various conferences (a total of 83 papers, presentations and reports published based on NASA inspired or supported work). In addition, the faculty and students presented related work at many meetings, and continuing work has been proposed to NSF, the Army, Industry and other state and federal institutions to continue efforts in the direction of multidisciplinary and recently multi-objective design and analysis. The specific problem addressed is component packing which was solved as a multi-objective problem using iterative genetic algorithms and decomposition. Further testing and refinement of the methodology developed is presently under investigation. Teaming issues research and classes resulted in the publication of a web site, (http://design.eng.clemson.edu/psych4991) which provides pointers and techniques to interested parties. Specific advantages of using iterative genetic algorithms, hurdles faced and resolved, and institutional difficulties associated with multi-discipline teaming are described in some detail.

  20. A Rapid Python-Based Methodology for Target-Focused Combinatorial Library Design.

    PubMed

    Li, Shiliang; Song, Yuwei; Liu, Xiaofeng; Li, Honglin

    2016-01-01

    The chemical space is so vast that only a small portion of it has been examined. As a complementary approach to systematically probe the chemical space, virtual combinatorial library design has extended enormous impacts on generating novel and diverse structures for drug discovery. Despite the favorable contributions, high attrition rates in drug development that mainly resulted from lack of efficacy and side effects make it increasingly challenging to discover good chemical starting points. In most cases, focused libraries, which are restricted to particular regions of the chemical space, are deftly exploited to maximize hit rate and improve efficiency at the beginning of the drug discovery and drug development pipeline. This paper presented a valid methodology for fast target-focused combinatorial library design in both reaction-based and production-based ways with the library creating rates of approximately 70,000 molecules per second. Simple, quick and convenient operating procedures are the specific features of the method. SHAFTS, a hybrid 3D similarity calculation software, was embedded to help refine the size of the libraries and improve hit rates. Two target-focused (p38-focused and COX2-focused) libraries were constructed efficiently in this study. This rapid library enumeration method is portable and applicable to any other targets for good chemical starting points identification collaborated with either structure-based or ligand-based virtual screening.