Sample records for design methodology application

  1. The Study of the Relationship between Probabilistic Design and Axiomatic Design Methodology. Volume 1

    NASA Technical Reports Server (NTRS)

    Onwubiko, Chinyere; Onyebueke, Landon

    1996-01-01

    This program report is the final report covering all the work done on this project. The goal of this project is technology transfer of methodologies to improve design process. The specific objectives are: 1. To learn and understand the Probabilistic design analysis using NESSUS. 2. To assign Design Projects to either undergraduate or graduate students on the application of NESSUS. 3. To integrate the application of NESSUS into some selected senior level courses in Civil and Mechanical Engineering curricula. 4. To develop courseware in Probabilistic Design methodology to be included in a graduate level Design Methodology course. 5. To study the relationship between the Probabilistic design methodology and Axiomatic design methodology.

  2. Rational Design Methodology.

    DTIC Science & Technology

    1978-09-01

    This report describes an effort to specify a software design methodology applicable to the Air Force software environment . Available methodologies...of techniques for proof of correctness, design specification, and performance assessment of static designs. The rational methodology selected is a

  3. Application of an Integrated Methodology for Propulsion and Airframe Control Design to a STOVL Aircraft

    NASA Technical Reports Server (NTRS)

    Garg, Sanjay; Mattern, Duane

    1994-01-01

    An advanced methodology for integrated flight propulsion control (IFPC) design for future aircraft, which will use propulsion system generated forces and moments for enhanced maneuver capabilities, is briefly described. This methodology has the potential to address in a systematic manner the coupling between the airframe and the propulsion subsystems typical of such enhanced maneuverability aircraft. Application of the methodology to a short take-off vertical landing (STOVL) aircraft in the landing approach to hover transition flight phase is presented with brief description of the various steps in the IFPC design methodology. The details of the individual steps have been described in previous publications and the objective of this paper is to focus on how the components of the control system designed at each step integrate into the overall IFPC system. The full nonlinear IFPC system was evaluated extensively in nonreal-time simulations as well as piloted simulations. Results from the nonreal-time evaluations are presented in this paper. Lessons learned from this application study are summarized in terms of areas of potential improvements in the STOVL IFPC design as well as identification of technology development areas to enhance the applicability of the proposed design methodology.

  4. Railroad classification yard design methodology study Elkhart Yard Rehabilitation : a case study

    DOT National Transportation Integrated Search

    1980-02-01

    This interim report documents the application of a railroad classification : yard design methodology to CONRAIL's Elkhart Yard Rehabilitation. This : case study effort represents Phase 2 of a larger effort to develop a yard : design methodology, and ...

  5. Methodologies for Root Locus and Loop Shaping Control Design with Comparisons

    NASA Technical Reports Server (NTRS)

    Kopasakis, George

    2017-01-01

    This paper describes some basics for the root locus controls design method as well as for loop shaping, and establishes approaches to expedite the application of these two design methodologies to easily obtain control designs that meet requirements with superior performance. The two design approaches are compared for their ability to meet control design specifications and for ease of application using control design examples. These approaches are also compared with traditional Proportional Integral Derivative (PID) control in order to demonstrate the limitations of PID control. Robustness of these designs is covered as it pertains to these control methodologies and for the example problems.

  6. Case-Crossover Analysis of Air Pollution Health Effects: A Systematic Review of Methodology and Application

    PubMed Central

    Carracedo-Martínez, Eduardo; Taracido, Margarita; Tobias, Aurelio; Saez, Marc; Figueiras, Adolfo

    2010-01-01

    Background Case-crossover is one of the most used designs for analyzing the health-related effects of air pollution. Nevertheless, no one has reviewed its application and methodology in this context. Objective We conducted a systematic review of case-crossover (CCO) designs used to study the relationship between air pollution and morbidity and mortality, from the standpoint of methodology and application. Data sources and extraction A search was made of the MEDLINE and EMBASE databases. Reports were classified as methodologic or applied. From the latter, the following information was extracted: author, study location, year, type of population (general or patients), dependent variable(s), independent variable(s), type of CCO design, and whether effect modification was analyzed for variables at the individual level. Data synthesis The review covered 105 reports that fulfilled the inclusion criteria. Of these, 24 addressed methodological aspects, and the remainder involved the design’s application. In the methodological reports, the designs that yielded the best results in simulation were symmetric bidirectional CCO and time-stratified CCO. Furthermore, we observed an increase across time in the use of certain CCO designs, mainly symmetric bidirectional and time-stratified CCO. The dependent variables most frequently analyzed were those relating to hospital morbidity; the pollutants most often studied were those linked to particulate matter. Among the CCO-application reports, 13.6% studied effect modification for variables at the individual level. Conclusions The use of CCO designs has undergone considerable growth; the most widely used designs were those that yielded better results in simulation studies: symmetric bidirectional and time-stratified CCO. However, the advantages of CCO as a method of analysis of variables at the individual level are put to little use. PMID:20356818

  7. A design methodology for nonlinear systems containing parameter uncertainty: Application to nonlinear controller design

    NASA Technical Reports Server (NTRS)

    Young, G.

    1982-01-01

    A design methodology capable of dealing with nonlinear systems, such as a controlled ecological life support system (CELSS), containing parameter uncertainty is discussed. The methodology was applied to the design of discrete time nonlinear controllers. The nonlinear controllers can be used to control either linear or nonlinear systems. Several controller strategies are presented to illustrate the design procedure.

  8. Development of tf coil support concepts by design methodology in the case of a Bitter-type magnet. [Bitter-type magnets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brossmann, U.B.

    1981-01-01

    The application of the methodological design is demonstrated for the development of support concepts in the case of a Bitter-type magnet designed for a compact tokamak experimentat aiming at ignition of a DT plasma. With this methodology all boundary conditions and design criteria are more easily satisfied in a technical and economical way.

  9. Applications of mixed-methods methodology in clinical pharmacy research.

    PubMed

    Hadi, Muhammad Abdul; Closs, S José

    2016-06-01

    Introduction Mixed-methods methodology, as the name suggests refers to mixing of elements of both qualitative and quantitative methodologies in a single study. In the past decade, mixed-methods methodology has gained popularity among healthcare researchers as it promises to bring together the strengths of both qualitative and quantitative approaches. Methodology A number of mixed-methods designs are available in the literature and the four most commonly used designs in healthcare research are: the convergent parallel design, the embedded design, the exploratory design, and the explanatory design. Each has its own unique advantages, challenges and procedures and selection of a particular design should be guided by the research question. Guidance on designing, conducting and reporting mixed-methods research is available in the literature, so it is advisable to adhere to this to ensure methodological rigour. When to use it is best suited when the research questions require: triangulating findings from different methodologies to explain a single phenomenon; clarifying the results of one method using another method; informing the design of one method based on the findings of another method, development of a scale/questionnaire and answering different research questions within a single study. Two case studies have been presented to illustrate possible applications of mixed-methods methodology. Limitations Possessing the necessary knowledge and skills to undertake qualitative and quantitative data collection, analysis, interpretation and integration remains the biggest challenge for researchers conducting mixed-methods studies. Sequential study designs are often time consuming, being in two (or more) phases whereas concurrent study designs may require more than one data collector to collect both qualitative and quantitative data at the same time.

  10. 43 CFR 11.83 - Damage determination phase-use value methodologies.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...

  11. 43 CFR 11.83 - Damage determination phase-use value methodologies.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...

  12. 43 CFR 11.83 - Damage determination phase-use value methodologies.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...

  13. 43 CFR 11.83 - Damage determination phase-use value methodologies.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...

  14. 43 CFR 11.83 - Damage determination phase-use value methodologies.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...

  15. Designing for fiber composite structural durability in hygrothermomechanical environment

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.

    1985-01-01

    A methodology is described which can be used to design/analyze fiber composite structures subjected to complex hygrothermomechanical environments. This methodology includes composite mechanics and advanced structural analysis methods (finite element). Select examples are described to illustrate the application of the available methodology. The examples include: (1) composite progressive fracture; (2) composite design for high cycle fatigue combined with hot-wet conditions; and (3) general laminate design.

  16. Application of the Hardman methodology to the Army Remotely Piloted Vehicle (RPV)

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The application of the HARDMAN Methodology to the Remotely Piloted Vehicle (RPV) is described. The methodology was used to analyze the manpower, personnel, and training (MPT) requirements of the proposed RPV system design for a number of operating scenarios. The RPV system is defined as consisting of the equipment, personnel, and operational procedures needed to perform five basic artillery missions: reconnaissance, target acquisition, artillery adjustment, target designation and damage assessment. The RPV design evaluated includes an air vehicle (AV), a modular integrated communications and navigation system (MICNS), a ground control station (GCS), a launch subsystem (LS), a recovery subsystem (RS), and a number of ground support requirements. The HARDMAN Methodology is an integrated set of data base management techniques and analytic tools, designed to provide timely and fully documented assessments of the human resource requirements associated with an emerging system's design.

  17. Classification of Computer-Aided Design-Computer-Aided Manufacturing Applications for the Reconstruction of Cranio-Maxillo-Facial Defects.

    PubMed

    Wauters, Lauri D J; Miguel-Moragas, Joan San; Mommaerts, Maurice Y

    2015-11-01

    To gain insight into the methodology of different computer-aided design-computer-aided manufacturing (CAD-CAM) applications for the reconstruction of cranio-maxillo-facial (CMF) defects. We reviewed and analyzed the available literature pertaining to CAD-CAM for use in CMF reconstruction. We proposed a classification system of the techniques of implant and cutting, drilling, and/or guiding template design and manufacturing. The system consisted of 4 classes (I-IV). These classes combine techniques used for both the implant and template to most accurately describe the methodology used. Our classification system can be widely applied. It should facilitate communication and immediate understanding of the methodology of CAD-CAM applications for the reconstruction of CMF defects.

  18. Turbofan engine control system design using the LQG/LTR methodology

    NASA Technical Reports Server (NTRS)

    Garg, Sanjay

    1989-01-01

    Application of the linear-quadratic-Gaussian with loop-transfer-recovery methodology to design of a control system for a simplified turbofan engine model is considered. The importance of properly scaling the plant to achieve the desired target feedback loop is emphasized. The steps involved in the application of the methodology are discussed via an example, and evaluation results are presented for a reduced-order compensator. The effect of scaling the plant on the stability robustness evaluation of the closed-loop system is studied in detail.

  19. Turbofan engine control system design using the LQG/LTR methodology

    NASA Technical Reports Server (NTRS)

    Garg, Sanjay

    1989-01-01

    Application of the Linear-Quadratic-Gaussian with Loop-Transfer-Recovery methodology to design of a control system for a simplified turbofan engine model is considered. The importance of properly scaling the plant to achieve the desired Target-Feedback-Loop is emphasized. The steps involved in the application of the methodology are discussed via an example, and evaluation results are presented for a reduced-order compensator. The effect of scaling the plant on the stability robustness evaluation of the closed-loop system is studied in detail.

  20. Railroad classification yard design methodology study : East Deerfield Yard, a case study

    DOT National Transportation Integrated Search

    1980-02-01

    This interim report documents the application of a railroad classification yard design methodology to Boston and Maine's East Deerfield Yard Rehabiliation. This case study effort represents Phase 2 of a larger effort to develop a yard design methodol...

  1. Application of new methodologies based on design of experiments, independent component analysis and design space for robust optimization in liquid chromatography.

    PubMed

    Debrus, Benjamin; Lebrun, Pierre; Ceccato, Attilio; Caliaro, Gabriel; Rozet, Eric; Nistor, Iolanda; Oprean, Radu; Rupérez, Francisco J; Barbas, Coral; Boulanger, Bruno; Hubert, Philippe

    2011-04-08

    HPLC separations of an unknown sample mixture and a pharmaceutical formulation have been optimized using a recently developed chemometric methodology proposed by W. Dewé et al. in 2004 and improved by P. Lebrun et al. in 2008. This methodology is based on experimental designs which are used to model retention times of compounds of interest. Then, the prediction accuracy and the optimal separation robustness, including the uncertainty study, were evaluated. Finally, the design space (ICH Q8(R1) guideline) was computed as the probability for a criterion to lie in a selected range of acceptance. Furthermore, the chromatograms were automatically read. Peak detection and peak matching were carried out with a previously developed methodology using independent component analysis published by B. Debrus et al. in 2009. The present successful applications strengthen the high potential of these methodologies for the automated development of chromatographic methods. Copyright © 2011 Elsevier B.V. All rights reserved.

  2. Warehouses information system design and development

    NASA Astrophysics Data System (ADS)

    Darajatun, R. A.; Sukanta

    2017-12-01

    Materials/goods handling industry is fundamental for companies to ensure the smooth running of their warehouses. Efficiency and organization within every aspect of the business is essential in order to gain a competitive advantage. The purpose of this research is design and development of Kanban of inventory storage and delivery system. Application aims to facilitate inventory stock checks to be more efficient and effective. Users easily input finished goods from production department, warehouse, customer, and also suppliers. Master data designed as complete as possible to be prepared applications used in a variety of process logistic warehouse variations. The author uses Java programming language to develop the application, which is used for building Java Web applications, while the database used is MySQL. System development methodology that I use is the Waterfall methodology. Waterfall methodology has several stages of the Analysis, System Design, Implementation, Integration, Operation and Maintenance. In the process of collecting data the author uses the method of observation, interviews, and literature.

  3. Application of Design Methodologies for Feedback Compensation Associated with Linear Systems

    NASA Technical Reports Server (NTRS)

    Smith, Monty J.

    1996-01-01

    The work that follows is concerned with the application of design methodologies for feedback compensation associated with linear systems. In general, the intent is to provide a well behaved closed loop system in terms of stability and robustness (internal signals remain bounded with a certain amount of uncertainty) and simultaneously achieve an acceptable level of performance. The approach here has been to convert the closed loop system and control synthesis problem into the interpolation setting. The interpolation formulation then serves as our mathematical representation of the design process. Lifting techniques have been used to solve the corresponding interpolation and control synthesis problems. Several applications using this multiobjective design methodology have been included to show the effectiveness of these techniques. In particular, the mixed H 2-H performance criteria with algorithm has been used on several examples including an F-18 HARV (High Angle of Attack Research Vehicle) for sensitivity performance.

  4. Efficient preliminary floating offshore wind turbine design and testing methodologies and application to a concrete spar design

    PubMed Central

    Matha, Denis; Sandner, Frank; Molins, Climent; Campos, Alexis; Cheng, Po Wen

    2015-01-01

    The current key challenge in the floating offshore wind turbine industry and research is on designing economic floating systems that can compete with fixed-bottom offshore turbines in terms of levelized cost of energy. The preliminary platform design, as well as early experimental design assessments, are critical elements in the overall design process. In this contribution, a brief review of current floating offshore wind turbine platform pre-design and scaled testing methodologies is provided, with a focus on their ability to accommodate the coupled dynamic behaviour of floating offshore wind systems. The exemplary design and testing methodology for a monolithic concrete spar platform as performed within the European KIC AFOSP project is presented. Results from the experimental tests compared to numerical simulations are presented and analysed and show very good agreement for relevant basic dynamic platform properties. Extreme and fatigue loads and cost analysis of the AFOSP system confirm the viability of the presented design process. In summary, the exemplary application of the reduced design and testing methodology for AFOSP confirms that it represents a viable procedure during pre-design of floating offshore wind turbine platforms. PMID:25583870

  5. Designing Trend-Monitoring Sounds for Helicopters: Methodological Issues and an Application

    ERIC Educational Resources Information Center

    Edworthy, Judy; Hellier, Elizabeth; Aldrich, Kirsteen; Loxley, Sarah

    2004-01-01

    This article explores methodological issues in sonification and sound design arising from the design of helicopter monitoring sounds. Six monitoring sounds (each with 5 levels) were tested for similarity and meaning with 3 different techniques: hierarchical cluster analysis, linkage analysis, and multidimensional scaling. In Experiment 1,…

  6. Multirate Flutter Suppression System Design for the Benchmark Active Controls Technology Wing. Part 2; Methodology Application Software Toolbox

    NASA Technical Reports Server (NTRS)

    Mason, Gregory S.; Berg, Martin C.; Mukhopadhyay, Vivek

    2002-01-01

    To study the effectiveness of various control system design methodologies, the NASA Langley Research Center initiated the Benchmark Active Controls Project. In this project, the various methodologies were applied to design a flutter suppression system for the Benchmark Active Controls Technology (BACT) Wing. This report describes the user's manual and software toolbox developed at the University of Washington to design a multirate flutter suppression control law for the BACT wing.

  7. Prognostics and health management design for rotary machinery systems—Reviews, methodology and applications

    NASA Astrophysics Data System (ADS)

    Lee, Jay; Wu, Fangji; Zhao, Wenyu; Ghaffari, Masoud; Liao, Linxia; Siegel, David

    2014-01-01

    Much research has been conducted in prognostics and health management (PHM), an emerging field in mechanical engineering that is gaining interest from both academia and industry. Most of these efforts have been in the area of machinery PHM, resulting in the development of many algorithms for this particular application. The majority of these algorithms concentrate on applications involving common rotary machinery components, such as bearings and gears. Knowledge of this prior work is a necessity for any future research efforts to be conducted; however, there has not been a comprehensive overview that details previous and on-going efforts in PHM. In addition, a systematic method for developing and deploying a PHM system has yet to be established. Such a method would enable rapid customization and integration of PHM systems for diverse applications. To address these gaps, this paper provides a comprehensive review of the PHM field, followed by an introduction of a systematic PHM design methodology, 5S methodology, for converting data to prognostics information. This methodology includes procedures for identifying critical components, as well as tools for selecting the most appropriate algorithms for specific applications. Visualization tools are presented for displaying prognostics information in an appropriate fashion for quick and accurate decision making. Industrial case studies are included in this paper to show how this methodology can help in the design of an effective PHM system.

  8. End State: The Fallacy of Modern Military Planning

    DTIC Science & Technology

    2017-04-06

    operational planning for non -linear, complex scenarios requires application of non -linear, advanced planning techniques such as design methodology ...cannot be approached in a linear, mechanistic manner by a universal planning methodology . Theater/global campaign plans and theater strategies offer no...strategic environments, and instead prescribes a universal linear methodology that pays no mind to strategic complexity. This universal application

  9. From the Analysis of Work-Processes to Designing Competence-Based Occupational Standards and Vocational Curricula

    ERIC Educational Resources Information Center

    Tutlys, Vidmantas; Spöttl, Georg

    2017-01-01

    Purpose: This paper aims to explore methodological and institutional challenges on application of the work-process analysis approach in the design and development of competence-based occupational standards for Lithuania. Design/methodology/approach: The theoretical analysis is based on the review of scientific literature and the analysis of…

  10. Investigating the application of AOP methodology in development of Financial Accounting Software using Eclipse-AJDT Environment

    NASA Astrophysics Data System (ADS)

    Sharma, Amita; Sarangdevot, S. S.

    2010-11-01

    Aspect-Oriented Programming (AOP) methodology has been investigated in development of real world business application software—Financial Accounting Software. Eclipse-AJDT environment has been used as open source enhanced IDE support for programming in AOP language—Aspect J. Crosscutting concerns have been identified and modularized as aspects. This reduces the complexity of the design considerably due to elimination of code scattering and tangling. Improvement in modularity, quality and performance is achieved. The study concludes that AOP methodology in Eclipse-AJDT environment offers powerful support for modular design and implementation of real world quality business software.

  11. Enhanced CAX Architecture, Design and Methodology - SPHINX (Architecture, definition et methodologie ameliorees des exercices assistes par ordinateur (CAX) - SPHINX)

    DTIC Science & Technology

    2016-08-01

    REPORT TR-MSG-106 Enhanced CAX Architecture, Design and Methodology – SPHINX (Architecture, définition et méthodologie améliorées des exercices...STO TECHNICAL REPORT TR-MSG-106 Enhanced CAX Architecture, Design and Methodology – SPHINX (Architecture, définition et méthodologie...transition, application and field-testing, experimentation and a range of related scientific activities that include systems engineering, operational

  12. Efficient preliminary floating offshore wind turbine design and testing methodologies and application to a concrete spar design.

    PubMed

    Matha, Denis; Sandner, Frank; Molins, Climent; Campos, Alexis; Cheng, Po Wen

    2015-02-28

    The current key challenge in the floating offshore wind turbine industry and research is on designing economic floating systems that can compete with fixed-bottom offshore turbines in terms of levelized cost of energy. The preliminary platform design, as well as early experimental design assessments, are critical elements in the overall design process. In this contribution, a brief review of current floating offshore wind turbine platform pre-design and scaled testing methodologies is provided, with a focus on their ability to accommodate the coupled dynamic behaviour of floating offshore wind systems. The exemplary design and testing methodology for a monolithic concrete spar platform as performed within the European KIC AFOSP project is presented. Results from the experimental tests compared to numerical simulations are presented and analysed and show very good agreement for relevant basic dynamic platform properties. Extreme and fatigue loads and cost analysis of the AFOSP system confirm the viability of the presented design process. In summary, the exemplary application of the reduced design and testing methodology for AFOSP confirms that it represents a viable procedure during pre-design of floating offshore wind turbine platforms. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  13. Applying axiomatic design to a medication distribution system

    NASA Astrophysics Data System (ADS)

    Raguini, Pepito B.

    As the need to minimize medication errors drives many medical facilities to come up with robust solutions to the most common error that affects patient's safety, these hospitals would be wise to put a concerted effort into finding methodologies that can facilitate an optimized medical distribution system. If the hospitals' upper management is looking for an optimization method that is an ideal fit, it is just as important that the right tool be selected for the application at hand. In the present work, we propose the application of Axiomatic Design (AD), which is a process that focuses on the generation and selection of functional requirements to meet the customer needs for product and/or process design. The appeal of the axiomatic approach is to provide both a formal design process and a set of technical coefficients for meeting the customer's needs. Thus, AD offers a strategy for the effective integration of people, design methods, design tools and design data. Therefore, we propose the AD methodology to medical applications with the main objective of allowing nurses the opportunity to provide cost effective delivery of medications to inpatients, thereby improving quality patient care. The AD methodology will be implemented through the use of focused stores, where medications can be readily stored and can be conveniently located near patients, as well as a mobile apparatus that can also store medications and is commonly used by hospitals, the medication cart. Moreover, a robust methodology called the focused store methodology will be introduced and developed for both the uncapacitated and capacitated case studies, which will set up an appropriate AD framework and design problem for a medication distribution case study.

  14. New Methods in Design Education: The Systemic Methodology and the Use of Sketch in the Conceptual Design Stage

    ERIC Educational Resources Information Center

    Westermeyer, Juan Carlos Briede; Ortuno, Bernabe Hernandis

    2011-01-01

    This study describes the application of a new product concurrent design methodologies in the context in the education of industrial design. The use of the sketch has been utilized many times as a tool of creative expression especially in the conceptual design stage, in an intuitive way and a little out of the context of the reality needs that the…

  15. A Methodology for Instructional Design in Mathematics--With the Generic and Epistemic Student at the Centre

    ERIC Educational Resources Information Center

    Strømskag, Heidi

    2017-01-01

    This theoretical paper presents a methodology for instructional design in mathematics. It is a theoretical analysis of a proposed model for instructional design, where tasks are embedded in situations that preserve meaning with respect to particular pieces of mathematical knowledge. The model is applicable when there is an intention of teaching…

  16. Development of an Expert Judgement Elicitation and Calibration Methodology for Risk Analysis in Conceptual Vehicle Design

    NASA Technical Reports Server (NTRS)

    Unal, Resit; Keating, Charles; Conway, Bruce; Chytka, Trina

    2004-01-01

    A comprehensive expert-judgment elicitation methodology to quantify input parameter uncertainty and analysis tool uncertainty in a conceptual launch vehicle design analysis has been developed. The ten-phase methodology seeks to obtain expert judgment opinion for quantifying uncertainties as a probability distribution so that multidisciplinary risk analysis studies can be performed. The calibration and aggregation techniques presented as part of the methodology are aimed at improving individual expert estimates, and provide an approach to aggregate multiple expert judgments into a single probability distribution. The purpose of this report is to document the methodology development and its validation through application to a reference aerospace vehicle. A detailed summary of the application exercise, including calibration and aggregation results is presented. A discussion of possible future steps in this research area is given.

  17. Recent activities within the Aeroservoelasticity Branch at the NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Noll, Thomas E.; Perry, Boyd, III; Gilbert, Michael G.

    1989-01-01

    The objective of research in aeroservoelasticity at the NASA Langley Research Center is to enhance the modeling, analysis, and multidisciplinary design methodologies for obtaining multifunction digital control systems for application to flexible flight vehicles. Recent accomplishments are discussed, and a status report on current activities within the Aeroservoelasticity Branch is presented. In the area of modeling, improvements to the Minimum-State Method of approximating unsteady aerodynamics are shown to provide precise, low-order aeroservoelastic models for design and simulation activities. Analytical methods based on Matched Filter Theory and Random Process Theory to provide efficient and direct predictions of the critical gust profile and the time-correlated gust loads for linear structural design considerations are also discussed. Two research projects leading towards improved design methodology are summarized. The first program is developing an integrated structure/control design capability based on hierarchical problem decomposition, multilevel optimization and analytical sensitivities. The second program provides procedures for obtaining low-order, robust digital control laws for aeroelastic applications. In terms of methodology validation and application the current activities associated with the Active Flexible Wing project are reviewed.

  18. Patients' perspective of the design of provider-patients electronic communication services.

    PubMed

    Silhavy, Petr; Silhavy, Radek; Prokopova, Zdenka

    2014-06-12

    Information Delivery is one the most important tasks in healthcare practice. This article discusses patient's tasks and perspectives, which are then used to design a new Effective Electronic Methodology. The system design methods applicable to electronic communication in the healthcare sector are also described. The architecture and the methodology for the healthcare service portal are set out in the proposed system design.

  19. Reliability based design optimization: Formulations and methodologies

    NASA Astrophysics Data System (ADS)

    Agarwal, Harish

    Modern products ranging from simple components to complex systems should be designed to be optimal and reliable. The challenge of modern engineering is to ensure that manufacturing costs are reduced and design cycle times are minimized while achieving requirements for performance and reliability. If the market for the product is competitive, improved quality and reliability can generate very strong competitive advantages. Simulation based design plays an important role in designing almost any kind of automotive, aerospace, and consumer products under these competitive conditions. Single discipline simulations used for analysis are being coupled together to create complex coupled simulation tools. This investigation focuses on the development of efficient and robust methodologies for reliability based design optimization in a simulation based design environment. Original contributions of this research are the development of a novel efficient and robust unilevel methodology for reliability based design optimization, the development of an innovative decoupled reliability based design optimization methodology, the application of homotopy techniques in unilevel reliability based design optimization methodology, and the development of a new framework for reliability based design optimization under epistemic uncertainty. The unilevel methodology for reliability based design optimization is shown to be mathematically equivalent to the traditional nested formulation. Numerical test problems show that the unilevel methodology can reduce computational cost by at least 50% as compared to the nested approach. The decoupled reliability based design optimization methodology is an approximate technique to obtain consistent reliable designs at lesser computational expense. Test problems show that the methodology is computationally efficient compared to the nested approach. A framework for performing reliability based design optimization under epistemic uncertainty is also developed. A trust region managed sequential approximate optimization methodology is employed for this purpose. Results from numerical test studies indicate that the methodology can be used for performing design optimization under severe uncertainty.

  20. Nonlinear and adaptive control

    NASA Technical Reports Server (NTRS)

    Athans, Michael

    1989-01-01

    The primary thrust of the research was to conduct fundamental research in the theories and methodologies for designing complex high-performance multivariable feedback control systems; and to conduct feasibiltiy studies in application areas of interest to NASA sponsors that point out advantages and shortcomings of available control system design methodologies.

  1. Interrogating discourse: the application of Foucault's methodological discussion to specific inquiry.

    PubMed

    Fadyl, Joanna K; Nicholls, David A; McPherson, Kathryn M

    2013-09-01

    Discourse analysis following the work of Michel Foucault has become a valuable methodology in the critical analysis of a broad range of topics relating to health. However, it can be a daunting task, in that there seems to be both a huge number of possible approaches to carrying out this type of project, and an abundance of different, often conflicting, opinions about what counts as 'Foucauldian'. This article takes the position that methodological design should be informed by ongoing discussion and applied as appropriate to a particular area of inquiry. The discussion given offers an interpretation and application of Foucault's methodological principles, integrating a reading of Foucault with applications of his work by other authors, showing how this is then applied to interrogate the practice of vocational rehabilitation. It is intended as a contribution to methodological discussion in this area, offering an interpretation of various methodological elements described by Foucault, alongside specific application of these aspects.

  2. Design consideration of resonance inverters with electro-technological application

    NASA Astrophysics Data System (ADS)

    Hinov, Nikolay

    2017-12-01

    This study presents design consideration of resonance inverters with electro-technological application. The presented methodology was achieved as a result of investigations and analyses of different types and working regimes of resonance inverters, made by the author. Are considered schemes of resonant inverters without inverse diodes. The first harmonic method is used in the analysis and design. This method for the case of inverters with electro-technological application gives very good accuracy. This does not require the use of a complex and heavy mathematical apparatus. The proposed methodology is easy to use and is suitable for use in training students in power electronics. Authenticity of achieved results is confirmed by simulating and physical prototypes research work.

  3. Application Of The Iberdrola Licensing Methodology To The Cofrentes BWR-6 110% Extended Power Up-rate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mata, Pedro; Fuente, Rafael de la; Iglesias, Javier

    Iberdrola (spanish utility) and Iberdrola Ingenieria (engineering branch) have been developing during the last two years the 110% Extended Power Up-rate Project (EPU 110%) for Cofrentes BWR-6. IBERDROLA has available an in-house design and licensing reload methodology that has been approved by the Spanish Nuclear Regulatory Authority. This methodology has been already used to perform the nuclear design and the reload licensing analysis for Cofrentes cycles 12 to 14. The methodology has been also applied to develop a significant number of safety analysis of the Cofrentes Extended Power Up-rate including: Reactor Heat Balance, Core and Fuel performance, Thermal Hydraulic Stability,more » ECCS LOCA Evaluation, Transient Analysis, Anticipated Transient Without Scram (ATWS) and Station Blackout (SBO) Since the scope of the licensing process of the Cofrentes Extended Power Up-rate exceeds the range of analysis included in the Cofrentes generic reload licensing process, it has been required to extend the applicability of the Cofrentes licensing methodology to the analysis of new transients. This is the case of the TLFW transient. The content of this paper shows the benefits of having an in-house design and licensing methodology, and describes the process to extend the applicability of the methodology to the analysis of new transients. The case of analysis of Total Loss of Feedwater with the Cofrentes Retran Model is included as an example of this process. (authors)« less

  4. Intelligent systems/software engineering methodology - A process to manage cost and risk

    NASA Technical Reports Server (NTRS)

    Friedlander, Carl; Lehrer, Nancy

    1991-01-01

    A systems development methodology is discussed that has been successfully applied to the construction of a number of intelligent systems. This methodology is a refinement of both evolutionary and spiral development methodologies. It is appropriate for development of intelligent systems. The application of advanced engineering methodology to the development of software products and intelligent systems is an important step toward supporting the transition of AI technology into aerospace applications. A description of the methodology and the process model from which it derives is given. Associated documents and tools are described which are used to manage the development process and record and report the emerging design.

  5. Application of type synthesis theory to the redesign of a complex surgical instrument.

    PubMed

    Lim, Jonas J B; Erdman, Arthur G

    2002-06-01

    Surgical instruments consist of basic mechanical components such as gears, links, pivots, sliders, etc., which are common in mechanical design. This paper describes the application of a method in the analysis and design of complex surgical instruments such as those employed in laparoscopic surgery. This is believed to be the first application of type synthesis theory to a complex medical instrument. Type synthesis is a methodology that can be applied during the conceptual phase of mechanical design. A handle assembly from a patented laparoscopic surgical stapler is used to illustrate the application of the design method developed. Type synthesis is applied on specific subsystems of the mechanism within the handle assembly where alternative design concepts are generated. Chosen concepts are then combined to form a new conceptual design for the handle assembly. The new handle assembly is improved because it has fewer number of parts, is a simpler design and is easier to assemble. Surgical instrument designers may use the methodology presented here to analyze the mechanical subsystems within complex instruments and to create new options that may offer improvements to the original design.

  6. A literature review of applied adaptive design methodology within the field of oncology in randomised controlled trials and a proposed extension to the CONSORT guidelines.

    PubMed

    Mistry, Pankaj; Dunn, Janet A; Marshall, Andrea

    2017-07-18

    The application of adaptive design methodology within a clinical trial setting is becoming increasingly popular. However the application of these methods within trials is not being reported as adaptive designs hence making it more difficult to capture the emerging use of these designs. Within this review, we aim to understand how adaptive design methodology is being reported, whether these methods are explicitly stated as an 'adaptive design' or if it has to be inferred and to identify whether these methods are applied prospectively or concurrently. Three databases; Embase, Ovid and PubMed were chosen to conduct the literature search. The inclusion criteria for the review were phase II, phase III and phase II/III randomised controlled trials within the field of Oncology that published trial results in 2015. A variety of search terms related to adaptive designs were used. A total of 734 results were identified, after screening 54 were eligible. Adaptive designs were more commonly applied in phase III confirmatory trials. The majority of the papers performed an interim analysis, which included some sort of stopping criteria. Additionally only two papers explicitly stated the term 'adaptive design' and therefore for most of the papers, it had to be inferred that adaptive methods was applied. Sixty-five applications of adaptive design methods were applied, from which the most common method was an adaptation using group sequential methods. This review indicated that the reporting of adaptive design methodology within clinical trials needs improving. The proposed extension to the current CONSORT 2010 guidelines could help capture adaptive design methods. Furthermore provide an essential aid to those involved with clinical trials.

  7. Topology optimization and laser additive manufacturing in design process of efficiency lightweight aerospace parts

    NASA Astrophysics Data System (ADS)

    Fetisov, K. V.; Maksimov, P. V.

    2018-05-01

    The paper presents the application of topology optimization and laser additive manufacturing in the design of lightweight aerospace parts. At the beginning a brief overview of the topology optimization algorithm SIMP is given, one of the most commonly used algorithm in FEA software. After that, methodology of parts design with using topology optimization is discussed as well as issues related to designing for additive manufacturing. In conclusion, the practical application of the proposed methodologies is presented using the example of one complex assembly unit. As a result of the new design approach, the mass of product was reduced five times, and twenty parts were replaced by one.

  8. Characteristics of a semi-custom library development system

    NASA Technical Reports Server (NTRS)

    Yancey, M.; Cannon, R.

    1990-01-01

    Standard cell and gate array macro libraries are in common use with workstation computer aided design (CAD) tools for application specific integrated circuit (ASIC) semi-custom application and have resulted in significant improvements in the overall design efficiencies as contrasted with custom design methodologies. Similar design methodology enhancements in providing for the efficient development of the library cells is an important factor in responding to the need for continuous technology improvement. The characteristics of a library development system that provides design flexibility and productivity enhancements for the library development engineer as he provides libraries in the state-of-the-art process technologies are presented. An overview of Gould's library development system ('Accolade') is also presented.

  9. Maximum Entropy/Optimal Projection (MEOP) control design synthesis: Optimal quantification of the major design tradeoffs

    NASA Technical Reports Server (NTRS)

    Hyland, D. C.; Bernstein, D. S.

    1987-01-01

    The underlying philosophy and motivation of the optimal projection/maximum entropy (OP/ME) stochastic modeling and reduced control design methodology for high order systems with parameter uncertainties are discussed. The OP/ME design equations for reduced-order dynamic compensation including the effect of parameter uncertainties are reviewed. The application of the methodology to several Large Space Structures (LSS) problems of representative complexity is illustrated.

  10. Patients’ Perspective of the Design of Provider-Patients Electronic Communication Services

    PubMed Central

    Silhavy, Petr; Silhavy, Radek; Prokopova, Zdenka

    2014-01-01

    Information Delivery is one the most important tasks in healthcare practice. This article discusses patient’s tasks and perspectives, which are then used to design a new Effective Electronic Methodology. The system design methods applicable to electronic communication in the healthcare sector are also described. The architecture and the methodology for the healthcare service portal are set out in the proposed system design. PMID:24927038

  11. Novel thermal management system design methodology for power lithium-ion battery

    NASA Astrophysics Data System (ADS)

    Nieto, Nerea; Díaz, Luis; Gastelurrutia, Jon; Blanco, Francisco; Ramos, Juan Carlos; Rivas, Alejandro

    2014-12-01

    Battery packs conformed by large format lithium-ion cells are increasingly being adopted in hybrid and pure electric vehicles in order to use the energy more efficiently and for a better environmental performance. Safety and cycle life are two of the main concerns regarding this technology, which are closely related to the cell's operating behavior and temperature asymmetries in the system. Therefore, the temperature of the cells in battery packs needs to be controlled by thermal management systems (TMSs). In the present paper an improved design methodology for developing TMSs is proposed. This methodology involves the development of different mathematical models for heat generation, transmission, and dissipation and their coupling and integration in the battery pack product design methodology in order to improve the overall safety and performance. The methodology is validated by comparing simulation results with laboratory measurements on a single module of the battery pack designed at IK4-IKERLAN for a traction application. The maximum difference between model predictions and experimental temperature data is 2 °C. The models developed have shown potential for use in battery thermal management studies for EV/HEV applications since they allow for scalability with accuracy and reasonable simulation time.

  12. An applicational process for dynamic balancing of turbomachinery shafting

    NASA Technical Reports Server (NTRS)

    Verhoff, Vincent G.

    1990-01-01

    The NASA Lewis Research Center has developed and implemented a time-efficient methodology for dynamically balancing turbomachinery shafting. This methodology minimizes costly facility downtime by using a balancing arbor (mandrel) that simulates the turbomachinery (rig) shafting. The need for precision dynamic balancing of turbomachinery shafting and for a dynamic balancing methodology is discussed in detail. Additionally, the inherent problems (and their causes and effects) associated with unbalanced turbomachinery shafting as a function of increasing shaft rotational speeds are discussed. Included are the design criteria concerning rotor weight differentials for rotors made of different materials that have similar parameters and shafting. The balancing methodology for applications where rotor replaceability is a requirement is also covered. This report is intended for use as a reference when designing, fabricating, and troubleshooting turbomachinery shafting.

  13. Verification methodology for fault-tolerant, fail-safe computers applied to maglev control computer systems. Final report, July 1991-May 1993

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lala, J.H.; Nagle, G.A.; Harper, R.E.

    1993-05-01

    The Maglev control computer system should be designed to verifiably possess high reliability and safety as well as high availability to make Maglev a dependable and attractive transportation alternative to the public. A Maglev control computer system has been designed using a design-for-validation methodology developed earlier under NASA and SDIO sponsorship for real-time aerospace applications. The present study starts by defining the maglev mission scenario and ends with the definition of a maglev control computer architecture. Key intermediate steps included definitions of functional and dependability requirements, synthesis of two candidate architectures, development of qualitative and quantitative evaluation criteria, and analyticalmore » modeling of the dependability characteristics of the two architectures. Finally, the applicability of the design-for-validation methodology was also illustrated by applying it to the German Transrapid TR07 maglev control system.« less

  14. Investigation of Weibull statistics in fracture analysis of cast aluminum

    NASA Technical Reports Server (NTRS)

    Holland, Frederic A., Jr.; Zaretsky, Erwin V.

    1989-01-01

    The fracture strengths of two large batches of A357-T6 cast aluminum coupon specimens were compared by using two-parameter Weibull analysis. The minimum number of these specimens necessary to find the fracture strength of the material was determined. The applicability of three-parameter Weibull analysis was also investigated. A design methodology based on the combination of elementary stress analysis and Weibull statistical analysis is advanced and applied to the design of a spherical pressure vessel shell. The results from this design methodology are compared with results from the applicable ASME pressure vessel code.

  15. Guidelines for the Design and Conduct of Clinical Studies in Knee Articular Cartilage Repair

    PubMed Central

    Mithoefer, Kai; Saris, Daniel B.F.; Farr, Jack; Kon, Elizaveta; Zaslav, Kenneth; Cole, Brian J.; Ranstam, Jonas; Yao, Jian; Shive, Matthew; Levine, David; Dalemans, Wilfried; Brittberg, Mats

    2011-01-01

    Objective: To summarize current clinical research practice and develop methodological standards for objective scientific evaluation of knee cartilage repair procedures and products. Design: A comprehensive literature review was performed of high-level original studies providing information relevant for the design of clinical studies on articular cartilage repair in the knee. Analysis of cartilage repair publications and synopses of ongoing trials were used to identify important criteria for the design, reporting, and interpretation of studies in this field. Results: Current literature reflects the methodological limitations of the scientific evidence available for articular cartilage repair. However, clinical trial databases of ongoing trials document a trend suggesting improved study designs and clinical evaluation methodology. Based on the current scientific information and standards of clinical care, detailed methodological recommendations were developed for the statistical study design, patient recruitment, control group considerations, study endpoint definition, documentation of results, use of validated patient-reported outcome instruments, and inclusion and exclusion criteria for the design and conduct of scientifically sound cartilage repair study protocols. A consensus statement among the International Cartilage Repair Society (ICRS) and contributing authors experienced in clinical trial design and implementation was achieved. Conclusions: High-quality clinical research methodology is critical for the optimal evaluation of current and new cartilage repair technologies. In addition to generally applicable principles for orthopedic study design, specific criteria and considerations apply to cartilage repair studies. Systematic application of these criteria and considerations can facilitate study designs that are scientifically rigorous, ethical, practical, and appropriate for the question(s) being addressed in any given cartilage repair research project. PMID:26069574

  16. Calibration Modeling Methodology to Optimize Performance for Low Range Applications

    NASA Technical Reports Server (NTRS)

    McCollum, Raymond A.; Commo, Sean A.; Parker, Peter A.

    2010-01-01

    Calibration is a vital process in characterizing the performance of an instrument in an application environment and seeks to obtain acceptable accuracy over the entire design range. Often, project requirements specify a maximum total measurement uncertainty, expressed as a percent of full-scale. However in some applications, we seek to obtain enhanced performance at the low range, therefore expressing the accuracy as a percent of reading should be considered as a modeling strategy. For example, it is common to desire to use a force balance in multiple facilities or regimes, often well below its designed full-scale capacity. This paper presents a general statistical methodology for optimizing calibration mathematical models based on a percent of reading accuracy requirement, which has broad application in all types of transducer applications where low range performance is required. A case study illustrates the proposed methodology for the Mars Entry Atmospheric Data System that employs seven strain-gage based pressure transducers mounted on the heatshield of the Mars Science Laboratory mission.

  17. Design, Progressive Modeling, Manufacture, and Testing of Composite Shield for Turbine Engine Blade Containment

    NASA Technical Reports Server (NTRS)

    Binienda, Wieslaw K.; Sancaktar, Erol; Roberts, Gary D. (Technical Monitor)

    2002-01-01

    An effective design methodology was established for composite jet engine containment structures. The methodology included the development of the full and reduced size prototypes, and FEA models of the containment structure, experimental and numerical examination of the modes of failure clue to turbine blade out event, identification of materials and design candidates for future industrial applications, and design and building of prototypes for testing and evaluation purposes.

  18. Integration Methodology For Oil-Free Shaft Support Systems: Four Steps to Success

    NASA Technical Reports Server (NTRS)

    Howard, Samuel A.; DellaCorte, Christopher; Bruckner, Robert J.

    2010-01-01

    Commercial applications for Oil-Free turbomachinery are slowly becoming a reality. Micro-turbine generators, highspeed electric motors, and electrically driven centrifugal blowers are a few examples of products available in today's commercial marketplace. Gas foil bearing technology makes most of these applications possible. A significant volume of component level research has led to recent acceptance of gas foil bearings in several specialized applications, including those mentioned above. Component tests identifying such characteristics as load carrying capacity, power loss, thermal behavior, rotordynamic coefficients, etc. all help the engineer design foil bearing machines, but the development process can be just as important. As the technology gains momentum and acceptance in a wider array of machinery, the complexity and variety of applications will grow beyond the current class of machines. Following a robust integration methodology will help improve the probability of successful development of future Oil-Free turbomachinery. This paper describes a previously successful four-step integration methodology used in the development of several Oil-Free turbomachines. Proper application of the methods put forward here enable successful design of Oil-Free turbomachinery. In addition when significant design changes or unique machinery are developed, this four-step process must be considered.

  19. A methodological framework to support the initiation, design and institutionalization of participatory modeling processes in water resources management

    NASA Astrophysics Data System (ADS)

    Halbe, Johannes; Pahl-Wostl, Claudia; Adamowski, Jan

    2018-01-01

    Multiple barriers constrain the widespread application of participatory methods in water management, including the more technical focus of most water agencies, additional cost and time requirements for stakeholder involvement, as well as institutional structures that impede collaborative management. This paper presents a stepwise methodological framework that addresses the challenges of context-sensitive initiation, design and institutionalization of participatory modeling processes. The methodological framework consists of five successive stages: (1) problem framing and stakeholder analysis, (2) process design, (3) individual modeling, (4) group model building, and (5) institutionalized participatory modeling. The Management and Transition Framework is used for problem diagnosis (Stage One), context-sensitive process design (Stage Two) and analysis of requirements for the institutionalization of participatory water management (Stage Five). Conceptual modeling is used to initiate participatory modeling processes (Stage Three) and ensure a high compatibility with quantitative modeling approaches (Stage Four). This paper describes the proposed participatory model building (PMB) framework and provides a case study of its application in Québec, Canada. The results of the Québec study demonstrate the applicability of the PMB framework for initiating and designing participatory model building processes and analyzing barriers towards institutionalization.

  20. Structural Sizing Methodology for the Tendon-Actuated Lightweight In-Space MANipulator (TALISMAN) System

    NASA Technical Reports Server (NTRS)

    Jones, Thomas C.; Dorsey, John T.; Doggett, William R.

    2015-01-01

    The Tendon-Actuated Lightweight In-Space MANipulator (TALISMAN) is a versatile long-reach robotic manipulator that is currently being tested at NASA Langley Research Center. TALISMAN is designed to be highly mass-efficient and multi-mission capable, with applications including asteroid retrieval and manipulation, in-space servicing, and astronaut and payload positioning. The manipulator uses a modular, periodic, tension-compression design that lends itself well to analytical modeling. Given the versatility of application for TALISMAN, a structural sizing methodology was developed that could rapidly assess mass and configuration sensitivities for any specified operating work space, applied loads and mission requirements. This methodology allows the systematic sizing of the key structural members of TALISMAN, which include the truss arm links, the spreaders and the tension elements. This paper summarizes the detailed analytical derivations and methodology that support the structural sizing approach and provides results from some recent TALISMAN designs developed for current and proposed mission architectures.

  1. Application of the IBERDROLA RETRAN Licensing Methodology to the Confrentes BWR-6 110% Extended Power Uprate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fuente, Rafael de la; Iglesias, Javier; Sedano, Pablo G.

    IBERDROLA (Spanish utility) and IBERDROLA INGENIERIA (engineering branch) have been developing during the last 2 yr the 110% Extended Power Uprate Project for Cofrentes BWR-6. IBERDROLA has available an in-house design and licensing reload methodology that has been approved in advance by the Spanish Nuclear Regulatory Authority. This methodology has been applied to perform the nuclear design and the reload licensing analysis for Cofrentes cycles 12 and 13 and to develop a significant number of safety analyses of the Cofrentes Extended Power.Because the scope of the licensing process of the Cofrentes Extended Power Uprate exceeds the range of analysis includedmore » in the Cofrentes generic reload licensing process, it has been required to extend the applicability of the Cofrentes RETRAN model to the analysis of new transients. This is the case of the total loss of feedwater (TLFW) transient.The content of this paper shows the benefits of having an in-house design and licensing methodology and describes the process to extend the applicability of the Cofrentes RETRAN model to the analysis of new transients, particularly in this paper the TLFW transient.« less

  2. Propulsion integration of hypersonic air-breathing vehicles utilizing a top-down design methodology

    NASA Astrophysics Data System (ADS)

    Kirkpatrick, Brad Kenneth

    In recent years, a focus of aerospace engineering design has been the development of advanced design methodologies and frameworks to account for increasingly complex and integrated vehicles. Techniques such as parametric modeling, global vehicle analyses, and interdisciplinary data sharing have been employed in an attempt to improve the design process. The purpose of this study is to introduce a new approach to integrated vehicle design known as the top-down design methodology. In the top-down design methodology, the main idea is to relate design changes on the vehicle system and sub-system level to a set of over-arching performance and customer requirements. Rather than focusing on the performance of an individual system, the system is analyzed in terms of the net effect it has on the overall vehicle and other vehicle systems. This detailed level of analysis can only be accomplished through the use of high fidelity computational tools such as Computational Fluid Dynamics (CFD) or Finite Element Analysis (FEA). The utility of the top-down design methodology is investigated through its application to the conceptual and preliminary design of a long-range hypersonic air-breathing vehicle for a hypothetical next generation hypersonic vehicle (NHRV) program. System-level design is demonstrated through the development of the nozzle section of the propulsion system. From this demonstration of the methodology, conclusions are made about the benefits, drawbacks, and cost of using the methodology.

  3. A manual for conducting environmental impact studies.

    DOT National Transportation Integrated Search

    1971-01-01

    This report suggests methodologies which should enable an interdisciplinary team to assess community values. The methodologies are applicable to the conceptual, location, and design phases of highway planning, respectively. The approach employs a wei...

  4. Design and Customization of Telemedicine Systems

    PubMed Central

    Martínez-Alcalá, Claudia I.; Muñoz, Mirna; Monguet-Fierro, Josep

    2013-01-01

    In recent years, the advances in information and communication technology (ICT) have resulted in the development of systems and applications aimed at supporting rehabilitation therapy that contributes to enrich patients' life quality. This work is focused on the improvement of the telemedicine systems with the purpose of customizing therapies according to the profile and disability of patients. For doing this, as salient contribution, this work proposes the adoption of user-centered design (UCD) methodology for the design and development of telemedicine systems in order to support the rehabilitation of patients with neurological disorders. Finally, some applications of the UCD methodology in the telemedicine field are presented as a proof of concept. PMID:23762191

  5. Application of an integrated flight/propulsion control design methodology to a STOVL aircraft

    NASA Technical Reports Server (NTRS)

    Garg, Sanjay; Mattern, Duane L.

    1991-01-01

    Results are presented from the application of an emerging Integrated Flight/Propulsion Control (IFPC) design methodology to a Short Take Off and Vertical Landing (STOVL) aircraft in transition flight. The steps in the methodology consist of designing command shaping prefilters to provide the overall desired response to pilot command inputs. A previously designed centralized controller is first validated for the integrated airframe/engine plant used. This integrated plant is derived from a different model of the engine subsystem than the one used for the centralized controller design. The centralized controller is then partitioned in a decentralized, hierarchical structure comprising of airframe lateral and longitudinal subcontrollers and an engine subcontroller. Command shaping prefilters from the pilot control effector inputs are then designed and time histories of the closed loop IFPC system response to simulated pilot commands are compared to desired responses based on handling qualities requirements. Finally, the propulsion system safety and nonlinear limited protection logic is wrapped around the engine subcontroller and the response of the closed loop integrated system is evaluated for transients that encounter the propulsion surge margin limit.

  6. Design of integrated pitch axis for autopilot/autothrottle and integrated lateral axis for autopilot/yaw damper for NASA TSRV airplane using integral LQG methodology

    NASA Technical Reports Server (NTRS)

    Kaminer, Isaac; Benson, Russell A.; Coleman, Edward E.; Ebrahimi, Yaghoob S.

    1990-01-01

    Two designs are presented for control systems for the NASA Transport System Research Vehicle (TSRV) using integral Linear Quadratic Gaussian (LQG) methodology. The first is an integrated longitudinal autopilot/autothrottle design and the second design is an integrated lateral autopilot/yaw damper/sideslip controller design. It is shown that a systematic top-down approach to a complex design problem combined with proper application of modern control synthesis techniques yields a satisfactory solution in a reasonable period of time.

  7. A Formal Methodology to Design and Deploy Dependable Wireless Sensor Networks

    PubMed Central

    Testa, Alessandro; Cinque, Marcello; Coronato, Antonio; Augusto, Juan Carlos

    2016-01-01

    Wireless Sensor Networks (WSNs) are being increasingly adopted in critical applications, where verifying the correct operation of sensor nodes is a major concern. Undesired events may undermine the mission of the WSNs. Hence, their effects need to be properly assessed before deployment, to obtain a good level of expected performance; and during the operation, in order to avoid dangerous unexpected results. In this paper, we propose a methodology that aims at assessing and improving the dependability level of WSNs by means of an event-based formal verification technique. The methodology includes a process to guide designers towards the realization of a dependable WSN and a tool (“ADVISES”) to simplify its adoption. The tool is applicable to homogeneous WSNs with static routing topologies. It allows the automatic generation of formal specifications used to check correctness properties and evaluate dependability metrics at design time and at runtime for WSNs where an acceptable percentage of faults can be defined. During the runtime, we can check the behavior of the WSN accordingly to the results obtained at design time and we can detect sudden and unexpected failures, in order to trigger recovery procedures. The effectiveness of the methodology is shown in the context of two case studies, as proof-of-concept, aiming to illustrate how the tool is helpful to drive design choices and to check the correctness properties of the WSN at runtime. Although the method scales up to very large WSNs, the applicability of the methodology may be compromised by the state space explosion of the reasoning model, which must be faced by partitioning large topologies into sub-topologies. PMID:28025568

  8. Application of Adjoint Methodology in Various Aspects of Sonic Boom Design

    NASA Technical Reports Server (NTRS)

    Rallabhandi, Sriram K.

    2014-01-01

    One of the advances in computational design has been the development of adjoint methods allowing efficient calculation of sensitivities in gradient-based shape optimization. This paper discusses two new applications of adjoint methodology that have been developed to aid in sonic boom mitigation exercises. In the first, equivalent area targets are generated using adjoint sensitivities of selected boom metrics. These targets may then be used to drive the vehicle shape during optimization. The second application is the computation of adjoint sensitivities of boom metrics on the ground with respect to parameters such as flight conditions, propagation sampling rate, and selected inputs to the propagation algorithms. These sensitivities enable the designer to make more informed selections of flight conditions at which the chosen cost functionals are less sensitive.

  9. A prototype computerized synthesis methodology for generic space access vehicle (SAV) conceptual design

    NASA Astrophysics Data System (ADS)

    Huang, Xiao

    2006-04-01

    Today's and especially tomorrow's competitive launch vehicle design environment requires the development of a dedicated generic Space Access Vehicle (SAV) design methodology. A total of 115 industrial, research, and academic aircraft, helicopter, missile, and launch vehicle design synthesis methodologies have been evaluated. As the survey indicates, each synthesis methodology tends to focus on a specific flight vehicle configuration, thus precluding the key capability to systematically compare flight vehicle design alternatives. The aim of the research investigation is to provide decision-making bodies and the practicing engineer a design process and tool box for robust modeling and simulation of flight vehicles where the ultimate performance characteristics may hinge on numerical subtleties. This will enable the designer of a SAV for the first time to consistently compare different classes of SAV configurations on an impartial basis. This dissertation presents the development steps required towards a generic (configuration independent) hands-on flight vehicle conceptual design synthesis methodology. This process is developed such that it can be applied to any flight vehicle class if desired. In the present context, the methodology has been put into operation for the conceptual design of a tourist Space Access Vehicle. The case study illustrates elements of the design methodology & algorithm for the class of Horizontal Takeoff and Horizontal Landing (HTHL) SAVs. The HTHL SAV design application clearly outlines how the conceptual design process can be centrally organized, executed and documented with focus on design transparency, physical understanding and the capability to reproduce results. This approach offers the project lead and creative design team a management process and tool which iteratively refines the individual design logic chosen, leading to mature design methods and algorithms. As illustrated, the HTHL SAV hands-on design methodology offers growth potential in that the same methodology can be continually updated and extended to other SAV configuration concepts, such as the Vertical Takeoff and Vertical Landing (VTVL) SAV class. Having developed, validated and calibrated the methodology for HTHL designs in the 'hands-on' mode, the report provides an outlook how the methodology will be integrated into a prototype computerized design synthesis software AVDS-PrADOSAV in a follow-on step.

  10. Development of Boolean calculus and its applications. [digital systems design

    NASA Technical Reports Server (NTRS)

    Tapia, M. A.

    1980-01-01

    The development of Boolean calculus for its application to developing digital system design methodologies that would reduce system complexity, size, cost, speed, power requirements, etc., is discussed. Synthesis procedures for logic circuits are examined particularly asynchronous circuits using clock triggered flip flops.

  11. Specification and Design Methodologies for High-Speed Fault-Tolerant Array Algorithms and Structures for VLSI.

    DTIC Science & Technology

    1987-06-01

    evaluation and chip layout planning for VLSI digital systems. A high-level applicative (functional) language, implemented at UCLA, allows combining of...operating system. 2.1 Introduction The complexity of VLSI requires the application of CAD tools at all levels of the design process. In order to be...effective, these tools must be adaptive to the specific design. In this project we studied a design method based on the use of applicative languages

  12. Engineering of solar photocatalytic detoxification and disinfection process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goswami, D.Y.

    1995-12-31

    Use of solar radiation for photocatalytic detoxification and disinfection is a very fascinating and fast-developing area. Although scientific research on these processes, especially photocatalytic oxidation, has been conducted for at least the last three decades, the development of industrial/commercial applications, engineering systems and engineering design methodologies have occurred only recently. A number of reactor concepts and designs, including concentrating and non-concentrating types and various methods of catalyst deployment have been developed. Some of these reactors have been used in field demonstrations of groundwater and wastewater remediation. Recent research has been focused on improvements of catalysts to increase the reaction rates,more » as well as finding new applications of the process. This paper reviews the latest developments of solar detoxification and disinfection including catalyst development, industrial/commercial applications, reactor design and engineering system design methodologies. 80 refs., 20 figs., 3 tabs.« less

  13. Designing application software in wide area network settings

    NASA Technical Reports Server (NTRS)

    Makpangou, Mesaac; Birman, Ken

    1990-01-01

    Progress in methodologies for developing robust local area network software has not been matched by similar results for wide area settings. The design of application software spanning multiple local area environments is examined. For important classes of applications, simple design techniques are presented that yield fault tolerant wide area programs. An implementation of these techniques as a set of tools for use within the ISIS system is described.

  14. Eighth DOD/NASA/FAA Conference on Fibrous Composites in Structural Design, part 1

    NASA Technical Reports Server (NTRS)

    Starnes, James H., Jr. (Compiler); Bohon, Herman L. (Compiler); Garzon, Sherry B. (Compiler)

    1990-01-01

    The status, problems, and requirements in the technical disciplines related to the design of composite structures are discussed. Papers are presented in the areas of applications in design; concepts in design; and methodology in design.

  15. A Methodology for the Design of Application-Specific Cyber-Physical Social Sensing Co-Simulators.

    PubMed

    Sánchez, Borja Bordel; Alcarria, Ramón; Sánchez-Picot, Álvaro; Sánchez-de-Rivera, Diego

    2017-09-22

    Cyber-Physical Social Sensing (CPSS) is a new trend in the context of pervasive sensing. In these new systems, various domains coexist in time, evolve together and influence each other. Thus, application-specific tools are necessary for specifying and validating designs and simulating systems. However, nowadays, different tools are employed to simulate each domain independently. Mainly, the cause of the lack of co-simulation instruments to simulate all domains together is the extreme difficulty of combining and synchronizing various tools. In order to reduce that difficulty, an adequate architecture for the final co-simulator must be selected. Therefore, in this paper the authors investigate and propose a methodology for the design of CPSS co-simulation tools. The paper describes the four steps that software architects should follow in order to design the most adequate co-simulator for a certain application, considering the final users' needs and requirements and various additional factors such as the development team's experience. Moreover, the first practical use case of the proposed methodology is provided. An experimental validation is also included in order to evaluate the performing of the proposed co-simulator and to determine the correctness of the proposal.

  16. A Methodology for the Design of Application-Specific Cyber-Physical Social Sensing Co-Simulators

    PubMed Central

    Sánchez-Picot, Álvaro

    2017-01-01

    Cyber-Physical Social Sensing (CPSS) is a new trend in the context of pervasive sensing. In these new systems, various domains coexist in time, evolve together and influence each other. Thus, application-specific tools are necessary for specifying and validating designs and simulating systems. However, nowadays, different tools are employed to simulate each domain independently. Mainly, the cause of the lack of co-simulation instruments to simulate all domains together is the extreme difficulty of combining and synchronizing various tools. In order to reduce that difficulty, an adequate architecture for the final co-simulator must be selected. Therefore, in this paper the authors investigate and propose a methodology for the design of CPSS co-simulation tools. The paper describes the four steps that software architects should follow in order to design the most adequate co-simulator for a certain application, considering the final users’ needs and requirements and various additional factors such as the development team’s experience. Moreover, the first practical use case of the proposed methodology is provided. An experimental validation is also included in order to evaluate the performing of the proposed co-simulator and to determine the correctness of the proposal. PMID:28937610

  17. Development of computer-assisted instruction application for statistical data analysis android platform as learning resource

    NASA Astrophysics Data System (ADS)

    Hendikawati, P.; Arifudin, R.; Zahid, M. Z.

    2018-03-01

    This study aims to design an android Statistics Data Analysis application that can be accessed through mobile devices to making it easier for users to access. The Statistics Data Analysis application includes various topics of basic statistical along with a parametric statistics data analysis application. The output of this application system is parametric statistics data analysis that can be used for students, lecturers, and users who need the results of statistical calculations quickly and easily understood. Android application development is created using Java programming language. The server programming language uses PHP with the Code Igniter framework, and the database used MySQL. The system development methodology used is the Waterfall methodology with the stages of analysis, design, coding, testing, and implementation and system maintenance. This statistical data analysis application is expected to support statistical lecturing activities and make students easier to understand the statistical analysis of mobile devices.

  18. Application of control theory to dynamic systems simulation

    NASA Technical Reports Server (NTRS)

    Auslander, D. M.; Spear, R. C.; Young, G. E.

    1982-01-01

    The application of control theory is applied to dynamic systems simulation. Theory and methodology applicable to controlled ecological life support systems are considered. Spatial effects on system stability, design of control systems with uncertain parameters, and an interactive computing language (PARASOL-II) designed for dynamic system simulation, report quality graphics, data acquisition, and simple real time control are discussed.

  19. Innovation design of medical equipment based on TRIZ.

    PubMed

    Gao, Changqing; Guo, Leiming; Gao, Fenglan; Yang, Bo

    2015-01-01

    Medical equipment is closely related to personal health and safety, and this can be of concern to the equipment user. Furthermore, there is much competition among medical equipment manufacturers. Innovative design is the key to success for those enterprises. The design of medical equipment usually covers vastly different domains of knowledge. The application of modern design methodology in medical equipment and technology invention is an urgent requirement. TRIZ (Russian abbreviation of what can be translated as `theory of inventive problem solving') was born in Russia, which contain some problem-solving methods developed by patent analysis around the world, including Conflict Matrix, Substance Field Analysis, Standard Solution, Effects, etc. TRIZ is an inventive methodology for problems solving. As an Engineering example, infusion system is analyzed and re-designed by TRIZ. The innovative idea is generated to liberate the caretaker from the infusion bag watching out. The research in this paper shows the process of the application of TRIZ in medical device inventions. It is proved that TRIZ is an inventive methodology for problems solving and can be used widely in medical device development.

  20. Fracture mechanics methodology: Evaluation of structural components integrity

    NASA Astrophysics Data System (ADS)

    Sih, G. C.; de Oliveira Faria, L.

    1984-09-01

    The application of fracture mechanics to structural-design problems is discussed in lectures presented in the AGARD Fracture Mechanics Methodology course held in Lisbon, Portugal, in June 1981. The emphasis is on aeronautical design, and chapters are included on fatigue-life prediction for metals and composites, the fracture mechanics of engineering structural components, failure mechanics and damage evaluation of structural components, flaw-acceptance methods, and reliability in probabilistic design. Graphs, diagrams, drawings, and photographs are provided.

  1. Application of the Hardman methodology to the Single Channel Ground-Airborne Radio System (SINCGARS)

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The HARDMAN methodology was applied to the various configurations of employment for an emerging Army multipurpose communications system. The methodology was used to analyze the manpower, personnel and training (MPT) requirements and associated costs, of the system concepts responsive to the Army's requirement for the Single Channel Ground-Airborne Radio System (SINCGARS). The scope of the application includes the analysis of two conceptual designs Cincinnati Electronics and ITT Aerospace/Optical Division for operating and maintenance support addressed through the general support maintenance echelon.

  2. Application of numerical methods to heat transfer and thermal stress analysis of aerospace vehicles

    NASA Technical Reports Server (NTRS)

    Wieting, A. R.

    1979-01-01

    The paper describes a thermal-structural design analysis study of a fuel-injection strut for a hydrogen-cooled scramjet engine for a supersonic transport, utilizing finite-element methodology. Applications of finite-element and finite-difference codes to the thermal-structural design-analysis of space transports and structures are discussed. The interaction between the thermal and structural analyses has led to development of finite-element thermal methodology to improve the integration between these two disciplines. The integrated thermal-structural analysis capability developed within the framework of a computer code is outlined.

  3. Development of risk-based decision methodology for facility design.

    DOT National Transportation Integrated Search

    2014-06-01

    This report develops a methodology for CDOT to use in the risk analysis of various types of facilities and provides : illustrative examples for the use of the proposed framework. An overview of the current practices and applications to : illustrate t...

  4. Optical design applications for enhanced illumination performance

    NASA Astrophysics Data System (ADS)

    Gilray, Carl; Lewin, Ian

    1995-08-01

    Nonimaging optical design techniques have been applied in the illumination industry for many years. Recently however, powerful software has been developed which allows accurate simulation and optimization of illumination devices. Wide experience has been obtained in using such design techniques for practical situations. These include automotive lighting where safety is of greatest importance, commercial lighting systems designed for energy efficiency, and numerous specialized applications. This presentation will discuss the performance requirements of a variety of illumination devices. It will further cover design methodology and present a variety of examples of practical applications for enhanced system performance.

  5. Developments in Sensitivity Methodologies and the Validation of Reactor Physics Calculations

    DOE PAGES

    Palmiotti, Giuseppe; Salvatores, Massimo

    2012-01-01

    The sensitivity methodologies have been a remarkable story when adopted in the reactor physics field. Sensitivity coefficients can be used for different objectives like uncertainty estimates, design optimization, determination of target accuracy requirements, adjustment of input parameters, and evaluations of the representativity of an experiment with respect to a reference design configuration. A review of the methods used is provided, and several examples illustrate the success of the methodology in reactor physics. A new application as the improvement of nuclear basic parameters using integral experiments is also described.

  6. Ninth DOD/NASA/FAA Conference on Fibrous Composites in Structural Design, volume 1

    NASA Technical Reports Server (NTRS)

    Soderquist, Joseph R. (Compiler); Neri, Lawrence M. (Compiler); Bohon, Herman L. (Compiler)

    1992-01-01

    This publication contains the proceedings of the Ninth DOD/NASA/FAA conference on Fibrous Composites in structural Design. Presentations were made in the following areas of composite structural design: perspectives in composites; design methodology; design applications; design criteria; supporting technology; damage tolerance; and manufacturing.

  7. Design and Analysis of Turbines for Space Applications

    NASA Technical Reports Server (NTRS)

    Griffin, Lisa W.; Dorney, Daniel J.; Huber, Frank W.

    2003-01-01

    In order to mitigate the risk of rocket propulsion development, efficient, accurate, detailed fluid dynamics analysis of the turbomachinery is necessary. This analysis is used for component development, design parametrics, performance prediction, and environment definition. To support this requirement, a task was developed at NASAh4arshall Space Flight Center (MSFC) to improve turbine aerodynamic performance through the application of advanced design and analysis tools. There are four major objectives of this task: 1) to develop, enhance, and integrate advanced turbine aerodynamic design and analysis tools; 2) to develop the methodology for application of the analytical techniques; 3) to demonstrate the benefits of the advanced turbine design procedure through its application to a relevant turbine design point; and 4) to verify the optimized design and analysis with testing. The turbine chosen on which to demonstrate the procedure was a supersonic design suitable for a reusable launch vehicle (RLV). The hot gas path and blading were redesigned to obtain an increased efficiency. The redesign of the turbine was conducted with a consideration of system requirements, realizing that a highly efficient turbine that, for example, significantly increases engine weight, is of limited benefit. Both preliminary and detailed designs were considered. To generate an improved design, one-dimensional (1D) design and analysis tools, computational fluid dynamics (CFD), response surface methodology (RSM), and neural nets (NN) were used.

  8. Teaching Camera Calibration by a Constructivist Methodology

    ERIC Educational Resources Information Center

    Samper, D.; Santolaria, J.; Pastor, J. J.; Aguilar, J. J.

    2010-01-01

    This article describes the Metrovisionlab simulation software and practical sessions designed to teach the most important machine vision camera calibration aspects in courses for senior undergraduate students. By following a constructivist methodology, having received introductory theoretical classes, students use the Metrovisionlab application to…

  9. Deliverology

    ERIC Educational Resources Information Center

    Nordstrum, Lee E.; LeMahieu, Paul G.; Dodd, Karen

    2017-01-01

    Purpose: This paper is one of seven in this volume elaborating different approaches to quality improvement in education. This paper aims to delineate a methodology called Deliverology. Design/methodology/approach: The paper presents the origins, theoretical foundations, core principles and a case study showing an application of Deliverology in the…

  10. A methodology for the design and evaluation of user interfaces for interactive information systems. Ph.D. Thesis Final Report, 1 Jul. 1985 - 31 Dec. 1987

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Farooq, Mohammad U.

    1986-01-01

    The definition of proposed research addressing the development and validation of a methodology for the design and evaluation of user interfaces for interactive information systems is given. The major objectives of this research are: the development of a comprehensive, objective, and generalizable methodology for the design and evaluation of user interfaces for information systems; the development of equations and/or analytical models to characterize user behavior and the performance of a designed interface; the design of a prototype system for the development and administration of user interfaces; and the design and use of controlled experiments to support the research and test/validate the proposed methodology. The proposed design methodology views the user interface as a virtual machine composed of three layers: an interactive layer, a dialogue manager layer, and an application interface layer. A command language model of user system interactions is presented because of its inherent simplicity and structured approach based on interaction events. All interaction events have a common structure based on common generic elements necessary for a successful dialogue. It is shown that, using this model, various types of interfaces could be designed and implemented to accommodate various categories of users. The implementation methodology is discussed in terms of how to store and organize the information.

  11. The Scenario-Based Engineering Process (SEP): a user-centered approach for the development of health care systems.

    PubMed

    Harbison, K; Kelly, J; Burnell, L; Silva, J

    1995-01-01

    The Scenario-based Engineering Process (SEP) is a user-focused methodology for large and complex system design. This process supports new application development from requirements analysis with domain models to component selection, design and modification, implementation, integration, and archival placement. It is built upon object-oriented methodologies, domain modeling strategies, and scenario-based techniques to provide an analysis process for mapping application requirements to available components. We are using SEP in the health care applications that we are developing. The process has already achieved success in the manufacturing and military domains and is being adopted by many organizations. SEP should prove viable in any domain containing scenarios that can be decomposed into tasks.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faidy, C.; Gilles, P.

    The objective of the seminar was to present the current state of the art in Leak-Before-Break (LBB) methodology development, validation, and application in an international forum. With particular emphasis on industrial applications and regulatory policies, the seminar provided an opportunity to compare approaches, experiences, and codifications developed by different countries. The seminar was organized into four topic areas: status of LBB applications; technical issues in LBB methodology; complementary requirements (leak detection and inspection); LBB assessment and margins. As a result of this seminar, an improved understanding of LBB gained through sharing of different viewpoints from different countries, permits consideration of:more » simplified pipe support design and possible elimination of loss-of-coolant-accident (LOCA) mechanical consequences for specific cases; defense-in-depth type of applications without support modifications; support of safety cases for plants designed without the LOCA hypothesis. In support of these activities, better estimates of the limits to the LBB approach should follow, as well as an improvement in codifying methodologies. Selected papers are indexed separately for inclusion in the Energy Science and Technology Database.« less

  13. An object-oriented approach for harmonization of multimedia markup languages

    NASA Astrophysics Data System (ADS)

    Chen, Yih-Feng; Kuo, May-Chen; Sun, Xiaoming; Kuo, C.-C. Jay

    2003-12-01

    An object-oriented methodology is proposed to harmonize several different markup languages in this research. First, we adopt the Unified Modelling Language (UML) as the data model to formalize the concept and the process of the harmonization process between the eXtensible Markup Language (XML) applications. Then, we design the Harmonization eXtensible Markup Language (HXML) based on the data model and formalize the transformation between the Document Type Definitions (DTDs) of the original XML applications and HXML. The transformation between instances is also discussed. We use the harmonization of SMIL and X3D as an example to demonstrate the proposed methodology. This methodology can be generalized to various application domains.

  14. Advanced piloted aircraft flight control system design methodology. Volume 1: Knowledge base

    NASA Technical Reports Server (NTRS)

    Mcruer, Duane T.; Myers, Thomas T.

    1988-01-01

    The development of a comprehensive and electric methodology for conceptual and preliminary design of flight control systems is presented and illustrated. The methodology is focused on the design stages starting with the layout of system requirements and ending when some viable competing system architectures (feedback control structures) are defined. The approach is centered on the human pilot and the aircraft as both the sources of, and the keys to the solution of, many flight control problems. The methodology relies heavily on computational procedures which are highly interactive with the design engineer. To maximize effectiveness, these techniques, as selected and modified to be used together in the methodology, form a cadre of computational tools specifically tailored for integrated flight control system preliminary design purposes. While theory and associated computational means are an important aspect of the design methodology, the lore, knowledge and experience elements, which guide and govern applications are critical features. This material is presented as summary tables, outlines, recipes, empirical data, lists, etc., which encapsulate a great deal of expert knowledge. Much of this is presented in topical knowledge summaries which are attached as Supplements. The composite of the supplements and the main body elements constitutes a first cut at a a Mark 1 Knowledge Base for manned-aircraft flight control.

  15. Co-design of RAD and ETHICS methodologies: a combination of information system development methods

    NASA Astrophysics Data System (ADS)

    Nasehi, Arezo; Shahriyari, Salman

    2011-12-01

    Co-design is a new trend in the social world which tries to capture different ideas in order to use the most appropriate features for a system. In this paper, co-design of two information system methodologies is regarded; rapid application development (RAD) and effective technical and human implementation of computer-based systems (ETHICS). We tried to consider the characteristics of these methodologies to see the possibility of having a co-design or combination of them for developing an information system. To reach this purpose, four different aspects of them are analyzed: social or technical approach, user participation and user involvement, job satisfaction, and overcoming change resistance. Finally, a case study using the quantitative method is analyzed in order to examine the possibility of co-design using these factors. The paper concludes that RAD and ETHICS are appropriate to be co-designed and brings some suggestions for the co-design.

  16. Solving rational matrix equations in the state space with applications to computer-aided control-system design

    NASA Technical Reports Server (NTRS)

    Packard, A. K.; Sastry, S. S.

    1986-01-01

    A method of solving a class of linear matrix equations over various rings is proposed, using results from linear geometric control theory. An algorithm, successfully implemented, is presented, along with non-trivial numerical examples. Applications of the method to the algebraic control system design methodology are discussed.

  17. Adopting a Design-Thinking Multidisciplinary Learning Approach: Integrating Mobile Applications into a Marketing Research Course

    ERIC Educational Resources Information Center

    Zarzosa, Jennifer

    2018-01-01

    This article seeks to address the gap between marketing education and marketing practice by integrating a design-thinking (DT) methodology to the marketing research (MR) framework to achieve learning objectives that will enhance cross-functional, collaborative, conceptual, and technical skills. The mobile application marketing research project…

  18. Electromagnetic Modeling, Optimization and Uncertainty Quantification for Antenna and Radar Systems Surfaces Scattering and Energy Absorption

    DTIC Science & Technology

    2017-03-06

    design of antenna and radar systems, energy absorption and scattering by rough-surfaces. This work has lead to significant new methodologies , including...problems in the field of electromagnetic propagation and scattering, with applicability to design of antenna and radar systems, energy absorption...and scattering by rough-surfaces. This work has lead to significant new methodologies , including introduction of a certain Windowed Green Function

  19. Modified Dynamic Inversion to Control Large Flexible Aircraft: What's Going On?

    NASA Technical Reports Server (NTRS)

    Gregory, Irene M.

    1999-01-01

    High performance aircraft of the future will be designed lighter, more maneuverable, and operate over an ever expanding flight envelope. One of the largest differences from the flight control perspective between current and future advanced aircraft is elasticity. Over the last decade, dynamic inversion methodology has gained considerable popularity in application to highly maneuverable fighter aircraft, which were treated as rigid vehicles. This paper explores dynamic inversion application to an advanced highly flexible aircraft. An initial application has been made to a large flexible supersonic aircraft. In the course of controller design for this advanced vehicle, modifications were made to the standard dynamic inversion methodology. The results of this application were deemed rather promising. An analytical study has been undertaken to better understand the nature of the made modifications and to determine its general applicability. This paper presents the results of this initial analytical look at the modifications to dynamic inversion to control large flexible aircraft.

  20. Ground Thermal Diffusivity Calculation by Direct Soil Temperature Measurement. Application to very Low Enthalpy Geothermal Energy Systems.

    PubMed

    Andújar Márquez, José Manuel; Martínez Bohórquez, Miguel Ángel; Gómez Melgar, Sergio

    2016-02-29

    This paper presents a methodology and instrumentation system for the indirect measurement of the thermal diffusivity of a soil at a given depth from measuring its temperature at that depth. The development has been carried out considering its application to the design and sizing of very low enthalpy geothermal energy (VLEGE) systems, but it can has many other applications, for example in construction, agriculture or biology. The methodology is simple and inexpensive because it can take advantage of the prescriptive geotechnical drilling prior to the construction of a house or building, to take at the same time temperature measurements that will allow get the actual temperature and ground thermal diffusivity to the depth of interest. The methodology and developed system have been tested and used in the design of a VLEGE facility for a chalet with basement at the outskirts of Huelva (a city in the southwest of Spain). Experimental results validate the proposed approach.

  1. Ground Thermal Diffusivity Calculation by Direct Soil Temperature Measurement. Application to very Low Enthalpy Geothermal Energy Systems

    PubMed Central

    Andújar Márquez, José Manuel; Martínez Bohórquez, Miguel Ángel; Gómez Melgar, Sergio

    2016-01-01

    This paper presents a methodology and instrumentation system for the indirect measurement of the thermal diffusivity of a soil at a given depth from measuring its temperature at that depth. The development has been carried out considering its application to the design and sizing of very low enthalpy geothermal energy (VLEGE) systems, but it can has many other applications, for example in construction, agriculture or biology. The methodology is simple and inexpensive because it can take advantage of the prescriptive geotechnical drilling prior to the construction of a house or building, to take at the same time temperature measurements that will allow get the actual temperature and ground thermal diffusivity to the depth of interest. The methodology and developed system have been tested and used in the design of a VLEGE facility for a chalet with basement at the outskirts of Huelva (a city in the southwest of Spain). Experimental results validate the proposed approach. PMID:26938534

  2. Fast underdetermined BSS architecture design methodology for real time applications.

    PubMed

    Mopuri, Suresh; Reddy, P Sreenivasa; Acharyya, Amit; Naik, Ganesh R

    2015-01-01

    In this paper, we propose a high speed architecture design methodology for the Under-determined Blind Source Separation (UBSS) algorithm using our recently proposed high speed Discrete Hilbert Transform (DHT) targeting real time applications. In UBSS algorithm, unlike the typical BSS, the number of sensors are less than the number of the sources, which is of more interest in the real time applications. The DHT architecture has been implemented based on sub matrix multiplication method to compute M point DHT, which uses N point architecture recursively and where M is an integer multiples of N. The DHT architecture and state of the art architecture are coded in VHDL for 16 bit word length and ASIC implementation is carried out using UMC 90 - nm technology @V DD = 1V and @ 1MHZ clock frequency. The proposed architecture implementation and experimental comparison results show that the DHT design is two times faster than state of the art architecture.

  3. Object-oriented analysis and design: a methodology for modeling the computer-based patient record.

    PubMed

    Egyhazy, C J; Eyestone, S M; Martino, J; Hodgson, C L

    1998-08-01

    The article highlights the importance of an object-oriented analysis and design (OOAD) methodology for the computer-based patient record (CPR) in the military environment. Many OOAD methodologies do not adequately scale up, allow for efficient reuse of their products, or accommodate legacy systems. A methodology that addresses these issues is formulated and used to demonstrate its applicability in a large-scale health care service system. During a period of 6 months, a team of object modelers and domain experts formulated an OOAD methodology tailored to the Department of Defense Military Health System and used it to produce components of an object model for simple order processing. This methodology and the lessons learned during its implementation are described. This approach is necessary to achieve broad interoperability among heterogeneous automated information systems.

  4. Computational Fragment-Based Drug Design: Current Trends, Strategies, and Applications.

    PubMed

    Bian, Yuemin; Xie, Xiang-Qun Sean

    2018-04-09

    Fragment-based drug design (FBDD) has become an effective methodology for drug development for decades. Successful applications of this strategy brought both opportunities and challenges to the field of Pharmaceutical Science. Recent progress in the computational fragment-based drug design provide an additional approach for future research in a time- and labor-efficient manner. Combining multiple in silico methodologies, computational FBDD possesses flexibilities on fragment library selection, protein model generation, and fragments/compounds docking mode prediction. These characteristics provide computational FBDD superiority in designing novel and potential compounds for a certain target. The purpose of this review is to discuss the latest advances, ranging from commonly used strategies to novel concepts and technologies in computational fragment-based drug design. Particularly, in this review, specifications and advantages are compared between experimental and computational FBDD, and additionally, limitations and future prospective are discussed and emphasized.

  5. Efficient Analysis of Complex Structures

    NASA Technical Reports Server (NTRS)

    Kapania, Rakesh K.

    2000-01-01

    Last various accomplishments achieved during this project are : (1) A Survey of Neural Network (NN) applications using MATLAB NN Toolbox on structural engineering especially on equivalent continuum models (Appendix A). (2) Application of NN and GAs to simulate and synthesize substructures: 1-D and 2-D beam problems (Appendix B). (3) Development of an equivalent plate-model analysis method (EPA) for static and vibration analysis of general trapezoidal built-up wing structures composed of skins, spars and ribs. Calculation of all sorts of test cases and comparison with measurements or FEA results. (Appendix C). (4) Basic work on using second order sensitivities on simulating wing modal response, discussion of sensitivity evaluation approaches, and some results (Appendix D). (5) Establishing a general methodology of simulating the modal responses by direct application of NN and by sensitivity techniques, in a design space composed of a number of design points. Comparison is made through examples using these two methods (Appendix E). (6) Establishing a general methodology of efficient analysis of complex wing structures by indirect application of NN: the NN-aided Equivalent Plate Analysis. Training of the Neural Networks for this purpose in several cases of design spaces, which can be applicable for actual design of complex wings (Appendix F).

  6. IMPAC: An Integrated Methodology for Propulsion and Airframe Control

    NASA Technical Reports Server (NTRS)

    Garg, Sanjay; Ouzts, Peter J.; Lorenzo, Carl F.; Mattern, Duane L.

    1991-01-01

    The National Aeronautics and Space Administration is actively involved in the development of enabling technologies that will lead towards aircraft with new/enhanced maneuver capabilities such as Short Take-Off Vertical Landing (STOVL) and high angle of attack performance. Because of the high degree of dynamic coupling between the airframe and propulsion systems of these types of aircraft, one key technology is the integration of the flight and propulsion control. The NASA Lewis Research Center approach to developing Integrated Flight Propulsion Control (IFPC) technologies is an in-house research program referred to as IMPAC (Integrated Methodology for Propulsion and Airframe Control). The goals of IMPAC are to develop a viable alternative to the existing integrated control design methodologies that will allow for improved system performance and simplicity of control law synthesis and implementation, and to demonstrate the applicability of the methodology to a supersonic STOVL fighter aircraft. Based on some preliminary control design studies that included evaluation of the existing methodologies, the IFPC design methodology that is emerging at the Lewis Research Center consists of considering the airframe and propulsion system as one integrated system for an initial centralized controller design and then partitioning the centralized controller into separate airframe and propulsion system subcontrollers to ease implementation and to set meaningful design requirements for detailed subsystem control design and evaluation. An overview of IMPAC is provided and detailed discussion of the various important design and evaluation steps in the methodology are included.

  7. The Design Process of Physical Security as Applied to a U.S. Border Port of Entry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wagner, G.G.

    1999-02-22

    This paper details the application of a standard physical security system design process to a US Border Port of Entry (PoE) for vehicle entry/exit. The physical security design methodology is described as well as the physical security similarities to facilities currently at a US Border PoE for vehicles. The physical security design process description includes the various elements that make up the methodologies well as the considerations that must be taken into account when dealing with system integration of those elements. The distinctions between preventing unlawful entry/exit of illegal contraband and personnel are described. The potential to enhance the functionsmore » of drug/contraband detection in the Pre-Primary Inspection area through the application of emerging technologies are also addressed.« less

  8. Schematic representation of case study research designs.

    PubMed

    Rosenberg, John P; Yates, Patsy M

    2007-11-01

    The paper is a report of a study to demonstrate how the use of schematics can provide procedural clarity and promote rigour in the conduct of case study research. Case study research is a methodologically flexible approach to research design that focuses on a particular case - whether an individual, a collective or a phenomenon of interest. It is known as the 'study of the particular' for its thorough investigation of particular, real-life situations and is gaining increased attention in nursing and social research. However, the methodological flexibility it offers can leave the novice researcher uncertain of suitable procedural steps required to ensure methodological rigour. This article provides a real example of a case study research design that utilizes schematic representation drawn from a doctoral study of the integration of health promotion principles and practices into a palliative care organization. The issues discussed are: (1) the definition and application of case study research design; (2) the application of schematics in research; (3) the procedural steps and their contribution to the maintenance of rigour; and (4) the benefits and risks of schematics in case study research. The inclusion of visual representations of design with accompanying explanatory text is recommended in reporting case study research methods.

  9. H-infinity based integrated flight-propulsion control design for a STOVL aircraft in transition flight

    NASA Technical Reports Server (NTRS)

    Garg, Sanjay; Mattern, Duane L.; Bright, Michelle M.; Ouzts, Peter J.

    1990-01-01

    Results are presented from an application of H-infinity control design methodology to a centralized integrated flight/propulsion control (IFPC) system design for a supersonic Short Take-Off and Vertical Landing (STOVL) fighter aircraft in transition flight. The overall design methodology consists of a centralized IFPC controller design with controller partitioning. Only the feedback controller design portion of the methodology is addressed. Design and evaluation vehicle models are summarized, and insight is provided into formulating the H-infinity control problem such that it reflects the IFPC design objectives. The H-infinity controller is shown to provide decoupled command tracking for the design model. The controller order could be significantly reduced by modal residualization of the fast controller modes without any deterioration in performance. A discussion is presented of the areas in which the controller performance needs to be improved, and ways in which these improvements can be achieved within the framework of an H-infinity based linear control design.

  10. Advanced piloted aircraft flight control system design methodology. Volume 2: The FCX flight control design expert system

    NASA Technical Reports Server (NTRS)

    Myers, Thomas T.; Mcruer, Duane T.

    1988-01-01

    The development of a comprehensive and electric methodology for conceptual and preliminary design of flight control systems is presented and illustrated. The methodology is focused on the design states starting with the layout of system requirements and ending when some viable competing system architectures (feedback control structures) are defined. The approach is centered on the human pilot and the aircraft as both the sources of, and the keys to the solution of, many flight control problems. The methodology relies heavily on computational procedures which are highly interactive with the design engineer. To maximize effectiveness, these techniques, as selected and modified to be used together in the methodology, form a cadre of computational tools specifically tailored for integrated flight control system preliminary design purposes. The FCX expert system as presently developed is only a limited prototype capable of supporting basic lateral-directional FCS design activities related to the design example used. FCX presently supports design of only one FCS architecture (yaw damper plus roll damper) and the rules are largely focused on Class IV (highly maneuverable) aircraft. Despite this limited scope, the major elements which appear necessary for application of knowledge-based software concepts to flight control design were assembled and thus FCX represents a prototype which can be tested, critiqued and evolved in an ongoing process of development.

  11. Lean for Education

    ERIC Educational Resources Information Center

    LeMahieu, Paul G.; Nordstrum, Lee E.; Greco, Patricia

    2017-01-01

    Purpose: This paper is one of seven in this volume that aims to elaborate different approaches to quality improvement in education. It delineates a methodology called Lean for Education. Design/methodology/approach: The paper presents the origins, theoretical foundations, core concepts and a case study demonstrating an application in US education,…

  12. Review of the probabilistic failure analysis methodology and other probabilistic approaches for application in aerospace structural design

    NASA Technical Reports Server (NTRS)

    Townsend, J.; Meyers, C.; Ortega, R.; Peck, J.; Rheinfurth, M.; Weinstock, B.

    1993-01-01

    Probabilistic structural analyses and design methods are steadily gaining acceptance within the aerospace industry. The safety factor approach to design has long been the industry standard, and it is believed by many to be overly conservative and thus, costly. A probabilistic approach to design may offer substantial cost savings. This report summarizes several probabilistic approaches: the probabilistic failure analysis (PFA) methodology developed by Jet Propulsion Laboratory, fast probability integration (FPI) methods, the NESSUS finite element code, and response surface methods. Example problems are provided to help identify the advantages and disadvantages of each method.

  13. A new methodology to integrate planetary quarantine requirements into mission planning, with application to a Jupiter orbiter

    NASA Technical Reports Server (NTRS)

    Howard, R. A.; North, D. W.; Pezier, J. P.

    1975-01-01

    A new methodology is proposed for integrating planetary quarantine objectives into space exploration planning. This methodology is designed to remedy the major weaknesses inherent in the current formulation of planetary quarantine requirements. Application of the methodology is illustrated by a tutorial analysis of a proposed Jupiter Orbiter mission. The proposed methodology reformulates planetary quarantine planning as a sequential decision problem. Rather than concentrating on a nominal plan, all decision alternatives and possible consequences are laid out in a decision tree. Probabilities and values are associated with the outcomes, including the outcome of contamination. The process of allocating probabilities, which could not be made perfectly unambiguous and systematic, is replaced by decomposition and optimization techniques based on principles of dynamic programming. Thus, the new methodology provides logical integration of all available information and allows selection of the best strategy consistent with quarantine and other space exploration goals.

  14. The design-build contracting methodology for transportation projects : a review of practice and evaluation for Connecticut applications.

    DOT National Transportation Integrated Search

    2010-06-10

    Two primary contracting methods are used by most state transportation agencies to : design and build infrastructure: design-bid-build and design-build. The objective : of this study is to conduct a literature review to identify how ConnDOTs use of...

  15. A combined stochastic feedforward and feedback control design methodology with application to autoland design

    NASA Technical Reports Server (NTRS)

    Halyo, Nesim

    1987-01-01

    A combined stochastic feedforward and feedback control design methodology was developed. The objective of the feedforward control law is to track the commanded trajectory, whereas the feedback control law tries to maintain the plant state near the desired trajectory in the presence of disturbances and uncertainties about the plant. The feedforward control law design is formulated as a stochastic optimization problem and is embedded into the stochastic output feedback problem where the plant contains unstable and uncontrollable modes. An algorithm to compute the optimal feedforward is developed. In this approach, the use of error integral feedback, dynamic compensation, control rate command structures are an integral part of the methodology. An incremental implementation is recommended. Results on the eigenvalues of the implemented versus designed control laws are presented. The stochastic feedforward/feedback control methodology is used to design a digital automatic landing system for the ATOPS Research Vehicle, a Boeing 737-100 aircraft. The system control modes include localizer and glideslope capture and track, and flare to touchdown. Results of a detailed nonlinear simulation of the digital control laws, actuator systems, and aircraft aerodynamics are presented.

  16. A computer simulator for development of engineering system design methodologies

    NASA Technical Reports Server (NTRS)

    Padula, S. L.; Sobieszczanski-Sobieski, J.

    1987-01-01

    A computer program designed to simulate and improve engineering system design methodology is described. The simulator mimics the qualitative behavior and data couplings occurring among the subsystems of a complex engineering system. It eliminates the engineering analyses in the subsystems by replacing them with judiciously chosen analytical functions. With the cost of analysis eliminated, the simulator is used for experimentation with a large variety of candidate algorithms for multilevel design optimization to choose the best ones for the actual application. Thus, the simulator serves as a development tool for multilevel design optimization strategy. The simulator concept, implementation, and status are described and illustrated with examples.

  17. Application-specific coarse-grained reconfigurable array: architecture and design methodology

    NASA Astrophysics Data System (ADS)

    Zhou, Li; Liu, Dongpei; Zhang, Jianfeng; Liu, Hengzhu

    2015-06-01

    Coarse-grained reconfigurable arrays (CGRAs) have shown potential for application in embedded systems in recent years. Numerous reconfigurable processing elements (PEs) in CGRAs provide flexibility while maintaining high performance by exploring different levels of parallelism. However, a difference remains between the CGRA and the application-specific integrated circuit (ASIC). Some application domains, such as software-defined radios (SDRs), require flexibility with performance demand increases. More effective CGRA architectures are expected to be developed. Customisation of a CGRA according to its application can improve performance and efficiency. This study proposes an application-specific CGRA architecture template composed of generic PEs (GPEs) and special PEs (SPEs). The hardware of the SPE can be customised to accelerate specific computational patterns. An automatic design methodology that includes pattern identification and application-specific function unit generation is also presented. A mapping algorithm based on ant colony optimisation is provided. Experimental results on the SDR target domain show that compared with other ordinary and application-specific reconfigurable architectures, the CGRA generated by the proposed method performs more efficiently for given applications.

  18. An improved approach for flight readiness certification: Probabilistic models for flaw propagation and turbine blade failure. Volume 1: Methodology and applications

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for designs failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.

  19. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples. Volume 2: Software documentation

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  20. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples, volume 1

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  1. Six Sigma in Education

    ERIC Educational Resources Information Center

    LeMahieu, Paul G.; Nordstrum, Lee E.; Cudney, Elizabeth A.

    2017-01-01

    Purpose: This paper is one of seven in this volume that aims to elaborate different approaches to quality improvement in education. It delineates a methodology called Six Sigma. Design/methodology/approach: The paper presents the origins, theoretical foundations, core principles and a case study demonstrating an application of Six Sigma in a…

  2. Rocketdyne PSAM: In-house enhancement/application

    NASA Technical Reports Server (NTRS)

    Newell, J. F.; Rajagopal, K. R.; Ohara, K.

    1991-01-01

    The development was initiated of the Probabilistic Design Analysis (PDA) Process for rocket engines. This will enable engineers a quantitative assessment of calculated reliability during the design process. The PDA will help choose better designs, make them more robust, and help decide on critical tests to help demonstrate key reliability issues to aid in improving the confidence of the engine capabilities. Rockedyne's involvement with the Composite Loads Spectra (CLS) and Probabilistic Structural Analysis Methodology (PSAM) contracts started this effort and are key elements in the on-going developments. Internal development efforts and hardware applications complement and extend the CLS and PSAM efforts. The completion of the CLS option work and the follow-on PSAM developments will also be integral parts of this methodology. A brief summary of these efforts is presented.

  3. Ninth DOD/NASA/FAA Conference on Fibrous Composites in Structural Design, volume 2

    NASA Technical Reports Server (NTRS)

    Soderquist, Joseph R. (Compiler); Neri, Lawrence M. (Compiler); Bohon, Herman L. (Compiler)

    1992-01-01

    This publication contains the proceedings of the Ninth DOD/NASA/FAA Conference on Fibrous Composites in Structural Design held at Lake Tahoe, Nevada, during 4-7 Nov. 1991. Presentations were made in the following areas of composite structural design: perspectives in composites, design methodology, design applications, design criteria, supporting technology, damage tolerance, and manufacturing.

  4. Ninth DOD/NASA/FAA Conference on Fibrous Composites in Structural Design, volume 3

    NASA Technical Reports Server (NTRS)

    Soderquist, Joseph R. (Compiler); Neri, Lawrence M. (Compiler); Bohon, Herman L. (Compiler)

    1992-01-01

    This publication contains the proceedings of the Ninth DOD/NASA/FAA Conference on Fibrous Composites in Structural Design held at Lake Tahoe, Nevada, during 4-7 Nov. 1991. Presentations were made in the following areas of composite structural design: perspectives in composites, design methodology, design applications, design criteria, supporting technology, damage tolerance, and manufacturing.

  5. DB4US: A Decision Support System for Laboratory Information Management.

    PubMed

    Carmona-Cejudo, José M; Hortas, Maria Luisa; Baena-García, Manuel; Lana-Linati, Jorge; González, Carlos; Redondo, Maximino; Morales-Bueno, Rafael

    2012-11-14

    Until recently, laboratory automation has focused primarily on improving hardware. Future advances are concentrated on intelligent software since laboratories performing clinical diagnostic testing require improved information systems to address their data processing needs. In this paper, we propose DB4US, an application that automates information related to laboratory quality indicators information. Currently, there is a lack of ready-to-use management quality measures. This application addresses this deficiency through the extraction, consolidation, statistical analysis, and visualization of data related to the use of demographics, reagents, and turn-around times. The design and implementation issues, as well as the technologies used for the implementation of this system, are discussed in this paper. To develop a general methodology that integrates the computation of ready-to-use management quality measures and a dashboard to easily analyze the overall performance of a laboratory, as well as automatically detect anomalies or errors. The novelty of our approach lies in the application of integrated web-based dashboards as an information management system in hospital laboratories. We propose a new methodology for laboratory information management based on the extraction, consolidation, statistical analysis, and visualization of data related to demographics, reagents, and turn-around times, offering a dashboard-like user web interface to the laboratory manager. The methodology comprises a unified data warehouse that stores and consolidates multidimensional data from different data sources. The methodology is illustrated through the implementation and validation of DB4US, a novel web application based on this methodology that constructs an interface to obtain ready-to-use indicators, and offers the possibility to drill down from high-level metrics to more detailed summaries. The offered indicators are calculated beforehand so that they are ready to use when the user needs them. The design is based on a set of different parallel processes to precalculate indicators. The application displays information related to tests, requests, samples, and turn-around times. The dashboard is designed to show the set of indicators on a single screen. DB4US was deployed for the first time in the Hospital Costa del Sol in 2008. In our evaluation we show the positive impact of this methodology for laboratory professionals, since the use of our application has reduced the time needed for the elaboration of the different statistical indicators and has also provided information that has been used to optimize the usage of laboratory resources by the discovery of anomalies in the indicators. DB4US users benefit from Internet-based communication of results, since this information is available from any computer without having to install any additional software. The proposed methodology and the accompanying web application, DB4US, automates the processing of information related to laboratory quality indicators and offers a novel approach for managing laboratory-related information, benefiting from an Internet-based communication mechanism. The application of this methodology has been shown to improve the usage of time, as well as other laboratory resources.

  6. A first principles based methodology for design of axial compressor configurations

    NASA Astrophysics Data System (ADS)

    Iyengar, Vishwas

    Axial compressors are widely used in many aerodynamic applications. The design of an axial compressor configuration presents many challenges. Until recently, compressor design was done using 2-D viscous flow analyses that solve the flow field around cascades or in meridional planes or 3-D inviscid analyses. With the advent of modern computational methods it is now possible to analyze the 3-D viscous flow and accurately predict the performance of 3-D multistage compressors. It is necessary to retool the design methodologies to take advantage of the improved accuracy and physical fidelity of these advanced methods. In this study, a first-principles based multi-objective technique for designing single stage compressors is described. The study accounts for stage aerodynamic characteristics, rotor-stator interactions and blade elastic deformations. A parametric representation of compressor blades that include leading and trailing edge camber line angles, thickness and camber distributions was used in this study. A design of experiment approach is used to reduce the large combinations of design variables into a smaller subset. A response surface method is used to approximately map the output variables as a function of design variables. An optimized configuration is determined as the extremum of all extrema. This method has been applied to a rotor-stator stage similar to NASA Stage 35. The study has two parts: a preliminary study where a limited number of design variables were used to give an understanding of the important design variables for subsequent use, and a comprehensive application of the methodology where a larger, more complete set of design variables are used. The extended methodology also attempts to minimize the acoustic fluctuations at the rotor-stator interface by considering a rotor-wake influence coefficient (RWIC). Results presented include performance map calculations at design and off-design speed along with a detailed visualization of the flow field at design and off-design conditions. The present methodology provides a way to systematically screening through the plethora of design variables. By selecting the most influential design parameters and by optimizing the blade leading edge and trailing edge mean camber line angles, phenomenon's such as tip blockages, blade-to-blade shock structures and other loss mechanisms can be weakened or alleviated. It is found that these changes to the configuration can have a beneficial effect on total pressure ratio and stage adiabatic efficiency, thereby improving the performance of the axial compression system. Aeroacoustic benefits were found by minimizing the noise generating mechanisms associated with rotor wake-stator interactions. The new method presented is reliable, low time cost, and easily applicable to industry daily design optimization of turbomachinery blades.

  7. Energy modelling in sensor networks

    NASA Astrophysics Data System (ADS)

    Schmidt, D.; Krämer, M.; Kuhn, T.; Wehn, N.

    2007-06-01

    Wireless sensor networks are one of the key enabling technologies for the vision of ambient intelligence. Energy resources for sensor nodes are very scarce. A key challenge is the design of energy efficient communication protocols. Models of the energy consumption are needed to accurately simulate the efficiency of a protocol or application design, and can also be used for automatic energy optimizations in a model driven design process. We propose a novel methodology to create models for sensor nodes based on few simple measurements. In a case study the methodology was used to create models for MICAz nodes. The models were integrated in a simulation environment as well as in a SDL runtime framework of a model driven design process. Measurements on a test application that was created automatically from an SDL specification showed an 80% reduction in energy consumption compared to an implementation without power saving strategies.

  8. Integrating Low-Cost Rapid Usability Testing into Agile System Development of Healthcare IT: A Methodological Perspective.

    PubMed

    Kushniruk, Andre W; Borycki, Elizabeth M

    2015-01-01

    The development of more usable and effective healthcare information systems has become a critical issue. In the software industry methodologies such as agile and iterative development processes have emerged to lead to more effective and usable systems. These approaches highlight focusing on user needs and promoting iterative and flexible development practices. Evaluation and testing of iterative agile development cycles is considered an important part of the agile methodology and iterative processes for system design and re-design. However, the issue of how to effectively integrate usability testing methods into rapid and flexible agile design cycles has remained to be fully explored. In this paper we describe our application of an approach known as low-cost rapid usability testing as it has been applied within agile system development in healthcare. The advantages of the integrative approach are described, along with current methodological considerations.

  9. The Taguchi methodology as a statistical tool for biotechnological applications: a critical appraisal.

    PubMed

    Rao, Ravella Sreenivas; Kumar, C Ganesh; Prakasham, R Shetty; Hobbs, Phil J

    2008-04-01

    Success in experiments and/or technology mainly depends on a properly designed process or product. The traditional method of process optimization involves the study of one variable at a time, which requires a number of combinations of experiments that are time, cost and labor intensive. The Taguchi method of design of experiments is a simple statistical tool involving a system of tabulated designs (arrays) that allows a maximum number of main effects to be estimated in an unbiased (orthogonal) fashion with a minimum number of experimental runs. It has been applied to predict the significant contribution of the design variable(s) and the optimum combination of each variable by conducting experiments on a real-time basis. The modeling that is performed essentially relates signal-to-noise ratio to the control variables in a 'main effect only' approach. This approach enables both multiple response and dynamic problems to be studied by handling noise factors. Taguchi principles and concepts have made extensive contributions to industry by bringing focused awareness to robustness, noise and quality. This methodology has been widely applied in many industrial sectors; however, its application in biological sciences has been limited. In the present review, the application and comparison of the Taguchi methodology has been emphasized with specific case studies in the field of biotechnology, particularly in diverse areas like fermentation, food processing, molecular biology, wastewater treatment and bioremediation.

  10. Alternative Models for Individualized Armor Training. Part I. Interim Report: Review and Analysis of the Literature

    DTIC Science & Technology

    1980-01-01

    for an individualized instructional con- text is provided by Giordono (1975), in his discussion of the design of a " non - lockstep educational system...state of ATI research, sum- marized the methodological and theoretical problems that may have inhibited the application of ATI findings to the design ...years. In contrast, systematic modifications based on results obtained through the application of appropriate experimental designs are desired and

  11. Development of 3D pseudo pin-by-pin calculation methodology in ANC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, B.; Mayhue, L.; Huria, H.

    2012-07-01

    Advanced cores and fuel assembly designs have been developed to improve operational flexibility, economic performance and further enhance safety features of nuclear power plants. The simulation of these new designs, along with strong heterogeneous fuel loading, have brought new challenges to the reactor physics methodologies currently employed in the industrial codes for core analyses. Control rod insertion during normal operation is one operational feature in the AP1000{sup R} plant of Westinghouse next generation Pressurized Water Reactor (PWR) design. This design improves its operational flexibility and efficiency but significantly challenges the conventional reactor physics methods, especially in pin power calculations. Themore » mixture loading of fuel assemblies with significant neutron spectrums causes a strong interaction between different fuel assembly types that is not fully captured with the current core design codes. To overcome the weaknesses of the conventional methods, Westinghouse has developed a state-of-the-art 3D Pin-by-Pin Calculation Methodology (P3C) and successfully implemented in the Westinghouse core design code ANC. The new methodology has been qualified and licensed for pin power prediction. The 3D P3C methodology along with its application and validation will be discussed in the paper. (authors)« less

  12. Dynamic mobility applications open source application development portal task 6.1a : architecture and high-level design task 6.1b : list of requirements included in initial architecture and high-level design.

    DOT National Transportation Integrated Search

    2016-10-12

    This document offers a detailed discussion of the systems functionality that was planned to be implemented. However, following the Agile Development methodology, during the course of system development, diligent decisions were made based on the la...

  13. VIII. THE PAST, PRESENT, AND FUTURE OF DEVELOPMENTAL METHODOLOGY.

    PubMed

    Little, Todd D; Wang, Eugene W; Gorrall, Britt K

    2017-06-01

    This chapter selectively reviews the evolution of quantitative practices in the field of developmental methodology. The chapter begins with an overview of the past in developmental methodology, discussing the implementation and dissemination of latent variable modeling and, in particular, longitudinal structural equation modeling. It then turns to the present state of developmental methodology, highlighting current methodological advances in the field. Additionally, this section summarizes ample quantitative resources, ranging from key quantitative methods journal articles to the various quantitative methods training programs and institutes. The chapter concludes with the future of developmental methodology and puts forth seven future innovations in the field. The innovations discussed span the topics of measurement, modeling, temporal design, and planned missing data designs. Lastly, the chapter closes with a brief overview of advanced modeling techniques such as continuous time models, state space models, and the application of Bayesian estimation in the field of developmental methodology. © 2017 The Society for Research in Child Development, Inc.

  14. User-Design: A Case Application in Health Care Training.

    ERIC Educational Resources Information Center

    Carr-Chellman, Alison; Cuyar, Craig; Breman, Jeroen

    1998-01-01

    Discussion of user-design as a theoretical process for creating training, software, and computer systems focuses on a case study that describes user-design methodology in home health care through the context of diffusing a new laptop patient-record-keeping system with home nurses. Research needs and implications for the design process are…

  15. Systems scenarios: a tool for facilitating the socio-technical design of work systems.

    PubMed

    Hughes, Helen P N; Clegg, Chris W; Bolton, Lucy E; Machon, Lauren C

    2017-10-01

    The socio-technical systems approach to design is well documented. Recognising the benefits of this approach, organisations are increasingly trying to work with systems, rather than their component parts. However, few tools attempt to analyse the complexity inherent in such systems, in ways that generate useful, practical outputs. In this paper, we outline the 'System Scenarios Tool' (SST), which is a novel, applied methodology that can be used by designers, end-users, consultants or researchers to help design or re-design work systems. The paper introduces the SST using examples of its application, and describes the potential benefits of its use, before reflecting on its limitations. Finally, we discuss potential opportunities for the tool, and describe sets of circumstances in which it might be used. Practitioner Summary: The paper presents a novel, applied methodological tool, named the 'Systems Scenarios Tool'. We believe this tool can be used as a point of reference by designers, end-users, consultants or researchers, to help design or re-design work systems. Included in the paper are two worked examples, demonstrating the tool's application.

  16. Exploring the bases for a mixed reality stroke rehabilitation system, Part I: A unified approach for representing action, quantitative evaluation, and interactive feedback

    PubMed Central

    2011-01-01

    Background Although principles based in motor learning, rehabilitation, and human-computer interfaces can guide the design of effective interactive systems for rehabilitation, a unified approach that connects these key principles into an integrated design, and can form a methodology that can be generalized to interactive stroke rehabilitation, is presently unavailable. Results This paper integrates phenomenological approaches to interaction and embodied knowledge with rehabilitation practices and theories to achieve the basis for a methodology that can support effective adaptive, interactive rehabilitation. Our resulting methodology provides guidelines for the development of an action representation, quantification of action, and the design of interactive feedback. As Part I of a two-part series, this paper presents key principles of the unified approach. Part II then describes the application of this approach within the implementation of the Adaptive Mixed Reality Rehabilitation (AMRR) system for stroke rehabilitation. Conclusions The accompanying principles for composing novel mixed reality environments for stroke rehabilitation can advance the design and implementation of effective mixed reality systems for the clinical setting, and ultimately be adapted for home-based application. They furthermore can be applied to other rehabilitation needs beyond stroke. PMID:21875441

  17. Quality of Service in Networks Supporting Cultural Multimedia Applications

    ERIC Educational Resources Information Center

    Kanellopoulos, Dimitris N.

    2011-01-01

    Purpose: This paper aims to provide an overview of representative multimedia applications in the cultural heritage sector, as well as research results on quality of service (QoS) mechanisms in internet protocol (IP) networks that support such applications. Design/methodology/approach: The paper's approach is a literature review. Findings: Cultural…

  18. High-performance radial AMTEC cell design for ultra-high-power solar AMTEC systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hendricks, T.J.; Huang, C.

    1999-07-01

    Alkali Metal Thermal to Electric Conversion (AMTEC) technology is rapidly maturing for potential application in ultra-high-power solar AMTEC systems required by potential future US Air Force (USAF) spacecraft missions in medium-earth and geosynchronous orbits (MEO and GEO). Solar thermal AMTEC power systems potentially have several important advantages over current solar photovoltaic power systems in ultra-high-power spacecraft applications for USAF MEO and GEO missions. This work presents key aspects of radial AMTEC cell design to achieve high cell performance in solar AMTEC systems delivering larger than 50 kW(e) to support high power USAF missions. These missions typically require AMTEC cell conversionmore » efficiency larger than 25%. A sophisticated design parameter methodology is described and demonstrated which establishes optimum design parameters in any radial cell design to satisfy high-power mission requirements. Specific relationships, which are distinct functions of cell temperatures and pressures, define critical dependencies between key cell design parameters, particularly the impact of parasitic thermal losses on Beta Alumina Solid Electrolyte (BASE) area requirements, voltage, number of BASE tubes, and system power production for both maximum power-per-BASE-area and optimum efficiency conditions. Finally, some high-level system tradeoffs are demonstrated using the design parameter methodology to establish high-power radial cell design requirements and philosophy. The discussion highlights how to incorporate this methodology with sophisticated SINDA/FLUINT AMTEC cell modeling capabilities to determine optimum radial AMTEC cell designs.« less

  19. Eighth DOD/NASA/FAA Conference on Fibrous Composites in Structural Design, Part 2

    NASA Technical Reports Server (NTRS)

    Starnes, James H., Jr. (Compiler); Bohon, Herman L. (Compiler); Garzon, Sherry B. (Compiler)

    1990-01-01

    Papers presented at the conference are compiled. The conference provided a forum for the scientific community to exchange composite structures design information and an opportunity to observe recent progress in composite structures design and technology. Part 2 contains papers related to the following subject areas: the application in design; methodology in design; and reliability in design.

  20. School Psychology as a Relational Enterprise: The Role and Process of Qualitative Methodology

    ERIC Educational Resources Information Center

    Newman, Daniel S.; Clare, Mary M.

    2016-01-01

    The purpose of this article is to explore the application of qualitative research to establishing a more complete understanding of relational processes inherent in school psychology practice. We identify the building blocks of rigorous qualitative research design through a conceptual overview of qualitative paradigms, methodologies, methods (i.e.,…

  1. Advanced software development workstation: Object-oriented methodologies and applications for flight planning and mission operations

    NASA Technical Reports Server (NTRS)

    Izygon, Michel

    1993-01-01

    The work accomplished during the past nine months in order to help three different organizations involved in Flight Planning and in Mission Operations systems, to transition to Object-Oriented Technology, by adopting one of the currently most widely used Object-Oriented analysis and Design Methodology is summarized.

  2. Education and Training in Ethical Decision Making: Comparing Context and Orientation

    ERIC Educational Resources Information Center

    Perri, David F.; Callanan, Gerard A.; Rotenberry, Paul F.; Oehlers, Peter F.

    2009-01-01

    Purpose: The purpose of this paper is to present a teaching methodology for improving the understanding of ethical decision making. This pedagogical approach is applicable in college courses and in corporate training programs. Design/methodology/approach: Participants are asked to analyze a set of eight ethical dilemmas with differing situational…

  3. Application Development Methodology Appropriateness: An Exploratory Case Study Bridging the Gap between Framework Characteristics and Selection

    ERIC Educational Resources Information Center

    Williams, Lawrence H., Jr.

    2013-01-01

    This qualitative study analyzed experiences of twenty software developers. The research showed that all software development methodologies are distinct from each other. While some, such as waterfall, focus on traditional, plan-driven approaches that allow software requirements and design to evolve; others facilitate ambiguity and uncertainty by…

  4. Design for dependability: A simulation-based approach. Ph.D. Thesis, 1993

    NASA Technical Reports Server (NTRS)

    Goswami, Kumar K.

    1994-01-01

    This research addresses issues in simulation-based system level dependability analysis of fault-tolerant computer systems. The issues and difficulties of providing a general simulation-based approach for system level analysis are discussed and a methodology that address and tackle these issues is presented. The proposed methodology is designed to permit the study of a wide variety of architectures under various fault conditions. It permits detailed functional modeling of architectural features such as sparing policies, repair schemes, routing algorithms as well as other fault-tolerant mechanisms, and it allows the execution of actual application software. One key benefit of this approach is that the behavior of a system under faults does not have to be pre-defined as it is normally done. Instead, a system can be simulated in detail and injected with faults to determine its failure modes. The thesis describes how object-oriented design is used to incorporate this methodology into a general purpose design and fault injection package called DEPEND. A software model is presented that uses abstractions of application programs to study the behavior and effect of software on hardware faults in the early design stage when actual code is not available. Finally, an acceleration technique that combines hierarchical simulation, time acceleration algorithms and hybrid simulation to reduce simulation time is introduced.

  5. Accelerating a MPEG-4 video decoder through custom software/hardware co-design

    NASA Astrophysics Data System (ADS)

    Díaz, Jorge L.; Barreto, Dacil; García, Luz; Marrero, Gustavo; Carballo, Pedro P.; Núñez, Antonio

    2007-05-01

    In this paper we present a novel methodology to accelerate an MPEG-4 video decoder using software/hardware co-design for wireless DAB/DMB networks. Software support includes the services provided by the embedded kernel μC/OS-II, and the application tasks mapped to software. Hardware support includes several custom co-processors and a communication architecture with bridges to the main system bus and with a dual port SRAM. Synchronization among tasks is achieved at two levels, by a hardware protocol and by kernel level scheduling services. Our reference application is an MPEG-4 video decoder composed of several software functions and written using a special C++ library named CASSE. Profiling and space exploration techniques were used previously over the Advanced Simple Profile (ASP) MPEG-4 decoder to determinate the best HW/SW partition developed here. This research is part of the ARTEMI project and its main goal is the establishment of methodologies for the design of real-time complex digital systems using Programmable Logic Devices with embedded microprocessors as target technology and the design of multimedia systems for broadcasting networks as reference application.

  6. Methodology for designing and implementing a class for service for the transmission of medical images over a common network

    NASA Astrophysics Data System (ADS)

    Dimond, David A.; Burgess, Robert; Barrios, Nolan; Johnson, Neil D.

    2000-05-01

    Traditionally, to guarantee the network performance of medical image data transmission, imaging traffic was isolated on a separate network. Organizations are depending on a new generation of multi-purpose networks to transport both normal information and image traffic as they expand access to images throughout the enterprise. These organi want to leverage their existing infrastructure for imaging traffic, but are not willing to accept degradations in overall network performance. To guarantee 'on demand' network performance for image transmissions anywhere at any time, networks need to be designed with the ability to 'carve out' bandwidth for specific applications and to minimize the chances of network failures. This paper will present the methodology Cincinnati Children's Hospital Medical Center (CHMC) used to enhance the physical and logical network design of the existing hospital network to guarantee a class of service for imaging traffic. PACS network designs should utilize the existing enterprise local area network i.e. (LAN) infrastructure where appropriate. Logical separation or segmentation provides the application independence from other clinical and administrative applications as required, ensuring bandwidth and service availability.

  7. pysimm: A Python Package for Simulation of Molecular Systems

    NASA Astrophysics Data System (ADS)

    Fortunato, Michael; Colina, Coray

    pysimm, short for python simulation interface for molecular modeling, is a python package designed to facilitate the structure generation and simulation of molecular systems through convenient and programmatic access to object-oriented representations of molecular system data. This poster presents core features of pysimm and design philosophies that highlight a generalized methodology for incorporation of third-party software packages through API interfaces. The integration with the LAMMPS simulation package is explained to demonstrate this methodology. pysimm began as a back-end python library that powered a cloud-based application on nanohub.org for amorphous polymer simulation. The extension from a specific application library to general purpose simulation interface is explained. Additionally, this poster highlights the rapid development of new applications to construct polymer chains capable of controlling chain morphology such as molecular weight distribution and monomer composition.

  8. Enhancing the Benefit of the Chemical Mixture Methodology: A Report on Methodology Testing and Potential Approaches for Improving Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, Xiao-Ying; Yao, Juan; He, Hua

    2012-01-01

    Extensive testing shows that the current version of the Chemical Mixture Methodology (CMM) is meeting its intended mission to provide conservative estimates of the health effects from exposure to airborne chemical mixtures. However, the current version of the CMM could benefit from several enhancements that are designed to improve its application of Health Code Numbers (HCNs) and employ weighting factors to reduce over conservatism.

  9. The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology: Capabilities and Applications

    NASA Technical Reports Server (NTRS)

    Evers, Ken H.; Bachert, Robert F.

    1987-01-01

    The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology has been formulated and applied over a five-year period. It has proven to be a unique, integrated approach utilizing a top-down, structured technique to define and document the system of interest; a knowledge engineering technique to collect and organize system descriptive information; a rapid prototyping technique to perform preliminary system performance analysis; and a sophisticated simulation technique to perform in-depth system performance analysis.

  10. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples. Volume 3: Structure and listing of programs

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  11. Formal and heuristic system decomposition methods in multidisciplinary synthesis. Ph.D. Thesis, 1991

    NASA Technical Reports Server (NTRS)

    Bloebaum, Christina L.

    1991-01-01

    The multidisciplinary interactions which exist in large scale engineering design problems provide a unique set of difficulties. These difficulties are associated primarily with unwieldy numbers of design variables and constraints, and with the interdependencies of the discipline analysis modules. Such obstacles require design techniques which account for the inherent disciplinary couplings in the analyses and optimizations. The objective of this work was to develop an efficient holistic design synthesis methodology that takes advantage of the synergistic nature of integrated design. A general decomposition approach for optimization of large engineering systems is presented. The method is particularly applicable for multidisciplinary design problems which are characterized by closely coupled interactions among discipline analyses. The advantage of subsystem modularity allows for implementation of specialized methods for analysis and optimization, computational efficiency, and the ability to incorporate human intervention and decision making in the form of an expert systems capability. The resulting approach is not a method applicable to only a specific situation, but rather, a methodology which can be used for a large class of engineering design problems in which the system is non-hierarchic in nature.

  12. Application of TRIZ Methodology in Diffusion Welding System Optimization

    NASA Astrophysics Data System (ADS)

    Ravinder Reddy, N.; Satyanarayana, V. V.; Prashanthi, M.; Suguna, N.

    2017-12-01

    Welding is tremendously used in metal joining processes in the manufacturing process. In recent years, diffusion welding method has significantly increased the quality of a weld. Nevertheless, diffusion welding has some extent short research and application progress. Therefore, diffusion welding has a lack of relevant information, concerned with the joining of thick and thin materials with or without interlayers, on welding design such as fixture, parameters selection and integrated design. This article intends to combine innovative methods in the application of diffusion welding design. This will help to decrease trial and error or failure risks in the welding process being guided by the theory of inventive problem solving (TRIZ) design method. This article hopes to provide welding design personnel with innovative design ideas under research and for practical application.

  13. Structural Technology and Analysis Program (STAP) Delivery Order 0004: Durability Patch

    NASA Astrophysics Data System (ADS)

    Ikegami, Roy; Haugse, Eric; Trego, Angela; Rogers, Lynn; Maly, Joe

    2001-06-01

    Structural cracks in secondary structure, resulting from a high cycle fatigue (HCF) environment, are often referred to as nuisance cracks. This type of damage can result in costly inspections and repair. The repairs often do not last long because the repaired structure continues to respond in a resonant fashion to the environment. Although the use of materials for passive damping applications is well understood, there are few applications to high-cycle fatigue problems. This is because design information characterization temperature, resonant response frequency and strain levels are difficult to determine. The Durability Patch and Damage Dosimeter Program addressed these problems by: (1) Developing a damped repair design process which includes a methodology for designing the material and application characteristics required to optimally damp the repair. (2) Designing and developing a rugged, small, and lightweight data acquisition unit called the damage dosimeter. This is a battery operated, single board computer, capable of collecting three channels of strain and one channel of temperature, processing this data by user developed algorithms written in the C programming language, and storing the processed data in resident memory. The dosimeter is used to provide flight data needed to characterize the vibration environment. The vibration environment is then used to design the damping material characteristics and repair. The repair design methodology and dosimeter were demonstrated on B-52, C-130, and F-15 aircraft applications.

  14. Development of Probabilistic Life Prediction Methodologies and Testing Strategies for MEMS and CMC's

    NASA Technical Reports Server (NTRS)

    Jadaan, Osama

    2003-01-01

    This effort is to investigate probabilistic life prediction methodologies for ceramic matrix composites and MicroElectroMechanical Systems (MEMS) and to analyze designs that determine stochastic properties of MEMS. For CMC's this includes a brief literature survey regarding lifing methodologies. Also of interest for MEMS is the design of a proper test for the Weibull size effect in thin film (bulge test) specimens. The Weibull size effect is a consequence of a stochastic strength response predicted from the Weibull distribution. Confirming that MEMS strength is controlled by the Weibull distribution will enable the development of a probabilistic design methodology for MEMS - similar to the GRC developed CARES/Life program for bulk ceramics. A main objective of this effort is to further develop and verify the ability of the Ceramics Analysis and Reliability Evaluation of Structures/Life (CARES/Life) code to predict the time-dependent reliability of MEMS structures subjected to multiple transient loads. A second set of objectives is to determine the applicability/suitability of the CARES/Life methodology for CMC analysis, what changes would be needed to the methodology and software, and if feasible, run a demonstration problem. Also important is an evaluation of CARES/Life coupled to the ANSYS Probabilistic Design System (PDS) and the potential of coupling transient reliability analysis to the ANSYS PDS.

  15. Methodology for extracting local constants from petroleum cracking flows

    DOEpatents

    Chang, Shen-Lin; Lottes, Steven A.; Zhou, Chenn Q.

    2000-01-01

    A methodology provides for the extraction of local chemical kinetic model constants for use in a reacting flow computational fluid dynamics (CFD) computer code with chemical kinetic computations to optimize the operating conditions or design of the system, including retrofit design improvements to existing systems. The coupled CFD and kinetic computer code are used in combination with data obtained from a matrix of experimental tests to extract the kinetic constants. Local fluid dynamic effects are implicitly included in the extracted local kinetic constants for each particular application system to which the methodology is applied. The extracted local kinetic model constants work well over a fairly broad range of operating conditions for specific and complex reaction sets in specific and complex reactor systems. While disclosed in terms of use in a Fluid Catalytic Cracking (FCC) riser, the inventive methodology has application in virtually any reaction set to extract constants for any particular application and reaction set formulation. The methodology includes the step of: (1) selecting the test data sets for various conditions; (2) establishing the general trend of the parametric effect on the measured product yields; (3) calculating product yields for the selected test conditions using coupled computational fluid dynamics and chemical kinetics; (4) adjusting the local kinetic constants to match calculated product yields with experimental data; and (5) validating the determined set of local kinetic constants by comparing the calculated results with experimental data from additional test runs at different operating conditions.

  16. General Electric composite ring-disk flywheel: Recent and potential developments

    NASA Technical Reports Server (NTRS)

    Coppa, A. P.

    1984-01-01

    Recent developments of the General Electric hybrid rotor design are described. The relation of the hybrid rotor design to flywheel designs that are especially suitable for spacecraft applications is discussed. Potential performance gains that can be achieved in such rotor designs by applying latest developments in materials, processing, and design methodology are projected. Indications are that substantial improvements can be obtained.

  17. Turbine Engine Testing.

    DTIC Science & Technology

    1981-01-01

    per-rev, ring weighting factor, etc.) and with compression system design . A detailed description of the SAE methodology is provided in Ref. 1...offers insights into the practical application of experimental aeromechanical procedures and establishes the process of valid design assessment, avoiding...considerations given to the total engine system. Design Verification in the Experimental Laboratory Certain key parameters are influencing the design of modern

  18. The Engineering of Engineering Education: Curriculum Development from a Designer's Point of View

    ERIC Educational Resources Information Center

    Rompelman, Otto; De Graaff, Erik

    2006-01-01

    Engineers have a set of powerful tools at their disposal for designing robust and reliable technical systems. In educational design these tools are seldom applied. This paper explores the application of concepts from the systems approach in an educational context. The paradigms of design methodology and systems engineering appear to be suitable…

  19. System engineering toolbox for design-oriented engineers

    NASA Technical Reports Server (NTRS)

    Goldberg, B. E.; Everhart, K.; Stevens, R.; Babbitt, N., III; Clemens, P.; Stout, L.

    1994-01-01

    This system engineering toolbox is designed to provide tools and methodologies to the design-oriented systems engineer. A tool is defined as a set of procedures to accomplish a specific function. A methodology is defined as a collection of tools, rules, and postulates to accomplish a purpose. For each concept addressed in the toolbox, the following information is provided: (1) description, (2) application, (3) procedures, (4) examples, if practical, (5) advantages, (6) limitations, and (7) bibliography and/or references. The scope of the document includes concept development tools, system safety and reliability tools, design-related analytical tools, graphical data interpretation tools, a brief description of common statistical tools and methodologies, so-called total quality management tools, and trend analysis tools. Both relationship to project phase and primary functional usage of the tools are also delineated. The toolbox also includes a case study for illustrative purposes. Fifty-five tools are delineated in the text.

  20. Suggested criteria for evaluating systems engineering methodologies

    NASA Technical Reports Server (NTRS)

    Gates, Audrey; Paul, Arthur S.; Gill, Tepper L.

    1989-01-01

    Systems engineering is the application of mathematical and scientific principles to practical ends in the life-cycle of a system. A methodology for systems engineering is a carefully developed, relatively complex procedure or process for applying these mathematical and scientific principles. There are many systems engineering methodologies (or possibly many versions of a few methodologies) currently in use in government and industry. These methodologies are usually designed to meet the needs of a particular organization. It has been observed, however, that many technical and non-technical problems arise when inadequate systems engineering methodologies are applied by organizations to their systems development projects. Various criteria for evaluating systems engineering methodologies are discussed. Such criteria are developed to assist methodology-users in identifying and selecting methodologies that best fit the needs of the organization.

  1. DB4US: A Decision Support System for Laboratory Information Management

    PubMed Central

    Hortas, Maria Luisa; Baena-García, Manuel; Lana-Linati, Jorge; González, Carlos; Redondo, Maximino; Morales-Bueno, Rafael

    2012-01-01

    Background Until recently, laboratory automation has focused primarily on improving hardware. Future advances are concentrated on intelligent software since laboratories performing clinical diagnostic testing require improved information systems to address their data processing needs. In this paper, we propose DB4US, an application that automates information related to laboratory quality indicators information. Currently, there is a lack of ready-to-use management quality measures. This application addresses this deficiency through the extraction, consolidation, statistical analysis, and visualization of data related to the use of demographics, reagents, and turn-around times. The design and implementation issues, as well as the technologies used for the implementation of this system, are discussed in this paper. Objective To develop a general methodology that integrates the computation of ready-to-use management quality measures and a dashboard to easily analyze the overall performance of a laboratory, as well as automatically detect anomalies or errors. The novelty of our approach lies in the application of integrated web-based dashboards as an information management system in hospital laboratories. Methods We propose a new methodology for laboratory information management based on the extraction, consolidation, statistical analysis, and visualization of data related to demographics, reagents, and turn-around times, offering a dashboard-like user web interface to the laboratory manager. The methodology comprises a unified data warehouse that stores and consolidates multidimensional data from different data sources. The methodology is illustrated through the implementation and validation of DB4US, a novel web application based on this methodology that constructs an interface to obtain ready-to-use indicators, and offers the possibility to drill down from high-level metrics to more detailed summaries. The offered indicators are calculated beforehand so that they are ready to use when the user needs them. The design is based on a set of different parallel processes to precalculate indicators. The application displays information related to tests, requests, samples, and turn-around times. The dashboard is designed to show the set of indicators on a single screen. Results DB4US was deployed for the first time in the Hospital Costa del Sol in 2008. In our evaluation we show the positive impact of this methodology for laboratory professionals, since the use of our application has reduced the time needed for the elaboration of the different statistical indicators and has also provided information that has been used to optimize the usage of laboratory resources by the discovery of anomalies in the indicators. DB4US users benefit from Internet-based communication of results, since this information is available from any computer without having to install any additional software. Conclusions The proposed methodology and the accompanying web application, DB4US, automates the processing of information related to laboratory quality indicators and offers a novel approach for managing laboratory-related information, benefiting from an Internet-based communication mechanism. The application of this methodology has been shown to improve the usage of time, as well as other laboratory resources. PMID:23608745

  2. Self-Organisation and Capacity Building: Sustaining the Change

    ERIC Educational Resources Information Center

    Bain, Alan; Walker, Allan; Chan, Anissa

    2011-01-01

    Purpose: The paper aims to describe the application of theoretical principles derived from a study of self-organisation and complex systems theory and their application to school-based capacity building to support planned change. Design/methodology/approach: The paper employs a case example in a Hong Kong School to illustrate the application of…

  3. Aircraft optimization by a system approach: Achievements and trends

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw

    1992-01-01

    Recently emerging methodology for optimal design of aircraft treated as a system of interacting physical phenomena and parts is examined. The methodology is found to coalesce into methods for hierarchic, non-hierarchic, and hybrid systems all dependent on sensitivity analysis. A separate category of methods has also evolved independent of sensitivity analysis, hence suitable for discrete problems. References and numerical applications are cited. Massively parallel computer processing is seen as enabling technology for practical implementation of the methodology.

  4. Mechanical system reliability for long life space systems

    NASA Technical Reports Server (NTRS)

    Kowal, Michael T.

    1994-01-01

    The creation of a compendium of mechanical limit states was undertaken in order to provide a reference base for the application of first-order reliability methods to mechanical systems in the context of the development of a system level design methodology. The compendium was conceived as a reference source specific to the problem of developing the noted design methodology, and not an exhaustive or exclusive compilation of mechanical limit states. The compendium is not intended to be a handbook of mechanical limit states for general use. The compendium provides a diverse set of limit-state relationships for use in demonstrating the application of probabilistic reliability methods to mechanical systems. The compendium is to be used in the reliability analysis of moderately complex mechanical systems.

  5. SERVQUAL Application and Adaptation for Educational Service Quality Assessments in Russian Higher Education

    ERIC Educational Resources Information Center

    Galeeva, Railya B.

    2016-01-01

    Purpose: The purpose of this study is to demonstrate an adaptation of the SERVQUAL survey method for measuring the quality of higher educational services in a Russian university context. We use a new analysis and a graphical technique for presentation of results. Design/methodology/approach: The methodology of this research follows the classic…

  6. Prototyping a Microcomputer-Based Online Library Catalog. Occasional Papers Number 177.

    ERIC Educational Resources Information Center

    Lazinger, Susan S.; Shoval, Peretz

    This report examines and evaluates the application of prototyping methodology in the design of a microcomputer-based online library catalog. The methodology for carrying out the research involves a five-part examination of the problem on both the theoretical and applied levels, each of which is discussed in a separate section as follows: (1) a…

  7. Bayesian methodology for the design and interpretation of clinical trials in critical care medicine: a primer for clinicians.

    PubMed

    Kalil, Andre C; Sun, Junfeng

    2014-10-01

    To review Bayesian methodology and its utility to clinical decision making and research in the critical care field. Clinical, epidemiological, and biostatistical studies on Bayesian methods in PubMed and Embase from their inception to December 2013. Bayesian methods have been extensively used by a wide range of scientific fields, including astronomy, engineering, chemistry, genetics, physics, geology, paleontology, climatology, cryptography, linguistics, ecology, and computational sciences. The application of medical knowledge in clinical research is analogous to the application of medical knowledge in clinical practice. Bedside physicians have to make most diagnostic and treatment decisions on critically ill patients every day without clear-cut evidence-based medicine (more subjective than objective evidence). Similarly, clinical researchers have to make most decisions about trial design with limited available data. Bayesian methodology allows both subjective and objective aspects of knowledge to be formally measured and transparently incorporated into the design, execution, and interpretation of clinical trials. In addition, various degrees of knowledge and several hypotheses can be tested at the same time in a single clinical trial without the risk of multiplicity. Notably, the Bayesian technology is naturally suited for the interpretation of clinical trial findings for the individualized care of critically ill patients and for the optimization of public health policies. We propose that the application of the versatile Bayesian methodology in conjunction with the conventional statistical methods is not only ripe for actual use in critical care clinical research but it is also a necessary step to maximize the performance of clinical trials and its translation to the practice of critical care medicine.

  8. Cognitive Task Analysis, Interface Design, and Technical Troubleshooting.

    ERIC Educational Resources Information Center

    Steinberg, Linda S.; Gitomer, Drew H.

    A model of the interface design process is proposed that makes use of two interdependent levels of cognitive analysis: the study of the criterion task through an analysis of expert/novice differences and the evaluation of the working user interface design through the application of a practical interface analysis methodology (GOMS model). This dual…

  9. Developing Pedagogical Practices for English-Language Learners: A Design-Based Approach

    ERIC Educational Resources Information Center

    Iddings, Ana Christina DaSilva; Rose, Brian Christopher

    2012-01-01

    This study draws on the application of sociocultural theory to second-language learning and teaching to examine the impact of a design-based research approach on teacher development and literacy instruction to English-language learners (ELLs). Design-based research methodology was employed to derive theoretical suppositions relating to the process…

  10. A web based health technology assessment in tele-echocardiography: the experience within an Italian project.

    PubMed

    Giansanti, Daniele; Morelli, Sandra; Maccioni, Giovanni; Guerriero, Lorenzo; Bedini, Remo; Pepe, Gennaro; Colombo, Cesare; Borghi, Gabriella; Macellari, Velio

    2009-01-01

    Due to major advances in the information technology, telemedicine applications are ready for a widespread use. Nonetheless, to allow their diffusion in National Health Care Systems (NHCSs) specific methodologies of health technology assessment (HTA) should be used to assess the standardization, the overall quality, the interoperability, the addressing to legal, economic and cost benefit aspects. One of the limits to the diffusion of the digital tele-echocardiography (T-E) applications in the NHCS lacking of a specific methodology for the HTA. In the present study, a solution offering a structured HTA of T-E products was designed. The methodology assured also the definition of standardized quality levels for the application. The first level represents the minimum level of acceptance; the other levels are accessory levels useful for a more accurate assessment of the product. The methodology showed to be useful to rationalize the process of standardization and has received a high degree of acceptance by the subjects involved in the study.

  11. The Design and Development of BMI Calc Android Application

    NASA Astrophysics Data System (ADS)

    Mohd Ali, Iliana; Samsudin, Nooraida

    2016-11-01

    Body mass index is a familiar term for those who are weight conscious. It is the term that let user know about the overall body composition in terms of fat.The available body mass index calculators whether online or on Play Store do not provide Malaysian meal suggestions. Hence, this paper proposes an application for body mass index calculator together with Malaysian meal suggestion. The objectives of the study are to design and develop BMI Calc android application for the purpose of calculating body mass index while embedding meal suggestion module. The design and methodology involve in the process are also presented.

  12. Systematic design methodology for robust genetic transistors based on I/O specifications via promoter-RBS libraries.

    PubMed

    Lee, Yi-Ying; Hsu, Chih-Yuan; Lin, Ling-Jiun; Chang, Chih-Chun; Cheng, Hsiao-Chun; Yeh, Tsung-Hsien; Hu, Rei-Hsing; Lin, Che; Xie, Zhen; Chen, Bor-Sen

    2013-10-27

    Synthetic genetic transistors are vital for signal amplification and switching in genetic circuits. However, it is still problematic to efficiently select the adequate promoters, Ribosome Binding Sides (RBSs) and inducer concentrations to construct a genetic transistor with the desired linear amplification or switching in the Input/Output (I/O) characteristics for practical applications. Three kinds of promoter-RBS libraries, i.e., a constitutive promoter-RBS library, a repressor-regulated promoter-RBS library and an activator-regulated promoter-RBS library, are constructed for systematic genetic circuit design using the identified kinetic strengths of their promoter-RBS components.According to the dynamic model of genetic transistors, a design methodology for genetic transistors via a Genetic Algorithm (GA)-based searching algorithm is developed to search for a set of promoter-RBS components and adequate concentrations of inducers to achieve the prescribed I/O characteristics of a genetic transistor. Furthermore, according to design specifications for different types of genetic transistors, a look-up table is built for genetic transistor design, from which we could easily select an adequate set of promoter-RBS components and adequate concentrations of external inducers for a specific genetic transistor. This systematic design method will reduce the time spent using trial-and-error methods in the experimental procedure for a genetic transistor with a desired I/O characteristic. We demonstrate the applicability of our design methodology to genetic transistors that have desirable linear amplification or switching by employing promoter-RBS library searching.

  13. Systematic design methodology for robust genetic transistors based on I/O specifications via promoter-RBS libraries

    PubMed Central

    2013-01-01

    Background Synthetic genetic transistors are vital for signal amplification and switching in genetic circuits. However, it is still problematic to efficiently select the adequate promoters, Ribosome Binding Sides (RBSs) and inducer concentrations to construct a genetic transistor with the desired linear amplification or switching in the Input/Output (I/O) characteristics for practical applications. Results Three kinds of promoter-RBS libraries, i.e., a constitutive promoter-RBS library, a repressor-regulated promoter-RBS library and an activator-regulated promoter-RBS library, are constructed for systematic genetic circuit design using the identified kinetic strengths of their promoter-RBS components. According to the dynamic model of genetic transistors, a design methodology for genetic transistors via a Genetic Algorithm (GA)-based searching algorithm is developed to search for a set of promoter-RBS components and adequate concentrations of inducers to achieve the prescribed I/O characteristics of a genetic transistor. Furthermore, according to design specifications for different types of genetic transistors, a look-up table is built for genetic transistor design, from which we could easily select an adequate set of promoter-RBS components and adequate concentrations of external inducers for a specific genetic transistor. Conclusion This systematic design method will reduce the time spent using trial-and-error methods in the experimental procedure for a genetic transistor with a desired I/O characteristic. We demonstrate the applicability of our design methodology to genetic transistors that have desirable linear amplification or switching by employing promoter-RBS library searching. PMID:24160305

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Janjusic, Tommy; Kartsaklis, Christos

    Memory scalability is an enduring problem and bottleneck that plagues many parallel codes. Parallel codes designed for High Performance Systems are typically designed over the span of several, and in some instances 10+, years. As a result, optimization practices which were appropriate for earlier systems may no longer be valid and thus require careful optimization consideration. Specifically, parallel codes whose memory footprint is a function of their scalability must be carefully considered for future exa-scale systems. In this paper we present a methodology and tool to study the memory scalability of parallel codes. Using our methodology we evaluate an applicationmore » s memory footprint as a function of scalability, which we coined memory efficiency, and describe our results. In particular, using our in-house tools we can pinpoint the specific application components which contribute to the application s overall memory foot-print (application data- structures, libraries, etc.).« less

  15. Systematic design of membership functions for fuzzy-logic control: A case study on one-stage partial nitritation/anammox treatment systems.

    PubMed

    Boiocchi, Riccardo; Gernaey, Krist V; Sin, Gürkan

    2016-10-01

    A methodology is developed to systematically design the membership functions of fuzzy-logic controllers for multivariable systems. The methodology consists of a systematic derivation of the critical points of the membership functions as a function of predefined control objectives. Several constrained optimization problems corresponding to different qualitative operation states of the system are defined and solved to identify, in a consistent manner, the critical points of the membership functions for the input variables. The consistently identified critical points, together with the linguistic rules, determine the long term reachability of the control objectives by the fuzzy logic controller. The methodology is highlighted using a single-stage side-stream partial nitritation/Anammox reactor as a case study. As a result, a new fuzzy-logic controller for high and stable total nitrogen removal efficiency is designed. Rigorous simulations are carried out to evaluate and benchmark the performance of the controller. The results demonstrate that the novel control strategy is capable of rejecting the long-term influent disturbances, and can achieve a stable and high TN removal efficiency. Additionally, the controller was tested, and showed robustness, against measurement noise levels typical for wastewater sensors. A feedforward-feedback configuration using the present controller would give even better performance. In comparison, a previously developed fuzzy-logic controller using merely expert and intuitive knowledge performed worse. This proved the importance of using a systematic methodology for the derivation of the membership functions for multivariable systems. These results are promising for future applications of the controller in real full-scale plants. Furthermore, the methodology can be used as a tool to help systematically design fuzzy logic control applications for other biological processes. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. An evaluation of the directed flow graph methodology

    NASA Technical Reports Server (NTRS)

    Snyder, W. E.; Rajala, S. A.

    1984-01-01

    The applicability of the Directed Graph Methodology (DGM) to the design and analysis of special purpose image and signal processing hardware was evaluated. A special purpose image processing system was designed and described using DGM. The design, suitable for very large scale integration (VLSI) implements a region labeling technique. Two computer chips were designed, both using metal-nitride-oxide-silicon (MNOS) technology, as well as a functional system utilizing those chips to perform real time region labeling. The system is described in terms of DGM primitives. As it is currently implemented, DGM is inappropriate for describing synchronous, tightly coupled, special purpose systems. The nature of the DGM formalism lends itself more readily to modeling networks of general purpose processors.

  17. Innovative Technologies for Multicultural Education Needs

    ERIC Educational Resources Information Center

    Ferdig, Richard E.; Coutts, Jade; DiPietro, Joseph; Lok, Benjamin; Davis, Niki

    2007-01-01

    Purpose: The purpose of this paper is to discuss several technology applications that are being used to address current problems or opportunities related to multicultural education. Design/methodology/approach: Five technology applications or technology-related projects are discussed, including a teacher education literacy tool, social networking…

  18. Data Sufficiency Assessment and Pumping Test Design for Groundwater Prediction Using Decision Theory and Genetic Algorithms

    NASA Astrophysics Data System (ADS)

    McPhee, J.; William, Y. W.

    2005-12-01

    This work presents a methodology for pumping test design based on the reliability requirements of a groundwater model. Reliability requirements take into consideration the application of the model results in groundwater management, expressed in this case as a multiobjective management model. The pumping test design is formulated as a mixed-integer nonlinear programming (MINLP) problem and solved using a combination of genetic algorithm (GA) and gradient-based optimization. Bayesian decision theory provides a formal framework for assessing the influence of parameter uncertainty over the reliability of the proposed pumping test. The proposed methodology is useful for selecting a robust design that will outperform all other candidate designs under most potential 'true' states of the system

  19. A methodology for accident analysis of fusion breeder blankets and its application to helium-cooled lead–lithium blanket

    DOE PAGES

    Panayotov, Dobromir; Poitevin, Yves; Grief, Andrew; ...

    2016-09-23

    'Fusion for Energy' (F4E) is designing, developing, and implementing the European Helium-Cooled Lead-Lithium (HCLL) and Helium-Cooled Pebble-Bed (HCPB) Test Blanket Systems (TBSs) for ITER (Nuclear Facility INB-174). Safety demonstration is an essential element for the integration of these TBSs into ITER and accident analysis is one of its critical components. A systematic approach to accident analysis has been developed under the F4E contract on TBS safety analyses. F4E technical requirements, together with Amec Foster Wheeler and INL efforts, have resulted in a comprehensive methodology for fusion breeding blanket accident analysis that addresses the specificity of the breeding blanket designs, materials,more » and phenomena while remaining consistent with the approach already applied to ITER accident analyses. Furthermore, the methodology phases are illustrated in the paper by its application to the EU HCLL TBS using both MELCOR and RELAP5 codes.« less

  20. The TMIS life-cycle process document, revision A

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The Technical and Management Information System (TMIS) Life-Cycle Process Document describes the processes that shall be followed in the definition, design, development, test, deployment, and operation of all TMIS products and data base applications. This document is a roll out of TMIS Standards Document (SSP 30546). The purpose of this document is to define the life cycle methodology that the developers of all products and data base applications and any subsequent modifications shall follow. Included in this methodology are descriptions of the tasks, deliverables, reviews, and approvals that are required before a product or data base application is accepted in the TMIS environment.

  1. A goal programming approach for a joint design of macroeconomic and environmental policies: a methodological proposal and an application to the Spanish economy.

    PubMed

    André, Francisco J; Cardenete, M Alejandro; Romero, Carlos

    2009-05-01

    The economic policy needs to pay increasingly more attention to the environmental issues, which requires the development of methodologies able to incorporate environmental, as well as macroeconomic, goals in the design of public policies. Starting from this observation, this article proposes a methodology based upon a Simonian satisficing logic made operational with the help of goal programming (GP) models, to address the joint design of macroeconomic and environmental policies. The methodology is applied to the Spanish economy, where a joint policy is elicited, taking into consideration macroeconomic goals (economic growth, inflation, unemployment, public deficit) and environmental goals (CO(2), NO( x ) and SO( x ) emissions) within the context of a computable general equilibrium model. The results show how the government can "fine-tune" its policy according to different criteria using GP models. The resulting policies aggregate the environmental and the economic goals in different ways: maximum aggregate performance, maximum balance and a lexicographic hierarchy of the goals.

  2. Space station definitions, design, and development. Task 5: Multiple arm telerobot coordination and control: Manipulator design methodology

    NASA Technical Reports Server (NTRS)

    Stoughton, R. M.

    1990-01-01

    A proposed methodology applicable to the design of manipulator systems is described. The current design process is especially weak in the preliminary design phase, since there is no accepted measure to be used in trading off different options available for the various subsystems. The design process described uses Cartesian End-Effector Impedance as a measure of performance for the system. Having this measure of performance, it is shown how it may be used to determine the trade-offs necessary to the preliminary design phase. The design process involves three main parts: (1) determination of desired system performance in terms of End-Effector Impedance; (2) trade-off design options to achieve this desired performance; and (3) verification of system performance through laboratory testing. The design process is developed using numerous examples and experiments to demonstrate the feasability of this approach to manipulator design.

  3. Software design and documentation language

    NASA Technical Reports Server (NTRS)

    Kleine, H.

    1977-01-01

    A communications medium to support the design and documentation of complex software applications is studied. The medium also provides the following: (1) a processor which can convert design specifications into an intelligible, informative machine reproducible document; (2) a design and documentation language with forms and syntax that are simple, unrestrictive, and communicative; and (3) methodology for effective use of the language and processor.

  4. A methodological approach for designing a usable ontology-based GUI in healthcare.

    PubMed

    Lasierra, N; Kushniruk, A; Alesanco, A; Borycki, E; García, J

    2013-01-01

    This paper presents a methodological approach to the design and evaluation of an interface for an ontology-based system used for designing care plans for monitoring patients at home. In order to define the care plans, physicians need a tool for creating instances of the ontology and configuring some rules. Our purpose is to develop an interface to allow clinicians to interact with the ontology. Although ontology-driven applications do not necessarily present the ontology in the user interface, it is our hypothesis that showing selected parts of the ontology in a "usable" way could enhance clinician's understanding and make easier the definition of the care plans. Based on prototyping and iterative testing, this methodology combines visualization techniques and usability methods. Preliminary results obtained after a formative evaluation indicate the effectiveness of suggested combination.

  5. An automated methodology development. [software design for combat simulation

    NASA Technical Reports Server (NTRS)

    Hawley, L. R.

    1985-01-01

    The design methodology employed in testing the applicability of Ada in large-scale combat simulations is described. Ada was considered as a substitute for FORTRAN to lower life cycle costs and ease the program development efforts. An object-oriented approach was taken, which featured definitions of military targets, the capability of manipulating their condition in real-time, and one-to-one correlation between the object states and real world states. The simulation design process was automated by the problem statement language (PSL)/problem statement analyzer (PSA). The PSL/PSA system accessed the problem data base directly to enhance the code efficiency by, e.g., eliminating non-used subroutines, and provided for automated report generation, besides allowing for functional and interface descriptions. The ways in which the methodology satisfied the responsiveness, reliability, transportability, modifiability, timeliness and efficiency goals are discussed.

  6. Compact sieve-tray distillation column for ammonia-water absorption heat pump: Part 1 -- Design methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anand, G.; Erickson, D.C.

    1999-07-01

    The distillation column is a key component of ammonia-water absorption units including advanced generator-absorber heat exchange (GAX) cycle heat pumps. The design of the distillation column is critical to unit performance, size, and cost. The distillation column can be designed with random packing, structured packing, or various tray configurations. A sieve-tray distillation column is the least complicated tray design and is less costly than high-efficiency packing. Substantial literature is available on sieve tray design and performance. However, most of the correlations and design recommendations were developed for large industrial hydrocarbon systems and are generally not directly applicable to the compactmore » ammonia-water column discussed here. The correlations were reviewed and modified as appropriate for this application, and a sieve-tray design model was developed. This paper presents the sieve-tray design methodology for highly compact ammonia-water columns. A conceptual design of the distillation column for an 8 ton vapor exchange (VX) GAX heat pump is presented, illustrating relevant design parameters and trends. The design process revealed several issues that have to be investigated experimentally to design the final optimized rectifier. Validation of flooding and weeping limits and tray/point efficiencies are of primary importance.« less

  7. RLV Turbine Performance Optimization

    NASA Technical Reports Server (NTRS)

    Griffin, Lisa W.; Dorney, Daniel J.

    2001-01-01

    A task was developed at NASA/Marshall Space Flight Center (MSFC) to improve turbine aerodynamic performance through the application of advanced design and analysis tools. There are four major objectives of this task: 1) to develop, enhance, and integrate advanced turbine aerodynamic design and analysis tools; 2) to develop the methodology for application of the analytical techniques; 3) to demonstrate the benefits of the advanced turbine design procedure through its application to a relevant turbine design point; and 4) to verify the optimized design and analysis with testing. Final results of the preliminary design and the results of the two-dimensional (2D) detailed design of the first-stage vane of a supersonic turbine suitable for a reusable launch vehicle (R-LV) are presented. Analytical techniques for obtaining the results are also discussed.

  8. Data Mining for Financial Applications

    NASA Astrophysics Data System (ADS)

    Kovalerchuk, Boris; Vityaev, Evgenii

    This chapter describes Data Mining in finance by discussing financial tasks, specifics of methodologies and techniques in this Data Mining area. It includes time dependence, data selection, forecast horizon, measures of success, quality of patterns, hypothesis evaluation, problem ID, method profile, attribute-based and relational methodologies. The second part of the chapter discusses Data Mining models and practice in finance. It covers use of neural networks in portfolio management, design of interpretable trading rules and discovering money laundering schemes using decision rules and relational Data Mining methodology.

  9. Hypersonics. Volume 1 - Defining the hypersonic environment; Proceedings of the First Joint Europe/U.S. Short Course on Hypersonics, Paris, France, Dec. 7-11, 1987

    NASA Astrophysics Data System (ADS)

    Bertin, John J.; Glowinski, Roland; Periaux, Jacques

    1989-05-01

    The present work discusses the general characterization of hypersonic flows, the hypersonic phenomena to be encountered by the Hermes spacecraft, industrial methodologies for the design of hypersonic vehicles, the definition of aerodynamic methodology, and hypersonic airbreathing-propulsion vehicle design practices applicable to the U.S. National Aerospace Plane. Also discussed are real gas effects in the hypersonic regime, the influence of thermochemistry and of nonequilibrium and surface catalysis on hypersonic vehicle design, the modelling of nonequilibrium effects in high speed flows, air-dissociation thermochemistry, and rarefied gas dynamics effects for spacecraft.

  10. An overview of key technology thrusts at Bell Helicopter Textron

    NASA Technical Reports Server (NTRS)

    Harse, James H.; Yen, Jing G.; Taylor, Rodney S.

    1988-01-01

    Insight is provided into several key technologies at Bell. Specific topics include the results of ongoing research and development in advanced rotors, methodology development, and new configurations. The discussion on advanced rotors highlight developments on the composite, bearingless rotor, including the development and testing of full scale flight hardware as well as some of the design support analyses and verification testing. The discussion on methodology development concentrates on analytical development in aeromechanics, including correlation studies and design application. New configurations, presents the results of some advanced configuration studies including hardware development.

  11. Fault Modeling of Extreme Scale Applications Using Machine Learning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vishnu, Abhinav; Dam, Hubertus van; Tallent, Nathan R.

    Faults are commonplace in large scale systems. These systems experience a variety of faults such as transient, permanent and intermittent. Multi-bit faults are typically not corrected by the hardware resulting in an error. Here, this paper attempts to answer an important question: Given a multi-bit fault in main memory, will it result in an application error — and hence a recovery algorithm should be invoked — or can it be safely ignored? We propose an application fault modeling methodology to answer this question. Given a fault signature (a set of attributes comprising of system and application state), we use machinemore » learning to create a model which predicts whether a multibit permanent/transient main memory fault will likely result in error. We present the design elements such as the fault injection methodology for covering important data structures, the application and system attributes which should be used for learning the model, the supervised learning algorithms (and potentially ensembles), and important metrics. Lastly, we use three applications — NWChem, LULESH and SVM — as examples for demonstrating the effectiveness of the proposed fault modeling methodology.« less

  12. Fault Modeling of Extreme Scale Applications Using Machine Learning

    DOE PAGES

    Vishnu, Abhinav; Dam, Hubertus van; Tallent, Nathan R.; ...

    2016-05-01

    Faults are commonplace in large scale systems. These systems experience a variety of faults such as transient, permanent and intermittent. Multi-bit faults are typically not corrected by the hardware resulting in an error. Here, this paper attempts to answer an important question: Given a multi-bit fault in main memory, will it result in an application error — and hence a recovery algorithm should be invoked — or can it be safely ignored? We propose an application fault modeling methodology to answer this question. Given a fault signature (a set of attributes comprising of system and application state), we use machinemore » learning to create a model which predicts whether a multibit permanent/transient main memory fault will likely result in error. We present the design elements such as the fault injection methodology for covering important data structures, the application and system attributes which should be used for learning the model, the supervised learning algorithms (and potentially ensembles), and important metrics. Lastly, we use three applications — NWChem, LULESH and SVM — as examples for demonstrating the effectiveness of the proposed fault modeling methodology.« less

  13. 77 FR 8288 - Applications and Amendments to Facility Operating Licenses Involving Proposed No Significant...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-14

    ... not listed on the Web site, but should note that the NRC's E-Filing system does not support unlisted... (COLR), to update the methodology reference list to support the core design with the new AREVA fuel... methodologies listed in Technical Specification 5.7.1.5 has no impact on any plant configuration or system...

  14. Methodology for object-oriented real-time systems analysis and design: Software engineering

    NASA Technical Reports Server (NTRS)

    Schoeffler, James D.

    1991-01-01

    Successful application of software engineering methodologies requires an integrated analysis and design life-cycle in which the various phases flow smoothly 'seamlessly' from analysis through design to implementation. Furthermore, different analysis methodologies often lead to different structuring of the system so that the transition from analysis to design may be awkward depending on the design methodology to be used. This is especially important when object-oriented programming is to be used for implementation when the original specification and perhaps high-level design is non-object oriented. Two approaches to real-time systems analysis which can lead to an object-oriented design are contrasted: (1) modeling the system using structured analysis with real-time extensions which emphasizes data and control flows followed by the abstraction of objects where the operations or methods of the objects correspond to processes in the data flow diagrams and then design in terms of these objects; and (2) modeling the system from the beginning as a set of naturally occurring concurrent entities (objects) each having its own time-behavior defined by a set of states and state-transition rules and seamlessly transforming the analysis models into high-level design models. A new concept of a 'real-time systems-analysis object' is introduced and becomes the basic building block of a series of seamlessly-connected models which progress from the object-oriented real-time systems analysis and design system analysis logical models through the physical architectural models and the high-level design stages. The methodology is appropriate to the overall specification including hardware and software modules. In software modules, the systems analysis objects are transformed into software objects.

  15. Directed Design of Experiments for Validating Probability of Detection Capability of NDE Systems (DOEPOD)

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R.

    2015-01-01

    Directed Design of Experiments for Validating Probability of Detection Capability of NDE Systems (DOEPOD) Manual v.1.2 The capability of an inspection system is established by applications of various methodologies to determine the probability of detection (POD). One accepted metric of an adequate inspection system is that there is 95% confidence that the POD is greater than 90% (90/95 POD). Design of experiments for validating probability of detection capability of nondestructive evaluation (NDE) systems (DOEPOD) is a methodology that is implemented via software to serve as a diagnostic tool providing detailed analysis of POD test data, guidance on establishing data distribution requirements, and resolving test issues. DOEPOD demands utilization of observance of occurrences. The DOEPOD capability has been developed to provide an efficient and accurate methodology that yields observed POD and confidence bounds for both Hit-Miss or signal amplitude testing. DOEPOD does not assume prescribed POD logarithmic or similar functions with assumed adequacy over a wide range of flaw sizes and inspection system technologies, so that multi-parameter curve fitting or model optimization approaches to generate a POD curve are not required. DOEPOD applications for supporting inspector qualifications is included.

  16. Sustainable design and manufacturing of multifunctional polymer nanocomposite coatings: A multiscale systems approach

    NASA Astrophysics Data System (ADS)

    Xiao, Jie

    Polymer nanocomposites have a great potential to be a dominant coating material in a wide range of applications in the automotive, aerospace, ship-making, construction, and pharmaceutical industries. However, how to realize design sustainability of this type of nanostructured materials and how to ensure the true optimality of the product quality and process performance in coating manufacturing remain as a mountaintop area. The major challenges arise from the intrinsic multiscale nature of the material-process-product system and the need to manipulate the high levels of complexity and uncertainty in design and manufacturing processes. This research centers on the development of a comprehensive multiscale computational methodology and a computer-aided tool set that can facilitate multifunctional nanocoating design and application from novel function envisioning and idea refinement, to knowledge discovery and design solution derivation, and further to performance testing in industrial applications and life cycle analysis. The principal idea is to achieve exceptional system performance through concurrent characterization and optimization of materials, product and associated manufacturing processes covering a wide range of length and time scales. Multiscale modeling and simulation techniques ranging from microscopic molecular modeling to classical continuum modeling are seamlessly coupled. The tight integration of different methods and theories at individual scales allows the prediction of macroscopic coating performance from the fundamental molecular behavior. Goal-oriented design is also pursued by integrating additional methods for bio-inspired dynamic optimization and computational task management that can be implemented in a hierarchical computing architecture. Furthermore, multiscale systems methodologies are developed to achieve the best possible material application towards sustainable manufacturing. Automotive coating manufacturing, that involves paint spay and curing, is specifically discussed in this dissertation. Nevertheless, the multiscale considerations for sustainable manufacturing, the novel concept of IPP control, and the new PPDE-based optimization method are applicable to other types of manufacturing, e.g., metal coating development through electroplating. It is demonstrated that the methodological development in this dissertation can greatly facilitate experimentalists in novel material invention and new knowledge discovery. At the same time, they can provide scientific guidance and reveal various new opportunities and effective strategies for sustainable manufacturing.

  17. Wiki-Based Rapid Prototyping for Teaching-Material Design in E-Learning Grids

    ERIC Educational Resources Information Center

    Shih, Wen-Chung; Tseng, Shian-Shyong; Yang, Chao-Tung

    2008-01-01

    Grid computing environments with abundant resources can support innovative e-Learning applications, and are promising platforms for e-Learning. To support individualized and adaptive learning, teachers are encouraged to develop various teaching materials according to different requirements. However, traditional methodologies for designing teaching…

  18. Developing Supply Chain Management Program: A Competency Model

    ERIC Educational Resources Information Center

    Sauber, Matthew H.; McSurely, Hugh B.; Tummala, V. M. Rao

    2008-01-01

    Purpose: This paper aims to show the process of designing and measuring learning competencies in program development. Design/methodology/approach: The paper includes cross-sectoral comparisons to draw on programmatic and pedagogical strategies, more commonly utilized in vocational education, and transfer the application of these strategies into…

  19. Applying Case-Based Method in Designing Self-Directed Online Instruction: A Formative Research Study

    ERIC Educational Resources Information Center

    Luo, Heng; Koszalka, Tiffany A.; Arnone, Marilyn P.; Choi, Ikseon

    2018-01-01

    This study investigated the case-based method (CBM) instructional-design theory and its application in designing self-directed online instruction. The purpose of this study was to validate and refine the theory for a self-directed online instruction context. Guided by formative research methodology, this study first developed an online tutorial…

  20. Coupled thermal, electrical, and fluid flow analyses of AMTEC converters, with illustrative application to OSC`s cell design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schock, A.; Noravian, H.; Or, C.

    1997-12-31

    This paper presents the background and introduction to the OSC AMTEC (Alkali Metal Thermal-to-Electrical Conversion) studies, which were conducted for the Department of energy (DOE) and NASA`s jet Propulsion Laboratory (JPL). After describing the basic principle of AMTEC, the paper describes and explains the operation of multi-tube vapor/vapor cells, which have been under development by AMPS (Advance Modular Power Systems, Inc.) for the Air Force Phillips Laboratory (AFPL) and JPL for possible application to the Europa Orbiter, Pluto Express, and other space missions. It then describes a novel OSC-generated methodology for analyzing the performance of such cells. This methodology consistsmore » of an iterative procedure for the coupled solution of the interdependent thermal, electrical, and fluid flow differential and integral equations governing the performance of AMTEC cells and generators, taking proper account of the non-linear axial variations of temperature, pressure, open-circuit voltage, inter-electrode voltages, current density, axial current, sodium mass flow rate, and power density. The paper illustrates that analytical procedure by applying it to OSC`s latest cell design and by presenting detailed analytical results for that design. The OSC-developed analytic methodology constitutes a unique and powerful tool for accurate parametric analyses and design optimizations of the multi-tube AMTEC cells and of radioisotope power systems. This is illustrated in two companion papers in these proceedings. The first of those papers applies the OSC-derived program to determine the effect of various design parameters on the performance of single AMTEC cells with adiabatic side walls, culminating in an OSC-recommended revised cell design. And the second describes a number of OSC-generated AMTEC generator designs consisting of 2 and 3 GPHS heat source modules, 16 multi-tube converter cells, and a hybrid insulation design, and presents the results of applying the above analysis program to determine the applicability of those generators to possible future missions under consideration by NASA.« less

  1. Efficient Robust Optimization of Metal Forming Processes using a Sequential Metamodel Based Strategy

    NASA Astrophysics Data System (ADS)

    Wiebenga, J. H.; Klaseboer, G.; van den Boogaard, A. H.

    2011-08-01

    The coupling of Finite Element (FE) simulations to mathematical optimization techniques has contributed significantly to product improvements and cost reductions in the metal forming industries. The next challenge is to bridge the gap between deterministic optimization techniques and the industrial need for robustness. This paper introduces a new and generally applicable structured methodology for modeling and solving robust optimization problems. Stochastic design variables or noise variables are taken into account explicitly in the optimization procedure. The metamodel-based strategy is combined with a sequential improvement algorithm to efficiently increase the accuracy of the objective function prediction. This is only done at regions of interest containing the optimal robust design. Application of the methodology to an industrial V-bending process resulted in valuable process insights and an improved robust process design. Moreover, a significant improvement of the robustness (>2σ) was obtained by minimizing the deteriorating effects of several noise variables. The robust optimization results demonstrate the general applicability of the robust optimization strategy and underline the importance of including uncertainty and robustness explicitly in the numerical optimization procedure.

  2. Methodology evaluation: Effects of independent verification and intergration on one class of application

    NASA Technical Reports Server (NTRS)

    Page, J.

    1981-01-01

    The effects of an independent verification and integration (V and I) methodology on one class of application are described. Resource profiles are discussed. The development environment is reviewed. Seven measures are presented to test the hypothesis that V and I improve the development and product. The V and I methodology provided: (1) a decrease in requirements ambiguities and misinterpretation; (2) no decrease in design errors; (3) no decrease in the cost of correcting errors; (4) a decrease in the cost of system and acceptance testing; (5) an increase in early discovery of errors; (6) no improvement in the quality of software put into operation; and (7) a decrease in productivity and an increase in cost.

  3. Integrated design optimization research and development in an industrial environment

    NASA Astrophysics Data System (ADS)

    Kumar, V.; German, Marjorie D.; Lee, S.-J.

    1989-04-01

    An overview is given of a design optimization project that is in progress at the GE Research and Development Center for the past few years. The objective of this project is to develop a methodology and a software system for design automation and optimization of structural/mechanical components and systems. The effort focuses on research and development issues and also on optimization applications that can be related to real-life industrial design problems. The overall technical approach is based on integration of numerical optimization techniques, finite element methods, CAE and software engineering, and artificial intelligence/expert systems (AI/ES) concepts. The role of each of these engineering technologies in the development of a unified design methodology is illustrated. A software system DESIGN-OPT has been developed for both size and shape optimization of structural components subjected to static as well as dynamic loadings. By integrating this software with an automatic mesh generator, a geometric modeler and an attribute specification computer code, a software module SHAPE-OPT has been developed for shape optimization. Details of these software packages together with their applications to some 2- and 3-dimensional design problems are described.

  4. Integrated design optimization research and development in an industrial environment

    NASA Technical Reports Server (NTRS)

    Kumar, V.; German, Marjorie D.; Lee, S.-J.

    1989-01-01

    An overview is given of a design optimization project that is in progress at the GE Research and Development Center for the past few years. The objective of this project is to develop a methodology and a software system for design automation and optimization of structural/mechanical components and systems. The effort focuses on research and development issues and also on optimization applications that can be related to real-life industrial design problems. The overall technical approach is based on integration of numerical optimization techniques, finite element methods, CAE and software engineering, and artificial intelligence/expert systems (AI/ES) concepts. The role of each of these engineering technologies in the development of a unified design methodology is illustrated. A software system DESIGN-OPT has been developed for both size and shape optimization of structural components subjected to static as well as dynamic loadings. By integrating this software with an automatic mesh generator, a geometric modeler and an attribute specification computer code, a software module SHAPE-OPT has been developed for shape optimization. Details of these software packages together with their applications to some 2- and 3-dimensional design problems are described.

  5. Heuristic decomposition for non-hierarchic systems

    NASA Technical Reports Server (NTRS)

    Bloebaum, Christina L.; Hajela, P.

    1991-01-01

    Design and optimization is substantially more complex in multidisciplinary and large-scale engineering applications due to the existing inherently coupled interactions. The paper introduces a quasi-procedural methodology for multidisciplinary optimization that is applicable for nonhierarchic systems. The necessary decision-making support for the design process is provided by means of an embedded expert systems capability. The method employs a decomposition approach whose modularity allows for implementation of specialized methods for analysis and optimization within disciplines.

  6. The use of concept maps during knowledge elicitation in ontology development processes – the nutrigenomics use case

    PubMed Central

    Castro, Alexander Garcia; Rocca-Serra, Philippe; Stevens, Robert; Taylor, Chris; Nashar, Karim; Ragan, Mark A; Sansone, Susanna-Assunta

    2006-01-01

    Background Incorporation of ontologies into annotations has enabled 'semantic integration' of complex data, making explicit the knowledge within a certain field. One of the major bottlenecks in developing bio-ontologies is the lack of a unified methodology. Different methodologies have been proposed for different scenarios, but there is no agreed-upon standard methodology for building ontologies. The involvement of geographically distributed domain experts, the need for domain experts to lead the design process, the application of the ontologies and the life cycles of bio-ontologies are amongst the features not considered by previously proposed methodologies. Results Here, we present a methodology for developing ontologies within the biological domain. We describe our scenario, competency questions, results and milestones for each methodological stage. We introduce the use of concept maps during knowledge acquisition phases as a feasible transition between domain expert and knowledge engineer. Conclusion The contributions of this paper are the thorough description of the steps we suggest when building an ontology, example use of concept maps, consideration of applicability to the development of lower-level ontologies and application to decentralised environments. We have found that within our scenario conceptual maps played an important role in the development process. PMID:16725019

  7. An overview of the impact of rare disease characteristics on research methodology.

    PubMed

    Whicher, Danielle; Philbin, Sarah; Aronson, Naomi

    2018-01-19

    About 30 million individuals in the United States are living with a rare disease, which by definition have a prevalence of 200,000 or fewer cases in the United States ([National Organization for Rare Disorders], [About NORD], [2016]). Disease heterogeneity and geographic dispersion add to the difficulty of completing robust studies in small populations. Improving the ability to conduct research on rare diseases would have a significant impact on population health. The purpose of this paper is to raise awareness of methodological approaches that can address the challenges to conducting robust research on rare diseases. We conducted a landscape review of available methodological and analytic approaches to address the challenges of rare disease research. Our objectives were to: 1. identify algorithms for matching study design to rare disease attributes and the methodological approaches applicable to these algorithms; 2. draw inferences on how research communities and infrastructure can contribute to the efficiency of research on rare diseases; and 3. to describe methodological approaches in the rare disease portfolio of the Patient-Centered Outcomes Research Institute (PCORI), a funder promoting both rare disease research and research infrastructure. We identified three algorithms for matching study design to rare disease or intervention characteristics (Gagne, et.al, BMJ 349:g6802, 2014); (Gupta, et.al, J Clin Epidemiol 64:1085-1094, 2011); (Cornu, et. al, Orphet J Rare Dis 8:48,2012) and summarized the applicable methodological and analytic approaches. From this literature we were also able to draw inferences on how an effective research infrastructure can set an agenda, prioritize studies, accelerate accrual, catalyze patient engagement and terminate poorly performing studies. Of the 24 rare disease projects in the PCORI portfolio, 11 are randomized controlled trials (RCTs) using standard designs. Thirteen are observational studies using case-control, prospective cohort, or natural history designs. PCORI has supported the development of 9 Patient-Powered Research Networks (PPRNs) focused on rare diseases. Matching research design to attributes of rare diseases and interventions can facilitate the completion of RCTs that are adequately powered. An effective research infrastructure can improve efficiency and avoid waste in rare disease research. Our review of the PCORI research portfolio demonstrates that it is feasible to conduct RCTs in rare disease. However, most of these studies are using standard RCT designs. This suggests that use of a broader array of methodological approaches to RCTs --such as adaptive trials, cross-over trials, and early escape designs can improve the productivity of robust research in rare diseases.

  8. Mechatronics by Analogy and Application to Legged Locomotion

    NASA Astrophysics Data System (ADS)

    Ragusila, Victor

    A new design methodology for mechatronic systems, dubbed as Mechatronics by Analogy (MbA), is introduced and applied to designing a leg mechanism. The new methodology argues that by establishing a similarity relation between a complex system and a number of simpler models it is possible to design the former using the analysis and synthesis means developed for the latter. The methodology provides a framework for concurrent engineering of complex systems while maintaining the transparency of the system behaviour through making formal analogies between the system and those with more tractable dynamics. The application of the MbA methodology to the design of a monopod robot leg, called the Linkage Leg, is also studied. A series of simulations show that the dynamic behaviour of the Linkage Leg is similar to that of a combination of a double pendulum and a spring-loaded inverted pendulum, based on which the system kinematic, dynamic, and control parameters can be designed concurrently. The first stage of Mechatronics by Analogy is a method of extracting significant features of system dynamics through simpler models. The goal is to determine a set of simpler mechanisms with similar dynamic behaviour to that of the original system in various phases of its motion. A modular bond-graph representation of the system is determined, and subsequently simplified using two simplification algorithms. The first algorithm determines the relevant dynamic elements of the system for each phase of motion, and the second algorithm finds the simple mechanism described by the remaining dynamic elements. In addition to greatly simplifying the controller for the system, using simpler mechanisms with similar behaviour provides a greater insight into the dynamics of the system. This is seen in the second stage of the new methodology, which concurrently optimizes the simpler mechanisms together with a control system based on their dynamics. Once the optimal configuration of the simpler system is determined, the original mechanism is optimized such that its dynamic behaviour is analogous. It is shown that, if this analogy is achieved, the control system designed based on the simpler mechanisms can be directly implemented to the more complex system, and their dynamic behaviours are close enough for the system performance to be effectively the same. Finally it is shown that, for the employed objective of fast legged locomotion, the proposed methodology achieves a better design than Reduction-by-Feedback, a competing methodology that uses control layers to simplify the dynamics of the system.

  9. The Application of Intelligent Agents in Libraries: A Survey

    ERIC Educational Resources Information Center

    Liu, Guoying

    2011-01-01

    Purpose: The purpose of this article is to provide a comprehensive literature review on the utilisation of intelligent agent technology in the library environment. Design/methodology/approach: Research papers since 1990 on the use of various intelligent agent technologies in libraries are divided into two main application areas: digital library…

  10. A Hindu Perspective to Organizational Learning

    ERIC Educational Resources Information Center

    Shama Rao, Ashok; Kamath Burde, Jyothsna

    2017-01-01

    Purpose: This paper aims to provide an overview of the relevance and applicability of the Hindu tradition to organizational learning. Design/methodology/approach: Attempting to separate the spiritual from the religious aspects, a primarily theoretical approach is used to delineate the basic concepts in Hinduism and their applicability to various…

  11. The Environment for Application Software Integration and Execution (EASIE) version 1.0. Volume 1: Executive overview

    NASA Technical Reports Server (NTRS)

    Rowell, Lawrence F.; Davis, John S.

    1989-01-01

    The Environment for Application Software Integration and Execution (EASIE) provides a methodology and a set of software utility programs to ease the task of coordinating engineering design and analysis codes. EASIE was designed to meet the needs of conceptual design engineers that face the task of integrating many stand-alone engineering analysis programs. Using EASIE, programs are integrated through a relational database management system. Volume 1, Executive Overview, gives an overview of the functions provided by EASIE and describes their use. Three operational design systems based upon the EASIE software are briefly described.

  12. The influence of capture-recapture methodology on the evolution of the North American Bird Banding Program

    USGS Publications Warehouse

    Tautin, J.; Lebreton, J.-D.; North, P.M.

    1993-01-01

    Capture-recapture methodology has advanced greatly in the last twenty years and is now a major factor driving the continuing evolution of the North American bird banding program. Bird banding studies are becoming more scientific with improved study designs and analytical procedures. Researchers and managers are gaining more reliable knowledge which in turn betters the conservation of migratory birds. The advances in capture-recapture methodology have benefited gamebird studies primarily, but nongame bird studies will benefit similarly as they expand greatly in the next decade. Further theoretical development of capture-recapture methodology should be encouraged, and, to maximize benefits of the methodology, work on practical applications should be increased.

  13. Application of Taguchi Design and Response Surface Methodology for Improving Conversion of Isoeugenol into Vanillin by Resting Cells of Psychrobacter sp. CSW4.

    PubMed

    Ashengroph, Morahem; Nahvi, Iraj; Amini, Jahanshir

    2013-01-01

    For all industrial processes, modelling, optimisation and control are the keys to enhance productivity and ensure product quality. In the current study, the optimization of process parameters for improving the conversion of isoeugenol to vanillin by Psychrobacter sp. CSW4 was investigated by means of Taguchi approach and Box-Behnken statistical design under resting cell conditions. Taguchi design was employed for screening the significant variables in the bioconversion medium. Sequentially, Box-Behnken design experiments under Response Surface Methodology (RSM) was used for further optimization. Four factors (isoeugenol, NaCl, biomass and tween 80 initial concentrations), which have significant effects on vanillin yield, were selected from ten variables by Taguchi experimental design. With the regression coefficient analysis in the Box-Behnken design, a relationship between vanillin production and four significant variables was obtained, and the optimum levels of the four variables were as follows: initial isoeugenol concentration 6.5 g/L, initial tween 80 concentration 0.89 g/L, initial NaCl concentration 113.2 g/L and initial biomass concentration 6.27 g/L. Under these optimized conditions, the maximum predicted concentration of vanillin was 2.25 g/L. These optimized values of the factors were validated in a triplicate shaking flask study and an average of 2.19 g/L for vanillin, which corresponded to a molar yield 36.3%, after a 24 h bioconversion was obtained. The present work is the first one reporting the application of Taguchi design and Response surface methodology for optimizing bioconversion of isoeugenol into vanillin under resting cell conditions.

  14. Automated Methodologies for the Design of Flow Diagrams for Development and Maintenance Activities

    NASA Astrophysics Data System (ADS)

    Shivanand M., Handigund; Shweta, Bhat

    The Software Requirements Specification (SRS) of the organization is a text document prepared by strategic management incorporating the requirements of the organization. These requirements of ongoing business/ project development process involve the software tools, the hardware devices, the manual procedures, the application programs and the communication commands. These components are appropriately ordered for achieving the mission of the concerned process both in the project development and the ongoing business processes, in different flow diagrams viz. activity chart, workflow diagram, activity diagram, component diagram and deployment diagram. This paper proposes two generic, automatic methodologies for the design of various flow diagrams of (i) project development activities, (ii) ongoing business process. The methodologies also resolve the ensuing deadlocks in the flow diagrams and determine the critical paths for the activity chart. Though both methodologies are independent, each complements other in authenticating its correctness and completeness.

  15. Autonomous interplanetary constellation design

    NASA Astrophysics Data System (ADS)

    Chow, Cornelius Channing, II

    According to NASA's integrated space technology roadmaps, space-based infrastructures are envisioned as necessary ingredients to a sustained effort in continuing space exploration. Whether it be for extra-terrestrial habitats, roving/cargo vehicles, or space tourism, autonomous space networks will provide a vital communications lifeline for both future robotic and human missions alike. Projecting that the Moon will be a bustling hub of activity within a few decades, a near-term opportunity for in-situ infrastructure development is within reach. This dissertation addresses the anticipated need for in-space infrastructure by investigating a general design methodology for autonomous interplanetary constellations; to illustrate the theory, this manuscript presents results from an application to the Earth-Moon neighborhood. The constellation design methodology is formulated as an optimization problem, involving a trajectory design step followed by a spacecraft placement sequence. Modeling the dynamics as a restricted 3-body problem, the investigated design space consists of families of periodic orbits which play host to the constellations, punctuated by arrangements of spacecraft autonomously guided by a navigation strategy called LiAISON (Linked Autonomous Interplanetary Satellite Orbit Navigation). Instead of more traditional exhaustive search methods, a numerical continuation approach is implemented to map the admissible configuration space. In particular, Keller's pseudo-arclength technique is used to follow folding/bifurcating solution manifolds, which are otherwise inaccessible with other parameter continuation schemes. A succinct characterization of the underlying structure of the local, as well as global, extrema is thus achievable with little a priori intuition of the solution space. Furthermore, the proposed design methodology offers benefits in computation speed plus the ability to handle mildly stochastic systems. An application of the constellation design methodology to the restricted Earth-Moon system, reveals optimal pairwise configurations for various L1, L2, and L5 (halo, axial, and vertical) periodic orbit families. Navigation accuracies, ranging from O (10+/-1) meters in position space, are obtained for the optimal Earth-Moon constellations, given measurement noise on the order of 1 meter.

  16. Methodological considerations in the use of audio diaries in work psychology: Adding to the qualitative toolkit.

    PubMed

    Crozier, Sarah E; Cassell, Catherine M

    2016-06-01

    The use of longitudinal methodology as a means of capturing the intricacies in complex organizational phenomena is well documented, and many different research strategies for longitudinal designs have been put forward from both a qualitative and quantitative stance. This study explores a specific emergent qualitative methodology, audio diaries, and assesses their utility for work psychology research drawing on the findings from a four-stage study addressing transient working patterns and stress in UK temporary workers. Specifically, we explore some important methodological, analytical and technical issues for practitioners and researchers who seek to use these methods and explain how this type of methodology has much to offer when studying stress and affective experiences at work. We provide support for the need to implement pluralistic and complementary methodological approaches in unearthing the depth in sense-making and assert their capacity to further illuminate the process orientation of stress. This study illustrates the importance of verbalization in documenting stress and affective experience as a mechanism for accessing cognitive processes in making sense of such experience.This study compares audio diaries with more traditional qualitative methods to assess applicability to different research contexts.This study provides practical guidance and a methodological framework for the design of audio diary research and design, taking into account challenges and solutions for researchers and practitioners.

  17. Proceedings, Seminar on Probabilistic Methods in Geotechnical Engineering

    NASA Astrophysics Data System (ADS)

    Hynes-Griffin, M. E.; Buege, L. L.

    1983-09-01

    Contents: Applications of Probabilistic Methods in Geotechnical Engineering; Probabilistic Seismic and Geotechnical Evaluation at a Dam Site; Probabilistic Slope Stability Methodology; Probability of Liquefaction in a 3-D Soil Deposit; Probabilistic Design of Flood Levees; Probabilistic and Statistical Methods for Determining Rock Mass Deformability Beneath Foundations: An Overview; Simple Statistical Methodology for Evaluating Rock Mechanics Exploration Data; New Developments in Statistical Techniques for Analyzing Rock Slope Stability.

  18. Un Modelo Basico de Instruccion Directa Para la Ensenanza de la Metodologia de la Investigacion (Using the Basic Direct Model of Instruction To Teach an Introductory Research Model).

    ERIC Educational Resources Information Center

    Serafin, Ana Gil

    This study examined the application of the Basic Direct Instruction Model (BDIM), a methodology designed to maximize student interest in instrumental and methodological courses, to graduate level educational leadership students. The research used qualitative techniques and a participatory approach with a sample of 92 beginning level Masters…

  19. An Assessment of IMPAC - Integrated Methodology for Propulsion and Airframe Controls

    NASA Technical Reports Server (NTRS)

    Walker, G. P.; Wagner, E. A.; Bodden, D. S.

    1996-01-01

    This report documents the work done under a NASA sponsored contract to transition to industry technologies developed under the NASA Lewis Research Center IMPAC (Integrated Methodology for Propulsion and Airframe Control) program. The critical steps in IMPAC are exercised on an example integrated flight/propulsion control design for linear airframe/engine models of a conceptual STOVL (Short Take-Off and Vertical Landing) aircraft, and MATRIXX (TM) executive files to implement each step are developed. The results from the example study are analyzed and lessons learned are listed along with recommendations that will improve the application of each design step. The end product of this research is a set of software requirements for developing a user-friendly control design tool which will automate the steps in the IMPAC methodology. Prototypes for a graphical user interface (GUI) are sketched to specify how the tool will interact with the user, and it is recommended to build the tool around existing computer aided control design software packages.

  20. Guidance, navigation, and control subsystem equipment selection algorithm using expert system methods

    NASA Technical Reports Server (NTRS)

    Allen, Cheryl L.

    1991-01-01

    Enhanced engineering tools can be obtained through the integration of expert system methodologies and existing design software. The application of these methodologies to the spacecraft design and cost model (SDCM) software provides an improved technique for the selection of hardware for unmanned spacecraft subsystem design. The knowledge engineering system (KES) expert system development tool was used to implement a smarter equipment section algorithm than that which is currently achievable through the use of a standard data base system. The guidance, navigation, and control subsystems of the SDCM software was chosen as the initial subsystem for implementation. The portions of the SDCM code which compute the selection criteria and constraints remain intact, and the expert system equipment selection algorithm is embedded within this existing code. The architecture of this new methodology is described and its implementation is reported. The project background and a brief overview of the expert system is described, and once the details of the design are characterized, an example of its implementation is demonstrated.

  1. DESIGN ANALYSIS FOR THE NAVAL SNF WASTE PACKAGE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    T.L. Mitchell

    2000-05-31

    The purpose of this analysis is to demonstrate the design of the naval spent nuclear fuel (SNF) waste package (WP) using the Waste Package Department's (WPD) design methodologies and processes described in the ''Waste Package Design Methodology Report'' (CRWMS M&O [Civilian Radioactive Waste Management System Management and Operating Contractor] 2000b). The calculations that support the design of the naval SNF WP will be discussed; however, only a sub-set of such analyses will be presented and shall be limited to those identified in the ''Waste Package Design Sensitivity Report'' (CRWMS M&O 2000c). The objective of this analysis is to describe themore » naval SNF WP design method and to show that the design of the naval SNF WP complies with the ''Naval Spent Nuclear Fuel Disposal Container System Description Document'' (CRWMS M&O 1999a) and Interface Control Document (ICD) criteria for Site Recommendation. Additional criteria for the design of the naval SNF WP have been outlined in Section 6.2 of the ''Waste Package Design Sensitivity Report'' (CRWMS M&O 2000c). The scope of this analysis is restricted to the design of the naval long WP containing one naval long SNF canister. This WP is representative of the WPs that will contain both naval short SNF and naval long SNF canisters. The following items are included in the scope of this analysis: (1) Providing a general description of the applicable design criteria; (2) Describing the design methodology to be used; (3) Presenting the design of the naval SNF waste package; and (4) Showing compliance with all applicable design criteria. The intended use of this analysis is to support Site Recommendation reports and assist in the development of WPD drawings. Activities described in this analysis were conducted in accordance with the technical product development plan (TPDP) ''Design Analysis for the Naval SNF Waste Package (CRWMS M&O 2000a).« less

  2. Development and Application of a Systems Engineering Framework to Support Online Course Design and Delivery

    ERIC Educational Resources Information Center

    Bozkurt, Ipek; Helm, James

    2013-01-01

    This paper develops a systems engineering-based framework to assist in the design of an online engineering course. Specifically, the purpose of the framework is to provide a structured methodology for the design, development and delivery of a fully online course, either brand new or modified from an existing face-to-face course. The main strength…

  3. A decision model for planetary missions

    NASA Technical Reports Server (NTRS)

    Hazelrigg, G. A., Jr.; Brigadier, W. L.

    1976-01-01

    Many techniques developed for the solution of problems in economics and operations research are directly applicable to problems involving engineering trade-offs. This paper investigates the use of utility theory for decision making in planetary exploration space missions. A decision model is derived that accounts for the objectives of the mission - science - the cost of flying the mission and the risk of mission failure. A simulation methodology for obtaining the probability distribution of science value and costs as a function spacecraft and mission design is presented and an example application of the decision methodology is given for various potential alternatives in a comet Encke mission.

  4. Bayesian Inference for Source Term Estimation: Application to the International Monitoring System Radionuclide Network

    DTIC Science & Technology

    2014-10-01

    de l’exactitude et de la précision), comparativement au modèle de mesure plus simple qui n’utilise pas de multiplicateurs. Importance pour la défense...3) Bayesian experimental design for receptor placement in order to maximize the expected information in the measured concen- tration data for...applications of the Bayesian inferential methodology for source recon- struction have used high-quality concentration data from well- designed atmospheric

  5. A review of engineering development of aqueous phase solar photocatalytic detoxification and disinfection processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goswami, D.Y.

    1997-05-01

    Scientific research on photocatalytic oxidation of hazardous chemicals has been conducted extensively over the last three decades. Use of solar radiation in photocatalytic detoxification and disinfection has only been explored in the last decade. Developments of engineering scale systems, design methodologies, and commercial and industrial applications have occurred even more recently. A number of reactor concepts and designs including concentrating and nonconcentrating types and methods of catalyst deployment have been developed. Some commercial and industrial field tests of solar detoxification systems have been conducted. This paper reviews the engineering developments of the solar photocatalytic detoxification and disinfection processes, including systemmore » design methodologies.« less

  6. Application of Adjoint Methodology to Supersonic Aircraft Design Using Reversed Equivalent Areas

    NASA Technical Reports Server (NTRS)

    Rallabhandi, Sriram K.

    2013-01-01

    This paper presents an approach to shape an aircraft to equivalent area based objectives using the discrete adjoint approach. Equivalent areas can be obtained either using reversed augmented Burgers equation or direct conversion of off-body pressures into equivalent area. Formal coupling with CFD allows computation of sensitivities of equivalent area objectives with respect to aircraft shape parameters. The exactness of the adjoint sensitivities is verified against derivatives obtained using the complex step approach. This methodology has the benefit of using designer-friendly equivalent areas in the shape design of low-boom aircraft. Shape optimization results with equivalent area cost functionals are discussed and further refined using ground loudness based objectives.

  7. Probabilistic Design Methodology and its Application to the Design of an Umbilical Retract Mechanism

    NASA Technical Reports Server (NTRS)

    Onyebueke, Landon; Ameye, Olusesan

    2002-01-01

    A lot has been learned from past experience with structural and machine element failures. The understanding of failure modes and the application of an appropriate design analysis method can lead to improved structural and machine element safety as well as serviceability. To apply Probabilistic Design Methodology (PDM), all uncertainties are modeled as random variables with selected distribution types, means, and standard deviations. It is quite difficult to achieve a robust design without considering the randomness of the design parameters which is the case in the use of the Deterministic Design Approach. The US Navy has a fleet of submarine-launched ballistic missiles. An umbilical plug joins the missile to the submarine in order to provide electrical and cooling water connections. As the missile leaves the submarine, an umbilical retract mechanism retracts the umbilical plug clear of the advancing missile after disengagement during launch and retrains the plug in the retracted position. The design of the current retract mechanism in use was based on the deterministic approach which puts emphasis on factor of safety. A new umbilical retract mechanism that is simpler in design, lighter in weight, more reliable, easier to adjust, and more cost effective has become desirable since this will increase the performance and efficiency of the system. This paper reports on a recent project performed at Tennessee State University for the US Navy that involved the application of PDM to the design of an umbilical retract mechanism. This paper demonstrates how the use of PDM lead to the minimization of weight and cost, and the maximization of reliability and performance.

  8. An Integrated Low-Speed Performance and Noise Prediction Methodology for Subsonic Aircraft

    NASA Technical Reports Server (NTRS)

    Olson, E. D.; Mavris, D. N.

    2000-01-01

    An integrated methodology has been assembled to compute the engine performance, takeoff and landing trajectories, and community noise levels for a subsonic commercial aircraft. Where feasible, physics-based noise analysis methods have been used to make the results more applicable to newer, revolutionary designs and to allow for a more direct evaluation of new technologies. The methodology is intended to be used with approximation methods and risk analysis techniques to allow for the analysis of a greater number of variable combinations while retaining the advantages of physics-based analysis. Details of the methodology are described and limited results are presented for a representative subsonic commercial aircraft.

  9. A study of commuter airplane design optimization

    NASA Technical Reports Server (NTRS)

    Roskam, J.; Wyatt, R. D.; Griswold, D. A.; Hammer, J. L.

    1977-01-01

    Problems of commuter airplane configuration design were studied to affect a minimization of direct operating costs. Factors considered were the minimization of fuselage drag, methods of wing design, and the estimated drag of an airplane submerged in a propellor slipstream; all design criteria were studied under a set of fixed performance, mission, and stability constraints. Configuration design data were assembled for application by a computerized design methodology program similar to the NASA-Ames General Aviation Synthesis Program.

  10. An Ontology for State Analysis: Formalizing the Mapping to SysML

    NASA Technical Reports Server (NTRS)

    Wagner, David A.; Bennett, Matthew B.; Karban, Robert; Rouquette, Nicolas; Jenkins, Steven; Ingham, Michel

    2012-01-01

    State Analysis is a methodology developed over the last decade for architecting, designing and documenting complex control systems. Although it was originally conceived for designing robotic spacecraft, recent applications include the design of control systems for large ground-based telescopes. The European Southern Observatory (ESO) began a project to design the European Extremely Large Telescope (E-ELT), which will require coordinated control of over a thousand articulated mirror segments. The designers are using State Analysis as a methodology and the Systems Modeling Language (SysML) as a modeling and documentation language in this task. To effectively apply the State Analysis methodology in this context it became necessary to provide ontological definitions of the concepts and relations in State Analysis and greater flexibility through a mapping of State Analysis into a practical extension of SysML. The ontology provides the formal basis for verifying compliance with State Analysis semantics including architectural constraints. The SysML extension provides the practical basis for applying the State Analysis methodology with SysML tools. This paper will discuss the method used to develop these formalisms (the ontology), the formalisms themselves, the mapping to SysML and approach to using these formalisms to specify a control system and enforce architectural constraints in a SysML model.

  11. The Development of CK2 Inhibitors: From Traditional Pharmacology to in Silico Rational Drug Design

    PubMed Central

    Cozza, Giorgio

    2017-01-01

    Casein kinase II (CK2) is an ubiquitous and pleiotropic serine/threonine protein kinase able to phosphorylate hundreds of substrates. Being implicated in several human diseases, from neurodegeneration to cancer, the biological roles of CK2 have been intensively studied. Upregulation of CK2 has been shown to be critical to tumor progression, making this kinase an attractive target for cancer therapy. Several CK2 inhibitors have been developed so far, the first being discovered by “trial and error testing”. In the last decade, the development of in silico rational drug design has prompted the discovery, de novo design and optimization of several CK2 inhibitors, active in the low nanomolar range. The screening of big chemical libraries and the optimization of hit compounds by Structure Based Drug Design (SBDD) provide telling examples of a fruitful application of rational drug design to the development of CK2 inhibitors. Ligand Based Drug Design (LBDD) models have been also applied to CK2 drug discovery, however they were mainly focused on methodology improvements rather than being critical for de novo design and optimization. This manuscript provides detailed description of in silico methodologies whose applications to the design and development of CK2 inhibitors proved successful and promising. PMID:28230762

  12. The Applicability of Course Experience Questionnaire for a Malaysian University Context

    ERIC Educational Resources Information Center

    Thien, Lei Mee; Ong, Mei Yean

    2016-01-01

    Purpose: The purpose of this study is to examine the applicability of Course Experience Questionnaire (CEQ) in a Malaysian university context. Design/methodology/approach: The CEQ was translated into Malay language using rigorous cross-cultural adaptation procedures. The Malay version CEQ was administered to 190 undergraduate students in one…

  13. Implementation of an innovative teaching project in a Chemical Process Design course at the University of Cantabria, Spain

    NASA Astrophysics Data System (ADS)

    Galan, Berta; Muñoz, Iciar; Viguri, Javier R.

    2016-09-01

    This paper shows the planning, the teaching activities and the evaluation of the learning and teaching process implemented in the Chemical Process Design course at the University of Cantabria, Spain. Educational methods to address the knowledge, skills and attitudes that students who complete the course are expected to acquire are proposed and discussed. Undergraduate and graduate engineers' perceptions of the methodology used are evaluated by means of a questionnaire. Results of the teaching activities and the strengths and weaknesses of the proposed case study are discussed in relation to the course characteristics. The findings of the empirical evaluation shows that the excessive time students had to dedicate to the case study project and dealing with limited information are the most negative aspects obtained, whereas an increase in the students' self-confidence and the practical application of the methodology are the most positive aspects. Finally, improvements are discussed in order to extend the application of the methodology to other courses offered as part of the chemical engineering degree.

  14. An improved approach for flight readiness certification: Probabilistic models for flaw propagation and turbine blade failure. Volume 2: Software documentation

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflights systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for design, failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.

  15. Palestine 1918: General Edmund Allenby’s Application of Operational Art and Design

    DTIC Science & Technology

    the British connected a series of joint combined arms operations in time , space, and purpose to achieve the strategic goals of the Empire. The campaign...perception. This monograph analizes the campaign in Palestine through the lens of Army Design Methodology (ADM) to illuminate how future leaders can

  16. Professional Development of Russian HEIs' Management and Faculty in CDIO Standards Application

    ERIC Educational Resources Information Center

    Chuchalin, Alexander; Malmqvist, Johan; Tayurskaya, Marina

    2016-01-01

    The paper presents the approach to complex training of managers and faculty staff for system modernisation of Russian engineering education. As a methodological basis of design and implementation of the faculty development programme, the CDIO (Conceive-Design-Implement-Operate) Approach was chosen due to compliance of its concept to the purposes…

  17. Nudging Waste Diversion at Western State Colorado University: Application of Behavioral Insights

    ERIC Educational Resources Information Center

    McCoy, Kimberly; Oliver, Justin J.; Borden, D. Scott; Cohn, Scott I.

    2018-01-01

    Purpose: This paper aims to test a nudge, or intervention, designed through behavioral insights at a university campus to discover cost-effective means for increasing recycling participation and methods for estimating waste removal cost savings. Design/methodology/approach: A series of studies were conducted demonstrating the effectiveness of…

  18. Careers in Academe: The Academic Labour Market as an Eco-System

    ERIC Educational Resources Information Center

    Baruch, Yehuda

    2013-01-01

    Purpose: This paper aims to explore the contrast between stable and dynamic labour markets in academe in light of career theories that were originally developed for business environments. Design/methodology/approach: A conceptual design, offering the eco-system as a framework. Findings: It evaluates their relevance and applicability to dynamic and…

  19. Strategically Focused Training in Six Sigma Way: A Case Study

    ERIC Educational Resources Information Center

    Pandey, Ashish

    2007-01-01

    Purpose: The purpose of the current study is to examine the utility of Six Sigma interventions as a performance measure and explore its applicability for making the training design and delivery operationally efficient and strategically effective. Design/methodology/approach: This is a single revelatory case study. Data were collected from multiple…

  20. Strategic Industrial Alliances in Paper Industry: XML- vs Ontology-Based Integration Platforms

    ERIC Educational Resources Information Center

    Naumenko, Anton; Nikitin, Sergiy; Terziyan, Vagan; Zharko, Andriy

    2005-01-01

    Purpose: To identify cases related to design of ICT platforms for industrial alliances, where the use of Ontology-driven architectures based on Semantic web standards is more advantageous than application of conventional modeling together with XML standards. Design/methodology/approach: A comparative analysis of the two latest and the most obvious…

  1. Octopus: A Design Methodology for Motion Capture Wearables

    PubMed Central

    2017-01-01

    Human motion capture (MoCap) is widely recognised for its usefulness and application in different fields, such as health, sports, and leisure; therefore, its inclusion in current wearables (MoCap-wearables) is increasing, and it may be very useful in a context of intelligent objects interconnected with each other and to the cloud in the Internet of Things (IoT). However, capturing human movement adequately requires addressing difficult-to-satisfy requirements, which means that the applications that are possible with this technology are held back by a series of accessibility barriers, some technological and some regarding usability. To overcome these barriers and generate products with greater wearability that are more efficient and accessible, factors are compiled through a review of publications and market research. The result of this analysis is a design methodology called Octopus, which ranks these factors and schematises them. Octopus provides a tool that can help define design requirements for multidisciplinary teams, generating a common framework and offering a new method of communication between them. PMID:28809786

  2. Object oriented development of engineering software using CLIPS

    NASA Technical Reports Server (NTRS)

    Yoon, C. John

    1991-01-01

    Engineering applications involve numeric complexity and manipulations of a large amount of data. Traditionally, numeric computation has been the concern in developing an engineering software. As engineering application software became larger and more complex, management of resources such as data, rather than the numeric complexity, has become the major software design problem. Object oriented design and implementation methodologies can improve the reliability, flexibility, and maintainability of the resulting software; however, some tasks are better solved with the traditional procedural paradigm. The C Language Integrated Production System (CLIPS), with deffunction and defgeneric constructs, supports the procedural paradigm. The natural blending of object oriented and procedural paradigms has been cited as the reason for the popularity of the C++ language. The CLIPS Object Oriented Language's (COOL) object oriented features are more versatile than C++'s. A software design methodology based on object oriented and procedural approaches appropriate for engineering software, and to be implemented in CLIPS was outlined. A method for sensor placement for Space Station Freedom is being implemented in COOL as a sample problem.

  3. Octopus: A Design Methodology for Motion Capture Wearables.

    PubMed

    Marin, Javier; Blanco, Teresa; Marin, Jose J

    2017-08-15

    Human motion capture (MoCap) is widely recognised for its usefulness and application in different fields, such as health, sports, and leisure; therefore, its inclusion in current wearables (MoCap-wearables) is increasing, and it may be very useful in a context of intelligent objects interconnected with each other and to the cloud in the Internet of Things (IoT). However, capturing human movement adequately requires addressing difficult-to-satisfy requirements, which means that the applications that are possible with this technology are held back by a series of accessibility barriers, some technological and some regarding usability. To overcome these barriers and generate products with greater wearability that are more efficient and accessible, factors are compiled through a review of publications and market research. The result of this analysis is a design methodology called Octopus, which ranks these factors and schematises them. Octopus provides a tool that can help define design requirements for multidisciplinary teams, generating a common framework and offering a new method of communication between them.

  4. Advanced Control Considerations for Turbofan Engine Design

    NASA Technical Reports Server (NTRS)

    Connolly, Joseph W.; Csank, Jeffrey T.; Chicatelli, Amy

    2016-01-01

    This paper covers the application of a model-based engine control (MBEC) methodology featuring a self tuning on-board model for an aircraft turbofan engine simulation. The nonlinear engine model is capable of modeling realistic engine performance, allowing for a verification of the advanced control methodology over a wide range of operating points and life cycle conditions. The on-board model is a piece-wise linear model derived from the nonlinear engine model and updated using an optimal tuner Kalman Filter estimation routine, which enables the on-board model to self-tune to account for engine performance variations. MBEC is used here to show how advanced control architectures can improve efficiency during the design phase of a turbofan engine by reducing conservative operability margins. The operability margins that can be reduced, such as stall margin, can expand the engine design space and offer potential for efficiency improvements. Application of MBEC architecture to a nonlinear engine simulation is shown to reduce the thrust specific fuel consumption by approximately 1% over the baseline design, while maintaining safe operation of the engine across the flight envelope.

  5. Stochastic optimization of broadband reflecting photonic structures.

    PubMed

    Estrada-Wiese, D; Del Río-Chanona, E A; Del Río, J A

    2018-01-19

    Photonic crystals (PCs) are built to control the propagation of light within their structure. These can be used for an assortment of applications where custom designed devices are of interest. Among them, one-dimensional PCs can be produced to achieve the reflection of specific and broad wavelength ranges. However, their design and fabrication are challenging due to the diversity of periodic arrangement and layer configuration that each different PC needs. In this study, we present a framework to design high reflecting PCs for any desired wavelength range. Our method combines three stochastic optimization algorithms (Random Search, Particle Swarm Optimization and Simulated Annealing) along with a reduced space-search methodology to obtain a custom and optimized PC configuration. The optimization procedure is evaluated through theoretical reflectance spectra calculated by using the Equispaced Thickness Method, which improves the simulations due to the consideration of incoherent light transmission. We prove the viability of our procedure by fabricating different reflecting PCs made of porous silicon and obtain good agreement between experiment and theory using a merit function. With this methodology, diverse reflecting PCs can be designed for any applications and fabricated with different materials.

  6. Responding to User's Expectation in the Library: Innovative Web 2.0 Applications at JUIT Library: A Case Study

    ERIC Educational Resources Information Center

    Ram, Shri; Anbu K., John Paul; Kataria, Sanjay

    2011-01-01

    Purpose: This paper seeks to provide an insight into the implementation of some of the innovative Web 2.0 applications at Jaypee University of Information Technology with the aim of exploring the expectations of the users and their awareness and usage of such applications. Design/methodology/approach: The study was undertaken at the Learning…

  7. Methodology for Variable Fidelity Multistage Optimization under Uncertainty

    DTIC Science & Technology

    2011-03-31

    problem selected for the application of the new optimization methodology is a Single Stage To Orbit ( SSTO ) expendable launch vehicle (ELV). Three...the primary exercise of the variable fidelity optimization portion of the code. SSTO vehicles have been discussed almost exclusively in the context...of reusable launch vehicles (RLV). There is very little discussion in recent literature of SSTO designs which are expendable. In the light of the

  8. VISP 2.0: Methodological Considerations for the Design and Implementation of an Audiodescription Based App to Improve Oral Skills

    ERIC Educational Resources Information Center

    Ibáñez Moreno, Ana; Vermeulen, Anna

    2015-01-01

    In this paper the methodological steps taken in the conception of a new mobile application (app) are introduced. This app, called VISP (Videos for Speaking), is easily accessible and manageable, and is aimed at helping students of English as a Foreign Language (EFL) to improve their idiomaticity in their oral production. In order to do so, the app…

  9. Sharing methodology: a worked example of theoretical integration with qualitative data to clarify practical understanding of learning and generate new theoretical development.

    PubMed

    Yardley, Sarah; Brosnan, Caragh; Richardson, Jane

    2013-01-01

    Theoretical integration is a necessary element of study design if clarification of experiential learning is to be achieved. There are few published examples demonstrating how this can be achieved. This methodological article provides a worked example of research methodology that achieved clarification of authentic early experiences (AEEs) through a bi-directional approach to theory and data. Bi-directional refers to our simultaneous use of theory to guide and interrogate empirical data and the use of empirical data to refine theory. We explain the five steps of our methodological approach: (1) understanding the context; (2) critique on existing applications of socio-cultural models to inform study design; (3) data generation; (4) analysis and interpretation and (5) theoretical development through a novel application of Metis. These steps resulted in understanding of how and why different outcomes arose from students participating in AEE. Our approach offers a mechanism for clarification without which evidence-based effective ways to maximise constructive learning cannot be developed. In our example it also contributed to greater theoretical understanding of the influence of social interactions. By sharing this example of research undertaken to develop both theory and educational practice we hope to assist others seeking to conduct similar research.

  10. Design and Implementation of Embedded Computer Vision Systems Based on Particle Filters

    DTIC Science & Technology

    2010-01-01

    for hardware/software implementa- tion of multi-dimensional particle filter application and we explore this in the third application which is a 3D...methodology for hardware/software implementation of multi-dimensional particle filter application and we explore this in the third application which is a...and hence multiprocessor implementation of parti- cle filters is an important option to examine. A significant body of work exists on optimizing generic

  11. RF power harvesting: a review on designing methodologies and applications

    NASA Astrophysics Data System (ADS)

    Tran, Le-Giang; Cha, Hyouk-Kyu; Park, Woo-Tae

    2017-12-01

    Wireless power transmission was conceptualized nearly a century ago. Certain achievements made to date have made power harvesting a reality, capable of providing alternative sources of energy. This review provides a summ ary of radio frequency (RF) power harvesting technologies in order to serve as a guide for the design of RF energy harvesting units. Since energy harvesting circuits are designed to operate with relatively small voltages and currents, they rely on state-of-the-art electrical technology for obtaining high efficiency. Thus, comprehensive analysis and discussions of various designs and their tradeoffs are included. Finally, recent applications of RF power harvesting are outlined.

  12. Design of a Mobile Application for Transfusion Medicine.

    PubMed

    Albornoz, M A; Márquez, S; Rubin, L; Luna, D

    2017-01-01

    One of the most frequent error in transfusion medicine is the failure in verifying the patient's identity prior to transfusion. This paper describes the design and development of a Mobile Application (MA) for transfusion medicine. The app uses barcode and QR reading technology for the verification of the patient's identity and the administration of blood components when making a blood transfusion. Physicians, developers, technicians of transfusion medicine and a User Centered Design team participated in the design. The inclusion of end users was fundamental to get full representativeness of their workflow. The project was based on agile methodologies of project management and software development.

  13. System perspectives for mobile platform design in m-Health

    NASA Astrophysics Data System (ADS)

    Roveda, Janet M.; Fink, Wolfgang

    2016-05-01

    Advances in integrated circuit technologies have led to the integration of medical sensor front ends with data processing circuits, i.e., mobile platform design for wearable sensors. We discuss design methodologies for wearable sensor nodes and their applications in m-Health. From the user perspective, flexibility, comfort, appearance, fashion, ease-of-use, and visibility are key form factors. From the technology development point of view, high accuracy, low power consumption, and high signal to noise ratio are desirable features. From the embedded software design standpoint, real time data analysis algorithms, application and database interfaces are the critical components to create successful wearable sensor-based products.

  14. Centralized database for interconnection system design. [for spacecraft

    NASA Technical Reports Server (NTRS)

    Billitti, Joseph W.

    1989-01-01

    A database application called DFACS (Database, Forms and Applications for Cabling and Systems) is described. The objective of DFACS is to improve the speed and accuracy of interconnection system information flow during the design and fabrication stages of a project, while simultaneously supporting both the horizontal (end-to-end wiring) and the vertical (wiring by connector) design stratagems used by the Jet Propulsion Laboratory (JPL) project engineering community. The DFACS architecture is centered around a centralized database and program methodology which emulates the manual design process hitherto used at JPL. DFACS has been tested and successfully applied to existing JPL hardware tasks with a resulting reduction in schedule time and costs.

  15. Curriculum Design and Management in the Digital Media U: Applying the Corporate University Concept to a Business Sub-Sector

    ERIC Educational Resources Information Center

    Selby, Les; Russell, David

    2005-01-01

    Purpose: To report on the progress of Digital Media U, a tailor-made portal, learning environment and management system. Design/methodology/approach: Discusses the design of the learning content domains, acquisition of the content and the systems for managing the curriculum in the future, including the application of a new model of accreditation.…

  16. 76 FR 68769 - Bridging the Idea Development Evaluation Assessment and Long-Term Initiative and Total Product...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-07

    ... complementary methodological frameworks of the IDEAL and TPLC initiatives, more comprehensive and applicable... devices, surgical operations, and invasive medical procedures; Unique study designs and reporting methods...

  17. Ethical Dilemmas for the Computational Linguist in the Business World.

    ERIC Educational Resources Information Center

    McCallum-Bayliss, Heather

    1993-01-01

    Reports on a computer application in which collaboration did not precede project design. Important project parameters established without author input presented ethical dilemmas in balancing contract obligations and methodological rigor. (Author/CK)

  18. Intelligent systems engineering methodology

    NASA Technical Reports Server (NTRS)

    Fouse, Scott

    1990-01-01

    An added challenge for the designers of large scale systems such as Space Station Freedom is the appropriate incorporation of intelligent system technology (artificial intelligence, expert systems, knowledge-based systems, etc.) into their requirements and design. This presentation will describe a view of systems engineering which successfully addresses several aspects of this complex problem: design of large scale systems, design with requirements that are so complex they only completely unfold during the development of a baseline system and even then continue to evolve throughout the system's life cycle, design that involves the incorporation of new technologies, and design and development that takes place with many players in a distributed manner yet can be easily integrated to meet a single view of the requirements. The first generation of this methodology was developed and evolved jointly by ISX and the Lockheed Aeronautical Systems Company over the past five years on the Defense Advanced Research Projects Agency/Air Force Pilot's Associate Program, one of the largest, most complex, and most successful intelligent systems constructed to date. As the methodology has evolved it has also been applied successfully to a number of other projects. Some of the lessons learned from this experience may be applicable to Freedom.

  19. Improvement of Steam Turbine Operational Performance and Reliability with using Modern Information Technologies

    NASA Astrophysics Data System (ADS)

    Brezgin, V. I.; Brodov, Yu M.; Kultishev, A. Yu

    2017-11-01

    The report presents improvement methods review in the fields of the steam turbine units design and operation based on modern information technologies application. In accordance with the life cycle methodology support, a conceptual model of the information support system during life cycle main stages (LC) of steam turbine unit is suggested. A classifying system, which ensures the creation of sustainable information links between the engineer team (manufacture’s plant) and customer organizations (power plants), is proposed. Within report, the principle of parameterization expansion beyond the geometric constructions at the design and improvement process of steam turbine unit equipment is proposed, studied and justified. The report presents the steam turbine unit equipment design methodology based on the brand new oil-cooler design system that have been developed and implemented by authors. This design system combines the construction subsystem, which is characterized by extensive usage of family tables and templates, and computation subsystem, which includes a methodology for the thermal-hydraulic zone-by-zone oil coolers design calculations. The report presents data about the developed software for operational monitoring, assessment of equipment parameters features as well as its implementation on five power plants.

  20. An Investigation of Experiment Designs for Applications in Biofeedback-Performance Research Methodologies.

    DTIC Science & Technology

    1980-09-01

    used to accomplish the necessary research . One such experi- ment design and its relationship to validity will be explored next. Nonequivalent Control ...interpreting the results. The non- equivalent control group design is of the quasi-experimental variety and is widely used in educational research . As...biofeed- back research literature is the controlled group outcome study. This design has also been discussed in Chapter III in two forms as the

  1. Analytical Design Package (ADP2): A computer aided engineering tool for aircraft transparency design

    NASA Technical Reports Server (NTRS)

    Wuerer, J. E.; Gran, M.; Held, T. W.

    1994-01-01

    The Analytical Design Package (ADP2) is being developed as a part of the Air Force Frameless Transparency Program (FTP). ADP2 is an integrated design tool consisting of existing analysis codes and Computer Aided Engineering (CAE) software. The objective of the ADP2 is to develop and confirm an integrated design methodology for frameless transparencies, related aircraft interfaces, and their corresponding tooling. The application of this methodology will generate high confidence for achieving a qualified part prior to mold fabrication. ADP2 is a customized integration of analysis codes, CAE software, and material databases. The primary CAE integration tool for the ADP2 is P3/PATRAN, a commercial-off-the-shelf (COTS) software tool. The open architecture of P3/PATRAN allows customized installations with different applications modules for specific site requirements. Integration of material databases allows the engineer to select a material, and those material properties are automatically called into the relevant analysis code. The ADP2 materials database will be composed of four independent schemas: CAE Design, Processing, Testing, and Logistics Support. The design of ADP2 places major emphasis on the seamless integration of CAE and analysis modules with a single intuitive graphical interface. This tool is being designed to serve and be used by an entire project team, i.e., analysts, designers, materials experts, and managers. The final version of the software will be delivered to the Air Force in Jan. 1994. The Analytical Design Package (ADP2) will then be ready for transfer to industry. The package will be capable of a wide range of design and manufacturing applications.

  2. A Function-Behavior-State Approach to Designing Human Machine Interface for Nuclear Power Plant Operators

    NASA Astrophysics Data System (ADS)

    Lin, Y.; Zhang, W. J.

    2005-02-01

    This paper presents an approach to human-machine interface design for control room operators of nuclear power plants. The first step in designing an interface for a particular application is to determine information content that needs to be displayed. The design methodology for this step is called the interface design framework (called framework ). Several frameworks have been proposed for applications at varying levels, including process plants. However, none is based on the design and manufacture of a plant system for which the interface is designed. This paper presents an interface design framework which originates from design theory and methodology for general technical systems. Specifically, the framework is based on a set of core concepts of a function-behavior-state model originally proposed by the artificial intelligence research community and widely applied in the design research community. Benefits of this new framework include the provision of a model-based fault diagnosis facility, and the seamless integration of the design (manufacture, maintenance) of plants and the design of human-machine interfaces. The missing linkage between design and operation of a plant was one of the causes of the Three Mile Island nuclear reactor incident. A simulated plant system is presented to explain how to apply this framework in designing an interface. The resulting human-machine interface is discussed; specifically, several fault diagnosis examples are elaborated to demonstrate how this interface could support operators' fault diagnosis in an unanticipated situation.

  3. Benefit-cost methodology study with example application of the use of wind generators

    NASA Technical Reports Server (NTRS)

    Zimmer, R. P.; Justus, C. G.; Mason, R. M.; Robinette, S. L.; Sassone, P. G.; Schaffer, W. A.

    1975-01-01

    An example application for cost-benefit methodology is presented for the use of wind generators. The approach adopted for the example application consisted of the following activities: (1) surveying of the available wind data and wind power system information, (2) developing models which quantitatively described wind distributions, wind power systems, and cost-benefit differences between conventional systems and wind power systems, and (3) applying the cost-benefit methodology to compare a conventional electrical energy generation system with systems which included wind power generators. Wind speed distribution data were obtained from sites throughout the contiguous United States and were used to compute plant factor contours shown on an annual and seasonal basis. Plant factor values (ratio of average output power to rated power) are found to be as high as 0.6 (on an annual average basis) in portions of the central U. S. and in sections of the New England coastal area. Two types of wind power systems were selected for the application of the cost-benefit methodology. A cost-benefit model was designed and implemented on a computer to establish a practical tool for studying the relative costs and benefits of wind power systems under a variety of conditions and to efficiently and effectively perform associated sensitivity analyses.

  4. Evolutionary Maps: A new model for the analysis of conceptual development, with application to the diurnal cycle

    NASA Astrophysics Data System (ADS)

    Navarro, Manuel

    2014-05-01

    This paper presents a model of how children generate concrete concepts from perception through processes of differentiation and integration. The model informs the design of a novel methodology (evolutionary maps or emaps), whose implementation on certain domains unfolds the web of itineraries that children may follow in the construction of concrete conceptual knowledge and pinpoints, for each conception, the architecture of the conceptual change that leads to the scientific concept. Remarkably, the generative character of its syntax yields conceptions that, if unknown, amount to predictions that can be tested experimentally. Its application to the diurnal cycle (including the sun's trajectory in the sky) indicates that the model is correct and the methodology works (in some domains). Specifically, said emap predicts a number of exotic trajectories of the sun in the sky that, in the experimental work, were drawn spontaneously both on paper and a dome. Additionally, the application of the emaps theoretical framework in clinical interviews has provided new insight into other cognitive processes. The field of validity of the methodology and its possible applications to science education are discussed.

  5. Heuristic Evaluation of E-Learning Courses: A Comparative Analysis of Two E-Learning Heuristic Sets

    ERIC Educational Resources Information Center

    Zaharias, Panagiotis; Koutsabasis, Panayiotis

    2012-01-01

    Purpose: The purpose of this paper is to discuss heuristic evaluation as a method for evaluating e-learning courses and applications and more specifically to investigate the applicability and empirical use of two customized e-learning heuristic protocols. Design/methodology/approach: Two representative e-learning heuristic protocols were chosen…

  6. Measurement of Productivity and Quality in Non-Marketable Services: With Application to Schools

    ERIC Educational Resources Information Center

    Fare, R.; Grosskopf, S.; Forsund, F. R.; Hayes, K.; Heshmati, A.

    2006-01-01

    Purpose: This paper seeks to model and compute productivity, including a measure of quality, of a service which does not have marketable outputs--namely public education at the micro level. This application is a case study for Sweden public schools. Design/methodology/approach: A Malmquist productivity index is employed which allows for multiple…

  7. Using a Mobile Application to Support Children's Writing Motivation

    ERIC Educational Resources Information Center

    Kanala, Sari; Nousiainen, Tuula; Kankaanranta, Marja

    2013-01-01

    Purpose: The purpose of this paper is to explore the use of the prototype of a mobile application for the enhancement of children's motivation for writing. The results are explored from students' and experts' perspectives. Design/methodology/approach: This study is based on a field trial and expert evaluations of a prototype of a mobile…

  8. Design risk assessment for burst-prone mines: Application in a Canadian mine

    NASA Astrophysics Data System (ADS)

    Cheung, David J.

    A proactive stance towards improving the effectiveness and consistency of risk assessments has been adopted recently by mining companies and industry. The next 10-20 years forecasts that ore deposits accessible using shallow mining techniques will diminish. The industry continues to strive for success in "deeper" mining projects in order to keep up with the continuing demand for raw materials. Although the returns are quite profitable, many projects have been sidelined due to high uncertainty and technical risk in the mining of the mineral deposit. Several hardrock mines have faced rockbursting and seismicity problems. Within those reported, mines in countries like South Africa, Australia and Canada have documented cases of severe rockburst conditions attributed to the mining depth. Severe rockburst conditions known as "burst-prone" can be effectively managed with design. Adopting a more robust design can ameliorate the exposure of workers and equipment to adverse conditions and minimize the economic consequences, which can hinder the bottom line of an operation. This thesis presents a methodology created for assessing the design risk in burst-prone mines. The methodology includes an evaluation of relative risk ratings for scenarios with options of risk reduction through several design principles. With rockbursts being a hazard of seismic events, the methodology is based on research in the area of mining seismicity factoring in rockmass failure mechanisms, which results from a combination of mining induced stress, geological structures, rockmass properties and mining influences. The methodology was applied to case studies at Craig Mine of Xstrata Nickel in Sudbury, Ontario, which is known to contain seismically active fault zones. A customized risk assessment was created and applied to rockburst case studies, evaluating the seismic vulnerability and consequence for each case. Application of the methodology to Craig Mine demonstrates that changes in the design can reduce both exposure risk (personnel and equipment), and economical risk (revenue and costs). Fatal and catastrophic consequences can be averted through robust planning and design. Two customized approaches were developed to conduct risk assessment of case studies at Craig Mine. Firstly, the Brownfield Approach utilizes the seismic database to determine the seismic hazard from a rating system that evaluates frequency-magnitude, event size, and event-blast relation. Secondly, the Greenfield Approach utilizes the seismic database, focusing on larger magnitude events, rocktype, and geological structure. The customized Greenfield Approach can also be applied in the evaluation of design risk in deep mines with the same setting and condition as Craig Mine. Other mines with different settings and conditions can apply the principles in the methodology to evaluate design alternatives and risk reduction strategies for burst-prone mines.

  9. Wavefront modulation and subwavelength diffractive acoustics with an acoustic metasurface.

    PubMed

    Xie, Yangbo; Wang, Wenqi; Chen, Huanyang; Konneker, Adam; Popa, Bogdan-Ioan; Cummer, Steven A

    2014-11-24

    Metasurfaces are a family of novel wavefront-shaping devices with planar profile and subwavelength thickness. Acoustic metasurfaces with ultralow profile yet extraordinary wave manipulating properties would be highly desirable for improving the performance of many acoustic wave-based applications. However, designing acoustic metasurfaces with similar functionality to their electromagnetic counterparts remains challenging with traditional metamaterial design approaches. Here we present a design and realization of an acoustic metasurface based on tapered labyrinthine metamaterials. The demonstrated metasurface can not only steer an acoustic beam as expected from the generalized Snell's law, but also exhibits various unique properties such as conversion from propagating wave to surface mode, extraordinary beam-steering and apparent negative refraction through higher-order diffraction. Such designer acoustic metasurfaces provide a new design methodology for acoustic signal modulation devices and may be useful for applications such as acoustic imaging, beam steering, ultrasound lens design and acoustic surface wave-based applications.

  10. Hot emission model for mobile sources: application to the metropolitan region of the city of Santiago, Chile.

    PubMed

    Corvalán, Roberto M; Osses, Mauricio; Urrutia, Cristian M

    2002-02-01

    Depending on the final application, several methodologies for traffic emission estimation have been developed. Emission estimation based on total miles traveled or other average factors is a sufficient approach only for extended areas such as national or worldwide areas. For road emission control and strategies design, microscale analysis based on real-world emission estimations is often required. This involves actual driving behavior and emission factors of the local vehicle fleet under study. This paper reports on a microscale model for hot road emissions and its application to the metropolitan region of the city of Santiago, Chile. The methodology considers the street-by-street hot emission estimation with its temporal and spatial distribution. The input data come from experimental emission factors based on local driving patterns and traffic surveys of traffic flows for different vehicle categories. The methodology developed is able to estimate hourly hot road CO, total unburned hydrocarbons (THCs), particulate matter (PM), and NO(x) emissions for predefined day types and vehicle categories.

  11. Improving timeliness and efficiency in the referral process for safety net providers: application of the Lean Six Sigma methodology.

    PubMed

    Deckard, Gloria J; Borkowski, Nancy; Diaz, Deisell; Sanchez, Carlos; Boisette, Serge A

    2010-01-01

    Designated primary care clinics largely serve low-income and uninsured patients who present a disproportionate number of chronic illnesses and face great difficulty in obtaining the medical care they need, particularly the access to specialty physicians. With limited capacity for providing specialty care, these primary care clinics generally refer patients to safety net hospitals' specialty ambulatory care clinics. A large public safety net health system successfully improved the effectiveness and efficiency of the specialty clinic referral process through application of Lean Six Sigma, an advanced process-improvement methodology and set of tools driven by statistics and engineering concepts.

  12. Application of Network and Decision Theory to Routing Problems.

    DTIC Science & Technology

    1982-03-01

    special thanks to Major Hal Carter, faculty member, for his help in getting the authors to understand one of the underlying algorithms in the methodology...61 26. General Methodology Flowchart .......... .. 64 27. Least Cost/Time Path Algorithm Flowchart . . 65 28. Possible Redundant Arc of Time...minimum time to travel. This was neces- sary because: 1. The DTN designers did not have a procedure to do so. 2. The various network algorithms to

  13. Applications of aerospace technology

    NASA Technical Reports Server (NTRS)

    Rouse, Doris J.

    1984-01-01

    The objective of the Research Triangle Institute Technology Transfer Team is to assist NASA in achieving widespread utilization of aerospace technology in terrestrial applications. Widespread utilization implies that the application of NASA technology is to benefit a significant sector of the economy and population of the Nation. This objective is best attained by stimulating the introduction of new or improved commercially available devices incorporating aerospace technology. A methodology is presented for the team's activities as an active transfer agent linking NASA Field Centers, industry associations, user groups, and the medical community. This methodology is designed to: (1) identify priority technology requirements in industry and medicine, (2) identify applicable NASA technology that represents an opportunity for a successful solution and commercial product, (3) obtain the early participation of industry in the transfer process, and (4) successfully develop a new product based on NASA technology.

  14. The Application of a Resilience Assessment Approach to Promote Campus Environmental Management: A South African Case Study

    ERIC Educational Resources Information Center

    Muller, Irene; Tempelhoff, Johann

    2016-01-01

    Purpose: This paper aims to outline the benefits of using resilience assessment instead of command and control mechanisms to evaluate sustainable campus environments. Design/Methodology/Approach: An exploratory mixed-method design was followed for the purposes of the project. During the first qualitative phase, a historical timeline of the focal…

  15. Considerations for the Systematic Analysis and Use of Single-Case Research

    ERIC Educational Resources Information Center

    Horner, Robert H.; Swaminathan, Hariharan; Sugai, George; Smolkowski, Keith

    2012-01-01

    Single-case research designs provide a rigorous research methodology for documenting experimental control. If single-case methods are to gain wider application, however, a need exists to define more clearly (a) the logic of single-case designs, (b) the process and decision rules for visual analysis, and (c) an accepted process for integrating…

  16. An application generator for rapid prototyping of Ada real-time control software

    NASA Technical Reports Server (NTRS)

    Johnson, Jim; Biglari, Haik; Lehman, Larry

    1990-01-01

    The need to increase engineering productivity and decrease software life cycle costs in real-time system development establishes a motivation for a method of rapid prototyping. The design by iterative rapid prototyping technique is described. A tool which facilitates such a design methodology for the generation of embedded control software is described.

  17. Identification of Highly Prized Commercial Fish Using a PCR-Based Methodology

    ERIC Educational Resources Information Center

    Moran, Paloma; Garcia-Vasquez, Eva

    2006-01-01

    We report a practical class designed for undergraduate students of Marine Sciences as a part of the Genetics course. The class can also be included in undergraduate studies of food technology. The exercise was designed to emphasize the application of molecular biology techniques to fish species authentication and traceability. After a simple and…

  18. Applied Coastal Oceanography--A Course That Integrates Science and Business.

    ERIC Educational Resources Information Center

    Montvilo, Jerome A.; Levin, Douglas R.

    1998-01-01

    Describes a course designed to teach students the fundamentals of coastal oceanography and the scientific methodologies used in studying this field. Business applications of this information also play an important role in the course. (DDR)

  19. Modern proposal of methodology for retrieval of characteristic synthetic rainfall hyetographs

    NASA Astrophysics Data System (ADS)

    Licznar, Paweł; Burszta-Adamiak, Ewa; Łomotowski, Janusz; Stańczyk, Justyna

    2017-11-01

    Modern engineering workshop of designing and modelling complex drainage systems is based on hydrodynamic modelling and has a probabilistic character. Its practical application requires a change regarding rainfall models accepted at the input. Previously used artificial rainfall models of simplified form, e.g. block precipitation or Euler's type II model rainfall are no longer sufficient. It is noticeable that urgent clarification is needed as regards the methodology of standardized rainfall hyetographs that would take into consideration the specifics of local storm rainfall temporal dynamics. The aim of the paper is to present a proposal for innovative methodology for determining standardized rainfall hyetographs, based on statistical processing of the collection of actual local precipitation characteristics. Proposed methodology is based on the classification of standardized rainfall hyetographs with the use of cluster analysis. Its application is presented on the example of selected rain gauges localized in Poland. Synthetic rainfall hyetographs achieved as a final result may be used for hydrodynamic modelling of sewerage systems, including probabilistic detection of necessary capacity of retention reservoirs.

  20. Hardware accelerated high performance neutron transport computation based on AGENT methodology

    NASA Astrophysics Data System (ADS)

    Xiao, Shanjie

    The spatial heterogeneity of the next generation Gen-IV nuclear reactor core designs brings challenges to the neutron transport analysis. The Arbitrary Geometry Neutron Transport (AGENT) AGENT code is a three-dimensional neutron transport analysis code being developed at the Laboratory for Neutronics and Geometry Computation (NEGE) at Purdue University. It can accurately describe the spatial heterogeneity in a hierarchical structure through the R-function solid modeler. The previous version of AGENT coupled the 2D transport MOC solver and the 1D diffusion NEM solver to solve the three dimensional Boltzmann transport equation. In this research, the 2D/1D coupling methodology was expanded to couple two transport solvers, the radial 2D MOC solver and the axial 1D MOC solver, for better accuracy. The expansion was benchmarked with the widely applied C5G7 benchmark models and two fast breeder reactor models, and showed good agreement with the reference Monte Carlo results. In practice, the accurate neutron transport analysis for a full reactor core is still time-consuming and thus limits its application. Therefore, another content of my research is focused on designing a specific hardware based on the reconfigurable computing technique in order to accelerate AGENT computations. It is the first time that the application of this type is used to the reactor physics and neutron transport for reactor design. The most time consuming part of the AGENT algorithm was identified. Moreover, the architecture of the AGENT acceleration system was designed based on the analysis. Through the parallel computation on the specially designed, highly efficient architecture, the acceleration design on FPGA acquires high performance at the much lower working frequency than CPUs. The whole design simulations show that the acceleration design would be able to speedup large scale AGENT computations about 20 times. The high performance AGENT acceleration system will drastically shortening the computation time for 3D full-core neutron transport analysis, making the AGENT methodology unique and advantageous, and thus supplies the possibility to extend the application range of neutron transport analysis in either industry engineering or academic research.

  1. Assessing the role of mini-applications in predicting key performance characteristics of scientific and engineering applications

    DOE PAGES

    Barrett, R. F.; Crozier, P. S.; Doerfler, D. W.; ...

    2014-09-28

    Computational science and engineering application programs are typically large, complex, and dynamic, and are often constrained by distribution limitations. As a means of making tractable rapid explorations of scientific and engineering application programs in the context of new, emerging, and future computing architectures, a suite of miniapps has been created to serve as proxies for full scale applications. Each miniapp is designed to represent a key performance characteristic that does or is expected to significantly impact the runtime performance of an application program. In this paper we introduce a methodology for assessing the ability of these miniapps to effectively representmore » these performance issues. We applied this methodology to four miniapps, examining the linkage between them and an application they are intended to represent. Herein we evaluate the fidelity of that linkage. This work represents the initial steps required to begin to answer the question, ''Under what conditions does a miniapp represent a key performance characteristic in a full app?''« less

  2. Design of helicopter rotors to noise constraints

    NASA Technical Reports Server (NTRS)

    Schaeffer, E. G.; Sternfeld, H., Jr.

    1978-01-01

    Results of the initial phase of a research project to study the design constraints on helicopter noise are presented. These include the calculation of nonimpulsive rotor harmonic and broadband hover noise spectra, over a wide range of rotor design variables and the sensitivity of perceived noise level (PNL) to changes in rotor design parameters. The prediction methodology used correlated well with measured whirl tower data. Application of the predictions to variations in rotor design showed tip speed and thrust as having the most effect on changing PNL.

  3. A Perspective on Computational Human Performance Models as Design Tools

    NASA Technical Reports Server (NTRS)

    Jones, Patricia M.

    2010-01-01

    The design of interactive systems, including levels of automation, displays, and controls, is usually based on design guidelines and iterative empirical prototyping. A complementary approach is to use computational human performance models to evaluate designs. An integrated strategy of model-based and empirical test and evaluation activities is particularly attractive as a methodology for verification and validation of human-rated systems for commercial space. This talk will review several computational human performance modeling approaches and their applicability to design of display and control requirements.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koenenkamp, Rolf

    We report on the design, assembly, operation and application of an aberration-corrected photoemission electron microscope. The instrument used novel hyperbolic mirror-correctors with two and three electrodes that allowed simultaneous correction of spherical and chromatic aberrations. A spatial resolution of 5.4nm was obtained with this instrument in 2009, and 4.7nm in subsequent years. New imaging methodology was introduced involving interferometric imaging of light diffraction. This methodology was applied in nano-photonics and in the characterization of surface-plasmon polaritons. Photonic crystals and waveguides, optical antennas and new plasmonic devices such as routers, localizers and filters were designed and demonstrated using the new capabilitiesmore » offered by the microscope.« less

  5. Methodology of shell structure reinforcement layout optimization

    NASA Astrophysics Data System (ADS)

    Szafrański, Tomasz; Małachowski, Jerzy; Damaziak, Krzysztof

    2018-01-01

    This paper presents an optimization process of a reinforced shell diffuser intended for a small wind turbine (rated power of 3 kW). The diffuser structure consists of multiple reinforcement and metal skin. This kind of structure is suitable for optimization in terms of selection of reinforcement density, stringers cross sections, sheet thickness, etc. The optimisation approach assumes the reduction of the amount of work to be done between the optimization process and the final product design. The proposed optimization methodology is based on application of a genetic algorithm to generate the optimal reinforcement layout. The obtained results are the basis for modifying the existing Small Wind Turbine (SWT) design.

  6. Practicing universal design to actual hand tool design process.

    PubMed

    Lin, Kai-Chieh; Wu, Chih-Fu

    2015-09-01

    UD evaluation principles are difficult to implement in product design. This study proposes a methodology for implementing UD in the design process through user participation. The original UD principles and user experience are used to develop the evaluation items. Difference of product types was considered. Factor analysis and Quantification theory type I were used to eliminate considered inappropriate evaluation items and to examine the relationship between evaluation items and product design factors. Product design specifications were established for verification. The results showed that converting user evaluation into crucial design verification factors by the generalized evaluation scale based on product attributes as well as the design factors applications in product design can improve users' UD evaluation. The design process of this study is expected to contribute to user-centered UD application. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  7. An evaluation of the quality of statistical design and analysis of published medical research: results from a systematic survey of general orthopaedic journals.

    PubMed

    Parsons, Nick R; Price, Charlotte L; Hiskens, Richard; Achten, Juul; Costa, Matthew L

    2012-04-25

    The application of statistics in reported research in trauma and orthopaedic surgery has become ever more important and complex. Despite the extensive use of statistical analysis, it is still a subject which is often not conceptually well understood, resulting in clear methodological flaws and inadequate reporting in many papers. A detailed statistical survey sampled 100 representative orthopaedic papers using a validated questionnaire that assessed the quality of the trial design and statistical analysis methods. The survey found evidence of failings in study design, statistical methodology and presentation of the results. Overall, in 17% (95% confidence interval; 10-26%) of the studies investigated the conclusions were not clearly justified by the results, in 39% (30-49%) of studies a different analysis should have been undertaken and in 17% (10-26%) a different analysis could have made a difference to the overall conclusions. It is only by an improved dialogue between statistician, clinician, reviewer and journal editor that the failings in design methodology and analysis highlighted by this survey can be addressed.

  8. Permanent magnet design methodology

    NASA Technical Reports Server (NTRS)

    Leupold, Herbert A.

    1991-01-01

    Design techniques developed for the exploitation of high energy magnetically rigid materials such as Sm-Co and Nd-Fe-B have resulted in a revolution in kind rather than in degree in the design of a variety of electron guidance structures for ballistic and aerospace applications. Salient examples are listed. Several prototype models were developed. These structures are discussed in some detail: permanent magnet solenoids, transverse field sources, periodic structures, and very high field structures.

  9. Organic Carbamates in Drug Design and Medicinal Chemistry

    PubMed Central

    2016-01-01

    The carbamate group is a key structural motif in many approved drugs and prodrugs. There is an increasing use of carbamates in medicinal chemistry and many derivatives are specifically designed to make drug–target interactions through their carbamate moiety. In this Perspective, we present properties and stabilities of carbamates, reagents and chemical methodologies for the synthesis of carbamates, and recent applications of carbamates in drug design and medicinal chemistry. PMID:25565044

  10. Identification of New Potential Scientific and Technology Areas for DoD Application. Summary of Activities

    DTIC Science & Technology

    1986-07-31

    designer will be able to more rapid- ly assemble a total software package from perfected modules that can be easily de - bugged or replaced with more...antinuclear interactions e. gravitational effects of antimatter 2. possible machine parameters and lattice design 3. electron and stochastic cooling needs 4...implementation, reliability requirements; development of design environments and of experimental methodology; technology transfer methods from

  11. Organic carbamates in drug design and medicinal chemistry.

    PubMed

    Ghosh, Arun K; Brindisi, Margherita

    2015-04-09

    The carbamate group is a key structural motif in many approved drugs and prodrugs. There is an increasing use of carbamates in medicinal chemistry and many derivatives are specifically designed to make drug-target interactions through their carbamate moiety. In this Perspective, we present properties and stabilities of carbamates, reagents and chemical methodologies for the synthesis of carbamates, and recent applications of carbamates in drug design and medicinal chemistry.

  12. When paradigms collide at the road rail interface: evaluation of a sociotechnical systems theory design toolkit for cognitive work analysis.

    PubMed

    Read, Gemma J M; Salmon, Paul M; Lenné, Michael G

    2016-09-01

    The Cognitive Work Analysis Design Toolkit (CWA-DT) is a recently developed approach that provides guidance and tools to assist in applying the outputs of CWA to design processes to incorporate the values and principles of sociotechnical systems theory. In this paper, the CWA-DT is evaluated based on an application to improve safety at rail level crossings. The evaluation considered the extent to which the CWA-DT met pre-defined methodological criteria and aligned with sociotechnical values and principles. Both process and outcome measures were taken based on the ratings of workshop participants and human factors experts. Overall, workshop participants were positive about the process and indicated that it met the methodological criteria and sociotechnical values. However, expert ratings suggested that the CWA-DT achieved only limited success in producing RLX designs that fully aligned with the sociotechnical approach. Discussion about the appropriateness of the sociotechnical approach in a public safety context is provided. Practitioner Summary: Human factors and ergonomics practitioners need evidence of the effectiveness of methods. A design toolkit for cognitive work analysis, incorporating values and principles from sociotechnical systems theory, was applied to create innovative designs for rail level crossings. Evaluation results based on the application are provided and discussed.

  13. System testing of a production Ada (trademark) project: The GRODY study

    NASA Technical Reports Server (NTRS)

    Seigle, Jeffrey; Esker, Linda; Shi, Ying-Liang

    1990-01-01

    The use of the Ada language and design methodologies that utilize its features has a strong impact on all phases of the software development project lifecycle. At the National Aeronautics and Space Administration/Goddard Space Flight Center (NASA/GSFC), the Software Engineering Laboratory (SEL) conducted an experiment in parallel development of two flight dynamics systems in FORTRAN and Ada. The teams found some qualitative differences between the system test phases of the two projects. Although planning for system testing and conducting of tests were not generally affected by the use of Ada, the solving of problems found in system testing was generally facilitated by Ada constructs and design methodology. Most problems found in system testing were not due to difficulty with the language or methodology but to lack of experience with the application.

  14. A Step Made Toward Designing Microelectromechanical System (MEMS) Structures With High Reliability

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.

    2003-01-01

    The mechanical design of microelectromechanical systems-particularly for micropower generation applications-requires the ability to predict the strength capacity of load-carrying components over the service life of the device. These microdevices, which typically are made of brittle materials such as polysilicon, show wide scatter (stochastic behavior) in strength as well as a different average strength for different sized structures (size effect). These behaviors necessitate either costly and time-consuming trial-and-error designs or, more efficiently, the development of a probabilistic design methodology for MEMS. Over the years, the NASA Glenn Research Center s Life Prediction Branch has developed the CARES/Life probabilistic design methodology to predict the reliability of advanced ceramic components. In this study, done in collaboration with Johns Hopkins University, the ability of the CARES/Life code to predict the reliability of polysilicon microsized structures with stress concentrations is successfully demonstrated.

  15. Design of experiments applications in bioprocessing: concepts and approach.

    PubMed

    Kumar, Vijesh; Bhalla, Akriti; Rathore, Anurag S

    2014-01-01

    Most biotechnology unit operations are complex in nature with numerous process variables, feed material attributes, and raw material attributes that can have significant impact on the performance of the process. Design of experiments (DOE)-based approach offers a solution to this conundrum and allows for an efficient estimation of the main effects and the interactions with minimal number of experiments. Numerous publications illustrate application of DOE towards development of different bioprocessing unit operations. However, a systematic approach for evaluation of the different DOE designs and for choosing the optimal design for a given application has not been published yet. Through this work we have compared the I-optimal and D-optimal designs to the commonly used central composite and Box-Behnken designs for bioprocess applications. A systematic methodology is proposed for construction of the model and for precise prediction of the responses for the three case studies involving some of the commonly used unit operations in downstream processing. Use of Akaike information criterion for model selection has been examined and found to be suitable for the applications under consideration. © 2013 American Institute of Chemical Engineers.

  16. Methodology for the optimal design of an integrated first and second generation ethanol production plant combined with power cogeneration.

    PubMed

    Bechara, Rami; Gomez, Adrien; Saint-Antonin, Valérie; Schweitzer, Jean-Marc; Maréchal, François

    2016-08-01

    The application of methodologies for the optimal design of integrated processes has seen increased interest in literature. This article builds on previous works and applies a systematic methodology to an integrated first and second generation ethanol production plant with power cogeneration. The methodology breaks into process simulation, heat integration, thermo-economic evaluation, exergy efficiency vs. capital costs, multi-variable, evolutionary optimization, and process selection via profitability maximization. Optimization generated Pareto solutions with exergy efficiency ranging between 39.2% and 44.4% and capital costs from 210M$ to 390M$. The Net Present Value was positive for only two scenarios and for low efficiency, low hydrolysis points. The minimum cellulosic ethanol selling price was sought to obtain a maximum NPV of zero for high efficiency, high hydrolysis alternatives. The obtained optimal configuration presented maximum exergy efficiency, hydrolyzed bagasse fraction, capital costs and ethanol production rate, and minimum cooling water consumption and power production rate. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Using Mixed Methods to Evaluate a Community Intervention for Sexual Assault Survivors: A Methodological Tale.

    PubMed

    Campbell, Rebecca; Patterson, Debra; Bybee, Deborah

    2011-03-01

    This article reviews current epistemological and design issues in the mixed methods literature and then examines the application of one specific design, a sequential explanatory mixed methods design, in an evaluation of a community-based intervention to improve postassault care for sexual assault survivors. Guided by a pragmatist epistemological framework, this study collected quantitative and qualitative data to understand how the implementation of a Sexual Assault Nurse Examiner (SANE) program affected prosecution rates of adult sexual assault cases in a large midwestern community. Quantitative results indicated that the program was successful in affecting legal systems change and the qualitative data revealed the mediating mechanisms of the intervention's effectiveness. Challenges of implementing this design are discussed, including epistemological and practical difficulties that developed from blending methodologies into a single project. © The Author(s) 2011.

  18. Design considerations and tradeoffs for passive RFID tags

    NASA Astrophysics Data System (ADS)

    Hussien, Faisal A.; Turker, Didem Z.; Srinivasan, Rangakrishnan; Mobarak, Mohamed S.; Cortes, Fernando P.; Sanchez-Sinencio, Edgar

    2005-06-01

    Radio Frequency Identification (RFID) systems are widely used in a variety of tracking, security and tagging applications. Their operation in non line-of-sight environments makes them superior over similar devices such as barcode and infrared tags. RFID systems span a wide range of applications: medical history storage, dental prosthesis tracking, oil drilling pipe and concrete stress monitoring, toll ways services, animal tracking applications, etc. Passive RFID tags generate their power from the incoming signal; therefore, they do not require a power source. Accordingly, minimizing the power consumption and the implementation area are usually the main design considerations. This paper presents a complete analysis on designing a passive RFID tag. A system design methodology is introduced including the main issues and tradeoffs between different design parameters. The uplink modulation techniques used (ASK, PSK, FSK, and PWM) are illustrated showing how to choose the appropriate signaling scheme for a specific data rate, a certain distance of operation and a limited power consumption budget. An antenna system (transmitter and receiver) is proposed providing the maximum distance of operation with the transmitted power stated by FCC regulations. The backscatter modulation scheme used in the downlink is shown whether to be ASK-BM or PSK-BM and the differences between them are discussed. The key building blocks such as the charge pump, voltage reference, and the regulator used to generate the DC supply voltage from the incoming RF signal are discussed along with their design tradeoffs. A complete architecture for a passive RFID tag is provided as an example to illustrate the proposed RFID tag design methodology.

  19. A novel anti-windup framework for cascade control systems: an application to underactuated mechanical systems.

    PubMed

    Mehdi, Niaz; Rehan, Muhammad; Malik, Fahad Mumtaz; Bhatti, Aamer Iqbal; Tufail, Muhammad

    2014-05-01

    This paper describes the anti-windup compensator (AWC) design methodologies for stable and unstable cascade plants with cascade controllers facing actuator saturation. Two novel full-order decoupling AWC architectures, based on equivalence of the overall closed-loop system, are developed to deal with windup effects. The decoupled architectures have been developed, to formulate the AWC synthesis problem, by assuring equivalence of the coupled and the decoupled architectures, instead of using an analogy, for cascade control systems. A comparison of both AWC architectures from application point of view is provided to consolidate their utilities. Mainly, one of the architecture is better in terms of computational complexity for implementation, while the other is suitable for unstable cascade systems. On the basis of the architectures for cascade systems facing stability and performance degradation problems in the event of actuator saturation, the global AWC design methodologies utilizing linear matrix inequalities (LMIs) are developed. These LMIs are synthesized by application of the Lyapunov theory, the global sector condition and the ℒ2 gain reduction of the uncertain decoupled nonlinear component of the decoupled architecture. Further, an LMI-based local AWC design methodology is derived by utilizing a local sector condition by means of a quadratic Lyapunov function to resolve the windup problem for unstable cascade plants under saturation. To demonstrate effectiveness of the proposed AWC schemes, an underactuated mechanical system, the ball-and-beam system, is considered, and details of the simulation and practical implementation results are described. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  20. Fracture Probability of MEMS Optical Devices for Space Flight Applications

    NASA Technical Reports Server (NTRS)

    Fettig, Rainer K.; Kuhn, Jonathan L.; Moseley, S. Harvey; Kutyrev, Alexander S.; Orloff, Jon

    1999-01-01

    A bending fracture test specimen design is presented for thin elements used in optical devices for space flight applications. The specimen design is insensitive to load position, avoids end effect complications, and can be used to measure strength of membranes less than 2 microns thick. The theoretical equations predicting stress at failure are presented, and a detailed finite element model is developed to validate the equations for this application. An experimental procedure using a focused ion beam machine is outlined, and results from preliminary tests of 1.9 microns thick single crystal silicon are presented. These tests are placed in the context of a methodology for the design and evaluation of mission critical devices comprised of large arrays of cells.

  1. Computational Fluid Dynamics (CFD) Analysis for the Reduction of Impeller Discharge Flow Distortion

    NASA Technical Reports Server (NTRS)

    Garcia, R.; McConnaughey, P. K.; Eastland, A.

    1993-01-01

    The use of Computational Fluid Dynamics (CFD) in the design and analysis of high performance rocket engine pumps has increased in recent years. This increase has been aided by the activities of the Marshall Space Flight Center (MSFC) Pump Stage Technology Team (PSTT). The team's goals include assessing the accuracy and efficiency of several methodologies and then applying the appropriate methodology(s) to understand and improve the flow inside a pump. The PSTT's objectives, team membership, and past activities are discussed in Garcia1 and Garcia2. The PSTT is one of three teams that form the NASA/MSFC CFD Consortium for Applications in Propulsion Technology (McConnaughey3). The PSTT first applied CFD in the design of the baseline consortium impeller. This impeller was designed for the Space Transportation Main Engine's (STME) fuel turbopump. The STME fuel pump was designed with three impeller stages because a two-stage design was deemed to pose a high developmental risk. The PSTT used CFD to design an impeller whose performance allowed for a two-stage STME fuel pump design. The availability of this design would have lead to a reduction in parts, weight, and cost had the STME reached production. One sample of the baseline consortium impeller was manufactured and tested in a water rig. The test data showed that the impeller performance was as predicted and that a two-stage design for the STME fuel pump was possible with minimal risk. The test data also verified another CFD predicted characteristic of the design that was not desirable. The classical 'jet-wake' pattern at the impeller discharge was strengthened by two aspects of the design: by the high head coefficient necessary for the required pressure rise and by the relatively few impeller exit blades, 12, necessary to reduce manufacturing cost. This 'jet-wake pattern produces an unsteady loading on the diffuser vanes and has, in past rocket engine programs, lead to diffuser structural failure. In industrial applications, this problem is typically avoided by increasing the space between the impeller and the diffuser to allow the dissipation of this pattern and, hence, the reduction of diffuser vane unsteady loading. This approach leads to small performance losses and, more importantly in rocket engine applications, to significant increases in the pump's size and weight. This latter consideration typically makes this approach unacceptable in high performance rocket engines.

  2. Design Methodology for Magnetic Field-Based Soft Tri-Axis Tactile Sensors.

    PubMed

    Wang, Hongbo; de Boer, Greg; Kow, Junwai; Alazmani, Ali; Ghajari, Mazdak; Hewson, Robert; Culmer, Peter

    2016-08-24

    Tactile sensors are essential if robots are to safely interact with the external world and to dexterously manipulate objects. Current tactile sensors have limitations restricting their use, notably being too fragile or having limited performance. Magnetic field-based soft tactile sensors offer a potential improvement, being durable, low cost, accurate and high bandwidth, but they are relatively undeveloped because of the complexities involved in design and calibration. This paper presents a general design methodology for magnetic field-based three-axis soft tactile sensors, enabling researchers to easily develop specific tactile sensors for a variety of applications. All aspects (design, fabrication, calibration and evaluation) of the development of tri-axis soft tactile sensors are presented and discussed. A moving least square approach is used to decouple and convert the magnetic field signal to force output to eliminate non-linearity and cross-talk effects. A case study of a tactile sensor prototype, MagOne, was developed. This achieved a resolution of 1.42 mN in normal force measurement (0.71 mN in shear force), good output repeatability and has a maximum hysteresis error of 3.4%. These results outperform comparable sensors reported previously, highlighting the efficacy of our methodology for sensor design.

  3. Design Methodology for Magnetic Field-Based Soft Tri-Axis Tactile Sensors

    PubMed Central

    Wang, Hongbo; de Boer, Greg; Kow, Junwai; Alazmani, Ali; Ghajari, Mazdak; Hewson, Robert; Culmer, Peter

    2016-01-01

    Tactile sensors are essential if robots are to safely interact with the external world and to dexterously manipulate objects. Current tactile sensors have limitations restricting their use, notably being too fragile or having limited performance. Magnetic field-based soft tactile sensors offer a potential improvement, being durable, low cost, accurate and high bandwidth, but they are relatively undeveloped because of the complexities involved in design and calibration. This paper presents a general design methodology for magnetic field-based three-axis soft tactile sensors, enabling researchers to easily develop specific tactile sensors for a variety of applications. All aspects (design, fabrication, calibration and evaluation) of the development of tri-axis soft tactile sensors are presented and discussed. A moving least square approach is used to decouple and convert the magnetic field signal to force output to eliminate non-linearity and cross-talk effects. A case study of a tactile sensor prototype, MagOne, was developed. This achieved a resolution of 1.42 mN in normal force measurement (0.71 mN in shear force), good output repeatability and has a maximum hysteresis error of 3.4%. These results outperform comparable sensors reported previously, highlighting the efficacy of our methodology for sensor design. PMID:27563908

  4. Cavitation Inside High-Pressure Optically Transparent Fuel Injector Nozzles

    NASA Astrophysics Data System (ADS)

    Falgout, Z.; Linne, M.

    2015-12-01

    Nozzle-orifice flow and cavitation have an important effect on primary breakup of sprays. For this reason, a number of studies in recent years have used injectors with optically transparent nozzles so that orifice flow cavitation can be examined directly. Many of these studies use injection pressures scaled down from realistic injection pressures used in modern fuel injectors, and so the geometry must be scaled up so that the Reynolds number can be matched with the industrial applications of interest. A relatively small number of studies have shown results at or near the injection pressures used in real systems. Unfortunately, neither the specifics of the design of the optical nozzle nor the design methodology used is explained in detail in these papers. Here, a methodology demonstrating how to prevent failure of a finished design made from commonly used optically transparent materials will be explained in detail, and a description of a new design for transparent nozzles which minimizes size and cost will be shown. The design methodology combines Finite Element Analysis with relevant materials science to evaluate the potential for failure of the finished assembly. Finally, test results imaging a cavitating flow at elevated pressures are presented.

  5. Optimization of helicopter airframe structures for vibration reduction considerations, formulations and applications

    NASA Technical Reports Server (NTRS)

    Murthy, T. Sreekanta

    1988-01-01

    Several key issues involved in the application of formal optimization technique to helicopter airframe structures for vibration reduction are addressed. Considerations which are important in the optimization of real airframe structures are discussed. Considerations necessary to establish relevant set of design variables, constraints and objectives which are appropriate to conceptual, preliminary, detailed design, ground and flight test phases of airframe design are discussed. A methodology is suggested for optimization of airframes in various phases of design. Optimization formulations that are unique to helicopter airframes are described and expressions for vibration related functions are derived. Using a recently developed computer code, the optimization of a Bell AH-1G helicopter airframe is demonstrated.

  6. New methodology for shaft design based on life expectancy

    NASA Technical Reports Server (NTRS)

    Loewenthal, S. H.

    1986-01-01

    The design of power transmission shafting for reliability has not historically received a great deal of attention. However, weight sensitive aerospace and vehicle applications and those where the penalties of shaft failure are great, require greater confidence in shaft design than earlier methods provided. This report summarizes a fatigue strength-based, design method for sizing shafts under variable amplitude loading histories for limited or nonlimited service life. Moreover, applications factors such as press-fitted collars, shaft size, residual stresses from shot peening or plating, corrosive environments can be readily accommodated into the framework of the analysis. Examples are given which illustrate the use of the method, pointing out the large life penalties due to occasional cyclic overloads.

  7. SU-E-T-785: Using Systems Engineering to Design HDR Skin Treatment Operation for Small Lesions to Enhance Patient Safety

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saw, C; Baikadi, M; Peters, C

    2015-06-15

    Purpose: Using systems engineering to design HDR skin treatment operation for small lesions using shielded applicators to enhance patient safety. Methods: Systems engineering is an interdisciplinary field that offers formal methodologies to study, design, implement, and manage complex engineering systems as a whole over their life-cycles. The methodologies deal with human work-processes, coordination of different team, optimization, and risk management. The V-model of systems engineering emphasize two streams, the specification and the testing streams. The specification stream consists of user requirements, functional requirements, and design specifications while the testing on installation, operational, and performance specifications. In implementing system engineering tomore » this project, the user and functional requirements are (a) HDR unit parameters be downloaded from the treatment planning system, (b) dwell times and positions be generated by treatment planning system, (c) source decay be computer calculated, (d) a double-check system of treatment parameters to comply with the NRC regulation. These requirements are intended to reduce human intervention to improve patient safety. Results: A formal investigation indicated that the user requirements can be satisfied. The treatment operation consists of using the treatment planning system to generate a pseudo plan that is adjusted for different shielded applicators to compute the dwell times. The dwell positions, channel numbers, and the dwell times are verified by the medical physicist and downloaded into the HDR unit. The decayed source strength is transferred to a spreadsheet that computes the dwell times based on the type of applicators and prescribed dose used. Prior to treatment, the source strength, dwell times, dwell positions, and channel numbers are double-checked by the radiation oncologist. No dosimetric parameters are manually calculated. Conclusion: Systems engineering provides methodologies to effectively design the HDR treatment operation that minimize human intervention and improve patient safety.« less

  8. Applications of Advanced Experimental Methodologies to AWAVS Training Research. Final Report, May 1977-July 1978.

    ERIC Educational Resources Information Center

    Simon, Charles W.

    A major part of the Naval Training Equipment Center's Aviation Wide Angle Visual System (AWAVS) program involves behavioral research to provide a basis for establishing design criteria for flight trainers. As part of the task of defining the purpose and approach of this program, the applications of advanced experimental methods are explained and…

  9. Mobile Applications and 4G Wireless Networks: A Framework for Analysis

    ERIC Educational Resources Information Center

    Yang, Samuel C.

    2012-01-01

    Purpose: The use of mobile wireless data services continues to increase worldwide. New fourth-generation (4G) wireless networks can deliver data rates exceeding 2 Mbps. The purpose of this paper is to develop a framework of 4G mobile applications that utilize such high data rates and run on small form-factor devices. Design/methodology/approach:…

  10. Effects of Screencasting on the Turkish Undergraduate Students' Achievement and Knowledge Acquisitions in Spreadsheet Applications

    ERIC Educational Resources Information Center

    Tekinarslan, Erkan

    2013-01-01

    The purpose of this study is to investigate the effects of screencasts on the Turkish undergraduate students' achievement and knowledge acquisitions in spreadsheet applications. The methodology of the study is based on a pretest-posttest experimental design with a control group. A total of 66 undergraduate students in two groups (n = 33 in…

  11. The Applications of Computers in Education in Developing Countries--with Specific Reference to the Cost-Effectiveness of Computer-Assisted Instruction.

    ERIC Educational Resources Information Center

    Lai, Kwok-Wing

    Designed to examine the application and cost-effectiveness of computer-assisted instruction (CAI) for secondary education in developing countries, this document is divided into eight chapters. A general introduction defines the research problem, describes the research methodology, and provides definitions of key terms used throughout the paper.…

  12. Space system operations and support cost analysis using Markov chains

    NASA Technical Reports Server (NTRS)

    Unal, Resit; Dean, Edwin B.; Moore, Arlene A.; Fairbairn, Robert E.

    1990-01-01

    This paper evaluates the use of Markov chain process in probabilistic life cycle cost analysis and suggests further uses of the process as a design aid tool. A methodology is developed for estimating operations and support cost and expected life for reusable space transportation systems. Application of the methodology is demonstrated for the case of a hypothetical space transportation vehicle. A sensitivity analysis is carried out to explore the effects of uncertainty in key model inputs.

  13. A study of the selection of microcomputer architectures to automate planetary spacecraft power systems

    NASA Technical Reports Server (NTRS)

    Nauda, A.

    1982-01-01

    Performance and reliability models of alternate microcomputer architectures as a methodology for optimizing system design were examined. A methodology for selecting an optimum microcomputer architecture for autonomous operation of planetary spacecraft power systems was developed. Various microcomputer system architectures are analyzed to determine their application to spacecraft power systems. It is suggested that no standardization formula or common set of guidelines exists which provides an optimum configuration for a given set of specifications.

  14. Application of an Optimal Tuner Selection Approach for On-Board Self-Tuning Engine Models

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Armstrong, Jeffrey B.; Garg, Sanjay

    2012-01-01

    An enhanced design methodology for minimizing the error in on-line Kalman filter-based aircraft engine performance estimation applications is presented in this paper. It specific-ally addresses the under-determined estimation problem, in which there are more unknown parameters than available sensor measurements. This work builds upon an existing technique for systematically selecting a model tuning parameter vector of appropriate dimension to enable estimation by a Kalman filter, while minimizing the estimation error in the parameters of interest. While the existing technique was optimized for open-loop engine operation at a fixed design point, in this paper an alternative formulation is presented that enables the technique to be optimized for an engine operating under closed-loop control throughout the flight envelope. The theoretical Kalman filter mean squared estimation error at a steady-state closed-loop operating point is derived, and the tuner selection approach applied to minimize this error is discussed. A technique for constructing a globally optimal tuning parameter vector, which enables full-envelope application of the technology, is also presented, along with design steps for adjusting the dynamic response of the Kalman filter state estimates. Results from the application of the technique to linear and nonlinear aircraft engine simulations are presented and compared to the conventional approach of tuner selection. The new methodology is shown to yield a significant improvement in on-line Kalman filter estimation accuracy.

  15. Time-resolved methods in biophysics. 9. Laser temperature-jump methods for investigating biomolecular dynamics.

    PubMed

    Kubelka, Jan

    2009-04-01

    Many important biochemical processes occur on the time-scales of nanoseconds and microseconds. The introduction of the laser temperature-jump (T-jump) to biophysics more than a decade ago opened these previously inaccessible time regimes up to direct experimental observation. Since then, laser T-jump methodology has evolved into one of the most versatile and generally applicable methods for studying fast biomolecular kinetics. This perspective is a review of the principles and applications of the laser T-jump technique in biophysics. A brief overview of the T-jump relaxation kinetics and the historical development of laser T-jump methodology is presented. The physical principles and practical experimental considerations that are important for the design of the laser T-jump experiments are summarized. These include the Raman conversion for generating heating pulses, considerations of size, duration and uniformity of the temperature jump, as well as potential adverse effects due to photo-acoustic waves, cavitation and thermal lensing, and their elimination. The laser T-jump apparatus developed at the NIH Laboratory of Chemical Physics is described in detail along with a brief survey of other laser T-jump designs in use today. Finally, applications of the laser T-jump in biophysics are reviewed, with an emphasis on the broad range of problems where the laser T-jump methodology has provided important new results and insights into the dynamics of the biomolecular processes.

  16. Discrete Abstractions of Hybrid Systems: Verification of Safety and Application to User-Interface Design

    NASA Technical Reports Server (NTRS)

    Oishi, Meeko; Tomlin, Claire; Degani, Asaf

    2003-01-01

    Human interaction with complex hybrid systems involves the user, the automation's discrete mode logic, and the underlying continuous dynamics of the physical system. Often the user-interface of such systems displays a reduced set of information about the entire system. In safety-critical systems, how can we identify user-interface designs which do not have adequate information, or which may confuse the user? Here we describe a methodology, based on hybrid system analysis, to verify that a user-interface contains information necessary to safely complete a desired procedure or task. Verification within a hybrid framework allows us to account for the continuous dynamics underlying the simple, discrete representations displayed to the user. We provide two examples: a car traveling through a yellow light at an intersection and an aircraft autopilot in a landing/go-around maneuver. The examples demonstrate the general nature of this methodology, which is applicable to hybrid systems (not fully automated) which have operational constraints we can pose in terms of safety. This methodology differs from existing work in hybrid system verification in that we directly account for the user's interactions with the system.

  17. Methodology of citrate-based biomaterial development and application

    NASA Astrophysics Data System (ADS)

    Tran, M. Richard

    Biomaterials play central roles in modern strategies of regenerative medicine and tissue engineering. Attempts to find tissue-engineered solutions to cure various injuries or diseases have led to an enormous increase in the number of polymeric biomaterials over the past decade. The breadth of new materials arises from the multiplicity of anatomical locations, cell types, and mode of application, which all place application-specific requirements on the biomaterial. Unfortunately, many of the currently available biodegradable polymers are limited in their versatility to meet the wide range of requirements for tissue engineering. Therefore, a methodology of biomaterial development, which is able to address a broad spectrum of requirements, would be beneficial to the biomaterial field. This work presents a methodology of citrate-based biomaterial design and application to meet the multifaceted needs of tissue engineering. We hypothesize that (1) citric acid, a non-toxic metabolic product of the body (Krebs Cycle), can be exploited as a universal multifunctional monomer and reacted with various diols to produce a new class of soft biodegradable elastomers with the flexibility to tune the material properties of the resulting material to meet a wide range of requirements; (2) the newly developed citrate-based polymers can be used as platform biomaterials for the design of novel tissue engineering scaffolding; and (3) microengineering approaches in the form thin scaffold sheets, microchannels, and a new porogen design can be used to generate complex cell-cell and cell-microenvironment interactions to mimic tissue complexity and architecture. To test these hypotheses, we first developed a methodology of citrate-based biomaterial development through the synthesis and characterization of a family of in situ crosslinkable and urethane-doped elastomers, which are synthesized using simple, cost-effective strategies and offer a variety methods to tailor the material properties to meet the needs of a particular application. Next, we introduced a new porogen generation technique, and showed the potential application of the newly developed materials through the fabrication and characterization of scaffold sheets, multiphasic small diameter vascular grafts, and multichanneled nerve guides. Finally, the in vivo applications of citrate-based materials are exemplified through the evaluation of peripheral nerve regeneration using multichanneled guides and the ability to assist in injection-based endoscopic mucosal resection therapy. The results presented in this work show that citric acid can be utilized as a cornerstone in the development of novel biodegradable materials, and combined with microengineering approaches to produce the next generation of tissue engineering scaffolding. These enabling new biomaterials and scaffolding strategies should address many of the existing challenges in tissue engineering and advance the field as a whole.

  18. A product-service system approach to telehealth application design.

    PubMed

    Flores-Vaquero, Paul; Tiwari, Ashutosh; Alcock, Jeffrey; Hutabarat, Windo; Turner, Chris

    2016-06-01

    A considerable proportion of current point-of-care devices do not offer a wide enough set of capabilities if they are to function in any telehealth system. There is a need for intermediate devices that lie between healthcare devices and service networks. The development of an application is suggested that allows for a smartphone to take the role of an intermediate device. This research seeks to identify the telehealth service requirements for long-term condition management using a product-service system approach. The use of product-service system has proven to be a suitable methodology for the design and development of telehealth smartphone applications. © The Author(s) 2014.

  19. MOEMS optical delay line for optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Choudhary, Om P.; Chouksey, S.; Sen, P. K.; Sen, P.; Solanki, J.; Andrews, J. T.

    2014-09-01

    Micro-Opto-Electro-Mechanical optical coherence tomography, a lab-on-chip for biomedical applications is designed, studied, fabricated and characterized. To fabricate the device standard PolyMUMPS processes is adopted. We report the utilization of electro-optic modulator for a fast scanning optical delay line for time domain optical coherence tomography. Design optimization are performed using Tanner EDA while simulations are performed using COMSOL. The paper summarizes various results and fabrication methodology adopted. The success of the device promises a future hand-held or endoscopic optical coherence tomography for biomedical applications.

  20. The application of quadratic optimal cooperative control synthesis to a CH-47 helicopter

    NASA Technical Reports Server (NTRS)

    Townsend, Barbara K.

    1987-01-01

    A control-system design method, quadratic optimal cooperative control synthesis (CCS), is applied to the design of a stability and control augmentation system (SCAS). The CCS design method is different from other design methods in that it does not require detailed a priori design criteria, but instead relies on an explicit optimal pilot-model to create desired performance. The design method, which was developed previously for fixed-wing aircraft, is simplified and modified for application to a Boeing CH-47 helicopter. Two SCAS designs are developed using the CCS design methodology. The resulting CCS designs are then compared with designs obtained using classical/frequency-domain methods and linear quadratic regulator (LQR) theory in a piloted fixed-base simulation. Results indicate that the CCS method, with slight modifications, can be used to produce controller designs which compare favorably with the frequency-domain approach.

  1. The application of quadratic optimal cooperative control synthesis to a CH-47 helicopter

    NASA Technical Reports Server (NTRS)

    Townsend, Barbara K.

    1986-01-01

    A control-system design method, Quadratic Optimal Cooperative Control Synthesis (CCS), is applied to the design of a Stability and Control Augmentation Systems (SCAS). The CCS design method is different from other design methods in that it does not require detailed a priori design criteria, but instead relies on an explicit optimal pilot-model to create desired performance. The design model, which was developed previously for fixed-wing aircraft, is simplified and modified for application to a Boeing Vertol CH-47 helicopter. Two SCAS designs are developed using the CCS design methodology. The resulting CCS designs are then compared with designs obtained using classical/frequency-domain methods and Linear Quadratic Regulator (LQR) theory in a piloted fixed-base simulation. Results indicate that the CCS method, with slight modifications, can be used to produce controller designs which compare favorably with the frequency-domain approach.

  2. A response surface methodology based damage identification technique

    NASA Astrophysics Data System (ADS)

    Fang, S. E.; Perera, R.

    2009-06-01

    Response surface methodology (RSM) is a combination of statistical and mathematical techniques to represent the relationship between the inputs and outputs of a physical system by explicit functions. This methodology has been widely employed in many applications such as design optimization, response prediction and model validation. But so far the literature related to its application in structural damage identification (SDI) is scarce. Therefore this study attempts to present a systematic SDI procedure comprising four sequential steps of feature selection, parameter screening, primary response surface (RS) modeling and updating, and reference-state RS modeling with SDI realization using the factorial design (FD) and the central composite design (CCD). The last two steps imply the implementation of inverse problems by model updating in which the RS models substitute the FE models. The proposed method was verified against a numerical beam, a tested reinforced concrete (RC) frame and an experimental full-scale bridge with the modal frequency being the output responses. It was found that the proposed RSM-based method performs well in predicting the damage of both numerical and experimental structures having single and multiple damage scenarios. The screening capacity of the FD can provide quantitative estimation of the significance levels of updating parameters. Meanwhile, the second-order polynomial model established by the CCD provides adequate accuracy in expressing the dynamic behavior of a physical system.

  3. Assessing reservoir operations risk under climate change

    USGS Publications Warehouse

    Brekke, L.D.; Maurer, E.P.; Anderson, J.D.; Dettinger, M.D.; Townsley, E.S.; Harrison, A.; Pruitt, T.

    2009-01-01

    Risk-based planning offers a robust way to identify strategies that permit adaptive water resources management under climate change. This paper presents a flexible methodology for conducting climate change risk assessments involving reservoir operations. Decision makers can apply this methodology to their systems by selecting future periods and risk metrics relevant to their planning questions and by collectively evaluating system impacts relative to an ensemble of climate projection scenarios (weighted or not). This paper shows multiple applications of this methodology in a case study involving California's Central Valley Project and State Water Project systems. Multiple applications were conducted to show how choices made in conducting the risk assessment, choices known as analytical design decisions, can affect assessed risk. Specifically, risk was reanalyzed for every choice combination of two design decisions: (1) whether to assume climate change will influence flood-control constraints on water supply operations (and how), and (2) whether to weight climate change scenarios (and how). Results show that assessed risk would motivate different planning pathways depending on decision-maker attitudes toward risk (e.g., risk neutral versus risk averse). Results also show that assessed risk at a given risk attitude is sensitive to the analytical design choices listed above, with the choice of whether to adjust flood-control rules under climate change having considerably more influence than the choice on whether to weight climate scenarios. Copyright 2009 by the American Geophysical Union.

  4. Methodology for Sensitivity Analysis, Approximate Analysis, and Design Optimization in CFD for Multidisciplinary Applications

    NASA Technical Reports Server (NTRS)

    Taylor, Arthur C., III; Hou, Gene W.

    1996-01-01

    An incremental iterative formulation together with the well-known spatially split approximate-factorization algorithm, is presented for solving the large, sparse systems of linear equations that are associated with aerodynamic sensitivity analysis. This formulation is also known as the 'delta' or 'correction' form. For the smaller two dimensional problems, a direct method can be applied to solve these linear equations in either the standard or the incremental form, in which case the two are equivalent. However, iterative methods are needed for larger two-dimensional and three dimensional applications because direct methods require more computer memory than is currently available. Iterative methods for solving these equations in the standard form are generally unsatisfactory due to an ill-conditioned coefficient matrix; this problem is overcome when these equations are cast in the incremental form. The methodology is successfully implemented and tested using an upwind cell-centered finite-volume formulation applied in two dimensions to the thin-layer Navier-Stokes equations for external flow over an airfoil. In three dimensions this methodology is demonstrated with a marching-solution algorithm for the Euler equations to calculate supersonic flow over the High-Speed Civil Transport configuration (HSCT 24E). The sensitivity derivatives obtained with the incremental iterative method from a marching Euler code are used in a design-improvement study of the HSCT configuration that involves thickness. camber, and planform design variables.

  5. JPRS Report, East Europe

    DTIC Science & Technology

    1987-11-16

    technologi- cal engineering, and design in the subordinate units and takes steps to pro- vide them with necessary technical-material resources; b) It... methodologies and the uniform standards and norms for the branch, subbranches, and other activities, and oversees their manner of application; it...dual subordination, in the field of preparing and fulfilling the annual plans for research, design , and microproduction; f) It participates in the

  6. The Prekindergarten Age-Cutoff Regression-Discontinuity Design: Methodological Issues and Implications for Application

    ERIC Educational Resources Information Center

    Lipsey, Mark W.; Weiland, Christina; Yoshikawa, Hirokazu; Wilson, Sandra Jo; Hofer, Kerry G.

    2015-01-01

    Much of the currently available evidence on the causal effects of public prekindergarten programs on school readiness outcomes comes from studies that use a regression-discontinuity design (RDD) with the age cutoff to enter a program in a given year as the basis for assignment to treatment and control conditions. Because the RDD has high internal…

  7. MODEL-Based Methodology for System of Systems Architecture Development with Application to the Recapitalization of the Future Towing and Salvage Platform

    DTIC Science & Technology

    2008-09-01

    SEP) is a comprehensive , iterative and recursive problem solving process, applied sequentially top-down by integrated teams. It transforms needs...central integrated design repository. It includes a comprehensive behavior modeling notation to understand the dynamics of a design. CORE is a MBSE...37 F. DYNAMIC POSITIONING..........................................................................38 G. FIREFIGHTING

  8. Social capital: theory, evidence, and implications for oral health.

    PubMed

    Rouxel, Patrick L; Heilmann, Anja; Aida, Jun; Tsakos, Georgios; Watt, Richard G

    2015-04-01

    In the last two decades, there has been increasing application of the concept of social capital in various fields of public health, including oral health. However, social capital is a contested concept with debates on its definition, measurement, and application. This study provides an overview of the concept of social capital, highlights the various pathways linking social capital to health, and discusses the potential implication of this concept for health policy. An extensive and diverse international literature has examined the relationship between social capital and a range of general health outcomes across the life course. A more limited but expanding literature has also demonstrated the potential influence of social capital on oral health. Much of the evidence in relation to oral health is limited by methodological shortcomings mainly related to the measurement of social capital, cross-sectional study designs, and inadequate controls for confounding factors. Further research using stronger methodological designs should explore the role of social capital in oral health and assess its potential application in the development of oral health improvement interventions. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  9. Analysis of the possibility of SysML and BPMN application in formal data acquisition system description

    NASA Astrophysics Data System (ADS)

    Ćwikła, G.; Gwiazda, A.; Banaś, W.; Monica, Z.; Foit, K.

    2017-08-01

    The article presents the study of possible application of selected methods of complex description, that can be used as a support of the Manufacturing Information Acquisition System (MIAS) methodology, describing how to design a data acquisition system, allowing for collecting and processing real-time data on the functioning of a production system, necessary for management of a company. MIAS can allow conversion into Cyber-Physical Production System. MIAS is gathering and pre-processing data on the state of production system, including e.g. realisation of production orders, state of machines, materials and human resources. Systematised approach and model-based development is proposed for improving the quality of the design of MIAS methodology-based complex systems supporting data acquisition in various types of companies. Graphical specification can be the baseline for any model-based development in specified areas. The possibility of application of SysML and BPMN, both being UML-based languages, representing different approaches to modelling of requirements, architecture and implementation of the data acquisition system, as a tools supporting description of required features of MIAS, were considered.

  10. Statistical Methodologies for the Optimization of Lipase and Biosurfactant by Ochrobactrum intermedium Strain MZV101 in an Identical Medium for Detergent Applications.

    PubMed

    Ebrahimipour, Gholamhossein; Sadeghi, Hossein; Zarinviarsagh, Mina

    2017-09-11

    The Plackett-Burman design and the Box-Behnken design, statistical methodologies, were employed for the optimization lipase and biosurfactant production by Ochrobactrum intermedium strain MZV101 in an identical broth medium for detergent applications. Environmental factor pH determined to be most mutual significant variables on production. A high concentration of molasses at high temperature and pH has a negative effect on lipase and biosurfactant production by O. intermedium strain MZV101. The chosen mathematical method of medium optimization was sufficient for improving the industrial production of lipase and biosurfactant by bacteria, which were respectively increased 3.46- and 1.89-fold. The duration of maximum production became 24 h shorter, so it was fast and cost-saving. In conclusion, lipase and biosurfactant production by O. intermedium strain MZV101 in an identical culture medium at pH 10.5-11 and 50-60 °C, with 1 g/L of molasses, seemed to be economical, fast, and effective for the enhancement of yield percentage for use in detergent applications.

  11. Design and experimental characterization of flexure activated by SMA wires for microassembly operations

    NASA Astrophysics Data System (ADS)

    Flores, Abiud; Ahuett, Horacio; Song, Gangbing

    2006-03-01

    Compliant mechanisms have a wide range of application in microassembly, micromanipulation and microsurgery. This article presents a low cost Flexure-Stage actuated by two SMA-wires that produces displacement in one direction in a range from 0 to 10 μm. The Flexure-Stage acts as a mechanical transform by reducing and changing the direction of the SMA actuator output displacement. The Flexure-Stage system has its application in microassembly operation and was built at cost of US$ 35 cost. The design methodology of a flexure-stage from concept design through FEA modeling and finally to construction and characterization is presented in this paper.

  12. Preloaded joint analysis methodology for space flight systems

    NASA Technical Reports Server (NTRS)

    Chambers, Jeffrey A.

    1995-01-01

    This report contains a compilation of some of the most basic equations governing simple preloaded joint systems and discusses the more common modes of failure associated with such hardware. It is intended to provide the mechanical designer with the tools necessary for designing a basic bolted joint. Although the information presented is intended to aid in the engineering of space flight structures, the fundamentals are equally applicable to other forms of mechanical design.

  13. A methodology for double patterning compliant split and design

    NASA Astrophysics Data System (ADS)

    Wiaux, Vincent; Verhaegen, Staf; Iwamoto, Fumio; Maenhoudt, Mireille; Matsuda, Takashi; Postnikov, Sergei; Vandenberghe, Geert

    2008-11-01

    Double Patterning allows to further extend the use of water immersion lithography at its maximum numerical aperture NA=1.35. Splitting of design layers to recombine through Double Patterning (DP) enables an effective resolution enhancement. Single polygons may need to be split up (cut) depending on the pattern density and its 2D content. The split polygons recombine at the so-called 'stitching points'. These stitching points may affect the yield due to the sensitivity to process variations. We describe a methodology to ensure a robust double patterning by identifying proper split- and design- guidelines. Using simulations and experimental data, we discuss in particular metal1 first interconnect layers of random LOGIC and DRAM applications at 45nm half-pitch (hp) and 32nm hp where DP may become the only timely patterning solution.

  14. Aeroelastic optimization methodology for viscous and turbulent flows

    NASA Astrophysics Data System (ADS)

    Barcelos Junior, Manuel Nascimento Dias

    2007-12-01

    In recent years, the development of faster computers and parallel processing allowed the application of high-fidelity analysis methods to the aeroelastic design of aircraft. However, these methods are restricted to the final design verification, mainly due to the computational cost involved in iterative design processes. Therefore, this work is concerned with the creation of a robust and efficient aeroelastic optimization methodology for inviscid, viscous and turbulent flows by using high-fidelity analysis and sensitivity analysis techniques. Most of the research in aeroelastic optimization, for practical reasons, treat the aeroelastic system as a quasi-static inviscid problem. In this work, as a first step toward the creation of a more complete aeroelastic optimization methodology for realistic problems, an analytical sensitivity computation technique was developed and tested for quasi-static aeroelastic viscous and turbulent flow configurations. Viscous and turbulent effects are included by using an averaged discretization of the Navier-Stokes equations, coupled with an eddy viscosity turbulence model. For quasi-static aeroelastic problems, the traditional staggered solution strategy has unsatisfactory performance when applied to cases where there is a strong fluid-structure coupling. Consequently, this work also proposes a solution methodology for aeroelastic and sensitivity analyses of quasi-static problems, which is based on the fixed point of an iterative nonlinear block Gauss-Seidel scheme. The methodology can also be interpreted as the solution of the Schur complement of the aeroelastic and sensitivity analyses linearized systems of equations. The methodologies developed in this work are tested and verified by using realistic aeroelastic systems.

  15. Health and safety impacts of nuclear, geothermal, and fossil-fuel electric generation in California. Volume 9. Methodologies for review of the health and safety aspects of proposed nuclear, geothermal, and fossil-fuel sites and facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nero, A.V.; Quinby-Hunt, M.S.

    1977-01-01

    This report sets forth methodologies for review of the health and safety aspects of proposed nuclear, geothermal, and fossil-fuel sites and facilities for electric power generation. The review is divided into a Notice of Intention process and an Application for Certification process, in accordance with the structure to be used by the California Energy Resources Conservation and Development Commission, the first emphasizing site-specific considerations, the second examining the detailed facility design as well. The Notice of Intention review is divided into three possible stages: an examination of emissions and site characteristics, a basic impact analysis, and an assessment of publicmore » impacts. The Application for Certification review is divided into five possible stages: a review of the Notice of Intention treatment, review of the emission control equipment, review of the safety design, review of the general facility design, and an overall assessment of site and facility acceptability.« less

  16. Hybrid computational and experimental approach for the study and optimization of mechanical components

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Pryputniewicz, Ryszard J.

    1998-05-01

    Increased demands on the performance and efficiency of mechanical components impose challenges on their engineering design and optimization, especially when new and more demanding applications must be developed in relatively short periods of time while satisfying design objectives, as well as cost and manufacturability. In addition, reliability and durability must be taken into consideration. As a consequence, effective quantitative methodologies, computational and experimental, should be applied in the study and optimization of mechanical components. Computational investigations enable parametric studies and the determination of critical engineering design conditions, while experimental investigations, especially those using optical techniques, provide qualitative and quantitative information on the actual response of the structure of interest to the applied load and boundary conditions. We discuss a hybrid experimental and computational approach for investigation and optimization of mechanical components. The approach is based on analytical, computational, and experimental resolutions methodologies in the form of computational, noninvasive optical techniques, and fringe prediction analysis tools. Practical application of the hybrid approach is illustrated with representative examples that demonstrate the viability of the approach as an effective engineering tool for analysis and optimization.

  17. Hybrid modeling for quality by design and PAT-benefits and challenges of applications in biopharmaceutical industry.

    PubMed

    von Stosch, Moritz; Davy, Steven; Francois, Kjell; Galvanauskas, Vytautas; Hamelink, Jan-Martijn; Luebbert, Andreas; Mayer, Martin; Oliveira, Rui; O'Kennedy, Ronan; Rice, Paul; Glassey, Jarka

    2014-06-01

    This report highlights the drivers, challenges, and enablers of the hybrid modeling applications in biopharmaceutical industry. It is a summary of an expert panel discussion of European academics and industrialists with relevant scientific and engineering backgrounds. Hybrid modeling is viewed in its broader sense, namely as the integration of different knowledge sources in form of parametric and nonparametric models into a hybrid semi-parametric model, for instance the integration of fundamental and data-driven models. A brief description of the current state-of-the-art and industrial uptake of the methodology is provided. The report concludes with a number of recommendations to facilitate further developments and a wider industrial application of this modeling approach. These recommendations are limited to further exploiting the benefits of this methodology within process analytical technology (PAT) applications in biopharmaceutical industry. Copyright © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Integrated Evaluation of Reliability and Power Consumption of Wireless Sensor Networks.

    PubMed

    Dâmaso, Antônio; Rosa, Nelson; Maciel, Paulo

    2017-11-05

    Power consumption is a primary interest in Wireless Sensor Networks (WSNs), and a large number of strategies have been proposed to evaluate it. However, those approaches usually neither consider reliability issues nor the power consumption of applications executing in the network. A central concern is the lack of consolidated solutions that enable us to evaluate the power consumption of applications and the network stack also considering their reliabilities. To solve this problem, we introduce a fully automatic solution to design power consumption aware WSN applications and communication protocols. The solution presented in this paper comprises a methodology to evaluate the power consumption based on the integration of formal models, a set of power consumption and reliability models, a sensitivity analysis strategy to select WSN configurations and a toolbox named EDEN to fully support the proposed methodology. This solution allows accurately estimating the power consumption of WSN applications and the network stack in an automated way.

  19. Concurrent application of TMS and near-infrared optical imaging: methodological considerations and potential artifacts

    PubMed Central

    Parks, Nathan A.

    2013-01-01

    The simultaneous application of transcranial magnetic stimulation (TMS) with non-invasive neuroimaging provides a powerful method for investigating functional connectivity in the human brain and the causal relationships between areas in distributed brain networks. TMS has been combined with numerous neuroimaging techniques including, electroencephalography (EEG), functional magnetic resonance imaging (fMRI), and positron emission tomography (PET). Recent work has also demonstrated the feasibility and utility of combining TMS with non-invasive near-infrared optical imaging techniques, functional near-infrared spectroscopy (fNIRS) and the event-related optical signal (EROS). Simultaneous TMS and optical imaging affords a number of advantages over other neuroimaging methods but also involves a unique set of methodological challenges and considerations. This paper describes the methodology of concurrently performing optical imaging during the administration of TMS, focusing on experimental design, potential artifacts, and approaches to controlling for these artifacts. PMID:24065911

  20. Design space pruning heuristics and global optimization method for conceptual design of low-thrust asteroid tour missions

    NASA Astrophysics Data System (ADS)

    Alemany, Kristina

    Electric propulsion has recently become a viable technology for spacecraft, enabling shorter flight times, fewer required planetary gravity assists, larger payloads, and/or smaller launch vehicles. With the maturation of this technology, however, comes a new set of challenges in the area of trajectory design. Because low-thrust trajectory optimization has historically required long run-times and significant user-manipulation, mission design has relied on expert-based knowledge for selecting departure and arrival dates, times of flight, and/or target bodies and gravitational swing-bys. These choices are generally based on known configurations that have worked well in previous analyses or simply on trial and error. At the conceptual design level, however, the ability to explore the full extent of the design space is imperative to locating the best solutions in terms of mass and/or flight times. Beginning in 2005, the Global Trajectory Optimization Competition posed a series of difficult mission design problems, all requiring low-thrust propulsion and visiting one or more asteroids. These problems all had large ranges on the continuous variables---launch date, time of flight, and asteroid stay times (when applicable)---as well as being characterized by millions or even billions of possible asteroid sequences. Even with recent advances in low-thrust trajectory optimization, full enumeration of these problems was not possible within the stringent time limits of the competition. This investigation develops a systematic methodology for determining a broad suite of good solutions to the combinatorial, low-thrust, asteroid tour problem. The target application is for conceptual design, where broad exploration of the design space is critical, with the goal being to rapidly identify a reasonable number of promising solutions for future analysis. The proposed methodology has two steps. The first step applies a three-level heuristic sequence developed from the physics of the problem, which allows for efficient pruning of the design space. The second phase applies a global optimization scheme to locate a broad suite of good solutions to the reduced problem. The global optimization scheme developed combines a novel branch-and-bound algorithm with a genetic algorithm and an industry-standard low-thrust trajectory optimization program to solve for the following design variables: asteroid sequence, launch date, times of flight, and asteroid stay times. The methodology is developed based on a small sample problem, which is enumerated and solved so that all possible discretized solutions are known. The methodology is then validated by applying it to a larger intermediate sample problem, which also has a known solution. Next, the methodology is applied to several larger combinatorial asteroid rendezvous problems, using previously identified good solutions as validation benchmarks. These problems include the 2nd and 3rd Global Trajectory Optimization Competition problems. The methodology is shown to be capable of achieving a reduction in the number of asteroid sequences of 6-7 orders of magnitude, in terms of the number of sequences that require low-thrust optimization as compared to the number of sequences in the original problem. More than 70% of the previously known good solutions are identified, along with several new solutions that were not previously reported by any of the competitors. Overall, the methodology developed in this investigation provides an organized search technique for the low-thrust mission design of asteroid rendezvous problems.

  1. Updated methods for traffic impact analysis, including evaluation of innovative intersection designs.

    DOT National Transportation Integrated Search

    2013-12-01

    In 1992, an Applicants Guide and a Reviewers Guide to Traffic Impact Analyses to standardize the methodologies for conducting traffic : impact analyses (TIAs) in Indiana were developed for the Indiana Department of Transportation (INDOT). The m...

  2. Microfluidic tools for cell biological research

    PubMed Central

    Velve-Casquillas, Guilhem; Le Berre, Maël; Piel, Matthieu; Tran, Phong T.

    2010-01-01

    Summary Microfluidic technology is creating powerful tools for cell biologists to control the complete cellular microenvironment, leading to new questions and new discoveries. We review here the basic concepts and methodologies in designing microfluidic devices, and their diverse cell biological applications. PMID:21152269

  3. Disposable Screen Printed Electrochemical Sensors: Tools for Environmental Monitoring

    PubMed Central

    Hayat, Akhtar; Marty, Jean Louis

    2014-01-01

    Screen printing technology is a widely used technique for the fabrication of electrochemical sensors. This methodology is likely to underpin the progressive drive towards miniaturized, sensitive and portable devices, and has already established its route from “lab-to-market” for a plethora of sensors. The application of these sensors for analysis of environmental samples has been the major focus of research in this field. As a consequence, this work will focus on recent important advances in the design and fabrication of disposable screen printed sensors for the electrochemical detection of environmental contaminants. Special emphasis is given on sensor fabrication methodology, operating details and performance characteristics for environmental applications. PMID:24932865

  4. Methodological approach and tools for systems thinking in health systems research: technical assistants' support of health administration reform in the Democratic Republic of Congo as an application.

    PubMed

    Ribesse, Nathalie; Bossyns, Paul; Marchal, Bruno; Karemere, Hermes; Burman, Christopher J; Macq, Jean

    2017-03-01

    In the field of development cooperation, interest in systems thinking and complex systems theories as a methodological approach is increasingly recognised. And so it is in health systems research, which informs health development aid interventions. However, practical applications remain scarce to date. The objective of this article is to contribute to the body of knowledge by presenting the tools inspired by systems thinking and complexity theories and methodological lessons learned from their application. These tools were used in a case study. Detailed results of this study are in process for publication in additional articles. Applying a complexity 'lens', the subject of the case study is the role of long-term international technical assistance in supporting health administration reform at the provincial level in the Democratic Republic of Congo. The Methods section presents the guiding principles of systems thinking and complex systems, their relevance and implication for the subject under study, and the existing tools associated with those theories which inspired us in the design of the data collection and analysis process. The tools and their application processes are presented in the results section, and followed in the discussion section by the critical analysis of their innovative potential and emergent challenges. The overall methodology provides a coherent whole, each tool bringing a different and complementary perspective on the system.

  5. Demonstration of a software design and statistical analysis methodology with application to patient outcomes data sets

    PubMed Central

    Mayo, Charles; Conners, Steve; Warren, Christopher; Miller, Robert; Court, Laurence; Popple, Richard

    2013-01-01

    Purpose: With emergence of clinical outcomes databases as tools utilized routinely within institutions, comes need for software tools to support automated statistical analysis of these large data sets and intrainstitutional exchange from independent federated databases to support data pooling. In this paper, the authors present a design approach and analysis methodology that addresses both issues. Methods: A software application was constructed to automate analysis of patient outcomes data using a wide range of statistical metrics, by combining use of C#.Net and R code. The accuracy and speed of the code was evaluated using benchmark data sets. Results: The approach provides data needed to evaluate combinations of statistical measurements for ability to identify patterns of interest in the data. Through application of the tools to a benchmark data set for dose-response threshold and to SBRT lung data sets, an algorithm was developed that uses receiver operator characteristic curves to identify a threshold value and combines use of contingency tables, Fisher exact tests, Welch t-tests, and Kolmogorov-Smirnov tests to filter the large data set to identify values demonstrating dose-response. Kullback-Leibler divergences were used to provide additional confirmation. Conclusions: The work demonstrates the viability of the design approach and the software tool for analysis of large data sets. PMID:24320426

  6. Demonstration of a software design and statistical analysis methodology with application to patient outcomes data sets.

    PubMed

    Mayo, Charles; Conners, Steve; Warren, Christopher; Miller, Robert; Court, Laurence; Popple, Richard

    2013-11-01

    With emergence of clinical outcomes databases as tools utilized routinely within institutions, comes need for software tools to support automated statistical analysis of these large data sets and intrainstitutional exchange from independent federated databases to support data pooling. In this paper, the authors present a design approach and analysis methodology that addresses both issues. A software application was constructed to automate analysis of patient outcomes data using a wide range of statistical metrics, by combining use of C#.Net and R code. The accuracy and speed of the code was evaluated using benchmark data sets. The approach provides data needed to evaluate combinations of statistical measurements for ability to identify patterns of interest in the data. Through application of the tools to a benchmark data set for dose-response threshold and to SBRT lung data sets, an algorithm was developed that uses receiver operator characteristic curves to identify a threshold value and combines use of contingency tables, Fisher exact tests, Welch t-tests, and Kolmogorov-Smirnov tests to filter the large data set to identify values demonstrating dose-response. Kullback-Leibler divergences were used to provide additional confirmation. The work demonstrates the viability of the design approach and the software tool for analysis of large data sets.

  7. Display/control requirements for automated VTOL aircraft

    NASA Technical Reports Server (NTRS)

    Hoffman, W. C.; Kleinman, D. L.; Young, L. R.

    1976-01-01

    A systematic design methodology for pilot displays in advanced commercial VTOL aircraft was developed and refined. The analyst is provided with a step-by-step procedure for conducting conceptual display/control configurations evaluations for simultaneous monitoring and control pilot tasks. The approach consists of three phases: formulation of information requirements, configuration evaluation, and system selection. Both the monitoring and control performance models are based upon the optimal control model of the human operator. Extensions to the conventional optimal control model required in the display design methodology include explicit optimization of control/monitoring attention; simultaneous monitoring and control performance predictions; and indifference threshold effects. The methodology was applied to NASA's experimental CH-47 helicopter in support of the VALT program. The CH-47 application examined the system performance of six flight conditions. Four candidate configurations are suggested for evaluation in pilot-in-the-loop simulations and eventual flight tests.

  8. Fuzzy logic controllers for electrotechnical devices - On-site tuning approach

    NASA Astrophysics Data System (ADS)

    Hissel, D.; Maussion, P.; Faucher, J.

    2001-12-01

    Fuzzy logic offers nowadays an interesting alternative to the designers of non linear control laws for electrical or electromechanical systems. However, due to the huge number of tuning parameters, this kind of control is only used in a few industrial applications. This paper proposes a new, very simple, on-site tuning strategy for a PID-like fuzzy logic controller. Thanks to the experimental designs methodology, we will propose sets of optimized pre-established settings for this kind of fuzzy controllers. The proposed parameters are only depending on one on-site open-loop identification test. In this way, this on-site tuning methodology has to be compared to the Ziegler-Nichols one's for conventional controllers. Experimental results (on a permanent magnets synchronous motor and on a DC/DC converter) will underline all the efficiency of this tuning methodology. Finally, the field of validity of the proposed pre-established settings will be given.

  9. Semi-Empirical Prediction of Aircraft Low-Speed Aerodynamic Characteristics

    NASA Technical Reports Server (NTRS)

    Olson, Erik D.

    2015-01-01

    This paper lays out a comprehensive methodology for computing a low-speed, high-lift polar, without requiring additional details about the aircraft design beyond what is typically available at the conceptual design stage. Introducing low-order, physics-based aerodynamic analyses allows the methodology to be more applicable to unconventional aircraft concepts than traditional, fully-empirical methods. The methodology uses empirical relationships for flap lift effectiveness, chord extension, drag-coefficient increment and maximum lift coefficient of various types of flap systems as a function of flap deflection, and combines these increments with the characteristics of the unflapped airfoils. Once the aerodynamic characteristics of the flapped sections are known, a vortex-lattice analysis calculates the three-dimensional lift, drag and moment coefficients of the whole aircraft configuration. This paper details the results of two validation cases: a supercritical airfoil model with several types of flaps; and a 12-foot, full-span aircraft model with slats and double-slotted flaps.

  10. Stand-alone flat-plate photovoltaic power systems: System sizing and life-cycle costing methodology for Federal agencies

    NASA Technical Reports Server (NTRS)

    Borden, C. S.; Volkmer, K.; Cochrane, E. H.; Lawson, A. C.

    1984-01-01

    A simple methodology to estimate photovoltaic system size and life-cycle costs in stand-alone applications is presented. It is designed to assist engineers at Government agencies in determining the feasibility of using small stand-alone photovoltaic systems to supply ac or dc power to the load. Photovoltaic system design considerations are presented as well as the equations for sizing the flat-plate array and the battery storage to meet the required load. Cost effectiveness of a candidate photovoltaic system is based on comparison with the life-cycle cost of alternative systems. Examples of alternative systems addressed are batteries, diesel generators, the utility grid, and other renewable energy systems.

  11. Model-Driven Approach for Body Area Network Application Development.

    PubMed

    Venčkauskas, Algimantas; Štuikys, Vytautas; Jusas, Nerijus; Burbaitė, Renata

    2016-05-12

    This paper introduces the sensor-networked IoT model as a prototype to support the design of Body Area Network (BAN) applications for healthcare. Using the model, we analyze the synergistic effect of the functional requirements (data collection from the human body and transferring it to the top level) and non-functional requirements (trade-offs between energy-security-environmental factors, treated as Quality-of-Service (QoS)). We use feature models to represent the requirements at the earliest stage for the analysis and describe a model-driven methodology to design the possible BAN applications. Firstly, we specify the requirements as the problem domain (PD) variability model for the BAN applications. Next, we introduce the generative technology (meta-programming as the solution domain (SD)) and the mapping procedure to map the PD feature-based variability model onto the SD feature model. Finally, we create an executable meta-specification that represents the BAN functionality to describe the variability of the problem domain though transformations. The meta-specification (along with the meta-language processor) is a software generator for multiple BAN-oriented applications. We validate the methodology with experiments and a case study to generate a family of programs for the BAN sensor controllers. This enables to obtain the adequate measure of QoS efficiently through the interactive adjustment of the meta-parameter values and re-generation process for the concrete BAN application.

  12. Model-Driven Approach for Body Area Network Application Development

    PubMed Central

    Venčkauskas, Algimantas; Štuikys, Vytautas; Jusas, Nerijus; Burbaitė, Renata

    2016-01-01

    This paper introduces the sensor-networked IoT model as a prototype to support the design of Body Area Network (BAN) applications for healthcare. Using the model, we analyze the synergistic effect of the functional requirements (data collection from the human body and transferring it to the top level) and non-functional requirements (trade-offs between energy-security-environmental factors, treated as Quality-of-Service (QoS)). We use feature models to represent the requirements at the earliest stage for the analysis and describe a model-driven methodology to design the possible BAN applications. Firstly, we specify the requirements as the problem domain (PD) variability model for the BAN applications. Next, we introduce the generative technology (meta-programming as the solution domain (SD)) and the mapping procedure to map the PD feature-based variability model onto the SD feature model. Finally, we create an executable meta-specification that represents the BAN functionality to describe the variability of the problem domain though transformations. The meta-specification (along with the meta-language processor) is a software generator for multiple BAN-oriented applications. We validate the methodology with experiments and a case study to generate a family of programs for the BAN sensor controllers. This enables to obtain the adequate measure of QoS efficiently through the interactive adjustment of the meta-parameter values and re-generation process for the concrete BAN application. PMID:27187394

  13. Metropolitan Forensic Anthropology Team (MFAT) studies in identification: 1. Race and sex assessment by discriminant function analysis of the postcranial skeleton.

    PubMed

    Taylor, J V; DiBennardo, R; Linares, G H; Goldman, A D; DeForest, P R

    1984-07-01

    A case study is presented to demonstrate the utility of the team approach to the identification of human remains, and to illustrate a methodological innovation developed by MFAT. Case 1 represents the first of several planned case studies, each designed to present new methodological solutions to standard problems in identification. The present case describes a test, by application, of race and sex assessment of the postcranial skeleton by discriminant function analysis.

  14. Enhanced Learning Methodologies and the Implementation of an Identification Course

    NASA Astrophysics Data System (ADS)

    Guidorzi, Roberto

    This paper proposes some considerations on the role played by information and communication technologies in the evolution of educational systems and describes the design philosophy and the realization of a basic course on dynamic system identification that relies on constructivist methodologies and on the use of e-learning environments. It reports also some of the opinions formulated by the students on the effectiveness of the available tools and on their role in acquiring proficiency in the application of identification techniques in modeling real processes.

  15. Emohawk: Searching for a "Good" Emergent Narrative

    NASA Astrophysics Data System (ADS)

    Brom, Cyril; Bída, Michal; Gemrot, Jakub; Kadlec, Rudolf; Plch, Tomáš

    We report on the progress we have achieved in development of Emohawk, a 3D virtual reality application with an emergent narrative for teaching high-school students and undergraduates the basics of virtual characters control, emotion modelling, and narrative generation. Besides, we present a new methodology, used in Emohawk, for purposeful authoring of emergent narratives of Façade's complexity. The methodology is based on massive automatic search for stories that are appealing to the audience whilst forbidding the unappealing ones during the design phase.

  16. RTL validation methodology on high complexity wireless microcontroller using OVM technique for fast time to market

    NASA Astrophysics Data System (ADS)

    Zhafirah Muhammad, Nurul; Harun, A.; Hambali, N. A. M. A.; Murad, S. A. Z.; Mohyar, S. N.; Isa, M. N.; Jambek, AB

    2017-11-01

    Increased demand in internet of thing (IOT) application based has inadvertently forced the move towards higher complexity of integrated circuit supporting SoC. Such spontaneous increased in complexity poses unequivocal complicated validation strategies. Hence, the complexity allows researchers to come out with various exceptional methodologies in order to overcome this problem. This in essence brings about the discovery of dynamic verification, formal verification and hybrid techniques. In reserve, it is very important to discover bugs at infancy of verification process in (SoC) in order to reduce time consuming and fast time to market for the system. Ergo, in this paper we are focusing on the methodology of verification that can be done at Register Transfer Level of SoC based on the AMBA bus design. On top of that, the discovery of others verification method called Open Verification Methodology (OVM) brings out an easier way in RTL validation methodology neither as the replacement for the traditional method yet as an effort for fast time to market for the system. Thus, the method called OVM is proposed in this paper as the verification method for larger design to avert the disclosure of the bottleneck in validation platform.

  17. Computer-Aided Sensor Development Focused on Security Issues.

    PubMed

    Bialas, Andrzej

    2016-05-26

    The paper examines intelligent sensor and sensor system development according to the Common Criteria methodology, which is the basic security assurance methodology for IT products and systems. The paper presents how the development process can be supported by software tools, design patterns and knowledge engineering. The automation of this process brings cost-, quality-, and time-related advantages, because the most difficult and most laborious activities are software-supported and the design reusability is growing. The paper includes a short introduction to the Common Criteria methodology and its sensor-related applications. In the experimental section the computer-supported and patterns-based IT security development process is presented using the example of an intelligent methane detection sensor. This process is supported by an ontology-based tool for security modeling and analyses. The verified and justified models are transferred straight to the security target specification representing security requirements for the IT product. The novelty of the paper is to provide a patterns-based and computer-aided methodology for the sensors development with a view to achieving their IT security assurance. The paper summarizes the validation experiment focused on this methodology adapted for the sensors system development, and presents directions of future research.

  18. Computer-Aided Sensor Development Focused on Security Issues

    PubMed Central

    Bialas, Andrzej

    2016-01-01

    The paper examines intelligent sensor and sensor system development according to the Common Criteria methodology, which is the basic security assurance methodology for IT products and systems. The paper presents how the development process can be supported by software tools, design patterns and knowledge engineering. The automation of this process brings cost-, quality-, and time-related advantages, because the most difficult and most laborious activities are software-supported and the design reusability is growing. The paper includes a short introduction to the Common Criteria methodology and its sensor-related applications. In the experimental section the computer-supported and patterns-based IT security development process is presented using the example of an intelligent methane detection sensor. This process is supported by an ontology-based tool for security modeling and analyses. The verified and justified models are transferred straight to the security target specification representing security requirements for the IT product. The novelty of the paper is to provide a patterns-based and computer-aided methodology for the sensors development with a view to achieving their IT security assurance. The paper summarizes the validation experiment focused on this methodology adapted for the sensors system development, and presents directions of future research. PMID:27240360

  19. Materials Selection Criteria for Nuclear Power Applications: A Decision Algorithm

    NASA Astrophysics Data System (ADS)

    Rodríguez-Prieto, Álvaro; Camacho, Ana María; Sebastián, Miguel Ángel

    2016-02-01

    An innovative methodology based on stringency levels is proposed in this paper and improves the current selection method for structural materials used in demanding industrial applications. This paper describes a new approach for quantifying the stringency of materials requirements based on a novel deterministic algorithm to prevent potential failures. We have applied the new methodology to different standardized specifications used in pressure vessels design, such as SA-533 Grade B Cl.1, SA-508 Cl.3 (issued by the American Society of Mechanical Engineers), DIN 20MnMoNi55 (issued by the German Institute of Standardization) and 16MND5 (issued by the French Nuclear Commission) specifications and determine the influence of design code selection. This study is based on key scientific publications on the influence of chemical composition on the mechanical behavior of materials, which were not considered when the technological requirements were established in the aforementioned specifications. For this purpose, a new method to quantify the efficacy of each standard has been developed using a deterministic algorithm. The process of assigning relative weights was performed by consulting a panel of experts in materials selection for reactor pressure vessels to provide a more objective methodology; thus, the resulting mathematical calculations for quantitative analysis are greatly simplified. The final results show that steel DIN 20MnMoNi55 is the best material option. Additionally, more recently developed materials such as DIN 20MnMoNi55, 16MND5 and SA-508 Cl.3 exhibit mechanical requirements more stringent than SA-533 Grade B Cl.1. The methodology presented in this paper can be used as a decision tool in selection of materials for a wide range of applications.

  20. C3I system modification and EMC (electromagnetic compatibility) methodology, volume 1

    NASA Astrophysics Data System (ADS)

    Wilson, J. L.; Jolly, M. B.

    1984-01-01

    A methodology (i.e., consistent set of procedures) for assessing the electromagnetic compatibility (EMC) of RF subsystem modifications on C3I aircraft was generated during this study (Volume 1). An IEMCAP (Intrasystem Electromagnetic Compatibility Analysis Program) database for the E-3A (AWACS) C3I aircraft RF subsystem was extracted to support the design of the EMC assessment methodology (Volume 2). Mock modifications were performed on the E-3A database to assess, using a preliminary form of the methodology, the resulting EMC impact. Application of the preliminary assessment methodology to modifications in the E-3A database served to fine tune the form of a final assessment methodology. The resulting final assessment methodology is documented in this report in conjunction with the overall study goals, procedures, and database. It is recommended that a similar EMC assessment methodology be developed for the power subsystem within C3I aircraft. It is further recommended that future EMC assessment methodologies be developed around expert systems (i.e., computer intelligent agents) to control both the excruciating detail and user requirement for transparency.

  1. NEW SAMPLING THEORY FOR MEASURING ECOSYSTEM STRUCTURE

    EPA Science Inventory

    This research considered the application of systems analysis to the study of laboratory ecosystems. The work concerned the development of a methodology which was shown to be useful in the design of laboratory experiments, the processing and interpretation of the results of these ...

  2. 34 CFR 428.21 - What selection criteria does the Secretary use?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... individuals enrolled in vocational education programs, and how those needs should influence teaching strategies and program design. (ii) Understanding of bilingual vocational training methodologies. (iii... points) The Secretary reviews each application for an effective plan of management that ensures proper...

  3. 34 CFR 428.21 - What selection criteria does the Secretary use?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... individuals enrolled in vocational education programs, and how those needs should influence teaching strategies and program design. (ii) Understanding of bilingual vocational training methodologies. (iii... points) The Secretary reviews each application for an effective plan of management that ensures proper...

  4. Automated sizing of large structures by mixed optimization methods

    NASA Technical Reports Server (NTRS)

    Sobieszczanski, J.; Loendorf, D.

    1973-01-01

    A procedure for automating the sizing of wing-fuselage airframes was developed and implemented in the form of an operational program. The program combines fully stressed design to determine an overall material distribution with mass-strength and mathematical programming methods to design structural details accounting for realistic design constraints. The practicality and efficiency of the procedure is demonstrated for transport aircraft configurations. The methodology is sufficiently general to be applicable to other large and complex structures.

  5. Application of damage tolerance methodology in certification of the Piaggio P-180 Avanti

    NASA Technical Reports Server (NTRS)

    Johnson, Jerry

    1992-01-01

    The Piaggio P-180 Avanti, a twin pusher-prop engine nine-passenger business aircraft was certified in 1990, to the requirements of FAR Part 23 and Associated Special Conditions for Composite Structure. Certification included the application of a damage tolerant methodology to the design of the composite forward wing and empennage (vertical fin, horizontal stabilizer, tailcone, and rudder) structure. This methodology included an extensive analytical evaluation coupled with sub-component and full-scale testing of the structure. The work from the Damage Tolerance Analysis Assessment was incorporated into the full-scale testing. Damage representing hazards such as dropped tools, ground equipment, handling, and runway debris, was applied to the test articles. Additional substantiation included allowing manufacturing discrepancies to exist unrepaired on the full-scale articles and simulated bondline failures in critical elements. The importance of full-scale testing in the critical environmental conditions and the application of critical damage are addressed. The implication of damage tolerance on static and fatigue testing is discussed. Good correlation between finite element solutions and experimental test data was observed.

  6. Global Design Optimization for Fluid Machinery Applications

    NASA Technical Reports Server (NTRS)

    Shyy, Wei; Papila, Nilay; Tucker, Kevin; Vaidyanathan, Raj; Griffin, Lisa

    2000-01-01

    Recent experiences in utilizing the global optimization methodology, based on polynomial and neural network techniques for fluid machinery design are summarized. Global optimization methods can utilize the information collected from various sources and by different tools. These methods offer multi-criterion optimization, handle the existence of multiple design points and trade-offs via insight into the entire design space can easily perform tasks in parallel, and are often effective in filtering the noise intrinsic to numerical and experimental data. Another advantage is that these methods do not need to calculate the sensitivity of each design variable locally. However, a successful application of the global optimization method needs to address issues related to data requirements with an increase in the number of design variables and methods for predicting the model performance. Examples of applications selected from rocket propulsion components including a supersonic turbine and an injector element and a turbulent flow diffuser are used to illustrate the usefulness of the global optimization method.

  7. A systematic methodology for the robust quantification of energy efficiency at wastewater treatment plants featuring Data Envelopment Analysis.

    PubMed

    Longo, S; Hospido, A; Lema, J M; Mauricio-Iglesias, M

    2018-05-10

    This article examines the potential benefits of using Data Envelopment Analysis (DEA) for conducting energy-efficiency assessment of wastewater treatment plants (WWTPs). WWTPs are characteristically heterogeneous (in size, technology, climate, function …) which limits the correct application of DEA. This paper proposes and describes the Robust Energy Efficiency DEA (REED) in its various stages, a systematic state-of-the-art methodology aimed at including exogenous variables in nonparametric frontier models and especially designed for WWTP operation. In particular, the methodology systematizes the modelling process by presenting an integrated framework for selecting the correct variables and appropriate models, possibly tackling the effect of exogenous factors. As a result, the application of REED improves the quality of the efficiency estimates and hence the significance of benchmarking. For the reader's convenience, this article is presented as a step-by-step guideline to guide the user in the determination of WWTPs energy efficiency from beginning to end. The application and benefits of the developed methodology are demonstrated by a case study related to the comparison of the energy efficiency of a set of 399 WWTPs operating in different countries and under heterogeneous environmental conditions. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. Application of Haddon's matrix in qualitative research methodology: an experience in burns epidemiology.

    PubMed

    Deljavan, Reza; Sadeghi-Bazargani, Homayoun; Fouladi, Nasrin; Arshi, Shahnam; Mohammadi, Reza

    2012-01-01

    Little has been done to investigate the application of injury specific qualitative research methods in the field of burn injuries. The aim of this study was to use an analytical tool (Haddon's matrix) through qualitative research methods to better understand people's perceptions about burn injuries. This study applied Haddon's matrix as a framework and an analytical tool for a qualitative research methodology in burn research. Both child and adult burn injury victims were enrolled into a qualitative study conducted using focus group discussion. Haddon's matrix was used to develop an interview guide and also through the analysis phase. The main analysis clusters were pre-event level/human (including risky behaviors, belief and cultural factors, and knowledge and education), pre-event level/object, pre-event phase/environment and event and post-event phase (including fire control, emergency scald and burn wound management, traditional remedies, medical consultation, and severity indicators). This research gave rise to results that are possibly useful both for future injury research and for designing burn injury prevention plans. Haddon's matrix is applicable in a qualitative research methodology both at data collection and data analysis phases. The study using Haddon's matrix through a qualitative research methodology yielded substantially rich information regarding burn injuries that may possibly be useful for prevention or future quantitative research.

  9. Numerical methods for engine-airframe integration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murthy, S.N.B.; Paynter, G.C.

    1986-01-01

    Various papers on numerical methods for engine-airframe integration are presented. The individual topics considered include: scientific computing environment for the 1980s, overview of prediction of complex turbulent flows, numerical solutions of the compressible Navier-Stokes equations, elements of computational engine/airframe integrations, computational requirements for efficient engine installation, application of CAE and CFD techniques to complete tactical missile design, CFD applications to engine/airframe integration, and application of a second-generation low-order panel methods to powerplant installation studies. Also addressed are: three-dimensional flow analysis of turboprop inlet and nacelle configurations, application of computational methods to the design of large turbofan engine nacelles, comparison ofmore » full potential and Euler solution algorithms for aeropropulsive flow field computations, subsonic/transonic, supersonic nozzle flows and nozzle integration, subsonic/transonic prediction capabilities for nozzle/afterbody configurations, three-dimensional viscous design methodology of supersonic inlet systems for advanced technology aircraft, and a user's technology assessment.« less

  10. Modeling Methodologies for Design and Control of Solid Oxide Fuel Cell APUs

    NASA Astrophysics Data System (ADS)

    Pianese, C.; Sorrentino, M.

    2009-08-01

    Among the existing fuel cell technologies, Solid Oxide Fuel Cells (SOFC) are particularly suitable for both stationary and mobile applications, due to their high energy conversion efficiencies, modularity, high fuel flexibility, low emissions and noise. Moreover, the high working temperatures enable their use for efficient cogeneration applications. SOFCs are entering in a pre-industrial era and a strong interest for designing tools has growth in the last years. Optimal system configuration, components sizing, control and diagnostic system design require computational tools that meet the conflicting needs of accuracy, affordable computational time, limited experimental efforts and flexibility. The paper gives an overview on control-oriented modeling of SOFC at both single cell and stack level. Such an approach provides useful simulation tools for designing and controlling SOFC-APUs destined to a wide application area, ranging from automotive to marine and airplane APUs.

  11. Recent Advances of MEMS Resonators for Lorentz Force Based Magnetic Field Sensors: Design, Applications and Challenges.

    PubMed

    Herrera-May, Agustín Leobardo; Soler-Balcazar, Juan Carlos; Vázquez-Leal, Héctor; Martínez-Castillo, Jaime; Vigueras-Zuñiga, Marco Osvaldo; Aguilera-Cortés, Luz Antonio

    2016-08-24

    Microelectromechanical systems (MEMS) resonators have allowed the development of magnetic field sensors with potential applications such as biomedicine, automotive industry, navigation systems, space satellites, telecommunications and non-destructive testing. We present a review of recent magnetic field sensors based on MEMS resonators, which operate with Lorentz force. These sensors have a compact structure, wide measurement range, low energy consumption, high sensitivity and suitable performance. The design methodology, simulation tools, damping sources, sensing techniques and future applications of magnetic field sensors are discussed. The design process is fundamental in achieving correct selection of the operation principle, sensing technique, materials, fabrication process and readout systems of the sensors. In addition, the description of the main sensing systems and challenges of the MEMS sensors are discussed. To develop the best devices, researches of their mechanical reliability, vacuum packaging, design optimization and temperature compensation circuits are needed. Future applications will require multifunctional sensors for monitoring several physical parameters (e.g., magnetic field, acceleration, angular ratio, humidity, temperature and gases).

  12. Recent Advances of MEMS Resonators for Lorentz Force Based Magnetic Field Sensors: Design, Applications and Challenges

    PubMed Central

    Herrera-May, Agustín Leobardo; Soler-Balcazar, Juan Carlos; Vázquez-Leal, Héctor; Martínez-Castillo, Jaime; Vigueras-Zuñiga, Marco Osvaldo; Aguilera-Cortés, Luz Antonio

    2016-01-01

    Microelectromechanical systems (MEMS) resonators have allowed the development of magnetic field sensors with potential applications such as biomedicine, automotive industry, navigation systems, space satellites, telecommunications and non-destructive testing. We present a review of recent magnetic field sensors based on MEMS resonators, which operate with Lorentz force. These sensors have a compact structure, wide measurement range, low energy consumption, high sensitivity and suitable performance. The design methodology, simulation tools, damping sources, sensing techniques and future applications of magnetic field sensors are discussed. The design process is fundamental in achieving correct selection of the operation principle, sensing technique, materials, fabrication process and readout systems of the sensors. In addition, the description of the main sensing systems and challenges of the MEMS sensors are discussed. To develop the best devices, researches of their mechanical reliability, vacuum packaging, design optimization and temperature compensation circuits are needed. Future applications will require multifunctional sensors for monitoring several physical parameters (e.g., magnetic field, acceleration, angular ratio, humidity, temperature and gases). PMID:27563912

  13. Design of experiments enhanced statistical process control for wind tunnel check standard testing

    NASA Astrophysics Data System (ADS)

    Phillips, Ben D.

    The current wind tunnel check standard testing program at NASA Langley Research Center is focused on increasing data quality, uncertainty quantification and overall control and improvement of wind tunnel measurement processes. The statistical process control (SPC) methodology employed in the check standard testing program allows for the tracking of variations in measurements over time as well as an overall assessment of facility health. While the SPC approach can and does provide researchers with valuable information, it has certain limitations in the areas of process improvement and uncertainty quantification. It is thought by utilizing design of experiments methodology in conjunction with the current SPC practices that one can efficiently and more robustly characterize uncertainties and develop enhanced process improvement procedures. In this research, methodologies were developed to generate regression models for wind tunnel calibration coefficients, balance force coefficients and wind tunnel flow angularities. The coefficients of these regression models were then tracked in statistical process control charts, giving a higher level of understanding of the processes. The methodology outlined is sufficiently generic such that this research can be applicable to any wind tunnel check standard testing program.

  14. Comparison of Requirements for Composite Structures for Aircraft and Space Applications

    NASA Technical Reports Server (NTRS)

    Raju, Ivatury S.; Elliott, Kenny B.; Hampton, Roy W.; Knight, Norman F., Jr.; Aggarwal, Pravin; Engelstad, Stephen P.; Chang, James B.

    2010-01-01

    In this paper, the aircraft and space vehicle requirements for composite structures are compared. It is a valuable exercise to study composite structural design approaches used in the airframe industry, and to adopt methodology that is applicable for space vehicles. The missions, environments, analysis methods, analysis validation approaches, testing programs, build quantities, inspection, and maintenance procedures used by the airframe industry, in general, are not transferable to spaceflight hardware. Therefore, while the application of composite design approaches from other industries is appealing, many aspects cannot be directly utilized. Nevertheless, experiences and research for composite aircraft structures may be of use in unexpected arenas as space exploration technology develops, and so continued technology exchanges are encouraged.

  15. A new design methodology of obtaining wide band high gain broadband parametric source for infrared wavelength applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maji, Partha Sona; Roy Chaudhuri, Partha

    In this article, we have presented a new design methodology of obtaining wide band parametric sources based on highly nonlinear chalcogenide material of As{sub 2}S{sub 3}. The dispersion profile of the photonic crystal fiber (PCF) has been engineered wisely by reducing the diameter of the second air-hole ring to have a favorable higher order dispersion parameter. The parametric gain dependence upon fiber length, pump power, and different pumping wavelengths has been investigated in detail. Based upon the nonlinear four wave mixing phenomenon, we are able to achieve a wideband parametric amplifier with peak gain of 29 dB with FWHM of ≈2000 nmmore » around the IR wavelength by proper tailoring of the dispersion profile of the PCF with a continuous wave Erbium (Er{sup 3+})-doped ZBLAN fiber laser emitting at 2.8 μm as the pump source with an average power of 5 W. The new design methodology will unleash a new dimension to the chalcogenide material based investigation for wavelength translation around IR wavelength band.« less

  16. Creep Life Prediction of Ceramic Components Using the Finite Element Based Integrated Design Program (CARES/Creep)

    NASA Technical Reports Server (NTRS)

    Jadaan, Osama M.; Powers, Lynn M.; Gyekenyesi, John P.

    1997-01-01

    The desirable properties of ceramics at high temperatures have generated interest in their use for structural applications such as in advanced turbine systems. Design lives for such systems can exceed 10,000 hours. Such long life requirements necessitate subjecting the components to relatively low stresses. The combination of high temperatures and low stresses typically places failure for monolithic ceramics in the creep regime. The objective of this work is to present a design methodology for predicting the lifetimes of structural components subjected to multiaxial creep loading. This methodology utilizes commercially available finite element packages and takes into account the time varying creep stress distributions (stress relaxation). In this methodology, the creep life of a component is divided into short time steps, during which, the stress and strain distributions are assumed constant. The damage, D, is calculated for each time step based on a modified Monkman-Grant creep rupture criterion. For components subjected to predominantly tensile loading, failure is assumed to occur when the normalized accumulated damage at any point in the component is greater than or equal to unity.

  17. An approach to quantitative sustainability assessment in the early stages of process design.

    PubMed

    Tugnoli, Alessandro; Santarelli, Francesco; Cozzani, Valerio

    2008-06-15

    A procedure was developed for the quantitative assessment of key performance indicators suitable for the sustainability analysis of alternative processes, mainly addressing the early stages of process design. The methodology was based on the calculation of a set of normalized impact indices allowing a direct comparison of the additional burden of each process alternative on a selected reference area. Innovative reference criteria were developed to compare and aggregate the impact indicators on the basis of the site-specific impact burden and sustainability policy. An aggregation procedure also allows the calculation of overall sustainability performance indicators and of an "impact fingerprint" of each process alternative. The final aim of the method is to support the decision making process during process development, providing a straightforward assessment of the expected sustainability performances. The application of the methodology to case studies concerning alternative waste disposal processes allowed a preliminary screening of the expected critical sustainability impacts of each process. The methodology was shown to provide useful results to address sustainability issues in the early stages of process design.

  18. SynGenics Optimization System (SynOptSys)

    NASA Technical Reports Server (NTRS)

    Ventresca, Carol; McMilan, Michelle L.; Globus, Stephanie

    2013-01-01

    The SynGenics Optimization System (SynOptSys) software application optimizes a product with respect to multiple, competing criteria using statistical Design of Experiments, Response-Surface Methodology, and the Desirability Optimization Methodology. The user is not required to be skilled in the underlying math; thus, SynOptSys can help designers and product developers overcome the barriers that prevent them from using powerful techniques to develop better pro ducts in a less costly manner. SynOpt-Sys is applicable to the design of any product or process with multiple criteria to meet, and at least two factors that influence achievement of those criteria. The user begins with a selected solution principle or system concept and a set of criteria that needs to be satisfied. The criteria may be expressed in terms of documented desirements or defined responses that the future system needs to achieve. Documented desirements can be imported into SynOptSys or created and documented directly within SynOptSys. Subsequent steps include identifying factors, specifying model order for each response, designing the experiment, running the experiment and gathering the data, analyzing the results, and determining the specifications for the optimized system. The user may also enter textual information as the project progresses. Data is easily edited within SynOptSys, and the software design enables full traceability within any step in the process, and facilitates reporting as needed. SynOptSys is unique in the way responses are defined and the nuances of the goodness associated with changes in response values for each of the responses of interest. The Desirability Optimization Methodology provides the basis of this novel feature. Moreover, this is a complete, guided design and optimization process tool with embedded math that can remain invisible to the user. It is not a standalone statistical program; it is a design and optimization system.

  19. Low-Radiation Cellular Inductive Powering of Rodent Wireless Brain Interfaces: Methodology and Design Guide.

    PubMed

    Soltani, Nima; Aliroteh, Miaad S; Salam, M Tariqus; Perez Velazquez, Jose Luis; Genov, Roman

    2016-08-01

    This paper presents a general methodology of inductive power delivery in wireless chronic rodent electrophysiology applications. The focus is on such systems design considerations under the following key constraints: maximum power delivery under the allowable specific absorption rate (SAR), low cost and spatial scalability. The methodology includes inductive coil design considerations within a low-frequency ferrite-core-free power transfer link which includes a scalable coil-array power transmitter floor and a single-coil implanted or worn power receiver. A specific design example is presented that includes the concept of low-SAR cellular single-transmitter-coil powering through dynamic tracking of a magnet-less receiver spatial location. The transmitter coil instantaneous supply current is monitored using a small number of low-cost electronic components. A drop in its value indicates the proximity of the receiver due to the reflected impedance of the latter. Only the transmitter coil nearest to the receiver is activated. Operating at the low frequency of 1.5 MHz, the inductive powering floor delivers a maximum of 15.9 W below the IEEE C95 SAR limit, which is over three times greater than that in other recently reported designs. The power transfer efficiency of 39% and 13% at the nominal and maximum distances of 8 cm and 11 cm, respectively, is maintained.

  20. Training and Farmers' Organizations' Performance

    ERIC Educational Resources Information Center

    Miiro, Richard F.; Matsiko, Frank B.; Mazur, Robert E.

    2014-01-01

    Purpose: This study sought to determine the influence of training transfer factors and actual application of training on organization level outcomes among farmer owned produce marketing organizations in Uganda. Design/methodology/approach: Interviews based on the Learning Transfer Systems Inventory (LTSI) were conducted with 120 PMO leaders…

  1. Building quality into medical product software design.

    PubMed

    Mallory, S R

    1993-01-01

    The software engineering and quality assurance disciplines are a requisite to the design of safe and effective software-based medical devices. It is in the areas of software methodology and process that the most beneficial application of these disciplines to software development can be made. Software is a product of complex operations and methodologies and is not amenable to the traditional electromechanical quality assurance processes. Software quality must be built in by the developers, with the software verification and validation engineers acting as the independent instruments for ensuring compliance with performance objectives and with development and maintenance standards. The implementation of a software quality assurance program is a complex process involving management support, organizational changes, and new skill sets, but the benefits are profound. Its rewards provide safe, reliable, cost-effective, maintainable, and manageable software, which may significantly speed the regulatory review process and therefore potentially shorten the overall time to market. The use of a trial project can greatly facilitate the learning process associated with the first-time application of a software quality assurance program.

  2. De/colonizing methodologies in science education: rebraiding research theory-practice-ethics with Indigenous theories and theorists

    NASA Astrophysics Data System (ADS)

    Higgins, Marc; Kim, Eun-Ji Amy

    2018-02-01

    The purpose of this article is to differentially engage in the work of thinking with Indigenous theorists and theories with decolonizing science education research methodologies in mind. As a rejoinder to Tracey McMahon, Emily Griese, and DenYelle Baete Kenyon's Cultivating Native American scientists: An application of an Indigenous model to an undergraduate research experience, we extend the notion of educationally centering Indigenous processes, pedagogies, and protocols by considering methodology a site in which (neo-)colonial logics often linger. We suggest that (re)designing methodology with Indigenous theorists and theories is an important act of resistance, refusal, and resignification; we demonstrate this significance through braiding together narratives of our engagement in this task and provide insights as to what is produced or producible.

  3. Structural Loads Analysis for Wave Energy Converters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    van Rij, Jennifer A; Yu, Yi-Hsiang; Guo, Yi

    2017-06-03

    This study explores and verifies the generalized body-modes method for evaluating the structural loads on a wave energy converter (WEC). Historically, WEC design methodologies have focused primarily on accurately evaluating hydrodynamic loads, while methodologies for evaluating structural loads have yet to be fully considered and incorporated into the WEC design process. As wave energy technologies continue to advance, however, it has become increasingly evident that an accurate evaluation of the structural loads will enable an optimized structural design, as well as the potential utilization of composites and flexible materials, and hence reduce WEC costs. Although there are many computational fluidmore » dynamics, structural analyses and fluid-structure-interaction (FSI) codes available, the application of these codes is typically too computationally intensive to be practical in the early stages of the WEC design process. The generalized body-modes method, however, is a reduced order, linearized, frequency-domain FSI approach, performed in conjunction with the linear hydrodynamic analysis, with computation times that could realistically be incorporated into the WEC design process.« less

  4. Towards a general object-oriented software development methodology

    NASA Technical Reports Server (NTRS)

    Seidewitz, ED; Stark, Mike

    1986-01-01

    An object is an abstract software model of a problem domain entity. Objects are packages of both data and operations of that data (Goldberg 83, Booch 83). The Ada (tm) package construct is representative of this general notion of an object. Object-oriented design is the technique of using objects as the basic unit of modularity in systems design. The Software Engineering Laboratory at the Goddard Space Flight Center is currently involved in a pilot program to develop a flight dynamics simulator in Ada (approximately 40,000 statements) using object-oriented methods. Several authors have applied object-oriented concepts to Ada (e.g., Booch 83, Cherry 85). It was found that these methodologies are limited. As a result a more general approach was synthesized with allows a designer to apply powerful object-oriented principles to a wide range of applications and at all stages of design. An overview is provided of this approach. Further, how object-oriented design fits into the overall software life-cycle is considered.

  5. Application of Plackett-Burman experimental design in the development of muffin using adlay flour

    NASA Astrophysics Data System (ADS)

    Valmorida, J. S.; Castillo-Israel, K. A. T.

    2018-01-01

    The application of Plackett-Burman experimental design was made to identify significant formulation and process variables in the development of muffin using adlay flour. Out of the seven screened variables, levels of sugar, levels of butter and baking temperature had the most significant influence on the product model in terms of physicochemical and sensory acceptability. Results of the experiment further demonstrate the effectiveness of Plackett-Burman design in choosing the best adlay variety for muffin production. Hence, the statistical method used in the study permits an efficient selection of important variables needed in the development of muffin from adlay which can be optimized using response surface methodology.

  6. A FPGA implementation for linearly unmixing a hyperspectral image using OpenCL

    NASA Astrophysics Data System (ADS)

    Guerra, Raúl; López, Sebastián.; Sarmiento, Roberto

    2017-10-01

    Hyperspectral imaging systems provide images in which single pixels have information from across the electromagnetic spectrum of the scene under analysis. These systems divide the spectrum into many contiguos channels, which may be even out of the visible part of the spectra. The main advantage of the hyperspectral imaging technology is that certain objects leave unique fingerprints in the electromagnetic spectrum, known as spectral signatures, which allow to distinguish between different materials that may look like the same in a traditional RGB image. Accordingly, the most important hyperspectral imaging applications are related with distinguishing or identifying materials in a particular scene. In hyperspectral imaging applications under real-time constraints, the huge amount of information provided by the hyperspectral sensors has to be rapidly processed and analysed. For such purpose, parallel hardware devices, such as Field Programmable Gate Arrays (FPGAs) are typically used. However, developing hardware applications typically requires expertise in the specific targeted device, as well as in the tools and methodologies which can be used to perform the implementation of the desired algorithms in the specific device. In this scenario, the Open Computing Language (OpenCL) emerges as a very interesting solution in which a single high-level synthesis design language can be used to efficiently develop applications in multiple and different hardware devices. In this work, the Fast Algorithm for Linearly Unmixing Hyperspectral Images (FUN) has been implemented into a Bitware Stratix V Altera FPGA using OpenCL. The obtained results demonstrate the suitability of OpenCL as a viable design methodology for quickly creating efficient FPGAs designs for real-time hyperspectral imaging applications.

  7. The application of remote sensing to resource management and environmental quality programs in Kansas

    NASA Technical Reports Server (NTRS)

    Barr, B. G.; Martinko, E. A.

    1976-01-01

    Activities of the Kansas Applied Remote Sensing Program (KARS) designed to establish interactions on cooperative projects with decision makers in Kansas agencies in the development and application of remote sensing procedures are reported. Cooperative demonstration projects undertaken with several different agencies involved three principal areas of effort: Wildlife Habitat and Environmental Analysis; Urban and Regional Analysis; Agricultural and Rural Analysis. These projects were designed to concentrate remote sensing concepts and methodologies on existing agency problems to insure the continued relevancy of the program and maximize the possibility for immediate operational use. Completed projects are briefly discussed.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hugo, Jacques

    The software application is called "HFE-Trace". This is an integrated method and tool for the management of Human Factors Engineering analyses and related data. Its primary purpose is to support the coherent and consistent application of the nuclear industry's best practices for human factors engineering work. The software is a custom Microsoft® Access® application. The application is used (in conjunction with other tools such as spreadsheets, checklists and normal documents where necessary) to collect data on the design of a new nuclear power plant from subject matter experts and other sources. This information is then used to identify potential systemmore » and functional breakdowns of the intended power plant design. This information is expanded by developing extensive descriptions of all functions, as well as system performance parameters, operating limits and constraints, and operational conditions. Once these have been verified, the human factors elements are added to each function, including intended operator role, function allocation considerations, prohibited actions, primary task categories, and primary work station. In addition, the application includes a computational method to assess a number of factors such as system and process complexity, workload, environmental conditions, procedures, regulations, etc.) that may shape operator performance. This is a unique methodology based upon principles described in NUREG/CR-3331 ("A methodology for allocating nuclear power plant control functions to human or automatic control") and it results in a semi-quantified allocation of functions to three or more levels of automation for a conceptual automation system. The aggregate of all this information is then linked to the Task Analysis section of the application where the existing information on all operator functions is transformed into task information and ultimately into design requirements for Human-System Interfaces and Control Rooms. This final step includes assessment of methods to prevent potential operator errors.« less

  9. CFD applications: The Lockheed perspective

    NASA Technical Reports Server (NTRS)

    Miranda, Luis R.

    1987-01-01

    The Numerical Aerodynamic Simulator (NAS) epitomizes the coming of age of supercomputing and opens exciting horizons in the world of numerical simulation. An overview of supercomputing at Lockheed Corporation in the area of Computational Fluid Dynamics (CFD) is presented. This overview will focus on developments and applications of CFD as an aircraft design tool and will attempt to present an assessment, withing this context, of the state-of-the-art in CFD methodology.

  10. Protein Design Using Unnatural Amino Acids

    NASA Astrophysics Data System (ADS)

    Bilgiçer, Basar; Kumar, Krishna

    2003-11-01

    With the increasing availability of whole organism genome sequences, understanding protein structure and function is of capital importance. Recent developments in the methodology of incorporation of unnatural amino acids into proteins allow the exploration of proteins at a very detailed level. Furthermore, de novo design of novel protein structures and function is feasible with unprecedented sophistication. Using examples from the literature, this article describes the available methods for unnatural amino acid incorporation and highlights some recent applications including the design of hyperstable protein folds.

  11. Integrated Controls-Structures Design Methodology for Flexible Spacecraft

    NASA Technical Reports Server (NTRS)

    Maghami, P. G.; Joshi, S. M.; Price, D. B.

    1995-01-01

    This paper proposes an approach for the design of flexible spacecraft, wherein the structural design and the control system design are performed simultaneously. The integrated design problem is posed as an optimization problem in which both the structural parameters and the control system parameters constitute the design variables, which are used to optimize a common objective function, thereby resulting in an optimal overall design. The approach is demonstrated by application to the integrated design of a geostationary platform, and to a ground-based flexible structure experiment. The numerical results obtained indicate that the integrated design approach generally yields spacecraft designs that are substantially superior to the conventional approach, wherein the structural design and control design are performed sequentially.

  12. The Design and Development of a Web-Interface for the Software Engineering Automation System

    DTIC Science & Technology

    2001-09-01

    application on the Internet. 14. SUBJECT TERMS Computer Aided Prototyping, Real Time Systems , Java 15. NUMBER OF...difficult. Developing the entire system only to find it does not meet the customer’s needs is a tremendous waste of time. Real - time systems need a...software prototyping is an iterative software development methodology utilized to improve the analysis and design of real - time systems [2]. One

  13. European Science Notes Information Bulletin Reports on Current European/ Middle Eastern Science

    DTIC Science & Technology

    1991-04-01

    Fault tolerance Technology and VLSIIWSI Implementation 10th IFAC2 Workshop on Distributed Computer Optimal designs Commercial and experimental Control...catalysts that would facilitate cooperation between applications experts and computer architects in designing and implementing a new generation of parallel...speculative. Sediments immediately north of Iceland are up to 1-km However, they demonstrate the methodology for thick but thin rapidly to less than 200-m

  14. SAGA: A project to automate the management of software production systems

    NASA Technical Reports Server (NTRS)

    Campbell, Roy H.; Laliberte, D.; Render, H.; Sum, R.; Smith, W.; Terwilliger, R.

    1987-01-01

    The Software Automation, Generation and Administration (SAGA) project is investigating the design and construction of practical software engineering environments for developing and maintaining aerospace systems and applications software. The research includes the practical organization of the software lifecycle, configuration management, software requirements specifications, executable specifications, design methodologies, programming, verification, validation and testing, version control, maintenance, the reuse of software, software libraries, documentation, and automated management.

  15. The futility study—progress over the last decade

    PubMed Central

    Levin, Bruce

    2015-01-01

    We review the futility clinical trial design (also known as the non-superiority design) with respect to its emergence and methodologic developments over the last decade, especially in regard to its application to clinical trials for neurological disorders. We discuss the design’s strengths as a programmatic screening device to weed out unpromising new treatments, its limitations and pitfalls, and a recent critique of the logic of the method. PMID:26123873

  16. Expert Systems Development Methodology

    DTIC Science & Technology

    1989-07-28

    application. Chapter 9, Design and Prototyping, discusses the problems of designing the user interface and other characteristics of the ES and the prototyping...severely in question as to whether they were computable. In order to work with this problem , Turing created what he called the universal machine. These...about the theory of computers and their relationship to problem solving. It was here at Princeton that he first began to experiment directly with

  17. Methodology for sensitivity analysis, approximate analysis, and design optimization in CFD for multidisciplinary applications. [computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Taylor, Arthur C., III; Hou, Gene W.

    1992-01-01

    Fundamental equations of aerodynamic sensitivity analysis and approximate analysis for the two dimensional thin layer Navier-Stokes equations are reviewed, and special boundary condition considerations necessary to apply these equations to isolated lifting airfoils on 'C' and 'O' meshes are discussed in detail. An efficient strategy which is based on the finite element method and an elastic membrane representation of the computational domain is successfully tested, which circumvents the costly 'brute force' method of obtaining grid sensitivity derivatives, and is also useful in mesh regeneration. The issue of turbulence modeling is addressed in a preliminary study. Aerodynamic shape sensitivity derivatives are efficiently calculated, and their accuracy is validated on two viscous test problems, including: (1) internal flow through a double throat nozzle, and (2) external flow over a NACA 4-digit airfoil. An automated aerodynamic design optimization strategy is outlined which includes the use of a design optimization program, an aerodynamic flow analysis code, an aerodynamic sensitivity and approximate analysis code, and a mesh regeneration and grid sensitivity analysis code. Application of the optimization methodology to the two test problems in each case resulted in a new design having a significantly improved performance in the aerodynamic response of interest.

  18. A novel neural network based image reconstruction model with scale and rotation invariance for target identification and classification for Active millimetre wave imaging

    NASA Astrophysics Data System (ADS)

    Agarwal, Smriti; Bisht, Amit Singh; Singh, Dharmendra; Pathak, Nagendra Prasad

    2014-12-01

    Millimetre wave imaging (MMW) is gaining tremendous interest among researchers, which has potential applications for security check, standoff personal screening, automotive collision-avoidance, and lot more. Current state-of-art imaging techniques viz. microwave and X-ray imaging suffers from lower resolution and harmful ionizing radiation, respectively. In contrast, MMW imaging operates at lower power and is non-ionizing, hence, medically safe. Despite these favourable attributes, MMW imaging encounters various challenges as; still it is very less explored area and lacks suitable imaging methodology for extracting complete target information. Keeping in view of these challenges, a MMW active imaging radar system at 60 GHz was designed for standoff imaging application. A C-scan (horizontal and vertical scanning) methodology was developed that provides cross-range resolution of 8.59 mm. The paper further details a suitable target identification and classification methodology. For identification of regular shape targets: mean-standard deviation based segmentation technique was formulated and further validated using a different target shape. For classification: probability density function based target material discrimination methodology was proposed and further validated on different dataset. Lastly, a novel artificial neural network based scale and rotation invariant, image reconstruction methodology has been proposed to counter the distortions in the image caused due to noise, rotation or scale variations. The designed neural network once trained with sample images, automatically takes care of these deformations and successfully reconstructs the corrected image for the test targets. Techniques developed in this paper are tested and validated using four different regular shapes viz. rectangle, square, triangle and circle.

  19. Case Selection via Matching

    ERIC Educational Resources Information Center

    Nielsen, Richard A.

    2016-01-01

    This article shows how statistical matching methods can be used to select "most similar" cases for qualitative analysis. I first offer a methodological justification for research designs based on selecting most similar cases. I then discuss the applicability of existing matching methods to the task of selecting most similar cases and…

  20. Design guidelines for an umbilical cord blood stem cell therapy quality assessment model

    NASA Astrophysics Data System (ADS)

    Januszewski, Witold S.; Michałek, Krzysztof; Yagensky, Oleksandr; Wardzińska, Marta

    The paper enlists the pivotal guidelines for producing an empirical umbilical cord blood stem cell therapy quality assessment model. The methodology adapted was single equation linear model with domain knowledge derived from MEDAFAR classification. The resulting model is ready for therapeutical application.

  1. A General Survey of Qualitative Research Methodology.

    ERIC Educational Resources Information Center

    Cary, Rick

    Current definitions and philosophical foundations of qualitative research are presented; and designs, evaluation methods, and issues in application of qualitative research to education are discussed. The effects of positivism and the post-positivist era on qualitative research are outlined, and naturalist and positivist approaches are contrasted.…

  2. Mobile Student Information System

    ERIC Educational Resources Information Center

    Asif, Muhammad; Krogstie, John

    2011-01-01

    Purpose: A mobile student information system (MSIS) based on mobile computing and context-aware application concepts can provide more user-centric information services to students. The purpose of this paper is to describe a system for providing relevant information to students on a mobile platform. Design/methodology/approach: The research…

  3. Principles of Protein Stability and Their Application in Computational Design.

    PubMed

    Goldenzweig, Adi; Fleishman, Sarel

    2018-01-26

    Proteins are increasingly used in basic and applied biomedical research.Many proteins, however, are only marginally stable and can be expressed in limited amounts, thus hampering research and applications. Research has revealed the thermodynamic, cellular, and evolutionary principles and mechanisms that underlie marginal stability. With this growing understanding, computational stability design methods have advanced over the past two decades starting from methods that selectively addressed only some aspects of marginal stability. Current methods are more general and, by combining phylogenetic analysis with atomistic design, have shown drastic improvements in solubility, thermal stability, and aggregation resistance while maintaining the protein's primary molecular activity. Stability design is opening the way to rational engineering of improved enzymes, therapeutics, and vaccines and to the application of protein design methodology to large proteins and molecular activities that have proven challenging in the past. Expected final online publication date for the Annual Review of Biochemistry Volume 87 is June 20, 2018. Please see http://www.annualreviews.org/page/journal/pubdates for revised estimates.

  4. Design Requirements for Communication-Intensive Interactive Applications

    NASA Astrophysics Data System (ADS)

    Bolchini, Davide; Garzotto, Franca; Paolini, Paolo

    Online interactive applications call for new requirements paradigms to capture the growing complexity of computer-mediated communication. Crafting successful interactive applications (such as websites and multimedia) involves modeling the requirements for the user experience, including those leading to content design, usable information architecture and interaction, in profound coordination with the communication goals of all stakeholders involved, ranging from persuasion to social engagement, to call for action. To face this grand challenge, we propose a methodology for modeling communication requirements and provide a set of operational conceptual tools to be used in complex projects with multiple stakeholders. Through examples from real-life projects and lessons-learned from direct experience, we draw on the concepts of brand, value, communication goals, information and persuasion requirements to systematically guide analysts to master the multifaceted connections of these elements as drivers to inform successful communication designs.

  5. Reliability and availability evaluation of Wireless Sensor Networks for industrial applications.

    PubMed

    Silva, Ivanovitch; Guedes, Luiz Affonso; Portugal, Paulo; Vasques, Francisco

    2012-01-01

    Wireless Sensor Networks (WSN) currently represent the best candidate to be adopted as the communication solution for the last mile connection in process control and monitoring applications in industrial environments. Most of these applications have stringent dependability (reliability and availability) requirements, as a system failure may result in economic losses, put people in danger or lead to environmental damages. Among the different type of faults that can lead to a system failure, permanent faults on network devices have a major impact. They can hamper communications over long periods of time and consequently disturb, or even disable, control algorithms. The lack of a structured approach enabling the evaluation of permanent faults, prevents system designers to optimize decisions that minimize these occurrences. In this work we propose a methodology based on an automatic generation of a fault tree to evaluate the reliability and availability of Wireless Sensor Networks, when permanent faults occur on network devices. The proposal supports any topology, different levels of redundancy, network reconfigurations, criticality of devices and arbitrary failure conditions. The proposed methodology is particularly suitable for the design and validation of Wireless Sensor Networks when trying to optimize its reliability and availability requirements.

  6. Reliability and Availability Evaluation of Wireless Sensor Networks for Industrial Applications

    PubMed Central

    Silva, Ivanovitch; Guedes, Luiz Affonso; Portugal, Paulo; Vasques, Francisco

    2012-01-01

    Wireless Sensor Networks (WSN) currently represent the best candidate to be adopted as the communication solution for the last mile connection in process control and monitoring applications in industrial environments. Most of these applications have stringent dependability (reliability and availability) requirements, as a system failure may result in economic losses, put people in danger or lead to environmental damages. Among the different type of faults that can lead to a system failure, permanent faults on network devices have a major impact. They can hamper communications over long periods of time and consequently disturb, or even disable, control algorithms. The lack of a structured approach enabling the evaluation of permanent faults, prevents system designers to optimize decisions that minimize these occurrences. In this work we propose a methodology based on an automatic generation of a fault tree to evaluate the reliability and availability of Wireless Sensor Networks, when permanent faults occur on network devices. The proposal supports any topology, different levels of redundancy, network reconfigurations, criticality of devices and arbitrary failure conditions. The proposed methodology is particularly suitable for the design and validation of Wireless Sensor Networks when trying to optimize its reliability and availability requirements. PMID:22368497

  7. Optimization of the ultrasonic assisted removal of methylene blue by gold nanoparticles loaded on activated carbon using experimental design methodology.

    PubMed

    Roosta, M; Ghaedi, M; Daneshfar, A; Sahraei, R; Asghari, A

    2014-01-01

    The present study was focused on the removal of methylene blue (MB) from aqueous solution by ultrasound-assisted adsorption onto the gold nanoparticles loaded on activated carbon (Au-NP-AC). This nanomaterial was characterized using different techniques such as SEM, XRD, and BET. The effects of variables such as pH, initial dye concentration, adsorbent dosage (g), temperature and sonication time (min) on MB removal were studied and using central composite design (CCD) and the optimum experimental conditions were found with desirability function (DF) combined response surface methodology (RSM). Fitting the experimental equilibrium data to various isotherm models such as Langmuir, Freundlich, Tempkin and Dubinin-Radushkevich models show the suitability and applicability of the Langmuir model. Analysis of experimental adsorption data to various kinetic models such as pseudo-first and second order, Elovich and intraparticle diffusion models show the applicability of the second-order equation model. The small amount of proposed adsorbent (0.01 g) is applicable for successful removal of MB (RE>95%) in short time (1.6 min) with high adsorption capacity (104-185 mg g(-1)). Copyright © 2013 Elsevier B.V. All rights reserved.

  8. Computational Fluid Dynamics (CFD) applications in rocket propulsion analysis and design

    NASA Technical Reports Server (NTRS)

    Mcconnaughey, P. K.; Garcia, R.; Griffin, L. W.; Ruf, J. H.

    1993-01-01

    Computational Fluid Dynamics (CFD) has been used in recent applications to affect subcomponent designs in liquid propulsion rocket engines. This paper elucidates three such applications for turbine stage, pump stage, and combustor chamber geometries. Details of these applications include the development of a high turning airfoil for a gas generator (GG) powered, liquid oxygen (LOX) turbopump, single-stage turbine using CFD as an integral part of the design process. CFD application to pump stage design has emphasized analysis of inducers, impellers, and diffuser/volute sections. Improvements in pump stage impeller discharge flow uniformity have been seen through CFD optimization on coarse grid models. In the area of combustor design, recent CFD analysis of a film cooled ablating combustion chamber has been used to quantify the interaction between film cooling rate, chamber wall contraction angle, and geometry and their effects of these quantities on local wall temperature. The results are currently guiding combustion chamber design and coolant flow rate for an upcoming subcomponent test. Critical aspects of successful integration of CFD into the design cycle includes a close-coupling of CFD and design organizations, quick turnaround of parametric analyses once a baseline CFD benchmark has been established, and the use of CFD methodology and approaches that address pertinent design issues. In this latter area, some problem details can be simplified while retaining key physical aspects to maintain analytical integrity.

  9. Decentralized PID controller for TITO systems using characteristic ratio assignment with an experimental application.

    PubMed

    Hajare, V D; Patre, B M

    2015-11-01

    This paper presents a decentralized PID controller design method for two input two output (TITO) systems with time delay using characteristic ratio assignment (CRA) method. The ability of CRA method to design controller for desired transient response has been explored for TITO systems. The design methodology uses an ideal decoupler to reduce the interaction. Each decoupled subsystem is reduced to first order plus dead time (FOPDT) model to design independent diagonal controllers. Based on specified overshoot and settling time, the controller parameters are computed using CRA method. To verify performance of the proposed controller, two benchmark simulation examples are presented. To demonstrate applicability of the proposed controller, experimentation is performed on real life interacting coupled tank level system. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  10. Circuit design advances for ultra-low power sensing platforms

    NASA Astrophysics Data System (ADS)

    Wieckowski, Michael; Dreslinski, Ronald G.; Mudge, Trevor; Blaauw, David; Sylvester, Dennis

    2010-04-01

    This paper explores the recent advances in circuit structures and design methodologies that have enabled ultra-low power sensing platforms and opened up a host of new applications. Central to this theme is the development of Near Threshold Computing (NTC) as a viable design space for low power sensing platforms. In this paradigm, the system's supply voltage is approximately equal to the threshold voltage of its transistors. Operating in this "near-threshold" region provides much of the energy savings previously demonstrated for subthreshold operation while offering more favorable performance and variability characteristics. This makes NTC applicable to a broad range of power-constrained computing segments including energy constrained sensing platforms. This paper explores the barriers to the adoption of NTC and describes current work aimed at overcoming these obstacles in the circuit design space.

  11. Applications of a damage tolerance analysis methodology in aircraft design and production

    NASA Technical Reports Server (NTRS)

    Woodward, M. R.; Owens, S. D.; Law, G. E.; Mignery, L. A.

    1992-01-01

    Objectives of customer mandated aircraft structural integrity initiatives in design are to guide material selection, to incorporate fracture resistant concepts in the design, to utilize damage tolerance based allowables and planned inspection procedures necessary to enhance the safety and reliability of manned flight vehicles. However, validated fracture analysis tools for composite structures are needed to accomplish these objectives in a timely and economical manner. This paper briefly describes the development, validation, and application of a damage tolerance methodology for composite airframe structures. A closed-form analysis code, entitled SUBLAM was developed to predict the critical biaxial strain state necessary to cause sublaminate buckling-induced delamination extension in an impact damaged composite laminate. An embedded elliptical delamination separating a thin sublaminate from a thick parent laminate is modelled. Predicted failure strains were correlated against a variety of experimental data that included results from compression after impact coupon and element tests. An integrated analysis package was developed to predict damage tolerance based margin-of-safety (MS) using NASTRAN generated loads and element information. Damage tolerance aspects of new concepts are quickly and cost-effectively determined without the need for excessive testing.

  12. Development of a prosthesis shoulder mechanism for upper limb amputees: application of an original design methodology to optimize functionality and wearability.

    PubMed

    Troncossi, Marco; Borghi, Corrado; Chiossi, Marco; Davalli, Angelo; Parenti-Castelli, Vincenzo

    2009-05-01

    The application of a design methodology for the determination of the optimal prosthesis architecture for a given upper limb amputee is presented in this paper along with the discussion of its results. In particular, a novel procedure was used to provide the main guidelines for the design of an actuated shoulder articulation for externally powered prostheses. The topology and the geometry of the new articulation were determined as the optimal compromise between wearability (for the ease of use and the patient's comfort) and functionality of the device (in terms of mobility, velocity, payload, etc.). This choice was based on kinematic and kinetostatic analyses of different upper limb prosthesis models and on purpose-built indices that were set up to evaluate the models from different viewpoints. Only 12 of the 31 simulated prostheses proved a sufficient level of functionality: among these, the optimal solution was an articulation having two actuated revolute joints with orthogonal axes for the elevation of the upper arm in any vertical plane and a frictional joint for the passive adjustment of the humeral intra-extra rotation. A prototype of the mechanism is at the clinical test stage.

  13. Synthesis design of artificial magnetic metamaterials using a genetic algorithm.

    PubMed

    Chen, P Y; Chen, C H; Wang, H; Tsai, J H; Ni, W X

    2008-08-18

    In this article, we present a genetic algorithm (GA) as one branch of artificial intelligence (AI) for the optimization-design of the artificial magnetic metamaterial whose structure is automatically generated by computer through the filling element methodology. A representative design example, metamaterials with permeability of negative unity, is investigated and the optimized structures found by the GA are presented. It is also demonstrated that our approach is effective for the synthesis of functional magnetic and electric metamaterials with optimal structures. This GA-based optimization-design technique shows great versatility and applicability in the design of functional metamaterials.

  14. Software design and documentation language, revision 1

    NASA Technical Reports Server (NTRS)

    Kleine, H.

    1979-01-01

    The Software Design and Documentation Language (SDDL) developed to provide an effective communications medium to support the design and documentation of complex software applications is described. Features of the system include: (1) a processor which can convert design specifications into an intelligible, informative machine-reproducible document; (2) a design and documentation language with forms and syntax that are simple, unrestrictive, and communicative; and (3) methodology for effective use of the language and processor. The SDDL processor is written in the SIMSCRIPT II programming language and is implemented on the UNIVAC 1108, the IBM 360/370, and Control Data machines.

  15. An Overview of R in Health Decision Sciences.

    PubMed

    Jalal, Hawre; Pechlivanoglou, Petros; Krijkamp, Eline; Alarid-Escudero, Fernando; Enns, Eva; Hunink, M G Myriam

    2017-10-01

    As the complexity of health decision science applications increases, high-level programming languages are increasingly adopted for statistical analyses and numerical computations. These programming languages facilitate sophisticated modeling, model documentation, and analysis reproducibility. Among the high-level programming languages, the statistical programming framework R is gaining increased recognition. R is freely available, cross-platform compatible, and open source. A large community of users who have generated an extensive collection of well-documented packages and functions supports it. These functions facilitate applications of health decision science methodology as well as the visualization and communication of results. Although R's popularity is increasing among health decision scientists, methodological extensions of R in the field of decision analysis remain isolated. The purpose of this article is to provide an overview of existing R functionality that is applicable to the various stages of decision analysis, including model design, input parameter estimation, and analysis of model outputs.

  16. Applications of artificial neural network in AIDS research and therapy.

    PubMed

    Sardari, S; Sardari, D

    2002-01-01

    In recent years considerable effort has been devoted to applying pattern recognition techniques to the complex task of data analysis in drug research. Artificial neural networks (ANN) methodology is a modeling method with great ability to adapt to a new situation, or control an unknown system, using data acquired in previous experiments. In this paper, a brief history of ANN and the basic concepts behind the computing, the mathematical and algorithmic formulation of each of the techniques, and their developmental background is presented. Based on the abilities of ANNs in pattern recognition and estimation of system outputs from the known inputs, the neural network can be considered as a tool for molecular data analysis and interpretation. Analysis by neural networks improves the classification accuracy, data quantification and reduces the number of analogues necessary for correct classification of biologically active compounds. Conformational analysis and quantifying the components in mixtures using NMR spectra, aqueous solubility prediction and structure-activity correlation are among the reported applications of ANN as a new modeling method. Ranging from drug design and discovery to structure and dosage form design, the potential pharmaceutical applications of the ANN methodology are significant. In the areas of clinical monitoring, utilization of molecular simulation and design of bioactive structures, ANN would make the study of the status of the health and disease possible and brings their predicted chemotherapeutic response closer to reality.

  17. Assessment of Reliable Change Using 95% Credible Intervals for the Differences in Proportions: A Statistical Analysis for Case-Study Methodology.

    PubMed

    Unicomb, Rachael; Colyvas, Kim; Harrison, Elisabeth; Hewat, Sally

    2015-06-01

    Case-study methodology studying change is often used in the field of speech-language pathology, but it can be criticized for not being statistically robust. Yet with the heterogeneous nature of many communication disorders, case studies allow clinicians and researchers to closely observe and report on change. Such information is valuable and can further inform large-scale experimental designs. In this research note, a statistical analysis for case-study data is outlined that employs a modification to the Reliable Change Index (Jacobson & Truax, 1991). The relationship between reliable change and clinical significance is discussed. Example data are used to guide the reader through the use and application of this analysis. A method of analysis is detailed that is suitable for assessing change in measures with binary categorical outcomes. The analysis is illustrated using data from one individual, measured before and after treatment for stuttering. The application of this approach to assess change in categorical, binary data has potential application in speech-language pathology. It enables clinicians and researchers to analyze results from case studies for their statistical and clinical significance. This new method addresses a gap in the research design literature, that is, the lack of analysis methods for noncontinuous data (such as counts, rates, proportions of events) that may be used in case-study designs.

  18. HSTDEK: Developing a methodology for construction of large-scale, multi-use knowledge bases

    NASA Technical Reports Server (NTRS)

    Freeman, Michael S.

    1987-01-01

    The primary research objectives of the Hubble Space Telescope Design/Engineering Knowledgebase (HSTDEK) are to develop a methodology for constructing and maintaining large scale knowledge bases which can be used to support multiple applications. To insure the validity of its results, this research is being persued in the context of a real world system, the Hubble Space Telescope. The HSTDEK objectives are described in detail. The history and motivation of the project are briefly described. The technical challenges faced by the project are outlined.

  19. A New Finite-Time Observer for Nonlinear Systems: Applications to Synchronization of Lorenz-Like Systems.

    PubMed

    Aguilar-López, Ricardo; Mata-Machuca, Juan L

    2016-01-01

    This paper proposes a synchronization methodology of two chaotic oscillators under the framework of identical synchronization and master-slave configuration. The proposed methodology is based on state observer design under the frame of control theory; the observer structure provides finite-time synchronization convergence by cancelling the upper bounds of the main nonlinearities of the chaotic oscillator. The above is showed via an analysis of the dynamic of the so called synchronization error. Numerical experiments corroborate the satisfactory results of the proposed scheme.

  20. A New Finite-Time Observer for Nonlinear Systems: Applications to Synchronization of Lorenz-Like Systems

    PubMed Central

    Aguilar-López, Ricardo

    2016-01-01

    This paper proposes a synchronization methodology of two chaotic oscillators under the framework of identical synchronization and master-slave configuration. The proposed methodology is based on state observer design under the frame of control theory; the observer structure provides finite-time synchronization convergence by cancelling the upper bounds of the main nonlinearities of the chaotic oscillator. The above is showed via an analysis of the dynamic of the so called synchronization error. Numerical experiments corroborate the satisfactory results of the proposed scheme. PMID:27738651

  1. A methodology based on reduced complexity algorithm for system applications using microprocessors

    NASA Technical Reports Server (NTRS)

    Yan, T. Y.; Yao, K.

    1988-01-01

    The paper considers a methodology on the analysis and design of a minimum mean-square error criterion linear system incorporating a tapped delay line (TDL) where all the full-precision multiplications in the TDL are constrained to be powers of two. A linear equalizer based on the dispersive and additive noise channel is presented. This microprocessor implementation with optimized power of two TDL coefficients achieves a system performance comparable to the optimum linear equalization with full-precision multiplications for an input data rate of 300 baud.

  2. Observational studies of the association between glucose-lowering medications and cardiovascular outcomes: addressing methodological limitations.

    PubMed

    Patorno, Elisabetta; Patrick, Amanda R; Garry, Elizabeth M; Schneeweiss, Sebastian; Gillet, Victoria G; Bartels, Dorothee B; Masso-Gonzalez, Elvira; Seeger, John D

    2014-11-01

    Recent years have witnessed a growing body of observational literature on the association between glucose-lowering treatments and cardiovascular disease. However, many of the studies are based on designs or analyses that inadequately address the methodological challenges involved. We reviewed recent observational literature on the association between glucose-lowering medications and cardiovascular outcomes and assessed the design and analysis methods used, with a focus on their ability to address specific methodological challenges. We describe and illustrate these methodological issues and their impact on observed associations, providing examples from the reviewed literature. We suggest approaches that may be employed to manage these methodological challenges. From the evaluation of 81 publications of observational investigations assessing the association between glucose-lowering treatments and cardiovascular outcomes, we identified the following methodological challenges: 1) handling of temporality in administrative databases; 2) handling of risks that vary with time and treatment duration; 3) definitions of the exposure risk window; 4) handling of exposures that change over time; and 5) handling of confounding by indication. Most of these methodological challenges may be suitably addressed through application of appropriate methods. Observational research plays an increasingly important role in the evaluation of the clinical effects of diabetes treatment. Implementation of appropriate research methods holds the promise of reducing the potential for spurious findings and the risk that the spurious findings will mislead the medical community about risks and benefits of diabetes medications.

  3. UWB Tracking Algorithms: AOA and TDOA

    NASA Technical Reports Server (NTRS)

    Ni, Jianjun David; Arndt, D.; Ngo, P.; Gross, J.; Refford, Melinda

    2006-01-01

    Ultra-Wideband (UWB) tracking prototype systems are currently under development at NASA Johnson Space Center for various applications on space exploration. For long range applications, a two-cluster Angle of Arrival (AOA) tracking method is employed for implementation of the tracking system; for close-in applications, a Time Difference of Arrival (TDOA) positioning methodology is exploited. Both AOA and TDOA are chosen to utilize the achievable fine time resolution of UWB signals. This talk presents a brief introduction to AOA and TDOA methodologies. The theoretical analysis of these two algorithms reveal the affecting parameters impact on the tracking resolution. For the AOA algorithm, simulations show that a tracking resolution less than 0.5% of the range can be achieved with the current achievable time resolution of UWB signals. For the TDOA algorithm used in close-in applications, simulations show that the (sub-inch) high tracking resolution is achieved with a chosen tracking baseline configuration. The analytical and simulated results provide insightful guidance for the UWB tracking system design.

  4. Integrated Evaluation of Reliability and Power Consumption of Wireless Sensor Networks

    PubMed Central

    Dâmaso, Antônio; Maciel, Paulo

    2017-01-01

    Power consumption is a primary interest in Wireless Sensor Networks (WSNs), and a large number of strategies have been proposed to evaluate it. However, those approaches usually neither consider reliability issues nor the power consumption of applications executing in the network. A central concern is the lack of consolidated solutions that enable us to evaluate the power consumption of applications and the network stack also considering their reliabilities. To solve this problem, we introduce a fully automatic solution to design power consumption aware WSN applications and communication protocols. The solution presented in this paper comprises a methodology to evaluate the power consumption based on the integration of formal models, a set of power consumption and reliability models, a sensitivity analysis strategy to select WSN configurations and a toolbox named EDEN to fully support the proposed methodology. This solution allows accurately estimating the power consumption of WSN applications and the network stack in an automated way. PMID:29113078

  5. Evaluation of the instream flow incremental methodology by U.S. Fish and Wildlife Service field users

    USGS Publications Warehouse

    Armour, Carl L.; Taylor, Jonathan G.

    1991-01-01

    This paper summarizes results of a survey conducted in 1988 of 57 U.S. Fish and Wildlife Service field offices. The purpose was to document opinions of biologists experienced in applying the Instream Flow Incremental Methodology (IFIM). Responses were received from 35 offices where 616 IFIM applications were reported. The existence of six monitoring studies designed to evaluate the adequacy of flows provided at sites was confirmed. The two principal categories reported as stumbling blocks to the successful application of IFIM were beliefs that the methodology is technically too simplistic or that it is too complex to apply. Recommendations receiving the highest scores for future initiatives to enhance IFIM use were (1) training and workshops for field biologists; and (2) improving suitability index (SI) curves and computer models, and evaluating the relationship of weighted useable area (WUA) to fish responses. The authors concur that emphasis for research should be on addressing technical concerns about SI curves and WUA.

  6. A methodology for extending domain coverage in SemRep.

    PubMed

    Rosemblat, Graciela; Shin, Dongwook; Kilicoglu, Halil; Sneiderman, Charles; Rindflesch, Thomas C

    2013-12-01

    We describe a domain-independent methodology to extend SemRep coverage beyond the biomedical domain. SemRep, a natural language processing application originally designed for biomedical texts, uses the knowledge sources provided by the Unified Medical Language System (UMLS©). Ontological and terminological extensions to the system are needed in order to support other areas of knowledge. We extended SemRep's application by developing a semantic representation of a previously unsupported domain. This was achieved by adapting well-known ontology engineering phases and integrating them with the UMLS knowledge sources on which SemRep crucially depends. While the process to extend SemRep coverage has been successfully applied in earlier projects, this paper presents in detail the step-wise approach we followed and the mechanisms implemented. A case study in the field of medical informatics illustrates how the ontology engineering phases have been adapted for optimal integration with the UMLS. We provide qualitative and quantitative results, which indicate the validity and usefulness of our methodology. Published by Elsevier Inc.

  7. Applications of Quantum Cascade Laser Spectroscopy in the Analysis of Pharmaceutical Formulations.

    PubMed

    Galán-Freyle, Nataly J; Pacheco-Londoño, Leonardo C; Román-Ospino, Andrés D; Hernandez-Rivera, Samuel P

    2016-09-01

    Quantum cascade laser spectroscopy was used to quantify active pharmaceutical ingredient content in a model formulation. The analyses were conducted in non-contact mode by mid-infrared diffuse reflectance. Measurements were carried out at a distance of 15 cm, covering the spectral range 1000-1600 cm(-1) Calibrations were generated by applying multivariate analysis using partial least squares models. Among the figures of merit of the proposed methodology are the high analytical sensitivity equivalent to 0.05% active pharmaceutical ingredient in the formulation, high repeatability (2.7%), high reproducibility (5.4%), and low limit of detection (1%). The relatively high power of the quantum-cascade-laser-based spectroscopic system resulted in the design of detection and quantification methodologies for pharmaceutical applications with high accuracy and precision that are comparable to those of methodologies based on near-infrared spectroscopy, attenuated total reflection mid-infrared Fourier transform infrared spectroscopy, and Raman spectroscopy. © The Author(s) 2016.

  8. Integrating end-to-end threads of control into object-oriented analysis and design

    NASA Technical Reports Server (NTRS)

    Mccandlish, Janet E.; Macdonald, James R.; Graves, Sara J.

    1993-01-01

    Current object-oriented analysis and design methodologies fall short in their use of mechanisms for identifying threads of control for the system being developed. The scenarios which typically describe a system are more global than looking at the individual objects and representing their behavior. Unlike conventional methodologies that use data flow and process-dependency diagrams, object-oriented methodologies do not provide a model for representing these global threads end-to-end. Tracing through threads of control is key to ensuring that a system is complete and timing constraints are addressed. The existence of multiple threads of control in a system necessitates a partitioning of the system into processes. This paper describes the application and representation of end-to-end threads of control to the object-oriented analysis and design process using object-oriented constructs. The issue of representation is viewed as a grouping problem, that is, how to group classes/objects at a higher level of abstraction so that the system may be viewed as a whole with both classes/objects and their associated dynamic behavior. Existing object-oriented development methodology techniques are extended by adding design-level constructs termed logical composite classes and process composite classes. Logical composite classes are design-level classes which group classes/objects both logically and by thread of control information. Process composite classes further refine the logical composite class groupings by using process partitioning criteria to produce optimum concurrent execution results. The goal of these design-level constructs is to ultimately provide the basis for a mechanism that can support the creation of process composite classes in an automated way. Using an automated mechanism makes it easier to partition a system into concurrently executing elements that can be run in parallel on multiple processors.

  9. Hard Constraints in Optimization Under Uncertainty

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Giesy, Daniel P.; Kenny, Sean P.

    2008-01-01

    This paper proposes a methodology for the analysis and design of systems subject to parametric uncertainty where design requirements are specified via hard inequality constraints. Hard constraints are those that must be satisfied for all parameter realizations within a given uncertainty model. Uncertainty models given by norm-bounded perturbations from a nominal parameter value, i.e., hyper-spheres, and by sets of independently bounded uncertain variables, i.e., hyper-rectangles, are the focus of this paper. These models, which are also quite practical, allow for a rigorous mathematical treatment within the proposed framework. Hard constraint feasibility is determined by sizing the largest uncertainty set for which the design requirements are satisfied. Analytically verifiable assessments of robustness are attained by comparing this set with the actual uncertainty model. Strategies that enable the comparison of the robustness characteristics of competing design alternatives, the description and approximation of the robust design space, and the systematic search for designs with improved robustness are also proposed. Since the problem formulation is generic and the tools derived only require standard optimization algorithms for their implementation, this methodology is applicable to a broad range of engineering problems.

  10. Elements of oxygen production systems using Martian atmosphere

    NASA Technical Reports Server (NTRS)

    Ash, R. L.; Huang, J.-K.; Johnson, P. B.; Sivertson, W. E., Jr.

    1986-01-01

    Hardware elements have been studied in terms of their applicability to Mars oxygen production systems. Various aspects of the system design are discussed and areas requiring further research are identified. Initial work on system reliability is discussed and a methodology for applying expert system technology to the oxygen processor is described.

  11. International Service and Public Health Learning Objectives for Medical Students

    ERIC Educational Resources Information Center

    Block, Robert C.; Duron, Vincent; Creigh, Peter; McIntosh, Scott

    2013-01-01

    Objective: We aimed to improve the education of medical students involved in a longitudinal perinatal health improvement project in Gowa, Malawi. Design: We conducted qualitative interviews with students who participated in the project, reviewed their quantitative reports, and assessed the application of methodologies consonant with the learning…

  12. Large-Scale Networked Virtual Environments: Architecture and Applications

    ERIC Educational Resources Information Center

    Lamotte, Wim; Quax, Peter; Flerackers, Eddy

    2008-01-01

    Purpose: Scalability is an important research topic in the context of networked virtual environments (NVEs). This paper aims to describe the ALVIC (Architecture for Large-scale Virtual Interactive Communities) approach to NVE scalability. Design/methodology/approach: The setup and results from two case studies are shown: a 3-D learning environment…

  13. Applications of landscape genetics in conservation biology: concepts and challenges

    Treesearch

    Gernot Segelbacher; Samuel A. Cushman; Bryan K. Epperson; Marie-Josee Fortin; Olivier Francois; Olivier J. Hardy; Rolf Holderegger; Stephanie Manel

    2010-01-01

    Landscape genetics plays an increasingly important role in the management and conservation of species. Here, we highlight some of the opportunities and challenges in using landscape genetic approaches in conservation biology. We first discuss challenges related to sampling design and introduce several recent methodological developments in landscape genetics (analyses...

  14. Performance Scripts Creation: Processes and Applications

    ERIC Educational Resources Information Center

    Lyons, Paul

    2006-01-01

    Purpose: Seeks to explain some of the dynamics of scripts creation as used in training, to offer some theoretical underpinning regarding the influence of script creation on behavior and performance, and to offer some examples of how script creation is applied in training activities. Design/methodology/approach: The paper explains in detail and…

  15. Graduate Recruitment and Development: Sector Influence on a Local Market/Regional Economy

    ERIC Educational Resources Information Center

    Heaton, Norma; McCracken, Martin; Harrison, Jeanette

    2008-01-01

    Purpose: The aim of this article is to illustrate how employers have used more innovative "localised" strategies to address what appears to be "globalised" problems of attracting and retaining high calibre applicants with the appropriate "work ready" skills. Design/methodology/approach: A series of interviews were…

  16. A New Admission System Model for Teacher Colleges

    ERIC Educational Resources Information Center

    Katz, Sara; Frish, Yehiel

    2016-01-01

    Purpose: Aspects of intellectual competence would not be sufficient for quality teaching that requires a mix of intellectual and personal qualities. The purpose of this paper was to elicit personal attributes of teachers' college applicants. Design/methodology/approach: This qualitative case study consisted of 99 participants aged 20-24 years of…

  17. Bringing Person-Centeredness and Active Involvement into Reality

    ERIC Educational Resources Information Center

    Torenholt, Rikke; Engelund, Gitte; Willaing, Ingrid

    2015-01-01

    Purpose: The purpose of this paper is to examine the use and applicability of cultural probes--an explorative participatory method to gain insights into a person's life and thoughts--to achieve person-centeredness and active involvement in self-management education for people with chronic illness. Design/methodology/approach: An education toolkit…

  18. Automatic mathematical modeling for real time simulation program (AI application)

    NASA Technical Reports Server (NTRS)

    Wang, Caroline; Purinton, Steve

    1989-01-01

    A methodology is described for automatic mathematical modeling and generating simulation models. The major objective was to create a user friendly environment for engineers to design, maintain, and verify their models; to automatically convert the mathematical models into conventional code for computation; and finally, to document the model automatically.

  19. Towards Text Copyright Detection Using Metadata in Web Applications

    ERIC Educational Resources Information Center

    Poulos, Marios; Korfiatis, Nikolaos; Bokos, George

    2011-01-01

    Purpose: This paper aims to present the semantic content identifier (SCI), a permanent identifier, computed through a linear-time onion-peeling algorithm that enables the extraction of semantic features from a text, and the integration of this information within the permanent identifier. Design/methodology/approach: The authors employ SCI to…

  20. Ethics in the Information Exploitation and Manipulation Age

    ERIC Educational Resources Information Center

    Snow, Richard; Snow, Mary

    2007-01-01

    Purpose: The purpose of this paper is to elaborate the need to educate and encourage students to seek an ethical realm in which the researcher not only accurately analyses and documents a problem, but also actually advocates involvement to mitigate negative impacts. Design/methodology/approach: Geographic information systems (GIS) applications are…

  1. Photovoltaic system criteria documents. Volume 5: Safety criteria for photovoltaic applications

    NASA Technical Reports Server (NTRS)

    Koenig, John C.; Billitti, Joseph W.; Tallon, John M.

    1979-01-01

    Methodology is described for determining potential safety hazards involved in the construction and operation of photovoltaic power systems and provides guidelines for the implementation of safety considerations in the specification, design and operation of photovoltaic systems. Safety verification procedures for use in solar photovoltaic systems are established.

  2. Personal Name Identification in the Practice of Digital Repositories

    ERIC Educational Resources Information Center

    Xia, Jingfeng

    2006-01-01

    Purpose: To propose improvements to the identification of authors' names in digital repositories. Design/methodology/approach: Analysis of current name authorities in digital resources, particularly in digital repositories, and analysis of some features of existing repository applications. Findings: This paper finds that the variations of authors'…

  3. Serious Game Evaluation as a Meta-Game

    ERIC Educational Resources Information Center

    Hall, Lynne; Jones, Susan Jane; Aylett, Ruth; Hall, Marc; Tazzyman, Sarah; Paiva, Ana; Humphries, Lynne

    2013-01-01

    Purpose: This paper aims to briefly outline the seamless evaluation approach and its application during an evaluation of ORIENT, a serious game aimed at young adults. Design/methodology/approach: In this paper, the authors detail a unobtrusive, embedded evaluation approach that occurs within the game context, adding value and entertainment to the…

  4. Optimization of EGFR high positive cell isolation procedure by design of experiments methodology.

    PubMed

    Levi, Ofer; Tal, Baruch; Hileli, Sagi; Shapira, Assaf; Benhar, Itai; Grabov, Pavel; Eliaz, Noam

    2015-01-01

    Circulating tumor cells (CTCs) in blood circulation may play a role in monitoring and even in early detection of metastasis patients. Due to the limited presence of CTCs in blood circulation, viable CTCs isolation technology must supply a very high recovery rate. Here, we implement design of experiments (DOE) methodology in order to optimize the Bio-Ferrography (BF) immunomagnetic isolation (IMI) procedure for the EGFR high positive CTCs application. All consequent DOE phases such as screening design, optimization experiments and validation experiments were used. A significant recovery rate of more than 95% was achieved while isolating 100 EGFR high positive CTCs from 1 mL human whole blood. The recovery achievement in this research positions BF technology as one of the most efficient IMI technologies, which is ready to be challenged with patients' blood samples. © 2015 International Clinical Cytometry Society.

  5. Application of Statistical Design for the Production of Cellulase by Trichoderma reesei Using Mango Peel.

    PubMed

    Saravanan, P; Muthuvelayudham, R; Viruthagiri, T

    2012-01-01

    Optimization of the culture medium for cellulase production using Trichoderma reesei was carried out. The optimization of cellulase production using mango peel as substrate was performed with statistical methodology based on experimental designs. The screening of nine nutrients for their influence on cellulase production is achieved using Plackett-Burman design. Avicel, soybean cake flour, KH(2)PO(4), and CoCl(2)·6H(2)O were selected based on their positive influence on cellulase production. The composition of the selected components was optimized using Response Surface Methodology (RSM). The optimum conditions are as follows: Avicel: 25.30 g/L, Soybean cake flour: 23.53 g/L, KH(2)PO(4): 4.90 g/L, and CoCl(2)·6H(2)O: 0.95 g/L. These conditions are validated experimentally which revealed an enhanced Cellulase activity of 7.8 IU/mL.

  6. Progress in multirate digital control system design

    NASA Technical Reports Server (NTRS)

    Berg, Martin C.; Mason, Gregory S.

    1991-01-01

    A new methodology for multirate sampled-data control design based on a new generalized control law structure, two new parameter-optimization-based control law synthesis methods, and a new singular-value-based robustness analysis method are described. The control law structure can represent multirate sampled-data control laws of arbitrary structure and dynamic order, with arbitrarily prescribed sampling rates for all sensors and update rates for all processor states and actuators. The two control law synthesis methods employ numerical optimization to determine values for the control law parameters. The robustness analysis method is based on the multivariable Nyquist criterion applied to the loop transfer function for the sampling period equal to the period of repetition of the system's complete sampling/update schedule. The complete methodology is demonstrated by application to the design of a combination yaw damper and modal suppression system for a commercial aircraft.

  7. Use of Taguchi methodology to enhance the yield of caffeine removal with growing cultures of Pseudomonas pseudoalcaligenes.

    PubMed

    Ashengroph, Morahem; Ababaf, Sajad

    2014-12-01

    Microbial caffeine removal is a green solution for treatment of caffeinated products and agro-industrial effluents. We directed this investigation to optimizing a bio-decaffeination process with growing cultures of Pseudomonas pseudoalcaligenes through Taguchi methodology which is a structured statistical approach that can be lowered variations in a process through Design of Experiments (DOE). Five parameters, i.e. initial fructose, tryptone, Zn(+2) ion and caffeine concentrations and also incubation time selected and an L16 orthogonal array was applied to design experiments with four 4-level factors and one 3-level factor (4(4) × 1(3)). Data analysis was performed using the statistical analysis of variance (ANOVA) method. Furthermore, the optimal conditions were determined by combining the optimal levels of the significant factors and verified by a confirming experiment. Measurement of residual caffeine concentration in the reaction mixture was performed using high-performance liquid chromatography (HPLC). Use of Taguchi methodology for optimization of design parameters resulted in about 86.14% reduction of caffeine in 48 h incubation when 5g/l fructose, 3 mM Zn(+2) ion and 4.5 g/l of caffeine are present in the designed media. Under the optimized conditions, the yield of degradation of caffeine (4.5 g/l) by the native strain of Pseudomonas pseudoalcaligenes TPS8 has been increased from 15.8% to 86.14% which is 5.4 fold higher than the normal yield. According to the experimental results, Taguchi methodology provides a powerful methodology for identifying the favorable parameters on caffeine removal using strain TPS8 which suggests that the approach also has potential application with similar strains to improve the yield of caffeine removal from caffeine containing solutions.

  8. Process control integration requirements for advanced life support systems applicable to manned space missions

    NASA Technical Reports Server (NTRS)

    Spurlock, Paul; Spurlock, Jack M.; Evanich, Peggy L.

    1991-01-01

    An overview of recent developments in process-control technology which might have applications in future advanced life support systems for long-duration space operations is presented. Consideration is given to design criteria related to control system selection and optimization, and process-control interfacing methodology. Attention is also given to current life support system process control strategies, innovative sensors, instrumentation and control, and innovations in process supervision.

  9. Stream habitat analysis using the instream flow incremental methodology

    USGS Publications Warehouse

    Bovee, Ken D.; Lamb, Berton L.; Bartholow, John M.; Stalnaker, Clair B.; Taylor, Jonathan; Henriksen, Jim

    1998-01-01

    This document describes the Instream Flow Methodology in its entirety. This also is to serve as a comprehensive introductory textbook on IFIM for training courses as it contains the most complete and comprehensive description of IFIM in existence today. This should also serve as an official guide to IFIM in publication to counteract the misconceptions about the methodology that have pervaded the professional literature since the mid-1980's as this describes IFIM as it is envisioned by its developers. The document is aimed at the decisionmakers of management and allocation of natural resources in providing them an overview; and to those who design and implement studies to inform the decisionmakers. There should be enough background on model concepts, data requirements, calibration techniques, and quality assurance to help the technical user design and implement a cost-effective application of IFIM that will provide policy-relevant information. Some of the chapters deal with basic organization of IFIM, procedural sequence of applying IFIM starting with problem identification, study planning and implementation, and problem resolution.

  10. Biosynthesis of Inorganic Nanoparticles: A Fresh Look at the Control of Shape, Size and Composition

    PubMed Central

    Dahoumane, Si Amar; Jeffryes, Clayton; Mechouet, Mourad; Agathos, Spiros N.

    2017-01-01

    Several methodologies have been devised for the design of nanomaterials. The “Holy Grail” for materials scientists is the cost-effective, eco-friendly synthesis of nanomaterials with controlled sizes, shapes and compositions, as these features confer to the as-produced nanocrystals unique properties making them appropriate candidates for valuable bio-applications. The present review summarizes published data regarding the production of nanomaterials with special features via sustainable methodologies based on the utilization of natural bioresources. The richness of the latter, the diversity of the routes adopted and the tuned experimental parameters have led to the fabrication of nanomaterials belonging to different chemical families with appropriate compositions and displaying interesting sizes and shapes. It is expected that these outstanding findings will encourage researchers and attract newcomers to continue and extend the exploration of possibilities offered by nature and the design of innovative and safer methodologies towards the synthesis of unique nanomaterials, possessing desired features and exhibiting valuable properties that can be exploited in a profusion of fields. PMID:28952493

  11. Anticipating the Chaotic Behaviour of Industrial Systems Based on Stochastic, Event-Driven Simulations

    NASA Astrophysics Data System (ADS)

    Bruzzone, Agostino G.; Revetria, Roberto; Simeoni, Simone; Viazzo, Simone; Orsoni, Alessandra

    2004-08-01

    In logistics and industrial production managers must deal with the impact of stochastic events to improve performances and reduce costs. In fact, production and logistics systems are generally designed considering some parameters as deterministically distributed. While this assumption is mostly used for preliminary prototyping, it is sometimes also retained during the final design stage, and especially for estimated parameters (i.e. Market Request). The proposed methodology can determine the impact of stochastic events in the system by evaluating the chaotic threshold level. Such an approach, based on the application of a new and innovative methodology, can be implemented to find the condition under which chaos makes the system become uncontrollable. Starting from problem identification and risk assessment, several classification techniques are used to carry out an effect analysis and contingency plan estimation. In this paper the authors illustrate the methodology with respect to a real industrial case: a production problem related to the logistics of distributed chemical processing.

  12. Comparison of Requirements for Composite Structures for Aircraft and Space Applications

    NASA Technical Reports Server (NTRS)

    Raju, Ivatury S.; Elliot, Kenny B.; Hampton, Roy W.; Knight, Norman F., Jr.; Aggarwal, Pravin; Engelstad, Stephen P.; Chang, James B.

    2010-01-01

    In this report, the aircraft and space vehicle requirements for composite structures are compared. It is a valuable exercise to study composite structural design approaches used in the airframe industry and to adopt methodology that is applicable for space vehicles. The missions, environments, analysis methods, analysis validation approaches, testing programs, build quantities, inspection, and maintenance procedures used by the airframe industry, in general, are not transferable to spaceflight hardware. Therefore, while the application of composite design approaches from aircraft and other industries is appealing, many aspects cannot be directly utilized. Nevertheless, experiences and research for composite aircraft structures may be of use in unexpected arenas as space exploration technology develops, and so continued technology exchanges are encouraged.

  13. Application of optimal design methodologies in clinical pharmacology experiments.

    PubMed

    Ogungbenro, Kayode; Dokoumetzidis, Aristides; Aarons, Leon

    2009-01-01

    Pharmacokinetics and pharmacodynamics data are often analysed by mixed-effects modelling techniques (also known as population analysis), which has become a standard tool in the pharmaceutical industries for drug development. The last 10 years has witnessed considerable interest in the application of experimental design theories to population pharmacokinetic and pharmacodynamic experiments. Design of population pharmacokinetic experiments involves selection and a careful balance of a number of design factors. Optimal design theory uses prior information about the model and parameter estimates to optimize a function of the Fisher information matrix to obtain the best combination of the design factors. This paper provides a review of the different approaches that have been described in the literature for optimal design of population pharmacokinetic and pharmacodynamic experiments. It describes options that are available and highlights some of the issues that could be of concern as regards practical application. It also discusses areas of application of optimal design theories in clinical pharmacology experiments. It is expected that as the awareness about the benefits of this approach increases, more people will embrace it and ultimately will lead to more efficient population pharmacokinetic and pharmacodynamic experiments and can also help to reduce both cost and time during drug development. Copyright (c) 2008 John Wiley & Sons, Ltd.

  14. Comprehensive Solar-Terrestrial Environment Model (COSTEM) for Space Weather Predictions

    DTIC Science & Technology

    2007-07-01

    research in data assimilation methodologies applicable to the space environment, as well as "threat adaptive" grid computing technologies, where we...SWMF is tested by(SWMF) [29, 43] was designed in 2001 and has sse et xriig mlil ope been developed to integrate and couple several system tests...its components. The night on several computer/compiler platforms. main design goals of the SWMF were to minimizedocumented. mai deigngoas o th SWF

  15. Fractional representation theory - Robustness results with applications to finite dimensional control of a class of linear distributed systems

    NASA Technical Reports Server (NTRS)

    Nett, C. N.; Jacobson, C. A.; Balas, M. J.

    1983-01-01

    This paper reviews and extends the fractional representation theory. In particular, new and powerful robustness results are presented. This new theory is utilized to develop a preliminary design methodology for finite dimensional control of a class of linear evolution equations on a Banach space. The design is for stability in an input-output sense, but particular attention is paid to internal stability as well.

  16. Laminated Object Manufacturing-Based Design Ceramic Matrix Composites

    DTIC Science & Technology

    2001-04-01

    components for DoD applications. Program goals included the development of (1) a new LOM based design methodology for CMC, (2) optimized preceramic polymer ...3.1.1-20 3.1.1-12 Detail of LOM Composites Forming System w/ glass fiber/ polymer laminate................ 3.1.1-21 3.1.1-13...such as polymer matrix composites have faced similar barriers to implementation. These barriers have been overcome through the development of suitable

  17. Wideband QAMC reflector's antenna for low profile applications

    NASA Astrophysics Data System (ADS)

    Grelier, M.; Jousset, M.; Mallégol, S.; Lepage, A. C.; Begaud, X.; LeMener, J. M.

    2011-06-01

    A wideband reflector's antenna based on quasi-artificial magnetic conductor is proposed. To validate the design, an Archimedean spiral has been backed to this new reflector. In comparison to classical solution using absorbent material, the prototype presents a very low thickness of λ/15 at the lowest operating frequency and an improved gain over a 2.4:1 bandwidth. The whole methodology to design this reflector can be applied to other wideband antennas.

  18. Recommendations for research design and reporting in computer-assisted diagnosis to facilitate meta-analysis.

    PubMed

    Eadie, Leila H; Taylor, Paul; Gibson, Adam P

    2012-04-01

    Computer-assisted diagnosis (CAD) describes a diverse, heterogeneous range of applications rather than a single entity. The aims and functions of CAD systems vary considerably and comparing studies and systems is challenging due to methodological and design differences. In addition, poor study quality and reporting can reduce the value of some publications. Meta-analyses of CAD are therefore difficult and may not provide reliable conclusions. Aiming to determine the major sources of heterogeneity and thereby what CAD researchers could change to allow this sort of assessment, this study reviews a sample of 147 papers concerning CAD used with imaging for cancer diagnosis. It discusses sources of variability, including the goal of the CAD system, learning methodology, study population, design, outcome measures, inclusion of radiologists, and study quality. Based upon this evidence, recommendations are made to help researchers optimize the quality and comparability of their trial design and reporting. Copyright © 2011 Elsevier Inc. All rights reserved.

  19. Improving scanner wafer alignment performance by target optimization

    NASA Astrophysics Data System (ADS)

    Leray, Philippe; Jehoul, Christiane; Socha, Robert; Menchtchikov, Boris; Raghunathan, Sudhar; Kent, Eric; Schoonewelle, Hielke; Tinnemans, Patrick; Tuffy, Paul; Belen, Jun; Wise, Rich

    2016-03-01

    In the process nodes of 10nm and below, the patterning complexity along with the processing and materials required has resulted in a need to optimize alignment targets in order to achieve the required precision, accuracy and throughput performance. Recent industry publications on the metrology target optimization process have shown a move from the expensive and time consuming empirical methodologies, towards a faster computational approach. ASML's Design for Control (D4C) application, which is currently used to optimize YieldStar diffraction based overlay (DBO) metrology targets, has been extended to support the optimization of scanner wafer alignment targets. This allows the necessary process information and design methodology, used for DBO target designs, to be leveraged for the optimization of alignment targets. In this paper, we show how we applied this computational approach to wafer alignment target design. We verify the correlation between predictions and measurements for the key alignment performance metrics and finally show the potential alignment and overlay performance improvements that an optimized alignment target could achieve.

  20. Study of Design Knowledge Capture (DKC) schemes implemented in magnetic bearing applications

    NASA Technical Reports Server (NTRS)

    1990-01-01

    A design knowledge capture (DKC) scheme was implemented using frame-based techniques. The objective of such a system is to capture not only the knowledge which describes a design, but also that which explains how the design decisions were reached. These knowledge types were labelled definitive and explanatory, respectively. Examination of the design process helped determine what knowledge to retain and at what stage that knowledge is used. A discussion of frames resulted in the recognition of their value to knowledge representation and organization. The FORMS frame system was used as a basis for further development, and for examples using magnetic bearing design. The specific contributions made by this research include: determination that frame-based systems provide a useful methodology for management and application of design knowledge; definition of specific user interface requirements, (this consists of a window-based browser); specification of syntax for DKC commands; and demonstration of the feasibility of DKC by applications to existing designs. It was determined that design knowledge capture could become an extremely valuable engineering tool for complicated, long-life systems, but that further work was needed, particularly the development of a graphic, window-based interface.

  1. Robust Feedback Control of Flow Induced Structural Radiation of Sound

    NASA Technical Reports Server (NTRS)

    Heatwole, Craig M.; Bernhard, Robert J.; Franchek, Matthew A.

    1997-01-01

    A significant component of the interior noise of aircraft and automobiles is a result of turbulent boundary layer excitation of the vehicular structure. In this work, active robust feedback control of the noise due to this non-predictable excitation is investigated. Both an analytical model and experimental investigations are used to determine the characteristics of the flow induced structural sound radiation problem. The problem is shown to be broadband in nature with large system uncertainties associated with the various operating conditions. Furthermore the delay associated with sound propagation is shown to restrict the use of microphone feedback. The state of the art control methodologies, IL synthesis and adaptive feedback control, are evaluated and shown to have limited success for solving this problem. A robust frequency domain controller design methodology is developed for the problem of sound radiated from turbulent flow driven plates. The control design methodology uses frequency domain sequential loop shaping techniques. System uncertainty, sound pressure level reduction performance, and actuator constraints are included in the design process. Using this design method, phase lag was added using non-minimum phase zeros such that the beneficial plant dynamics could be used. This general control approach has application to lightly damped vibration and sound radiation problems where there are high bandwidth control objectives requiring a low controller DC gain and controller order.

  2. Evaluation of Structural Robustness against Column Loss: Methodology and Application to RC Frame Buildings.

    PubMed

    Bao, Yihai; Main, Joseph A; Noh, Sam-Young

    2017-08-01

    A computational methodology is presented for evaluating structural robustness against column loss. The methodology is illustrated through application to reinforced concrete (RC) frame buildings, using a reduced-order modeling approach for three-dimensional RC framing systems that includes the floor slabs. Comparisons with high-fidelity finite-element model results are presented to verify the approach. Pushdown analyses of prototype buildings under column loss scenarios are performed using the reduced-order modeling approach, and an energy-based procedure is employed to account for the dynamic effects associated with sudden column loss. Results obtained using the energy-based approach are found to be in good agreement with results from direct dynamic analysis of sudden column loss. A metric for structural robustness is proposed, calculated by normalizing the ultimate capacities of the structural system under sudden column loss by the applicable service-level gravity loading and by evaluating the minimum value of this normalized ultimate capacity over all column removal scenarios. The procedure is applied to two prototype 10-story RC buildings, one employing intermediate moment frames (IMFs) and the other employing special moment frames (SMFs). The SMF building, with its more stringent seismic design and detailing, is found to have greater robustness.

  3. Automated control of hierarchical systems using value-driven methods

    NASA Technical Reports Server (NTRS)

    Pugh, George E.; Burke, Thomas E.

    1990-01-01

    An introduction is given to the Value-driven methodology, which has been successfully applied to solve a variety of difficult decision, control, and optimization problems. Many real-world decision processes (e.g., those encountered in scheduling, allocation, and command and control) involve a hierarchy of complex planning considerations. For such problems it is virtually impossible to define a fixed set of rules that will operate satisfactorily over the full range of probable contingencies. Decision Science Applications' value-driven methodology offers a systematic way of automating the intuitive, common-sense approach used by human planners. The inherent responsiveness of value-driven systems to user-controlled priorities makes them particularly suitable for semi-automated applications in which the user must remain in command of the systems operation. Three examples of the practical application of the approach in the automation of hierarchical decision processes are discussed: the TAC Brawler air-to-air combat simulation is a four-level computerized hierarchy; the autonomous underwater vehicle mission planning system is a three-level control system; and the Space Station Freedom electrical power control and scheduling system is designed as a two-level hierarchy. The methodology is compared with rule-based systems and with other more widely-known optimization techniques.

  4. At the cross-roads of participatory research and biomarker discovery in autism: the need for empirical data.

    PubMed

    Yusuf, Afiqah; Elsabbagh, Mayada

    2015-12-15

    Identifying biomarkers for autism can improve outcomes for those affected by autism. Engaging the diverse stakeholders in the research process using community-based participatory research (CBPR) can accelerate biomarker discovery into clinical applications. However, there are limited examples of stakeholder involvement in autism research, possibly due to conceptual and practical concerns. We evaluate the applicability of CBPR principles to biomarker discovery in autism and critically review empirical studies adopting these principles. Using a scoping review methodology, we identified and evaluated seven studies using CBPR principles in biomarker discovery. The limited number of studies in biomarker discovery adopting CBPR principles coupled with their methodological limitations suggests that such applications are feasible but challenging. These studies illustrate three CBPR themes: community assessment, setting global priorities, and collaboration in research design. We propose that further research using participatory principles would be useful in accelerating the pace of discovery and the development of clinically meaningful biomarkers. For this goal to be successful we advocate for increased attention to previously identified conceptual and methodological challenges to participatory approaches in health research, including improving scientific rigor and developing long-term partnerships among stakeholders.

  5. SAGA: A project to automate the management of software production systems

    NASA Technical Reports Server (NTRS)

    Campbell, Roy H.; Beckman, Carol S.; Benzinger, Leonora; Beshers, George; Hammerslag, David; Kimball, John; Kirslis, Peter A.; Render, Hal; Richards, Paul; Terwilliger, Robert

    1985-01-01

    The SAGA system is a software environment that is designed to support most of the software development activities that occur in a software lifecycle. The system can be configured to support specific software development applications using given programming languages, tools, and methodologies. Meta-tools are provided to ease configuration. The SAGA system consists of a small number of software components that are adapted by the meta-tools into specific tools for use in the software development application. The modules are design so that the meta-tools can construct an environment which is both integrated and flexible. The SAGA project is documented in several papers which are presented.

  6. 26th National Medicinal Chemistry Symposium--Developments in chemokines, carbohydrates, p53 and drug metabolism. 14-18 June 1998, Richmond, Virginia, USA.

    PubMed

    Swords, B

    1998-08-01

    This symposium, organized by the American Chemical Society, is held every two years. This year's meeting, sponsored by the ACS and The Virginia Commonwealth University, was attended by approximately 300 delegates and covered developments in chemokines, carbohydrates, p53, drug metabolism, prodrugs, structure-based design and molecular modeling. At the opening ceremony, John Topliss began by paying tribute to the distinguished medicinal chemistry career of Alfred Burger (University of Virginia, USA). He then reviewed the application of physicochemical principles to drug design, including the development and application of quantitative structure-activity relationship methodology.

  7. Serial multiplier arrays for parallel computation

    NASA Technical Reports Server (NTRS)

    Winters, Kel

    1990-01-01

    Arrays of systolic serial-parallel multiplier elements are proposed as an alternative to conventional SIMD mesh serial adder arrays for applications that are multiplication intensive and require few stored operands. The design and operation of a number of multiplier and array configurations featuring locality of connection, modularity, and regularity of structure are discussed. A design methodology combining top-down and bottom-up techniques is described to facilitate development of custom high-performance CMOS multiplier element arrays as well as rapid synthesis of simulation models and semicustom prototype CMOS components. Finally, a differential version of NORA dynamic circuits requiring a single-phase uncomplemented clock signal introduced for this application.

  8. Application of experimental design in geothermal resources assessment of Ciwidey-Patuha, West Java, Indonesia

    NASA Astrophysics Data System (ADS)

    Ashat, Ali; Pratama, Heru Berian

    2017-12-01

    The successful Ciwidey-Patuha geothermal field size assessment required integration data analysis of all aspects to determined optimum capacity to be installed. Resources assessment involve significant uncertainty of subsurface information and multiple development scenarios from these field. Therefore, this paper applied the application of experimental design approach to the geothermal numerical simulation of Ciwidey-Patuha to generate probabilistic resource assessment result. This process assesses the impact of evaluated parameters affecting resources and interacting between these parameters. This methodology have been successfully estimated the maximum resources with polynomial function covering the entire range of possible values of important reservoir parameters.

  9. Experimental Design and Bioinformatics Analysis for the Application of Metagenomics in Environmental Sciences and Biotechnology.

    PubMed

    Ju, Feng; Zhang, Tong

    2015-11-03

    Recent advances in DNA sequencing technologies have prompted the widespread application of metagenomics for the investigation of novel bioresources (e.g., industrial enzymes and bioactive molecules) and unknown biohazards (e.g., pathogens and antibiotic resistance genes) in natural and engineered microbial systems across multiple disciplines. This review discusses the rigorous experimental design and sample preparation in the context of applying metagenomics in environmental sciences and biotechnology. Moreover, this review summarizes the principles, methodologies, and state-of-the-art bioinformatics procedures, tools and database resources for metagenomics applications and discusses two popular strategies (analysis of unassembled reads versus assembled contigs/draft genomes) for quantitative or qualitative insights of microbial community structure and functions. Overall, this review aims to facilitate more extensive application of metagenomics in the investigation of uncultured microorganisms, novel enzymes, microbe-environment interactions, and biohazards in biotechnological applications where microbial communities are engineered for bioenergy production, wastewater treatment, and bioremediation.

  10. Capabilities, methodologies, and use of the cambio file-translation application.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lasche, George P.

    2007-03-01

    This report describes the capabilities, methodologies, and uses of the Cambio computer application, designed to automatically read and display nuclear spectral data files of any known format in the world and to convert spectral data to one of several commonly used analysis formats. To further assist responders, Cambio incorporates an analysis method based on non-linear fitting techniques found in open literature and implemented in openly published source code in the late 1980s. A brief description is provided of how Cambio works, of what basic formats it can currently read, and how it can be used. Cambio was developed at Sandiamore » National Laboratories and is provided as a free service to assist nuclear emergency response analysts anywhere in the world in the fight against nuclear terrorism.« less

  11. Integrated Systems Health Management (ISHM) Toolkit

    NASA Technical Reports Server (NTRS)

    Venkatesh, Meera; Kapadia, Ravi; Walker, Mark; Wilkins, Kim

    2013-01-01

    A framework of software components has been implemented to facilitate the development of ISHM systems according to a methodology based on Reliability Centered Maintenance (RCM). This framework is collectively referred to as the Toolkit and was developed using General Atomics' Health MAP (TM) technology. The toolkit is intended to provide assistance to software developers of mission-critical system health monitoring applications in the specification, implementation, configuration, and deployment of such applications. In addition to software tools designed to facilitate these objectives, the toolkit also provides direction to software developers in accordance with an ISHM specification and development methodology. The development tools are based on an RCM approach for the development of ISHM systems. This approach focuses on defining, detecting, and predicting the likelihood of system functional failures and their undesirable consequences.

  12. Recent efforts directed to the development of more sustainable asymmetric organocatalysis.

    PubMed

    Hernández, José G; Juaristi, Eusebio

    2012-06-04

    In line with the principles of "green" chemistry, organocatalysis seeks to reduce energy consumption and to optimize the use of the available resources, aiming to become a sustainable strategy in chemical transformations. Nevertheless, during the last decade diverse experimental protocols have made organocatalysis an even "greener" alternative by the use of friendlier reaction conditions, or via the application of solvent-free methodologies, or through the design and synthesis of more selective catalysts, or via the development of multicomponent one-pot organocatalytic reactions, or by the recycling and reuse of organocatalysts, or by means of the application of more energy-efficient activation techniques, among other approaches. In this feature article we review some of the remarkable advancements that have made it possible to develop even more sustainable asymmetric organocatalyzed methodologies.

  13. Air-kerma evaluation at the maze entrance of HDR brachytherapy facilities.

    PubMed

    Pujades, M C; Granero, D; Vijande, J; Ballester, F; Perez-Calatayud, J; Papagiannis, P; Siebert, F A

    2014-12-01

    In the absence of procedures for evaluating the design of brachytherapy (BT) facilities for radiation protection purposes, the methodology used for external beam radiotherapy facilities is often adapted. The purpose of this study is to adapt the NCRP 151 methodology for estimating the air-kerma rate at the door in BT facilities. Such methodology was checked against Monte Carlo (MC) techniques using the code Geant4. Five different facility designs were studied for (192)Ir and (60)Co HDR applications to account for several different bunker layouts.For the estimation of the lead thickness needed at the door, the use of transmission data for the real spectra at the door instead of the ones emitted by (192)Ir and (60)Co will reduce the lead thickness by a factor of five for (192)Ir and ten for (60)Co. This will significantly lighten the door and hence simplify construction and operating requirements for all bunkers.The adaptation proposed in this study to estimate the air-kerma rate at the door depends on the complexity of the maze: it provides good results for bunkers with a maze (i.e. similar to those used for linacs for which the NCRP 151 methodology was developed) but fails for less conventional designs. For those facilities, a specific Monte Carlo study is in order for reasons of safety and cost-effectiveness.

  14. Watermark: An Application and Methodology and Application for Interactive and intelligent Decision Support for Groundwater Systems

    NASA Astrophysics Data System (ADS)

    Pierce, S. A.; Wagner, K.; Schwartz, S.; Gentle, J. N., Jr.

    2016-12-01

    Critical water resources face the effects of historic drought, increased demand, and potential contamination, the need has never been greater to develop resources to effectively communicate conservation and protection across a broad audience and geographical area. The Watermark application and macro-analysis methodology merges topical analysis of context rich corpus from policy texts with multi-attributed solution sets from integrated models of water resource and other subsystems, such as mineral, food, energy, or environmental systems to construct a scalable, robust, and reproducible approach for identifying links between policy and science knowledge bases. The Watermark application is an open-source, interactive workspace to support science-based visualization and decision making. Designed with generalization in mind, Watermark is a flexible platform that allows for data analysis and inclusion of large datasets with an interactive front-end capable of connecting with other applications as well as advanced computing resources. In addition, the Watermark analysis methodology offers functionality that streamlines communication with non-technical users for policy, education, or engagement with groups around scientific topics of societal relevance. The technology stack for Watermark was selected with the goal of creating a robust and dynamic modular codebase that can be adjusted to fit many use cases and scale to support usage loads that range between simple data display to complex scientific simulation-based modelling and analytics. The methodology uses to topical analysis and simulation-optimization to systematically analyze the policy and management realities of resource systems and explicitly connect the social and problem contexts with science-based and engineering knowledge from models. A case example demonstrates use in a complex groundwater resources management study highlighting multi-criteria spatial decision making and uncertainty comparisons.

  15. A normative price for energy from an electricity generation system: An Owner-dependent Methodology for Energy Generation (system) Assessment (OMEGA). Volume 1: Summary

    NASA Technical Reports Server (NTRS)

    Chamberlain, R. G.; Mcmaster, K. M.

    1981-01-01

    The utility owned solar electric system methodology is generalized and updated. The net present value of the system is determined by consideration of all financial benefits and costs (including a specified return on investment). Life cycle costs, life cycle revenues, and residual system values are obtained. Break even values of system parameters are estimated by setting the net present value to zero. While the model was designed for photovoltaic generators with a possible thermal energy byproduct, it applicability is not limited to such systems. The resulting owner-dependent methodology for energy generation system assessment consists of a few equations that can be evaluated without the aid of a high-speed computer.

  16. Conjoint analysis: using a market-based research model for healthcare decision making.

    PubMed

    Mele, Nancy L

    2008-01-01

    Conjoint analysis is a market-based research model that has been used by businesses for more than 35 years to predict consumer preferences in product design and purchasing. Researchers in medicine, healthcare economics, and health policy have discovered the value of this methodology in determining treatment preferences, resource allocation, and willingness to pay. To describe the conjoint analysis methodology and explore value-added applications in nursing research. Conjoint analysis methodology is described, using examples from the healthcare and business literature, and personal experience with the method. Nurses are called upon to increase interdisciplinary research, provide an evidence base for nursing practice, create patient-centered treatments, and revise nursing education. Other disciplines have met challenges like these using conjoint analysis and discrete choice modeling.

  17. Comparative Effectiveness Research in Lung Diseases and Sleep Disorders

    PubMed Central

    Lieu, Tracy A.; Au, David; Krishnan, Jerry A.; Moss, Marc; Selker, Harry; Harabin, Andrea; Connors, Alfred

    2011-01-01

    The Division of Lung Diseases of the National Heart, Lung, and Blood Institute (NHLBI) held a workshop to develop recommendations on topics, methodologies, and resources for comparative effectiveness research (CER) that will guide clinical decision making about available treatment options for lung diseases and sleep disorders. A multidisciplinary group of experts with experience in efficacy, effectiveness, implementation, and economic research identified (a) what types of studies the domain of CER in lung diseases and sleep disorders should include, (b) the criteria and process for setting priorities, and (c) current resources for and barriers to CER in lung diseases. Key recommendations were to (1) increase efforts to engage stakeholders in developing CER questions and study designs; (2) invest in further development of databases and other infrastructure, including efficient methods for data sharing; (3) make full use of a broad range of study designs; (4) increase the appropriate use of observational designs and the support of methodologic research; (5) ensure that committees that review CER grant applications include persons with appropriate perspective and expertise; and (6) further develop the workforce for CER by supporting training opportunities that focus on the methodologic and practical skills needed. PMID:21965016

  18. On sustainable and efficient design of ground-source heat pump systems

    NASA Astrophysics Data System (ADS)

    Grassi, W.; Conti, P.; Schito, E.; Testi, D.

    2015-11-01

    This paper is mainly aimed at stressing some fundamental features of the GSHP design and is based on a broad research we are performing at the University of Pisa. In particular, we focus the discussion on an environmentally sustainable approach, based on performance optimization during the entire operational life. The proposed methodology aims at investigating design and management strategies to find the optimal level of exploitation of the ground source and refer to other technical means to cover the remaining energy requirements and modulate the power peaks. The method is holistic, considering the system as a whole, rather than focusing only on some components, usually considered as the most important ones. Each subsystem is modeled and coupled to the others in a full set of equations, which is used within an optimization routine to reproduce the operative performances of the overall GSHP system. As a matter of fact, the recommended methodology is a 4-in-1 activity, including sizing of components, lifecycle performance evaluation, optimization process, and feasibility analysis. The paper reviews also some previous works concerning possible applications of the proposed methodology. In conclusion, we describe undergoing research activities and objectives of future works.

  19. Reassessing SERS enhancement factors: using thermodynamics to drive substrate design.

    PubMed

    Guicheteau, J A; Tripathi, A; Emmons, E D; Christesen, S D; Fountain, Augustus W

    2017-12-04

    Over the past 40 years fundamental and application research into Surface-Enhanced Raman Scattering (SERS) has been explored by academia, industry, and government laboratories. To date however, SERS has achieved little commercial success as an analytical technique. Researchers are tackling a variety of paths to help break through the commercial barrier by addressing the reproducibility in both the SERS substrates and SERS signals as well as continuing to explore the underlying mechanisms. To this end, investigators use a variety of methodologies, typically studying strongly binding analytes such as aromatic thiols and azarenes, and report SERS enhancement factor calculations. However a drawback of the traditional SERS enhancement factor calculation is that it does not yield enough information to understand substrate reproducibility, application potential with another analyte, or the driving factors behind the molecule-metal interaction. Our work at the US Army Edgewood Chemical Biological Center has focused on these questions and we have shown that thermodynamic principles play a key role in the SERS response and are an essential factor in future designs of substrates and applications. This work will discuss the advantages and disadvantages of various experimental techniques used to report SERS enhancement with planar SERS substrates and present our alternative SERS enhancement value. We will report on three types of analysis scenarios that all yield different information concerning the effectiveness of the SERS substrate, practical application of the substrate, and finally the thermodynamic properties of the substrate. We believe that through this work a greater understanding for designing substrates will be achieved, one that is based on both thermodynamic and plasmonic properties as opposed to just plasmonic properties. This new understanding and potential change in substrate design will enable more applications for SERS based methodologies including targeting molecules that are traditionally not easily detected with SERS due to the perceived weak molecule-metal interaction of substrates.

  20. Application of Taguchi methods to infrared window design

    NASA Astrophysics Data System (ADS)

    Osmer, Kurt A.; Pruszynski, Charles J.

    1990-10-01

    Dr. Genichi Taguchi, a prominent quality consultant, reduced a branch of statistics known as "Design of Experiments" to a cookbook methodology that can be employed by any competent engineer. This technique has been extensively employed by Japanese manufacturers, and is widely credited with helping them attain their current level of success in low cost, high quality product design and fabrication. Although this technique was originally put forth as a tool to streamline the determination of improved production processes, it can also be applied to a wide range of engineering problems. As part of an internal research project, this method of experimental design has been adapted to window trade studies and materials research. Two of these analyses are presented herein, and have been chosen to illustrate the breadth of applications to which the Taguchi method can be utilized.

  1. Attitude control/momentum management and payload pointing in advanced space vehicles

    NASA Technical Reports Server (NTRS)

    Parlos, Alexander G.; Jayasuriya, Suhada

    1990-01-01

    The design and evaluation of an attitude control/momentum management system for highly asymmetric spacecraft configurations are presented. The preliminary development and application of a nonlinear control system design methodology for tracking control of uncertain systems, such as spacecraft payload pointing systems are also presented. Control issues relevant to both linear and nonlinear rigid-body spacecraft dynamics are addressed, whereas any structural flexibilities are not taken into consideration. Results from the first task indicate that certain commonly used simplifications in the equations of motions result in unstable attitude control systems, when used for highly asymmetric spacecraft configurations. An approach is suggested circumventing this problem. Additionally, even though preliminary results from the second task are encouraging, the proposed nonlinear control system design method requires further investigation prior to its application and use as an effective payload pointing system design technique.

  2. Modeling and Dynamic Analysis of Paralleled dc/dc Converters With Master-Slave Current Sharing Control

    NASA Technical Reports Server (NTRS)

    Rajagopalan, J.; Xing, K.; Guo, Y.; Lee, F. C.; Manners, Bruce

    1996-01-01

    A simple, application-oriented, transfer function model of paralleled converters employing Master-Slave Current-sharing (MSC) control is developed. Dynamically, the Master converter retains its original design characteristics; all the Slave converters are forced to depart significantly from their original design characteristics into current-controlled current sources. Five distinct loop gains to assess system stability and performance are identified and their physical significance is described. A design methodology for the current share compensator is presented. The effect of this current sharing scheme on 'system output impedance' is analyzed.

  3. The Use of Exergy and Decomposition Techniques in the Development of Generic Analysis, and Optimization Methodologies Applicable to the Synthesis/Design of Aircraft/Aerospace Systems

    DTIC Science & Technology

    2006-04-21

    C. M., and Prendergast, J. P., 2002, "Thermial Analysis of Hypersonic Inlet Flow with Exergy -Based Design Methods," International Journal of Applied...parametric study of the PS and its components is first presented in order to show the type of detailed information on internal system losses which an exergy ...Thermoeconomic Isolation Applied to the Optimal Synthesis/Design of an Advanced Fighter Aircraft System," International Journal of Thermodynamics, ICAT

  4. Multidisciplinary Optimization Approach for Design and Operation of Constrained and Complex-shaped Space Systems

    NASA Astrophysics Data System (ADS)

    Lee, Dae Young

    The design of a small satellite is challenging since they are constrained by mass, volume, and power. To mitigate these constraint effects, designers adopt deployable configurations on the spacecraft that result in an interesting and difficult optimization problem. The resulting optimization problem is challenging due to the computational complexity caused by the large number of design variables and the model complexity created by the deployables. Adding to these complexities, there is a lack of integration of the design optimization systems into operational optimization, and the utility maximization of spacecraft in orbit. The developed methodology enables satellite Multidisciplinary Design Optimization (MDO) that is extendable to on-orbit operation. Optimization of on-orbit operations is possible with MDO since the model predictive controller developed in this dissertation guarantees the achievement of the on-ground design behavior in orbit. To enable the design optimization of highly constrained and complex-shaped space systems, the spherical coordinate analysis technique, called the "Attitude Sphere", is extended and merged with an additional engineering tools like OpenGL. OpenGL's graphic acceleration facilitates the accurate estimation of the shadow-degraded photovoltaic cell area. This technique is applied to the design optimization of the satellite Electric Power System (EPS) and the design result shows that the amount of photovoltaic power generation can be increased more than 9%. Based on this initial methodology, the goal of this effort is extended from Single Discipline Optimization to Multidisciplinary Optimization, which includes the design and also operation of the EPS, Attitude Determination and Control System (ADCS), and communication system. The geometry optimization satisfies the conditions of the ground development phase; however, the operation optimization may not be as successful as expected in orbit due to disturbances. To address this issue, for the ADCS operations, controllers based on Model Predictive Control that are effective for constraint handling were developed and implemented. All the suggested design and operation methodologies are applied to a mission "CADRE", which is space weather mission scheduled for operation in 2016. This application demonstrates the usefulness and capability of the methodology to enhance CADRE's capabilities, and its ability to be applied to a variety of missions.

  5. A Test Methodology for Determining Space-Readiness of Xilinx SRAM-Based FPGA Designs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quinn, Heather M; Graham, Paul S; Morgan, Keith S

    2008-01-01

    Using reconfigurable, static random-access memory (SRAM) based field-programmable gate arrays (FPGAs) for space-based computation has been an exciting area of research for the past decade. Since both the circuit and the circuit's state is stored in radiation-tolerant memory, both could be alterd by the harsh space radiation environment. Both the circuit and the circuit's state can be prote cted by triple-moduler redundancy (TMR), but applying TMR to FPGA user designs is often an error-prone process. Faulty application of TMR could cause the FPGA user circuit to output incorrect data. This paper will describe a three-tiered methodology for testing FPGA usermore » designs for space-readiness. We will describe the standard approach to testing FPGA user designs using a particle accelerator, as well as two methods using fault injection and a modeling tool. While accelerator testing is the current 'gold standard' for pre-launch testing, we believe the use of fault injection and modeling tools allows for easy, cheap and uniform access for discovering errors early in the design process.« less

  6. Application of a statistical design to the optimization of parameters and culture medium for alpha-amylase production by Aspergillus oryzae CBS 819.72 grown on gruel (wheat grinding by-product).

    PubMed

    Kammoun, Radhouane; Naili, Belgacem; Bejar, Samir

    2008-09-01

    The production optimization of alpha-amylase (E.C.3.2.1.1) from Aspergillus oryzae CBS 819.72 fungus, using a by-product of wheat grinding (gruel) as sole carbon source, was performed with statistical methodology based on three experimental designs. The optimisation of temperature, agitation and inoculum size was attempted using a Box-Behnken design under the response surface methodology. The screening of nineteen nutrients for their influence on alpha-amylase production was achieved using a Plackett-Burman design. KH(2)PO(4), urea, glycerol, (NH(4))(2)SO(4), CoCl(2), casein hydrolysate, soybean meal hydrolysate, MgSO(4) were selected based on their positive influence on enzyme formation. The optimized nutrients concentration was obtained using a Taguchi experimental design and the analysis of the data predicts a theoretical increase in the alpha-amylase expression of 73.2% (from 40.1 to 151.1 U/ml). These conditions were validated experimentally and revealed an enhanced alpha-amylase yield of 72.7%.

  7. Unique considerations in the design and experimental evaluation of tailored wings with elastically produced chordwise camber

    NASA Technical Reports Server (NTRS)

    Rehfield, Lawrence W.; Zischka, Peter J.; Fentress, Michael L.; Chang, Stephen

    1992-01-01

    Some of the unique considerations that are associated with the design and experimental evaluation of chordwise deformable wing structures are addressed. Since chordwise elastic camber deformations are desired and must be free to develop, traditional rib concepts and experimental methodology cannot be used. New rib design concepts are presented and discussed. An experimental methodology based upon the use of a flexible sling support and load application system has been created and utilized to evaluate a model box beam experimentally. Experimental data correlate extremely well with design analysis predictions based upon a beam model for the global properties of camber compliance and spanwise bending compliance. Local strain measurements exhibit trends in agreement with intuition and theory but depart slightly from theoretical perfection based upon beam-like behavior alone. It is conjectured that some additional refinement of experimental technique is needed to explain or eliminate these (minor) departures from asymmetric behavior of upper and lower box cover strains. Overall, a solid basis for the design of box structures based upon the bending method of elastic camber production has been confirmed by the experiments.

  8. Some consideration for evaluation of structural integrity of aging aircraft

    NASA Astrophysics Data System (ADS)

    Terada, Hiroyuki; Asada, Hiroo

    The objective of this paper is to examine the achievement and the limitation of state-of-the-art of the methodology of damage tolerant design and the subjects to be solved for further improvement. The topics discussed are: the basic concept of full-scale fatigue testing, fracture mechanics applications, repair of detected damages, inspection technology, and determination of inspection intervals, reliability assessment for practical application, and the importance of various kinds of data acquisition.

  9. An Efficient Optimal Design Methodology for Nonlinear Multibody Dynamics Systems with Application to Vehicle Occupant Restraint Systems

    DTIC Science & Technology

    2011-04-01

    Lai, W., Carhart, M., Richards, D., Brown, J. and Raasch, C., (2006), Modeling the Effects of Seat Belt Pretensioners on Occupant Kinematics During...from being ejected from the vehicle but also be able to assist rapid entry into the vehicle during a rollover or other accidents to avoid injury or...vehicles, such as gunner restraint systems, blast-protective seating systems and other restraint systems, and commercial applications, such as

  10. Application of design sensitivity analysis for greater improvement on machine structural dynamics

    NASA Technical Reports Server (NTRS)

    Yoshimura, Masataka

    1987-01-01

    Methodologies are presented for greatly improving machine structural dynamics by using design sensitivity analyses and evaluative parameters. First, design sensitivity coefficients and evaluative parameters of structural dynamics are described. Next, the relations between the design sensitivity coefficients and the evaluative parameters are clarified. Then, design improvement procedures of structural dynamics are proposed for the following three cases: (1) addition of elastic structural members, (2) addition of mass elements, and (3) substantial charges of joint design variables. Cases (1) and (2) correspond to the changes of the initial framework or configuration, and (3) corresponds to the alteration of poor initial design variables. Finally, numerical examples are given for demonstrating the availability of the methods proposed.

  11. Structural design optimization with survivability dependent constraints application: Primary wing box of a multi-role fighter

    NASA Technical Reports Server (NTRS)

    Dolvin, Douglas J.

    1992-01-01

    The superior survivability of a multirole fighter is dependent upon balanced integration of technologies for reduced vulnerability and susceptability. The objective is to develop a methodology for structural design optimization with survivability dependent constraints. The design criteria for optimization will be survivability in a tactical laser environment. The following analyses are studied to establish a dependent design relationship between structural weight and survivability: (1) develop a physically linked global design model of survivability variables; and (2) apply conventional constraints to quantify survivability dependent design. It was not possible to develop an exact approach which would include all aspects of survivability dependent design, therefore guidelines are offered for solving similar problems.

  12. Changing Paradigms Managed Learning Environments and Web 2.0

    ERIC Educational Resources Information Center

    Craig, Emory M.

    2007-01-01

    Purpose: The purpose of this paper is to understand how emerging technologies and Web 2.0 services are transforming the structure of the web and their potential impact on managed learning environments (MLS) and learning content management systems (LCMS). Design/methodology/approach: Innovative Web 2.0 applications are reviewed in the paper to…

  13. Learning Probabilities in Computer Engineering by Using a Competency- and Problem-Based Approach

    ERIC Educational Resources Information Center

    Khoumsi, Ahmed; Hadjou, Brahim

    2005-01-01

    Our department has redesigned its electrical and computer engineering programs by adopting a learning methodology based on competence development, problem solving, and the realization of design projects. In this article, we show how this pedagogical approach has been successfully used for learning probabilities and their application to computer…

  14. Consumer Learning for University Students: A Case for a Curriculum

    ERIC Educational Resources Information Center

    Crafford, Sharon; Bitzer, Eli

    2009-01-01

    This article indicates how the application of a simplified version of the analytical abstraction method (AAM) was used in curriculum development for consumer learning at one higher education institution in South Africa. We used a case study design and qualitative research methodology to generate data through semi-structured interviews with eight…

  15. Use of Problem-Based Learning in the Teaching and Learning of Horticultural Production

    ERIC Educational Resources Information Center

    Abbey, Lord; Dowsett, Eric; Sullivan, Jan

    2017-01-01

    Purpose: Problem-based learning (PBL), a relatively novel teaching and learning process in horticulture, was investigated. Proper application of PBL can potentially create a learning context that enhances student learning. Design/Methodology/Approach: Students worked on two complex ill-structured problems: (1) to produce fresh baby greens for a…

  16. European Social Fund in Portugal: A Complex Question for Human Resource Development

    ERIC Educational Resources Information Center

    Tome, Eduardo

    2012-01-01

    Purpose: This article aims to review the application of the funds awarded by the European Social Fund (ESF) to Portugal, since 1986, from a human resource development (HRD) perspective. Design/methodology/approach: Several variables are analyzed: investment, absorption, people, impact of investment, evolution of skills, main programs, supply and…

  17. 21 CFR 514.8 - Supplements and other changes to an approved application.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... the drug as manufactured without the change; (C) Changes that may affect drug substance or drug... the proposed change; (C) The drug(s) involved; (D) The manufacturing site(s) or area(s) affected; (E...) Replacement of equipment with that of a different design that does not affect the process methodology or...

  18. 21 CFR 514.8 - Supplements and other changes to an approved application.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... the drug as manufactured without the change; (C) Changes that may affect drug substance or drug... the proposed change; (C) The drug(s) involved; (D) The manufacturing site(s) or area(s) affected; (E...) Replacement of equipment with that of a different design that does not affect the process methodology or...

  19. Research in Special Education: Designs, Methods, and Applications. Second Edition

    ERIC Educational Resources Information Center

    Rumrill, Phillip D., Jr.; Cook, Bryan G.; Wiley, Andrew L.

    2011-01-01

    The goal of this second edition is to provide a comprehensive overview of the philosophical, ethical, methodological, and analytical fundamentals of social science and educational research, as well as specify aspects of special education research that distinguish it from scientific inquiry in other fields of education and human services. Foremost…

  20. An Analytic Hierarchy Process for School Quality and Inspection: Model Development and Application

    ERIC Educational Resources Information Center

    Al Qubaisi, Amal; Badri, Masood; Mohaidat, Jihad; Al Dhaheri, Hamad; Yang, Guang; Al Rashedi, Asma; Greer, Kenneth

    2016-01-01

    Purpose: The purpose of this paper is to develop an analytic hierarchy planning-based framework to establish criteria weights and to develop a school performance system commonly called school inspections. Design/methodology/approach: The analytic hierarchy process (AHP) model uses pairwise comparisons and a measurement scale to generate the…

Top