Science.gov

Sample records for systematic design methodology

  1. Systematic Controller Design Methodology for Variable-Speed Wind Turbines

    SciTech Connect

    Hand, M. M.; Balas, M. J.

    2002-02-01

    Variable-speed, horizontal axis wind turbines use blade-pitch control to meet specified objectives for three operational regions. This paper provides a guide for controller design for the constant power production regime. A simple, rigid, non-linear turbine model was used to systematically perform trade-off studies between two performance metrics. Minimization of both the deviation of the rotor speed from the desired speed and the motion of the actuator is desired. The robust nature of the proportional-integral-derivative controller is illustrated, and optimal operating conditions are determined. Because numerous simulation runs may be completed in a short time, the relationship between the two opposing metrics is easily visualized.

  2. Systematic defect filtering and data analysis methodology for design based metrology

    NASA Astrophysics Data System (ADS)

    Yang, Hyunjo; Kim, Jungchan; Lee, Taehyeong; Jung, Areum; Yoo, Gyun; Yim, Donggyu; Park, Sungki; Hasebe, Toshiaki; Yamamoto, Masahiro; Cai, Jun

    2009-03-01

    Recently several Design Based Metrologies (DBMs) are introduced and being in use for wafer verification. The major applications of DBM are OPC accuracy improvement, DFM feed-back through Process Window Qualification (PWQ) and advanced process control. In general, however, the amount of output data from DBM is normally so large that it is very hard to handle the data for valuable feed-back. In case of PWQ, more than thousands of hot spots are detected on a single chip at the edge of process window. So, it takes much time and labor to review and analyze all the hot spots detected at PWQ. Design-related systematic defects, however, will be found repeatedly and if they can be classified into groups, it would be possible to save a lot of time for the analysis. We have demonstrated an EDA tool which can handle the large amount of output data from DBM by reducing pattern defects to groups. It can classify millions of patterns into less than thousands of pattern groups. It has been evaluated on the analysis of PWQ of metal layer in NAND Flash memory device and random contact hole patterns in a DRAM device. The result shows that this EDA tool can handle the CD measurement data easily and can save us a lot of time and labor for the analysis. The procedures of systematic defect filtering and data handling using an EDA tool are presented in detail

  3. Systematic Review Methodology in Higher Education

    ERIC Educational Resources Information Center

    Bearman, Margaret; Smith, Calvin D.; Carbone, Angela; Slade, Susan; Baik, Chi; Hughes-Warrington, Marnie; Neumann, David L.

    2012-01-01

    Systematic review methodology can be distinguished from narrative reviews of the literature through its emphasis on transparent, structured and comprehensive approaches to searching the literature and its requirement for formal synthesis of research findings. There appears to be relatively little use of the systematic review methodology within the…

  4. Variable-Speed Wind Turbine Controller Systematic Design Methodology: A Comparison of Non-Linear and Linear Model-Based Designs

    SciTech Connect

    Hand, M. M.

    1999-07-30

    Variable-speed, horizontal axis wind turbines use blade-pitch control to meet specified objectives for three regions of operation. This paper focuses on controller design for the constant power production regime. A simple, rigid, non-linear turbine model was used to systematically perform trade-off studies between two performance metrics. Minimization of both the deviation of the rotor speed from the desired speed and the motion of the actuator is desired. The robust nature of the proportional-integral-derivative (PID) controller is illustrated, and optimal operating conditions are determined. Because numerous simulation runs may be completed in a short time, the relationship of the two opposing metrics is easily visualized. Traditional controller design generally consists of linearizing a model about an operating point. This step was taken for two different operating points, and the systematic design approach was used. A comparison of the optimal regions selected using the n on-linear model and the two linear models shows similarities. The linearization point selection does, however, affect the turbine performance slightly. Exploitation of the simplicity of the model allows surfaces consisting of operation under a wide range of gain values to be created. This methodology provides a means of visually observing turbine performance based upon the two metrics chosen for this study. Design of a PID controller is simplified, and it is possible to ascertain the best possible combination of controller parameters. The wide, flat surfaces indicate that a PID controller is very robust in this variable-speed wind turbine application.

  5. Autonomous spacecraft design methodology

    SciTech Connect

    Divita, E.L.; Turner, P.R.

    1984-08-01

    A methodology for autonomous spacecraft design blends autonomy requirements with traditional mission requirements and assesses the impact of autonomy upon the total system resources available to support faulttolerance and automation. A baseline functional design can be examined for autonomy implementation impacts, and the costs, risk, and benefits of various options can be assessed. The result of the process is a baseline design that includes autonomous control functions.

  6. Vending machine assessment methodology. A systematic review.

    PubMed

    Matthews, Melissa A; Horacek, Tanya M

    2015-07-01

    The nutritional quality of food and beverage products sold in vending machines has been implicated as a contributing factor to the development of an obesogenic food environment. How comprehensive, reliable, and valid are the current assessment tools for vending machines to support or refute these claims? A systematic review was conducted to summarize, compare, and evaluate the current methodologies and available tools for vending machine assessment. A total of 24 relevant research studies published between 1981 and 2013 met inclusion criteria for this review. The methodological variables reviewed in this study include assessment tool type, study location, machine accessibility, product availability, healthfulness criteria, portion size, price, product promotion, and quality of scientific practice. There were wide variations in the depth of the assessment methodologies and product healthfulness criteria utilized among the reviewed studies. Of the reviewed studies, 39% evaluated machine accessibility, 91% evaluated product availability, 96% established healthfulness criteria, 70% evaluated portion size, 48% evaluated price, 52% evaluated product promotion, and 22% evaluated the quality of scientific practice. Of all reviewed articles, 87% reached conclusions that provided insight into the healthfulness of vended products and/or vending environment. Product healthfulness criteria and complexity for snack and beverage products was also found to be variable between the reviewed studies. These findings make it difficult to compare results between studies. A universal, valid, and reliable vending machine assessment tool that is comprehensive yet user-friendly is recommended.

  7. RAMCAD Design Methodology

    DTIC Science & Technology

    1993-05-01

    fault detection and isolation , and reduces...elements during the design process. Fault detection and isolation are simplified when an entire function can be assigned to a single hardware design element...defining the fault detection and isolation constraints and goals) are met or all of the test resources have been committed. Alternative resource

  8. Permanent magnet design methodology

    NASA Technical Reports Server (NTRS)

    Leupold, Herbert A.

    1991-01-01

    Design techniques developed for the exploitation of high energy magnetically rigid materials such as Sm-Co and Nd-Fe-B have resulted in a revolution in kind rather than in degree in the design of a variety of electron guidance structures for ballistic and aerospace applications. Salient examples are listed. Several prototype models were developed. These structures are discussed in some detail: permanent magnet solenoids, transverse field sources, periodic structures, and very high field structures.

  9. Solid lubrication design methodology

    NASA Technical Reports Server (NTRS)

    Aggarwal, B. B.; Yonushonis, T. M.; Bovenkerk, R. L.

    1984-01-01

    A single element traction rig was used to measure the traction forces at the contact of a ball against a flat disc at room temperature under combined rolling and sliding. The load and speed conditions were selected to match those anticipated for bearing applications in adiabatic diesel engines. The test program showed that the magnitude of traction forces were almost the same for all the lubricants tested; a lubricant should, therefore, be selected on the basis of its ability to prevent wear of the contact surfaces. Traction vs. slide/roll ratio curves were similar to those for liquid lubricants but the traction forces were an order of magnitude higher. The test data was used to derive equations to predict traction force as a function of contact stress and rolling speed. Qualitative design guidelines for solid lubricated concentrated contacts are proposed.

  10. Systematic Comparison of Operating Reserve Methodologies: Preprint

    SciTech Connect

    Ibanez, E.; Krad, I.; Ela, E.

    2014-04-01

    Operating reserve requirements are a key component of modern power systems, and they contribute to maintaining reliable operations with minimum economic impact. No universal method exists for determining reserve requirements, thus there is a need for a thorough study and performance comparison of the different existing methodologies. Increasing penetrations of variable generation (VG) on electric power systems are posed to increase system uncertainty and variability, thus the need for additional reserve also increases. This paper presents background information on operating reserve and its relationship to VG. A consistent comparison of three methodologies to calculate regulating and flexibility reserve in systems with VG is performed.

  11. A Systematic Methodology for Verifying Superscalar Microprocessors

    NASA Technical Reports Server (NTRS)

    Srivas, Mandayam; Hosabettu, Ravi; Gopalakrishnan, Ganesh

    1999-01-01

    We present a systematic approach to decompose and incrementally build the proof of correctness of pipelined microprocessors. The central idea is to construct the abstraction function by using completion functions, one per unfinished instruction, each of which specifies the effect (on the observables) of completing the instruction. In addition to avoiding the term size and case explosion problem that limits the pure flushing approach, our method helps localize errors, and also handles stages with interactive loops. The technique is illustrated on pipelined and superscalar pipelined implementations of a subset of the DLX architecture. It has also been applied to a processor with out-of-order execution.

  12. Waste Package Design Methodology Report

    SciTech Connect

    D.A. Brownson

    2001-09-28

    The objective of this report is to describe the analytical methods and processes used by the Waste Package Design Section to establish the integrity of the various waste package designs, the emplacement pallet, and the drip shield. The scope of this report shall be the methodology used in criticality, risk-informed, shielding, source term, structural, and thermal analyses. The basic features and appropriateness of the methods are illustrated, and the processes are defined whereby input values and assumptions flow through the application of those methods to obtain designs that ensure defense-in-depth as well as satisfy requirements on system performance. Such requirements include those imposed by federal regulation, from both the U.S. Department of Energy (DOE) and U.S. Nuclear Regulatory Commission (NRC), and those imposed by the Yucca Mountain Project to meet repository performance goals. The report is to be used, in part, to describe the waste package design methods and techniques to be used for producing input to the License Application Report.

  13. Methodology for Designing Fault-Protection Software

    NASA Technical Reports Server (NTRS)

    Barltrop, Kevin; Levison, Jeffrey; Kan, Edwin

    2006-01-01

    A document describes a methodology for designing fault-protection (FP) software for autonomous spacecraft. The methodology embodies and extends established engineering practices in the technical discipline of Fault Detection, Diagnosis, Mitigation, and Recovery; and has been successfully implemented in the Deep Impact Spacecraft, a NASA Discovery mission. Based on established concepts of Fault Monitors and Responses, this FP methodology extends the notion of Opinion, Symptom, Alarm (aka Fault), and Response with numerous new notions, sub-notions, software constructs, and logic and timing gates. For example, Monitor generates a RawOpinion, which graduates into Opinion, categorized into no-opinion, acceptable, or unacceptable opinion. RaiseSymptom, ForceSymptom, and ClearSymptom govern the establishment and then mapping to an Alarm (aka Fault). Local Response is distinguished from FP System Response. A 1-to-n and n-to- 1 mapping is established among Monitors, Symptoms, and Responses. Responses are categorized by device versus by function. Responses operate in tiers, where the early tiers attempt to resolve the Fault in a localized step-by-step fashion, relegating more system-level response to later tier(s). Recovery actions are gated by epoch recovery timing, enabling strategy, urgency, MaxRetry gate, hardware availability, hazardous versus ordinary fault, and many other priority gates. This methodology is systematic, logical, and uses multiple linked tables, parameter files, and recovery command sequences. The credibility of the FP design is proven via a fault-tree analysis "top-down" approach, and a functional fault-mode-effects-and-analysis via "bottoms-up" approach. Via this process, the mitigation and recovery strategy(s) per Fault Containment Region scope (width versus depth) the FP architecture.

  14. Poor methodological quality and reporting standards of systematic reviews in burn care management.

    PubMed

    Wasiak, Jason; Tyack, Zephanie; Ware, Robert; Goodwin, Nicholas; Faggion, Clovis M

    2016-12-18

    The methodological and reporting quality of burn-specific systematic reviews has not been established. The aim of this study was to evaluate the methodological quality of systematic reviews in burn care management. Computerised searches were performed in Ovid MEDLINE, Ovid EMBASE and The Cochrane Library through to February 2016 for systematic reviews relevant to burn care using medical subject and free-text terms such as 'burn', 'systematic review' or 'meta-analysis'. Additional studies were identified by hand-searching five discipline-specific journals. Two authors independently screened papers, extracted and evaluated methodological quality using the 11-item A Measurement Tool to Assess Systematic Reviews (AMSTAR) tool and reporting quality using the 27-item Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) checklist. Characteristics of systematic reviews associated with methodological and reporting quality were identified. Descriptive statistics and linear regression identified features associated with improved methodological quality. A total of 60 systematic reviews met the inclusion criteria. Six of the 11 AMSTAR items reporting on 'a priori' design, duplicate study selection, grey literature, included/excluded studies, publication bias and conflict of interest were reported in less than 50% of the systematic reviews. Of the 27 items listed for PRISMA, 13 items reporting on introduction, methods, results and the discussion were addressed in less than 50% of systematic reviews. Multivariable analyses showed that systematic reviews associated with higher methodological or reporting quality incorporated a meta-analysis (AMSTAR regression coefficient 2.1; 95% CI: 1.1, 3.1; PRISMA regression coefficient 6·3; 95% CI: 3·8, 8·7) were published in the Cochrane library (AMSTAR regression coefficient 2·9; 95% CI: 1·6, 4·2; PRISMA regression coefficient 6·1; 95% CI: 3·1, 9·2) and included a randomised control trial (AMSTAR regression

  15. Reliability based design optimization: Formulations and methodologies

    NASA Astrophysics Data System (ADS)

    Agarwal, Harish

    Modern products ranging from simple components to complex systems should be designed to be optimal and reliable. The challenge of modern engineering is to ensure that manufacturing costs are reduced and design cycle times are minimized while achieving requirements for performance and reliability. If the market for the product is competitive, improved quality and reliability can generate very strong competitive advantages. Simulation based design plays an important role in designing almost any kind of automotive, aerospace, and consumer products under these competitive conditions. Single discipline simulations used for analysis are being coupled together to create complex coupled simulation tools. This investigation focuses on the development of efficient and robust methodologies for reliability based design optimization in a simulation based design environment. Original contributions of this research are the development of a novel efficient and robust unilevel methodology for reliability based design optimization, the development of an innovative decoupled reliability based design optimization methodology, the application of homotopy techniques in unilevel reliability based design optimization methodology, and the development of a new framework for reliability based design optimization under epistemic uncertainty. The unilevel methodology for reliability based design optimization is shown to be mathematically equivalent to the traditional nested formulation. Numerical test problems show that the unilevel methodology can reduce computational cost by at least 50% as compared to the nested approach. The decoupled reliability based design optimization methodology is an approximate technique to obtain consistent reliable designs at lesser computational expense. Test problems show that the methodology is computationally efficient compared to the nested approach. A framework for performing reliability based design optimization under epistemic uncertainty is also developed

  16. Nonlinear flight control design using backstepping methodology

    NASA Astrophysics Data System (ADS)

    Tran, Thanh Trung

    The subject of nonlinear flight control design using backstepping control methodology is investigated in the dissertation research presented here. Control design methods based on nonlinear models of the dynamic system provide higher utility and versatility because the design model more closely matches the physical system behavior. Obtaining requisite model fidelity is only half of the overall design process, however. Design of the nonlinear control loops can lessen the effects of nonlinearity, or even exploit nonlinearity, to achieve higher levels of closed-loop stability, performance, and robustness. The goal of the research is to improve control quality for a general class of strict-feedback dynamic systems and provide flight control architectures to augment the aircraft motion. The research is divided into two parts: theoretical control development for the strict-feedback form of nonlinear dynamic systems and application of the proposed theory for nonlinear flight dynamics. In the first part, the research is built on two components: transforming the nonlinear dynamic model to a canonical strict-feedback form and then applying backstepping control theory to the canonical model. The research considers a process to determine when this transformation is possible, and when it is possible, a systematic process to transfer the model is also considered when practical. When this is not the case, certain modeling assumptions are explored to facilitate the transformation. After achieving the canonical form, a systematic design procedure for formulating a backstepping control law is explored in the research. Starting with the simplest subsystem and ending with the full system, pseudo control concepts based on Lyapunov control functions are used to control each successive subsystem. Typically each pseudo control must be solved from a nonlinear algebraic equation. At the end of this process, the physical control input must be re-expressed in terms of the physical states by

  17. Integrated Design Methodology for Highly Reliable Liquid Rocket Engine

    NASA Astrophysics Data System (ADS)

    Kuratani, Naoshi; Aoki, Hiroshi; Yasui, Masaaki; Kure, Hirotaka; Masuya, Goro

    The Integrated Design Methodology is strongly required at the conceptual design phase to achieve the highly reliable space transportation systems, especially the propulsion systems, not only in Japan but also all over the world in these days. Because in the past some catastrophic failures caused some losses of mission and vehicle (LOM/LOV) at the operational phase, moreover did affect severely the schedule delays and cost overrun at the later development phase. Design methodology for highly reliable liquid rocket engine is being preliminarily established and investigated in this study. The sensitivity analysis is systematically performed to demonstrate the effectiveness of this methodology, and to clarify and especially to focus on the correlation between the combustion chamber, turbopump and main valve as main components. This study describes the essential issues to understand the stated correlations, the need to apply this methodology to the remaining critical failure modes in the whole engine system, and the perspective on the engine development in the future.

  18. Space Engineering Projects in Design Methodology

    NASA Technical Reports Server (NTRS)

    Crawford, R.; Wood, K.; Nichols, S.; Hearn, C.; Corrier, S.; DeKunder, G.; George, S.; Hysinger, C.; Johnson, C.; Kubasta, K.

    1993-01-01

    NASA/USRA is an ongoing sponsor of space design projects in the senior design courses of the Mechanical Engineering Department at The University of Texas at Austin. This paper describes the UT senior design sequence, focusing on the first-semester design methodology course. The philosophical basis and pedagogical structure of this course is summarized. A history of the Department's activities in the Advanced Design Program is then presented. The paper includes a summary of the projects completed during the 1992-93 Academic Year in the methodology course, and concludes with an example of two projects completed by student design teams.

  19. Application of systematic review methodology to the field of nutrition.

    PubMed

    Lichtenstein, Alice H; Yetley, Elizabeth A; Lau, Joseph

    2008-12-01

    Systematic reviews represent a rigorous and transparent approach to synthesizing scientific evidence that minimizes bias. They evolved within the medical community to support development of clinical and public health practice guidelines, set research agendas, and formulate scientific consensus statements. The use of systematic reviews for nutrition-related topics is more recent. Systematic reviews provide independently conducted comprehensive and objective assessments of available information addressing precise questions. This approach to summarizing available data is a useful tool for identifying the state of science including knowledge gaps and associated research needs, supporting development of science-based recommendations and guidelines, and serving as the foundation for updates as new data emerge. Our objective is to describe the steps for performing systematic reviews and highlight areas unique to the discipline of nutrition that are important to consider in data assessment. The steps involved in generating systematic reviews include identifying staffing and planning for outside expert input, forming a research team, developing an analytic framework, developing and refining research questions, defining eligibility criteria, identifying search terms, screening abstracts according to eligibility criteria, retrieving articles for evaluation, constructing evidence and summary tables, assessing methodological quality and applicability, and synthesizing results including performing meta-analysis, if appropriate. Unique and at times challenging, nutrition-related considerations include baseline nutrient exposure, nutrient status, bioequivalence of bioactive compounds, bioavailability, multiple and interrelated biological functions, undefined nature of some interventions, and uncertainties in intake assessment. Systematic reviews are a valuable and independent component of decision-making processes by groups responsible for developing science-based recommendations

  20. Assuring data transparency through design methodologies

    NASA Technical Reports Server (NTRS)

    Williams, Allen

    1990-01-01

    This paper addresses the role of design methodologies and practices in the assurance of technology transparency. The development of several subsystems on large, long life cycle government programs was analyzed to glean those characteristics in the design, development, test, and evaluation that precluded or enabled the insertion of new technology. The programs examined were Minuteman, DSP, B1-B, and space shuttle. All these were long life cycle, technology-intensive programs. The design methodologies (or lack thereof) and design practices for each were analyzed in terms of the success or failure in incorporating evolving technology. Common elements contributing to the success or failure were extracted and compared to current methodologies being proposed by the Department of Defense and NASA. The relevance of these practices to the design and deployment of Space Station Freedom were evaluated. In particular, appropriate methodologies now being used on the core development contract were examined.

  1. General Methodology for Designing Spacecraft Trajectories

    NASA Technical Reports Server (NTRS)

    Condon, Gerald; Ocampo, Cesar; Mathur, Ravishankar; Morcos, Fady; Senent, Juan; Williams, Jacob; Davis, Elizabeth C.

    2012-01-01

    A methodology for designing spacecraft trajectories in any gravitational environment within the solar system has been developed. The methodology facilitates modeling and optimization for problems ranging from that of a single spacecraft orbiting a single celestial body to that of a mission involving multiple spacecraft and multiple propulsion systems operating in gravitational fields of multiple celestial bodies. The methodology consolidates almost all spacecraft trajectory design and optimization problems into a single conceptual framework requiring solution of either a system of nonlinear equations or a parameter-optimization problem with equality and/or inequality constraints.

  2. A new methodology for hospital design.

    PubMed

    Mejia, Ana Maria Silva

    2013-08-01

    According to architect, Ana Maria Silva Mejia, 'a new era for the design of hospitals in Guatemala has arrived', with a considerable growth in interest around good healthcare facility design. Here, in a slightly adapted version of an article, 'A new methodology for design', first published in the IFHE (International Federation of Hospital Engineering) Digest 2012, she reports on the application of a new methodology designed to optimise efficient use of space, and clinical and other adjacencies, in a district hospital in the City of Zacapa. The system has subsequently been successfully applied to a number of other Guatemalan healthcare facilities.

  3. Saving Material with Systematic Process Designs

    NASA Astrophysics Data System (ADS)

    Kerausch, M.

    2011-08-01

    Global competition is forcing the stamping industry to further increase quality, to shorten time-to-market and to reduce total cost. Continuous balancing between these classical time-cost-quality targets throughout the product development cycle is required to ensure future economical success. In today's industrial practice, die layout standards are typically assumed to implicitly ensure the balancing of company specific time-cost-quality targets. Although die layout standards are a very successful approach, there are two methodical disadvantages. First, the capabilities for tool design have to be continuously adapted to technological innovations; e.g. to take advantage of the full forming capability of new materials. Secondly, the great variety of die design aspects have to be reduced to a generic rule or guideline; e.g. binder shape, draw-in conditions or the use of drawbeads. Therefore, it is important to not overlook cost or quality opportunities when applying die design standards. This paper describes a systematic workflow with focus on minimizing material consumption. The starting point of the investigation is a full process plan for a typical structural part. All requirements are definedaccording to a predefined set of die design standards with industrial relevance are fulfilled. In a first step binder and addendum geometry is systematically checked for material saving potentials. In a second step, blank shape and draw-in are adjusted to meet thinning, wrinkling and springback targets for a minimum blank solution. Finally the identified die layout is validated with respect to production robustness versus splits, wrinkles and springback. For all three steps the applied methodology is based on finite element simulation combined with a stochastical variation of input variables. With the proposed workflow a well-balanced (time-cost-quality) production process assuring minimal material consumption can be achieved.

  4. Applying Software Design Methodology to Instructional Design

    ERIC Educational Resources Information Center

    East, J. Philip

    2004-01-01

    The premise of this paper is that computer science has much to offer the endeavor of instructional improvement. Software design processes employed in computer science for developing software can be used for planning instruction and should improve instruction in much the same manner that design processes appear to have improved software. Techniques…

  5. Design methodology and projects for space engineering

    NASA Technical Reports Server (NTRS)

    Nichols, S.; Kleespies, H.; Wood, K.; Crawford, R.

    1993-01-01

    NASA/USRA is an ongoing sponsor of space design projects in the senior design course of the Mechanical Engineering Department at The University of Texas at Austin. This paper describes the UT senior design sequence, consisting of a design methodology course and a capstone design course. The philosophical basis of this sequence is briefly summarized. A history of the Department's activities in the Advanced Design Program is then presented. The paper concludes with a description of the projects completed during the 1991-92 academic year and the ongoing projects for the Fall 1992 semester.

  6. Integrating rock mechanics issues with repository design through design process principles and methodology

    SciTech Connect

    Bieniawski, Z.T.

    1996-04-01

    A good designer needs not only knowledge for designing (technical know-how that is used to generate alternative design solutions) but also must have knowledge about designing (appropriate principles and systematic methodology to follow). Concepts such as {open_quotes}design for manufacture{close_quotes} or {open_quotes}concurrent engineering{close_quotes} are widely used in the industry. In the field of rock engineering, only limited attention has been paid to the design process because design of structures in rock masses presents unique challenges to the designers as a result of the uncertainties inherent in characterization of geologic media. However, a stage has now been reached where we are be able to sufficiently characterize rock masses for engineering purposes and identify the rock mechanics issues involved but are still lacking engineering design principles and methodology to maximize our design performance. This paper discusses the principles and methodology of the engineering design process directed to integrating site characterization activities with design, construction and performance of an underground repository. Using the latest information from the Yucca Mountain Project on geology, rock mechanics and starter tunnel design, the current lack of integration is pointed out and it is shown how rock mechanics issues can be effectively interwoven with repository design through a systematic design process methodology leading to improved repository performance. In essence, the design process is seen as the use of design principles within an integrating design methodology, leading to innovative problem solving. In particular, a new concept of {open_quotes}Design for Constructibility and Performance{close_quotes} is introduced. This is discussed with respect to ten rock mechanics issues identified for repository design and performance.

  7. Failure detection system design methodology. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Chow, E. Y.

    1980-01-01

    The design of a failure detection and identification system consists of designing a robust residual generation process and a high performance decision making process. The design of these two processes are examined separately. Residual generation is based on analytical redundancy. Redundancy relations that are insensitive to modelling errors and noise effects are important for designing robust residual generation processes. The characterization of the concept of analytical redundancy in terms of a generalized parity space provides a framework in which a systematic approach to the determination of robust redundancy relations are developed. The Bayesian approach is adopted for the design of high performance decision processes. The FDI decision problem is formulated as a Bayes sequential decision problem. Since the optimal decision rule is incomputable, a methodology for designing suboptimal rules is proposed. A numerical algorithm is developed to facilitate the design and performance evaluation of suboptimal rules.

  8. Single-Case Experimental Designs: A Systematic Review of Published Research and Current Standards

    ERIC Educational Resources Information Center

    Smith, Justin D.

    2012-01-01

    This article systematically reviews the research design and methodological characteristics of single-case experimental design (SCED) research published in peer-reviewed journals between 2000 and 2010. SCEDs provide researchers with a flexible and viable alternative to group designs with large sample sizes. However, methodological challenges have…

  9. Waste Package Component Design Methodology Report

    SciTech Connect

    D.C. Mecham

    2004-07-12

    This Executive Summary provides an overview of the methodology being used by the Yucca Mountain Project (YMP) to design waste packages and ancillary components. This summary information is intended for readers with general interest, but also provides technical readers a general framework surrounding a variety of technical details provided in the main body of the report. The purpose of this report is to document and ensure appropriate design methods are used in the design of waste packages and ancillary components (the drip shields and emplacement pallets). The methodology includes identification of necessary design inputs, justification of design assumptions, and use of appropriate analysis methods, and computational tools. This design work is subject to ''Quality Assurance Requirements and Description''. The document is primarily intended for internal use and technical guidance for a variety of design activities. It is recognized that a wide audience including project management, the U.S. Department of Energy (DOE), the U.S. Nuclear Regulatory Commission, and others are interested to various levels of detail in the design methods and therefore covers a wide range of topics at varying levels of detail. Due to the preliminary nature of the design, readers can expect to encounter varied levels of detail in the body of the report. It is expected that technical information used as input to design documents will be verified and taken from the latest versions of reference sources given herein. This revision of the methodology report has evolved with changes in the waste package, drip shield, and emplacement pallet designs over many years and may be further revised as the design is finalized. Different components and analyses are at different stages of development. Some parts of the report are detailed, while other less detailed parts are likely to undergo further refinement. The design methodology is intended to provide designs that satisfy the safety and operational

  10. Case-Crossover Analysis of Air Pollution Health Effects: A Systematic Review of Methodology and Application

    PubMed Central

    Carracedo-Martínez, Eduardo; Taracido, Margarita; Tobias, Aurelio; Saez, Marc; Figueiras, Adolfo

    2010-01-01

    Background Case-crossover is one of the most used designs for analyzing the health-related effects of air pollution. Nevertheless, no one has reviewed its application and methodology in this context. Objective We conducted a systematic review of case-crossover (CCO) designs used to study the relationship between air pollution and morbidity and mortality, from the standpoint of methodology and application. Data sources and extraction A search was made of the MEDLINE and EMBASE databases. Reports were classified as methodologic or applied. From the latter, the following information was extracted: author, study location, year, type of population (general or patients), dependent variable(s), independent variable(s), type of CCO design, and whether effect modification was analyzed for variables at the individual level. Data synthesis The review covered 105 reports that fulfilled the inclusion criteria. Of these, 24 addressed methodological aspects, and the remainder involved the design’s application. In the methodological reports, the designs that yielded the best results in simulation were symmetric bidirectional CCO and time-stratified CCO. Furthermore, we observed an increase across time in the use of certain CCO designs, mainly symmetric bidirectional and time-stratified CCO. The dependent variables most frequently analyzed were those relating to hospital morbidity; the pollutants most often studied were those linked to particulate matter. Among the CCO-application reports, 13.6% studied effect modification for variables at the individual level. Conclusions The use of CCO designs has undergone considerable growth; the most widely used designs were those that yielded better results in simulation studies: symmetric bidirectional and time-stratified CCO. However, the advantages of CCO as a method of analysis of variables at the individual level are put to little use. PMID:20356818

  11. Measuring service line competitive position. A systematic methodology for hospitals.

    PubMed

    Studnicki, J

    1991-01-01

    To mount a broad effort aimed at improving their competitive position for some service or group of services, hospitals have begun to pursue product line management techniques. A few hospitals have even reorganized completely under the product line framework. The benefits include focusing accountability for operations and results, facilitating coordination between departments and functions, stimulating market segmentation, and promoting rigorous examination of new and existing programs. As part of its strategic planning process, a suburban Baltimore hospital developed a product line management methodology with six basic steps: (1) define the service lines (which they did by grouping all existing diagnosis-related groups into 35 service lines), (2) determine the contribution of each service line to total inpatient volume, (3) determine trends in service line volumes (by comparing data over time), (4) derive a useful comparison group (competing hospitals or groups of hospitals with comparable size, scope of services, payer mix, and financial status), (5) review multiple time frames, and (6) summarize the long- and short-term performance of the hospital's service lines to focus further analysis. This type of systematic and disciplined analysis can become part of a permanent strategic intelligence program. When hospitals have such a program in place, their market research, planning, budgeting, and operations will be tied together in a true management decision support system.

  12. Methodological considerations for designing a community water fluoridation cessation study.

    PubMed

    Singhal, Sonica; Farmer, Julie; McLaren, Lindsay

    2017-02-22

    High-quality, up-to-date research on community water fluoridation (CWF), and especially on the implications of CWF cessation for dental health, is limited. Although CWF cessation studies have been conducted, they are few in number; one of the major reasons is the methodological complexity of conducting such a study. This article draws on a systematic review of existing cessation studies (n=15) to explore methodological considerations of conducting CWF cessation studies in future. We review nine important methodological aspects (study design, comparison community, target population, time frame, sampling strategy, clinical indicators, assessment criteria, covariates and biomarkers) and provide recommendations for planning future CWF cessation studies that examine effects on dental caries. There is no one ideal study design to answer a research question. However, recommendations proposed regarding methodological aspects to conduct an epidemiological study to observe the effects of CWF cessation on dental caries, coupled with our identification of important methodological gaps, will be useful for researchers who are looking to optimize resources to conduct such a study with standards of rigour.

  13. An Analysis of Software Design Methodologies

    DTIC Science & Technology

    1979-08-01

    Technical Report 401-, # "AN ANALYSIS OF SOFTWARE DESIGN METHODOLOGIES H. Rudy Ramsey, Michael E. Atwood , and Gary D. Campbell Science...H. Rudy Ramsey, Michael E. Atwood , and Gary D. Campbell Science Applications, Incorporated Submitted by: Edgar M. Johnson, Chief HUMAN FACTORS...expressed by members ot the Integrated Software Research and Development Working Group (ISRAD). The authors are indebted to Martha Cichelli, Margaret

  14. A Design Methodology for Optoelectronic VLSI

    DTIC Science & Technology

    2007-01-01

    soldered to a copper -clad printed circuit (PC) board, are no longer sufficient for today’s high-speed ICs. A processing chip that can compute data at a rate...design approach. A new design methodology has to be adopted to take advan- tage of the benefits that FSOI offers. Optoelectronic VLSI is the coupling of...and connections are made from chip to chip via traces of copper wire, as shown in Figure 2-2. The signal from a logic gate on one chip to a logic gate

  15. Structural design methodology for large space structures

    NASA Astrophysics Data System (ADS)

    Dornsife, Ralph J.

    1992-02-01

    The Department of Defense requires research and development in designing, fabricating, deploying, and maintaining large space structures (LSS) in support of Army and Strategic Defense Initiative military objectives. Because of their large size, extreme flexibility, and the unique loading conditions in the space environment, LSS will present engineers with problems unlike those encountered in designing conventional civil engineering or aerospace structures. LSS will require sophisticated passive damping and active control systems in order to meet stringent mission requirements. These structures must also be optimally designed to minimize high launch costs. This report outlines a methodology for the structural design of LSS. It includes a definition of mission requirements, structural modeling and analysis, passive damping and active control system design, ground-based testing, payload integration, on-orbit system verification, and on-orbit assessment of structural damage. In support of this methodology, analyses of candidate LSS truss configurations are presented, and an algorithm correlating ground-based test behavior to expected microgravity behavior is developed.

  16. Structural design methodology for large space structures

    NASA Astrophysics Data System (ADS)

    Dornsife, Ralph J.

    The Department of Defense requires research and development in designing, fabricating, deploying, and maintaining large space structures (LSS) in support of Army and Strategic Defense Initiative military objectives. Because of their large size, extreme flexibility, and the unique loading conditions in the space environment, LSS will present engineers with problems unlike those encountered in designing conventional civil engineering or aerospace structures. LSS will require sophisticated passive damping and active control systems in order to meet stringent mission requirements. These structures must also be optimally designed to minimize high launch costs. This report outlines a methodology for the structural design of LSS. It includes a definition of mission requirements, structural modeling and analysis, passive damping and active control system design, ground-based testing, payload integration, on-orbit system verification, and on-orbit assessment of structural damage. In support of this methodology, analyses of candidate LSS truss configurations are presented, and an algorithm correlating ground-based test behavior to expected microgravity behavior is developed.

  17. Control/structure interaction design methodology

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.; Layman, William E.

    1989-01-01

    The Control Structure Interaction Program is a technology development program for spacecraft that exhibit interactions between the control system and structural dynamics. The program objectives include development and verification of new design concepts (such as active structure) and new tools (such as a combined structure and control optimization algorithm) and their verification in ground and possibly flight test. The new CSI design methodology is centered around interdisciplinary engineers using new tools that closely integrate structures and controls. Verification is an important CSI theme and analysts will be closely integrated to the CSI Test Bed laboratory. Components, concepts, tools and algorithms will be developed and tested in the lab and in future Shuttle-based flight experiments. The design methodology is summarized in block diagrams depicting the evolution of a spacecraft design and descriptions of analytical capabilities used in the process. The multiyear JPL CSI implementation plan is described along with the essentials of several new tools. A distributed network of computation servers and workstations was designed that will provide a state-of-the-art development base for the CSI technologies.

  18. Statins, cognition, and dementia—systematic review and methodological commentary

    PubMed Central

    Power, Melinda C.; Weuve, Jennifer; Sharrett, A. Richey; Blacker, Deborah

    2015-01-01

    Firm conclusions about whether mid-life or long-term statin use has an impact on cognitive decline and dementia remain elusive. Here, our objective was to systematically review, synthesize and critique the epidemiological literature that examines the relationship between statin use and cognition, so as to assess the current state of knowledge, identify gaps in our understanding, and make recommendations for future research. We summarize the findings of randomized controlled trials (RCTs) and observational studies, grouped according to study design. We discuss the methods for each, and consider likely sources of bias, such as reverse causation and confounding. Although observational studies that considered statin use at or near the time of dementia diagnosis suggest a protective effect of statins, these findings could be attributable to reverse causation. RCTs and well-conducted observational studies of baseline statin use and subsequent cognition over several years of follow-up do not support a causal preventative effect of late-life statin use on cognitive decline or dementia. Given that much of the human research on statins and cognition in the future will be observational, careful study design and analysis will be essential. PMID:25799928

  19. A critical appraisal of the methodology and quality of evidence of systematic reviews and meta-analyses of traditional Chinese medical nursing interventions: a systematic review of reviews

    PubMed Central

    Jin, Ying-Hui; Wang, Guo-Hao; Sun, Yi-Rong; Li, Qi; Zhao, Chen; Li, Ge; Si, Jin-Hua; Li, Yan; Lu, Cui; Shang, Hong-Cai

    2016-01-01

    Objective To assess the methodology and quality of evidence of systematic reviews and meta-analyses of traditional Chinese medical nursing (TCMN) interventions in Chinese journals. These interventions include acupressure, massage, Tai Chi, Qi Gong, electroacupuncture and use of Chinese herbal medicines—for example, in enemas, foot massage and compressing the umbilicus. Design A systematic literature search for systematic reviews and meta-analyses of TCMN interventions was performed. Review characteristics were extracted. The methodological quality and the quality of the evidence were evaluated using the Assessment of Multiple Systematic Reviews (AMSTAR) and Grading of Recommendations Assessment, Development and Evaluation (GRADE) approaches. Result We included 20 systematic reviews and meta-analyses, and a total of 11 TCMN interventions were assessed in the 20 reviews. The compliance with AMSTAR checklist items ranged from 4.5 to 8 and systematic reviews/meta-analyses were, on average, of medium methodological quality. The quality of the evidence we assessed ranged from very low to moderate; no high-quality evidence was found. The top two causes for downrating confidence in effect estimates among the 31 bodies of evidence assessed were the risk of bias and inconsistency. Conclusions There is room for improvement in the methodological quality of systematic reviews/meta-analyses of TCMN interventions published in Chinese journals. Greater efforts should be devoted to ensuring a more comprehensive search strategy, clearer specification of the interventions of interest in the eligibility criteria and identification of meaningful outcomes for clinicians and patients (consumers). The overall quality of evidence among reviews remains suboptimal, which raise concerns about their roles in influencing clinical practice. Thus, the conclusions in reviews we assessed must be treated with caution and their roles in influencing clinical practice should be limited. A critical

  20. Systematic Review of the Methodological Quality of Studies Aimed at Creating Gestational Weight Gain Charts12

    PubMed Central

    Ohadike, Corah O; Cheikh-Ismail, Leila; Ohuma, Eric O; Giuliani, Francesca; Bishop, Deborah; Kac, Gilberto; Puglia, Fabien; Maia-Schlüssel, Michael; Kennedy, Stephen H; Villar, José; Hirst, Jane E

    2016-01-01

    A range of adverse outcomes is associated with insufficient and excessive maternal weight gain in pregnancy, but there is no consensus regarding what constitutes optimal gestational weight gain (GWG). Differences in the methodological quality of GWG studies may explain the varying chart recommendations. The goal of this systematic review was to evaluate the methodological quality of studies that aimed to create GWG charts by scoring them against a set of predefined, independently agreed-upon criteria. These criteria were divided into 3 domains: study design (12 criteria), statistical methods (7 criteria), and reporting methods (4 criteria). The criteria were broken down further into items, and studies were assigned a quality score (QS) based on these criteria. For each item, studies were scored as either high (score = 0) or low (score = 1) risk of bias; a high QS correlated with a low risk of bias. The maximum possible QS was 34. The systematic search identified 12 eligible studies involving 2,268,556 women from 9 countries; their QSs ranged from 9 (26%) to 29 (85%) (median, 18; 53%). The most common sources for bias were found in study designs (i.e., not prospective); assessments of prepregnancy weight and gestational age; descriptions of weighing protocols; sample size calculations; and the multiple measurements taken at each visit. There is wide variation in the methodological quality of GWG studies constructing charts. High-quality studies are needed to guide future clinical recommendations. We recommend the following main requirements for future studies: prospective design, reliable evaluation of prepregnancy weight and gestational age, detailed description of measurement procedures and protocols, description of sample-size calculation, and the creation of smooth centile charts or z scores. PMID:26980814

  1. Application of systematic review methodology to the field of nutrition

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Systematic reviews represent a rigorous and transparent approach of synthesizing scientific evidence that minimizes bias. They evolved within the medical community to support development of clinical and public health practice guidelines, set research agendas and formulate scientific consensus state...

  2. Sketching Designs Using the Five Design-Sheet Methodology.

    PubMed

    Roberts, Jonathan C; Headleand, Chris; Ritsos, Panagiotis D

    2016-01-01

    Sketching designs has been shown to be a useful way of planning and considering alternative solutions. The use of lo-fidelity prototyping, especially paper-based sketching, can save time, money and converge to better solutions more quickly. However, this design process is often viewed to be too informal. Consequently users do not know how to manage their thoughts and ideas (to first think divergently, to then finally converge on a suitable solution). We present the Five Design Sheet (FdS) methodology. The methodology enables users to create information visualization interfaces through lo-fidelity methods. Users sketch and plan their ideas, helping them express different possibilities, think through these ideas to consider their potential effectiveness as solutions to the task (sheet 1); they create three principle designs (sheets 2,3 and 4); before converging on a final realization design that can then be implemented (sheet 5). In this article, we present (i) a review of the use of sketching as a planning method for visualization and the benefits of sketching, (ii) a detailed description of the Five Design Sheet (FdS) methodology, and (iii) an evaluation of the FdS using the System Usability Scale, along with a case-study of its use in industry and experience of its use in teaching.

  3. Toward systematic design of multi-standard converters

    NASA Astrophysics Data System (ADS)

    Rivas, V. J.; Castro-López, R.; Morgado, A.; Guerra, O.; Roca, E.; del Río, R.; de la Rosa, J. M.; Fernández, F. V.

    2007-05-01

    In the last few years, we are witnessing the convergence of more and more communication capabilities into a single terminal. A basic component of these communication transceivers is the multi-standard Analog-to-Digital-Converter (ADC). Many systematic, partially automated approaches for the design of ADCs dealing with a single communication standard have been reported. However, most multi-standard converters reported in the literature follow an ad-hoc approach, which do not guarantee either an efficient occupation of silicon area or its power efficiency in the different standards. This paper aims at the core of this problem by formulating a systematic design approach based on the following key elements: (1) Definition of a set of metrics for reconfigurability: impact in area and power consumption, design complexity and performances; (2) Definition of the reconfiguration capabilities of the component blocks at different hierarchical levels, with assessment of the associated metrics; (3) Exploration of candidate architectures by using a combination of simulated annealing and evolutionary algorithms; (4) Improved top-down synthesis with bottom-up generated low-level design information. The systematic design methodology is illustrated via the design of a multi-standard ΣΔ modulator meeting the specifications of three wireless communication standards.

  4. Lean management in health care: definition, concepts, methodology and effects reported (systematic review protocol)

    PubMed Central

    2014-01-01

    Background Lean is a set of operating philosophies and methods that help create a maximum value for patients by reducing waste and waits. It emphasizes the consideration of the customer’s needs, employee involvement and continuous improvement. Research on the application and implementation of lean principles in health care has been limited. Methods This is a protocol for a systematic review, following the Cochrane Effective Practice and Organisation of Care (EPOC) methodology. The review aims to document, catalogue and synthesize the existing literature on the effects of lean implementation in health care settings especially the potential effects on professional practice and health care outcomes. We have developed a Medline keyword search strategy, and this focused strategy will be translated into other databases. All search strategies will be provided in the review. The method proposed by the Cochrane EPOC group regarding randomized study designs, non-randomised controlled trials controlled before and after studies and interrupted time series will be followed. In addition, we will also include cohort, case–control studies, and relevant non-comparative publications such as case reports. We will categorize and analyse the review findings according to the study design employed, the study quality (low- versus high-quality studies) and the reported types of implementation in the primary studies. We will present the results of studies in a tabular form. Discussion Overall, the systematic review aims to identify, assess and synthesize the evidence to underpin the implementation of lean activities in health care settings as defined in this protocol. As a result, the review will provide an evidence base for the effectiveness of lean and implementation methodologies reported in health care. Systematic review registration PROSPERO CRD42014008853 PMID:25238974

  5. Systematic design assessment techniques for solar buildings

    NASA Astrophysics Data System (ADS)

    Page, J. K.; Rodgers, G. G.; Souster, C. G.

    1980-02-01

    The paper describes the various approaches developed for the detailed modelling of the relevant climatic input variables for systematic design assessments for solar housing techniques. A report is made of the techniques developed to generate systematic short wave radiation data for vertical and inclined surfaces for different types of weather. The analysis is based on different types of days, such as sunny, average and overcast. Work on the accurate estimation of the magnitude of the associated weather variables affecting heat transfer in the external environment is also reported, covering air temperature, wind speed and long wave radiation exchanges.

  6. Methodology Series Module 6: Systematic Reviews and Meta-analysis

    PubMed Central

    Setia, Maninder Singh

    2016-01-01

    Systematic reviews and meta-analysis have become an important of biomedical literature, and they provide the “highest level of evidence” for various clinical questions. There are a lot of studies – sometimes with contradictory conclusions – on a particular topic in literature. Hence, as a clinician, which results will you believe? What will you tell your patient? Which drug is better? A systematic review or a meta-analysis may help us answer these questions. In addition, it may also help us understand the quality of the articles in literature or the type of studies that have been conducted and published (example, randomized trials or observational studies). The first step it to identify a research question for systematic review or meta-analysis. The next step is to identify the articles that will be included in the study. This will be done by searching various databases; it is important that the researcher should search for articles in more than one database. It will also be useful to form a group of researchers and statisticians that have expertise in conducting systematic reviews and meta-analysis before initiating them. We strongly encourage the readers to register their proposed review/meta-analysis with PROSPERO. Finally, these studies should be reported according to the Preferred Reporting Items for Systematic Reviews and Meta-analysis checklist. PMID:27904176

  7. Systematic Reviews of Animal Models: Methodology versus Epistemology

    PubMed Central

    Greek, Ray; Menache, Andre

    2013-01-01

    Systematic reviews are currently favored methods of evaluating research in order to reach conclusions regarding medical practice. The need for such reviews is necessitated by the fact that no research is perfect and experts are prone to bias. By combining many studies that fulfill specific criteria, one hopes that the strengths can be multiplied and thus reliable conclusions attained. Potential flaws in this process include the assumptions that underlie the research under examination. If the assumptions, or axioms, upon which the research studies are based, are untenable either scientifically or logically, then the results must be highly suspect regardless of the otherwise high quality of the studies or the systematic reviews. We outline recent criticisms of animal-based research, namely that animal models are failing to predict human responses. It is this failure that is purportedly being corrected via systematic reviews. We then examine the assumption that animal models can predict human outcomes to perturbations such as disease or drugs, even under the best of circumstances. We examine the use of animal models in light of empirical evidence comparing human outcomes to those from animal models, complexity theory, and evolutionary biology. We conclude that even if legitimate criticisms of animal models were addressed, through standardization of protocols and systematic reviews, the animal model would still fail as a predictive modality for human response to drugs and disease. Therefore, systematic reviews and meta-analyses of animal-based research are poor tools for attempting to reach conclusions regarding human interventions. PMID:23372426

  8. Systematic reviews of animal models: methodology versus epistemology.

    PubMed

    Greek, Ray; Menache, Andre

    2013-01-01

    Systematic reviews are currently favored methods of evaluating research in order to reach conclusions regarding medical practice. The need for such reviews is necessitated by the fact that no research is perfect and experts are prone to bias. By combining many studies that fulfill specific criteria, one hopes that the strengths can be multiplied and thus reliable conclusions attained. Potential flaws in this process include the assumptions that underlie the research under examination. If the assumptions, or axioms, upon which the research studies are based, are untenable either scientifically or logically, then the results must be highly suspect regardless of the otherwise high quality of the studies or the systematic reviews. We outline recent criticisms of animal-based research, namely that animal models are failing to predict human responses. It is this failure that is purportedly being corrected via systematic reviews. We then examine the assumption that animal models can predict human outcomes to perturbations such as disease or drugs, even under the best of circumstances. We examine the use of animal models in light of empirical evidence comparing human outcomes to those from animal models, complexity theory, and evolutionary biology. We conclude that even if legitimate criticisms of animal models were addressed, through standardization of protocols and systematic reviews, the animal model would still fail as a predictive modality for human response to drugs and disease. Therefore, systematic reviews and meta-analyses of animal-based research are poor tools for attempting to reach conclusions regarding human interventions.

  9. Environmental and Sustainability Education Policy Research: A Systematic Review of Methodological and Thematic Trends

    ERIC Educational Resources Information Center

    Aikens, Kathleen; McKenzie, Marcia; Vaughter, Philip

    2016-01-01

    This paper reports on a systematic literature review of policy research in the area of environmental and sustainability education. We analyzed 215 research articles, spanning four decades and representing 71 countries, and which engaged a range of methodologies. Our analysis combines quantification of geographic and methodological trends with…

  10. Unshrouded Centrifugal Turbopump Impeller Design Methodology

    NASA Technical Reports Server (NTRS)

    Prueger, George H.; Williams, Morgan; Chen, Wei-Chung; Paris, John; Williams, Robert; Stewart, Eric

    2001-01-01

    Turbopump weight continues to be a dominant parameter in the trade space for reduction of engine weight. Space Shuttle Main Engine weight distribution indicates that the turbomachinery make up approximately 30% of the total engine weight. Weight reduction can be achieved through the reduction of envelope of the turbopump. Reduction in envelope relates to an increase in turbopump speed and an increase in impeller head coefficient. Speed can be increased until suction performance limits are achieved on the pump or due to alternate constraints the turbine or bearings limit speed. Once the speed of the turbopump is set the impeller tip speed sets the minimum head coefficient of the machine. To reduce impeller diameter the head coefficient must be increased. A significant limitation with increasing head coefficient is that the slope of the head-flow characteristic is affected and this can limit engine throttling range. Unshrouded impellers offer a design option for increased turbopump speed without increasing the impeller head coefficient. However, there are several issues with regard to using an unshrouded impeller: there is a pump performance penalty due to the front open face recirculation flow, there is a potential pump axial thrust problem from the unbalanced front open face and the back shroud face, and since test data is very limited for this configuration, there is uncertainty in the magnitude and phase of the rotordynamic forces due to the front impeller passage. The purpose of the paper is to discuss the design of an unshrouded impeller and to examine the hydrodynamic performance, axial thrust, and rotordynamic performance. The design methodology will also be discussed. This work will help provide some guidelines for unshrouded impeller design.

  11. Methodological guidance documents for evaluation of ethical considerations in health technology assessment: a systematic review.

    PubMed

    Assasi, Nazila; Schwartz, Lisa; Tarride, Jean-Eric; Campbell, Kaitryn; Goeree, Ron

    2014-04-01

    Despite the advances made in the development of ethical frameworks for health technology assessment (HTA), there is no clear agreement on the scope and details of a practical approach to address ethical aspects in HTA. This systematic review aimed to identify existing guidance documents for incorporation of ethics in HTA to provide an overview of their methodological features. The review identified 43 conceptual frameworks or practical guidelines, varying in their philosophical approach, structure, and comprehensiveness. They were designed for different purposes throughout the HTA process, ranging from helping HTA-producers in identification, appraisal and analysis of ethical data to supporting decision-makers in making value-sensitive decisions. They frequently promoted using analytical methods that combined normative reflection with participatory approaches. The choice of a method for collection and analysis of ethical data seems to depend on the context in which technology is being assessed, the purpose of analysis, and availability of required resources.

  12. CONCEPTUAL DESIGNS FOR A NEW HIGHWAY VEHICLE EMISSIONS ESTIMATION METHODOLOGY

    EPA Science Inventory

    The report discusses six conceptual designs for a new highway vehicle emissions estimation methodology and summarizes the recommendations of each design for improving the emissions and activity factors in the emissions estimation process. he complete design reports are included a...

  13. Hydrogel design of experiments methodology to optimize hydrogel for iPSC-NPC culture.

    PubMed

    Lam, Jonathan; Carmichael, S Thomas; Lowry, William E; Segura, Tatiana

    2015-03-11

    Bioactive signals can be incorporated in hydrogels to direct encapsulated cell behavior. Design of experiments methodology methodically varies the signals systematically to determine the individual and combinatorial effects of each factor on cell activity. Using this approach enables the optimization of three ligands concentrations (RGD, YIGSR, IKVAV) for the survival and differentiation of neural progenitor cells.

  14. Methodology for Preliminary Design of Electrical Microgrids

    SciTech Connect

    Jensen, Richard P.; Stamp, Jason E.; Eddy, John P.; Henry, Jordan M; Munoz-Ramos, Karina; Abdallah, Tarek

    2015-09-30

    Many critical loads rely on simple backup generation to provide electricity in the event of a power outage. An Energy Surety Microgrid TM can protect against outages caused by single generator failures to improve reliability. An ESM will also provide a host of other benefits, including integration of renewable energy, fuel optimization, and maximizing the value of energy storage. The ESM concept includes a categorization for microgrid value proposi- tions, and quantifies how the investment can be justified during either grid-connected or utility outage conditions. In contrast with many approaches, the ESM approach explic- itly sets requirements based on unlikely extreme conditions, including the need to protect against determined cyber adversaries. During the United States (US) Department of Defense (DOD)/Department of Energy (DOE) Smart Power Infrastructure Demonstration for Energy Reliability and Security (SPIDERS) effort, the ESM methodology was successfully used to develop the preliminary designs, which direct supported the contracting, construction, and testing for three military bases. Acknowledgements Sandia National Laboratories and the SPIDERS technical team would like to acknowledge the following for help in the project: * Mike Hightower, who has been the key driving force for Energy Surety Microgrids * Juan Torres and Abbas Akhil, who developed the concept of microgrids for military installations * Merrill Smith, U.S. Department of Energy SPIDERS Program Manager * Ross Roley and Rich Trundy from U.S. Pacific Command * Bill Waugaman and Bill Beary from U.S. Northern Command * Melanie Johnson and Harold Sanborn of the U.S. Army Corps of Engineers Construc- tion Engineering Research Laboratory * Experts from the National Renewable Energy Laboratory, Idaho National Laboratory, Oak Ridge National Laboratory, and Pacific Northwest National Laboratory

  15. Enhancing the Front-End Phase of Design Methodology

    ERIC Educational Resources Information Center

    Elias, Erasto

    2006-01-01

    Design methodology (DM) is defined by the procedural path, expressed in design models, and techniques or methods used to untangle the various activities within a design model. Design education in universities is mainly based on descriptive design models. Much knowledge and organization have been built into DM to facilitate design teaching.…

  16. Impact of searching clinical trial registries in systematic reviews of pharmaceutical treatments: methodological systematic review and reanalysis of meta-analyses.

    PubMed

    Baudard, Marie; Yavchitz, Amélie; Ravaud, Philippe; Perrodeau, Elodie; Boutron, Isabelle

    2017-02-17

    Objective To evaluate the impact of searching clinical trial registries in systematic reviews.Design Methodological systematic review and reanalyses of meta-analyses.Data sources Medline was searched to identify systematic reviews of randomised controlled trials (RCTs) assessing pharmaceutical treatments published between June 2014 and January 2015. For all systematic reviews that did not report a trial registry search but reported the information to perform it, the World Health Organization International Trials Registry Platform (WHO ICTRP search portal) was searched for completed or terminated RCTs not originally included in the systematic review.Data extraction For each systematic review, two researchers independently extracted the outcomes analysed, the number of patients included, and the treatment effect estimated. For each RCT identified, two researchers independently determined whether the results were available (ie, posted, published, or available on the sponsor website) and extracted the data. When additional data were retrieved, we reanalysed meta-analyses and calculated the weight of the additional RCTs and the change in summary statistics by comparison with the original meta-analysis.Results Among 223 selected systematic reviews, 116 (52%) did not report a search of trial registries; 21 of these did not report the information to perform the search (key words, search date). A search was performed for 95 systematic reviews; for 54 (57%), no additional RCTs were found and for 41 (43%) 122 additional RCTs were identified. The search allowed for increasing the number of patients by more than 10% in 19 systematic reviews, 20% in 10, 30% in seven, and 50% in four. Moreover, 63 RCTs had results available; the results for 45 could be included in a meta-analysis. 14 systematic reviews including 45 RCTs were reanalysed. The weight of the additional RCTs in the recalculated meta-analyses ranged from 0% to 58% and was greater than 10% in five of 14 systematic

  17. Technical report on LWR design decision methodology. Phase I

    SciTech Connect

    1980-03-01

    Energy Incorporated (EI) was selected by Sandia Laboratories to develop and test on LWR design decision methodology. Contract Number 42-4229 provided funding for Phase I of this work. This technical report on LWR design decision methodology documents the activities performed under that contract. Phase I was a short-term effort to thoroughly review the curret LWR design decision process to assure complete understanding of current practices and to establish a well defined interface for development of initial quantitative design guidelines.

  18. Design Research: Theoretical and Methodological Issues

    ERIC Educational Resources Information Center

    Collins, Allan; Joseph, Diana; Bielaczyc, Katerine

    2004-01-01

    The term "design experiments" was introduced in 1992, in articles by Ann Brown (1992) and Allan Collins (1992). Design experiments were developed as a way to carry out formative research to test and refine educational designs based on principles derived from prior research. More recently the term design research has been applied to this kind of…

  19. A Design Methodology for Medical Processes

    PubMed Central

    Bonacina, Stefano; Pozzi, Giuseppe; Pinciroli, Francesco; Marceglia, Sara

    2016-01-01

    Summary Background Healthcare processes, especially those belonging to the clinical domain, are acknowledged as complex and characterized by the dynamic nature of the diagnosis, the variability of the decisions made by experts driven by their experiences, the local constraints, the patient’s needs, the uncertainty of the patient’s response, and the indeterminacy of patient’s compliance to treatment. Also, the multiple actors involved in patient’s care need clear and transparent communication to ensure care coordination. Objectives In this paper, we propose a methodology to model healthcare processes in order to break out complexity and provide transparency. Methods The model is grounded on a set of requirements that make the healthcare domain unique with respect to other knowledge domains. The modeling methodology is based on three main phases: the study of the environmental context, the conceptual modeling, and the logical modeling. Results The proposed methodology was validated by applying it to the case study of the rehabilitation process of stroke patients in the specific setting of a specialized rehabilitation center. The resulting model was used to define the specifications of a software artifact for the digital administration and collection of assessment tests that was also implemented. Conclusions Despite being only an example, our case study showed the ability of process modeling to answer the actual needs in healthcare practices. Independently from the medical domain in which the modeling effort is done, the proposed methodology is useful to create high-quality models, and to detect and take into account relevant and tricky situations that can occur during process execution. PMID:27081415

  20. Instruments for Assessing Risk of Bias and Other Methodological Criteria of Published Animal Studies: A Systematic Review

    PubMed Central

    Krauth, David; Woodruff, Tracey J.

    2013-01-01

    Background: Results from animal toxicology studies are critical to evaluating the potential harm from exposure to environmental chemicals or the safety of drugs prior to human testing. However, there is significant debate about how to evaluate the methodology and potential biases of the animal studies. There is no agreed-upon approach, and a systematic evaluation of current best practices is lacking. Objective: We performed a systematic review to identify and evaluate instruments for assessing the risk of bias and/or other methodological criteria of animal studies. Method: We searched Medline (January 1966–November 2011) to identify all relevant articles. We extracted data on risk of bias criteria (e.g., randomization, blinding, allocation concealment) and other study design features included in each assessment instrument. Discussion: Thirty distinct instruments were identified, with the total number of assessed risk of bias, methodological, and/or reporting criteria ranging from 2 to 25. The most common criteria assessed were randomization (25/30, 83%), investigator blinding (23/30, 77%), and sample size calculation (18/30, 60%). In general, authors failed to empirically justify why these or other criteria were included. Nearly all (28/30, 93%) of the instruments have not been rigorously tested for validity or reliability. Conclusion: Our review highlights a number of risk of bias assessment criteria that have been empirically tested for animal research, including randomization, concealment of allocation, blinding, and accounting for all animals. In addition, there is a need for empirically testing additional methodological criteria and assessing the validity and reliability of a standard risk of bias assessment instrument. Citation: Krauth D, Woodruff TJ, Bero L. 2013. Instruments for assessing risk of bias and other methodological criteria of published animal studies: a systematic review. Environ Health Perspect 121:985–992 (2013); http://dx.doi.org/10

  1. Systematic risk assessment methodology for critical infrastructure elements - Oil and Gas subsectors

    NASA Astrophysics Data System (ADS)

    Gheorghiu, A.-D.; Ozunu, A.

    2012-04-01

    The concern for the protection of critical infrastructure has been rapidly growing in the last few years in Europe. The level of knowledge and preparedness in this field is beginning to develop in a lawfully organized manner, for the identification and designation of critical infrastructure elements of national and European interest. Oil and gas production, refining, treatment, storage and transmission by pipelines facilities, are considered European critical infrastructure sectors, as per Annex I of the Council Directive 2008/114/EC of 8 December 2008 on the identification and designation of European critical infrastructures and the assessment of the need to improve their protection. Besides identifying European and national critical infrastructure elements, member states also need to perform a risk analysis for these infrastructure items, as stated in Annex II of the above mentioned Directive. In the field of risk assessment, there are a series of acknowledged and successfully used methods in the world, but not all hazard identification and assessment methods and techniques are suitable for a given site, situation, or type of hazard. As Theoharidou, M. et al. noted (Theoharidou, M., P. Kotzanikolaou, and D. Gritzalis 2009. Risk-Based Criticality Analysis. In Critical Infrastructure Protection III. Proceedings. Third Annual IFIP WG 11.10 International Conference on Critical Infrastructure Protection. Hanover, New Hampshire, USA, March 23-25, 2009: revised selected papers, edited by C. Palmer and S. Shenoi, 35-49. Berlin: Springer.), despite the wealth of knowledge already created, there is a need for simple, feasible, and standardized criticality analyses. The proposed systematic risk assessment methodology includes three basic steps: the first step (preliminary analysis) includes the identification of hazards (including possible natural hazards) for each installation/section within a given site, followed by a criterial analysis and then a detailed analysis step

  2. The Study of the Relationship between Probabilistic Design and Axiomatic Design Methodology. Volume 1

    NASA Technical Reports Server (NTRS)

    Onwubiko, Chinyere; Onyebueke, Landon

    1996-01-01

    This program report is the final report covering all the work done on this project. The goal of this project is technology transfer of methodologies to improve design process. The specific objectives are: 1. To learn and understand the Probabilistic design analysis using NESSUS. 2. To assign Design Projects to either undergraduate or graduate students on the application of NESSUS. 3. To integrate the application of NESSUS into some selected senior level courses in Civil and Mechanical Engineering curricula. 4. To develop courseware in Probabilistic Design methodology to be included in a graduate level Design Methodology course. 5. To study the relationship between the Probabilistic design methodology and Axiomatic design methodology.

  3. Epidemiology characteristics, reporting characteristics, and methodological quality of systematic reviews and meta-analyses on traditional Chinese medicine nursing interventions published in Chinese journals.

    PubMed

    Yang, Min; Jiang, Li; Wang, Aihong; Xu, Guihua

    2017-02-01

    To evaluate the epidemiological characteristics, reporting characteristics, and methodological quality of systematic reviews in the traditional Chinese medicine nursing field published in Chinese journals. The number of systematic reviews in the traditional Chinese medicine nursing field has increased, but their epidemiology, quality, and reporting characteristics have not been assessed completely. We generated an overview of reviews using a narrative approach. Four Chinese databases were searched for systematic reviews from inception to December 2015. The Preferred Reporting Items of Systematic Reviews and Meta-analyses and the Assessment of Multiple Systematic Reviews checklists were adopted to evaluate reporting and methodological quality, respectively. A total of 73 eligible systematic reviews, published from 2005 to 2015, were included. The deficiencies in reporting characteristics mainly lay in the lack of structured abstract or protocol, incomplete reporting of search strategies, study selection, and risk of bias. The deficiencies in methodological quality were reflected in the lack of a priori design and conflict of interest, incomplete literature searches, and assessment of publication bias. The quality of the evaluated reviews was unsatisfactory; attention should be paid to the improvement of reporting and methodological quality in the conduct of systematic reviews.

  4. A Methodology for Total Hospital Design

    PubMed Central

    Delon, Gerald L.

    1970-01-01

    A procedure is described that integrates three techniques into a unified approach: a computerized method for estimating departmental areas and construction costs, a computerized layout routine that produces a space-relationship diagram based on qualitative factors, and a second layout program that establishes a final layout by a series of iterations. The methodology described utilizes as input the results of earlier phases of the research, with the output of each step in turn becoming the input for the succeeding step. The method is illustrated by application to a hypothetical pediatric hospital of 100 beds. PMID:5494263

  5. CAGE IIIA Distributed Simulation Design Methodology

    DTIC Science & Technology

    2014-05-01

    and Execution Process (FEDEP), Synthetic Environment Development and Exploitation Process (SEDEP), Distributed Simulation Engineering and Execution... Process (DSEEP) and Kweley and Wood [Andreas Tolk et al. (2012)] all assume a central design authority and thus full control of the system of... Process ......................................................................... 6 2.3.2.1 Scenario Design Products

  6. The methodology of database design in organization management systems

    NASA Astrophysics Data System (ADS)

    Chudinov, I. L.; Osipova, V. V.; Bobrova, Y. V.

    2017-01-01

    The paper describes the unified methodology of database design for management information systems. Designing the conceptual information model for the domain area is the most important and labor-intensive stage in database design. Basing on the proposed integrated approach to design, the conceptual information model, the main principles of developing the relation databases are provided and user’s information needs are considered. According to the methodology, the process of designing the conceptual information model includes three basic stages, which are defined in detail. Finally, the article describes the process of performing the results of analyzing user’s information needs and the rationale for use of classifiers.

  7. A design optimization methodology for Li+ batteries

    NASA Astrophysics Data System (ADS)

    Golmon, Stephanie; Maute, Kurt; Dunn, Martin L.

    2014-05-01

    Design optimization for functionally graded battery electrodes is shown to improve the usable energy capacity of Li batteries predicted by computational simulations and numerically optimizing the electrode porosities and particle radii. A multi-scale battery model which accounts for nonlinear transient transport processes, electrochemical reactions, and mechanical deformations is used to predict the usable energy storage capacity of the battery over a range of discharge rates. A multi-objective formulation of the design problem is introduced to maximize the usable capacity over a range of discharge rates while limiting the mechanical stresses. The optimization problem is solved via a gradient based optimization. A LiMn2O4 cathode is simulated with a PEO-LiCF3SO3 electrolyte and both a Li Foil (half cell) and LiC6 anode. Studies were performed on both half and full cell configurations resulting in distinctly different optimal electrode designs. The numerical results show that the highest rate discharge drives the simulations and the optimal designs are dominated by Li+ transport rates. The results also suggest that spatially varying electrode porosities and active particle sizes provides an efficient approach to improve the power-to-energy density of Li+ batteries. For the half cell configuration, the optimal design improves the discharge capacity by 29% while for the full cell the discharge capacity was improved 61% relative to an initial design with a uniform electrode structure. Most of the improvement in capacity was due to the spatially varying porosity, with up to 5% of the gains attributed to the particle radii design variables.

  8. Neuropsychological investigations in obsessive-compulsive disorder: A systematic review of methodological challenges.

    PubMed

    Abramovitch, Amitai; Mittelman, Andrew; Tankersley, Amelia P; Abramowitz, Jonathan S; Schweiger, Avraham

    2015-07-30

    The inconsistent nature of the neuropsychology literature pertaining to obsessive-compulsive disorder (OCD) has long been recognized. However, individual studies, systematic reviews, and recent meta-analytic reviews were unsuccessful in establishing a consensus regarding a disorder-specific neuropsychological profile. In an attempt to identify methodological factors that may contribute to the inconsistency that is characteristic of this body of research, a systematic review of methodological factors in studies comparing OCD patients and non-psychiatric controls on neuropsychological tests was conducted. This review covered 115 studies that included nearly 3500 patients. Results revealed a range of methodological weaknesses. Some of these weaknesses have been previously noted in the broader neuropsychological literature, while some are more specific to psychiatric disorders, and to OCD. These methodological shortcomings have the potential to hinder the identification of a specific neuropsychological profile associated with OCD as well as to obscure the association between neurocognitive dysfunctions and contemporary neurobiological models. Rectifying these weaknesses may facilitate replicability, and promote our ability to extract cogent, meaningful, and more unified inferences regarding the neuropsychology of OCD. To that end, we present a set of methodological recommendations to facilitate future neuropsychology research in psychiatric disorders in general, and in OCD in particular.

  9. Implicit Shape Parameterization for Kansei Design Methodology

    NASA Astrophysics Data System (ADS)

    Nordgren, Andreas Kjell; Aoyama, Hideki

    Implicit shape parameterization for Kansei design is a procedure that use 3D-models, or concepts, to span a shape space for surfaces in the automotive field. A low-dimensional, yet accurate shape descriptor was found by Principal Component Analysis of an ensemble of point-clouds, which were extracted from mesh-based surfaces modeled in a CAD-program. A theoretical background of the procedure is given along with step-by-step instructions for the required data-processing. The results show that complex surfaces can be described very efficiently, and encode design features by an implicit approach that does not rely on error-prone explicit parameterizations. This provides a very intuitive way to explore shapes for a designer, because various design features can simply be introduced by adding new concepts to the ensemble. Complex shapes have been difficult to analyze with Kansei methods due to the large number of parameters involved, but implicit parameterization of design features provides a low-dimensional shape descriptor for efficient data collection, model-building and analysis of emotional content in 3D-surfaces.

  10. Philosophical and Methodological Beliefs of Instructional Design Faculty and Professionals

    ERIC Educational Resources Information Center

    Sheehan, Michael D.; Johnson, R. Burke

    2012-01-01

    The purpose of this research was to probe the philosophical beliefs of instructional designers using sound philosophical constructs and quantitative data collection and analysis. We investigated the philosophical and methodological beliefs of instructional designers, including 152 instructional design faculty members and 118 non-faculty…

  11. Development of Distributed Computing Systems Software Design Methodologies.

    DTIC Science & Technology

    1982-11-05

    R12i 941 DEVELOPMENT OF DISTRIBUTED COMPUTING SYSTEMS SOFTWARE ± DESIGN METHODOLOGIES(U) NORTHWESTERN UNIV EVANSTON IL DEPT OF ELECTRICAL...GUIRWAU OF STANDARDS -16 5 A Ax u FINAL REPORT Development of Distributed Computing System Software Design Methodologies C)0 Stephen S. Yau September 22...of Distributed Computing Systems Software pt.22,, 80 -OJu1, 2 * Dsig Mehodloges PERFORMING ORG REPORT NUMBERDesign th ol ies" 7. AUTHOR() .. CONTRACT

  12. Design Methodology for Multiple Microcomputer Architectures.

    DTIC Science & Technology

    1982-07-01

    multimicro design knowledge is true both in industry and in university environments. In the industrial environment, it reduces productivity and increases...Real-Time Processor Problems," Proc. of ELECTRO-81 Tercer Seminario de Ingenieria Electronica, Nov. 9-13, 1981. 14 1981 "D Flip/Flop Substracts

  13. A Design Methodology For Industrial Vision Systems

    NASA Astrophysics Data System (ADS)

    Batchelor, B. G.; Waltz, F. M.; Snyder, M. A.

    1988-11-01

    The cost of design, rather than that of target system hardware, represents the principal factor inhibiting the adoption of machine vision systems by manufacturing industry. To reduce design costs to a minimum, a number of software and hardware aids have been developed or are currently being built by the authors. These design aids are as follows: a. An expert system for giving advice about which image acquisition techniques (i.e. lighting/viewing techniques) might be appropriate in a given situation. b. A program to assist in the selection and setup of camera lenses. c. A rich repertoire of image processing procedures, integrated with the Al language Prolog. This combination (called ProVision) provides a facility for experimenting with intelligent image processing techniques and is intended to allow rapid prototyping of algorithms and/or heuristics. d. Fast image processing hardware, capable of implementing commands in the ProVision language. The speed of operation of this equipment is sufficiently high for it to be used, without modification, in many industrial applications. Where this is not possible, even higher execution speed may be achieved by adding extra modules to the processing hardware. In this way, it is possible to trade speed against the cost of the target system hardware. New and faster implementations of a given algorithm/heuristic can usually be achieved with the expenditure of only a small effort. Throughout this article, the emphasis is on designing an industrial vision system in a smooth and effortless manner. In order to illustrate our main thesis that the design of industrial vision systems can be made very much easier through the use of suitable utilities, the article concludes with a discussion of a case study: the dissection of tiny plants using a visually controlled robot.

  14. Surface design methodology - challenge the steel

    NASA Astrophysics Data System (ADS)

    Bergman, M.; Rosen, B.-G.; Eriksson, L.; Anderberg, C.

    2014-03-01

    The way a product or material is experienced by its user could be different depending on the scenario. It is also well known that different materials and surfaces are used for different purposes. When optimizing materials and surface roughness for a certain something with the intention to improve a product, it is important to obtain not only the physical requirements, but also the user experience and expectations. Laws and requirements of the materials and the surface function, but also the conservative way of thinking about materials and colours characterize the design of medical equipment. The purpose of this paper is to link the technical- and customer requirements of current materials and surface textures in medical environments. By focusing on parts of the theory of Kansei Engineering, improvements of the companys' products are possible. The idea is to find correlations between desired experience or "feeling" for a product, -customer requirements, functional requirements, and product geometrical properties -design parameters, to be implemented on new improved products. To be able to find new materials with the same (or better) technical requirements but a higher level of user stimulation, the current material (stainless steel) and its surface (brushed textures) was used as a reference. The usage of focus groups of experts at the manufacturer lead to a selection of twelve possible new materials for investigation in the project. In collaboration with the topical company for this project, three new materials that fulfil the requirements -easy to clean and anti-bacterial came to be in focus for further investigation in regard to a new design of a washer-disinfector for medical equipment using the Kansei based Clean ability approach CAA.

  15. A Methodology for the Neutronics Design of Space Nuclear Reactors

    SciTech Connect

    King, Jeffrey C.; El-Genk, Mohamed S.

    2004-02-04

    A methodology for the neutronics design of space power reactors is presented. This methodology involves balancing the competing requirements of having sufficient excess reactivity for the desired lifetime, keeping the reactor subcritical at launch and during submersion accidents, and providing sufficient control over the lifetime of the reactor. These requirements are addressed by three reactivity values for a given reactor design: the excess reactivity at beginning of mission, the negative reactivity at shutdown, and the negative reactivity margin in submersion accidents. These reactivity values define the control worth and the safety worth in submersion accidents, used for evaluating the merit of a proposed reactor type and design. The Heat Pipe-Segmented Thermoelectric Module Converters space reactor core design is evaluated and modified based on the proposed methodology. The final reactor core design has sufficient excess reactivity for 10 years of nominal operation at 1.82 MW of fission power and is subcritical at launch and in all water submersion accidents.

  16. Methodological quality and descriptive characteristics of prosthodontic-related systematic reviews.

    PubMed

    Aziz, T; Compton, S; Nassar, U; Matthews, D; Ansari, K; Flores-Mir, C

    2013-04-01

    Ideally, healthcare systematic reviews (SRs) should be beneficial to practicing professionals in making evidence-based clinical decisions. However, the conclusions drawn from SRs are directly related to the quality of the SR and of the included studies. The aim was to investigate the methodological quality and key descriptive characteristics of SRs published in prosthodontics. Methodological quality was analysed using the Assessment of Multiple Reviews (AMSTAR) tool. Several electronic resources (MEDLINE, EMBASE, Web of Science and American Dental Association's Evidence-based Dentistry website) were searched. In total 106 SRs were located. Key descriptive characteristics and methodological quality features were gathered and assessed, and descriptive and inferential statistical testing performed. Most SRs in this sample originated from the European continent followed by North America. Two to five authors conducted most SRs; the majority was affiliated with academic institutions and had prior experience publishing SRs. The majority of SRs were published in specialty dentistry journals, with implant or implant-related topics, the primary topics of interest for most. According to AMSTAR, most quality aspects were adequately fulfilled by less than half of the reviews. Publication bias and grey literature searches were the most poorly adhered components. Overall, the methodological quality of the prosthodontic-related systematic was deemed limited. Future recommendations would include authors to have prior training in conducting SRs and for journals to include a universal checklist that should be adhered to address all key characteristics of an unbiased SR process.

  17. Application of an Integrated Methodology for Propulsion and Airframe Control Design to a STOVL Aircraft

    NASA Technical Reports Server (NTRS)

    Garg, Sanjay; Mattern, Duane

    1994-01-01

    An advanced methodology for integrated flight propulsion control (IFPC) design for future aircraft, which will use propulsion system generated forces and moments for enhanced maneuver capabilities, is briefly described. This methodology has the potential to address in a systematic manner the coupling between the airframe and the propulsion subsystems typical of such enhanced maneuverability aircraft. Application of the methodology to a short take-off vertical landing (STOVL) aircraft in the landing approach to hover transition flight phase is presented with brief description of the various steps in the IFPC design methodology. The details of the individual steps have been described in previous publications and the objective of this paper is to focus on how the components of the control system designed at each step integrate into the overall IFPC system. The full nonlinear IFPC system was evaluated extensively in nonreal-time simulations as well as piloted simulations. Results from the nonreal-time evaluations are presented in this paper. Lessons learned from this application study are summarized in terms of areas of potential improvements in the STOVL IFPC design as well as identification of technology development areas to enhance the applicability of the proposed design methodology.

  18. The use of the cluster randomized crossover design in clinical trials: protocol for a systematic review

    PubMed Central

    2014-01-01

    Background The cluster randomized crossover (CRXO) design is gaining popularity in trial settings where individual randomization or parallel group cluster randomization is not feasible or practical. In a CRXO trial, not only are clusters of individuals rather than individuals themselves randomized to trial arms, but also each cluster participates in each arm of the trial at least once in separate periods of time. We will review publications of clinical trials undertaken in humans that have used the CRXO design. The aim of this systematic review is to summarize, as reported: the motivations for using the CRXO design, the values of the CRXO design parameters, the justification and methodology for the sample size calculations and analyses, and the quality of reporting the CRXO design aspects. Methods/Design We will identify reports of CRXO trials by systematically searching MEDLINE, PubMed, Cochrane Methodology Register, EMBASE, and CINAHL Plus. In addition, we will search for methodological articles that describe the CRXO design and conduct citation searches to identify any further CRXO trials. The references of all eligible trials will also be searched. We will screen the identified abstracts, and retrieve and assess for inclusion the full text for any potentially relevant articles. Data will be extracted from the full text independently by two reviewers. Descriptive summary statistics will be presented for the extracted data. Discussion This systematic review will inform both researchers addressing CRXO methodology and trialists considering implementing the design. The results will allow focused methodological research of the CRXO design, provide practical examples for researchers of how CRXO trials have been conducted, including any shortcomings, and highlight areas where reporting and conduct may be improved. PMID:25115725

  19. Solid lubrication design methodology, phase 2

    NASA Technical Reports Server (NTRS)

    Pallini, R. A.; Wedeven, L. D.; Ragen, M. A.; Aggarwal, B. B.

    1986-01-01

    The high temperature performance of solid lubricated rolling elements was conducted with a specially designed traction (friction) test apparatus. Graphite lubricants containing three additives (silver, phosphate glass, and zinc orthophosphate) were evaluated from room temperature to 540 C. Two hard coats were also evaluated. The evaluation of these lubricants, using a burnishing method of application, shows a reasonable transfer of lubricant and wear protection for short duration testing except in the 200 C temperature range. The graphite lubricants containing silver and zinc orthophosphate additives were more effective than the phosphate glass material over the test conditions examined. Traction coefficients ranged from a low of 0.07 to a high of 0.6. By curve fitting the traction data, empirical equations for slope and maximum traction coefficient as a function of contact pressure (P), rolling speed (U), and temperature (T) can be developed for each lubricant. A solid lubricant traction model was incorporated into an advanced bearing analysis code (SHABERTH). For comparison purposes, preliminary heat generation calculations were made for both oil and solid lubricated bearing operation. A preliminary analysis indicated a significantly higher heat generation for a solid lubricated ball bearing in a deep groove configuration. An analysis of a cylindrical roller bearing configuration showed a potential for a low friction solid lubricated bearing.

  20. Optimal Design and Purposeful Sampling: Complementary Methodologies for Implementation Research.

    PubMed

    Duan, Naihua; Bhaumik, Dulal K; Palinkas, Lawrence A; Hoagwood, Kimberly

    2015-09-01

    Optimal design has been an under-utilized methodology. However, it has significant real-world applications, particularly in mixed methods implementation research. We review the concept and demonstrate how it can be used to assess the sensitivity of design decisions and balance competing needs. For observational studies, this methodology enables selection of the most informative study units. For experimental studies, it entails selecting and assigning study units to intervention conditions in the most informative manner. We blend optimal design methods with purposeful sampling to show how these two concepts balance competing needs when there are multiple study aims, a common situation in implementation research.

  1. PEM Fuel Cells Redesign Using Biomimetic and TRIZ Design Methodologies

    NASA Astrophysics Data System (ADS)

    Fung, Keith Kin Kei

    Two formal design methodologies, biomimetic design and the Theory of Inventive Problem Solving, TRIZ, were applied to the redesign of a Proton Exchange Membrane (PEM) fuel cell. Proof of concept prototyping was performed on two of the concepts for water management. The liquid water collection with strategically placed wicks concept demonstrated the potential benefits for a fuel cell. Conversely, the periodic flow direction reversal concepts might cause a potential reduction water removal from a fuel cell. The causes of this water removal reduction remain unclear. In additional, three of the concepts generated with biomimetic design were further studied and demonstrated to stimulate more creative ideas in the thermal and water management of fuel cells. The biomimetic design and the TRIZ methodologies were successfully applied to fuel cells and provided different perspectives to the redesign of fuel cells. The methodologies should continue to be used to improve fuel cells.

  2. Methodological Innovation in Practice-Based Design Doctorates

    ERIC Educational Resources Information Center

    Yee, Joyce S. R.

    2010-01-01

    This article presents a selective review of recent design PhDs that identify and analyse the methodological innovation that is occurring in the field, in order to inform future provision of research training. Six recently completed design PhDs are used to highlight possible philosophical and practical models that can be adopted by future PhD…

  3. Helicopter-V/STOL dynamic wind and turbulence design methodology

    NASA Technical Reports Server (NTRS)

    Bailey, J. Earl

    1987-01-01

    Aircraft and helicopter accidents due to severe dynamic wind and turbulence continue to present challenging design problems. The development of the current set of design analysis tools for a aircraft wind and turbulence design began in the 1940's and 1950's. The areas of helicopter dynamic wind and turbulence modeling and vehicle response to severe dynamic wind inputs (microburst type phenomena) during takeoff and landing remain as major unsolved design problems from a lack of both environmental data and computational methodology. The development of helicopter and V/STOL dynamic wind and turbulence response computation methology is reviewed, the current state of the design art in industry is outlined, and comments on design methodology are made which may serve to improve future flight vehicle design.

  4. Fast detection of manufacturing systematic design pattern failures causing device yield loss

    NASA Astrophysics Data System (ADS)

    Le Denmat, Jean-Christophe; Feldman, Nelly; Riewer, Olivia; Yesilada, Emek; Vallet, Michel; Suzor, Christophe; Talluto, Salvatore

    2015-03-01

    Starting from the 45nm technology node, systematic defectivity has a significant impact on device yield loss with each new technology node. The effort required to achieve patterning maturity with zero yield detractor is also significantly increasing with technology nodes. Within the manufacturing environment, new in-line wafer inspection methods have been developed to identify device systematic defects, including the process window qualification (PWQ) methodology used to characterize process robustness. Although patterning is characterized with PWQ methodology, some questions remain: How can we demonstrate that the measured process window is large enough to avoid design-based defects which will impact the device yield? Can we monitor the systematic yield loss on nominal wafers? From device test engineering point of view, systematic yield detractors are expected to be identified by Automated Test Pattern Generator (ATPG) test results diagnostics performed after electrical wafer sort (EWS). Test diagnostics can identify failed nets or cells causing systematic yield loss [1],[2]. Convergence from device failed nets and cells to failed manufacturing design pattern are usually based on assumptions that should be confirmed by an electrical failure analysis (EFA). However, many EFA investigations are required before the design pattern failures are found, and thus design pattern failure identification was costly in time and resources. With this situation, an opportunity to share knowledge exists between device test engineering and manufacturing environments to help with device yield improvement. This paper presents a new yield diagnostics flow dedicated to correlation of critical design patterns detected within manufacturing environment, with the observed device yield loss. The results obtained with this new flow on a 28nm technology device are described, with the defects of interest and the device yield impact for each design pattern. The EFA done to validate the design

  5. A methodology for designing aircraft to low sonic boom constraints

    NASA Technical Reports Server (NTRS)

    Mack, Robert J.; Needleman, Kathy E.

    1991-01-01

    A method for designing conceptual supersonic cruise aircraft to meet low sonic boom requirements is outlined and described. The aircraft design is guided through a systematic evolution from initial three view drawing to a final numerical model description, while the designer using the method controls the integration of low sonic boom, high supersonic aerodynamic efficiency, adequate low speed handling, and reasonable structure and materials technologies. Some experience in preliminary aircraft design and in the use of various analytical and numerical codes is required for integrating the volume and lift requirements throughout the design process.

  6. A design methodology for nonlinear systems containing parameter uncertainty: Application to nonlinear controller design

    NASA Technical Reports Server (NTRS)

    Young, G.

    1982-01-01

    A design methodology capable of dealing with nonlinear systems, such as a controlled ecological life support system (CELSS), containing parameter uncertainty is discussed. The methodology was applied to the design of discrete time nonlinear controllers. The nonlinear controllers can be used to control either linear or nonlinear systems. Several controller strategies are presented to illustrate the design procedure.

  7. Enhancing Instructional Design Efficiency: Methodologies Employed by Instructional Designers

    ERIC Educational Resources Information Center

    Roytek, Margaret A.

    2010-01-01

    Instructional systems design (ISD) has been frequently criticised as taking too long to implement, calling for a reduction in cycle time--the time that elapses between project initiation and delivery. While instructional design research has historically focused on increasing "learner" efficiencies, the study of what instructional designers do to…

  8. Prognostics and health management design for rotary machinery systems—Reviews, methodology and applications

    NASA Astrophysics Data System (ADS)

    Lee, Jay; Wu, Fangji; Zhao, Wenyu; Ghaffari, Masoud; Liao, Linxia; Siegel, David

    2014-01-01

    Much research has been conducted in prognostics and health management (PHM), an emerging field in mechanical engineering that is gaining interest from both academia and industry. Most of these efforts have been in the area of machinery PHM, resulting in the development of many algorithms for this particular application. The majority of these algorithms concentrate on applications involving common rotary machinery components, such as bearings and gears. Knowledge of this prior work is a necessity for any future research efforts to be conducted; however, there has not been a comprehensive overview that details previous and on-going efforts in PHM. In addition, a systematic method for developing and deploying a PHM system has yet to be established. Such a method would enable rapid customization and integration of PHM systems for diverse applications. To address these gaps, this paper provides a comprehensive review of the PHM field, followed by an introduction of a systematic PHM design methodology, 5S methodology, for converting data to prognostics information. This methodology includes procedures for identifying critical components, as well as tools for selecting the most appropriate algorithms for specific applications. Visualization tools are presented for displaying prognostics information in an appropriate fashion for quick and accurate decision making. Industrial case studies are included in this paper to show how this methodology can help in the design of an effective PHM system.

  9. External validation of multivariable prediction models: a systematic review of methodological conduct and reporting

    PubMed Central

    2014-01-01

    Background Before considering whether to use a multivariable (diagnostic or prognostic) prediction model, it is essential that its performance be evaluated in data that were not used to develop the model (referred to as external validation). We critically appraised the methodological conduct and reporting of external validation studies of multivariable prediction models. Methods We conducted a systematic review of articles describing some form of external validation of one or more multivariable prediction models indexed in PubMed core clinical journals published in 2010. Study data were extracted in duplicate on design, sample size, handling of missing data, reference to the original study developing the prediction models and predictive performance measures. Results 11,826 articles were identified and 78 were included for full review, which described the evaluation of 120 prediction models. in participant data that were not used to develop the model. Thirty-three articles described both the development of a prediction model and an evaluation of its performance on a separate dataset, and 45 articles described only the evaluation of an existing published prediction model on another dataset. Fifty-seven percent of the prediction models were presented and evaluated as simplified scoring systems. Sixteen percent of articles failed to report the number of outcome events in the validation datasets. Fifty-four percent of studies made no explicit mention of missing data. Sixty-seven percent did not report evaluating model calibration whilst most studies evaluated model discrimination. It was often unclear whether the reported performance measures were for the full regression model or for the simplified models. Conclusions The vast majority of studies describing some form of external validation of a multivariable prediction model were poorly reported with key details frequently not presented. The validation studies were characterised by poor design, inappropriate handling

  10. Implementation of Probabilistic Design Methodology at Tennessee State University

    NASA Technical Reports Server (NTRS)

    Onwubiko, Chinyere

    1996-01-01

    Engineering Design is one of the most important areas in engineering education. Deterministic Design Methodology (DDM) is the only design method that is taught in most engineering schools. This method does not give a direct account of uncertainties in design parameters. Hence, it is impossible to quantify the uncertainties in the response and the actual safety margin remains unknown. The desire for a design methodology tha can identify the primitive (random) variables that affect the structural behavior has led to a growing interest on Probabilistic Design Methodology (PDM). This method is gaining more recognition in industries than in educational institutions. Some of the reasons for the limited use of the PDM at the moment are that many are unaware of its potentials, and most of the software developed for PDM are very recent. The central goal of the PDM project at Tennessee State University is to introduce engineering students to the method. The students participating in the project learn about PDM and the computer codes that are available to the design engineer. The software being used of this project is NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) developed under NASA probabilistic structural analysis program. NESSUS has three different modules which make it a very comprehensive computer code for PDM. A research in technology transfer through course offering in PDM is in effect a Tennessee State University. The aim is to familiarize students with the problem of uncertainties in engineering design. Included in the paper are some projects on PDM carried out by some students and faculty. The areas this method is being applied at the moment include, Design of Gears (spur and worm); Design of Shafts; Design of Statistically Indeterminate Frame Structures; Design of Helical Springs; and Design of Shock Absorbers. Some of the current results of these projects are presented.

  11. A bio-inspired EAP actuator design methodology

    NASA Astrophysics Data System (ADS)

    Fernandez, Diego; Moreno, Luis; Baselga, Juan

    2005-05-01

    Current EAP actuator sheets or fibers perform reasonable well in the centimeter and mN range, but are not practical for larger force and deformation requirements. In order to make EAP actuators technology scalable a design methodology for polymer actuators is required. Design variables, optimization formulas and a general architecture are required, as it is usual in electromagnetic or hydraulic actuator design. This will allow the development of large EAP actuators specifically designed for a particular application. It will also help to enhance the EAP material final performance. This approach is not new, it is found in Nature. Skeletal muscle architecture has a profound influence on muscle force-generating properties and functionality. Based on existing literature on skeletal muscle biomechanics, the Nature design philosophy is inferred. Formulas and curves employed by Nature in the design of muscles are presented. Design units such as fiber, tendon, aponeurosis, and motor unit are compared with the equivalent design units to be taken into account in the design of EAP actuators. Finally a complete design methodology for the design of actuators based on multiple EAP fiber is proposed. In addition, the procedure gives an idea of the required parameters that must be clearly modeled and characterized at EAP material level.

  12. SYSTEMATIC PROCEDURE FOR DESIGNING PROCESSES WITH MULTIPLE ENVIRONMENTAL OBJECTIVES

    EPA Science Inventory

    Evaluation of multiple objectives is very important in designing environmentally benign processes. It requires a systematic procedure for solving multiobjective decision-making problems, due to the complex nature of the problems, the need for complex assessments, and complicated ...

  13. Extensibility of a linear rapid robust design methodology

    NASA Astrophysics Data System (ADS)

    Steinfeldt, Bradley A.; Braun, Robert D.

    2016-05-01

    The extensibility of a linear rapid robust design methodology is examined. This analysis is approached from a computational cost and accuracy perspective. The sensitivity of the solution's computational cost is examined by analysing effects such as the number of design variables, nonlinearity of the CAs, and nonlinearity of the response in addition to several potential complexity metrics. Relative to traditional robust design methods, the linear rapid robust design methodology scaled better with the size of the problem and had performance that exceeded the traditional techniques examined. The accuracy of applying a method with linear fundamentals to nonlinear problems was examined. It is observed that if the magnitude of nonlinearity is less than 1000 times that of the nominal linear response, the error associated with applying successive linearization will result in ? errors in the response less than 10% compared to the full nonlinear error.

  14. Viability, Advantages and Design Methodologies of M-Learning Delivery

    ERIC Educational Resources Information Center

    Zabel, Todd W.

    2010-01-01

    The purpose of this study was to examine the viability and principle design methodologies of Mobile Learning models in developing regions. Demographic and market studies were utilized to determine the viability of M-Learning delivery as well as best uses for such technologies and methods given socioeconomic and political conditions within the…

  15. Chicken or Egg? Communicative Methodology or Communicative Syllabus Design.

    ERIC Educational Resources Information Center

    Yalden, Janice

    A consensus has emerged on many issues in communicative language teaching, but one question that needs attention is the question of what ought to constitute the appropriate starting point in the design and implementation of a second language program. Two positions to consider are the following: first, the development of communicative methodology,…

  16. Tornado missile simulation and design methodology. Volume 1: simulation methodology, design applications, and TORMIS computer code. Final report

    SciTech Connect

    Twisdale, L.A.; Dunn, W.L.

    1981-08-01

    A probabilistic methodology has been developed to predict the probabilities of tornado-propelled missiles impacting and damaging nuclear power plant structures. Mathematical models of each event in the tornado missile hazard have been developed and sequenced to form an integrated, time-history simulation methodology. The models are data based where feasible. The data include documented records of tornado occurrence, field observations of missile transport, results of wind tunnel experiments, and missile impact tests. Probabilistic Monte Carlo techniques are used to estimate the risk probabilities. The methodology has been encoded in the TORMIS computer code to facilitate numerical analysis and plant-specific tornado missile probability assessments. Sensitivity analyses have been performed on both the individual models and the integrated methodology, and risk has been assessed for a hypothetical nuclear power plant design case study.

  17. A Practical Methodology for Quantifying Random and Systematic Components of Unexplained Variance in a Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Deloach, Richard; Obara, Clifford J.; Goodman, Wesley L.

    2012-01-01

    This paper documents a check standard wind tunnel test conducted in the Langley 0.3-Meter Transonic Cryogenic Tunnel (0.3M TCT) that was designed and analyzed using the Modern Design of Experiments (MDOE). The test designed to partition the unexplained variance of typical wind tunnel data samples into two constituent components, one attributable to ordinary random error, and one attributable to systematic error induced by covariate effects. Covariate effects in wind tunnel testing are discussed, with examples. The impact of systematic (non-random) unexplained variance on the statistical independence of sequential measurements is reviewed. The corresponding correlation among experimental errors is discussed, as is the impact of such correlation on experimental results generally. The specific experiment documented herein was organized as a formal test for the presence of unexplained variance in representative samples of wind tunnel data, in order to quantify the frequency with which such systematic error was detected, and its magnitude relative to ordinary random error. Levels of systematic and random error reported here are representative of those quantified in other facilities, as cited in the references.

  18. Systematic review of foodborne burden of disease studies: quality assessment of data and methodology.

    PubMed

    Haagsma, Juanita A; Polinder, Suzanne; Stein, Claudia E; Havelaar, Arie H

    2013-08-16

    Burden of disease (BoD) studies aim to identify the public health impact of different health problems and risk factors. To assess BoD, detailed knowledge is needed on epidemiology, disability and mortality in the population under study. This is particularly challenging for foodborne disease, because of the multitude of causative agents and their health effects. The purpose of this study is to systematically review the methodology of foodborne BoD studies. Three key questions were addressed: 1) which data sources and approaches were used to assess mortality, morbidity and disability?, 2) which methodological choices were made to calculate Disability Adjusted Life Years (DALY), and 3) were uncertainty analyses performed and if so, how? Studies (1990-June 2012) in international peer-reviewed journals and grey literature were identified with main inclusion criteria being that the study assessed disability adjusted life years related to foodborne disease. Twenty-four studies met our inclusion criteria. To assess incidence or prevalence of foodborne disease in the population, four approaches could be distinguished, each using a different data source as a starting point, namely 1) laboratory-confirmed cases, 2) cohort or cross-sectional data, 3) syndrome surveillance data and 4) exposure data. Considerable variation existed in BoD methodology (e.g. disability weights, discounting, age-weighting). Almost all studies analyzed the effect of uncertainty as a result of possible imprecision in the parameter values. Awareness of epidemiological and methodological rigor between foodborne BoD studies using the DALY approach is a critical priority for advancing burden of disease studies. Harmonization of methodology that is used and of modeling techniques and high quality data can enlarge the detection of real variation in DALY outcomes between pathogens, between populations or over time. This harmonization can be achieved by identifying substantial data gaps and uncertainty and

  19. FOREWORD: Computational methodologies for designing materials Computational methodologies for designing materials

    NASA Astrophysics Data System (ADS)

    Rahman, Talat S.

    2009-02-01

    It would be fair to say that in the past few decades, theory and computer modeling have played a major role in elucidating the microscopic factors that dictate the properties of functional novel materials. Together with advances in experimental techniques, theoretical methods are becoming increasingly capable of predicting properties of materials at different length scales, thereby bringing in sight the long-sought goal of designing material properties according to need. Advances in computer technology and their availability at a reasonable cost around the world have made tit all the more urgent to disseminate what is now known about these modern computational techniques. In this special issue on computational methodologies for materials by design we have tried to solicit articles from authors whose works collectively represent the microcosm of developments in the area. This turned out to be a difficult task for a variety of reasons, not the least of which is space limitation in this special issue. Nevertheless, we gathered twenty articles that represent some of the important directions in which theory and modeling are proceeding in the general effort to capture the ability to produce materials by design. The majority of papers presented here focus on technique developments that are expected to uncover further the fundamental processes responsible for material properties, and for their growth modes and morphological evolutions. As for material properties, some of the articles here address the challenges that continue to emerge from attempts at accurate descriptions of magnetic properties, of electronically excited states, and of sparse matter, all of which demand new looks at density functional theory (DFT). I should hasten to add that much of the success in accurate computational modeling of materials emanates from the remarkable predictive power of DFT, without which we would not be able to place the subject on firm theoretical grounds. As we know and will also

  20. Variance estimation for systematic designs in spatial surveys.

    PubMed

    Fewster, R M

    2011-12-01

    In spatial surveys for estimating the density of objects in a survey region, systematic designs will generally yield lower variance than random designs. However, estimating the systematic variance is well known to be a difficult problem. Existing methods tend to overestimate the variance, so although the variance is genuinely reduced, it is over-reported, and the gain from the more efficient design is lost. The current approaches to estimating a systematic variance for spatial surveys are to approximate the systematic design by a random design, or approximate it by a stratified design. Previous work has shown that approximation by a random design can perform very poorly, while approximation by a stratified design is an improvement but can still be severely biased in some situations. We develop a new estimator based on modeling the encounter process over space. The new "striplet" estimator has negligible bias and excellent precision in a wide range of simulation scenarios, including strip-sampling, distance-sampling, and quadrat-sampling surveys, and including populations that are highly trended or have strong aggregation of objects. We apply the new estimator to survey data for the spotted hyena (Crocuta crocuta) in the Serengeti National Park, Tanzania, and find that the reported coefficient of variation for estimated density is 20% using approximation by a random design, 17% using approximation by a stratified design, and 11% using the new striplet estimator. This large reduction in reported variance is verified by simulation.

  1. Single-Case Experimental Designs: A Systematic Review of Published Research and Current Standards

    PubMed Central

    Smith, Justin D.

    2013-01-01

    This article systematically reviews the research design and methodological characteristics of single-case experimental design (SCED) research published in peer-reviewed journals between 2000 and 2010. SCEDs provide researchers with a flexible and viable alternative to group designs with large sample sizes. However, methodological challenges have precluded widespread implementation and acceptance of the SCED as a viable complementary methodology to the predominant group design. This article includes a description of the research design, measurement, and analysis domains distinctive to the SCED; a discussion of the results within the framework of contemporary standards and guidelines in the field; and a presentation of updated benchmarks for key characteristics (e.g., baseline sampling, method of analysis), and overall, it provides researchers and reviewers with a resource for conducting and evaluating SCED research. The results of the systematic review of 409 studies suggest that recently published SCED research is largely in accordance with contemporary criteria for experimental quality. Analytic method emerged as an area of discord. Comparison of the findings of this review with historical estimates of the use of statistical analysis indicates an upward trend, but visual analysis remains the most common analytic method and also garners the most support amongst those entities providing SCED standards. Although consensus exists along key dimensions of single-case research design and researchers appear to be practicing within these parameters, there remains a need for further evaluation of assessment and sampling techniques and data analytic methods. PMID:22845874

  2. A bioclimatic design methodology for urban outdoor spaces

    NASA Astrophysics Data System (ADS)

    Swaid, H.; Bar-El, M.; Hoffman, M. E.

    1993-03-01

    The development of a bioclimatic urban design methodology is described. The cluster thermal time constant ( CTTC) model for predicting street-level urban air temperature variations is coupled with the wind-profile power law and the index of thermal stress (ITS.) for human comfort. The CTTC model and the power law produce the diurnal air temperature and wind speed variations in various canyonlike urban forms. The thermal comfort requirements for lightly-dressed, moderately-walking/seated persons in the outdoor space in summer are then obtained using the ITS. model. The proposed methodology enables a first-order assessment of the climatic implications of different features of the physical structure of the city such as street orientation, canyon height-to-width ratio, building density, and street shading. The application of the proposed methodology is demonstrated for Tel Aviv.

  3. A robust optimization methodology for preliminary aircraft design

    NASA Astrophysics Data System (ADS)

    Prigent, S.; Maréchal, P.; Rondepierre, A.; Druot, T.; Belleville, M.

    2016-05-01

    This article focuses on a robust optimization of an aircraft preliminary design under operational constraints. According to engineers' know-how, the aircraft preliminary design problem can be modelled as an uncertain optimization problem whose objective (the cost or the fuel consumption) is almost affine, and whose constraints are convex. It is shown that this uncertain optimization problem can be approximated in a conservative manner by an uncertain linear optimization program, which enables the use of the techniques of robust linear programming of Ben-Tal, El Ghaoui, and Nemirovski [Robust Optimization, Princeton University Press, 2009]. This methodology is then applied to two real cases of aircraft design and numerical results are presented.

  4. Aerodynamic configuration design using response surface methodology analysis

    NASA Astrophysics Data System (ADS)

    Engelund, Walter C.; Stanley, Douglas O.; Lepsch, Roger A.; McMillin, Mark M.; Unal, Resit

    1993-08-01

    An investigation has been conducted to determine a set of optimal design parameters for a single-stage-to-orbit reentry vehicle. Several configuration geometry parameters which had a large impact on the entry vehicle flying characteristics were selected as design variables: the fuselage fineness ratio, the nose to body length ratio, the nose camber value, the wing planform area scale factor, and the wing location. The optimal geometry parameter values were chosen using a response surface methodology (RSM) technique which allowed for a minimum dry weight configuration design that met a set of aerodynamic performance constraints on the landing speed, and on the subsonic, supersonic, and hypersonic trim and stability levels. The RSM technique utilized, specifically the central composite design method, is presented, along with the general vehicle conceptual design process. Results are presented for an optimized configuration along with several design trade cases.

  5. Aerodynamic configuration design using response surface methodology analysis

    NASA Technical Reports Server (NTRS)

    Engelund, Walter C.; Stanley, Douglas O.; Lepsch, Roger A.; Mcmillin, Mark M.; Unal, Resit

    1993-01-01

    An investigation has been conducted to determine a set of optimal design parameters for a single-stage-to-orbit reentry vehicle. Several configuration geometry parameters which had a large impact on the entry vehicle flying characteristics were selected as design variables: the fuselage fineness ratio, the nose to body length ratio, the nose camber value, the wing planform area scale factor, and the wing location. The optimal geometry parameter values were chosen using a response surface methodology (RSM) technique which allowed for a minimum dry weight configuration design that met a set of aerodynamic performance constraints on the landing speed, and on the subsonic, supersonic, and hypersonic trim and stability levels. The RSM technique utilized, specifically the central composite design method, is presented, along with the general vehicle conceptual design process. Results are presented for an optimized configuration along with several design trade cases.

  6. ProSAR: a new methodology for combinatorial library design.

    PubMed

    Chen, Hongming; Börjesson, Ulf; Engkvist, Ola; Kogej, Thierry; Svensson, Mats A; Blomberg, Niklas; Weigelt, Dirk; Burrows, Jeremy N; Lange, Tim

    2009-03-01

    A method is introduced for performing reagent selection for chemical library design based on topological (2D) pharmacophore fingerprints. Optimal reagent selection is achieved by optimizing the Shannon entropy of the 2D pharmacophore distribution for the reagent set. The method, termed ProSAR, is therefore expected to enumerate compounds that could serve as a good starting point for deriving a structure activity relationship (SAR) in combinatorial library design. This methodology is exemplified by library design examples where the active compounds were already known. The results show that most of the pharmacophores on the substituents for the active compounds are covered by the designed library. This strategy is further expanded to include product property profiles for aqueous solubility, hERG risk assessment, etc. in the optimization process so that the reagent pharmacophore diversity and the product property profile are optimized simultaneously via a genetic algorithm. This strategy is applied to a two-dimensional library design example and compared with libraries designed by a diversity based strategy which minimizes the average ensemble Tanimoto similarity. Our results show that by using the PSAR methodology, libraries can be designed with simultaneously good pharmacophore coverage and product property profile.

  7. Are we talking the same paradigm? Considering methodological choices in health education systematic review

    PubMed Central

    Gordon, Morris

    2016-01-01

    Abstract For the past two decades, there have been calls for medical education to become more evidence-based. Whilst previous works have described how to use such methods, there are no works discussing when or why to select different methods from either a conceptual or pragmatic perspective. This question is not to suggest the superiority of such methods, but that having a clear rationale to underpin such choices is key and should be communicated to the reader of such works. Our goal within this manuscript is to consider the philosophical alignment of these different review and synthesis modalities and how this impacts on their suitability to answer different systematic review questions within health education. The key characteristic of a systematic review that should impact the synthesis choice is discussed in detail. By clearly defining this and the related outcome expected from the review and for educators who will receive this outcome, the alignment will become apparent. This will then allow deployment of an appropriate methodology that is fit for purpose and will indeed justify the significant work needed to complete a systematic. Key items discussed are the positivist synthesis methods meta-analysis and content analysis to address questions in the form of ‘whether and what’ education is effective. These can be juxtaposed with the constructivist aligned thematic analysis and meta-ethnography to address questions in the form of ‘why’. The concept of the realist review is also considered. It is proposed that authors of such work should describe their research alignment and the link between question, alignment and evidence synthesis method selected. The process of exploring the range of modalities and their alignment highlights gaps in the researcher’s arsenal. Future works are needed to explore the impact of such changes in writing from authors of medical education systematic review. PMID:27007488

  8. Are we talking the same paradigm? Considering methodological choices in health education systematic review.

    PubMed

    Gordon, Morris

    2016-07-01

    For the past two decades, there have been calls for medical education to become more evidence-based. Whilst previous works have described how to use such methods, there are no works discussing when or why to select different methods from either a conceptual or pragmatic perspective. This question is not to suggest the superiority of such methods, but that having a clear rationale to underpin such choices is key and should be communicated to the reader of such works. Our goal within this manuscript is to consider the philosophical alignment of these different review and synthesis modalities and how this impacts on their suitability to answer different systematic review questions within health education. The key characteristic of a systematic review that should impact the synthesis choice is discussed in detail. By clearly defining this and the related outcome expected from the review and for educators who will receive this outcome, the alignment will become apparent. This will then allow deployment of an appropriate methodology that is fit for purpose and will indeed justify the significant work needed to complete a systematic. Key items discussed are the positivist synthesis methods meta-analysis and content analysis to address questions in the form of 'whether and what' education is effective. These can be juxtaposed with the constructivist aligned thematic analysis and meta-ethnography to address questions in the form of 'why'. The concept of the realist review is also considered. It is proposed that authors of such work should describe their research alignment and the link between question, alignment and evidence synthesis method selected. The process of exploring the range of modalities and their alignment highlights gaps in the researcher's arsenal. Future works are needed to explore the impact of such changes in writing from authors of medical education systematic review.

  9. An automated methodology development. [software design for combat simulation

    NASA Technical Reports Server (NTRS)

    Hawley, L. R.

    1985-01-01

    The design methodology employed in testing the applicability of Ada in large-scale combat simulations is described. Ada was considered as a substitute for FORTRAN to lower life cycle costs and ease the program development efforts. An object-oriented approach was taken, which featured definitions of military targets, the capability of manipulating their condition in real-time, and one-to-one correlation between the object states and real world states. The simulation design process was automated by the problem statement language (PSL)/problem statement analyzer (PSA). The PSL/PSA system accessed the problem data base directly to enhance the code efficiency by, e.g., eliminating non-used subroutines, and provided for automated report generation, besides allowing for functional and interface descriptions. The ways in which the methodology satisfied the responsiveness, reliability, transportability, modifiability, timeliness and efficiency goals are discussed.

  10. Design methodology of an automated scattering measurement facility

    NASA Astrophysics Data System (ADS)

    Mazur, D. G.

    1985-12-01

    This thesis addresses the design methodology surrounding an automated scattering measurement facility. A brief historical survey of radar cross-section (RCS) measurements is presented. The electromagnetic theory associated with a continuous wave (CW) background cancellation technique for measuring RCS is discussed as background. In addition, problems associated with interfacing test equipment, data storage and output are addressed. The facility used as a model for this thesis is located at the Air Force Institute of Technology, WPARB, OH. The design methodology applies to any automated scattering measurement facility. A software package incorporating features that enhance the operation of AFIT's facility by students is presented. Finally, sample output from the software package illustrate formats for displaying RCS data.

  11. Development of a Design Methodology for Reconfigurable Flight Control Systems

    NASA Technical Reports Server (NTRS)

    Hess, Ronald A.; McLean, C.

    2000-01-01

    A methodology is presented for the design of flight control systems that exhibit stability and performance-robustness in the presence of actuator failures. The design is based upon two elements. The first element consists of a control law that will ensure at least stability in the presence of a class of actuator failures. This law is created by inner-loop, reduced-order, linear dynamic inversion, and outer-loop compensation based upon Quantitative Feedback Theory. The second element consists of adaptive compensators obtained from simple and approximate time-domain identification of the dynamics of the 'effective vehicle' with failed actuator(s). An example involving the lateral-directional control of a fighter aircraft is employed both to introduce the proposed methodology and to demonstrate its effectiveness and limitations.

  12. An integrated risk analysis methodology in a multidisciplinary design environment

    NASA Astrophysics Data System (ADS)

    Hampton, Katrina Renee

    Design of complex, one-of-a-kind systems, such as space transportation systems, is characterized by high uncertainty and, consequently, high risk. It is necessary to account for these uncertainties in the design process to produce systems that are more reliable. Systems designed by including uncertainties and managing them, as well, are more robust and less prone to poor operations as a result of parameter variability. The quantification, analysis and mitigation of uncertainties are challenging tasks as many systems lack historical data. In such an environment, risk or uncertainty quantification becomes subjective because input data is based on professional judgment. Additionally, there are uncertainties associated with the analysis tools and models. Both the input data and the model uncertainties must be considered for a multi disciplinary systems level risk analysis. This research synthesizes an integrated approach for developing a method for risk analysis. Expert judgment methodology is employed to quantify external risk. This methodology is then combined with a Latin Hypercube Sampling - Monte Carlo simulation to propagate uncertainties across a multidisciplinary environment for the overall system. Finally, a robust design strategy is employed to mitigate risk during the optimization process. This type of approach to risk analysis is conducive to the examination of quantitative risk factors. The core of this research methodology is the theoretical framework for uncertainty propagation. The research is divided into three stages or modules. The first two modules include the identification/quantification and propagation of uncertainties. The third module involves the management of uncertainties or response optimization. This final module also incorporates the integration of risk into program decision-making. The risk analysis methodology, is applied to a launch vehicle conceptual design study at NASA Langley Research Center. The launch vehicle multidisciplinary

  13. Methodology used in comparative studies assessing programmes of transition from paediatrics to adult care programmes: a systematic review

    PubMed Central

    Le Roux, E; Mellerio, H; Guilmin-Crépon, S; Gottot, S; Jacquin, P; Boulkedid, R; Alberti, C

    2017-01-01

    Objective To explore the methodologies employed in studies assessing transition of care interventions, with the aim of defining goals for the improvement of future studies. Design Systematic review of comparative studies assessing transition to adult care interventions for young people with chronic conditions. Data sources MEDLINE, EMBASE, ClinicalTrial.gov. Eligibility criteria for selecting studies 2 reviewers screened comparative studies with experimental and quasi-experimental designs, published or registered before July 2015. Eligible studies evaluate transition interventions at least in part after transfer to adult care of young people with chronic conditions with at least one outcome assessed quantitatively. Results 39 studies were reviewed, 26/39 (67%) published their final results and 13/39 (33%) were in progress. In 9 studies (9/39, 23%) comparisons were made between preintervention and postintervention in a single group. Randomised control groups were used in 9/39 (23%) studies. 2 (2/39, 5%) reported blinding strategies. Use of validated questionnaires was reported in 28% (11/39) of studies. In terms of reporting in published studies 15/26 (58%) did not report age at transfer, and 6/26 (23%) did not report the time of collection of each outcome. Conclusions Few evaluative studies exist and their level of methodological quality is variable. The complexity of interventions, multiplicity of outcomes, difficulty of blinding and the small groups of patients have consequences on concluding on the effectiveness of interventions. The evaluation of the transition interventions requires an appropriate and common methodology which will provide access to a better level of evidence. We identified areas for improvement in terms of randomisation, recruitment and external validity, blinding, measurement validity, standardised assessment and reporting. Improvements will increase our capacity to determine effective interventions for transition care. PMID:28131998

  14. [Comments on methodological quality of systematic review/meta-analysis on acupuncture therapy in China].

    PubMed

    Xiong, Jun; Du, Yuan-hao

    2011-02-01

    Along with the development of evidence-based medicine, more and more systematic review/Meta-analysis papers on acupuncture therapy have been published in China. Most researches have played an important part in guiding clinical study and practice on acupuncture. However, low quality researches may mislead the users. In the present paper, we analyze shortcomings of the published papers in China about systemic review/Meta-analysis on acupuncture therapy from the methodological quality according to the assessment tool: Oxman-Guyatt Overview Quality Assessment Questionnaire(OQAQ). Moreover, we also analyze some possible factors affecting the quality of systemic review/Meta-analysis and put forward a few of measures for improving the quality of systemic review.

  15. Thin Film Heat Flux Sensors: Design and Methodology

    NASA Technical Reports Server (NTRS)

    Fralick, Gustave C.; Wrbanek, John D.

    2013-01-01

    Thin Film Heat Flux Sensors: Design and Methodology: (1) Heat flux is one of a number of parameters, together with pressure, temperature, flow, etc. of interest to engine designers and fluid dynamists, (2) The measurement of heat flux is of interest in directly determining the cooling requirements of hot section blades and vanes, and (3)In addition, if the surface and gas temperatures are known, the measurement of heat flux provides a value for the convective heat transfer coefficient that can be compared with the value provided by CFD codes.

  16. When Playing Meets Learning: Methodological Framework for Designing Educational Games

    NASA Astrophysics Data System (ADS)

    Linek, Stephanie B.; Schwarz, Daniel; Bopp, Matthias; Albert, Dietrich

    Game-based learning builds upon the idea of using the motivational potential of video games in the educational context. Thus, the design of educational games has to address optimizing enjoyment as well as optimizing learning. Within the EC-project ELEKTRA a methodological framework for the conceptual design of educational games was developed. Thereby state-of-the-art psycho-pedagogical approaches were combined with insights of media-psychology as well as with best-practice game design. This science-based interdisciplinary approach was enriched by enclosed empirical research to answer open questions on educational game-design. Additionally, several evaluation-cycles were implemented to achieve further improvements. The psycho-pedagogical core of the methodology can be summarized by the ELEKTRA's 4Ms: Macroadaptivity, Microadaptivity, Metacognition, and Motivation. The conceptual framework is structured in eight phases which have several interconnections and feedback-cycles that enable a close interdisciplinary collaboration between game design, pedagogy, cognitive science and media psychology.

  17. Methodological developments in searching for studies for systematic reviews: past, present and future?

    PubMed Central

    2013-01-01

    The Cochrane Collaboration was established in 1993, following the opening of the UK Cochrane Centre in 1992, at a time when searching for studies for inclusion in systematic reviews was not well-developed. Review authors largely conducted their own searches or depended on medical librarians, who often possessed limited awareness and experience of systematic reviews. Guidance on the conduct and reporting of searches was limited. When work began to identify reports of randomized controlled trials (RCTs) for inclusion in Cochrane Reviews in 1992, there were only approximately 20,000 reports indexed as RCTs in MEDLINE and none indexed as RCTs in Embase. No search filters had been developed with the aim of identifying all RCTs in MEDLINE or other major databases. This presented The Cochrane Collaboration with a considerable challenge in identifying relevant studies. Over time, the number of studies indexed as RCTs in the major databases has grown considerably and the Cochrane Central Register of Controlled Trials (CENTRAL) has become the best single source of published controlled trials, with approximately 700,000 records, including records identified by the Collaboration from Embase and MEDLINE. Search filters for various study types, including systematic reviews and the Cochrane Highly Sensitive Search Strategies for RCTs, have been developed. There have been considerable advances in the evidence base for methodological aspects of information retrieval. The Cochrane Handbook for Systematic Reviews of Interventions now provides detailed guidance on the conduct and reporting of searches. Initiatives across The Cochrane Collaboration to improve the quality inter alia of information retrieval include: the recently introduced Methodological Expectations for Cochrane Intervention Reviews (MECIR) programme, which stipulates 'mandatory’ and 'highly desirable’ standards for various aspects of review conduct and reporting including searching, the development of Standard

  18. Methodological developments in searching for studies for systematic reviews: past, present and future?

    PubMed

    Lefebvre, Carol; Glanville, Julie; Wieland, L Susan; Coles, Bernadette; Weightman, Alison L

    2013-09-25

    The Cochrane Collaboration was established in 1993, following the opening of the UK Cochrane Centre in 1992, at a time when searching for studies for inclusion in systematic reviews was not well-developed. Review authors largely conducted their own searches or depended on medical librarians, who often possessed limited awareness and experience of systematic reviews. Guidance on the conduct and reporting of searches was limited. When work began to identify reports of randomized controlled trials (RCTs) for inclusion in Cochrane Reviews in 1992, there were only approximately 20,000 reports indexed as RCTs in MEDLINE and none indexed as RCTs in Embase. No search filters had been developed with the aim of identifying all RCTs in MEDLINE or other major databases. This presented The Cochrane Collaboration with a considerable challenge in identifying relevant studies.Over time, the number of studies indexed as RCTs in the major databases has grown considerably and the Cochrane Central Register of Controlled Trials (CENTRAL) has become the best single source of published controlled trials, with approximately 700,000 records, including records identified by the Collaboration from Embase and MEDLINE. Search filters for various study types, including systematic reviews and the Cochrane Highly Sensitive Search Strategies for RCTs, have been developed. There have been considerable advances in the evidence base for methodological aspects of information retrieval. The Cochrane Handbook for Systematic Reviews of Interventions now provides detailed guidance on the conduct and reporting of searches. Initiatives across The Cochrane Collaboration to improve the quality inter alia of information retrieval include: the recently introduced Methodological Expectations for Cochrane Intervention Reviews (MECIR) programme, which stipulates 'mandatory' and 'highly desirable' standards for various aspects of review conduct and reporting including searching, the development of Standard Training

  19. Idiopathic pulmonary fibrosis - a systematic review on methodology for the collection of epidemiological data

    PubMed Central

    2013-01-01

    Background Recent studies suggest that the incidence of idiopathic pulmonary fibrosis (IPF) is rising. Accurate epidemiological data on IPF, however, are sparse and the results of previous studies are contradictory. This study was undertaken to gain insight into the various methods used in the epidemiological research of IPF, and to get accurate and comparable data on these different methodologies. Methods A systematic database search was performed in order to identify all epidemiological studies on IPF after the previous guidelines for diagnosis and treatment were published in 2000. Medline (via Pubmed), Science Sitation Index (via Web of Science) and Embase databases were searched for original epidemiological articles published in English in international peer-reviewed journals starting from 2001. After pre-screening and a full-text review, 13 articles were accepted for data abstraction. Results Three different methodologies of epidemiological studies were most commonly used, namely: 1) national registry databases, 2) questionnaire-based studies, and 3) analysis of the health care system’s own registry databases. The overall prevalence and incidence of IPF varied in these studies between 0.5–27.9/100,000 and 0.22–8.8/100,000, respectively. According to four studies the mortality and incidence of IPF are rising. Conclusions We conclude that there are numerous ways to execute epidemiological research in the field of IPF. This review offers the possibility to compare the different methodologies that have been used, and this information could form a basis for future studies investigating the prevalence and incidence of IPF. PMID:23962167

  20. Implementation of probabilistic design methodology at Tennessee State University

    NASA Technical Reports Server (NTRS)

    Onwubiko, Chinyere

    1995-01-01

    The fact that Deterministic Design Method no longer satisfies most design needs calls for methods that will cope with the high trend in technology. The advance in computer technology has reduced the rigors that normally accompany many design analysis methods that account for uncertainties in design parameters. Probabilistic Design Methodology (PDM) is beginning to make impact in engineering design. This method is gaining more recognition in industries than in educational institutions. Some of the reasons for the limited use of the PDM at the moment are that many are unaware of its potentials, and most of the software developed for PDM are very recent. The central goal of the PDM project at Tennessee State University is to introduce engineering students to this method. The students participating in the project learn about PDM and the computer codes that are available to the design engineer. The software being used for this project is NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) developed under NASA probabilistic structural analysis program. NESSUS has three different modules which make it a very comprehensive computer code for PDM. Since this method is new to the students, its introduction into the engineering curriculum is to be in stages. These range from the introduction of PDM and its software to the applications. While this program is being developed for its eventual inclusion into the engineering curriculum, some graduate and undergraduate students are already carrying out some projects using this method. As the students are increasing their understanding on PDM, they are at the same time applying it to some common design problems. The areas this method is being applied at the moment include, Design of Gears (spur and worm); Design of Brakes; Design of Heat Exchangers Design of Helical Springs; and Design of Shock Absorbers. Some of the current results of these projects are presented.

  1. Fuel cell cathode air filters: Methodologies for design and optimization

    NASA Astrophysics Data System (ADS)

    Kennedy, Daniel M.; Cahela, Donald R.; Zhu, Wenhua H.; Westrom, Kenneth C.; Nelms, R. Mark; Tatarchuk, Bruce J.

    Proton exchange membrane (PEM) fuel cells experience performance degradation, such as reduction in efficiency and life, as a result of poisoning of platinum catalysts by airborne contaminants. Research on these contaminant effects suggests that the best possible solution to allowing fuel cells to operate in contaminated environments is by filtration of the harmful contaminants from the cathode air. A cathode air filter design methodology was created that connects properties of cathode air stream, filter design options, and filter footprint, to a set of adsorptive filter parameters that must be optimized to efficiently operate the fuel cell. Filter optimization requires a study of the trade off between two causal factors of power loss: first, a reduction in power production due to poisoning of the platinum catalyst by chemical contaminants and second, an increase in power requirements to operate the air compressor with a larger pressure drop from additional contaminant filtration. The design methodology was successfully applied to a 1.2 kW fuel cell using a programmable algorithm and predictions were made about the relationships between inlet concentration, breakthrough time, filter design, pressure drop, and compressor power requirements.

  2. The design and methodology of premature ejaculation interventional studies.

    PubMed

    McMahon, Chris G

    2016-08-01

    Large well-designed clinical efficacy and safety randomized clinical trials (RCTs) are required to achieve regulatory approval of new drug treatments. The objective of this article is to make recommendations for the criteria for defining and selecting the clinical trial study population, design and efficacy outcomes measures which comprise ideal premature ejaculation (PE) interventional trial methodology. Data on clinical trial design, epidemiology, definitions, dimensions and psychological impact of PE was reviewed, critiqued and incorporated into a series of recommendations for standardisation of PE clinical trial design, outcome measures and reporting using the principles of evidence based medicine. Data from PE interventional studies are only reliable, interpretable and capable of being generalised to patients with PE, when study populations are defined by the International Society for Sexual Medicine (ISSM) multivariate definition of PE. PE intervention trials should employ a double-blind RCT methodology and include placebo control, active standard drug control, and/or dose comparison trials. Ejaculatory latency time (ELT) and subject/partner outcome measures of control, personal/partner/relationship distress and other study-specific outcome measures should be used as outcome measures. There is currently no published literature which identifies a clinically significant threshold response to intervention. The ISSM definition of PE reflects the contemporary understanding of PE and represents the state-of-the-art multi-dimensional definition of PE and is recommended as the basis of diagnosis of PE for all PE clinical trials.

  3. The design and methodology of premature ejaculation interventional studies

    PubMed Central

    2016-01-01

    Large well-designed clinical efficacy and safety randomized clinical trials (RCTs) are required to achieve regulatory approval of new drug treatments. The objective of this article is to make recommendations for the criteria for defining and selecting the clinical trial study population, design and efficacy outcomes measures which comprise ideal premature ejaculation (PE) interventional trial methodology. Data on clinical trial design, epidemiology, definitions, dimensions and psychological impact of PE was reviewed, critiqued and incorporated into a series of recommendations for standardisation of PE clinical trial design, outcome measures and reporting using the principles of evidence based medicine. Data from PE interventional studies are only reliable, interpretable and capable of being generalised to patients with PE, when study populations are defined by the International Society for Sexual Medicine (ISSM) multivariate definition of PE. PE intervention trials should employ a double-blind RCT methodology and include placebo control, active standard drug control, and/or dose comparison trials. Ejaculatory latency time (ELT) and subject/partner outcome measures of control, personal/partner/relationship distress and other study-specific outcome measures should be used as outcome measures. There is currently no published literature which identifies a clinically significant threshold response to intervention. The ISSM definition of PE reflects the contemporary understanding of PE and represents the state-of-the-art multi-dimensional definition of PE and is recommended as the basis of diagnosis of PE for all PE clinical trials. PMID:27652224

  4. Behavioral headache research: methodologic considerations and research design alternatives.

    PubMed

    Hursey, Karl G; Rains, Jeanetta C; Penzien, Donald B; Nash, Justin M; Nicholson, Robert A

    2005-05-01

    Behavioral headache treatments have garnered solid empirical support in recent years, but there is substantial opportunity to strengthen the next generation of studies with improved methods and consistency across studies. Recently, Guidelines for Trials of Behavioral Treatments for Recurrent Headache were published to facilitate the production of high-quality research. The present article compliments the guidelines with a discussion of methodologic and research design considerations. Since there is no research design that is applicable in every situation, selecting an appropriate research design is fundamental to producing meaningful results. Investigators in behavioral headache and other areas of research consider the developmental phase of the research, the principle objectives of the project, and the sources of error or alternative interpretations in selecting a design. Phases of clinical trials typically include pilot studies, efficacy studies, and effectiveness studies. These trials may be categorized as primarily pragmatic or explanatory. The most appropriate research designs for these different phases and different objectives vary on such characteristics as sample size and assignment to condition, types of control conditions, periods or frequency of measurement, and the dimensions along which comparisons are made. A research design also must fit within constraints on available resources. There are a large number of potential research designs that can be used and considering these characteristics allows selection of appropriate research designs.

  5. Systematic review of sensory integration therapy for individuals with disabilities: Single case design studies.

    PubMed

    Leong, H M; Carter, Mark; Stephenson, Jennifer

    2015-12-01

    Sensory integration therapy (SIT) is a controversial intervention that is widely used for people with disabilities. Systematic analysis was conducted on the outcomes of 17 single case design studies on sensory integration therapy for people with, or at-risk of, a developmental or learning disability, disorder or delay. An assessment of the quality of methodology of the studies found most used weak designs and poor methodology, with a tendency for higher quality studies to produce negative results. Based on limited comparative evidence, functional analysis-based interventions for challenging behavior were more effective that SIT. Overall the studies do not provide convincing evidence for the efficacy of sensory integration therapy. Given the findings of the present review and other recent analyses it is advised that the use of SIT be limited to experimental contexts. Issues with the studies and possible improvements for future research are discussed including the need to employ designs that allow for adequate demonstration of experimental control.

  6. Optimal input design for aircraft instrumentation systematic error estimation

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene A.

    1991-01-01

    A new technique for designing optimal flight test inputs for accurate estimation of instrumentation systematic errors was developed and demonstrated. A simulation model of the F-18 High Angle of Attack Research Vehicle (HARV) aircraft was used to evaluate the effectiveness of the optimal input compared to input recorded during flight test. Instrumentation systematic error parameter estimates and their standard errors were compared. It was found that the optimal input design improved error parameter estimates and their accuracies for a fixed time input design. Pilot acceptability of the optimal input design was demonstrated using a six degree-of-freedom fixed base piloted simulation of the F-18 HARV. The technique described in this work provides a practical, optimal procedure for designing inputs for data compatibility experiments.

  7. Self-efficacy instruments for patients with chronic diseases suffer from methodological limitations - a systematic review

    PubMed Central

    Frei, Anja; Svarin, Anna; Steurer-Stey, Claudia; Puhan, Milo A

    2009-01-01

    instruments was often not specified and for most instruments, not all measurement properties that are important to support the specific aim of the instrument (for example responsiveness for evaluative instruments) were assessed. Researchers who develop and validate self-efficacy instruments should adhere more closely to important methodological concepts for development and validation of patient-reported outcomes and report their methods more transparently. We propose a systematic five step approach for the development and validation of self-efficacy instruments. PMID:19781095

  8. Applying a user centered design methodology in a clinical context.

    PubMed

    Kashfi, Hajar

    2010-01-01

    A clinical decision support system (CDSS) is an interactive application that is used to facilitate the process of decisionmaking in a clinical context. Developing a usable CDSS is a challenging process; mostly because of the complex nature of domain knowledge and the context of use of those systems. This paper describes how a user centered design (UCD) approach can be used in a clinical context for developing a CDSS. In our effort, a design-based research methodology has been used. The outcomes of this work are as follow; a customized UCD approach is suggested that combines UCD and openEHR. Moreover, the GUI developed in the design phase and the result of the GUI evaluation is briefly presented.

  9. Aircraft conceptual design - an adaptable parametric sizing methodology

    NASA Astrophysics Data System (ADS)

    Coleman, Gary John, Jr.

    Aerospace is a maturing industry with successful and refined baselines which work well for traditional baseline missions, markets and technologies. However, when new markets (space tourism) or new constrains (environmental) or new technologies (composite, natural laminar flow) emerge, the conventional solution is not necessarily best for the new situation. Which begs the question "how does a design team quickly screen and compare novel solutions to conventional solutions for new aerospace challenges?" The answer is rapid and flexible conceptual design Parametric Sizing. In the product design life-cycle, parametric sizing is the first step in screening the total vehicle in terms of mission, configuration and technology to quickly assess first order design and mission sensitivities. During this phase, various missions and technologies are assessed. During this phase, the designer is identifying design solutions of concepts and configurations to meet combinations of mission and technology. This research undertaking contributes the state-of-the-art in aircraft parametric sizing through (1) development of a dedicated conceptual design process and disciplinary methods library, (2) development of a novel and robust parametric sizing process based on 'best-practice' approaches found in the process and disciplinary methods library, and (3) application of the parametric sizing process to a variety of design missions (transonic, supersonic and hypersonic transports), different configurations (tail-aft, blended wing body, strut-braced wing, hypersonic blended bodies, etc.), and different technologies (composite, natural laminar flow, thrust vectored control, etc.), in order to demonstrate the robustness of the methodology and unearth first-order design sensitivities to current and future aerospace design problems. This research undertaking demonstrates the importance of this early design step in selecting the correct combination of mission, technologies and configuration to

  10. Development and implementation of rotorcraft preliminary design methodology using multidisciplinary design optimization

    NASA Astrophysics Data System (ADS)

    Khalid, Adeel Syed

    Rotorcraft's evolution has lagged behind that of fixed-wing aircraft. One of the reasons for this gap is the absence of a formal methodology to accomplish a complete conceptual and preliminary design. Traditional rotorcraft methodologies are not only time consuming and expensive but also yield sub-optimal designs. Rotorcraft design is an excellent example of a multidisciplinary complex environment where several interdependent disciplines are involved. A formal framework is developed and implemented in this research for preliminary rotorcraft design using IPPD methodology. The design methodology consists of the product and process development cycles. In the product development loop, all the technical aspects of design are considered including the vehicle engineering, dynamic analysis, stability and control, aerodynamic performance, propulsion, transmission design, weight and balance, noise analysis and economic analysis. The design loop starts with a detailed analysis of requirements. A baseline is selected and upgrade targets are identified depending on the mission requirements. An Overall Evaluation Criterion (OEC) is developed that is used to measure the goodness of the design or to compare the design with competitors. The requirements analysis and baseline upgrade targets lead to the initial sizing and performance estimation of the new design. The digital information is then passed to disciplinary experts. This is where the detailed disciplinary analyses are performed. Information is transferred from one discipline to another as the design loop is iterated. To coordinate all the disciplines in the product development cycle, Multidisciplinary Design Optimization (MDO) techniques e.g. All At Once (AAO) and Collaborative Optimization (CO) are suggested. The methodology is implemented on a Light Turbine Training Helicopter (LTTH) design. Detailed disciplinary analyses are integrated through a common platform for efficient and centralized transfer of design

  11. Designs and Methods in School Improvement Research: A Systematic Review

    ERIC Educational Resources Information Center

    Feldhoff, Tobias; Radisch, Falk; Bischof, Linda Marie

    2016-01-01

    Purpose: The purpose of this paper is to focus on challenges faced by longitudinal quantitative analyses of school improvement processes and offers a systematic literature review of current papers that use longitudinal analyses. In this context, the authors assessed designs and methods that are used to analyze the relation between school…

  12. A SYSTEMATIC PROCEDURE FOR DESIGNING PROCESSES WITH MULTIPLE ENVIRONMENTAL OBJECTIVES

    EPA Science Inventory

    Evaluation and analysis of multiple objectives are very important in designing environmentally benign processes. They require a systematic procedure for solving multi-objective decision-making problems due to the complex nature of the problems and the need for complex assessment....

  13. Evaluation of methodology and quality characteristics of systematic reviews in orthodontics.

    PubMed

    Papageorgiou, S N; Papadopoulos, M A; Athanasiou, A E

    2011-08-01

    Systematic reviews (SRs) are published with an increasing rate in many fields of biomedical literature, including orthodontics. Although SRs should consolidate the evidence-based characteristics of contemporary orthodontic practice, doubts on the validity of their conclusions have been frequently expressed. The aim of this study was to evaluate the methodology and quality characteristics of orthodontic SRs as well as to assess their quality of reporting during the last years. Electronic databases were searched for SRs (without any meta-analytical data synthesis) in the field of orthodontics, indexed up to the start of 2010. The Assessment of Multiple Systematic Reviews (AMSTAR) tool was used for quality assessment of the included articles. Data were analyzed with Student's t-test, one-way ANOVA, and linear regression. Risk ratios (RR) with 95% confidence intervals were calculated to represent changes during the years in reporting of key items associated with quality. A total of 110 SRs were included in this evaluation. About half of the SRs (46.4%) were published in orthodontic journals, while few (5.5%) were updates of previously published reviews. Using the AMSTAR tool, thirty (27.3%) of the SRs were found to be of low quality, 63 (57.3%) of medium quality, and 17 (15.5%) of high quality. No significant trend for quality improvement was observed during the last years. The overall quality of orthodontic SRs may be considered as medium. Although the number of orthodontic SRs has increased over the last decade, their quality characteristics can be characterized as moderate.

  14. The uniqueness of the human dentition as forensic evidence: a systematic review on the technological methodology.

    PubMed

    Franco, Ademir; Willems, Guy; Souza, Paulo Henrique Couto; Bekkering, Geertruida E; Thevissen, Patrick

    2015-11-01

    The uniqueness of human dentition is routinely approached as identification evidence in forensic odontology. Specifically in bitemark and human identification cases, positive identifications are obtained under the hypothesis that two individuals do not have the same dental features. The present study compiles methodological information from articles on the uniqueness of human dentition to support investigations into the mentioned hypothesis. In April 2014, three electronic library databases (SciELO®, MEDLINE®/PubMed®, and LILACS®) were systematically searched. In parallel, reference lists of relevant studies were also screened. From the obtained articles (n = 1235), 13 full-text articles were considered eligible. They were examined according to the studied parameters: the sample size, the number of examined teeth, the registration technique for data collection, the methods for data analysis, and the study outcomes. Six combinations of studied data were detected: (1) dental shape, size, angulation, and position (n = 1); (2) dental shape, size, and angulation (n = 4); (3) dental shape and size (n = 5); (4) dental angulation and position (n = 2); (5) dental shape and angulation (n = 1); and (6) dental shape (n = 1). The sample size ranged between 10 and 1099 human dentitions. Ten articles examined the six anterior teeth, while three articles examined more teeth. Four articles exclusively addressed three-dimensional (3D) data registration, while six articles used two-dimensional (2D) imaging. In three articles, both imaging registrations were combined. Most articles (n = 9) explored the data using landmark placement. The other articles (n = 4) comprised digital comparison of superimposed dental contours. Although there were large methodological variations within the investigated articles, the uniqueness of human dentition remains unproved.

  15. The Study of the Relationship between Probabilistic Design and Axiomatic Design Methodology. Volume 2

    NASA Technical Reports Server (NTRS)

    Onwubiko, Chin-Yere; Onyebueke, Landon

    1996-01-01

    The structural design, or the design of machine elements, has been traditionally based on deterministic design methodology. The deterministic method considers all design parameters to be known with certainty. This methodology is, therefore, inadequate to design complex structures that are subjected to a variety of complex, severe loading conditions. A nonlinear behavior that is dependent on stress, stress rate, temperature, number of load cycles, and time is observed on all components subjected to complex conditions. These complex conditions introduce uncertainties; hence, the actual factor of safety margin remains unknown. In the deterministic methodology, the contingency of failure is discounted; hence, there is a use of a high factor of safety. It may be most useful in situations where the design structures are simple. The probabilistic method is concerned with the probability of non-failure performance of structures or machine elements. It is much more useful in situations where the design is characterized by complex geometry, possibility of catastrophic failure, sensitive loads and material properties. Also included: Comparative Study of the use of AGMA Geometry Factors and Probabilistic Design Methodology in the Design of Compact Spur Gear Set.

  16. Checklists of Methodological Issues for Review Authors to Consider When Including Non-Randomized Studies in Systematic Reviews

    ERIC Educational Resources Information Center

    Wells, George A.; Shea, Beverley; Higgins, Julian P. T.; Sterne, Jonathan; Tugwell, Peter; Reeves, Barnaby C.

    2013-01-01

    Background: There is increasing interest from review authors about including non-randomized studies (NRS) in their systematic reviews of health care interventions. This series from the Ottawa Non-Randomized Studies Workshop consists of six papers identifying methodological issues when doing this. Aim: To format the guidance from the preceding…

  17. Fast underdetermined BSS architecture design methodology for real time applications.

    PubMed

    Mopuri, Suresh; Reddy, P Sreenivasa; Acharyya, Amit; Naik, Ganesh R

    2015-01-01

    In this paper, we propose a high speed architecture design methodology for the Under-determined Blind Source Separation (UBSS) algorithm using our recently proposed high speed Discrete Hilbert Transform (DHT) targeting real time applications. In UBSS algorithm, unlike the typical BSS, the number of sensors are less than the number of the sources, which is of more interest in the real time applications. The DHT architecture has been implemented based on sub matrix multiplication method to compute M point DHT, which uses N point architecture recursively and where M is an integer multiples of N. The DHT architecture and state of the art architecture are coded in VHDL for 16 bit word length and ASIC implementation is carried out using UMC 90 - nm technology @V DD = 1V and @ 1MHZ clock frequency. The proposed architecture implementation and experimental comparison results show that the DHT design is two times faster than state of the art architecture.

  18. Design Evolution and Methodology for Pumpkin Super-Pressure Balloons

    NASA Astrophysics Data System (ADS)

    Farley, Rodger

    The NASA Ultra Long Duration Balloon (ULDB) program has had many technical development issues discovered and solved along its road to success as a new vehicle. It has the promise of being a sub-satellite, a means to launch up to 2700 kg to 33.5 km altitude for 100 days from a comfortable mid-latitude launch point. Current high-lift long duration ballooning is accomplished out of Antarctica with zero-pressure balloons, which cannot cope with the rigors of diurnal cycles. The ULDB design is still evolving, the product of intense analytical effort, scaled testing, improved manufacturing, and engineering intuition. The past technical problems, in particular the s-cleft deformation, their solutions, future challenges, and the methodology of pumpkin balloon design will generally be described.

  19. A variable-gain output feedback control design methodology

    NASA Technical Reports Server (NTRS)

    Halyo, Nesim; Moerder, Daniel D.; Broussard, John R.; Taylor, Deborah B.

    1989-01-01

    A digital control system design technique is developed in which the control system gain matrix varies with the plant operating point parameters. The design technique is obtained by formulating the problem as an optimal stochastic output feedback control law with variable gains. This approach provides a control theory framework within which the operating range of a control law can be significantly extended. Furthermore, the approach avoids the major shortcomings of the conventional gain-scheduling techniques. The optimal variable gain output feedback control problem is solved by embedding the Multi-Configuration Control (MCC) problem, previously solved at ICS. An algorithm to compute the optimal variable gain output feedback control gain matrices is developed. The algorithm is a modified version of the MCC algorithm improved so as to handle the large dimensionality which arises particularly in variable-gain control problems. The design methodology developed is applied to a reconfigurable aircraft control problem. A variable-gain output feedback control problem was formulated to design a flight control law for an AFTI F-16 aircraft which can automatically reconfigure its control strategy to accommodate failures in the horizontal tail control surface. Simulations of the closed-loop reconfigurable system show that the approach produces a control design which can accommodate such failures with relative ease. The technique can be applied to many other problems including sensor failure accommodation, mode switching control laws and super agility.

  20. A symbolic methodology to improve disassembly process design.

    PubMed

    Rios, Pedro; Blyler, Leslie; Tieman, Lisa; Stuart, Julie Ann; Grant, Ed

    2003-12-01

    Millions of end-of-life electronic components are retired annually due to the proliferation of new models and their rapid obsolescence. The recovery of resources such as plastics from these goods requires their disassembly. The time required for each disassembly and its associated cost is defined by the operator's familiarity with the product design and its complexity. Since model proliferation serves to complicate an operator's learning curve, it is worthwhile to investigate the benefits to be gained in a disassembly operator's preplanning process. Effective disassembly process design demands the application of green engineering principles, such as those developed by Anastas and Zimmerman (Environ. Sci. Technol. 2003, 37, 94A-101A), which include regard for product complexity, structural commonality, separation energy, material value, and waste prevention. This paper introduces the concept of design symbolsto help the operator more efficiently survey product complexity with respect to location and number of fasteners to remove a structure that is common to all electronics: the housing. With a sample of 71 different computers, printers, and monitors, we demonstrate that appropriate symbols reduce the total disassembly planning time by 13.2 min. Such an improvement could well make efficient the separation of plastic that would otherwise be destined for waste-to-energy or landfill. The symbolic methodology presented may also improve Design for Recycling and Design for Maintenance and Support.

  1. Gaining system design knowledge by systematic design space exploration with graph based design languages

    NASA Astrophysics Data System (ADS)

    Schmidt, Jens; Rudolph, Stephan

    2014-10-01

    The conceptual design phase in the design of complex systems such as satellite propulsion systems heavily relies on an exploration of the feasible design space. This exploration requires both: topological changes in the potential system architecture and consistent parametrical changes in the dimensioning of the existing system components. Since advanced engineering design techniques nowadays advocate a model-based systems engineering (MBSE) approach, graph-based design languages which embed a superset of MBSE-features are consequently used in this work to systematically explore the feasible design space. Design languages allow the design knowledge to be represented, modeled and executed using model-based transformations and combine this among other features with constraint processing techniques. The execution of the design language shown for the satellite propulsion systems in this work yields topologically varied designs (i.e. the selection of a monergol, a diergol or a coldgas system) with consistent parameters. Based on an a posteriori performance analysis of the automatically generated system designs, novel system knowledge (most notably in form of so-called "topology change points") can be gained and extracted from the original point cloud of numerical results.

  2. Development of the Spanish version of the Systematized Nomenclature of Medicine: methodology and main issues.

    PubMed Central

    Reynoso, G. A.; March, A. D.; Berra, C. M.; Strobietto, R. P.; Barani, M.; Iubatti, M.; Chiaradio, M. P.; Serebrisky, D.; Kahn, A.; Vaccarezza, O. A.; Leguiza, J. L.; Ceitlin, M.; Luna, D. A.; Bernaldo de Quirós, F. G.; Otegui, M. I.; Puga, M. C.; Vallejos, M.

    2000-01-01

    This presentation features linguistic and terminology management issues related to the development of the Spanish version of the Systematized Nomenclature of Medicine (SNOMED). It aims at describing the aspects of translating and the difficulties encountered in delivering a natural and consistent medical nomenclature. Bunge's three-layered model is referenced to analyze the sequence of symbolic concept representations. It further explains how a communicative translation based on a concept-to-concept approach was used to achieve the highest level of flawlessness and naturalness for the Spanish rendition of SNOMED. Translation procedures and techniques are described and exemplified. Both the computer-aided and human translation methods are portrayed. The scientific and translation team tasks are detailed, with focus on Newmark's four-level principle for the translation process, extended with a fifth further level relevant to the ontology to control the consistency of the typology of concepts. Finally the convenience for a common methodology to develop non-English versions of SNOMED is suggested. PMID:11079973

  3. Towards a Methodology for the Design of Multimedia Public Access Interfaces.

    ERIC Educational Resources Information Center

    Rowley, Jennifer

    1998-01-01

    Discussion of information systems methodologies that can contribute to interface design for public access systems covers: the systems life cycle; advantages of adopting information systems methodologies; soft systems methodologies; task-oriented approaches to user interface design; holistic design, the Star model, and prototyping; the…

  4. Combustor design and analysis using the Rocket Combustor Interactive Design (ROCCID) methodology

    NASA Technical Reports Server (NTRS)

    Klem, Mark D.; Pieper, Jerry L.; Walker, Richard E.

    1990-01-01

    The ROCket Combustor Interactive Design (ROCCID) Methodology is a newly developed, interactive computer code for the design and analysis of a liquid propellant rocket combustion chamber. The application of ROCCID to design a liquid rocket combustion chamber is illustrated. Designs for a 50,000 lbf thrust and 1250 psi chamber pressure combustor using liquid oxygen (LOX)RP-1 propellants are developed and evaluated. Tradeoffs between key design parameters affecting combustor performance and stability are examined. Predicted performance and combustion stability margin for these designs are provided as a function of the combustor operating mixture ratio and chamber pressure.

  5. Combustor design and analysis using the ROCket Combustor Interactive Design (ROCCID) Methodology

    NASA Technical Reports Server (NTRS)

    Klem, Mark D.; Pieper, Jerry L.; Walker, Richard E.

    1990-01-01

    The ROCket Combustor Interactive Design (ROCCID) Methodology is a newly developed, interactive computer code for the design and analysis of a liquid propellant rocket combustion chamber. The application of ROCCID to design a liquid rocket combustion chamber is illustrated. Designs for a 50,000 lbf thrust and 1250 psi chamber pressure combustor using liquid oxygen (LOX)RP-1 propellants are developed and evaluated. Tradeoffs between key design parameters affecting combustor performance and stability are examined. Predicted performance and combustion stability margin for these designs are provided as a function of the combustor operating mixture ratio and chamber pressure.

  6. Developing Risk Prediction Models for Postoperative Pancreatic Fistula: a Systematic Review of Methodology and Reporting Quality.

    PubMed

    Wen, Zhang; Guo, Ya; Xu, Banghao; Xiao, Kaiyin; Peng, Tao; Peng, Minhao

    2016-04-01

    Postoperative pancreatic fistula is still a major complication after pancreatic surgery, despite improvements of surgical technique and perioperative management. We sought to systematically review and critically access the conduct and reporting of methods used to develop risk prediction models for predicting postoperative pancreatic fistula. We conducted a systematic search of PubMed and EMBASE databases to identify articles published before January 1, 2015, which described the development of models to predict the risk of postoperative pancreatic fistula. We extracted information of developing a prediction model including study design, sample size and number of events, definition of postoperative pancreatic fistula, risk predictor selection, missing data, model-building strategies, and model performance. Seven studies of developing seven risk prediction models were included. In three studies (42 %), the number of events per variable was less than 10. The number of candidate risk predictors ranged from 9 to 32. Five studies (71 %) reported using univariate screening, which was not recommended in building a multivariate model, to reduce the number of risk predictors. Six risk prediction models (86 %) were developed by categorizing all continuous risk predictors. The treatment and handling of missing data were not mentioned in all studies. We found use of inappropriate methods that could endanger the development of model, including univariate pre-screening of variables, categorization of continuous risk predictors, and model validation. The use of inappropriate methods affects the reliability and the accuracy of the probability estimates of predicting postoperative pancreatic fistula.

  7. Design of integrated pitch axis for autopilot/autothrottle and integrated lateral axis for autopilot/yaw damper for NASA TSRV airplane using integral LQG methodology

    NASA Technical Reports Server (NTRS)

    Kaminer, Isaac; Benson, Russell A.; Coleman, Edward E.; Ebrahimi, Yaghoob S.

    1990-01-01

    Two designs are presented for control systems for the NASA Transport System Research Vehicle (TSRV) using integral Linear Quadratic Gaussian (LQG) methodology. The first is an integrated longitudinal autopilot/autothrottle design and the second design is an integrated lateral autopilot/yaw damper/sideslip controller design. It is shown that a systematic top-down approach to a complex design problem combined with proper application of modern control synthesis techniques yields a satisfactory solution in a reasonable period of time.

  8. A new hot gas cleanup filter design methodology

    SciTech Connect

    VanOsdol, J.G.; Dennis, R.A.; Shaffer, F.D.

    1996-12-31

    The fluid dynamics of Hot Gas Cleanup (HGCU) systems having complex geometrical configurations are typically analyzed using computational fluid dynamics codes (CFD) or bench-scale laboratory test facilities called cold-flow models (CFM). At the present time, both CFD and CFM can be effectively used for simple flows limited to one or two characteristic length scales with well defined boundary conditions. This is not the situation with HGCU devices. These devices have very complex geometries, low Reynolds number, multi-phase flows that operate on multiple-length scales. For this reason, both CFD and CFM analysis cannot yet be considered as a practical engineering analysis tool for modeling the entire flow field inside HGCU systems. The thrust of this work is to provide an aerodynamic analysis methodology that can be easily applied to the complex geometries characteristic of HGCU filter vessels, but would not require the tedious numerical solution to the entire set of transport equations. The analysis methodology performs the following tasks: Predicts problem areas where ash deposition will most likely occur; Predicts residence times for particles at various locations inside the filter vessel; Lends itself quickly to major design changes; Provides a sound technical basis for more appropriate use of CFD and CFM analysis; and Provides CFD and CFM analysis in a more focused way where if is needed.

  9. Systematic search for major genes in schizophrenia: Methodological issues and results from chromosome 12

    SciTech Connect

    Dawson, E.; Powell, J.F.; Sham, P.

    1995-10-09

    We describe a method of systematically searching for major genes in disorders of unknown mode of inheritance, using linkage analysis. Our method is designed to minimize the probability of missing linkage due to inadequate exploration of data. We illustrate this method with the results of a search for a locus for schizophrenia on chromosome 12 using 22 highly polymorphic markers in 23 high density pedigrees. The markers span approximately 85-90% of the chromosome and are on average 9.35 cM apart. We have analysed the data using the most plausible current genetic models and allowing for the presence of genetic heterogeneity. None of the markers was supportive of linkage and the distribution of the heterogeneity statistics was in accordance with the null hypothesis. 53 refs., 2 figs., 4 tabs.

  10. Systematic review of the methodological quality of controlled trials evaluating Chinese herbal medicine in patients with rheumatoid arthritis

    PubMed Central

    Pan, Xin; Lopez-Olivo, Maria A; Song, Juhee; Pratt, Gregory; Suarez-Almazor, Maria E

    2017-01-01

    Objectives We appraised the methodological and reporting quality of randomised controlled clinical trials (RCTs) evaluating the efficacy and safety of Chinese herbal medicine (CHM) in patients with rheumatoid arthritis (RA). Design For this systematic review, electronic databases were searched from inception until June 2015. The search was limited to humans and non-case report studies, but was not limited by language, year of publication or type of publication. Two independent reviewers selected RCTs, evaluating CHM in RA (herbals and decoctions). Descriptive statistics were used to report on risk of bias and their adherence to reporting standards. Multivariable logistic regression analysis was performed to determine study characteristics associated with high or unclear risk of bias. Results Out of 2342 unique citations, we selected 119 RCTs including 18 919 patients: 10 108 patients received CHM alone and 6550 received one of 11 treatment combinations. A high risk of bias was observed across all domains: 21% had a high risk for selection bias (11% from sequence generation and 30% from allocation concealment), 85% for performance bias, 89% for detection bias, 4% for attrition bias and 40% for reporting bias. In multivariable analysis, fewer authors were associated with selection bias (allocation concealment), performance bias and attrition bias, and earlier year of publication and funding source not reported or disclosed were associated with selection bias (sequence generation). Studies published in non-English language were associated with reporting bias. Poor adherence to recommended reporting standards (<60% of the studies not providing sufficient information) was observed in 11 of the 23 sections evaluated. Limitations Study quality and data extraction were performed by one reviewer and cross-checked by a second reviewer. Translation to English was performed by one reviewer in 85% of the included studies. Conclusions Studies evaluating CHM often fail to

  11. A design methodology for portable software on parallel computers

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Miller, Keith W.; Chrisman, Dan A.

    1993-01-01

    This final report for research that was supported by grant number NAG-1-995 documents our progress in addressing two difficulties in parallel programming. The first difficulty is developing software that will execute quickly on a parallel computer. The second difficulty is transporting software between dissimilar parallel computers. In general, we expect that more hardware-specific information will be included in software designs for parallel computers than in designs for sequential computers. This inclusion is an instance of portability being sacrificed for high performance. New parallel computers are being introduced frequently. Trying to keep one's software on the current high performance hardware, a software developer almost continually faces yet another expensive software transportation. The problem of the proposed research is to create a design methodology that helps designers to more precisely control both portability and hardware-specific programming details. The proposed research emphasizes programming for scientific applications. We completed our study of the parallelizability of a subsystem of the NASA Earth Radiation Budget Experiment (ERBE) data processing system. This work is summarized in section two. A more detailed description is provided in Appendix A ('Programming Practices to Support Eventual Parallelism'). Mr. Chrisman, a graduate student, wrote and successfully defended a Ph.D. dissertation proposal which describes our research associated with the issues of software portability and high performance. The list of research tasks are specified in the proposal. The proposal 'A Design Methodology for Portable Software on Parallel Computers' is summarized in section three and is provided in its entirety in Appendix B. We are currently studying a proposed subsystem of the NASA Clouds and the Earth's Radiant Energy System (CERES) data processing system. This software is the proof-of-concept for the Ph.D. dissertation. We have implemented and measured

  12. Sonic Boom Mitigation Through Aircraft Design and Adjoint Methodology

    NASA Technical Reports Server (NTRS)

    Rallabhandi, Siriam K.; Diskin, Boris; Nielsen, Eric J.

    2012-01-01

    This paper presents a novel approach to design of the supersonic aircraft outer mold line (OML) by optimizing the A-weighted loudness of sonic boom signature predicted on the ground. The optimization process uses the sensitivity information obtained by coupling the discrete adjoint formulations for the augmented Burgers Equation and Computational Fluid Dynamics (CFD) equations. This coupled formulation links the loudness of the ground boom signature to the aircraft geometry thus allowing efficient shape optimization for the purpose of minimizing the impact of loudness. The accuracy of the adjoint-based sensitivities is verified against sensitivities obtained using an independent complex-variable approach. The adjoint based optimization methodology is applied to a configuration previously optimized using alternative state of the art optimization methods and produces additional loudness reduction. The results of the optimizations are reported and discussed.

  13. Development of design and analysis methodology for composite bolted joints

    NASA Astrophysics Data System (ADS)

    Grant, Peter; Sawicki, Adam

    1991-05-01

    This paper summarizes work performed to develop composite joint design methodology for use on rotorcraft primary structure, determine joint characteristics which affect joint bearing and bypass strength, and develop analytical methods for predicting the effects of such characteristics in structural joints. Experimental results have shown that bearing-bypass interaction allowables cannot be defined using a single continuous function due to variance of failure modes for different bearing-bypass ratios. Hole wear effects can be significant at moderate stress levels and should be considered in the development of bearing allowables. A computer program has been developed and has successfully predicted bearing-bypass interaction effects for the (0/+/-45/90) family of laminates using filled hole and unnotched test data.

  14. The usefulness of systematic reviews of animal experiments for the design of preclinical and clinical studies.

    PubMed

    de Vries, Rob B M; Wever, Kimberley E; Avey, Marc T; Stephens, Martin L; Sena, Emily S; Leenaars, Marlies

    2014-01-01

    The question of how animal studies should be designed, conducted, and analyzed remains underexposed in societal debates on animal experimentation. This is not only a scientific but also a moral question. After all, if animal experiments are not appropriately designed, conducted, and analyzed, the results produced are unlikely to be reliable and the animals have in effect been wasted. In this article, we focus on one particular method to address this moral question, namely systematic reviews of previously performed animal experiments. We discuss how the design, conduct, and analysis of future (animal and human) experiments may be optimized through such systematic reviews. In particular, we illustrate how these reviews can help improve the methodological quality of animal experiments, make the choice of an animal model and the translation of animal data to the clinic more evidence-based, and implement the 3Rs. Moreover, we discuss which measures are being taken and which need to be taken in the future to ensure that systematic reviews will actually contribute to optimizing experimental design and thereby to meeting a necessary condition for making the use of animals in these experiments justified.

  15. SSME Investment in Turbomachinery Inducer Impeller Design Tools and Methodology

    NASA Technical Reports Server (NTRS)

    Zoladz, Thomas; Mitchell, William; Lunde, Kevin

    2010-01-01

    Within the rocket engine industry, SSME turbomachines are the de facto standards of success with regard to meeting aggressive performance requirements under challenging operational environments. Over the Shuttle era, SSME has invested heavily in our national inducer impeller design infrastructure. While both low and high pressure turbopump failures/anomaly resolution efforts spurred some of these investments, the SSME program was a major benefactor of key areas of turbomachinery inducer-impeller research outside of flight manifest pressures. Over the past several decades, key turbopump internal environments have been interrogated via highly instrumented hot-fire and cold-flow testing. Likewise, SSME has sponsored the advancement of time accurate and cavitating inducer impeller computation fluid dynamics (CFD) tools. These investments together have led to a better understanding of the complex internal flow fields within aggressive high performing inducers and impellers. New design tools and methodologies have evolved which intend to provide confident blade designs which strike an appropriate balance between performance and self induced load management.

  16. An NAFP Project: Use of Object Oriented Methodologies and Design Patterns to Refactor Software Design

    NASA Technical Reports Server (NTRS)

    Shaykhian, Gholam Ali; Baggs, Rhoda

    2007-01-01

    In the early problem-solution era of software programming, functional decompositions were mainly used to design and implement software solutions. In functional decompositions, functions and data are introduced as two separate entities during the design phase, and are followed as such in the implementation phase. Functional decompositions make use of refactoring through optimizing the algorithms, grouping similar functionalities into common reusable functions, and using abstract representations of data where possible; all these are done during the implementation phase. This paper advocates the usage of object-oriented methodologies and design patterns as the centerpieces of refactoring software solutions. Refactoring software is a method of changing software design while explicitly preserving its external functionalities. The combined usage of object-oriented methodologies and design patterns to refactor should also benefit the overall software life cycle cost with improved software.

  17. Author-paper affiliation network architecture influences the methodological quality of systematic reviews and meta-analyses of psoriasis.

    PubMed

    Sanz-Cabanillas, Juan Luis; Ruano, Juan; Gomez-Garcia, Francisco; Alcalde-Mellado, Patricia; Gay-Mimbrera, Jesus; Aguilar-Luque, Macarena; Maestre-Lopez, Beatriz; Gonzalez-Padilla, Marcelino; Carmona-Fernandez, Pedro J; Velez Garcia-Nieto, Antonio; Isla-Tejera, Beatriz

    2017-01-01

    Moderate-to-severe psoriasis is associated with significant comorbidity, an impaired quality of life, and increased medical costs, including those associated with treatments. Systematic reviews (SRs) and meta-analyses (MAs) of randomized clinical trials are considered two of the best approaches to the summarization of high-quality evidence. However, methodological bias can reduce the validity of conclusions from these types of studies and subsequently impair the quality of decision making. As co-authorship is among the most well-documented forms of research collaboration, the present study aimed to explore whether authors' collaboration methods might influence the methodological quality of SRs and MAs of psoriasis. Methodological quality was assessed by two raters who extracted information from full articles. After calculating total and per-item Assessment of Multiple Systematic Reviews (AMSTAR) scores, reviews were classified as low (0-4), medium (5-8), or high (9-11) quality. Article metadata and journal-related bibliometric indices were also obtained. A total of 741 authors from 520 different institutions and 32 countries published 220 reviews that were classified as high (17.2%), moderate (55%), or low (27.7%) methodological quality. The high methodological quality subnetwork was larger but had a lower connection density than the low and moderate methodological quality subnetworks; specifically, the former contained relatively fewer nodes (authors and reviews), reviews by authors, and collaborators per author. Furthermore, the high methodological quality subnetwork was highly compartmentalized, with several modules representing few poorly interconnected communities. In conclusion, structural differences in author-paper affiliation network may influence the methodological quality of SRs and MAs on psoriasis. As the author-paper affiliation network structure affects study quality in this research field, authors who maintain an appropriate balance between

  18. Database Design Methodology and Database Management System for Computer-Aided Structural Design Optimization.

    DTIC Science & Technology

    1984-12-01

    1983). Several researchers Lillehagen and Dokkar (1982), Grabowski, Eigener and Ranch (1978), and Eberlein and Wedekind (1982) have worked on database...Proceedings of International Federation of Information Processing. pp. 335-366. Eberlein, W. and Wedekind , H., 1982, "A Methodology for Embedding Design

  19. Supporting Space Systems Design via Systems Dependency Analysis Methodology

    NASA Astrophysics Data System (ADS)

    Guariniello, Cesare

    assess the behavior of each system based on its internal status and on the topology of its dependencies on systems connected to it. Designers and decision makers can therefore quickly analyze and explore the behavior of complex systems and evaluate different architectures under various working conditions. The methods support educated decision making both in the design and in the update process of systems architecture, reducing the need to execute extensive simulations. In particular, in the phase of concept generation and selection, the information given by the methods can be used to identify promising architectures to be further tested and improved, while discarding architectures that do not show the required level of global features. The methods, when used in conjunction with appropriate metrics, also allow for improved reliability and risk analysis, as well as for automatic scheduling and re-scheduling based on the features of the dependencies and on the accepted level of risk. This dissertation illustrates the use of the two methods in sample aerospace applications, both in the operational and in the developmental domain. The applications show how to use the developed methodology to evaluate the impact of failures, assess the criticality of systems, quantify metrics of interest, quantify the impact of delays, support informed decision making when scheduling the development of systems and evaluate the achievement of partial capabilities. A larger, well-framed case study illustrates how the Systems Operational Dependency Analysis method and the Systems Developmental Dependency Analysis method can support analysis and decision making, at the mid and high level, in the design process of architectures for the exploration of Mars. The case study also shows how the methods do not replace the classical systems engineering methodologies, but support and improve them.

  20. Toward More Effective Microcomputer Courseware through Application of Systematic Instructional Design Methods.

    ERIC Educational Resources Information Center

    Roblyer, M. D.

    1983-01-01

    Discusses some reasons courseware developers have not embraced the systematic approach to microcomputer courseware design and describes ways systematic design could have a positive impact on courseware acceptance in classrooms. Twenty-two references are listed. (MBR)

  1. The HIV care cascade: a systematic review of data sources, methodology and comparability

    PubMed Central

    Medland, Nicholas A; McMahon, James H; Chow, Eric PF; Elliott, Julian H; Hoy, Jennifer F; Fairley, Christopher K

    2015-01-01

    Introduction The cascade of HIV diagnosis, care and treatment (HIV care cascade) is increasingly used to direct and evaluate interventions to increase population antiretroviral therapy (ART) coverage, a key component of treatment as prevention. The ability to compare cascades over time, sub-population, jurisdiction or country is important. However, differences in data sources and methodology used to construct the HIV care cascade might limit its comparability and ultimately its utility. Our aim was to review systematically the different methods used to estimate and report the HIV care cascade and their comparability. Methods A search of published and unpublished literature through March 2015 was conducted. Cascades that reported the continuum of care from diagnosis to virological suppression in a demographically definable population were included. Data sources and methods of measurement or estimation were extracted. We defined the most comparable cascade elements as those that directly measured diagnosis or care from a population-based data set. Results and discussions Thirteen reports were included after screening 1631 records. The undiagnosed HIV-infected population was reported in seven cascades, each of which used different data sets and methods and could not be considered to be comparable. All 13 used mandatory HIV diagnosis notification systems to measure the diagnosed population. Population-based data sets, derived from clinical data or mandatory reporting of CD4 cell counts and viral load tests from all individuals, were used in 6 of 12 cascades reporting linkage, 6 of 13 reporting retention, 3 of 11 reporting ART and 6 of 13 cascades reporting virological suppression. Cascades with access to population-based data sets were able to directly measure cascade elements and are therefore comparable over time, place and sub-population. Other data sources and methods are less comparable. Conclusions To ensure comparability, countries wishing to accurately measure

  2. Arab Teens Lifestyle Study (ATLS): objectives, design, methodology and implications

    PubMed Central

    Al-Hazzaa, Hazzaa M; Musaiger, Abdulrahman O

    2011-01-01

    Background There is a lack of comparable data on physical activity, sedentary behavior, and dietary habits among Arab adolescents, which limits our understanding and interpretation of the relationship between obesity and lifestyle parameters. Therefore, we initiated the Arab Teens Lifestyle Study (ATLS). The ATLS is a multicenter collaborative project for assessing lifestyle habits of Arab adolescents. The objectives of the ATLS project were to investigate the prevalence rates for overweight and obesity, physical activity, sedentary activity and dietary habits among Arab adolescents, and to examine the interrelationships between these lifestyle variables. This paper reports on the objectives, design, methodology, and implications of the ATLS. Design/Methods The ATLS is a school-based cross-sectional study involving 9182 randomly selected secondary-school students (14–19 years) from major Arab cities, using a multistage stratified sampling technique. The participating Arab cities included Riyadh, Jeddah, and Al-Khobar (Saudi Arabia), Bahrain, Dubai (United Arab Emirates), Kuwait, Amman (Jordan), Mosel (Iraq), Muscat (Oman), Tunisia (Tunisia) and Kenitra (Morocco). Measured variables included anthropometric measurements, physical activity, sedentary behavior, sleep duration, and dietary habits. Discussion The ATLS project will provide a unique opportunity to collect and analyze important lifestyle information from Arab adolescents using standardized procedures. This is the first time a collaborative Arab project will simultaneously assess broad lifestyle variables in a large sample of adolescents from numerous urbanized Arab regions. This joint research project will supply us with comprehensive and recent data on physical activity/inactivity and eating habits of Arab adolescents relative to obesity. Such invaluable lifestyle-related data are crucial for developing public health policies and regional strategies for health promotion and disease prevention. PMID

  3. Bioslurry phase remediation of chlorpyrifos contaminated soil: process evaluation and optimization by Taguchi design of experimental (DOE) methodology.

    PubMed

    Venkata Mohan, S; Sirisha, K; Sreenivasa Rao, R; Sarma, P N

    2007-10-01

    Design of experimental (DOE) methodology using Taguchi orthogonal array (OA) was applied to evaluate the influence of eight biotic and abiotic factors (substrate-loading rate, slurry phase pH, slurry phase dissolved oxygen (DO), soil water ratio, temperature, soil microflora load, application of bioaugmentation and humic substance concentration) on the soil bound chlorpyrifos bioremediation in bioslurry phase reactor. The selected eight factors were considered at three levels (18 experiments) in the experimental design. Substrate-loading rate showed significant influence on the bioremediation process among the selected factors. Derived optimum operating conditions obtained by the methodology showed enhanced chlorpyrifos degradation from 1479.99 to 2458.33microg/g (over all 39.82% enhancement). The proposed method facilitated systematic mathematical approach to understand the complex bioremediation process and the optimization of near optimum design parameters, only with a few well-defined experimental sets.

  4. Developing risk prediction models for type 2 diabetes: a systematic review of methodology and reporting

    PubMed Central

    2011-01-01

    Background The World Health Organisation estimates that by 2030 there will be approximately 350 million people with type 2 diabetes. Associated with renal complications, heart disease, stroke and peripheral vascular disease, early identification of patients with undiagnosed type 2 diabetes or those at an increased risk of developing type 2 diabetes is an important challenge. We sought to systematically review and critically assess the conduct and reporting of methods used to develop risk prediction models for predicting the risk of having undiagnosed (prevalent) or future risk of developing (incident) type 2 diabetes in adults. Methods We conducted a systematic search of PubMed and EMBASE databases to identify studies published before May 2011 that describe the development of models combining two or more variables to predict the risk of prevalent or incident type 2 diabetes. We extracted key information that describes aspects of developing a prediction model including study design, sample size and number of events, outcome definition, risk predictor selection and coding, missing data, model-building strategies and aspects of performance. Results Thirty-nine studies comprising 43 risk prediction models were included. Seventeen studies (44%) reported the development of models to predict incident type 2 diabetes, whilst 15 studies (38%) described the derivation of models to predict prevalent type 2 diabetes. In nine studies (23%), the number of events per variable was less than ten, whilst in fourteen studies there was insufficient information reported for this measure to be calculated. The number of candidate risk predictors ranged from four to sixty-four, and in seven studies it was unclear how many risk predictors were considered. A method, not recommended to select risk predictors for inclusion in the multivariate model, using statistical significance from univariate screening was carried out in eight studies (21%), whilst the selection procedure was unclear in

  5. Optimizing drug delivery systems using systematic "design of experiments." Part I: fundamental aspects.

    PubMed

    Singh, Bhupinder; Kumar, Rajiv; Ahuja, Naveen

    2005-01-01

    Design of an impeccable drug delivery product normally encompasses multiple objectives. For decades, this task has been attempted through trial and error, supplemented with the previous experience, knowledge, and wisdom of the formulator. Optimization of a pharmaceutical formulation or process using this traditional approach involves changing one variable at a time. Using this methodology, the solution of a specific problematic formulation characteristic can certainly be achieved, but attainment of the true optimal composition is never guaranteed. And for improvement in one characteristic, one has to trade off for degeneration in another. This customary approach of developing a drug product or process has been proved to be not only uneconomical in terms of time, money, and effort, but also unfavorable to fix errors, unpredictable, and at times even unsuccessful. On the other hand, the modern formulation optimization approaches, employing systematic Design of Experiments (DoE), are extensively practiced in the development of diverse kinds of drug delivery devices to improve such irregularities. Such systematic approaches are far more advantageous, because they require fewer experiments to achieve an optimum formulation, make problem tracing and rectification quite easier, reveal drug/polymer interactions, simulate the product performance, and comprehend the process to assist in better formulation development and subsequent scale-up. Optimization techniques using DoE represent effective and cost-effective analytical tools to yield the "best solution" to a particular "problem." Through quantification of drug delivery systems, these approaches provide a depth of understanding as well as an ability to explore and defend ranges for formulation factors, where experimentation is completed before optimization is attempted. The key elements of a DoE optimization methodology encompass planning the study objectives, screening of influential variables, experimental designs

  6. Methodology for the optimal design of an integrated first and second generation ethanol production plant combined with power cogeneration.

    PubMed

    Bechara, Rami; Gomez, Adrien; Saint-Antonin, Valérie; Schweitzer, Jean-Marc; Maréchal, François

    2016-08-01

    The application of methodologies for the optimal design of integrated processes has seen increased interest in literature. This article builds on previous works and applies a systematic methodology to an integrated first and second generation ethanol production plant with power cogeneration. The methodology breaks into process simulation, heat integration, thermo-economic evaluation, exergy efficiency vs. capital costs, multi-variable, evolutionary optimization, and process selection via profitability maximization. Optimization generated Pareto solutions with exergy efficiency ranging between 39.2% and 44.4% and capital costs from 210M$ to 390M$. The Net Present Value was positive for only two scenarios and for low efficiency, low hydrolysis points. The minimum cellulosic ethanol selling price was sought to obtain a maximum NPV of zero for high efficiency, high hydrolysis alternatives. The obtained optimal configuration presented maximum exergy efficiency, hydrolyzed bagasse fraction, capital costs and ethanol production rate, and minimum cooling water consumption and power production rate.

  7. A systematic approach to design for lifelong aircraft evolution

    NASA Astrophysics Data System (ADS)

    Lim, Dongwook

    This research proposes a systematic approach with which the decision makers can evaluate the value and risk of a new aircraft development program, including potential derivative development opportunities. The proposed Evaluation of Lifelong Vehicle Evolution (EvoLVE) method is a two- or multi-stage representation of the aircraft design process that accommodates initial development phases as well as follow-on phases. One of the key elements of this method is the Stochastic Programming with Recourse (SPR) technique, which accounts for uncertainties associated with future requirements. The remedial approach of SPR in its two distinctive problem-solving steps is well suited to aircraft design problems where derivatives, retrofits, and upgrades have been used to fix designs that were once but no longer optimal. The solution approach of SPR is complemented by the Risk-Averse Strategy Selection (RASS) technique to gauge risk associated with vehicle evolution options. In the absence of a full description of the random space, a scenario-based approach captures the randomness with a few probable scenarios and reveals implications of different future events. Last, an interactive framework for decision-making support allows simultaneous navigation of the current and future design space with a greater degree of freedom. A cantilevered beam design problem was set up and solved using the SPR technique to showcase its application to an engineering design setting. The full EvoLVE method was conducted on a notional multi-role fighter based on the F/A-18 Hornet.

  8. Multidisciplinary design and optimization (MDO) methodology for the aircraft conceptual design

    NASA Astrophysics Data System (ADS)

    Iqbal, Liaquat Ullah

    An integrated design and optimization methodology has been developed for the conceptual design of an aircraft. The methodology brings higher fidelity Computer Aided Design, Engineering and Manufacturing (CAD, CAE and CAM) Tools such as CATIA, FLUENT, ANSYS and SURFCAM into the conceptual design by utilizing Excel as the integrator and controller. The approach is demonstrated to integrate with many of the existing low to medium fidelity codes such as the aerodynamic panel code called CMARC and sizing and constraint analysis codes, thus providing the multi-fidelity capabilities to the aircraft designer. The higher fidelity design information from the CAD and CAE tools for the geometry, aerodynamics, structural and environmental performance is provided for the application of the structured design methods such as the Quality Function Deployment (QFD) and the Pugh's Method. The higher fidelity tools bring the quantitative aspects of a design such as precise measurements of weight, volume, surface areas, center of gravity (CG) location, lift over drag ratio, and structural weight, as well as the qualitative aspects such as external geometry definition, internal layout, and coloring scheme early in the design process. The performance and safety risks involved with the new technologies can be reduced by modeling and assessing their impact more accurately on the performance of the aircraft. The methodology also enables the design and evaluation of the novel concepts such as the blended (BWB) and the hybrid wing body (HWB) concepts. Higher fidelity computational fluid dynamics (CFD) and finite element analysis (FEA) allow verification of the claims for the performance gains in aerodynamics and ascertain risks of structural failure due to different pressure distribution in the fuselage as compared with the tube and wing design. The higher fidelity aerodynamics and structural models can lead to better cost estimates that help reduce the financial risks as well. This helps in

  9. A combined stochastic feedforward and feedback control design methodology with application to autoland design

    NASA Technical Reports Server (NTRS)

    Halyo, Nesim

    1987-01-01

    A combined stochastic feedforward and feedback control design methodology was developed. The objective of the feedforward control law is to track the commanded trajectory, whereas the feedback control law tries to maintain the plant state near the desired trajectory in the presence of disturbances and uncertainties about the plant. The feedforward control law design is formulated as a stochastic optimization problem and is embedded into the stochastic output feedback problem where the plant contains unstable and uncontrollable modes. An algorithm to compute the optimal feedforward is developed. In this approach, the use of error integral feedback, dynamic compensation, control rate command structures are an integral part of the methodology. An incremental implementation is recommended. Results on the eigenvalues of the implemented versus designed control laws are presented. The stochastic feedforward/feedback control methodology is used to design a digital automatic landing system for the ATOPS Research Vehicle, a Boeing 737-100 aircraft. The system control modes include localizer and glideslope capture and track, and flare to touchdown. Results of a detailed nonlinear simulation of the digital control laws, actuator systems, and aircraft aerodynamics are presented.

  10. A Synergy between the Technological Process and a Methodology for Web Design: Implications for Technological Problem Solving and Design

    ERIC Educational Resources Information Center

    Jakovljevic, Maria; Ankiewicz, Piet; De swardt, Estelle; Gross, Elna

    2004-01-01

    Traditional instructional methodology in the Information System Design (ISD) environment lacks explicit strategies for promoting the cognitive skills of prospective system designers. This contributes to the fragmented knowledge and low motivational and creative involvement of learners in system design tasks. In addition, present ISD methodologies,…

  11. An Examination of the MH-60S Common Cockpit from a Design Methodology and Acquisitions Standpoint

    DTIC Science & Technology

    2009-06-01

    HCI Design Methodology Based on the CCD Philosophy ...................................51 a. Use of Design Methodology Specifically Developed for...19 Figure 8. Lockheed Martin Human Computer Interface Requirements (HCIRS) contents (From: [6]).......36 Figure 9. Lockheed Martin eight step HCI ...function of time (From: [9]).....................................46 Figure 15. Systems engineering iterative HCI design process (From [10

  12. SysSon - A Framework for Systematic Sonification Design

    NASA Astrophysics Data System (ADS)

    Vogt, Katharina; Goudarzi, Visda; Holger Rutz, Hanns

    2015-04-01

    SysSon is a research approach on introducing sonification systematically to a scientific community where it is not yet commonly used - e.g., in climate science. Thereby, both technical and socio-cultural barriers have to be met. The approach was further developed with climate scientists, who participated in contextual inquiries, usability tests and a workshop of collaborative design. Following from these extensive user tests resulted our final software framework. As frontend, a graphical user interface allows climate scientists to parametrize standard sonifications with their own data sets. Additionally, an interactive shell allows to code new sonifications for users competent in sound design. The framework is a standalone desktop application, available as open source (for details see http://sysson.kug.ac.at/) and works with data in NetCDF format.

  13. Systematic design of highly birefringent photonic crystal fibers

    NASA Astrophysics Data System (ADS)

    Hsu, Jui-Ming

    2017-03-01

    This article systematically designs and theoretically investigates a highly birefringent photonic crystal fiber (HB-PCF) for reducing the effect of polarization mode dispersion in high-speed optical communication system. To achieve a high modal birefringence in the proposed HB-PCF, four types of HB-PCF were designed by adding some birefringence-enhancing factors step by step in sequence. Ultimately, as per the simulation results, in the condition of single-mode operation, the numeric values of modal birefringence and confinement loss of the proposed HB-PCF is about 21.85 × 10- 3 and 0.47 dB/km at the habitual wavelength λ = 1.55 µm of optical-fiber communications.

  14. New systematic methodology for incorporating dynamic heat transfer modelling in multi-phase biochemical reactors.

    PubMed

    Fernández-Arévalo, T; Lizarralde, I; Grau, P; Ayesa, E

    2014-09-01

    This paper presents a new modelling methodology for dynamically predicting the heat produced or consumed in the transformations of any biological reactor using Hess's law. Starting from a complete description of model components stoichiometry and formation enthalpies, the proposed modelling methodology has integrated successfully the simultaneous calculation of both the conventional mass balances and the enthalpy change of reaction in an expandable multi-phase matrix structure, which facilitates a detailed prediction of the main heat fluxes in the biochemical reactors. The methodology has been implemented in a plant-wide modelling methodology in order to facilitate the dynamic description of mass and heat throughout the plant. After validation with literature data, as illustrative examples of the capability of the methodology, two case studies have been described. In the first one, a predenitrification-nitrification dynamic process has been analysed, with the aim of demonstrating the easy integration of the methodology in any system. In the second case study, the simulation of a thermal model for an ATAD has shown the potential of the proposed methodology for analysing the effect of ventilation and influent characterization.

  15. A performance-oriented power transformer design methodology using multi-objective evolutionary optimization.

    PubMed

    Adly, Amr A; Abd-El-Hafiz, Salwa K

    2015-05-01

    Transformers are regarded as crucial components in power systems. Due to market globalization, power transformer manufacturers are facing an increasingly competitive environment that mandates the adoption of design strategies yielding better performance at lower costs. In this paper, a power transformer design methodology using multi-objective evolutionary optimization is proposed. Using this methodology, which is tailored to be target performance design-oriented, quick rough estimation of transformer design specifics may be inferred. Testing of the suggested approach revealed significant qualitative and quantitative match with measured design and performance values. Details of the proposed methodology as well as sample design results are reported in the paper.

  16. A performance-oriented power transformer design methodology using multi-objective evolutionary optimization

    PubMed Central

    Adly, Amr A.; Abd-El-Hafiz, Salwa K.

    2014-01-01

    Transformers are regarded as crucial components in power systems. Due to market globalization, power transformer manufacturers are facing an increasingly competitive environment that mandates the adoption of design strategies yielding better performance at lower costs. In this paper, a power transformer design methodology using multi-objective evolutionary optimization is proposed. Using this methodology, which is tailored to be target performance design-oriented, quick rough estimation of transformer design specifics may be inferred. Testing of the suggested approach revealed significant qualitative and quantitative match with measured design and performance values. Details of the proposed methodology as well as sample design results are reported in the paper. PMID:26257939

  17. [Principles and methodology for ecological rehabilitation and security pattern design in key project construction].

    PubMed

    Chen, Li-Ding; Lu, Yi-He; Tian, Hui-Ying; Shi, Qian

    2007-03-01

    Global ecological security becomes increasingly important with the intensive human activities. The function of ecological security is influenced by human activities, and in return, the efficiency of human activities will also be affected by the patterns of regional ecological security. Since the 1990s, China has initiated the construction of key projects "Yangtze Three Gorges Dam", "Qinghai-Tibet Railway", "West-to-East Gas Pipeline", "West-to-East Electricity Transmission" and "South-to-North Water Transfer" , etc. The interaction between these projects and regional ecological security has particularly attracted the attention of Chinese government. It is not only important for the regional environmental protection, but also of significance for the smoothly implementation of various projects aimed to develop an ecological rehabilitation system and to design a regional ecological security pattern. This paper made a systematic analysis on the types and characteristics of key project construction and their effects on the environment, and on the basis of this, brought forward the basic principles and methodology for ecological rehabilitation and security pattern design in this construction. It was considered that the following issues should be addressed in the implementation of a key project: 1) analysis and evaluation of current regional ecological environment, 2) evaluation of anthropogenic disturbances and their ecological risk, 3) regional ecological rehabilitation and security pattern design, 4) scenario analysis of environmental benefits of regional ecological security pattern, 5) re-optimization of regional ecological system framework, and 6) establishment of regional ecosystem management plan.

  18. Systematic Reviews and Meta-Analyses of Home Telemonitoring Interventions for Patients With Chronic Diseases: A Critical Assessment of Their Methodological Quality

    PubMed Central

    2013-01-01

    . While several criteria were met satisfactorily by either all or nearly all reviews, such as the establishment of an a priori design with inclusion and exclusion criteria, use of electronic searches on multiple databases, and reporting of studies characteristics, there were other important areas that needed improvement. Duplicate data extraction, manual searches of highly relevant journals, inclusion of gray and non-English literature, assessment of the methodological quality of included studies and quality of evidence were key methodological procedures that were performed infrequently. Furthermore, certain methodological limitations identified in the synthesis of study results have affected the results and conclusions of some reviews. Conclusions Despite the availability of methodological guidelines that can be utilized to guide the proper conduct of systematic reviews and meta-analyses and eliminate potential risks of bias, this knowledge has not yet been fully integrated in the area of home telemonitoring. Further efforts should be made to improve the design, conduct, reporting, and publication of systematic reviews and meta-analyses in this area. PMID:23880072

  19. [Marxism as a theoretical and methodological framework in collective health: implications for systematic review and synthesis of evidence].

    PubMed

    Soares, Cassia Baldini; Campos, Celia Maria Sivalli; Yonekura, Tatiana

    2013-12-01

    In this study, we discuss the integration in systematic reviews of research developed from a Marxist perspective of knowledge production and their results as evidence in healthcare. The study objectives are to review the assumptions of dialectical and historical materialism (DHM) and discuss the implications of dialectics for a literature review and the synthesis of evidence. DHM is a powerful framework for knowledge generation and transformation of policies and practices in healthcare. It assumes that social contradictions underlie the health-disease process, the fundamental theoretical construction in the field of collective health. Currently, we observe a considerable influence of the critical paradigm, of Marxist origin, in the construction of knowledge in health. Studies based on this critical paradigm incorporate complex methods, which are inherent to the guidelines of dialect, to identify the object and arrive at results that constitute evidence in healthcare. Systematic reviews should address the methodological difficulties associated with entirely integrating these results to healthcare.

  20. Design methodology of the strength properties of medical knitted meshes

    NASA Astrophysics Data System (ADS)

    Mikołajczyk, Z.; Walkowska, A.

    2016-07-01

    One of the most important utility properties of medical knitted meshes intended for hernia and urological treatment is their bidirectional strength along the courses and wales. The value of this parameter, expected by the manufacturers and surgeons, is estimated at 100 N per 5 cm of the sample width. The most frequently, these meshes are produced on the basis of single- or double-guide stitches. They are made of polypropylene and polyester monofilament yarns with the diameter in the range from 0.6 to 1.2 mm, characterized by a high medical purity. The aim of the study was to develop the design methodology of meshes strength based on the geometrical construction of the stitch and strength of yarn. In the environment of the ProCAD warpknit 5 software the simulated stretching process of meshes together with an analysis of their geometry changes was carried out. Simulations were made for four selected representative stitches. Both on a built, unique measuring position and on the tensile testing machine the real parameters of the loops geometry of meshes were measured. Model of mechanical stretching of warp-knitted meshes along the courses and wales was developed. The thesis argument was made, that the force that breaks the loop of warp-knitted fabric is the lowest value of breaking forces of loop link yarns or yarns that create straight sections of loop. This thesis was associate with the theory of strength that uses the “the weakest link concept”. Experimental verification of model was carried out for the basic structure of the single-guide mesh. It has been shown that the real, relative strength of the mesh related to one course is equal to the strength of the yarn breakage in a loop, while the strength along the wales is close to breaking strength of a single yarn. In relation to the specific construction of the medical mesh, based on the knowledge of the density of the loops structure, the a-jour mesh geometry and the yarns strength, it is possible, with high

  1. Methodological quality of systematic reviews and clinical trials on women's health published in a Brazilian evidence-based health journal

    PubMed Central

    Macedo, Cristiane Rufino; Riera, Rachel; Torloni, Maria Regina

    2013-01-01

    OBJECTIVES: To assess the quality of systematic reviews and clinical trials on women's health recently published in a Brazilian evidence-based health journal. METHOD: All systematic reviews and clinical trials on women's health published in the last five years in the Brazilian Journal of Evidence-based Health were retrieved. Two independent reviewers critically assessed the methodological quality of reviews and trials using AMSTAR and the Cochrane Risk of Bias Table, respectively. RESULTS: Systematic reviews and clinical trials accounted for less than 10% of the 61 original studies on women's health published in the São Paulo Medical Journal over the last five years. All five reviews were considered to be of moderate quality; the worst domains were publication bias and the appropriate use of study quality in formulating conclusions. All three clinical trials were judged to have a high risk of bias. The participant blinding, personnel and outcome assessors and allocation concealment domains had the worst scores. CONCLUSIONS: Most of the systematic reviews and clinical trials on women's health recently published in a Brazilian evidence-based journal are of low to moderate quality. The quality of these types of studies needs improvement. PMID:23778332

  2. Methodological Quality and Reporting of Generalized Linear Mixed Models in Clinical Medicine (2000–2012): A Systematic Review

    PubMed Central

    Casals, Martí; Girabent-Farrés, Montserrat; Carrasco, Josep L.

    2014-01-01

    Background Modeling count and binary data collected in hierarchical designs have increased the use of Generalized Linear Mixed Models (GLMMs) in medicine. This article presents a systematic review of the application and quality of results and information reported from GLMMs in the field of clinical medicine. Methods A search using the Web of Science database was performed for published original articles in medical journals from 2000 to 2012. The search strategy included the topic “generalized linear mixed models”,“hierarchical generalized linear models”, “multilevel generalized linear model” and as a research domain we refined by science technology. Papers reporting methodological considerations without application, and those that were not involved in clinical medicine or written in English were excluded. Results A total of 443 articles were detected, with an increase over time in the number of articles. In total, 108 articles fit the inclusion criteria. Of these, 54.6% were declared to be longitudinal studies, whereas 58.3% and 26.9% were defined as repeated measurements and multilevel design, respectively. Twenty-two articles belonged to environmental and occupational public health, 10 articles to clinical neurology, 8 to oncology, and 7 to infectious diseases and pediatrics. The distribution of the response variable was reported in 88% of the articles, predominantly Binomial (n = 64) or Poisson (n = 22). Most of the useful information about GLMMs was not reported in most cases. Variance estimates of random effects were described in only 8 articles (9.2%). The model validation, the method of covariate selection and the method of goodness of fit were only reported in 8.0%, 36.8% and 14.9% of the articles, respectively. Conclusions During recent years, the use of GLMMs in medical literature has increased to take into account the correlation of data when modeling qualitative data or counts. According to the current recommendations, the quality of

  3. Aerospace engineering design by systematic decomposition and multilevel optimization

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, J.; Barthelemy, J. F. M.; Giles, G. L.

    1984-01-01

    A method for systematic analysis and optimization of large engineering systems, by decomposition of a large task into a set of smaller subtasks that is solved concurrently is described. The subtasks may be arranged in hierarchical levels. Analyses are carried out in each subtask using inputs received from other subtasks, and are followed by optimizations carried out from the bottom up. Each optimization at the lower levels is augmented by analysis of its sensitivity to the inputs received from other subtasks to account for the couplings among the subtasks in a formal manner. The analysis and optimization operations alternate iteratively until they converge to a system design whose performance is maximized with all constraints satisfied. The method, which is still under development, is tentatively validated by test cases in structural applications and an aircraft configuration optimization.

  4. A Formal Semantics for the SRI Hierarchical Program Design Methodology

    NASA Technical Reports Server (NTRS)

    Boyer, R. S.; Moore, J. S.

    1983-01-01

    A formal statement of what it means to use (a subset of) the methodology is presented. It is formally defined that some specified module exists and what it means to say that another module is paid correctly implemented on top of it. No attention is to motivation, either of the methodology or of the formal development of it. Concentration is entirely upon mathematical succinctness and precision. A discussion is presented of how to use certain INTERLISP programs which implement the formal definitions. Among these are a program which generates Floyd like verification conditions sufficient to imply the correctness of a module implementation.

  5. Methodological Quality Assessment of Meta-Analyses and Systematic Reviews of Probiotics in Inflammatory Bowel Disease and Pouchitis

    PubMed Central

    Teng, Guigen; Wei, Tiantong; Gao, Wen; Wang, Huahong

    2016-01-01

    Background Probiotics are widely used for the induction and maintenance of remission in inflammatory bowel disease (IBD) and pouchitis. There are a large number of meta-analyses (MAs)/ systematic reviews (SRs) on this subject, the methodological quality of which has not been evaluated. Objectives This study aimed to evaluate the methodological quality of and summarize the evidence obtained from MAs/SRs of probiotic treatments for IBD and pouchitis patients. Methods The PubMed, EMBASE, Cochrane Library and China National Knowledge Infrastructure (CNKI) databases were searched to identify Chinese and English language MAs/SRs of the use of probiotics for IBD and pouchitis. The Assessment of Multiple Systematic Reviews (AMSTAR) scale was used to assess the methodological quality of the studies. Results A total of 36 MAs/SRs were evaluated. The AMSTAR scores of the included studies ranged from 1 to 10, and the average score was 5.81. According to the Canadian Agency for Drugs and Technologies in Health, 4 articles were classified as high quality, 24 articles were classified as moderate quality, and 8 articles were classified as low quality. Most of the MAs/SRs suggested that probiotics had potential benefits for patients with ulcerative colitis (UC), but failed to show effectiveness in the induction and maintenance of remission in Crohn’s disease (CD). The probiotic preparation VSL#3 may play a beneficial role in pouchitis. Conclusion The overall methodological quality of the current MAs/SRs in the field of probiotics for IBD and pouchitis was found to be low to moderate. More MAs/SRs of high quality are required to support using probiotics to treat IBD and pouchitis. PMID:28005973

  6. Applying Set Based Methodology In Submarine Concept Design

    DTIC Science & Technology

    2010-06-01

    powering , trim, etc. By systematically addressing each element in sequence and doing so in increasing detail in each pass around the spiral, a single...constant and proactive oversight. Involvement of the Technical Warrant Holders (TWHs) 4 was fundamental to the SSC implementation of SBD. They...4 Technical Warrant Holders are individuals holding Technical Authority (TA) for a given technical area. The TWH is

  7. Staffing by Design: A Methodology for Staffing Reference

    ERIC Educational Resources Information Center

    Ward, David; Phetteplace, Eric

    2012-01-01

    The growth in number and kind of online reference services has resulted in both new users consulting library research services as well as new patterns of service use. Staffing in-person and virtual reference services desks adequately requires a systematic analysis of patterns of use across service points in order to successfully meet fluctuating…

  8. Establishing Equivalence: Methodological Progress in Group-Matching Design and Analysis

    ERIC Educational Resources Information Center

    Kover, Sara T.; Atwood, Amy K.

    2013-01-01

    This methodological review draws attention to the challenges faced by intellectual and developmental disabilities researchers in the appropriate design and analysis of group comparison studies. We provide a brief overview of matching methodologies in the field, emphasizing group-matching designs used in behavioral research on cognition and…

  9. De/signing Research in Education: Patchwork(ing) Methodologies with Theory

    ERIC Educational Resources Information Center

    Higgins, Marc; Madden, Brooke; Berard, Marie-France; Lenz Kothe, Elsa; Nordstrom, Susan

    2017-01-01

    Four education scholars extend the methodological space inspired by Jackson and Mazzei's "Thinking with Theory" through focusing on research design. The notion of de/sign is presented and employed to counter prescriptive method/ology that often sutures over pedagogical possibilities in research and educational settings. Key…

  10. Methodology for designing accelerated aging tests for predicting life of photovoltaic arrays

    NASA Technical Reports Server (NTRS)

    Gaines, G. B.; Thomas, R. E.; Derringer, G. C.; Kistler, C. W.; Bigg, D. M.; Carmichael, D. C.

    1977-01-01

    A methodology for designing aging tests in which life prediction was paramount was developed. The methodology builds upon experience with regard to aging behavior in those material classes which are expected to be utilized as encapsulant elements, viz., glasses and polymers, and upon experience with the design of aging tests. The experiences were reviewed, and results are discussed in detail.

  11. Methodology of Computer-Aided Design of Variable Guide Vanes of Aircraft Engines

    ERIC Educational Resources Information Center

    Falaleev, Sergei V.; Melentjev, Vladimir S.; Gvozdev, Alexander S.

    2016-01-01

    The paper presents a methodology which helps to avoid a great amount of costly experimental research. This methodology includes thermo-gas dynamic design of an engine and its mounts, the profiling of compressor flow path and cascade design of guide vanes. Employing a method elaborated by Howell, we provide a theoretical solution to the task of…

  12. Design guided data analysis for summarizing systematic pattern defects and process window

    NASA Astrophysics Data System (ADS)

    Xie, Qian; Venkatachalam, Panneerselvam; Lee, Julie; Chen, Zhijin; Zafar, Khurram

    2016-03-01

    As the semiconductor process technology moves into more advanced nodes, design and process induced systematic defects become increasingly significant yield limiters. Therefore, early detection of these defects is crucial. Focus Exposure Matrix (FEM) and Process Window Qualification (PWQ) are routine methods for discovering systematic patterning defects and establishing the lithography process window. These methods require the stepper to expose a reticle onto the wafer at various focus and exposure settings (also known as modulations). The wafer is subsequently inspected by a bright field, broadband plasma or an E-Beam Inspection tool using a high sensitivity inspection recipe (i.e. hot scan) that often reports a million or more defects. Analyzing this vast stream of data to identify the weak patterns and arrive at the optimal focus/exposure settings requires a significant amount of data reduction through aggressive sampling and nuisance filtering schemes. However, these schemes increase alpha risk, i.e. the probability of not catching some systematic or otherwise important defects within a modulation and thus reporting that modulation as a good condition for production wafers. In order to reduce this risk and establish a more accurate process window, we describe a technique that introduces image-and-design integration methodologies into the inspection data analysis workflow. These image-and-design integration methodologies include contour extraction and alignment to design, contour-to-design defect detection, defective/nuisance pattern retrieval, confirmed defective/nuisance pattern overlay with inspection data, and modulation-related weak-pattern ranking. The technique we present provides greater automation, from defect detection to defective pattern retrieval to decision-making steps, that allows for statistically summarized results and increased coverage of the wafer to be achieved without an adverse impact on cycle time. Statistically summarized results, lead

  13. Behavioral Methodology for Designing and Evaluating Applied Programs for Women.

    ERIC Educational Resources Information Center

    Thurston, Linda P.

    To be maximally effective in solving problems, researchers must place their methodological and theoretical models of science within social and political contexts. They must become aware of biases and assumptions and move toward a more valid perception of social realities. Psychologists must view women in the situational context within which…

  14. Peer Review of a Formal Verification/Design Proof Methodology

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The role of formal verification techniques in system validation was examined. The value and the state of the art of performance proving for fault-tolerant compuers were assessed. The investigation, development, and evaluation of performance proving tools were reviewed. The technical issues related to proof methodologies are examined. The technical issues discussed are summarized.

  15. Loss Exposure and Risk Analysis Methodology (LERAM) Project Database Design.

    DTIC Science & Technology

    1996-06-01

    MISREPS) to more capably support system safety engineering concepts such as hazard analysis and risk management. As part of the Loss Exposure and Risk ... Analysis Methodology (LERAM) project, the research into the methods which we employ to report, track, and analyze hazards has resulted in a series of low

  16. Integrated Controls-Structures Design Methodology for Flexible Spacecraft

    NASA Technical Reports Server (NTRS)

    Maghami, P. G.; Joshi, S. M.; Price, D. B.

    1995-01-01

    This paper proposes an approach for the design of flexible spacecraft, wherein the structural design and the control system design are performed simultaneously. The integrated design problem is posed as an optimization problem in which both the structural parameters and the control system parameters constitute the design variables, which are used to optimize a common objective function, thereby resulting in an optimal overall design. The approach is demonstrated by application to the integrated design of a geostationary platform, and to a ground-based flexible structure experiment. The numerical results obtained indicate that the integrated design approach generally yields spacecraft designs that are substantially superior to the conventional approach, wherein the structural design and control design are performed sequentially.

  17. USDA Nutrition Evidence Library: methodology used to identify topics and develop systematic review questions for the birth-to-24-mo population.

    PubMed

    Obbagy, Julie E; Blum-Kemelor, Donna M; Essery, Eve V; Lyon, Joan M G; Spahn, Joanne M

    2014-03-01

    The USDA's Nutrition Evidence Library (NEL) specializes in conducting food- and nutrition-related systematic reviews that are used to inform federal government decision making. To ensure the utility of NEL systematic reviews, the most relevant topics must be addressed, questions must be clearly focused and appropriate in scope, and review frameworks must reflect the state of the science. Identifying the optimal topics and questions requires input from a variety of stakeholders, including scientists with technical expertise, as well as government policy and program leaders. The objective of this article is to describe the rationale and NEL methodology for identifying topics and developing systematic review questions implemented as part of the "Evaluating the evidence base to support the inclusion of infants and children from birth to 24 months of age in the Dietary Guidelines for Americans--the B-24 Project." This is the first phase of a larger project designed to develop dietary guidance for the birth to 24-mo population in the United States.

  18. Nominating under Constraints: A Systematic Comparison of Unlimited and Limited Peer Nomination Methodologies in Elementary School

    ERIC Educational Resources Information Center

    Gommans, Rob; Cillessen, Antonius H. N.

    2015-01-01

    Children's peer relationships are frequently assessed with peer nominations. An important methodological issue is whether to collect unlimited or limited nominations. Some researchers have argued that the psychometric differences between both methods are negligible, while others have claimed that one is superior over the other. The current study…

  19. A Systematic Review of Brief Functional Analysis Methodology with Typically Developing Children

    ERIC Educational Resources Information Center

    Gardner, Andrew W.; Spencer, Trina D.; Boelter, Eric W.; DuBard, Melanie; Jennett, Heather K.

    2012-01-01

    Brief functional analysis (BFA) is an abbreviated assessment methodology derived from traditional extended functional analysis methods. BFAs are often conducted when time constraints in clinics, schools or homes are of concern. While BFAs have been used extensively to identify the function of problem behavior for children with disabilities, their…

  20. Response surface methodology and process optimization of sustained release pellets using Taguchi orthogonal array design and central composite design

    PubMed Central

    Singh, Gurinder; Pai, Roopa S.; Devi, V. Kusum

    2012-01-01

    Furosemide is a powerful diuretic and antihypertensive drug which has low bioavailability due to hepatic first pass metabolism and has a short half-life of 2 hours. To overcome the above drawback, the present study was carried out to formulate and evaluate sustained release (SR) pellets of furosemide for oral administration prepared by extrusion/spheronization. Drug Coat L-100 was used within the pellet core along with microcrystalline cellulose as the diluent and concentration of selected binder was optimized to be 1.2%. The formulation was prepared with drug to polymer ratio 1:3. It was optimized using Design of Experiments by employing a 32 central composite design that was used to systematically optimize the process parameters combined with response surface methodology. Dissolution studies were carried out with USP apparatus Type I (basket type) in both simulated gastric and intestinal pH. The statistical technique, i.e., the two-tailed paired t test and one-way ANOVA of in vitro data has proposed that there was very significant (P≤0.05) difference in dissolution profile of furosemide SR pellets when compared with pure drug and commercial product. Validation of the process optimization study indicated an extremely high degree of prognostic ability. The study effectively undertook the development of optimized process parameters of pelletization of furosemide pellets with tremendous SR characteristics. PMID:22470891

  1. Game Methodology for Design Methods and Tools Selection

    ERIC Educational Resources Information Center

    Ahmad, Rafiq; Lahonde, Nathalie; Omhover, Jean-françois

    2014-01-01

    Design process optimisation and intelligence are the key words of today's scientific community. A proliferation of methods has made design a convoluted area. Designers are usually afraid of selecting one method/tool over another and even expert designers may not necessarily know which method is the best to use in which circumstances. This…

  2. A Rapid Python-Based Methodology for Target-Focused Combinatorial Library Design.

    PubMed

    Li, Shiliang; Song, Yuwei; Liu, Xiaofeng; Li, Honglin

    2016-01-01

    The chemical space is so vast that only a small portion of it has been examined. As a complementary approach to systematically probe the chemical space, virtual combinatorial library design has extended enormous impacts on generating novel and diverse structures for drug discovery. Despite the favorable contributions, high attrition rates in drug development that mainly resulted from lack of efficacy and side effects make it increasingly challenging to discover good chemical starting points. In most cases, focused libraries, which are restricted to particular regions of the chemical space, are deftly exploited to maximize hit rate and improve efficiency at the beginning of the drug discovery and drug development pipeline. This paper presented a valid methodology for fast target-focused combinatorial library design in both reaction-based and production-based ways with the library creating rates of approximately 70,000 molecules per second. Simple, quick and convenient operating procedures are the specific features of the method. SHAFTS, a hybrid 3D similarity calculation software, was embedded to help refine the size of the libraries and improve hit rates. Two target-focused (p38-focused and COX2-focused) libraries were constructed efficiently in this study. This rapid library enumeration method is portable and applicable to any other targets for good chemical starting points identification collaborated with either structure-based or ligand-based virtual screening.

  3. Total Synthesis of Vinblastine, Related Natural Products, and Key Analogues and Development of Inspired Methodology Suitable for the Systematic Study of Their Structure–Function Properties

    PubMed Central

    2015-01-01

    Conspectus Biologically active natural products composed of fascinatingly complex structures are often regarded as not amenable to traditional systematic structure–function studies enlisted in medicinal chemistry for the optimization of their properties beyond what might be accomplished by semisynthetic modification. Herein, we summarize our recent studies on the Vinca alkaloids vinblastine and vincristine, often considered as prototypical members of such natural products, that not only inspired the development of powerful new synthetic methodology designed to expedite their total synthesis but have subsequently led to the discovery of several distinct classes of new, more potent, and previously inaccessible analogues. With use of the newly developed methodology and in addition to ongoing efforts to systematically define the importance of each embedded structural feature of vinblastine, two classes of analogues already have been discovered that enhance the potency of the natural products >10-fold. In one instance, remarkable progress has also been made on the refractory problem of reducing Pgp transport responsible for clinical resistance with a series of derivatives made accessible only using the newly developed synthetic methodology. Unlike the removal of vinblastine structural features or substituents, which typically has a detrimental impact, the additions of new structural features have been found that can enhance target tubulin binding affinity and functional activity while simultaneously disrupting Pgp binding, transport, and functional resistance. Already analogues are in hand that are deserving of full preclinical development, and it is a tribute to the advances in organic synthesis that they are readily accessible even on a natural product of a complexity once thought refractory to such an approach. PMID:25586069

  4. Drift design methodology and preliminary application for the Yucca Mountain Site Characterization Project; Yucca Mountain Site Characterization Project

    SciTech Connect

    Hardy, M.P.; Bauer, S.J.

    1991-12-01

    Excavation stability in an underground nuclear waste repository is required during construction, emplacement, retrieval (if required), and closure phases to ensure worker health and safety, and to prevent development of potential pathways for radionuclide migration in the post-closure period. Stable excavations are developed by appropriate excavation procedures, design of the room shape, design and installation of rock support reinforcement systems, and implementation of appropriate monitoring and maintenance programs. In addition to the loads imposed by the in situ stress field, the repository drifts will be impacted by thermal loads developed after waste emplacement and, periodically, by seismic loads from naturally occurring earthquakes and underground nuclear events. A priori evaluation of stability is required for design of the ground support system, to confirm that the thermal loads are reasonable, and to support the license application process. In this report, a design methodology for assessing drift stability is presented. This is based on site conditions, together with empirical and analytical methods. Analytical numerical methods are emphasized at this time because empirical data are unavailable for excavations in welded tuff either at elevated temperatures or under seismic loads. The analytical methodology incorporates analysis of rock masses that are systematically jointed, randomly jointed, and sparsely jointed. In situ thermal and seismic loads are considered. Methods of evaluating the analytical results and estimating ground support requirements for all the full range of expected ground conditions are outlines. The results of a preliminary application of the methodology using the limited available data are presented. 26 figs., 55 tabs.

  5. Design methodology for optimal hardware implementation of wavelet transform domain algorithms

    NASA Astrophysics Data System (ADS)

    Johnson-Bey, Charles; Mickens, Lisa P.

    2005-05-01

    The work presented in this paper lays the foundation for the development of an end-to-end system design methodology for implementing wavelet domain image/video processing algorithms in hardware using Xilinx field programmable gate arrays (FPGAs). With the integration of the Xilinx System Generator toolbox, this methodology will allow algorithm developers to design and implement their code using the familiar MATLAB/Simulink development environment. By using this methodology, algorithm developers will not be required to become proficient in the intricacies of hardware design, thus reducing the design cycle and time-to-market.

  6. An overview of systematic review.

    PubMed

    Baker, Kathy A; Weeks, Susan Mace

    2014-12-01

    Systematic review is an invaluable tool for the practicing clinician. A well-designed systematic review represents the latest and most complete information available on a particular topic or intervention. This article highlights the key elements of systematic review, what it is and is not, and provides an overview of several reputable organizations supporting the methodological development and conduct of systematic review. Important aspects for evaluating the quality of a systematic review are also included.

  7. Design of a methodology for assessing an electrocardiographic telemonitoring system.

    PubMed

    Alfonzo, A; Huerta, M K; Wong, S; Passariello, G; Díaz, M; La Cruz, A; Cruz, J

    2007-01-01

    Recent studies in Bioengineering show a great interest in telemedicine projects, it is motivated mainly for the fast communication technologies reached during the last decade. Since then many telemedicine projects in different areas have been pursued, among them the electrocardiographic monitoring, as well as methodological reports for the evaluation of these projects. In this work a methodology to evaluate an electrocardiographic telemonitoring system is presented. A procedure to verify the operation of Data Acquisition Module (DAM) of an electrocardiographic telemonitoring system is given, taking as reference defined standards, and procedures for the measurement of the Quality of Service (QoS) parameters required by the system in a Local Area Network (LAN). Finally a graphical model and protocols of evaluation are proposed.

  8. Using QALYs in telehealth evaluations: a systematic review of methodology and transparency

    PubMed Central

    2014-01-01

    Background The quality-adjusted life-year (QALY) is a recognised outcome measure in health economic evaluations. QALY incorporates individual preferences and identifies health gains by combining mortality and morbidity into one single index number. A literature review was conducted to examine and discuss the use of QALYs to measure outcomes in telehealth evaluations. Methods Evaluations were identified via a literature search in all relevant databases. Only economic evaluations measuring both costs and QALYs using primary patient level data of two or more alternatives were included. Results A total of 17 economic evaluations estimating QALYs were identified. All evaluations used validated generic health related-quality of life (HRQoL) instruments to describe health states. They used accepted methods for transforming the quality scores into utility values. The methodology used varied between the evaluations. The evaluations used four different preference measures (EQ-5D, SF-6D, QWB and HUI3), and utility scores were elicited from the general population. Most studies reported the methodology used in calculating QALYs. The evaluations were less transparent in reporting utility weights at different time points and variability around utilities and QALYs. Few made adjustments for differences in baseline utilities. The QALYs gained in the reviewed evaluations varied from 0.001 to 0.118 in implying a small but positive effect of telehealth intervention on patient’s health. The evaluations reported mixed cost-effectiveness results. Conclusion The use of QALYs in telehealth evaluations has increased over the last few years. Different methodologies and utility measures have been used to calculate QALYs. A more harmonised methodology and utility measure is needed to ensure comparability across telehealth evaluations. PMID:25086443

  9. Patient education in osteoporosis prevention: a systematic review focusing on methodological quality of randomised controlled trials.

    PubMed

    Morfeld, Jana-Carina; Vennedey, Vera; Müller, Dirk; Pieper, Dawid; Stock, Stephanie

    2017-02-24

    This review summarizes evidence regarding the effects of patient education in osteoporosis prevention and treatment. The included studies reveal mixed results on a variety of endpoints. Methodological improvem ent of future RCTs (e.g. with regard to randomization and duration of follow-up) might yield more conclusive evidence on the effects of patient education in osteoporosis INTRODUCTION: This review aims to evaluate the effects of patient education on osteoporosis prevention and treatment results.

  10. Clinical tests of the sacroiliac joint. A systematic methodological review. Part 1: Reliability.

    PubMed

    van der Wurff, P; Hagmeijer, R H; Meyne, W

    2000-02-01

    In the literature concerning the sacroiliac joint (SIJ) there are numerous specific tests used to detect joint mobility or pain provocation. In this article the authors have reviewed 11 studies which investigated the reliability of these tests. The methodological quality of the studies was tested by a list of criteria developed by the authors. This list consisted of three categories: (1) study population, (2) test procedures and (3) test results. To each criterion a weighting was attached. The methodological score for nine out of the 11 studies was found to be acceptable. The results of this review, however, could not demonstrate reliable outcomes and therefore no evidence on which to base acceptance of mobility tests of the SIJ into daily clinical practice. There are no indications that 'upgrading' of methodological quality would have improved the final conclusions. With respect to pain provocation tests, the findings did not show the same trend. Two studies demonstrated reliable results using the Gaenslen test and the Thigh thrust test. One study showed acceptable reliability for five other pain provocation tests; however, since other authors have described contradictory results, there is a necessity for further research in this area with an emphasis on multiple test scores and pain provocation tests of the SIJ.

  11. Ethics of Engagement: User-Centered Design and Rhetorical Methodology.

    ERIC Educational Resources Information Center

    Salvo, Michael J.

    2001-01-01

    Explores the shift from observation of users to participation with users, describing and investigating three examples of user-centered design practice in order to consider the new ethical demands being made of technical communicators. Explores Pelle Ehn's participatory design method, Roger Whitehouse's design of tactile signage for blind users,…

  12. Propulsion integration of hypersonic air-breathing vehicles utilizing a top-down design methodology

    NASA Astrophysics Data System (ADS)

    Kirkpatrick, Brad Kenneth

    In recent years, a focus of aerospace engineering design has been the development of advanced design methodologies and frameworks to account for increasingly complex and integrated vehicles. Techniques such as parametric modeling, global vehicle analyses, and interdisciplinary data sharing have been employed in an attempt to improve the design process. The purpose of this study is to introduce a new approach to integrated vehicle design known as the top-down design methodology. In the top-down design methodology, the main idea is to relate design changes on the vehicle system and sub-system level to a set of over-arching performance and customer requirements. Rather than focusing on the performance of an individual system, the system is analyzed in terms of the net effect it has on the overall vehicle and other vehicle systems. This detailed level of analysis can only be accomplished through the use of high fidelity computational tools such as Computational Fluid Dynamics (CFD) or Finite Element Analysis (FEA). The utility of the top-down design methodology is investigated through its application to the conceptual and preliminary design of a long-range hypersonic air-breathing vehicle for a hypothetical next generation hypersonic vehicle (NHRV) program. System-level design is demonstrated through the development of the nozzle section of the propulsion system. From this demonstration of the methodology, conclusions are made about the benefits, drawbacks, and cost of using the methodology.

  13. A transonic-small-disturbance wing design methodology

    NASA Technical Reports Server (NTRS)

    Phillips, Pamela S.; Waggoner, Edgar G.; Campbell, Richard L.

    1988-01-01

    An automated transonic design code has been developed which modifies an initial airfoil or wing in order to generate a specified pressure distribution. The design method uses an iterative approach that alternates between a potential-flow analysis and a design algorithm that relates changes in surface pressure to changes in geometry. The analysis code solves an extended small-disturbance potential-flow equation and can model a fuselage, pylons, nacelles, and a winglet in addition to the wing. A two-dimensional option is available for airfoil analysis and design. Several two- and three-dimensional test cases illustrate the capabilities of the design code.

  14. The Maltreatment-Offending Association: A Systematic Review of the Methodological Features of Prospective and Longitudinal Studies.

    PubMed

    Malvaso, Catia Gaetana; Delfabbro, Paul; Day, Andrew

    2015-12-09

    Although the association between childhood maltreatment and the subsequent development of offending behavior is well documented, the association does not necessarily reflect a causal relationship. This paper provides a systematic review of prospective and longitudinal studies using official records of maltreatment to gain insights into the extent to which methodological variations are likely to influence the conclusions drawn about the likely relationship between maltreatment and offending. Sixty-two original studies met the inclusion criteria. These studies were assessed according to a set of seven methodological criteria: (1) inclusion of comparison groups, (2) the use of statistical controls, (3) valid outcome measures, (4) operationalization of maltreatment, (5) proper temporal order of associations, (6) data relating to unsubstantiated maltreatment, and (7) consideration of mediating and moderating factors. The strength of evidence in support of the maltreatment-offending association was influenced by a number of methodological factors. Despite the increasing sophistication of studies, there is a need to be mindful of how these factors are taken into account in future research in order to gain a deeper understanding of the adverse consequences of maltreatment and how this might influence outcomes and inform interventions.

  15. Design Thinking: A Methodology towards Sustainable Problem Solving in Higher Education in South Africa

    ERIC Educational Resources Information Center

    Munyai, Keneilwe

    2016-01-01

    This short paper explores the potential contribution of design thinking methodology to the education and training system in South Africa. Design thinking is slowly gaining traction in South Africa. Design Thinking is gaining traction in South Africa. There is offered by the Hasso Plattner Institute of Design Thinking at the University of Cape Town…

  16. Designing trials for pressure ulcer risk assessment research: methodological challenges.

    PubMed

    Balzer, K; Köpke, S; Lühmann, D; Haastert, B; Kottner, J; Meyer, G

    2013-08-01

    For decades various pressure ulcer risk assessment scales (PURAS) have been developed and implemented into nursing practice despite uncertainty whether use of these tools helps to prevent pressure ulcers. According to current methodological standards, randomised controlled trials (RCTs) are required to conclusively determine the clinical efficacy and safety of this risk assessment strategy. In these trials, PURAS-aided risk assessment has to be compared to nurses' clinical judgment alone in terms of its impact on pressure ulcer incidence and adverse outcomes. However, RCTs evaluating diagnostic procedures are prone to specific risks of bias and threats to the statistical power which may challenge their validity and feasibility. This discussion paper critically reflects on the rigour and feasibility of experimental research needed to substantiate the clinical efficacy of PURAS-aided risk assessment. Based on reflections of the methodological literature, a critical appraisal of available trials on this subject and an analysis of a protocol developed for a methodologically robust cluster-RCT, this paper arrives at the following conclusions: First, available trials do not provide reliable estimates of the impact of PURAS-aided risk assessment on pressure ulcer incidence compared to nurses' clinical judgement alone due to serious risks of bias and insufficient sample size. Second, it seems infeasible to assess this impact by means of rigorous experimental studies since sample size would become extremely high if likely threats to validity and power are properly taken into account. Third, means of evidence linkages seem to currently be the most promising approaches for evaluating the clinical efficacy and safety of PURAS-aided risk assessment. With this kind of secondary research, the downstream effect of use of PURAS on pressure ulcer incidence could be modelled by combining best available evidence for single parts of this pathway. However, to yield reliable modelling

  17. A Methodology for Quantifying Certain Design Requirements During the Design Phase

    NASA Technical Reports Server (NTRS)

    Adams, Timothy; Rhodes, Russel

    2005-01-01

    A methodology for developing and balancing quantitative design requirements for safety, reliability, and maintainability has been proposed. Conceived as the basis of a more rational approach to the design of spacecraft, the methodology would also be applicable to the design of automobiles, washing machines, television receivers, or almost any other commercial product. Heretofore, it has been common practice to start by determining the requirements for reliability of elements of a spacecraft or other system to ensure a given design life for the system. Next, safety requirements are determined by assessing the total reliability of the system and adding redundant components and subsystems necessary to attain safety goals. As thus described, common practice leaves the maintainability burden to fall to chance; therefore, there is no control of recurring costs or of the responsiveness of the system. The means that have been used in assessing maintainability have been oriented toward determining the logistical sparing of components so that the components are available when needed. The process established for developing and balancing quantitative requirements for safety (S), reliability (R), and maintainability (M) derives and integrates NASA s top-level safety requirements and the controls needed to obtain program key objectives for safety and recurring cost (see figure). Being quantitative, the process conveniently uses common mathematical models. Even though the process is shown as being worked from the top down, it can also be worked from the bottom up. This process uses three math models: (1) the binomial distribution (greaterthan- or-equal-to case), (2) reliability for a series system, and (3) the Poisson distribution (less-than-or-equal-to case). The zero-fail case for the binomial distribution approximates the commonly known exponential distribution or "constant failure rate" distribution. Either model can be used. The binomial distribution was selected for

  18. Optimal color design of psychological counseling room by design of experiments and response surface methodology.

    PubMed

    Liu, Wenjuan; Ji, Jianlin; Chen, Hua; Ye, Chenyu

    2014-01-01

    Color is one of the most powerful aspects of a psychological counseling environment. Little scientific research has been conducted on color design and much of the existing literature is based on observational studies. Using design of experiments and response surface methodology, this paper proposes an optimal color design approach for transforming patients' perception into color elements. Six indices, pleasant-unpleasant, interesting-uninteresting, exciting-boring, relaxing-distressing, safe-fearful, and active-inactive, were used to assess patients' impression. A total of 75 patients participated, including 42 for Experiment 1 and 33 for Experiment 2. 27 representative color samples were designed in Experiment 1, and the color sample (L = 75, a = 0, b = -60) was the most preferred one. In Experiment 2, this color sample was set as the 'central point', and three color attributes were optimized to maximize the patients' satisfaction. The experimental results show that the proposed method can get the optimal solution for color design of a counseling room.

  19. Improved FTA Methodology and Application to Subsea Pipeline Reliability Design

    PubMed Central

    Lin, Jing; Yuan, Yongbo; Zhang, Mingyuan

    2014-01-01

    An innovative logic tree, Failure Expansion Tree (FET), is proposed in this paper, which improves on traditional Fault Tree Analysis (FTA). It describes a different thinking approach for risk factor identification and reliability risk assessment. By providing a more comprehensive and objective methodology, the rather subjective nature of FTA node discovery is significantly reduced and the resulting mathematical calculations for quantitative analysis are greatly simplified. Applied to the Useful Life phase of a subsea pipeline engineering project, the approach provides a more structured analysis by constructing a tree following the laws of physics and geometry. Resulting improvements are summarized in comparison table form. PMID:24667681

  20. Probabilistic Design Methodology and its Application to the Design of an Umbilical Retract Mechanism

    NASA Technical Reports Server (NTRS)

    Onyebueke, Landon; Ameye, Olusesan

    2002-01-01

    A lot has been learned from past experience with structural and machine element failures. The understanding of failure modes and the application of an appropriate design analysis method can lead to improved structural and machine element safety as well as serviceability. To apply Probabilistic Design Methodology (PDM), all uncertainties are modeled as random variables with selected distribution types, means, and standard deviations. It is quite difficult to achieve a robust design without considering the randomness of the design parameters which is the case in the use of the Deterministic Design Approach. The US Navy has a fleet of submarine-launched ballistic missiles. An umbilical plug joins the missile to the submarine in order to provide electrical and cooling water connections. As the missile leaves the submarine, an umbilical retract mechanism retracts the umbilical plug clear of the advancing missile after disengagement during launch and retrains the plug in the retracted position. The design of the current retract mechanism in use was based on the deterministic approach which puts emphasis on factor of safety. A new umbilical retract mechanism that is simpler in design, lighter in weight, more reliable, easier to adjust, and more cost effective has become desirable since this will increase the performance and efficiency of the system. This paper reports on a recent project performed at Tennessee State University for the US Navy that involved the application of PDM to the design of an umbilical retract mechanism. This paper demonstrates how the use of PDM lead to the minimization of weight and cost, and the maximization of reliability and performance.

  1. Probabilistic Design Methodology and its Application to the Design of an Umbilical Retract Mechanism

    NASA Astrophysics Data System (ADS)

    Onyebueke, Landon; Ameye, Olusesan

    2002-10-01

    A lot has been learned from past experience with structural and machine element failures. The understanding of failure modes and the application of an appropriate design analysis method can lead to improved structural and machine element safety as well as serviceability. To apply Probabilistic Design Methodology (PDM), all uncertainties are modeled as random variables with selected distribution types, means, and standard deviations. It is quite difficult to achieve a robust design without considering the randomness of the design parameters which is the case in the use of the Deterministic Design Approach. The US Navy has a fleet of submarine-launched ballistic missiles. An umbilical plug joins the missile to the submarine in order to provide electrical and cooling water connections. As the missile leaves the submarine, an umbilical retract mechanism retracts the umbilical plug clear of the advancing missile after disengagement during launch and retrains the plug in the retracted position. The design of the current retract mechanism in use was based on the deterministic approach which puts emphasis on factor of safety. A new umbilical retract mechanism that is simpler in design, lighter in weight, more reliable, easier to adjust, and more cost effective has become desirable since this will increase the performance and efficiency of the system. This paper reports on a recent project performed at Tennessee State University for the US Navy that involved the application of PDM to the design of an umbilical retract mechanism. This paper demonstrates how the use of PDM lead to the minimization of weight and cost, and the maximization of reliability and performance.

  2. A systematic review of mosquito coils and passive emanators: defining recommendations for spatial repellency testing methodologies.

    PubMed

    Ogoma, Sheila B; Moore, Sarah J; Maia, Marta F

    2012-12-07

    Mosquito coils, vaporizer mats and emanators confer protection against mosquito bites through the spatial action of emanated vapor or airborne pyrethroid particles. These products dominate the pest control market; therefore, it is vital to characterize mosquito responses elicited by the chemical actives and their potential for disease prevention. The aim of this review was to determine effects of mosquito coils and emanators on mosquito responses that reduce human-vector contact and to propose scientific consensus on terminologies and methodologies used for evaluation of product formats that could contain spatial chemical actives, including indoor residual spraying (IRS), long lasting insecticide treated nets (LLINs) and insecticide treated materials (ITMs). PubMed, (National Centre for Biotechnology Information (NCBI), U.S. National Library of Medicine, NIH), MEDLINE, LILAC, Cochrane library, IBECS and Armed Forces Pest Management Board Literature Retrieval System search engines were used to identify studies of pyrethroid based coils and emanators with key-words "Mosquito coils" "Mosquito emanators" and "Spatial repellents". It was concluded that there is need to improve statistical reporting of studies, and reach consensus in the methodologies and terminologies used through standardized testing guidelines. Despite differing evaluation methodologies, data showed that coils and emanators induce mortality, deterrence, repellency as well as reduce the ability of mosquitoes to feed on humans. Available data on efficacy outdoors, dose-response relationships and effective distance of coils and emanators is inadequate for developing a target product profile (TPP), which will be required for such chemicals before optimized implementation can occur for maximum benefits in disease control.

  3. A systematic review of mosquito coils and passive emanators: defining recommendations for spatial repellency testing methodologies

    PubMed Central

    2012-01-01

    Mosquito coils, vaporizer mats and emanators confer protection against mosquito bites through the spatial action of emanated vapor or airborne pyrethroid particles. These products dominate the pest control market; therefore, it is vital to characterize mosquito responses elicited by the chemical actives and their potential for disease prevention. The aim of this review was to determine effects of mosquito coils and emanators on mosquito responses that reduce human-vector contact and to propose scientific consensus on terminologies and methodologies used for evaluation of product formats that could contain spatial chemical actives, including indoor residual spraying (IRS), long lasting insecticide treated nets (LLINs) and insecticide treated materials (ITMs). PubMed, (National Centre for Biotechnology Information (NCBI), U.S. National Library of Medicine, NIH), MEDLINE, LILAC, Cochrane library, IBECS and Armed Forces Pest Management Board Literature Retrieval System search engines were used to identify studies of pyrethroid based coils and emanators with key-words “Mosquito coils” “Mosquito emanators” and “Spatial repellents”. It was concluded that there is need to improve statistical reporting of studies, and reach consensus in the methodologies and terminologies used through standardized testing guidelines. Despite differing evaluation methodologies, data showed that coils and emanators induce mortality, deterrence, repellency as well as reduce the ability of mosquitoes to feed on humans. Available data on efficacy outdoors, dose–response relationships and effective distance of coils and emanators is inadequate for developing a target product profile (TPP), which will be required for such chemicals before optimized implementation can occur for maximum benefits in disease control. PMID:23216844

  4. Narrowing the focus on the assessment of psychosis-related PTSD: a methodologically orientated systematic review

    PubMed Central

    Fornells-Ambrojo, Miriam; Gracie, Alison; Brewin, Chris R.; Hardy, Amy

    2016-01-01

    Background Posttraumatic stress disorder (PTSD) in response to psychosis and associated experiences (psychosis-related PTSD, or PR-PTSD) is the subject of a growing field of research. However, a wide range of PR-PTSD prevalence rates has been reported. This may be due to definitional and methodological inconsistencies in the assessment of PR-PTSD. Objective The focus of the review is two-fold. (1) To identify factors that enhance, or detract from, the robustness of PR-PTSD assessment and (2) to critically evaluate the evidence in relation to these identified criteria, including the impact on PR-PTSD prevalence rates. Method Four quality criteria, whose development was informed by mainstream PTSD research, were selected to evaluate findings on PR-PTSD prevalence. Two criteria related to assessment of psychosis-related stressors (participant identification of worst moments of discrete threat events; psychometrically robust trauma measure) and two focussed on PR-PTSD symptom measurement (adequate time elapsed since trauma; use of validated PTSD interview) in the context of psychosis. Results Twenty-one studies of PR-PTSD, with prevalence rates ranging from 11 to 51%, were evaluated. Fourteen studies (67%) used robust PTSD measures but PR-trauma was not specifically defined or assessed with validated measures. Eleven studies (52%) assessed PTSD before sufficient time had elapsed since the trauma. Due to significant methodological limitations, it was not possible to review PR-PTSD rates and provide a revised estimate of prevalence. Conclusions Methodological limitations are common in existing studies of PR-PTSD prevalence. Specific recommendations for improving assessment of psychosis-related trauma are made to guide the development of this new and emerging field. The review concludes with a proposed conceptualisation of PR-PTSD in the context of current diagnostic systems. The utility of the PR-PTSD term and its theoretical underpinnings are discussed. Highlights of

  5. Aerospace engineering design by systematic decomposition and multilevel optimization

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, J.; Giles, G. L.; Barthelemy, J.-F. M.

    1984-01-01

    This paper describes a method for systematic analysis and optimization of large engineering systems, e.g., aircraft, by decomposition of a large task into a set of smaller, self-contained subtasks that can be solved concurrently. The subtasks may be arranged in many hierarchical levels with the assembled system at the top level. Analyses are carried out in each subtask using inputs received from other subtasks, and are followed by optimizations carried out from the bottom up. Each optimization at the lower levels is augmented by analysis of its sensitivity to the inputs received from other subtasks to account for the couplings among the subtasks in a formal manner. The analysis and optimization operations alternate iteratively until they converge to a system design whose performance is maximized with all constraints satisfied. The method, which is still under development, is tentatively validated by test cases in structural applications and an aircraft configuration optimization. It is pointed out that the method is intended to be compatible with the typical engineering organization and the modern technology of distributed computing.

  6. Participatory Pattern Workshops: A Methodology for Open Learning Design Inquiry

    ERIC Educational Resources Information Center

    Mor, Yishay; Warburton, Steven; Winters, Niall

    2012-01-01

    In order to promote pedagogically informed use of technology, educators need to develop an active, inquisitive, design-oriented mindset. Design Patterns have been demonstrated as powerful mediators of theory-praxis conversations yet widespread adoption by the practitioner community remains a challenge. Over several years, the authors and their…

  7. Participant Observation, Anthropology Methodology and Design Anthropology Research Inquiry

    ERIC Educational Resources Information Center

    Gunn, Wendy; Løgstrup, Louise B.

    2014-01-01

    Within the design studio, and across multiple field sites, the authors compare involvement of research tools and materials during collaborative processes of designing. Their aim is to trace temporal dimensions (shifts/ movements) of where and when learning takes place along different sites of practice. They do so by combining participant…

  8. A Fundamental Methodology for Designing Management Information Systems for Schools.

    ERIC Educational Resources Information Center

    Visscher, Adrie J.

    Computer-assisted school information systems (SISs) are developed and used worldwide; however, the literature on strategies for their design and development is lacking. This paper presents the features of a fundamental approach to systems design that proved to be successful when developing SCHOLIS, a computer-assisted SIS for Dutch secondary…

  9. Breast cancer statistics and prediction methodology: a systematic review and analysis.

    PubMed

    Dubey, Ashutosh Kumar; Gupta, Umesh; Jain, Sonal

    2015-01-01

    Breast cancer is a menacing cancer, primarily affecting women. Continuous research is going on for detecting breast cancer in the early stage as the possibility of cure in early stages is bright. There are two main objectives of this current study, first establish statistics for breast cancer and second to find methodologies which can be helpful in the early stage detection of the breast cancer based on previous studies. The breast cancer statistics for incidence and mortality of the UK, US, India and Egypt were considered for this study. The finding of this study proved that the overall mortality rates of the UK and US have been improved because of awareness, improved medical technology and screening, but in case of India and Egypt the condition is less positive because of lack of awareness. The methodological findings of this study suggest a combined framework based on data mining and evolutionary algorithms. It provides a strong bridge in improving the classification and detection accuracy of breast cancer data.

  10. [Variations in the epidemiolgy of adverse events: methodology of the Harvard Medical Practice Design].

    PubMed

    Lessing, C; Schmitz, A; Schrappe, M

    2012-02-01

    The Harvard Medical Practice (HMP) Design is based on a multi-staged retrospective review of inpatient records and is used to assess the frequency of (preventable) adverse events ([P]AE) in large study populations. Up to now HMP studies have been conducted in 9 countries. Results differ largely from 2.9% to 3.7% of patients with AE in the USA up to 16.6% in Australia. In our analysis we systematically compare the methodology of 9 HMP studies published in the English language and discuss possible impacts on reported frequencies. Modifications in HMP studies can be individualised from each stage of planning, conducting, and reporting results. In doing so 2 studies from the USA with lowest rates of AE can be characterised by their context of liability and the absence of screening for nosocomial infections. Studies with a high proportion of AE are marked by an intense training of reviewers. Further conclusions are hindered by divergences in defining periods of observation, by presenting frequencies as cumulative prevalences, and differences in the reporting of study results. As a consequence future HMP studies should go for complete, consistent and transparent coverage. Further research should concentrate on advancing methods for collecting data on (P)AE.

  11. Information System Design Methodology Based on PERT/CPM Networking and Optimization Techniques.

    ERIC Educational Resources Information Center

    Bose, Anindya

    The dissertation attempts to demonstrate that the program evaluation and review technique (PERT)/Critical Path Method (CPM) or some modified version thereof can be developed into an information system design methodology. The methodology utilizes PERT/CPM which isolates the basic functional units of a system and sets them in a dynamic time/cost…

  12. Application of optimal design methodologies in clinical pharmacology experiments.

    PubMed

    Ogungbenro, Kayode; Dokoumetzidis, Aristides; Aarons, Leon

    2009-01-01

    Pharmacokinetics and pharmacodynamics data are often analysed by mixed-effects modelling techniques (also known as population analysis), which has become a standard tool in the pharmaceutical industries for drug development. The last 10 years has witnessed considerable interest in the application of experimental design theories to population pharmacokinetic and pharmacodynamic experiments. Design of population pharmacokinetic experiments involves selection and a careful balance of a number of design factors. Optimal design theory uses prior information about the model and parameter estimates to optimize a function of the Fisher information matrix to obtain the best combination of the design factors. This paper provides a review of the different approaches that have been described in the literature for optimal design of population pharmacokinetic and pharmacodynamic experiments. It describes options that are available and highlights some of the issues that could be of concern as regards practical application. It also discusses areas of application of optimal design theories in clinical pharmacology experiments. It is expected that as the awareness about the benefits of this approach increases, more people will embrace it and ultimately will lead to more efficient population pharmacokinetic and pharmacodynamic experiments and can also help to reduce both cost and time during drug development.

  13. A Systematic Methodology for Constructing High-Order Energy-Stable WENO Schemes

    NASA Technical Reports Server (NTRS)

    Yamaleev, Nail K.; Carpenter, Mark H.

    2008-01-01

    A third-order Energy Stable Weighted Essentially Non-Oscillatory (ESWENO) finite difference scheme developed by Yamaleev and Carpenter (AIAA 2008-2876, 2008) was proven to be stable in the energy norm for both continuous and discontinuous solutions of systems of linear hyperbolic equations. Herein, a systematic approach is presented that enables \\energy stable" modifications for existing WENO schemes of any order. The technique is demonstrated by developing a one-parameter family of fifth-order upwind-biased ESWENO schemes; ESWENO schemes up to eighth order are presented in the appendix. New weight functions are also developed that provide (1) formal consistency, (2) much faster convergence for smooth solutions with an arbitrary number of vanishing derivatives, and (3) improved resolution near strong discontinuities.

  14. A Systematic Methodology for Constructing High-Order Energy Stable WENO Schemes

    NASA Technical Reports Server (NTRS)

    Yamaleev, Nail K.; Carpenter, Mark H.

    2009-01-01

    A third-order Energy Stable Weighted Essentially Non{Oscillatory (ESWENO) finite difference scheme developed by Yamaleev and Carpenter [1] was proven to be stable in the energy norm for both continuous and discontinuous solutions of systems of linear hyperbolic equations. Herein, a systematic approach is presented that enables "energy stable" modifications for existing WENO schemes of any order. The technique is demonstrated by developing a one-parameter family of fifth-order upwind-biased ESWENO schemes; ESWENO schemes up to eighth order are presented in the appendix. New weight functions are also developed that provide (1) formal consistency, (2) much faster convergence for smooth solutions with an arbitrary number of vanishing derivatives, and (3) improved resolution near strong discontinuities.

  15. Design methodology and application of high speed gate arrays

    NASA Astrophysics Data System (ADS)

    Decker, R.

    A system to provide real-time signal averaging of waveforms from a 50 MHz analog to digital converter has been fabricated to operate over a wide temperature range. This system evolved from conception, through an initial simulated design for emitter coupled logic (ECL), to a pair of CMOS gate array designs. Changing the implementation technology to CMOS gate arrays resulted in savings in cost, size, weight, and power. Design rules employed to obtain working silicon on the first cycle, at double state-of-the-art gate array speeds, are discussed. Also discussed are built-in, run-time, self-test features.

  16. Modern design methodology and problems in training aircraft engineers

    NASA Technical Reports Server (NTRS)

    Liseitsev, N. K.

    1989-01-01

    A brief report on the problem of modern aircraft specialist education is presented that is devoted to the content and methods of teaching a course in General Aircraft Design in the Moscow Aviation Institute.

  17. A Systematic Software, Firmware, and Hardware Codesign Methodology for Digital Signal Processing

    DTIC Science & Technology

    2014-03-01

    Interface C o m m u n ic at io n s B as ic 53 Streaming Data 67 Shared Bus 54 Message Passing 68 Token Ring 55 Remote-Procedure Call 69... technologies for the feasibility check. These design concepts and implementation technologies are in the form of models, and they can also be used for...hardware partitioning involves a diversity of applications, design styles and implementation technologies ; ultimately it depends on human expert

  18. 77 FR 66471 - Methodology for Designation of Frontier and Remote Areas

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-05

    ... HUMAN SERVICES Health Resources and Services Administration Methodology for Designation of Frontier and Remote Areas AGENCY: Health Resources and Services Administration, HHS. ACTION: Request for public... of Rural Health Policy (ORHP) in the Health Resources and Services Administration (HRSA); and...

  19. New Methods in Design Education: The Systemic Methodology and the Use of Sketch in the Conceptual Design Stage

    ERIC Educational Resources Information Center

    Westermeyer, Juan Carlos Briede; Ortuno, Bernabe Hernandis

    2011-01-01

    This study describes the application of a new product concurrent design methodologies in the context in the education of industrial design. The use of the sketch has been utilized many times as a tool of creative expression especially in the conceptual design stage, in an intuitive way and a little out of the context of the reality needs that the…

  20. Methodology Series Module 8: Designing Questionnaires and Clinical Record Forms

    PubMed Central

    Setia, Maninder Singh

    2017-01-01

    As researchers, we often collect data on a clinical record form or a questionnaire. It is an important part of study design. If the questionnaire is not well designed, the data collected will not be useful. In this section of the module, we have discussed some practical aspects of designing a questionnaire. It is useful to make a list of all the variables that will be assessed in the study before preparing the questionnaire. The researcher should review all the existing questionnaires. It may be efficient to use an existing standardized questionnaire or scale. Many of these scales are freely available and may be used with an appropriate reference. However, some may be under copyright protection and permissions may be required to use the same questionnaire. While designing their own questionnaire, researchers may use open- or close-ended questions. It is important to design the responses appropriately as the format of responses will influence the analysis. Sometimes, one can collect the same information in multiple ways - continuous or categorical response. Besides these, the researcher can also use visual analog scales or Likert's scale in the questionnaire. Some practical take-home points are: (1) Use specific language while framing the questions; (2) write detailed instructions in the questionnaire; (3) use mutually exclusive response categories; (4) use skip patterns; (5) avoid double-barreled questions; and (6) anchor the time period if required.

  1. Structural Design Methodology Based on Concepts of Uncertainty

    NASA Technical Reports Server (NTRS)

    Lin, K. Y.; Du, Jiaji; Rusk, David

    2000-01-01

    In this report, an approach to damage-tolerant aircraft structural design is proposed based on the concept of an equivalent "Level of Safety" that incorporates past service experience in the design of new structures. The discrete "Level of Safety" for a single inspection event is defined as the compliment of the probability that a single flaw size larger than the critical flaw size for residual strength of the structure exists, and that the flaw will not be detected. The cumulative "Level of Safety" for the entire structure is the product of the discrete "Level of Safety" values for each flaw of each damage type present at each location in the structure. Based on the definition of "Level of Safety", a design procedure was identified and demonstrated on a composite sandwich panel for various damage types, with results showing the sensitivity of the structural sizing parameters to the relative safety of the design. The "Level of Safety" approach has broad potential application to damage-tolerant aircraft structural design with uncertainty.

  2. Influence of Glenosphere Design on Outcomes and Complications of Reverse Arthroplasty: A Systematic Review

    PubMed Central

    Lawrence, Cassandra; Williams, Gerald R.

    2016-01-01

    Background Different implant designs are utilized in reverse shoulder arthroplasty. The purpose of this systematic review was to evaluate the results of reverse shoulder arthroplasty using a traditional (Grammont) prosthesis and a lateralized prosthesis for the treatment of cuff tear arthropathy and massive irreparable rotator cuff tears. Methods A systematic review of the literature was performed via a search of two electronic databases. Two reviewers evaluated the quality of methodology and retrieved data from each included study. In cases where the outcomes data were similar between studies, the data were pooled using frequency-weighted mean values to generate summary outcomes. Results Thirteen studies met the inclusion and exclusion criteria. Demographics were similar between treatment groups. The frequency-weighted mean active external rotation was 24° in the traditional group and 46° in the lateralized group (p = 0.0001). Scapular notching was noted in 44.9% of patients in the traditional group compared to 5.4% of patients in the lateralized group (p = 0.0001). The rate of clinically significant glenoid loosening was 1.8% in the traditional group and 8.8% in the lateralized group (p = 0.003). Conclusions Both the traditional Grammont and the lateralized offset reverse arthroplasty designs can improve pain and function in patients with diagnoses of cuff tear arthropathy and irreparable rotator cuff tear. While a lateralized design can result in increased active external rotation and decreased rates of scapular notching, there may be a higher rate of glenoid baseplate loosening. PMID:27583112

  3. New methodology for shaft design based on life expectancy

    NASA Technical Reports Server (NTRS)

    Loewenthal, S. H.

    1986-01-01

    The design of power transmission shafting for reliability has not historically received a great deal of attention. However, weight sensitive aerospace and vehicle applications and those where the penalties of shaft failure are great, require greater confidence in shaft design than earlier methods provided. This report summarizes a fatigue strength-based, design method for sizing shafts under variable amplitude loading histories for limited or nonlimited service life. Moreover, applications factors such as press-fitted collars, shaft size, residual stresses from shot peening or plating, corrosive environments can be readily accommodated into the framework of the analysis. Examples are given which illustrate the use of the method, pointing out the large life penalties due to occasional cyclic overloads.

  4. Optimum design criteria for a synchronous reluctance motor with concentrated winding using response surface methodology

    NASA Astrophysics Data System (ADS)

    Lee, Jung-Ho; Park, Seong-June; Jeon, Su-Jin

    2006-04-01

    This paper presents an optimization procedure using response surface methodology (RSM) to determine design parameters for reducing torque ripple. The RSM has been achieved to use the experimental design method in combination with finite element method and well adapted to make analytical model for a complex problem considering a lot of interaction of design variables.

  5. Towards uniform accelerometry analysis: a standardization methodology to minimize measurement bias due to systematic accelerometer wear-time variation.

    PubMed

    Katapally, Tarun R; Muhajarine, Nazeem

    2014-05-01

    Accelerometers are predominantly used to objectively measure the entire range of activity intensities - sedentary behaviour (SED), light physical activity (LPA) and moderate to vigorous physical activity (MVPA). However, studies consistently report results without accounting for systematic accelerometer wear-time variation (within and between participants), jeopardizing the validity of these results. This study describes the development of a standardization methodology to understand and minimize measurement bias due to wear-time variation. Accelerometry is generally conducted over seven consecutive days, with participants' data being commonly considered 'valid' only if wear-time is at least 10 hours/day. However, even within 'valid' data, there could be systematic wear-time variation. To explore this variation, accelerometer data of Smart Cities, Healthy Kids study (www.smartcitieshealthykids.com) were analyzed descriptively and with repeated measures multivariate analysis of variance (MANOVA). Subsequently, a standardization method was developed, where case-specific observed wear-time is controlled to an analyst specified time period. Next, case-specific accelerometer data are interpolated to this controlled wear-time to produce standardized variables. To understand discrepancies owing to wear-time variation, all analyses were conducted pre- and post-standardization. Descriptive analyses revealed systematic wear-time variation, both between and within participants. Pre- and post-standardized descriptive analyses of SED, LPA and MVPA revealed a persistent and often significant trend of wear-time's influence on activity. SED was consistently higher on weekdays before standardization; however, this trend was reversed post-standardization. Even though MVPA was significantly higher on weekdays both pre- and post-standardization, the magnitude of this difference decreased post-standardization. Multivariable analyses with standardized SED, LPA and MVPA as outcome

  6. Structural design methodologies for ceramic-based material systems

    NASA Technical Reports Server (NTRS)

    Duffy, Stephen F.; Chulya, Abhisak; Gyekenyesi, John P.

    1991-01-01

    One of the primary pacing items for realizing the full potential of ceramic-based structural components is the development of new design methods and protocols. The focus here is on low temperature, fast-fracture analysis of monolithic, whisker-toughened, laminated, and woven ceramic composites. A number of design models and criteria are highlighted. Public domain computer algorithms, which aid engineers in predicting the fast-fracture reliability of structural components, are mentioned. Emphasis is not placed on evaluating the models, but instead is focused on the issues relevant to the current state of the art.

  7. Designing and Integrating Purposeful Learning in Game Play: A Systematic Review

    ERIC Educational Resources Information Center

    Ke, Fengfeng

    2016-01-01

    Via a systematic review of the literature on learning games, this article presents a systematic discussion on the design of intrinsic integration of domain-specific learning in game mechanics and game world design. A total of 69 articles ultimately met the inclusion criteria and were coded for the literature synthesis. Exemplary learning games…

  8. Are QALYs based on time trade-off comparable?--A systematic review of TTO methodologies.

    PubMed

    Arnesen, Trude; Trommald, Mari

    2005-01-01

    A wide range of methods is used to elicit quality-of-life weights of different health states to generate 'Quality-adjusted life years' (QALYs). The comparability between different types of health outcomes at a numerical level is the main advantage of using a 'common currency for health' such as the QALY. It has been warned that results of different methods and perspectives should not be directly compared in QALY league tables. But do we know that QALYs are comparable if they are based on the same method and perspective?The Time trade-off (TTO) consists in a hypothetical trade-off between living shorter and living healthier. We performed a literature review of the TTO methodology used to elicit quality-of-life weights for own, current health. Fifty-six journal articles, with quality-of-life weights assigned to 102 diagnostic groups were included. We found extensive differences in how the TTO question was asked. The time frame varied from 1 month to 30 years, and was not reported for one-fourth of the weights. The samples in which the quality-of-life weights were elicited were generally small with a median size of 53 respondents. Comprehensive inclusion criteria were given for half the diagnostic groups. Co-morbidity was described in less than one-tenth of the groups of respondents. For two-thirds of the quality-of-life weights, there was no discussion of the influence of other factors, such as age, sex, employment and children. The different methodological approaches did not influence the TTO weights in a predictable or clear pattern. Whether or not it is possible to standardise the TTO method and the sampling procedure, and whether or not the TTO will then give valid quality-of-life weights, remains an open question.This review of the TTO elicited on own behalf, shows that limiting cost-utility analysis to include only quality life weights from one method and one perspective is not enough to ensure that QALYs are comparable.

  9. Design and Methodology of the Korean Early Psychosis Cohort Study

    PubMed Central

    Kim, Sung-Wan; Lee, Bong Ju; Kim, Jung Jin; Yu, Je-Chun; Lee, Kyu Young; Won, Seung-Hee; Lee, Seung-Hwan; Kim, Seung-Hyun; Kang, Shi Hyun

    2017-01-01

    The present study details the rationale and methodology of the Korean Early Psychosis Cohort Study (KEPS), which is a clinical cohort investigation of first episode psychosis patients from a Korean population. The KEPS is a prospective naturalistic observational cohort study that follows the participants for at least 2 years. This study includes patients between 18 and 45 years of age who fulfill the criteria for one of schizophrenia spectrum and other psychotic disorders according to the diagnostic criteria of DSM-5. Early psychosis is defined as first episode patients who received antipsychotic treatment for fewer than 4 consecutive weeks after the onset of illness or stabilized patients in the early stages of the disorder whose duration of illness was less than 2 years from the initiation of antipsychotic treatment. The primary outcome measures are treatment response, remission, recovery, and relapse. Additionally, several laboratory tests are conducted and a variety of objective and subjective psychiatric measures assessing early life trauma, lifestyle pattern, and social and cognitive functioning are administered. This long-term prospective cohort study may contribute to the development of early intervention strategies and the improvement of long-term outcomes in patients with schizophrenia. PMID:28096881

  10. A general methodology and applications for conduction-like flow-channel design.

    SciTech Connect

    Cummings, Eric B.; Fiechtner, Gregory J.

    2004-02-01

    A novel design methodology is developed for creating conduction devices in which fields are piecewise uniform. This methodology allows the normally analytically intractable problem of Lagrangian transport to be solved using algebraic and trigonometric equations. Low-dispersion turns, manifolds, and expansions are developed. In this methodology, regions of piece-wise constant specific permeability (permeability per unit width) border each other with straight, generally tilted interfaces. The fields within each region are made uniform by satisfying a simple compatibility relation between the tilt angle and ratio of specific permeability of adjacent regions. This methodology has particular promise in the rational design of quasi-planar devices, in which the specific permeability is proportional to the depth of the channel. For such devices, the methodology can be implemented by connecting channel facets having two or more depths, fabricated, e.g., using a simple two-etch process.

  11. Cost of arthritis: a systematic review of methodologies used for direct costs.

    PubMed

    Lo, T K T; Parkinson, Lynne; Cunich, Michelle; Byles, Julie

    2016-01-01

    A substantial amount of healthcare and costs are attributable to arthritis, which is a very common chronic disease. This paper presents the results of a systematic review of arthritis cost studies published from 2008 to 2013. MEDLINE, Embase, EconLit databases were searched, as well as governmental and nongovernmental organization websites. Seventy-one reports met the inclusion/exclusion criteria, and 24 studies were included in the review. Among these studies, common methods included the use of individual-level data, bottom-up costing approach, use of both an arthritis group and a control group to enable incremental cost computation of the disease, and use of regression methods such as generalized linear models and ordinary least squares regression to control for confounding variables. Estimates of the healthcare cost of arthritis varied considerably across the studies depending on the study methods, the form of arthritis and the population studied. In the USA, for example, the estimated healthcare cost of arthritis ranged from $1862 to $14,021 per person, per year. The reviewed study methods have strengths, weaknesses and potential improvements in relation to estimating the cost of disease, which are outlined in this paper. Caution must be exercised when these methods are applied to cost estimation and monitoring of the economic burden of arthritis.

  12. Operationalising resilience in longitudinal studies: a systematic review of methodological approaches

    PubMed Central

    Cosco, T D; Kaushal, A; Hardy, R; Richards, M; Kuh, D; Stafford, M

    2017-01-01

    Over the life course, we are invariably faced with some form of adversity. The process of positively adapting to adverse events is known as ‘resilience’. Despite the acknowledgement of 2 common components of resilience, that is, adversity and positive adaptation, no consensus operational definition has been agreed. Resilience operationalisations have been reviewed in a cross-sectional context; however, a review of longitudinal methods of operationalising resilience has not been conducted. The present study conducts a systematic review across Scopus and Web of Science capturing studies of ageing that posited operational definitions of resilience in longitudinal studies of ageing. Thirty-six studies met inclusion criteria. Non-acute events, for example, cancer, were the most common form of adversity identified and psychological components, for example, the absence of depression, the most common forms of positive adaptation. Of the included studies, 4 used psychometrically driven methods, that is, repeated administration of established resilience metrics, 9 used definition-driven methods, that is, a priori establishment of resilience components and criteria, and 23 used data-driven methods, that is, techniques that identify resilient individuals using latent variable models. Acknowledging the strengths and limitations of each operationalisation is integral to the appropriate application of these methods to life course and longitudinal resilience research. PMID:27502781

  13. Operationalising resilience in longitudinal studies: a systematic review of methodological approaches.

    PubMed

    Cosco, T D; Kaushal, A; Hardy, R; Richards, M; Kuh, D; Stafford, M

    2017-01-01

    Over the life course, we are invariably faced with some form of adversity. The process of positively adapting to adverse events is known as 'resilience'. Despite the acknowledgement of 2 common components of resilience, that is, adversity and positive adaptation, no consensus operational definition has been agreed. Resilience operationalisations have been reviewed in a cross-sectional context; however, a review of longitudinal methods of operationalising resilience has not been conducted. The present study conducts a systematic review across Scopus and Web of Science capturing studies of ageing that posited operational definitions of resilience in longitudinal studies of ageing. Thirty-six studies met inclusion criteria. Non-acute events, for example, cancer, were the most common form of adversity identified and psychological components, for example, the absence of depression, the most common forms of positive adaptation. Of the included studies, 4 used psychometrically driven methods, that is, repeated administration of established resilience metrics, 9 used definition-driven methods, that is, a priori establishment of resilience components and criteria, and 23 used data-driven methods, that is, techniques that identify resilient individuals using latent variable models. Acknowledging the strengths and limitations of each operationalisation is integral to the appropriate application of these methods to life course and longitudinal resilience research.

  14. Design Based Research Methodology for Teaching with Technology in English

    ERIC Educational Resources Information Center

    Jetnikoff, Anita

    2015-01-01

    Design based research (DBR) is an appropriate method for small scale educational research projects involving collaboration between teachers, students and researchers. It is particularly useful in collaborative projects where an intervention is implemented and evaluated in a grounded context. The intervention can be technological, or a new program…

  15. Serration Design Methodology for Wind Turbine Noise Reduction

    NASA Astrophysics Data System (ADS)

    Mathew, J.; Singh, A.; Madsen, J.; Arce León, C.

    2016-09-01

    Trailing edge serrations are today an established method to reduce the aeroacoustic noise from wind turbine blades. In this paper, a brief introduction to the aerodynamic and acoustic design procedure used at LM Wind Power is given. Early field tests on serrations, retrofitted to the turbine blades, gave preliminary indication of their noise reduction potential. However, a multitude of challenges stand in the way of any proof of concept and a viable commercial product. LM undertook a methodical test and validation procedure to understand the impact of design parameters on serration performance, and quantify the uncertainties associated with the proposed designs. Aerodynamic and acoustic validation tests were carried out in number of wind tunnel facilities. Models were written to predict the aerodynamic, acoustic and structural performance of the serrations. LM serration designs have evolved over the period of time to address constraints imposed by aero performance, structural reliability, manufacturing and installation. The latest LM serration offering was tested in the field on three different wind turbines. A consistent noise reduction in excess of 1.5 dB was achieved in the field for all three turbines.

  16. Kids in the city study: research design and methodology

    PubMed Central

    2011-01-01

    Background Physical activity is essential for optimal physical and psychological health but substantial declines in children's activity levels have occurred in New Zealand and internationally. Children's independent mobility (i.e., outdoor play and traveling to destinations unsupervised), an integral component of physical activity in childhood, has also declined radically in recent decades. Safety-conscious parenting practices, car reliance and auto-centric urban design have converged to produce children living increasingly sedentary lives. This research investigates how urban neighborhood environments can support or enable or restrict children's independent mobility, thereby influencing physical activity accumulation and participation in daily life. Methods/Design The study is located in six Auckland, New Zealand neighborhoods, diverse in terms of urban design attributes, particularly residential density. Participants comprise 160 children aged 9-11 years and their parents/caregivers. Objective measures (global positioning systems, accelerometers, geographical information systems, observational audits) assessed children's independent mobility and physical activity, neighborhood infrastructure, and streetscape attributes. Parent and child neighborhood perceptions and experiences were assessed using qualitative research methods. Discussion This study is one of the first internationally to examine the association of specific urban design attributes with child independent mobility. Using robust, appropriate, and best practice objective measures, this study provides robust epidemiological information regarding the relationships between the built environment and health outcomes for this population. PMID:21781341

  17. Situated Research Design and Methodological Choices in Formative Program Evaluation

    ERIC Educational Resources Information Center

    Supovitz, Jonathan

    2013-01-01

    Design-based implementation research offers the opportunity to rethink the relationships between intervention, research, and situation to better attune research and evaluation to the program development process. Using a heuristic called the intervention development curve, I describe the rough trajectory that programs typically follow as they…

  18. The methodological quality of economic evaluation studies in obstetrics and gynecology: a systematic review.

    PubMed

    Vijgen, Sylvia M C; Opmeer, Brent C; Mol, Ben Willem J

    2013-04-01

    We evaluated the methodological quality of economic evaluation studies in the field of obstetrics and gynecology published in the last decade. A MEDLINE search was performed to find economic evaluation studies in obstetrics and gynecology from the years 1997 through 2009. We included full economic evaluation studies concerning tests or interventions in the field of obstetrics or gynecology. Each included study was evaluated by two reviewers using a quality checklist that was based on international guidelines for medical economic evaluation studies and a checklist used in a previous review. The mean number of quality criteria adhered to was 23 of 30 items, whereas five articles (3%) met all 30 criteria. Compliance was low for the description of the perspective (40%), the completeness of costs looking at the perspective (48%) or time horizon (48%), and reporting of quantities of resources (47%). Furthermore, if no discounting was applied, an explanation was infrequently given (14%). A comparison of study quality to that reported by Smith and Blackmore showed a considerable improvement in the following criteria: presentation perspective (from 19 to 40%), statement of primary outcome measure (from 72 to 81%), completeness costs looking at the time horizon (from 14 to 48%), the presentation of discount rates (from 10 to 54%), details of sensitivity analyses (from 21 to 61%), reporting incremental results (from 17 to 70%), and reporting a summary measure (from 57 to 74%). The quality of economic studies in obstetrics and gynecology has considerably improved in the last decade, but room for further improvement is present.

  19. A Practical Methodology for the Systematic Development of Multiple Choice Tests.

    ERIC Educational Resources Information Center

    Blumberg, Phyllis; Felner, Joel

    Using Guttman's facet design analysis, four parallel forms of a multiple-choice test were developed. A mapping sentence, logically representing the universe of content of a basic cardiology course, specified the facets of the course and the semantic structural units linking them. The facets were: cognitive processes, disease priority, specific…

  20. Dynamic testing of learning potential in adults with cognitive impairments: A systematic review of methodology and predictive value.

    PubMed

    Boosman, Hileen; Bovend'Eerdt, Thamar J H; Visser-Meily, Johanna M A; Nijboer, Tanja C W; van Heugten, Caroline M

    2016-09-01

    Dynamic testing includes procedures that examine the effects of brief training on test performance where pre- to post-training change reflects patients' learning potential. The objective of this systematic review was to provide clinicians and researchers insight into the concept and methodology of dynamic testing and to explore its predictive validity in adult patients with cognitive impairments. The following electronic databases were searched: PubMed, PsychINFO, and Embase/Medline. Of 1141 potentially relevant articles, 24 studies met the inclusion criteria. The mean methodological quality score was 4.6 of 8. Eleven different dynamic tests were used. The majority of studies used dynamic versions of the Wisconsin Card Sorting Test. The training mostly consisted of a combination of performance feedback, reinforcement, expanded instruction, or strategy training. Learning potential was quantified using numerical (post-test score, difference score, gain score, regression residuals) and categorical (groups) indices. In five of six longitudinal studies, learning potential significantly predicted rehabilitation outcome. Three of four studies supported the added value of dynamic testing over conventional testing in predicting rehabilitation outcome. This review provides preliminary support that dynamic tests can provide a valuable addition to conventional tests to assess patients' abilities. Although promising, there was a large variability in methods used for dynamic testing and, therefore, it remains unclear which dynamic testing methods are most appropriate for patients with cognitive impairments. More research is warranted to further evaluate and refine dynamic testing methodology and to further elucidate its predictive validity concerning rehabilitation outcomes relative to other cognitive and functional status indices.

  1. Designing a Methodology for Future Air Travel Scenarios

    NASA Technical Reports Server (NTRS)

    Wuebbles, Donald J.; Baughcum, Steven L.; Gerstle, John H.; Edmonds, Jae; Kinnison, Douglas E.; Krull, Nick; Metwally, Munir; Mortlock, Alan; Prather, Michael J.

    1992-01-01

    -subsonic future fleet. The methodology, procedures, and recommendations for the development of future HSCT and the subsonic fleet scenarios used for this evaluation are discussed.

  2. Memristor-Based Computing Architecture: Design Methodologies and Circuit Techniques

    DTIC Science & Technology

    2013-03-01

    simply consists of an NMOS transistor (Q) and a memristor. When the input Vin is low, the transistor Q is turned off. Thus, the output Vout is...connected to ground through the memristor. Conversely, when Vin is high, turning Q on, the memristance M and the equivalent transistor resistance (RQ...synapse design was dependent on the equivalent resistance (effectively, the size) of the Q transistor (RQ). A larger Q would offer a wider range of Vout

  3. Methodological Foundations for Designing Intelligent Computer-Based Training

    DTIC Science & Technology

    1991-09-03

    metacognitive processes (Derry, in press). To this list, we would add that FSM techniques need not be restricted to intelligent tutoring systems. As Chin (1989...systems (pp. 313-333). New York: Springer-Verlag. Derry, S. (in press). Metacognitive models of learning and instructional systems design. In P.H. Winne...15 3.1 Stocks Portfolio Profiler Example .............................. 15 3.1.1 Chart Classes

  4. Development of a combustor analytical design methodology for liquid rocket engines

    NASA Technical Reports Server (NTRS)

    Pieper, Jerry L.; Muss, Jeff

    1989-01-01

    The development of a user friendly computerized methodology for the design and analysis of liquid propellant rocket engine combustion chambers is described. An overview of the methodology, consisting of a computer program containing an appropriate modular assembly of existing industry wide performance and combustion stability models, is presented. These models are linked with an interactive front end processor enabling the user to define the performance and stability traits of an existing design (point analysis) or to create the essential design features of a combustor to meet specific performance goals and combustion stability (point design). Plans for demonstration and verification of this methodology are also presented. These plans include the creation of combustor designs using the methodology, together with predictions of the performance and combustion stability for each design. A verification test program of 26 hot fire tests with up to four designs created using this methodology is described. This testing is planned using LOX/RP-1 propellants with a thrust level of approx. 220,000 N (50,000 lbf).

  5. Theories and Research Methodologies for Design-Based Implementation Research: Examples from Four Cases

    ERIC Educational Resources Information Center

    Russell, Jennifer Lin; Jackson, Kara; Krumm, Andrew E.; Frank, Kenneth A.

    2013-01-01

    Design-Based Implementation Research is the process of engaging "learning scientists, policy researchers, and practitioners in a model of collaborative, iterative, and systematic research and development" designed to address persistent problems of teaching and learning. Addressing persistent problems of teaching and learning requires…

  6. Software Design Methodology Migration for a Distributed Ground System

    NASA Technical Reports Server (NTRS)

    Ritter, George; McNair, Ann R. (Technical Monitor)

    2002-01-01

    The Marshall Space Flight Center's (MSFC) Payload Operations Center (POC) ground system has been developed and has evolved over a period of about 10 years. During this time the software processes have migrated from more traditional to more contemporary development processes. The new Software processes still emphasize requirements capture, software configuration management, design documenting, and making sure the products that have been developed are accountable to initial requirements. This paper will give an overview of how the Software Process have evolved highlighting the positives as well as the negatives. In addition, we will mention the COTS tools that have been integrated into the processes and how the COTS have provided value to the project .

  7. Design-Based Research: Is This a Suitable Methodology for Short-Term Projects?

    ERIC Educational Resources Information Center

    Pool, Jessica; Laubscher, Dorothy

    2016-01-01

    This article reports on a design-based methodology of a thesis in which a fully face-to-face contact module was converted into a blended learning course. The purpose of the article is to report on how design-based phases, in the form of micro-, meso- and macro-cycles were applied to improve practice and to generate design principles. Design-based…

  8. Systematic optimization model and algorithm for binding sequence selection in computational enzyme design

    PubMed Central

    Huang, Xiaoqiang; Han, Kehang; Zhu, Yushan

    2013-01-01

    A systematic optimization model for binding sequence selection in computational enzyme design was developed based on the transition state theory of enzyme catalysis and graph-theoretical modeling. The saddle point on the free energy surface of the reaction system was represented by catalytic geometrical constraints, and the binding energy between the active site and transition state was minimized to reduce the activation energy barrier. The resulting hyperscale combinatorial optimization problem was tackled using a novel heuristic global optimization algorithm, which was inspired and tested by the protein core sequence selection problem. The sequence recapitulation tests on native active sites for two enzyme catalyzed hydrolytic reactions were applied to evaluate the predictive power of the design methodology. The results of the calculation show that most of the native binding sites can be successfully identified if the catalytic geometrical constraints and the structural motifs of the substrate are taken into account. Reliably predicting active site sequences may have significant implications for the creation of novel enzymes that are capable of catalyzing targeted chemical reactions. PMID:23649589

  9. Systematic optimization model and algorithm for binding sequence selection in computational enzyme design.

    PubMed

    Huang, Xiaoqiang; Han, Kehang; Zhu, Yushan

    2013-07-01

    A systematic optimization model for binding sequence selection in computational enzyme design was developed based on the transition state theory of enzyme catalysis and graph-theoretical modeling. The saddle point on the free energy surface of the reaction system was represented by catalytic geometrical constraints, and the binding energy between the active site and transition state was minimized to reduce the activation energy barrier. The resulting hyperscale combinatorial optimization problem was tackled using a novel heuristic global optimization algorithm, which was inspired and tested by the protein core sequence selection problem. The sequence recapitulation tests on native active sites for two enzyme catalyzed hydrolytic reactions were applied to evaluate the predictive power of the design methodology. The results of the calculation show that most of the native binding sites can be successfully identified if the catalytic geometrical constraints and the structural motifs of the substrate are taken into account. Reliably predicting active site sequences may have significant implications for the creation of novel enzymes that are capable of catalyzing targeted chemical reactions.

  10. Object-oriented analysis and design: a methodology for modeling the computer-based patient record.

    PubMed

    Egyhazy, C J; Eyestone, S M; Martino, J; Hodgson, C L

    1998-08-01

    The article highlights the importance of an object-oriented analysis and design (OOAD) methodology for the computer-based patient record (CPR) in the military environment. Many OOAD methodologies do not adequately scale up, allow for efficient reuse of their products, or accommodate legacy systems. A methodology that addresses these issues is formulated and used to demonstrate its applicability in a large-scale health care service system. During a period of 6 months, a team of object modelers and domain experts formulated an OOAD methodology tailored to the Department of Defense Military Health System and used it to produce components of an object model for simple order processing. This methodology and the lessons learned during its implementation are described. This approach is necessary to achieve broad interoperability among heterogeneous automated information systems.

  11. Design Methodology for Low-Speed Variable Reluctance Motors

    NASA Astrophysics Data System (ADS)

    Suriano, John Riden

    Lowering the gear reduction in actuators by utilizing high-torque low-speed motors enables the use of less expensive and simpler gear systems and decreases the overall system inertia. Variable reluctance machines can produce high torque at low speeds. Their static torque, a critical quantity for determination of low speed operation, is compared for three variable reluctance motor design variations using linear analysis. Saturation effects, which are crucial to the accurate determination of static torque, are modeled using a dual energy technique first proposed by Lord Rayleigh. Dual energy techniques utilizing flux tubes and magnetomotive force slices are developed into a numerical method for predicting nonlinear three-dimensional magnetostatic field parameters. The dual energy method offers a compromise between the accurate but laborious finite element method and the speed of simplified lumped parameter magnetic circuit calculations. A two-dimensional dual energy model of a variable reluctance motor is developed. Results of calculations on a 4 kW Oulton machine are compared to measurements and other calculation methods. Finally, as a demonstration, the model is used to evaluate two competing variable reluctance motors for use as replacements for a DC windshield wiper motor.

  12. Weibull-Based Design Methodology for Rotating Aircraft Engine Structures

    NASA Technical Reports Server (NTRS)

    Zaretsky, Erwin; Hendricks, Robert C.; Soditus, Sherry

    2002-01-01

    The NASA Energy Efficient Engine (E(sup 3)-Engine) is used as the basis of a Weibull-based life and reliability analysis. Each component's life and thus the engine's life is defined by high-cycle fatigue (HCF) or low-cycle fatigue (LCF). Knowing the cumulative life distribution of each of the components making up the engine as represented by a Weibull slope is a prerequisite to predicting the life and reliability of the entire engine. As the engine Weibull slope increases, the predicted lives decrease. The predicted engine lives L(sub 5) (95 % probability of survival) of approximately 17,000 and 32,000 hr do correlate with current engine maintenance practices without and with refurbishment. respectively. The individual high pressure turbine (HPT) blade lives necessary to obtain a blade system life L(sub 0.1) (99.9 % probability of survival) of 9000 hr for Weibull slopes of 3, 6 and 9, are 47,391 and 20,652 and 15,658 hr, respectively. For a design life of the HPT disks having probable points of failure equal to or greater than 36,000 hr at a probability of survival of 99.9 %, the predicted disk system life L(sub 0.1) can vary from 9,408 to 24,911 hr.

  13. Advanced piloted aircraft flight control system design methodology. Volume 2: The FCX flight control design expert system

    NASA Technical Reports Server (NTRS)

    Myers, Thomas T.; Mcruer, Duane T.

    1988-01-01

    The development of a comprehensive and electric methodology for conceptual and preliminary design of flight control systems is presented and illustrated. The methodology is focused on the design states starting with the layout of system requirements and ending when some viable competing system architectures (feedback control structures) are defined. The approach is centered on the human pilot and the aircraft as both the sources of, and the keys to the solution of, many flight control problems. The methodology relies heavily on computational procedures which are highly interactive with the design engineer. To maximize effectiveness, these techniques, as selected and modified to be used together in the methodology, form a cadre of computational tools specifically tailored for integrated flight control system preliminary design purposes. The FCX expert system as presently developed is only a limited prototype capable of supporting basic lateral-directional FCS design activities related to the design example used. FCX presently supports design of only one FCS architecture (yaw damper plus roll damper) and the rules are largely focused on Class IV (highly maneuverable) aircraft. Despite this limited scope, the major elements which appear necessary for application of knowledge-based software concepts to flight control design were assembled and thus FCX represents a prototype which can be tested, critiqued and evolved in an ongoing process of development.

  14. The inclusion of ergonomic tools in the informational, conceptual and preliminary phases of the product design methodology.

    PubMed

    Medeiros, Ivan Luiz de; Batiz, Eduardo Concepción

    2012-01-01

    The process of product development has received special attention as it is being recognized as a source of competitive gain. Through its systematic use companies reduce costs, increase quality and decrease development time. However, one can find products being launched on the market that cause dissatisfaction to its users, and in consequence if the customer feels harmed or injured he will no longer purchase a product from the same brand. This in regard only to the commercial aspect; usually the danger of an accident or injury is not even thought about. This paper is the basis of the dissertation master's degree and used a literature research to build the repertoire, analyzing the methodologies applied by product design engineers, designers and ergonomists. The analysis results demonstrate the inefficiency of the design methodologies ergonomic issues. The contribution of this work lies in the suggestion to include ergonomic tools in all phases of product development and the presentation of a table with the tools that points out its most suitable time of application and results.

  15. Analysis and Design of Fuselage Structures Including Residual Strength Prediction Methodology

    NASA Technical Reports Server (NTRS)

    Knight, Norman F.

    1998-01-01

    The goal of this research project is to develop and assess methodologies for the design and analysis of fuselage structures accounting for residual strength. Two primary objectives are included in this research activity: development of structural analysis methodology for predicting residual strength of fuselage shell-type structures; and the development of accurate, efficient analysis, design and optimization tool for fuselage shell structures. Assessment of these tools for robustness, efficient, and usage in a fuselage shell design environment will be integrated with these two primary research objectives.

  16. Operational Design Cognitive Methodology: An Analysis of COMISAF 30 August 2009 Initial Assessment

    DTIC Science & Technology

    2010-04-01

    the problem must exist between the instruments of power. This research paper applies the Operational Design Cognitive Methodology to the...AIR COMMAND AND STAFF COLLEGE AIR UNIVERSITY OPERATIONAL DESIGN COGNITIVE METHODOLGY : AN ANALYSIS OF COMISAF 30 AUGUST 2009 INITIAL...ASSESSMENT by Abraham L. Jackson, Major, USAF A Research Report Submitted to the Faculty In Partial Fulfillment of the Graduation

  17. A Step-by-Step Design Methodology for a Base Case Vanadium Redox-Flow Battery

    ERIC Educational Resources Information Center

    Moore, Mark; Counce, Robert M.; Watson, Jack S.; Zawodzinski, Thomas A.; Kamath, Haresh

    2012-01-01

    The purpose of this work is to develop an evolutionary procedure to be used by Chemical Engineering students for the base-case design of a Vanadium Redox-Flow Battery. The design methodology is based on the work of Douglas (1985) and provides a profitability analysis at each decision level so that more profitable alternatives and directions can be…

  18. A design methodology for evolutionary air transportation networks

    NASA Astrophysics Data System (ADS)

    Yang, Eunsuk

    The air transportation demand at large hubs in the U.S. is anticipated to double in the near future. Current runway construction plans at selected airports can relieve some capacity and delay problems, but many are doubtful that this solution is sufficient to accommodate the anticipated demand growth in the National Airspace System (NAS). With the worsening congestion problem, it is imperative to seek alternative solutions other than costly runway constructions. In this respect, many researchers and organizations have been building models and performing analyses of the NAS. However, the complexity and size of the problem results in an overwhelming task for transportation system modelers. This research seeks to compose an active design algorithm for an evolutionary airline network model so as to include network specific control properties. An airline network designer, referred to as a network architect, can use this tool to assess the possibilities of gaining more capacity by changing the network configuration. Since the Airline Deregulation Act of 1978, the airline service network has evolved into a distinct Hub-and-Spoke (H&S) network. Enplanement demand on the H&S network is the sum of Origin-Destination (O-D) demand and transfer demand. Even though the flight or enplanement demand is a function of O-D demand and passenger routings on the airline network, the distinction between enplanement and O-D demand is not often made. Instead, many demand forecast practices in current days are based on scale-ups from the enplanements, which include the demand to and from transferring network hubs. Based on this research, it was found that the current demand prediction practice can be improved by dissecting enplanements further into smaller pieces of information. As a result, enplanement demand is decomposed into intrinsic and variable parts. The proposed intrinsic demand model is based on the concept of 'true' O-D demand which includes the direction of each round trip

  19. Advanced piloted aircraft flight control system design methodology. Volume 1: Knowledge base

    NASA Technical Reports Server (NTRS)

    Mcruer, Duane T.; Myers, Thomas T.

    1988-01-01

    The development of a comprehensive and electric methodology for conceptual and preliminary design of flight control systems is presented and illustrated. The methodology is focused on the design stages starting with the layout of system requirements and ending when some viable competing system architectures (feedback control structures) are defined. The approach is centered on the human pilot and the aircraft as both the sources of, and the keys to the solution of, many flight control problems. The methodology relies heavily on computational procedures which are highly interactive with the design engineer. To maximize effectiveness, these techniques, as selected and modified to be used together in the methodology, form a cadre of computational tools specifically tailored for integrated flight control system preliminary design purposes. While theory and associated computational means are an important aspect of the design methodology, the lore, knowledge and experience elements, which guide and govern applications are critical features. This material is presented as summary tables, outlines, recipes, empirical data, lists, etc., which encapsulate a great deal of expert knowledge. Much of this is presented in topical knowledge summaries which are attached as Supplements. The composite of the supplements and the main body elements constitutes a first cut at a a Mark 1 Knowledge Base for manned-aircraft flight control.

  20. Reporting of planned statistical methods in published surgical randomised trial protocols: a protocol for a methodological systematic review

    PubMed Central

    Madden, Kim; Arseneau, Erika; Evaniew, Nathan; Smith, Christopher S; Thabane, Lehana

    2016-01-01

    Introduction Poor reporting can lead to inadequate presentation of data, confusion regarding research methodology used, selective reporting of results, and other misinformation regarding health research. One of the most recent attempts to improve quality of reporting comes from the Standard Protocol Items: Recommendations for Interventional Trials (SPIRIT) Group, which makes recommendations for the reporting of protocols. In this report, we present a protocol for a systematic review of published surgical randomised controlled trial (RCT) protocols, with the purpose of assessing the reporting quality and completeness of the statistical aspects. Methods We will include all published protocols of randomised trials that investigate surgical interventions. We will search MEDLINE, EMBASE, and CENTRAL for relevant studies. Author pairs will independently review all titles, abstracts, and full texts identified by the literature search, and extract data using a structured data extraction form. We will extract the following: year of publication, country, sample size, description of study population, description of intervention and control, primary outcome, important methodological qualities, and quality of reporting of planned statistical methods based on the SPIRIT guidelines. Ethics and dissemination The results of this review will demonstrate the quality of statistical reporting of published surgical RCT protocols. This knowledge will inform recommendations to surgeons, researchers, journal editors and peer reviewers, and other knowledge users that focus on common deficiencies in reporting and how to rectify them. Ethics approval for this study is not required. We will disseminate the results of this review in peer-reviewed publications and conference presentations, and at a doctoral independent study of oral defence. PMID:27259528

  1. Integrated Controls-Structures Design Methodology: Redesign of an Evolutionary Test Structure

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Gupta, Sandeep; Elliot, Kenny B.; Joshi, Suresh M.

    1997-01-01

    An optimization-based integrated controls-structures design methodology for a class of flexible space structures is described, and the phase-0 Controls-Structures-Integration evolutionary model, a laboratory testbed at NASA Langley, is redesigned using this integrated design methodology. The integrated controls-structures design is posed as a nonlinear programming problem to minimize the control effort required to maintain a specified line-of-sight pointing performance, under persistent white noise disturbance. Static and dynamic dissipative control strategies are employed for feedback control, and parameters of these controllers are considered as the control design variables. Sizes of strut elements in various sections of the CEM are used as the structural design variables. Design guides for the struts are developed and employed in the integrated design process, to ensure that the redesigned structure can be effectively fabricated. The superiority of the integrated design methodology over the conventional design approach is demonstrated analytically by observing a significant reduction in the average control power needed to maintain specified pointing performance with the integrated design approach.

  2. Co-design of RAD and ETHICS methodologies: a combination of information system development methods

    NASA Astrophysics Data System (ADS)

    Nasehi, Arezo; Shahriyari, Salman

    2011-12-01

    Co-design is a new trend in the social world which tries to capture different ideas in order to use the most appropriate features for a system. In this paper, co-design of two information system methodologies is regarded; rapid application development (RAD) and effective technical and human implementation of computer-based systems (ETHICS). We tried to consider the characteristics of these methodologies to see the possibility of having a co-design or combination of them for developing an information system. To reach this purpose, four different aspects of them are analyzed: social or technical approach, user participation and user involvement, job satisfaction, and overcoming change resistance. Finally, a case study using the quantitative method is analyzed in order to examine the possibility of co-design using these factors. The paper concludes that RAD and ETHICS are appropriate to be co-designed and brings some suggestions for the co-design.

  3. How to Assess the External Validity and Model Validity of Therapeutic Trials: A Conceptual Approach to Systematic Review Methodology

    PubMed Central

    2014-01-01

    Background. Evidence rankings do not consider equally internal (IV), external (EV), and model validity (MV) for clinical studies including complementary and alternative medicine/integrative medicine (CAM/IM) research. This paper describe this model and offers an EV assessment tool (EVAT©) for weighing studies according to EV and MV in addition to IV. Methods. An abbreviated systematic review methodology was employed to search, assemble, and evaluate the literature that has been published on EV/MV criteria. Standard databases were searched for keywords relating to EV, MV, and bias-scoring from inception to Jan 2013. Tools identified and concepts described were pooled to assemble a robust tool for evaluating these quality criteria. Results. This study assembled a streamlined, objective tool to incorporate for the evaluation of quality of EV/MV research that is more sensitive to CAM/IM research. Conclusion. Improved reporting on EV can help produce and provide information that will help guide policy makers, public health researchers, and other scientists in their selection, development, and improvement in their research-tested intervention. Overall, clinical studies with high EV have the potential to provide the most useful information about “real-world” consequences of health interventions. It is hoped that this novel tool which considers IV, EV, and MV on equal footing will better guide clinical decision making. PMID:24734111

  4. Efficient preliminary floating offshore wind turbine design and testing methodologies and application to a concrete spar design

    PubMed Central

    Matha, Denis; Sandner, Frank; Molins, Climent; Campos, Alexis; Cheng, Po Wen

    2015-01-01

    The current key challenge in the floating offshore wind turbine industry and research is on designing economic floating systems that can compete with fixed-bottom offshore turbines in terms of levelized cost of energy. The preliminary platform design, as well as early experimental design assessments, are critical elements in the overall design process. In this contribution, a brief review of current floating offshore wind turbine platform pre-design and scaled testing methodologies is provided, with a focus on their ability to accommodate the coupled dynamic behaviour of floating offshore wind systems. The exemplary design and testing methodology for a monolithic concrete spar platform as performed within the European KIC AFOSP project is presented. Results from the experimental tests compared to numerical simulations are presented and analysed and show very good agreement for relevant basic dynamic platform properties. Extreme and fatigue loads and cost analysis of the AFOSP system confirm the viability of the presented design process. In summary, the exemplary application of the reduced design and testing methodology for AFOSP confirms that it represents a viable procedure during pre-design of floating offshore wind turbine platforms. PMID:25583870

  5. Efficient preliminary floating offshore wind turbine design and testing methodologies and application to a concrete spar design.

    PubMed

    Matha, Denis; Sandner, Frank; Molins, Climent; Campos, Alexis; Cheng, Po Wen

    2015-02-28

    The current key challenge in the floating offshore wind turbine industry and research is on designing economic floating systems that can compete with fixed-bottom offshore turbines in terms of levelized cost of energy. The preliminary platform design, as well as early experimental design assessments, are critical elements in the overall design process. In this contribution, a brief review of current floating offshore wind turbine platform pre-design and scaled testing methodologies is provided, with a focus on their ability to accommodate the coupled dynamic behaviour of floating offshore wind systems. The exemplary design and testing methodology for a monolithic concrete spar platform as performed within the European KIC AFOSP project is presented. Results from the experimental tests compared to numerical simulations are presented and analysed and show very good agreement for relevant basic dynamic platform properties. Extreme and fatigue loads and cost analysis of the AFOSP system confirm the viability of the presented design process. In summary, the exemplary application of the reduced design and testing methodology for AFOSP confirms that it represents a viable procedure during pre-design of floating offshore wind turbine platforms.

  6. Reach, engagement, and effectiveness: a systematic review of evaluation methodologies used in health promotion via social networking sites.

    PubMed

    Lim, Megan S C; Wright, Cassandra J C; Carrotte, Elise R; Pedrana, Alisa E

    2016-10-06

    Issue addressed: Social networking sites (SNS) are increasingly popular platforms for health promotion. Advancements in SNS health promotion require quality evidence; however, interventions are often not formally evaluated. This study aims to describe evaluation practices used in SNS health promotion.Methods: A systematic review was undertaken of Medline, PsycINFO, Scopus, EMBASE, CINAHL Plus, Communication and Mass Media Complete, and Cochrane Library databases. Articles published between 2006 and 2013 describing any health promotion intervention delivered using SNS were included.Results: Forty-seven studies were included. There were two main evaluation approaches: closed designs (n=23), which used traditional research designs and formal recruitment procedures; and open designs (n=19), which evaluated the intervention in a real-world setting, allowing unknown SNS users to interact with the content without enrolling in research. Closed designs were unable to assess reach and engagement beyond their research sample. Open designs often relied on weaker study designs with no use of objective outcome measures and yielded low response rates.Conclusions: Barriers to evaluation included low participation rates, high attrition, unknown representativeness and lack of comparison groups. Acceptability was typically assessed among those engaged with the intervention, with limited population data available to accurately assess intervention reach. Few studies were able to assess uptake of the intervention in a real-life setting while simultaneously assessing effectiveness of interventions with research rigour.So what?: Through use of quasi-experimental or well designed before-after evaluations, in combination with detailed engagement metrics, it is possible to balance assessment of effectiveness and reach to evaluate SNS health promotion.

  7. Energy-Based Design Methodology for Air Vehicle Systems: Aerodynamic Correlation Study

    DTIC Science & Technology

    2005-03-01

    ENERGY -BASED DESIGN METHODOLOGY FOR AIR VEHICLE SYSTEMS : AERODYNAMIC CORRELATION STUDY AFOSR: FA9550-64-"t/Dr. John Schmisseur AFOSR-NA C>(4-1-0- I...drag estimation and vehicle-level utilization of energy . The exergy utilization of a wing in a steady, low subsonic, three-dimensional, viscous flow...5a. CONTRACT NUMBER Energy -Based Design Methodology For Air Vehicle 5b. GRANT NUMBER Systems : Aerodynamic Correlation Study FA9550,-64 (9 4-1-- !(1 5c

  8. Systematic Assessment of a High-Impact Course Design Institute

    ERIC Educational Resources Information Center

    Palmer, Michael S.; Streifer, Adriana C.; Williams-Duncan, Stacy

    2016-01-01

    Herein, we describe an intensive, week-long course design institute (CDI) designed to introduce participants to the scholarly and evidence-driven process of learning-focused course design. Impact of this intervention is demonstrated using a multifaceted approach: (a) post-CDI satisfaction and perception surveys, (b) pre-/post-CDI surveys probing…

  9. Model-Driven Design: Systematically Building Integrated Blended Learning Experiences

    ERIC Educational Resources Information Center

    Laster, Stephen

    2010-01-01

    Developing and delivering curricula that are integrated and that use blended learning techniques requires a highly orchestrated design. While institutions have demonstrated the ability to design complex curricula on an ad-hoc basis, these projects are generally successful at a great human and capital cost. Model-driven design provides a…

  10. Systematic design of membership functions for fuzzy-logic control: A case study on one-stage partial nitritation/anammox treatment systems.

    PubMed

    Boiocchi, Riccardo; Gernaey, Krist V; Sin, Gürkan

    2016-10-01

    A methodology is developed to systematically design the membership functions of fuzzy-logic controllers for multivariable systems. The methodology consists of a systematic derivation of the critical points of the membership functions as a function of predefined control objectives. Several constrained optimization problems corresponding to different qualitative operation states of the system are defined and solved to identify, in a consistent manner, the critical points of the membership functions for the input variables. The consistently identified critical points, together with the linguistic rules, determine the long term reachability of the control objectives by the fuzzy logic controller. The methodology is highlighted using a single-stage side-stream partial nitritation/Anammox reactor as a case study. As a result, a new fuzzy-logic controller for high and stable total nitrogen removal efficiency is designed. Rigorous simulations are carried out to evaluate and benchmark the performance of the controller. The results demonstrate that the novel control strategy is capable of rejecting the long-term influent disturbances, and can achieve a stable and high TN removal efficiency. Additionally, the controller was tested, and showed robustness, against measurement noise levels typical for wastewater sensors. A feedforward-feedback configuration using the present controller would give even better performance. In comparison, a previously developed fuzzy-logic controller using merely expert and intuitive knowledge performed worse. This proved the importance of using a systematic methodology for the derivation of the membership functions for multivariable systems. These results are promising for future applications of the controller in real full-scale plants. Furthermore, the methodology can be used as a tool to help systematically design fuzzy logic control applications for other biological processes.

  11. A systematic review of the performance of instruments designed to measure the dimensions of pressure ulcers.

    PubMed

    O'Meara, Susan M; Bland, J Martin; Dumville, Jo C; Cullum, Nicky A

    2012-01-01

    The objective was to undertake a systematic review of the performance of wound measurement instruments used for patients with pressure ulcers. Studies of any design, evaluating methods for estimating wound diameter, depth, surface area, or volume in patients with pressure ulcers were included. Eligible evaluations had to report intra- or inter-rater reliability, accuracy, agreement, or feasibility of methods. Electronic databases and other sources were accessed for study identification. Included studies were critically appraised using a modified checklist for diagnostic test evaluations. Twelve studies were included. Most had methodological problems and/or used inappropriate statistical methods. Reliable methods for measuring pressure ulcer surface area may include: grid tracings from photographs combined with whole plus partial square count; a portable digital pad; and stereophotogrammetry combined with computerized image analysis. The agreement between photographic tracing and direct transparency tracing may be satisfactory (both methods being combined with computerized planimetry). No definitive conclusions could be reached about studies of diameter or depth; this means that there is little evidence to underpin recommendations in clinical guidelines. Evaluations of volume measurement were of poor quality, and there were few data on feasibility. Further primary research is needed to evaluate methods of wound measurement used in clinical practice.

  12. Experimental design in caecilian systematics: phylogenetic information of mitochondrial genomes and nuclear rag1.

    PubMed

    San Mauro, Diego; Gower, David J; Massingham, Tim; Wilkinson, Mark; Zardoya, Rafael; Cotton, James A

    2009-08-01

    In molecular phylogenetic studies, a major aspect of experimental design concerns the choice of markers and taxa. Although previous studies have investigated the phylogenetic performance of different genes and the effectiveness of increasing taxon sampling, their conclusions are partly contradictory, probably because they are highly context specific and dependent on the group of organisms used in each study. Goldman introduced a method for experimental design in phylogenetics based on the expected information to be gained that has barely been used in practice. Here we use this method to explore the phylogenetic utility of mitochondrial (mt) genes, mt genomes, and nuclear rag1 for studies of the systematics of caecilian amphibians, as well as the effect of taxon addition on the stabilization of a controversial branch of the tree. Overall phylogenetic information estimates per gene, specific estimates per branch of the tree, estimates for combined (mitogenomic) data sets, and estimates as a hypothetical new taxon is added to different parts of the caecilian tree are calculated and compared. In general, the most informative data sets are those for mt transfer and ribosomal RNA genes. Our results also show at which positions in the caecilian tree the addition of taxa have the greatest potential to increase phylogenetic information with respect to the controversial relationships of Scolecomorphus, Boulengerula, and all other teresomatan caecilians. These positions are, as intuitively expected, mostly (but not all) adjacent to the controversial branch. Generating whole mitogenomic and rag1 data for additional taxa joining the Scolecomorphus branch may be a more efficient strategy than sequencing a similar amount of additional nucleotides spread across the current caecilian taxon sampling. The methodology employed in this study allows an a priori evaluation and testable predictions of the appropriateness of particular experimental designs to solve specific questions at

  13. A methodology for designing robust multivariable nonlinear control systems. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Grunberg, D. B.

    1986-01-01

    A new methodology is described for the design of nonlinear dynamic controllers for nonlinear multivariable systems providing guarantees of closed-loop stability, performance, and robustness. The methodology is an extension of the Linear-Quadratic-Gaussian with Loop-Transfer-Recovery (LQG/LTR) methodology for linear systems, thus hinging upon the idea of constructing an approximate inverse operator for the plant. A major feature of the methodology is a unification of both the state-space and input-output formulations. In addition, new results on stability theory, nonlinear state estimation, and optimal nonlinear regulator theory are presented, including the guaranteed global properties of the extended Kalman filter and optimal nonlinear regulators.

  14. Development of an Expert Judgement Elicitation and Calibration Methodology for Risk Analysis in Conceptual Vehicle Design

    NASA Technical Reports Server (NTRS)

    Unal, Resit; Keating, Charles; Conway, Bruce; Chytka, Trina

    2004-01-01

    A comprehensive expert-judgment elicitation methodology to quantify input parameter uncertainty and analysis tool uncertainty in a conceptual launch vehicle design analysis has been developed. The ten-phase methodology seeks to obtain expert judgment opinion for quantifying uncertainties as a probability distribution so that multidisciplinary risk analysis studies can be performed. The calibration and aggregation techniques presented as part of the methodology are aimed at improving individual expert estimates, and provide an approach to aggregate multiple expert judgments into a single probability distribution. The purpose of this report is to document the methodology development and its validation through application to a reference aerospace vehicle. A detailed summary of the application exercise, including calibration and aggregation results is presented. A discussion of possible future steps in this research area is given.

  15. Methodology for object-oriented real-time systems analysis and design: Software engineering

    NASA Technical Reports Server (NTRS)

    Schoeffler, James D.

    1991-01-01

    Successful application of software engineering methodologies requires an integrated analysis and design life-cycle in which the various phases flow smoothly 'seamlessly' from analysis through design to implementation. Furthermore, different analysis methodologies often lead to different structuring of the system so that the transition from analysis to design may be awkward depending on the design methodology to be used. This is especially important when object-oriented programming is to be used for implementation when the original specification and perhaps high-level design is non-object oriented. Two approaches to real-time systems analysis which can lead to an object-oriented design are contrasted: (1) modeling the system using structured analysis with real-time extensions which emphasizes data and control flows followed by the abstraction of objects where the operations or methods of the objects correspond to processes in the data flow diagrams and then design in terms of these objects; and (2) modeling the system from the beginning as a set of naturally occurring concurrent entities (objects) each having its own time-behavior defined by a set of states and state-transition rules and seamlessly transforming the analysis models into high-level design models. A new concept of a 'real-time systems-analysis object' is introduced and becomes the basic building block of a series of seamlessly-connected models which progress from the object-oriented real-time systems analysis and design system analysis logical models through the physical architectural models and the high-level design stages. The methodology is appropriate to the overall specification including hardware and software modules. In software modules, the systems analysis objects are transformed into software objects.

  16. Space station definitions, design, and development. Task 5: Multiple arm telerobot coordination and control: Manipulator design methodology

    NASA Technical Reports Server (NTRS)

    Stoughton, R. M.

    1990-01-01

    A proposed methodology applicable to the design of manipulator systems is described. The current design process is especially weak in the preliminary design phase, since there is no accepted measure to be used in trading off different options available for the various subsystems. The design process described uses Cartesian End-Effector Impedance as a measure of performance for the system. Having this measure of performance, it is shown how it may be used to determine the trade-offs necessary to the preliminary design phase. The design process involves three main parts: (1) determination of desired system performance in terms of End-Effector Impedance; (2) trade-off design options to achieve this desired performance; and (3) verification of system performance through laboratory testing. The design process is developed using numerous examples and experiments to demonstrate the feasability of this approach to manipulator design.

  17. Methodological quality of meta-analyses on treatments for chronic obstructive pulmonary disease: a cross-sectional study using the AMSTAR (Assessing the Methodological Quality of Systematic Reviews) tool

    PubMed Central

    Ho, Robin ST; Wu, Xinyin; Yuan, Jinqiu; Liu, Siya; Lai, Xin; Wong, Samuel YS; Chung, Vincent CH

    2015-01-01

    Background: Meta-analysis (MA) of randomised trials is considered to be one of the best approaches for summarising high-quality evidence on the efficacy and safety of treatments. However, methodological flaws in MAs can reduce the validity of conclusions, subsequently impairing the quality of decision making. Aims: To assess the methodological quality of MAs on COPD treatments. Methods: A cross-sectional study on MAs of COPD trials. MAs published during 2000–2013 were sampled from the Cochrane Database of Systematic Reviews and Database of Abstracts of Reviews of Effect. Methodological quality was assessed using the validated AMSTAR (Assessing the Methodological Quality of Systematic Reviews) tool. Results: Seventy-nine MAs were sampled. Only 18% considered the scientific quality of primary studies when formulating conclusions and 49% used appropriate meta-analytic methods to combine findings. The problems were particularly acute among MAs on pharmacological treatments. In 48% of MAs the authors did not report conflict of interest. Fifty-eight percent reported harmful effects of treatment. Publication bias was not assessed in 65% of MAs, and only 10% had searched non-English databases. Conclusions: The methodological quality of the included MAs was disappointing. Consideration of scientific quality when formulating conclusions should be made explicit. Future MAs should improve on reporting conflict of interest and harm, assessment of publication bias, prevention of language bias and use of appropriate meta-analytic methods. PMID:25569783

  18. Cyber-Informed Engineering: The Need for a New Risk Informed and Design Methodology

    SciTech Connect

    Price, Joseph Daniel; Anderson, Robert Stephen

    2015-06-01

    Current engineering and risk management methodologies do not contain the foundational assumptions required to address the intelligent adversary’s capabilities in malevolent cyber attacks. Current methodologies focus on equipment failures or human error as initiating events for a hazard, while cyber attacks use the functionality of a trusted system to perform operations outside of the intended design and without the operator’s knowledge. These threats can by-pass or manipulate traditionally engineered safety barriers and present false information, invalidating the fundamental basis of a safety analysis. Cyber threats must be fundamentally analyzed from a completely new perspective where neither equipment nor human operation can be fully trusted. A new risk analysis and design methodology needs to be developed to address this rapidly evolving threatscape.

  19. A multi-criteria decision aid methodology to design electric vehicles public charging networks

    NASA Astrophysics Data System (ADS)

    Raposo, João; Rodrigues, Ana; Silva, Carlos; Dentinho, Tomaz

    2015-05-01

    This article presents a new multi-criteria decision aid methodology, dynamic-PROMETHEE, here used to design electric vehicle charging networks. In applying this methodology to a Portuguese city, results suggest that it is effective in designing electric vehicle charging networks, generating time and policy based scenarios, considering offer and demand and the city's urban structure. Dynamic-PROMETHE adds to the already known PROMETHEE's characteristics other useful features, such as decision memory over time, versatility and adaptability. The case study, used here to present the dynamic-PROMETHEE, served as inspiration and base to create this new methodology. It can be used to model different problems and scenarios that may present similar requirement characteristics.

  20. BEAM STOP DESIGN METHODOLOGY AND DESCRIPTION OF A NEW SNS BEAM STOP

    SciTech Connect

    Polsky, Yarom; Plum, Michael A; Geoghegan, Patrick J; Jacobs, Lorelei L; Lu, Wei; McTeer, Stephen Mark

    2010-01-01

    The design of accelerator components such as magnets, accelerator cavities and beam instruments tends to be a fairly standardized and collective effort within the particle accelerator community with well established performance, reliability and, in some cases, even budgetary criteria. Beam stop design, by contrast, has been comparatively subjective historically with much more general goals. This lack of rigor has lead to a variety of facility implementations with limited standardization and minimal consensus on approach to development within the particle accelerator community. At the Spallation Neutron Source (SNS), for example, there are four high power beam stops in use, three of which have significantly different design solutions. This paper describes the design of a new off-momentum beam stop for the SNS. The technical description of the system will be complemented by a discussion of design methodology. This paper presented an overview of the new SNS HEBT off-momentum beam stop and outlined a methodology for beam stop system design. The new beam stop consists of aluminium and steel blocks cooled by a closed-loop forced-air system and is expected to be commissioned this summer. The design methodology outlined in the paper represents a basic description of the process, data, analyses and critical decisions involved in the development of a beam stop system.

  1. Cell-based top-down design methodology for RSFQ digital circuits

    NASA Astrophysics Data System (ADS)

    Yoshikawa, N.; Koshiyama, J.; Motoori, K.; Matsuzaki, F.; Yoda, K.

    2001-08-01

    We propose a cell-based top-down design methodology for rapid single flux quantum (RSFQ) digital circuits. Our design methodology employs a binary decision diagram (BDD), which is currently used for the design of CMOS pass-transistor logic circuits. The main features of the BDD RSFQ circuits are the limited primitive number, dual rail nature, non-clocking architecture, and small gate count. We have made a standard BDD RSFQ cell library and prepared a top-down design CAD environment, by which we can perform logic synthesis, logic simulation, circuit simulation and layout view extraction. In order to clarify problems expected in large-scale RSFQ circuits design, we have designed a small RSFQ microprocessor based on simple architecture using our top-down design methodology. We have estimated its system performance and compared it with that of the CMOS microprocessor with the same architecture. It was found that the RSFQ system is superior in terms of the operating speed though it requires extremely large chip area.

  2. Designing Needs Statements in a Systematic Iterative Way

    ERIC Educational Resources Information Center

    Verstegen, D. M. L.; Barnard, Y. F.; Pilot, A.

    2009-01-01

    Designing specifications for technically advanced instructional products, such as e-learning, simulations or simulators requires different kinds of expertise. The SLIM method proposes to involve all stakeholders from the beginning in a series of workshops under the guidance of experienced instructional designers. These instructional designers…

  3. Conjecture Mapping: An Approach to Systematic Educational Design Research

    ERIC Educational Resources Information Center

    Sandoval, William

    2014-01-01

    Design research is strongly associated with the learning sciences community, and in the 2 decades since its conception it has become broadly accepted. Yet within and without the learning sciences there remains confusion about how to do design research, with most scholarship on the approach describing what it is rather than how to do it. This…

  4. A design methodology for effective application of pan-tilt cameras in alarm assessment systems

    SciTech Connect

    Davis, R.F.

    1993-08-01

    Effective application of pan-tilt cameras in alarm assessment systems requires that the overall system design be such that any threat for which the system is designed will be within the field of view of the camera for a sufficiently long time for the assessment of the alarm to be performed. The assessment of alarms in large, unobstructed areas requires a different type of analysis than traditionally used for clear zones between fences along fixed perimeters where an intruder`s possible location is well defined. This paper presents a design methodology which integrates the threat characteristics, sensor detection pattern, system response time, and optics geometry considerations to identify all feasible locations for camera placement for effective assessment of large, unobstructed areas. The methodology also can be used to evaluate tradeoffs among these various considerations to improve candidate designs.

  5. Curriculum Design: Nurse Educator's Role in Managing and Utilizing Various Teaching Methodologies.

    ERIC Educational Resources Information Center

    Walters, Norma J.

    The role of the nurse educator in curriculum design in the future is considered. Changing technology, shifts in patient care agencies, legislation and long-term care specialties in nursing are all factors that will have a significant impact on curricula. Plans for managing and utilizing various teaching methodologies will be an important role for…

  6. Beyond Needs Analysis: Soft Systems Methodology for Meaningful Collaboration in EAP Course Design

    ERIC Educational Resources Information Center

    Tajino, Akira; James, Robert; Kijima, Kyoichi

    2005-01-01

    Designing an EAP course requires collaboration among various concerned stakeholders, including students, subject teachers, institutional administrators and EAP teachers themselves. While needs analysis is often considered fundamental to EAP, alternative research methodologies may be required to facilitate meaningful collaboration between these…

  7. A water quality monitoring network design methodology for the selection of critical sampling points: Part I.

    PubMed

    Strobl, R O; Robillard, P D; Shannon, R D; Day, R L; McDonnell, A J

    2006-01-01

    The principal instrument to temporally and spatially manage water resources is a water quality monitoring network. However, to date in most cases, there is a clear absence of a concise strategy or methodology for designing monitoring networks, especially when deciding upon the placement of sampling stations. Since water quality monitoring networks can be quite costly, it is very important to properly design the monitoring network so that maximum information extraction can be accomplished, which in turn is vital when informing decision-makers. This paper presents the development of a methodology for identifying the critical sampling locations within a watershed. Hence, it embodies the spatial component in the design of a water quality monitoring network by designating the critical stream locations that should ideally be sampled. For illustration purposes, the methodology focuses on a single contaminant, namely total phosphorus, and is applicable to small, upland, predominantly agricultural-forested watersheds. It takes a number of hydrologic, topographic, soils, vegetative, and land use factors into account. In addition, it includes an economic as well as logistical component in order to approximate the number of sampling points required for a given budget and to only consider the logistically accessible stream reaches in the analysis, respectively. The methodology utilizes a geographic information system (GIS), hydrologic simulation model, and fuzzy logic.

  8. IDR: A Participatory Methodology for Interdisciplinary Design in Technology Enhanced Learning

    ERIC Educational Resources Information Center

    Winters, Niall; Mor, Yishay

    2008-01-01

    One of the important themes that emerged from the CAL'07 conference was the failure of technology to bring about the expected disruptive effect to learning and teaching. We identify one of the causes as an inherent weakness in prevalent development methodologies. While the problem of designing technology for learning is irreducibly…

  9. Using Delphi Methodology to Design Assessments of Teachers' Pedagogical Content Knowledge

    ERIC Educational Resources Information Center

    Manizade, Agida Gabil; Mason, Marguerite M.

    2011-01-01

    Descriptions of methodologies that can be used to create items for assessing teachers' "professionally situated" knowledge are lacking in mathematics education research literature. In this study, researchers described and used the Delphi method to design an instrument to measure teachers' pedagogical content knowledge. The instrument focused on a…

  10. Methodology for the Preliminary Design of High Performance Schools in Hot and Humid Climates

    ERIC Educational Resources Information Center

    Im, Piljae

    2009-01-01

    A methodology to develop an easy-to-use toolkit for the preliminary design of high performance schools in hot and humid climates was presented. The toolkit proposed in this research will allow decision makers without simulation knowledge easily to evaluate accurately energy efficient measures for K-5 schools, which would contribute to the…

  11. Physical Activity Participation of Disabled Children: A Systematic Review of Conceptual and Methodological Approaches in Health Research

    PubMed Central

    Ross, Samantha Mae; Bogart, Kathleen R.; Logan, Samuel W.; Case, Layne; Fine, Jeremiah; Thompson, Hanna

    2016-01-01

    Physical activity (PA) participation is widely recognized as a critical component of health and development for disabled and non-disabled children. Emergent literature reflects a paradigm shift in the conceptualization of childhood PA as a multi-dimensional construct, encompassing aspects of physical performance, and self-perceived engagement. However, ambiguity remains around how participation as a health construct is integrated into PA research. The primary objective of the present mini-review is to critically examine current conceptual and methodological approaches to evaluating PA participation among disabled children. We conducted a systematic review of contemporary literature (published between 2000 and 2016). Seventeen articles met inclusion criteria, and their research approach was classified into guiding framework, definition of the key construct, and measurement used. The primary guiding framework was the international classification of functioning, disability and health. An explicit definition of PA participation was absent from all studies. Eight studies (47%) operationalized PA and participation as independent constructs. Measurements included traditional performance-based aspects of PA (frequency, duration, and intensity), and alternative participation measures (subjective perception of involvement, inclusion, or enjoyment). Approximately 64% of included articles were published in the past 2 years (2014–2016) indicating a rising interest in the topic of PA participation. Drawing from the broader discussion of participation in the literature, we offer a working definition of PA participation as it pertains to active, health-associated behaviors. Further description of alternative approaches to framing and measuring PA participation are offered to support effective assessment of health status among disabled children. PMID:27656639

  12. Non-Communicable Disease Clinical Practice Guidelines in Brazil: A Systematic Assessment of Methodological Quality and Transparency

    PubMed Central

    Romano-Lieber, Nicolina Silvana; Ribeiro, Eliane; de Melo, Daniela Oliveira

    2016-01-01

    Background Annually, non-communicable diseases (NCDs) kill 38 million people worldwide, with low and middle-income countries accounting for three-quarters of these deaths. High-quality clinical practice guidelines (CPGs) are fundamental to improving NCD management. The present study evaluated the methodological rigor and transparency of Brazilian CPGs that recommend pharmacological treatment for the most prevalent NCDs. Methods We conducted a systematic search for CPGs of the following NCDs: asthma, atrial fibrillation, benign prostatic hyperplasia, chronic obstructive pulmonary disease, congestive heart failure, coronary artery disease and/or stable angina, dementia, depression, diabetes, gastroesophageal reflux disease, hypercholesterolemia, hypertension, osteoarthritis, and osteoporosis. CPGs comprising pharmacological treatment recommendations were included. No language or year restrictions were applied. CPGs were excluded if they were merely for local use and referred to NCDs not listed above. CPG quality was independently assessed by two reviewers using the Appraisal of Guidelines Research and Evaluation instrument, version II (AGREE II). Main Findings “Scope and purpose” and “clarity and presentation” domains received the highest scores. Sixteen of 26 CPGs were classified as low quality, and none were classified as high overall quality. No CPG was recommended without modification (77% were not recommended at all). After 2009, 2 domain scores (“rigor of development” and “clarity and presentation”) increased (61% and 73%, respectively). However, “rigor of development” was still rated < 30%. Conclusion Brazilian healthcare professionals should be concerned with CPG quality for the treatment of selected NCDs. Features that undermined AGREE II scores included the lack of a multidisciplinary team for the development group, no consideration of patients’ preferences, insufficient information regarding literature searches, lack of selection

  13. Physical Activity Participation of Disabled Children: A Systematic Review of Conceptual and Methodological Approaches in Health Research.

    PubMed

    Ross, Samantha Mae; Bogart, Kathleen R; Logan, Samuel W; Case, Layne; Fine, Jeremiah; Thompson, Hanna

    2016-01-01

    Physical activity (PA) participation is widely recognized as a critical component of health and development for disabled and non-disabled children. Emergent literature reflects a paradigm shift in the conceptualization of childhood PA as a multi-dimensional construct, encompassing aspects of physical performance, and self-perceived engagement. However, ambiguity remains around how participation as a health construct is integrated into PA research. The primary objective of the present mini-review is to critically examine current conceptual and methodological approaches to evaluating PA participation among disabled children. We conducted a systematic review of contemporary literature (published between 2000 and 2016). Seventeen articles met inclusion criteria, and their research approach was classified into guiding framework, definition of the key construct, and measurement used. The primary guiding framework was the international classification of functioning, disability and health. An explicit definition of PA participation was absent from all studies. Eight studies (47%) operationalized PA and participation as independent constructs. Measurements included traditional performance-based aspects of PA (frequency, duration, and intensity), and alternative participation measures (subjective perception of involvement, inclusion, or enjoyment). Approximately 64% of included articles were published in the past 2 years (2014-2016) indicating a rising interest in the topic of PA participation. Drawing from the broader discussion of participation in the literature, we offer a working definition of PA participation as it pertains to active, health-associated behaviors. Further description of alternative approaches to framing and measuring PA participation are offered to support effective assessment of health status among disabled children.

  14. A low-power photovoltaic system with energy storage for radio communications: description and design methodology

    SciTech Connect

    Chapman, C.P.; Chapman, P.D.

    1982-01-01

    A low power photovoltaic system was constructed with approximately 500 amp hours of battery energy storage to provide power to an emergency amateur radio communications center. The system can power the communications center for about 72 hours of continuous nonsun operation. Complete construction details and a design methodology algorithm are given with abundant engineering data and adequate theory to allow similar systems to be constructed, scaled up or down, with minimum design effort.

  15. Low-power photovoltaic system with energy storage for radio communications. Description and design methodology

    SciTech Connect

    Chapman, C.P.; Chapman, P.D.; Lewison, A.H.

    1982-01-15

    A low-power photovoltaic system was constructed with approximately 500 amp-hours of battery energy storage to provide power to an emergency amateur radio communications center. The system can power the communications center for about 72 hours of continuous no-sun operation. Complete construction details and a design methodology algorithm are given with abundant engineering data and adequate theory to allow similar systems to be constructed, scaled up or down, with minimum design effort.

  16. A low-power photovoltaic system with energy storage for radio communications: Description and design methodology

    NASA Technical Reports Server (NTRS)

    Chapman, C. P.; Chapman, P. D.; Lewison, A. H.

    1982-01-01

    A low power photovoltaic system was constructed with approximately 500 amp hours of battery energy storage to provide power to an emergency amateur radio communications center. The system can power the communications center for about 72 hours of continuous nonsun operation. Complete construction details and a design methodology algorithm are given with abundant engineering data and adequate theory to allow similar systems to be constructed, scaled up or down, with minimum design effort.

  17. Design Methodology for Multi-Element High-Lift Systems on Subsonic Civil Transport Aircraft

    NASA Technical Reports Server (NTRS)

    Pepper, R. S.; vanDam, C. P.

    1996-01-01

    The choice of a high-lift system is crucial in the preliminary design process of a subsonic civil transport aircraft. Its purpose is to increase the allowable aircraft weight or decrease the aircraft's wing area for a given takeoff and landing performance. However, the implementation of a high-lift system into a design must be done carefully, for it can improve the aerodynamic performance of an aircraft but may also drastically increase the aircraft empty weight. If designed properly, a high-lift system can improve the cost effectiveness of an aircraft by increasing the payload weight for a given takeoff and landing performance. This is why the design methodology for a high-lift system should incorporate aerodynamic performance, weight, and cost. The airframe industry has experienced rapid technological growth in recent years which has led to significant advances in high-lift systems. For this reason many existing design methodologies have become obsolete since they are based on outdated low Reynolds number wind-tunnel data and can no longer accurately predict the aerodynamic characteristics or weight of current multi-element wings. Therefore, a new design methodology has been created that reflects current aerodynamic, weight, and cost data and provides enough flexibility to allow incorporation of new data when it becomes available.

  18. Coach design for the Korean high-speed train: a systematic approach to passenger seat design and layout.

    PubMed

    Jung, E S; Han, S H; Jung, M; Choe, J

    1998-12-01

    Proper ergonomic design of a passenger seat and coach layout for a high-speed train is an essential component that is directly related to passenger comfort. In this research, a systematic approach to the design of passenger seats was described and the coach layout which reflected the tradeoff between transportation capacity and passenger comfort was investigated for the Korean high-speed train. As a result, design recommendations and specifications of the passenger seat and its layout were suggested. The whole design process is composed of four stages. A survey and analysis of design requirement was first conducted, which formed the base for designing the first and second class passenger seats. Prototypes were made and evaluated iteratively, and seat arrangement and coach layout were finally obtained. The systematic approach and recommendations suggested in this study are expected to be applicable to the seat design for public transportations and to help modify and redesign existing vehicular seats.

  19. Novel thermal management system design methodology for power lithium-ion battery

    NASA Astrophysics Data System (ADS)

    Nieto, Nerea; Díaz, Luis; Gastelurrutia, Jon; Blanco, Francisco; Ramos, Juan Carlos; Rivas, Alejandro

    2014-12-01

    Battery packs conformed by large format lithium-ion cells are increasingly being adopted in hybrid and pure electric vehicles in order to use the energy more efficiently and for a better environmental performance. Safety and cycle life are two of the main concerns regarding this technology, which are closely related to the cell's operating behavior and temperature asymmetries in the system. Therefore, the temperature of the cells in battery packs needs to be controlled by thermal management systems (TMSs). In the present paper an improved design methodology for developing TMSs is proposed. This methodology involves the development of different mathematical models for heat generation, transmission, and dissipation and their coupling and integration in the battery pack product design methodology in order to improve the overall safety and performance. The methodology is validated by comparing simulation results with laboratory measurements on a single module of the battery pack designed at IK4-IKERLAN for a traction application. The maximum difference between model predictions and experimental temperature data is 2 °C. The models developed have shown potential for use in battery thermal management studies for EV/HEV applications since they allow for scalability with accuracy and reasonable simulation time.

  20. The conceptual framework and assessment methodology for the systematic reviews of community-based interventions for the prevention and control of infectious diseases of poverty

    PubMed Central

    2014-01-01

    This paper describes the conceptual framework and the methodology used to guide the systematic reviews of community-based interventions (CBIs) for the prevention and control of infectious diseases of poverty (IDoP). We adapted the conceptual framework from the 3ie work on the ‘Community-Based Intervention Packages for Preventing Maternal Morbidity and Mortality and Improving Neonatal Outcomes’ to aid in the analyzing of the existing CBIs for IDoP. The conceptual framework revolves around objectives, inputs, processes, outputs, outcomes, and impacts showing the theoretical linkages between the delivery of the interventions targeting these diseases through various community delivery platforms and the consequent health impacts. We also describe the methodology undertaken to conduct the systematic reviews and the meta-analyses. PMID:25105014

  1. The conceptual framework and assessment methodology for the systematic reviews of community-based interventions for the prevention and control of infectious diseases of poverty.

    PubMed

    Lassi, Zohra S; Salam, Rehana A; Das, Jai K; Bhutta, Zulfiqar A

    2014-01-01

    This paper describes the conceptual framework and the methodology used to guide the systematic reviews of community-based interventions (CBIs) for the prevention and control of infectious diseases of poverty (IDoP). We adapted the conceptual framework from the 3ie work on the 'Community-Based Intervention Packages for Preventing Maternal Morbidity and Mortality and Improving Neonatal Outcomes' to aid in the analyzing of the existing CBIs for IDoP. The conceptual framework revolves around objectives, inputs, processes, outputs, outcomes, and impacts showing the theoretical linkages between the delivery of the interventions targeting these diseases through various community delivery platforms and the consequent health impacts. We also describe the methodology undertaken to conduct the systematic reviews and the meta-analyses.

  2. Application of Design Methodologies for Feedback Compensation Associated with Linear Systems

    NASA Technical Reports Server (NTRS)

    Smith, Monty J.

    1996-01-01

    The work that follows is concerned with the application of design methodologies for feedback compensation associated with linear systems. In general, the intent is to provide a well behaved closed loop system in terms of stability and robustness (internal signals remain bounded with a certain amount of uncertainty) and simultaneously achieve an acceptable level of performance. The approach here has been to convert the closed loop system and control synthesis problem into the interpolation setting. The interpolation formulation then serves as our mathematical representation of the design process. Lifting techniques have been used to solve the corresponding interpolation and control synthesis problems. Several applications using this multiobjective design methodology have been included to show the effectiveness of these techniques. In particular, the mixed H 2-H performance criteria with algorithm has been used on several examples including an F-18 HARV (High Angle of Attack Research Vehicle) for sensitivity performance.

  3. Aero-Mechanical Design Methodology for Subsonic Civil Transport High-Lift Systems

    NASA Technical Reports Server (NTRS)

    vanDam, C. P.; Shaw, S. G.; VanderKam, J. C.; Brodeur, R. R.; Rudolph, P. K. C.; Kinney, D.

    2000-01-01

    In today's highly competitive and economically driven commercial aviation market, the trend is to make aircraft systems simpler and to shorten their design cycle which reduces recurring, non-recurring and operating costs. One such system is the high-lift system. A methodology has been developed which merges aerodynamic data with kinematic analysis of the trailing-edge flap mechanism with minimum mechanism definition required. This methodology provides quick and accurate aerodynamic performance prediction for a given flap deployment mechanism early on in the high-lift system preliminary design stage. Sample analysis results for four different deployment mechanisms are presented as well as descriptions of the aerodynamic and mechanism data required for evaluation. Extensions to interactive design capabilities are also discussed.

  4. Design Methodology of a Dual-Halbach Array Linear Actuator with Thermal-Electromagnetic Coupling.

    PubMed

    Eckert, Paulo Roberto; Flores Filho, Aly Ferreira; Perondi, Eduardo; Ferri, Jeferson; Goltz, Evandro

    2016-03-11

    This paper proposes a design methodology for linear actuators, considering thermal and electromagnetic coupling with geometrical and temperature constraints, that maximizes force density and minimizes force ripple. The method allows defining an actuator for given specifications in a step-by-step way so that requirements are met and the temperature within the device is maintained under or equal to its maximum allowed for continuous operation. According to the proposed method, the electromagnetic and thermal models are built with quasi-static parametric finite element models. The methodology was successfully applied to the design of a linear cylindrical actuator with a dual quasi-Halbach array of permanent magnets and a moving-coil. The actuator can produce an axial force of 120 N and a stroke of 80 mm. The paper also presents a comparative analysis between results obtained considering only an electromagnetic model and the thermal-electromagnetic coupled model. This comparison shows that the final designs for both cases differ significantly, especially regarding its active volume and its electrical and magnetic loading. Although in this paper the methodology was employed to design a specific actuator, its structure can be used to design a wide range of linear devices if the parametric models are adjusted for each particular actuator.

  5. Design Methodology of a Dual-Halbach Array Linear Actuator with Thermal-Electromagnetic Coupling

    PubMed Central

    Eckert, Paulo Roberto; Flores Filho, Aly Ferreira; Perondi, Eduardo; Ferri, Jeferson; Goltz, Evandro

    2016-01-01

    This paper proposes a design methodology for linear actuators, considering thermal and electromagnetic coupling with geometrical and temperature constraints, that maximizes force density and minimizes force ripple. The method allows defining an actuator for given specifications in a step-by-step way so that requirements are met and the temperature within the device is maintained under or equal to its maximum allowed for continuous operation. According to the proposed method, the electromagnetic and thermal models are built with quasi-static parametric finite element models. The methodology was successfully applied to the design of a linear cylindrical actuator with a dual quasi-Halbach array of permanent magnets and a moving-coil. The actuator can produce an axial force of 120 N and a stroke of 80 mm. The paper also presents a comparative analysis between results obtained considering only an electromagnetic model and the thermal-electromagnetic coupled model. This comparison shows that the final designs for both cases differ significantly, especially regarding its active volume and its electrical and magnetic loading. Although in this paper the methodology was employed to design a specific actuator, its structure can be used to design a wide range of linear devices if the parametric models are adjusted for each particular actuator. PMID:26978370

  6. A Robust Design Methodology for Optimal Microscale Secondary Flow Control in Compact Inlet Diffusers

    NASA Technical Reports Server (NTRS)

    Anderson, Bernhard H.; Keller, Dennis J.

    2001-01-01

    It is the purpose of this study to develop an economical Robust design methodology for microscale secondary flow control in compact inlet diffusers. To illustrate the potential of economical Robust Design methodology, two different mission strategies were considered for the subject inlet, namely Maximum Performance and Maximum HCF Life Expectancy. The Maximum Performance mission maximized total pressure recovery while the Maximum HCF Life Expectancy mission minimized the mean of the first five Fourier harmonic amplitudes, i.e., 'collectively' reduced all the harmonic 1/2 amplitudes of engine face distortion. Each of the mission strategies was subject to a low engine face distortion constraint, i.e., DC60<0.10, which is a level acceptable for commercial engines. For each of these missions strategies, an 'Optimal Robust' (open loop control) and an 'Optimal Adaptive' (closed loop control) installation was designed over a twenty degree angle-of-incidence range. The Optimal Robust installation used economical Robust Design methodology to arrive at a single design which operated over the entire angle-of-incident range (open loop control). The Optimal Adaptive installation optimized all the design parameters at each angle-of-incidence. Thus, the Optimal Adaptive installation would require a closed loop control system to sense a proper signal for each effector and modify that effector device, whether mechanical or fluidic, for optimal inlet performance. In general, the performance differences between the Optimal Adaptive and Optimal Robust installation designs were found to be marginal. This suggests, however, that Optimal Robust open loop installation designs can be very competitive with Optimal Adaptive close loop designs. Secondary flow control in inlets is inherently robust, provided it is optimally designed. Therefore, the new methodology presented in this paper, combined array 'Lower Order' approach to Robust DOE, offers the aerodynamicist a very viable and

  7. Design methodology for integrated downstream separation systems in an ethanol biorefinery

    NASA Astrophysics Data System (ADS)

    Mohammadzadeh Rohani, Navid

    and obtaining energy security. On the other hand, Process Integration (PI) as defined by Natural Resource Canada as the combination of activities which aim at improving process systems, their unit operations and their interactions in order to maximize the efficiency of using water, energy and raw materials can also help biorefineries lower their energy consumptions and improve their economics. Energy integration techniques such as pinch analysis adopted by different industries over the years have ensured using heat sources within a plant to supply the demand internally and decrease the external utility consumption. Therefore, adopting energy integration can be one of the ways biorefinery technology owners can consider in their process development as well as their business model in order to improve their overall economics. The objective of this thesis is to propose a methodology for designing integrated downstream separation in a biorefinery. This methodology is tested in an ethanol biorefinery case study. Several alternative separation techniques are evaluated in their energy consumption and economics in three different scenarios; stand-alone without energy integration, stand-alone with internal energy integration and integrated-with Kraft. The energy consumptions and capital costs of separation techniques are assessed in each scenario and the cost and benefit of integration are determined and finally the best alternative is found through techno-economic metrics. Another advantage of this methodology is the use of a graphical tool which provides insights on decreasing energy consumption by modifying the process condition. The pivot point of this work is the use of a novel energy integration method called Bridge analysis. This systematic method which originally is intended for retrofit situation is used here for integration with Kraft process. Integration potentials are identified through this method and savings are presented for each design. In stand-alone with

  8. A Digital Tool Set for Systematic Model Design in Process-Engineering Education

    ERIC Educational Resources Information Center

    van der Schaaf, Hylke; Tramper, Johannes; Hartog, Rob J.M.; Vermue, Marian

    2006-01-01

    One of the objectives of the process technology curriculum at Wageningen University is that students learn how to design mathematical models in the context of process engineering, using a systematic problem analysis approach. Students find it difficult to learn to design a model and little material exists to meet this learning objective. For these…

  9. Cognitive Activity-based Design Methodology for Novice Visual Communication Designers

    ERIC Educational Resources Information Center

    Kim, Hyunjung; Lee, Hyunju

    2016-01-01

    The notion of design thinking is becoming more concrete nowadays, as design researchers and practitioners study the thinking processes involved in design and employ the concept of design thinking to foster better solutions to complex and ill-defined problems. The goal of the present research is to develop a cognitive activity-based design…

  10. New methods, new methodology: Advanced CFD in the Snecma turbomachinery design process

    NASA Astrophysics Data System (ADS)

    Vuillez, Christophe; Petot, Bertrand

    1994-05-01

    CFD tools represent a significant source of improvements in the design process of turbomachinery components, leading to higher performances, cost and cycle savings as well as lower associated risks. Such methods are the backbone of compressor and turbine design methodologies at Snecma. In the 80's, the use of 3D Euler solvers was a key factor in designing fan blades with very high performance level. Counter rotating high speed propellers designed with this methodology reached measured performances very close to their ambitious objective from the first test series. In the late 80's and the beginning of the 90's, new, more powerful methods were rapidly developed and are now commonly used in the design process: a quasi-3D, compressible, transonic inverse method; quasi-3D and 3D Navier-Stokes solvers; 3D unsteady Euler solvers. As an example, several hundred 3D Navier-Stokes computations are run yearly for the design of low and high pressure compressor and turbine blades. In addition to their modelling capabilities, the efficient use of such methods in the design process comes from their close integration in the global methodology and from an adequate exploitation environment. Their validation, their calibration, and the correlations between different levels of modelling are of critical importance to an actual improvement in design know-how. The integration of different methods in the design process is described. Several examples of application illustrate their practical utilization. Comparisons between computational results and test results show their capabilities as well as their present limitations. The prospects linked to new developments currently under way are discussed.

  11. Automated Methodologies for the Design of Flow Diagrams for Development and Maintenance Activities

    NASA Astrophysics Data System (ADS)

    Shivanand M., Handigund; Shweta, Bhat

    The Software Requirements Specification (SRS) of the organization is a text document prepared by strategic management incorporating the requirements of the organization. These requirements of ongoing business/ project development process involve the software tools, the hardware devices, the manual procedures, the application programs and the communication commands. These components are appropriately ordered for achieving the mission of the concerned process both in the project development and the ongoing business processes, in different flow diagrams viz. activity chart, workflow diagram, activity diagram, component diagram and deployment diagram. This paper proposes two generic, automatic methodologies for the design of various flow diagrams of (i) project development activities, (ii) ongoing business process. The methodologies also resolve the ensuing deadlocks in the flow diagrams and determine the critical paths for the activity chart. Though both methodologies are independent, each complements other in authenticating its correctness and completeness.

  12. A user-centred methodology for designing an online social network to motivate health behaviour change.

    PubMed

    Kamal, Noreen; Fels, Sidney

    2013-01-01

    Positive health behaviour is critical to preventing illness and managing chronic conditions. A user-centred methodology was employed to design an online social network to motivate health behaviour change. The methodology was augmented by utilizing the Appeal, Belonging, Commitment (ABC) Framework, which is based on theoretical models for health behaviour change and use of online social networks. The user-centred methodology included four phases: 1) initial user inquiry on health behaviour and use of online social networks; 2) interview feedback on paper prototypes; 2) laboratory study on medium fidelity prototype; and 4) a field study on the high fidelity prototype. The points of inquiry through these phases were based on the ABC Framework. This yielded an online social network system that linked to external third party databases to deploy to users via an interactive website.

  13. Systematic design of a magneto-rheological fluid embedded pneumatic vibration isolator subject to practical constraints

    NASA Astrophysics Data System (ADS)

    Zhu, Xiaocong; Jing, Xingjian; Cheng, Li

    2012-03-01

    A systematic design of a magneto-rheological fluid embedded pneumatic vibration isolator (MrEPI) considering practical constraints and optimal performance is proposed. The design procedure basically consists of three steps, i.e. system level design, component level design and practical realization. The system level design involves synthesizing appropriate non-dimensional system parameters of pneumatic spring and MR damper elements based on parameter sensitivity analysis considering requirements for compact and efficient hardware utilization. The component level design involves optimal design of the MR valve by minimizing an objective function in terms of non-dimensional geometric, material and excitation parameters, and guaranteeing required performance in the worst cases. Then practical realization involves determining actual plant parameters from the non-dimensional analysis in system and component level designs with the considerations of practical requirements/constraints. To verify the effectiveness of this optimization procedure, the semi-active vibration control performance of the optimized MrEPI subject to harmonic disturbances is evaluated, which shows good isolation performance in all tested cases. This study actually provides a systematic method for the optimal analysis and design of all those nonlinear vibration isolators consisting of pneumatic spring and MR damper elements. This is achieved firstly by developing effective sensitivity analysis of dominant design parameters upon the adjustable stiffness and damping capacity irrespective of bulky or small system mass configuration and subsequently via a systematic realization design with the consideration of practical constraints in applications.

  14. Methodology to design a municipal solid waste pre-collection system. A case study

    SciTech Connect

    Gallardo, A. Carlos, M. Peris, M. Colomer, F.J.

    2015-02-15

    Highlights: • MSW recovery starts at homes; therefore it is important to facilitate it to people. • Additionally, to optimize MSW collection a previous pre-collection must be planned. • A methodology to organize pre-collection considering several factors is presented. • The methodology has been verified applying it to a Spanish middle town. - Abstract: The municipal solid waste (MSW) management is an important task that local governments as well as private companies must take into account to protect human health, the environment and to preserve natural resources. To design an adequate MSW management plan the first step consists in defining the waste generation and composition patterns of the town. As these patterns depend on several socio-economic factors it is advisable to organize them previously. Moreover, the waste generation and composition patterns may vary around the town and over the time. Generally, the data are not homogeneous around the city as the number of inhabitants is not constant nor it is the economic activity. Therefore, if all the information is showed in thematic maps, the final waste management decisions can be made more efficiently. The main aim of this paper is to present a structured methodology that allows local authorities or private companies who deal with MSW to design its own MSW management plan depending on the available data. According to these data, this paper proposes two ways of action: a direct way when detailed data are available and an indirect way when there is a lack of data and it is necessary to take into account bibliographic data. In any case, the amount of information needed is considerable. This paper combines the planning methodology with the Geographic Information Systems to present the final results in thematic maps that make easier to interpret them. The proposed methodology is a previous useful tool to organize the MSW collection routes including the selective collection. To verify the methodology it has

  15. Methodology for the Design of Streamline-Traced External-Compression Supersonic Inlets

    NASA Technical Reports Server (NTRS)

    Slater, John W.

    2014-01-01

    A design methodology based on streamline-tracing is discussed for the design of external-compression, supersonic inlets for flight below Mach 2.0. The methodology establishes a supersonic compression surface and capture cross-section by tracing streamlines through an axisymmetric Busemann flowfield. The compression system of shock and Mach waves is altered through modifications to the leading edge and shoulder of the compression surface. An external terminal shock is established to create subsonic flow which is diffused in the subsonic diffuser. The design methodology was implemented into the SUPIN inlet design tool. SUPIN uses specified design factors to design the inlets and computes the inlet performance, which includes the flow rates, total pressure recovery, and wave drag. A design study was conducted using SUPIN and the Wind-US computational fluid dynamics code to design and analyze the properties of two streamline-traced, external-compression (STEX) supersonic inlets for Mach 1.6 freestream conditions. The STEX inlets were compared to axisymmetric pitot, two-dimensional, and axisymmetric spike inlets. The STEX inlets had slightly lower total pressure recovery and higher levels of total pressure distortion than the axisymmetric spike inlet. The cowl wave drag coefficients of the STEX inlets were 20% of those for the axisymmetric spike inlet. The STEX inlets had external sound pressures that were 37% of those of the axisymmetric spike inlet, which may result in lower adverse sonic boom characteristics. The flexibility of the shape of the capture cross-section may result in benefits for the integration of STEX inlets with aircraft.

  16. A knowledge management methodology for the integrated assessment of WWTP configurations during conceptual design.

    PubMed

    Garrido-Baserba, M; Reif, R; Rodriguez-Roda, I; Poch, M

    2012-01-01

    The current complexity involved in wastewater management projects is arising as the XXI century sets new challenges leading towards a more integrated plant design. In this context, the growing number of innovative technologies, stricter legislation and the development of new methodological approaches make it difficult to design appropriate flow schemes for new wastewater projects. Thus, new tools are needed for the wastewater treatment plant (WWTP) conceptual design using integrated assessment methods in order to include different types of objectives at the same time i.e. environmental, economical, technical, and legal. Previous experiences used the decision support system (DSS) methodology to handle the specific issues related to wastewater management, for example, the design of treatment facilities for small communities. However, tools developed for addressing the whole treatment process independently of the plant size, capable of integrating knowledge from many different areas, including both conventional and innovative technologies are not available. Therefore, the aim of this paper is to present and describe an innovative knowledge-based methodology that handles the conceptual design of WWTP process flow-diagrams (PFDs), satisfying a vast number of different criteria. This global approach is based on a hierarchy of decisions that uses the information contained in knowledge bases (KBs) with the aim of automating the generation of suitable WWTP configurations for a specific scenario. Expert interviews, legislation, specialized literature and engineering experience have been integrated within the different KBs, which indeed constitute one of the main highlights of this work. Therefore, the methodology is presented as a valuable tool which provides customized PFD for each specific case, taking into account process unit interactions and the user specified requirements and objectives.

  17. Bayesian cross-entropy methodology for optimal design of validation experiments

    NASA Astrophysics Data System (ADS)

    Jiang, X.; Mahadevan, S.

    2006-07-01

    An important concern in the design of validation experiments is how to incorporate the mathematical model in the design in order to allow conclusive comparisons of model prediction with experimental output in model assessment. The classical experimental design methods are more suitable for phenomena discovery and may result in a subjective, expensive, time-consuming and ineffective design that may adversely impact these comparisons. In this paper, an integrated Bayesian cross-entropy methodology is proposed to perform the optimal design of validation experiments incorporating the computational model. The expected cross entropy, an information-theoretic distance between the distributions of model prediction and experimental observation, is defined as a utility function to measure the similarity of two distributions. A simulated annealing algorithm is used to find optimal values of input variables through minimizing or maximizing the expected cross entropy. The measured data after testing with the optimum input values are used to update the distribution of the experimental output using Bayes theorem. The procedure is repeated to adaptively design the required number of experiments for model assessment, each time ensuring that the experiment provides effective comparison for validation. The methodology is illustrated for the optimal design of validation experiments for a three-leg bolted joint structure and a composite helicopter rotor hub component.

  18. Impact of User-Centered Design Methodology on the Design of Information Systems.

    ERIC Educational Resources Information Center

    Sugar, William A.

    1995-01-01

    Examines the implications of incorporating user-centered design within information systems design practices. Highlights include a definition of user-centered design based on human-computer interface; questions asked about users, including outcome, process, and task variables; and three criteria for when to use this approach in information systems…

  19. Integrating uniform design and response surface methodology to optimize thiacloprid suspension

    PubMed Central

    Li, Bei-xing; Wang, Wei-chang; Zhang, Xian-peng; Zhang, Da-xia; Mu, Wei; Liu, Feng

    2017-01-01

    A model 25% suspension concentrate (SC) of thiacloprid was adopted to evaluate an integrative approach of uniform design and response surface methodology. Tersperse2700, PE1601, xanthan gum and veegum were the four experimental factors, and the aqueous separation ratio and viscosity were the two dependent variables. Linear and quadratic polynomial models of stepwise regression and partial least squares were adopted to test the fit of the experimental data. Verification tests revealed satisfactory agreement between the experimental and predicted data. The measured values for the aqueous separation ratio and viscosity were 3.45% and 278.8 mPa·s, respectively, and the relative errors of the predicted values were 9.57% and 2.65%, respectively (prepared under the proposed conditions). Comprehensive benefits could also be obtained by appropriately adjusting the amount of certain adjuvants based on practical requirements. Integrating uniform design and response surface methodology is an effective strategy for optimizing SC formulas. PMID:28383036

  20. Design methodology for multi-pumped discrete Raman amplifiers: case-study employing photonic crystal fibers.

    PubMed

    Castellani, C E S; Cani, S P N; Segatto, M E; Pontes, M J; Romero, M A

    2009-08-03

    This paper proposes a new design methodology for discrete multi-pumped Raman amplifier. In a multi-objective optimization scenario, in a first step the whole solution-space is inspected by a CW analytical formulation. Then, the most promising solutions are fully investigated by a rigorous numerical treatment and the Raman amplification performance is thus determined by the combination of analytical and numerical approaches. As an application of our methodology we designed an photonic crystal fiber Raman amplifier configuration which provides low ripple, high gain, clear eye opening and a low power penalty. The amplifier configuration also enables to fully compensate the dispersion introduced by a 70-km singlemode fiber in a 10 Gbit/s system. We have successfully obtained a configuration with 8.5 dB average gain over the C-band and 0.71 dB ripple with almost zero eye-penalty using only two pump lasers with relatively low pump power.

  1. Integrating uniform design and response surface methodology to optimize thiacloprid suspension.

    PubMed

    Li, Bei-Xing; Wang, Wei-Chang; Zhang, Xian-Peng; Zhang, Da-Xia; Mu, Wei; Liu, Feng

    2017-04-06

    A model 25% suspension concentrate (SC) of thiacloprid was adopted to evaluate an integrative approach of uniform design and response surface methodology. Tersperse2700, PE1601, xanthan gum and veegum were the four experimental factors, and the aqueous separation ratio and viscosity were the two dependent variables. Linear and quadratic polynomial models of stepwise regression and partial least squares were adopted to test the fit of the experimental data. Verification tests revealed satisfactory agreement between the experimental and predicted data. The measured values for the aqueous separation ratio and viscosity were 3.45% and 278.8 mPa·s, respectively, and the relative errors of the predicted values were 9.57% and 2.65%, respectively (prepared under the proposed conditions). Comprehensive benefits could also be obtained by appropriately adjusting the amount of certain adjuvants based on practical requirements. Integrating uniform design and response surface methodology is an effective strategy for optimizing SC formulas.

  2. Increasing Airpower’s Effectiveness: Applying the U.S. Army’s Operational Design Methodology to Airpower in Warfare

    DTIC Science & Technology

    2010-04-01

    used the U.S. Army’s Operational Design methodology. The lack of systems thinking , reflective thinking, environmental framing and the inability to...the U.S. Army’s Operational Design methodology has a positive impact on airpower. The presence of systems thinking and reflective thinking by

  3. The Atomic Intrinsic Integration Approach: A Structured Methodology for the Design of Games for the Conceptual Understanding of Physics

    ERIC Educational Resources Information Center

    Echeverria, Alejandro; Barrios, Enrique; Nussbaum, Miguel; Amestica, Matias; Leclerc, Sandra

    2012-01-01

    Computer simulations combined with games have been successfully used to teach conceptual physics. However, there is no clear methodology for guiding the design of these types of games. To remedy this, we propose a structured methodology for the design of conceptual physics games that explicitly integrates the principles of the intrinsic…

  4. HPCC Methodologies for Structural Design and Analysis on Parallel and Distributed Computing Platforms

    NASA Technical Reports Server (NTRS)

    Farhat, Charbel

    1998-01-01

    In this grant, we have proposed a three-year research effort focused on developing High Performance Computation and Communication (HPCC) methodologies for structural analysis on parallel processors and clusters of workstations, with emphasis on reducing the structural design cycle time. Besides consolidating and further improving the FETI solver technology to address plate and shell structures, we have proposed to tackle the following design related issues: (a) parallel coupling and assembly of independently designed and analyzed three-dimensional substructures with non-matching interfaces, (b) fast and smart parallel re-analysis of a given structure after it has undergone design modifications, (c) parallel evaluation of sensitivity operators (derivatives) for design optimization, and (d) fast parallel analysis of mildly nonlinear structures. While our proposal was accepted, support was provided only for one year.

  5. Spintronic logic design methodology based on spin Hall effect-driven magnetic tunnel junctions

    NASA Astrophysics Data System (ADS)

    Kang, Wang; Wang, Zhaohao; Zhang, Youguang; Klein, Jacques-Olivier; Lv, Weifeng; Zhao, Weisheng

    2016-02-01

    Conventional complementary metal-oxide-semiconductor (CMOS) technology is now approaching its physical scaling limits to enable Moore’s law to continue. Spintronic devices, as one of the potential alternatives, show great promise to replace CMOS technology for next-generation low-power integrated circuits in nanoscale technology nodes. Until now, spintronic memory has been successfully commercialized. However spintronic logic still faces many critical challenges (e.g. direct cascading capability and small operation gain) before it can be practically applied. In this paper, we propose a standard complimentary spintronic logic (CSL) design methodology to form a CMOS-like logic design paradigm. Using the spin Hall effect (SHE)-driven magnetic tunnel junction (MTJ) device as an example, we demonstrate CSL implementation, functionality and performance. This logic family provides a unified design methodology for spintronic logic circuits and partly solves the challenges of direct cascading capability and small operation gain in the previously proposed spintronic logic designs. By solving a modified Landau-Lifshitz-Gilbert equation, the magnetization dynamics in the free layer of the MTJ is theoretically described and a compact electrical model is developed. With this electrical model, numerical simulations have been performed to evaluate the functionality and performance of the proposed CSL design. Simulation results demonstrate that the proposed CSL design paradigm is rather promising for low-power logic computing.

  6. Methodology for CFD Design Analysis of National Launch System Nozzle Manifold

    NASA Technical Reports Server (NTRS)

    Haire, Scot L.

    1993-01-01

    The current design environment dictates that high technology CFD (Computational Fluid Dynamics) analysis produce quality results in a timely manner if it is to be integrated into the design process. The design methodology outlined describes the CFD analysis of an NLS (National Launch System) nozzle film cooling manifold. The objective of the analysis was to obtain a qualitative estimate for the flow distribution within the manifold. A complex, 3D, multiple zone, structured grid was generated from a 3D CAD file of the geometry. A Euler solution was computed with a fully implicit compressible flow solver. Post processing consisted of full 3D color graphics and mass averaged performance. The result was a qualitative CFD solution that provided the design team with relevant information concerning the flow distribution in and performance characteristics of the film cooling manifold within an effective time frame. Also, this design methodology was the foundation for a quick turnaround CFD analysis of the next iteration in the manifold design.

  7. A Formal Methodology to Design and Deploy Dependable Wireless Sensor Networks

    PubMed Central

    Testa, Alessandro; Cinque, Marcello; Coronato, Antonio; Augusto, Juan Carlos

    2016-01-01

    Wireless Sensor Networks (WSNs) are being increasingly adopted in critical applications, where verifying the correct operation of sensor nodes is a major concern. Undesired events may undermine the mission of the WSNs. Hence, their effects need to be properly assessed before deployment, to obtain a good level of expected performance; and during the operation, in order to avoid dangerous unexpected results. In this paper, we propose a methodology that aims at assessing and improving the dependability level of WSNs by means of an event-based formal verification technique. The methodology includes a process to guide designers towards the realization of a dependable WSN and a tool (“ADVISES”) to simplify its adoption. The tool is applicable to homogeneous WSNs with static routing topologies. It allows the automatic generation of formal specifications used to check correctness properties and evaluate dependability metrics at design time and at runtime for WSNs where an acceptable percentage of faults can be defined. During the runtime, we can check the behavior of the WSN accordingly to the results obtained at design time and we can detect sudden and unexpected failures, in order to trigger recovery procedures. The effectiveness of the methodology is shown in the context of two case studies, as proof-of-concept, aiming to illustrate how the tool is helpful to drive design choices and to check the correctness properties of the WSN at runtime. Although the method scales up to very large WSNs, the applicability of the methodology may be compromised by the state space explosion of the reasoning model, which must be faced by partitioning large topologies into sub-topologies. PMID:28025568

  8. Design Methodology and Experimental Verification of Serpentine/Folded Waveguide TWTs

    DTIC Science & Technology

    2016-03-17

    serpentine amplifier, which embodies the design methodology described herein. Particular attention will be paid to the comparison between code ...segment. The width of waveguide measured in the page is a. demonstration. Thus, particular attention will be paid to the comparison between code prediction...are computed with the 3-D electromagnetic code ANALYST [17]. In the parametric study shown in Fig. 7, we keep p, b, and beam tunnel radius constant

  9. The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology: Capabilities and Applications

    NASA Technical Reports Server (NTRS)

    Evers, Ken H.; Bachert, Robert F.

    1987-01-01

    The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology has been formulated and applied over a five-year period. It has proven to be a unique, integrated approach utilizing a top-down, structured technique to define and document the system of interest; a knowledge engineering technique to collect and organize system descriptive information; a rapid prototyping technique to perform preliminary system performance analysis; and a sophisticated simulation technique to perform in-depth system performance analysis.

  10. A Formal Methodology to Design and Deploy Dependable Wireless Sensor Networks.

    PubMed

    Testa, Alessandro; Cinque, Marcello; Coronato, Antonio; Augusto, Juan Carlos

    2016-12-23

    Wireless Sensor Networks (WSNs) are being increasingly adopted in critical applications, where verifying the correct operation of sensor nodes is a major concern. Undesired events may undermine the mission of the WSNs. Hence, their effects need to be properly assessed before deployment, to obtain a good level of expected performance; and during the operation, in order to avoid dangerous unexpected results. In this paper, we propose a methodology that aims at assessing and improving the dependability level of WSNs by means of an event-based formal verification technique. The methodology includes a process to guide designers towards the realization of a dependable WSN and a tool ("ADVISES") to simplify its adoption. The tool is applicable to homogeneous WSNs with static routing topologies. It allows the automatic generation of formal specifications used to check correctness properties and evaluate dependability metrics at design time and at runtime for WSNs where an acceptable percentage of faults can be defined. During the runtime, we can check the behavior of the WSN accordingly to the results obtained at design time and we can detect sudden and unexpected failures, in order to trigger recovery procedures. The effectiveness of the methodology is shown in the context of two case studies, as proof-of-concept, aiming to illustrate how the tool is helpful to drive design choices and to check the correctness properties of the WSN at runtime. Although the method scales up to very large WSNs, the applicability of the methodology may be compromised by the state space explosion of the reasoning model, which must be faced by partitioning large topologies into sub-topologies.

  11. A Visual Analytics Based Decision Support Methodology For Evaluating Low Energy Building Design Alternatives

    NASA Astrophysics Data System (ADS)

    Dutta, Ranojoy

    The ability to design high performance buildings has acquired great importance in recent years due to numerous federal, societal and environmental initiatives. However, this endeavor is much more demanding in terms of designer expertise and time. It requires a whole new level of synergy between automated performance prediction with the human capabilities to perceive, evaluate and ultimately select a suitable solution. While performance prediction can be highly automated through the use of computers, performance evaluation cannot, unless it is with respect to a single criterion. The need to address multi-criteria requirements makes it more valuable for a designer to know the "latitude" or "degrees of freedom" he has in changing certain design variables while achieving preset criteria such as energy performance, life cycle cost, environmental impacts etc. This requirement can be met by a decision support framework based on near-optimal "satisficing" as opposed to purely optimal decision making techniques. Currently, such a comprehensive design framework is lacking, which is the basis for undertaking this research. The primary objective of this research is to facilitate a complementary relationship between designers and computers for Multi-Criterion Decision Making (MCDM) during high performance building design. It is based on the application of Monte Carlo approaches to create a database of solutions using deterministic whole building energy simulations, along with data mining methods to rank variable importance and reduce the multi-dimensionality of the problem. A novel interactive visualization approach is then proposed which uses regression based models to create dynamic interplays of how varying these important variables affect the multiple criteria, while providing a visual range or band of variation of the different design parameters. The MCDM process has been incorporated into an alternative methodology for high performance building design referred to as

  12. Design of psychosocial factors questionnaires: a systematic measurement approach

    PubMed Central

    Vargas, Angélica; Felknor, Sarah A

    2012-01-01

    Background Evaluation of psychosocial factors requires instruments that measure dynamic complexities. This study explains the design of a set of questionnaires to evaluate work and non-work psychosocial risk factors for stress-related illnesses. Methods The measurement model was based on a review of literature. Content validity was performed by experts and cognitive interviews. Pilot testing was carried out with a convenience sample of 132 workers. Cronbach’s alpha evaluated internal consistency and concurrent validity was estimated by Spearman correlation coefficients. Results Three questionnaires were constructed to evaluate exposure to work and non-work risk factors. Content validity improved the questionnaires coherence with the measurement model. Internal consistency was adequate (α=0.85–0.95). Concurrent validity resulted in moderate correlations of psychosocial factors with stress symptoms. Conclusions Questionnaires´ content reflected a wide spectrum of psychosocial factors sources. Cognitive interviews improved understanding of questions and dimensions. The structure of the measurement model was confirmed. PMID:22628068

  13. Methodology to design a municipal solid waste generation and composition map: A case study

    SciTech Connect

    Gallardo, A. Carlos, M. Peris, M. Colomer, F.J.

    2014-11-15

    Highlights: • To draw a waste generation and composition map of a town a lot of factors must be taken into account. • The methodology proposed offers two different depending on the available data combined with geographical information systems. • The methodology has been applied to a Spanish city with success. • The methodology will be a useful tool to organize the municipal solid waste management. - Abstract: The municipal solid waste (MSW) management is an important task that local governments as well as private companies must take into account to protect human health, the environment and to preserve natural resources. To design an adequate MSW management plan the first step consist in defining the waste generation and composition patterns of the town. As these patterns depend on several socio-economic factors it is advisable to organize them previously. Moreover, the waste generation and composition patterns may vary around the town and over the time. Generally, the data are not homogeneous around the city as the number of inhabitants is not constant nor it is the economic activity. Therefore, if all the information is showed in thematic maps, the final waste management decisions can be made more efficiently. The main aim of this paper is to present a structured methodology that allows local authorities or private companies who deal with MSW to design its own MSW management plan depending on the available data. According to these data, this paper proposes two ways of action: a direct way when detailed data are available and an indirect way when there is a lack of data and it is necessary to take into account bibliographic data. In any case, the amount of information needed is considerable. This paper combines the planning methodology with the Geographic Information Systems to present the final results in thematic maps that make easier to interpret them. The proposed methodology is a previous useful tool to organize the MSW collection routes including the

  14. Application of an integrated flight/propulsion control design methodology to a STOVL aircraft

    NASA Technical Reports Server (NTRS)

    Garg, Sanjay; Mattern, Duane L.

    1991-01-01

    Results are presented from the application of an emerging Integrated Flight/Propulsion Control (IFPC) design methodology to a Short Take Off and Vertical Landing (STOVL) aircraft in transition flight. The steps in the methodology consist of designing command shaping prefilters to provide the overall desired response to pilot command inputs. A previously designed centralized controller is first validated for the integrated airframe/engine plant used. This integrated plant is derived from a different model of the engine subsystem than the one used for the centralized controller design. The centralized controller is then partitioned in a decentralized, hierarchical structure comprising of airframe lateral and longitudinal subcontrollers and an engine subcontroller. Command shaping prefilters from the pilot control effector inputs are then designed and time histories of the closed loop IFPC system response to simulated pilot commands are compared to desired responses based on handling qualities requirements. Finally, the propulsion system safety and nonlinear limited protection logic is wrapped around the engine subcontroller and the response of the closed loop integrated system is evaluated for transients that encounter the propulsion surge margin limit.

  15. A methodology for the validated design space exploration of fuel cell powered unmanned aerial vehicles

    NASA Astrophysics Data System (ADS)

    Moffitt, Blake Almy

    Unmanned Aerial Vehicles (UAVs) are the most dynamic growth sector of the aerospace industry today. The need to provide persistent intelligence, surveillance, and reconnaissance for military operations is driving the planned acquisition of over 5,000 UAVs over the next five years. The most pressing need is for quiet, small UAVs with endurance beyond what is capable with advanced batteries or small internal combustion propulsion systems. Fuel cell systems demonstrate high efficiency, high specific energy, low noise, low temperature operation, modularity, and rapid refuelability making them a promising enabler of the small, quiet, and persistent UAVs that military planners are seeking. Despite the perceived benefits, the actual near-term performance of fuel cell powered UAVs is unknown. Until the auto industry began spending billions of dollars in research, fuel cell systems were too heavy for useful flight applications. However, the last decade has seen rapid development with fuel cell gravimetric and volumetric power density nearly doubling every 2--3 years. As a result, a few design studies and demonstrator aircraft have appeared, but overall the design methodology and vehicles are still in their infancy. The design of fuel cell aircraft poses many challenges. Fuel cells differ fundamentally from combustion based propulsion in how they generate power and interact with other aircraft subsystems. As a result, traditional multidisciplinary analysis (MDA) codes are inappropriate. Building new MDAs is difficult since fuel cells are rapidly changing in design, and various competitive architectures exist for balance of plant, hydrogen storage, and all electric aircraft subsystems. In addition, fuel cell design and performance data is closely protected which makes validation difficult and uncertainty significant. Finally, low specific power and high volumes compared to traditional combustion based propulsion result in more highly constrained design spaces that are

  16. A Systematic Approach to Optimize Organizations Operating in Uncertain Environments: Design Methodology and Applications

    DTIC Science & Technology

    2002-09-01

    conditions force the company to periodically undertake various efforts directed at reducing the cost of its products, improving the quality of products , marketing...expected events (e.g., improving the quality of products , marketing the upgraded products, and so on) also necessitate the completion of a series of tasks

  17. Methodology for worker neutron exposure evaluation in the PDCF facility design.

    PubMed

    Scherpelz, R I; Traub, R J; Pryor, K H

    2004-01-01

    A project headed by Washington Group International is meant to design the Pit Disassembly and Conversion Facility (PDCF) to convert the plutonium pits from excessed nuclear weapons into plutonium oxide for ultimate disposition. Battelle staff are performing the shielding calculations that will determine appropriate shielding so that the facility workers will not exceed target exposure levels. The target exposure levels for workers in the facility are 5 mSv y(-1) for the whole body and 100 mSv y(-1) for the extremity, which presents a significant challenge to the designers of a facility that will process tons of radioactive material. The design effort depended on shielding calculations to determine appropriate thickness and composition for glove box walls, and concrete wall thicknesses for storage vaults. Pacific Northwest National Laboratory (PNNL) staff used ORIGEN-S and SOURCES to generate gamma and neutron source terms, and Monte Carlo (computer code for) neutron photon (transport) (MCNP-4C) to calculate the radiation transport in the facility. The shielding calculations were performed by a team of four scientists, so it was necessary to develop a consistent methodology. There was also a requirement for the study to be cost-effective, so efficient methods of evaluation were required. The calculations were subject to rigorous scrutiny by internal and external reviewers, so acceptability was a major feature of the methodology. Some of the issues addressed in the development of the methodology included selecting appropriate dose factors, developing a method for handling extremity doses, adopting an efficient method for evaluating effective dose equivalent in a non-uniform radiation field, modelling the reinforcing steel in concrete, and modularising the geometry descriptions for efficiency. The relative importance of the neutron dose equivalent compared with the gamma dose equivalent varied substantially depending on the specific shielding conditions and lessons

  18. A methodology for the design of experiments in computational intelligence with multiple regression models.

    PubMed

    Fernandez-Lozano, Carlos; Gestal, Marcos; Munteanu, Cristian R; Dorado, Julian; Pazos, Alejandro

    2016-01-01

    The design of experiments and the validation of the results achieved with them are vital in any research study. This paper focuses on the use of different Machine Learning approaches for regression tasks in the field of Computational Intelligence and especially on a correct comparison between the different results provided for different methods, as those techniques are complex systems that require further study to be fully understood. A methodology commonly accepted in Computational intelligence is implemented in an R package called RRegrs. This package includes ten simple and complex regression models to carry out predictive modeling using Machine Learning and well-known regression algorithms. The framework for experimental design presented herein is evaluated and validated against RRegrs. Our results are different for three out of five state-of-the-art simple datasets and it can be stated that the selection of the best model according to our proposal is statistically significant and relevant. It is of relevance to use a statistical approach to indicate whether the differences are statistically significant using this kind of algorithms. Furthermore, our results with three real complex datasets report different best models than with the previously published methodology. Our final goal is to provide a complete methodology for the use of different steps in order to compare the results obtained in Computational Intelligence problems, as well as from other fields, such as for bioinformatics, cheminformatics, etc., given that our proposal is open and modifiable.

  19. A methodology for the design of experiments in computational intelligence with multiple regression models

    PubMed Central

    Gestal, Marcos; Munteanu, Cristian R.; Dorado, Julian; Pazos, Alejandro

    2016-01-01

    The design of experiments and the validation of the results achieved with them are vital in any research study. This paper focuses on the use of different Machine Learning approaches for regression tasks in the field of Computational Intelligence and especially on a correct comparison between the different results provided for different methods, as those techniques are complex systems that require further study to be fully understood. A methodology commonly accepted in Computational intelligence is implemented in an R package called RRegrs. This package includes ten simple and complex regression models to carry out predictive modeling using Machine Learning and well-known regression algorithms. The framework for experimental design presented herein is evaluated and validated against RRegrs. Our results are different for three out of five state-of-the-art simple datasets and it can be stated that the selection of the best model according to our proposal is statistically significant and relevant. It is of relevance to use a statistical approach to indicate whether the differences are statistically significant using this kind of algorithms. Furthermore, our results with three real complex datasets report different best models than with the previously published methodology. Our final goal is to provide a complete methodology for the use of different steps in order to compare the results obtained in Computational Intelligence problems, as well as from other fields, such as for bioinformatics, cheminformatics, etc., given that our proposal is open and modifiable. PMID:27920952

  20. Device Thrombogenicty Emulator (DTE) – Design optimization Methodology for Cardiovascular Devices: A Study in Two Bileaflet MHV Designs

    PubMed Central

    Xenos, Michalis; Girdhar, Gaurav; Alemu, Yared; Jesty, Jolyon; Slepian, Marvin; Einav, Shmuel; Bluestein, Danny

    2010-01-01

    Patients who receive prosthetic heart valve (PHV) implants require mandatory anticoagulation medication after implantation due to the thrombogenic potential of the valve. Optimization of PHV designs may facilitate reduction of flow-induced thrombogenicity and reduce or eliminate the need for post-implant anticoagulants. We present a methodology entitled Device Thrombogenicty Emulator (DTE) for optimizing the thrombo-resistance performance of PHV by combining numerical and experimental approaches. Two bileaflet mechanical heart valves (MHV) designs – St. Jude Medical (SJM) and ATS were investigated, by studying the effect of distinct flow phases on platelet activation. Transient turbulent and direct numerical simulations (DNS) were conducted, and stress loading histories experienced by the platelets were calculated along flow trajectories. The numerical simulations indicated distinct design dependent differences between the two valves. The stress-loading waveforms extracted from the numerical simulations were programmed into a hemodynamic shearing device (HSD), emulating the flow conditions past the valves in distinct ‘hot spot’ flow regions that are implicated in MHV thrombogenicity. The resultant platelet activity was measured with a modified prothrombinase assay, and was found to be significantly higher in the SJM valve, mostly during the regurgitation phase. The experimental results were in excellent agreement with the calculated platelet activation potential. This establishes the utility of the DTE methodology for serving as a test bed for evaluating design modifications for achieving better thrombogenic performance for such devices. PMID:20483411

  1. Digital Games, Design, and Learning: A Systematic Review and Meta-Analysis

    ERIC Educational Resources Information Center

    Clark, Douglas B.; Tanner-Smith, Emily E.; Killingsworth, Stephen S.

    2016-01-01

    In this meta-analysis, we systematically reviewed research on digital games and learning for K-16 students. We synthesized comparisons of game versus nongame conditions (i.e., media comparisons) and comparisons of augmented games versus standard game designs (i.e., value-added comparisons). We used random-effects meta-regression models with robust…

  2. The Design, Implementation, and Evaluation of Online Credit Nutrition Courses: A Systematic Review

    ERIC Educational Resources Information Center

    Cohen, Nancy L.; Carbone, Elena T.; Beffa-Negrini, Patricia A.

    2011-01-01

    Objective: To assess how postsecondary online nutrition education courses (ONEC) are delivered, determine ONEC effectiveness, identify theoretical models used, and identify future research needs. Design: Systematic search of database literature. Setting: Postsecondary education. Participants: Nine research articles evaluating postsecondary ONEC.…

  3. Design Methodology for Magnetic Field-Based Soft Tri-Axis Tactile Sensors

    PubMed Central

    Wang, Hongbo; de Boer, Greg; Kow, Junwai; Alazmani, Ali; Ghajari, Mazdak; Hewson, Robert; Culmer, Peter

    2016-01-01

    Tactile sensors are essential if robots are to safely interact with the external world and to dexterously manipulate objects. Current tactile sensors have limitations restricting their use, notably being too fragile or having limited performance. Magnetic field-based soft tactile sensors offer a potential improvement, being durable, low cost, accurate and high bandwidth, but they are relatively undeveloped because of the complexities involved in design and calibration. This paper presents a general design methodology for magnetic field-based three-axis soft tactile sensors, enabling researchers to easily develop specific tactile sensors for a variety of applications. All aspects (design, fabrication, calibration and evaluation) of the development of tri-axis soft tactile sensors are presented and discussed. A moving least square approach is used to decouple and convert the magnetic field signal to force output to eliminate non-linearity and cross-talk effects. A case study of a tactile sensor prototype, MagOne, was developed. This achieved a resolution of 1.42 mN in normal force measurement (0.71 mN in shear force), good output repeatability and has a maximum hysteresis error of 3.4%. These results outperform comparable sensors reported previously, highlighting the efficacy of our methodology for sensor design. PMID:27563908

  4. Design Methodology for Magnetic Field-Based Soft Tri-Axis Tactile Sensors.

    PubMed

    Wang, Hongbo; de Boer, Greg; Kow, Junwai; Alazmani, Ali; Ghajari, Mazdak; Hewson, Robert; Culmer, Peter

    2016-08-24

    Tactile sensors are essential if robots are to safely interact with the external world and to dexterously manipulate objects. Current tactile sensors have limitations restricting their use, notably being too fragile or having limited performance. Magnetic field-based soft tactile sensors offer a potential improvement, being durable, low cost, accurate and high bandwidth, but they are relatively undeveloped because of the complexities involved in design and calibration. This paper presents a general design methodology for magnetic field-based three-axis soft tactile sensors, enabling researchers to easily develop specific tactile sensors for a variety of applications. All aspects (design, fabrication, calibration and evaluation) of the development of tri-axis soft tactile sensors are presented and discussed. A moving least square approach is used to decouple and convert the magnetic field signal to force output to eliminate non-linearity and cross-talk effects. A case study of a tactile sensor prototype, MagOne, was developed. This achieved a resolution of 1.42 mN in normal force measurement (0.71 mN in shear force), good output repeatability and has a maximum hysteresis error of 3.4%. These results outperform comparable sensors reported previously, highlighting the efficacy of our methodology for sensor design.

  5. Low-Radiation Cellular Inductive Powering of Rodent Wireless Brain Interfaces: Methodology and Design Guide.

    PubMed

    Soltani, Nima; Aliroteh, Miaad S; Salam, M Tariqus; Perez Velazquez, Jose Luis; Genov, Roman

    2016-03-04

    This paper presents a general methodology of inductive power delivery in wireless chronic rodent electrophysiology applications. The focus is on such systems design considerations under the following key constraints: maximum power delivery under the allowable specific absorption rate (SAR), low cost and spatial scalability. The methodology includes inductive coil design considerations within a low-frequency ferrite-core-free power transfer link which includes a scalable coil-array power transmitter floor and a single-coil implanted or worn power receiver. A specific design example is presented that includes the concept of low-SAR cellular single-transmitter-coil powering through dynamic tracking of a magnet-less receiver spatial location. The transmitter coil instantaneous supply current is monitored using a small number of low-cost electronic components. A drop in its value indicates the proximity of the receiver due to the reflected impedance of the latter. Only the transmitter coil nearest to the receiver is activated. Operating at the low frequency of 1.5 MHz, the inductive powering floor delivers a maximum of 15.9 W below the IEEE C95 SAR limit, which is over three times greater than that in other recently reported designs. The power transfer efficiency of 39% and 13% at the nominal and maximum distances of 8 cm and 11 cm, respectively, is maintained.

  6. Low-Radiation Cellular Inductive Powering of Rodent Wireless Brain Interfaces: Methodology and Design Guide.

    PubMed

    Soltani, Nima; Aliroteh, Miaad S; Salam, M Tariqus; Perez Velazquez, Jose Luis; Genov, Roman

    2016-08-01

    This paper presents a general methodology of inductive power delivery in wireless chronic rodent electrophysiology applications. The focus is on such systems design considerations under the following key constraints: maximum power delivery under the allowable specific absorption rate (SAR), low cost and spatial scalability. The methodology includes inductive coil design considerations within a low-frequency ferrite-core-free power transfer link which includes a scalable coil-array power transmitter floor and a single-coil implanted or worn power receiver. A specific design example is presented that includes the concept of low-SAR cellular single-transmitter-coil powering through dynamic tracking of a magnet-less receiver spatial location. The transmitter coil instantaneous supply current is monitored using a small number of low-cost electronic components. A drop in its value indicates the proximity of the receiver due to the reflected impedance of the latter. Only the transmitter coil nearest to the receiver is activated. Operating at the low frequency of 1.5 MHz, the inductive powering floor delivers a maximum of 15.9 W below the IEEE C95 SAR limit, which is over three times greater than that in other recently reported designs. The power transfer efficiency of 39% and 13% at the nominal and maximum distances of 8 cm and 11 cm, respectively, is maintained.

  7. Multirate Flutter Suppression System Design for the Benchmark Active Controls Technology Wing. Part 2; Methodology Application Software Toolbox

    NASA Technical Reports Server (NTRS)

    Mason, Gregory S.; Berg, Martin C.; Mukhopadhyay, Vivek

    2002-01-01

    To study the effectiveness of various control system design methodologies, the NASA Langley Research Center initiated the Benchmark Active Controls Project. In this project, the various methodologies were applied to design a flutter suppression system for the Benchmark Active Controls Technology (BACT) Wing. This report describes the user's manual and software toolbox developed at the University of Washington to design a multirate flutter suppression control law for the BACT wing.

  8. Design methodology: edgeless 3D ASICs with complex in-pixel processing for pixel detectors

    SciTech Connect

    Fahim Farah, Fahim Farah; Deptuch, Grzegorz W.; Hoff, James R.; Mohseni, Hooman

    2015-08-28

    The design methodology for the development of 3D integrated edgeless pixel detectors with in-pixel processing using Electronic Design Automation (EDA) tools is presented. A large area 3 tier 3D detector with one sensor layer and two ASIC layers containing one analog and one digital tier, is built for x-ray photon time of arrival measurement and imaging. A full custom analog pixel is 65μm x 65μm. It is connected to a sensor pixel of the same size on one side, and on the other side it has approximately 40 connections to the digital pixel. A 32 x 32 edgeless array without any peripheral functional blocks constitutes a sub-chip. The sub-chip is an indivisible unit, which is further arranged in a 6 x 6 array to create the entire 1.248cm x 1.248cm ASIC. Each chip has 720 bump-bond I/O connections, on the back of the digital tier to the ceramic PCB. All the analog tier power and biasing is conveyed through the digital tier from the PCB. The assembly has no peripheral functional blocks, and hence the active area extends to the edge of the detector. This was achieved by using a few flavors of almost identical analog pixels (minimal variation in layout) to allow for peripheral biasing blocks to be placed within pixels. The 1024 pixels within a digital sub-chip array have a variety of full custom, semi-custom and automated timing driven functional blocks placed together. The methodology uses a modified mixed-mode on-top digital implementation flow to not only harness the tool efficiency for timing and floor-planning but also to maintain designer control over compact parasitically aware layout. The methodology uses the Cadence design platform, however it is not limited to this tool.

  9. Methodology to design a municipal solid waste generation and composition map: a case study.

    PubMed

    Gallardo, A; Carlos, M; Peris, M; Colomer, F J

    2014-11-01

    The municipal solid waste (MSW) management is an important task that local governments as well as private companies must take into account to protect human health, the environment and to preserve natural resources. To design an adequate MSW management plan the first step consist in defining the waste generation and composition patterns of the town. As these patterns depend on several socio-economic factors it is advisable to organize them previously. Moreover, the waste generation and composition patterns may vary around the town and over the time. Generally, the data are not homogeneous around the city as the number of inhabitants is not constant nor it is the economic activity. Therefore, if all the information is showed in thematic maps, the final waste management decisions can be made more efficiently. The main aim of this paper is to present a structured methodology that allows local authorities or private companies who deal with MSW to design its own MSW management plan depending on the available data. According to these data, this paper proposes two ways of action: a direct way when detailed data are available and an indirect way when there is a lack of data and it is necessary to take into account bibliographic data. In any case, the amount of information needed is considerable. This paper combines the planning methodology with the Geographic Information Systems to present the final results in thematic maps that make easier to interpret them. The proposed methodology is a previous useful tool to organize the MSW collection routes including the selective collection. To verify the methodology it has been successfully applied to a Spanish town.

  10. Methodology to design a municipal solid waste generation and composition map: a case study.

    PubMed

    Gallardo, A; Carlos, M; Peris, M; Colomer, F J

    2015-02-01

    The municipal solid waste (MSW) management is an important task that local governments as well as private companies must take into account to protect human health, the environment and to preserve natural resources. To design an adequate MSW management plan the first step consists in defining the waste generation and composition patterns of the town. As these patterns depend on several socio-economic factors it is advisable to organize them previously. Moreover, the waste generation and composition patterns may vary around the town and over the time. Generally, the data are not homogeneous around the city as the number of inhabitants is not constant nor it is the economic activity. Therefore, if all the information is showed in thematic maps, the final waste management decisions can be made more efficiently. The main aim of this paper is to present a structured methodology that allows local authorities or private companies who deal with MSW to design its own MSW management plan depending on the available data. According to these data, this paper proposes two ways of action: a direct way when detailed data are available and an indirect way when there is a lack of data and it is necessary to take into account bibliographic data. In any case, the amount of information needed is considerable. This paper combines the planning methodology with the Geographic Information Systems to present the final results in thematic maps that make easier to interpret them. The proposed methodology is a previous useful tool to organize the MSW collection routes including the selective collection. To verify the methodology it has been successfully applied to a Spanish town.

  11. A Human-Centered Design Methodology to Enhance the Usability, Human Factors, and User Experience of Connected Health Systems: A Three-Phase Methodology

    PubMed Central

    Harte, Richard; Glynn, Liam; Rodríguez-Molinero, Alejandro; Baker, Paul MA; Scharf, Thomas; ÓLaighin, Gearóid

    2017-01-01

    Background Design processes such as human-centered design, which involve the end user throughout the product development and testing process, can be crucial in ensuring that the product meets the needs and capabilities of the user, particularly in terms of safety and user experience. The structured and iterative nature of human-centered design can often present a challenge when design teams are faced with the necessary, rapid, product development life cycles associated with the competitive connected health industry. Objective We wanted to derive a structured methodology that followed the principles of human-centered design that would allow designers and developers to ensure that the needs of the user are taken into account throughout the design process, while maintaining a rapid pace of development. In this paper, we present the methodology and its rationale before outlining how it was applied to assess and enhance the usability, human factors, and user experience of a connected health system known as the Wireless Insole for Independent and Safe Elderly Living (WIISEL) system, a system designed to continuously assess fall risk by measuring gait and balance parameters associated with fall risk. Methods We derived a three-phase methodology. In Phase 1 we emphasized the construction of a use case document. This document can be used to detail the context of use of the system by utilizing storyboarding, paper prototypes, and mock-ups in conjunction with user interviews to gather insightful user feedback on different proposed concepts. In Phase 2 we emphasized the use of expert usability inspections such as heuristic evaluations and cognitive walkthroughs with small multidisciplinary groups to review the prototypes born out of the Phase 1 feedback. Finally, in Phase 3 we emphasized classical user testing with target end users, using various metrics to measure the user experience and improve the final prototypes. Results We report a successful implementation of the

  12. Inductive Powering of Subcutaneous Stimulators: Key Parameters and Their Impact on the Design Methodology.

    PubMed

    Godfraind, Carmen; Debelle, Adrien; Lonys, Laurent; Acuña, Vicente; Doguet, Pascal; Nonclercq, Antoine

    2016-06-13

    Inductive powering of implantable medical devices involves numerous factors acting on the system efficiency and safety in adversarial ways. This paper lightens up their role and identifies a procedure enabling the system design. The latter enables the problem to be decoupled into four principal steps: the frequency choice, the magnetic link optimization, the secondary circuit and then finally the primary circuit designs. The methodology has been tested for the powering system of a device requirering a power of 300mW and implanted at a distance of 15 to 30mm from the outside power source. It allowed the identification of the most critical parameters. A satisfying efficiency of 34% was reached at 21mm and tend to validate the proposed design procedure.

  13. Design Methodology: ASICs with complex in-pixel processing for Pixel Detectors

    SciTech Connect

    Fahim, Farah

    2014-10-31

    The development of Application Specific Integrated Circuits (ASIC) for pixel detectors with complex in-pixel processing using Computer Aided Design (CAD) tools that are, themselves, mainly developed for the design of conventional digital circuits requires a specialized approach. Mixed signal pixels often require parasitically aware detailed analog front-ends and extremely compact digital back-ends with more than 1000 transistors in small areas below 100μm x 100μm. These pixels are tiled to create large arrays, which have the same clock distribution and data readout speed constraints as in, for example, micro-processors. The methodology uses a modified mixed-mode on-top digital implementation flow to not only harness the tool efficiency for timing and floor-planning but also to maintain designer control over compact parasitically aware layout.

  14. Numerical simulation methodologies for design and development of Diffuser-Augmented Wind Turbines - analysis and comparison

    NASA Astrophysics Data System (ADS)

    Michał, Lipian; Maciej, Karczewski; Jakub, Molinski; Krzysztof, Jozwik

    2016-01-01

    Different numerical computation methods used to develop a methodology for fast, efficient, reliable design and comparison of Diffuser-Augmented Wind Turbine (DAWT) geometries are presented. The demand for such methods is evident, following the multitude of geometrical parameters that influence the flow character through ducted turbines. The results of the Actuator Disk Model (ADM) simulations will be confronted with a simulation method of higher order of accuracy, i.e. the 3D Fully-resolved Rotor Model (FRM) in the rotor design point. Both will be checked for consistency with the experimental results measured in the wind tunnel at the Institute of Turbo-machinery (IMP), Lodz University of Technology (TUL). An attempt to find an efficient method (with a compromise between accuracy and design time) for the flow analysis pertinent to the DAWT is a novel approach presented in this paper.

  15. Inductive Powering of Subcutaneous Stimulators: Key Parameters and Their Impact on the Design Methodology

    PubMed Central

    Godfraind, Carmen; Debelle, Adrien; Lonys, Laurent; Acuña, Vicente; Doguet, Pascal; Nonclercq, Antoine

    2016-01-01

    Inductive powering of implantable medical devices involves numerous factors acting on the system efficiency and safety in adversarial ways. This paper lightens up their role and identifies a procedure enabling the system design. The latter enables the problem to be decoupled into four principal steps: the frequency choice, the magnetic link optimization, the secondary circuit and then finally the primary circuit designs. The methodology has been tested for the powering system of a device requirering a power of 300mW and implanted at a distance of 15 to 30mm from the outside power source. It allowed the identification of the most critical parameters. A satisfying efficiency of 34% was reached at 21mm and tend to validate the proposed design procedure. PMID:27478572

  16. Methods for comparing data across differently designed agronomic studies: examples of different meta-analysis methods used to compare relative composition of plant foods grown using organic or conventional production methods and a protocol for a systematic review.

    PubMed

    Brandt, Kirsten; Srednicka-Tober, Dominika; Barański, Marcin; Sanderson, Roy; Leifert, Carlo; Seal, Chris

    2013-07-31

    Meta-analyses are methods to combine outcomes from different studies to investigate consistent effects of relatively small magnitude, which are difficult to distinguish from random variation within a single study. Several published meta-analyses addressed whether organic and conventional production methods affect the composition of plant foods differently. The meta-analyses were carried out using different options for the methodology and resulted in different conclusions. The types of designs of field trials and farm comparisons widely used in horticultural and agronomic research differ substantially from the clinical trials and epidemiological studies that most meta-analysis methodologies were developed for. Therefore, it is proposed that a systematic review and meta-analysis be carried out with the aim of developing a consolidated methodology. If successful, this methodology can then be used to determine effects of different production systems on plant food composition as well as other comparable factors with small but systematic effects across studies.

  17. Applying Quality Indicators to Single-Case Research Designs Used in Special Education: A Systematic Review

    ERIC Educational Resources Information Center

    Moeller, Jeremy D.; Dattilo, John; Rusch, Frank

    2015-01-01

    This study examined how specific guidelines and heuristics have been used to identify methodological rigor associated with single-case research designs based on quality indicators developed by Horner et al. Specifically, this article describes how literature reviews have applied Horner et al.'s quality indicators and evidence-based criteria.…

  18. Community-wide assessment of protein-interface modeling suggests improvements to design methodology.

    PubMed

    Fleishman, Sarel J; Whitehead, Timothy A; Strauch, Eva-Maria; Corn, Jacob E; Qin, Sanbo; Zhou, Huan-Xiang; Mitchell, Julie C; Demerdash, Omar N A; Takeda-Shitaka, Mayuko; Terashi, Genki; Moal, Iain H; Li, Xiaofan; Bates, Paul A; Zacharias, Martin; Park, Hahnbeom; Ko, Jun-su; Lee, Hasup; Seok, Chaok; Bourquard, Thomas; Bernauer, Julie; Poupon, Anne; Azé, Jérôme; Soner, Seren; Ovali, Sefik Kerem; Ozbek, Pemra; Tal, Nir Ben; Haliloglu, Türkan; Hwang, Howook; Vreven, Thom; Pierce, Brian G; Weng, Zhiping; Pérez-Cano, Laura; Pons, Carles; Fernández-Recio, Juan; Jiang, Fan; Yang, Feng; Gong, Xinqi; Cao, Libin; Xu, Xianjin; Liu, Bin; Wang, Panwen; Li, Chunhua; Wang, Cunxin; Robert, Charles H; Guharoy, Mainak; Liu, Shiyong; Huang, Yangyu; Li, Lin; Guo, Dachuan; Chen, Ying; Xiao, Yi; London, Nir; Itzhaki, Zohar; Schueler-Furman, Ora; Inbar, Yuval; Potapov, Vladimir; Cohen, Mati; Schreiber, Gideon; Tsuchiya, Yuko; Kanamori, Eiji; Standley, Daron M; Nakamura, Haruki; Kinoshita, Kengo; Driggers, Camden M; Hall, Robert G; Morgan, Jessica L; Hsu, Victor L; Zhan, Jian; Yang, Yuedong; Zhou, Yaoqi; Kastritis, Panagiotis L; Bonvin, Alexandre M J J; Zhang, Weiyi; Camacho, Carlos J; Kilambi, Krishna P; Sircar, Aroop; Gray, Jeffrey J; Ohue, Masahito; Uchikoga, Nobuyuki; Matsuzaki, Yuri; Ishida, Takashi; Akiyama, Yutaka; Khashan, Raed; Bush, Stephen; Fouches, Denis; Tropsha, Alexander; Esquivel-Rodríguez, Juan; Kihara, Daisuke; Stranges, P Benjamin; Jacak, Ron; Kuhlman, Brian; Huang, Sheng-You; Zou, Xiaoqin; Wodak, Shoshana J; Janin, Joel; Baker, David

    2011-11-25

    The CAPRI (Critical Assessment of Predicted Interactions) and CASP (Critical Assessment of protein Structure Prediction) experiments have demonstrated the power of community-wide tests of methodology in assessing the current state of the art and spurring progress in the very challenging areas of protein docking and structure prediction. We sought to bring the power of community-wide experiments to bear on a very challenging protein design problem that provides a complementary but equally fundamental test of current understanding of protein-binding thermodynamics. We have generated a number of designed protein-protein interfaces with very favorable computed binding energies but which do not appear to be formed in experiments, suggesting that there may be important physical chemistry missing in the energy calculations. A total of 28 research groups took up the challenge of determining what is missing: we provided structures of 87 designed complexes and 120 naturally occurring complexes and asked participants to identify energetic contributions and/or structural features that distinguish between the two sets. The community found that electrostatics and solvation terms partially distinguish the designs from the natural complexes, largely due to the nonpolar character of the designed interactions. Beyond this polarity difference, the community found that the designed binding surfaces were, on average, structurally less embedded in the designed monomers, suggesting that backbone conformational rigidity at the designed surface is important for realization of the designed function. These results can be used to improve computational design strategies, but there is still much to be learned; for example, one designed complex, which does form in experiments, was classified by all metrics as a nonbinder.

  19. Community-wide assessment of protein-interface modeling suggests improvements to design methodology

    PubMed Central

    Fleishman, Sarel J; Whitehead, Timothy A; Strauch, Eva-Maria; Corn, Jacob E; Qin, Sanbo; Zhou, Huan-Xiang; Mitchell, Julie C.; Demerdash, Omar N.A; Takeda-Shitaka, Mayuko; Terashi, Genki; Moal, Iain H.; Li, Xiaofan; Bates, Paul A.; Zacharias, Martin; Park, Hahnbeom; Ko, Jun-su; Lee, Hasup; Seok, Chaok; Bourquard, Thomas; Bernauer, Julie; Poupon, Anne; Azé, Jérôme; Soner, Seren; Ovali, Şefik Kerem; Ozbek, Pemra; Ben Tal, Nir; Haliloglu, Türkan; Hwang, Howook; Vreven, Thom; Pierce, Brian G.; Weng, Zhiping; Pérez-Cano, Laura; Pons, Carles; Fernández-Recio, Juan; Jiang, Fan; Yang, Feng; Gong, Xinqi; Cao, Libin; Xu, Xianjin; Liu, Bin; Wang, Panwen; Li, Chunhua; Wang, Cunxin; Robert, Charles H.; Guharoy, Mainak; Liu, Shiyong; Huang, Yangyu; Li, Lin; Guo, Dachuan; Chen, Ying; Xiao, Yi; London, Nir; Itzhaki, Zohar; Schueler-Furman, Ora; Inbar, Yuval; Patapov, Vladimir; Cohen, Mati; Schreiber, Gideon; Tsuchiya, Yuko; Kanamori, Eiji; Standley, Daron M.; Nakamura, Haruki; Kinoshita, Kengo; Driggers, Camden M.; Hall, Robert G.; Morgan, Jessica L.; Hsu, Victor L.; Zhan, Jian; Yang, Yuedong; Zhou, Yaoqi; Kastritis, Panagiotis L.; Bonvin, Alexandre M.J.J.; Zhang, Weiyi; Camacho, Carlos J.; Kilambi, Krishna P.; Sircar, Aroop; Gray, Jeffrey J.; Ohue, Masahito; Uchikoga, Nobuyuki; Matsuzaki, Yuri; Ishida, Takashi; Akiyama, Yutaka; Khashan, Raed; Bush, Stephen; Fouches, Denis; Tropsha, Alexander; Esquivel-Rodríguez, Juan; Kihara, Daisuke; Stranges, P Benjamin; Jacak, Ron; Kuhlman, Brian; Huang, Sheng-You; Zou, Xiaoqin; Wodak, Shoshana J; Janin, Joel; Baker, David

    2013-01-01

    The CAPRI and CASP prediction experiments have demonstrated the power of community wide tests of methodology in assessing the current state of the art and spurring progress in the very challenging areas of protein docking and structure prediction. We sought to bring the power of community wide experiments to bear on a very challenging protein design problem that provides a complementary but equally fundamental test of current understanding of protein-binding thermodynamics. We have generated a number of designed protein-protein interfaces with very favorable computed binding energies but which do not appear to be formed in experiments, suggesting there may be important physical chemistry missing in the energy calculations. 28 research groups took up the challenge of determining what is missing: we provided structures of 87 designed complexes and 120 naturally occurring complexes and asked participants to identify energetic contributions and/or structural features that distinguish between the two sets. The community found that electrostatics and solvation terms partially distinguish the designs from the natural complexes, largely due to the non-polar character of the designed interactions. Beyond this polarity difference, the community found that the designed binding surfaces were on average structurally less embedded in the designed monomers, suggesting that backbone conformational rigidity at the designed surface is important for realization of the designed function. These results can be used to improve computational design strategies, but there is still much to be learned; for example, one designed complex, which does form in experiments, was classified by all metrics as a non-binder. PMID:22001016

  20. A system-of-systems modeling methodology for strategic general aviation design decision-making

    NASA Astrophysics Data System (ADS)

    Won, Henry Thome

    General aviation has long been studied as a means of providing an on-demand "personal air vehicle" that bypasses the traffic at major commercial hubs. This thesis continues this research through development of a system of systems modeling methodology applicable to the selection of synergistic product concepts, market segments, and business models. From the perspective of the conceptual design engineer, the design and selection of future general aviation aircraft is complicated by the definition of constraints and requirements, and the tradeoffs among performance and cost aspects. Qualitative problem definition methods have been utilized, although their accuracy in determining specific requirement and metric values is uncertain. In industry, customers are surveyed, and business plans are created through a lengthy, iterative process. In recent years, techniques have developed for predicting the characteristics of US travel demand based on travel mode attributes, such as door-to-door time and ticket price. As of yet, these models treat the contributing systems---aircraft manufacturers and service providers---as independently variable assumptions. In this research, a methodology is developed which seeks to build a strategic design decision making environment through the construction of a system of systems model. The demonstrated implementation brings together models of the aircraft and manufacturer, the service provider, and most importantly the travel demand. Thus represented is the behavior of the consumers and the reactive behavior of the suppliers---the manufacturers and transportation service providers---in a common modeling framework. The results indicate an ability to guide the design process---specifically the selection of design requirements---through the optimization of "capability" metrics. Additionally, results indicate the ability to find synergetic solutions, that is solutions in which two systems might collaborate to achieve a better result than acting

  1. Health-related quality of life after TBI: a systematic review of study design, instruments, measurement properties, and outcome.

    PubMed

    Polinder, Suzanne; Haagsma, Juanita A; van Klaveren, David; Steyerberg, Ewout W; van Beeck, Ed F

    2015-01-01

    Measurement of health-related quality of life (HRQL) is essential to quantify the subjective burden of traumatic brain injury (TBI) in survivors. We performed a systematic review of HRQL studies in TBI to evaluate study design, instruments used, methodological quality, and outcome. Fifty-eight studies were included, showing large variation in HRQL instruments and assessment time points used. The Short Form-36 (SF-36) was most frequently used. A high prevalence of health problems during and after the first year of TBI was a common finding of the studies included. In the long term, patients with a TBI still showed large deficits from full recovery compared to population norms. Positive results for internal consistency and interpretability of the SF-36 were reported in validity studies. The Quality of Life after Brain Injury instrument (QOLIBRI), European Brain Injury Questionnaire (EBIQ), Child Health Questionnaire (CHQ), and the World Health Organization Quality of Life short version (WHOQOL-BREF) showed positive results, but evidence was limited. Meta-analysis of SF-36 showed that TBI outcome is heterogeneous, encompassing a broad spectrum of HRQL, with most problems reported in the physical, emotional, and social functioning domain. The use of SF-36 in combination with a TBI-specific instrument, i.e., QOLIBRI, seems promising. Consensus on preferred methodologies of HRQL measurement in TBI would facilitate comparability across studies, resulting in improved insights in recovery patterns and better estimates of the burden of TBI.

  2. LOX/hydrocarbon rocket engine analytical design methodology development and validation. Volume 2: Appendices

    NASA Technical Reports Server (NTRS)

    Niiya, Karen E.; Walker, Richard E.; Pieper, Jerry L.; Nguyen, Thong V.

    1993-01-01

    This final report includes a discussion of the work accomplished during the period from Dec. 1988 through Nov. 1991. The objective of the program was to assemble existing performance and combustion stability models into a usable design methodology capable of designing and analyzing high-performance and stable LOX/hydrocarbon booster engines. The methodology was then used to design a validation engine. The capabilities and validity of the methodology were demonstrated using this engine in an extensive hot fire test program. The engine used LOX/RP-1 propellants and was tested over a range of mixture ratios, chamber pressures, and acoustic damping device configurations. This volume contains time domain and frequency domain stability plots which indicate the pressure perturbation amplitudes and frequencies from approximately 30 tests of a 50K thrust rocket engine using LOX/RP-1 propellants over a range of chamber pressures from 240 to 1750 psia with mixture ratios of from 1.2 to 7.5. The data is from test configurations which used both bitune and monotune acoustic cavities and from tests with no acoustic cavities. The engine had a length of 14 inches and a contraction ratio of 2.0 using a 7.68 inch diameter injector. The data was taken from both stable and unstable tests. All combustion instabilities were spontaneous in the first tangential mode. Although stability bombs were used and generated overpressures of approximately 20 percent, no tests were driven unstable by the bombs. The stability instrumentation included six high-frequency Kistler transducers in the combustion chamber, a high-frequency Kistler transducer in each propellant manifold, and tri-axial accelerometers. Performance data is presented, both characteristic velocity efficiencies and energy release efficiencies, for those tests of sufficient duration to record steady state values.

  3. Optimization of Electrospray Ionization by Statistical Design of Experiments and Response Surface Methodology: Protein-Ligand Equilibrium Dissociation Constant Determinations.

    PubMed

    Pedro, Liliana; Van Voorhis, Wesley C; Quinn, Ronald J

    2016-09-01

    Electrospray ionization mass spectrometry (ESI-MS) binding studies between proteins and ligands under native conditions require that instrumental ESI source conditions are optimized if relative solution-phase equilibrium concentrations between the protein-ligand complex and free protein are to be retained. Instrumental ESI source conditions that simultaneously maximize the relative ionization efficiency of the protein-ligand complex over free protein and minimize the protein-ligand complex dissociation during the ESI process and the transfer from atmospheric pressure to vacuum are generally specific for each protein-ligand system and should be established when an accurate equilibrium dissociation constant (KD) is to be determined via titration. In this paper, a straightforward and systematic approach for ESI source optimization is presented. The method uses statistical design of experiments (DOE) in conjunction with response surface methodology (RSM) and is demonstrated for the complexes between Plasmodium vivax guanylate kinase (PvGK) and two ligands: 5'-guanosine monophosphate (GMP) and 5'-guanosine diphosphate (GDP). It was verified that even though the ligands are structurally similar, the most appropriate ESI conditions for KD determination by titration are different for each. Graphical Abstract ᅟ.

  4. Methodology for the design of accelerated stress tests for non-precious metal catalysts in fuel cell cathodes

    NASA Astrophysics Data System (ADS)

    Sharabi, Ronit; Wijsboom, Yair Haim; Borchtchoukova, Nino; Finkelshtain, Gennadi; Elbaz, Lior

    2016-12-01

    In this work we propose systematic methods for testing non-precious group metal catalysts and support degradation alkaline fuel cell cathodes. In this case study, we used a cathode composed of a pyrolyzed non-precious metal catalyst (NPMC) on activated carbon. The vulnerabilities of the cathode components were studied in order to develop the methodology and design an accelerated stress test (AST) for NPMC-based cathode in alkaline environment. Cyclic voltammetry (CV), chronoamperometry (CA) and impedance spectroscopy (EIS) were used to characterize the electrochemical behavior of the cathode and to follow the changes that occur as a result of exposing the cathodes to extreme operating conditions. Rotating ring disk electrode (RRDE) was used to study the cathodes kinetics; Raman spectroscopy and X-ray fluorescence (XRF) were used to study the structural changes in the electrode surface as well as depletion of the catalysts' active sites from the electrode. The changes in the composition of the electrode and catalyst were detected using X-ray diffraction (XRD). For the first time, we show that NPMC degrade rapidly at low operating potentials whereas the support degrades at high operating potentials and developed a tailor-made AST to take these into account.

  5. Optimization of Electrospray Ionization by Statistical Design of Experiments and Response Surface Methodology: Protein-Ligand Equilibrium Dissociation Constant Determinations

    NASA Astrophysics Data System (ADS)

    Pedro, Liliana; Van Voorhis, Wesley C.; Quinn, Ronald J.

    2016-09-01

    Electrospray ionization mass spectrometry (ESI-MS) binding studies between proteins and ligands under native conditions require that instrumental ESI source conditions are optimized if relative solution-phase equilibrium concentrations between the protein-ligand complex and free protein are to be retained. Instrumental ESI source conditions that simultaneously maximize the relative ionization efficiency of the protein-ligand complex over free protein and minimize the protein-ligand complex dissociation during the ESI process and the transfer from atmospheric pressure to vacuum are generally specific for each protein-ligand system and should be established when an accurate equilibrium dissociation constant (KD) is to be determined via titration. In this paper, a straightforward and systematic approach for ESI source optimization is presented. The method uses statistical design of experiments (DOE) in conjunction with response surface methodology (RSM) and is demonstrated for the complexes between Plasmodium vivax guanylate kinase ( PvGK) and two ligands: 5'-guanosine monophosphate (GMP) and 5'-guanosine diphosphate (GDP). It was verified that even though the ligands are structurally similar, the most appropriate ESI conditions for KD determination by titration are different for each.

  6. New Methodology of Designing Inexpensive Hybrid Control-Acquisition Systems for Mechatronic Constructions

    PubMed Central

    Augustyn, Jacek

    2013-01-01

    This article presents a new methodology for designing a hybrid control and acquisition system consisting of a 32-bit SoC microsystem connected via a direct Universal Serial Bus (USB) with a standard commercial off-the-shelf (COTS) component running the Android operating system. It is proposed to utilize it avoiding the use of an additional converter. An Android-based component was chosen to explore the potential for a mobile, compact and energy efficient solution with easy to build user interfaces and easy wireless integration with other computer systems. This paper presents results of practical implementation and analysis of experimental real-time performance. It covers closed control loop time between the sensor/actuator module and the Android operating system as well as the real-time sensor data stream within such a system. Some optimisations are proposed and their influence on real-time performance was investigated. The proposed methodology is intended for acquisition and control of mechatronic systems, especially mobile robots. It can be used in a wide range of control applications as well as embedded acquisition-recording devices, including energy quality measurements, smart-grids and medicine. It is demonstrated that the proposed methodology can be employed without developing specific device drivers. The latency achieved was less than 0.5 ms and the sensor data stream throughput was on the order of 750 KB/s (compared to 3 ms latency and 300 KB/s in traditional solutions). PMID:24351633

  7. An Introduction to Methodological Issues When Including Non-Randomised Studies in Systematic Reviews on the Effects of Interventions

    ERIC Educational Resources Information Center

    Reeves, Barnaby C.; Higgins, Julian P. T.; Ramsay, Craig; Shea, Beverley; Tugwell, Peter; Wells, George A.

    2013-01-01

    Background: Methods need to be further developed to include non-randomised studies (NRS) in systematic reviews of the effects of health care interventions. NRS are often required to answer questions about harms and interventions for which evidence from randomised controlled trials (RCTs) is not available. Methods used to review randomised…

  8. Meta-Design. An Approach to the Development of Design Methodologies

    DTIC Science & Technology

    1990-01-01

    subject to the constraints. The KKT conditions are necessary conditions for a particular value X* for the vector of design variables X, to be a... optimization problem a second time. We start by applying the requirement that optimality is to be maintained, so we must also satisfy the third KKT condition ... optimal solution to this problem must satisfy the third KKT condition : afl/’cl + X .g/Dcl = 2(cl/10 - 1)(1/10) + X1 = 0. Then X1 = (1/5)(1 - cl/10

  9. Novel DPT methodology co-optimized with design rules for sub-20nm device

    NASA Astrophysics Data System (ADS)

    Lee, Hyun-Jong; Choi, Soo-Han; Yang, Jae-Seok; Chun, Kwan-Young; Do, Jeong-ho; Park, Chul-Hong

    2012-11-01

    Because extreme ultra violet (EUV) lithography is not ready due to technical challenges and low throughput, we are facing severe limitation for sub-20nm node patterning even though the extreme resolution enhancement technology (RET) such as the off-axis illumination and computational lithography have been used to achieve enough process window and critical dimension uniformity (CDU). As an alternative solution, double patterning technology (DPT) becomes the essential patterning scheme for the sub-20nm technology node. DPT requires the complex design rules because DPT rules need to consider layout decomposability into two masks. In order to improve CDU and to achieve both design rule simplicity and better designability, we propose two kinds of layout decomposition methodologies in this paper; 1) new mandrel decomposition of the Fin generation for better uniformity, 2) chip-level decomposition and colorless design rule of the contact to improve the scalability. Co-optimized design rules, decomposition method and process requirement enable us to obtain about 6% scaling benefits by comparison with normal DPT flow. These DPT approaches provide benefits for both process and design.

  10. Proposal of a methodology for the design of offshore wind farms

    NASA Astrophysics Data System (ADS)

    Esteban, Dolores; Diez, J. Javier; Santos Lopez, J.; Negro, Vicente

    2010-05-01

    In fact, the wind power installed in the sea is still very scarce, with only 1,500 megawatts in operation in the middle of 2009. Although the first offshore wind farm experiment took place in 1990, the facilities built up to now have been mainly pilot projects. These previous statements confirm the incipient state of offshore wind power, Anyway, in this moment this technology is being strongly pushed, especially by the governments of some countries - like the United Kingdom, Germany, etc. - which is due above all to the general commitments made to reduce the emission of greenhouses gases. All of these factors lead to predict a promising future for offshore wind power. Nevertheless, it has not been still established a general methodology for the design and the management of this kind of installations. This paper includes some of the results of a research project, which consists on the elaboration of a methodology to enable the optimization of the global process of the operations leading to the implantation of offshore wind facilities. The proposed methodology allows the planning of offshore wind projects according to an integral management policy, enabling not only technical and financial feasibility of the offshore wind project to be achieved, but also respect for the environment. For that, it has been necessary to take into account multiple factors, including the territory, the terrain, the physical-chemical properties of the contact area between the atmosphere and the ocean, the dynamics resulting in both as a consequence of the Earth's behaviour as a heat machine, external geodynamics, internal geodynamics, planetary dynamics, biokenosis, the legislative and financial framework, human activities, wind turbines, met masts, electric substations and lines, foundations, logistics and the project's financial profitability. For its validation, this methodology has been applied to different offshore wind farms in operation.

  11. Toward a systematic design theory for silicon solar cells using optimization techniques

    NASA Technical Reports Server (NTRS)

    Misiakos, K.; Lindholm, F. A.

    1986-01-01

    This work is a first detailed attempt to systematize the design of silicon solar cells. Design principles follow from three theorems. Although the results hold only under low injection conditions in base and emitter regions, they hold for arbitrary doping profiles and include the effects of drift fields, high/low junctions and heavy doping concentrations of donor or acceptor atoms. Several optimal designs are derived from the theorems, one of which involves a three-dimensional morphology in the emitter region. The theorems are derived from a nonlinear differential equation of the Riccati form, the dependent variable of which is a normalized recombination particle current.

  12. All-dielectric structure development for electromagnetic wave shielding using a systematic design approach

    NASA Astrophysics Data System (ADS)

    Shin, H.; Heo, N.; Park, J.; Seo, I.; Yoo, J.

    2017-01-01

    Common dielectric metamaterials for electromagnetic (EM) interference shielding, stealth applications, and EM cloaking generally require larger thicknesses than the wavelength of incidence light. We propose an all-dielectric metamaterial inspired structure using a systematic approach based on the phase field design method. The structure is composed of periodically arranged unit structures that have a 2D configuration, which is sub-wavelength thick over its entire structure. The proposed structure provides anomalous reflections to prevent reflections back toward the wave source and is anti-penetrative over the microwave band with no conductive materials. We digitally fabricated the designed structure using 3D printing and verified the design specifications by experiments.

  13. The Study of the Relationship between Probabilistic Design and Axiomatic Design Methodology. Volume 3

    NASA Technical Reports Server (NTRS)

    Onwubiko, Chin-Yere; Onyebueke, Landon

    1996-01-01

    Structural failure is rarely a "sudden death" type of event, such sudden failures may occur only under abnormal loadings like bomb or gas explosions and very strong earthquakes. In most cases, structures fail due to damage accumulated under normal loadings such as wind loads, dead and live loads. The consequence of cumulative damage will affect the reliability of surviving components and finally causes collapse of the system. The cumulative damage effects on system reliability under time-invariant loadings are of practical interest in structural design and therefore will be investigated in this study. The scope of this study is, however, restricted to the consideration of damage accumulation as the increase in the number of failed components due to the violation of their strength limits.

  14. Application of Adjoint Methodology to Supersonic Aircraft Design Using Reversed Equivalent Areas

    NASA Technical Reports Server (NTRS)

    Rallabhandi, Sriram K.

    2013-01-01

    This paper presents an approach to shape an aircraft to equivalent area based objectives using the discrete adjoint approach. Equivalent areas can be obtained either using reversed augmented Burgers equation or direct conversion of off-body pressures into equivalent area. Formal coupling with CFD allows computation of sensitivities of equivalent area objectives with respect to aircraft shape parameters. The exactness of the adjoint sensitivities is verified against derivatives obtained using the complex step approach. This methodology has the benefit of using designer-friendly equivalent areas in the shape design of low-boom aircraft. Shape optimization results with equivalent area cost functionals are discussed and further refined using ground loudness based objectives.

  15. A hybrid design methodology for structuring an Integrated Environmental Management System (IEMS) for shipping business.

    PubMed

    Celik, Metin

    2009-03-01

    The International Safety Management (ISM) Code defines a broad framework for the safe management and operation of merchant ships, maintaining high standards of safety and environmental protection. On the other hand, ISO 14001:2004 provides a generic, worldwide environmental management standard that has been utilized by several industries. Both the ISM Code and ISO 14001:2004 have the practical goal of establishing a sustainable Integrated Environmental Management System (IEMS) for shipping businesses. This paper presents a hybrid design methodology that shows how requirements from both standards can be combined into a single execution scheme. Specifically, the Analytic Hierarchy Process (AHP) and Fuzzy Axiomatic Design (FAD) are used to structure an IEMS for ship management companies. This research provides decision aid to maritime executives in order to enhance the environmental performance in the shipping industry.

  16. Piloted Evaluation of an Integrated Methodology for Propulsion and Airframe Control Design

    NASA Technical Reports Server (NTRS)

    Bright, Michelle M.; Simon, Donald L.; Garg, Sanjay; Mattern, Duane L.; Ranaudo, Richard J.; Odonoghue, Dennis P.

    1994-01-01

    An integrated methodology for propulsion and airframe control has been developed and evaluated for a Short Take-Off Vertical Landing (STOVL) aircraft using a fixed base flight simulator at NASA Lewis Research Center. For this evaluation the flight simulator is configured for transition flight using a STOVL aircraft model, a full nonlinear turbofan engine model, simulated cockpit and displays, and pilot effectors. The paper provides a brief description of the simulation models, the flight simulation environment, the displays and symbology, the integrated control design, and the piloted tasks used for control design evaluation. In the simulation, the pilots successfully completed typical transition phase tasks such as combined constant deceleration with flight path tracking, and constant acceleration wave-off maneuvers. The pilot comments of the integrated system performance and the display symbology are discussed and analyzed to identify potential areas of improvement.

  17. Application of Adjoint Methodology in Various Aspects of Sonic Boom Design

    NASA Technical Reports Server (NTRS)

    Rallabhandi, Sriram K.

    2014-01-01

    One of the advances in computational design has been the development of adjoint methods allowing efficient calculation of sensitivities in gradient-based shape optimization. This paper discusses two new applications of adjoint methodology that have been developed to aid in sonic boom mitigation exercises. In the first, equivalent area targets are generated using adjoint sensitivities of selected boom metrics. These targets may then be used to drive the vehicle shape during optimization. The second application is the computation of adjoint sensitivities of boom metrics on the ground with respect to parameters such as flight conditions, propagation sampling rate, and selected inputs to the propagation algorithms. These sensitivities enable the designer to make more informed selections of flight conditions at which the chosen cost functionals are less sensitive.

  18. Human factors analysis and design methods for nuclear waste retrieval systems. Human factors design methodology and integration plan

    SciTech Connect

    Casey, S.M.

    1980-06-01

    The purpose of this document is to provide an overview of the recommended activities and methods to be employed by a team of human factors engineers during the development of a nuclear waste retrieval system. This system, as it is presently conceptualized, is intended to be used for the removal of storage canisters (each canister containing a spent fuel rod assembly) located in an underground salt bed depository. This document, and the others in this series, have been developed for the purpose of implementing human factors engineering principles during the design and construction of the retrieval system facilities and equipment. The methodology presented has been structured around a basic systems development effort involving preliminary development, equipment development, personnel subsystem development, and operational test and evaluation. Within each of these phases, the recommended activities of the human engineering team have been stated, along with descriptions of the human factors engineering design techniques applicable to the specific design issues. Explicit examples of how the techniques might be used in the analysis of human tasks and equipment required in the removal of spent fuel canisters have been provided. Only those techniques having possible relevance to the design of the waste retrieval system have been reviewed. This document is intended to provide the framework for integrating human engineering with the rest of the system development effort. The activities and methodologies reviewed in this document have been discussed in the general order in which they will occur, although the time frame (the total duration of the development program in years and months) in which they should be performed has not been discussed.

  19. Methodology for Sensitivity Analysis, Approximate Analysis, and Design Optimization in CFD for Multidisciplinary Applications

    NASA Technical Reports Server (NTRS)

    Taylor, Arthur C., III; Hou, Gene W.

    1996-01-01

    An incremental iterative formulation together with the well-known spatially split approximate-factorization algorithm, is presented for solving the large, sparse systems of linear equations that are associated with aerodynamic sensitivity analysis. This formulation is also known as the 'delta' or 'correction' form. For the smaller two dimensional problems, a direct method can be applied to solve these linear equations in either the standard or the incremental form, in which case the two are equivalent. However, iterative methods are needed for larger two-dimensional and three dimensional applications because direct methods require more computer memory than is currently available. Iterative methods for solving these equations in the standard form are generally unsatisfactory due to an ill-conditioned coefficient matrix; this problem is overcome when these equations are cast in the incremental form. The methodology is successfully implemented and tested using an upwind cell-centered finite-volume formulation applied in two dimensions to the thin-layer Navier-Stokes equations for external flow over an airfoil. In three dimensions this methodology is demonstrated with a marching-solution algorithm for the Euler equations to calculate supersonic flow over the High-Speed Civil Transport configuration (HSCT 24E). The sensitivity derivatives obtained with the incremental iterative method from a marching Euler code are used in a design-improvement study of the HSCT configuration that involves thickness. camber, and planform design variables.

  20. Assessment of current structural design methodology for high-temperature reactors based on failure tests

    SciTech Connect

    Corum, J.M.; Sartory, W.K.

    1985-01-01

    A mature design methodology, consisting of inelastic analysis methods, provided in Department of Energy guidelines, and failure criteria, contained in ASME Code Case N-47, exists in the United States for high-temperature reactor components. The objective of this paper is to assess the adequacy of this overall methodology by comparing predicted inelastic deformations and lifetimes with observed results from structural failure tests and from an actual service failure. Comparisons are presented for three types of structural situations: (1) nozzle-to-spherical shell specimens, where stresses at structural discontinuities lead to cracking, (2) welded structures, where metallurgical discontinuities play a key role in failures, and (3) thermal shock loadings of cylinders and pipes, where thermal discontinuities can lead to failure. The comparison between predicted and measured inelastic responses are generally reasonalbly good; quantities are sometimes overpredicted somewhat, and, sometimes underpredicted. However, even seemingly small discrepancies can have a significant effect on structural life, and lifetimes are not always as closely predicted. For a few cases, the lifetimes are substantially overpredicted, which raises questions regarding the adequacy of existing design margins.

  1. Integrated active and passive control design methodology for the LaRC CSI evolutionary model

    NASA Technical Reports Server (NTRS)

    Voth, Christopher T.; Richards, Kenneth E., Jr.; Schmitz, Eric; Gehling, Russel N.; Morgenthaler, Daniel R.

    1994-01-01

    A general design methodology to integrate active control with passive damping was demonstrated on the NASA LaRC CSI Evolutionary Model (CEM), a ground testbed for future large, flexible spacecraft. Vibration suppression controllers designed for Line-of Sight (LOS) minimization were successfully implemented on the CEM. A frequency-shaped H2 methodology was developed, allowing the designer to specify the roll-off of the MIMO compensator. A closed loop bandwidth of 4 Hz, including the six rigid body modes and the first three dominant elastic modes of the CEM was achieved. Good agreement was demonstrated between experimental data and analytical predictions for the closed loop frequency response and random tests. Using the Modal Strain Energy (MSE) method, a passive damping treatment consisting of 60 viscoelastically damped struts was designed, fabricated and implemented on the CEM. Damping levels for the targeted modes were more than an order of magnitude larger than for the undamped structure. Using measured loss and stiffness data for the individual damped struts, analytical predictions of the damping levels were very close to the experimental values in the (1-10) Hz frequency range where the open loop model matched the experimental data. An integrated active/passive controller was successfully implemented on the CEM and was evaluated against an active-only controller. A two-fold increase in the effective control bandwidth and further reductions of 30 percent to 50 percent in the LOS RMS outputs were achieved compared to an active-only controller. Superior performance was also obtained compared to a High-Authority/Low-Authority (HAC/LAC) controller.

  2. Modeling and Design Analysis Methodology for Tailoring of Aircraft Structures with Composites

    NASA Technical Reports Server (NTRS)

    Rehfield, Lawrence W.

    2004-01-01

    Composite materials provide design flexibility in that fiber placement and orientation can be specified and a variety of material forms and manufacturing processes are available. It is possible, therefore, to 'tailor' the structure to a high degree in order to meet specific design requirements in an optimum manner. Common industrial practices, however, have limited the choices designers make. One of the reasons for this is that there is a dearth of conceptual/preliminary design analysis tools specifically devoted to identifying structural concepts for composite airframe structures. Large scale finite element simulations are not suitable for such purposes. The present project has been devoted to creating modeling and design analysis methodology for use in the tailoring process of aircraft structures. Emphasis has been given to creating bend-twist elastic coupling in high aspect ratio wings or other lifting surfaces. The direction of our work was in concert with the overall NASA effort Twenty- First Century Aircraft Technology (TCAT). A multi-disciplinary team was assembled by Dr. Damodar Ambur to work on wing technology, which included our project.

  3. A robust rotorcraft flight control system design methodology utilizing quantitative feedback theory

    NASA Technical Reports Server (NTRS)

    Gorder, Peter James

    1993-01-01

    Rotorcraft flight control systems present design challenges which often exceed those associated with fixed-wing aircraft. First, large variations in the response characteristics of the rotorcraft result from the wide range of airspeeds of typical operation (hover to over 100 kts). Second, the assumption of vehicle rigidity often employed in the design of fixed-wing flight control systems is rarely justified in rotorcraft where rotor degrees of freedom can have a significant impact on the system performance and stability. This research was intended to develop a methodology for the design of robust rotorcraft flight control systems. Quantitative Feedback Theory (QFT) was chosen as the basis for the investigation. Quantitative Feedback Theory is a technique which accounts for variability in the dynamic response of the controlled element in the design robust control systems. It was developed to address a Multiple-Input Single-Output (MISO) design problem, and utilizes two degrees of freedom to satisfy the design criteria. Two techniques were examined for extending the QFT MISO technique to the design of a Multiple-Input-Multiple-Output (MIMO) flight control system (FCS) for a UH-60 Black Hawk Helicopter. In the first, a set of MISO systems, mathematically equivalent to the MIMO system, was determined. QFT was applied to each member of the set simultaneously. In the second, the same set of equivalent MISO systems were analyzed sequentially, with closed loop response information from each loop utilized in subsequent MISO designs. The results of each technique were compared, and the advantages of the second, termed Sequential Loop Closure, were clearly evident.

  4. Towards a methodology for cluster searching to provide conceptual and contextual “richness” for systematic reviews of complex interventions: case study (CLUSTER)

    PubMed Central

    2013-01-01

    Background Systematic review methodologies can be harnessed to help researchers to understand and explain how complex interventions may work. Typically, when reviewing complex interventions, a review team will seek to understand the theories that underpin an intervention and the specific context for that intervention. A single published report from a research project does not typically contain this required level of detail. A review team may find it more useful to examine a “study cluster”; a group of related papers that explore and explain various features of a single project and thus supply necessary detail relating to theory and/or context. We sought to conduct a preliminary investigation, from a single case study review, of techniques required to identify a cluster of related research reports, to document the yield from such methods, and to outline a systematic methodology for cluster searching. Methods In a systematic review of community engagement we identified a relevant project – the Gay Men’s Task Force. From a single “key pearl citation” we conducted a series of related searches to find contextually or theoretically proximate documents. We followed up Citations, traced Lead authors, identified Unpublished materials, searched Google Scholar, tracked Theories, undertook ancestry searching for Early examples and followed up Related projects (embodied in the CLUSTER mnemonic). Results Our structured, formalised procedure for cluster searching identified useful reports that are not typically identified from topic-based searches on bibliographic databases. Items previously rejected by an initial sift were subsequently found to inform our understanding of underpinning theory (for example Diffusion of Innovations Theory), context or both. Relevant material included book chapters, a Web-based process evaluation, and peer reviewed reports of projects sharing a common ancestry. We used these reports to understand the context for the intervention and to

  5. Integrated controls-structures design methodology development for a class of flexible spacecraft

    NASA Technical Reports Server (NTRS)

    Maghami, P. G.; Joshi, S. M.; Walz, J. E.; Armstrong, E. S.

    1990-01-01

    Future utilization of space will require large space structures in low-Earth and geostationary orbits. Example missions include: Earth observation systems, personal communication systems, space science missions, space processing facilities, etc., requiring large antennas, platforms, and solar arrays. The dimensions of such structures will range from a few meters to possibly hundreds of meters. For reducing the cost of construction, launching, and operating (e.g., energy required for reboosting and control), it will be necessary to make the structure as light as possible. However, reducing structural mass tends to increase the flexibility which would make it more difficult to control with the specified precision in attitude and shape. Therefore, there is a need to develop a methodology for designing space structures which are optimal with respect to both structural design and control design. In the current spacecraft design practice, it is customary to first perform the structural design and then the controller design. However, the structural design and the control design problems are substantially coupled and must be considered concurrently in order to obtain a truly optimal spacecraft design. For example, let C denote the set of the 'control' design variables (e.g., controller gains), and L the set of the 'structural' design variables (e.g., member sizes). If a structural member thickness is changed, the dynamics would change which would then change the control law and the actuator mass. That would, in turn, change the structural model. Thus, the sets C and L depend on each other. Future space structures can be roughly divided into four mission classes. Class 1 missions include flexible spacecraft with no articulated appendages which require fine attitude pointing and vibration suppression (e.g., large space antennas). Class 2 missions consist of flexible spacecraft with articulated multiple payloads, where the requirement is to fine-point the spacecraft and each

  6. Development of a design methodology for pipelines in ice scoured seabeds

    SciTech Connect

    Clark, J.I.; Paulin, M.J.; Lach, P.R.; Yang, Q.S.; Poorooshasb, H.

    1994-12-31

    Large areas of the continental shelf of northern oceans are frequently scoured or gouged by moving bodies of ice such as icebergs and sea ice keels associated with pressure ridges. This phenomenon presents a formidable challenge when the route of a submarine pipeline is intersected by the scouring ice. It is generally acknowledged that if a pipeline, laid on the seabed, were hit by an iceberg or a pressure ridge keel, the forces imposed on the pipeline would be much greater than it could practically withstand. The pipeline must therefore be buried to avoid direct contact with ice, but it is very important to determine with some assurance the minimum depth required for safety for both economical and environmental reasons. The safe burial depth of a pipeline, however, cannot be determined directly from the relatively straight forward measurement of maximum scour depth. The major design consideration is the determination of the potential sub-scour deformation of the ice scoured soil. Forces transmitted through the soil and soil displacement around the pipeline could load the pipeline to failure if not taken into account in the design. If the designer can predict the forces transmitted through the soil, the pipeline can be designed to withstand these external forces using conventional design practice. In this paper, the authors outline a design methodology that is based on phenomenological studies of ice scoured terrain, both modern and relict, laboratory tests, centrifuge modeling, and numerical analysis. The implications of these studies, which could assist in the safe and economical design of pipelines in ice scoured terrain, will also be discussed.

  7. A systematic design approach for two planetary gear split hybrid vehicles

    NASA Astrophysics Data System (ADS)

    Liu, Jinming; Peng, Huei

    2010-11-01

    Multiple power sources in a hybrid vehicle allow for flexible vehicle power-train operations, but also impose kinematic constraints due to component characteristics. This paper presents a design process that enables systematic search and screening through all three major dimensions of hybrid vehicle designs - system configuration, component sizing and control, to achieve optimal performance while satisfying the imposed constraints. An automated dynamic modelling method is first developed which enables the construction of hybrid vehicle model efficiently. A screening process then narrows down to configurations that satisfy drivability and operation constraints. Finally, a design and control optimisation strategy is carried out to obtain the best execution of each configuration. A case study for the design of a power-split hybrid vehicle with optimal fuel economy is used to demonstrate this overall hybrid vehicle design process.

  8. A Research Methodology for Green IT Systems Based on WSR and Design Science: The Case of a Chinese Company

    NASA Astrophysics Data System (ADS)

    Zhong, Yinghong; Liu, Hongwei

    Currently green IT has been a hotspot in both practice and research fields. Much progress has been made in the aspect of green technologies. However, researchers and designers could not simply build up a green IT system from technological aspect, which is normally considered as a wicked problem. This paper puts forward a research methodology for green IT systems by introducing WSR and design science. This methodology absorbs essence from soft systems methodology and action research. It considers the research, design and building of green IT systems from a systemic perspective which can be divided into as technological dimension, management dimension and human dimension. This methodology consists of 7 iterated stages. Each stage is presented and followed by a case study from a Chinese company.

  9. A Test Methodology for Determining Space-Readiness of Xilinx SRAM-Based FPGA Designs

    SciTech Connect

    Quinn, Heather M; Graham, Paul S; Morgan, Keith S; Caffrey, Michael P

    2008-01-01

    Using reconfigurable, static random-access memory (SRAM) based field-programmable gate arrays (FPGAs) for space-based computation has been an exciting area of research for the past decade. Since both the circuit and the circuit's state is stored in radiation-tolerant memory, both could be alterd by the harsh space radiation environment. Both the circuit and the circuit's state can be prote cted by triple-moduler redundancy (TMR), but applying TMR to FPGA user designs is often an error-prone process. Faulty application of TMR could cause the FPGA user circuit to output incorrect data. This paper will describe a three-tiered methodology for testing FPGA user designs for space-readiness. We will describe the standard approach to testing FPGA user designs using a particle accelerator, as well as two methods using fault injection and a modeling tool. While accelerator testing is the current 'gold standard' for pre-launch testing, we believe the use of fault injection and modeling tools allows for easy, cheap and uniform access for discovering errors early in the design process.

  10. Development of a decision-making methodology to design a water quality monitoring network.

    PubMed

    Keum, Jongho; Kaluarachchi, Jagath J

    2015-07-01

    The number of water quality monitoring stations in the USA has decreased over the past few decades. Scarcity of observations can easily produce prediction uncertainty due to unreliable model calibration. An effective water quality monitoring network is important not only for model calibration and water quality prediction but also for resources management. Redundant or improperly located monitoring stations may cause increased monitoring costs without improvement to the understanding of water quality in watersheds. In this work, a decision-making methodology is proposed to design a water quality monitoring network by providing an adequate number of monitoring stations and their approximate locations at the eight-digit hydrologic unit codes (HUC8) scale. The proposed methodology is demonstrated for an example at the Upper Colorado River Basin (UCRB), where salinity is a serious concern. The level of monitoring redundancy or scarcity is defined by an index, station ratio (SR), which represents a monitoring density based on water quality load originated within a subbasin. By comparing the number of stations from a selected target SR with the available number of stations including the actual and the potential stations, the suggested number of stations in each subbasin was decided. If monitoring stations are primarily located in the low salinity loading subbasins, the average actual SR tends to increase, and vice versa. Results indicate that the spatial distribution of monitoring locations in 2011 is concentrated on low salinity loading subbasins, and therefore, additional monitoring is required for the high salinity loading subbasins. The proposed methodology shows that the SR is a simple and a practical indicator for monitoring density.

  11. A Systematic Review on the Designs of Clinical Technology: Findings and Recommendations for Future Research

    PubMed Central

    PhD, Greg Alexander; Staggers, Nancy

    2010-01-01

    Human factors (HF) studies are increasingly important as technology infuses into clinical settings. No nursing research reviews exist in this area. The authors conducted a systematic review on designs of clinical technology, 34 articles with 50 studies met inclusion criteria. Findings were classified into three categories based on HF research goals. The majority of studies evaluated effectiveness of clinical design; efficiency was fewest. Current research ranges across many interface types examined with no apparent pattern or obvious rationale. Future research should expand types, settings, participants; integrate displays; and expand outcome variables. PMID:19707093

  12. Methodology to improve design of accelerated life tests in civil engineering projects.

    PubMed

    Lin, Jing; Yuan, Yongbo; Zhou, Jilai; Gao, Jie

    2014-01-01

    For reliability testing an Energy Expansion Tree (EET) and a companion Energy Function Model (EFM) are proposed and described in this paper. Different from conventional approaches, the EET provides a more comprehensive and objective way to systematically identify external energy factors affecting reliability. The EFM introduces energy loss into a traditional Function Model to identify internal energy sources affecting reliability. The combination creates a sound way to enumerate the energies to which a system may be exposed during its lifetime. We input these energies into planning an accelerated life test, a Multi Environment Over Stress Test. The test objective is to discover weak links and interactions among the system and the energies to which it is exposed, and design them out. As an example, the methods are applied to the pipe in subsea pipeline. However, they can be widely used in other civil engineering industries as well. The proposed method is compared with current methods.

  13. Designing multidisciplinary longitudinal studies of human development: analyzing past research to inform methodology.

    PubMed

    Shulruf, Boaz; Morton, Susan; Goodyear-Smith, Felicity; O'Loughlin, Claire; Dixon, Robyn

    2007-09-01

    This review identifies key issues associated with the design of future longitudinal studies of human development. Sixteen international studies were compared for initial response and retention rate, sample size, type of data collected, and sampling frames. The studies had little information about the influences of fathers, extended family members, childcare, and educational institutions; the effects of peers; children's use of time; the needs of disabled children; urban versus rural environments; or the influence of genetic factors. A contemporary longitudinal study should include measures of physical and mental health, cognitive capacity, educational attainment, social adjustment, conduct and behavior, resiliency, and risk-taking behaviors. It needs to address genetic and intergenerational factors, cultural identity, and the influences of neighborhood, community, and wider social and political environments and to encompass outcomes at all life stages to systematically determine the role each factor plays in individuals' lives, including interactions within and across variables.

  14. Methodology to Improve Design of Accelerated Life Tests in Civil Engineering Projects

    PubMed Central

    Lin, Jing; Yuan, Yongbo; Zhou, Jilai; Gao, Jie

    2014-01-01

    For reliability testing an Energy Expansion Tree (EET) and a companion Energy Function Model (EFM) are proposed and described in this paper. Different from conventional approaches, the EET provides a more comprehensive and objective way to systematically identify external energy factors affecting reliability. The EFM introduces energy loss into a traditional Function Model to identify internal energy sources affecting reliability. The combination creates a sound way to enumerate the energies to which a system may be exposed during its lifetime. We input these energies into planning an accelerated life test, a Multi Environment Over Stress Test. The test objective is to discover weak links and interactions among the system and the energies to which it is exposed, and design them out. As an example, the methods are applied to the pipe in subsea pipeline. However, they can be widely used in other civil engineering industries as well. The proposed method is compared with current methods. PMID:25111800

  15. A probabilistic methodology for radar cross section prediction in conceptual aircraft design

    NASA Astrophysics Data System (ADS)

    Hines, Nathan Robert

    System effectiveness has increasingly become the prime metric for the evaluation of military aircraft. As such, it is the decision maker's/designer's goal to maximize system effectiveness. Industry and government research documents indicate that all future military aircraft will incorporate signature reduction as an attempt to improve system effectiveness and reduce the cost of attrition. Today's operating environments demand low observable aircraft which are able to reliably take out valuable, time critical targets. Thus it is desirable to be able to design vehicles that are balanced for increased effectiveness. Previous studies have shown that shaping of the vehicle is one of the most important contributors to radar cross section, a measure of radar signature, and must be considered from the very beginning of the design process. Radar cross section estimation should be incorporated into conceptual design to develop more capable systems. This research strives to meet these needs by developing a conceptual design tool that predicts radar cross section for parametric geometries. This tool predicts the absolute radar cross section of the vehicle as well as the impact of geometry changes, allowing for the simultaneous tradeoff of the aerodynamic, performance, and cost characteristics of the vehicle with the radar cross section. Furthermore, this tool can be linked to a campaign theater analysis code to demonstrate the changes in system and system of system effectiveness due to changes in aircraft geometry. A general methodology was developed and implemented and sample computer codes applied to prototype the proposed process. Studies utilizing this radar cross section tool were subsequently performed to demonstrate the capabilities of this method and show the impact that various inputs have on the outputs of these models. The F/A-18 aircraft configuration was chosen as a case study vehicle to perform a design space exercise and to investigate the relative impact of

  16. Assessment of an effective quasirelativistic methodology designed to study astatine chemistry in aqueous solution.

    PubMed

    Champion, Julie; Seydou, Mahamadou; Sabatié-Gogova, Andrea; Renault, Eric; Montavon, Gilles; Galland, Nicolas

    2011-09-07

    A cost-effective computational methodology designed to study astatine (At) chemistry in aqueous solution has been established. It is based on two-component spin-orbit density functional theory calculations and solvation calculations using the conductor-like polarizable continuum model in conjunction with specific astatine cavities. Theoretical calculations are confronted with experimental data measured for complexation reactions between metallic forms of astatine (At(+) and AtO(+)) and inorganic ligands (Cl(-), Br(-) and SCN(-)). For each reaction, both 1:1 and 1:2 complexes are evidenced. The experimental trends regarding the thermodynamic constants (K) can be reproduced qualitatively and quantitatively. The mean signed error on computed Log K values is -0.4, which corresponds to a mean signed error smaller than 1 kcal mol(-1) on free energies of reaction. Theoretical investigations show that the reactivity of cationic species of astatine is highly sensitive to spin-orbit coupling and solvent effects. At the moment, the presented computational methodology appears to be the only tool to gain an insight into astatine chemistry at a molecular level.

  17. Design of a strong cation exchange methodology for the evaluation of charge heterogeneity in glatiramer acetate.

    PubMed

    Campos-García, Víctor R; López-Morales, Carlos A; Benites-Zaragoza, Eleuterio; Jiménez-Miranda, Armando; Espinosa-de la Garza, Carlos E; Herrera-Fernández, Daniel; Padilla-Calderón, Jesús; Pérez, Néstor O; Flores-Ortiz, Luis F; Medina-Rivero, E

    2017-01-05

    Complex pharmaceuticals are in demand of competent analytical methods able to analyze charge heterogeneity as a critical quality attribute (CQA), in compliance with current regulatory expectations. A notorious example is glatiramer acetate (GA), a complex polypeptide mixture useful for the treatment of relapsing-remitting multiple sclerosis. This pharmaceutical challenges the current state of analytical technology in terms of the capacity to study their constituent species. Thus, a strong cation exchange methodology was designed under the lifecycle approach to support the establishment of GA identity, trough the evaluation of its chromatographic profile, which acts as a charge heterogeneity fingerprint. In this regard, a maximum relative margin of error of 5% for relative retention time and symmetry factor were proposed for the analytical target profile. The methodology met the proposed requirements after precision and specificity tests results, the former comprised of sensitivity and selectivity. Subsequently, method validation was conducted and showed that the method is able to differentiate between intact GA and heterogeneity profiles coming from stressed, fractioned or process-modified samples. In summary, these results provide evidence that the method is adequate to assess charge heterogeneity as a CQA of this complex pharmaceutical.

  18. Use of a qualitative methodological scaffolding process to design robust interprofessional studies.

    PubMed

    Wener, Pamela; Woodgate, Roberta L

    2013-07-01

    Increasingly, researchers are using qualitative methodology to study interprofessional collaboration (IPC). With this increase in use, there seems to be an appreciation for how qualitative studies allow us to understand the unique individual or group experience in more detail and form a basis for policy change and innovative interventions. Furthermore, there is an increased understanding of the potential of studying new or emerging phenomena qualitatively to inform further large-scale studies. Although there is a current trend toward greater acceptance of the value of qualitative studies describing the experiences of IPC, these studies are mostly descriptive in nature. Applying a process suggested by Crotty (1998) may encourage researchers to consider the value in situating research questions within a broader theoretical framework that will inform the overall research approach including methodology and methods. This paper describes the application of a process to a research project and then illustrates how this process encouraged iterative cycles of thinking and doing. The authors describe each step of the process, shares decision-making points, as well as suggests an additional step to the process. Applying this approach to selecting data collection methods may serve to guide and support the qualitative researcher in creating a well-designed study approach.

  19. A methodology for the efficient integration of transient constraints in the design of aircraft dynamic systems

    NASA Astrophysics Data System (ADS)

    Phan, Leon L.

    The motivation behind this thesis mainly stems from previous work performed at Hispano-Suiza (Safran Group) in the context of the European research project "Power Optimised Aircraft". Extensive testing on the COPPER Bird RTM, a test rig designed to characterize aircraft electrical networks, demonstrated the relevance of transient regimes in the design and development of dynamic systems. Transient regimes experienced by dynamic systems may have severe impacts on the operation of the aircraft. For example, the switching on of a high electrical load might cause a network voltage drop inducing a loss of power available to critical aircraft systems. These transient behaviors are thus often regulated by dynamic constraints, requiring the dynamic signals to remain within bounds whose values vary with time. The verification of these peculiar types of constraints, which generally requires high-fidelity time-domain simulation, intervenes late in the system development process, thus potentially causing costly design iterations. The research objective of this thesis is to develop a methodology that integrates the verification of dynamic constraints in the early specification of dynamic systems. In order to circumvent the inefficiencies of time-domain simulation, multivariate dynamic surrogate models of the original time-domain simulation models are generated, building on a nonlinear system identification technique using wavelet neural networks (or wavenets), which allow the multiscale nature of transient signals to be captured. However, training multivariate wavenets can become computationally prohibitive as the number of design variables increases. Therefore, an alternate approach is formulated, in which dynamic surrogate models using sigmoid-based neural networks are used to emulate the transient behavior of the envelopes of the time-domain response. Thus, in order to train the neural network, the envelopes are extracted by first separating the scales of the dynamic response

  20. Systematic review of enriched enrolment, randomised withdrawal trial designs in chronic pain: a new framework for design and reporting.

    PubMed

    Moore, R Andrew; Wiffen, Philip J; Eccleston, Christopher; Derry, Sheena; Baron, Ralf; Bell, Rae F; Furlan, Andrea D; Gilron, Ian; Haroutounian, Simon; Katz, Nathaniel P; Lipman, Arthur G; Morley, Stephen; Peloso, Paul M; Quessy, Steve N; Seers, Kate; Strassels, Scott A; Straube, Sebastian

    2015-08-01

    Enriched enrolment, randomised withdrawal (EERW) pain trials select, before randomisation, patients who respond by demonstrating a predetermined degree of pain relief and acceptance of adverse events. There is uncertainty over the value of this design. We report a systematic review of EERW trials in chronic noncancer pain together with a critical appraisal of methods and potential biases in the methods used and recommendations for the design and reporting of future EERW trials. Electronic and other searches found 25 EERW trials published between 1995 and June 2014, involving 5669 patients in a randomised withdrawal phase comparing drug with placebo; 13 (median, 107 patients) had a randomised withdrawal phase of 6 weeks or less, and 12 (median, 334) lasted 12 to 26 weeks. Risks of bias included short duration, inadequate outcome definition, incomplete outcome data reporting, small size, and inadequate dose tapering on randomisation to placebo. Active treatment was usually better than placebo (22/25 trials). This review reduces the uncertainty around the value of EERW trials in pain. If properly designed, conducted, and reported, they are feasible and useful for making decisions about pain therapies. Shorter, small studies can be explanatory; longer, larger studies can inform practice. Current evidence is inadequate for valid comparisons in outcome between EERW and classical trials, although no gross differences were found. This systematic review provides a framework for assessing potential biases and the value of the EERW trials, and for the design of future studies by making recommendations for the conduct and reporting of EERW trials.

  1. Human factors analysis and design methods for nuclear waste retrieval systems: Human factors design methodology and integration plan

    NASA Astrophysics Data System (ADS)

    Casey, S. M.

    1980-06-01

    The nuclear waste retrieval system intended to be used for the removal of storage canisters (each canister containing a spent fuel rod assembly) located in an underground salt bed depository is discussed. The implementation of human factors engineering principles during the design and construction of the retrieval system facilities and equipment is reported. The methodology is structured around a basic system development effort involving preliminary development, equipment development, personnel subsystem development, and operational test and evaluation. Examples of application of the techniques in the analysis of human tasks, and equipment required in the removal of spent fuel canisters is provided. The framework for integrating human engineering with the rest of the system development effort is documented.

  2. Mixed culture optimization for marigold flower ensilage via experimental design and response surface methodology.

    PubMed

    Navarrete-Bolaños, José Luis; Jiménez-Islas, Hugo; Botello-Alvarez, Enrique; Rico-Martínez, Ramiro

    2003-04-09

    Endogenous microorganisms isolated from the marigold flower (Tagetes erecta) were studied to understand the events taking place during its ensilage. Studies of the cellulase enzymatic activity and the ensilage process were undertaken. In both studies, the use of approximate second-order models and multiple lineal regression, within the context of an experimental mixture design using the response surface methodology as optimization strategy, determined that the microorganisms Flavobacterium IIb, Acinetobacter anitratus, and Rhizopus nigricans are the most significant in marigold flower ensilage and exhibit high cellulase activity. A mixed culture comprised of 9.8% Flavobacterium IIb, 41% A. anitratus, and 49.2% R. nigricans used during ensilage resulted in an increased yield of total xanthophylls extracted of 24.94 g/kg of dry weight compared with 12.92 for the uninoculated control ensilage.

  3. Analog design optimization methodology for ultralow-power circuits using intuitive inversion-level and saturation-level parameters

    NASA Astrophysics Data System (ADS)

    Eimori, Takahisa; Anami, Kenji; Yoshimatsu, Norifumi; Hasebe, Tetsuya; Murakami, Kazuaki

    2014-01-01

    A comprehensive design optimization methodology using intuitive nondimensional parameters of inversion-level and saturation-level is proposed, especially for ultralow-power, low-voltage, and high-performance analog circuits with mixed strong, moderate, and weak inversion metal-oxide-semiconductor transistor (MOST) operations. This methodology is based on the synthesized charge-based MOST model composed of Enz-Krummenacher-Vittoz (EKV) basic concepts and advanced-compact-model (ACM) physics-based equations. The key concept of this methodology is that all circuit and system characteristics are described as some multivariate functions of inversion-level parameters, where the inversion level is used as an independent variable representative of each MOST. The analog circuit design starts from the first step of inversion-level design using universal characteristics expressed by circuit currents and inversion-level parameters without process-dependent parameters, followed by the second step of foundry-process-dependent design and the last step of verification using saturation-level criteria. This methodology also paves the way to an intuitive and comprehensive design approach for many kinds of analog circuit specifications by optimization using inversion-level log-scale diagrams and saturation-level criteria. In this paper, we introduce an example of our design methodology for a two-stage Miller amplifier.

  4. Design methodology accounting for fabrication errors in manufactured modified Fresnel lenses for controlled LED illumination.

    PubMed

    Shim, Jongmyeong; Kim, Joongeok; Lee, Jinhyung; Park, Changsu; Cho, Eikhyun; Kang, Shinill

    2015-07-27

    The increasing demand for lightweight, miniaturized electronic devices has prompted the development of small, high-performance optical components for light-emitting diode (LED) illumination. As such, the Fresnel lens is widely used in applications due to its compact configuration. However, the vertical groove angle between the optical axis and the groove inner facets in a conventional Fresnel lens creates an inherent Fresnel loss, which degrades optical performance. Modified Fresnel lenses (MFLs) have been proposed in which the groove angles along the optical paths are carefully controlled; however, in practice, the optical performance of MFLs is inferior to the theoretical performance due to fabrication errors, as conventional design methods do not account for fabrication errors as part of the design process. In this study, the Fresnel loss and the loss area due to microscopic fabrication errors in the MFL were theoretically derived to determine optical performance. Based on this analysis, a design method for the MFL accounting for the fabrication errors was proposed. MFLs were fabricated using an ultraviolet imprinting process and an injection molding process, two representative processes with differing fabrication errors. The MFL fabrication error associated with each process was examined analytically and experimentally to investigate our methodology.

  5. Designing reasonable accommodation of the workplace: a new methodology based on risk assessment.

    PubMed

    Pigini, L; Andrich, R; Liverani, G; Bucciarelli, P; Occhipinti, E

    2010-05-01

    If working tasks are carried out in inadequate conditions, workers with functional limitations may, over time, risk developing further disabilities. While several validated risk assessment methods exist for able-bodied workers, few studies have been carried out for workers with disabilities. This article, which reports the findings of a Study funded by the Italian Ministry of Labour, proposes a general methodology for the technical and organisational re-design of a worksite, based on risk assessment and irrespective of any worker disability. To this end, a sample of 16 disabled workers, composed of people with either mild or severe motor disabilities, was recruited. Their jobs include business administration (5), computer programmer (1), housewife (1), mechanical worker (2), textile worker (1), bus driver (1), nurse (2), electrical worker (1), teacher (1), warehouseman (1). By using a mix of risk assessment methods and the International Classification of Functioning (ICF) taxonomy, their worksites were re-designed in view of a reasonable accommodation, and prospective evaluation was carried out to check whether the new design would eliminate the risks. In one case - a man with congenital malformations who works as a help-desk operator for technical assistance in the Information and Communication Technology (ICT) department of a big organisation - the accommodation was actually carried out within the time span of the study, thus making it possible to confirm the hypotheses raised in the prospective assessment.

  6. Robust design of spot welds in automotive structures: A decision-making methodology

    NASA Astrophysics Data System (ADS)

    Ouisse, M.; Cogan, S.

    2010-05-01

    Automotive structures include thousands of spot welds whose design must allow the assembled vehicle to satisfy a wide variety of performance constraints including static, dynamic and crash criteria. The objective of a standard optimization strategy is to reduce the number of spot welds as much as possible while satisfying all the design objectives. However, a classical optimization of the spot weld distribution using an exhaustive search approach is simply not feasible due to the very high order of the design space and the subsequently prohibitive calculation costs. Moreover, even if this calculation could be done, the result would not necessarily be very informative with respect to the design robustness to manufacturing uncertainties (location of welds and defective welds) and to the degradation of spot welds due to fatigue effects over the lifetime of the vehicle. In this paper, a decision-making methodology is presented which allows some aspects of the robustness issues to be integrated into the spot weld design process. The starting point is a given distribution of spot welds on the structure, which is based on both engineering know-how and preliminary critical numerical results, in particular criteria such as crash behavior. An over-populated spot weld distribution is then built in order to satisfy the remaining design criteria, such as static torsion angle and modal behavior. Then, an efficient optimization procedure based on energy considerations is used to eliminate redundant spot welds while preserving as far as possible the nominal structural behavior. The resulting sub-optimal solution is then used to provide a decision indicator for defining effective quality control procedures (e.g. visual post-assembly inspection of a small number of critical spot welds) as well as designing redundancy into critical zones. The final part of the paper is related to comparing the robustness of competing designs. Some decision-making indicators are presented to help the

  7. A Systematic Composite Service Design Modeling Method Using Graph-Based Theory

    PubMed Central

    Elhag, Arafat Abdulgader Mohammed; Mohamad, Radziah; Aziz, Muhammad Waqar; Zeshan, Furkh

    2015-01-01

    The composite service design modeling is an essential process of the service-oriented software development life cycle, where the candidate services, composite services, operations and their dependencies are required to be identified and specified before their design. However, a systematic service-oriented design modeling method for composite services is still in its infancy as most of the existing approaches provide the modeling of atomic services only. For these reasons, a new method (ComSDM) is proposed in this work for modeling the concept of service-oriented design to increase the reusability and decrease the complexity of system while keeping the service composition considerations in mind. Furthermore, the ComSDM method provides the mathematical representation of the components of service-oriented design using the graph-based theoryto facilitate the design quality measurement. To demonstrate that the ComSDM method is also suitable for composite service design modeling of distributed embedded real-time systems along with enterprise software development, it is implemented in the case study of a smart home. The results of the case study not only check the applicability of ComSDM, but can also be used to validate the complexity and reusability of ComSDM. This also guides the future research towards the design quality measurement such as using the ComSDM method to measure the quality of composite service design in service-oriented software system. PMID:25928358

  8. A systematic composite service design modeling method using graph-based theory.

    PubMed

    Elhag, Arafat Abdulgader Mohammed; Mohamad, Radziah; Aziz, Muhammad Waqar; Zeshan, Furkh

    2015-01-01

    The composite service design modeling is an essential process of the service-oriented software development life cycle, where the candidate services, composite services, operations and their dependencies are required to be identified and specified before their design. However, a systematic service-oriented design modeling method for composite services is still in its infancy as most of the existing approaches provide the modeling of atomic services only. For these reasons, a new method (ComSDM) is proposed in this work for modeling the concept of service-oriented design to increase the reusability and decrease the complexity of system while keeping the service composition considerations in mind. Furthermore, the ComSDM method provides the mathematical representation of the components of service-oriented design using the graph-based theoryto facilitate the design quality measurement. To demonstrate that the ComSDM method is also suitable for composite service design modeling of distributed embedded real-time systems along with enterprise software development, it is implemented in the case study of a smart home. The results of the case study not only check the applicability of ComSDM, but can also be used to validate the complexity and reusability of ComSDM. This also guides the future research towards the design quality measurement such as using the ComSDM method to measure the quality of composite service design in service-oriented software system.

  9. Teaching Mathematical Modelling in a Design Context: A Methodology Based on the Mechanical Analysis of a Domestic Cancrusher.

    ERIC Educational Resources Information Center

    Pace, Sydney

    2000-01-01

    Presents a methodology for teaching mathematical modeling skills to A-level students. Gives an example illustrating how mathematics teachers and design teachers can take joint perspective in devising learning opportunities that develop mathematical and design skills concurrently. (Contains 14 references.) (Author/ASK)

  10. A game-based decision support methodology for competitive systems design

    NASA Astrophysics Data System (ADS)

    Briceno, Simon Ignacio

    This dissertation describes the development of a game-based methodology that facilitates the exploration and selection of research and development (R&D) projects under uncertain competitive scenarios. The proposed method provides an approach that analyzes competitor positioning and formulates response strategies to forecast the impact of technical design choices on a project's market performance. A critical decision in the conceptual design phase of propulsion systems is the selection of the best architecture, centerline, core size, and technology portfolio. This selection can be challenging when considering evolving requirements from both the airframe manufacturing company and the airlines in the market. Furthermore, the exceedingly high cost of core architecture development and its associated risk makes this strategic architecture decision the most important one for an engine company. Traditional conceptual design processes emphasize performance and affordability as their main objectives. These areas alone however, do not provide decision-makers with enough information as to how successful their engine will be in a competitive market. A key objective of this research is to examine how firm characteristics such as their relative differences in completing R&D projects, differences in the degree of substitutability between different project types, and first/second-mover advantages affect their product development strategies. Several quantitative methods are investigated that analyze business and engineering strategies concurrently. In particular, formulations based on the well-established mathematical field of game theory are introduced to obtain insights into the project selection problem. The use of game theory is explored in this research as a method to assist the selection process of R&D projects in the presence of imperfect market information. The proposed methodology focuses on two influential factors: the schedule uncertainty of project completion times and

  11. Experimental validation of an integrated controls-structures design methodology for a class of flexible space structures

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Gupta, Sandeep; Elliott, Kenny B.; Joshi, Suresh M.; Walz, Joseph E.

    1994-01-01

    This paper describes the first experimental validation of an optimization-based integrated controls-structures design methodology for a class of flexible space structures. The Controls-Structures-Interaction (CSI) Evolutionary Model, a laboratory test bed at Langley, is redesigned based on the integrated design methodology with two different dissipative control strategies. The redesigned structure is fabricated, assembled in the laboratory, and experimentally compared with the original test structure. Design guides are proposed and used in the integrated design process to ensure that the resulting structure can be fabricated. Experimental results indicate that the integrated design requires greater than 60 percent less average control power (by thruster actuators) than the conventional control-optimized design while maintaining the required line-of-sight performance, thereby confirming the analytical findings about the superiority of the integrated design methodology. Amenability of the integrated design structure to other control strategies is considered and evaluated analytically and experimentally. This work also demonstrates the capabilities of the Langley-developed design tool CSI DESIGN which provides a unified environment for structural and control design.

  12. A Software Designed For STP Data Plot and Analysis Based on Object-oriented Methodology

    NASA Astrophysics Data System (ADS)

    Lina, L.; Murata, K.

    2006-12-01

    simply follows the present system as long as the language is object-oriented language. Researchers would want to add their data into the STARS. In this case, they simply add their own data class in the domain object model. It is because any satellite data has properties such as time or date, which are inherited from the upper class. In this way, their effort is less than in other old methodologies. In the OMT, description format of the system is rather strictly standardized. When new developers take part in STARS project, they have only to understand each model to obtain the overview of the STARS. Then they follow this designs and documents to implement the system. The OMT makes a new comer easy to join into the project already running.

  13. Methodology for Benefit Analysis of CAD/CAM (Computer-Aided Design/Computer-Aided Manufacturing) in USN Shipyards.

    DTIC Science & Technology

    1984-03-01

    D-Ri38 398 METHODOLOGY FOR BENEFIT ANALYSIS OF CAD/CAM / (COMPUTER-HIDED DESIGN/COMPUTER-AIDED MANUFACTURING) IN USN SHIPYARDS(U) NAVAL POSTGRADUATE...Monterey, California DT I ~" t • EB3 1984 THESIS METHODOLOGY FOR BENEFIT ANALYSIS OF CAD/CAM IN USN SHIPYARDS by Richard B. Grahlman March 1984 Thesis...REPORT & PERIOD COVERED Methodology for Benefit Analysis of CAD/CAM Mastrch 1984 i in UM Sipyads. PERFORMIANG ORG. REPORT NUM8ER 7- AUHOW11111 4

  14. Applications of a damage tolerance analysis methodology in aircraft design and production

    NASA Technical Reports Server (NTRS)

    Woodward, M. R.; Owens, S. D.; Law, G. E.; Mignery, L. A.

    1992-01-01

    Objectives of customer mandated aircraft structural integrity initiatives in design are to guide material selection, to incorporate fracture resistant concepts in the design, to utilize damage tolerance based allowables and planned inspection procedures necessary to enhance the safety and reliability of manned flight vehicles. However, validated fracture analysis tools for composite structures are needed to accomplish these objectives in a timely and economical manner. This paper briefly describes the development, validation, and application of a damage tolerance methodology for composite airframe structures. A closed-form analysis code, entitled SUBLAM was developed to predict the critical biaxial strain state necessary to cause sublaminate buckling-induced delamination extension in an impact damaged composite laminate. An embedded elliptical delamination separating a thin sublaminate from a thick parent laminate is modelled. Predicted failure strains were correlated against a variety of experimental data that included results from compression after impact coupon and element tests. An integrated analysis package was developed to predict damage tolerance based margin-of-safety (MS) using NASTRAN generated loads and element information. Damage tolerance aspects of new concepts are quickly and cost-effectively determined without the need for excessive testing.

  15. The Component Packaging Problem: A Vehicle for the Development of Multidisciplinary Design and Analysis Methodologies

    NASA Technical Reports Server (NTRS)

    Fadel, Georges; Bridgewood, Michael; Figliola, Richard; Greenstein, Joel; Kostreva, Michael; Nowaczyk, Ronald; Stevenson, Steve

    1999-01-01

    This report summarizes academic research which has resulted in an increased appreciation for multidisciplinary efforts among our students, colleagues and administrators. It has also generated a number of research ideas that emerged from the interaction between disciplines. Overall, 17 undergraduate students and 16 graduate students benefited directly from the NASA grant: an additional 11 graduate students were impacted and participated without financial support from NASA. The work resulted in 16 theses (with 7 to be completed in the near future), 67 papers or reports mostly published in 8 journals and/or presented at various conferences (a total of 83 papers, presentations and reports published based on NASA inspired or supported work). In addition, the faculty and students presented related work at many meetings, and continuing work has been proposed to NSF, the Army, Industry and other state and federal institutions to continue efforts in the direction of multidisciplinary and recently multi-objective design and analysis. The specific problem addressed is component packing which was solved as a multi-objective problem using iterative genetic algorithms and decomposition. Further testing and refinement of the methodology developed is presently under investigation. Teaming issues research and classes resulted in the publication of a web site, (http://design.eng.clemson.edu/psych4991) which provides pointers and techniques to interested parties. Specific advantages of using iterative genetic algorithms, hurdles faced and resolved, and institutional difficulties associated with multi-discipline teaming are described in some detail.

  16. Systematic design of output filters for audio class-D amplifiers via Simplified Real Frequency Technique

    NASA Astrophysics Data System (ADS)

    Hintzen, E.; Vennemann, T.; Mathis, W.

    2014-11-01

    In this paper a new filter design concept is proposed and implemented which takes into account the complex loudspeaker impedance. By means of techniques of broadband matching, that has been successfully applied in radio technology, we are able to optimize the reconstruction filter to achieve an overall linear frequency response. Here, a passive filter network is inserted between source and load that matches the complex load impedance to the complex source impedance within a desired frequency range. The design and calculation of the filter is usually done using numerical approximation methods which are known as Real Frequency Techniques (RFT). A first approach to systematic design of reconstruction filters for class-D amplifiers is proposed, using the Simplified Real Frequency Technique (SRFT). Some fundamental considerations are introduced as well as the benefits and challenges of impedance matching between class-D amplifiers and loudspeakers. Current simulation data using MATLAB is presented and supports some first conclusions.

  17. Experimental validation of systematically designed acoustic hyperbolic meta material slab exhibiting negative refraction

    NASA Astrophysics Data System (ADS)

    Christiansen, Rasmus E.; Sigmund, Ole

    2016-09-01

    This Letter reports on the experimental validation of a two-dimensional acoustic hyperbolic metamaterial slab optimized to exhibit negative refractive behavior. The slab was designed using a topology optimization based systematic design method allowing for tailoring the refractive behavior. The experimental results confirm the predicted refractive capability as well as the predicted transmission at an interface. The study simultaneously provides an estimate of the attenuation inside the slab stemming from the boundary layer effects—insight which can be utilized in the further design of the metamaterial slabs. The capability of tailoring the refractive behavior opens possibilities for different applications. For instance, a slab exhibiting zero refraction across a wide angular range is capable of funneling acoustic energy through it, while a material exhibiting the negative refractive behavior across a wide angular range provides lensing and collimating capabilities.

  18. Transcranial direct current stimulation (tDCS) in behavioral and food addiction: a systematic review of efficacy, technical, and methodological issues

    PubMed Central

    Sauvaget, Anne; Trojak, Benoît; Bulteau, Samuel; Jiménez-Murcia, Susana; Fernández-Aranda, Fernando; Wolz, Ines; Menchón, José M.; Achab, Sophia; Vanelle, Jean-Marie; Grall-Bronnec, Marie

    2015-01-01

    Objectives: Behavioral addictions (BA) are complex disorders for which pharmacological and psychotherapeutic treatments have shown their limits. Non-invasive brain stimulation, among which transcranial direct current stimulation (tDCS), has opened up new perspectives in addiction treatment. The purpose of this work is to conduct a critical and systematic review of tDCS efficacy, and of technical and methodological considerations in the field of BA. Methods: A bibliographic search has been conducted on the Medline and ScienceDirect databases until December 2014, based on the following selection criteria: clinical studies on tDCS and BA (namely eating disorders, compulsive buying, Internet addiction, pathological gambling, sexual addiction, sports addiction, video games addiction). Study selection, data analysis, and reporting were conducted according to the PRISMA guidelines. Results: Out of 402 potential articles, seven studies were selected. So far focusing essentially on abnormal eating, these studies suggest that tDCS (right prefrontal anode/left prefrontal cathode) reduces food craving induced by visual stimuli. Conclusions: Despite methodological and technical differences between studies, the results are promising. So far, only few studies of tDCS in BA have been conducted. New research is recommended on the use of tDCS in BA, other than eating disorders. PMID:26500478

  19. A methodology for system-of-systems design in support of the engineering team

    NASA Astrophysics Data System (ADS)

    Ridolfi, G.; Mooij, E.; Cardile, D.; Corpino, S.; Ferrari, G.

    2012-04-01

    Space missions have experienced a trend of increasing complexity in the last decades, resulting in the design of very complex systems formed by many elements and sub-elements working together to meet the requirements. In a classical approach, especially in a company environment, the two steps of design-space exploration and optimization are usually performed by experts inferring on major phenomena, making assumptions and doing some trial-and-error runs on the available mathematical models. This is done especially in the very early design phases where most of the costs are locked-in. With the objective of supporting the engineering team and the decision-makers during the design of complex systems, the authors developed a modelling framework for a particular category of complex, coupled space systems called System-of-Systems. Once modelled, the System-of-Systems is solved using a computationally cheap parametric methodology, named the mixed-hypercube approach, based on the utilization of a particular type of fractional factorial design-of-experiments, and analysis of the results via global sensitivity analysis and response surfaces. As an applicative example, a system-of-systems of a hypothetical human space exploration scenario for the support of a manned lunar base is presented. The results demonstrate that using the mixed-hypercube to sample the design space, an optimal solution is reached with a limited computational effort, providing support to the engineering team and decision makers thanks to sensitivity and robustness information. The analysis of the system-of-systems model that was implemented shows that the logistic support of a human outpost on the Moon for 15 years is still feasible with currently available launcher classes. The results presented in this paper have been obtained in cooperation with Thales Alenia Space—Italy, in the framework of a regional programme called STEPS. STEPS—Sistemi e Tecnologie per l'EsPlorazione Spaziale is a research

  20. Systematic design of flat band slow light in photonic crystal waveguides.

    PubMed

    Li, Juntao; White, Thomas P; O'Faolain, Liam; Gomez-Iglesias, Alvaro; Krauss, Thomas F

    2008-04-28

    We present a systematic procedure for designing "flat bands" of photonic crystal waveguides for slow light propagation. The procedure aims to maximize the group index - bandwidth product by changing the position of the first two rows of holes of W1 line defect photonic crystal waveguides. A nearly constant group index - bandwidth product is achieved for group indices of 30-90 and as an example, we experimentally demonstrate flat band slow light with nearly constant group indices of 32.5, 44 and 49 over 14 nm, 11 nm and 9.5 nm bandwidth around 1550 nm, respectively.

  1. On the engineering design for systematic integration of agent-orientation in industrial automation.

    PubMed

    Yu, Liyong; Schüller, Andreas; Epple, Ulrich

    2014-09-01

    In today's automation industry, agent-oriented development of system functionalities appears to have a great potential for increasing autonomy and flexibility of complex operations, while lowering the workload of users. In this paper, we present a reference model for the harmonious and systematical integration of agent-orientation in industrial automation. Considering compatibility with existing automation systems and best practice, this model combines advantages of function block technology, service orientation and native description methods from the automation standard IEC 61131-3. This approach can be applied as a guideline for the engineering design of future agent-oriented automation systems.

  2. Engineering Bacteria to Search for Specific Concentrations of Molecules by a Systematic Synthetic Biology Design Method

    PubMed Central

    Chen, Bor-Sen

    2016-01-01

    Bacteria navigate environments full of various chemicals to seek favorable places for survival by controlling the flagella’s rotation using a complicated signal transduction pathway. By influencing the pathway, bacteria can be engineered to search for specific molecules, which has great potential for application to biomedicine and bioremediation. In this study, genetic circuits were constructed to make bacteria search for a specific molecule at particular concentrations in their environment through a synthetic biology method. In addition, by replacing the “brake component” in the synthetic circuit with some specific sensitivities, the bacteria can be engineered to locate areas containing specific concentrations of the molecule. Measured by the swarm assay qualitatively and microfluidic techniques quantitatively, the characteristics of each “brake component” were identified and represented by a mathematical model. Furthermore, we established another mathematical model to anticipate the characteristics of the “brake component”. Based on this model, an abundant component library can be established to provide adequate component selection for different searching conditions without identifying all components individually. Finally, a systematic design procedure was proposed. Following this systematic procedure, one can design a genetic circuit for bacteria to rapidly search for and locate different concentrations of particular molecules by selecting the most adequate “brake component” in the library. Moreover, following simple procedures, one can also establish an exclusive component library suitable for other cultivated environments, promoter systems, or bacterial strains. PMID:27096615

  3. Systematic model researches on the stability limits of the DVL series of float designs

    NASA Technical Reports Server (NTRS)

    Sottorf, W.

    1949-01-01

    To determine the trim range in which a seaplane can take off without porpoising, stability tests were made of a Plexiglas model, composed of float, wing, and tailplane, which corresponded to a full-size research airplane. The model and full-size stability limits are in good agreement. After all structural parts pertaining to the air frame were removed gradually, the aerodynamic forces replaced by weight forces, and the moment of inertia and position of the center of gravity changed, no marked change of limits of the stable zone was noticeable. The latter, therefore, is for practical purposes affected only by hydrodynamic phenomena. The stability limits of the DVL family of floats were determined by a systematic investigation independent of any particular sea-plane design, thus a seaplane may be designed to give a run free from porpoising.

  4. Optimization of Cu(II) biosorption onto Ascophyllum nodosum by factorial design methodology.

    PubMed

    Freitas, Olga; Delerue-Matos, Cristina; Boaventura, Rui

    2009-08-15

    A Box-Behnken factorial design coupled with surface response methodology was used to evaluate the effects of temperature, pH and initial concentration in the Cu(II) sorption process onto the marine macro-algae Ascophyllum nodosum. The effect of the operating variables on metal uptake capacity was studied in a batch system and a mathematical model showing the influence of each variable and their interactions was obtained. Study ranges were 10-40 degrees C for temperature, 3.0-5.0 for pH and 50-150 mg L(-1) for initial Cu(II) concentration. Within these ranges, the biosorption capacity is slightly dependent on temperature but markedly increases with pH and initial concentration of Cu(II). The uptake capacities predicted by the model are in good agreement with the experimental values. Maximum biosorption capacity of Cu(II) by A. nodosum is 70 mg g(-1) and corresponds to the following values of those variables: temperature=40 degrees C, pH=5.0 and initial Cu(II) concentration=150 mg L(-1).

  5. Optimization of Chitinase Production by Bacillus pumilus Using Plackett-Burman Design and Response Surface Methodology

    PubMed Central

    Tasharrofi, Noshin; Adrangi, Sina; Fazeli, Mehdi; Rastegar, Hossein; Khoshayand, Mohammad Reza; Faramarzi, Mohammad Ali

    2011-01-01

    A soil bacterium capable of degrading chitin on chitin agar plates was isolated and identified as Bacillus pumilus isolate U5 on the basis of 16S rDNA sequence analysis. In order to optimize culture conditions for chitinase production by this bacterium, a two step approach was employed. First, the effects of several medium components were studied using the Plackett-Burman design. Among various components tested, chitin and yeast extract showed positive effect on enzyme production while MgSO4 and FeSO4 had negative effect. However, the linear model proved to be insufficient for determining the optimum levels for these components due to a highly significant curvature effect. In the second step, Box-Behnken response surface methodology was used to determine the optimum values. It was noticed that a quadratic polynomial equation fitted he experimental data appropriately. The optimum concentrations for chitin, yeast extract, MgSO4 and FeSO4 were found to be 4.76, 0.439, 0.0055 and 0.019 g/L, respectively, with a predicted value of chitinase production of 97.67 U/100 mL. Using this statistically optimized medium, the practical chitinase production reached 96.1 U/100 mL. PMID:24250411

  6. Methodological and ethical considerations in designing an Internet study of quality of life: a discussion paper.

    PubMed

    Holmes, Susan

    2009-03-01

    Use of the Internet in research is a relatively new phenomenon offering a potentially valuable research resource that, although increasingly used, appears largely untapped in nursing and healthcare more generally. This paper discusses methodological and ethical issues that need consideration when designing an Internet-based study concluding that, in general, online research methods are simply adaptations of traditional methods of data collection. Issues such as the representativeness of the data and ethical concerns are discussed. It considers whether the ethical dilemmas faced by online researchers differ from those faced by those seeking to use other, more 'traditional' approaches. Using the example of a study that employed the Internet as a means of distributing questionnaires, this paper shows that this can be an efficient and effective means of gathering data from a geographically dispersed sample. Furthermore, since typewritten data is obtained in the same format from all respondents, the need for transcription and the potential for error are reduced potentially enhancing the quality of any such study.

  7. A methodology for formulating a minimal uncertainty model for robust control system design and analysis

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.; Chang, B.-C.; Fischl, Robert

    1989-01-01

    In the design and analysis of robust control systems for uncertain plants, the technique of formulating what is termed an M-delta model has become widely accepted and applied in the robust control literature. The M represents the transfer function matrix M(s) of the nominal system, and delta represents an uncertainty matrix acting on M(s). The uncertainty can arise from various sources, such as structured uncertainty from parameter variations or multiple unstructured uncertainties from unmodeled dynamics and other neglected phenomena. In general, delta is a block diagonal matrix, and for real parameter variations the diagonal elements are real. As stated in the literature, this structure can always be formed for any linear interconnection of inputs, outputs, transfer functions, parameter variations, and perturbations. However, very little of the literature addresses methods for obtaining this structure, and none of this literature addresses a general methodology for obtaining a minimal M-delta model for a wide class of uncertainty. Since have a delta matrix of minimum order would improve the efficiency of structured singular value (or multivariable stability margin) computations, a method of obtaining a minimal M-delta model would be useful. A generalized method of obtaining a minimal M-delta structure for systems with real parameter variations is given.

  8. Application-specific coarse-grained reconfigurable array: architecture and design methodology

    NASA Astrophysics Data System (ADS)

    Zhou, Li; Liu, Dongpei; Zhang, Jianfeng; Liu, Hengzhu

    2015-06-01

    Coarse-grained reconfigurable arrays (CGRAs) have shown potential for application in embedded systems in recent years. Numerous reconfigurable processing elements (PEs) in CGRAs provide flexibility while maintaining high performance by exploring different levels of parallelism. However, a difference remains between the CGRA and the application-specific integrated circuit (ASIC). Some application domains, such as software-defined radios (SDRs), require flexibility with performance demand increases. More effective CGRA architectures are expected to be developed. Customisation of a CGRA according to its application can improve performance and efficiency. This study proposes an application-specific CGRA architecture template composed of generic PEs (GPEs) and special PEs (SPEs). The hardware of the SPE can be customised to accelerate specific computational patterns. An automatic design methodology that includes pattern identification and application-specific function unit generation is also presented. A mapping algorithm based on ant colony optimisation is provided. Experimental results on the SDR target domain show that compared with other ordinary and application-specific reconfigurable architectures, the CGRA generated by the proposed method performs more efficiently for given applications.

  9. Online Intelligent Controllers for an Enzyme Recovery Plant: Design Methodology and Performance

    PubMed Central

    Leite, M. S.; Fujiki, T. L.; Silva, F. V.; Fileti, A. M. F.

    2010-01-01

    This paper focuses on the development of intelligent controllers for use in a process of enzyme recovery from pineapple rind. The proteolytic enzyme bromelain (EC 3.4.22.4) is precipitated with alcohol at low temperature in a fed-batch jacketed tank. Temperature control is crucial to avoid irreversible protein denaturation. Fuzzy or neural controllers offer a way of implementing solutions that cover dynamic and nonlinear processes. The design methodology and a comparative study on the performance of fuzzy-PI, neurofuzzy, and neural network intelligent controllers are presented. To tune the fuzzy PI Mamdani controller, various universes of discourse, rule bases, and membership function support sets were tested. A neurofuzzy inference system (ANFIS), based on Takagi-Sugeno rules, and a model predictive controller, based on neural modeling, were developed and tested as well. Using a Fieldbus network architecture, a coolant variable speed pump was driven by the controllers. The experimental results show the effectiveness of fuzzy controllers in comparison to the neural predictive control. The fuzzy PI controller exhibited a reduced error parameter (ITAE), lower power consumption, and better recovery of enzyme activity. PMID:21234106

  10. A methodology for using nonlinear aerodynamics in aeroservoelastic analysis and design

    NASA Technical Reports Server (NTRS)

    Silva, Walter A.

    1991-01-01

    A methodology is presented for using the Volterra-Wiener theory of nonlinear systems in aeroservoelastic (ASE) analyses and design. The theory is applied to the development of nonlinear aerodynamic response models that can be defined in state-space form and are, therefore, appropriate for use in modern control theory. The theory relies on the identification of nonlinear kernels that can be used to predict the response of a nonlinear system due to an arbitrary input. A numerical kernel identification technique, based on unit impulse responses, is presented and applied to a simple bilinear, single-input single-output (SISO) system. The linear kernel (unit impulse response) and the nonlinear second-order kernel of the system are numerically-identified and compared with the exact, analytically-defined and linear and second-order kernels. This kernel identification technique is then applied to the CAP-TSD (Computational Aeroelasticity Program-Transonic Small Disturbance) code for identification of the linear and second-order kernels of a NACA64A010 rectangular wing undergoing pitch at M = 0.5, M = 8.5 (transonic), and M = 0.93 (transonic). Results presented demonstrate the feasibility of this approach for use with nonlinear, unsteady aerodynamic responses.

  11. Design, Development and Optimization of S (-) Atenolol Floating Sustained Release Matrix Tablets Using Surface Response Methodology

    PubMed Central

    Gunjal, P. T.; Shinde, M. B.; Gharge, V. S.; Pimple, S. V.; Gurjar, M. K.; Shah, M. N.

    2015-01-01

    The objective of this present investigation was to develop and formulate floating sustained release matrix tablets of s (-) atenolol, by using different polymer combinations and filler, to optimize by using surface response methodology for different drug release variables and to evaluate the drug release pattern of the optimized product. Floating sustained release matrix tablets of various combinations were prepared with cellulose-based polymers: Hydroxypropyl methylcellulose, sodium bicarbonate as a gas generating agent, polyvinyl pyrrolidone as a binder and lactose monohydrate as filler. The 32 full factorial design was employed to investigate the effect of formulation variables on different properties of tablets applicable to floating lag time, buoyancy time, % drug release in 1 and 6 h (D1 h,D6 h) and time required to 90% drug release (t90%). Significance of result was analyzed using analysis of non variance and P < 0.05 was considered statistically significant. S (-) atenolol floating sustained release matrix tablets followed the Higuchi drug release kinetics that indicates the release of drug follows anomalous (non-Fickian) diffusion mechanism. The developed floating sustained release matrix tablet of improved efficacy can perform therapeutically better than a conventional tablet. PMID:26798171

  12. A methodology for using nonlinear aerodynamics in aeroservoelastic analysis and design

    NASA Technical Reports Server (NTRS)

    Silva, Walter A.

    1991-01-01

    A methodology is presented for using the Volterra-Wiener theory of nonlinear systems in aeroservoelastic (ASE) analyses and design. The theory is applied to the development of nonlinear aerodynamic response models that can be defined in state-space form and are, therefore, appropriate for use in modern control theory. The theory relies on the identification of nonlinear kernels that can be used to predict the response of a nonlinear system due to an arbitrary input. A numerical kernel identification technique, based on unit impulse responses, is presented and applied to a simple bilinear, single-input-single-output system. The linear kernel (unit impulse response) and the nonlinear second-order kernel of the system are numerically-identified and compared with the exact, analytically-defined linear and second-order kernels. This kernel identification technique is then applied to the CAP-TSD code for identification of the linear and second-order kernels of a NACA64A010 rectangular wing undergoing pitch at M = 0.5, M = 0.85 (transonic), and M = 0.93 (transonic). Results presented demonstrate the feasibility of this approach for use with nonlinear, unsteady aerodynamic responses.

  13. Hybrid intelligent methodology to design translation invariant morphological operators for Brazilian stock market prediction.

    PubMed

    Araújo, Ricardo de A

    2010-12-01

    This paper presents a hybrid intelligent methodology to design increasing translation invariant morphological operators applied to Brazilian stock market prediction (overcoming the random walk dilemma). The proposed Translation Invariant Morphological Robust Automatic phase-Adjustment (TIMRAA) method consists of a hybrid intelligent model composed of a Modular Morphological Neural Network (MMNN) with a Quantum-Inspired Evolutionary Algorithm (QIEA), which searches for the best time lags to reconstruct the phase space of the time series generator phenomenon and determines the initial (sub-optimal) parameters of the MMNN. Each individual of the QIEA population is further trained by the Back Propagation (BP) algorithm to improve the MMNN parameters supplied by the QIEA. Also, for each prediction model generated, it uses a behavioral statistical test and a phase fix procedure to adjust time phase distortions observed in stock market time series. Furthermore, an experimental analysis is conducted with the proposed method through four Brazilian stock market time series, and the achieved results are discussed and compared to results found with random walk models and the previously introduced Time-delay Added Evolutionary Forecasting (TAEF) and Morphological-Rank-Linear Time-lag Added Evolutionary Forecasting (MRLTAEF) methods.

  14. Innovative Mixed-Methods Research: Moving beyond Design Technicalities to Epistemological and Methodological Realizations

    ERIC Educational Resources Information Center

    Riazi, A. Mehdi

    2016-01-01

    Mixed-methods research (MMR), as an inter-discourse (quantitative and qualitative) methodology, can provide applied linguistics researchers the opportunity to draw on and integrate the strengths of the two research methodological approaches in favour of making more rigorous inferences about research problems. In this article, the argument is made…

  15. Methodological, Theoretical, Infrastructural, and Design Issues in Conducting Good Outcome Studies

    ERIC Educational Resources Information Center

    Kelly, Michael P.; Moore, Tessa A.

    2011-01-01

    This article outlines a set of methodological, theoretical, and other issues relating to the conduct of good outcome studies. The article begins by considering the contribution of evidence-based medicine to the methodology of outcome research. The lessons which can be applied in outcome studies in nonmedical settings are described. The article…

  16. LOX/hydrocarbon rocket engine analytical design methodology development and validation. Volume 1: Executive summary and technical narrative

    NASA Technical Reports Server (NTRS)

    Pieper, Jerry L.; Walker, Richard E.

    1993-01-01

    During the past three decades, an enormous amount of resources were expended in the design and development of Liquid Oxygen/Hydrocarbon and Hydrogen (LOX/HC and LOX/H2) rocket engines. A significant portion of these resources were used to develop and demonstrate the performance and combustion stability for each new engine. During these efforts, many analytical and empirical models were developed that characterize design parameters and combustion processes that influence performance and stability. Many of these models are suitable as design tools, but they have not been assembled into an industry-wide usable analytical design methodology. The objective of this program was to assemble existing performance and combustion stability models into a usable methodology capable of producing high performing and stable LOX/hydrocarbon and LOX/hydrogen propellant booster engines.

  17. Introduction to the Design and Optimization of Experiments Using Response Surface Methodology. A Gas Chromatography Experiment for the Instrumentation Laboratory

    ERIC Educational Resources Information Center

    Lang, Patricia L.; Miller, Benjamin I.; Nowak, Abigail Tuttle

    2006-01-01

    The study describes how to design and optimize an experiment with multiple factors and multiple responses. The experiment uses fractional factorial analysis as a screening experiment only to identify important instrumental factors and does not use response surface methodology to find the optimal set of conditions.

  18. Methodology for the preliminary design of high performance schools in hot and humid climates

    NASA Astrophysics Data System (ADS)

    Im, Piljae

    A methodology to develop an easy-to-use toolkit for the preliminary design of high performance schools in hot and humid climates was presented. The toolkit proposed in this research will allow decision makers without simulation knowledge easily to evaluate accurately energy efficient measures for K-5 schools, which would contribute to the accelerated dissemination of energy efficient design. For the development of the toolkit, first, a survey was performed to identify high performance measures available today being implemented in new K-5 school buildings. Then an existing case-study school building in a hot and humid climate was selected and analyzed to understand the energy use pattern in a school building and to be used in developing a calibrated simulation. Based on the information from the previous step, an as-built and calibrated simulation was then developed. To accomplish this, five calibration steps were performed to match the simulation results with the measured energy use. The five steps include: (1) Using an actual 2006 weather file with measured solar radiation, (2) Modifying lighting & equipment schedule using ASHRAE's RP-1093 methods, (3) Using actual equipment performance curves (i.e., scroll chiller), (4) Using the Winkelmann's method for the underground floor heat transfer, and (5) Modifying the HVAC and room setpoint temperature based on the measured field data. Next, the calibrated simulation of the case-study K-5 school was compared to an ASHRAE Standard 90.1-1999 code-compliant school. In the next step, the energy savings potentials from the application of several high performance measures to an equivalent ASHRAE Standard 90.1-1999 code-compliant school. The high performance measures applied included the recommendations from the ASHRAE Advanced Energy Design Guides (AEDG) for K-12 and other high performance measures from the literature review as well as a daylighting strategy and solar PV and thermal systems. The results show that the net

  19. A computer modeling methodology and tool for assessing design concepts for the Space Station Data Management System

    NASA Technical Reports Server (NTRS)

    Jones, W. R.

    1986-01-01

    A computer modeling tool is being developed to assess candidate designs for the Space Station Data Management System (DMS). The DMS is to be a complex distributed computer system including the processor, storage devices, local area networks, and software that will support all processing functions onboard the Space Station. The modeling tool will allow a candidate design for the DMS, or for other subsystems that use the DMS, to be evaluated in terms of parameters. The tool and its associated modeling methodology are intended for use by DMS and subsystem designers to perform tradeoff analyses between design concepts using varied architectures and technologies.

  20. What Evidence Underlies Clinical Practice in Paediatric Surgery? A Systematic Review Assessing Choice of Study Design

    PubMed Central

    Allin, Benjamin; Knight, Marian

    2016-01-01

    Objective Identify every paediatric surgical article published in 1998 and every paediatric surgical article published in 2013, and determine which study designs were used and whether they were appropriate for robustly assessing interventions in surgical conditions. Methods A systematic review was conducted according to a pre-specified protocol (CRD42014007629), using EMBASE and Medline. Non-English language studies were excluded. Studies were included if meeting population criteria and either condition or intervention criteria. Population: Children under the age of 18, or adults who underwent intervention for a condition managed by paediatric surgeons when they were under 18 years of age. Condition: One managed by general paediatric surgeons. Intervention: Used for treatment of a condition managed by general paediatric surgeons. Main Outcome Measure Studies were classified according to whether the IDEAL collaboration recommended their design for assessing surgical interventions or not. Change in proportions between 1998 and 2013 was calculated. Results 1581 paediatric surgical articles were published in 1998, and 3453 in 2013. The most commonly used design, accounting for 45% of studies in 1998 and 46.8% in 2013, was the retrospective case series. Only 1.8% of studies were RCTs in 1998, and 1.9% in 2013. Overall, in 1998, 9.8% of studies used a recommended design. In 2013, 11.9% used a recommended design (proportion increase 2.3%, 95% confidence interval 0.5% increase to 4% increase, p = 0.017). Conclusions and Relevance A low proportion of published paediatric surgical manuscripts utilise a design that is recommended for assessing surgical interventions. RCTs represent fewer than 1 in 50 studies. In 2013, 88.1% of studies used a less robust design, suggesting the need for a new way of approaching paediatric surgical research. PMID:26959824

  1. A systematic review of the preventive effect of oral hygiene on pneumonia and respiratory tract infection in elderly people in hospitals and nursing homes: effect estimates and methodological quality of randomized controlled trials.

    PubMed

    Sjögren, Petteri; Nilsson, Erika; Forsell, Marianne; Johansson, Olle; Hoogstraate, Janet

    2008-11-01

    The objective of this study was to investigate the preventive effect of oral hygiene on pneumonia and respiratory tract infection, focusing on elderly people in hospitals and nursing homes, by systematically reviewing effect estimates and methodological quality of randomized controlled trials (RCTs) and to provide an overview of additional clinical studies in this area. Literature searches were conducted in the Medline database, the Cochrane library databases, and by hand-searching reference lists. Included publications were analyzed for intervention (or topic) studied, main conclusions, strength of evidence, and study design. RCTs were further analyzed for effect magnitudes and methodological details. Absolute risk reductions (ARRs) and numbers needed to treat (NNTs) were calculated. Fifteen publications fulfilled the inclusion criteria. There was a wide variation in the design and quality of the studies included. The RCTs revealed positive preventive effects of oral hygiene on pneumonia and respiratory tract infection in hospitalized elderly people and elderly nursing home residents, with ARRs from 6.6% to 11.7% and NNTs from 8.6 to 15.3 individuals. The non-RCT studies contributed to inconclusive evidence on the association and correlation between oral hygiene and pneumonia or respiratory tract infection in elderly people. Mechanical oral hygiene has a preventive effect on mortality from pneumonia, and non-fatal pneumonia in hospitalized elderly people and elderly nursing home residents. Approximately one in 10 cases of death from pneumonia in elderly nursing home residents may be prevented by improving oral hygiene. Future research in this area should be focused on high-quality RCTs with appropriate sample size calculations.

  2. A systematic approach to designing a multiphase unsaturated zone monitoring network

    SciTech Connect

    Cullen, S.J.; Kramer, J.H.; Luellen, J.R.

    1995-10-01

    A systematic approach is presented for the design of a multiphase vadose zone monitoring system recognizing that, as in ground water monitoring system design, complete subsurface coverage is not practical. The approach includes identification and prioritization of vulnerable areas; selection of cost-effective indirect monitoring methods that will provide early warning of contaminant migration; selection of direct monitoring methods for diagnostic confirmation; identification of background monitoring locations; and identification of an appropriate temporal monitoring plan. An example of a monitoring system designed for a solid waste landfill is presented and utilized to illustrate the approach and provide details of system implementation. The example design described incorporates the use of neutron moisture probes deployed in both vertical and horizontal access tubes beneath the leachate recovery collection system of the landfill. Early warning of gaseous phase contaminant migration is monitored utilizing whole-air active soil gas sampling points deployed in gravel-filled trenches beneath the subgrade. Diagnostic confirmation of contaminant migration is provided utilizing pore-liquid samplers. Conservative tracers can be used to distinguish between chemical species released by a landfill from those attributable to other (e.g., off-site) sources of present naturally in the subsurface. A discussion of background monitoring point location is also presented.

  3. Tornado missile simulation and design methodology. Volume 2: model verification and data base updates. Final report

    SciTech Connect

    Twisdale, L.A.; Dunn, W.L.

    1981-08-01

    A probabilistic methodology has been developed to predict the probabilities of tornado-propelled missiles impacting and damaging nuclear power plant structures. Mathematical models of each event in the tornado missile hazard have been developed and sequenced to form an integrated, time-history simulation methodology. The models are data based where feasible. The data include documented records of tornado occurrence, field observations of missile transport, results of wind tunnel experiments, and missile impact tests. Probabilistic Monte Carlo techniques are used to estimate the risk probabilities. The methodology has been encoded in the TORMIS computer code to facilitate numerical analysis and plant-specific tornado missile probability assessments.

  4. The Trial of Mid-Urethral Slings (TOMUS): Design and Methodology

    PubMed Central

    2013-01-01

    Objective Mid-urethral slings (MUS) are increasingly common surgical procedures for the treatment of stress urinary incontinence (SUI) in women. There are currently no adequately powered trials with sufficient length of follow-up comparing the efficacy or safety of the transobturator and retropubic MUS. As a result, no selection criteria are available to guide surgeons or patients. This article describes the methodology and rationale for the Trial Of Mid-Urethral Slings (TOMUS). Patients and Methods The primary aims of this randomized controlled trial is to compare subjective and objective success rates for urinary incontinence (UI) at 12 and 24 months following retropubic and transobturator MUS procedures. Secondary aims are to compare the resolution of overall and stress-specific UI, morbidity, the time to adequate voiding, satisfaction, and quality of life in the two groups. TOMUS will also assess the clinical utility of pre-operative urodynamics in women undergoing MUS procedures. The primary outcome will be obtained at 12 months and 24 months. The definition of treatment success is two-fold. Objective treatment success is defined by a negative stress test, a negative 24-hour pad test and no retreatment for SUI. Subjective treatment success is defined by no self-reported leakage on 3-day diary and no self-reported SUI symptoms. Enrollment began April 2006 and is expected to be complete in 2 years. Conclusions The TOMUS trial is designed to provide outcome and safety information to pelvic surgeons and their patients on the two most commonly performed MUS techniques. PMID:24772006

  5. Rationale, Design, Methodology and Hospital Characteristics of the First Gulf Acute Heart Failure Registry (Gulf CARE)

    PubMed Central

    Sulaiman, Kadhim J.; Panduranga, Prashanth; Al-Zakwani, Ibrahim; Alsheikh-Ali, Alawi; Al-Habib, Khalid; Al-Suwaidi, Jassim; Al-Mahmeed, Wael; Al-Faleh, Husam; El-Asfar, Abdelfatah; Al-Motarreb, Ahmed; Ridha, Mustafa; Bulbanat, Bassam; Al-Jarallah, Mohammed; Bazargani, Nooshin; Asaad, Nidal; Amin, Haitham

    2014-01-01

    Background: There is paucity of data on heart failure (HF) in the Gulf Middle East. The present paper describes the rationale, design, methodology and hospital characteristics of the first Gulf acute heart failure registry (Gulf CARE). Materials and Methods: Gulf CARE is a prospective, multicenter, multinational registry of patients >18 year of age admitted with diagnosis of acute HF (AHF). The data collected included demographics, clinical characteristics, etiology, precipitating factors, management and outcomes of patients admitted with AHF. In addition, data about hospital readmission rates, procedures and mortality at 3 months and 1-year follow-up were recorded. Hospital characteristics and care provider details were collected. Data were entered in a dedicated website using an electronic case record form. Results: A total of 5005 consecutive patients were enrolled from February 14, 2012 to November 13, 2012. Forty-seven hospitals in 7 Gulf States (Oman, Saudi Arabia, Yemen, Kuwait, United Gulf Emirates, Qatar and Bahrain) participated in the project. The majority of hospitals were community hospitals (46%; 22/47) followed by non-University teaching (32%; 15/47 and University hospitals (17%). Most of the hospitals had intensive or coronary care unit facilities (93%; 44/47) with 59% (28/47) having catheterization laboratory facilities. However, only 29% (14/47) had a dedicated HF clinic facility. Most patients (71%) were cared for by a cardiologist. Conclusions: Gulf CARE is the first prospective registry of AHF in the Middle East, intending to provide a unique insight into the demographics, etiology, management and outcomes of AHF in the Middle East. HF management in the Middle East is predominantly provided by cardiologists. The data obtained from this registry will help the local clinicians to identify the deficiencies in HF management as well as provide a platform to implement evidence based preventive and treatment strategies to reduce the burden of HF in

  6. SMET: systematic multiple enzyme targeting - a method to rationally design optimal strains for target chemical overproduction.

    PubMed

    Flowers, David; Thompson, R Adam; Birdwell, Douglas; Wang, Tsewei; Trinh, Cong T

    2013-05-01

    Identifying multiple enzyme targets for metabolic engineering is very critical for redirecting cellular metabolism to achieve desirable phenotypes, e.g., overproduction of a target chemical. The challenge is to determine which enzymes and how much of these enzymes should be manipulated by adding, deleting, under-, and/or over-expressing associated genes. In this study, we report the development of a systematic multiple enzyme targeting method (SMET), to rationally design optimal strains for target chemical overproduction. The SMET method combines both elementary mode analysis and ensemble metabolic modeling to derive SMET metrics including l-values and c-values that can identify rate-limiting reaction steps and suggest which enzymes and how much of these enzymes to manipulate to enhance product yields, titers, and productivities. We illustrated, tested, and validated the SMET method by analyzing two networks, a simple network for concept demonstration and an Escherichia coli metabolic network for aromatic amino acid overproduction. The SMET method could systematically predict simultaneous multiple enzyme targets and their optimized expression levels, consistent with experimental data from the literature, without performing an iterative sequence of single-enzyme perturbation. The SMET method was much more efficient and effective than single-enzyme perturbation in terms of computation time and finding improved solutions.

  7. Generic design methodology for the development of three-dimensional structured-light sensory systems for measuring complex objects

    NASA Astrophysics Data System (ADS)

    Marin, Veronica E.; Chang, Wei Hao Wayne; Nejat, Goldie

    2014-11-01

    Structured-light (SL) techniques are emerging as popular noncontact approaches for obtaining three-dimensional (3-D) measurements of complex objects for real-time applications in manufacturing, bioengineering, and robotics. The performance of SL systems is determined by the emitting (i.e., projector) and capturing (i.e., camera) hardware components and the triangulation configuration between them and an object of interest. A generic design methodology is presented to determine optimal triangulation configurations for SL systems. These optimal configurations are determined with respect to a set of performance metrics: (1) minimizing the 3-D reconstruction errors, (2) maximizing the pixel-to-pixel correspondence between the projector and camera, and (3) maximizing the dispersion of the measured 3-D points within a measurement volume, while satisfying design constraints based on hardware and user-defined specifications. The proposed methodology utilizes a 3-D geometric triangulation model based on ray-tracing geometry and pin-hole models for the projector and camera. Using the methodology, a set of optimal system configurations can be determined for a given set of hardware components. The design methodology was applied to a real-time SL system for surface profiling of complex objects. Experiments were conducted with an optimal sensor configuration and its performance verified with respect to a nonoptimal hardware configuration.

  8. Design, implementation and reporting strategies to reduce the instance and impact of missing patient-reported outcome (PRO) data: a systematic review

    PubMed Central

    Mercieca-Bebber, Rebecca; Palmer, Michael J; Brundage, Michael; Stockler, Martin R; King, Madeleine T

    2016-01-01

    Objectives Patient-reported outcomes (PROs) provide important information about the impact of treatment from the patients' perspective. However, missing PRO data may compromise the interpretability and value of the findings. We aimed to report: (1) a non-technical summary of problems caused by missing PRO data; and (2) a systematic review by collating strategies to: (A) minimise rates of missing PRO data, and (B) facilitate transparent interpretation and reporting of missing PRO data in clinical research. Our systematic review does not address statistical handling of missing PRO data. Data sources MEDLINE and Cumulative Index to Nursing and Allied Health Literature (CINAHL) databases (inception to 31 March 2015), and citing articles and reference lists from relevant sources. Eligibility criteria English articles providing recommendations for reducing missing PRO data rates, or strategies to facilitate transparent interpretation and reporting of missing PRO data were included. Methods 2 reviewers independently screened articles against eligibility criteria. Discrepancies were resolved with the research team. Recommendations were extracted and coded according to framework synthesis. Results 117 sources (55% discussion papers, 26% original research) met the eligibility criteria. Design and methodological strategies for reducing rates of missing PRO data included: incorporating PRO-specific information into the protocol; carefully designing PRO assessment schedules and defining termination rules; minimising patient burden; appointing a PRO coordinator; PRO-specific training for staff; ensuring PRO studies are adequately resourced; and continuous quality assurance. Strategies for transparent interpretation and reporting of missing PRO data include utilising auxiliary data to inform analysis; transparently reporting baseline PRO scores, rates and reasons for missing data; and methods for handling missing PRO data. Conclusions The instance of missing PRO data and its

  9. Using Economic Evidence to Set Healthcare Priorities in Low‐Income and Lower‐Middle‐Income Countries: A Systematic Review of Methodological Frameworks

    PubMed Central

    Mitton, Craig; Doyle‐Waters, Mary M.; Drake, Tom; Conteh, Lesong; Newall, Anthony T.; Onwujekwe, Obinna; Jan, Stephen

    2016-01-01

    Abstract Policy makers in low‐income and lower‐middle‐income countries (LMICs) are increasingly looking to develop ‘evidence‐based’ frameworks for identifying priority health interventions. This paper synthesises and appraises the literature on methodological frameworks – which incorporate economic evaluation evidence – for the purpose of setting healthcare priorities in LMICs. A systematic search of Embase, MEDLINE, Econlit and PubMed identified 3968 articles with a further 21 articles identified through manual searching. A total of 36 papers were eligible for inclusion. These covered a wide range of health interventions with only two studies including health systems strengthening interventions related to financing, governance and human resources. A little under half of the studies (39%) included multiple criteria for priority setting, most commonly equity, feasibility and disease severity. Most studies (91%) specified a measure of ‘efficiency’ defined as cost per disability‐adjusted life year averted. Ranking of health interventions using multi‐criteria decision analysis and generalised cost‐effectiveness were the most common frameworks for identifying priority health interventions. Approximately a third of studies discussed the affordability of priority interventions. Only one study identified priority areas for the release or redeployment of resources. The paper concludes by highlighting the need for local capacity to conduct evaluations (including economic analysis) and empowerment of local decision‐makers to act on this evidence. PMID:26804361

  10. Using Economic Evidence to Set Healthcare Priorities in Low-Income and Lower-Middle-Income Countries: A Systematic Review of Methodological Frameworks.

    PubMed

    Wiseman, Virginia; Mitton, Craig; Doyle-Waters, Mary M; Drake, Tom; Conteh, Lesong; Newall, Anthony T; Onwujekwe, Obinna; Jan, Stephen

    2016-02-01

    Policy makers in low-income and lower-middle-income countries (LMICs) are increasingly looking to develop 'evidence-based' frameworks for identifying priority health interventions. This paper synthesises and appraises the literature on methodological frameworks--which incorporate economic evaluation evidence--for the purpose of setting healthcare priorities in LMICs. A systematic search of Embase, MEDLINE, Econlit and PubMed identified 3968 articles with a further 21 articles identified through manual searching. A total of 36 papers were eligible for inclusion. These covered a wide range of health interventions with only two studies including health systems strengthening interventions related to financing, governance and human resources. A little under half of the studies (39%) included multiple criteria for priority setting, most commonly equity, feasibility and disease severity. Most studies (91%) specified a measure of 'efficiency' defined as cost per disability-adjusted life year averted. Ranking of health interventions using multi-criteria decision analysis and generalised cost-effectiveness were the most common frameworks for identifying priority health interventions. Approximately a third of studies discussed the affordability of priority interventions. Only one study identified priority areas for the release or redeployment of resources. The paper concludes by highlighting the need for local capacity to conduct evaluations (including economic analysis) and empowerment of local decision-makers to act on this evidence.

  11. Systematic approach for PID controller design for pitch-regulated, variable-speed wind turbines

    SciTech Connect

    Hand, M.M.; Balas, M.J.

    1997-11-01

    Variable-speed, horizontal axis wind turbines use blade-pitch control to meet specified objectives for three regions of operation. This paper focuses on controller design for the constant power production regime. A simple, rigid, non-linear turbine model was used to systematically perform trade-off studies between two performance metrics. Minimization of both the deviation of the rotor speed from the desired speed and the motion of the actuator is desired. The robust nature of the proportional-integral-derivative (PID) controller is illustrated, and optimal operating conditions are determined. Because numerous simulation runs may be completed in a short time, the relationship of the two opposing metrics is easily visualized. 2 refs., 9 figs.

  12. A Multi-Objective Advanced Design Methodology of Composite Beam-to-Column Joints Subjected to Seismic and Fire Loads

    NASA Astrophysics Data System (ADS)

    Pucinotti, Raffaele; Ferrario, Fabio; Bursi, Oreste S.

    2008-07-01

    A multi-objective advanced design methodology dealing with seismic actions followed by fire on steel-concrete composite full strength joints with concrete filled tubes is proposed in this paper. The specimens were designed in detail in order to exhibit a suitable fire behaviour after a severe earthquake. The major aspects of the cyclic behaviour of composite joints are presented and commented upon. The data obtained from monotonic and cyclic experimental tests have been used to calibrate a model of the joint in order to perform seismic simulations on several moment resisting frames. A hysteretic law was used to take into account the seismic degradation of the joints. Finally, fire tests were conducted with the objective to evaluate fire resistance of the connection already damaged by an earthquake. The experimental activity together with FE simulation demonstrated the adequacy of the advanced design methodology.

  13. Design methodology for nano-engineered surfaces to control adhesion: Application to the anti-adhesion of particles

    NASA Astrophysics Data System (ADS)

    Kim, Taekyung; Min, Cheongwan; Jung, Myungki; Lee, Jinhyung; Park, Changsu; Kang, Shinill

    2016-12-01

    With increasing demand for means of controlling surface adhesion in various applications, including the semiconductor industry, optics, micro/nanoelectromechanical systems, and the medical industry, nano-engineered surfaces have attracted much attention. This study suggests a design methodology for nanostructures using the Derjaguin approximation in conjunction with finite element analysis for the control of adhesion forces. The suggested design methodology was applied for designing a nano-engineered surface with low-adhesion properties. To verify this, rectangular and sinusoidal nanostructures were fabricated and analyzed using force-distance curve measurements using atomic force microscopy and centrifugal detachment testing. For force-distance curve measurements, modified cantilevers with tips formed with atypical particles were used. Subsequently, centrifugal detachment tests were also conducted. The surface wettability of rectangular and sinusoidal nanostructures was measured and compared with the measured adhesion force and the number of particles remaining after centrifugal detachment tests.

  14. The role of intervention mapping in designing disease prevention interventions: A systematic review of the literature

    PubMed Central

    Garba, Rayyan M.; Gadanya, Muktar A.

    2017-01-01

    Objective To assess the role of Intervention Mapping (IM) in designing disease prevention interventions worldwide. Methods Systematic search and review of the relevant literature—peer-reviewed and grey—was conducted using the Preferred Reporting Items for Systematic Reviews and Meta-analysis (PRISMA) guidelines. Findings Only five of the twenty two included studies reviewed were RCTs that compared intervention using IM protocol with placebo intervention, and provided the outcomes in terms of percentage increase in the uptake of disease-prevention programmes, and only one of the five studies provided an effect measure in the form of relative risk (RR = 1.59, 95% CI = 1.08–2.34, p = 0.02). Of the five RCTs, three were rated as strong evidences, one as a medium evidence and one as a weak evidence, and they all reported statistically significant difference between the two study groups, with disease prevention interventions that have used the intervention mapping approach generally reported significant increases in the uptake of disease-prevention interventions, ranging from 9% to 28.5% (0.0001 ≤ p ≤ 0.02), On the other hand, all the 22 studies have successfully identified the determinants of the uptake of disease prevention interventions that is essential to the success of disease prevention programmes. Conclusion Intervention Mapping has been successfully used to plan, implement and evaluate interventions that showed significant increase in uptake of disease prevention programmes. This study has provided a good understanding of the role of intervention mapping in designing disease prevention interventions, and a good foundation upon which subsequent reviews can be guided. PMID:28358821

  15. Review of the probabilistic failure analysis methodology and other probabilistic approaches for application in aerospace structural design

    NASA Technical Reports Server (NTRS)

    Townsend, J.; Meyers, C.; Ortega, R.; Peck, J.; Rheinfurth, M.; Weinstock, B.

    1993-01-01

    Probabilistic structural analyses and design methods are steadily gaining acceptance within the aerospace industry. The safety factor approach to design has long been the industry standard, and it is believed by many to be overly conservative and thus, costly. A probabilistic approach to design may offer substantial cost savings. This report summarizes several probabilistic approaches: the probabilistic failure analysis (PFA) methodology developed by Jet Propulsion Laboratory, fast probability integration (FPI) methods, the NESSUS finite element code, and response surface methods. Example problems are provided to help identify the advantages and disadvantages of each method.

  16. Systematic generation of chemical structures for rational drug design based on QSAR models.

    PubMed

    Funatsu, Kimito; Miyao, Tomoyuki; Arakawa, Masamoto

    2011-03-01

    The first step in the process of drug development is to determine those lead compounds that demonstrate significant biological activity with regard to a target protein. Because this process is often costly and time consuming, there is a need to develop efficient methodologies for the generation of lead compounds for practical drug design. One promising approach for determining a potent lead compound is computational virtual screening. The biological activities of candidate structures found in virtual libraries are estimated by using quantitative structure activity relationship (QSAR) models and/or computational docking simulations. In virtual screening studies, databases of existing drugs or natural products are commonly used as a source of lead candidates. However, these databases are not sufficient for the purpose of finding lead candidates having novel scaffolds. Therefore, a method must be developed to generate novel molecular structures to indicate high activity for efficient lead discovery. In this paper, we review current trends in structure generation methods for drug design and discuss future directions. First, we present an overview of lead discovery and drug design, and then, we review structure generation methods. Here, the structure generation methods are classified on the basis of whether or not they employ QSAR models for generating structures. We conclude that the use of QSAR models for structure generation is an effective method for computational lead discovery. Finally, we discuss the problems regarding the applicability domain of QSAR models and future directions in this field.

  17. Design and construction of nanoscale material for ultrasonic assisted adsorption of dyes: Application of derivative spectrophotometry and experimental design methodology.

    PubMed

    Bagheri, Ahmad Reza; Ghaedi, Mehrorang; Asfaram, Arash; Jannesar, Ramin; Goudarzi, Alireza

    2017-03-01

    Response surface methodology (RSM) based on central rotatable experimental design was used to investigate the effect of ultrasound assisted simultaneous adsorption process variables on Cu: ZnS-NPs-AC from aqueous solution. Cu: ZnS-NPs-AC was characterized using field emission scanning electron microscopy (FE-SEM), Energy Dispersive X-ray Spectroscopy (EDX) and X-ray diffraction (XRD). To overcome the severe methylene blue (MB) and brilliant green (BG) dyes spectral overlapping, derivative spectrophotometric method were successfully applied for the simultaneous determination of dyes in their binary solutions. Simultaneous determination of the dyes can be carried out using the first-order and second order derivative signal at 664 and 663nm for BG and MB, respectively. The factors investigated were pH (2.5-8.5), adsorbent mass (0.006-0.030g), sonication time (1-5min) and initial MB and BG concentration (3-15mgL(-1)). Five levels, which were low level, center point, upper level and two axillar points, were considered for each of the factors. The desirability function (DF: 0.9853) on the STATISTICA version 10.0 software showed that the optimum removal (99.832 and 99.423% for MB and BG, respectively) was obtained at pH 8.0, adsorbent mass 0.024g, sonication time 4min and 9mgL(-1) initial concentration for each dye. Besides, the results show that obtained data were adequately fitted into the second-order polynomial model, since the calculated model F value (172.96 and 96.35 for MB and BG, respectively) is higher than the critical F value. The values of coefficient of determination (0.9968 and 0.9943 for MB and BG, respectively) and adjusted coefficient of determination (0.9911 and 0.9840 for MB and BG, respectively) are close to 1, indicating a high correlation between the observed and the predicted values. The ultrasonic amplitude and adsorbent mass were found to be the most effective variable influencing the adsorption process. The adsorption equilibrium was well

  18. Thermal Hydraulics Design and Analysis Methodology for a Solid-Core Nuclear Thermal Rocket Engine Thrust Chamber

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Canabal, Francisco; Chen, Yen-Sen; Cheng, Gary; Ito, Yasushi

    2013-01-01

    Nuclear thermal propulsion is a leading candidate for in-space propulsion for human Mars missions. This chapter describes a thermal hydraulics design and analysis methodology developed at the NASA Marshall Space Flight Center, in support of the nuclear thermal propulsion development effort. The objective of this campaign is to bridge the design methods in the Rover/NERVA era, with a modern computational fluid dynamics and heat transfer methodology, to predict thermal, fluid, and hydrogen environments of a hypothetical solid-core, nuclear thermal engine the Small Engine, designed in the 1960s. The computational methodology is based on an unstructured-grid, pressure-based, all speeds, chemically reacting, computational fluid dynamics and heat transfer platform, while formulations of flow and heat transfer through porous and solid media were implemented to describe those of hydrogen flow channels inside the solid24 core. Design analyses of a single flow element and the entire solid-core thrust chamber of the Small Engine were performed and the results are presented herein

  19. Study-design selection criteria in systematic reviews of effectiveness of health systems interventions and reforms: A meta-review.

    PubMed

    Rockers, Peter C; Feigl, Andrea B; Røttingen, John-Arne; Fretheim, Atle; de Ferranti, David; Lavis, John N; Melberg, Hans Olav; Bärnighausen, Till

    2012-03-01

    At present, there exists no widely agreed upon set of study-design selection criteria for systematic reviews of health systems research, except for those proposed by the Cochrane Collaboration's Effective Practice and Organisation of Care (EPOC) review group (which comprises randomized controlled trials, controlled clinical trials, controlled before-after studies, and interrupted time series). We conducted a meta-review of the study-design selection criteria used in systematic reviews available in the McMaster University's Health Systems Evidence or the EPOC database. Of 414 systematic reviews, 13% did not indicate any study-design selection criteria. Of the 359 studies that described such criteria, 50% limited their synthesis to controlled trials and 68% to some or all of the designs defined by the EPOC criteria. Seven out of eight reviews identified at least one controlled trial that was relevant for the review topic. Seven percent of the reviews included either no or only one relevant primary study. Our meta-review reveals reviewers' preferences for restricting synthesis to controlled experiments or study designs that comply with the EPOC criteria. We discuss the advantages and disadvantages of the current practices regarding study-design selection in systematic reviews of health systems research as well as alternative approaches.

  20. Active lower limb prosthetics: a systematic review of design issues and solutions.

    PubMed

    Windrich, Michael; Grimmer, Martin; Christ, Oliver; Rinderknecht, Stephan; Beckerle, Philipp

    2016-12-19

    This paper presents a review on design issues and solutions found in active lower limb prostheses. This review is based on a systematic literature search with a methodical search strategy. The search was carried out across four major technical databases and the retrieved records were screened for their relevance. A total of 21 different active prostheses, including 8 above-knee, 9 below-knee and 4 combined knee-ankle prostheses were identified. While an active prosthesis may help to restore the functional performance of an amputee, the requirements regarding the actuation unit as well as for the control system are high and the development becomes a challenging task. Regarding mechanical design and the actuation unit high force/torque delivery, high efficiency, low size and low weight are conflicting goals. The actuation principle and variable impedance actuators are discussed. The control system is paramount for a "natural functioning" of the prosthesis. The control system has to enable locomotion and should react to the amputee's intent. For this, multi-level control approaches are reviewed.

  1. Grounded Theory as a Methodology to Design Teaching Strategies for Historically Informed Musical Performance

    ERIC Educational Resources Information Center

    Mateos-Moreno, Daniel; Alcaraz-Iborra, Mario

    2013-01-01

    Our work highlights the necessity of revising the materials employed in instrumental education, which are systematically based on a progressive development of technical abilities and, though only transversely, without a structured sequence of contents, on issues referring to the interpretation of different periods and styles. In order to elaborate…

  2. Contentious issues in research on trafficked women working in the sex industry: study design, ethics, and methodology.

    PubMed

    Cwikel, Julie; Hoban, Elizabeth

    2005-11-01

    The trafficking of women and children for work in the globalized sex industry is a global social problem. Quality data is needed to provide a basis for legislation, policy, and programs, but first, numerous research design, ethical, and methodological problems must be addressed. Research design issues in studying women trafficked for sex work (WTSW) include how to (a) develop coalitions to fund and support research, (b) maintain a critical stance on prostitution, and therefore WTSW (c) use multiple paradigms and methods to accurately reflect WTSW's reality, (d) present the purpose of the study, and (e) protect respondents' identities. Ethical issues include (a) complications with informed consent procedures, (b) problematic access to WTSW (c) loss of WTSW to follow-up, (d) inability to intervene in illegal acts or human rights violations, and (e) the need to maintain trustworthiness as researchers. Methodological issues include (a) constructing representative samples, (b) managing media interest, and (c) handling incriminating materials about law enforcement and immigration.

  3. Observations on computational methodologies for use in large-scale, gradient-based, multidisciplinary design incorporating advanced CFD codes

    NASA Technical Reports Server (NTRS)

    Newman, P. A.; Hou, G. J.-W.; Jones, H. E.; Taylor, A. C., III; Korivi, V. M.

    1992-01-01

    How a combination of various computational methodologies could reduce the enormous computational costs envisioned in using advanced CFD codes in gradient based optimized multidisciplinary design (MdD) procedures is briefly outlined. Implications of these MdD requirements upon advanced CFD codes are somewhat different than those imposed by a single discipline design. A means for satisfying these MdD requirements for gradient information is presented which appear to permit: (1) some leeway in the CFD solution algorithms which can be used; (2) an extension to 3-D problems; and (3) straightforward use of other computational methodologies. Many of these observations have previously been discussed as possibilities for doing parts of the problem more efficiently; the contribution here is observing how they fit together in a mutually beneficial way.

  4. Switching from usual brand cigarettes to a tobacco-heating cigarette or snus: Part 1. Study design and methodology

    PubMed Central

    Ogden, Michael W.; Marano, Kristin M.; Jones, Bobbette A.; Stiles, Mitchell F.

    2015-01-01

    Abstract A randomized, multi-center study was conducted to assess potential improvement in health status measures, as well as changes in biomarkers of tobacco exposure and biomarkers of biological effect, in current adult cigarette smokers switched to tobacco-heating cigarettes, snus or ultra-low machine yield tobacco-burning cigarettes (50/group) evaluated over 24 weeks. Study design, conduct and methodology are presented here along with subjects’ disposition, characteristics, compliance and safety results. This design and methodology, evaluating generally healthy adult smokers over a relatively short duration, proved feasible. Findings from this randomized study provide generalized knowledge of the risk continuum among various tobacco products (ClinicalTrials.gov Identifier: NCT02061917). PMID:26525849

  5. Switching from usual brand cigarettes to a tobacco-heating cigarette or snus: Part 1. Study design and methodology.

    PubMed

    Ogden, Michael W; Marano, Kristin M; Jones, Bobbette A; Stiles, Mitchell F

    2015-01-01

    A randomized, multi-center study was conducted to assess potential improvement in health status measures, as well as changes in biomarkers of tobacco exposure and biomarkers of biological effect, in current adult cigarette smokers switched to tobacco-heating cigarettes, snus or ultra-low machine yield tobacco-burning cigarettes (50/group) evaluated over 24 weeks. Study design, conduct and methodology are presented here along with subjects' disposition, characteristics, compliance and safety results. This design and methodology, evaluating generally healthy adult smokers over a relatively short duration, proved feasible. Findings from this randomized study provide generalized knowledge of the risk continuum among various tobacco products (ClinicalTrials.gov Identifier: NCT02061917).

  6. Design of integrated autopilot/autothrottle for NASA TSRV airplane using integral LQG methodology. [transport systems research vehicle

    NASA Technical Reports Server (NTRS)

    Kaminer, Isaac; Benson, Russell A.

    1989-01-01

    An integrated autopilot/autothrottle control system has been developed for the NASA transport system research vehicle using a two-degree-of-freedom approach. Based on this approach, the feedback regulator was designed using an integral linear quadratic regulator design technique, which offers a systematic approach to satisfy desired feedback performance requirements and guarantees stability margins in both control and sensor loops. The resulting feedback controller was discretized and implemented using a delta coordinate concept, which allows for transient free controller switching by initializing all controller states to zero and provides a simple solution for dealing with throttle limiting cases.

  7. A framework for the systematic design of fed-batch strategies in mammalian cell culture.

    PubMed

    Kyriakopoulos, Sarantos; Kontoravdi, Cleo

    2014-12-01

    A methodology to calculate the required amount of amino acids (a.a.) and glucose in feeds for animal cell culture from monitoring their levels in batch experiments is presented herein. Experiments with the designed feeds on an antibody-producing Chinese hamster ovary cell line resulted in a 3-fold increase in titer compared to batch culture. Adding 40% more nutrients to the same feed further increases the yield to 3.5 higher than in batch culture. Our results show that above a certain threshold there is no linear correlation between nutrient addition and the integral of viable cell concentration. In addition, although high ammonia levels hinder cell growth, they do not appear to affect specific antibody productivity, while we hypothesize that high extracellular lactate concentration is the cause for the metabolic shift towards lactate consumption for the cell line used. Overall, the performance of the designed feeds is comparable to that of a commercial feed that was tested in parallel. Expanding this approach to more nutrients, as well as changing the ratio of certain amino acids as informed by flux balance analysis, could achieve even higher yields.

  8. Methodological Complications of Matching Designs under Real World Constraints: Lessons from a Study of Deeper Learning

    ERIC Educational Resources Information Center

    Zeiser, Kristina; Rickles, Jordan; Garet, Michael S.

    2014-01-01

    To help researchers understand potential issues one can encounter when conducting propensity matching studies in complex settings, this paper describes methodological complications faced when studying schools using deeper learning practices to improve college and career readiness. The study uses data from high schools located in six districts…

  9. Qualitative Case Study Methodology: Study Design and Implementation for Novice Researchers

    ERIC Educational Resources Information Center

    Baxter, Pamela; Jack, Susan

    2008-01-01

    Qualitative case study methodology provides tools for researchers to study complex phenomena within their contexts. When the approach is applied correctly, it becomes a valuable method for health science research to develop theory, evaluate programs, and develop interventions. The purpose of this paper is to guide the novice researcher in…

  10. Evaluation of the Probabilistic Design Methodology and Computer Code for Composite Structures

    DTIC Science & Technology

    2001-06-01

    conditional expectation method (CEM) to determine the failure probability of a specified failure event. This methodology was first verified by...function into several conditional failure functions. The probability of failure for each conditional failure function is first calculated using the CEM. The

  11. Impact Evaluation of Quality Assurance in Higher Education: Methodology and Causal Designs

    ERIC Educational Resources Information Center

    Leiber, Theodor; Stensaker, Bjørn; Harvey, Lee

    2015-01-01

    In this paper, the theoretical perspectives and general methodological elements of impact evaluation of quality assurance in higher education institutions are discussed, which should be a cornerstone of quality development in higher education and contribute to improving the knowledge about the effectiveness (or ineffectiveness) of quality…

  12. Future Directions in Adventure-based Therapy Research: Methodological Considerations and Design Suggestions.

    ERIC Educational Resources Information Center

    Newes, Sandra L.

    2001-01-01

    More methodologically sound research in adventure therapy is needed if the field is to claim empirically-based efficacy as a treatment modality. Some considerations for conducting outcome studies in adventure therapy relate to standardization, multiple domain assessment, regression techniques, objective assessment of participant change, client and…

  13. Implications of Functional Analysis Methodology for the Design of Intervention Programs

    ERIC Educational Resources Information Center

    Iwata, Brian A.; Worsdell, April S.

    2005-01-01

    Functional analysis methodology is an assessment strategy that identifies sources of reinforcement that maintain problem behavior and prescribes individualized interventions that directly alter the conditions under which behavior occurs. In this article we describe the environmental determinants of problem behavior, methods for conducting…

  14. Systematic review: Effects, design choices, and context of pay-for-performance in health care

    PubMed Central

    2010-01-01

    Background Pay-for-performance (P4P) is one of the primary tools used to support healthcare delivery reform. Substantial heterogeneity exists in the development and implementation of P4P in health care and its effects. This paper summarizes evidence, obtained from studies published between January 1990 and July 2009, concerning P4P effects, as well as evidence on the impact of design choices and contextual mediators on these effects. Effect domains include clinical effectiveness, access and equity, coordination and continuity, patient-centeredness, and cost-effectiveness. Methods The systematic review made use of electronic database searching, reference screening, forward citation tracking and expert consultation. The following databases were searched: Cochrane Library, EconLit, Embase, Medline, PsychINFO, and Web of Science. Studies that evaluate P4P effects in primary care or acute hospital care medicine were included. Papers concerning other target groups or settings, having no empirical evaluation design or not complying with the P4P definition were excluded. According to study design nine validated quality appraisal tools and reporting statements were applied. Data were extracted and summarized into evidence tables independently by two reviewers. Results One hundred twenty-eight evaluation studies provide a large body of evidence -to be interpreted with caution- concerning the effects of P4P on clinical effectiveness and equity of care. However, less evidence on the impact on coordination, continuity, patient-centeredness and cost-effectiveness was found. P4P effects can be judged to be encouraging or disappointing, depending on the primary mission of the P4P program: supporting minimal quality standards and/or boosting quality improvement. Moreover, the effects of P4P interventions varied according to design choices and characteristics of the context in which it was introduced. Future P4P programs should (1) select and define P4P targets on the basis of

  15. Design methodology for a confocal imaging system using an objective microlens array with an increased working distance

    PubMed Central

    Choi, Woojae; Shin, Ryung; Lim, Jiseok; Kang, Shinill

    2016-01-01

    In this study, a design methodology for a multi-optical probe confocal imaging system was developed. To develop an imaging system that has the required resolving power and imaging area, this study focused on a design methodology to create a scalable and easy-to-implement confocal imaging system. This system overcomes the limitations of the optical complexities of conventional multi-optical probe confocal imaging systems and the short working distance using a micro-objective lens module composed of two microlens arrays and a telecentric relay optical system. The micro-objective lens module was fabricated on a glass substrate using backside alignment photolithography and thermal reflow processes. To test the feasibility of the developed methodology, an optical system with a resolution of 1 μm/pixel using multi-optical probes with an array size of 10 × 10 was designed and constructed. The developed system provides a 1 mm × 1 mm field of view and a sample scanning range of 100 μm. The optical resolution was evaluated by conducting sample tests using a knife-edge detecting method. The measured lateral resolution of the system was 0.98 μm. PMID:27615370

  16. Design methodology for a confocal imaging system using an objective microlens array with an increased working distance

    NASA Astrophysics Data System (ADS)

    Choi, Woojae; Shin, Ryung; Lim, Jiseok; Kang, Shinill

    2016-09-01

    In this study, a design methodology for a multi-optical probe confocal imaging system was developed. To develop an imaging system that has the required resolving power and imaging area, this study focused on a design methodology to create a scalable and easy-to-implement confocal imaging system. This system overcomes the limitations of the optical complexities of conventional multi-optical probe confocal imaging systems and the short working distance using a micro-objective lens module composed of two microlens arrays and a telecentric relay optical system. The micro-objective lens module was fabricated on a glass substrate using backside alignment photolithography and thermal reflow processes. To test the feasibility of the developed methodology, an optical system with a resolution of 1 μm/pixel using multi-optical probes with an array size of 10 × 10 was designed and constructed. The developed system provides a 1 mm × 1 mm field of view and a sample scanning range of 100 μm. The optical resolution was evaluated by conducting sample tests using a knife-edge detecting method. The measured lateral resolution of the system was 0.98 μm.

  17. Longitudinal Intergenerational Birth Cohort Designs: A Systematic Review of Australian and New Zealand Studies

    PubMed Central

    Townsend, Michelle L.; Riepsamen, Angelique; Georgiou, Christos; Flood, Victoria M.; Caputi, Peter; Wright, Ian M.; Davis, Warren S.; Jones, Alison; Larkin, Theresa A.; Williamson, Moira J.; Grenyer, Brin F. S.

    2016-01-01

    Background The longitudinal birth cohort design has yielded a substantial contribution to knowledge of child health and development. The last full review in New Zealand and Australia in 2004 identified 13 studies. Since then, birth cohort designs continue to be an important tool in understanding how intrauterine, infant and childhood development affect long-term health and well-being. This updated review in a defined geographical area was conducted to better understand the factors associated with successful quality and productivity, and greater scientific and policy contribution and scope. Methods We adopted the preferred reporting items for systematic reviews and meta-analyses (PRISMA) approach, searching PubMed, Scopus, Cinahl, Medline, Science Direct and ProQuest between 1963 and 2013. Experts were consulted regarding further studies. Five inclusion criteria were used: (1) have longitudinally tracked a birth cohort, (2) have collected data on the child and at least one parent or caregiver (3) be based in Australia or New Zealand, (4) be empirical in design, and (5) have been published in English. Results 10665 records were initially retrieved from which 23 birth cohort studies met the selection criteria. Together these studies recruited 91,196 participants, with 38,600 mothers, 14,206 fathers and 38,390 live births. Seventeen studies were located in Australia and six in New Zealand. Research questions initially focused on the perinatal period, but as studies matured, longer-term effects and outcomes were examined. Conclusions This review demonstrates the significant yield from this effort both in terms of scientific discovery and social policy impact. Further opportunities have been recognised with cross-study collaboration and pooling of data between established and newer studies and international studies to investigate global health determinants. PMID:26991330

  18. Rating the methodological quality of single-subject designs and n-of-1 trials: introducing the Single-Case Experimental Design (SCED) Scale.

    PubMed

    Tate, Robyn L; McDonald, Skye; Perdices, Michael; Togher, Leanne; Schultz, Regina; Savage, Sharon

    2008-08-01

    Rating scales that assess methodological quality of clinical trials provide a means to critically appraise the literature. Scales are currently available to rate randomised and non-randomised controlled trials, but there are none that assess single-subject designs. The Single-Case Experimental Design (SCED) Scale was developed for this purpose and evaluated for reliability. Six clinical researchers who were trained and experienced in rating methodological quality of clinical trials developed the scale and participated in reliability studies. The SCED Scale is an 11-item rating scale for single-subject designs, of which 10 items are used to assess methodological quality and use of statistical analysis. The scale was developed and refined over a 3-year period. Content validity was addressed by identifying items to reduce the main sources of bias in single-case methodology as stipulated by authorities in the field, which were empirically tested against 85 published reports. Inter-rater reliability was assessed using a random sample of 20/312 single-subject reports archived in the Psychological Database of Brain Impairment Treatment Efficacy (PsycBITE). Inter-rater reliability for the total score was excellent, both for individual raters (overall ICC = 0.84; 95% confidence interval 0.73-0.92) and for consensus ratings between pairs of raters (overall ICC = 0.88; 95% confidence interval 0.78-0.95). Item reliability was fair to excellent for consensus ratings between pairs of raters (range k = 0.48 to 1.00). The results were replicated with two independent novice raters who were trained in the use of the scale (ICC = 0.88, 95% confidence interval 0.73-0.95). The SCED Scale thus provides a brief and valid evaluation of methodological quality of single-subject designs, with the total score demonstrating excellent inter-rater reliability using both individual and consensus ratings. Items from the scale can also be used as a checklist in the design, reporting and critical

  19. A methodology for hypersonic transport technology planning

    NASA Technical Reports Server (NTRS)

    Repic, E. M.; Olson, G. A.; Milliken, R. J.

    1973-01-01

    A systematic procedure by which the relative economic value of technology factors affecting design, configuration, and operation of a hypersonic cruise transport can be evaluated is discussed. Use of the methodology results in identification of first-order economic gains potentially achievable by projected advances in each of the definable, hypersonic technologies. Starting with a baseline vehicle, the formulas, procedures and forms which are integral parts of this methodology are developed. A demonstration of the methodology is presented for one specific hypersonic vehicle system.

  20. Methodology of design and analysis of external walls of space station for hypervelocity impacts by meteoroids and space debris

    NASA Technical Reports Server (NTRS)

    Batla, F. A.

    1986-01-01

    The development of criteria and methodology for the design and analysis of Space Station wall elements for collisions with meteoroids and space debris at hypervelocities is discussed. These collisions will occur at velocities of 10 km/s or more and can be damaging to the external wall elements of the Space Station. The wall elements need to be designed to protect the pressurized modules of the Space Station from functional or structural failure due to these collisions at hypervelocities for a given environment and population of meteoroids and space debris. The design and analysis approach and the associated computer program presented is to achieve this objective, including the optimization of the design for a required overall probability of no penetration. The approach is based on the presently available experimental and actual data on meteoroids and space debris flux and damage assessments and the empirical relationships resulting from the hypervelocity impact studies in laboratories.