Science.gov

Sample records for systematic design methodology

  1. Systematic Controller Design Methodology for Variable-Speed Wind Turbines

    SciTech Connect

    Hand, M. M.; Balas, M. J.

    2002-02-01

    Variable-speed, horizontal axis wind turbines use blade-pitch control to meet specified objectives for three operational regions. This paper provides a guide for controller design for the constant power production regime. A simple, rigid, non-linear turbine model was used to systematically perform trade-off studies between two performance metrics. Minimization of both the deviation of the rotor speed from the desired speed and the motion of the actuator is desired. The robust nature of the proportional-integral-derivative controller is illustrated, and optimal operating conditions are determined. Because numerous simulation runs may be completed in a short time, the relationship between the two opposing metrics is easily visualized.

  2. Systematic defect filtering and data analysis methodology for design based metrology

    NASA Astrophysics Data System (ADS)

    Yang, Hyunjo; Kim, Jungchan; Lee, Taehyeong; Jung, Areum; Yoo, Gyun; Yim, Donggyu; Park, Sungki; Hasebe, Toshiaki; Yamamoto, Masahiro; Cai, Jun

    2009-03-01

    Recently several Design Based Metrologies (DBMs) are introduced and being in use for wafer verification. The major applications of DBM are OPC accuracy improvement, DFM feed-back through Process Window Qualification (PWQ) and advanced process control. In general, however, the amount of output data from DBM is normally so large that it is very hard to handle the data for valuable feed-back. In case of PWQ, more than thousands of hot spots are detected on a single chip at the edge of process window. So, it takes much time and labor to review and analyze all the hot spots detected at PWQ. Design-related systematic defects, however, will be found repeatedly and if they can be classified into groups, it would be possible to save a lot of time for the analysis. We have demonstrated an EDA tool which can handle the large amount of output data from DBM by reducing pattern defects to groups. It can classify millions of patterns into less than thousands of pattern groups. It has been evaluated on the analysis of PWQ of metal layer in NAND Flash memory device and random contact hole patterns in a DRAM device. The result shows that this EDA tool can handle the CD measurement data easily and can save us a lot of time and labor for the analysis. The procedures of systematic defect filtering and data handling using an EDA tool are presented in detail

  3. Systematic Review Methodology in Higher Education

    ERIC Educational Resources Information Center

    Bearman, Margaret; Smith, Calvin D.; Carbone, Angela; Slade, Susan; Baik, Chi; Hughes-Warrington, Marnie; Neumann, David L.

    2012-01-01

    Systematic review methodology can be distinguished from narrative reviews of the literature through its emphasis on transparent, structured and comprehensive approaches to searching the literature and its requirement for formal synthesis of research findings. There appears to be relatively little use of the systematic review methodology within the…

  4. Variable-Speed Wind Turbine Controller Systematic Design Methodology: A Comparison of Non-Linear and Linear Model-Based Designs

    SciTech Connect

    Hand, M. M.

    1999-07-30

    Variable-speed, horizontal axis wind turbines use blade-pitch control to meet specified objectives for three regions of operation. This paper focuses on controller design for the constant power production regime. A simple, rigid, non-linear turbine model was used to systematically perform trade-off studies between two performance metrics. Minimization of both the deviation of the rotor speed from the desired speed and the motion of the actuator is desired. The robust nature of the proportional-integral-derivative (PID) controller is illustrated, and optimal operating conditions are determined. Because numerous simulation runs may be completed in a short time, the relationship of the two opposing metrics is easily visualized. Traditional controller design generally consists of linearizing a model about an operating point. This step was taken for two different operating points, and the systematic design approach was used. A comparison of the optimal regions selected using the n on-linear model and the two linear models shows similarities. The linearization point selection does, however, affect the turbine performance slightly. Exploitation of the simplicity of the model allows surfaces consisting of operation under a wide range of gain values to be created. This methodology provides a means of visually observing turbine performance based upon the two metrics chosen for this study. Design of a PID controller is simplified, and it is possible to ascertain the best possible combination of controller parameters. The wide, flat surfaces indicate that a PID controller is very robust in this variable-speed wind turbine application.

  5. Vending machine assessment methodology. A systematic review.

    PubMed

    Matthews, Melissa A; Horacek, Tanya M

    2015-07-01

    The nutritional quality of food and beverage products sold in vending machines has been implicated as a contributing factor to the development of an obesogenic food environment. How comprehensive, reliable, and valid are the current assessment tools for vending machines to support or refute these claims? A systematic review was conducted to summarize, compare, and evaluate the current methodologies and available tools for vending machine assessment. A total of 24 relevant research studies published between 1981 and 2013 met inclusion criteria for this review. The methodological variables reviewed in this study include assessment tool type, study location, machine accessibility, product availability, healthfulness criteria, portion size, price, product promotion, and quality of scientific practice. There were wide variations in the depth of the assessment methodologies and product healthfulness criteria utilized among the reviewed studies. Of the reviewed studies, 39% evaluated machine accessibility, 91% evaluated product availability, 96% established healthfulness criteria, 70% evaluated portion size, 48% evaluated price, 52% evaluated product promotion, and 22% evaluated the quality of scientific practice. Of all reviewed articles, 87% reached conclusions that provided insight into the healthfulness of vended products and/or vending environment. Product healthfulness criteria and complexity for snack and beverage products was also found to be variable between the reviewed studies. These findings make it difficult to compare results between studies. A universal, valid, and reliable vending machine assessment tool that is comprehensive yet user-friendly is recommended. PMID:25772195

  6. Methodology of systematic reviews and recommendations.

    PubMed

    Furlan, Julio C; Singh, Jeffrey; Hsieh, Jane; Fehlings, Michael G

    2011-08-01

    Although research in the field of spinal cord injury (SCI) is a relatively new endeavor, a remarkable number of papers focused on this subspecialty have been published in a broad variety of journals over the last two decades. A multidisciplinary group of experts, including clinical epidemiologists, neurosurgical and orthopedic spine surgeons, basic scientists, rehabilitation specialists, intensivists, and allied health professionals (nursing and physical therapy) was assembled through the Spinal Cord Injury Solutions Network to summarize the existing literature focusing on 12 key topics related to acute traumatic SCI, which have not been recently reviewed. The objective was to develop evidence-based recommendations to help translate current science into clinical practice and to identify new directions for research. For each topic one to three specific questions were formulated by consensus through the expert panel. A systematic review of the literature was performed to determine the current evidence for the specific questions. A primary literature search was performed using MEDLINE, CINAHL, EMBASE, and Cochrane databases. A secondary search strategy incorporated additional articles referenced in significant publications (i.e., meta-analysis, systematic and nonsystematic review articles). Two reviewers independently reviewed the titles and abstracts yielded by this comprehensive search and subsequently selected articles based on the predetermined inclusion and inclusion criteria. Data were extracted for population into evidentiary tables. Selected articles were rated for level of evidence and methodological quality, information that was also included in evidentiary tables. Disagreements were resolved by a third reviewer or consensus-based discussion. Based on the evidence compiled, answers to the targeted questions were formulated and recommendations generated by consensus-based discussion and anonymized voting using Delphi methodology. A level of consensus of 80

  7. Permanent magnet design methodology

    NASA Technical Reports Server (NTRS)

    Leupold, Herbert A.

    1991-01-01

    Design techniques developed for the exploitation of high energy magnetically rigid materials such as Sm-Co and Nd-Fe-B have resulted in a revolution in kind rather than in degree in the design of a variety of electron guidance structures for ballistic and aerospace applications. Salient examples are listed. Several prototype models were developed. These structures are discussed in some detail: permanent magnet solenoids, transverse field sources, periodic structures, and very high field structures.

  8. Solid lubrication design methodology

    NASA Technical Reports Server (NTRS)

    Aggarwal, B. B.; Yonushonis, T. M.; Bovenkerk, R. L.

    1984-01-01

    A single element traction rig was used to measure the traction forces at the contact of a ball against a flat disc at room temperature under combined rolling and sliding. The load and speed conditions were selected to match those anticipated for bearing applications in adiabatic diesel engines. The test program showed that the magnitude of traction forces were almost the same for all the lubricants tested; a lubricant should, therefore, be selected on the basis of its ability to prevent wear of the contact surfaces. Traction vs. slide/roll ratio curves were similar to those for liquid lubricants but the traction forces were an order of magnitude higher. The test data was used to derive equations to predict traction force as a function of contact stress and rolling speed. Qualitative design guidelines for solid lubricated concentrated contacts are proposed.

  9. Fixed or random effects meta-analysis? Common methodological issues in systematic reviews of effectiveness.

    PubMed

    Tufanaru, Catalin; Munn, Zachary; Stephenson, Matthew; Aromataris, Edoardo

    2015-09-01

    Systematic review aims to systematically identify, critically appraise, and summarize all relevant studies that match predefined criteria and answer predefined questions. The most common type of systematic review is that assessing the effectiveness of an intervention or therapy. In this article, we discuss some of the common methodological issues that arise when conducting systematic reviews and meta-analyses of effectiveness data, including issues related to study designs, meta-analysis, and the use and interpretation of effect sizes. PMID:26355603

  10. Methodological quality of behavioural weight loss studies: a systematic review.

    PubMed

    Lemon, S C; Wang, M L; Haughton, C F; Estabrook, D P; Frisard, C F; Pagoto, S L

    2016-07-01

    This systematic review assessed the methodological quality of behavioural weight loss intervention studies conducted among adults and associations between quality and statistically significant weight loss outcome, strength of intervention effectiveness and sample size. Searches for trials published between January, 2009 and December, 2014 were conducted using PUBMED, MEDLINE and PSYCINFO and identified ninety studies. Methodological quality indicators included study design, anthropometric measurement approach, sample size calculations, intent-to-treat (ITT) analysis, loss to follow-up rate, missing data strategy, sampling strategy, report of treatment receipt and report of intervention fidelity (mean = 6.3). Indicators most commonly utilized included randomized design (100%), objectively measured anthropometrics (96.7%), ITT analysis (86.7%) and reporting treatment adherence (76.7%). Most studies (62.2%) had a follow-up rate > 75% and reported a loss to follow-up analytic strategy or minimal missing data (69.9%). Describing intervention fidelity (34.4%) and sampling from a known population (41.1%) were least common. Methodological quality was not associated with reporting a statistically significant result, effect size or sample size. This review found the published literature of behavioural weight loss trials to be of high quality for specific indicators, including study design and measurement. Identified for improvement include utilization of more rigorous statistical approaches to loss to follow up and better fidelity reporting. PMID:27071775

  11. Systematic Comparison of Operating Reserve Methodologies: Preprint

    SciTech Connect

    Ibanez, E.; Krad, I.; Ela, E.

    2014-04-01

    Operating reserve requirements are a key component of modern power systems, and they contribute to maintaining reliable operations with minimum economic impact. No universal method exists for determining reserve requirements, thus there is a need for a thorough study and performance comparison of the different existing methodologies. Increasing penetrations of variable generation (VG) on electric power systems are posed to increase system uncertainty and variability, thus the need for additional reserve also increases. This paper presents background information on operating reserve and its relationship to VG. A consistent comparison of three methodologies to calculate regulating and flexibility reserve in systems with VG is performed.

  12. A Systematic Methodology for Verifying Superscalar Microprocessors

    NASA Technical Reports Server (NTRS)

    Srivas, Mandayam; Hosabettu, Ravi; Gopalakrishnan, Ganesh

    1999-01-01

    We present a systematic approach to decompose and incrementally build the proof of correctness of pipelined microprocessors. The central idea is to construct the abstraction function by using completion functions, one per unfinished instruction, each of which specifies the effect (on the observables) of completing the instruction. In addition to avoiding the term size and case explosion problem that limits the pure flushing approach, our method helps localize errors, and also handles stages with interactive loops. The technique is illustrated on pipelined and superscalar pipelined implementations of a subset of the DLX architecture. It has also been applied to a processor with out-of-order execution.

  13. Waste Package Design Methodology Report

    SciTech Connect

    D.A. Brownson

    2001-09-28

    The objective of this report is to describe the analytical methods and processes used by the Waste Package Design Section to establish the integrity of the various waste package designs, the emplacement pallet, and the drip shield. The scope of this report shall be the methodology used in criticality, risk-informed, shielding, source term, structural, and thermal analyses. The basic features and appropriateness of the methods are illustrated, and the processes are defined whereby input values and assumptions flow through the application of those methods to obtain designs that ensure defense-in-depth as well as satisfy requirements on system performance. Such requirements include those imposed by federal regulation, from both the U.S. Department of Energy (DOE) and U.S. Nuclear Regulatory Commission (NRC), and those imposed by the Yucca Mountain Project to meet repository performance goals. The report is to be used, in part, to describe the waste package design methods and techniques to be used for producing input to the License Application Report.

  14. Methodology for Designing Fault-Protection Software

    NASA Technical Reports Server (NTRS)

    Barltrop, Kevin; Levison, Jeffrey; Kan, Edwin

    2006-01-01

    A document describes a methodology for designing fault-protection (FP) software for autonomous spacecraft. The methodology embodies and extends established engineering practices in the technical discipline of Fault Detection, Diagnosis, Mitigation, and Recovery; and has been successfully implemented in the Deep Impact Spacecraft, a NASA Discovery mission. Based on established concepts of Fault Monitors and Responses, this FP methodology extends the notion of Opinion, Symptom, Alarm (aka Fault), and Response with numerous new notions, sub-notions, software constructs, and logic and timing gates. For example, Monitor generates a RawOpinion, which graduates into Opinion, categorized into no-opinion, acceptable, or unacceptable opinion. RaiseSymptom, ForceSymptom, and ClearSymptom govern the establishment and then mapping to an Alarm (aka Fault). Local Response is distinguished from FP System Response. A 1-to-n and n-to- 1 mapping is established among Monitors, Symptoms, and Responses. Responses are categorized by device versus by function. Responses operate in tiers, where the early tiers attempt to resolve the Fault in a localized step-by-step fashion, relegating more system-level response to later tier(s). Recovery actions are gated by epoch recovery timing, enabling strategy, urgency, MaxRetry gate, hardware availability, hazardous versus ordinary fault, and many other priority gates. This methodology is systematic, logical, and uses multiple linked tables, parameter files, and recovery command sequences. The credibility of the FP design is proven via a fault-tree analysis "top-down" approach, and a functional fault-mode-effects-and-analysis via "bottoms-up" approach. Via this process, the mitigation and recovery strategy(s) per Fault Containment Region scope (width versus depth) the FP architecture.

  15. Integrated Design Methodology for Highly Reliable Liquid Rocket Engine

    NASA Astrophysics Data System (ADS)

    Kuratani, Naoshi; Aoki, Hiroshi; Yasui, Masaaki; Kure, Hirotaka; Masuya, Goro

    The Integrated Design Methodology is strongly required at the conceptual design phase to achieve the highly reliable space transportation systems, especially the propulsion systems, not only in Japan but also all over the world in these days. Because in the past some catastrophic failures caused some losses of mission and vehicle (LOM/LOV) at the operational phase, moreover did affect severely the schedule delays and cost overrun at the later development phase. Design methodology for highly reliable liquid rocket engine is being preliminarily established and investigated in this study. The sensitivity analysis is systematically performed to demonstrate the effectiveness of this methodology, and to clarify and especially to focus on the correlation between the combustion chamber, turbopump and main valve as main components. This study describes the essential issues to understand the stated correlations, the need to apply this methodology to the remaining critical failure modes in the whole engine system, and the perspective on the engine development in the future.

  16. Space Engineering Projects in Design Methodology

    NASA Technical Reports Server (NTRS)

    Crawford, R.; Wood, K.; Nichols, S.; Hearn, C.; Corrier, S.; DeKunder, G.; George, S.; Hysinger, C.; Johnson, C.; Kubasta, K.

    1993-01-01

    NASA/USRA is an ongoing sponsor of space design projects in the senior design courses of the Mechanical Engineering Department at The University of Texas at Austin. This paper describes the UT senior design sequence, focusing on the first-semester design methodology course. The philosophical basis and pedagogical structure of this course is summarized. A history of the Department's activities in the Advanced Design Program is then presented. The paper includes a summary of the projects completed during the 1992-93 Academic Year in the methodology course, and concludes with an example of two projects completed by student design teams.

  17. Assuring data transparency through design methodologies

    NASA Technical Reports Server (NTRS)

    Williams, Allen

    1990-01-01

    This paper addresses the role of design methodologies and practices in the assurance of technology transparency. The development of several subsystems on large, long life cycle government programs was analyzed to glean those characteristics in the design, development, test, and evaluation that precluded or enabled the insertion of new technology. The programs examined were Minuteman, DSP, B1-B, and space shuttle. All these were long life cycle, technology-intensive programs. The design methodologies (or lack thereof) and design practices for each were analyzed in terms of the success or failure in incorporating evolving technology. Common elements contributing to the success or failure were extracted and compared to current methodologies being proposed by the Department of Defense and NASA. The relevance of these practices to the design and deployment of Space Station Freedom were evaluated. In particular, appropriate methodologies now being used on the core development contract were examined.

  18. A design methodology for unattended monitoring systems

    SciTech Connect

    SMITH,JAMES D.; DELAND,SHARON M.

    2000-03-01

    The authors presented a high-level methodology for the design of unattended monitoring systems, focusing on a system to detect diversion of nuclear materials from a storage facility. The methodology is composed of seven, interrelated analyses: Facility Analysis, Vulnerability Analysis, Threat Assessment, Scenario Assessment, Design Analysis, Conceptual Design, and Performance Assessment. The design of the monitoring system is iteratively improved until it meets a set of pre-established performance criteria. The methodology presented here is based on other, well-established system analysis methodologies and hence they believe it can be adapted to other verification or compliance applications. In order to make this approach more generic, however, there needs to be more work on techniques for establishing evaluation criteria and associated performance metrics. They found that defining general-purpose evaluation criteria for verifying compliance with international agreements was a significant undertaking in itself. They finally focused on diversion of nuclear material in order to simplify the problem so that they could work out an overall approach for the design methodology. However, general guidelines for the development of evaluation criteria are critical for a general-purpose methodology. A poor choice in evaluation criteria could result in a monitoring system design that solves the wrong problem.

  19. General Methodology for Designing Spacecraft Trajectories

    NASA Technical Reports Server (NTRS)

    Condon, Gerald; Ocampo, Cesar; Mathur, Ravishankar; Morcos, Fady; Senent, Juan; Williams, Jacob; Davis, Elizabeth C.

    2012-01-01

    A methodology for designing spacecraft trajectories in any gravitational environment within the solar system has been developed. The methodology facilitates modeling and optimization for problems ranging from that of a single spacecraft orbiting a single celestial body to that of a mission involving multiple spacecraft and multiple propulsion systems operating in gravitational fields of multiple celestial bodies. The methodology consolidates almost all spacecraft trajectory design and optimization problems into a single conceptual framework requiring solution of either a system of nonlinear equations or a parameter-optimization problem with equality and/or inequality constraints.

  20. Applying Software Design Methodology to Instructional Design

    ERIC Educational Resources Information Center

    East, J. Philip

    2004-01-01

    The premise of this paper is that computer science has much to offer the endeavor of instructional improvement. Software design processes employed in computer science for developing software can be used for planning instruction and should improve instruction in much the same manner that design processes appear to have improved software. Techniques…

  1. Applying Software Design Methodology to Instructional Design

    NASA Astrophysics Data System (ADS)

    East, J. Philip

    2004-12-01

    The premise of this paper is that computer science has much to offer the endeavor of instructional improvement. Software design processes employed in computer science for developing software can be used for planning instruction and should improve instruction in much the same manner that design processes appear to have improved software. Techniques for examining the software development process can be applied to an examination of the instructional process. Furthermore, the computer science discipline is particularly well suited to these tasks. Thus, computer science can develop instructional design expertise for export to other disciplines to improve education in all disciplines and, eventually, at all levels.

  2. Design methodologies for silicon photonic integrated circuits

    NASA Astrophysics Data System (ADS)

    Chrostowski, Lukas; Flueckiger, Jonas; Lin, Charlie; Hochberg, Michael; Pond, James; Klein, Jackson; Ferguson, John; Cone, Chris

    2014-03-01

    This paper describes design methodologies developed for silicon photonics integrated circuits. The approach presented is inspired by methods employed in the Electronics Design Automation (EDA) community. This is complemented by well established photonic component design tools, compact model synthesis, and optical circuit modelling. A generic silicon photonics design kit, as described here, is available for download at http://www.siepic.ubc.ca/GSiP.

  3. Methodological Alignment in Design-Based Research

    ERIC Educational Resources Information Center

    Hoadley, Christopher M.

    2004-01-01

    Empirical research is all about trying to model and predict the world. In this article, I discuss how design-based research methods can help do this effectively. In particular, design-based research methods can help with the problem of methodological alignment: ensuring that the research methods we use actually test what we think they are testing.…

  4. Design methodology and projects for space engineering

    NASA Technical Reports Server (NTRS)

    Nichols, S.; Kleespies, H.; Wood, K.; Crawford, R.

    1993-01-01

    NASA/USRA is an ongoing sponsor of space design projects in the senior design course of the Mechanical Engineering Department at The University of Texas at Austin. This paper describes the UT senior design sequence, consisting of a design methodology course and a capstone design course. The philosophical basis of this sequence is briefly summarized. A history of the Department's activities in the Advanced Design Program is then presented. The paper concludes with a description of the projects completed during the 1991-92 academic year and the ongoing projects for the Fall 1992 semester.

  5. Integrating rock mechanics issues with repository design through design process principles and methodology

    SciTech Connect

    Bieniawski, Z.T.

    1996-04-01

    A good designer needs not only knowledge for designing (technical know-how that is used to generate alternative design solutions) but also must have knowledge about designing (appropriate principles and systematic methodology to follow). Concepts such as {open_quotes}design for manufacture{close_quotes} or {open_quotes}concurrent engineering{close_quotes} are widely used in the industry. In the field of rock engineering, only limited attention has been paid to the design process because design of structures in rock masses presents unique challenges to the designers as a result of the uncertainties inherent in characterization of geologic media. However, a stage has now been reached where we are be able to sufficiently characterize rock masses for engineering purposes and identify the rock mechanics issues involved but are still lacking engineering design principles and methodology to maximize our design performance. This paper discusses the principles and methodology of the engineering design process directed to integrating site characterization activities with design, construction and performance of an underground repository. Using the latest information from the Yucca Mountain Project on geology, rock mechanics and starter tunnel design, the current lack of integration is pointed out and it is shown how rock mechanics issues can be effectively interwoven with repository design through a systematic design process methodology leading to improved repository performance. In essence, the design process is seen as the use of design principles within an integrating design methodology, leading to innovative problem solving. In particular, a new concept of {open_quotes}Design for Constructibility and Performance{close_quotes} is introduced. This is discussed with respect to ten rock mechanics issues identified for repository design and performance.

  6. Single-Case Experimental Designs: A Systematic Review of Published Research and Current Standards

    ERIC Educational Resources Information Center

    Smith, Justin D.

    2012-01-01

    This article systematically reviews the research design and methodological characteristics of single-case experimental design (SCED) research published in peer-reviewed journals between 2000 and 2010. SCEDs provide researchers with a flexible and viable alternative to group designs with large sample sizes. However, methodological challenges have…

  7. Failure detection system design methodology. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Chow, E. Y.

    1980-01-01

    The design of a failure detection and identification system consists of designing a robust residual generation process and a high performance decision making process. The design of these two processes are examined separately. Residual generation is based on analytical redundancy. Redundancy relations that are insensitive to modelling errors and noise effects are important for designing robust residual generation processes. The characterization of the concept of analytical redundancy in terms of a generalized parity space provides a framework in which a systematic approach to the determination of robust redundancy relations are developed. The Bayesian approach is adopted for the design of high performance decision processes. The FDI decision problem is formulated as a Bayes sequential decision problem. Since the optimal decision rule is incomputable, a methodology for designing suboptimal rules is proposed. A numerical algorithm is developed to facilitate the design and performance evaluation of suboptimal rules.

  8. Parametric design methodology for chemical processes using a simulator

    SciTech Connect

    Diwekar, U.M.; Rubin, E.S. )

    1994-02-01

    Parameter design is a method popularized by the Japanese quality expert G. Taguchi, for designing products and manufacturing processes that are robust in the face of uncontrollable variations. At the design stage, the goal of parameter design is to identify design settings that make the product performance less sensitive to the effects of manufacturing and environmental variations and deterioration. Because parameter design reduces performance variation by reducing the influence of the sources of variation rather than by controlling them, it is a cost-effective technique for improving quality. A recent study on the application of parameter design methodology for chemical processes reported that the use of Taguchi's method was not justified and a method based on Monte Carlo simulation combined with optimization was shown to be more effective. However, this method is computationally intensive as a large number of samples are necessary to achieve the given accuracy. Additionally, determination of the number of sample runs required is based on experimentation due to a lack of systematic sampling methods. In an attempt to overcome these problems, the use of a stochastic modeling capability combined with an optimizer is presented in this paper. The objective is that of providing an effective means for application of parameter design methodologies to chemical processes using the ASPEN simulator. This implementation not only presents a generalized tool for use by chemical engineers at large but also provides systematic estimates of the number of sample runs required to attain the specified accuracy. The stochastic model employs the technique of Latin hypercube sampling instead of the traditional Monte Carlo technique and hence has a great potential to reduce the required number of samples. The methodology is illustrated via an example problem of designing a chemical process.

  9. Case-Crossover Analysis of Air Pollution Health Effects: A Systematic Review of Methodology and Application

    PubMed Central

    Carracedo-Martínez, Eduardo; Taracido, Margarita; Tobias, Aurelio; Saez, Marc; Figueiras, Adolfo

    2010-01-01

    Background Case-crossover is one of the most used designs for analyzing the health-related effects of air pollution. Nevertheless, no one has reviewed its application and methodology in this context. Objective We conducted a systematic review of case-crossover (CCO) designs used to study the relationship between air pollution and morbidity and mortality, from the standpoint of methodology and application. Data sources and extraction A search was made of the MEDLINE and EMBASE databases. Reports were classified as methodologic or applied. From the latter, the following information was extracted: author, study location, year, type of population (general or patients), dependent variable(s), independent variable(s), type of CCO design, and whether effect modification was analyzed for variables at the individual level. Data synthesis The review covered 105 reports that fulfilled the inclusion criteria. Of these, 24 addressed methodological aspects, and the remainder involved the design’s application. In the methodological reports, the designs that yielded the best results in simulation were symmetric bidirectional CCO and time-stratified CCO. Furthermore, we observed an increase across time in the use of certain CCO designs, mainly symmetric bidirectional and time-stratified CCO. The dependent variables most frequently analyzed were those relating to hospital morbidity; the pollutants most often studied were those linked to particulate matter. Among the CCO-application reports, 13.6% studied effect modification for variables at the individual level. Conclusions The use of CCO designs has undergone considerable growth; the most widely used designs were those that yielded better results in simulation studies: symmetric bidirectional and time-stratified CCO. However, the advantages of CCO as a method of analysis of variables at the individual level are put to little use. PMID:20356818

  10. Waste Package Component Design Methodology Report

    SciTech Connect

    D.C. Mecham

    2004-07-12

    This Executive Summary provides an overview of the methodology being used by the Yucca Mountain Project (YMP) to design waste packages and ancillary components. This summary information is intended for readers with general interest, but also provides technical readers a general framework surrounding a variety of technical details provided in the main body of the report. The purpose of this report is to document and ensure appropriate design methods are used in the design of waste packages and ancillary components (the drip shields and emplacement pallets). The methodology includes identification of necessary design inputs, justification of design assumptions, and use of appropriate analysis methods, and computational tools. This design work is subject to ''Quality Assurance Requirements and Description''. The document is primarily intended for internal use and technical guidance for a variety of design activities. It is recognized that a wide audience including project management, the U.S. Department of Energy (DOE), the U.S. Nuclear Regulatory Commission, and others are interested to various levels of detail in the design methods and therefore covers a wide range of topics at varying levels of detail. Due to the preliminary nature of the design, readers can expect to encounter varied levels of detail in the body of the report. It is expected that technical information used as input to design documents will be verified and taken from the latest versions of reference sources given herein. This revision of the methodology report has evolved with changes in the waste package, drip shield, and emplacement pallet designs over many years and may be further revised as the design is finalized. Different components and analyses are at different stages of development. Some parts of the report are detailed, while other less detailed parts are likely to undergo further refinement. The design methodology is intended to provide designs that satisfy the safety and operational

  11. Performance-based asphalt mixture design methodology

    NASA Astrophysics Data System (ADS)

    Ali, Al-Hosain Mansour

    Today, several State D.O.T.s are being investigating the use of tire rubber with local conventional materials. Several of the ongoing investigations identified potential benefits from the use of these materials, including improvements in material properties and performance. One of the major problems is being associated with the transferability of asphalt rubber technology without appropriately considering the effects of the variety of conventional materials on mixture behavior and performance. Typically, the design of these mixtures is being adapted to the physical properties of the conventional materials by using the empirical Marshall mixture design and without considering fundamental mixture behavior and performance. Use of design criteria related to the most common modes of failure for asphalt mixtures, such as rutting, fatigue cracking, and low temperature thermal cracking have to be developed and used for identifying the "best mixture," in term of performance, for the specific local materials and loading conditions. The main objective of this study was the development of a mixture design methodology that considers mixture behavior and performance. In order to achieve this objective a laboratory investigation able to evaluate mixture properties that can be related to mixture performance, (in terms of rutting, low temperature cracking, moisture damage and fatigue), and simulating the actual field loading conditions that the material is being exposed to, was conducted. The results proved that the inclusion of rubber into asphalt mixtures improved physical characteristics such as elasticity, flexibility, rebound, aging properties, increased fatigue resistance, and reduced rutting potential. The possibility of coupling the traditional Marshall mix design method with parameters related to mixture behavior and performance was investigated. Also, the SHRP SUPERPAVE mix design methodology was reviewed and considered in this study for the development of an integrated

  12. Statins, cognition, and dementia—systematic review and methodological commentary

    PubMed Central

    Power, Melinda C.; Weuve, Jennifer; Sharrett, A. Richey; Blacker, Deborah

    2015-01-01

    Firm conclusions about whether mid-life or long-term statin use has an impact on cognitive decline and dementia remain elusive. Here, our objective was to systematically review, synthesize and critique the epidemiological literature that examines the relationship between statin use and cognition, so as to assess the current state of knowledge, identify gaps in our understanding, and make recommendations for future research. We summarize the findings of randomized controlled trials (RCTs) and observational studies, grouped according to study design. We discuss the methods for each, and consider likely sources of bias, such as reverse causation and confounding. Although observational studies that considered statin use at or near the time of dementia diagnosis suggest a protective effect of statins, these findings could be attributable to reverse causation. RCTs and well-conducted observational studies of baseline statin use and subsequent cognition over several years of follow-up do not support a causal preventative effect of late-life statin use on cognitive decline or dementia. Given that much of the human research on statins and cognition in the future will be observational, careful study design and analysis will be essential. PMID:25799928

  13. Statins, cognition, and dementia—systematic review and methodological commentary.

    PubMed

    Power, Melinda C; Weuve, Jennifer; Sharrett, A Richey; Blacker, Deborah; Gottesman, Rebecca F

    2015-04-01

    Firm conclusions about whether mid-life or long-term statin use has an impact on cognitive decline and dementia remain elusive. Here, our objective was to systematically review, synthesize and critique the epidemiological literature that examines the relationship between statin use and cognition, so as to assess the current state of knowledge, identify gaps in our understanding, and make recommendations for future research. We summarize the findings of randomized controlled trials (RCTs) and observational studies, grouped according to study design. We discuss the methods for each, and consider likely sources of bias, such as reverse causation and confounding. Although observational studies that considered statin use at or near the time of dementia diagnosis suggest a protective effect of statins, these findings could be attributable to reverse causation. RCTs and well-conducted observational studies of baseline statin use and subsequent cognition over several years of follow-up do not support a causal preventative effect of late-life statin use on cognitive decline or dementia. Given that much of the human research on statins and cognition in the future will be observational, careful study design and analysis will be essential. PMID:25799928

  14. Control/structure interaction design methodology

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.; Layman, William E.

    1989-01-01

    The Control Structure Interaction Program is a technology development program for spacecraft that exhibit interactions between the control system and structural dynamics. The program objectives include development and verification of new design concepts (such as active structure) and new tools (such as a combined structure and control optimization algorithm) and their verification in ground and possibly flight test. The new CSI design methodology is centered around interdisciplinary engineers using new tools that closely integrate structures and controls. Verification is an important CSI theme and analysts will be closely integrated to the CSI Test Bed laboratory. Components, concepts, tools and algorithms will be developed and tested in the lab and in future Shuttle-based flight experiments. The design methodology is summarized in block diagrams depicting the evolution of a spacecraft design and descriptions of analytical capabilities used in the process. The multiyear JPL CSI implementation plan is described along with the essentials of several new tools. A distributed network of computation servers and workstations was designed that will provide a state-of-the-art development base for the CSI technologies.

  15. Design Methodology of Micro Vibration Energy Harvesters

    NASA Astrophysics Data System (ADS)

    Tanaka, Shuji

    Recently, micro vibration energy harvesters are attracting much attention for wireless sensor applications. To answer the power requirement of practical applications, the design methodology is important. This paper first reviews the fundamental theory of vibration energy harvesting, and then discusses how to design a micro vibration energy harvester at a concept level. For the micro vibration energy harvesters, independent design parameters at the top level are only the mass and stroke of a seismic mass and quality factor, while the frequency and acceleration of vibration input are given parameters determined by the application. The key design point is simply to make the mass and stroke of the seismic mass as large as possible within the available device size. Some case studies based on the theory are also presented. This paper provides a guideline for the development of the micro vibration energy harvesters.

  16. Application of systematic review methodology to the field of nutrition

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Systematic reviews represent a rigorous and transparent approach of synthesizing scientific evidence that minimizes bias. They evolved within the medical community to support development of clinical and public health practice guidelines, set research agendas and formulate scientific consensus state...

  17. Systematic Review of the Methodological Quality of Studies Aimed at Creating Gestational Weight Gain Charts.

    PubMed

    Ohadike, Corah O; Cheikh-Ismail, Leila; Ohuma, Eric O; Giuliani, Francesca; Bishop, Deborah; Kac, Gilberto; Puglia, Fabien; Maia-Schlüssel, Michael; Kennedy, Stephen H; Villar, José; Hirst, Jane E

    2016-03-01

    A range of adverse outcomes is associated with insufficient and excessive maternal weight gain in pregnancy, but there is no consensus regarding what constitutes optimal gestational weight gain (GWG). Differences in the methodological quality of GWG studies may explain the varying chart recommendations. The goal of this systematic review was to evaluate the methodological quality of studies that aimed to create GWG charts by scoring them against a set of predefined, independently agreed-upon criteria. These criteria were divided into 3 domains: study design (12 criteria), statistical methods (7 criteria), and reporting methods (4 criteria). The criteria were broken down further into items, and studies were assigned a quality score (QS) based on these criteria. For each item, studies were scored as either high (score = 0) or low (score = 1) risk of bias; a high QS correlated with a low risk of bias. The maximum possible QS was 34. The systematic search identified 12 eligible studies involving 2,268,556 women from 9 countries; their QSs ranged from 9 (26%) to 29 (85%) (median, 18; 53%). The most common sources for bias were found in study designs (i.e., not prospective); assessments of prepregnancy weight and gestational age; descriptions of weighing protocols; sample size calculations; and the multiple measurements taken at each visit. There is wide variation in the methodological quality of GWG studies constructing charts. High-quality studies are needed to guide future clinical recommendations. We recommend the following main requirements for future studies: prospective design, reliable evaluation of prepregnancy weight and gestational age, detailed description of measurement procedures and protocols, description of sample-size calculation, and the creation of smooth centile charts or z scores. PMID:26980814

  18. Systematic Reviews of Animal Models: Methodology versus Epistemology

    PubMed Central

    Greek, Ray; Menache, Andre

    2013-01-01

    Systematic reviews are currently favored methods of evaluating research in order to reach conclusions regarding medical practice. The need for such reviews is necessitated by the fact that no research is perfect and experts are prone to bias. By combining many studies that fulfill specific criteria, one hopes that the strengths can be multiplied and thus reliable conclusions attained. Potential flaws in this process include the assumptions that underlie the research under examination. If the assumptions, or axioms, upon which the research studies are based, are untenable either scientifically or logically, then the results must be highly suspect regardless of the otherwise high quality of the studies or the systematic reviews. We outline recent criticisms of animal-based research, namely that animal models are failing to predict human responses. It is this failure that is purportedly being corrected via systematic reviews. We then examine the assumption that animal models can predict human outcomes to perturbations such as disease or drugs, even under the best of circumstances. We examine the use of animal models in light of empirical evidence comparing human outcomes to those from animal models, complexity theory, and evolutionary biology. We conclude that even if legitimate criticisms of animal models were addressed, through standardization of protocols and systematic reviews, the animal model would still fail as a predictive modality for human response to drugs and disease. Therefore, systematic reviews and meta-analyses of animal-based research are poor tools for attempting to reach conclusions regarding human interventions. PMID:23372426

  19. Development of a systematic methodology for evaluation of soil vapor extraction at Rocky Mountain Arsenal

    SciTech Connect

    Aamodt, E.C.; Gilmore, J.E.; Weaver, J.D.; Dahm, M.A.; Riese, A.C.; Tortoso, A.

    1994-12-31

    A systematic methodology was developed to evaluate the feasibility of using soil vapor extraction (SVE) to treat South Plants and former Basin F soil media in support of the ongoing Onpost Operable Unit Feasibility Study (FS) at Rocky Mountain Arsenal. The methodology used in situ air permeability testing, chemical and physical property characterization, and computer modeling to evaluate the potential for using SVE to treat soil contaminated with volatile organic compounds (VOCs), semivolatile organic compounds (SVOCs), and pesticide manufacturing process wastes, including potential odor-causing compounds. In situ air permeability tests were performed to measure air permeabilities and extracted vapor flow rates. Soil samples were collected at each test location and were analyzed for VOCs, low molecular weight SVOCs, potential odor-causing compounds, and physical property characteristics. In situ air permeability test and chemical and physical property characterization results were used during computer modeling evaluations to develop SVE conceptual designs, estimate remediation timeframes, select appropriate treatment technologies, and develop preliminary cost estimates for full-scale implementation. The methodology developed provides information necessary to evaluate SVE at the FS stage and provides a sound technical basis for design of full-scale SVE systems.

  20. Methodology on zoom system design and optimization

    NASA Astrophysics Data System (ADS)

    Ding, Quanxin; Liu, Hua

    2008-03-01

    For aim to establish effective methodology in research to design and evaluate on typical zoom sensor system, to satisfy the system requirements and achieve an advanced characteristics. Some methods about system analysis, especially task principle and key technique of core system, are analyzed deeply. Base on Gaussian photonics theory, zoom system differential equation, solves vector space distribution and integrated balance algorithm on global optimization system is studied. Dominate configuration of new idea system design and optimization, with which consecutive zoom and diffractive module equipped by great format photonics device, is established. The results of evaluated on a kind of typical zoom sensor system is presented, and achieves remarkable advantages on some criterions, such as Modulation Transfer Function (MTF), Spot Diagram (RMS) and Point Spread Function (PSF) etc., and in volume, weight, system efficiency and otherwise.

  1. Lean management in health care: definition, concepts, methodology and effects reported (systematic review protocol)

    PubMed Central

    2014-01-01

    Background Lean is a set of operating philosophies and methods that help create a maximum value for patients by reducing waste and waits. It emphasizes the consideration of the customer’s needs, employee involvement and continuous improvement. Research on the application and implementation of lean principles in health care has been limited. Methods This is a protocol for a systematic review, following the Cochrane Effective Practice and Organisation of Care (EPOC) methodology. The review aims to document, catalogue and synthesize the existing literature on the effects of lean implementation in health care settings especially the potential effects on professional practice and health care outcomes. We have developed a Medline keyword search strategy, and this focused strategy will be translated into other databases. All search strategies will be provided in the review. The method proposed by the Cochrane EPOC group regarding randomized study designs, non-randomised controlled trials controlled before and after studies and interrupted time series will be followed. In addition, we will also include cohort, case–control studies, and relevant non-comparative publications such as case reports. We will categorize and analyse the review findings according to the study design employed, the study quality (low- versus high-quality studies) and the reported types of implementation in the primary studies. We will present the results of studies in a tabular form. Discussion Overall, the systematic review aims to identify, assess and synthesize the evidence to underpin the implementation of lean activities in health care settings as defined in this protocol. As a result, the review will provide an evidence base for the effectiveness of lean and implementation methodologies reported in health care. Systematic review registration PROSPERO CRD42014008853 PMID:25238974

  2. Methodological Considerations in Designing and Evaluating Animal-Assisted Interventions

    PubMed Central

    Stern, Cindy; Chur-Hansen, Anna

    2013-01-01

    Simple Summary There is a growing literature on the benefits of companion animals to human mental and physical health. Despite the literature base, these benefits are not well understood, because of flawed methodologies. This paper draws upon four systematic reviews, focusing exclusively on the use of canine-assisted interventions for older people residing in long-term care. Two guides are offered for researchers, one for qualitative research, one for quantitative studies, in order to improve the empirical basis of knowledge. Research in the area of the human-animal bond and the potential benefits that derive from it can be better promoted with the use of uniform and rigorous methodological approaches. Abstract This paper presents a discussion of the literature on animal-assisted interventions and describes limitations surrounding current methodological quality. Benefits to human physical, psychological and social health cannot be empirically confirmed due to the methodological limitations of the existing body of research, and comparisons cannot validly be made across different studies. Without a solid research base animal-assisted interventions will not receive recognition and acceptance as a credible alternative health care treatment. The paper draws on the work of four systematic reviews conducted over April–May 2009, with no date restrictions, focusing exclusively on the use of canine-assisted interventions for older people residing in long-term care. The reviews revealed a lack of good quality studies. Although the literature base has grown in volume since its inception, it predominantly consists of anecdotal accounts and reports. Experimental studies undertaken are often flawed in aspects of design, conduct and reporting. There are few qualitative studies available leading to the inability to draw definitive conclusions. It is clear that due to the complexities associated with these interventions not all weaknesses can be eliminated. However, there are

  3. The Use and Reporting of the Cross-Over Study Design in Clinical Trials and Systematic Reviews: A Systematic Assessment

    PubMed Central

    Hambleton, Ian; Dwan, Kerry

    2016-01-01

    Background Systematic reviews of treatment interventions in stable or chronic conditions often require the synthesis of clinical trials with a cross-over design. Previous work has indicated that methodology for analysing cross-over data is inadequate in trial reports and in systematic reviews assessing trials with this design. Objective We assessed systematic review methodology for synthesising cross-over trials among Cochrane Cystic Fibrosis and Genetic Disorders Group reviews published to July 2015, and assessed the quality of reporting among the cross-over trials included in these reviews. Methodology We performed data extraction of methodology and reporting in reviews, trials identified and trials included within reviews. Principal Findings We reviewed a total of 142 Cochrane systematic reviews including 53 reviews which synthesised evidence from 218 cross-over trials. Thirty-three (63%) Cochrane reviews described a clear and appropriate method for the inclusion of cross-over data, and of these 19 (56%) used the same method to analyse results. 145 cross-over trials were described narratively or treated as parallel trials in reviews but in 30 (21%) of these trials data existed in the trial reports to account for the cross-over design. At the trial level, the analysis and presentation of results were often inappropriate or unclear, with only 69 (32%) trials presenting results that could be included in meta-analysis. Conclusions Despite development of accessible, technical guidance and training for Cochrane systematic reviewers, statistical analysis and reporting of cross-over data is inadequate at both the systematic review and the trial level. Plain language and practical guidance for the inclusion of cross-over data in meta-analysis would benefit systematic reviewers, who come from a wide range of health specialties. Minimum reporting standards for cross-over trials are needed. PMID:27409076

  4. Environmental and Sustainability Education Policy Research: A Systematic Review of Methodological and Thematic Trends

    ERIC Educational Resources Information Center

    Aikens, Kathleen; McKenzie, Marcia; Vaughter, Philip

    2016-01-01

    This paper reports on a systematic literature review of policy research in the area of environmental and sustainability education. We analyzed 215 research articles, spanning four decades and representing 71 countries, and which engaged a range of methodologies. Our analysis combines quantification of geographic and methodological trends with…

  5. Unshrouded Centrifugal Turbopump Impeller Design Methodology

    NASA Technical Reports Server (NTRS)

    Prueger, George H.; Williams, Morgan; Chen, Wei-Chung; Paris, John; Williams, Robert; Stewart, Eric

    2001-01-01

    Turbopump weight continues to be a dominant parameter in the trade space for reduction of engine weight. Space Shuttle Main Engine weight distribution indicates that the turbomachinery make up approximately 30% of the total engine weight. Weight reduction can be achieved through the reduction of envelope of the turbopump. Reduction in envelope relates to an increase in turbopump speed and an increase in impeller head coefficient. Speed can be increased until suction performance limits are achieved on the pump or due to alternate constraints the turbine or bearings limit speed. Once the speed of the turbopump is set the impeller tip speed sets the minimum head coefficient of the machine. To reduce impeller diameter the head coefficient must be increased. A significant limitation with increasing head coefficient is that the slope of the head-flow characteristic is affected and this can limit engine throttling range. Unshrouded impellers offer a design option for increased turbopump speed without increasing the impeller head coefficient. However, there are several issues with regard to using an unshrouded impeller: there is a pump performance penalty due to the front open face recirculation flow, there is a potential pump axial thrust problem from the unbalanced front open face and the back shroud face, and since test data is very limited for this configuration, there is uncertainty in the magnitude and phase of the rotordynamic forces due to the front impeller passage. The purpose of the paper is to discuss the design of an unshrouded impeller and to examine the hydrodynamic performance, axial thrust, and rotordynamic performance. The design methodology will also be discussed. This work will help provide some guidelines for unshrouded impeller design.

  6. Methodologies and study designs relevant to medical education research.

    PubMed

    Turner, Teri L; Balmer, Dorene F; Coverdale, John H

    2013-06-01

    Research is an important part of educational scholarship. Knowledge of research methodologies is essential for both conducting research as well as determining the soundness of the findings from published studies. Our goals for this paper therefore are to inform medical education researchers of the range and key components of educational research designs. We will discuss both qualitative and quantitative approaches to educational research. Qualitative methods will be presented according to traditions that have a distinguished history in particular disciplines. Quantitative methods will be presented according to an evidence-based hierarchy akin to that of evidence-based medicine with the stronger designs (systematic reviews and well conducted educational randomized controlled trials) at the top, and weaker designs (descriptive studies without comparison groups, or single case studies) at the bottom. It should be appreciated, however, that the research question determines the study design. Therefore, the onus is on the researcher to choose a design that is appropriate to answering the question. We conclude with an overview of how educational researchers should describe the study design and methods in order to provide transparency and clarity. PMID:23859093

  7. New methodology for systematic construction of systolic arrays

    SciTech Connect

    Faroughi, N.

    1987-01-01

    Transforming an algorithm, represented by mathematical expressions with uniform and bounded index spaces, into a systolic-array architecture is discussed. Systolic arrays are highly structured architectures tailored to a specific application. They have specific architectural properties such as simple processing elements (cells), simple and regular data and control communication, and local-cell interconnections. The new design method is based on an understanding of the relationship between two highly structured representations of the algorithms: the mathematical expressions and their systolic solutions. The method consists of three major steps: algorithm representation, algorithm model, and architecture specification. The algorithm representation involves the translation of mathematical expressions into a set of equivalent simple computations which are grouped into subsets based on the required set of operations and same type operands. In the algorithm model, the properties of systolic arrays are represented in terms of feature interrelationships. A sub-systolic array is designed separately for each subset of the simple computations. The final array is constructed by joining the sub-systolic arrays. Other architecture specifications, such as data movement and cell count ratio, are determined early in a design process and thus can be used to select systolic solutions that require fewest cells and lowest I/O bandwidth.

  8. CONCEPTUAL DESIGNS FOR A NEW HIGHWAY VEHICLE EMISSIONS ESTIMATION METHODOLOGY

    EPA Science Inventory

    The report discusses six conceptual designs for a new highway vehicle emissions estimation methodology and summarizes the recommendations of each design for improving the emissions and activity factors in the emissions estimation process. he complete design reports are included a...

  9. Methodology for Preliminary Design of Electrical Microgrids

    SciTech Connect

    Jensen, Richard P.; Stamp, Jason E.; Eddy, John P.; Henry, Jordan M; Munoz-Ramos, Karina; Abdallah, Tarek

    2015-09-30

    Many critical loads rely on simple backup generation to provide electricity in the event of a power outage. An Energy Surety Microgrid TM can protect against outages caused by single generator failures to improve reliability. An ESM will also provide a host of other benefits, including integration of renewable energy, fuel optimization, and maximizing the value of energy storage. The ESM concept includes a categorization for microgrid value proposi- tions, and quantifies how the investment can be justified during either grid-connected or utility outage conditions. In contrast with many approaches, the ESM approach explic- itly sets requirements based on unlikely extreme conditions, including the need to protect against determined cyber adversaries. During the United States (US) Department of Defense (DOD)/Department of Energy (DOE) Smart Power Infrastructure Demonstration for Energy Reliability and Security (SPIDERS) effort, the ESM methodology was successfully used to develop the preliminary designs, which direct supported the contracting, construction, and testing for three military bases. Acknowledgements Sandia National Laboratories and the SPIDERS technical team would like to acknowledge the following for help in the project: * Mike Hightower, who has been the key driving force for Energy Surety Microgrids * Juan Torres and Abbas Akhil, who developed the concept of microgrids for military installations * Merrill Smith, U.S. Department of Energy SPIDERS Program Manager * Ross Roley and Rich Trundy from U.S. Pacific Command * Bill Waugaman and Bill Beary from U.S. Northern Command * Melanie Johnson and Harold Sanborn of the U.S. Army Corps of Engineers Construc- tion Engineering Research Laboratory * Experts from the National Renewable Energy Laboratory, Idaho National Laboratory, Oak Ridge National Laboratory, and Pacific Northwest National Laboratory

  10. Improving spacecraft design using a multidisciplinary design optimization methodology

    NASA Astrophysics Data System (ADS)

    Mosher, Todd Jon

    2000-10-01

    Spacecraft design has gone from maximizing performance under technology constraints to minimizing cost under performance constraints. This is characteristic of the "faster, better, cheaper" movement that has emerged within NASA. Currently spacecraft are "optimized" manually through a tool-assisted evaluation of a limited set of design alternatives. With this approach there is no guarantee that a systems-level focus will be taken and "feasibility" rather than "optimality" is commonly all that is achieved. To improve spacecraft design in the "faster, better, cheaper" era, a new approach using multidisciplinary design optimization (MDO) is proposed. Using MDO methods brings structure to conceptual spacecraft design by casting a spacecraft design problem into an optimization framework. Then, through the construction of a model that captures design and cost, this approach facilitates a quicker and more straightforward option synthesis. The final step is to automatically search the design space. As computer processor speed continues to increase, enumeration of all combinations, while not elegant, is one method that is straightforward to perform. As an alternative to enumeration, genetic algorithms are used and find solutions by reviewing fewer possible solutions with some limitations. Both methods increase the likelihood of finding an optimal design, or at least the most promising area of the design space. This spacecraft design methodology using MDO is demonstrated on three examples. A retrospective test for validation is performed using the Near Earth Asteroid Rendezvous (NEAR) spacecraft design. For the second example, the premise that aerobraking was needed to minimize mission cost and was mission enabling for the Mars Global Surveyor (MGS) mission is challenged. While one might expect no feasible design space for an MGS without aerobraking mission, a counterintuitive result is discovered. Several design options that don't use aerobraking are feasible and cost

  11. Systematic design of an anastigmatic lens axicon

    NASA Astrophysics Data System (ADS)

    Goncharov, Alexander V.; Burvall, Anna; Dainty, Christopher

    2007-08-01

    We present an analytical method for systematic optical design of a double-pass axicon that shows almost no astigmatism in oblique illumination compared to a conventional linear axicon. The anastigmatic axicon is a singlet lens with nearly concentric spherical surfaces applied in double pass, making it possible to form a long narrow focal line of uniform width. The front and the back surfaces have reflective coatings in the central and annular zones, respectively, to provide the double pass. Our design method finds the radii of curvatures and axial thickness of the lens for a given angle between the exiting rays and the optical axis. It also finds the optimal position of the reflecting zones for minimal vignetting. This method is based on ray tracing of the real rays at the marginal heights of the aperture and therefore is superior to any paraxial method. We illustrate the efficiency of the method by designing a test axicon with optical parameters used for a prototype axicon, which was manufactured and experimentally tested. We compare the optical characteristics of our test axicon with those of the experimental prototype.

  12. Enhancing the Front-End Phase of Design Methodology

    ERIC Educational Resources Information Center

    Elias, Erasto

    2006-01-01

    Design methodology (DM) is defined by the procedural path, expressed in design models, and techniques or methods used to untangle the various activities within a design model. Design education in universities is mainly based on descriptive design models. Much knowledge and organization have been built into DM to facilitate design teaching.…

  13. Design and Methodology in Institutional Research.

    ERIC Educational Resources Information Center

    Bagley, Clarence H., Ed.

    The proceedings of this forum contain 23 papers focusing mainly on the technical aspects of institutional research. Most of the authors are institutional research officers in various colleges and universities. Part 1 contains papers on institutional research methodology. They deal with such topics as faculty load studies, enrollment projections,…

  14. Technical report on LWR design decision methodology. Phase I

    SciTech Connect

    1980-03-01

    Energy Incorporated (EI) was selected by Sandia Laboratories to develop and test on LWR design decision methodology. Contract Number 42-4229 provided funding for Phase I of this work. This technical report on LWR design decision methodology documents the activities performed under that contract. Phase I was a short-term effort to thoroughly review the curret LWR design decision process to assure complete understanding of current practices and to establish a well defined interface for development of initial quantitative design guidelines.

  15. Systematic risk assessment methodology for critical infrastructure elements - Oil and Gas subsectors

    NASA Astrophysics Data System (ADS)

    Gheorghiu, A.-D.; Ozunu, A.

    2012-04-01

    The concern for the protection of critical infrastructure has been rapidly growing in the last few years in Europe. The level of knowledge and preparedness in this field is beginning to develop in a lawfully organized manner, for the identification and designation of critical infrastructure elements of national and European interest. Oil and gas production, refining, treatment, storage and transmission by pipelines facilities, are considered European critical infrastructure sectors, as per Annex I of the Council Directive 2008/114/EC of 8 December 2008 on the identification and designation of European critical infrastructures and the assessment of the need to improve their protection. Besides identifying European and national critical infrastructure elements, member states also need to perform a risk analysis for these infrastructure items, as stated in Annex II of the above mentioned Directive. In the field of risk assessment, there are a series of acknowledged and successfully used methods in the world, but not all hazard identification and assessment methods and techniques are suitable for a given site, situation, or type of hazard. As Theoharidou, M. et al. noted (Theoharidou, M., P. Kotzanikolaou, and D. Gritzalis 2009. Risk-Based Criticality Analysis. In Critical Infrastructure Protection III. Proceedings. Third Annual IFIP WG 11.10 International Conference on Critical Infrastructure Protection. Hanover, New Hampshire, USA, March 23-25, 2009: revised selected papers, edited by C. Palmer and S. Shenoi, 35-49. Berlin: Springer.), despite the wealth of knowledge already created, there is a need for simple, feasible, and standardized criticality analyses. The proposed systematic risk assessment methodology includes three basic steps: the first step (preliminary analysis) includes the identification of hazards (including possible natural hazards) for each installation/section within a given site, followed by a criterial analysis and then a detailed analysis step

  16. Methodology used in studies reporting chronic kidney disease prevalence: a systematic literature review

    PubMed Central

    Brück, Katharina; Jager, Kitty J.; Dounousi, Evangelia; Kainz, Alexander; Nitsch, Dorothea; Ärnlöv, Johan; Rothenbacher, Dietrich; Browne, Gemma; Capuano, Vincenzo; Ferraro, Pietro Manuel; Ferrieres, Jean; Gambaro, Giovanni; Guessous, Idris; Hallan, Stein; Kastarinen, Mika; Navis, Gerjan; Gonzalez, Alfonso Otero; Palmieri, Luigi; Romundstad, Solfrid; Spoto, Belinda; Stengel, Benedicte; Tomson, Charles; Tripepi, Giovanni; Völzke, Henry; Wiȩcek, Andrzej; Gansevoort, Ron; Schöttker, Ben; Wanner, Christoph; Vinhas, Jose; Zoccali, Carmine; Van Biesen, Wim; Stel, Vianda S.

    2015-01-01

    Background Many publications report the prevalence of chronic kidney disease (CKD) in the general population. Comparisons across studies are hampered as CKD prevalence estimations are influenced by study population characteristics and laboratory methods. Methods For this systematic review, two researchers independently searched PubMed, MEDLINE and EMBASE to identify all original research articles that were published between 1 January 2003 and 1 November 2014 reporting the prevalence of CKD in the European adult general population. Data on study methodology and reporting of CKD prevalence results were independently extracted by two researchers. Results We identified 82 eligible publications and included 48 publications of individual studies for the data extraction. There was considerable variation in population sample selection. The majority of studies did not report the sampling frame used, and the response ranged from 10 to 87%. With regard to the assessment of kidney function, 67% used a Jaffe assay, whereas 13% used the enzymatic assay for creatinine determination. Isotope dilution mass spectrometry calibration was used in 29%. The CKD-EPI (52%) and MDRD (75%) equations were most often used to estimate glomerular filtration rate (GFR). CKD was defined as estimated GFR (eGFR) <60 mL/min/1.73 m2 in 92% of studies. Urinary markers of CKD were assessed in 60% of the studies. CKD prevalence was reported by sex and age strata in 54 and 50% of the studies, respectively. In publications with a primary objective of reporting CKD prevalence, 39% reported a 95% confidence interval. Conclusions The findings from this systematic review showed considerable variation in methods for sampling the general population and assessment of kidney function across studies reporting CKD prevalence. These results are utilized to provide recommendations to help optimize both the design and the reporting of future CKD prevalence studies, which will enhance comparability of study results

  17. Reporting of results from network meta-analyses: methodological systematic review

    PubMed Central

    Trinquart, Ludovic; Seror, Raphaèle; Ravaud, Philippe

    2014-01-01

    Objective To examine how the results of network meta-analyses are reported. Design Methodological systematic review of published reports of network meta-analyses. Data sources Cochrane Database of Systematic Reviews, Database of Abstracts of Reviews of Effects, Medline, and Embase, searched from inception to 12 July 2012. Study selection All network meta-analyses comparing the clinical efficacy of three or more interventions in randomised controlled trials were included, excluding meta-analyses with an open loop network of three interventions. Data extraction and synthesis The reporting of the network and results was assessed. A composite outcome included the description of the network (number of interventions, direct comparisons, and randomised controlled trials and patients for each comparison) and the reporting of effect sizes derived from direct evidence, indirect evidence, and the network meta-analysis. Results 121 network meta-analyses (55 published in general journals; 48 funded by at least one private source) were included. The network and its geometry (network graph) were not reported in 100 (83%) articles. The effect sizes derived from direct evidence, indirect evidence, and the network meta-analysis were not reported in 48 (40%), 108 (89%), and 43 (36%) articles, respectively. In 52 reports that ranked interventions, 43 did not report the uncertainty in ranking. Overall, 119 (98%) reports of network meta-analyses did not give a description of the network or effect sizes from direct evidence, indirect evidence, and the network meta-analysis. This finding did not differ by journal type or funding source. Conclusions The results of network meta-analyses are heterogeneously reported. Development of reporting guidelines to assist authors in writing and readers in critically appraising reports of network meta-analyses is timely. PMID:24618053

  18. Forced vibration and flutter design methodology

    SciTech Connect

    Snyder, L.E.; Burns, D.W.

    1988-06-01

    The aeroelastic principles and considerations of designing blades, disks, and vanes to avoid high cycle fatigue failure is covered. Two types of vibration that can cause high cycle fatigue, flutter, and forced vibration, will first be defined and the basic governing equations discussed. Next, under forced vibration design the areas of source definition, types of components, vibratory mode shape definitions, and basic steps in design for adequate high cycle fatigue life will be presented. For clarification a forced vibration design example will be shown using a high performance turbine blade/disk component. Finally, types of flutter, dominant flutter parameters, and flutter procedures and design parameters will be discussed. The overall emphasis is on application to initial design of blades, disks, and vanes of aeroelastic criteria to prevent high cycle fatigue failures.

  19. The Study of the Relationship between Probabilistic Design and Axiomatic Design Methodology. Volume 1

    NASA Technical Reports Server (NTRS)

    Onwubiko, Chinyere; Onyebueke, Landon

    1996-01-01

    This program report is the final report covering all the work done on this project. The goal of this project is technology transfer of methodologies to improve design process. The specific objectives are: 1. To learn and understand the Probabilistic design analysis using NESSUS. 2. To assign Design Projects to either undergraduate or graduate students on the application of NESSUS. 3. To integrate the application of NESSUS into some selected senior level courses in Civil and Mechanical Engineering curricula. 4. To develop courseware in Probabilistic Design methodology to be included in a graduate level Design Methodology course. 5. To study the relationship between the Probabilistic design methodology and Axiomatic design methodology.

  20. A prototype computerized synthesis methodology for generic space access vehicle (SAV) conceptual design

    NASA Astrophysics Data System (ADS)

    Huang, Xiao

    2006-04-01

    Today's and especially tomorrow's competitive launch vehicle design environment requires the development of a dedicated generic Space Access Vehicle (SAV) design methodology. A total of 115 industrial, research, and academic aircraft, helicopter, missile, and launch vehicle design synthesis methodologies have been evaluated. As the survey indicates, each synthesis methodology tends to focus on a specific flight vehicle configuration, thus precluding the key capability to systematically compare flight vehicle design alternatives. The aim of the research investigation is to provide decision-making bodies and the practicing engineer a design process and tool box for robust modeling and simulation of flight vehicles where the ultimate performance characteristics may hinge on numerical subtleties. This will enable the designer of a SAV for the first time to consistently compare different classes of SAV configurations on an impartial basis. This dissertation presents the development steps required towards a generic (configuration independent) hands-on flight vehicle conceptual design synthesis methodology. This process is developed such that it can be applied to any flight vehicle class if desired. In the present context, the methodology has been put into operation for the conceptual design of a tourist Space Access Vehicle. The case study illustrates elements of the design methodology & algorithm for the class of Horizontal Takeoff and Horizontal Landing (HTHL) SAVs. The HTHL SAV design application clearly outlines how the conceptual design process can be centrally organized, executed and documented with focus on design transparency, physical understanding and the capability to reproduce results. This approach offers the project lead and creative design team a management process and tool which iteratively refines the individual design logic chosen, leading to mature design methods and algorithms. As illustrated, the HTHL SAV hands-on design methodology offers growth

  1. Qualitative research synthesis: methodological guidance for systematic reviewers utilizing meta-aggregation.

    PubMed

    Lockwood, Craig; Munn, Zachary; Porritt, Kylie

    2015-09-01

    Qualitative synthesis informs important aspects of evidence-based healthcare, particularly within the practical decision-making contexts that health professionals work in. Of the qualitative methodologies available for synthesis, meta-aggregation is most transparently aligned with accepted conventions for the conduct of high-quality systematic reviews. Meta-aggregation is philosophically grounded in pragmatism and transcendental phenomenology. The essential characteristics of a meta-aggregative review are that the reviewer avoids re-interpretation of included studies, but instead accurately and reliably presents the findings of the included studies as intended by the original authors. This study reports on the methodology and methods of meta-aggregation within the structure of an a priori protocol and standardized frameworks for reporting of results by over-viewing the essential components of a systematic review report. PMID:26262565

  2. Implicit Shape Parameterization for Kansei Design Methodology

    NASA Astrophysics Data System (ADS)

    Nordgren, Andreas Kjell; Aoyama, Hideki

    Implicit shape parameterization for Kansei design is a procedure that use 3D-models, or concepts, to span a shape space for surfaces in the automotive field. A low-dimensional, yet accurate shape descriptor was found by Principal Component Analysis of an ensemble of point-clouds, which were extracted from mesh-based surfaces modeled in a CAD-program. A theoretical background of the procedure is given along with step-by-step instructions for the required data-processing. The results show that complex surfaces can be described very efficiently, and encode design features by an implicit approach that does not rely on error-prone explicit parameterizations. This provides a very intuitive way to explore shapes for a designer, because various design features can simply be introduced by adding new concepts to the ensemble. Complex shapes have been difficult to analyze with Kansei methods due to the large number of parameters involved, but implicit parameterization of design features provides a low-dimensional shape descriptor for efficient data collection, model-building and analysis of emotional content in 3D-surfaces.

  3. A wing design methodology for low-boom low-drag supersonic business jet

    NASA Astrophysics Data System (ADS)

    Le, Daniel B.

    2009-12-01

    The arguably most critical hindrance to the successful development of a commercial supersonic aircraft is the impact of the sonic boom signature. The sonic boom signature of a supersonic aircraft is predicted using sonic boom theory, which formulates a relationship between the complex three-dimensional geometry of the aircraft to the pressure distribution and decomposes the geometry in terms of simple geometrical components. The supersonic aircraft design process is typically based on boom minimization theory. This theory provides a theoretical equivalent area distribution which should be matched by the conceptual design in order to achieve the pre-determined sonic boom signature. The difference between the target equivalent area distribution and the actual equivalent area distribution is referred to here as the gap distribution. The primary intent of this dissertation is to provide the designer with a systematic and structured approach to designing the aircraft wings with limited changes to the baseline concept while achieving critical design goals. The design process can be easily overwhelmed and may be difficult to evaluate their effectiveness. The wing design is decoupled into two separate processes, one focused on the planform design and the other on the camber design. Moreover, this design methodology supplements the designer by allowing trade studies to be conducted between important design parameters and objectives. The wing planform design methodology incorporates a continuous gradient-based optimization scheme to supplement the design process. This is not meant to substitute the vast amount of knowledge and design decisions that are needed for a successful design. Instead, the numerical optimization helps the designer to refine creative concepts. Last, this dissertation integrates a risk mitigation scheme throughout the wing design process. The design methodology implements minimal design changes to the wing geometry white achieving the target design goal

  4. Philosophical and Methodological Beliefs of Instructional Design Faculty and Professionals

    ERIC Educational Resources Information Center

    Sheehan, Michael D.; Johnson, R. Burke

    2012-01-01

    The purpose of this research was to probe the philosophical beliefs of instructional designers using sound philosophical constructs and quantitative data collection and analysis. We investigated the philosophical and methodological beliefs of instructional designers, including 152 instructional design faculty members and 118 non-faculty…

  5. Experimental Validation of an Integrated Controls-Structures Design Methodology

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Gupta, Sandeep; Elliot, Kenny B.; Walz, Joseph E.

    1996-01-01

    The first experimental validation of an integrated controls-structures design methodology for a class of large order, flexible space structures is described. Integrated redesign of the controls-structures-interaction evolutionary model, a laboratory testbed at NASA Langley, was described earlier. The redesigned structure was fabricated, assembled in the laboratory, and experimentally tested against the original structure. Experimental results indicate that the structure redesigned using the integrated design methodology requires significantly less average control power than the nominal structure with control-optimized designs, while maintaining the required line-of-sight pointing performance. Thus, the superiority of the integrated design methodology over the conventional design approach is experimentally demonstrated. Furthermore, amenability of the integrated design structure to other control strategies is evaluated, both analytically and experimentally. Using Linear-Quadratic-Guassian optimal dissipative controllers, it is observed that the redesigned structure leads to significantly improved performance with alternate controllers as well.

  6. Changes in Clinical Trials Methodology Over Time: A Systematic Review of Six Decades of Research in Psychopharmacology

    PubMed Central

    Brunoni, André R.; Tadini, Laura; Fregni, Felipe

    2010-01-01

    Background There have been many changes in clinical trials methodology since the introduction of lithium and the beginning of the modern era of psychopharmacology in 1949. The nature and importance of these changes have not been fully addressed to date. As methodological flaws in trials can lead to false-negative or false-positive results, the objective of our study was to evaluate the impact of methodological changes in psychopharmacology clinical research over the past 60 years. Methodology/Principal Findings We performed a systematic review from 1949 to 2009 on MEDLINE and Web of Science electronic databases, and a hand search of high impact journals on studies of seven major drugs (chlorpromazine, clozapine, risperidone, lithium, fluoxetine and lamotrigine). All controlled studies published 100 months after the first trial were included. Ninety-one studies met our inclusion criteria. We analyzed the major changes in abstract reporting, study design, participants' assessment and enrollment, methodology and statistical analysis. Our results showed that the methodology of psychiatric clinical trials changed substantially, with quality gains in abstract reporting, results reporting, and statistical methodology. Recent trials use more informed consent, periods of washout, intention-to-treat approach and parametric tests. Placebo use remains high and unchanged over time. Conclusions/Significance Clinical trial quality of psychopharmacological studies has changed significantly in most of the aspects we analyzed. There was significant improvement in quality reporting and internal validity. These changes have increased study efficiency; however, there is room for improvement in some aspects such as rating scales, diagnostic criteria and better trial reporting. Therefore, despite the advancements observed, there are still several areas that can be improved in psychopharmacology clinical trials. PMID:20209133

  7. Surface design methodology - challenge the steel

    NASA Astrophysics Data System (ADS)

    Bergman, M.; Rosen, B.-G.; Eriksson, L.; Anderberg, C.

    2014-03-01

    The way a product or material is experienced by its user could be different depending on the scenario. It is also well known that different materials and surfaces are used for different purposes. When optimizing materials and surface roughness for a certain something with the intention to improve a product, it is important to obtain not only the physical requirements, but also the user experience and expectations. Laws and requirements of the materials and the surface function, but also the conservative way of thinking about materials and colours characterize the design of medical equipment. The purpose of this paper is to link the technical- and customer requirements of current materials and surface textures in medical environments. By focusing on parts of the theory of Kansei Engineering, improvements of the companys' products are possible. The idea is to find correlations between desired experience or "feeling" for a product, -customer requirements, functional requirements, and product geometrical properties -design parameters, to be implemented on new improved products. To be able to find new materials with the same (or better) technical requirements but a higher level of user stimulation, the current material (stainless steel) and its surface (brushed textures) was used as a reference. The usage of focus groups of experts at the manufacturer lead to a selection of twelve possible new materials for investigation in the project. In collaboration with the topical company for this project, three new materials that fulfil the requirements -easy to clean and anti-bacterial came to be in focus for further investigation in regard to a new design of a washer-disinfector for medical equipment using the Kansei based Clean ability approach CAA.

  8. FOREWORD: Computational methodologies for designing materials Computational methodologies for designing materials

    NASA Astrophysics Data System (ADS)

    Rahman, Talat S.

    2009-02-01

    It would be fair to say that in the past few decades, theory and computer modeling have played a major role in elucidating the microscopic factors that dictate the properties of functional novel materials. Together with advances in experimental techniques, theoretical methods are becoming increasingly capable of predicting properties of materials at different length scales, thereby bringing in sight the long-sought goal of designing material properties according to need. Advances in computer technology and their availability at a reasonable cost around the world have made tit all the more urgent to disseminate what is now known about these modern computational techniques. In this special issue on computational methodologies for materials by design we have tried to solicit articles from authors whose works collectively represent the microcosm of developments in the area. This turned out to be a difficult task for a variety of reasons, not the least of which is space limitation in this special issue. Nevertheless, we gathered twenty articles that represent some of the important directions in which theory and modeling are proceeding in the general effort to capture the ability to produce materials by design. The majority of papers presented here focus on technique developments that are expected to uncover further the fundamental processes responsible for material properties, and for their growth modes and morphological evolutions. As for material properties, some of the articles here address the challenges that continue to emerge from attempts at accurate descriptions of magnetic properties, of electronically excited states, and of sparse matter, all of which demand new looks at density functional theory (DFT). I should hasten to add that much of the success in accurate computational modeling of materials emanates from the remarkable predictive power of DFT, without which we would not be able to place the subject on firm theoretical grounds. As we know and will also

  9. A Methodology for the Neutronics Design of Space Nuclear Reactors

    SciTech Connect

    King, Jeffrey C.; El-Genk, Mohamed S.

    2004-02-04

    A methodology for the neutronics design of space power reactors is presented. This methodology involves balancing the competing requirements of having sufficient excess reactivity for the desired lifetime, keeping the reactor subcritical at launch and during submersion accidents, and providing sufficient control over the lifetime of the reactor. These requirements are addressed by three reactivity values for a given reactor design: the excess reactivity at beginning of mission, the negative reactivity at shutdown, and the negative reactivity margin in submersion accidents. These reactivity values define the control worth and the safety worth in submersion accidents, used for evaluating the merit of a proposed reactor type and design. The Heat Pipe-Segmented Thermoelectric Module Converters space reactor core design is evaluated and modified based on the proposed methodology. The final reactor core design has sufficient excess reactivity for 10 years of nominal operation at 1.82 MW of fission power and is subcritical at launch and in all water submersion accidents.

  10. Solid lubrication design methodology, phase 2

    NASA Technical Reports Server (NTRS)

    Pallini, R. A.; Wedeven, L. D.; Ragen, M. A.; Aggarwal, B. B.

    1986-01-01

    The high temperature performance of solid lubricated rolling elements was conducted with a specially designed traction (friction) test apparatus. Graphite lubricants containing three additives (silver, phosphate glass, and zinc orthophosphate) were evaluated from room temperature to 540 C. Two hard coats were also evaluated. The evaluation of these lubricants, using a burnishing method of application, shows a reasonable transfer of lubricant and wear protection for short duration testing except in the 200 C temperature range. The graphite lubricants containing silver and zinc orthophosphate additives were more effective than the phosphate glass material over the test conditions examined. Traction coefficients ranged from a low of 0.07 to a high of 0.6. By curve fitting the traction data, empirical equations for slope and maximum traction coefficient as a function of contact pressure (P), rolling speed (U), and temperature (T) can be developed for each lubricant. A solid lubricant traction model was incorporated into an advanced bearing analysis code (SHABERTH). For comparison purposes, preliminary heat generation calculations were made for both oil and solid lubricated bearing operation. A preliminary analysis indicated a significantly higher heat generation for a solid lubricated ball bearing in a deep groove configuration. An analysis of a cylindrical roller bearing configuration showed a potential for a low friction solid lubricated bearing.

  11. Optimization Methodology for Unconventional Rocket Nozzle Design

    NASA Technical Reports Server (NTRS)

    Follett, W.

    1996-01-01

    Several current rocket engine concepts such as the bell-annular tripropellant engine, and the linear aerospike being proposed for the X-33, require unconventional three-dimensional rocket nozzles which must conform to rectangular or sector-shaped envelopes to meet integration constraints. These types of nozzles exist outside the current experience database, therefore, development of efficient design methods for these propulsion concepts is critical to the success of launch vehicle programs. Several approaches for optimizing rocket nozzles, including streamline tracing techniques, and the coupling of CFD analysis to optimization algorithms are described. The relative strengths and weaknesses of four classes of optimization algorithms are discussed: Gradient based methods, genetic algorithms, simplex methods, and surface response methods. Additionally, a streamline tracing technique, which provides a very computationally efficient means of defining a three-dimensional contour, is discussed. The performance of the various optimization methods on thrust optimization problems for tripropellant and aerospike concepts is assessed and recommendations are made for future development efforts.

  12. Fast detection of manufacturing systematic design pattern failures causing device yield loss

    NASA Astrophysics Data System (ADS)

    Le Denmat, Jean-Christophe; Feldman, Nelly; Riewer, Olivia; Yesilada, Emek; Vallet, Michel; Suzor, Christophe; Talluto, Salvatore

    2015-03-01

    Starting from the 45nm technology node, systematic defectivity has a significant impact on device yield loss with each new technology node. The effort required to achieve patterning maturity with zero yield detractor is also significantly increasing with technology nodes. Within the manufacturing environment, new in-line wafer inspection methods have been developed to identify device systematic defects, including the process window qualification (PWQ) methodology used to characterize process robustness. Although patterning is characterized with PWQ methodology, some questions remain: How can we demonstrate that the measured process window is large enough to avoid design-based defects which will impact the device yield? Can we monitor the systematic yield loss on nominal wafers? From device test engineering point of view, systematic yield detractors are expected to be identified by Automated Test Pattern Generator (ATPG) test results diagnostics performed after electrical wafer sort (EWS). Test diagnostics can identify failed nets or cells causing systematic yield loss [1],[2]. Convergence from device failed nets and cells to failed manufacturing design pattern are usually based on assumptions that should be confirmed by an electrical failure analysis (EFA). However, many EFA investigations are required before the design pattern failures are found, and thus design pattern failure identification was costly in time and resources. With this situation, an opportunity to share knowledge exists between device test engineering and manufacturing environments to help with device yield improvement. This paper presents a new yield diagnostics flow dedicated to correlation of critical design patterns detected within manufacturing environment, with the observed device yield loss. The results obtained with this new flow on a 28nm technology device are described, with the defects of interest and the device yield impact for each design pattern. The EFA done to validate the design

  13. A methodological survey of the analysis, reporting and interpretation of Absolute Risk ReductiOn in systematic revieWs (ARROW): a study protocol

    PubMed Central

    2013-01-01

    Background Clinicians, providers and guideline panels use absolute effects to weigh the advantages and downsides of treatment alternatives. Relative measures have the potential to mislead readers. However, little is known about the reporting of absolute measures in systematic reviews. The objectives of our study are to determine the proportion of systematic reviews that report absolute measures of effect for the most important outcomes, and ascertain how they are analyzed, reported and interpreted. Methods/design We will conduct a methodological survey of systematic reviews published in 2010. We will conduct a 1:1 stratified random sampling of Cochrane vs. non-Cochrane systematic reviews. We will calculate the proportion of systematic reviews reporting at least one absolute estimate of effect for the most patient-important outcome for the comparison of interest. We will conduct multivariable logistic regression analyses with the reporting of an absolute estimate of effect as the dependent variable and pre-specified study characteristics as the independent variables. For systematic reviews reporting an absolute estimate of effect, we will document the methods used for the analysis, reporting and interpretation of the absolute estimate. Discussion Our methodological survey will inform current practices regarding reporting of absolute estimates in systematic reviews. Our findings may influence recommendations on reporting, conduct and interpretation of absolute estimates. Our results are likely to be of interest to systematic review authors, funding agencies, clinicians, guideline developers and journal editors. PMID:24330779

  14. PEM Fuel Cells Redesign Using Biomimetic and TRIZ Design Methodologies

    NASA Astrophysics Data System (ADS)

    Fung, Keith Kin Kei

    Two formal design methodologies, biomimetic design and the Theory of Inventive Problem Solving, TRIZ, were applied to the redesign of a Proton Exchange Membrane (PEM) fuel cell. Proof of concept prototyping was performed on two of the concepts for water management. The liquid water collection with strategically placed wicks concept demonstrated the potential benefits for a fuel cell. Conversely, the periodic flow direction reversal concepts might cause a potential reduction water removal from a fuel cell. The causes of this water removal reduction remain unclear. In additional, three of the concepts generated with biomimetic design were further studied and demonstrated to stimulate more creative ideas in the thermal and water management of fuel cells. The biomimetic design and the TRIZ methodologies were successfully applied to fuel cells and provided different perspectives to the redesign of fuel cells. The methodologies should continue to be used to improve fuel cells.

  15. SYSTEMATIC PROCEDURE FOR DESIGNING PROCESSES WITH MULTIPLE ENVIRONMENTAL OBJECTIVES

    EPA Science Inventory

    Evaluation of multiple objectives is very important in designing environmentally benign processes. It requires a systematic procedure for solving multiobjective decision-making problems, due to the complex nature of the problems, the need for complex assessments, and complicated ...

  16. Methodological Innovation in Practice-Based Design Doctorates

    ERIC Educational Resources Information Center

    Yee, Joyce S. R.

    2010-01-01

    This article presents a selective review of recent design PhDs that identify and analyse the methodological innovation that is occurring in the field, in order to inform future provision of research training. Six recently completed design PhDs are used to highlight possible philosophical and practical models that can be adopted by future PhD…

  17. Helicopter-V/STOL dynamic wind and turbulence design methodology

    NASA Technical Reports Server (NTRS)

    Bailey, J. Earl

    1987-01-01

    Aircraft and helicopter accidents due to severe dynamic wind and turbulence continue to present challenging design problems. The development of the current set of design analysis tools for a aircraft wind and turbulence design began in the 1940's and 1950's. The areas of helicopter dynamic wind and turbulence modeling and vehicle response to severe dynamic wind inputs (microburst type phenomena) during takeoff and landing remain as major unsolved design problems from a lack of both environmental data and computational methodology. The development of helicopter and V/STOL dynamic wind and turbulence response computation methology is reviewed, the current state of the design art in industry is outlined, and comments on design methodology are made which may serve to improve future flight vehicle design.

  18. Application of an Integrated Methodology for Propulsion and Airframe Control Design to a STOVL Aircraft

    NASA Technical Reports Server (NTRS)

    Garg, Sanjay; Mattern, Duane

    1994-01-01

    An advanced methodology for integrated flight propulsion control (IFPC) design for future aircraft, which will use propulsion system generated forces and moments for enhanced maneuver capabilities, is briefly described. This methodology has the potential to address in a systematic manner the coupling between the airframe and the propulsion subsystems typical of such enhanced maneuverability aircraft. Application of the methodology to a short take-off vertical landing (STOVL) aircraft in the landing approach to hover transition flight phase is presented with brief description of the various steps in the IFPC design methodology. The details of the individual steps have been described in previous publications and the objective of this paper is to focus on how the components of the control system designed at each step integrate into the overall IFPC system. The full nonlinear IFPC system was evaluated extensively in nonreal-time simulations as well as piloted simulations. Results from the nonreal-time evaluations are presented in this paper. Lessons learned from this application study are summarized in terms of areas of potential improvements in the STOVL IFPC design as well as identification of technology development areas to enhance the applicability of the proposed design methodology.

  19. Methodologies for measuring travelers' risk perception of infectious diseases: A systematic review.

    PubMed

    Sridhar, Shruti; Régner, Isabelle; Brouqui, Philippe; Gautret, Philippe

    2016-01-01

    Numerous studies in the past have stressed the importance of travelers' psychology and perception in the implementation of preventive measures. The aim of this systematic review was to identify the methodologies used in studies reporting on travelers' risk perception of infectious diseases. A systematic search for relevant literature was conducted according to Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines. There were 39 studies identified. In 35 of 39 studies, the methodology used was that of a knowledge, attitude and practice (KAP) survey based on questionnaires. One study used a combination of questionnaires and a visual psychometric measuring instrument called the 'pictorial representation of illness and self-measurement" or PRISM. One study used a self-representation model (SRM) method. Two studies measured psychosocial factors. Valuable information was obtained from KAP surveys showing an overall lack of knowledge among travelers about the most frequent travel-associated infections and associated preventive measures. This methodological approach however, is mainly descriptive, addressing knowledge, attitudes, and practices separately and lacking an examination of the interrelationships between these three components. Another limitation of the KAP method is underestimating psychosocial variables that have proved influential in health related behaviors, including perceived benefits and costs of preventive measures, perceived social pressure, perceived personal control, unrealistic optimism and risk propensity. Future risk perception studies in travel medicine should consider psychosocial variables with inferential and multivariate statistical analyses. The use of implicit measurements of attitudes could also provide new insights in the field of travelers' risk perception of travel-associated infectious diseases. PMID:27238906

  20. A methodology for designing aircraft to low sonic boom constraints

    NASA Technical Reports Server (NTRS)

    Mack, Robert J.; Needleman, Kathy E.

    1991-01-01

    A method for designing conceptual supersonic cruise aircraft to meet low sonic boom requirements is outlined and described. The aircraft design is guided through a systematic evolution from initial three view drawing to a final numerical model description, while the designer using the method controls the integration of low sonic boom, high supersonic aerodynamic efficiency, adequate low speed handling, and reasonable structure and materials technologies. Some experience in preliminary aircraft design and in the use of various analytical and numerical codes is required for integrating the volume and lift requirements throughout the design process.

  1. External validation of multivariable prediction models: a systematic review of methodological conduct and reporting

    PubMed Central

    2014-01-01

    Background Before considering whether to use a multivariable (diagnostic or prognostic) prediction model, it is essential that its performance be evaluated in data that were not used to develop the model (referred to as external validation). We critically appraised the methodological conduct and reporting of external validation studies of multivariable prediction models. Methods We conducted a systematic review of articles describing some form of external validation of one or more multivariable prediction models indexed in PubMed core clinical journals published in 2010. Study data were extracted in duplicate on design, sample size, handling of missing data, reference to the original study developing the prediction models and predictive performance measures. Results 11,826 articles were identified and 78 were included for full review, which described the evaluation of 120 prediction models. in participant data that were not used to develop the model. Thirty-three articles described both the development of a prediction model and an evaluation of its performance on a separate dataset, and 45 articles described only the evaluation of an existing published prediction model on another dataset. Fifty-seven percent of the prediction models were presented and evaluated as simplified scoring systems. Sixteen percent of articles failed to report the number of outcome events in the validation datasets. Fifty-four percent of studies made no explicit mention of missing data. Sixty-seven percent did not report evaluating model calibration whilst most studies evaluated model discrimination. It was often unclear whether the reported performance measures were for the full regression model or for the simplified models. Conclusions The vast majority of studies describing some form of external validation of a multivariable prediction model were poorly reported with key details frequently not presented. The validation studies were characterised by poor design, inappropriate handling

  2. Enhancing Instructional Design Efficiency: Methodologies Employed by Instructional Designers

    ERIC Educational Resources Information Center

    Roytek, Margaret A.

    2010-01-01

    Instructional systems design (ISD) has been frequently criticised as taking too long to implement, calling for a reduction in cycle time--the time that elapses between project initiation and delivery. While instructional design research has historically focused on increasing "learner" efficiencies, the study of what instructional designers do to…

  3. A design methodology for nonlinear systems containing parameter uncertainty: Application to nonlinear controller design

    NASA Technical Reports Server (NTRS)

    Young, G.

    1982-01-01

    A design methodology capable of dealing with nonlinear systems, such as a controlled ecological life support system (CELSS), containing parameter uncertainty is discussed. The methodology was applied to the design of discrete time nonlinear controllers. The nonlinear controllers can be used to control either linear or nonlinear systems. Several controller strategies are presented to illustrate the design procedure.

  4. Variance estimation for systematic designs in spatial surveys.

    PubMed

    Fewster, R M

    2011-12-01

    In spatial surveys for estimating the density of objects in a survey region, systematic designs will generally yield lower variance than random designs. However, estimating the systematic variance is well known to be a difficult problem. Existing methods tend to overestimate the variance, so although the variance is genuinely reduced, it is over-reported, and the gain from the more efficient design is lost. The current approaches to estimating a systematic variance for spatial surveys are to approximate the systematic design by a random design, or approximate it by a stratified design. Previous work has shown that approximation by a random design can perform very poorly, while approximation by a stratified design is an improvement but can still be severely biased in some situations. We develop a new estimator based on modeling the encounter process over space. The new "striplet" estimator has negligible bias and excellent precision in a wide range of simulation scenarios, including strip-sampling, distance-sampling, and quadrat-sampling surveys, and including populations that are highly trended or have strong aggregation of objects. We apply the new estimator to survey data for the spotted hyena (Crocuta crocuta) in the Serengeti National Park, Tanzania, and find that the reported coefficient of variation for estimated density is 20% using approximation by a random design, 17% using approximation by a stratified design, and 11% using the new striplet estimator. This large reduction in reported variance is verified by simulation. PMID:21534940

  5. A Practical Methodology for Quantifying Random and Systematic Components of Unexplained Variance in a Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Deloach, Richard; Obara, Clifford J.; Goodman, Wesley L.

    2012-01-01

    This paper documents a check standard wind tunnel test conducted in the Langley 0.3-Meter Transonic Cryogenic Tunnel (0.3M TCT) that was designed and analyzed using the Modern Design of Experiments (MDOE). The test designed to partition the unexplained variance of typical wind tunnel data samples into two constituent components, one attributable to ordinary random error, and one attributable to systematic error induced by covariate effects. Covariate effects in wind tunnel testing are discussed, with examples. The impact of systematic (non-random) unexplained variance on the statistical independence of sequential measurements is reviewed. The corresponding correlation among experimental errors is discussed, as is the impact of such correlation on experimental results generally. The specific experiment documented herein was organized as a formal test for the presence of unexplained variance in representative samples of wind tunnel data, in order to quantify the frequency with which such systematic error was detected, and its magnitude relative to ordinary random error. Levels of systematic and random error reported here are representative of those quantified in other facilities, as cited in the references.

  6. Prognostics and health management design for rotary machinery systems—Reviews, methodology and applications

    NASA Astrophysics Data System (ADS)

    Lee, Jay; Wu, Fangji; Zhao, Wenyu; Ghaffari, Masoud; Liao, Linxia; Siegel, David

    2014-01-01

    Much research has been conducted in prognostics and health management (PHM), an emerging field in mechanical engineering that is gaining interest from both academia and industry. Most of these efforts have been in the area of machinery PHM, resulting in the development of many algorithms for this particular application. The majority of these algorithms concentrate on applications involving common rotary machinery components, such as bearings and gears. Knowledge of this prior work is a necessity for any future research efforts to be conducted; however, there has not been a comprehensive overview that details previous and on-going efforts in PHM. In addition, a systematic method for developing and deploying a PHM system has yet to be established. Such a method would enable rapid customization and integration of PHM systems for diverse applications. To address these gaps, this paper provides a comprehensive review of the PHM field, followed by an introduction of a systematic PHM design methodology, 5S methodology, for converting data to prognostics information. This methodology includes procedures for identifying critical components, as well as tools for selecting the most appropriate algorithms for specific applications. Visualization tools are presented for displaying prognostics information in an appropriate fashion for quick and accurate decision making. Industrial case studies are included in this paper to show how this methodology can help in the design of an effective PHM system.

  7. Methodology for a stormwater sensitive urban watershed design

    NASA Astrophysics Data System (ADS)

    Romnée, Ambroise; Evrard, Arnaud; Trachte, Sophie

    2015-11-01

    In urban stormwater management, decentralized systems are nowadays worldwide experimented, including stormwater best management practices. However, a watershed-scale approach, relevant for urban hydrology, is almost always neglected when designing a stormwater management plan with best management practices. As a consequence, urban designers fail to convince public authorities of the actual hydrologic effectiveness of such an approach to urban watershed stormwater management. In this paper, we develop a design oriented methodology for studying the morphology of an urban watershed in terms of sustainable stormwater management. The methodology is a five-step method, firstly based on the cartographic analysis of many stormwater relevant indicators regarding the landscape, the urban fabric and the governance. The second step focuses on the identification of many territorial stakes and their corresponding strategies of a decentralized stormwater management. Based on the indicators, the stakes and the strategies, the third step defines many spatial typologies regarding the roadway system and the urban fabric system. The fourth step determines many stormwater management scenarios to be applied to both spatial typologies systems. The fifth step is the design of decentralized stormwater management projects integrating BMPs into each spatial typology. The methodology aims to advise urban designers and engineering offices in the right location and selection of BMPs without given them a hypothetical unique solution. Since every location and every watershed is different due to local guidelines and stakeholders, this paper provide a methodology for a stormwater sensitive urban watershed design that could be reproduced everywhere. As an example, the methodology is applied as a case study to an urban watershed in Belgium, confirming that the method is applicable to any urban watershed. This paper should be helpful for engineering and design offices in urban hydrology to define a

  8. Implementation of Probabilistic Design Methodology at Tennessee State University

    NASA Technical Reports Server (NTRS)

    Onwubiko, Chinyere

    1996-01-01

    Engineering Design is one of the most important areas in engineering education. Deterministic Design Methodology (DDM) is the only design method that is taught in most engineering schools. This method does not give a direct account of uncertainties in design parameters. Hence, it is impossible to quantify the uncertainties in the response and the actual safety margin remains unknown. The desire for a design methodology tha can identify the primitive (random) variables that affect the structural behavior has led to a growing interest on Probabilistic Design Methodology (PDM). This method is gaining more recognition in industries than in educational institutions. Some of the reasons for the limited use of the PDM at the moment are that many are unaware of its potentials, and most of the software developed for PDM are very recent. The central goal of the PDM project at Tennessee State University is to introduce engineering students to the method. The students participating in the project learn about PDM and the computer codes that are available to the design engineer. The software being used of this project is NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) developed under NASA probabilistic structural analysis program. NESSUS has three different modules which make it a very comprehensive computer code for PDM. A research in technology transfer through course offering in PDM is in effect a Tennessee State University. The aim is to familiarize students with the problem of uncertainties in engineering design. Included in the paper are some projects on PDM carried out by some students and faculty. The areas this method is being applied at the moment include, Design of Gears (spur and worm); Design of Shafts; Design of Statistically Indeterminate Frame Structures; Design of Helical Springs; and Design of Shock Absorbers. Some of the current results of these projects are presented.

  9. Developing an evidence-based methodological framework to systematically compare HTA coverage decisions: A mixed methods study.

    PubMed

    Nicod, Elena; Kanavos, Panos

    2016-01-01

    Health Technology Assessment (HTA) often results in different coverage recommendations across countries for a same medicine despite similar methodological approaches. This paper develops and pilots a methodological framework that systematically identifies the reasons for these differences using an exploratory sequential mixed methods research design. The study countries were England, Scotland, Sweden and France. The methodological framework was built around three stages of the HTA process: (a) evidence, (b) its interpretation, and (c) its influence on the final recommendation; and was applied to two orphan medicinal products. The criteria accounted for at each stage were qualitatively analyzed through thematic analysis. Piloting the framework for two medicines, eight trials, 43 clinical endpoints and seven economic models were coded 155 times. Eighteen different uncertainties about this evidence were coded 28 times, 56% of which pertained to evidence commonly appraised and 44% to evidence considered by only some agencies. The poor agreement in interpreting this evidence (κ=0.183) was partly explained by stakeholder input (ns=48 times), or by agency-specific risk (nu=28 uncertainties) and value preferences (noc=62 "other considerations"), derived through correspondence analysis. Accounting for variability at each stage of the process can be achieved by codifying its existence and quantifying its impact through the application of this framework. The transferability of this framework to other disease areas, medicines and countries is ensured by its iterative and flexible nature, and detailed description. PMID:26723201

  10. Single-Case Experimental Designs: A Systematic Review of Published Research and Current Standards

    PubMed Central

    Smith, Justin D.

    2013-01-01

    This article systematically reviews the research design and methodological characteristics of single-case experimental design (SCED) research published in peer-reviewed journals between 2000 and 2010. SCEDs provide researchers with a flexible and viable alternative to group designs with large sample sizes. However, methodological challenges have precluded widespread implementation and acceptance of the SCED as a viable complementary methodology to the predominant group design. This article includes a description of the research design, measurement, and analysis domains distinctive to the SCED; a discussion of the results within the framework of contemporary standards and guidelines in the field; and a presentation of updated benchmarks for key characteristics (e.g., baseline sampling, method of analysis), and overall, it provides researchers and reviewers with a resource for conducting and evaluating SCED research. The results of the systematic review of 409 studies suggest that recently published SCED research is largely in accordance with contemporary criteria for experimental quality. Analytic method emerged as an area of discord. Comparison of the findings of this review with historical estimates of the use of statistical analysis indicates an upward trend, but visual analysis remains the most common analytic method and also garners the most support amongst those entities providing SCED standards. Although consensus exists along key dimensions of single-case research design and researchers appear to be practicing within these parameters, there remains a need for further evaluation of assessment and sampling techniques and data analytic methods. PMID:22845874

  11. The methodological quality of systematic reviews comparing intravitreal bevacizumab and alternates for neovascular age related macular degeneration: A systematic review of reviews

    PubMed Central

    George, Pradeep Paul; DeCastro Molina, Joseph Antonio; Heng, Bee Hoon

    2014-01-01

    Objective: To systematically collate and evaluate the evidence from recent SRs of bevacizumab for neo-vascular age related macular degeneration. Materials and Methods: Literature searches were carried out in Medline, Embase, Cochrane databases for all systematic reviews (SRs) on the effectiveness of bevacizumab for neo-vascular age related macular degeneration, published between 2000 and 2013. Titles and abstracts were assessed against the inclusion/exclusion criteria using Joanna Briggs Institute (JBI) study eligibility form. Data was extracted using the JBI data extraction form. The quality of the SRs was assessed using JBI critical appraisal checklist for SRs. Decisions on study eligibility and quality were made by two reviewers; any disagreements were resolved by discussion. Results: Nine relevant reviews were identified from 30 citations, of which 5 reviews fulfilled the review's inclusion criteria. All 5 reviews showed bevacizumab to be effective for neovascular AMD in the short-term when used alone or in combination with PDT or Pegaptanib. The average quality score of the reviews was 7; 95% confidence interval 6.2 to 7.8 (maximum possible quality score is 10). The selection and publication bias were not addressed in all included reviews. Three-fifth of the reviews had a quality score of 7 or lower, these reviews had some methodological limitations, search strategies were only identified in 2 (40%) reviews, independent study selection and quality assessment of included studies (4 (80%)) were infrequently performed. Conclusion: Overall, the reviews on the effectiveness of intravitreal/systemic bevacizumab for neovascularage-related macular generation (AMD) received good JBI quality scores (mean score = 7.0 points), with a few exceptions. The study also highlights the suboptimal reporting of SRs on this topic. Reviews with poor methodology may limit the validity of the reported results; hence efforts should be made to improve the design, reporting and

  12. Extensibility of a linear rapid robust design methodology

    NASA Astrophysics Data System (ADS)

    Steinfeldt, Bradley A.; Braun, Robert D.

    2016-05-01

    The extensibility of a linear rapid robust design methodology is examined. This analysis is approached from a computational cost and accuracy perspective. The sensitivity of the solution's computational cost is examined by analysing effects such as the number of design variables, nonlinearity of the CAs, and nonlinearity of the response in addition to several potential complexity metrics. Relative to traditional robust design methods, the linear rapid robust design methodology scaled better with the size of the problem and had performance that exceeded the traditional techniques examined. The accuracy of applying a method with linear fundamentals to nonlinear problems was examined. It is observed that if the magnitude of nonlinearity is less than 1000 times that of the nominal linear response, the error associated with applying successive linearization will result in ? errors in the response less than 10% compared to the full nonlinear error.

  13. Viability, Advantages and Design Methodologies of M-Learning Delivery

    ERIC Educational Resources Information Center

    Zabel, Todd W.

    2010-01-01

    The purpose of this study was to examine the viability and principle design methodologies of Mobile Learning models in developing regions. Demographic and market studies were utilized to determine the viability of M-Learning delivery as well as best uses for such technologies and methods given socioeconomic and political conditions within the…

  14. A computer simulator for development of engineering system design methodologies

    NASA Technical Reports Server (NTRS)

    Padula, S. L.; Sobieszczanski-Sobieski, J.

    1987-01-01

    A computer program designed to simulate and improve engineering system design methodology is described. The simulator mimics the qualitative behavior and data couplings occurring among the subsystems of a complex engineering system. It eliminates the engineering analyses in the subsystems by replacing them with judiciously chosen analytical functions. With the cost of analysis eliminated, the simulator is used for experimentation with a large variety of candidate algorithms for multilevel design optimization to choose the best ones for the actual application. Thus, the simulator serves as a development tool for multilevel design optimization strategy. The simulator concept, implementation, and status are described and illustrated with examples.

  15. Tornado missile simulation and design methodology. Volume 1: simulation methodology, design applications, and TORMIS computer code. Final report

    SciTech Connect

    Twisdale, L.A.; Dunn, W.L.

    1981-08-01

    A probabilistic methodology has been developed to predict the probabilities of tornado-propelled missiles impacting and damaging nuclear power plant structures. Mathematical models of each event in the tornado missile hazard have been developed and sequenced to form an integrated, time-history simulation methodology. The models are data based where feasible. The data include documented records of tornado occurrence, field observations of missile transport, results of wind tunnel experiments, and missile impact tests. Probabilistic Monte Carlo techniques are used to estimate the risk probabilities. The methodology has been encoded in the TORMIS computer code to facilitate numerical analysis and plant-specific tornado missile probability assessments. Sensitivity analyses have been performed on both the individual models and the integrated methodology, and risk has been assessed for a hypothetical nuclear power plant design case study.

  16. Are we talking the same paradigm? Considering methodological choices in health education systematic review.

    PubMed

    Gordon, Morris

    2016-07-01

    For the past two decades, there have been calls for medical education to become more evidence-based. Whilst previous works have described how to use such methods, there are no works discussing when or why to select different methods from either a conceptual or pragmatic perspective. This question is not to suggest the superiority of such methods, but that having a clear rationale to underpin such choices is key and should be communicated to the reader of such works. Our goal within this manuscript is to consider the philosophical alignment of these different review and synthesis modalities and how this impacts on their suitability to answer different systematic review questions within health education. The key characteristic of a systematic review that should impact the synthesis choice is discussed in detail. By clearly defining this and the related outcome expected from the review and for educators who will receive this outcome, the alignment will become apparent. This will then allow deployment of an appropriate methodology that is fit for purpose and will indeed justify the significant work needed to complete a systematic. Key items discussed are the positivist synthesis methods meta-analysis and content analysis to address questions in the form of 'whether and what' education is effective. These can be juxtaposed with the constructivist aligned thematic analysis and meta-ethnography to address questions in the form of 'why'. The concept of the realist review is also considered. It is proposed that authors of such work should describe their research alignment and the link between question, alignment and evidence synthesis method selected. The process of exploring the range of modalities and their alignment highlights gaps in the researcher's arsenal. Future works are needed to explore the impact of such changes in writing from authors of medical education systematic review. PMID:27007488

  17. Are we talking the same paradigm? Considering methodological choices in health education systematic review

    PubMed Central

    Gordon, Morris

    2016-01-01

    Abstract For the past two decades, there have been calls for medical education to become more evidence-based. Whilst previous works have described how to use such methods, there are no works discussing when or why to select different methods from either a conceptual or pragmatic perspective. This question is not to suggest the superiority of such methods, but that having a clear rationale to underpin such choices is key and should be communicated to the reader of such works. Our goal within this manuscript is to consider the philosophical alignment of these different review and synthesis modalities and how this impacts on their suitability to answer different systematic review questions within health education. The key characteristic of a systematic review that should impact the synthesis choice is discussed in detail. By clearly defining this and the related outcome expected from the review and for educators who will receive this outcome, the alignment will become apparent. This will then allow deployment of an appropriate methodology that is fit for purpose and will indeed justify the significant work needed to complete a systematic. Key items discussed are the positivist synthesis methods meta-analysis and content analysis to address questions in the form of ‘whether and what’ education is effective. These can be juxtaposed with the constructivist aligned thematic analysis and meta-ethnography to address questions in the form of ‘why’. The concept of the realist review is also considered. It is proposed that authors of such work should describe their research alignment and the link between question, alignment and evidence synthesis method selected. The process of exploring the range of modalities and their alignment highlights gaps in the researcher’s arsenal. Future works are needed to explore the impact of such changes in writing from authors of medical education systematic review. PMID:27007488

  18. FOREWORD: Computational methodologies for designing materials Computational methodologies for designing materials

    NASA Astrophysics Data System (ADS)

    Rahman, Talat S.

    2009-02-01

    It would be fair to say that in the past few decades, theory and computer modeling have played a major role in elucidating the microscopic factors that dictate the properties of functional novel materials. Together with advances in experimental techniques, theoretical methods are becoming increasingly capable of predicting properties of materials at different length scales, thereby bringing in sight the long-sought goal of designing material properties according to need. Advances in computer technology and their availability at a reasonable cost around the world have made tit all the more urgent to disseminate what is now known about these modern computational techniques. In this special issue on computational methodologies for materials by design we have tried to solicit articles from authors whose works collectively represent the microcosm of developments in the area. This turned out to be a difficult task for a variety of reasons, not the least of which is space limitation in this special issue. Nevertheless, we gathered twenty articles that represent some of the important directions in which theory and modeling are proceeding in the general effort to capture the ability to produce materials by design. The majority of papers presented here focus on technique developments that are expected to uncover further the fundamental processes responsible for material properties, and for their growth modes and morphological evolutions. As for material properties, some of the articles here address the challenges that continue to emerge from attempts at accurate descriptions of magnetic properties, of electronically excited states, and of sparse matter, all of which demand new looks at density functional theory (DFT). I should hasten to add that much of the success in accurate computational modeling of materials emanates from the remarkable predictive power of DFT, without which we would not be able to place the subject on firm theoretical grounds. As we know and will also

  19. A robust optimization methodology for preliminary aircraft design

    NASA Astrophysics Data System (ADS)

    Prigent, S.; Maréchal, P.; Rondepierre, A.; Druot, T.; Belleville, M.

    2016-05-01

    This article focuses on a robust optimization of an aircraft preliminary design under operational constraints. According to engineers' know-how, the aircraft preliminary design problem can be modelled as an uncertain optimization problem whose objective (the cost or the fuel consumption) is almost affine, and whose constraints are convex. It is shown that this uncertain optimization problem can be approximated in a conservative manner by an uncertain linear optimization program, which enables the use of the techniques of robust linear programming of Ben-Tal, El Ghaoui, and Nemirovski [Robust Optimization, Princeton University Press, 2009]. This methodology is then applied to two real cases of aircraft design and numerical results are presented.

  20. Methodological developments in searching for studies for systematic reviews: past, present and future?

    PubMed

    Lefebvre, Carol; Glanville, Julie; Wieland, L Susan; Coles, Bernadette; Weightman, Alison L

    2013-01-01

    The Cochrane Collaboration was established in 1993, following the opening of the UK Cochrane Centre in 1992, at a time when searching for studies for inclusion in systematic reviews was not well-developed. Review authors largely conducted their own searches or depended on medical librarians, who often possessed limited awareness and experience of systematic reviews. Guidance on the conduct and reporting of searches was limited. When work began to identify reports of randomized controlled trials (RCTs) for inclusion in Cochrane Reviews in 1992, there were only approximately 20,000 reports indexed as RCTs in MEDLINE and none indexed as RCTs in Embase. No search filters had been developed with the aim of identifying all RCTs in MEDLINE or other major databases. This presented The Cochrane Collaboration with a considerable challenge in identifying relevant studies.Over time, the number of studies indexed as RCTs in the major databases has grown considerably and the Cochrane Central Register of Controlled Trials (CENTRAL) has become the best single source of published controlled trials, with approximately 700,000 records, including records identified by the Collaboration from Embase and MEDLINE. Search filters for various study types, including systematic reviews and the Cochrane Highly Sensitive Search Strategies for RCTs, have been developed. There have been considerable advances in the evidence base for methodological aspects of information retrieval. The Cochrane Handbook for Systematic Reviews of Interventions now provides detailed guidance on the conduct and reporting of searches. Initiatives across The Cochrane Collaboration to improve the quality inter alia of information retrieval include: the recently introduced Methodological Expectations for Cochrane Intervention Reviews (MECIR) programme, which stipulates 'mandatory' and 'highly desirable' standards for various aspects of review conduct and reporting including searching, the development of Standard Training

  1. Methodological developments in searching for studies for systematic reviews: past, present and future?

    PubMed Central

    2013-01-01

    The Cochrane Collaboration was established in 1993, following the opening of the UK Cochrane Centre in 1992, at a time when searching for studies for inclusion in systematic reviews was not well-developed. Review authors largely conducted their own searches or depended on medical librarians, who often possessed limited awareness and experience of systematic reviews. Guidance on the conduct and reporting of searches was limited. When work began to identify reports of randomized controlled trials (RCTs) for inclusion in Cochrane Reviews in 1992, there were only approximately 20,000 reports indexed as RCTs in MEDLINE and none indexed as RCTs in Embase. No search filters had been developed with the aim of identifying all RCTs in MEDLINE or other major databases. This presented The Cochrane Collaboration with a considerable challenge in identifying relevant studies. Over time, the number of studies indexed as RCTs in the major databases has grown considerably and the Cochrane Central Register of Controlled Trials (CENTRAL) has become the best single source of published controlled trials, with approximately 700,000 records, including records identified by the Collaboration from Embase and MEDLINE. Search filters for various study types, including systematic reviews and the Cochrane Highly Sensitive Search Strategies for RCTs, have been developed. There have been considerable advances in the evidence base for methodological aspects of information retrieval. The Cochrane Handbook for Systematic Reviews of Interventions now provides detailed guidance on the conduct and reporting of searches. Initiatives across The Cochrane Collaboration to improve the quality inter alia of information retrieval include: the recently introduced Methodological Expectations for Cochrane Intervention Reviews (MECIR) programme, which stipulates 'mandatory’ and 'highly desirable’ standards for various aspects of review conduct and reporting including searching, the development of Standard

  2. An automated methodology development. [software design for combat simulation

    NASA Technical Reports Server (NTRS)

    Hawley, L. R.

    1985-01-01

    The design methodology employed in testing the applicability of Ada in large-scale combat simulations is described. Ada was considered as a substitute for FORTRAN to lower life cycle costs and ease the program development efforts. An object-oriented approach was taken, which featured definitions of military targets, the capability of manipulating their condition in real-time, and one-to-one correlation between the object states and real world states. The simulation design process was automated by the problem statement language (PSL)/problem statement analyzer (PSA). The PSL/PSA system accessed the problem data base directly to enhance the code efficiency by, e.g., eliminating non-used subroutines, and provided for automated report generation, besides allowing for functional and interface descriptions. The ways in which the methodology satisfied the responsiveness, reliability, transportability, modifiability, timeliness and efficiency goals are discussed.

  3. Development of a Design Methodology for Reconfigurable Flight Control Systems

    NASA Technical Reports Server (NTRS)

    Hess, Ronald A.; McLean, C.

    2000-01-01

    A methodology is presented for the design of flight control systems that exhibit stability and performance-robustness in the presence of actuator failures. The design is based upon two elements. The first element consists of a control law that will ensure at least stability in the presence of a class of actuator failures. This law is created by inner-loop, reduced-order, linear dynamic inversion, and outer-loop compensation based upon Quantitative Feedback Theory. The second element consists of adaptive compensators obtained from simple and approximate time-domain identification of the dynamics of the 'effective vehicle' with failed actuator(s). An example involving the lateral-directional control of a fighter aircraft is employed both to introduce the proposed methodology and to demonstrate its effectiveness and limitations.

  4. Methodology for system description using the software design & documentation language

    NASA Technical Reports Server (NTRS)

    Kleine, H.

    1985-01-01

    The Software Design and Documentation Language (SDDL) can be loosely characterized as a text processor with built-in knowledge of, and methods for handling the concepts of structure and abstraction which are essential for developing software and other information intensive systems. Several aspects of system descriptions to which SDDL has been applied are presented and specific SDDL methodologies developed for these applications are discussed.

  5. A SYSTEMATIC PROCEDURE FOR DESIGNING PROCESSES WITH MULTIPLE ENVIRONMENTAL OBJECTIVES

    EPA Science Inventory

    Evaluation and analysis of multiple objectives are very important in designing environmentally benign processes. They require a systematic procedure for solving multi-objective decision-making problems due to the complex nature of the problems and the need for complex assessment....

  6. Designs and Methods in School Improvement Research: A Systematic Review

    ERIC Educational Resources Information Center

    Feldhoff, Tobias; Radisch, Falk; Bischof, Linda Marie

    2016-01-01

    Purpose: The purpose of this paper is to focus on challenges faced by longitudinal quantitative analyses of school improvement processes and offers a systematic literature review of current papers that use longitudinal analyses. In this context, the authors assessed designs and methods that are used to analyze the relation between school…

  7. A systematic computational methodology applied to a three-dimensional film-cooling flowfield

    SciTech Connect

    Walters, D.K.; Leylek, J.H.

    1997-10-01

    Numerical results are presented for a three-dimensional discrete-jet in crossflow problem typical of a realistic film-cooling application in gas turbines. Key aspects of the study include: (1) application of a systematic computational methodology that stresses accurate computational model of the physical problem, including simultaneous, fully elliptic solution of the crossflow, film-hole, and plenum regions; high-quality three-dimensional unstructured grid generation techniques, which have yet to be documented for this class of problems; the use of a high-order discretization scheme to reduce numerical errors significantly; and effective turbulence modeling; (2) a three-way comparison of results to both code validation quality experimental data and a previously documented structured grid simulation; and (3) identification of sources of discrepancy between predicted and measured results, as well as recommendations to alleviate these discrepancies. Solutions were obtained with a multiblock, unstructured/adaptive grid, fully explicit, time-marching, Reynolds-averaged Navier-Stokes code with multigrid, local time stepping, and residual smoothing type acceleration techniques. The computational methodology was applied to the validation test case of a row of discrete jets on a flat plate with a streamwise injection angle of 35 deg, and two film-hole length-to-diameter ratios of 3.5 and 1.75. The density ratio for all cases was 2.0, blowing ratio was varied from 0.5 to 2.0, and free-stream turbulence intensity was 2%. The results demonstrate that the prescribed computational methodology yields consistently more accurate solutions for this class of problems than previous attempts published in the open literature. Sources of disagreement between measured and computed results have been identified, and recommendations made for future prediction of film-cooling problems.

  8. Thin Film Heat Flux Sensors: Design and Methodology

    NASA Technical Reports Server (NTRS)

    Fralick, Gustave C.; Wrbanek, John D.

    2013-01-01

    Thin Film Heat Flux Sensors: Design and Methodology: (1) Heat flux is one of a number of parameters, together with pressure, temperature, flow, etc. of interest to engine designers and fluid dynamists, (2) The measurement of heat flux is of interest in directly determining the cooling requirements of hot section blades and vanes, and (3)In addition, if the surface and gas temperatures are known, the measurement of heat flux provides a value for the convective heat transfer coefficient that can be compared with the value provided by CFD codes.

  9. When Playing Meets Learning: Methodological Framework for Designing Educational Games

    NASA Astrophysics Data System (ADS)

    Linek, Stephanie B.; Schwarz, Daniel; Bopp, Matthias; Albert, Dietrich

    Game-based learning builds upon the idea of using the motivational potential of video games in the educational context. Thus, the design of educational games has to address optimizing enjoyment as well as optimizing learning. Within the EC-project ELEKTRA a methodological framework for the conceptual design of educational games was developed. Thereby state-of-the-art psycho-pedagogical approaches were combined with insights of media-psychology as well as with best-practice game design. This science-based interdisciplinary approach was enriched by enclosed empirical research to answer open questions on educational game-design. Additionally, several evaluation-cycles were implemented to achieve further improvements. The psycho-pedagogical core of the methodology can be summarized by the ELEKTRA's 4Ms: Macroadaptivity, Microadaptivity, Metacognition, and Motivation. The conceptual framework is structured in eight phases which have several interconnections and feedback-cycles that enable a close interdisciplinary collaboration between game design, pedagogy, cognitive science and media psychology.

  10. Implementation of probabilistic design methodology at Tennessee State University

    NASA Technical Reports Server (NTRS)

    Onwubiko, Chinyere

    1995-01-01

    The fact that Deterministic Design Method no longer satisfies most design needs calls for methods that will cope with the high trend in technology. The advance in computer technology has reduced the rigors that normally accompany many design analysis methods that account for uncertainties in design parameters. Probabilistic Design Methodology (PDM) is beginning to make impact in engineering design. This method is gaining more recognition in industries than in educational institutions. Some of the reasons for the limited use of the PDM at the moment are that many are unaware of its potentials, and most of the software developed for PDM are very recent. The central goal of the PDM project at Tennessee State University is to introduce engineering students to this method. The students participating in the project learn about PDM and the computer codes that are available to the design engineer. The software being used for this project is NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) developed under NASA probabilistic structural analysis program. NESSUS has three different modules which make it a very comprehensive computer code for PDM. Since this method is new to the students, its introduction into the engineering curriculum is to be in stages. These range from the introduction of PDM and its software to the applications. While this program is being developed for its eventual inclusion into the engineering curriculum, some graduate and undergraduate students are already carrying out some projects using this method. As the students are increasing their understanding on PDM, they are at the same time applying it to some common design problems. The areas this method is being applied at the moment include, Design of Gears (spur and worm); Design of Brakes; Design of Heat Exchangers Design of Helical Springs; and Design of Shock Absorbers. Some of the current results of these projects are presented.

  11. Gaining system design knowledge by systematic design space exploration with graph based design languages

    NASA Astrophysics Data System (ADS)

    Schmidt, Jens; Rudolph, Stephan

    2014-10-01

    The conceptual design phase in the design of complex systems such as satellite propulsion systems heavily relies on an exploration of the feasible design space. This exploration requires both: topological changes in the potential system architecture and consistent parametrical changes in the dimensioning of the existing system components. Since advanced engineering design techniques nowadays advocate a model-based systems engineering (MBSE) approach, graph-based design languages which embed a superset of MBSE-features are consequently used in this work to systematically explore the feasible design space. Design languages allow the design knowledge to be represented, modeled and executed using model-based transformations and combine this among other features with constraint processing techniques. The execution of the design language shown for the satellite propulsion systems in this work yields topologically varied designs (i.e. the selection of a monergol, a diergol or a coldgas system) with consistent parameters. Based on an a posteriori performance analysis of the automatically generated system designs, novel system knowledge (most notably in form of so-called "topology change points") can be gained and extracted from the original point cloud of numerical results.

  12. Evaluation of methodology and quality characteristics of systematic reviews in orthodontics.

    PubMed

    Papageorgiou, S N; Papadopoulos, M A; Athanasiou, A E

    2011-08-01

    Systematic reviews (SRs) are published with an increasing rate in many fields of biomedical literature, including orthodontics. Although SRs should consolidate the evidence-based characteristics of contemporary orthodontic practice, doubts on the validity of their conclusions have been frequently expressed. The aim of this study was to evaluate the methodology and quality characteristics of orthodontic SRs as well as to assess their quality of reporting during the last years. Electronic databases were searched for SRs (without any meta-analytical data synthesis) in the field of orthodontics, indexed up to the start of 2010. The Assessment of Multiple Systematic Reviews (AMSTAR) tool was used for quality assessment of the included articles. Data were analyzed with Student's t-test, one-way ANOVA, and linear regression. Risk ratios (RR) with 95% confidence intervals were calculated to represent changes during the years in reporting of key items associated with quality. A total of 110 SRs were included in this evaluation. About half of the SRs (46.4%) were published in orthodontic journals, while few (5.5%) were updates of previously published reviews. Using the AMSTAR tool, thirty (27.3%) of the SRs were found to be of low quality, 63 (57.3%) of medium quality, and 17 (15.5%) of high quality. No significant trend for quality improvement was observed during the last years. The overall quality of orthodontic SRs may be considered as medium. Although the number of orthodontic SRs has increased over the last decade, their quality characteristics can be characterized as moderate. PMID:21771267

  13. The uniqueness of the human dentition as forensic evidence: a systematic review on the technological methodology.

    PubMed

    Franco, Ademir; Willems, Guy; Souza, Paulo Henrique Couto; Bekkering, Geertruida E; Thevissen, Patrick

    2015-11-01

    The uniqueness of human dentition is routinely approached as identification evidence in forensic odontology. Specifically in bitemark and human identification cases, positive identifications are obtained under the hypothesis that two individuals do not have the same dental features. The present study compiles methodological information from articles on the uniqueness of human dentition to support investigations into the mentioned hypothesis. In April 2014, three electronic library databases (SciELO®, MEDLINE®/PubMed®, and LILACS®) were systematically searched. In parallel, reference lists of relevant studies were also screened. From the obtained articles (n = 1235), 13 full-text articles were considered eligible. They were examined according to the studied parameters: the sample size, the number of examined teeth, the registration technique for data collection, the methods for data analysis, and the study outcomes. Six combinations of studied data were detected: (1) dental shape, size, angulation, and position (n = 1); (2) dental shape, size, and angulation (n = 4); (3) dental shape and size (n = 5); (4) dental angulation and position (n = 2); (5) dental shape and angulation (n = 1); and (6) dental shape (n = 1). The sample size ranged between 10 and 1099 human dentitions. Ten articles examined the six anterior teeth, while three articles examined more teeth. Four articles exclusively addressed three-dimensional (3D) data registration, while six articles used two-dimensional (2D) imaging. In three articles, both imaging registrations were combined. Most articles (n = 9) explored the data using landmark placement. The other articles (n = 4) comprised digital comparison of superimposed dental contours. Although there were large methodological variations within the investigated articles, the uniqueness of human dentition remains unproved. PMID:25398633

  14. Aircraft conceptual design - an adaptable parametric sizing methodology

    NASA Astrophysics Data System (ADS)

    Coleman, Gary John, Jr.

    Aerospace is a maturing industry with successful and refined baselines which work well for traditional baseline missions, markets and technologies. However, when new markets (space tourism) or new constrains (environmental) or new technologies (composite, natural laminar flow) emerge, the conventional solution is not necessarily best for the new situation. Which begs the question "how does a design team quickly screen and compare novel solutions to conventional solutions for new aerospace challenges?" The answer is rapid and flexible conceptual design Parametric Sizing. In the product design life-cycle, parametric sizing is the first step in screening the total vehicle in terms of mission, configuration and technology to quickly assess first order design and mission sensitivities. During this phase, various missions and technologies are assessed. During this phase, the designer is identifying design solutions of concepts and configurations to meet combinations of mission and technology. This research undertaking contributes the state-of-the-art in aircraft parametric sizing through (1) development of a dedicated conceptual design process and disciplinary methods library, (2) development of a novel and robust parametric sizing process based on 'best-practice' approaches found in the process and disciplinary methods library, and (3) application of the parametric sizing process to a variety of design missions (transonic, supersonic and hypersonic transports), different configurations (tail-aft, blended wing body, strut-braced wing, hypersonic blended bodies, etc.), and different technologies (composite, natural laminar flow, thrust vectored control, etc.), in order to demonstrate the robustness of the methodology and unearth first-order design sensitivities to current and future aerospace design problems. This research undertaking demonstrates the importance of this early design step in selecting the correct combination of mission, technologies and configuration to

  15. Development and implementation of rotorcraft preliminary design methodology using multidisciplinary design optimization

    NASA Astrophysics Data System (ADS)

    Khalid, Adeel Syed

    Rotorcraft's evolution has lagged behind that of fixed-wing aircraft. One of the reasons for this gap is the absence of a formal methodology to accomplish a complete conceptual and preliminary design. Traditional rotorcraft methodologies are not only time consuming and expensive but also yield sub-optimal designs. Rotorcraft design is an excellent example of a multidisciplinary complex environment where several interdependent disciplines are involved. A formal framework is developed and implemented in this research for preliminary rotorcraft design using IPPD methodology. The design methodology consists of the product and process development cycles. In the product development loop, all the technical aspects of design are considered including the vehicle engineering, dynamic analysis, stability and control, aerodynamic performance, propulsion, transmission design, weight and balance, noise analysis and economic analysis. The design loop starts with a detailed analysis of requirements. A baseline is selected and upgrade targets are identified depending on the mission requirements. An Overall Evaluation Criterion (OEC) is developed that is used to measure the goodness of the design or to compare the design with competitors. The requirements analysis and baseline upgrade targets lead to the initial sizing and performance estimation of the new design. The digital information is then passed to disciplinary experts. This is where the detailed disciplinary analyses are performed. Information is transferred from one discipline to another as the design loop is iterated. To coordinate all the disciplines in the product development cycle, Multidisciplinary Design Optimization (MDO) techniques e.g. All At Once (AAO) and Collaborative Optimization (CO) are suggested. The methodology is implemented on a Light Turbine Training Helicopter (LTTH) design. Detailed disciplinary analyses are integrated through a common platform for efficient and centralized transfer of design

  16. Development of the Spanish version of the Systematized Nomenclature of Medicine: methodology and main issues.

    PubMed Central

    Reynoso, G. A.; March, A. D.; Berra, C. M.; Strobietto, R. P.; Barani, M.; Iubatti, M.; Chiaradio, M. P.; Serebrisky, D.; Kahn, A.; Vaccarezza, O. A.; Leguiza, J. L.; Ceitlin, M.; Luna, D. A.; Bernaldo de Quirós, F. G.; Otegui, M. I.; Puga, M. C.; Vallejos, M.

    2000-01-01

    This presentation features linguistic and terminology management issues related to the development of the Spanish version of the Systematized Nomenclature of Medicine (SNOMED). It aims at describing the aspects of translating and the difficulties encountered in delivering a natural and consistent medical nomenclature. Bunge's three-layered model is referenced to analyze the sequence of symbolic concept representations. It further explains how a communicative translation based on a concept-to-concept approach was used to achieve the highest level of flawlessness and naturalness for the Spanish rendition of SNOMED. Translation procedures and techniques are described and exemplified. Both the computer-aided and human translation methods are portrayed. The scientific and translation team tasks are detailed, with focus on Newmark's four-level principle for the translation process, extended with a fifth further level relevant to the ontology to control the consistency of the typology of concepts. Finally the convenience for a common methodology to develop non-English versions of SNOMED is suggested. PMID:11079973

  17. Experimental facility and methodology for systematic studies of cold startability in direct injection Diesel engines

    NASA Astrophysics Data System (ADS)

    Pastor, J. V.; García-Oliver, J. M.; Pastor, J. M.; Ramírez-Hernández, J. G.

    2009-09-01

    Cold start at low temperatures in current direct injection (DI) Diesel engines is a problem which has not yet been properly solved and it becomes particularly critical with the current trend to reduce the engine compression ratio. Although it is clear that there are some key factors whose control leads to a proper cold start process, their individual relevance and relationships are not clearly understood. Thus, efforts on optimization of the cold start process are mainly based on a trial-and-error procedure in climatic chambers at low ambient temperature, with serious limitations in terms of measurement reliability during such a transient process, low repeatability and experimental cost. This paper presents a novel approach for an experimental facility capable of simulating real engine cold start, at room temperature and under well-controlled low speed and low temperature conditions. It is based on an optical single cylinder engine adapted to reproduce in-cylinder conditions representative of those of a real engine during start at cold ambient temperatures (of the order of -20 °C). Such conditions must be realistic, controlled and repeatable in order to perform systematic studies in the borderline between ignition success and misfiring. An analysis methodology, combining optical techniques and heat release analysis of individual cycles, has been applied.

  18. The Study of the Relationship between Probabilistic Design and Axiomatic Design Methodology. Volume 2

    NASA Technical Reports Server (NTRS)

    Onwubiko, Chin-Yere; Onyebueke, Landon

    1996-01-01

    The structural design, or the design of machine elements, has been traditionally based on deterministic design methodology. The deterministic method considers all design parameters to be known with certainty. This methodology is, therefore, inadequate to design complex structures that are subjected to a variety of complex, severe loading conditions. A nonlinear behavior that is dependent on stress, stress rate, temperature, number of load cycles, and time is observed on all components subjected to complex conditions. These complex conditions introduce uncertainties; hence, the actual factor of safety margin remains unknown. In the deterministic methodology, the contingency of failure is discounted; hence, there is a use of a high factor of safety. It may be most useful in situations where the design structures are simple. The probabilistic method is concerned with the probability of non-failure performance of structures or machine elements. It is much more useful in situations where the design is characterized by complex geometry, possibility of catastrophic failure, sensitive loads and material properties. Also included: Comparative Study of the use of AGMA Geometry Factors and Probabilistic Design Methodology in the Design of Compact Spur Gear Set.

  19. Developing Risk Prediction Models for Postoperative Pancreatic Fistula: a Systematic Review of Methodology and Reporting Quality.

    PubMed

    Wen, Zhang; Guo, Ya; Xu, Banghao; Xiao, Kaiyin; Peng, Tao; Peng, Minhao

    2016-04-01

    Postoperative pancreatic fistula is still a major complication after pancreatic surgery, despite improvements of surgical technique and perioperative management. We sought to systematically review and critically access the conduct and reporting of methods used to develop risk prediction models for predicting postoperative pancreatic fistula. We conducted a systematic search of PubMed and EMBASE databases to identify articles published before January 1, 2015, which described the development of models to predict the risk of postoperative pancreatic fistula. We extracted information of developing a prediction model including study design, sample size and number of events, definition of postoperative pancreatic fistula, risk predictor selection, missing data, model-building strategies, and model performance. Seven studies of developing seven risk prediction models were included. In three studies (42 %), the number of events per variable was less than 10. The number of candidate risk predictors ranged from 9 to 32. Five studies (71 %) reported using univariate screening, which was not recommended in building a multivariate model, to reduce the number of risk predictors. Six risk prediction models (86 %) were developed by categorizing all continuous risk predictors. The treatment and handling of missing data were not mentioned in all studies. We found use of inappropriate methods that could endanger the development of model, including univariate pre-screening of variables, categorization of continuous risk predictors, and model validation. The use of inappropriate methods affects the reliability and the accuracy of the probability estimates of predicting postoperative pancreatic fistula. PMID:27303124

  20. Empirical studies of clinical supervision in psychiatric nursing: A systematic literature review and methodological critique.

    PubMed

    Buus, Niels; Gonge, Henrik

    2009-08-01

    The objective of this paper was to systematically review and critically evaluate all English language research papers reporting empirical studies of clinical supervision in psychiatric nursing. The first part of the search strategy was a combination of brief and building block strategies in the PubMed, CINAHL, and PsycINFO databases. The second part was a citation pearl growing strategy with reviews of 179 reference lists. In total, the search strategy demonstrated a low level of precision and a high level of recall. Thirty four articles met the criteria of the review and were systematically evaluated using three checklists. The findings were summarized by using a new checklist with nine overall questions regarding the studies' design, methods, findings, and limitations. The studies were categorized as: (i) effect studies; (ii) survey studies; (iii) interview studies; and (iv) case studies. In general, the studies were relatively small scale; they used relatively new and basic methods for data collection and analysis, and rarely included sufficient strategies for identifying confounding factors or how the researchers' preconceptions influenced the analyses. Empirical research of clinical supervision in psychiatric nursing was characterized by a basic lack of agreement about which models and instruments to use. Challenges and recommendations for future research are discussed. Clinical supervision in psychiatric nursing was commonly perceived as a good thing, but there was limited empirical evidence supporting this claim. PMID:19594645

  1. Design Evolution and Methodology for Pumpkin Super-Pressure Balloons

    NASA Astrophysics Data System (ADS)

    Farley, Rodger

    The NASA Ultra Long Duration Balloon (ULDB) program has had many technical development issues discovered and solved along its road to success as a new vehicle. It has the promise of being a sub-satellite, a means to launch up to 2700 kg to 33.5 km altitude for 100 days from a comfortable mid-latitude launch point. Current high-lift long duration ballooning is accomplished out of Antarctica with zero-pressure balloons, which cannot cope with the rigors of diurnal cycles. The ULDB design is still evolving, the product of intense analytical effort, scaled testing, improved manufacturing, and engineering intuition. The past technical problems, in particular the s-cleft deformation, their solutions, future challenges, and the methodology of pumpkin balloon design will generally be described.

  2. Fast underdetermined BSS architecture design methodology for real time applications.

    PubMed

    Mopuri, Suresh; Reddy, P Sreenivasa; Acharyya, Amit; Naik, Ganesh R

    2015-01-01

    In this paper, we propose a high speed architecture design methodology for the Under-determined Blind Source Separation (UBSS) algorithm using our recently proposed high speed Discrete Hilbert Transform (DHT) targeting real time applications. In UBSS algorithm, unlike the typical BSS, the number of sensors are less than the number of the sources, which is of more interest in the real time applications. The DHT architecture has been implemented based on sub matrix multiplication method to compute M point DHT, which uses N point architecture recursively and where M is an integer multiples of N. The DHT architecture and state of the art architecture are coded in VHDL for 16 bit word length and ASIC implementation is carried out using UMC 90 - nm technology @V DD = 1V and @ 1MHZ clock frequency. The proposed architecture implementation and experimental comparison results show that the DHT design is two times faster than state of the art architecture. PMID:26737514

  3. The Usefulness of Systematic Reviews of Animal Experiments for the Design of Preclinical and Clinical Studies

    PubMed Central

    de Vries, Rob B. M.; Wever, Kimberley E.; Avey, Marc T.; Stephens, Martin L.; Sena, Emily S.; Leenaars, Marlies

    2014-01-01

    The question of how animal studies should be designed, conducted, and analyzed remains underexposed in societal debates on animal experimentation. This is not only a scientific but also a moral question. After all, if animal experiments are not appropriately designed, conducted, and analyzed, the results produced are unlikely to be reliable and the animals have in effect been wasted. In this article, we focus on one particular method to address this moral question, namely systematic reviews of previously performed animal experiments. We discuss how the design, conduct, and analysis of future (animal and human) experiments may be optimized through such systematic reviews. In particular, we illustrate how these reviews can help improve the methodological quality of animal experiments, make the choice of an animal model and the translation of animal data to the clinic more evidence-based, and implement the 3Rs. Moreover, we discuss which measures are being taken and which need to be taken in the future to ensure that systematic reviews will actually contribute to optimizing experimental design and thereby to meeting a necessary condition for making the use of animals in these experiments justified. PMID:25541545

  4. A design methodology for biologically inspired dry fibrillar adhesives

    NASA Astrophysics Data System (ADS)

    Aksak, Burak

    Realization of the unique aspects of gecko adhesion and incorporating these aspects into a comprehensive design methodology is essential to enable fabrication of application oriented gecko-inspired dry fibrillar adhesives. To address the need for such a design methodology, we propose a fibrillar adhesion model that evaluates the effect of fiber dimensions and material on adhesive performance of fiber arrays. A fibrillar adhesion model is developed to predict the adhesive characteristics of an array of fibrillar structures, and quantify the effect of fiber length, radius, spacing, and material. Photolithography techniques were utilized to fabricate elastomer microfiber arrays. Fibers that are fabricated from stiff SU-8 photoresist are used to fabricate a flexible negative mold that facilitates fabrication of fiber arrays from various elastomers with high yield. The tips of the cylindrical fibers are modified to mushroom-like tip shapes. Adhesive strengths in excess of 100 kPa is obtained with mushroom tipped elastomer microfibers. Vertically aligned carbon nanofibers (VACNFs) are utilized as enhanced friction materials by partially embedding inside soft polyurethanes. Friction coefficients up to 1 were repeatedly obtained from the resulting VACNF composite structures. A novel fabrication method is used to attach Poly(n-butyl acrylate) (PBA) molecular brush-like structures on the surface of polydimethylsiloxane (PDMS). These brushes are grown on unstructured PDMS and PDMS fibers with mushroom tips. Pull-off force is enhanced by up to 7 times with PBA brush grafted micro-fiber arrays over unstructured PDMS substrate. Adhesion model, initially developed for curved smooth surfaces, is extended to self-affine fractal surfaces to better reflect the adhesion performance of fiber arrays on natural surfaces. Developed adhesion model for fiber arrays is used in an optimization scheme which estimates optimal design parameters to obtain maximum adhesive strength on a given

  5. A variable-gain output feedback control design methodology

    NASA Technical Reports Server (NTRS)

    Halyo, Nesim; Moerder, Daniel D.; Broussard, John R.; Taylor, Deborah B.

    1989-01-01

    A digital control system design technique is developed in which the control system gain matrix varies with the plant operating point parameters. The design technique is obtained by formulating the problem as an optimal stochastic output feedback control law with variable gains. This approach provides a control theory framework within which the operating range of a control law can be significantly extended. Furthermore, the approach avoids the major shortcomings of the conventional gain-scheduling techniques. The optimal variable gain output feedback control problem is solved by embedding the Multi-Configuration Control (MCC) problem, previously solved at ICS. An algorithm to compute the optimal variable gain output feedback control gain matrices is developed. The algorithm is a modified version of the MCC algorithm improved so as to handle the large dimensionality which arises particularly in variable-gain control problems. The design methodology developed is applied to a reconfigurable aircraft control problem. A variable-gain output feedback control problem was formulated to design a flight control law for an AFTI F-16 aircraft which can automatically reconfigure its control strategy to accommodate failures in the horizontal tail control surface. Simulations of the closed-loop reconfigurable system show that the approach produces a control design which can accommodate such failures with relative ease. The technique can be applied to many other problems including sensor failure accommodation, mode switching control laws and super agility.

  6. Systematic search for major genes in schizophrenia: Methodological issues and results from chromosome 12

    SciTech Connect

    Dawson, E.; Powell, J.F.; Sham, P.

    1995-10-09

    We describe a method of systematically searching for major genes in disorders of unknown mode of inheritance, using linkage analysis. Our method is designed to minimize the probability of missing linkage due to inadequate exploration of data. We illustrate this method with the results of a search for a locus for schizophrenia on chromosome 12 using 22 highly polymorphic markers in 23 high density pedigrees. The markers span approximately 85-90% of the chromosome and are on average 9.35 cM apart. We have analysed the data using the most plausible current genetic models and allowing for the presence of genetic heterogeneity. None of the markers was supportive of linkage and the distribution of the heterogeneity statistics was in accordance with the null hypothesis. 53 refs., 2 figs., 4 tabs.

  7. Integrated design of the CSI evolutionary structure: A verification of the design methodology

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Joshi, S. M.; Elliott, Kenny B.; Walz, J. E.

    1993-01-01

    One of the main objectives of the Controls-Structures Interaction (CSI) program is to develop and evaluate integrated controls-structures design methodology for flexible space structures. Thus far, integrated design methodologies for a class of flexible spacecraft, which require fine attitude pointing and vibration suppression with no payload articulation, have been extensively investigated. Various integrated design optimization approaches, such as single-objective optimization, and multi-objective optimization, have been implemented with an array of different objectives and constraints involving performance and cost measures such as total mass, actuator mass, steady-state pointing performance, transient performance, control power, and many more. These studies have been performed using an integrated design software tool (CSI-DESIGN CODE) which is under development by the CSI-ADM team at the NASA Langley Research Center. To date, all of these studies, irrespective of the type of integrated optimization posed or objectives and constraints used, have indicated that integrated controls-structures design results in an overall spacecraft design which is considerably superior to designs obtained through a conventional sequential approach. Consequently, it is believed that validation of some of these results through fabrication and testing of a structure which is designed through an integrated design approach is warranted. The objective of this paper is to present and discuss the efforts that have been taken thus far for the validation of the integrated design methodology.

  8. Combustor design and analysis using the Rocket Combustor Interactive Design (ROCCID) methodology

    NASA Technical Reports Server (NTRS)

    Klem, Mark D.; Pieper, Jerry L.; Walker, Richard E.

    1990-01-01

    The ROCket Combustor Interactive Design (ROCCID) Methodology is a newly developed, interactive computer code for the design and analysis of a liquid propellant rocket combustion chamber. The application of ROCCID to design a liquid rocket combustion chamber is illustrated. Designs for a 50,000 lbf thrust and 1250 psi chamber pressure combustor using liquid oxygen (LOX)RP-1 propellants are developed and evaluated. Tradeoffs between key design parameters affecting combustor performance and stability are examined. Predicted performance and combustion stability margin for these designs are provided as a function of the combustor operating mixture ratio and chamber pressure.

  9. Combustor design and analysis using the ROCket Combustor Interactive Design (ROCCID) Methodology

    NASA Technical Reports Server (NTRS)

    Klem, Mark D.; Pieper, Jerry L.; Walker, Richard E.

    1990-01-01

    The ROCket Combustor Interactive Design (ROCCID) Methodology is a newly developed, interactive computer code for the design and analysis of a liquid propellant rocket combustion chamber. The application of ROCCID to design a liquid rocket combustion chamber is illustrated. Designs for a 50,000 lbf thrust and 1250 psi chamber pressure combustor using liquid oxygen (LOX)RP-1 propellants are developed and evaluated. Tradeoffs between key design parameters affecting combustor performance and stability are examined. Predicted performance and combustion stability margin for these designs are provided as a function of the combustor operating mixture ratio and chamber pressure.

  10. Towards a Methodology for the Design of Multimedia Public Access Interfaces.

    ERIC Educational Resources Information Center

    Rowley, Jennifer

    1998-01-01

    Discussion of information systems methodologies that can contribute to interface design for public access systems covers: the systems life cycle; advantages of adopting information systems methodologies; soft systems methodologies; task-oriented approaches to user interface design; holistic design, the Star model, and prototyping; the…

  11. A design methodology for portable software on parallel computers

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Miller, Keith W.; Chrisman, Dan A.

    1993-01-01

    This final report for research that was supported by grant number NAG-1-995 documents our progress in addressing two difficulties in parallel programming. The first difficulty is developing software that will execute quickly on a parallel computer. The second difficulty is transporting software between dissimilar parallel computers. In general, we expect that more hardware-specific information will be included in software designs for parallel computers than in designs for sequential computers. This inclusion is an instance of portability being sacrificed for high performance. New parallel computers are being introduced frequently. Trying to keep one's software on the current high performance hardware, a software developer almost continually faces yet another expensive software transportation. The problem of the proposed research is to create a design methodology that helps designers to more precisely control both portability and hardware-specific programming details. The proposed research emphasizes programming for scientific applications. We completed our study of the parallelizability of a subsystem of the NASA Earth Radiation Budget Experiment (ERBE) data processing system. This work is summarized in section two. A more detailed description is provided in Appendix A ('Programming Practices to Support Eventual Parallelism'). Mr. Chrisman, a graduate student, wrote and successfully defended a Ph.D. dissertation proposal which describes our research associated with the issues of software portability and high performance. The list of research tasks are specified in the proposal. The proposal 'A Design Methodology for Portable Software on Parallel Computers' is summarized in section three and is provided in its entirety in Appendix B. We are currently studying a proposed subsystem of the NASA Clouds and the Earth's Radiant Energy System (CERES) data processing system. This software is the proof-of-concept for the Ph.D. dissertation. We have implemented and measured

  12. Bond energy analysis revisited and designed toward a rigorous methodology

    NASA Astrophysics Data System (ADS)

    Nakai, Hiromi; Ohashi, Hideaki; Imamura, Yutaka; Kikuchi, Yasuaki

    2011-09-01

    The present study theoretically revisits and numerically assesses two-body energy decomposition schemes including a newly proposed one. The new decomposition scheme is designed to make the equilibrium bond distance equivalent with the minimum point of bond energies. Although the other decomposition schemes generally predict the wrong order of the C-C bond strengths of C2H2, C2H4, and C2H6, the new decomposition scheme is capable of reproducing the C-C bond strengths. Numerical assessment on a training set of molecules demonstrates that the present scheme exhibits a stronger correlation with bond dissociation energies than the other decomposition schemes do, which suggests that the new decomposition scheme is a reliable and powerful analysis methodology.

  13. Sonic Boom Mitigation Through Aircraft Design and Adjoint Methodology

    NASA Technical Reports Server (NTRS)

    Rallabhandi, Siriam K.; Diskin, Boris; Nielsen, Eric J.

    2012-01-01

    This paper presents a novel approach to design of the supersonic aircraft outer mold line (OML) by optimizing the A-weighted loudness of sonic boom signature predicted on the ground. The optimization process uses the sensitivity information obtained by coupling the discrete adjoint formulations for the augmented Burgers Equation and Computational Fluid Dynamics (CFD) equations. This coupled formulation links the loudness of the ground boom signature to the aircraft geometry thus allowing efficient shape optimization for the purpose of minimizing the impact of loudness. The accuracy of the adjoint-based sensitivities is verified against sensitivities obtained using an independent complex-variable approach. The adjoint based optimization methodology is applied to a configuration previously optimized using alternative state of the art optimization methods and produces additional loudness reduction. The results of the optimizations are reported and discussed.

  14. Development of design and analysis methodology for composite bolted joints

    NASA Astrophysics Data System (ADS)

    Grant, Peter; Sawicki, Adam

    1991-05-01

    This paper summarizes work performed to develop composite joint design methodology for use on rotorcraft primary structure, determine joint characteristics which affect joint bearing and bypass strength, and develop analytical methods for predicting the effects of such characteristics in structural joints. Experimental results have shown that bearing-bypass interaction allowables cannot be defined using a single continuous function due to variance of failure modes for different bearing-bypass ratios. Hole wear effects can be significant at moderate stress levels and should be considered in the development of bearing allowables. A computer program has been developed and has successfully predicted bearing-bypass interaction effects for the (0/+/-45/90) family of laminates using filled hole and unnotched test data.

  15. A novel methodology for building robust design rules by using design based metrology (DBM)

    NASA Astrophysics Data System (ADS)

    Lee, Myeongdong; Choi, Seiryung; Choi, Jinwoo; Kim, Jeahyun; Sung, Hyunju; Yeo, Hyunyoung; Shim, Myoungseob; Jin, Gyoyoung; Chung, Eunseung; Roh, Yonghan

    2013-03-01

    This paper addresses a methodology for building robust design rules by using design based metrology (DBM). Conventional method for building design rules has been using a simulation tool and a simple pattern spider mask. At the early stage of the device, the estimation of simulation tool is poor. And the evaluation of the simple pattern spider mask is rather subjective because it depends on the experiential judgment of an engineer. In this work, we designed a huge number of pattern situations including various 1D and 2D design structures. In order to overcome the difficulties of inspecting many types of patterns, we introduced Design Based Metrology (DBM) of Nano Geometry Research, Inc. And those mass patterns could be inspected at a fast speed with DBM. We also carried out quantitative analysis on PWQ silicon data to estimate process variability. Our methodology demonstrates high speed and accuracy for building design rules. All of test patterns were inspected within a few hours. Mass silicon data were handled with not personal decision but statistical processing. From the results, robust design rules are successfully verified and extracted. Finally we found out that our methodology is appropriate for building robust design rules.

  16. SSME Investment in Turbomachinery Inducer Impeller Design Tools and Methodology

    NASA Technical Reports Server (NTRS)

    Zoladz, Thomas; Mitchell, William; Lunde, Kevin

    2010-01-01

    Within the rocket engine industry, SSME turbomachines are the de facto standards of success with regard to meeting aggressive performance requirements under challenging operational environments. Over the Shuttle era, SSME has invested heavily in our national inducer impeller design infrastructure. While both low and high pressure turbopump failures/anomaly resolution efforts spurred some of these investments, the SSME program was a major benefactor of key areas of turbomachinery inducer-impeller research outside of flight manifest pressures. Over the past several decades, key turbopump internal environments have been interrogated via highly instrumented hot-fire and cold-flow testing. Likewise, SSME has sponsored the advancement of time accurate and cavitating inducer impeller computation fluid dynamics (CFD) tools. These investments together have led to a better understanding of the complex internal flow fields within aggressive high performing inducers and impellers. New design tools and methodologies have evolved which intend to provide confident blade designs which strike an appropriate balance between performance and self induced load management.

  17. Design of integrated pitch axis for autopilot/autothrottle and integrated lateral axis for autopilot/yaw damper for NASA TSRV airplane using integral LQG methodology

    NASA Technical Reports Server (NTRS)

    Kaminer, Isaac; Benson, Russell A.; Coleman, Edward E.; Ebrahimi, Yaghoob S.

    1990-01-01

    Two designs are presented for control systems for the NASA Transport System Research Vehicle (TSRV) using integral Linear Quadratic Gaussian (LQG) methodology. The first is an integrated longitudinal autopilot/autothrottle design and the second design is an integrated lateral autopilot/yaw damper/sideslip controller design. It is shown that a systematic top-down approach to a complex design problem combined with proper application of modern control synthesis techniques yields a satisfactory solution in a reasonable period of time.

  18. An NAFP Project: Use of Object Oriented Methodologies and Design Patterns to Refactor Software Design

    NASA Technical Reports Server (NTRS)

    Shaykhian, Gholam Ali; Baggs, Rhoda

    2007-01-01

    In the early problem-solution era of software programming, functional decompositions were mainly used to design and implement software solutions. In functional decompositions, functions and data are introduced as two separate entities during the design phase, and are followed as such in the implementation phase. Functional decompositions make use of refactoring through optimizing the algorithms, grouping similar functionalities into common reusable functions, and using abstract representations of data where possible; all these are done during the implementation phase. This paper advocates the usage of object-oriented methodologies and design patterns as the centerpieces of refactoring software solutions. Refactoring software is a method of changing software design while explicitly preserving its external functionalities. The combined usage of object-oriented methodologies and design patterns to refactor should also benefit the overall software life cycle cost with improved software.

  19. A systematic approach to design for lifelong aircraft evolution

    NASA Astrophysics Data System (ADS)

    Lim, Dongwook

    This research proposes a systematic approach with which the decision makers can evaluate the value and risk of a new aircraft development program, including potential derivative development opportunities. The proposed Evaluation of Lifelong Vehicle Evolution (EvoLVE) method is a two- or multi-stage representation of the aircraft design process that accommodates initial development phases as well as follow-on phases. One of the key elements of this method is the Stochastic Programming with Recourse (SPR) technique, which accounts for uncertainties associated with future requirements. The remedial approach of SPR in its two distinctive problem-solving steps is well suited to aircraft design problems where derivatives, retrofits, and upgrades have been used to fix designs that were once but no longer optimal. The solution approach of SPR is complemented by the Risk-Averse Strategy Selection (RASS) technique to gauge risk associated with vehicle evolution options. In the absence of a full description of the random space, a scenario-based approach captures the randomness with a few probable scenarios and reveals implications of different future events. Last, an interactive framework for decision-making support allows simultaneous navigation of the current and future design space with a greater degree of freedom. A cantilevered beam design problem was set up and solved using the SPR technique to showcase its application to an engineering design setting. The full EvoLVE method was conducted on a notional multi-role fighter based on the F/A-18 Hornet.

  20. Preliminary methodology for design of stable drifts for the Yucca Mountain Project

    SciTech Connect

    Bauer, S.J.; Ehgartner, B.L.; Hardy, M.P.

    1989-12-31

    This paper defines a methodology and criteria by which the stability of underground repository drifts in tuff is determined and from which the ground support system is designed. Preconstruction evaluations of stability are required for planning and to support the license application process. The emphasis is on analytical numerical methods because, at this time, empirical data are generally not available for excavations in welded tuff at elevated temperatures or in seismic environments. Observational methods are only applicable during construction. The methodology suggests analytical techniques for the range of structural conditions of the rock currently expected at the Yucca Mountain site: systematically jointed rock masses, randomly jointed rock masses, and widely spaced discrete joints. The analyses must also load on the rock in the vicinity of excavations that result from in situ stresses, thermal expansion, and seismic events. Large-scale field experiments at the Exploratory Shaft Facility (ESF), and laboratory experiments on ESF samples define the controlling deformation mechanisms and allows evaluation of in situ properties, the methodology presented is expected to evolve. 2 refs., 2 figs.

  1. The Navigation Guide Systematic Review Methodology: A Rigorous and Transparent Method for Translating Environmental Health Science into Better Health Outcomes

    PubMed Central

    Sutton, Patrice

    2014-01-01

    Background: Synthesizing what is known about the environmental drivers of health is instrumental to taking prevention-oriented action. Methods of research synthesis commonly used in environmental health lag behind systematic review methods developed in the clinical sciences over the past 20 years. Objectives: We sought to develop a proof of concept of the “Navigation Guide,” a systematic and transparent method of research synthesis in environmental health. Discussion: The Navigation Guide methodology builds on best practices in research synthesis in evidence-based medicine and environmental health. Key points of departure from current methods of expert-based narrative review prevalent in environmental health include a prespecified protocol, standardized and transparent documentation including expert judgment, a comprehensive search strategy, assessment of “risk of bias,” and separation of the science from values and preferences. Key points of departure from evidence-based medicine include assigning a “moderate” quality rating to human observational studies and combining diverse evidence streams. Conclusions: The Navigation Guide methodology is a systematic and rigorous approach to research synthesis that has been developed to reduce bias and maximize transparency in the evaluation of environmental health information. Although novel aspects of the method will require further development and validation, our findings demonstrated that improved methods of research synthesis under development at the National Toxicology Program and under consideration by the U.S. Environmental Protection Agency are fully achievable. The institutionalization of robust methods of systematic and transparent review would provide a concrete mechanism for linking science to timely action to prevent harm. Citation: Woodruff TJ, Sutton P. 2014. The Navigation Guide systematic review methodology: a rigorous and transparent method for translating environmental health science into

  2. European COPD Audit: design, organisation of work and methodology.

    PubMed

    López-Campos, Jose Luis; Hartl, Sylvia; Pozo-Rodriguez, Francisco; Roberts, C Michael

    2013-02-01

    Clinical audit has an important role as an indicator of the clinical practice in a given community. The European Respiratory Society (ERS) chronic obstructive pulmonary disease (COPD) audit was designed as a pilot study to evaluate clinical practice variability as well as clinical and organisational factors related to outcomes for COPD hospital admissions across Europe. The study was designed as a prospective observational noninterventional cohort trial, in which 422 hospitals from 13 European countries participated. There were two databases: one for hospital's resources and organisation and one for clinical information. The study was comprised of an initial 8-week phase during which all consecutive cases admitted to hospital due to an exacerbation of COPD were identified and information on clinical practice was gathered. During the 90-day second phase, mortality and readmissions were recorded. Patient data were anonymised and encrypted through a multi-lingual web-tool. As there is no pan-European Ethics Committee for audits, all partners accepted the general ethical rules of the ERS and ensured compliance with their own national ethical requirements. This paper describes the methodological issues encountered in organising and delivering a multi-national European audit, highlighting goals, barriers and achievements, and provides valuable information for those interested in developing clinical audits. PMID:22599361

  3. Design Methodology of Long Complex Helium Cryogenic Transfer Lines

    NASA Astrophysics Data System (ADS)

    Fydrych, J.; Chorowski, M.; Polinski, J.; Skrzypacz, J.

    2010-04-01

    Big scientific facilities, like superconducting particle accelerators or fusion reactors require high cooling power, usually produced locally by large helium refrigerators and transferred, by means of liquid or supercritical helium, over the distances that may exceed several kilometres. Construction of cold helium transfer lines should take into consideration many different issues. The lines are exposed to thermal loads which can constitute an important part of the cryogenic system thermal budget. Pressure difference between the vacuum insulation and the inner content of the pipes causes significant mechanical stresses. The cyclic changes of temperature can lead to considerable fatigue stresses. Additionally, due to complex structure of the scientific facilities, the access to the cryogenic lines can be partly or totally limited. Therefore all these thermal and mechanical aspects have to be analyzed and compromised during the design phase of the complex helium transfer line. The paper presents a design methodology of long multi-channel helium cryogenic transfer lines. It describes some aspects of process line arrangement, thermo-mechanical calculation, supporting structure and contraction protection, taking as a case study cryogenic transfer line XATL1, dedicated for the Accelerator Module Test Facility (AMTF) of the European X-rays Free Electron Laser (XFEL).

  4. SysSon - A Framework for Systematic Sonification Design

    NASA Astrophysics Data System (ADS)

    Vogt, Katharina; Goudarzi, Visda; Holger Rutz, Hanns

    2015-04-01

    SysSon is a research approach on introducing sonification systematically to a scientific community where it is not yet commonly used - e.g., in climate science. Thereby, both technical and socio-cultural barriers have to be met. The approach was further developed with climate scientists, who participated in contextual inquiries, usability tests and a workshop of collaborative design. Following from these extensive user tests resulted our final software framework. As frontend, a graphical user interface allows climate scientists to parametrize standard sonifications with their own data sets. Additionally, an interactive shell allows to code new sonifications for users competent in sound design. The framework is a standalone desktop application, available as open source (for details see http://sysson.kug.ac.at/) and works with data in NetCDF format.

  5. We!Design: A Student-Centred Participatory Methodology for the Design of Educational Applications

    ERIC Educational Resources Information Center

    Triantafyllakos, George N.; Palaigeorgiou, George E.; Tsoukalas, Ioannis A.

    2008-01-01

    The development of educational applications has always been a challenging and complex issue, mainly because of the complications imposed by the cognitive and psychological aspects of student-computer interactions. This article presents a methodology, named We!Design, that tries to encounter the complexity of educational applications development…

  6. Architectural Exploration and Design Methodologies of Photonic Interconnection Networks

    NASA Astrophysics Data System (ADS)

    Chan, Jong Wu

    Photonic technology is becoming an increasingly attractive solution to the problems facing today's electronic chip-scale interconnection networks. Recent progress in silicon photonics research has enabled the demonstration of all the necessary optical building blocks for creating extremely high-bandwidth density and energy-efficient links for on- and off-chip communications. From the feasibility and architecture perspective however, photonics represents a dramatic paradigm shift from traditional electronic network designs due to fundamental differences in how electronics and photonics function and behave. As a result of these differences, new modeling and analysis methods must be employed in order to properly realize a functional photonic chip-scale interconnect design. In this work, we present a methodology for characterizing and modeling fundamental photonic building blocks which can subsequently be combined to form full photonic network architectures. We also describe a set of tools which can be utilized to assess the physical-layer and system-level performance properties of a photonic network. The models and tools are integrated in a novel open-source design and simulation environment called PhoenixSim. Next, we leverage PhoenixSim for the study of chip-scale photonic networks. We examine several photonic networks through the synergistic study of both physical-layer metrics and system-level metrics. This holistic analysis method enables us to provide deeper insight into architecture scalability since it considers insertion loss, crosstalk, and power dissipation. In addition to these novel physical-layer metrics, traditional system-level metrics of bandwidth and latency are also obtained. Lastly, we propose a novel routing architecture known as wavelength-selective spatial routing. This routing architecture is analogous to electronic virtual channels since it enables the transmission of multiple logical optical channels through a single physical plane (i.e. the

  7. Arab Teens Lifestyle Study (ATLS): objectives, design, methodology and implications

    PubMed Central

    Al-Hazzaa, Hazzaa M; Musaiger, Abdulrahman O

    2011-01-01

    Background There is a lack of comparable data on physical activity, sedentary behavior, and dietary habits among Arab adolescents, which limits our understanding and interpretation of the relationship between obesity and lifestyle parameters. Therefore, we initiated the Arab Teens Lifestyle Study (ATLS). The ATLS is a multicenter collaborative project for assessing lifestyle habits of Arab adolescents. The objectives of the ATLS project were to investigate the prevalence rates for overweight and obesity, physical activity, sedentary activity and dietary habits among Arab adolescents, and to examine the interrelationships between these lifestyle variables. This paper reports on the objectives, design, methodology, and implications of the ATLS. Design/Methods The ATLS is a school-based cross-sectional study involving 9182 randomly selected secondary-school students (14–19 years) from major Arab cities, using a multistage stratified sampling technique. The participating Arab cities included Riyadh, Jeddah, and Al-Khobar (Saudi Arabia), Bahrain, Dubai (United Arab Emirates), Kuwait, Amman (Jordan), Mosel (Iraq), Muscat (Oman), Tunisia (Tunisia) and Kenitra (Morocco). Measured variables included anthropometric measurements, physical activity, sedentary behavior, sleep duration, and dietary habits. Discussion The ATLS project will provide a unique opportunity to collect and analyze important lifestyle information from Arab adolescents using standardized procedures. This is the first time a collaborative Arab project will simultaneously assess broad lifestyle variables in a large sample of adolescents from numerous urbanized Arab regions. This joint research project will supply us with comprehensive and recent data on physical activity/inactivity and eating habits of Arab adolescents relative to obesity. Such invaluable lifestyle-related data are crucial for developing public health policies and regional strategies for health promotion and disease prevention. PMID

  8. Using systematic reviews for evidence-based health promotion: basic methodology issues.

    PubMed

    Buendía-Rodríguez, Jefferson A; Sánchez-Villamil, Juana P

    2006-12-01

    Systematic reviews and evidence-based recommendations are becoming increasingly important for decision-making in health and medicine. Systematic reviews of population-health interventions are challenging and methods will continue evolving. This paper provides an overview of how evidence-based approaches in public health and health promotion are being reviewed to provide a basis for Colombian Guide to Health Promotion, analysing limitations and recommendations for future reviews. PMID:17361581

  9. New systematic methodology for incorporating dynamic heat transfer modelling in multi-phase biochemical reactors.

    PubMed

    Fernández-Arévalo, T; Lizarralde, I; Grau, P; Ayesa, E

    2014-09-01

    This paper presents a new modelling methodology for dynamically predicting the heat produced or consumed in the transformations of any biological reactor using Hess's law. Starting from a complete description of model components stoichiometry and formation enthalpies, the proposed modelling methodology has integrated successfully the simultaneous calculation of both the conventional mass balances and the enthalpy change of reaction in an expandable multi-phase matrix structure, which facilitates a detailed prediction of the main heat fluxes in the biochemical reactors. The methodology has been implemented in a plant-wide modelling methodology in order to facilitate the dynamic description of mass and heat throughout the plant. After validation with literature data, as illustrative examples of the capability of the methodology, two case studies have been described. In the first one, a predenitrification-nitrification dynamic process has been analysed, with the aim of demonstrating the easy integration of the methodology in any system. In the second case study, the simulation of a thermal model for an ATAD has shown the potential of the proposed methodology for analysing the effect of ventilation and influent characterization. PMID:24852412

  10. Using the systematic review methodology to evaluate factors that influence the persistence of influenza virus in environmental matrices.

    PubMed

    Irwin, C K; Yoon, K J; Wang, C; Hoff, S J; Zimmerman, J J; Denagamage, T; O'Connor, A M

    2011-02-01

    Understanding factors that influence persistence of influenza virus in an environment without host animals is critical to appropriate decision-making for issues such as quarantine downtimes, setback distances, and eradication programs in livestock production systems. This systematic review identifies literature describing persistence of influenza virus in environmental samples, i.e., air, water, soil, feces, and fomites. An electronic search of PubMed, CAB, AGRICOLA, Biosis, and Compendex was performed, and citation relevance was determined according to the aim of the review. Quality assessment of relevant studies was performed using criteria from experts in virology, disease ecology, and environmental science. A total of 9,760 abstracts were evaluated, and 40 appeared to report the persistence of influenza virus in environmental samples. Evaluation of full texts revealed that 19 of the 40 studies were suitable for review, as they described virus concentration measured at multiple sampling times, with viruses detectable at least twice. Seven studies reported persistence in air (six published before 1970), seven in water (five published after 1990), two in feces, and three on surfaces. All three fomite and five air studies addressed human influenza virus, and all water and feces studies pertained to avian influenza virus. Outcome measurements were transformed to half-lives, and resultant multivariate mixed linear regression models identified influenza virus surviving longer in water than in air. Temperature was a significant predictor of persistence over all matrices. Salinity and pH were significant predictors of persistence in water conditions. An assessment of the methodological quality review of the included studies revealed significant gaps in reporting critical aspects of study design. PMID:21148699

  11. Aerospace engineering design by systematic decomposition and multilevel optimization

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, J.; Barthelemy, J. F. M.; Giles, G. L.

    1984-01-01

    A method for systematic analysis and optimization of large engineering systems, by decomposition of a large task into a set of smaller subtasks that is solved concurrently is described. The subtasks may be arranged in hierarchical levels. Analyses are carried out in each subtask using inputs received from other subtasks, and are followed by optimizations carried out from the bottom up. Each optimization at the lower levels is augmented by analysis of its sensitivity to the inputs received from other subtasks to account for the couplings among the subtasks in a formal manner. The analysis and optimization operations alternate iteratively until they converge to a system design whose performance is maximized with all constraints satisfied. The method, which is still under development, is tentatively validated by test cases in structural applications and an aircraft configuration optimization.

  12. Multidisciplinary design and optimization (MDO) methodology for the aircraft conceptual design

    NASA Astrophysics Data System (ADS)

    Iqbal, Liaquat Ullah

    An integrated design and optimization methodology has been developed for the conceptual design of an aircraft. The methodology brings higher fidelity Computer Aided Design, Engineering and Manufacturing (CAD, CAE and CAM) Tools such as CATIA, FLUENT, ANSYS and SURFCAM into the conceptual design by utilizing Excel as the integrator and controller. The approach is demonstrated to integrate with many of the existing low to medium fidelity codes such as the aerodynamic panel code called CMARC and sizing and constraint analysis codes, thus providing the multi-fidelity capabilities to the aircraft designer. The higher fidelity design information from the CAD and CAE tools for the geometry, aerodynamics, structural and environmental performance is provided for the application of the structured design methods such as the Quality Function Deployment (QFD) and the Pugh's Method. The higher fidelity tools bring the quantitative aspects of a design such as precise measurements of weight, volume, surface areas, center of gravity (CG) location, lift over drag ratio, and structural weight, as well as the qualitative aspects such as external geometry definition, internal layout, and coloring scheme early in the design process. The performance and safety risks involved with the new technologies can be reduced by modeling and assessing their impact more accurately on the performance of the aircraft. The methodology also enables the design and evaluation of the novel concepts such as the blended (BWB) and the hybrid wing body (HWB) concepts. Higher fidelity computational fluid dynamics (CFD) and finite element analysis (FEA) allow verification of the claims for the performance gains in aerodynamics and ascertain risks of structural failure due to different pressure distribution in the fuselage as compared with the tube and wing design. The higher fidelity aerodynamics and structural models can lead to better cost estimates that help reduce the financial risks as well. This helps in

  13. A combined stochastic feedforward and feedback control design methodology with application to autoland design

    NASA Technical Reports Server (NTRS)

    Halyo, Nesim

    1987-01-01

    A combined stochastic feedforward and feedback control design methodology was developed. The objective of the feedforward control law is to track the commanded trajectory, whereas the feedback control law tries to maintain the plant state near the desired trajectory in the presence of disturbances and uncertainties about the plant. The feedforward control law design is formulated as a stochastic optimization problem and is embedded into the stochastic output feedback problem where the plant contains unstable and uncontrollable modes. An algorithm to compute the optimal feedforward is developed. In this approach, the use of error integral feedback, dynamic compensation, control rate command structures are an integral part of the methodology. An incremental implementation is recommended. Results on the eigenvalues of the implemented versus designed control laws are presented. The stochastic feedforward/feedback control methodology is used to design a digital automatic landing system for the ATOPS Research Vehicle, a Boeing 737-100 aircraft. The system control modes include localizer and glideslope capture and track, and flare to touchdown. Results of a detailed nonlinear simulation of the digital control laws, actuator systems, and aircraft aerodynamics are presented.

  14. Systematic methodology for estimating direct capital costs for blanket tritium processing systems

    SciTech Connect

    Finn, P.A.

    1985-01-01

    This paper describes the methodology developed for estimating the relative capital costs of blanket processing systems. The capital costs of the nine blanket concepts selected in the Blanket Comparison and Selection Study are presented and compared.

  15. Risk of bias and methodological appraisal practices in systematic reviews published in anaesthetic journals: a meta-epidemiological study.

    PubMed

    Detweiler, B N; Kollmorgen, L E; Umberham, B A; Hedin, R J; Vassar, B M

    2016-08-01

    The validity of primary study results included in systematic reviews plays an important role in drawing conclusions about intervention effectiveness and carries implications for clinical decision-making. We evaluated the prevalence of methodological quality and risk of bias assessments in systematic reviews published in the five highest-ranked anaesthesia journals since 2007. The initial PubMed search yielded 315 citations, and our final sample after screening consisted of 207 systematic reviews. One hundred and seventy-four reviews conducted methodological quality/risk of bias analyses. The Jadad scale was most frequently used. Forty-four of the 83 reviews that included high risk of bias studies re-analysed their data omitting these trials: 20 showed differences in pooled effect estimates. Reviews containing a greater number of primary studies evaluated quality less frequently than smaller reviews. Overall, the majority of reviews evaluated bias; however, many applied questionable methods. Given the potential effects of bias on summary outcomes, greater attention is warranted. PMID:27396249

  16. Methodology for the optimal design of an integrated first and second generation ethanol production plant combined with power cogeneration.

    PubMed

    Bechara, Rami; Gomez, Adrien; Saint-Antonin, Valérie; Schweitzer, Jean-Marc; Maréchal, François

    2016-08-01

    The application of methodologies for the optimal design of integrated processes has seen increased interest in literature. This article builds on previous works and applies a systematic methodology to an integrated first and second generation ethanol production plant with power cogeneration. The methodology breaks into process simulation, heat integration, thermo-economic evaluation, exergy efficiency vs. capital costs, multi-variable, evolutionary optimization, and process selection via profitability maximization. Optimization generated Pareto solutions with exergy efficiency ranging between 39.2% and 44.4% and capital costs from 210M$ to 390M$. The Net Present Value was positive for only two scenarios and for low efficiency, low hydrolysis points. The minimum cellulosic ethanol selling price was sought to obtain a maximum NPV of zero for high efficiency, high hydrolysis alternatives. The obtained optimal configuration presented maximum exergy efficiency, hydrolyzed bagasse fraction, capital costs and ethanol production rate, and minimum cooling water consumption and power production rate. PMID:27160954

  17. A Synergy between the Technological Process and a Methodology for Web Design: Implications for Technological Problem Solving and Design

    ERIC Educational Resources Information Center

    Jakovljevic, Maria; Ankiewicz, Piet; De swardt, Estelle; Gross, Elna

    2004-01-01

    Traditional instructional methodology in the Information System Design (ISD) environment lacks explicit strategies for promoting the cognitive skills of prospective system designers. This contributes to the fragmented knowledge and low motivational and creative involvement of learners in system design tasks. In addition, present ISD methodologies,…

  18. A multi-stage, multi-response Bayesian methodology for surrogate modeling in engineering design

    NASA Astrophysics Data System (ADS)

    Romero, David A.

    To design products, designers often need models of the system behavior as a function of a set of input (or design) variables; these models allow designers to learn about the influence of different design variables on the system's performance. In practice, however, system models are either unavailable or expensive to evaluate and thus unsuited for systematic use in preliminary design. Creation of surrogate models, or metamodels, of system behavior is a common approach that has been proposed to circumvent these problems. In this work in engineering design based on computer experiments with multiple performance criteria, we propose the creation of multi-response metamodels to model several metrics of system behavior jointly, instead of modeling each individually. In particular, we develop the Multi-stage, Multi-Response Bayesian Surrogate Models (MMRBSM) methodology, a flexible, multi-stage framework that allows for modeling the correlation among different response variables for their simultaneous prediction, while also enabling the integration of different sources of information about the response values into a single, global model of the system's responses. In this thesis, the mathematical formulation of MMRBSM metamodels is developed, including the required multi-stage, multi-response covariance functions and multi-response adaptive sampling techniques. The proposed metamodeling framework is tested, first with commonly-used, analytical test functions and then in the engineering design of an electronic device based on multiple performance metrics. Results indicate that the proposed MMRBSM outperforms individual metamodels, though the relative performance depends on the sample size, the sampling method and the true correlation among the observed response values. Results also indicate that the proposed multi-stage formulation enables the incorporation of expert knowledge into the multi-response metamodels, leading to order-of-magnitude improvements in the predictive

  19. A rational design change methodology based on experimental and analytical modal analysis

    SciTech Connect

    Weinacht, D.J.; Bennett, J.G.

    1993-08-01

    A design methodology that integrates analytical modeling and experimental characterization is presented. This methodology represents a powerful tool for making rational design decisions and changes. An example of its implementation in the design, analysis, and testing of a precisions machine tool support structure is given.

  20. Reporting of financial and non-financial conflicts of interest by authors of systematic reviews: a methodological survey

    PubMed Central

    Anouti, Sirine; Al-Gibbawi, Mounir; Abou-Jaoude, Elias A; Hasbani, Divina Justina; Guyatt, Gordon; Akl, Elie A

    2016-01-01

    Background Conflicts of interest may bias the findings of systematic reviews. The objective of this methodological survey was to assess the frequency and different types of conflicts of interest that authors of Cochrane and non-Cochrane systematic reviews report. Methods We searched for systematic reviews using the Cochrane Database of Systematic Reviews and Ovid MEDLINE (limited to the 119 Core Clinical Journals and the year 2015). We defined a conflict of interest disclosure as the reporting of whether a conflict of interest exists or not, and used a framework to classify conflicts of interest into individual (financial, professional and intellectual) and institutional (financial and advocatory) conflicts of interest. We conducted descriptive and regression analyses. Results Of the 200 systematic reviews, 194 (97%) reported authors' conflicts of interest disclosures, typically in the main document, and in a few cases either online (2%) or on request (5%). Of the 194 Cochrane and non-Cochrane reviews, 49% and 33%, respectively, had at least one author reporting any type of conflict of interest (p=0.023). Institutional conflicts of interest were less frequently reported than individual conflicts of interest, and Cochrane reviews were more likely to report individual intellectual conflicts of interest compared with non-Cochrane reviews (19% and 5%, respectively, p=0.004). Regression analyses showed a positive association between reporting of conflicts of interest (at least one type of conflict of interest, individual financial conflict of interest, institutional financial conflict of interest) and journal impact factor and between reporting individual financial conflicts of interest and pharmacological versus non-pharmacological intervention. Conclusions Although close to half of the published systematic reviews report that authors (typically many) have conflicts of interest, more than half report that they do not. Authors reported individual conflicts of interest

  1. Design guided data analysis for summarizing systematic pattern defects and process window

    NASA Astrophysics Data System (ADS)

    Xie, Qian; Venkatachalam, Panneerselvam; Lee, Julie; Chen, Zhijin; Zafar, Khurram

    2016-03-01

    As the semiconductor process technology moves into more advanced nodes, design and process induced systematic defects become increasingly significant yield limiters. Therefore, early detection of these defects is crucial. Focus Exposure Matrix (FEM) and Process Window Qualification (PWQ) are routine methods for discovering systematic patterning defects and establishing the lithography process window. These methods require the stepper to expose a reticle onto the wafer at various focus and exposure settings (also known as modulations). The wafer is subsequently inspected by a bright field, broadband plasma or an E-Beam Inspection tool using a high sensitivity inspection recipe (i.e. hot scan) that often reports a million or more defects. Analyzing this vast stream of data to identify the weak patterns and arrive at the optimal focus/exposure settings requires a significant amount of data reduction through aggressive sampling and nuisance filtering schemes. However, these schemes increase alpha risk, i.e. the probability of not catching some systematic or otherwise important defects within a modulation and thus reporting that modulation as a good condition for production wafers. In order to reduce this risk and establish a more accurate process window, we describe a technique that introduces image-and-design integration methodologies into the inspection data analysis workflow. These image-and-design integration methodologies include contour extraction and alignment to design, contour-to-design defect detection, defective/nuisance pattern retrieval, confirmed defective/nuisance pattern overlay with inspection data, and modulation-related weak-pattern ranking. The technique we present provides greater automation, from defect detection to defective pattern retrieval to decision-making steps, that allows for statistically summarized results and increased coverage of the wafer to be achieved without an adverse impact on cycle time. Statistically summarized results, lead

  2. [Marxism as a theoretical and methodological framework in collective health: implications for systematic review and synthesis of evidence].

    PubMed

    Soares, Cassia Baldini; Campos, Celia Maria Sivalli; Yonekura, Tatiana

    2013-12-01

    In this study, we discuss the integration in systematic reviews of research developed from a Marxist perspective of knowledge production and their results as evidence in healthcare. The study objectives are to review the assumptions of dialectical and historical materialism (DHM) and discuss the implications of dialectics for a literature review and the synthesis of evidence. DHM is a powerful framework for knowledge generation and transformation of policies and practices in healthcare. It assumes that social contradictions underlie the health-disease process, the fundamental theoretical construction in the field of collective health. Currently, we observe a considerable influence of the critical paradigm, of Marxist origin, in the construction of knowledge in health. Studies based on this critical paradigm incorporate complex methods, which are inherent to the guidelines of dialect, to identify the object and arrive at results that constitute evidence in healthcare. Systematic reviews should address the methodological difficulties associated with entirely integrating these results to healthcare. PMID:24626368

  3. Methodological quality of systematic reviews and clinical trials on women's health published in a Brazilian evidence-based health journal

    PubMed Central

    Macedo, Cristiane Rufino; Riera, Rachel; Torloni, Maria Regina

    2013-01-01

    OBJECTIVES: To assess the quality of systematic reviews and clinical trials on women's health recently published in a Brazilian evidence-based health journal. METHOD: All systematic reviews and clinical trials on women's health published in the last five years in the Brazilian Journal of Evidence-based Health were retrieved. Two independent reviewers critically assessed the methodological quality of reviews and trials using AMSTAR and the Cochrane Risk of Bias Table, respectively. RESULTS: Systematic reviews and clinical trials accounted for less than 10% of the 61 original studies on women's health published in the São Paulo Medical Journal over the last five years. All five reviews were considered to be of moderate quality; the worst domains were publication bias and the appropriate use of study quality in formulating conclusions. All three clinical trials were judged to have a high risk of bias. The participant blinding, personnel and outcome assessors and allocation concealment domains had the worst scores. CONCLUSIONS: Most of the systematic reviews and clinical trials on women's health recently published in a Brazilian evidence-based journal are of low to moderate quality. The quality of these types of studies needs improvement. PMID:23778332

  4. SOME RECENT IDEAS IN RESEARCH METHODOLOGY--FACET DESIGN AND THEORY OF DATA.

    ERIC Educational Resources Information Center

    RUNKEL, PHILIP J.

    FACET DESIGN, AS ORIGINATED BY LOUIS GUTTMAN, IS A METHOD OF SYSTEMATICALLY ORDERING A PROBLEM FOR RESEARCH. FACET ANALYSIS ENABLES THE VALIDITY OF AN ASSESSMENT OF THE ORDERING PROCESS TO BE TESTED. THE LOGIC OF FACET DESIGN AND ANALYSIS IS BASED UPON SYSTEMATIC DELINEATION OF THE IMPORTANT VARIABLES PRIOR TO DATA COLLECTION AND THE EVALUATION OF…

  5. A performance-oriented power transformer design methodology using multi-objective evolutionary optimization.

    PubMed

    Adly, Amr A; Abd-El-Hafiz, Salwa K

    2015-05-01

    Transformers are regarded as crucial components in power systems. Due to market globalization, power transformer manufacturers are facing an increasingly competitive environment that mandates the adoption of design strategies yielding better performance at lower costs. In this paper, a power transformer design methodology using multi-objective evolutionary optimization is proposed. Using this methodology, which is tailored to be target performance design-oriented, quick rough estimation of transformer design specifics may be inferred. Testing of the suggested approach revealed significant qualitative and quantitative match with measured design and performance values. Details of the proposed methodology as well as sample design results are reported in the paper. PMID:26257939

  6. A novel integrated framework and improved methodology of computer-aided drug design.

    PubMed

    Chen, Calvin Yu-Chian

    2013-01-01

    Computer-aided drug design (CADD) is a critical initiating step of drug development, but a single model capable of covering all designing aspects remains to be elucidated. Hence, we developed a drug design modeling framework that integrates multiple approaches, including machine learning based quantitative structure-activity relationship (QSAR) analysis, 3D-QSAR, Bayesian network, pharmacophore modeling, and structure-based docking algorithm. Restrictions for each model were defined for improved individual and overall accuracy. An integration method was applied to join the results from each model to minimize bias and errors. In addition, the integrated model adopts both static and dynamic analysis to validate the intermolecular stabilities of the receptor-ligand conformation. The proposed protocol was applied to identifying HER2 inhibitors from traditional Chinese medicine (TCM) as an example for validating our new protocol. Eight potent leads were identified from six TCM sources. A joint validation system comprised of comparative molecular field analysis, comparative molecular similarity indices analysis, and molecular dynamics simulation further characterized the candidates into three potential binding conformations and validated the binding stability of each protein-ligand complex. The ligand pathway was also performed to predict the ligand "in" and "exit" from the binding site. In summary, we propose a novel systematic CADD methodology for the identification, analysis, and characterization of drug-like candidates. PMID:23651478

  7. Appraisal of the methodological quality and summary of the findings of systematic reviews on the relationship between SSRIs and suicidality

    PubMed Central

    LI, Wei; LI, Wei; WAN, Yumei; REN, Juanjuan; LI, Ting; LI, Chunbo

    2014-01-01

    Background Several systematic reviews have been published about the relationship of the use of selective serotonin reuptake inhibitors (SSRIs) and risk of suicidal ideation or behavior but there has been no formal assessment of the quality of these reports. Aim Assess the methodological quality of systematic reviews about the relationship of SSRI use and suicidal ideation and behavior; and provide overall conclusions based on this assessment. Methods Systematic reviews of RCTs that compared SSRIs to placebo and used suicidal ideation or behavior as a key outcome variable were identified by searching Pubmed, Embase, The Cochrane Library, EBSCO, PsycINFO, Chinese National Knowledge Infrastructure, Chongqing VIP database for Chinese Technical Periodicals, WANFANG DATA, and the Chinese Biological Medical Literature Database. The methodological quality of included reviews was independently assessed by two expert raters using the 11-item Assessment of Multiple Systematic Reviews (AMSTAR) scale. Results Twelve systematic reviews and meta-analyses were identified. The inter-rater reliability of the overall AMSTAR quality score was excellent (ICC=0.86) but the inter-rater reliability of 5 of the 11 AMSTAR items was poor (Kappa <0.60). Based on the AMSTAR total score, there was one high-quality review, eight moderate-quality reviews, and three low-quality reviews. The high-quality review and three of the moderate-quality reviews reported a significantly increased risk of suicidal ideation or behavior in the SSRI group compared to the placebo group. Three of the four reviews limited to children and adolescents found a significantly increased risk of suicidal ideation or behavior with SSRI use which was most evident in teenagers taking paroxetine and in teenagers with depressive disorders. Conclusions The available evidence suggests that adolescents may experience an increase in suicidal ideation and behavior with SSRI use, particularly those who have a depressive disorder and

  8. Design methodology of the strength properties of medical knitted meshes

    NASA Astrophysics Data System (ADS)

    Mikołajczyk, Z.; Walkowska, A.

    2016-07-01

    One of the most important utility properties of medical knitted meshes intended for hernia and urological treatment is their bidirectional strength along the courses and wales. The value of this parameter, expected by the manufacturers and surgeons, is estimated at 100 N per 5 cm of the sample width. The most frequently, these meshes are produced on the basis of single- or double-guide stitches. They are made of polypropylene and polyester monofilament yarns with the diameter in the range from 0.6 to 1.2 mm, characterized by a high medical purity. The aim of the study was to develop the design methodology of meshes strength based on the geometrical construction of the stitch and strength of yarn. In the environment of the ProCAD warpknit 5 software the simulated stretching process of meshes together with an analysis of their geometry changes was carried out. Simulations were made for four selected representative stitches. Both on a built, unique measuring position and on the tensile testing machine the real parameters of the loops geometry of meshes were measured. Model of mechanical stretching of warp-knitted meshes along the courses and wales was developed. The thesis argument was made, that the force that breaks the loop of warp-knitted fabric is the lowest value of breaking forces of loop link yarns or yarns that create straight sections of loop. This thesis was associate with the theory of strength that uses the “the weakest link concept”. Experimental verification of model was carried out for the basic structure of the single-guide mesh. It has been shown that the real, relative strength of the mesh related to one course is equal to the strength of the yarn breakage in a loop, while the strength along the wales is close to breaking strength of a single yarn. In relation to the specific construction of the medical mesh, based on the knowledge of the density of the loops structure, the a-jour mesh geometry and the yarns strength, it is possible, with high

  9. A systematic review and analysis of factors associated with methodological quality in laparoscopic randomized controlled trials.

    PubMed

    Antoniou, Stavros Athanasios; Andreou, Alexandros; Antoniou, George Athanasios; Bertsias, Antonios; Köhler, Gernot; Koch, Oliver Owen; Pointner, Rudolph; Granderath, Frank-Alexander

    2015-01-01

    Several methods for assessment of methodological quality in randomized controlled trials (RCTs) have been developed during the past few years. Factors associated with quality in laparoscopic surgery have not been defined till date. The aim of this study was to investigate the relationship between bibliometric and the methodological quality of laparoscopic RCTs. The PubMed search engine was queried to identify RCTs on minimally invasive surgery published in 2012 in the 10 highest impact factor surgery journals and the 5 highest impact factor laparoscopic journals. Eligible studies were blindly assessed by two independent investigators using the Scottish Intercollegiate Guidelines Network (SIGN) tool for RCTs. Univariate and multivariate analyses were performed to identify potential associations with methodological quality. A total of 114 relevant RCTs were identified. More than half of the trials were of high or acceptable quality. Half of the reports provided information on comparative demo graphic data and only 21% performed intention-to-treat analysis. RCTs with sample size of at least 60 patients presented higher methodological quality (p = 0.025). Upon multiple regression, reporting on preoperative care and the experience level of surgeons were independent factors of quality. PMID:25896540

  10. A Systematic Review of Brief Functional Analysis Methodology with Typically Developing Children

    ERIC Educational Resources Information Center

    Gardner, Andrew W.; Spencer, Trina D.; Boelter, Eric W.; DuBard, Melanie; Jennett, Heather K.

    2012-01-01

    Brief functional analysis (BFA) is an abbreviated assessment methodology derived from traditional extended functional analysis methods. BFAs are often conducted when time constraints in clinics, schools or homes are of concern. While BFAs have been used extensively to identify the function of problem behavior for children with disabilities, their…

  11. Nominating under Constraints: A Systematic Comparison of Unlimited and Limited Peer Nomination Methodologies in Elementary School

    ERIC Educational Resources Information Center

    Gommans, Rob; Cillessen, Antonius H. N.

    2015-01-01

    Children's peer relationships are frequently assessed with peer nominations. An important methodological issue is whether to collect unlimited or limited nominations. Some researchers have argued that the psychometric differences between both methods are negligible, while others have claimed that one is superior over the other. The current…

  12. Staffing by Design: A Methodology for Staffing Reference

    ERIC Educational Resources Information Center

    Ward, David; Phetteplace, Eric

    2012-01-01

    The growth in number and kind of online reference services has resulted in both new users consulting library research services as well as new patterns of service use. Staffing in-person and virtual reference services desks adequately requires a systematic analysis of patterns of use across service points in order to successfully meet fluctuating…

  13. USDA Nutrition Evidence Library: methodology used to identify topics and develop systematic review questions for the birth-to-24-mo population.

    PubMed

    Obbagy, Julie E; Blum-Kemelor, Donna M; Essery, Eve V; Lyon, Joan M G; Spahn, Joanne M

    2014-03-01

    The USDA's Nutrition Evidence Library (NEL) specializes in conducting food- and nutrition-related systematic reviews that are used to inform federal government decision making. To ensure the utility of NEL systematic reviews, the most relevant topics must be addressed, questions must be clearly focused and appropriate in scope, and review frameworks must reflect the state of the science. Identifying the optimal topics and questions requires input from a variety of stakeholders, including scientists with technical expertise, as well as government policy and program leaders. The objective of this article is to describe the rationale and NEL methodology for identifying topics and developing systematic review questions implemented as part of the "Evaluating the evidence base to support the inclusion of infants and children from birth to 24 months of age in the Dietary Guidelines for Americans--the B-24 Project." This is the first phase of a larger project designed to develop dietary guidance for the birth to 24-mo population in the United States. PMID:24452234

  14. A Formal Semantics for the SRI Hierarchical Program Design Methodology

    NASA Technical Reports Server (NTRS)

    Boyer, R. S.; Moore, J. S.

    1983-01-01

    A formal statement of what it means to use (a subset of) the methodology is presented. It is formally defined that some specified module exists and what it means to say that another module is paid correctly implemented on top of it. No attention is to motivation, either of the methodology or of the formal development of it. Concentration is entirely upon mathematical succinctness and precision. A discussion is presented of how to use certain INTERLISP programs which implement the formal definitions. Among these are a program which generates Floyd like verification conditions sufficient to imply the correctness of a module implementation.

  15. Peer Review of a Formal Verification/Design Proof Methodology

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The role of formal verification techniques in system validation was examined. The value and the state of the art of performance proving for fault-tolerant compuers were assessed. The investigation, development, and evaluation of performance proving tools were reviewed. The technical issues related to proof methodologies are examined. The technical issues discussed are summarized.

  16. Visual Methodology in Classroom Inquiry: Enhancing Complementary Qualitative Research Designs

    ERIC Educational Resources Information Center

    Kingsley, Joanne

    2009-01-01

    This article presents the argument that combining visual methods with other qualitative research methods enhances the inherent strengths of each methodology and allows new understandings to emerge. These would otherwise remain hidden if only one method were used in isolation. In a qualitative inquiry of an elementary teacher's constructivist…

  17. Behavioral Methodology for Designing and Evaluating Applied Programs for Women.

    ERIC Educational Resources Information Center

    Thurston, Linda P.

    To be maximally effective in solving problems, researchers must place their methodological and theoretical models of science within social and political contexts. They must become aware of biases and assumptions and move toward a more valid perception of social realities. Psychologists must view women in the situational context within which…

  18. Integrated Controls-Structures Design Methodology for Flexible Spacecraft

    NASA Technical Reports Server (NTRS)

    Maghami, P. G.; Joshi, S. M.; Price, D. B.

    1995-01-01

    This paper proposes an approach for the design of flexible spacecraft, wherein the structural design and the control system design are performed simultaneously. The integrated design problem is posed as an optimization problem in which both the structural parameters and the control system parameters constitute the design variables, which are used to optimize a common objective function, thereby resulting in an optimal overall design. The approach is demonstrated by application to the integrated design of a geostationary platform, and to a ground-based flexible structure experiment. The numerical results obtained indicate that the integrated design approach generally yields spacecraft designs that are substantially superior to the conventional approach, wherein the structural design and control design are performed sequentially.

  19. Using QALYs in telehealth evaluations: a systematic review of methodology and transparency

    PubMed Central

    2014-01-01

    Background The quality-adjusted life-year (QALY) is a recognised outcome measure in health economic evaluations. QALY incorporates individual preferences and identifies health gains by combining mortality and morbidity into one single index number. A literature review was conducted to examine and discuss the use of QALYs to measure outcomes in telehealth evaluations. Methods Evaluations were identified via a literature search in all relevant databases. Only economic evaluations measuring both costs and QALYs using primary patient level data of two or more alternatives were included. Results A total of 17 economic evaluations estimating QALYs were identified. All evaluations used validated generic health related-quality of life (HRQoL) instruments to describe health states. They used accepted methods for transforming the quality scores into utility values. The methodology used varied between the evaluations. The evaluations used four different preference measures (EQ-5D, SF-6D, QWB and HUI3), and utility scores were elicited from the general population. Most studies reported the methodology used in calculating QALYs. The evaluations were less transparent in reporting utility weights at different time points and variability around utilities and QALYs. Few made adjustments for differences in baseline utilities. The QALYs gained in the reviewed evaluations varied from 0.001 to 0.118 in implying a small but positive effect of telehealth intervention on patient’s health. The evaluations reported mixed cost-effectiveness results. Conclusion The use of QALYs in telehealth evaluations has increased over the last few years. Different methodologies and utility measures have been used to calculate QALYs. A more harmonised methodology and utility measure is needed to ensure comparability across telehealth evaluations. PMID:25086443

  20. Prelimary methodology for design of stable drifts for the Yucca Mountain Project

    SciTech Connect

    Bauer, S.J.; Ehgartner, B.L.; Hardy, M.P.

    1990-10-01

    This paper discusses the Yucca Mountain Project which is investigating the feasibility of locating a high-level radioactive nuclear waste repository at Yucca Mountain, Nevada. The conceptual design of the repository is described. The design methodology is presented.

  1. Methodology for designing accelerated aging tests for predicting life of photovoltaic arrays

    NASA Technical Reports Server (NTRS)

    Gaines, G. B.; Thomas, R. E.; Derringer, G. C.; Kistler, C. W.; Bigg, D. M.; Carmichael, D. C.

    1977-01-01

    A methodology for designing aging tests in which life prediction was paramount was developed. The methodology builds upon experience with regard to aging behavior in those material classes which are expected to be utilized as encapsulant elements, viz., glasses and polymers, and upon experience with the design of aging tests. The experiences were reviewed, and results are discussed in detail.

  2. Establishing Equivalence: Methodological Progress in Group-Matching Design and Analysis

    ERIC Educational Resources Information Center

    Kover, Sara T.; Atwood, Amy K.

    2013-01-01

    This methodological review draws attention to the challenges faced by intellectual and developmental disabilities researchers in the appropriate design and analysis of group comparison studies. We provide a brief overview of matching methodologies in the field, emphasizing group-matching designs used in behavioral research on cognition and…

  3. Game Methodology for Design Methods and Tools Selection

    ERIC Educational Resources Information Center

    Ahmad, Rafiq; Lahonde, Nathalie; Omhover, Jean-françois

    2014-01-01

    Design process optimisation and intelligence are the key words of today's scientific community. A proliferation of methods has made design a convoluted area. Designers are usually afraid of selecting one method/tool over another and even expert designers may not necessarily know which method is the best to use in which circumstances. This…

  4. Response surface methodology and process optimization of sustained release pellets using Taguchi orthogonal array design and central composite design

    PubMed Central

    Singh, Gurinder; Pai, Roopa S.; Devi, V. Kusum

    2012-01-01

    Furosemide is a powerful diuretic and antihypertensive drug which has low bioavailability due to hepatic first pass metabolism and has a short half-life of 2 hours. To overcome the above drawback, the present study was carried out to formulate and evaluate sustained release (SR) pellets of furosemide for oral administration prepared by extrusion/spheronization. Drug Coat L-100 was used within the pellet core along with microcrystalline cellulose as the diluent and concentration of selected binder was optimized to be 1.2%. The formulation was prepared with drug to polymer ratio 1:3. It was optimized using Design of Experiments by employing a 32 central composite design that was used to systematically optimize the process parameters combined with response surface methodology. Dissolution studies were carried out with USP apparatus Type I (basket type) in both simulated gastric and intestinal pH. The statistical technique, i.e., the two-tailed paired t test and one-way ANOVA of in vitro data has proposed that there was very significant (P≤0.05) difference in dissolution profile of furosemide SR pellets when compared with pure drug and commercial product. Validation of the process optimization study indicated an extremely high degree of prognostic ability. The study effectively undertook the development of optimized process parameters of pelletization of furosemide pellets with tremendous SR characteristics. PMID:22470891

  5. Total synthesis of vinblastine, related natural products, and key analogues and development of inspired methodology suitable for the systematic study of their structure-function properties.

    PubMed

    Sears, Justin E; Boger, Dale L

    2015-03-17

    Biologically active natural products composed of fascinatingly complex structures are often regarded as not amenable to traditional systematic structure-function studies enlisted in medicinal chemistry for the optimization of their properties beyond what might be accomplished by semisynthetic modification. Herein, we summarize our recent studies on the Vinca alkaloids vinblastine and vincristine, often considered as prototypical members of such natural products, that not only inspired the development of powerful new synthetic methodology designed to expedite their total synthesis but have subsequently led to the discovery of several distinct classes of new, more potent, and previously inaccessible analogues. With use of the newly developed methodology and in addition to ongoing efforts to systematically define the importance of each embedded structural feature of vinblastine, two classes of analogues already have been discovered that enhance the potency of the natural products >10-fold. In one instance, remarkable progress has also been made on the refractory problem of reducing Pgp transport responsible for clinical resistance with a series of derivatives made accessible only using the newly developed synthetic methodology. Unlike the removal of vinblastine structural features or substituents, which typically has a detrimental impact, the additions of new structural features have been found that can enhance target tubulin binding affinity and functional activity while simultaneously disrupting Pgp binding, transport, and functional resistance. Already analogues are in hand that are deserving of full preclinical development, and it is a tribute to the advances in organic synthesis that they are readily accessible even on a natural product of a complexity once thought refractory to such an approach. PMID:25586069

  6. A design methodology for robust failure detection and isolation

    NASA Technical Reports Server (NTRS)

    Pattipati, K. R.; Willsky, A. S.; Deckert, J. C.; Eterno, J. S.; Weiss, J. S.

    1984-01-01

    A decentralized failure detection and isolation (FDI) methodology, which is robust with respect to model uncertainties and noise, is presented Redundancy metrics are developed, and optimization problems are posed for the choices of robust parity relations. Closed-form solutions for some special failure cases are given. Connections are drawn with other disciplines, and the use of the metrics to evaluate alternative FDI schemes is discussed.

  7. A Systematic Review of Methodology: Time Series Regression Analysis for Environmental Factors and Infectious Diseases

    PubMed Central

    Imai, Chisato; Hashizume, Masahiro

    2015-01-01

    Background: Time series analysis is suitable for investigations of relatively direct and short-term effects of exposures on outcomes. In environmental epidemiology studies, this method has been one of the standard approaches to assess impacts of environmental factors on acute non-infectious diseases (e.g. cardiovascular deaths), with conventionally generalized linear or additive models (GLM and GAM). However, the same analysis practices are often observed with infectious diseases despite of the substantial differences from non-infectious diseases that may result in analytical challenges. Methods: Following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines, systematic review was conducted to elucidate important issues in assessing the associations between environmental factors and infectious diseases using time series analysis with GLM and GAM. Published studies on the associations between weather factors and malaria, cholera, dengue, and influenza were targeted. Findings: Our review raised issues regarding the estimation of susceptible population and exposure lag times, the adequacy of seasonal adjustments, the presence of strong autocorrelations, and the lack of a smaller observation time unit of outcomes (i.e. daily data). These concerns may be attributable to features specific to infectious diseases, such as transmission among individuals and complicated causal mechanisms. Conclusion: The consequence of not taking adequate measures to address these issues is distortion of the appropriate risk quantifications of exposures factors. Future studies should pay careful attention to details and examine alternative models or methods that improve studies using time series regression analysis for environmental determinants of infectious diseases. PMID:25859149

  8. A systematic review of mosquito coils and passive emanators: defining recommendations for spatial repellency testing methodologies

    PubMed Central

    2012-01-01

    Mosquito coils, vaporizer mats and emanators confer protection against mosquito bites through the spatial action of emanated vapor or airborne pyrethroid particles. These products dominate the pest control market; therefore, it is vital to characterize mosquito responses elicited by the chemical actives and their potential for disease prevention. The aim of this review was to determine effects of mosquito coils and emanators on mosquito responses that reduce human-vector contact and to propose scientific consensus on terminologies and methodologies used for evaluation of product formats that could contain spatial chemical actives, including indoor residual spraying (IRS), long lasting insecticide treated nets (LLINs) and insecticide treated materials (ITMs). PubMed, (National Centre for Biotechnology Information (NCBI), U.S. National Library of Medicine, NIH), MEDLINE, LILAC, Cochrane library, IBECS and Armed Forces Pest Management Board Literature Retrieval System search engines were used to identify studies of pyrethroid based coils and emanators with key-words “Mosquito coils” “Mosquito emanators” and “Spatial repellents”. It was concluded that there is need to improve statistical reporting of studies, and reach consensus in the methodologies and terminologies used through standardized testing guidelines. Despite differing evaluation methodologies, data showed that coils and emanators induce mortality, deterrence, repellency as well as reduce the ability of mosquitoes to feed on humans. Available data on efficacy outdoors, dose–response relationships and effective distance of coils and emanators is inadequate for developing a target product profile (TPP), which will be required for such chemicals before optimized implementation can occur for maximum benefits in disease control. PMID:23216844

  9. A systematic review of mosquito coils and passive emanators: defining recommendations for spatial repellency testing methodologies.

    PubMed

    Ogoma, Sheila B; Moore, Sarah J; Maia, Marta F

    2012-01-01

    Mosquito coils, vaporizer mats and emanators confer protection against mosquito bites through the spatial action of emanated vapor or airborne pyrethroid particles. These products dominate the pest control market; therefore, it is vital to characterize mosquito responses elicited by the chemical actives and their potential for disease prevention. The aim of this review was to determine effects of mosquito coils and emanators on mosquito responses that reduce human-vector contact and to propose scientific consensus on terminologies and methodologies used for evaluation of product formats that could contain spatial chemical actives, including indoor residual spraying (IRS), long lasting insecticide treated nets (LLINs) and insecticide treated materials (ITMs). PubMed, (National Centre for Biotechnology Information (NCBI), U.S. National Library of Medicine, NIH), MEDLINE, LILAC, Cochrane library, IBECS and Armed Forces Pest Management Board Literature Retrieval System search engines were used to identify studies of pyrethroid based coils and emanators with key-words "Mosquito coils" "Mosquito emanators" and "Spatial repellents". It was concluded that there is need to improve statistical reporting of studies, and reach consensus in the methodologies and terminologies used through standardized testing guidelines. Despite differing evaluation methodologies, data showed that coils and emanators induce mortality, deterrence, repellency as well as reduce the ability of mosquitoes to feed on humans. Available data on efficacy outdoors, dose-response relationships and effective distance of coils and emanators is inadequate for developing a target product profile (TPP), which will be required for such chemicals before optimized implementation can occur for maximum benefits in disease control. PMID:23216844

  10. A Design Methodology for Complex (E)-Learning. Innovative Session.

    ERIC Educational Resources Information Center

    Bastiaens, Theo; van Merrienboer, Jeroen; Hoogveld, Bert

    Human resource development (HRD) specialists are searching for instructional design models that accommodate e-learning platforms. Van Merrienboer proposed the four-component instructional design model (4C/ID model) for competency-based education. The model's basic message is that well-designed learning environments can always be described in terms…

  11. Researching experiences of terminal cancer: a systematic review of methodological issues and approaches.

    PubMed

    Harris, F M; Kendall, M; Bentley, A; Maguire, R; Worth, A; Murray, S; Boyd, K; Brown, D; Kearney, N; Sheikh, A

    2008-07-01

    The objectives of this review were to assess the methods and approaches applied to end-of-life cancer research based on papers focusing on approaches or methodological issues related to seeking the views of people affected by terminal cancer. A comprehensive search of 10 databases (January 1980-February 2004) was undertaken. References were screened, quality assessed and data extracted by two reviewers. Analysis followed a meta-narrative approach. Fifteen papers were included. They discussed 'traditional' approaches, such as focus groups, interviews, surveys, as well as innovative approaches allied to the arts. They reveal that mixed methods are gaining popularity. The emotional demands placed on researchers and the ethical issues involved in this research area were also discussed. We concluded that researchers should embrace innovative approaches from other areas of social science, such as the use of arts-based techniques. This may facilitate recruitment of the hard-to-reach groups and engage with experiences that may be otherwise difficult to verbalize. Although researching the needs of the dying carries challenges, these are not the exclusive domain of the cancer field. This study reveals that diverse methods, from research-based drama to postal questionnaires, can enhance end-of-life research. However, this review reveals the need for more methodological work to be undertaken and disseminated. PMID:18485015

  12. Breast cancer statistics and prediction methodology: a systematic review and analysis.

    PubMed

    Dubey, Ashutosh Kumar; Gupta, Umesh; Jain, Sonal

    2015-01-01

    Breast cancer is a menacing cancer, primarily affecting women. Continuous research is going on for detecting breast cancer in the early stage as the possibility of cure in early stages is bright. There are two main objectives of this current study, first establish statistics for breast cancer and second to find methodologies which can be helpful in the early stage detection of the breast cancer based on previous studies. The breast cancer statistics for incidence and mortality of the UK, US, India and Egypt were considered for this study. The finding of this study proved that the overall mortality rates of the UK and US have been improved because of awareness, improved medical technology and screening, but in case of India and Egypt the condition is less positive because of lack of awareness. The methodological findings of this study suggest a combined framework based on data mining and evolutionary algorithms. It provides a strong bridge in improving the classification and detection accuracy of breast cancer data. PMID:26028079

  13. Drift design methodology and preliminary application for the Yucca Mountain Site Characterization Project; Yucca Mountain Site Characterization Project

    SciTech Connect

    Hardy, M.P.; Bauer, S.J.

    1991-12-01

    Excavation stability in an underground nuclear waste repository is required during construction, emplacement, retrieval (if required), and closure phases to ensure worker health and safety, and to prevent development of potential pathways for radionuclide migration in the post-closure period. Stable excavations are developed by appropriate excavation procedures, design of the room shape, design and installation of rock support reinforcement systems, and implementation of appropriate monitoring and maintenance programs. In addition to the loads imposed by the in situ stress field, the repository drifts will be impacted by thermal loads developed after waste emplacement and, periodically, by seismic loads from naturally occurring earthquakes and underground nuclear events. A priori evaluation of stability is required for design of the ground support system, to confirm that the thermal loads are reasonable, and to support the license application process. In this report, a design methodology for assessing drift stability is presented. This is based on site conditions, together with empirical and analytical methods. Analytical numerical methods are emphasized at this time because empirical data are unavailable for excavations in welded tuff either at elevated temperatures or under seismic loads. The analytical methodology incorporates analysis of rock masses that are systematically jointed, randomly jointed, and sparsely jointed. In situ thermal and seismic loads are considered. Methods of evaluating the analytical results and estimating ground support requirements for all the full range of expected ground conditions are outlines. The results of a preliminary application of the methodology using the limited available data are presented. 26 figs., 55 tabs.

  14. A transonic-small-disturbance wing design methodology

    NASA Technical Reports Server (NTRS)

    Phillips, Pamela S.; Waggoner, Edgar G.; Campbell, Richard L.

    1988-01-01

    An automated transonic design code has been developed which modifies an initial airfoil or wing in order to generate a specified pressure distribution. The design method uses an iterative approach that alternates between a potential-flow analysis and a design algorithm that relates changes in surface pressure to changes in geometry. The analysis code solves an extended small-disturbance potential-flow equation and can model a fuselage, pylons, nacelles, and a winglet in addition to the wing. A two-dimensional option is available for airfoil analysis and design. Several two- and three-dimensional test cases illustrate the capabilities of the design code.

  15. 77 FR 9256 - Design and Methodology for Postmarket Surveillance Studies Under Section 522 of the Federal Food...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-16

    ... HUMAN SERVICES Food and Drug Administration Design and Methodology for Postmarket Surveillance Studies... announcing a public workshop entitled ``Design and Methodology for Postmarket Surveillance Studies under... stakeholders with experience in epidemiology, statistics, and biomedical research to advance the design...

  16. A Systematic Methodology for Constructing High-Order Energy Stable WENO Schemes

    NASA Technical Reports Server (NTRS)

    Yamaleev, Nail K.; Carpenter, Mark H.

    2009-01-01

    A third-order Energy Stable Weighted Essentially Non{Oscillatory (ESWENO) finite difference scheme developed by Yamaleev and Carpenter [1] was proven to be stable in the energy norm for both continuous and discontinuous solutions of systems of linear hyperbolic equations. Herein, a systematic approach is presented that enables "energy stable" modifications for existing WENO schemes of any order. The technique is demonstrated by developing a one-parameter family of fifth-order upwind-biased ESWENO schemes; ESWENO schemes up to eighth order are presented in the appendix. New weight functions are also developed that provide (1) formal consistency, (2) much faster convergence for smooth solutions with an arbitrary number of vanishing derivatives, and (3) improved resolution near strong discontinuities.

  17. A Systematic Methodology for Constructing High-Order Energy-Stable WENO Schemes

    NASA Technical Reports Server (NTRS)

    Yamaleev, Nail K.; Carpenter, Mark H.

    2008-01-01

    A third-order Energy Stable Weighted Essentially Non-Oscillatory (ESWENO) finite difference scheme developed by Yamaleev and Carpenter (AIAA 2008-2876, 2008) was proven to be stable in the energy norm for both continuous and discontinuous solutions of systems of linear hyperbolic equations. Herein, a systematic approach is presented that enables \\energy stable" modifications for existing WENO schemes of any order. The technique is demonstrated by developing a one-parameter family of fifth-order upwind-biased ESWENO schemes; ESWENO schemes up to eighth order are presented in the appendix. New weight functions are also developed that provide (1) formal consistency, (2) much faster convergence for smooth solutions with an arbitrary number of vanishing derivatives, and (3) improved resolution near strong discontinuities.

  18. Influence of Glenosphere Design on Outcomes and Complications of Reverse Arthroplasty: A Systematic Review

    PubMed Central

    Lawrence, Cassandra; Williams, Gerald R.

    2016-01-01

    Background Different implant designs are utilized in reverse shoulder arthroplasty. The purpose of this systematic review was to evaluate the results of reverse shoulder arthroplasty using a traditional (Grammont) prosthesis and a lateralized prosthesis for the treatment of cuff tear arthropathy and massive irreparable rotator cuff tears. Methods A systematic review of the literature was performed via a search of two electronic databases. Two reviewers evaluated the quality of methodology and retrieved data from each included study. In cases where the outcomes data were similar between studies, the data were pooled using frequency-weighted mean values to generate summary outcomes. Results Thirteen studies met the inclusion and exclusion criteria. Demographics were similar between treatment groups. The frequency-weighted mean active external rotation was 24° in the traditional group and 46° in the lateralized group (p = 0.0001). Scapular notching was noted in 44.9% of patients in the traditional group compared to 5.4% of patients in the lateralized group (p = 0.0001). The rate of clinically significant glenoid loosening was 1.8% in the traditional group and 8.8% in the lateralized group (p = 0.003). Conclusions Both the traditional Grammont and the lateralized offset reverse arthroplasty designs can improve pain and function in patients with diagnoses of cuff tear arthropathy and irreparable rotator cuff tear. While a lateralized design can result in increased active external rotation and decreased rates of scapular notching, there may be a higher rate of glenoid baseplate loosening. PMID:27583112

  19. Designing and Integrating Purposeful Learning in Game Play: A Systematic Review

    ERIC Educational Resources Information Center

    Ke, Fengfeng

    2016-01-01

    Via a systematic review of the literature on learning games, this article presents a systematic discussion on the design of intrinsic integration of domain-specific learning in game mechanics and game world design. A total of 69 articles ultimately met the inclusion criteria and were coded for the literature synthesis. Exemplary learning games…

  20. Improved FTA methodology and application to subsea pipeline reliability design.

    PubMed

    Lin, Jing; Yuan, Yongbo; Zhang, Mingyuan

    2014-01-01

    An innovative logic tree, Failure Expansion Tree (FET), is proposed in this paper, which improves on traditional Fault Tree Analysis (FTA). It describes a different thinking approach for risk factor identification and reliability risk assessment. By providing a more comprehensive and objective methodology, the rather subjective nature of FTA node discovery is significantly reduced and the resulting mathematical calculations for quantitative analysis are greatly simplified. Applied to the Useful Life phase of a subsea pipeline engineering project, the approach provides a more structured analysis by constructing a tree following the laws of physics and geometry. Resulting improvements are summarized in comparison table form. PMID:24667681

  1. Improved FTA Methodology and Application to Subsea Pipeline Reliability Design

    PubMed Central

    Lin, Jing; Yuan, Yongbo; Zhang, Mingyuan

    2014-01-01

    An innovative logic tree, Failure Expansion Tree (FET), is proposed in this paper, which improves on traditional Fault Tree Analysis (FTA). It describes a different thinking approach for risk factor identification and reliability risk assessment. By providing a more comprehensive and objective methodology, the rather subjective nature of FTA node discovery is significantly reduced and the resulting mathematical calculations for quantitative analysis are greatly simplified. Applied to the Useful Life phase of a subsea pipeline engineering project, the approach provides a more structured analysis by constructing a tree following the laws of physics and geometry. Resulting improvements are summarized in comparison table form. PMID:24667681

  2. A Methodology for Quantifying Certain Design Requirements During the Design Phase

    NASA Technical Reports Server (NTRS)

    Adams, Timothy; Rhodes, Russel

    2005-01-01

    A methodology for developing and balancing quantitative design requirements for safety, reliability, and maintainability has been proposed. Conceived as the basis of a more rational approach to the design of spacecraft, the methodology would also be applicable to the design of automobiles, washing machines, television receivers, or almost any other commercial product. Heretofore, it has been common practice to start by determining the requirements for reliability of elements of a spacecraft or other system to ensure a given design life for the system. Next, safety requirements are determined by assessing the total reliability of the system and adding redundant components and subsystems necessary to attain safety goals. As thus described, common practice leaves the maintainability burden to fall to chance; therefore, there is no control of recurring costs or of the responsiveness of the system. The means that have been used in assessing maintainability have been oriented toward determining the logistical sparing of components so that the components are available when needed. The process established for developing and balancing quantitative requirements for safety (S), reliability (R), and maintainability (M) derives and integrates NASA s top-level safety requirements and the controls needed to obtain program key objectives for safety and recurring cost (see figure). Being quantitative, the process conveniently uses common mathematical models. Even though the process is shown as being worked from the top down, it can also be worked from the bottom up. This process uses three math models: (1) the binomial distribution (greaterthan- or-equal-to case), (2) reliability for a series system, and (3) the Poisson distribution (less-than-or-equal-to case). The zero-fail case for the binomial distribution approximates the commonly known exponential distribution or "constant failure rate" distribution. Either model can be used. The binomial distribution was selected for

  3. Propulsion integration of hypersonic air-breathing vehicles utilizing a top-down design methodology

    NASA Astrophysics Data System (ADS)

    Kirkpatrick, Brad Kenneth

    In recent years, a focus of aerospace engineering design has been the development of advanced design methodologies and frameworks to account for increasingly complex and integrated vehicles. Techniques such as parametric modeling, global vehicle analyses, and interdisciplinary data sharing have been employed in an attempt to improve the design process. The purpose of this study is to introduce a new approach to integrated vehicle design known as the top-down design methodology. In the top-down design methodology, the main idea is to relate design changes on the vehicle system and sub-system level to a set of over-arching performance and customer requirements. Rather than focusing on the performance of an individual system, the system is analyzed in terms of the net effect it has on the overall vehicle and other vehicle systems. This detailed level of analysis can only be accomplished through the use of high fidelity computational tools such as Computational Fluid Dynamics (CFD) or Finite Element Analysis (FEA). The utility of the top-down design methodology is investigated through its application to the conceptual and preliminary design of a long-range hypersonic air-breathing vehicle for a hypothetical next generation hypersonic vehicle (NHRV) program. System-level design is demonstrated through the development of the nozzle section of the propulsion system. From this demonstration of the methodology, conclusions are made about the benefits, drawbacks, and cost of using the methodology.

  4. Probabilistic Design Methodology and its Application to the Design of an Umbilical Retract Mechanism

    NASA Astrophysics Data System (ADS)

    Onyebueke, Landon; Ameye, Olusesan

    2002-10-01

    A lot has been learned from past experience with structural and machine element failures. The understanding of failure modes and the application of an appropriate design analysis method can lead to improved structural and machine element safety as well as serviceability. To apply Probabilistic Design Methodology (PDM), all uncertainties are modeled as random variables with selected distribution types, means, and standard deviations. It is quite difficult to achieve a robust design without considering the randomness of the design parameters which is the case in the use of the Deterministic Design Approach. The US Navy has a fleet of submarine-launched ballistic missiles. An umbilical plug joins the missile to the submarine in order to provide electrical and cooling water connections. As the missile leaves the submarine, an umbilical retract mechanism retracts the umbilical plug clear of the advancing missile after disengagement during launch and retrains the plug in the retracted position. The design of the current retract mechanism in use was based on the deterministic approach which puts emphasis on factor of safety. A new umbilical retract mechanism that is simpler in design, lighter in weight, more reliable, easier to adjust, and more cost effective has become desirable since this will increase the performance and efficiency of the system. This paper reports on a recent project performed at Tennessee State University for the US Navy that involved the application of PDM to the design of an umbilical retract mechanism. This paper demonstrates how the use of PDM lead to the minimization of weight and cost, and the maximization of reliability and performance.

  5. Probabilistic Design Methodology and its Application to the Design of an Umbilical Retract Mechanism

    NASA Technical Reports Server (NTRS)

    Onyebueke, Landon; Ameye, Olusesan

    2002-01-01

    A lot has been learned from past experience with structural and machine element failures. The understanding of failure modes and the application of an appropriate design analysis method can lead to improved structural and machine element safety as well as serviceability. To apply Probabilistic Design Methodology (PDM), all uncertainties are modeled as random variables with selected distribution types, means, and standard deviations. It is quite difficult to achieve a robust design without considering the randomness of the design parameters which is the case in the use of the Deterministic Design Approach. The US Navy has a fleet of submarine-launched ballistic missiles. An umbilical plug joins the missile to the submarine in order to provide electrical and cooling water connections. As the missile leaves the submarine, an umbilical retract mechanism retracts the umbilical plug clear of the advancing missile after disengagement during launch and retrains the plug in the retracted position. The design of the current retract mechanism in use was based on the deterministic approach which puts emphasis on factor of safety. A new umbilical retract mechanism that is simpler in design, lighter in weight, more reliable, easier to adjust, and more cost effective has become desirable since this will increase the performance and efficiency of the system. This paper reports on a recent project performed at Tennessee State University for the US Navy that involved the application of PDM to the design of an umbilical retract mechanism. This paper demonstrates how the use of PDM lead to the minimization of weight and cost, and the maximization of reliability and performance.

  6. Systematic reviews with language restrictions and no author contact have lower overall credibility: a methodology study

    PubMed Central

    Wang, Zhen; Brito, Juan P; Tsapas, Apostolos; Griebeler, Marcio L; Alahdab, Fares; Murad, Mohammad Hassan

    2015-01-01

    Background High-quality systematic reviews (SRs) require rigorous approaches to identify, appraise, select, and synthesize research evidence relevant to a specific question. In this study, we evaluated the association between two steps in the conduct of an SR – restricting the search to English, and author contact for missing data – and the overall credibility of a SR. Methods All SRs cited by the Endocrine Society’s Clinical Practice Guidelines published from October 2006 through January 2012 were included. The main outcome was the overall A Measurement Tool to Assess Systematic Reviews (AMSTAR) score, as a surrogate of SR credibility. Nonparametric Kruskal–Wallis tests and multivariable linear regression models were used to investigate the association between language restriction, author contact for missing data, and the overall AMSTAR score. Results In all, 69 SRs were included in the analysis. Only 31 SRs (45%) reported searching non-English literature, with an average AMSTAR score of 7.90 (standard deviation [SD] =1.64). SRs that reported language restriction received significantly lower AMSTAR scores (mean =5.25, SD =2.32) (P<0.001). Only 30 SRs (43%) reported contacting authors for missing data, and these received, on average, 2.59 more AMSTAR points (SD =1.95) than those who did not (P<0.001). In multivariable analyses, AMSTAR score was significantly correlated with language restriction (beta =−1.31, 95% confidence interval [CI]: −2.62, −0.01, P=0.05) and author contact for missing data (beta =2.16, 95% CI: 0.91, 3.41, P=0.001). However, after adjusting for compliance with reporting guidelines, language restriction was no longer significantly associated with the AMSTAR score. Conclusion Fewer than half of the SRs conducted to support the clinical practice guidelines we examined reported contacting study authors or searched non–English literature. SRs that did not conduct these two steps had lower quality scores, suggesting the importance of

  7. Participant Observation, Anthropology Methodology and Design Anthropology Research Inquiry

    ERIC Educational Resources Information Center

    Gunn, Wendy; Løgstrup, Louise B.

    2014-01-01

    Within the design studio, and across multiple field sites, the authors compare involvement of research tools and materials during collaborative processes of designing. Their aim is to trace temporal dimensions (shifts/ movements) of where and when learning takes place along different sites of practice. They do so by combining participant…

  8. Participatory Pattern Workshops: A Methodology for Open Learning Design Inquiry

    ERIC Educational Resources Information Center

    Mor, Yishay; Warburton, Steven; Winters, Niall

    2012-01-01

    In order to promote pedagogically informed use of technology, educators need to develop an active, inquisitive, design-oriented mindset. Design Patterns have been demonstrated as powerful mediators of theory-praxis conversations yet widespread adoption by the practitioner community remains a challenge. Over several years, the authors and their…

  9. Flexible design clinical trial methodology in regulatory applications.

    PubMed

    Hung, H M James; Wang, Sue-Jane; O'Neill, Robert

    2011-06-15

    Adaptive designs or flexible designs in a broader sense have increasingly been considered in planning pivotal registration clinical trials. Sample size reassessment design and adaptive selection design are two of such designs that appear in regulatory applications. At the design stage, consideration of sample size reassessment at an interim time of the trial should lead to extensive discussion about how to appropriately size the trial. Additionally, careful attention needs to be paid to the issue of how the size of the trial is impacted by the requirement that the final p-value of the trial meets the specific threshold of a clinically meaningful effect. These issues are not straightforward and will be discussed in this work. In a trial design that allows selection between a pre-specified patient subgroup and the initially planned overall patient population based on the accumulating data, there is an issue of what the 'overall' population means. In addition, it is critically important to know how such selection influences the validity of statistical inferences on the potentially modified overall population. This work presents the biases that may incur under adaptive patient selection designs. PMID:21344470

  10. LWR design decision methodology: Phase II. Final report

    SciTech Connect

    1981-01-01

    Techniques were identified to augment existing design process at the component and system level in order to optimize cost and safety between alternative system designs. The method was demonstrated using the Surry Low Pressure Injection System (LPIS). Three possible backfit options were analyzed for the Surry LPIS, assessing the safety level of each option and estimating the acquisition and installation costs for each. (DLC)

  11. A Methodology for Multi-Criteria Information System Design.

    ERIC Educational Resources Information Center

    Chandler, John S.; DeLutis, Thomas G.

    The complexity of the design problem for modern computer based information systems has increased significantly over its predecessors. The problem presented to the designer is to configure a system which satisfies the user criterion while achieving system resource related performance criteria. The purpose of this paper is to present an evaluation…

  12. Systems analysis and design methodologies: practicalities and use in today's information systems development efforts.

    PubMed

    Jerva, M

    2001-05-01

    Historically, systems analysis and design methodologies have been used as a guide in software development. Such methods provide structure to software engineers in their efforts to create quality solutions in the real world of information systems. This article looks at the elements that constitute a systems analysis methodology and examines the historical development of systems analysis in software development. It concludes with observations on the strengths and weaknesses of four methodologies and the state of the art of practice today. PMID:11378979

  13. Systematic Neighborhood Observations at High Spatial Resolution: Methodology and Assessment of Potential Benefits

    PubMed Central

    Leonard, Tammy C. M.; Caughy, Margaret O'Brien; Mays, Judith K.; Murdoch, James C.

    2011-01-01

    There is a growing body of public health research documenting how characteristics of neighborhoods are associated with differences in the health status of residents. However, little is known about how the spatial resolution of neighborhood observational data or community audits affects the identification of neighborhood differences in health. We developed a systematic neighborhood observation instrument for collecting data at very high spatial resolution (we observe each parcel independently) and used it to collect data in a low-income minority neighborhood in Dallas, TX. In addition, we collected data on the health status of individuals residing in this neighborhood. We then assessed the inter-rater reliability of the instrument and compared the costs and benefits of using data at this high spatial resolution. Our instrument provides a reliable and cost-effect method for collecting neighborhood observational data at high spatial resolution, which then allows researchers to explore the impact of varying geographic aggregations. Furthermore, these data facilitate a demonstration of the predictive accuracy of self-reported health status. We find that ordered logit models of health status using observational data at different spatial resolution produce different results. This implies a need to analyze the variation in correlative relationships at different geographic resolutions when there is no solid theoretical rational for choosing a particular resolution. We argue that neighborhood data at high spatial resolution greatly facilitates the evaluation of alternative geographic specifications in studies of neighborhood and health. PMID:21673983

  14. Towards uniform accelerometry analysis: a standardization methodology to minimize measurement bias due to systematic accelerometer wear-time variation.

    PubMed

    Katapally, Tarun R; Muhajarine, Nazeem

    2014-05-01

    Accelerometers are predominantly used to objectively measure the entire range of activity intensities - sedentary behaviour (SED), light physical activity (LPA) and moderate to vigorous physical activity (MVPA). However, studies consistently report results without accounting for systematic accelerometer wear-time variation (within and between participants), jeopardizing the validity of these results. This study describes the development of a standardization methodology to understand and minimize measurement bias due to wear-time variation. Accelerometry is generally conducted over seven consecutive days, with participants' data being commonly considered 'valid' only if wear-time is at least 10 hours/day. However, even within 'valid' data, there could be systematic wear-time variation. To explore this variation, accelerometer data of Smart Cities, Healthy Kids study (www.smartcitieshealthykids.com) were analyzed descriptively and with repeated measures multivariate analysis of variance (MANOVA). Subsequently, a standardization method was developed, where case-specific observed wear-time is controlled to an analyst specified time period. Next, case-specific accelerometer data are interpolated to this controlled wear-time to produce standardized variables. To understand discrepancies owing to wear-time variation, all analyses were conducted pre- and post-standardization. Descriptive analyses revealed systematic wear-time variation, both between and within participants. Pre- and post-standardized descriptive analyses of SED, LPA and MVPA revealed a persistent and often significant trend of wear-time's influence on activity. SED was consistently higher on weekdays before standardization; however, this trend was reversed post-standardization. Even though MVPA was significantly higher on weekdays both pre- and post-standardization, the magnitude of this difference decreased post-standardization. Multivariable analyses with standardized SED, LPA and MVPA as outcome

  15. Systematic review finds major deficiencies in sample size methodology and reporting for stepped-wedge cluster randomised trials

    PubMed Central

    Martin, James; Taljaard, Monica; Girling, Alan; Hemming, Karla

    2016-01-01

    Background Stepped-wedge cluster randomised trials (SW-CRT) are increasingly being used in health policy and services research, but unless they are conducted and reported to the highest methodological standards, they are unlikely to be useful to decision-makers. Sample size calculations for these designs require allowance for clustering, time effects and repeated measures. Methods We carried out a methodological review of SW-CRTs up to October 2014. We assessed adherence to reporting each of the 9 sample size calculation items recommended in the 2012 extension of the CONSORT statement to cluster trials. Results We identified 32 completed trials and 28 independent protocols published between 1987 and 2014. Of these, 45 (75%) reported a sample size calculation, with a median of 5.0 (IQR 2.5–6.0) of the 9 CONSORT items reported. Of those that reported a sample size calculation, the majority, 33 (73%), allowed for clustering, but just 15 (33%) allowed for time effects. There was a small increase in the proportions reporting a sample size calculation (from 64% before to 84% after publication of the CONSORT extension, p=0.07). The type of design (cohort or cross-sectional) was not reported clearly in the majority of studies, but cohort designs seemed to be most prevalent. Sample size calculations in cohort designs were particularly poor with only 3 out of 24 (13%) of these studies allowing for repeated measures. Discussion The quality of reporting of sample size items in stepped-wedge trials is suboptimal. There is an urgent need for dissemination of the appropriate guidelines for reporting and methodological development to match the proliferation of the use of this design in practice. Time effects and repeated measures should be considered in all SW-CRT power calculations, and there should be clarity in reporting trials as cohort or cross-sectional designs. PMID:26846897

  16. Detecting and overcoming systematic bias in high-throughput screening technologies: a comprehensive review of practical issues and methodological solutions.

    PubMed

    Caraus, Iurie; Alsuwailem, Abdulaziz A; Nadon, Robert; Makarenkov, Vladimir

    2015-11-01

    Significant efforts have been made recently to improve data throughput and data quality in screening technologies related to drug design. The modern pharmaceutical industry relies heavily on high-throughput screening (HTS) and high-content screening (HCS) technologies, which include small molecule, complementary DNA (cDNA) and RNA interference (RNAi) types of screening. Data generated by these screening technologies are subject to several environmental and procedural systematic biases, which introduce errors into the hit identification process. We first review systematic biases typical of HTS and HCS screens. We highlight that study design issues and the way in which data are generated are crucial for providing unbiased screening results. Considering various data sets, including the publicly available ChemBank data, we assess the rates of systematic bias in experimental HTS by using plate-specific and assay-specific error detection tests. We describe main data normalization and correction techniques and introduce a general data preprocessing protocol. This protocol can be recommended for academic and industrial researchers involved in the analysis of current or next-generation HTS data. PMID:25750417

  17. A Practical Methodology for the Systematic Development of Multiple Choice Tests.

    ERIC Educational Resources Information Center

    Blumberg, Phyllis; Felner, Joel

    Using Guttman's facet design analysis, four parallel forms of a multiple-choice test were developed. A mapping sentence, logically representing the universe of content of a basic cardiology course, specified the facets of the course and the semantic structural units linking them. The facets were: cognitive processes, disease priority, specific…

  18. eLSE Methodology: A Systematic Approach to the e-Learning Systems Evaluation

    ERIC Educational Resources Information Center

    Lanzilotti, Rosa; Ardito, Carmelo; Costabile, Maria F.; De Angeli, Antonella

    2006-01-01

    Quality of e-learning systems is one of the important topics that the researchers are investigating in the last years. This paper refines the concept of quality of e-learning systems and proposes a new framework, called TICS (Technology, Interaction, Content, Services), which focuses on the most important aspects to be considered when designing or…

  19. Information System Design Methodology Based on PERT/CPM Networking and Optimization Techniques.

    ERIC Educational Resources Information Center

    Bose, Anindya

    The dissertation attempts to demonstrate that the program evaluation and review technique (PERT)/Critical Path Method (CPM) or some modified version thereof can be developed into an information system design methodology. The methodology utilizes PERT/CPM which isolates the basic functional units of a system and sets them in a dynamic time/cost…

  20. Structural Design Methodology Based on Concepts of Uncertainty

    NASA Technical Reports Server (NTRS)

    Lin, K. Y.; Du, Jiaji; Rusk, David

    2000-01-01

    In this report, an approach to damage-tolerant aircraft structural design is proposed based on the concept of an equivalent "Level of Safety" that incorporates past service experience in the design of new structures. The discrete "Level of Safety" for a single inspection event is defined as the compliment of the probability that a single flaw size larger than the critical flaw size for residual strength of the structure exists, and that the flaw will not be detected. The cumulative "Level of Safety" for the entire structure is the product of the discrete "Level of Safety" values for each flaw of each damage type present at each location in the structure. Based on the definition of "Level of Safety", a design procedure was identified and demonstrated on a composite sandwich panel for various damage types, with results showing the sensitivity of the structural sizing parameters to the relative safety of the design. The "Level of Safety" approach has broad potential application to damage-tolerant aircraft structural design with uncertainty.

  1. Impact of Molecular Diagnostics for Tuberculosis on Patient-Important Outcomes: A Systematic Review of Study Methodologies

    PubMed Central

    Schumacher, Samuel G.; Sohn, Hojoon; Qin, Zhi Zhen; Gore, Genevieve; Davis, J. Lucian; Denkinger, Claudia M.; Pai, Madhukar

    2016-01-01

    Background Several reviews on the accuracy of Tuberculosis (TB) Nucleic Acid Amplification Tests (NAATs) have been performed but the evidence on their impact on patient-important outcomes has not been systematically reviewed. Given the recent increase in research evaluating such outcomes and the growing list of TB NAATs that will reach the market over the coming years, there is a need to bring together the existing evidence on impact, rather than accuracy. We aimed to assess the approaches that have been employed to measure the impact of TB NAATs on patient-important outcomes in adults with possible pulmonary TB and/or drug-resistant TB. Methods We first develop a conceptual framework to clarify through which mechanisms the improved technical performance of a novel TB test may lead to improved patient outcomes and outline which designs may be used to measure them. We then systematically review the literature on studies attempting to assess the impact of molecular TB diagnostics on such outcomes and provide a narrative synthesis of designs used, outcomes assessed and risk of bias across different study designs. Results We found 25 eligible studies that assessed a wide range of outcomes and utilized a variety of experimental and observational study designs. Many potentially strong design options have never been used. We found that much of the available evidence on patient-important outcomes comes from a small number of settings with particular epidemiological and operational context and that confounding, time trends and incomplete outcome data receive insufficient attention. Conclusions A broader range of designs should be considered when designing studies to assess the impact of TB diagnostics on patient outcomes and more attention needs to be paid to the analysis as concerns about confounding and selection bias become relevant in addition to those on measurement that are of greatest concern in accuracy studies. PMID:26954678

  2. New methodology for shaft design based on life expectancy

    NASA Technical Reports Server (NTRS)

    Loewenthal, S. H.

    1986-01-01

    The design of power transmission shafting for reliability has not historically received a great deal of attention. However, weight sensitive aerospace and vehicle applications and those where the penalties of shaft failure are great, require greater confidence in shaft design than earlier methods provided. This report summarizes a fatigue strength-based, design method for sizing shafts under variable amplitude loading histories for limited or nonlimited service life. Moreover, applications factors such as press-fitted collars, shaft size, residual stresses from shot peening or plating, corrosive environments can be readily accommodated into the framework of the analysis. Examples are given which illustrate the use of the method, pointing out the large life penalties due to occasional cyclic overloads.

  3. Structural design methodologies for ceramic-based material systems

    NASA Technical Reports Server (NTRS)

    Duffy, Stephen F.; Chulya, Abhisak; Gyekenyesi, John P.

    1991-01-01

    One of the primary pacing items for realizing the full potential of ceramic-based structural components is the development of new design methods and protocols. The focus here is on low temperature, fast-fracture analysis of monolithic, whisker-toughened, laminated, and woven ceramic composites. A number of design models and criteria are highlighted. Public domain computer algorithms, which aid engineers in predicting the fast-fracture reliability of structural components, are mentioned. Emphasis is not placed on evaluating the models, but instead is focused on the issues relevant to the current state of the art.

  4. New Methods in Design Education: The Systemic Methodology and the Use of Sketch in the Conceptual Design Stage

    ERIC Educational Resources Information Center

    Westermeyer, Juan Carlos Briede; Ortuno, Bernabe Hernandis

    2011-01-01

    This study describes the application of a new product concurrent design methodologies in the context in the education of industrial design. The use of the sketch has been utilized many times as a tool of creative expression especially in the conceptual design stage, in an intuitive way and a little out of the context of the reality needs that the…

  5. Design Based Research Methodology for Teaching with Technology in English

    ERIC Educational Resources Information Center

    Jetnikoff, Anita

    2015-01-01

    Design based research (DBR) is an appropriate method for small scale educational research projects involving collaboration between teachers, students and researchers. It is particularly useful in collaborative projects where an intervention is implemented and evaluated in a grounded context. The intervention can be technological, or a new program…

  6. A Methodology for the Design of Learning Environments

    ERIC Educational Resources Information Center

    Page, Tom; Thorsteinsson, Gisli

    2009-01-01

    This article presents and discusses some theoretical starting points and design considerations for addressing emotional and aesthetic aspects of virtual learning environments (VLEs) for support of ubiquitous teaching, studying and learning. In this article, we note that a VLE should be viewed upon as an interactive and sensations arousing…

  7. Kids in the city study: research design and methodology

    PubMed Central

    2011-01-01

    Background Physical activity is essential for optimal physical and psychological health but substantial declines in children's activity levels have occurred in New Zealand and internationally. Children's independent mobility (i.e., outdoor play and traveling to destinations unsupervised), an integral component of physical activity in childhood, has also declined radically in recent decades. Safety-conscious parenting practices, car reliance and auto-centric urban design have converged to produce children living increasingly sedentary lives. This research investigates how urban neighborhood environments can support or enable or restrict children's independent mobility, thereby influencing physical activity accumulation and participation in daily life. Methods/Design The study is located in six Auckland, New Zealand neighborhoods, diverse in terms of urban design attributes, particularly residential density. Participants comprise 160 children aged 9-11 years and their parents/caregivers. Objective measures (global positioning systems, accelerometers, geographical information systems, observational audits) assessed children's independent mobility and physical activity, neighborhood infrastructure, and streetscape attributes. Parent and child neighborhood perceptions and experiences were assessed using qualitative research methods. Discussion This study is one of the first internationally to examine the association of specific urban design attributes with child independent mobility. Using robust, appropriate, and best practice objective measures, this study provides robust epidemiological information regarding the relationships between the built environment and health outcomes for this population. PMID:21781341

  8. Designing institutions and incentives in hospitals: an organization economics methodology.

    PubMed

    Eid, Florence

    2004-01-01

    Recent seminal developments in organization economics, namely the decision rights approach, offer an opportunity to shed new light on an old question, the design of effective institutions. Drawing on conclusions about how and why firm organizational boundaries change, the decision rights approach is used in this article as an analytical lens to develop a new method for assessing institutional and incentive design in restructured hospitals. The article explains the decision rights approach and shows how the Decision Rights Framework developed from it, is a way of mapping of incentive structures to allow a comparative assessment of institutional design, an understudied area, as most work on hospitals has focused on assessing equity versus efficiency tradeoffs. The new method is illustrated drawing on one example from a case study of an innovative self-corporatized hospital in Lebanon that was at the vanguard of hospital restructuring legislation, adopted for system-wide reforms. A country with a strong private sector tradition, Lebanon was fertile territory for analyzing how high-powered incentive schemes emerge from a public sector setting, in a manner similar to the evolution of a firm in reaction to market forces. Among the findings revealed by the approach is that key to "good" design is the identification of requisite incentives and the matching up of incentives with goals through decision rights allocations. The appropriate organizational form is then a logical result. PMID:15839525

  9. Optimum design criteria for a synchronous reluctance motor with concentrated winding using response surface methodology

    NASA Astrophysics Data System (ADS)

    Lee, Jung-Ho; Park, Seong-June; Jeon, Su-Jin

    2006-04-01

    This paper presents an optimization procedure using response surface methodology (RSM) to determine design parameters for reducing torque ripple. The RSM has been achieved to use the experimental design method in combination with finite element method and well adapted to make analytical model for a complex problem considering a lot of interaction of design variables.

  10. Designing a Methodology for Future Air Travel Scenarios

    NASA Technical Reports Server (NTRS)

    Wuebbles, Donald J.; Baughcum, Steven L.; Gerstle, John H.; Edmonds, Jae; Kinnison, Douglas E.; Krull, Nick; Metwally, Munir; Mortlock, Alan; Prather, Michael J.

    1992-01-01

    -subsonic future fleet. The methodology, procedures, and recommendations for the development of future HSCT and the subsonic fleet scenarios used for this evaluation are discussed.

  11. Advanced design methodologies and novel applications of reflectarray antennas

    NASA Astrophysics Data System (ADS)

    Nayeri, Payam

    Reflectarray antennas combine the numerous advantages of printed antenna arrays and reflector antennas and create a hybrid high-gain antenna with a low-profile, low-mass, and diversified radiation performance. Reflectarrays are now emerging as the new generation of high-gain antennas for long-distance communications. In this dissertation, some advanced concepts demonstrating novel features of reflectarray antennas are presented. • First, various approaches for radiation analysis of reflectarray antennas are described and implemented. Numerical results are then presented for a variety of systems and the advantages, limitations, and accuracy of these approaches are discussed and compared with each other. • A broadband technique by using sub-wavelength elements is proposed and prototypes are fabricated and tested. This technique enables the reflectarray to achieve a significant bandwidth improvement with no additional cost. • Infrared reflectarrays antennas are studied for possible applications in concentrating solar power systems. Material losses, an important design issue at infrared frequencies, are investigated and reflectarrays consisted of dielectric resonant elements are proposed with low-loss features at infrared. • Multi-beam reflectarray antennas are studied and it is demonstrated that by optimizing the phase of the elements, a desirable multi-beam performance can be achieved using a single-feed. Local and global phase-only optimization techniques have been implemented. Two Ka-band quad-beam prototypes with symmetric and asymmetric beams have been fabricated and tested. • Different approaches for beam-scanning with reflectarray antennas are also reviewed and it is shown that for moderately wide angle beam-scanning, utilizing a feed displacement technique is more suitable than an aperture phase tuning approach. A feed displacement beam-scanning design with novel aperture phase distribution is proposed for the reflectarray antenna, and is further

  12. Model-Driven Design: Systematically Building Integrated Blended Learning Experiences

    ERIC Educational Resources Information Center

    Laster, Stephen

    2010-01-01

    Developing and delivering curricula that are integrated and that use blended learning techniques requires a highly orchestrated design. While institutions have demonstrated the ability to design complex curricula on an ad-hoc basis, these projects are generally successful at a great human and capital cost. Model-driven design provides a…

  13. Software Design Methodology Migration for a Distributed Ground System

    NASA Technical Reports Server (NTRS)

    Ritter, George; McNair, Ann R. (Technical Monitor)

    2002-01-01

    The Marshall Space Flight Center's (MSFC) Payload Operations Center (POC) ground system has been developed and has evolved over a period of about 10 years. During this time the software processes have migrated from more traditional to more contemporary development processes. The new Software processes still emphasize requirements capture, software configuration management, design documenting, and making sure the products that have been developed are accountable to initial requirements. This paper will give an overview of how the Software Process have evolved highlighting the positives as well as the negatives. In addition, we will mention the COTS tools that have been integrated into the processes and how the COTS have provided value to the project .

  14. Development of a combustor analytical design methodology for liquid rocket engines

    NASA Technical Reports Server (NTRS)

    Pieper, Jerry L.; Muss, Jeff

    1989-01-01

    The development of a user friendly computerized methodology for the design and analysis of liquid propellant rocket engine combustion chambers is described. An overview of the methodology, consisting of a computer program containing an appropriate modular assembly of existing industry wide performance and combustion stability models, is presented. These models are linked with an interactive front end processor enabling the user to define the performance and stability traits of an existing design (point analysis) or to create the essential design features of a combustor to meet specific performance goals and combustion stability (point design). Plans for demonstration and verification of this methodology are also presented. These plans include the creation of combustor designs using the methodology, together with predictions of the performance and combustion stability for each design. A verification test program of 26 hot fire tests with up to four designs created using this methodology is described. This testing is planned using LOX/RP-1 propellants with a thrust level of approx. 220,000 N (50,000 lbf).

  15. Theories and Research Methodologies for Design-Based Implementation Research: Examples from Four Cases

    ERIC Educational Resources Information Center

    Russell, Jennifer Lin; Jackson, Kara; Krumm, Andrew E.; Frank, Kenneth A.

    2013-01-01

    Design-Based Implementation Research is the process of engaging "learning scientists, policy researchers, and practitioners in a model of collaborative, iterative, and systematic research and development" designed to address persistent problems of teaching and learning. Addressing persistent problems of teaching and learning requires…

  16. Reporting of planned statistical methods in published surgical randomised trial protocols: a protocol for a methodological systematic review

    PubMed Central

    Madden, Kim; Arseneau, Erika; Evaniew, Nathan; Smith, Christopher S; Thabane, Lehana

    2016-01-01

    Introduction Poor reporting can lead to inadequate presentation of data, confusion regarding research methodology used, selective reporting of results, and other misinformation regarding health research. One of the most recent attempts to improve quality of reporting comes from the Standard Protocol Items: Recommendations for Interventional Trials (SPIRIT) Group, which makes recommendations for the reporting of protocols. In this report, we present a protocol for a systematic review of published surgical randomised controlled trial (RCT) protocols, with the purpose of assessing the reporting quality and completeness of the statistical aspects. Methods We will include all published protocols of randomised trials that investigate surgical interventions. We will search MEDLINE, EMBASE, and CENTRAL for relevant studies. Author pairs will independently review all titles, abstracts, and full texts identified by the literature search, and extract data using a structured data extraction form. We will extract the following: year of publication, country, sample size, description of study population, description of intervention and control, primary outcome, important methodological qualities, and quality of reporting of planned statistical methods based on the SPIRIT guidelines. Ethics and dissemination The results of this review will demonstrate the quality of statistical reporting of published surgical RCT protocols. This knowledge will inform recommendations to surgeons, researchers, journal editors and peer reviewers, and other knowledge users that focus on common deficiencies in reporting and how to rectify them. Ethics approval for this study is not required. We will disseminate the results of this review in peer-reviewed publications and conference presentations, and at a doctoral independent study of oral defence. PMID:27259528

  17. Prescribed wake methodologies for wind turbine design codes

    SciTech Connect

    Galbraith, R.A.M.; Coton, F.N.; Robison, D.J.

    1995-12-31

    Prescribed wake performance assessment models have been developed successfully for both vertical (VAWT) and horizontal (HAWT) axis wind turbines. In the case of the VAWT model the Beddoes and Leishman dynamic stall model has been incorporated. This has resulted in a fully unsteady 3-D code, establishing extremely accurate performance prediction across a wide range of operating conditions. Comparison of performance estimates from the prescribed wake model with those from free wake models have shown excellent correlation. To date, the HAWT model has been developed for the consideration of steady axial and yawed inflows. In the axial flow case comparisons of predicted power output with field data and free wake predictions have shown excellent agreement. Full validation of the yawed flow model is currently underway, with very encouraging initial results. The capabilities of the HAWT model are currently being extended by the inclusion of the Beddoes and Leishman dynamic stall model. Consideration of the significant unsteady aerodynamic influences acting on HAWTs while operating in yaw will significantly improve the models performance. The power of this modelling technique is the significant reduction in the computational overhead it offers. The prescribed wake models offer performance estimates of comparable detail and accuracy to those from free vortex analyses in minutes rather than hours. As such these models are highly suited to design assessment, with particular application to fatigue load analysis.

  18. Weibull-Based Design Methodology for Rotating Aircraft Engine Structures

    NASA Technical Reports Server (NTRS)

    Zaretsky, Erwin; Hendricks, Robert C.; Soditus, Sherry

    2002-01-01

    The NASA Energy Efficient Engine (E(sup 3)-Engine) is used as the basis of a Weibull-based life and reliability analysis. Each component's life and thus the engine's life is defined by high-cycle fatigue (HCF) or low-cycle fatigue (LCF). Knowing the cumulative life distribution of each of the components making up the engine as represented by a Weibull slope is a prerequisite to predicting the life and reliability of the entire engine. As the engine Weibull slope increases, the predicted lives decrease. The predicted engine lives L(sub 5) (95 % probability of survival) of approximately 17,000 and 32,000 hr do correlate with current engine maintenance practices without and with refurbishment. respectively. The individual high pressure turbine (HPT) blade lives necessary to obtain a blade system life L(sub 0.1) (99.9 % probability of survival) of 9000 hr for Weibull slopes of 3, 6 and 9, are 47,391 and 20,652 and 15,658 hr, respectively. For a design life of the HPT disks having probable points of failure equal to or greater than 36,000 hr at a probability of survival of 99.9 %, the predicted disk system life L(sub 0.1) can vary from 9,408 to 24,911 hr.

  19. A design methodology for evolutionary air transportation networks

    NASA Astrophysics Data System (ADS)

    Yang, Eunsuk

    The air transportation demand at large hubs in the U.S. is anticipated to double in the near future. Current runway construction plans at selected airports can relieve some capacity and delay problems, but many are doubtful that this solution is sufficient to accommodate the anticipated demand growth in the National Airspace System (NAS). With the worsening congestion problem, it is imperative to seek alternative solutions other than costly runway constructions. In this respect, many researchers and organizations have been building models and performing analyses of the NAS. However, the complexity and size of the problem results in an overwhelming task for transportation system modelers. This research seeks to compose an active design algorithm for an evolutionary airline network model so as to include network specific control properties. An airline network designer, referred to as a network architect, can use this tool to assess the possibilities of gaining more capacity by changing the network configuration. Since the Airline Deregulation Act of 1978, the airline service network has evolved into a distinct Hub-and-Spoke (H&S) network. Enplanement demand on the H&S network is the sum of Origin-Destination (O-D) demand and transfer demand. Even though the flight or enplanement demand is a function of O-D demand and passenger routings on the airline network, the distinction between enplanement and O-D demand is not often made. Instead, many demand forecast practices in current days are based on scale-ups from the enplanements, which include the demand to and from transferring network hubs. Based on this research, it was found that the current demand prediction practice can be improved by dissecting enplanements further into smaller pieces of information. As a result, enplanement demand is decomposed into intrinsic and variable parts. The proposed intrinsic demand model is based on the concept of 'true' O-D demand which includes the direction of each round trip

  20. Applied design methodology for lunar rover elastic wheel

    NASA Astrophysics Data System (ADS)

    Cardile, Diego; Viola, Nicole; Chiesa, Sergio; Rougier, Alessandro

    2012-12-01

    In recent years an increasing interest in the Moon surface operations has been experienced. In the future robotic and manned missions of Moon surface exploration will be fundamental in order to lay the groundwork for more ambitious space exploration programs. Surface mobility systems will be the key elements to ensure an efficient and safe Moon exploration. Future lunar rovers are likely to be heavier and able to travel longer distances than the previously developed Moon rover systems. The Lunar Roving Vehicle (LRV) is the only manned rover, which has so far been launched and used on the Moon surface. Its mobility system included flexible wheels that cannot be scaled to the heavier and longer range vehicles. Thus the previously developed wheels are likely not to be suitable for the new larger vehicles. Taking all these considerations into account, on the basis of the system requirements and assumptions, several wheel concepts have been discussed and evaluated through a trade-off analysis. Semi-empirical equations have been utilized to predict the wheel geometrical characteristics, as well as to estimate the motion resistances and the ability of the system to generate thrust. A numerical model has also been implemented, in order to define more into the details the whole wheel design, in terms of wheel geometry and physical properties. As a result of the trade-off analysis, the ellipse wheel concept has shown the best behavior in terms of stiffness, mass budget and dynamic performance. The results presented in the paper have been obtained in cooperation with Thales Alenia Space-Italy and Sicme motori, in the framework of a regional program called STEPS . STEPS-Sistemi e Tecnologie per l'EsPlorazione Spaziale is a research project co-financed by Piedmont Region and firms and universities of the Piedmont Aerospace District in the ambit of the P.O.R-F.E.S.R. 2007-2013 program.

  1. Design-Based Research: Is This a Suitable Methodology for Short-Term Projects?

    ERIC Educational Resources Information Center

    Pool, Jessica; Laubscher, Dorothy

    2016-01-01

    This article reports on a design-based methodology of a thesis in which a fully face-to-face contact module was converted into a blended learning course. The purpose of the article is to report on how design-based phases, in the form of micro-, meso- and macro-cycles were applied to improve practice and to generate design principles. Design-based…

  2. Advanced piloted aircraft flight control system design methodology. Volume 2: The FCX flight control design expert system

    NASA Technical Reports Server (NTRS)

    Myers, Thomas T.; Mcruer, Duane T.

    1988-01-01

    The development of a comprehensive and electric methodology for conceptual and preliminary design of flight control systems is presented and illustrated. The methodology is focused on the design states starting with the layout of system requirements and ending when some viable competing system architectures (feedback control structures) are defined. The approach is centered on the human pilot and the aircraft as both the sources of, and the keys to the solution of, many flight control problems. The methodology relies heavily on computational procedures which are highly interactive with the design engineer. To maximize effectiveness, these techniques, as selected and modified to be used together in the methodology, form a cadre of computational tools specifically tailored for integrated flight control system preliminary design purposes. The FCX expert system as presently developed is only a limited prototype capable of supporting basic lateral-directional FCS design activities related to the design example used. FCX presently supports design of only one FCS architecture (yaw damper plus roll damper) and the rules are largely focused on Class IV (highly maneuverable) aircraft. Despite this limited scope, the major elements which appear necessary for application of knowledge-based software concepts to flight control design were assembled and thus FCX represents a prototype which can be tested, critiqued and evolved in an ongoing process of development.

  3. Experimental design in caecilian systematics: phylogenetic information of mitochondrial genomes and nuclear rag1.

    PubMed

    San Mauro, Diego; Gower, David J; Massingham, Tim; Wilkinson, Mark; Zardoya, Rafael; Cotton, James A

    2009-08-01

    In molecular phylogenetic studies, a major aspect of experimental design concerns the choice of markers and taxa. Although previous studies have investigated the phylogenetic performance of different genes and the effectiveness of increasing taxon sampling, their conclusions are partly contradictory, probably because they are highly context specific and dependent on the group of organisms used in each study. Goldman introduced a method for experimental design in phylogenetics based on the expected information to be gained that has barely been used in practice. Here we use this method to explore the phylogenetic utility of mitochondrial (mt) genes, mt genomes, and nuclear rag1 for studies of the systematics of caecilian amphibians, as well as the effect of taxon addition on the stabilization of a controversial branch of the tree. Overall phylogenetic information estimates per gene, specific estimates per branch of the tree, estimates for combined (mitogenomic) data sets, and estimates as a hypothetical new taxon is added to different parts of the caecilian tree are calculated and compared. In general, the most informative data sets are those for mt transfer and ribosomal RNA genes. Our results also show at which positions in the caecilian tree the addition of taxa have the greatest potential to increase phylogenetic information with respect to the controversial relationships of Scolecomorphus, Boulengerula, and all other teresomatan caecilians. These positions are, as intuitively expected, mostly (but not all) adjacent to the controversial branch. Generating whole mitogenomic and rag1 data for additional taxa joining the Scolecomorphus branch may be a more efficient strategy than sequencing a similar amount of additional nucleotides spread across the current caecilian taxon sampling. The methodology employed in this study allows an a priori evaluation and testable predictions of the appropriateness of particular experimental designs to solve specific questions at

  4. Conjecture Mapping: An Approach to Systematic Educational Design Research

    ERIC Educational Resources Information Center

    Sandoval, William

    2014-01-01

    Design research is strongly associated with the learning sciences community, and in the 2 decades since its conception it has become broadly accepted. Yet within and without the learning sciences there remains confusion about how to do design research, with most scholarship on the approach describing what it is rather than how to do it. This…

  5. Designing Needs Statements in a Systematic Iterative Way

    ERIC Educational Resources Information Center

    Verstegen, D. M. L.; Barnard, Y. F.; Pilot, A.

    2009-01-01

    Designing specifications for technically advanced instructional products, such as e-learning, simulations or simulators requires different kinds of expertise. The SLIM method proposes to involve all stakeholders from the beginning in a series of workshops under the guidance of experienced instructional designers. These instructional designers…

  6. Analysis and Design of Fuselage Structures Including Residual Strength Prediction Methodology

    NASA Technical Reports Server (NTRS)

    Knight, Norman F.

    1998-01-01

    The goal of this research project is to develop and assess methodologies for the design and analysis of fuselage structures accounting for residual strength. Two primary objectives are included in this research activity: development of structural analysis methodology for predicting residual strength of fuselage shell-type structures; and the development of accurate, efficient analysis, design and optimization tool for fuselage shell structures. Assessment of these tools for robustness, efficient, and usage in a fuselage shell design environment will be integrated with these two primary research objectives.

  7. The inclusion of ergonomic tools in the informational, conceptual and preliminary phases of the product design methodology.

    PubMed

    Medeiros, Ivan Luiz de; Batiz, Eduardo Concepción

    2012-01-01

    The process of product development has received special attention as it is being recognized as a source of competitive gain. Through its systematic use companies reduce costs, increase quality and decrease development time. However, one can find products being launched on the market that cause dissatisfaction to its users, and in consequence if the customer feels harmed or injured he will no longer purchase a product from the same brand. This in regard only to the commercial aspect; usually the danger of an accident or injury is not even thought about. This paper is the basis of the dissertation master's degree and used a literature research to build the repertoire, analyzing the methodologies applied by product design engineers, designers and ergonomists. The analysis results demonstrate the inefficiency of the design methodologies ergonomic issues. The contribution of this work lies in the suggestion to include ergonomic tools in all phases of product development and the presentation of a table with the tools that points out its most suitable time of application and results. PMID:22316854

  8. Design methodology for high-speed video processing system based on signal integrity analysis

    NASA Astrophysics Data System (ADS)

    Wang, Rui; Zhang, Hao

    2009-07-01

    On account of high performance requirement of video processing systems and the shortcoming of conventional circuit design method, a design methodology based on the signal integrity (SI) theory for the high-speed video processing system with TI's digital signal processor TMS320DM642 was proposed. The PCB stack-up and construction of the system as well as transmission line characteristic impedance are set and calculated firstly with the impedance control tool Si8000 through this methodology. And then some crucial signals such as data lines of SDRAM are simulated and analyzed with the IBIS models so that reasonable layout and routing rules are established. Finally the system's highdensity PCB design is completed on Cadence SPB15.7 platform. The design result shows that this methodology can effectively restrain signal reflection, crosstalk, rail collapse noise and electromagnetic interference (EMI). Thus it significantly improves stability of the system and shortens development cycles.

  9. Designing Trend-Monitoring Sounds for Helicopters: Methodological Issues and an Application

    ERIC Educational Resources Information Center

    Edworthy, Judy; Hellier, Elizabeth; Aldrich, Kirsteen; Loxley, Sarah

    2004-01-01

    This article explores methodological issues in sonification and sound design arising from the design of helicopter monitoring sounds. Six monitoring sounds (each with 5 levels) were tested for similarity and meaning with 3 different techniques: hierarchical cluster analysis, linkage analysis, and multidimensional scaling. In Experiment 1,…

  10. NATIONAL RESEARCH PROGRAM ON DESIGN-BASED/MODEL-ASSISTED SURVEY METHODOLOGY FOR AQUATIC RESOURCES

    EPA Science Inventory

    We expect to accomplish five major goals with the Program. The first is to extend design-based statistical methodology to cover the unique circumstances encountered in EMAP. The second is to make both existing and newly-developed model-assisted design-based statistical tools m...

  11. A Step-by-Step Design Methodology for a Base Case Vanadium Redox-Flow Battery

    ERIC Educational Resources Information Center

    Moore, Mark; Counce, Robert M.; Watson, Jack S.; Zawodzinski, Thomas A.; Kamath, Haresh

    2012-01-01

    The purpose of this work is to develop an evolutionary procedure to be used by Chemical Engineering students for the base-case design of a Vanadium Redox-Flow Battery. The design methodology is based on the work of Douglas (1985) and provides a profitability analysis at each decision level so that more profitable alternatives and directions can be…

  12. Three-dimensional viscous design methodology for advanced technology aircraft supersonic inlet systems

    NASA Technical Reports Server (NTRS)

    Anderson, B. H.

    1983-01-01

    A broad program to develop advanced, reliable, and user oriented three-dimensional viscous design techniques for supersonic inlet systems, and encourage their transfer into the general user community is discussed. Features of the program include: (1) develop effective methods of computing three-dimensional flows within a zonal modeling methodology; (2) ensure reasonable agreement between said analysis and selective sets of benchmark validation data; (3) develop user orientation into said analysis; and (4) explore and develop advanced numerical methodology.

  13. Advanced piloted aircraft flight control system design methodology. Volume 1: Knowledge base

    NASA Technical Reports Server (NTRS)

    Mcruer, Duane T.; Myers, Thomas T.

    1988-01-01

    The development of a comprehensive and electric methodology for conceptual and preliminary design of flight control systems is presented and illustrated. The methodology is focused on the design stages starting with the layout of system requirements and ending when some viable competing system architectures (feedback control structures) are defined. The approach is centered on the human pilot and the aircraft as both the sources of, and the keys to the solution of, many flight control problems. The methodology relies heavily on computational procedures which are highly interactive with the design engineer. To maximize effectiveness, these techniques, as selected and modified to be used together in the methodology, form a cadre of computational tools specifically tailored for integrated flight control system preliminary design purposes. While theory and associated computational means are an important aspect of the design methodology, the lore, knowledge and experience elements, which guide and govern applications are critical features. This material is presented as summary tables, outlines, recipes, empirical data, lists, etc., which encapsulate a great deal of expert knowledge. Much of this is presented in topical knowledge summaries which are attached as Supplements. The composite of the supplements and the main body elements constitutes a first cut at a a Mark 1 Knowledge Base for manned-aircraft flight control.

  14. Systematic design of membership functions for fuzzy-logic control: A case study on one-stage partial nitritation/anammox treatment systems.

    PubMed

    Boiocchi, Riccardo; Gernaey, Krist V; Sin, Gürkan

    2016-10-01

    A methodology is developed to systematically design the membership functions of fuzzy-logic controllers for multivariable systems. The methodology consists of a systematic derivation of the critical points of the membership functions as a function of predefined control objectives. Several constrained optimization problems corresponding to different qualitative operation states of the system are defined and solved to identify, in a consistent manner, the critical points of the membership functions for the input variables. The consistently identified critical points, together with the linguistic rules, determine the long term reachability of the control objectives by the fuzzy logic controller. The methodology is highlighted using a single-stage side-stream partial nitritation/Anammox reactor as a case study. As a result, a new fuzzy-logic controller for high and stable total nitrogen removal efficiency is designed. Rigorous simulations are carried out to evaluate and benchmark the performance of the controller. The results demonstrate that the novel control strategy is capable of rejecting the long-term influent disturbances, and can achieve a stable and high TN removal efficiency. Additionally, the controller was tested, and showed robustness, against measurement noise levels typical for wastewater sensors. A feedforward-feedback configuration using the present controller would give even better performance. In comparison, a previously developed fuzzy-logic controller using merely expert and intuitive knowledge performed worse. This proved the importance of using a systematic methodology for the derivation of the membership functions for multivariable systems. These results are promising for future applications of the controller in real full-scale plants. Furthermore, the methodology can be used as a tool to help systematically design fuzzy logic control applications for other biological processes. PMID:27390035

  15. Design methodology for integrated downstream separation systems in an ethanol biorefinery

    NASA Astrophysics Data System (ADS)

    Mohammadzadeh Rohani, Navid

    and obtaining energy security. On the other hand, Process Integration (PI) as defined by Natural Resource Canada as the combination of activities which aim at improving process systems, their unit operations and their interactions in order to maximize the efficiency of using water, energy and raw materials can also help biorefineries lower their energy consumptions and improve their economics. Energy integration techniques such as pinch analysis adopted by different industries over the years have ensured using heat sources within a plant to supply the demand internally and decrease the external utility consumption. Therefore, adopting energy integration can be one of the ways biorefinery technology owners can consider in their process development as well as their business model in order to improve their overall economics. The objective of this thesis is to propose a methodology for designing integrated downstream separation in a biorefinery. This methodology is tested in an ethanol biorefinery case study. Several alternative separation techniques are evaluated in their energy consumption and economics in three different scenarios; stand-alone without energy integration, stand-alone with internal energy integration and integrated-with Kraft. The energy consumptions and capital costs of separation techniques are assessed in each scenario and the cost and benefit of integration are determined and finally the best alternative is found through techno-economic metrics. Another advantage of this methodology is the use of a graphical tool which provides insights on decreasing energy consumption by modifying the process condition. The pivot point of this work is the use of a novel energy integration method called Bridge analysis. This systematic method which originally is intended for retrofit situation is used here for integration with Kraft process. Integration potentials are identified through this method and savings are presented for each design. In stand-alone with

  16. Integrated Controls-Structures Design Methodology: Redesign of an Evolutionary Test Structure

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Gupta, Sandeep; Elliot, Kenny B.; Joshi, Suresh M.

    1997-01-01

    An optimization-based integrated controls-structures design methodology for a class of flexible space structures is described, and the phase-0 Controls-Structures-Integration evolutionary model, a laboratory testbed at NASA Langley, is redesigned using this integrated design methodology. The integrated controls-structures design is posed as a nonlinear programming problem to minimize the control effort required to maintain a specified line-of-sight pointing performance, under persistent white noise disturbance. Static and dynamic dissipative control strategies are employed for feedback control, and parameters of these controllers are considered as the control design variables. Sizes of strut elements in various sections of the CEM are used as the structural design variables. Design guides for the struts are developed and employed in the integrated design process, to ensure that the redesigned structure can be effectively fabricated. The superiority of the integrated design methodology over the conventional design approach is demonstrated analytically by observing a significant reduction in the average control power needed to maintain specified pointing performance with the integrated design approach.

  17. Systematic design of cantilever beams for muscle research.

    PubMed

    McLaughlin, R J

    1977-05-01

    Experimental studies of muscle contraction often involve difficult problems in the design of cantilever beams for movable levers, transducers, or mechanical supports. Equations are presented for the calculation of mass, inertia, stress distribution, strain, deflection curve, compliance, and resonant frequency of uniform or nonuniform cantilever beams made of structural materials of different density or elastic modulus. Formulas are listed for solid, thick-wall, and thin-wall uniform beams of rectangular and circular cross section. Physical properties including density, elastic and torsional moduli, stress and strain limits, thermal expansion coefficients, Poisson's ratio, and certain elastic-modulus-to-density ratios are tabulated for structural materials including common metals, glass, plastic, and wood. A graphical design procedure is presented based on a chart containing loci of constant beam parameter values as a function of beam length and height or diameter, for the simple geometries. The choice of structural material is discussed for design problems with typical constraints, and examples are given of the design of beams of nonuniform cross section. Methods for extending the design chart to other geometries and materials are included. PMID:863848

  18. Co-design of RAD and ETHICS methodologies: a combination of information system development methods

    NASA Astrophysics Data System (ADS)

    Nasehi, Arezo; Shahriyari, Salman

    2011-12-01

    Co-design is a new trend in the social world which tries to capture different ideas in order to use the most appropriate features for a system. In this paper, co-design of two information system methodologies is regarded; rapid application development (RAD) and effective technical and human implementation of computer-based systems (ETHICS). We tried to consider the characteristics of these methodologies to see the possibility of having a co-design or combination of them for developing an information system. To reach this purpose, four different aspects of them are analyzed: social or technical approach, user participation and user involvement, job satisfaction, and overcoming change resistance. Finally, a case study using the quantitative method is analyzed in order to examine the possibility of co-design using these factors. The paper concludes that RAD and ETHICS are appropriate to be co-designed and brings some suggestions for the co-design.

  19. Systematic designs of buffers in macropipelines of systolic arrays

    SciTech Connect

    Wah, B.W.; Aboelaze, M.; Shang, W.

    1988-02-01

    In a macropipeline of systolic arrays, outputs of one systolic array in a given format have to be fed as inputs to another systolic array in a possibly different format. A common memory becomes a bottleneck and limits the number of systolic arrays that can be connected together. In this paper, they study designs of buffers to convert data from one format to another. The minimum number of buffers is determined by a dynamic-programming algorithm with THETA(n/sup 2/) computational complexity, where n is the problem size. A general-purpose converter to convert data from any distribution to any other in a subset of the possible data distribution is also proposed. Last, buffer designs for a macropipeline to perform feature extraction and pattern classification are used to exemplify the design process.

  20. Phylogenetic information and experimental design in molecular systematics.

    PubMed Central

    Goldman, N

    1998-01-01

    Despite the widespread perception that evolutionary inference from molecular sequences is a statistical problem, there has been very little attention paid to questions of experimental design. Previous consideration of this topic has led to little more than an empirical folklore regarding the choice of suitable genes for analysis, and to dispute over the best choice of taxa for inclusion in data sets. I introduce what I believe are new methods that permit the quantification of phylogenetic information in a sequence alignment. The methods use likelihood calculations based on Markov-process models of nucleotide substitution allied with phylogenetic trees, and allow a general approach to optimal experimental design. Two examples are given, illustrating realistic problems in experimental design in molecular phylogenetics and suggesting more general conclusions about the choice of genomic regions, sequence lengths and taxa for evolutionary studies. PMID:9787470

  1. Efficient preliminary floating offshore wind turbine design and testing methodologies and application to a concrete spar design.

    PubMed

    Matha, Denis; Sandner, Frank; Molins, Climent; Campos, Alexis; Cheng, Po Wen

    2015-02-28

    The current key challenge in the floating offshore wind turbine industry and research is on designing economic floating systems that can compete with fixed-bottom offshore turbines in terms of levelized cost of energy. The preliminary platform design, as well as early experimental design assessments, are critical elements in the overall design process. In this contribution, a brief review of current floating offshore wind turbine platform pre-design and scaled testing methodologies is provided, with a focus on their ability to accommodate the coupled dynamic behaviour of floating offshore wind systems. The exemplary design and testing methodology for a monolithic concrete spar platform as performed within the European KIC AFOSP project is presented. Results from the experimental tests compared to numerical simulations are presented and analysed and show very good agreement for relevant basic dynamic platform properties. Extreme and fatigue loads and cost analysis of the AFOSP system confirm the viability of the presented design process. In summary, the exemplary application of the reduced design and testing methodology for AFOSP confirms that it represents a viable procedure during pre-design of floating offshore wind turbine platforms. PMID:25583870

  2. Efficient preliminary floating offshore wind turbine design and testing methodologies and application to a concrete spar design

    PubMed Central

    Matha, Denis; Sandner, Frank; Molins, Climent; Campos, Alexis; Cheng, Po Wen

    2015-01-01

    The current key challenge in the floating offshore wind turbine industry and research is on designing economic floating systems that can compete with fixed-bottom offshore turbines in terms of levelized cost of energy. The preliminary platform design, as well as early experimental design assessments, are critical elements in the overall design process. In this contribution, a brief review of current floating offshore wind turbine platform pre-design and scaled testing methodologies is provided, with a focus on their ability to accommodate the coupled dynamic behaviour of floating offshore wind systems. The exemplary design and testing methodology for a monolithic concrete spar platform as performed within the European KIC AFOSP project is presented. Results from the experimental tests compared to numerical simulations are presented and analysed and show very good agreement for relevant basic dynamic platform properties. Extreme and fatigue loads and cost analysis of the AFOSP system confirm the viability of the presented design process. In summary, the exemplary application of the reduced design and testing methodology for AFOSP confirms that it represents a viable procedure during pre-design of floating offshore wind turbine platforms. PMID:25583870

  3. Taking STOX: developing a cross disciplinary methodology for systematic reviews of research on the built environment and the health of the public

    PubMed Central

    Weaver, N; Williams, J; Weightman, A; Kitcher, H; Temple, J; Jones, P; Palmer, S

    2002-01-01

    Study objective: To develop a cross disciplinary literature search methodology for conducting systematic reviews of all types of research investigating aspects of the built environment and the health of the public. Design: The method was developed following a comprehensive search of literature in the area of housing and injuries, using 30 databases covering many disciplines including medicine, social science, architecture, science, engineering, environment, planning and psychology. The results of the database searches, including the type (or evidence) of research papers identified, were analysed to identify the most productive databases and improve the efficiency of the strategy. The revised strategy for literature searching was then applied to the area of neighbourhoods and mental health, and an analysis of the evidence type of references was carried out. In recognition of the large number and variety of observational studies, an expanded evidence type classification was developed for this purpose. Main results: From an analysis of 722 citations obtained by a housing and injuries search, an overlap of only 9% was found between medical and social science databases and only 1% between medical and built environment databases. A preliminary evidence type classification of those citations that could be assessed (from information in the abstracts and titles) suggested that the majority of intervention studies on housing and injuries are likely to be found in the medical and social science databases. A number of relevant observational studies (10% of all research studies) would have been missed, however, by excluding built environment and grey literature databases. In an area lacking in interventional research (housing/neighbourhoods and mental health) as many as 25% of all research studies would have been missed by ignoring the built environment and grey literature. Conclusions: When planning a systematic review of all types of evidence in a topic relating to the built

  4. Development of an Expert Judgement Elicitation and Calibration Methodology for Risk Analysis in Conceptual Vehicle Design

    NASA Technical Reports Server (NTRS)

    Unal, Resit; Keating, Charles; Conway, Bruce; Chytka, Trina

    2004-01-01

    A comprehensive expert-judgment elicitation methodology to quantify input parameter uncertainty and analysis tool uncertainty in a conceptual launch vehicle design analysis has been developed. The ten-phase methodology seeks to obtain expert judgment opinion for quantifying uncertainties as a probability distribution so that multidisciplinary risk analysis studies can be performed. The calibration and aggregation techniques presented as part of the methodology are aimed at improving individual expert estimates, and provide an approach to aggregate multiple expert judgments into a single probability distribution. The purpose of this report is to document the methodology development and its validation through application to a reference aerospace vehicle. A detailed summary of the application exercise, including calibration and aggregation results is presented. A discussion of possible future steps in this research area is given.

  5. Space station definitions, design, and development. Task 5: Multiple arm telerobot coordination and control: Manipulator design methodology

    NASA Technical Reports Server (NTRS)

    Stoughton, R. M.

    1990-01-01

    A proposed methodology applicable to the design of manipulator systems is described. The current design process is especially weak in the preliminary design phase, since there is no accepted measure to be used in trading off different options available for the various subsystems. The design process described uses Cartesian End-Effector Impedance as a measure of performance for the system. Having this measure of performance, it is shown how it may be used to determine the trade-offs necessary to the preliminary design phase. The design process involves three main parts: (1) determination of desired system performance in terms of End-Effector Impedance; (2) trade-off design options to achieve this desired performance; and (3) verification of system performance through laboratory testing. The design process is developed using numerous examples and experiments to demonstrate the feasability of this approach to manipulator design.

  6. Enriched Environments, Cortical Plasticity, and Implications for the Systematic Design of Instruction.

    ERIC Educational Resources Information Center

    Rice, Stephen; And Others

    1996-01-01

    Discussion of learning theories focuses on cortical plasticity, enriched learning environments, and the systematic design of instruction. Topics include neurophysiology; research on cortical plasticity and its implications for instructional systems design; linking theory with practice; discovery learning; and motivation and arousal, including…

  7. A Digital Tool Set for Systematic Model Design in Process-Engineering Education

    ERIC Educational Resources Information Center

    van der Schaaf, Hylke; Tramper, Johannes; Hartog, Rob J.M.; Vermue, Marian

    2006-01-01

    One of the objectives of the process technology curriculum at Wageningen University is that students learn how to design mathematical models in the context of process engineering, using a systematic problem analysis approach. Students find it difficult to learn to design a model and little material exists to meet this learning objective. For these…

  8. Outcomes of a Systematically Designed Strategy for the Implementation of Sex Education in Dutch Secondary Schools

    ERIC Educational Resources Information Center

    Wiefferink, C. H.; Poelman, J.; Linthorst, M.; Vanwesenbeeck, I.; Van Wijngaarden, J. C. M.; Paulussen, T. G. W.

    2005-01-01

    This study examines the effects of a systematically designed innovation strategy on teachers implementation of a sex education curriculum and its related determinants. A quasi-experimental group design was used to assess the effectiveness of the innovation strategy. Teachers filled in questionnaires on the determinants of curriculum implementation…

  9. Methodology for object-oriented real-time systems analysis and design: Software engineering

    NASA Technical Reports Server (NTRS)

    Schoeffler, James D.

    1991-01-01

    Successful application of software engineering methodologies requires an integrated analysis and design life-cycle in which the various phases flow smoothly 'seamlessly' from analysis through design to implementation. Furthermore, different analysis methodologies often lead to different structuring of the system so that the transition from analysis to design may be awkward depending on the design methodology to be used. This is especially important when object-oriented programming is to be used for implementation when the original specification and perhaps high-level design is non-object oriented. Two approaches to real-time systems analysis which can lead to an object-oriented design are contrasted: (1) modeling the system using structured analysis with real-time extensions which emphasizes data and control flows followed by the abstraction of objects where the operations or methods of the objects correspond to processes in the data flow diagrams and then design in terms of these objects; and (2) modeling the system from the beginning as a set of naturally occurring concurrent entities (objects) each having its own time-behavior defined by a set of states and state-transition rules and seamlessly transforming the analysis models into high-level design models. A new concept of a 'real-time systems-analysis object' is introduced and becomes the basic building block of a series of seamlessly-connected models which progress from the object-oriented real-time systems analysis and design system analysis logical models through the physical architectural models and the high-level design stages. The methodology is appropriate to the overall specification including hardware and software modules. In software modules, the systems analysis objects are transformed into software objects.

  10. A multi-criteria decision aid methodology to design electric vehicles public charging networks

    NASA Astrophysics Data System (ADS)

    Raposo, João; Rodrigues, Ana; Silva, Carlos; Dentinho, Tomaz

    2015-05-01

    This article presents a new multi-criteria decision aid methodology, dynamic-PROMETHEE, here used to design electric vehicle charging networks. In applying this methodology to a Portuguese city, results suggest that it is effective in designing electric vehicle charging networks, generating time and policy based scenarios, considering offer and demand and the city's urban structure. Dynamic-PROMETHE adds to the already known PROMETHEE's characteristics other useful features, such as decision memory over time, versatility and adaptability. The case study, used here to present the dynamic-PROMETHEE, served as inspiration and base to create this new methodology. It can be used to model different problems and scenarios that may present similar requirement characteristics.

  11. Cyber-Informed Engineering: The Need for a New Risk Informed and Design Methodology

    SciTech Connect

    Price, Joseph Daniel; Anderson, Robert Stephen

    2015-06-01

    Current engineering and risk management methodologies do not contain the foundational assumptions required to address the intelligent adversary’s capabilities in malevolent cyber attacks. Current methodologies focus on equipment failures or human error as initiating events for a hazard, while cyber attacks use the functionality of a trusted system to perform operations outside of the intended design and without the operator’s knowledge. These threats can by-pass or manipulate traditionally engineered safety barriers and present false information, invalidating the fundamental basis of a safety analysis. Cyber threats must be fundamentally analyzed from a completely new perspective where neither equipment nor human operation can be fully trusted. A new risk analysis and design methodology needs to be developed to address this rapidly evolving threatscape.

  12. A cost-effective methodology for the design of massively-parallel VLSI functional units

    NASA Technical Reports Server (NTRS)

    Venkateswaran, N.; Sriram, G.; Desouza, J.

    1993-01-01

    In this paper we propose a generalized methodology for the design of cost-effective massively-parallel VLSI Functional Units. This methodology is based on a technique of generating and reducing a massive bit-array on the mask-programmable PAcube VLSI array. This methodology unifies (maintains identical data flow and control) the execution of complex arithmetic functions on PAcube arrays. It is highly regular, expandable and uniform with respect to problem-size and wordlength, thereby reducing the communication complexity. The memory-functional unit interface is regular and expandable. Using this technique functional units of dedicated processors can be mask-programmed on the naked PAcube arrays, reducing the turn-around time. The production cost of such dedicated processors can be drastically reduced since the naked PAcube arrays can be mass-produced. Analysis of the the performance of functional units designed by our method yields promising results.

  13. BEAM STOP DESIGN METHODOLOGY AND DESCRIPTION OF A NEW SNS BEAM STOP

    SciTech Connect

    Polsky, Yarom; Plum, Michael A; Geoghegan, Patrick J; Jacobs, Lorelei L; Lu, Wei; McTeer, Stephen Mark

    2010-01-01

    The design of accelerator components such as magnets, accelerator cavities and beam instruments tends to be a fairly standardized and collective effort within the particle accelerator community with well established performance, reliability and, in some cases, even budgetary criteria. Beam stop design, by contrast, has been comparatively subjective historically with much more general goals. This lack of rigor has lead to a variety of facility implementations with limited standardization and minimal consensus on approach to development within the particle accelerator community. At the Spallation Neutron Source (SNS), for example, there are four high power beam stops in use, three of which have significantly different design solutions. This paper describes the design of a new off-momentum beam stop for the SNS. The technical description of the system will be complemented by a discussion of design methodology. This paper presented an overview of the new SNS HEBT off-momentum beam stop and outlined a methodology for beam stop system design. The new beam stop consists of aluminium and steel blocks cooled by a closed-loop forced-air system and is expected to be commissioned this summer. The design methodology outlined in the paper represents a basic description of the process, data, analyses and critical decisions involved in the development of a beam stop system.

  14. The Case in Case-Based Design of Educational Software: A Methodological Interrogation

    ERIC Educational Resources Information Center

    Khan, S.

    2008-01-01

    This research assessed the value of case study methodology in the design of an educational computer simulation. Three sources of knowledge were compared to assess the value of case study: practitioner and programmer knowledge, disciplinary knowledge, and knowledge obtained from a case study of teacher practice. A retrospective analysis revealed…

  15. A Methodological Framework for Instructional Design Model Development: Critical Dimensions and Synthesized Procedures

    ERIC Educational Resources Information Center

    Lee, Jihyun; Jang, Seonyoung

    2014-01-01

    Instructional design (ID) models have been developed to promote understandings of ID reality and guide ID performance. As the number and diversity of ID practices grows, implicit doubts regarding the reliability, validity, and usefulness of ID models suggest the need for methodological guidance that would help to generate ID models that are…

  16. Learning Network Design: A Methodology for the Construction of Co-operative Distance Learning Environments.

    ERIC Educational Resources Information Center

    Davies, Dick

    Learning Network Design (LND) is a socially oriented methodology for construction of cooperative distance learning environments. The paper advances a social constructivist approach to learning in which learning and teaching are seen as a process of active communication, interpretation, and negotiation; offers a view of information technology as a…

  17. IDR: A Participatory Methodology for Interdisciplinary Design in Technology Enhanced Learning

    ERIC Educational Resources Information Center

    Winters, Niall; Mor, Yishay

    2008-01-01

    One of the important themes that emerged from the CAL'07 conference was the failure of technology to bring about the expected disruptive effect to learning and teaching. We identify one of the causes as an inherent weakness in prevalent development methodologies. While the problem of designing technology for learning is irreducibly…

  18. Intranets and Digital Organizational Information Resources: Towards a Portable Methodology for Design and Development.

    ERIC Educational Resources Information Center

    Rosenbaum, Howard

    1997-01-01

    Discusses the concept of the intranet, comparing and contrasting it with groupware, and presents an argument for its value based on technical and information management considerations. Presents an intranet development project for an academic organization and describes a portable, user-centered and team-based methodology for the design and…

  19. Methodology for the Preliminary Design of High Performance Schools in Hot and Humid Climates

    ERIC Educational Resources Information Center

    Im, Piljae

    2009-01-01

    A methodology to develop an easy-to-use toolkit for the preliminary design of high performance schools in hot and humid climates was presented. The toolkit proposed in this research will allow decision makers without simulation knowledge easily to evaluate accurately energy efficient measures for K-5 schools, which would contribute to the…

  20. Using Delphi Methodology to Design Assessments of Teachers' Pedagogical Content Knowledge

    ERIC Educational Resources Information Center

    Manizade, Agida Gabil; Mason, Marguerite M.

    2011-01-01

    Descriptions of methodologies that can be used to create items for assessing teachers' "professionally situated" knowledge are lacking in mathematics education research literature. In this study, researchers described and used the Delphi method to design an instrument to measure teachers' pedagogical content knowledge. The instrument focused on a…

  1. Curriculum Design: Nurse Educator's Role in Managing and Utilizing Various Teaching Methodologies.

    ERIC Educational Resources Information Center

    Walters, Norma J.

    The role of the nurse educator in curriculum design in the future is considered. Changing technology, shifts in patient care agencies, legislation and long-term care specialties in nursing are all factors that will have a significant impact on curricula. Plans for managing and utilizing various teaching methodologies will be an important role for…

  2. Design of psychosocial factors questionnaires: a systematic measurement approach

    PubMed Central

    Vargas, Angélica; Felknor, Sarah A

    2012-01-01

    Background Evaluation of psychosocial factors requires instruments that measure dynamic complexities. This study explains the design of a set of questionnaires to evaluate work and non-work psychosocial risk factors for stress-related illnesses. Methods The measurement model was based on a review of literature. Content validity was performed by experts and cognitive interviews. Pilot testing was carried out with a convenience sample of 132 workers. Cronbach’s alpha evaluated internal consistency and concurrent validity was estimated by Spearman correlation coefficients. Results Three questionnaires were constructed to evaluate exposure to work and non-work risk factors. Content validity improved the questionnaires coherence with the measurement model. Internal consistency was adequate (α=0.85–0.95). Concurrent validity resulted in moderate correlations of psychosocial factors with stress symptoms. Conclusions Questionnaires´ content reflected a wide spectrum of psychosocial factors sources. Cognitive interviews improved understanding of questions and dimensions. The structure of the measurement model was confirmed. PMID:22628068

  3. The conceptual framework and assessment methodology for the systematic reviews of community-based interventions for the prevention and control of infectious diseases of poverty

    PubMed Central

    2014-01-01

    This paper describes the conceptual framework and the methodology used to guide the systematic reviews of community-based interventions (CBIs) for the prevention and control of infectious diseases of poverty (IDoP). We adapted the conceptual framework from the 3ie work on the ‘Community-Based Intervention Packages for Preventing Maternal Morbidity and Mortality and Improving Neonatal Outcomes’ to aid in the analyzing of the existing CBIs for IDoP. The conceptual framework revolves around objectives, inputs, processes, outputs, outcomes, and impacts showing the theoretical linkages between the delivery of the interventions targeting these diseases through various community delivery platforms and the consequent health impacts. We also describe the methodology undertaken to conduct the systematic reviews and the meta-analyses. PMID:25105014

  4. The conceptual framework and assessment methodology for the systematic reviews of community-based interventions for the prevention and control of infectious diseases of poverty.

    PubMed

    Lassi, Zohra S; Salam, Rehana A; Das, Jai K; Bhutta, Zulfiqar A

    2014-01-01

    This paper describes the conceptual framework and the methodology used to guide the systematic reviews of community-based interventions (CBIs) for the prevention and control of infectious diseases of poverty (IDoP). We adapted the conceptual framework from the 3ie work on the 'Community-Based Intervention Packages for Preventing Maternal Morbidity and Mortality and Improving Neonatal Outcomes' to aid in the analyzing of the existing CBIs for IDoP. The conceptual framework revolves around objectives, inputs, processes, outputs, outcomes, and impacts showing the theoretical linkages between the delivery of the interventions targeting these diseases through various community delivery platforms and the consequent health impacts. We also describe the methodology undertaken to conduct the systematic reviews and the meta-analyses. PMID:25105014

  5. QFD: a methodological tool for integration of ergonomics at the design stage.

    PubMed

    Marsot, Jacques

    2005-03-01

    As a marked increase in the number of musculoskeletal disorders was noted in many industrialized countries and more specifically in companies that require the use of hand tools, the French National Research and Safety Institute launched in 1999 a research program on the topic of integrating ergonomics into hand tool design. After a brief review of the problems of integrating ergonomics at the design stage, the paper shows how the "Quality Function Deployment" method has been applied to the design of a boning knife and it highlights the difficulties encountered. Then, it demonstrates how this method can be a methodological tool geared to greater ergonomics consideration in product design. PMID:15694072

  6. A low-power photovoltaic system with energy storage for radio communications: description and design methodology

    SciTech Connect

    Chapman, C.P.; Chapman, P.D.

    1982-01-01

    A low power photovoltaic system was constructed with approximately 500 amp hours of battery energy storage to provide power to an emergency amateur radio communications center. The system can power the communications center for about 72 hours of continuous nonsun operation. Complete construction details and a design methodology algorithm are given with abundant engineering data and adequate theory to allow similar systems to be constructed, scaled up or down, with minimum design effort.

  7. Low-power photovoltaic system with energy storage for radio communications. Description and design methodology

    SciTech Connect

    Chapman, C.P.; Chapman, P.D.; Lewison, A.H.

    1982-01-15

    A low-power photovoltaic system was constructed with approximately 500 amp-hours of battery energy storage to provide power to an emergency amateur radio communications center. The system can power the communications center for about 72 hours of continuous no-sun operation. Complete construction details and a design methodology algorithm are given with abundant engineering data and adequate theory to allow similar systems to be constructed, scaled up or down, with minimum design effort.

  8. A low-power photovoltaic system with energy storage for radio communications: Description and design methodology

    NASA Technical Reports Server (NTRS)

    Chapman, C. P.; Chapman, P. D.; Lewison, A. H.

    1982-01-01

    A low power photovoltaic system was constructed with approximately 500 amp hours of battery energy storage to provide power to an emergency amateur radio communications center. The system can power the communications center for about 72 hours of continuous nonsun operation. Complete construction details and a design methodology algorithm are given with abundant engineering data and adequate theory to allow similar systems to be constructed, scaled up or down, with minimum design effort.

  9. A design and experimental verification methodology for an energy harvester skin structure

    NASA Astrophysics Data System (ADS)

    Lee, Soobum; Youn, Byeng D.

    2011-05-01

    This paper presents a design and experimental verification methodology for energy harvesting (EH) skin, which opens up a practical and compact piezoelectric energy harvesting concept. In the past, EH research has primarily focused on the design improvement of a cantilever-type EH device. However, such EH devices require additional space for proof mass and fixture and sometimes result in significant energy loss as the clamping condition becomes loose. Unlike the cantilever-type device, the proposed design is simply implemented by laminating a thin piezoelectric patch onto a vibrating structure. The design methodology proposed, which determines a highly efficient piezoelectric material distribution, is composed of two tasks: (i) topology optimization and (ii) shape optimization of the EH material. An outdoor condensing unit is chosen as a case study among many engineered systems with harmonic vibrating configuration. The proposed design methodology determined an optimal PZT material configuration on the outdoor unit skin structure. The designed EH skin was carefully prototyped to demonstrate that it can generate power up to 3.7 mW, which is sustainable for operating wireless sensor units for structural health monitoring and/or building automation.

  10. Design Methodology for Multi-Element High-Lift Systems on Subsonic Civil Transport Aircraft

    NASA Technical Reports Server (NTRS)

    Pepper, R. S.; vanDam, C. P.

    1996-01-01

    The choice of a high-lift system is crucial in the preliminary design process of a subsonic civil transport aircraft. Its purpose is to increase the allowable aircraft weight or decrease the aircraft's wing area for a given takeoff and landing performance. However, the implementation of a high-lift system into a design must be done carefully, for it can improve the aerodynamic performance of an aircraft but may also drastically increase the aircraft empty weight. If designed properly, a high-lift system can improve the cost effectiveness of an aircraft by increasing the payload weight for a given takeoff and landing performance. This is why the design methodology for a high-lift system should incorporate aerodynamic performance, weight, and cost. The airframe industry has experienced rapid technological growth in recent years which has led to significant advances in high-lift systems. For this reason many existing design methodologies have become obsolete since they are based on outdated low Reynolds number wind-tunnel data and can no longer accurately predict the aerodynamic characteristics or weight of current multi-element wings. Therefore, a new design methodology has been created that reflects current aerodynamic, weight, and cost data and provides enough flexibility to allow incorporation of new data when it becomes available.

  11. Methodology for design of adaptive interfaces for diagnostic workstations with integrated images and reports

    NASA Astrophysics Data System (ADS)

    Harreld, Michael R.; Valentino, Daniel J.; Liu, Brent J.; El-Saden, Suzie; Duckwiler, Gary R.

    1998-06-01

    Diagnostic workstations have generally lacked acceptance due to awkward interfaces, poor usability and lack of clinical data integration. We developed a new methodology for the design and implementation of diagnostic workstations and applied the methodology in diagnostic neuroradiology. The methodology facilitated the objective design and evaluation of optimal diagnostic features, including the integration of images and reports, and the implementation of intelligent and adaptive graphical user interfaces. As a test of this new methodology, we developed and evaluated a neuroradiological diagnostic workstation. The general goals of diagnostic neuroradiologists were modeled and directly used in the design of the UCLA Digital ViewBox, an object-oriented toolkit for medical imaging workstations. For case-specific goals, an object-oriented protocol toolkit was developed for rapid development and integration of new protocols, modes, and tools. Each protocol defines a way to arrange and process data in order to accomplish diagnostic goals that are specific to anatomy (e.g., a spine protocol), or to a suspected pathology (e.g., a tumor protocol). Each protocol was divided into modes that represent diagnostic reading tasks. Each mode was further broken down into functions supporting that task. Via a data mediator engine, the workstation communicated with clinical data repositories, including the UCLA HIS, Clinical RIS/PACS and individual DICOM compatible scanners. The data mediator served to transparently integrate, retrieve, and cache image and report data. Task-oriented Reading protocols automatically present the appropriate diagnostic information and diagnostic tools to the radiologist. We describe a protocol toolkit that enables the rapid design and implementation of customized reading protocols. We also present an intelligent layer that enables the automatic presentation of the appropriate information. This new methodology for diagnostic workstation design led to an

  12. Turbofan engine control system design using the LQG/LTR methodology

    NASA Technical Reports Server (NTRS)

    Garg, Sanjay

    1989-01-01

    Application of the Linear-Quadratic-Gaussian with Loop-Transfer-Recovery methodology to design of a control system for a simplified turbofan engine model is considered. The importance of properly scaling the plant to achieve the desired Target-Feedback-Loop is emphasized. The steps involved in the application of the methodology are discussed via an example, and evaluation results are presented for a reduced-order compensator. The effect of scaling the plant on the stability robustness evaluation of the closed-loop system is studied in detail.

  13. Novel thermal management system design methodology for power lithium-ion battery

    NASA Astrophysics Data System (ADS)

    Nieto, Nerea; Díaz, Luis; Gastelurrutia, Jon; Blanco, Francisco; Ramos, Juan Carlos; Rivas, Alejandro

    2014-12-01

    Battery packs conformed by large format lithium-ion cells are increasingly being adopted in hybrid and pure electric vehicles in order to use the energy more efficiently and for a better environmental performance. Safety and cycle life are two of the main concerns regarding this technology, which are closely related to the cell's operating behavior and temperature asymmetries in the system. Therefore, the temperature of the cells in battery packs needs to be controlled by thermal management systems (TMSs). In the present paper an improved design methodology for developing TMSs is proposed. This methodology involves the development of different mathematical models for heat generation, transmission, and dissipation and their coupling and integration in the battery pack product design methodology in order to improve the overall safety and performance. The methodology is validated by comparing simulation results with laboratory measurements on a single module of the battery pack designed at IK4-IKERLAN for a traction application. The maximum difference between model predictions and experimental temperature data is 2 °C. The models developed have shown potential for use in battery thermal management studies for EV/HEV applications since they allow for scalability with accuracy and reasonable simulation time.

  14. Design methodology for integrated downstream separation systems in an ethanol biorefinery

    NASA Astrophysics Data System (ADS)

    Mohammadzadeh Rohani, Navid

    and obtaining energy security. On the other hand, Process Integration (PI) as defined by Natural Resource Canada as the combination of activities which aim at improving process systems, their unit operations and their interactions in order to maximize the efficiency of using water, energy and raw materials can also help biorefineries lower their energy consumptions and improve their economics. Energy integration techniques such as pinch analysis adopted by different industries over the years have ensured using heat sources within a plant to supply the demand internally and decrease the external utility consumption. Therefore, adopting energy integration can be one of the ways biorefinery technology owners can consider in their process development as well as their business model in order to improve their overall economics. The objective of this thesis is to propose a methodology for designing integrated downstream separation in a biorefinery. This methodology is tested in an ethanol biorefinery case study. Several alternative separation techniques are evaluated in their energy consumption and economics in three different scenarios; stand-alone without energy integration, stand-alone with internal energy integration and integrated-with Kraft. The energy consumptions and capital costs of separation techniques are assessed in each scenario and the cost and benefit of integration are determined and finally the best alternative is found through techno-economic metrics. Another advantage of this methodology is the use of a graphical tool which provides insights on decreasing energy consumption by modifying the process condition. The pivot point of this work is the use of a novel energy integration method called Bridge analysis. This systematic method which originally is intended for retrofit situation is used here for integration with Kraft process. Integration potentials are identified through this method and savings are presented for each design. In stand-alone with

  15. Aero-Mechanical Design Methodology for Subsonic Civil Transport High-Lift Systems

    NASA Technical Reports Server (NTRS)

    vanDam, C. P.; Shaw, S. G.; VanderKam, J. C.; Brodeur, R. R.; Rudolph, P. K. C.; Kinney, D.

    2000-01-01

    In today's highly competitive and economically driven commercial aviation market, the trend is to make aircraft systems simpler and to shorten their design cycle which reduces recurring, non-recurring and operating costs. One such system is the high-lift system. A methodology has been developed which merges aerodynamic data with kinematic analysis of the trailing-edge flap mechanism with minimum mechanism definition required. This methodology provides quick and accurate aerodynamic performance prediction for a given flap deployment mechanism early on in the high-lift system preliminary design stage. Sample analysis results for four different deployment mechanisms are presented as well as descriptions of the aerodynamic and mechanism data required for evaluation. Extensions to interactive design capabilities are also discussed.

  16. Application of Design Methodologies for Feedback Compensation Associated with Linear Systems

    NASA Technical Reports Server (NTRS)

    Smith, Monty J.

    1996-01-01

    The work that follows is concerned with the application of design methodologies for feedback compensation associated with linear systems. In general, the intent is to provide a well behaved closed loop system in terms of stability and robustness (internal signals remain bounded with a certain amount of uncertainty) and simultaneously achieve an acceptable level of performance. The approach here has been to convert the closed loop system and control synthesis problem into the interpolation setting. The interpolation formulation then serves as our mathematical representation of the design process. Lifting techniques have been used to solve the corresponding interpolation and control synthesis problems. Several applications using this multiobjective design methodology have been included to show the effectiveness of these techniques. In particular, the mixed H 2-H performance criteria with algorithm has been used on several examples including an F-18 HARV (High Angle of Attack Research Vehicle) for sensitivity performance.

  17. Design Methodology of a Dual-Halbach Array Linear Actuator with Thermal-Electromagnetic Coupling.

    PubMed

    Eckert, Paulo Roberto; Flores Filho, Aly Ferreira; Perondi, Eduardo; Ferri, Jeferson; Goltz, Evandro

    2016-01-01

    This paper proposes a design methodology for linear actuators, considering thermal and electromagnetic coupling with geometrical and temperature constraints, that maximizes force density and minimizes force ripple. The method allows defining an actuator for given specifications in a step-by-step way so that requirements are met and the temperature within the device is maintained under or equal to its maximum allowed for continuous operation. According to the proposed method, the electromagnetic and thermal models are built with quasi-static parametric finite element models. The methodology was successfully applied to the design of a linear cylindrical actuator with a dual quasi-Halbach array of permanent magnets and a moving-coil. The actuator can produce an axial force of 120 N and a stroke of 80 mm. The paper also presents a comparative analysis between results obtained considering only an electromagnetic model and the thermal-electromagnetic coupled model. This comparison shows that the final designs for both cases differ significantly, especially regarding its active volume and its electrical and magnetic loading. Although in this paper the methodology was employed to design a specific actuator, its structure can be used to design a wide range of linear devices if the parametric models are adjusted for each particular actuator. PMID:26978370

  18. Design Methodology of a Dual-Halbach Array Linear Actuator with Thermal-Electromagnetic Coupling

    PubMed Central

    Eckert, Paulo Roberto; Flores Filho, Aly Ferreira; Perondi, Eduardo; Ferri, Jeferson; Goltz, Evandro

    2016-01-01

    This paper proposes a design methodology for linear actuators, considering thermal and electromagnetic coupling with geometrical and temperature constraints, that maximizes force density and minimizes force ripple. The method allows defining an actuator for given specifications in a step-by-step way so that requirements are met and the temperature within the device is maintained under or equal to its maximum allowed for continuous operation. According to the proposed method, the electromagnetic and thermal models are built with quasi-static parametric finite element models. The methodology was successfully applied to the design of a linear cylindrical actuator with a dual quasi-Halbach array of permanent magnets and a moving-coil. The actuator can produce an axial force of 120 N and a stroke of 80 mm. The paper also presents a comparative analysis between results obtained considering only an electromagnetic model and the thermal-electromagnetic coupled model. This comparison shows that the final designs for both cases differ significantly, especially regarding its active volume and its electrical and magnetic loading. Although in this paper the methodology was employed to design a specific actuator, its structure can be used to design a wide range of linear devices if the parametric models are adjusted for each particular actuator. PMID:26978370

  19. A Robust Design Methodology for Optimal Microscale Secondary Flow Control in Compact Inlet Diffusers

    NASA Technical Reports Server (NTRS)

    Anderson, Bernhard H.; Keller, Dennis J.

    2001-01-01

    It is the purpose of this study to develop an economical Robust design methodology for microscale secondary flow control in compact inlet diffusers. To illustrate the potential of economical Robust Design methodology, two different mission strategies were considered for the subject inlet, namely Maximum Performance and Maximum HCF Life Expectancy. The Maximum Performance mission maximized total pressure recovery while the Maximum HCF Life Expectancy mission minimized the mean of the first five Fourier harmonic amplitudes, i.e., 'collectively' reduced all the harmonic 1/2 amplitudes of engine face distortion. Each of the mission strategies was subject to a low engine face distortion constraint, i.e., DC60<0.10, which is a level acceptable for commercial engines. For each of these missions strategies, an 'Optimal Robust' (open loop control) and an 'Optimal Adaptive' (closed loop control) installation was designed over a twenty degree angle-of-incidence range. The Optimal Robust installation used economical Robust Design methodology to arrive at a single design which operated over the entire angle-of-incident range (open loop control). The Optimal Adaptive installation optimized all the design parameters at each angle-of-incidence. Thus, the Optimal Adaptive installation would require a closed loop control system to sense a proper signal for each effector and modify that effector device, whether mechanical or fluidic, for optimal inlet performance. In general, the performance differences between the Optimal Adaptive and Optimal Robust installation designs were found to be marginal. This suggests, however, that Optimal Robust open loop installation designs can be very competitive with Optimal Adaptive close loop designs. Secondary flow control in inlets is inherently robust, provided it is optimally designed. Therefore, the new methodology presented in this paper, combined array 'Lower Order' approach to Robust DOE, offers the aerodynamicist a very viable and

  20. A methodology for robust structural design with application to active aeroelastic wings

    NASA Astrophysics Data System (ADS)

    Zink, Paul Scott

    A new design process for Active Aeroelastic Wing (AAW) technology was developed, in which control surface gear ratios and structural design variables were treated together in the same optimization problem, acting towards the same objective of weight minimization. This is in contrast to traditional AAW design processes that treat design of the gear ratios and design of the structure as separate optimization problems, each with their own different objectives and constraints, executed in an iterative fashion. The demonstration of the new AAW design process, implemented in an efficient modal-based structural analysis and optimization code, on a lightweight fighter resulted in a 15% reduction in wing box skin weight over a more traditional AAW design process. In addition, the new process was far more streamlined than the traditional approach in that it was performed in one continuous run and did not require the exchange of data between modules. The new AAW design process was then used in the development of a methodology for the design of AAW structures that are robust to uncertainty in maneuver loads which arise from the use of linear aerodynamics. Maneuver load uncertainty was modeled probabilistically and based on typical differences between rigid loads as predicted by nonlinear and linear aerodynamic theory. These models were used to augment the linear aerodynamic loads that had been used in the AAW design process. Characteristics of the robust design methodology included: use of a criticality criterion based on a strain energy formulation to determine what loads were most critical to the structure, Latin Hypercube Sampling for the propagation of uncertainty to the criterion function, and redesign of the structure, using the new AAW design process, to the most critical loads identified. The demonstration of the methodology resulted in a wing box skin structure that was 11% heavier than an AAW structure designed only with linear aerodynamics. However, it was

  1. Electric Utility Rate Design Study: comments on An Evaluation of Four Marginal-Costing Methodologies

    SciTech Connect

    Not Available

    1980-06-12

    This report is an extension of NP-24255 (EAPA 6:1820), An Evaluation of Four Marginal Costing Methodologies (RDS No. 66), which summarizes, contrasts, and evaluates four marginal costing methodologies currently in use by various electric utilities. The proponents of the four methodologies evaluated by Temple, Barker, and Sloane (TBS) were asked to comment on the TBS report (RDS No. 66). Other selected reviewers were asked to comment on the TBS report. This report, RDS No. 67, is an anthology of all those comments plus a response to them by TBS. The rebuttal comments from TBS appear first, followed by comments submitted by Ralph Turvey, an authority in microeconomics. The next comments are to the Rate Design Study by members of Advisory Group I, experts in the field of electricity pricing. The next four sections present detailed comments submitted by the four marginal-cost proponents: Cicchetti, Gillen, and Smolensky; Ernst and Ernst; Gordian Associates; and National Economic Research Associates.

  2. Study designs for identifying risk compensation behavior among users of biomedical HIV prevention technologies: balancing methodological rigor and research ethics.

    PubMed

    Underhill, Kristen

    2013-10-01

    The growing evidence base for biomedical HIV prevention interventions - such as oral pre-exposure prophylaxis, microbicides, male circumcision, treatment as prevention, and eventually prevention vaccines - has given rise to concerns about the ways in which users of these biomedical products may adjust their HIV risk behaviors based on the perception that they are prevented from infection. Known as risk compensation, this behavioral adjustment draws on the theory of "risk homeostasis," which has previously been applied to phenomena as diverse as Lyme disease vaccination, insurance mandates, and automobile safety. Little rigorous evidence exists to answer risk compensation concerns in the biomedical HIV prevention literature, in part because the field has not systematically evaluated the study designs available for testing these behaviors. The goals of this Commentary are to explain the origins of risk compensation behavior in risk homeostasis theory, to reframe risk compensation as a testable response to the perception of reduced risk, and to assess the methodological rigor and ethical justification of study designs aiming to isolate risk compensation responses. Although the most rigorous methodological designs for assessing risk compensation behavior may be unavailable due to ethical flaws, several strategies can help investigators identify potential risk compensation behavior during Phase II, Phase III, and Phase IV testing of new technologies. Where concerns arise regarding risk compensation behavior, empirical evidence about the incidence, types, and extent of these behavioral changes can illuminate opportunities to better support the users of new HIV prevention strategies. This Commentary concludes by suggesting a new way to conceptualize risk compensation behavior in the HIV prevention context. PMID:23597916

  3. Methodology to design a municipal solid waste pre-collection system. A case study

    SciTech Connect

    Gallardo, A. Carlos, M. Peris, M. Colomer, F.J.

    2015-02-15

    Highlights: • MSW recovery starts at homes; therefore it is important to facilitate it to people. • Additionally, to optimize MSW collection a previous pre-collection must be planned. • A methodology to organize pre-collection considering several factors is presented. • The methodology has been verified applying it to a Spanish middle town. - Abstract: The municipal solid waste (MSW) management is an important task that local governments as well as private companies must take into account to protect human health, the environment and to preserve natural resources. To design an adequate MSW management plan the first step consists in defining the waste generation and composition patterns of the town. As these patterns depend on several socio-economic factors it is advisable to organize them previously. Moreover, the waste generation and composition patterns may vary around the town and over the time. Generally, the data are not homogeneous around the city as the number of inhabitants is not constant nor it is the economic activity. Therefore, if all the information is showed in thematic maps, the final waste management decisions can be made more efficiently. The main aim of this paper is to present a structured methodology that allows local authorities or private companies who deal with MSW to design its own MSW management plan depending on the available data. According to these data, this paper proposes two ways of action: a direct way when detailed data are available and an indirect way when there is a lack of data and it is necessary to take into account bibliographic data. In any case, the amount of information needed is considerable. This paper combines the planning methodology with the Geographic Information Systems to present the final results in thematic maps that make easier to interpret them. The proposed methodology is a previous useful tool to organize the MSW collection routes including the selective collection. To verify the methodology it has

  4. A methodology for the validated design space exploration of fuel cell powered unmanned aerial vehicles

    NASA Astrophysics Data System (ADS)

    Moffitt, Blake Almy

    Unmanned Aerial Vehicles (UAVs) are the most dynamic growth sector of the aerospace industry today. The need to provide persistent intelligence, surveillance, and reconnaissance for military operations is driving the planned acquisition of over 5,000 UAVs over the next five years. The most pressing need is for quiet, small UAVs with endurance beyond what is capable with advanced batteries or small internal combustion propulsion systems. Fuel cell systems demonstrate high efficiency, high specific energy, low noise, low temperature operation, modularity, and rapid refuelability making them a promising enabler of the small, quiet, and persistent UAVs that military planners are seeking. Despite the perceived benefits, the actual near-term performance of fuel cell powered UAVs is unknown. Until the auto industry began spending billions of dollars in research, fuel cell systems were too heavy for useful flight applications. However, the last decade has seen rapid development with fuel cell gravimetric and volumetric power density nearly doubling every 2--3 years. As a result, a few design studies and demonstrator aircraft have appeared, but overall the design methodology and vehicles are still in their infancy. The design of fuel cell aircraft poses many challenges. Fuel cells differ fundamentally from combustion based propulsion in how they generate power and interact with other aircraft subsystems. As a result, traditional multidisciplinary analysis (MDA) codes are inappropriate. Building new MDAs is difficult since fuel cells are rapidly changing in design, and various competitive architectures exist for balance of plant, hydrogen storage, and all electric aircraft subsystems. In addition, fuel cell design and performance data is closely protected which makes validation difficult and uncertainty significant. Finally, low specific power and high volumes compared to traditional combustion based propulsion result in more highly constrained design spaces that are

  5. Methodology for the Design of Streamline-Traced External-Compression Supersonic Inlets

    NASA Technical Reports Server (NTRS)

    Slater, John W.

    2014-01-01

    A design methodology based on streamline-tracing is discussed for the design of external-compression, supersonic inlets for flight below Mach 2.0. The methodology establishes a supersonic compression surface and capture cross-section by tracing streamlines through an axisymmetric Busemann flowfield. The compression system of shock and Mach waves is altered through modifications to the leading edge and shoulder of the compression surface. An external terminal shock is established to create subsonic flow which is diffused in the subsonic diffuser. The design methodology was implemented into the SUPIN inlet design tool. SUPIN uses specified design factors to design the inlets and computes the inlet performance, which includes the flow rates, total pressure recovery, and wave drag. A design study was conducted using SUPIN and the Wind-US computational fluid dynamics code to design and analyze the properties of two streamline-traced, external-compression (STEX) supersonic inlets for Mach 1.6 freestream conditions. The STEX inlets were compared to axisymmetric pitot, two-dimensional, and axisymmetric spike inlets. The STEX inlets had slightly lower total pressure recovery and higher levels of total pressure distortion than the axisymmetric spike inlet. The cowl wave drag coefficients of the STEX inlets were 20% of those for the axisymmetric spike inlet. The STEX inlets had external sound pressures that were 37% of those of the axisymmetric spike inlet, which may result in lower adverse sonic boom characteristics. The flexibility of the shape of the capture cross-section may result in benefits for the integration of STEX inlets with aircraft.

  6. Development of an aggregation methodology for risk analysis in aerospace conceptual vehicle design

    NASA Astrophysics Data System (ADS)

    Chytka, Trina Marsh

    2003-10-01

    The growing complexity of technical systems has emphasized a need to gather as much information as possible regarding specific systems of interest in order to make robust, sound decisions about their design and deployment. Acquiring as much data as possible requires the use of empirical statistics, historical information and expert opinion. In much of the aerospace conceptual design environment, the lack of historical information and infeasibility of gathering empirical data relegates the data collection to expert opinion. The conceptual design of a space vehicle requires input from several disciplines (weights and sizing, operations, trajectory, etc.). In this multidisciplinary environment, the design variables are often not easily quantified and have a high degree of uncertainty associated with their values. Decision-makers must rely on expert assessments of the uncertainty associated with the design variables to evaluate the risk level of a conceptual design. Since multiple experts are often queried for their evaluation of uncertainty, a means to combine/aggregate multiple expert assessments must be developed. Providing decision-makers with a solitary assessment that captures the consensus of the multiple experts would greatly enhance the ability to evaluate risk associated with a conceptual design. The objective of this research has been to develop an aggregation methodology that efficiently combines the uncertainty assessments of multiple experts in multiple disciplines involved in aerospace conceptual design. Bayesian probability augmented by uncertainty modeling and expert calibration was employed in the methodology construction. Appropriate questionnaire techniques were used to acquire expert opinion; the responses served as input distributions to the aggregation algorithm. Application of the derived techniques were applied as part of a larger expert assessment elicitation and calibration study. Results of this research demonstrate that aggregation of

  7. Progress in the Development of a Nozzle Design Methodology for Pulsed Detonation Engines

    NASA Technical Reports Server (NTRS)

    Leary, B. A.; Waltrup, P. J.; Rice, T.; Cybyk, B. Z.

    2002-01-01

    The Johns Hopkins University Applied Physics Laboratory (JHU/APL), in support of the NASA Glenn Research Center (NASA GRC), is investigating performance methodologies and system integration issues related to Pulsed Detonation Engine (PDE) nozzles. The primary goal of this ongoing effort is to develop design and performance assessment methodologies applicable to PDE exit nozzle(s). APL is currently focusing its efforts on a common plenum chamber design that collects the exhaust products from multiple PDE tubes prior to expansion in a single converging-diverging exit nozzle. To accomplish this goal, a time-dependent, quasi-one-dimensional analysis for determining the flow properties in and through a single plenum and exhaust nozzle is underway. In support of these design activities, parallel modeling efforts using commercial Computational Fluid Dynamics (CFD) software are on-going. These efforts include both two and three-dimensional as well as steady and time-dependent computations to assess the flow in and through these devices. This paper discusses the progress in developing this nozzle design methodology.

  8. The Design, Implementation, and Evaluation of Online Credit Nutrition Courses: A Systematic Review

    ERIC Educational Resources Information Center

    Cohen, Nancy L.; Carbone, Elena T.; Beffa-Negrini, Patricia A.

    2011-01-01

    Objective: To assess how postsecondary online nutrition education courses (ONEC) are delivered, determine ONEC effectiveness, identify theoretical models used, and identify future research needs. Design: Systematic search of database literature. Setting: Postsecondary education. Participants: Nine research articles evaluating postsecondary ONEC.…

  9. Storytelling to Enhance Teaching and Learning: The Systematic Design, Development, and Testing of Two Online Courses

    ERIC Educational Resources Information Center

    Hirumi, Atsusi; Sivo, Stephen; Pounds, Kelly

    2012-01-01

    Storytelling may be a powerful instructional approach for engaging learners and facilitating e-learning. However, relatively little is known about how to apply story within the context of systematic instructional design processes and claims for the effectiveness of storytelling in training and education have been primarily anecdotal and…

  10. Digital Games, Design, and Learning: A Systematic Review and Meta-Analysis

    ERIC Educational Resources Information Center

    Clark, Douglas B.; Tanner-Smith, Emily E.; Killingsworth, Stephen S.

    2016-01-01

    In this meta-analysis, we systematically reviewed research on digital games and learning for K-16 students. We synthesized comparisons of game versus nongame conditions (i.e., media comparisons) and comparisons of augmented games versus standard game designs (i.e., value-added comparisons). We used random-effects meta-regression models with robust…

  11. Robust model matching design methodology for a stochastic synthetic gene network.

    PubMed

    Chen, Bor-Sen; Chang, Chia-Hung; Wang, Yu-Chao; Wu, Chih-Hung; Lee, Hsiao-Ching

    2011-03-01

    Synthetic biology has shown its potential and promising applications in the last decade. However, many synthetic gene networks cannot work properly and maintain their desired behaviors due to intrinsic parameter variations and extrinsic disturbances. In this study, the intrinsic parameter uncertainties and external disturbances are modeled in a non-linear stochastic gene network to mimic the real environment in the host cell. Then a non-linear stochastic robust matching design methodology is introduced to withstand the intrinsic parameter fluctuations and to attenuate the extrinsic disturbances in order to achieve a desired reference matching purpose. To avoid solving the Hamilton-Jacobi inequality (HJI) in the non-linear stochastic robust matching design, global linearization technique is used to simplify the design procedure by solving a set of linear matrix inequalities (LMIs). As a result, the proposed matching design methodology of the robust synthetic gene network can be efficiently designed with the help of LMI toolbox in Matlab. Finally, two in silico design examples of the robust synthetic gene network are given to illustrate the design procedure and to confirm the robust model matching performance to achieve the desired behavior in spite of stochastic parameter fluctuations and environmental disturbances in the host cell. PMID:21215760

  12. Methodology development of an engineering design expert system utilizing a modular knowledge-base inference process

    NASA Astrophysics Data System (ADS)

    Winter, Steven John

    Methodology development was conducted to incorporate a modular knowledge-base representation into an expert system engineering design application. The objective for using multidisciplinary methodologies in defining a design system was to develop a system framework that would be applicable to a wide range of engineering applications. The technique of "knowledge clustering" was used to construct a general decision tree for all factual information relating to the design application. This construction combined the design process surface knowledge and specific application depth knowledge. Utilization of both levels of knowledge created a system capable of processing multiple controlling tasks including; organizing factual information relative to the cognitive levels of the design process, building finite element models for depth knowledge analysis, developing a standardized finite element code for parallel processing, and determining a best solution generated by design optimization procedures. Proof of concept for the methodology developed here is shown in the implementation of an application defining the analysis and optimization of a composite aircraft canard subjected to a general compound loading condition. This application contained a wide range of factual information and heuristic rules. The analysis tools used included a finite element (FE) processor and numerical optimizer. An advisory knowledge-base was also developed to provide a standard for conversion of serial FE code for parallel processing. All knowledge-bases developed operated as either an advisory, selection, or classification systems. Laminate properties are limited to even-numbered, quasi-isotropic ply stacking sequences. This retained full influence of the coupled in-plane and bending effects of the structures behavior. The canard is modeled as a constant thickness plate and discretized into a varying number of four or nine-noded, quadrilateral, shear-deformable plate elements. The benefit gained by

  13. Designing and Implementing INTREPID, an Intensive Program in Translational Research Methodologies for New Investigators

    PubMed Central

    Plottel, Claudia S.; Aphinyanaphongs, Yindalon; Shao, Yongzhao; Micoli, Keith J.; Fang, Yixin; Galeano, Claudia R.; Stangel, Jessica H.; Hochman, Judith S.; Cronstein, Bruce N.; Pillinger, Michael H.

    2014-01-01

    Senior housestaff and junior faculty are often expected to perform clinical research, yet may not always have the requisite knowledge and skills to do so successfully. Formal degree programs provide such knowledge, but require a significant commitment of time and money. Short-term training programs (days to weeks) provide alternative ways to accrue essential information and acquire fundamental methodological skills. Unfortunately, published information about short-term programs is sparse. To encourage discussion and exchange of ideas regarding such programs, we here share our experience developing and implementing INTREPID (INtensive Training in Research Statistics, Ethics, and Protocol Informatics and Design), a 24-day immersion training program in clinical research methodologies. Designing, planning, and offering INTREPID was feasible, and required significant faculty commitment, support personnel and infrastructure, as well as committed trainees. PMID:25066862

  14. The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology: Capabilities and Applications

    NASA Technical Reports Server (NTRS)

    Evers, Ken H.; Bachert, Robert F.

    1987-01-01

    The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology has been formulated and applied over a five-year period. It has proven to be a unique, integrated approach utilizing a top-down, structured technique to define and document the system of interest; a knowledge engineering technique to collect and organize system descriptive information; a rapid prototyping technique to perform preliminary system performance analysis; and a sophisticated simulation technique to perform in-depth system performance analysis.

  15. Methodology for CFD Design Analysis of National Launch System Nozzle Manifold

    NASA Technical Reports Server (NTRS)

    Haire, Scot L.

    1993-01-01

    The current design environment dictates that high technology CFD (Computational Fluid Dynamics) analysis produce quality results in a timely manner if it is to be integrated into the design process. The design methodology outlined describes the CFD analysis of an NLS (National Launch System) nozzle film cooling manifold. The objective of the analysis was to obtain a qualitative estimate for the flow distribution within the manifold. A complex, 3D, multiple zone, structured grid was generated from a 3D CAD file of the geometry. A Euler solution was computed with a fully implicit compressible flow solver. Post processing consisted of full 3D color graphics and mass averaged performance. The result was a qualitative CFD solution that provided the design team with relevant information concerning the flow distribution in and performance characteristics of the film cooling manifold within an effective time frame. Also, this design methodology was the foundation for a quick turnaround CFD analysis of the next iteration in the manifold design.

  16. Spintronic logic design methodology based on spin Hall effect-driven magnetic tunnel junctions

    NASA Astrophysics Data System (ADS)

    Kang, Wang; Wang, Zhaohao; Zhang, Youguang; Klein, Jacques-Olivier; Lv, Weifeng; Zhao, Weisheng

    2016-02-01

    Conventional complementary metal-oxide-semiconductor (CMOS) technology is now approaching its physical scaling limits to enable Moore’s law to continue. Spintronic devices, as one of the potential alternatives, show great promise to replace CMOS technology for next-generation low-power integrated circuits in nanoscale technology nodes. Until now, spintronic memory has been successfully commercialized. However spintronic logic still faces many critical challenges (e.g. direct cascading capability and small operation gain) before it can be practically applied. In this paper, we propose a standard complimentary spintronic logic (CSL) design methodology to form a CMOS-like logic design paradigm. Using the spin Hall effect (SHE)-driven magnetic tunnel junction (MTJ) device as an example, we demonstrate CSL implementation, functionality and performance. This logic family provides a unified design methodology for spintronic logic circuits and partly solves the challenges of direct cascading capability and small operation gain in the previously proposed spintronic logic designs. By solving a modified Landau-Lifshitz-Gilbert equation, the magnetization dynamics in the free layer of the MTJ is theoretically described and a compact electrical model is developed. With this electrical model, numerical simulations have been performed to evaluate the functionality and performance of the proposed CSL design. Simulation results demonstrate that the proposed CSL design paradigm is rather promising for low-power logic computing.

  17. Neotype designation for Calotes versicolor Daudin, 1802 (Sauria: Agamidae) with notes on its systematics.

    PubMed

    Gowande, Gaurang; Mishra, Anurag; Mirza, Zeeshan A

    2016-01-01

    Calotes versicolor Daudin, 1802 is one of the most widespread agamid lizard species which was described without a locality. The type specimen of the species has long been considered lost; however most workers considered Pondicherry as the type locality for the species. Studies by Zug et al. 2006 confirmed that C. versicolor is a complex of multiple species which necessitates fixing type locality and specimen for the species in order to resolve the systematics of the species complex. An adult male from Pondicherry was collected and is here designated as the neotype. A re-description of the species is provided along with notes on systematics of the species. PMID:27395587

  18. A methodology for evacuation design for urban areas: theoretical aspects and experimentation

    NASA Astrophysics Data System (ADS)

    Russo, F.; Vitetta, A.

    2009-04-01

    This paper proposes an unifying approach for the simulation and design of a transportation system under conditions of incoming safety and/or security. Safety and security are concerned with threats generated by very different factors and which, in turn, generate emergency conditions, such as the 9/11, Madrid and London attacks, the Asian tsunami, and the Katrina hurricane; just considering the last five years. In transportation systems, when exogenous events happen and there is a sufficient interval time between the instant when the event happens and the instant when the event has effect on the population, it is possible to reduce the negative effects with the population evacuation. For this event in every case it is possible to prepare with short and long term the evacuation. For other event it is possible also to plan the real time evacuation inside the general risk methodology. The development of models for emergency conditions in transportation systems has not received much attention in the literature. The main findings in this area are limited to only a few public research centres and private companies. In general, there is no systematic analysis of the risk theory applied in the transportation system. Very often, in practice, the vulnerability and exposure in the transportation system are considered as similar variables, or in other worse cases the exposure variables are treated as vulnerability variables. Models and algorithms specified and calibrated in ordinary conditions cannot be directly applied in emergency conditions under the usual hypothesis considered. This paper is developed with the following main objectives: (a) to formalize the risk problem with clear diversification (for the consequences) in the definition of the vulnerability and exposure in a transportation system; thus the book offers improvements over consolidated quantitative risk analysis models, especially transportation risk analysis models (risk assessment); (b) to formalize a system

  19. The Atomic Intrinsic Integration Approach: A Structured Methodology for the Design of Games for the Conceptual Understanding of Physics

    ERIC Educational Resources Information Center

    Echeverria, Alejandro; Barrios, Enrique; Nussbaum, Miguel; Amestica, Matias; Leclerc, Sandra

    2012-01-01

    Computer simulations combined with games have been successfully used to teach conceptual physics. However, there is no clear methodology for guiding the design of these types of games. To remedy this, we propose a structured methodology for the design of conceptual physics games that explicitly integrates the principles of the intrinsic…

  20. Nuclear design methodology for analyzing ultra high temperature highly compact ternary carbide reactor

    NASA Astrophysics Data System (ADS)

    Gouw, Reza Raymond

    Recent studies at the Innovative Nuclear Space Power and Propulsion Institute (INSPI) have demonstrated the feasibility of fabricating solid solutions of ternary carbide fuels such as (U,Zr,Nb)C, (U,Zr,Ta)C, (U,Zr,Hf)C and (U,Zr,W)C. The necessity for accurate nuclear design analysis of these ternary carbides in highly compact nuclear space systems prompted the development of nuclear design methodology for analyzing these systems. This study will present the improvement made in the high temperature nuclear cross-sections. It will show the relation between Monte Carlo and Deterministic calculations. It will prove the significant role of the energy spectrum in the multigroup nuclear cross-sections generation in the highly-thermalized-nuclear system. The nuclear design methodology will address several issues in the homogenization of a nuclear system, such as energy spectrum comparison between a heterogeneous system and homogeneous system. It will also address several key points in the continuous and multigroup nuclear cross-sections generation. The study will present the methodology of selecting broad energy group structures. Finally, a comparison between the Monte Carlo and Deterministic methods will be performed for the Square-Lattice Honeycomb Nuclear Space Reactor. In the comparison calculations, it will include the system characterization calculations, such as energy spectrum comparison, 2-D power distributions, temperature coefficient analysis, and water submersion accident analysis.

  1. A Visual Analytics Based Decision Support Methodology For Evaluating Low Energy Building Design Alternatives

    NASA Astrophysics Data System (ADS)

    Dutta, Ranojoy

    The ability to design high performance buildings has acquired great importance in recent years due to numerous federal, societal and environmental initiatives. However, this endeavor is much more demanding in terms of designer expertise and time. It requires a whole new level of synergy between automated performance prediction with the human capabilities to perceive, evaluate and ultimately select a suitable solution. While performance prediction can be highly automated through the use of computers, performance evaluation cannot, unless it is with respect to a single criterion. The need to address multi-criteria requirements makes it more valuable for a designer to know the "latitude" or "degrees of freedom" he has in changing certain design variables while achieving preset criteria such as energy performance, life cycle cost, environmental impacts etc. This requirement can be met by a decision support framework based on near-optimal "satisficing" as opposed to purely optimal decision making techniques. Currently, such a comprehensive design framework is lacking, which is the basis for undertaking this research. The primary objective of this research is to facilitate a complementary relationship between designers and computers for Multi-Criterion Decision Making (MCDM) during high performance building design. It is based on the application of Monte Carlo approaches to create a database of solutions using deterministic whole building energy simulations, along with data mining methods to rank variable importance and reduce the multi-dimensionality of the problem. A novel interactive visualization approach is then proposed which uses regression based models to create dynamic interplays of how varying these important variables affect the multiple criteria, while providing a visual range or band of variation of the different design parameters. The MCDM process has been incorporated into an alternative methodology for high performance building design referred to as

  2. Applying Quality Indicators to Single-Case Research Designs Used in Special Education: A Systematic Review

    ERIC Educational Resources Information Center

    Moeller, Jeremy D.; Dattilo, John; Rusch, Frank

    2015-01-01

    This study examined how specific guidelines and heuristics have been used to identify methodological rigor associated with single-case research designs based on quality indicators developed by Horner et al. Specifically, this article describes how literature reviews have applied Horner et al.'s quality indicators and evidence-based criteria.…

  3. Biomarker-Guided Adaptive Trial Designs in Phase II and Phase III: A Methodological Review

    PubMed Central

    Antoniou, Miranta; Jorgensen, Andrea L; Kolamunnage-Dona, Ruwanthi

    2016-01-01

    Background Personalized medicine is a growing area of research which aims to tailor the treatment given to a patient according to one or more personal characteristics. These characteristics can be demographic such as age or gender, or biological such as a genetic or other biomarker. Prior to utilizing a patient’s biomarker information in clinical practice, robust testing in terms of analytical validity, clinical validity and clinical utility is necessary. A number of clinical trial designs have been proposed for testing a biomarker’s clinical utility, including Phase II and Phase III clinical trials which aim to test the effectiveness of a biomarker-guided approach to treatment; these designs can be broadly classified into adaptive and non-adaptive. While adaptive designs allow planned modifications based on accumulating information during a trial, non-adaptive designs are typically simpler but less flexible. Methods and Findings We have undertaken a comprehensive review of biomarker-guided adaptive trial designs proposed in the past decade. We have identified eight distinct biomarker-guided adaptive designs and nine variations from 107 studies. Substantial variability has been observed in terms of how trial designs are described and particularly in the terminology used by different authors. We have graphically displayed the current biomarker-guided adaptive trial designs and summarised the characteristics of each design. Conclusions Our in-depth overview provides future researchers with clarity in definition, methodology and terminology for biomarker-guided adaptive trial designs. PMID:26910238

  4. Methodology to design a municipal solid waste generation and composition map: A case study

    SciTech Connect

    Gallardo, A. Carlos, M. Peris, M. Colomer, F.J.

    2014-11-15

    Highlights: • To draw a waste generation and composition map of a town a lot of factors must be taken into account. • The methodology proposed offers two different depending on the available data combined with geographical information systems. • The methodology has been applied to a Spanish city with success. • The methodology will be a useful tool to organize the municipal solid waste management. - Abstract: The municipal solid waste (MSW) management is an important task that local governments as well as private companies must take into account to protect human health, the environment and to preserve natural resources. To design an adequate MSW management plan the first step consist in defining the waste generation and composition patterns of the town. As these patterns depend on several socio-economic factors it is advisable to organize them previously. Moreover, the waste generation and composition patterns may vary around the town and over the time. Generally, the data are not homogeneous around the city as the number of inhabitants is not constant nor it is the economic activity. Therefore, if all the information is showed in thematic maps, the final waste management decisions can be made more efficiently. The main aim of this paper is to present a structured methodology that allows local authorities or private companies who deal with MSW to design its own MSW management plan depending on the available data. According to these data, this paper proposes two ways of action: a direct way when detailed data are available and an indirect way when there is a lack of data and it is necessary to take into account bibliographic data. In any case, the amount of information needed is considerable. This paper combines the planning methodology with the Geographic Information Systems to present the final results in thematic maps that make easier to interpret them. The proposed methodology is a previous useful tool to organize the MSW collection routes including the

  5. Application of an integrated flight/propulsion control design methodology to a STOVL aircraft

    NASA Technical Reports Server (NTRS)

    Garg, Sanjay; Mattern, Duane L.

    1991-01-01

    Results are presented from the application of an emerging Integrated Flight/Propulsion Control (IFPC) design methodology to a Short Take Off and Vertical Landing (STOVL) aircraft in transition flight. The steps in the methodology consist of designing command shaping prefilters to provide the overall desired response to pilot command inputs. A previously designed centralized controller is first validated for the integrated airframe/engine plant used. This integrated plant is derived from a different model of the engine subsystem than the one used for the centralized controller design. The centralized controller is then partitioned in a decentralized, hierarchical structure comprising of airframe lateral and longitudinal subcontrollers and an engine subcontroller. Command shaping prefilters from the pilot control effector inputs are then designed and time histories of the closed loop IFPC system response to simulated pilot commands are compared to desired responses based on handling qualities requirements. Finally, the propulsion system safety and nonlinear limited protection logic is wrapped around the engine subcontroller and the response of the closed loop integrated system is evaluated for transients that encounter the propulsion surge margin limit.

  6. A methodology for the validated design space exploration of fuel cell powered unmanned aerial vehicles

    NASA Astrophysics Data System (ADS)

    Moffitt, Blake Almy

    Unmanned Aerial Vehicles (UAVs) are the most dynamic growth sector of the aerospace industry today. The need to provide persistent intelligence, surveillance, and reconnaissance for military operations is driving the planned acquisition of over 5,000 UAVs over the next five years. The most pressing need is for quiet, small UAVs with endurance beyond what is capable with advanced batteries or small internal combustion propulsion systems. Fuel cell systems demonstrate high efficiency, high specific energy, low noise, low temperature operation, modularity, and rapid refuelability making them a promising enabler of the small, quiet, and persistent UAVs that military planners are seeking. Despite the perceived benefits, the actual near-term performance of fuel cell powered UAVs is unknown. Until the auto industry began spending billions of dollars in research, fuel cell systems were too heavy for useful flight applications. However, the last decade has seen rapid development with fuel cell gravimetric and volumetric power density nearly doubling every 2--3 years. As a result, a few design studies and demonstrator aircraft have appeared, but overall the design methodology and vehicles are still in their infancy. The design of fuel cell aircraft poses many challenges. Fuel cells differ fundamentally from combustion based propulsion in how they generate power and interact with other aircraft subsystems. As a result, traditional multidisciplinary analysis (MDA) codes are inappropriate. Building new MDAs is difficult since fuel cells are rapidly changing in design, and various competitive architectures exist for balance of plant, hydrogen storage, and all electric aircraft subsystems. In addition, fuel cell design and performance data is closely protected which makes validation difficult and uncertainty significant. Finally, low specific power and high volumes compared to traditional combustion based propulsion result in more highly constrained design spaces that are

  7. Methodology for worker neutron exposure evaluation in the PDCF facility design.

    PubMed

    Scherpelz, R I; Traub, R J; Pryor, K H

    2004-01-01

    A project headed by Washington Group International is meant to design the Pit Disassembly and Conversion Facility (PDCF) to convert the plutonium pits from excessed nuclear weapons into plutonium oxide for ultimate disposition. Battelle staff are performing the shielding calculations that will determine appropriate shielding so that the facility workers will not exceed target exposure levels. The target exposure levels for workers in the facility are 5 mSv y(-1) for the whole body and 100 mSv y(-1) for the extremity, which presents a significant challenge to the designers of a facility that will process tons of radioactive material. The design effort depended on shielding calculations to determine appropriate thickness and composition for glove box walls, and concrete wall thicknesses for storage vaults. Pacific Northwest National Laboratory (PNNL) staff used ORIGEN-S and SOURCES to generate gamma and neutron source terms, and Monte Carlo (computer code for) neutron photon (transport) (MCNP-4C) to calculate the radiation transport in the facility. The shielding calculations were performed by a team of four scientists, so it was necessary to develop a consistent methodology. There was also a requirement for the study to be cost-effective, so efficient methods of evaluation were required. The calculations were subject to rigorous scrutiny by internal and external reviewers, so acceptability was a major feature of the methodology. Some of the issues addressed in the development of the methodology included selecting appropriate dose factors, developing a method for handling extremity doses, adopting an efficient method for evaluating effective dose equivalent in a non-uniform radiation field, modelling the reinforcing steel in concrete, and modularising the geometry descriptions for efficiency. The relative importance of the neutron dose equivalent compared with the gamma dose equivalent varied substantially depending on the specific shielding conditions and lessons

  8. Empirical Evidence of Study Design Biases in Randomized Trials: Systematic Review of Meta-Epidemiological Studies

    PubMed Central

    Page, Matthew J.; Higgins, Julian P. T.; Clayton, Gemma; Sterne, Jonathan A. C.; Hróbjartsson, Asbjørn; Savović, Jelena

    2016-01-01

    Objective To synthesise evidence on the average bias and heterogeneity associated with reported methodological features of randomized trials. Design Systematic review of meta-epidemiological studies. Methods We retrieved eligible studies included in a recent AHRQ-EPC review on this topic (latest search September 2012), and searched Ovid MEDLINE and Ovid EMBASE for studies indexed from Jan 2012-May 2015. Data were extracted by one author and verified by another. We combined estimates of average bias (e.g. ratio of odds ratios (ROR) or difference in standardised mean differences (dSMD)) in meta-analyses using the random-effects model. Analyses were stratified by type of outcome (“mortality” versus “other objective” versus “subjective”). Direction of effect was standardised so that ROR < 1 and dSMD < 0 denotes a larger intervention effect estimate in trials with an inadequate or unclear (versus adequate) characteristic. Results We included 24 studies. The available evidence suggests that intervention effect estimates may be exaggerated in trials with inadequate/unclear (versus adequate) sequence generation (ROR 0.93, 95% CI 0.86 to 0.99; 7 studies) and allocation concealment (ROR 0.90, 95% CI 0.84 to 0.97; 7 studies). For these characteristics, the average bias appeared to be larger in trials of subjective outcomes compared with other objective outcomes. Also, intervention effects for subjective outcomes appear to be exaggerated in trials with lack of/unclear blinding of participants (versus blinding) (dSMD -0.37, 95% CI -0.77 to 0.04; 2 studies), lack of/unclear blinding of outcome assessors (ROR 0.64, 95% CI 0.43 to 0.96; 1 study) and lack of/unclear double blinding (ROR 0.77, 95% CI 0.61 to 0.93; 1 study). The influence of other characteristics (e.g. unblinded trial personnel, attrition) is unclear. Conclusions Certain characteristics of randomized trials may exaggerate intervention effect estimates. The average bias appears to be greatest in trials of

  9. A novel generalized design methodology and realization of Boolean operations using DNA.

    PubMed

    Zoraida, B S E; Arock, Michael; Ronald, B S M; Ponalagusamy, R

    2009-09-01

    The biological deoxyribonucleic acid (DNA) strand has been increasingly seen as a promising computing unit. A new algorithm is formulated in this paper to design any DNA Boolean operator with molecular beacons (MBs) as its input. Boolean operators realized using the proposed design methodology is presented. The developed operators adopt a uniform representation for logical 0 and 1 for any Boolean operator. The Boolean operators designed in this work employ only a hybridization operation at each stage. Further, this paper for the first time brings out the realization of a binary adder and subtractor using molecular beacons. Simulation results of the DNA-based binary adder and subtractor are given to validate the design. PMID:19505531

  10. Toward a systematic design theory for silicon solar cells using optimization techniques

    NASA Technical Reports Server (NTRS)

    Misiakos, K.; Lindholm, F. A.

    1986-01-01

    This work is a first detailed attempt to systematize the design of silicon solar cells. Design principles follow from three theorems. Although the results hold only under low injection conditions in base and emitter regions, they hold for arbitrary doping profiles and include the effects of drift fields, high/low junctions and heavy doping concentrations of donor or acceptor atoms. Several optimal designs are derived from the theorems, one of which involves a three-dimensional morphology in the emitter region. The theorems are derived from a nonlinear differential equation of the Riccati form, the dependent variable of which is a normalized recombination particle current.

  11. An Optimal Design Methodology of Tapered Roller Bearings Using Genetic Algorithms

    NASA Astrophysics Data System (ADS)

    Tiwari, Rajiv; Sunil, Kumar K.; Reddy, R. S.

    2012-03-01

    In the design of tapered roller bearings, long life is the one of the most important criterion. The design of bearings has to satisfy constraints of geometry and strength, while operating at its rated speed. An optimal design methodology is needed to achieve this objective (i.e., the maximization of the fatigue life). The fatigue life is directly proportional to the dynamic capacity; hence, for the present case, the latter has been chosen as the objective function. It has been optimized by using a constrained nonlinear formulation with real-coded genetic algorithms. Design variables for the bearing include four geometrical parameters: the bearing pitch diameter, the diameter of the roller, the effective length of the roller, and the number of rollers. These directly affect the dynamic capacity of tapered roller bearings. In addition to these, another five design constraint constants are included, which indirectly affect the basic dynamic capacity of tapered roller bearings. The five design constraint constants have been given bounds based on the parametric studies through initial optimization runs. There is good agreement between the optimized and standard bearings in respect to the basic dynamic capacity. A convergence study has been carried out to ensure the global optimum point in the design. A sensitivity analysis of various design parameters, using the Monte Carlo simulation method, has been performed to see changes in the dynamic capacity. Illustrations show that none of the geometric design parameters have adverse affect on the dynamic capacity.

  12. Hypnosis for procedure-related pain and distress in pediatric cancer patients: a systematic review of effectiveness and methodology related to hypnosis interventions.

    PubMed

    Richardson, Janet; Smith, Joanna E; McCall, Gillian; Pilkington, Karen

    2006-01-01

    The aim of this study was to systematically review and critically appraise the evidence on the effectiveness of hypnosis for procedure-related pain and distress in pediatric cancer patients. A comprehensive search of major biomedical and specialist complementary and alternative medicine databases was conducted. Citations were included from the databases' inception to March 2005. Efforts were made to identify unpublished and ongoing research. Controlled trials were appraised using predefined criteria. Clinical commentaries were obtained for each study. Seven randomized controlled clinical trials and one controlled clinical trial were found. Studies report positive results, including statistically significant reductions in pain and anxiety/distress, but a number of methodological limitations were identified. Systematic searching and appraisal has demonstrated that hypnosis has potential as a clinically valuable intervention for procedure-related pain and distress in pediatric cancer patients. Further research into the effectiveness and acceptability of hypnosis for pediatric cancer patients is recommended. PMID:16442484

  13. Device Thrombogenicty Emulator (DTE) – Design optimization Methodology for Cardiovascular Devices: A Study in Two Bileaflet MHV Designs

    PubMed Central

    Xenos, Michalis; Girdhar, Gaurav; Alemu, Yared; Jesty, Jolyon; Slepian, Marvin; Einav, Shmuel; Bluestein, Danny

    2010-01-01

    Patients who receive prosthetic heart valve (PHV) implants require mandatory anticoagulation medication after implantation due to the thrombogenic potential of the valve. Optimization of PHV designs may facilitate reduction of flow-induced thrombogenicity and reduce or eliminate the need for post-implant anticoagulants. We present a methodology entitled Device Thrombogenicty Emulator (DTE) for optimizing the thrombo-resistance performance of PHV by combining numerical and experimental approaches. Two bileaflet mechanical heart valves (MHV) designs – St. Jude Medical (SJM) and ATS were investigated, by studying the effect of distinct flow phases on platelet activation. Transient turbulent and direct numerical simulations (DNS) were conducted, and stress loading histories experienced by the platelets were calculated along flow trajectories. The numerical simulations indicated distinct design dependent differences between the two valves. The stress-loading waveforms extracted from the numerical simulations were programmed into a hemodynamic shearing device (HSD), emulating the flow conditions past the valves in distinct ‘hot spot’ flow regions that are implicated in MHV thrombogenicity. The resultant platelet activity was measured with a modified prothrombinase assay, and was found to be significantly higher in the SJM valve, mostly during the regurgitation phase. The experimental results were in excellent agreement with the calculated platelet activation potential. This establishes the utility of the DTE methodology for serving as a test bed for evaluating design modifications for achieving better thrombogenic performance for such devices. PMID:20483411

  14. A methodology towards virtualisation-based high performance simulation platform supporting multidisciplinary design of complex products

    NASA Astrophysics Data System (ADS)

    Ren, Lei; Zhang, Lin; Tao, Fei; (Luke) Zhang, Xiaolong; Luo, Yongliang; Zhang, Yabin

    2012-08-01

    Multidisciplinary design of complex products leads to an increasing demand for high performance simulation (HPS) platforms. One great challenge is how to achieve high efficient utilisation of large-scale simulation resources in distributed and heterogeneous environments. This article reports a virtualisation-based methodology to realise a HPS platform. This research is driven by the issues concerning large-scale simulation resources deployment and complex simulation environment construction, efficient and transparent utilisation of fine-grained simulation resources and high reliable simulation with fault tolerance. A framework of virtualisation-based simulation platform (VSIM) is first proposed. Then the article investigates and discusses key approaches in VSIM, including simulation resources modelling, a method to automatically deploying simulation resources for dynamic construction of system environment, and a live migration mechanism in case of faults in run-time simulation. Furthermore, the proposed methodology is applied to a multidisciplinary design system for aircraft virtual prototyping and some experiments are conducted. The experimental results show that the proposed methodology can (1) significantly improve the utilisation of fine-grained simulation resources, (2) result in a great reduction in deployment time and an increased flexibility for simulation environment construction and (3)achieve fault tolerant simulation.

  15. Design Methodology for Magnetic Field-Based Soft Tri-Axis Tactile Sensors.

    PubMed

    Wang, Hongbo; de Boer, Greg; Kow, Junwai; Alazmani, Ali; Ghajari, Mazdak; Hewson, Robert; Culmer, Peter

    2016-01-01

    Tactile sensors are essential if robots are to safely interact with the external world and to dexterously manipulate objects. Current tactile sensors have limitations restricting their use, notably being too fragile or having limited performance. Magnetic field-based soft tactile sensors offer a potential improvement, being durable, low cost, accurate and high bandwidth, but they are relatively undeveloped because of the complexities involved in design and calibration. This paper presents a general design methodology for magnetic field-based three-axis soft tactile sensors, enabling researchers to easily develop specific tactile sensors for a variety of applications. All aspects (design, fabrication, calibration and evaluation) of the development of tri-axis soft tactile sensors are presented and discussed. A moving least square approach is used to decouple and convert the magnetic field signal to force output to eliminate non-linearity and cross-talk effects. A case study of a tactile sensor prototype, MagOne, was developed. This achieved a resolution of 1.42 mN in normal force measurement (0.71 mN in shear force), good output repeatability and has a maximum hysteresis error of 3.4%. These results outperform comparable sensors reported previously, highlighting the efficacy of our methodology for sensor design. PMID:27563908

  16. Pushover Analysis Methodologies: A Tool For Limited Damage Based Design Of Structure For Seismic Vibration

    NASA Astrophysics Data System (ADS)

    Dutta, Sekhar Chandra; Chakroborty, Suvonkar; Raychaudhuri, Anusrita

    Vibration transmitted to the structure during earthquake may vary in magnitude over a wide range. Design methodology should, therefore, enumerates steps so that structures are able to survive in the event of even severe ground motion. However, on account of economic reason, the strengths can be provided to the structures in such a way that the structure remains in elastic range in low to moderate range earthquake and is allowed to undergo inelastic deformation in severe earthquake without collapse. To implement this design philosophy a rigorous nonlinear dynamic analysis is needed to be performed to estimate the inelastic demands. Furthermore, the same is time consuming and requires expertise to judge the results obtained from the same. In this context, the present paper discusses and demonstrates an alternative simple method known as Pushover method, which can be easily used by practicing engineers bypassing intricate nonlinear dynamic analysis and can be thought of as a substitute of the latter. This method is in the process of development and is increasingly becoming popular for its simplicity. The objective of this paper is to emphasize and demonstrate the basic concept, strength and ease of this state of the art methodology for regular use in design offices in performance based seismic design of structures.

  17. Low-Radiation Cellular Inductive Powering of Rodent Wireless Brain Interfaces: Methodology and Design Guide.

    PubMed

    Soltani, Nima; Aliroteh, Miaad S; Salam, M Tariqus; Perez Velazquez, Jose Luis; Genov, Roman

    2016-08-01

    This paper presents a general methodology of inductive power delivery in wireless chronic rodent electrophysiology applications. The focus is on such systems design considerations under the following key constraints: maximum power delivery under the allowable specific absorption rate (SAR), low cost and spatial scalability. The methodology includes inductive coil design considerations within a low-frequency ferrite-core-free power transfer link which includes a scalable coil-array power transmitter floor and a single-coil implanted or worn power receiver. A specific design example is presented that includes the concept of low-SAR cellular single-transmitter-coil powering through dynamic tracking of a magnet-less receiver spatial location. The transmitter coil instantaneous supply current is monitored using a small number of low-cost electronic components. A drop in its value indicates the proximity of the receiver due to the reflected impedance of the latter. Only the transmitter coil nearest to the receiver is activated. Operating at the low frequency of 1.5 MHz, the inductive powering floor delivers a maximum of 15.9 W below the IEEE C95 SAR limit, which is over three times greater than that in other recently reported designs. The power transfer efficiency of 39% and 13% at the nominal and maximum distances of 8 cm and 11 cm, respectively, is maintained. PMID:26960227

  18. Study design, methodology and statistical analyses in the clinical development of sparfloxacin.

    PubMed

    Genevois, E; Lelouer, V; Vercken, J B; Caillon, R

    1996-05-01

    Many publications in the past 10 years have emphasised the difficulties of evaluating anti-infective drugs and the need for well-designed clinical trials in this therapeutic field. The clinical development of sparfloxacin in Europe, involving more than 4000 patients in ten countries, provided the opportunity to implement a methodology for evaluation and statistical analyses which would take into account actual requirements and past insufficiencies. This methodology focused on a rigorous and accurate patient classification for evaluability, subgroups of particular interest, efficacy assessment based on automation (algorithm) and individual case review by expert panel committees. In addition, the statistical analyses did not use significance testing but rather confidence intervals to determine whether sparfloxacin was therapeutically equivalent to the reference comparator antibacterial agents. PMID:8737126

  19. Design methodology: edgeless 3D ASICs with complex in-pixel processing for pixel detectors

    NASA Astrophysics Data System (ADS)

    Fahim, Farah; Deptuch, Grzegorz W.; Hoff, James R.; Mohseni, Hooman

    2015-08-01

    The design methodology for the development of 3D integrated edgeless pixel detectors with in-pixel processing using Electronic Design Automation (EDA) tools is presented. A large area 3 tier 3D detector with one sensor layer and two ASIC layers containing one analog and one digital tier, is built for x-ray photon time of arrival measurement and imaging. A full custom analog pixel is 65μm x 65μm. It is connected to a sensor pixel of the same size on one side, and on the other side it has approximately 40 connections to the digital pixel. A 32 x 32 edgeless array without any peripheral functional blocks constitutes a sub-chip. The sub-chip is an indivisible unit, which is further arranged in a 6 x 6 array to create the entire 1.248cm x 1.248cm ASIC. Each chip has 720 bump-bond I/O connections, on the back of the digital tier to the ceramic PCB. All the analog tier power and biasing is conveyed through the digital tier from the PCB. The assembly has no peripheral functional blocks, and hence the active area extends to the edge of the detector. This was achieved by using a few flavors of almost identical analog pixels (minimal variation in layout) to allow for peripheral biasing blocks to be placed within pixels. The 1024 pixels within a digital sub-chip array have a variety of full custom, semi-custom and automated timing driven functional blocks placed together. The methodology uses a modified mixed-mode on-top digital implementation flow to not only harness the tool efficiency for timing and floor-planning but also to maintain designer control over compact parasitically aware layout. The methodology uses the Cadence design platform, however it is not limited to this tool.

  20. Design methodology: edgeless 3D ASICs with complex in-pixel processing for pixel detectors

    SciTech Connect

    Fahim Farah, Fahim Farah; Deptuch, Grzegorz W.; Hoff, James R.; Mohseni, Hooman

    2015-08-28

    The design methodology for the development of 3D integrated edgeless pixel detectors with in-pixel processing using Electronic Design Automation (EDA) tools is presented. A large area 3 tier 3D detector with one sensor layer and two ASIC layers containing one analog and one digital tier, is built for x-ray photon time of arrival measurement and imaging. A full custom analog pixel is 65μm x 65μm. It is connected to a sensor pixel of the same size on one side, and on the other side it has approximately 40 connections to the digital pixel. A 32 x 32 edgeless array without any peripheral functional blocks constitutes a sub-chip. The sub-chip is an indivisible unit, which is further arranged in a 6 x 6 array to create the entire 1.248cm x 1.248cm ASIC. Each chip has 720 bump-bond I/O connections, on the back of the digital tier to the ceramic PCB. All the analog tier power and biasing is conveyed through the digital tier from the PCB. The assembly has no peripheral functional blocks, and hence the active area extends to the edge of the detector. This was achieved by using a few flavors of almost identical analog pixels (minimal variation in layout) to allow for peripheral biasing blocks to be placed within pixels. The 1024 pixels within a digital sub-chip array have a variety of full custom, semi-custom and automated timing driven functional blocks placed together. The methodology uses a modified mixed-mode on-top digital implementation flow to not only harness the tool efficiency for timing and floor-planning but also to maintain designer control over compact parasitically aware layout. The methodology uses the Cadence design platform, however it is not limited to this tool.

  1. Pharmacological and methodological aspects of the separation-induced vocalization test in guinea pig pups; a systematic review and meta-analysis.

    PubMed

    Groenink, Lucianne; Verdouw, P Monika; Bakker, Brenda; Wever, Kimberley E

    2015-04-15

    The separation-induced vocalization test in guinea pig pups is one of many that has been used to screen for anxiolytic-like properties of drugs. The test is based on the cross-species phenomenon that infants emit distress calls when placed in social isolation. Here we report a systematic review and meta-analysis of pharmacological intervention in the separation-induced vocalization test in guinea pig pups. Electronic databases were searched for original research articles, yielding 32 studies that met inclusion criteria. We extracted data on pharmacological intervention, animal and methodological characteristics, and study quality indicators. Meta-analysis showed that the different drug classes in clinical use for the treatment of anxiety disorders, have comparable effects on vocalization behaviour, irrespective of their mechanism of action. Of the experimental drugs, nociception (NOP) receptor agonists proved very effective in this test. Analysis further indicated that the commonly used read-outs total number and total duration of vocalizations are equally valid. With regard to methodological characteristics, repeated testing of pups as well as selecting pups with moderate or high levels of vocalization were associated with larger treatment effects. Finally, reporting of study methodology, randomization and blinding was poor and Egger's test for small study effects showed that publication bias likely occurred. This review illustrates the value of systematic reviews and meta-analyses in improving translational value and methodological aspects of animal models. It further shows the urgent need to implement existing publication guidelines to maximize the output and impact of experimental animal studies. PMID:25460027

  2. Methodology to design a municipal solid waste generation and composition map: a case study.

    PubMed

    Gallardo, A; Carlos, M; Peris, M; Colomer, F J

    2014-11-01

    The municipal solid waste (MSW) management is an important task that local governments as well as private companies must take into account to protect human health, the environment and to preserve natural resources. To design an adequate MSW management plan the first step consist in defining the waste generation and composition patterns of the town. As these patterns depend on several socio-economic factors it is advisable to organize them previously. Moreover, the waste generation and composition patterns may vary around the town and over the time. Generally, the data are not homogeneous around the city as the number of inhabitants is not constant nor it is the economic activity. Therefore, if all the information is showed in thematic maps, the final waste management decisions can be made more efficiently. The main aim of this paper is to present a structured methodology that allows local authorities or private companies who deal with MSW to design its own MSW management plan depending on the available data. According to these data, this paper proposes two ways of action: a direct way when detailed data are available and an indirect way when there is a lack of data and it is necessary to take into account bibliographic data. In any case, the amount of information needed is considerable. This paper combines the planning methodology with the Geographic Information Systems to present the final results in thematic maps that make easier to interpret them. The proposed methodology is a previous useful tool to organize the MSW collection routes including the selective collection. To verify the methodology it has been successfully applied to a Spanish town. PMID:25008298

  3. Methodology to design a municipal solid waste generation and composition map: a case study.

    PubMed

    Gallardo, A; Carlos, M; Peris, M; Colomer, F J

    2015-02-01

    The municipal solid waste (MSW) management is an important task that local governments as well as private companies must take into account to protect human health, the environment and to preserve natural resources. To design an adequate MSW management plan the first step consists in defining the waste generation and composition patterns of the town. As these patterns depend on several socio-economic factors it is advisable to organize them previously. Moreover, the waste generation and composition patterns may vary around the town and over the time. Generally, the data are not homogeneous around the city as the number of inhabitants is not constant nor it is the economic activity. Therefore, if all the information is showed in thematic maps, the final waste management decisions can be made more efficiently. The main aim of this paper is to present a structured methodology that allows local authorities or private companies who deal with MSW to design its own MSW management plan depending on the available data. According to these data, this paper proposes two ways of action: a direct way when detailed data are available and an indirect way when there is a lack of data and it is necessary to take into account bibliographic data. In any case, the amount of information needed is considerable. This paper combines the planning methodology with the Geographic Information Systems to present the final results in thematic maps that make easier to interpret them. The proposed methodology is a previous useful tool to organize the MSW collection routes including the selective collection. To verify the methodology it has been successfully applied to a Spanish town. PMID:25443095

  4. Multirate Flutter Suppression System Design for the Benchmark Active Controls Technology Wing. Part 2; Methodology Application Software Toolbox

    NASA Technical Reports Server (NTRS)

    Mason, Gregory S.; Berg, Martin C.; Mukhopadhyay, Vivek

    2002-01-01

    To study the effectiveness of various control system design methodologies, the NASA Langley Research Center initiated the Benchmark Active Controls Project. In this project, the various methodologies were applied to design a flutter suppression system for the Benchmark Active Controls Technology (BACT) Wing. This report describes the user's manual and software toolbox developed at the University of Washington to design a multirate flutter suppression control law for the BACT wing.

  5. Inductive Powering of Subcutaneous Stimulators: Key Parameters and Their Impact on the Design Methodology

    PubMed Central

    Godfraind, Carmen; Debelle, Adrien; Lonys, Laurent; Acuña, Vicente; Doguet, Pascal; Nonclercq, Antoine

    2016-01-01

    Inductive powering of implantable medical devices involves numerous factors acting on the system efficiency and safety in adversarial ways. This paper lightens up their role and identifies a procedure enabling the system design. The latter enables the problem to be decoupled into four principal steps: the frequency choice, the magnetic link optimization, the secondary circuit and then finally the primary circuit designs. The methodology has been tested for the powering system of a device requirering a power of 300mW and implanted at a distance of 15 to 30mm from the outside power source. It allowed the identification of the most critical parameters. A satisfying efficiency of 34% was reached at 21mm and tend to validate the proposed design procedure. PMID:27478572

  6. Design Methodology: ASICs with complex in-pixel processing for Pixel Detectors

    SciTech Connect

    Fahim, Farah

    2014-10-31

    The development of Application Specific Integrated Circuits (ASIC) for pixel detectors with complex in-pixel processing using Computer Aided Design (CAD) tools that are, themselves, mainly developed for the design of conventional digital circuits requires a specialized approach. Mixed signal pixels often require parasitically aware detailed analog front-ends and extremely compact digital back-ends with more than 1000 transistors in small areas below 100μm x 100μm. These pixels are tiled to create large arrays, which have the same clock distribution and data readout speed constraints as in, for example, micro-processors. The methodology uses a modified mixed-mode on-top digital implementation flow to not only harness the tool efficiency for timing and floor-planning but also to maintain designer control over compact parasitically aware layout.

  7. A Digital Methodology for the Design Process of Aerospace Assemblies with Sustainable Composite Processes & Manufacture

    NASA Astrophysics Data System (ADS)

    McEwan, W.; Butterfield, J.

    2011-05-01

    The well established benefits of composite materials are driving a significant shift in design and manufacture strategies for original equipment manufacturers (OEMs). Thermoplastic composites have advantages over the traditional thermosetting materials with regards to sustainability and environmental impact, features which are becoming increasingly pertinent in the aerospace arena. However, when sustainability and environmental impact are considered as design drivers, integrated methods for part design and product development must be developed so that any benefits of sustainable composite material systems can be assessed during the design process. These methods must include mechanisms to account for process induced part variation and techniques related to re-forming, recycling and decommissioning, which are in their infancy. It is proposed in this paper that predictive techniques related to material specification, part processing and product cost of thermoplastic composite components, be integrated within a Through Life Management (TLM) product development methodology as part of a larger strategy of product system modeling to improve disciplinary concurrency, realistic part performance, and to place sustainability at the heart of the design process. This paper reports the enhancement of digital manufacturing tools as a means of drawing simulated part manufacturing scenarios, real time costing mechanisms, and broader lifecycle performance data capture into the design cycle. The work demonstrates predictive processes for sustainable composite product manufacture and how a Product-Process-Resource (PPR) structure can be customised and enhanced to include design intent driven by `Real' part geometry and consequent assembly. your paper.

  8. A systematic design approach for two planetary gear split hybrid vehicles

    NASA Astrophysics Data System (ADS)

    Liu, Jinming; Peng, Huei

    2010-11-01

    Multiple power sources in a hybrid vehicle allow for flexible vehicle power-train operations, but also impose kinematic constraints due to component characteristics. This paper presents a design process that enables systematic search and screening through all three major dimensions of hybrid vehicle designs - system configuration, component sizing and control, to achieve optimal performance while satisfying the imposed constraints. An automated dynamic modelling method is first developed which enables the construction of hybrid vehicle model efficiently. A screening process then narrows down to configurations that satisfy drivability and operation constraints. Finally, a design and control optimisation strategy is carried out to obtain the best execution of each configuration. A case study for the design of a power-split hybrid vehicle with optimal fuel economy is used to demonstrate this overall hybrid vehicle design process.

  9. A system-of-systems modeling methodology for strategic general aviation design decision-making

    NASA Astrophysics Data System (ADS)

    Won, Henry Thome

    General aviation has long been studied as a means of providing an on-demand "personal air vehicle" that bypasses the traffic at major commercial hubs. This thesis continues this research through development of a system of systems modeling methodology applicable to the selection of synergistic product concepts, market segments, and business models. From the perspective of the conceptual design engineer, the design and selection of future general aviation aircraft is complicated by the definition of constraints and requirements, and the tradeoffs among performance and cost aspects. Qualitative problem definition methods have been utilized, although their accuracy in determining specific requirement and metric values is uncertain. In industry, customers are surveyed, and business plans are created through a lengthy, iterative process. In recent years, techniques have developed for predicting the characteristics of US travel demand based on travel mode attributes, such as door-to-door time and ticket price. As of yet, these models treat the contributing systems---aircraft manufacturers and service providers---as independently variable assumptions. In this research, a methodology is developed which seeks to build a strategic design decision making environment through the construction of a system of systems model. The demonstrated implementation brings together models of the aircraft and manufacturer, the service provider, and most importantly the travel demand. Thus represented is the behavior of the consumers and the reactive behavior of the suppliers---the manufacturers and transportation service providers---in a common modeling framework. The results indicate an ability to guide the design process---specifically the selection of design requirements---through the optimization of "capability" metrics. Additionally, results indicate the ability to find synergetic solutions, that is solutions in which two systems might collaborate to achieve a better result than acting

  10. Application of experimental design methodology in development and optimization of drug release method.

    PubMed

    Kincl, M; Turk, S; Vrecer, F

    2005-03-01

    The aim of our research was to apply experimental design methodology in the development and optimization of drug release methods. Diclofenac sodium (2-[(2,6-dichlorophenyl)amino]benzeneacetic acid monosodium salt) was selected as a model drug and Naklofen retard prolonged release tablets, containing 100 mg of diclofenac sodium, were chosen as a model prolonged release system. On the basis of previous results, a three-level three-factorial Box-Behnken experimental design was used to characterize and optimize three physicochemical parameters, i.e. rotation speeds of the stirring elements, pH, and ionic strengths of the dissolution medium, affecting the release of diclofenac sodium from the tablets. The chosen dependent variables (responses) were a cumulative percentage of dissolved diclofenac sodium in 2, 6, 12 and 24 h. For estimation of coefficients in the approximating polynomial function, the least square regression method was applied. Afterwards, the information about the model reliability was verified by using the analysis of variance (ANOVA). The estimation of model factors' significance was performed by Student's t-test. For investigation of the shape of the predicted response surfaces and for model optimization, the canonical analysis was applied. Our study proved that experimental design methodology could efficiently be applied for characterization and optimization of analytical parameters affecting drug release and that it is an economical way of obtaining the maximum amount of information in a short period of time and with the fewest number of experiments. PMID:15707730

  11. LOX/hydrocarbon rocket engine analytical design methodology development and validation. Volume 2: Appendices

    NASA Technical Reports Server (NTRS)

    Niiya, Karen E.; Walker, Richard E.; Pieper, Jerry L.; Nguyen, Thong V.

    1993-01-01

    This final report includes a discussion of the work accomplished during the period from Dec. 1988 through Nov. 1991. The objective of the program was to assemble existing performance and combustion stability models into a usable design methodology capable of designing and analyzing high-performance and stable LOX/hydrocarbon booster engines. The methodology was then used to design a validation engine. The capabilities and validity of the methodology were demonstrated using this engine in an extensive hot fire test program. The engine used LOX/RP-1 propellants and was tested over a range of mixture ratios, chamber pressures, and acoustic damping device configurations. This volume contains time domain and frequency domain stability plots which indicate the pressure perturbation amplitudes and frequencies from approximately 30 tests of a 50K thrust rocket engine using LOX/RP-1 propellants over a range of chamber pressures from 240 to 1750 psia with mixture ratios of from 1.2 to 7.5. The data is from test configurations which used both bitune and monotune acoustic cavities and from tests with no acoustic cavities. The engine had a length of 14 inches and a contraction ratio of 2.0 using a 7.68 inch diameter injector. The data was taken from both stable and unstable tests. All combustion instabilities were spontaneous in the first tangential mode. Although stability bombs were used and generated overpressures of approximately 20 percent, no tests were driven unstable by the bombs. The stability instrumentation included six high-frequency Kistler transducers in the combustion chamber, a high-frequency Kistler transducer in each propellant manifold, and tri-axial accelerometers. Performance data is presented, both characteristic velocity efficiencies and energy release efficiencies, for those tests of sufficient duration to record steady state values.

  12. Control of multiterminal HVDC systems embedded in AC networks. Volume 1. Methodologies for control system design

    NASA Astrophysics Data System (ADS)

    Hauth, R. L.; Nozari, F.; Winkelman, J. R.; Athans, M.; Chan, S. M.

    1982-05-01

    Control concepts applicable to future multiterminal high voltage dc (MTDC) networks embedded in bulk power ac systems are discussed. The control's objectives are to enhance the steady state and/or dynamic performance of the integrated MTDC/ac power system. A multi-terminal HVdc system is one with more than two converter terminals. The three basic control levels of an MTDC system are: primary control, supplementary power modulation (damping) controls, and dispatch control. Techniques for use in all three levels of control are described. The application of modern control robustness theories to the MTDC power modulation control design methodology is discussed.

  13. Optimization of Electrospray Ionization by Statistical Design of Experiments and Response Surface Methodology: Protein-Ligand Equilibrium Dissociation Constant Determinations

    NASA Astrophysics Data System (ADS)

    Pedro, Liliana; Van Voorhis, Wesley C.; Quinn, Ronald J.

    2016-09-01

    Electrospray ionization mass spectrometry (ESI-MS) binding studies between proteins and ligands under native conditions require that instrumental ESI source conditions are optimized if relative solution-phase equilibrium concentrations between the protein-ligand complex and free protein are to be retained. Instrumental ESI source conditions that simultaneously maximize the relative ionization efficiency of the protein-ligand complex over free protein and minimize the protein-ligand complex dissociation during the ESI process and the transfer from atmospheric pressure to vacuum are generally specific for each protein-ligand system and should be established when an accurate equilibrium dissociation constant (KD) is to be determined via titration. In this paper, a straightforward and systematic approach for ESI source optimization is presented. The method uses statistical design of experiments (DOE) in conjunction with response surface methodology (RSM) and is demonstrated for the complexes between Plasmodium vivax guanylate kinase ( PvGK) and two ligands: 5'-guanosine monophosphate (GMP) and 5'-guanosine diphosphate (GDP). It was verified that even though the ligands are structurally similar, the most appropriate ESI conditions for KD determination by titration are different for each.

  14. Optimization of Electrospray Ionization by Statistical Design of Experiments and Response Surface Methodology: Protein-Ligand Equilibrium Dissociation Constant Determinations.

    PubMed

    Pedro, Liliana; Van Voorhis, Wesley C; Quinn, Ronald J

    2016-09-01

    Electrospray ionization mass spectrometry (ESI-MS) binding studies between proteins and ligands under native conditions require that instrumental ESI source conditions are optimized if relative solution-phase equilibrium concentrations between the protein-ligand complex and free protein are to be retained. Instrumental ESI source conditions that simultaneously maximize the relative ionization efficiency of the protein-ligand complex over free protein and minimize the protein-ligand complex dissociation during the ESI process and the transfer from atmospheric pressure to vacuum are generally specific for each protein-ligand system and should be established when an accurate equilibrium dissociation constant (KD) is to be determined via titration. In this paper, a straightforward and systematic approach for ESI source optimization is presented. The method uses statistical design of experiments (DOE) in conjunction with response surface methodology (RSM) and is demonstrated for the complexes between Plasmodium vivax guanylate kinase (PvGK) and two ligands: 5'-guanosine monophosphate (GMP) and 5'-guanosine diphosphate (GDP). It was verified that even though the ligands are structurally similar, the most appropriate ESI conditions for KD determination by titration are different for each. Graphical Abstract ᅟ. PMID:27225419

  15. Optimization of Electrospray Ionization by Statistical Design of Experiments and Response Surface Methodology: Protein-Ligand Equilibrium Dissociation Constant Determinations

    NASA Astrophysics Data System (ADS)

    Pedro, Liliana; Van Voorhis, Wesley C.; Quinn, Ronald J.

    2016-05-01

    Electrospray ionization mass spectrometry (ESI-MS) binding studies between proteins and ligands under native conditions require that instrumental ESI source conditions are optimized if relative solution-phase equilibrium concentrations between the protein-ligand complex and free protein are to be retained. Instrumental ESI source conditions that simultaneously maximize the relative ionization efficiency of the protein-ligand complex over free protein and minimize the protein-ligand complex dissociation during the ESI process and the transfer from atmospheric pressure to vacuum are generally specific for each protein-ligand system and should be established when an accurate equilibrium dissociation constant (KD) is to be determined via titration. In this paper, a straightforward and systematic approach for ESI source optimization is presented. The method uses statistical design of experiments (DOE) in conjunction with response surface methodology (RSM) and is demonstrated for the complexes between Plasmodium vivax guanylate kinase (PvGK) and two ligands: 5'-guanosine monophosphate (GMP) and 5'-guanosine diphosphate (GDP). It was verified that even though the ligands are structurally similar, the most appropriate ESI conditions for KD determination by titration are different for each.

  16. New methodology of designing inexpensive hybrid control-acquisition systems for mechatronic constructions.

    PubMed

    Augustyn, Jacek

    2013-01-01

    This article presents a new methodology for designing a hybrid control and acquisition system consisting of a 32-bit SoC microsystem connected via a direct Universal Serial Bus (USB) with a standard commercial off-the-shelf (COTS) component running the Android operating system. It is proposed to utilize it avoiding the use of an additional converter. An Android-based component was chosen to explore the potential for a mobile, compact and energy efficient solution with easy to build user interfaces and easy wireless integration with other computer systems. This paper presents results of practical implementation and analysis of experimental real-time performance. It covers closed control loop time between the sensor/actuator module and the Android operating system as well as the real-time sensor data stream within such a system. Some optimisations are proposed and their influence on real-time performance was investigated. The proposed methodology is intended for acquisition and control of mechatronic systems, especially mobile robots. It can be used in a wide range of control applications as well as embedded acquisition-recording devices, including energy quality measurements, smart-grids and medicine. It is demonstrated that the proposed methodology can be employed without developing specific device drivers. The latency achieved was less than 0.5 ms and the sensor data stream throughput was on the order of 750 KB/s (compared to 3 ms latency and 300 KB/s in traditional solutions). PMID:24351633

  17. New Methodology of Designing Inexpensive Hybrid Control-Acquisition Systems for Mechatronic Constructions

    PubMed Central

    Augustyn, Jacek

    2013-01-01

    This article presents a new methodology for designing a hybrid control and acquisition system consisting of a 32-bit SoC microsystem connected via a direct Universal Serial Bus (USB) with a standard commercial off-the-shelf (COTS) component running the Android operating system. It is proposed to utilize it avoiding the use of an additional converter. An Android-based component was chosen to explore the potential for a mobile, compact and energy efficient solution with easy to build user interfaces and easy wireless integration with other computer systems. This paper presents results of practical implementation and analysis of experimental real-time performance. It covers closed control loop time between the sensor/actuator module and the Android operating system as well as the real-time sensor data stream within such a system. Some optimisations are proposed and their influence on real-time performance was investigated. The proposed methodology is intended for acquisition and control of mechatronic systems, especially mobile robots. It can be used in a wide range of control applications as well as embedded acquisition-recording devices, including energy quality measurements, smart-grids and medicine. It is demonstrated that the proposed methodology can be employed without developing specific device drivers. The latency achieved was less than 0.5 ms and the sensor data stream throughput was on the order of 750 KB/s (compared to 3 ms latency and 300 KB/s in traditional solutions). PMID:24351633

  18. A systematic investigation of large-scale diffractive coded aperture designs

    NASA Astrophysics Data System (ADS)

    Gottesman, Stephen R.; Shrekenhamer, Abraham; Isser, Abraham; Gigioli, George

    2012-10-01

    One obstacle to optimizing performance of large-scale coded aperture systems operating in the diffractive regime has been the lack of a robust, rapid, and efficient method for generating diffraction patterns that are projected by the system onto the focal plane. We report on the use of the 'Shrekenhamer Transform' for a systematic investigation of various types of coded aperture designs operating in the diffractive mode. Each design is evaluated in terms of its autocorrelation function for potential use in future imaging applications. The motivation of our study is to gain insight into more efficient optimization methods of image reconstruction algorithms.

  19. The Study of the Relationship between Probabilistic Design and Axiomatic Design Methodology. Volume 3

    NASA Technical Reports Server (NTRS)

    Onwubiko, Chin-Yere; Onyebueke, Landon

    1996-01-01

    Structural failure is rarely a "sudden death" type of event, such sudden failures may occur only under abnormal loadings like bomb or gas explosions and very strong earthquakes. In most cases, structures fail due to damage accumulated under normal loadings such as wind loads, dead and live loads. The consequence of cumulative damage will affect the reliability of surviving components and finally causes collapse of the system. The cumulative damage effects on system reliability under time-invariant loadings are of practical interest in structural design and therefore will be investigated in this study. The scope of this study is, however, restricted to the consideration of damage accumulation as the increase in the number of failed components due to the violation of their strength limits.

  20. Effectiveness of interventions designed to prevent female genital mutilation/cutting: a systematic review.

    PubMed

    Berg, Rigmor C; Denison, Eva

    2012-06-01

    Female genital mutilation/cutting (FGM/C) is widely considered a human rights infringement, although communities that practice the tradition view it as an integral part of their culture. Given these vastly different views, the effectiveness of efforts to abandon FGM/C is uncertain. We conducted a systematic review of the best available evidence regarding evaluations of interventions to prevent FGM/C, including eight controlled before-and-after studies with 7,042 participants from Africa. Findings indicate that 19 of 49 outcomes (with baseline similarity) were significantly different at study level, mostly favoring the intervention, but results from four meta-analyses showed considerable heterogeneity. The limited effectiveness and weak overall quality of the evidence from the studies appear related to methodological limitations of the studies and shortcomings in the implementation of the interventions. Nevertheless, the findings point to possible advantageous developments from the interventions. PMID:23175952

  1. Deformable Surface Accommodating Intraocular Lens: Second Generation Prototype Design Methodology and Testing

    PubMed Central

    McCafferty, Sean J.; Schwiegerling, Jim T.

    2015-01-01

    Purpose: Present an analysis methodology for developing and evaluating accommodating intraocular lenses incorporating a deformable interface. Methods: The next generation design of extruded gel interface intraocular lens is presented. A prototype based upon similar previously in vivo proven design was tested with measurements of actuation force, lens power, interface contour, optical transfer function, and visual Strehl ratio. Prototype verified mathematical models were used to optimize optical and mechanical design parameters to maximize the image quality and minimize the required force to accommodate. Results: The prototype lens produced adequate image quality with the available physiologic accommodating force. The iterative mathematical modeling based upon the prototype yielded maximized optical and mechanical performance through maximum allowable gel thickness to extrusion diameter ratio, maximum feasible refractive index change at the interface, and minimum gel material properties in Poisson's ratio and Young's modulus. Conclusions: The design prototype performed well. It operated within the physiologic constraints of the human eye including the force available for full accommodative amplitude using the eye's natural focusing feedback, while maintaining image quality in the space available. The parameters that optimized optical and mechanical performance were delineated as those, which minimize both asphericity and actuation pressure. The design parameters outlined herein can be used as a template to maximize the performance of a deformable interface intraocular lens. Translational Relevance: The article combines a multidisciplinary basic science approach from biomechanics, optical science, and ophthalmology to optimize an intraocular lens design suitable for preliminary animal trials. PMID:25938005

  2. Proposal of a methodology for the design of offshore wind farms

    NASA Astrophysics Data System (ADS)

    Esteban, Dolores; Diez, J. Javier; Santos Lopez, J.; Negro, Vicente

    2010-05-01

    In fact, the wind power installed in the sea is still very scarce, with only 1,500 megawatts in operation in the middle of 2009. Although the first offshore wind farm experiment took place in 1990, the facilities built up to now have been mainly pilot projects. These previous statements confirm the incipient state of offshore wind power, Anyway, in this moment this technology is being strongly pushed, especially by the governments of some countries - like the United Kingdom, Germany, etc. - which is due above all to the general commitments made to reduce the emission of greenhouses gases. All of these factors lead to predict a promising future for offshore wind power. Nevertheless, it has not been still established a general methodology for the design and the management of this kind of installations. This paper includes some of the results of a research project, which consists on the elaboration of a methodology to enable the optimization of the global process of the operations leading to the implantation of offshore wind facilities. The proposed methodology allows the planning of offshore wind projects according to an integral management policy, enabling not only technical and financial feasibility of the offshore wind project to be achieved, but also respect for the environment. For that, it has been necessary to take into account multiple factors, including the territory, the terrain, the physical-chemical properties of the contact area between the atmosphere and the ocean, the dynamics resulting in both as a consequence of the Earth's behaviour as a heat machine, external geodynamics, internal geodynamics, planetary dynamics, biokenosis, the legislative and financial framework, human activities, wind turbines, met masts, electric substations and lines, foundations, logistics and the project's financial profitability. For its validation, this methodology has been applied to different offshore wind farms in operation.

  3. Systematic study of high-frequency ultrasonic transducer design for laser-scanning photoacoustic ophthalmoscopy

    PubMed Central

    Ma, Teng; Zhang, Xiangyang; Chiu, Chi Tat; Chen, Ruimin; Kirk Shung, K.; Zhou, Qifa; Jiao, Shuliang

    2014-01-01

    Abstract. Photoacoustic ophthalmoscopy (PAOM) is a high-resolution in vivo imaging modality that is capable of providing specific optical absorption information for the retina. A high-frequency ultrasonic transducer is one of the key components in PAOM, which is in contact with the eyelid through coupling gel during imaging. The ultrasonic transducer plays a crucial role in determining the image quality affected by parameters such as spatial resolution, signal-to-noise ratio, and field of view. In this paper, we present the results of a systematic study on a high-frequency ultrasonic transducer design for PAOM. The design includes piezoelectric material selection, frequency selection, and the fabrication process. Transducers of various designs were successfully applied for capturing images of biological samples in vivo. The performances of these designs are compared and evaluated. PMID:24441942

  4. Systematic study of high-frequency ultrasonic transducer design for laser-scanning photoacoustic ophthalmoscopy

    NASA Astrophysics Data System (ADS)

    Ma, Teng; Zhang, Xiangyang; Chiu, Chi Tat; Chen, Ruimin; Kirk Shung, K.; Zhou, Qifa; Jiao, Shuliang

    2014-01-01

    Photoacoustic ophthalmoscopy (PAOM) is a high-resolution in vivo imaging modality that is capable of providing specific optical absorption information for the retina. A high-frequency ultrasonic transducer is one of the key components in PAOM, which is in contact with the eyelid through coupling gel during imaging. The ultrasonic transducer plays a crucial role in determining the image quality affected by parameters such as spatial resolution, signal-to-noise ratio, and field of view. In this paper, we present the results of a systematic study on a high-frequency ultrasonic transducer design for PAOM. The design includes piezoelectric material selection, frequency selection, and the fabrication process. Transducers of various designs were successfully applied for capturing images of biological samples in vivo. The performances of these designs are compared and evaluated.

  5. Human factors analysis and design methods for nuclear waste retrieval systems. Human factors design methodology and integration plan

    SciTech Connect

    Casey, S.M.

    1980-06-01

    The purpose of this document is to provide an overview of the recommended activities and methods to be employed by a team of human factors engineers during the development of a nuclear waste retrieval system. This system, as it is presently conceptualized, is intended to be used for the removal of storage canisters (each canister containing a spent fuel rod assembly) located in an underground salt bed depository. This document, and the others in this series, have been developed for the purpose of implementing human factors engineering principles during the design and construction of the retrieval system facilities and equipment. The methodology presented has been structured around a basic systems development effort involving preliminary development, equipment development, personnel subsystem development, and operational test and evaluation. Within each of these phases, the recommended activities of the human engineering team have been stated, along with descriptions of the human factors engineering design techniques applicable to the specific design issues. Explicit examples of how the techniques might be used in the analysis of human tasks and equipment required in the removal of spent fuel canisters have been provided. Only those techniques having possible relevance to the design of the waste retrieval system have been reviewed. This document is intended to provide the framework for integrating human engineering with the rest of the system development effort. The activities and methodologies reviewed in this document have been discussed in the general order in which they will occur, although the time frame (the total duration of the development program in years and months) in which they should be performed has not been discussed.

  6. Systematic review of effects of current transtibial prosthetic socket designs--Part 2: Quantitative outcomes.

    PubMed

    Safari, Mohammad Reza; Meier, Margrit Regula

    2015-01-01

    This review is an attempt to untangle the complexity of transtibial prosthetic socket fit and perhaps find some indication of whether a particular prosthetic socket type might be best for a given situation. In addition, we identified knowledge gaps, thus providing direction for possible future research. We followed the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines, using medical subject headings and standard key words to search for articles in relevant databases. No restrictions were made on study design and type of outcome measure used. From the obtained search results (n = 1,863), 35 articles were included. The relevant data were entered into a predefined data form that included the Downs and Black risk of bias assessment checklist. This article presents the results from the systematic review of the quantitative outcomes (n = 27 articles). Trends indicate that vacuum-assisted suction sockets improve gait symmetry, volume control, and residual limb health more than other socket designs. Hydrostatic sockets seem to create less inconsistent socket fittings, reducing a problem that greatly influences outcome measures. Knowledge gaps exist in the understanding of clinically meaningful changes in socket fit and its effect on biomechanical outcomes. Further, safe and comfortable pressure thresholds under various conditions should be determined through a systematic approach. PMID:26436733

  7. Piloted Evaluation of an Integrated Methodology for Propulsion and Airframe Control Design

    NASA Technical Reports Server (NTRS)

    Bright, Michelle M.; Simon, Donald L.; Garg, Sanjay; Mattern, Duane L.; Ranaudo, Richard J.; Odonoghue, Dennis P.

    1994-01-01

    An integrated methodology for propulsion and airframe control has been developed and evaluated for a Short Take-Off Vertical Landing (STOVL) aircraft using a fixed base flight simulator at NASA Lewis Research Center. For this evaluation the flight simulator is configured for transition flight using a STOVL aircraft model, a full nonlinear turbofan engine model, simulated cockpit and displays, and pilot effectors. The paper provides a brief description of the simulation models, the flight simulation environment, the displays and symbology, the integrated control design, and the piloted tasks used for control design evaluation. In the simulation, the pilots successfully completed typical transition phase tasks such as combined constant deceleration with flight path tracking, and constant acceleration wave-off maneuvers. The pilot comments of the integrated system performance and the display symbology are discussed and analyzed to identify potential areas of improvement.

  8. Application of Adjoint Methodology in Various Aspects of Sonic Boom Design

    NASA Technical Reports Server (NTRS)

    Rallabhandi, Sriram K.

    2014-01-01

    One of the advances in computational design has been the development of adjoint methods allowing efficient calculation of sensitivities in gradient-based shape optimization. This paper discusses two new applications of adjoint methodology that have been developed to aid in sonic boom mitigation exercises. In the first, equivalent area targets are generated using adjoint sensitivities of selected boom metrics. These targets may then be used to drive the vehicle shape during optimization. The second application is the computation of adjoint sensitivities of boom metrics on the ground with respect to parameters such as flight conditions, propagation sampling rate, and selected inputs to the propagation algorithms. These sensitivities enable the designer to make more informed selections of flight conditions at which the chosen cost functionals are less sensitive.

  9. Application of Adjoint Methodology to Supersonic Aircraft Design Using Reversed Equivalent Areas

    NASA Technical Reports Server (NTRS)

    Rallabhandi, Sriram K.

    2013-01-01

    This paper presents an approach to shape an aircraft to equivalent area based objectives using the discrete adjoint approach. Equivalent areas can be obtained either using reversed augmented Burgers equation or direct conversion of off-body pressures into equivalent area. Formal coupling with CFD allows computation of sensitivities of equivalent area objectives with respect to aircraft shape parameters. The exactness of the adjoint sensitivities is verified against derivatives obtained using the complex step approach. This methodology has the benefit of using designer-friendly equivalent areas in the shape design of low-boom aircraft. Shape optimization results with equivalent area cost functionals are discussed and further refined using ground loudness based objectives.

  10. A structural design methodology for large angle articulated trusses considering realistic joint modeling

    NASA Astrophysics Data System (ADS)

    Thorwald, Gregory; Mikulas, Martin M., Jr.

    1994-01-01

    A structural design methodology is developed by quantifying the magnitude that large angle articulations and realistic modeling considerations adversely affect a truss's structural stiffness. Batten actuators provide the ability for the truss both to deploy and articulate. Such an articulated truss can be used in space crane applications. With geometry and modeling considerations identified and examined, strategies to alleviate the truss's stiffness reduction are developed and evaluated. Using these strategies, an improved articulated truss is then demonstrated. Observing that the design strategies are effective for the planar truss models similar 3-D truss models are then analyzed. The results show that the improvement strategies benefit both the 2-D and 3-D truss models.

  11. A hybrid design methodology for structuring an Integrated Environmental Management System (IEMS) for shipping business.

    PubMed

    Celik, Metin

    2009-03-01

    The International Safety Management (ISM) Code defines a broad framework for the safe management and operation of merchant ships, maintaining high standards of safety and environmental protection. On the other hand, ISO 14001:2004 provides a generic, worldwide environmental management standard that has been utilized by several industries. Both the ISM Code and ISO 14001:2004 have the practical goal of establishing a sustainable Integrated Environmental Management System (IEMS) for shipping businesses. This paper presents a hybrid design methodology that shows how requirements from both standards can be combined into a single execution scheme. Specifically, the Analytic Hierarchy Process (AHP) and Fuzzy Axiomatic Design (FAD) are used to structure an IEMS for ship management companies. This research provides decision aid to maritime executives in order to enhance the environmental performance in the shipping industry. PMID:19038488

  12. Design optimization of a permanent magnet synchronous motor by the response surface methodology

    NASA Astrophysics Data System (ADS)

    Fujishima, Y.; Wakao, S.; Yamashita, A.; Katsuta, T.; Matsuoka, K.; Kondo, M.

    2002-05-01

    This article proposes an effective computational approach to design optimization of an outer-rotor type permanent magnet synchronous motor. As usual, because of the complicated rotor configuration and the complex magnetic saturation effects, it is difficult to design the lightweight permanent magnet synchronous motor structure that makes good use of reluctance torque within an acceptable CPU time. In this article, we adopt the finite element method as a magnetic field analysis method and the genetic algorithms as a search method. Furthermore, the response surface methodology, which enables us to evaluate the objective physical quantities in a much shorter time, is introduced into the above methods in the proposed approach. This optimization approach results in an overall increase in the optimization speed, that is, substantial CPU time reduction in comparison with the case of a conventional one. Some numerical results that demonstrate the validity of the proposed approach are also presented.

  13. A Systematic Composite Service Design Modeling Method Using Graph-Based Theory

    PubMed Central

    Elhag, Arafat Abdulgader Mohammed; Mohamad, Radziah; Aziz, Muhammad Waqar; Zeshan, Furkh

    2015-01-01

    The composite service design modeling is an essential process of the service-oriented software development life cycle, where the candidate services, composite services, operations and their dependencies are required to be identified and specified before their design. However, a systematic service-oriented design modeling method for composite services is still in its infancy as most of the existing approaches provide the modeling of atomic services only. For these reasons, a new method (ComSDM) is proposed in this work for modeling the concept of service-oriented design to increase the reusability and decrease the complexity of system while keeping the service composition considerations in mind. Furthermore, the ComSDM method provides the mathematical representation of the components of service-oriented design using the graph-based theoryto facilitate the design quality measurement. To demonstrate that the ComSDM method is also suitable for composite service design modeling of distributed embedded real-time systems along with enterprise software development, it is implemented in the case study of a smart home. The results of the case study not only check the applicability of ComSDM, but can also be used to validate the complexity and reusability of ComSDM. This also guides the future research towards the design quality measurement such as using the ComSDM method to measure the quality of composite service design in service-oriented software system. PMID:25928358

  14. A systematic composite service design modeling method using graph-based theory.

    PubMed

    Elhag, Arafat Abdulgader Mohammed; Mohamad, Radziah; Aziz, Muhammad Waqar; Zeshan, Furkh

    2015-01-01

    The composite service design modeling is an essential process of the service-oriented software development life cycle, where the candidate services, composite services, operations and their dependencies are required to be identified and specified before their design. However, a systematic service-oriented design modeling method for composite services is still in its infancy as most of the existing approaches provide the modeling of atomic services only. For these reasons, a new method (ComSDM) is proposed in this work for modeling the concept of service-oriented design to increase the reusability and decrease the complexity of system while keeping the service composition considerations in mind. Furthermore, the ComSDM method provides the mathematical representation of the components of service-oriented design using the graph-based theoryto facilitate the design quality measurement. To demonstrate that the ComSDM method is also suitable for composite service design modeling of distributed embedded real-time systems along with enterprise software development, it is implemented in the case study of a smart home. The results of the case study not only check the applicability of ComSDM, but can also be used to validate the complexity and reusability of ComSDM. This also guides the future research towards the design quality measurement such as using the ComSDM method to measure the quality of composite service design in service-oriented software system. PMID:25928358

  15. Methodology for Sensitivity Analysis, Approximate Analysis, and Design Optimization in CFD for Multidisciplinary Applications

    NASA Technical Reports Server (NTRS)

    Taylor, Arthur C., III; Hou, Gene W.

    1996-01-01

    An incremental iterative formulation together with the well-known spatially split approximate-factorization algorithm, is presented for solving the large, sparse systems of linear equations that are associated with aerodynamic sensitivity analysis. This formulation is also known as the 'delta' or 'correction' form. For the smaller two dimensional problems, a direct method can be applied to solve these linear equations in either the standard or the incremental form, in which case the two are equivalent. However, iterative methods are needed for larger two-dimensional and three dimensional applications because direct methods require more computer memory than is currently available. Iterative methods for solving these equations in the standard form are generally unsatisfactory due to an ill-conditioned coefficient matrix; this problem is overcome when these equations are cast in the incremental form. The methodology is successfully implemented and tested using an upwind cell-centered finite-volume formulation applied in two dimensions to the thin-layer Navier-Stokes equations for external flow over an airfoil. In three dimensions this methodology is demonstrated with a marching-solution algorithm for the Euler equations to calculate supersonic flow over the High-Speed Civil Transport configuration (HSCT 24E). The sensitivity derivatives obtained with the incremental iterative method from a marching Euler code are used in a design-improvement study of the HSCT configuration that involves thickness. camber, and planform design variables.

  16. Design methodology for multi-pumped discrete Raman amplifiers: case-study employing photonic crystal fibers.

    PubMed

    Castellani, C E S; Cani, S P N; Segatto, M E; Pontes, M J; Romero, M A

    2009-08-01

    This paper proposes a new design methodology for discrete multi-pumped Raman amplifier. In a multi-objective optimization scenario, in a first step the whole solution-space is inspected by a CW analytical formulation. Then, the most promising solutions are fully investigated by a rigorous numerical treatment and the Raman amplification performance is thus determined by the combination of analytical and numerical approaches. As an application of our methodology we designed an photonic crystal fiber Raman amplifier configuration which provides low ripple, high gain, clear eye opening and a low power penalty. The amplifier configuration also enables to fully compensate the dispersion introduced by a 70-km singlemode fiber in a 10 Gbit/s system. We have successfully obtained a configuration with 8.5 dB average gain over the C-band and 0.71 dB ripple with almost zero eye-penalty using only two pump lasers with relatively low pump power. PMID:19654822

  17. Design of roundness measurement model with multi-systematic error for cylindrical components with large radius

    NASA Astrophysics Data System (ADS)

    Sun, Chuanzhi; Wang, Lei; Tan, Jiubin; Zhao, Bo; Tang, Yangchao

    2016-02-01

    The paper designs a roundness measurement model with multi-systematic error, which takes eccentricity, probe offset, radius of tip head of probe, and tilt error into account for roundness measurement of cylindrical components. The effects of the systematic errors and radius of components are analysed in the roundness measurement. The proposed method is built on the instrument with a high precision rotating spindle. The effectiveness of the proposed method is verified by experiment with the standard cylindrical component, which is measured on a roundness measuring machine. Compared to the traditional limacon measurement model, the accuracy of roundness measurement can be increased by about 2.2 μm using the proposed roundness measurement model for the object with a large radius of around 37 mm. The proposed method can improve the accuracy of roundness measurement and can be used for error separation, calibration, and comparison, especially for cylindrical components with a large radius.

  18. Design of roundness measurement model with multi-systematic error for cylindrical components with large radius.

    PubMed

    Sun, Chuanzhi; Wang, Lei; Tan, Jiubin; Zhao, Bo; Tang, Yangchao

    2016-02-01

    The paper designs a roundness measurement model with multi-systematic error, which takes eccentricity, probe offset, radius of tip head of probe, and tilt error into account for roundness measurement of cylindrical components. The effects of the systematic errors and radius of components are analysed in the roundness measurement. The proposed method is built on the instrument with a high precision rotating spindle. The effectiveness of the proposed method is verified by experiment with the standard cylindrical component, which is measured on a roundness measuring machine. Compared to the traditional limacon measurement model, the accuracy of roundness measurement can be increased by about 2.2 μm using the proposed roundness measurement model for the object with a large radius of around 37 mm. The proposed method can improve the accuracy of roundness measurement and can be used for error separation, calibration, and comparison, especially for cylindrical components with a large radius. PMID:26931894

  19. Integrated active and passive control design methodology for the LaRC CSI evolutionary model

    NASA Technical Reports Server (NTRS)

    Voth, Christopher T.; Richards, Kenneth E., Jr.; Schmitz, Eric; Gehling, Russel N.; Morgenthaler, Daniel R.

    1994-01-01

    A general design methodology to integrate active control with passive damping was demonstrated on the NASA LaRC CSI Evolutionary Model (CEM), a ground testbed for future large, flexible spacecraft. Vibration suppression controllers designed for Line-of Sight (LOS) minimization were successfully implemented on the CEM. A frequency-shaped H2 methodology was developed, allowing the designer to specify the roll-off of the MIMO compensator. A closed loop bandwidth of 4 Hz, including the six rigid body modes and the first three dominant elastic modes of the CEM was achieved. Good agreement was demonstrated between experimental data and analytical predictions for the closed loop frequency response and random tests. Using the Modal Strain Energy (MSE) method, a passive damping treatment consisting of 60 viscoelastically damped struts was designed, fabricated and implemented on the CEM. Damping levels for the targeted modes were more than an order of magnitude larger than for the undamped structure. Using measured loss and stiffness data for the individual damped struts, analytical predictions of the damping levels were very close to the experimental values in the (1-10) Hz frequency range where the open loop model matched the experimental data. An integrated active/passive controller was successfully implemented on the CEM and was evaluated against an active-only controller. A two-fold increase in the effective control bandwidth and further reductions of 30 percent to 50 percent in the LOS RMS outputs were achieved compared to an active-only controller. Superior performance was also obtained compared to a High-Authority/Low-Authority (HAC/LAC) controller.

  20. A Methodological Approach for Designating Management Zones in Mount Spil National Park, Turkey.

    PubMed

    Hepcan

    2000-09-01

    / This study was undertaken to (1) determine the suitability of ecosystems within Mount Spil National Park (Turkey) to human activities by a systematic zoning procedure, and (2) provide the basis for developing sound management strategies based on natural-cultural resource attributes of the park. After assessing natural-cultural resources and human activity requirements, the suitability of three zones (Strict Protection Zone, SPZ; Restricted Use Zone, RUZ; and Recreation and Administration Zone, RAZ) for proposed human activities/land uses was determined in order to maintain ecological sustainability and integrity through a weighting-ranking methodology, based on a grid cell resolution of 1 km x 1 km. Results showed that out of the three management zones, the RUZ in which the recreational activities that do not require physical developments are allowed constituted 82% of the park area as the first priority management zone. The proposed zoning procedure is believed to be a key step to improve management for both the study area and other national parks with the similar landscape features. PMID:10977885

  1. Modeling and Design Analysis Methodology for Tailoring of Aircraft Structures with Composites

    NASA Technical Reports Server (NTRS)

    Rehfield, Lawrence W.

    2004-01-01

    Composite materials provide design flexibility in that fiber placement and orientation can be specified and a variety of material forms and manufacturing processes are available. It is possible, therefore, to 'tailor' the structure to a high degree in order to meet specific design requirements in an optimum manner. Common industrial practices, however, have limited the choices designers make. One of the reasons for this is that there is a dearth of conceptual/preliminary design analysis tools specifically devoted to identifying structural concepts for composite airframe structures. Large scale finite element simulations are not suitable for such purposes. The present project has been devoted to creating modeling and design analysis methodology for use in the tailoring process of aircraft structures. Emphasis has been given to creating bend-twist elastic coupling in high aspect ratio wings or other lifting surfaces. The direction of our work was in concert with the overall NASA effort Twenty- First Century Aircraft Technology (TCAT). A multi-disciplinary team was assembled by Dr. Damodar Ambur to work on wing technology, which included our project.

  2. A robust rotorcraft flight control system design methodology utilizing quantitative feedback theory

    NASA Technical Reports Server (NTRS)

    Gorder, Peter James

    1993-01-01

    Rotorcraft flight control systems present design challenges which often exceed those associated with fixed-wing aircraft. First, large variations in the response characteristics of the rotorcraft result from the wide range of airspeeds of typical operation (hover to over 100 kts). Second, the assumption of vehicle rigidity often employed in the design of fixed-wing flight control systems is rarely justified in rotorcraft where rotor degrees of freedom can have a significant impact on the system performance and stability. This research was intended to develop a methodology for the design of robust rotorcraft flight control systems. Quantitative Feedback Theory (QFT) was chosen as the basis for the investigation. Quantitative Feedback Theory is a technique which accounts for variability in the dynamic response of the controlled element in the design robust control systems. It was developed to address a Multiple-Input Single-Output (MISO) design problem, and utilizes two degrees of freedom to satisfy the design criteria. Two techniques were examined for extending the QFT MISO technique to the design of a Multiple-Input-Multiple-Output (MIMO) flight control system (FCS) for a UH-60 Black Hawk Helicopter. In the first, a set of MISO systems, mathematically equivalent to the MIMO system, was determined. QFT was applied to each member of the set simultaneously. In the second, the same set of equivalent MISO systems were analyzed sequentially, with closed loop response information from each loop utilized in subsequent MISO designs. The results of each technique were compared, and the advantages of the second, termed Sequential Loop Closure, were clearly evident.

  3. Electronic Symptom Reporting Between Patient and Provider for Improved Health Care Service Quality: A Systematic Review of Randomized Controlled Trials. Part 2: Methodological Quality and Effects

    PubMed Central

    Berntsen, Gro K Rosvold; Schuster, Tibor; Henriksen, Eva; Horsch, Alexander

    2012-01-01

    Background We conducted in two parts a systematic review of randomized controlled trials (RCTs) on electronic symptom reporting between patients and providers to improve health care service quality. Part 1 reviewed the typology of patient groups, health service innovations, and research targets. Four innovation categories were identified: consultation support, monitoring with clinician support, self-management with clinician support, and therapy. Objective To assess the methodological quality of the RCTs, and summarize effects and benefits from the methodologically best studies. Methods We searched Medline, EMBASE, PsycINFO, Cochrane Central Register of Controlled Trials, and IEEE Xplore for original studies presented in English-language articles between 1990 and November 2011. Risk of bias and feasibility were judged according to the Cochrane recommendation, and theoretical evidence and preclinical testing were evaluated according to the Framework for Design and Evaluation of Complex Interventions to Improve Health. Three authors assessed the risk of bias and two authors extracted the effect data independently. Disagreement regarding bias assessment, extraction, and interpretation of results were resolved by consensus discussions. Results Of 642 records identified, we included 32 articles representing 29 studies. No articles fulfilled all quality requirements. All interventions were feasible to implement in a real-life setting, and theoretical evidence was provided for almost all studies. However, preclinical testing was reported in only a third of the articles. We judged three-quarters of the articles to have low risk for random sequence allocation and approximately half of the articles to have low risk for the following biases: allocation concealment, incomplete outcome data, and selective reporting. Slightly more than one fifth of the articles were judged as low risk for blinding of outcome assessment. Only 1 article had low risk of bias for blinding of

  4. Integrated controls-structures design methodology development for a class of flexible spacecraft

    NASA Technical Reports Server (NTRS)

    Maghami, P. G.; Joshi, S. M.; Walz, J. E.; Armstrong, E. S.

    1990-01-01

    Future utilization of space will require large space structures in low-Earth and geostationary orbits. Example missions include: Earth observation systems, personal communication systems, space science missions, space processing facilities, etc., requiring large antennas, platforms, and solar arrays. The dimensions of such structures will range from a few meters to possibly hundreds of meters. For reducing the cost of construction, launching, and operating (e.g., energy required for reboosting and control), it will be necessary to make the structure as light as possible. However, reducing structural mass tends to increase the flexibility which would make it more difficult to control with the specified precision in attitude and shape. Therefore, there is a need to develop a methodology for designing space structures which are optimal with respect to both structural design and control design. In the current spacecraft design practice, it is customary to first perform the structural design and then the controller design. However, the structural design and the control design problems are substantially coupled and must be considered concurrently in order to obtain a truly optimal spacecraft design. For example, let C denote the set of the 'control' design variables (e.g., controller gains), and L the set of the 'structural' design variables (e.g., member sizes). If a structural member thickness is changed, the dynamics would change which would then change the control law and the actuator mass. That would, in turn, change the structural model. Thus, the sets C and L depend on each other. Future space structures can be roughly divided into four mission classes. Class 1 missions include flexible spacecraft with no articulated appendages which require fine attitude pointing and vibration suppression (e.g., large space antennas). Class 2 missions consist of flexible spacecraft with articulated multiple payloads, where the requirement is to fine-point the spacecraft and each

  5. Development of a design methodology for pipelines in ice scoured seabeds

    SciTech Connect

    Clark, J.I.; Paulin, M.J.; Lach, P.R.; Yang, Q.S.; Poorooshasb, H.

    1994-12-31

    Large areas of the continental shelf of northern oceans are frequently scoured or gouged by moving bodies of ice such as icebergs and sea ice keels associated with pressure ridges. This phenomenon presents a formidable challenge when the route of a submarine pipeline is intersected by the scouring ice. It is generally acknowledged that if a pipeline, laid on the seabed, were hit by an iceberg or a pressure ridge keel, the forces imposed on the pipeline would be much greater than it could practically withstand. The pipeline must therefore be buried to avoid direct contact with ice, but it is very important to determine with some assurance the minimum depth required for safety for both economical and environmental reasons. The safe burial depth of a pipeline, however, cannot be determined directly from the relatively straight forward measurement of maximum scour depth. The major design consideration is the determination of the potential sub-scour deformation of the ice scoured soil. Forces transmitted through the soil and soil displacement around the pipeline could load the pipeline to failure if not taken into account in the design. If the designer can predict the forces transmitted through the soil, the pipeline can be designed to withstand these external forces using conventional design practice. In this paper, the authors outline a design methodology that is based on phenomenological studies of ice scoured terrain, both modern and relict, laboratory tests, centrifuge modeling, and numerical analysis. The implications of these studies, which could assist in the safe and economical design of pipelines in ice scoured terrain, will also be discussed.

  6. Systematic design of output filters for audio class-D amplifiers via Simplified Real Frequency Technique

    NASA Astrophysics Data System (ADS)

    Hintzen, E.; Vennemann, T.; Mathis, W.

    2014-11-01

    In this paper a new filter design concept is proposed and implemented which takes into account the complex loudspeaker impedance. By means of techniques of broadband matching, that has been successfully applied in radio technology, we are able to optimize the reconstruction filter to achieve an overall linear frequency response. Here, a passive filter network is inserted between source and load that matches the complex load impedance to the complex source impedance within a desired frequency range. The design and calculation of the filter is usually done using numerical approximation methods which are known as Real Frequency Techniques (RFT). A first approach to systematic design of reconstruction filters for class-D amplifiers is proposed, using the Simplified Real Frequency Technique (SRFT). Some fundamental considerations are introduced as well as the benefits and challenges of impedance matching between class-D amplifiers and loudspeakers. Current simulation data using MATLAB is presented and supports some first conclusions.

  7. A Test Methodology for Determining Space-Readiness of Xilinx SRAM-Based FPGA Designs

    SciTech Connect

    Quinn, Heather M; Graham, Paul S; Morgan, Keith S; Caffrey, Michael P

    2008-01-01

    Using reconfigurable, static random-access memory (SRAM) based field-programmable gate arrays (FPGAs) for space-based computation has been an exciting area of research for the past decade. Since both the circuit and the circuit's state is stored in radiation-tolerant memory, both could be alterd by the harsh space radiation environment. Both the circuit and the circuit's state can be prote cted by triple-moduler redundancy (TMR), but applying TMR to FPGA user designs is often an error-prone process. Faulty application of TMR could cause the FPGA user circuit to output incorrect data. This paper will describe a three-tiered methodology for testing FPGA user designs for space-readiness. We will describe the standard approach to testing FPGA user designs using a particle accelerator, as well as two methods using fault injection and a modeling tool. While accelerator testing is the current 'gold standard' for pre-launch testing, we believe the use of fault injection and modeling tools allows for easy, cheap and uniform access for discovering errors early in the design process.

  8. Methodology to Improve Design of Accelerated Life Tests in Civil Engineering Projects

    PubMed Central

    Lin, Jing; Yuan, Yongbo; Zhou, Jilai; Gao, Jie

    2014-01-01

    For reliability testing an Energy Expansion Tree (EET) and a companion Energy Function Model (EFM) are proposed and described in this paper. Different from conventional approaches, the EET provides a more comprehensive and objective way to systematically identify external energy factors affecting reliability. The EFM introduces energy loss into a traditional Function Model to identify internal energy sources affecting reliability. The combination creates a sound way to enumerate the energies to which a system may be exposed during its lifetime. We input these energies into planning an accelerated life test, a Multi Environment Over Stress Test. The test objective is to discover weak links and interactions among the system and the energies to which it is exposed, and design them out. As an example, the methods are applied to the pipe in subsea pipeline. However, they can be widely used in other civil engineering industries as well. The proposed method is compared with current methods. PMID:25111800

  9. Development of a decision-making methodology to design a water quality monitoring network.

    PubMed

    Keum, Jongho; Kaluarachchi, Jagath J

    2015-07-01

    The number of water quality monitoring stations in the USA has decreased over the past few decades. Scarcity of observations can easily produce prediction uncertainty due to unreliable model calibration. An effective water quality monitoring network is important not only for model calibration and water quality prediction but also for resources management. Redundant or improperly located monitoring stations may cause increased monitoring costs without improvement to the understanding of water quality in watersheds. In this work, a decision-making methodology is proposed to design a water quality monitoring network by providing an adequate number of monitoring stations and their approximate locations at the eight-digit hydrologic unit codes (HUC8) scale. The proposed methodology is demonstrated for an example at the Upper Colorado River Basin (UCRB), where salinity is a serious concern. The level of monitoring redundancy or scarcity is defined by an index, station ratio (SR), which represents a monitoring density based on water quality load originated within a subbasin. By comparing the number of stations from a selected target SR with the available number of stations including the actual and the potential stations, the suggested number of stations in each subbasin was decided. If monitoring stations are primarily located in the low salinity loading subbasins, the average actual SR tends to increase, and vice versa. Results indicate that the spatial distribution of monitoring locations in 2011 is concentrated on low salinity loading subbasins, and therefore, additional monitoring is required for the high salinity loading subbasins. The proposed methodology shows that the SR is a simple and a practical indicator for monitoring density. PMID:26113203

  10. Systematic screening of the cellular uptake of designed alpha-helix peptides.

    PubMed

    Usui, Kenji; Kikuchi, Takuya; Mie, Masayasu; Kobatake, Eiry; Mihara, Hisakazu

    2013-05-01

    The cellular penetration (CP) activity of functional molecules has attracted significant attention as one of the most promising new approaches for drug delivery. In particular, cell-penetrating peptides (CPPs) have been studied extensively in cellular engineering. Because there have been few large-scale systematic studies to identify peptide sequences with optimal CP activity or that are suitable for further applications in cell engineering, such as cell-specific penetration and cell-selective culture, we screened and compared the cellular uptake (CU) activity of 54 systematically designed α-helical peptides in HeLa cells. Furthermore, the CU activity of 24 designed peptides was examined in four cell lines using a cell fingerprinting technique and statistical approaches. The CU activities in various cells depended on amino acid residues of peptide sequences as well as charge, α-helical content and hydrophobicity of the peptides. Notably, the mutation of a single residue significantly altered the CU ability of a peptide, highlighting the variability of cell uptake mechanisms. Moreover, these results demonstrated the feasibility of cell-selective culture by conducting cell-selective permeation and death in cultures containing two cell types. These studies may lead to further peptide library design and screening for new classes of CPPs with useful functions. PMID:23498920

  11. Improving Clinical Trial Participant Tracking Tools Using Knowledge-Anchored Design Methodologies

    PubMed Central

    Payne, P.R.O.; Embi, P.J.; Johnson, S.B.; Mendonca, E.; Starren, J.

    2010-01-01

    Objective Rigorous human-computer interaction (HCI) design methodologies have not traditionally been applied to the development of clinical trial participant tracking (CTPT) tools. Given the frequent use of iconic HCI models in CTPTs, and prior evidence of usability problems associated with the use of ambiguous icons in complex interfaces, such approaches may be problematic. Presentation Discovery (PD), a knowledge-anchored HCI design method, has been previously demonstrated to improve the design of iconic HCI models. In this study, we compare the usability of a CTPT HCI model designed using PD and an intuitively designed CTPT HCI model. Methods An iconic CPTP HCI model was created using PD. The PD-generated and an existing iconic CTPT HCI model were subjected to usability testing, with an emphasis on task accuracy and completion times. Study participants also completed a qualitative survey instrument to evaluate subjective satisfaction with the two models. Results CTPT end-users reliably and reproducibly agreed on the visual manifestation and semantics of prototype graphics generated using PD. The performance of the PD-generated iconic HCI model was equivalent to an existing HCI model for tasks at multiple levels of complexity, and in some cases superior. This difference was particularly notable when tasks required an understanding of the semantic meanings of multiple icons. Conclusion The use of PD to design an iconic CTPT HCI model generated beneficial results and improved end-user subjective satisfaction, while reducing task completion time. Such results are desirable in information and time intensive domains, such as clinical trials management. PMID:22132037

  12. A probabilistic methodology for radar cross section prediction in conceptual aircraft design

    NASA Astrophysics Data System (ADS)

    Hines, Nathan Robert

    System effectiveness has increasingly become the prime metric for the evaluation of military aircraft. As such, it is the decision maker's/designer's goal to maximize system effectiveness. Industry and government research documents indicate that all future military aircraft will incorporate signature reduction as an attempt to improve system effectiveness and reduce the cost of attrition. Today's operating environments demand low observable aircraft which are able to reliably take out valuable, time critical targets. Thus it is desirable to be able to design vehicles that are balanced for increased effectiveness. Previous studies have shown that shaping of the vehicle is one of the most important contributors to radar cross section, a measure of radar signature, and must be considered from the very beginning of the design process. Radar cross section estimation should be incorporated into conceptual design to develop more capable systems. This research strives to meet these needs by developing a conceptual design tool that predicts radar cross section for parametric geometries. This tool predicts the absolute radar cross section of the vehicle as well as the impact of geometry changes, allowing for the simultaneous tradeoff of the aerodynamic, performance, and cost characteristics of the vehicle with the radar cross section. Furthermore, this tool can be linked to a campaign theater analysis code to demonstrate the changes in system and system of system effectiveness due to changes in aircraft geometry. A general methodology was developed and implemented and sample computer codes applied to prototype the proposed process. Studies utilizing this radar cross section tool were subsequently performed to demonstrate the capabilities of this method and show the impact that various inputs have on the outputs of these models. The F/A-18 aircraft configuration was chosen as a case study vehicle to perform a design space exercise and to investigate the relative impact of

  13. Assessment of an effective quasirelativistic methodology designed to study astatine chemistry in aqueous solution.

    PubMed

    Champion, Julie; Seydou, Mahamadou; Sabatié-Gogova, Andrea; Renault, Eric; Montavon, Gilles; Galland, Nicolas

    2011-09-01

    A cost-effective computational methodology designed to study astatine (At) chemistry in aqueous solution has been established. It is based on two-component spin-orbit density functional theory calculations and solvation calculations using the conductor-like polarizable continuum model in conjunction with specific astatine cavities. Theoretical calculations are confronted with experimental data measured for complexation reactions between metallic forms of astatine (At(+) and AtO(+)) and inorganic ligands (Cl(-), Br(-) and SCN(-)). For each reaction, both 1:1 and 1:2 complexes are evidenced. The experimental trends regarding the thermodynamic constants (K) can be reproduced qualitatively and quantitatively. The mean signed error on computed Log K values is -0.4, which corresponds to a mean signed error smaller than 1 kcal mol(-1) on free energies of reaction. Theoretical investigations show that the reactivity of cationic species of astatine is highly sensitive to spin-orbit coupling and solvent effects. At the moment, the presented computational methodology appears to be the only tool to gain an insight into astatine chemistry at a molecular level. PMID:21769335

  14. Methodologies used in cost-effectiveness models for evaluating treatments in major depressive disorder: a systematic review

    PubMed Central

    2012-01-01

    Background Decision makers in many jurisdictions use cost-effectiveness estimates as an aid for selecting interventions with an appropriate balance between health benefits and costs. This systematic literature review aims to provide an overview of published cost-effectiveness models in major depressive disorder (MDD) with a focus on the methods employed. Key components of the identified models are discussed and any challenges in developing models are highlighted. Methods A systematic literature search was performed to identify all primary model-based economic evaluations of MDD interventions indexed in MEDLINE, the Cochrane Library, EMBASE, EconLit, and PsycINFO between January 2000 and May 2010. Results A total of 37 studies were included in the review. These studies predominantly evaluated antidepressant medications. The analyses were performed across a broad set of countries. The majority of models were decision-trees; eight were Markov models. Most models had a time horizon of less than 1 year. The majority of analyses took a payer perspective. Clinical input data were obtained from pooled placebo-controlled comparative trials, single head-to-head trials, or meta-analyses. The majority of studies (24 of 37) used treatment success or symptom-free days as main outcomes, 14 studies incorporated health state utilities, and 2 used disability-adjusted life-years. A few models (14 of 37) incorporated probabilities and costs associated with suicide and/or suicide attempts. Two models examined the cost-effectiveness of second-line treatment in patients who had failed to respond to initial therapy. Resource use data used in the models were obtained mostly from expert opinion. All studies, with the exception of one, explored parameter uncertainty. Conclusions The review identified several model input data gaps, including utility values in partial responders, efficacy of second-line treatments, and resource utilisation estimates obtained from relevant, high-quality studies

  15. Detailed Methodology for Systematic Reviews of Interventions to Improve the Sexual and Reproductive Health of Young People in Low- and Middle-Income Countries.

    PubMed

    Hindin, Michelle J; Kalamar, Amanda M

    2016-09-01

    The goal of this project was to systematically review and compile evidence on interventions in low- and middle-income countries, which targeted three adverse health-related outcomes for young people (ages 10-24): (1) early pregnancy and repeat pregnancy; (2) child marriage; and (3) sexually transmitted infections including human immunodeficiency virus. We searched the gray and published literature to identify interventions and developed a scoring system to assess whether these interventions and their evaluations were of high quality. The three review articles in this volume focus on behavioral outcomes and provide a summary of interventions and evaluations that were both successful and unsuccessful in their impact on the targeted outcomes. This commentary provides the details of the methodology that are common across all three review articles. PMID:27562451

  16. Mixed culture optimization for marigold flower ensilage via experimental design and response surface methodology.

    PubMed

    Navarrete-Bolaños, José Luis; Jiménez-Islas, Hugo; Botello-Alvarez, Enrique; Rico-Martínez, Ramiro

    2003-04-01

    Endogenous microorganisms isolated from the marigold flower (Tagetes erecta) were studied to understand the events taking place during its ensilage. Studies of the cellulase enzymatic activity and the ensilage process were undertaken. In both studies, the use of approximate second-order models and multiple lineal regression, within the context of an experimental mixture design using the response surface methodology as optimization strategy, determined that the microorganisms Flavobacterium IIb, Acinetobacter anitratus, and Rhizopus nigricans are the most significant in marigold flower ensilage and exhibit high cellulase activity. A mixed culture comprised of 9.8% Flavobacterium IIb, 41% A. anitratus, and 49.2% R. nigricans used during ensilage resulted in an increased yield of total xanthophylls extracted of 24.94 g/kg of dry weight compared with 12.92 for the uninoculated control ensilage. PMID:12670157

  17. Transmutation of singularities and zeros in graded index optical instruments: a methodology for designing practical devices.

    PubMed

    Hooper, I R; Philbin, T G

    2013-12-30

    We describe a design methodology for modifying the refractive index profile of graded-index optical instruments that incorporate singularities or zeros in their refractive index. The process maintains the device performance whilst resulting in graded profiles that are all-dielectric, do not require materials with unrealistic values, and that are impedance matched to the bounding medium. This is achieved by transmuting the singularities (or zeros) using the formalism of transformation optics, but with an additional boundary condition requiring the gradient of the co-ordinate transformation be continuous. This additional boundary condition ensures that the device is impedance matched to the bounding medium when the spatially varying permittivity and permeability profiles are scaled to realizable values. We demonstrate the method in some detail for an Eaton lens, before describing the profiles for an "invisible disc" and "multipole" lenses. PMID:24514824

  18. A systematic methodology for large scale compound screening: A case study on the discovery of novel S1PL inhibitors.

    PubMed

    Deniz, Utku; Ozkirimli, Elif; Ulgen, Kutlu O

    2016-01-01

    Decrease in sphingosine 1-phosphate (S1P) concentration induces migration of pathogenic T cells to the blood stream, disrupts the CNS and it is implicated in multiple sclerosis (MS), a progressive inflammatory disorder of the central nervous system (CNS), and Alzheimer's disease (AD). A promising treatment alternative for MS and AD is inhibition of the activity of the microsomal enzyme sphingosine 1-phosphate lyase (S1PL), which degrades intracellular S1P. This report describes an integrated systematic approach comprising virtual screening, molecular docking, substructure search and molecular dynamics simulation to discover novel S1PL inhibitors. Virtual screening of the ZINC database via ligand-based and structure-based pharmacophore models yielded 10000 hits. After molecular docking, common substructures of the top ranking hits were identified. The ligand binding poses were optimized by induced fit docking. MD simulations were performed on the complex structures to determine the stability of the S1PL-ligand complex and to calculate the binding free energy. Selectivity of the selected molecules was examined by docking them to hERG and cytochrome P450 receptors. As a final outcome, 15 compounds from different chemotypes were proposed as potential S1PL inhibitors. These molecules may guide future medicinal chemistry efforts in the discovery of new compounds against the destructive action of pathogenic T cells. PMID:26724452

  19. Design methodology accounting for fabrication errors in manufactured modified Fresnel lenses for controlled LED illumination.

    PubMed

    Shim, Jongmyeong; Kim, Joongeok; Lee, Jinhyung; Park, Changsu; Cho, Eikhyun; Kang, Shinill

    2015-07-27

    The increasing demand for lightweight, miniaturized electronic devices has prompted the development of small, high-performance optical components for light-emitting diode (LED) illumination. As such, the Fresnel lens is widely used in applications due to its compact configuration. However, the vertical groove angle between the optical axis and the groove inner facets in a conventional Fresnel lens creates an inherent Fresnel loss, which degrades optical performance. Modified Fresnel lenses (MFLs) have been proposed in which the groove angles along the optical paths are carefully controlled; however, in practice, the optical performance of MFLs is inferior to the theoretical performance due to fabrication errors, as conventional design methods do not account for fabrication errors as part of the design process. In this study, the Fresnel loss and the loss area due to microscopic fabrication errors in the MFL were theoretically derived to determine optical performance. Based on this analysis, a design method for the MFL accounting for the fabrication errors was proposed. MFLs were fabricated using an ultraviolet imprinting process and an injection molding process, two representative processes with differing fabrication errors. The MFL fabrication error associated with each process was examined analytically and experimentally to investigate our methodology. PMID:26367631

  20. Designing reasonable accommodation of the workplace: a new methodology based on risk assessment.

    PubMed

    Pigini, L; Andrich, R; Liverani, G; Bucciarelli, P; Occhipinti, E

    2010-05-01

    If working tasks are carried out in inadequate conditions, workers with functional limitations may, over time, risk developing further disabilities. While several validated risk assessment methods exist for able-bodied workers, few studies have been carried out for workers with disabilities. This article, which reports the findings of a Study funded by the Italian Ministry of Labour, proposes a general methodology for the technical and organisational re-design of a worksite, based on risk assessment and irrespective of any worker disability. To this end, a sample of 16 disabled workers, composed of people with either mild or severe motor disabilities, was recruited. Their jobs include business administration (5), computer programmer (1), housewife (1), mechanical worker (2), textile worker (1), bus driver (1), nurse (2), electrical worker (1), teacher (1), warehouseman (1). By using a mix of risk assessment methods and the International Classification of Functioning (ICF) taxonomy, their worksites were re-designed in view of a reasonable accommodation, and prospective evaluation was carried out to check whether the new design would eliminate the risks. In one case - a man with congenital malformations who works as a help-desk operator for technical assistance in the Information and Communication Technology (ICT) department of a big organisation - the accommodation was actually carried out within the time span of the study, thus making it possible to confirm the hypotheses raised in the prospective assessment. PMID:20131973

  1. Robust design of spot welds in automotive structures: A decision-making methodology

    NASA Astrophysics Data System (ADS)

    Ouisse, M.; Cogan, S.

    2010-05-01

    Automotive structures include thousands of spot welds whose design must allow the assembled vehicle to satisfy a wide variety of performance constraints including static, dynamic and crash criteria. The objective of a standard optimization strategy is to reduce the number of spot welds as much as possible while satisfying all the design objectives. However, a classical optimization of the spot weld distribution using an exhaustive search approach is simply not feasible due to the very high order of the design space and the subsequently prohibitive calculation costs. Moreover, even if this calculation could be done, the result would not necessarily be very informative with respect to the design robustness to manufacturing uncertainties (location of welds and defective welds) and to the degradation of spot welds due to fatigue effects over the lifetime of the vehicle. In this paper, a decision-making methodology is presented which allows some aspects of the robustness issues to be integrated into the spot weld design process. The starting point is a given distribution of spot welds on the structure, which is based on both engineering know-how and preliminary critical numerical results, in particular criteria such as crash behavior. An over-populated spot weld distribution is then built in order to satisfy the remaining design criteria, such as static torsion angle and modal behavior. Then, an efficient optimization procedure based on energy considerations is used to eliminate redundant spot welds while preserving as far as possible the nominal structural behavior. The resulting sub-optimal solution is then used to provide a decision indicator for defining effective quality control procedures (e.g. visual post-assembly inspection of a small number of critical spot welds) as well as designing redundancy into critical zones. The final part of the paper is related to comparing the robustness of competing designs. Some decision-making indicators are presented to help the

  2. Analog design optimization methodology for ultralow-power circuits using intuitive inversion-level and saturation-level parameters

    NASA Astrophysics Data System (ADS)

    Eimori, Takahisa; Anami, Kenji; Yoshimatsu, Norifumi; Hasebe, Tetsuya; Murakami, Kazuaki

    2014-01-01

    A comprehensive design optimization methodology using intuitive nondimensional parameters of inversion-level and saturation-level is proposed, especially for ultralow-power, low-voltage, and high-performance analog circuits with mixed strong, moderate, and weak inversion metal-oxide-semiconductor transistor (MOST) operations. This methodology is based on the synthesized charge-based MOST model composed of Enz-Krummenacher-Vittoz (EKV) basic concepts and advanced-compact-model (ACM) physics-based equations. The key concept of this methodology is that all circuit and system characteristics are described as some multivariate functions of inversion-level parameters, where the inversion level is used as an independent variable representative of each MOST. The analog circuit design starts from the first step of inversion-level design using universal characteristics expressed by circuit currents and inversion-level parameters without process-dependent parameters, followed by the second step of foundry-process-dependent design and the last step of verification using saturation-level criteria. This methodology also paves the way to an intuitive and comprehensive design approach for many kinds of analog circuit specifications by optimization using inversion-level log-scale diagrams and saturation-level criteria. In this paper, we introduce an example of our design methodology for a two-stage Miller amplifier.

  3. Time-Varying Characteristics Analysis and Fuzzy Controller Systematic Design Method for Pressurized Water Reactor Power Control

    SciTech Connect

    Liu Shengzhi; Zhang Naiyao; Cui Zhenhua

    2004-11-15

    In this paper a systematic design method of fuzzy control systems is applied to the pressurized water reactor's (PWR) power control. The paper includes three parts. In the first part, a simplified time-varying linear model of the PWR power system is constructed, and its inner structure and time-varying characteristics are analyzed. That provides a solid basis for study and design of the nuclear reactor power control system. In the second part, a systematic design method of fuzzy control systems is introduced and applied to control the nuclear reactor power process. The design procedures and parameters are given in detail. This systematic design method has some notable advantages. The control of a global fuzzy model can be decomposed into controlling a set of linear submodels. Each submodel controller can be independently designed by using a linear quadratic regulator approach. This systematic design method gives a sufficient and necessary condition to guarantee the stability of fuzzy control systems; thus, better control performance can be obtained due to the accurate control gains. In the third part, the control performance of the nuclear reactor fuzzy control system is examined by simulation experiments, including nuclear reactor power shutdown, start-up, and adjustment operations. The satisfactory experiment results have shown that the systematic design method for fuzzy control systems is effective and feasible.

  4. Engineering Bacteria to Search for Specific Concentrations of Molecules by a Systematic Synthetic Biology Design Method

    PubMed Central

    Chen, Bor-Sen

    2016-01-01

    Bacteria navigate environments full of various chemicals to seek favorable places for survival by controlling the flagella’s rotation using a complicated signal transduction pathway. By influencing the pathway, bacteria can be engineered to search for specific molecules, which has great potential for application to biomedicine and bioremediation. In this study, genetic circuits were constructed to make bacteria search for a specific molecule at particular concentrations in their environment through a synthetic biology method. In addition, by replacing the “brake component” in the synthetic circuit with some specific sensitivities, the bacteria can be engineered to locate areas containing specific concentrations of the molecule. Measured by the swarm assay qualitatively and microfluidic techniques quantitatively, the characteristics of each “brake component” were identified and represented by a mathematical model. Furthermore, we established another mathematical model to anticipate the characteristics of the “brake component”. Based on this model, an abundant component library can be established to provide adequate component selection for different searching conditions without identifying all components individually. Finally, a systematic design procedure was proposed. Following this systematic procedure, one can design a genetic circuit for bacteria to rapidly search for and locate different concentrations of particular molecules by selecting the most adequate “brake component” in the library. Moreover, following simple procedures, one can also establish an exclusive component library suitable for other cultivated environments, promoter systems, or bacterial strains. PMID:27096615

  5. Low-dose computed tomography screening for lung cancer in populations highly exposed to tobacco: A systematic methodological appraisal of published randomised controlled trials.

    PubMed

    Coureau, Gaëlle; Salmi, L Rachid; Etard, Cécile; Sancho-Garnier, Hélène; Sauvaget, Catherine; Mathoulin-Pélissier, Simone

    2016-07-01

    Low-dose computed tomography (LDCT) screening recommendations for lung cancer are contradictory. The French National Authority for Health commissioned experts to carry a systematic review on the effectiveness, acceptability and safety of lung cancer screening with LDCT in subjects highly exposed to tobacco. We used MEDLINE and Embase databases (2003-2014) and identified 83 publications representing ten randomised control trials. Control arms and methodology varied considerably, precluding a full comparison and questioning reproducibility of the findings. From five trials reporting mortality results, only the National Lung Screening Trial found a significant decrease of disease-specific and all-cause mortality with LDCT screening compared to chest X-ray screening. None of the studies provided all information needed to document the risk-benefit balance. The lack of statistical power and the methodological heterogeneity of European trials question on the possibility of obtaining valid results separately or by pooling. We conclude, in regard to the lack of strong scientific evidence, that LDCT screening should not be recommended in subjects highly exposed to tobacco. PMID:27211572

  6. Transcranial direct current stimulation (tDCS) in behavioral and food addiction: a systematic review of efficacy, technical, and methodological issues

    PubMed Central

    Sauvaget, Anne; Trojak, Benoît; Bulteau, Samuel; Jiménez-Murcia, Susana; Fernández-Aranda, Fernando; Wolz, Ines; Menchón, José M.; Achab, Sophia; Vanelle, Jean-Marie; Grall-Bronnec, Marie

    2015-01-01

    Objectives: Behavioral addictions (BA) are complex disorders for which pharmacological and psychotherapeutic treatments have shown their limits. Non-invasive brain stimulation, among which transcranial direct current stimulation (tDCS), has opened up new perspectives in addiction treatment. The purpose of this work is to conduct a critical and systematic review of tDCS efficacy, and of technical and methodological considerations in the field of BA. Methods: A bibliographic search has been conducted on the Medline and ScienceDirect databases until December 2014, based on the following selection criteria: clinical studies on tDCS and BA (namely eating disorders, compulsive buying, Internet addiction, pathological gambling, sexual addiction, sports addiction, video games addiction). Study selection, data analysis, and reporting were conducted according to the PRISMA guidelines. Results: Out of 402 potential articles, seven studies were selected. So far focusing essentially on abnormal eating, these studies suggest that tDCS (right prefrontal anode/left prefrontal cathode) reduces food craving induced by visual stimuli. Conclusions: Despite methodological and technical differences between studies, the results are promising. So far, only few studies of tDCS in BA have been conducted. New research is recommended on the use of tDCS in BA, other than eating disorders. PMID:26500478

  7. A game-based decision support methodology for competitive systems design

    NASA Astrophysics Data System (ADS)

    Briceno, Simon Ignacio

    This dissertation describes the development of a game-based methodology that facilitates the exploration and selection of research and development (R&D) projects under uncertain competitive scenarios. The proposed method provides an approach that analyzes competitor positioning and formulates response strategies to forecast the impact of technical design choices on a project's market performance. A critical decision in the conceptual design phase of propulsion systems is the selection of the best architecture, centerline, core size, and technology portfolio. This selection can be challenging when considering evolving requirements from both the airframe manufacturing company and the airlines in the market. Furthermore, the exceedingly high cost of core architecture development and its associated risk makes this strategic architecture decision the most important one for an engine company. Traditional conceptual design processes emphasize performance and affordability as their main objectives. These areas alone however, do not provide decision-makers with enough information as to how successful their engine will be in a competitive market. A key objective of this research is to examine how firm characteristics such as their relative differences in completing R&D projects, differences in the degree of substitutability between different project types, and first/second-mover advantages affect their product development strategies. Several quantitative methods are investigated that analyze business and engineering strategies concurrently. In particular, formulations based on the well-established mathematical field of game theory are introduced to obtain insights into the project selection problem. The use of game theory is explored in this research as a method to assist the selection process of R&D projects in the presence of imperfect market information. The proposed methodology focuses on two influential factors: the schedule uncertainty of project completion times and

  8. A Software Designed For STP Data Plot and Analysis Based on Object-oriented Methodology

    NASA Astrophysics Data System (ADS)

    Lina, L.; Murata, K.

    2006-12-01

    simply follows the present system as long as the language is object-oriented language. Researchers would want to add their data into the STARS. In this case, they simply add their own data class in the domain object model. It is because any satellite data has properties such as time or date, which are inherited from the upper class. In this way, their effort is less than in other old methodologies. In the OMT, description format of the system is rather strictly standardized. When new developers take part in STARS project, they have only to understand each model to obtain the overview of the STARS. Then they follow this designs and documents to implement the system. The OMT makes a new comer easy to join into the project already running.

  9. Experimental validation of an integrated controls-structures design methodology for a class of flexible space structures

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Gupta, Sandeep; Elliott, Kenny B.; Joshi, Suresh M.; Walz, Joseph E.

    1994-01-01

    This paper describes the first experimental validation of an optimization-based integrated controls-structures design methodology for a class of flexible space structures. The Controls-Structures-Interaction (CSI) Evolutionary Model, a laboratory test bed at Langley, is redesigned based on the integrated design methodology with two different dissipative control strategies. The redesigned structure is fabricated, assembled in the laboratory, and experimentally compared with the original test structure. Design guides are proposed and used in the integrated design process to ensure that the resulting structure can be fabricated. Experimental results indicate that the integrated design requires greater than 60 percent less average control power (by thruster actuators) than the conventional control-optimized design while maintaining the required line-of-sight performance, thereby confirming the analytical findings about the superiority of the integrated design methodology. Amenability of the integrated design structure to other control strategies is considered and evaluated analytically and experimentally. This work also demonstrates the capabilities of the Langley-developed design tool CSI DESIGN which provides a unified environment for structural and control design.

  10. Applications of a damage tolerance analysis methodology in aircraft design and production

    NASA Technical Reports Server (NTRS)

    Woodward, M. R.; Owens, S. D.; Law, G. E.; Mignery, L. A.

    1992-01-01

    Objectives of customer mandated aircraft structural integrity initiatives in design are to guide material selection, to incorporate fracture resistant concepts in the design, to utilize damage tolerance based allowables and planned inspection procedures necessary to enhance the safety and reliability of manned flight vehicles. However, validated fracture analysis tools for composite structures are needed to accomplish these objectives in a timely and economical manner. This paper briefly describes the development, validation, and application of a damage tolerance methodology for composite airframe structures. A closed-form analysis code, entitled SUBLAM was developed to predict the critical biaxial strain state necessary to cause sublaminate buckling-induced delamination extension in an impact damaged composite laminate. An embedded elliptical delamination separating a thin sublaminate from a thick parent laminate is modelled. Predicted failure strains were correlated against a variety of experimental data that included results from compression after impact coupon and element tests. An integrated analysis package was developed to predict damage tolerance based margin-of-safety (MS) using NASTRAN generated loads and element information. Damage tolerance aspects of new concepts are quickly and cost-effectively determined without the need for excessive testing.

  11. Demonstration of a software design and statistical analysis methodology with application to patient outcomes data sets

    PubMed Central

    Mayo, Charles; Conners, Steve; Warren, Christopher; Miller, Robert; Court, Laurence; Popple, Richard

    2013-01-01

    Purpose: With emergence of clinical outcomes databases as tools utilized routinely within institutions, comes need for software tools to support automated statistical analysis of these large data sets and intrainstitutional exchange from independent federated databases to support data pooling. In this paper, the authors present a design approach and analysis methodology that addresses both issues. Methods: A software application was constructed to automate analysis of patient outcomes data using a wide range of statistical metrics, by combining use of C#.Net and R code. The accuracy and speed of the code was evaluated using benchmark data sets. Results: The approach provides data needed to evaluate combinations of statistical measurements for ability to identify patterns of interest in the data. Through application of the tools to a benchmark data set for dose-response threshold and to SBRT lung data sets, an algorithm was developed that uses receiver operator characteristic curves to identify a threshold value and combines use of contingency tables, Fisher exact tests, Welch t-tests, and Kolmogorov-Smirnov tests to filter the large data set to identify values demonstrating dose-response. Kullback-Leibler divergences were used to provide additional confirmation. Conclusions: The work demonstrates the viability of the design approach and the software tool for analysis of large data sets. PMID:24320426

  12. The Component Packaging Problem: A Vehicle for the Development of Multidisciplinary Design and Analysis Methodologies

    NASA Technical Reports Server (NTRS)

    Fadel, Georges; Bridgewood, Michael; Figliola, Richard; Greenstein, Joel; Kostreva, Michael; Nowaczyk, Ronald; Stevenson, Steve

    1999-01-01

    This report summarizes academic research which has resulted in an increased appreciation for multidisciplinary efforts among our students, colleagues and administrators. It has also generated a number of research ideas that emerged from the interaction between disciplines. Overall, 17 undergraduate students and 16 graduate students benefited directly from the NASA grant: an additional 11 graduate students were impacted and participated without financial support from NASA. The work resulted in 16 theses (with 7 to be completed in the near future), 67 papers or reports mostly published in 8 journals and/or presented at various conferences (a total of 83 papers, presentations and reports published based on NASA inspired or supported work). In addition, the faculty and students presented related work at many meetings, and continuing work has been proposed to NSF, the Army, Industry and other state and federal institutions to continue efforts in the direction of multidisciplinary and recently multi-objective design and analysis. The specific problem addressed is component packing which was solved as a multi-objective problem using iterative genetic algorithms and decomposition. Further testing and refinement of the methodology developed is presently under investigation. Teaming issues research and classes resulted in the publication of a web site, (http://design.eng.clemson.edu/psych4991) which provides pointers and techniques to interested parties. Specific advantages of using iterative genetic algorithms, hurdles faced and resolved, and institutional difficulties associated with multi-discipline teaming are described in some detail.

  13. What Evidence Underlies Clinical Practice in Paediatric Surgery? A Systematic Review Assessing Choice of Study Design

    PubMed Central

    Allin, Benjamin; Knight, Marian

    2016-01-01

    Objective Identify every paediatric surgical article published in 1998 and every paediatric surgical article published in 2013, and determine which study designs were used and whether they were appropriate for robustly assessing interventions in surgical conditions. Methods A systematic review was conducted according to a pre-specified protocol (CRD42014007629), using EMBASE and Medline. Non-English language studies were excluded. Studies were included if meeting population criteria and either condition or intervention criteria. Population: Children under the age of 18, or adults who underwent intervention for a condition managed by paediatric surgeons when they were under 18 years of age. Condition: One managed by general paediatric surgeons. Intervention: Used for treatment of a condition managed by general paediatric surgeons. Main Outcome Measure Studies were classified according to whether the IDEAL collaboration recommended their design for assessing surgical interventions or not. Change in proportions between 1998 and 2013 was calculated. Results 1581 paediatric surgical articles were published in 1998, and 3453 in 2013. The most commonly used design, accounting for 45% of studies in 1998 and 46.8% in 2013, was the retrospective case series. Only 1.8% of studies were RCTs in 1998, and 1.9% in 2013. Overall, in 1998, 9.8% of studies used a recommended design. In 2013, 11.9% used a recommended design (proportion increase 2.3%, 95% confidence interval 0.5% increase to 4% increase, p = 0.017). Conclusions and Relevance A low proportion of published paediatric surgical manuscripts utilise a design that is recommended for assessing surgical interventions. RCTs represent fewer than 1 in 50 studies. In 2013, 88.1% of studies used a less robust design, suggesting the need for a new way of approaching paediatric surgical research. PMID:26959824

  14. A methodology for system-of-systems design in support of the engineering team

    NASA Astrophysics Data System (ADS)

    Ridolfi, G.; Mooij, E.; Cardile, D.; Corpino, S.; Ferrari, G.

    2012-04-01

    Space missions have experienced a trend of increasing complexity in the last decades, resulting in the design of very complex systems formed by many elements and sub-elements working together to meet the requirements. In a classical approach, especially in a company environment, the two steps of design-space exploration and optimization are usually performed by experts inferring on major phenomena, making assumptions and doing some trial-and-error runs on the available mathematical models. This is done especially in the very early design phases where most of the costs are locked-in. With the objective of supporting the engineering team and the decision-makers during the design of complex systems, the authors developed a modelling framework for a particular category of complex, coupled space systems called System-of-Systems. Once modelled, the System-of-Systems is solved using a computationally cheap parametric methodology, named the mixed-hypercube approach, based on the utilization of a particular type of fractional factorial design-of-experiments, and analysis of the results via global sensitivity analysis and response surfaces. As an applicative example, a system-of-systems of a hypothetical human space exploration scenario for the support of a manned lunar base is presented. The results demonstrate that using the mixed-hypercube to sample the design space, an optimal solution is reached with a limited computational effort, providing support to the engineering team and decision makers thanks to sensitivity and robustness information. The analysis of the system-of-systems model that was implemented shows that the logistic support of a human outpost on the Moon for 15 years is still feasible with currently available launcher classes. The results presented in this paper have been obtained in cooperation with Thales Alenia Space—Italy, in the framework of a regional programme called STEPS. STEPS—Sistemi e Tecnologie per l'EsPlorazione Spaziale is a research

  15. My Interventional Drug-Eluting Stent Educational App (MyIDEA): Patient-Centered Design Methodology

    PubMed Central

    Shroff, Adhir; Groo, Vicki; Dickens, Carolyn; Field, Jerry; Baumann, Matthew; Welland, Betty; Gutowski, Gerry; Flores Jr, Jose D; Zhao, Zhongsheng; Bahroos, Neil; Hynes, Denise M; Wilkie, Diana J

    2015-01-01

    Background Patient adherence to medication regimens is critical in most chronic disease treatment plans. This study uses a patient-centered tablet app, “My Interventional Drug-Eluting Stent Educational App (MyIDEA).” This is an educational program designed to improve patient medication adherence. Objective Our goal is to describe the design, methodology, limitations, and results of the MyIDEA tablet app. We created a mobile technology-based patient education app to improve dual antiplatelet therapy adherence in patients who underwent a percutaneous coronary intervention and received a drug-eluting stent. Methods Patient advisers were involved in the development process of MyIDEA from the initial wireframe to the final launch of the product. The program was restructured and redesigned based on the patient advisers’ suggestions as well as those from multidisciplinary team members. To accommodate those with low health literacy, we modified the language and employed attractive color schemes to improve ease of use. We assumed that the target patient population may have little to no experience with electronic tablets, and therefore, we designed the interface to be as intuitive as possible. Results The MyIDEA app has been successfully deployed to a low-health-literate elderly patient population in the hospital setting. A total of 6 patients have interacted with MyIDEA for an average of 17.6 minutes/session. Conclusions Including patient advisers in the early phases of a mobile patient education development process is critical. A number of changes in text order, language, and color schemes occurred to improve ease of use. The MyIDEA program has been successfully deployed to a low-health-literate elderly patient population. Leveraging patient advisers throughout the development process helps to ensure implementation success. PMID:26139587

  16. Application of new methodologies based on design of experiments, independent component analysis and design space for robust optimization in liquid chromatography.

    PubMed

    Debrus, Benjamin; Lebrun, Pierre; Ceccato, Attilio; Caliaro, Gabriel; Rozet, Eric; Nistor, Iolanda; Oprean, Radu; Rupérez, Francisco J; Barbas, Coral; Boulanger, Bruno; Hubert, Philippe

    2011-04-01

    HPLC separations of an unknown sample mixture and a pharmaceutical formulation have been optimized using a recently developed chemometric methodology proposed by W. Dewé et al. in 2004 and improved by P. Lebrun et al. in 2008. This methodology is based on experimental designs which are used to model retention times of compounds of interest. Then, the prediction accuracy and the optimal separation robustness, including the uncertainty study, were evaluated. Finally, the design space (ICH Q8(R1) guideline) was computed as the probability for a criterion to lie in a selected range of acceptance. Furthermore, the chromatograms were automatically read. Peak detection and peak matching were carried out with a previously developed methodology using independent component analysis published by B. Debrus et al. in 2009. The present successful applications strengthen the high potential of these methodologies for the automated development of chromatographic methods. PMID:21458628

  17. A systematic approach for introducing innovative product design in courses with engineering and nonengineering students.

    PubMed

    Patterson, P E

    2007-01-01

    In our new global economy, biomedical product development teams need to be even more innovative in an environment constrained by fewer resources with less time from concept to market. Teams are often comprised of individuals spread around the world. To simulate this setting, we revised an existing course to incorporate teams of on-campus and distance students, with each team including both engineers and other specialties. Through interactive lectures and projects, we presented a systematic approach to innovation that should be useful to engineers and non-engineers alike. Students found the course challenging and exciting, displaying an improved ability to work in distributed teams and in developing innovative design solutions. PMID:17487063

  18. Systematic approach for PID controller design for pitch-regulated, variable-speed wind turbines

    SciTech Connect

    Hand, M.M.; Balas, M.J.

    1997-11-01

    Variable-speed, horizontal axis wind turbines use blade-pitch control to meet specified objectives for three regions of operation. This paper focuses on controller design for the constant power production regime. A simple, rigid, non-linear turbine model was used to systematically perform trade-off studies between two performance metrics. Minimization of both the deviation of the rotor speed from the desired speed and the motion of the actuator is desired. The robust nature of the proportional-integral-derivative (PID) controller is illustrated, and optimal operating conditions are determined. Because numerous simulation runs may be completed in a short time, the relationship of the two opposing metrics is easily visualized. 2 refs., 9 figs.

  19. A methodology for formulating a minimal uncertainty model for robust control system design and analysis

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.; Chang, B.-C.; Fischl, Robert

    1989-01-01

    In the design and analysis of robust control systems for uncertain plants, the technique of formulating what is termed an M-delta model has become widely accepted and applied in the robust control literature. The M represents the transfer function matrix M(s) of the nominal system, and delta represents an uncertainty matrix acting on M(s). The uncertainty can arise from various sources, such as structured uncertainty from parameter variations or multiple unstructured uncertainties from unmodeled dynamics and other neglected phenomena. In general, delta is a block diagonal matrix, and for real parameter variations the diagonal elements are real. As stated in the literature, this structure can always be formed for any linear interconnection of inputs, outputs, transfer functions, parameter variations, and perturbations. However, very little of the literature addresses methods for obtaining this structure, and none of this literature addresses a general methodology for obtaining a minimal M-delta model for a wide class of uncertainty. Since have a delta matrix of minimum order would improve the efficiency of structured singular value (or multivariable stability margin) computations, a method of obtaining a minimal M-delta model would be useful. A generalized method of obtaining a minimal M-delta structure for systems with real parameter variations is given.

  20. Design, Development and Optimization of S (-) Atenolol Floating Sustained Release Matrix Tablets Using Surface Response Methodology

    PubMed Central

    Gunjal, P. T.; Shinde, M. B.; Gharge, V. S.; Pimple, S. V.; Gurjar, M. K.; Shah, M. N.

    2015-01-01

    The objective of this present investigation was to develop and formulate floating sustained release matrix tablets of s (-) atenolol, by using different polymer combinations and filler, to optimize by using surface response methodology for different drug release variables and to evaluate the drug release pattern of the optimized product. Floating sustained release matrix tablets of various combinations were prepared with cellulose-based polymers: Hydroxypropyl methylcellulose, sodium bicarbonate as a gas generating agent, polyvinyl pyrrolidone as a binder and lactose monohydrate as filler. The 32 full factorial design was employed to investigate the effect of formulation variables on different properties of tablets applicable to floating lag time, buoyancy time, % drug release in 1 and 6 h (D1 h,D6 h) and time required to 90% drug release (t90%). Significance of result was analyzed using analysis of non variance and P < 0.05 was considered statistically significant. S (-) atenolol floating sustained release matrix tablets followed the Higuchi drug release kinetics that indicates the release of drug follows anomalous (non-Fickian) diffusion mechanism. The developed floating sustained release matrix tablet of improved efficacy can perform therapeutically better than a conventional tablet. PMID:26798171

  1. Area overhead analysis of SEF: A design methodology for tolerating SEU

    SciTech Connect

    Blaquiere, Y.; Savaria, Y.

    1987-12-01

    Soft-Error filtering (SEF) is a design methodology proposed recently for implementing machines tolerant to SEU. This paper deals mainly with the evaluation and the reduction of the area overhead brought by SEF. A new shift register filtering latch configuration is proposed. The use of this latch, optimized for minimum area, reduces the area overhead by a factor of 2.6, when compared with latches optimized for time performance. A detailed analysis of the area overhead with SEF implemented on two relatively complex machines produced the following results: a SEF version of the 6800 microprocessor would require an area overhead varying between 12% and 69% depending on the SEF latch used and, a SEF version of the RISCII microprocessor would result in a 38.8% area overhead. An analysis of the cost of implementing the Hamming error correcting code on a register array is presented and this cost is compared with that of implementing SEU tolerance directly with SEF. Finally, a hybrid approach is proposed where a large register array is protected by an error correcting code whereas the isolated latches are replaced by filtering latches. This hybrid approach reduces the area overhead to 18.8% for the RISCII architecture.

  2. Application-specific coarse-grained reconfigurable array: architecture and design methodology

    NASA Astrophysics Data System (ADS)

    Zhou, Li; Liu, Dongpei; Zhang, Jianfeng; Liu, Hengzhu

    2015-06-01

    Coarse-grained reconfigurable arrays (CGRAs) have shown potential for application in embedded systems in recent years. Numerous reconfigurable processing elements (PEs) in CGRAs provide flexibility while maintaining high performance by exploring different levels of parallelism. However, a difference remains between the CGRA and the application-specific integrated circuit (ASIC). Some application domains, such as software-defined radios (SDRs), require flexibility with performance demand increases. More effective CGRA architectures are expected to be developed. Customisation of a CGRA according to its application can improve performance and efficiency. This study proposes an application-specific CGRA architecture template composed of generic PEs (GPEs) and special PEs (SPEs). The hardware of the SPE can be customised to accelerate specific computational patterns. An automatic design methodology that includes pattern identification and application-specific function unit generation is also presented. A mapping algorithm based on ant colony optimisation is provided. Experimental results on the SDR target domain show that compared with other ordinary and application-specific reconfigurable architectures, the CGRA generated by the proposed method performs more efficiently for given applications.

  3. Design, Development and Optimization of S (-) Atenolol Floating Sustained Release Matrix Tablets Using Surface Response Methodology.

    PubMed

    Gunjal, P T; Shinde, M B; Gharge, V S; Pimple, S V; Gurjar, M K; Shah, M N

    2015-01-01

    The objective of this present investigation was to develop and formulate floating sustained release matrix tablets of s (-) atenolol, by using different polymer combinations and filler, to optimize by using surface response methodology for different drug release variables and to evaluate the drug release pattern of the optimized product. Floating sustained release matrix tablets of various combinations were prepared with cellulose-based polymers: Hydroxypropyl methylcellulose, sodium bicarbonate as a gas generating agent, polyvinyl pyrrolidone as a binder and lactose monohydrate as filler. The 3(2) full factorial design was employed to investigate the effect of formulation variables on different properties of tablets applicable to floating lag time, buoyancy time, % drug release in 1 and 6 h (D1 h,D6 h) and time required to 90% drug release (t90%). Significance of result was analyzed using analysis of non variance and P < 0.05 was considered statistically significant. S (-) atenolol floating sustained release matrix tablets followed the Higuchi drug release kinetics that indicates the release of drug follows anomalous (non-Fickian) diffusion mechanism. The developed floating sustained release matrix tablet of improved efficacy can perform therapeutically better than a conventional tablet. PMID:26798171

  4. Online Intelligent Controllers for an Enzyme Recovery Plant: Design Methodology and Performance

    PubMed Central

    Leite, M. S.; Fujiki, T. L.; Silva, F. V.; Fileti, A. M. F.

    2010-01-01

    This paper focuses on the development of intelligent controllers for use in a process of enzyme recovery from pineapple rind. The proteolytic enzyme bromelain (EC 3.4.22.4) is precipitated with alcohol at low temperature in a fed-batch jacketed tank. Temperature control is crucial to avoid irreversible protein denaturation. Fuzzy or neural controllers offer a way of implementing solutions that cover dynamic and nonlinear processes. The design methodology and a comparative study on the performance of fuzzy-PI, neurofuzzy, and neural network intelligent controllers are presented. To tune the fuzzy PI Mamdani controller, various universes of discourse, rule bases, and membership function support sets were tested. A neurofuzzy inference system (ANFIS), based on Takagi-Sugeno rules, and a model predictive controller, based on neural modeling, were developed and tested as well. Using a Fieldbus network architecture, a coolant variable speed pump was driven by the controllers. The experimental results show the effectiveness of fuzzy controllers in comparison to the neural predictive control. The fuzzy PI controller exhibited a reduced error parameter (ITAE), lower power consumption, and better recovery of enzyme activity. PMID:21234106

  5. Methodological trends in the design of recent microenvironmental studies of personal CO exposure

    NASA Astrophysics Data System (ADS)

    Flachsbart, Peter G.

    This paper describes the designs of three recent microenvironmental studies of personal exposure to carbon monoxide (CO) from motor vehicle exhaust. These studies were conducted sequentially, first in four California cities (Los Angeles, Mountain View, Palo Alto, and San Francisco), then in Honolulu, and, most recently, in metropolitan Washington, D.C. Though study purposes differed, each study faced common methodological issues related to personal exposure monitors (PEMs), quality assurance and data collection procedures, and the selection of microenvironments for study. Two major objectives of the California cities study were to determine the CO concentrations typically found in commercial settings and to define and classify microenvironments applicable to such settings: The Honolulu study measured merchant exposure to CO in shopping centers attached to semienclosed parking garages during business hours and commuter exposure to CO in vehicles (passenger cars and buses) on congested roadways during peak periods. The intent of the Washington study was to develop a model of commuter exposure to motor vehicle exhaust using CO as an indicator pollutant. Certain trends are discernible from reviewing the three studies. There are clearly trends in PEM development that have expanded instrument capabilities and automated data collection and storage. There are also trends towards more rigorous quality assurance procedures and more standardized protocols for collecting exposure data. Further, one can see a trend towards more elaborate indicators for identifying microenvironments for study. Finally, there is a trend towards using personal monitors in public policy review and evaluation.

  6. Optimization of Chitinase Production by Bacillus pumilus Using Plackett-Burman Design and Response Surface Methodology

    PubMed Central

    Tasharrofi, Noshin; Adrangi, Sina; Fazeli, Mehdi; Rastegar, Hossein; Khoshayand, Mohammad Reza; Faramarzi, Mohammad Ali

    2011-01-01

    A soil bacterium capable of degrading chitin on chitin agar plates was isolated and identified as Bacillus pumilus isolate U5 on the basis of 16S rDNA sequence analysis. In order to optimize culture conditions for chitinase production by this bacterium, a two step approach was employed. First, the effects of several medium components were studied using the Plackett-Burman design. Among various components tested, chitin and yeast extract showed positive effect on enzyme production while MgSO4 and FeSO4 had negative effect. However, the linear model proved to be insufficient for determining the optimum levels for these components due to a highly significant curvature effect. In the second step, Box-Behnken response surface methodology was used to determine the optimum values. It was noticed that a quadratic polynomial equation fitted he experimental data appropriately. The optimum concentrations for chitin, yeast extract, MgSO4 and FeSO4 were found to be 4.76, 0.439, 0.0055 and 0.019 g/L, respectively, with a predicted value of chitinase production of 97.67 U/100 mL. Using this statistically optimized medium, the practical chitinase production reached 96.1 U/100 mL. PMID:24250411

  7. Conceptual design and analysis methodology for knowledge acquisition for expert systems

    SciTech Connect

    Adiga, S.

    1986-01-01

    The field analysis of Artificial Intelligence, particularly expert systems, has been identified by experts as a technology with the most promise for handling complex information processing needs of modern manufacturing systems. Knowledge acquisition or the process of building the knowledge base for expert systems needs precise and well-formulated methods to pass from being an art to theory. This research in a step in that direction. The approach evolves at the conceptual level from Pask's work on conversation theory which provides the minimal structural requirement for development and validation of the method. An integrated approach is developed with guidelines for structured knowledge elicitation, analysis, and mapping of the verbal data into well-defined object-oriented generic knowledge structures capable of representing both structural and operational knowledge. The research extends and blends the concepts of protocol analysis, object-oriented design, and semantic data modeling into an integrated framework. This methodology, being a domain-independent development, theoretically can be used to acquire knowledge for any expert performance system.

  8. Development of designer chicken shred with response surface methodology and evaluation of its quality characteristics.

    PubMed

    Reddy, K Jalarama; Jayathilakan, K; Pandey, M C

    2016-01-01

    Meat is considered to be an excellent source of protein, essential minerals, trace elements and vitamins but negative concerns regarding meat consumption and its impact on human health have promoted research into development of novel functional meat products. In the present study Rice bran oil (RBO), and Flaxseed oil (FSO) were used for attaining an ideal lipid profile in the product. The experiment was designed to optimise the RBO and FSO concentration for development of product with ideal lipid profile and maximum acceptability by the application of central composite rotatable design of Response surface methodology (RSM). Levels of RBO and FSO were taken as independent variables and overall acceptability (OAA), n-6 and n-3 fatty acids as responses. Quadratic fit model was found to be suitable for optimising the product. Sample with RBO (20.51 ml) and FSO (2.57 ml) yielded an OAA score of 8.25, 29.54 % of n-6 and 7.70 % of n-3 having n-6/n-3 ratio as 3.8:1. Optimised product was analysed for physico-chemical, sensory and microbial profile during storage at 4 ± 1 °C for 30 days. Increase in the lipid oxidative parameters was observed during storage but it was not significant (p < 0.05). Studies revealed great potential of developing functional poultry products with improved nutritional quality and good shelf stability by incorporating RBO and FSO. PMID:26787966

  9. Validating a new methodology for optical probe design and image registration in fNIRS studies

    PubMed Central

    Wijeakumar, Sobanawartiny; Spencer, John P.; Bohache, Kevin; Boas, David A.; Magnotta, Vincent A.

    2015-01-01

    Functional near-infrared spectroscopy (fNIRS) is an imaging technique that relies on the principle of shining near-infrared light through tissue to detect changes in hemodynamic activation. An important methodological issue encountered is the creation of optimized probe geometry for fNIRS recordings. Here, across three experiments, we describe and validate a processing pipeline designed to create an optimized, yet scalable probe geometry based on selected regions of interest (ROIs) from the functional magnetic resonance imaging (fMRI) literature. In experiment 1, we created a probe geometry optimized to record changes in activation from target ROIs important for visual working memory. Positions of the sources and detectors of the probe geometry on an adult head were digitized using a motion sensor and projected onto a generic adult atlas and a segmented head obtained from the subject's MRI scan. In experiment 2, the same probe geometry was scaled down to fit a child's head and later digitized and projected onto the generic adult atlas and a segmented volume obtained from the child's MRI scan. Using visualization tools and by quantifying the amount of intersection between target ROIs and channels, we show that out of 21 ROIs, 17 and 19 ROIs intersected with fNIRS channels from the adult and child probe geometries, respectively. Further, both the adult atlas and adult subject-specific MRI approaches yielded similar results and can be used interchangeably. However, results suggest that segmented heads obtained from MRI scans be used for registering children's data. Finally, in experiment 3, we further validated our processing pipeline by creating a different probe geometry designed to record from target ROIs involved in language and motor processing. PMID:25705757

  10. A systematic approach for designing a HBM pilot study for Europe.

    PubMed

    Becker, Kerstin; Seiwert, Margarete; Casteleyn, Ludwine; Joas, Reinhard; Joas, Anke; Biot, Pierre; Aerts, Dominique; Castaño, Argelia; Esteban, Marta; Angerer, Jürgen; Koch, Holger M; Schoeters, Greet; Den Hond, Elly; Sepai, Ovnair; Exley, Karen; Knudsen, Lisbeth E; Horvat, Milena; Bloemen, Louis; Kolossa-Gehring, Marike

    2014-03-01

    The objective of COPHES (Consortium to Perform Human biomonitoring on a European Scale) was to develop a harmonised approach to conduct human biomonitoring on a European scale. COPHES developed a systematic approach for designing and conducting a pilot study for an EU-wide cross-sectional human biomonitoring (HBM) study and for the implementation of the fieldwork procedures. The approach gave the basis for discussion of the main aspects of study design and conduct, and provided a decision making tool which can be applied to many other studies. Each decision that had to be taken was listed in a table of options with their advantages and disadvantages. Based on this the rationale of the decisions could be explained and be transparent. This was important because an EU-wide HBM study demands openness of all decisions taken to encourage as many countries as possible to participate and accept the initiative undertaken. Based on this approach the following study design was suggested: a cross-sectional study including 120 children aged 6-11 years and their mothers aged up to 45 years from each participating country. For the pilot study the children should be sampled in equal shares in an urban and a rural location. Only healthy children and mothers (no metabolic disturbances) should be included, who have a sufficient knowledge of the local language and have been living at least for 5 years at the sampling location. Occupational exposure should not be an exclusion criterion. Recruitment should be performed via inhabitant registries or schools as an alternative option. Measures suitable to increase the response rate should be applied. Preferably, the families should be visited at home and interviewed face-to-face. Various quality control measures to guarantee a good fieldwork performance were recommended. This comprehensive overview aims to provide scientists, EU officials, partners and stakeholders involved in the EU implementation process full transparency of the work

  11. METHODOLOGY FOR DESIGNING AIR QUALITY MONITORING NETWORKS: 2. APPLICATION TO LAS VEGAS, NEVADA, FOR CARBON MONOXIDE

    EPA Science Inventory

    An objective methodology presented in a companion paper (Liu et al., 1986) for determining the optimum number and disposition of ambient air quality stations in a monitoring network for carbon monoxide is applied to the Las Vegas, Nevada, area. The methodology utilizes an air qua...

  12. Methodological, Theoretical, Infrastructural, and Design Issues in Conducting Good Outcome Studies

    ERIC Educational Resources Information Center

    Kelly, Michael P.; Moore, Tessa A.

    2011-01-01

    This article outlines a set of methodological, theoretical, and other issues relating to the conduct of good outcome studies. The article begins by considering the contribution of evidence-based medicine to the methodology of outcome research. The lessons which can be applied in outcome studies in nonmedical settings are described. The article…

  13. Innovative Mixed-Methods Research: Moving beyond Design Technicalities to Epistemological and Methodological Realizations

    ERIC Educational Resources Information Center

    Riazi, A. Mehdi

    2016-01-01

    Mixed-methods research (MMR), as an inter-discourse (quantitative and qualitative) methodology, can provide applied linguistics researchers the opportunity to draw on and integrate the strengths of the two research methodological approaches in favour of making more rigorous inferences about research problems. In this article, the argument is made…

  14. Design, implementation and reporting strategies to reduce the instance and impact of missing patient-reported outcome (PRO) data: a systematic review

    PubMed Central

    Mercieca-Bebber, Rebecca; Palmer, Michael J; Brundage, Michael; Stockler, Martin R; King, Madeleine T

    2016-01-01

    Objectives Patient-reported outcomes (PROs) provide important information about the impact of treatment from the patients' perspective. However, missing PRO data may compromise the interpretability and value of the findings. We aimed to report: (1) a non-technical summary of problems caused by missing PRO data; and (2) a systematic review by collating strategies to: (A) minimise rates of missing PRO data, and (B) facilitate transparent interpretation and reporting of missing PRO data in clinical research. Our systematic review does not address statistical handling of missing PRO data. Data sources MEDLINE and Cumulative Index to Nursing and Allied Health Literature (CINAHL) databases (inception to 31 March 2015), and citing articles and reference lists from relevant sources. Eligibility criteria English articles providing recommendations for reducing missing PRO data rates, or strategies to facilitate transparent interpretation and reporting of missing PRO data were included. Methods 2 reviewers independently screened articles against eligibility criteria. Discrepancies were resolved with the research team. Recommendations were extracted and coded according to framework synthesis. Results 117 sources (55% discussion papers, 26% original research) met the eligibility criteria. Design and methodological strategies for reducing rates of missing PRO data included: incorporating PRO-specific information into the protocol; carefully designing PRO assessment schedules and defining termination rules; minimising patient burden; appointing a PRO coordinator; PRO-specific training for staff; ensuring PRO studies are adequately resourced; and continuous quality assurance. Strategies for transparent interpretation and reporting of missing PRO data include utilising auxiliary data to inform analysis; transparently reporting baseline PRO scores, rates and reasons for missing data; and methods for handling missing PRO data. Conclusions The instance of missing PRO data and its

  15. LOX/hydrocarbon rocket engine analytical design methodology development and validation. Volume 1: Executive summary and technical narrative

    NASA Technical Reports Server (NTRS)

    Pieper, Jerry L.; Walker, Richard E.

    1993-01-01

    During the past three decades, an enormous amount of resources were expended in the design and development of Liquid Oxygen/Hydrocarbon and Hydrogen (LOX/HC and LOX/H2) rocket engines. A significant portion of these resources were used to develop and demonstrate the performance and combustion stability for each new engine. During these efforts, many analytical and empirical models were developed that characterize design parameters and combustion processes that influence performance and stability. Many of these models are suitable as design tools, but they have not been assembled into an industry-wide usable analytical design methodology. The objective of this program was to assemble existing performance and combustion stability models into a usable methodology capable of producing high performing and stable LOX/hydrocarbon and LOX/hydrogen propellant booster engines.

  16. Introduction to the Design and Optimization of Experiments Using Response Surface Methodology. A Gas Chromatography Experiment for the Instrumentation Laboratory

    ERIC Educational Resources Information Center

    Lang, Patricia L.; Miller, Benjamin I.; Nowak, Abigail Tuttle

    2006-01-01

    The study describes how to design and optimize an experiment with multiple factors and multiple responses. The experiment uses fractional factorial analysis as a screening experiment only to identify important instrumental factors and does not use response surface methodology to find the optimal set of conditions.

  17. Revised Design-Based Research Methodology for College Course Improvement and Application to Education Courses in Japan

    ERIC Educational Resources Information Center

    Akahori, Kanji

    2011-01-01

    The author describes a research methodology for college course improvement, and applies the results to education courses. In Japan, it is usually difficult to carry out research on college course improvement, because faculty cannot introduce experimental design approaches based on control and treatment groupings of students in actual classroom…

  18. COMPUTER AIDED CHEMICAL PROCESS DESIGN METHODOLOGIES FOR POLLUTION REDUCTION(SYSTEMS ANALYSIS BRANCH, SUSTAINABLE TECHNOLOGY DIVISION, NRMRL)

    EPA Science Inventory

    The objective of the project is to develop computer optimization and simulation methodologies for the design of economical chemical manufacturing processes with a minimum of impact on the environment. The computer simulation and optimization tools developed in this project can be...

  19. School Reform and Student Diversity, Volume III: Technical Appendix: Research Design and Methodology. Studies of Education Reform.

    ERIC Educational Resources Information Center

    Berman, Paul; And Others

    More than one-fifth of American school-age children and youth come from language-minority families--homes in which languages other than English are spoken. This volume, the last in a series of three, describes the research design and methodology for a study that examined exemplary school-reform efforts involving the education of limited…

  20. A Systematic Approach for Quantitative Analysis of Multidisciplinary Design Optimization Framework

    NASA Astrophysics Data System (ADS)

    Kim, Sangho; Park, Jungkeun; Lee, Jeong-Oog; Lee, Jae-Woo

    An efficient Multidisciplinary Design and Optimization (MDO) framework for an aerospace engineering system should use and integrate distributed resources such as various analysis codes, optimization codes, Computer Aided Design (CAD) tools, Data Base Management Systems (DBMS), etc. in a heterogeneous environment, and need to provide user-friendly graphical user interfaces. In this paper, we propose a systematic approach for determining a reference MDO framework and for evaluating MDO frameworks. The proposed approach incorporates two well-known methods, Analytic Hierarchy Process (AHP) and Quality Function Deployment (QFD), in order to provide a quantitative analysis of the qualitative criteria of MDO frameworks. Identification and hierarchy of the framework requirements and the corresponding solutions for the reference MDO frameworks, the general one and the aircraft oriented one were carefully investigated. The reference frameworks were also quantitatively identified using AHP and QFD. An assessment of three in-house frameworks was then performed. The results produced clear and useful guidelines for improvement of the in-house MDO frameworks and showed the feasibility of the proposed approach for evaluating an MDO framework without a human interference.

  1. Multi-Period Many-Objective Groundwater Monitoring Design Given Systematic Model Errors and Uncertainty

    NASA Astrophysics Data System (ADS)

    Kollat, J. B.; Reed, P. M.

    2011-12-01

    This study demonstrates how many-objective long-term groundwater monitoring (LTGM) network design tradeoffs evolve across multiple management periods given systematic models errors (i.e., predictive bias), groundwater flow-and-transport forecasting uncertainties, and contaminant observation uncertainties. Our analysis utilizes the Adaptive Strategies for Sampling in Space and Time (ASSIST) framework, which is composed of three primary components: (1) bias-aware Ensemble Kalman Filtering, (2) many-objective hierarchical Bayesian optimization, and (3) interactive visual analytics for understanding spatiotemporal network design tradeoffs. A physical aquifer experiment is utilized to develop a severely challenging multi-period observation system simulation experiment (OSSE) that reflects the challenges and decisions faced in monitoring contaminated groundwater systems. The experimental aquifer OSSE shows both the influence and consequences of plume dynamics as well as alternative cost-savings strategies in shaping how LTGM many-objective tradeoffs evolve. Our findings highlight the need to move beyond least cost purely statistical monitoring frameworks to consider many-objective evaluations of LTGM tradeoffs. The ASSIST framework provides a highly flexible approach for measuring the value of observables that simultaneously improves how the data are used to inform decisions.

  2. Systematic review of effects of current transtibial prosthetic socket designs-Part 1: Qualitative outcomes.

    PubMed

    Safari, Mohammad Reza; Meier, Margrit Regula

    2015-01-01

    This review is an attempt to untangle the complexity of transtibial prosthetic socket fit, determine the most important characteristic for a successful fitting, and perhaps find some indication of whether a particular prosthetic socket type might be best for a given situation. Further, it is intended to provide directions for future research. We followed the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines and used medical subject headings and standard key words to search for articles in relevant databases. No restrictions were made on study design or type of outcome measure. From the obtained search results (n = 1,863), 35 articles were included. The relevant data were entered into a predefined data form that incorporated the Downs and Black risk of bias assessment checklist. Results for the qualitative outcomes (n = 19 articles) are synthesized. Total surface bearing sockets lead to greater activity levels and satisfaction in active persons with amputation, those with a traumatic cause of amputation, and younger persons with amputation than patellar tendon bearing sockets. Evidence on vacuum-assisted suction and hydrostatic sockets is inadequate, and further studies are much needed. To improve the scientific basis for prescription, comparison of and correlation between mechanical properties of interface material, socket designs, user characteristics, and outcome measures should be conducted and reported in future studies. PMID:26436666

  3. A computer modeling methodology and tool for assessing design concepts for the Space Station Data Management System

    NASA Technical Reports Server (NTRS)

    Jones, W. R.

    1986-01-01

    A computer modeling tool is being developed to assess candidate designs for the Space Station Data Management System (DMS). The DMS is to be a complex distributed computer system including the processor, storage devices, local area networks, and software that will support all processing functions onboard the Space Station. The modeling tool will allow a candidate design for the DMS, or for other subsystems that use the DMS, to be evaluated in terms of parameters. The tool and its associated modeling methodology are intended for use by DMS and subsystem designers to perform tradeoff analyses between design concepts using varied architectures and technologies.

  4. Tornado missile simulation and design methodology. Volume 2: model verification and data base updates. Final report

    SciTech Connect

    Twisdale, L.A.; Dunn, W.L.

    1981-08-01

    A probabilistic methodology has been developed to predict the probabilities of tornado-propelled missiles impacting and damaging nuclear power plant structures. Mathematical models of each event in the tornado missile hazard have been developed and sequenced to form an integrated, time-history simulation methodology. The models are data based where feasible. The data include documented records of tornado occurrence, field observations of missile transport, results of wind tunnel experiments, and missile impact tests. Probabilistic Monte Carlo techniques are used to estimate the risk probabilities. The methodology has been encoded in the TORMIS computer code to facilitate numerical analysis and plant-specific tornado missile probability assessments.

  5. Multiplexed actuation using ultra dielectrophoresis for proteomics applications: a comprehensive electrical and electrothermal design methodology.

    PubMed

    Emaminejad, Sam; Dutton, Robert W; Davis, Ronald W; Javanmard, Mehdi

    2014-06-21

    In this work, we present a methodological approach to analyze an enhanced dielectrophoresis (DEP) system from both a circuit analysis and electrothermal view points. In our developed model, we have taken into account various phenomena and constraints such as voltage degradation (due to the presence of the protecting oxide layer), oxide breakdown, instrumentation limitations, and thermal effects. The results from this analysis are applicable generally to a wide variety of geometries and high voltage microsystems. Here, these design guidelines were applied to develop a robust electronic actuation system to perform a multiplexed bead-based protein assay. To carry out the multiplexed functionality, along a single microfluidic channel, an array of proteins is patterned, where each element is targeting a specific secondary protein coated on micron-sized beads in the subsequently introduced sample solution. Below each element of the array, we have a pair of addressable interdigitated electrodes. By selectively applying voltage at the terminals of each interdigitated electrode pair, the enhanced DEP, or equivalently 'ultra'-DEP (uDEP) force detaches protein-bound beads from each element of the array, one by one, without disturbing the bound beads in the neighboring regions. The detached beads can be quantified optically or electrically downstream. For proof of concept, we illustrated 16-plex actuation capability of our device to elute micron-sized beads that are bound to the surface through anti-IgG and IgG interaction which is on the same order of magnitude in strength as typical antibody-antigen interactions. In addition to its application in multiplexed protein analysis, our platform can be potentially utilized to statistically characterize the strength profile of biological bonds, since the multiplexed format allows for high throughput force spectroscopy using the array of uDEP devices, under the same buffer and assay preparation conditions. PMID:24801800

  6. Rationale, Design, Methodology and Hospital Characteristics of the First Gulf Acute Heart Failure Registry (Gulf CARE)

    PubMed Central

    Sulaiman, Kadhim J.; Panduranga, Prashanth; Al-Zakwani, Ibrahim; Alsheikh-Ali, Alawi; Al-Habib, Khalid; Al-Suwaidi, Jassim; Al-Mahmeed, Wael; Al-Faleh, Husam; El-Asfar, Abdelfatah; Al-Motarreb, Ahmed; Ridha, Mustafa; Bulbanat, Bassam; Al-Jarallah, Mohammed; Bazargani, Nooshin; Asaad, Nidal; Amin, Haitham

    2014-01-01

    Background: There is paucity of data on heart failure (HF) in the Gulf Middle East. The present paper describes the rationale, design, methodology and hospital characteristics of the first Gulf acute heart failure registry (Gulf CARE). Materials and Methods: Gulf CARE is a prospective, multicenter, multinational registry of patients >18 year of age admitted with diagnosis of acute HF (AHF). The data collected included demographics, clinical characteristics, etiology, precipitating factors, management and outcomes of patients admitted with AHF. In addition, data about hospital readmission rates, procedures and mortality at 3 months and 1-year follow-up were recorded. Hospital characteristics and care provider details were collected. Data were entered in a dedicated website using an electronic case record form. Results: A total of 5005 consecutive patients were enrolled from February 14, 2012 to November 13, 2012. Forty-seven hospitals in 7 Gulf States (Oman, Saudi Arabia, Yemen, Kuwait, United Gulf Emirates, Qatar and Bahrain) participated in the project. The majority of hospitals were community hospitals (46%; 22/47) followed by non-University teaching (32%; 15/47 and University hospitals (17%). Most of the hospitals had intensive or coronary care unit facilities (93%; 44/47) with 59% (28/47) having catheterization laboratory facilities. However, only 29% (14/47) had a dedicated HF clinic facility. Most patients (71%) were cared for by a cardiologist. Conclusions: Gulf CARE is the first prospective registry of AHF in the Middle East, intending to provide a unique insight into the demographics, etiology, management and outcomes of AHF in the Middle East. HF management in the Middle East is predominantly provided by cardiologists. The data obtained from this registry will help the local clinicians to identify the deficiencies in HF management as well as provide a platform to implement evidence based preventive and treatment strategies to reduce the burden of HF in

  7. Neutralization of red mud with pickling waste liquor using Taguchi's design of experimental methodology.

    PubMed

    Rai, Suchita; Wasewar, Kailas L; Lataye, Dilip H; Mishra, Rajshekhar S; Puttewar, Suresh P; Chaddha, Mukesh J; Mahindiran, P; Mukhopadhyay, Jyoti

    2012-09-01

    'Red mud' or 'bauxite residue', a waste generated from alumina refinery is highly alkaline in nature with a pH of 10.5-12.5. Red mud poses serious environmental problems such as alkali seepage in ground water and alkaline dust generation. One of the options to make red mud less hazardous and environmentally benign is its neutralization with acid or an acidic waste. Hence, in the present study, neutralization of alkaline red mud was carried out using a highly acidic waste (pickling waste liquor). Pickling waste liquor is a mixture of strong acids used for descaling or cleaning the surfaces in steel making industry. The aim of the study was to look into the feasibility of neutralization process of the two wastes using Taguchi's design of experimental methodology. This would make both the wastes less hazardous and safe for disposal. The effect of slurry solids, volume of pickling liquor, stirring time and temperature on the neutralization process were investigated. The analysis of variance (ANOVA) shows that the volume of the pickling liquor is the most significant parameter followed by quantity of red mud with 69.18% and 18.48% contribution each respectively. Under the optimized parameters, pH value of 7 can be achieved by mixing the two wastes. About 25-30% of the total soda from the red mud is being neutralized and alkalinity is getting reduced by 80-85%. Mineralogy and morphology of the neutralized red mud have also been studied. The data presented will be useful in view of environmental concern of red mud disposal. PMID:22751850

  8. Longitudinal Intergenerational Birth Cohort Designs: A Systematic Review of Australian and New Zealand Studies

    PubMed Central

    Townsend, Michelle L.; Riepsamen, Angelique; Georgiou, Christos; Flood, Victoria M.; Caputi, Peter; Wright, Ian M.; Davis, Warren S.; Jones, Alison; Larkin, Theresa A.; Williamson, Moira J.; Grenyer, Brin F. S.

    2016-01-01

    Background The longitudinal birth cohort design has yielded a substantial contribution to knowledge of child health and development. The last full review in New Zealand and Australia in 2004 identified 13 studies. Since then, birth cohort designs continue to be an important tool in understanding how intrauterine, infant and childhood development affect long-term health and well-being. This updated review in a defined geographical area was conducted to better understand the factors associated with successful quality and productivity, and greater scientific and policy contribution and scope. Methods We adopted the preferred reporting items for systematic reviews and meta-analyses (PRISMA) approach, searching PubMed, Scopus, Cinahl, Medline, Science Direct and ProQuest between 1963 and 2013. Experts were consulted regarding further studies. Five inclusion criteria were used: (1) have longitudinally tracked a birth cohort, (2) have collected data on the child and at least one parent or caregiver (3) be based in Australia or New Zealand, (4) be empirical in design, and (5) have been published in English. Results 10665 records were initially retrieved from which 23 birth cohort studies met the selection criteria. Together these studies recruited 91,196 participants, with 38,600 mothers, 14,206 fathers and 38,390 live births. Seventeen studies were located in Australia and six in New Zealand. Research questions initially focused on the perinatal period, but as studies matured, longer-term effects and outcomes were examined. Conclusions This review demonstrates the significant yield from this effort both in terms of scientific discovery and social policy impact. Further opportunities have been recognised with cross-study collaboration and pooling of data between established and newer studies and international studies to investigate global health determinants. PMID:26991330

  9. A Multi-Objective Advanced Design Methodology of Composite Beam-to-Column Joints Subjected to Seismic and Fire Loads

    SciTech Connect

    Pucinotti, Raffaele; Ferrario, Fabio; Bursi, Oreste S.

    2008-07-08

    A multi-objective advanced design methodology dealing with seismic actions followed by fire on steel-concrete composite full strength joints with concrete filled tubes is proposed in this paper. The specimens were designed in detail in order to exhibit a suitable fire behaviour after a severe earthquake. The major aspects of the cyclic behaviour of composite joints are presented and commented upon. The data obtained from monotonic and cyclic experimental tests have been used to calibrate a model of the joint in order to perform seismic simulations on several moment resisting frames. A hysteretic law was used to take into account the seismic degradation of the joints. Finally, fire tests were conducted with the objective to evaluate fire resistance of the connection already damaged by an earthquake. The experimental activity together with FE simulation demonstrated the adequacy of the advanced design methodology.

  10. Public health economics: a systematic review of guidance for the economic evaluation of public health interventions and discussion of key methodological issues

    PubMed Central

    2013-01-01

    Background If Public Health is the science and art of how society collectively aims to improve health, and reduce inequalities in health, then Public Health Economics is the science and art of supporting decision making as to how society can use its available resources to best meet these objectives and minimise opportunity cost. A systematic review of published guidance for the economic evaluation of public health interventions within this broad public policy paradigm was conducted. Methods Electronic databases and organisation websites were searched using a 22 year time horizon (1990–2012). References of papers were hand searched for additional papers for inclusion. Government reports or peer-reviewed published papers were included if they; referred to the methods of economic evaluation of public health interventions, identified key challenges of conducting economic evaluations of public health interventions or made recommendations for conducting economic evaluations of public health interventions. Guidance was divided into three categories UK guidance, international guidance and observations or guidance provided by individual commentators in the field of public health economics. An assessment of the theoretical frameworks underpinning the guidance was made and served as a rationale for categorising the papers. Results We identified 5 international guidance documents, 7 UK guidance documents and 4 documents by individual commentators. The papers reviewed identify the main methodological challenges that face analysts when conducting such evaluations. There is a consensus within the guidance that wider social and environmental costs and benefits should be looked at due to the complex nature of public health. This was reflected in the theoretical underpinning as the majority of guidance was categorised as extra-welfarist. Conclusions In this novel review we argue that health economics may have come full circle from its roots in broad public policy economics. We may

  11. Systematic Design of Pore Size and Functionality in Isoreticular MOFs and Their Application in Methane Storage

    NASA Astrophysics Data System (ADS)

    Eddaoudi, Mohamed; Kim, Jaheon; Rosi, Nathaniel; Vodak, David; Wachter, Joseph; O'Keeffe, Michael; Yaghi, Omar M.

    2002-01-01

    A strategy based on reticulating metal ions and organic carboxylate links into extended networks has been advanced to a point that allowed the design of porous structures in which pore size and functionality could be varied systematically. Metal-organic framework (MOF-5), a prototype of a new class of porous materials and one that is constructed from octahedral Zn-O-C clusters and benzene links, was used to demonstrate that its three-dimensional porous system can be functionalized with the organic groups -Br, -NH2, -OC3H7, -OC5H11, -C2H4, and -C4H4 and that its pore size can be expanded with the long molecular struts biphenyl, tetrahydropyrene, pyrene, and terphenyl. We synthesized an isoreticular series (one that has the same framework topology) of 16 highly crystalline materials whose open space represented up to 91.1% of the crystal volume, as well as homogeneous periodic pores that can be incrementally varied from 3.8 to 28.8 angstroms. One member of this series exhibited a high capacity for methane storage (240 cubic centimeters at standard temperature and pressure per gram at 36 atmospheres and ambient temperature), and others the lowest densities (0.41 to 0.21 gram per cubic centimeter) for a crystalline material at room temperature.

  12. Systematic design of pore size and functionality in isoreticular MOFs and their application in methane storage.

    PubMed

    Eddaoudi, Mohamed; Kim, Jaheon; Rosi, Nathaniel; Vodak, David; Wachter, Joseph; O'Keeffe, Michael; Yaghi, Omar M

    2002-01-18

    A strategy based on reticulating metal ions and organic carboxylate links into extended networks has been advanced to a point that allowed the design of porous structures in which pore size and functionality could be varied systematically. Metal-organic framework (MOF-5), a prototype of a new class of porous materials and one that is constructed from octahedral Zn-O-C clusters and benzene links, was used to demonstrate that its three-dimensional porous system can be functionalized with the organic groups -Br, -NH2, -OC3H7, -OC5H11, -C2H4, and -C4H4 and that its pore size can be expanded with the long molecular struts biphenyl, tetrahydropyrene, pyrene, and terphenyl. We synthesized an isoreticular series (one that has the same framework topology) of 16 highly crystalline materials whose open space represented up to 91.1% of the crystal volume, as well as homogeneous periodic pores that can be incrementally varied from 3.8 to 28.8 angstroms. One member of this series exhibited a high capacity for methane storage (240 cubic centimeters at standard temperature and pressure per gram at 36 atmospheres and ambient temperature), and others the lowest densities (0.41 to 0.21 gram per cubic centimeter) for a crystalline material at room temperature. PMID:11799235

  13. Systematic optimization of human pluripotent stem cells media using Design of Experiments

    NASA Astrophysics Data System (ADS)

    Marinho, Paulo A.; Chailangkarn, Thanathom; Muotri, Alysson R.

    2015-05-01

    Human pluripotent stem cells (hPSC) are used to study the early stages of human development in vitro and, increasingly due to somatic cell reprogramming, cellular and molecular mechanisms of disease. Cell culture medium is a critical factor for hPSC to maintain pluripotency and self-renewal. Numerous defined culture media have been empirically developed but never systematically optimized for culturing hPSC. We applied design of experiments (DOE), a powerful statistical tool, to improve the medium formulation for hPSC. Using pluripotency and cell growth as read-outs, we determined the optimal concentration of both basic fibroblast growth factor (bFGF) and neuregulin-1 beta 1 (NRG1β1). The resulting formulation, named iDEAL, improved the maintenance and passage of hPSC in both normal and stressful conditions, and affected trimethylated histone 3 lysine 27 (H3K27me3) epigenetic status after genetic reprogramming. It also enhances efficient hPSC plating as single cells. Altogether, iDEAL potentially allows scalable and controllable hPSC culture routine in translational research. Our DOE strategy could also be applied to hPSC differentiation protocols, which often require numerous and complex cell culture media.

  14. Systematic optimization of human pluripotent stem cells media using Design of Experiments

    PubMed Central

    Marinho, Paulo A.; Chailangkarn, Thanathom; Muotri, Alysson R.

    2015-01-01

    Human pluripotent stem cells (hPSC) are used to study the early stages of human development in vitro and, increasingly due to somatic cell reprogramming, cellular and molecular mechanisms of disease. Cell culture medium is a critical factor for hPSC to maintain pluripotency and self-renewal. Numerous defined culture media have been empirically developed but never systematically optimized for culturing hPSC. We applied design of experiments (DOE), a powerful statistical tool, to improve the medium formulation for hPSC. Using pluripotency and cell growth as read-outs, we determined the optimal concentration of both basic fibroblast growth factor (bFGF) and neuregulin−1 beta 1 (NRG1β1). The resulting formulation, named iDEAL, improved the maintenance and passage of hPSC in both normal and stressful conditions, and affected trimethylated histone 3 lysine 27 (H3K27me3) epigenetic status after genetic reprogramming. It also enhances efficient hPSC plating as single cells. Altogether, iDEAL potentially allows scalable and controllable hPSC culture routine in translational research. Our DOE strategy could also be applied to hPSC differentiation protocols, which often require numerous and complex cell culture media. PMID:25940691

  15. Review of the probabilistic failure analysis methodology and other probabilistic approaches for application in aerospace structural design

    NASA Technical Reports Server (NTRS)

    Townsend, J.; Meyers, C.; Ortega, R.; Peck, J.; Rheinfurth, M.; Weinstock, B.

    1993-01-01

    Probabilistic structural analyses and design methods are steadily gaining acceptance within the aerospace industry. The safety factor approach to design has long been the industry standard, and it is believed by many to be overly conservative and thus, costly. A probabilistic approach to design may offer substantial cost savings. This report summarizes several probabilistic approaches: the probabilistic failure analysis (PFA) methodology developed by Jet Propulsion Laboratory, fast probability integration (FPI) methods, the NESSUS finite element code, and response surface methods. Example problems are provided to help identify the advantages and disadvantages of each method.

  16. Systematic Description of Functional Knowledge based on Functional Ontologies and Its Use for Supporting Design of Functional Structures

    NASA Astrophysics Data System (ADS)

    Kitamura, Yoshinobu; Kasai, Toshinobu; Yoshikawa, Mariko; Takahashi, Masaru; Kozaki, Kouji; Mizoguchi, Riichiro

    In conceptual design, a designer decomposes a required function into sub-functions, so-called functional decomposition, using a kind of functional knowledge representing achievement relations among functions. Aimin at systematization of such functional knowledge, we proposed ontologies that guide conceptualization of artifacts from the functional point of view. This paper discusses its systematic description based on the functional ontologies. Firstly, we propose a new concept named “way of achievement” as a key concept for its systematization. Categorization of typical representations of the knowledge and organization as is-a hierarchies are also discussed. Such concept, categorization, and functional ontologies make the functional knowledge consistent and applicable to other domains. Next, the implementation of the functional ontologies and their utility on description of the knowledge are shown. Lastly, we discuss development of a knowledge-based system to help human designers redesign an existin artifact. The ontology of functional concepts and the systematic description of functional knowledge enable the supporting system to show designers a wide range of alternative ways and then to facilitate innovative redesign.

  17. Thermal Hydraulics Design and Analysis Methodology for a Solid-Core Nuclear Thermal Rocket Engine Thrust Chamber

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Canabal, Francisco; Chen, Yen-Sen; Cheng, Gary; Ito, Yasushi

    2013-01-01

    Nuclear thermal propulsion is a leading candidate for in-space propulsion for human Mars missions. This chapter describes a thermal hydraulics design and analysis methodology developed at the NASA Marshall Space Flight Center, in support of the nuclear thermal propulsion development effort. The objective of this campaign is to bridge the design methods in the Rover/NERVA era, with a modern computational fluid dynamics and heat transfer methodology, to predict thermal, fluid, and hydrogen environments of a hypothetical solid-core, nuclear thermal engine the Small Engine, designed in the 1960s. The computational methodology is based on an unstructured-grid, pressure-based, all speeds, chemically reacting, computational fluid dynamics and heat transfer platform, while formulations of flow and heat transfer through porous and solid media were implemented to describe those of hydrogen flow channels inside the solid24 core. Design analyses of a single flow element and the entire solid-core thrust chamber of the Small Engine were performed and the results are presented herein

  18. Grounded Theory as a Methodology to Design Teaching Strategies for Historically Informed Musical Performance

    ERIC Educational Resources Information Center

    Mateos-Moreno, Daniel; Alcaraz-Iborra, Mario

    2013-01-01

    Our work highlights the necessity of revising the materials employed in instrumental education, which are systematically based on a progressive development of technical abilities and, though only transversely, without a structured sequence of contents, on issues referring to the interpretation of different periods and styles. In order to elaborate…

  19. Systematic design of transmitter and receiver architectures for flexible filter bank multi-carrier signals

    NASA Astrophysics Data System (ADS)

    Gutiérrez, Esteban; López-Salcedo, José A.; Seco-Granados, Gonzalo

    2014-12-01

    Multi-carrier (MC) signaling is currently in the forefront of a myriad of systems, either wired or wireless, due to its high spectral efficiency, simple equalization, and robustness in front of multipath and narrowband interference sources. Despite its widespread deployment, the design of efficient architectures for MC systems becomes a challenging task when adopting filter bank multi-carrier (FBMC) modulation due to the inclusion of band-limited shaping pulses into the signal model. The reason to employ these pulses is the numerous improvements they offer in terms of performance, such as providing higher spectral confinement and no frequency overlap between adjacent subcarriers. These attributes lead to a reduced out-of-band power emission and a higher effective throughput. The latter is indeed possible by removing the need of cyclic prefix, which is in charge of preserving orthogonality among subcarriers in conventional MC systems. Nevertheless, the potential benefits of FBMC modulations are often obscured when it comes to an implementation point of view. In order to circumvent this limitation, the present paper provides a unified framework to describe all FBMC signals in which both signal design and implementation criteria are explicitly combined. In addition to this, we introduce the concept of flexible FBMC signals that, unlike their traditional MC counterparts, do not impose restrictions on the signal parameters (i.e., symbol rate, carrier spacing, or sampling frequency). Moreover, our framework also proposes a methodology that overcomes the implementation issues that characterize FBMC systems and allows us to derive simple, efficient, and time-invariant transmitter and receiver architectures.

  20. An Investigation of the Shortcomings of the CONSORT 2010 Statement for the Reporting of Group Sequential Randomised Controlled Trials: A Methodological Systematic Review

    PubMed Central

    Stevely, Abigail; Dimairo, Munyaradzi; Todd, Susan; Julious, Steven A.; Nicholl, Jonathan; Hind, Daniel; Cooper, Cindy L.

    2015-01-01

    Background It can be argued that adaptive designs are underused in clinical research. We have explored concerns related to inadequate reporting of such trials, which may influence their uptake. Through a careful examination of the literature, we evaluated the standards of reporting of group sequential (GS) randomised controlled trials, one form of a confirmatory adaptive design. Methods We undertook a systematic review, by searching Ovid MEDLINE from the 1st January 2001 to 23rd September 2014, supplemented with trials from an audit study. We included parallel group, confirmatory, GS trials that were prospectively designed using a Frequentist approach. Eligible trials were examined for compliance in their reporting against the CONSORT 2010 checklist. In addition, as part of our evaluation, we developed a supplementary checklist to explicitly capture group sequential specific reporting aspects, and investigated how these are currently being reported. Results Of the 284 screened trials, 68(24%) were eligible. Most trials were published in “high impact” peer-reviewed journals. Examination of trials established that 46(68%) were stopped early, predominantly either for futility or efficacy. Suboptimal reporting compliance was found in general items relating to: access to full trials protocols; methods to generate randomisation list(s); details of randomisation concealment, and its implementation. Benchmarking against the supplementary checklist, GS aspects were largely inadequately reported. Only 3(7%) trials which stopped early reported use of statistical bias correction. Moreover, 52(76%) trials failed to disclose methods used to minimise the risk of operational bias, due to the knowledge or leakage of interim results. Occurrence of changes to trial methods and outcomes could not be determined in most trials, due to inaccessible protocols and amendments. Discussion and Conclusions There are issues with the reporting of GS trials, particularly those specific to the

  1. Design of integrated autopilot/autothrottle for NASA TSRV airplane using integral LQG methodology. [transport systems research vehicle

    NASA Technical Reports Server (NTRS)

    Kaminer, Isaac; Benson, Russell A.

    1989-01-01

    An integrated autopilot/autothrottle control system has been developed for the NASA transport system research vehicle using a two-degree-of-freedom approach. Based on this approach, the feedback regulator was designed using an integral linear quadratic regulator design technique, which offers a systematic approach to satisfy desired feedback performance requirements and guarantees stability margins in both control and sensor loops. The resulting feedback controller was discretized and implemented using a delta coordinate concept, which allows for transient free controller switching by initializing all controller states to zero and provides a simple solution for dealing with throttle limiting cases.

  2. Switching from usual brand cigarettes to a tobacco-heating cigarette or snus: Part 1. Study design and methodology.

    PubMed

    Ogden, Michael W; Marano, Kristin M; Jones, Bobbette A; Stiles, Mitchell F

    2015-01-01

    A randomized, multi-center study was conducted to assess potential improvement in health status measures, as well as changes in biomarkers of tobacco exposure and biomarkers of biological effect, in current adult cigarette smokers switched to tobacco-heating cigarettes, snus or ultra-low machine yield tobacco-burning cigarettes (50/group) evaluated over 24 weeks. Study design, conduct and methodology are presented here along with subjects' disposition, characteristics, compliance and safety results. This design and methodology, evaluating generally healthy adult smokers over a relatively short duration, proved feasible. Findings from this randomized study provide generalized knowledge of the risk continuum among various tobacco products (ClinicalTrials.gov Identifier: NCT02061917). PMID:26525849

  3. Observations on computational methodologies for use in large-scale, gradient-based, multidisciplinary design incorporating advanced CFD codes

    NASA Technical Reports Server (NTRS)

    Newman, P. A.; Hou, G. J.-W.; Jones, H. E.; Taylor, A. C., III; Korivi, V. M.

    1992-01-01

    How a combination of various computational methodologies could reduce the enormous computational costs envisioned in using advanced CFD codes in gradient based optimized multidisciplinary design (MdD) procedures is briefly outlined. Implications of these MdD requirements upon advanced CFD codes are somewhat different than those imposed by a single discipline design. A means for satisfying these MdD requirements for gradient information is presented which appear to permit: (1) some leeway in the CFD solution algorithms which can be used; (2) an extension to 3-D problems; and (3) straightforward use of other computational methodologies. Many of these observations have previously been discussed as possibilities for doing parts of the problem more efficiently; the contribution here is observing how they fit together in a mutually beneficial way.

  4. Switching from usual brand cigarettes to a tobacco-heating cigarette or snus: Part 1. Study design and methodology

    PubMed Central

    Ogden, Michael W.; Marano, Kristin M.; Jones, Bobbette A.; Stiles, Mitchell F.

    2015-01-01

    Abstract A randomized, multi-center study was conducted to assess potential improvement in health status measures, as well as changes in biomarkers of tobacco exposure and biomarkers of biological effect, in current adult cigarette smokers switched to tobacco-heating cigarettes, snus or ultra-low machine yield tobacco-burning cigarettes (50/group) evaluated over 24 weeks. Study design, conduct and methodology are presented here along with subjects’ disposition, characteristics, compliance and safety results. This design and methodology, evaluating generally healthy adult smokers over a relatively short duration, proved feasible. Findings from this randomized study provide generalized knowledge of the risk continuum among various tobacco products (ClinicalTrials.gov Identifier: NCT02061917). PMID:26525849

  5. Contentious issues in research on trafficked women working in the sex industry: study design, ethics, and methodology.

    PubMed

    Cwikel, Julie; Hoban, Elizabeth

    2005-11-01

    The trafficking of women and children for work in the globalized sex industry is a global social problem. Quality data is needed to provide a basis for legislation, policy, and programs, but first, numerous research design, ethical, and methodological problems must be addressed. Research design issues in studying women trafficked for sex work (WTSW) include how to (a) develop coalitions to fund and support research, (b) maintain a critical stance on prostitution, and therefore WTSW (c) use multiple paradigms and methods to accurately reflect WTSW's reality, (d) present the purpose of the study, and (e) protect respondents' identities. Ethical issues include (a) complications with informed consent procedures, (b) problematic access to WTSW (c) loss of WTSW to follow-up, (d) inability to intervene in illegal acts or human rights violations, and (e) the need to maintain trustworthiness as researchers. Methodological issues include (a) constructing representative samples, (b) managing media interest, and (c) handling incriminating materials about law enforcement and immigration. PMID:19827235

  6. Impact Evaluation of Quality Assurance in Higher Education: Methodology and Causal Designs

    ERIC Educational Resources Information Center

    Leiber, Theodor; Stensaker, Bjørn; Harvey, Lee

    2015-01-01

    In this paper, the theoretical perspectives and general methodological elements of impact evaluation of quality assurance in higher education institutions are discussed, which should be a cornerstone of quality development in higher education and contribute to improving the knowledge about the effectiveness (or ineffectiveness) of quality…

  7. Class Size and Educational Achievement: A Review of Methodology with Particular Reference to Study Design.

    ERIC Educational Resources Information Center

    Goldstein, Harvey; Blatchford, Peter

    1998-01-01

    Reviews research into class size effects from a methodological viewpoint, concentrating on various strengths and weaknesses of randomized controlled trials (RCT) and observational studies. Discusses population definitions, causation, and generally sets out criteria for valid inferences from such studies. Illustrates with new findings from data in…

  8. Methodological Complications of Matching Designs under Real World Constraints: Lessons from a Study of Deeper Learning

    ERIC Educational Resources Information Center

    Zeiser, Kristina; Rickles, Jordan; Garet, Michael S.

    2014-01-01

    To help researchers understand potential issues one can encounter when conducting propensity matching studies in complex settings, this paper describes methodological complications faced when studying schools using deeper learning practices to improve college and career readiness. The study uses data from high schools located in six districts…

  9. An integrated controls-structures design methodology for a flexible spacecraft

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Joshi, Suresh M.; Price, Douglas B.

    1992-01-01

    This paper proposes an approach for the design of flexible spacecraft, wherein the structural design and the control system design are performed simultaneously. The integrated design problem is posed as an optimization problem in which both the structural parameters and the control system parameters constitute the design variables, which are used to optimize a common objective function, thereby resulting in an optimal overall design. The approach is demonstrated by application to the integrated design of a geostationary platform, and to a ground-based flexible structure experiment. The numerical results obtained indicate that the integrated design approach generally yields spacecraft designs that are substantially superior compared to the conventional approach, wherein the structural design and control design are performed sequentially.

  10. Design methodology for a confocal imaging system using an objective microlens array with an increased working distance.

    PubMed

    Choi, Woojae; Shin, Ryung; Lim, Jiseok; Kang, Shinill

    2016-01-01

    In this study, a design methodology for a multi-optical probe confocal imaging system was developed. To develop an imaging system that has the required resolving power and imaging area, this study focused on a design methodology to create a scalable and easy-to-implement confocal imaging system. This system overcomes the limitations of the optical complexities of conventional multi-optical probe confocal imaging systems and the short working distance using a micro-objective lens module composed of two microlens arrays and a telecentric relay optical system. The micro-objective lens module was fabricated on a glass substrate using backside alignment photolithography and thermal reflow processes. To test the feasibility of the developed methodology, an optical system with a resolution of 1 μm/pixel using multi-optical probes with an array size of 10 × 10 was designed and constructed. The developed system provides a 1 mm × 1 mm field of view and a sample scanning range of 100 μm. The optical resolution was evaluated by conducting sample tests using a knife-edge detecting method. The measured lateral resolution of the system was 0.98 μm. PMID:27615370

  11. Systematic reviews need systematic searchers

    PubMed Central

    McGowan, Jessie; Sampson, Margaret

    2005-01-01

    Purpose: This paper will provide a description of the methods, skills, and knowledge of expert searchers working on systematic review teams. Brief Description: Systematic reviews and meta-analyses are very important to health care practitioners, who need to keep abreast of the medical literature and make informed decisions. Searching is a critical part of conducting these systematic reviews, as errors made in the search process potentially result in a biased or otherwise incomplete evidence base for the review. Searches for systematic reviews need to be constructed to maximize recall and deal effectively with a number of potentially biasing factors. Librarians who conduct the searches for systematic reviews must be experts. Discussion/Conclusion: Expert searchers need to understand the specifics about data structure and functions of bibliographic and specialized databases, as well as the technical and methodological issues of searching. Search methodology must be based on research about retrieval practices, and it is vital that expert searchers keep informed about, advocate for, and, moreover, conduct research in information retrieval. Expert searchers are an important part of the systematic review team, crucial throughout the review process—from the development of the proposal and research question to publication. PMID:15685278

  12. Integrating Evidence From Systematic Reviews, Qualitative Research, and Expert Knowledge Using Co-Design Techniques to Develop a Web-Based Intervention for People in the Retirement Transition

    PubMed Central

    O'Brien, Nicola; Heaven, Ben; Teal, Gemma; Evans, Elizabeth H; Cleland, Claire; Moffatt, Suzanne; Sniehotta, Falko F; White, Martin; Mathers, John C

    2016-01-01

    Background Integrating stakeholder involvement in complex health intervention design maximizes acceptability and potential effectiveness. However, there is little methodological guidance about how to integrate evidence systematically from various sources in this process. Scientific evidence derived from different approaches can be difficult to integrate and the problem is compounded when attempting to include diverse, subjective input from stakeholders. Objective The intent of the study was to describe and appraise a systematic, sequential approach to integrate scientific evidence, expert knowledge and experience, and stakeholder involvement in the co-design and development of a complex health intervention. The development of a Web-based lifestyle intervention for people in retirement is used as an example. Methods Evidence from three systematic reviews, qualitative research findings, and expert knowledge was compiled to produce evidence statements (stage 1). Face validity of these statements was assessed by key stakeholders in a co-design workshop resulting in a set of intervention principles (stage 2). These principles were assessed for face validity in a second workshop, resulting in core intervention concepts and hand-drawn prototypes (stage 3). The outputs from stages 1-3 were translated into a design brief and specification (stage 4), which guided the building of a functioning prototype, Web-based intervention (stage 5). This prototype was de-risked resulting in an optimized functioning prototype (stage 6), which was subject to iterative testing and optimization (stage 7), prior to formal pilot evaluation. Results The evidence statements (stage 1) highlighted the effectiveness of physical activity, dietary and social role interventions in retirement; the idiosyncratic nature of retirement and well-being; the value of using specific behavior change techniques including those derived from the Health Action Process Approach; and the need for signposting to local

  13. A systematic framework for computer-aided design of engineering rubber formulations

    NASA Astrophysics Data System (ADS)

    Ghosh, Prasenjeet

    This thesis considers the design of engineering rubber formulations, whose unique properties of elasticity and resilience enable diverse applications. Engineering rubber formulations are a complex mixture of different materials called curatives that includes elastomers, fillers, crosslinking agents, accelerators, activators, retarders, anti-oxidants and processing aids, where the amount of curatives must be adjusted for each application. The characterization of the final properties of the rubber in application is complex and depends on the chemical interplay between the different curatives in formulation via vulcanization chemistry. The details of the processing conditions and the thermal, deformational, and chemical environment encountered in application also have a pronounced effect on the performance of the rubber. Consequently, for much of the history of rubber as an engineering material, its recipe formulations have been developed largely by trial-and-error, rather than by a fundamental understanding. A computer-aided, systematic and automated framework for the design of such materials is proposed in this thesis. The framework requires the solution to two sub-problems: (a) the forward problem, which involves prediction of the desired properties when the formulation is known and (b) the inverse problem that requires identification of the appropriate formulation, given the desired target properties. As part of the forward model, the chemistry of accelerated sulfur vulcanization is reviewed that permits integration of the knowledge of the past five decades in the literature to answer some old questions, reconcile some of the contradicting mechanisms and present a holistic description of the governing chemistry. Based on this mechanistic chemistry, a fundamental kinetic model is derived using population balance equations. The model quantitatively describes, for the first time, the different aspects of vulcanization chemistry. Subsequently, a novel three

  14. An optimization-based integrated controls-structures design methodology for flexible space structures

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Joshi, Suresh M.; Armstrong, Ernest S.

    1993-01-01

    An approach for an optimization-based integrated controls-structures design is presented for a class of flexible spacecraft that require fine attitude pointing and vibration suppression. The integrated design problem is posed in the form of simultaneous optimization of both structural and control design variables. The approach is demonstrated by application to the integrated design of a generic space platform and to a model of a ground-based flexible structure. The numerical results obtained indicate that the integrated design approach can yield spacecraft designs that have substantially superior performance over a conventional design wherein the structural and control designs are performed sequentially. For example, a 40-percent reduction in the pointing error is observed along with a slight reduction in mass, or an almost twofold increase in the controlled performance is indicated with more than a 5-percent reduction in the overall mass of the spacecraft (a reduction of hundreds of kilograms).

  15. Methodology of design and analysis of external walls of space station for hypervelocity impacts by meteoroids and space debris

    NASA Technical Reports Server (NTRS)

    Batla, F. A.

    1986-01-01

    The development of criteria and methodology for the design and analysis of Space Station wall elements for collisions with meteoroids and space debris at hypervelocities is discussed. These collisions will occur at velocities of 10 km/s or more and can be damaging to the external wall elements of the Space Station. The wall elements need to be designed to protect the pressurized modules of the Space Station from functional or structural failure due to these collisions at hypervelocities for a given environment and population of meteoroids and space debris. The design and analysis approach and the associated computer program presented is to achieve this objective, including the optimization of the design for a required overall probability of no penetration. The approach is based on the presently available experimental and actual data on meteoroids and space debris flux and damage assessments and the empirical relationships resulting from the hypervelocity impact studies in laboratories.

  16. A systematic review of the incidence of schizophrenia: the distribution of rates and the influence of sex, urbanicity, migrant status and methodology

    PubMed Central

    McGrath, John; Saha, Sukanta; Welham, Joy; El Saadi, Ossama; MacCauley, Clare; Chant, David

    2004-01-01

    Background Understanding variations in the incidence of schizophrenia is a crucial step in unravelling the aetiology of this group of disorders. The aims of this review are to systematically identify studies related to the incidence of schizophrenia, to describe the key features of these studies, and to explore the distribution of rates derived from these studies. Methods Studies with original data related to the incidence of schizophrenia (published 1965–2001) were identified via searching electronic databases, reviewing citations and writing to authors. These studies were divided into core studies, migrant studies, cohort studies and studies based on Other Special Groups. Between- and within-study filters were applied in order to identify discrete rates. Cumulative plots of these rates were made and these distributions were compared when the underlying rates were sorted according to sex, urbanicity, migrant status and various methodological features. Results We identified 100 core studies, 24 migrant studies, 23 cohort studies and 14 studies based on Other Special Groups. These studies, which were drawn from 33 countries, generated a total of 1,458 rates. Based on discrete core data for persons (55 studies and 170 rates), the distribution of rates was asymmetric and had a median value (10%–90% quantile) of 15.2 (7.7–43.0) per 100,000. The distribution of rates was significantly higher in males compared to females; the male/female rate ratio median (10%–90% quantile) was 1.40 (0.9–2.4). Those studies conducted in urban versus mixed urban-rural catchment areas generated significantly higher rate distributions. The distribution of rates in migrants was significantly higher compared to native-born; the migrant/native-born rate ratio median (10%–90% quantile) was 4.6 (1.0–12.8). Apart from the finding that older studies reported higher rates, other study features were not associated with significantly different rate distributions (e.g. overall quality

  17. Design methodology for micro-discrete planar optics with minimum illumination loss for an extended source.

    PubMed

    Shim, Jongmyeong; Park, Changsu; Lee, Jinhyung; Kang, Shinill

    2016-08-01

    Recently, studies have examined techniques for modeling the light distribution of light-emitting diodes (LEDs) for various applications owing to their low power consumption, longevity, and light weight. The energy mapping technique, a design method that matches the energy distributions of an LED light source and target area, has been the focus of active research because of its design efficiency and accuracy. However, these studies have not considered the effects of the emitting area of the LED source. Therefore, there are limitations to the design accuracy for small, high-power applications with a short distance between the light source and optical system. A design method for compensating for the light distribution of an extended source after the initial optics design based on a point source was proposed to overcome such limits, but its time-consuming process and limited design accuracy with multiple iterations raised the need for a new design method that considers an extended source in the initial design stage. This study proposed a method for designing discrete planar optics that controls the light distribution and minimizes the optical loss with an extended source and verified the proposed method experimentally. First, the extended source was modeled theoretically, and a design method for discrete planar optics with the optimum groove angle through energy mapping was proposed. To verify the design method, design for the discrete planar optics was achieved for applications in illumination for LED flash. In addition, discrete planar optics for LED illuminance were designed and fabricated to create a uniform illuminance distribution. Optical characterization of these structures showed that the design was optimal; i.e., we plotted the optical losses as a function of the groove angle, and found a clear minimum. Simulations and measurements showed that an efficient optical design was achieved for an extended source. PMID:27505823

  18. A Sizing Methodology for the Conceptual Design of Blended-Wing-Body Transports. Degree awarded by George Washington Univ.

    NASA Technical Reports Server (NTRS)

    Kimmel, William M. (Technical Monitor); Bradley, Kevin R.

    2004-01-01

    This paper describes the development of a methodology for sizing Blended-Wing-Body (BWB) transports and how the capabilities of the Flight Optimization System (FLOPS) have been expanded using that methodology. In this approach, BWB transports are sized based on the number of passengers in each class that must fit inside the centerbody or pressurized vessel. Weight estimation equations for this centerbody structure were developed using Finite Element Analysis (FEA). This paper shows how the sizing methodology has been incorporated into FLOPS to enable the design and analysis of BWB transports. Previous versions of FLOPS did not have the ability to accurately represent or analyze BWB configurations in any reliable, logical way. The expanded capabilities allow the design and analysis of a 200 to 450-passenger BWB transport or the analysis of a BWB transport for which the geometry is already known. The modifications to FLOPS resulted in differences of less than 4 percent for the ramp weight of a BWB transport in this range when compared to previous studies performed by NASA and Boeing.

  19. Application of modern control design methodology to oblique wing research aircraft

    NASA Technical Reports Server (NTRS)

    Vincent, James H.

    1991-01-01

    A Linear Quadratic Regulator synthesis technique was used to design an explicit model following control system for the Oblique Wing Research Aircraft (OWRA). The forward path model (Maneuver Command Generator) was designed to incorporate the desired flying qualities and response decoupling. The LQR synthesis was based on the use of generalized controls, and it was structured to provide a proportional/integral error regulator with feedforward compensation. An unexpected consequence of this design approach was the ability to decouple the control synthesis into separate longitudinal and lateral directional designs. Longitudinal and lateral directional control laws were generated for each of the nine design flight conditions, and gain scheduling requirements were addressed. A fully coupled 6 degree of freedom open loop model of the OWRA along with the longitudinal and lateral directional control laws was used to assess the closed loop performance of the design. Evaluations were performed for each of the nine design flight conditions.

  20. Games and Diabetes: A Review Investigating Theoretical Frameworks, Evaluation Methodologies, and Opportunities for Design Grounded in Learning Theories.

    PubMed

    Lazem, Shaimaa; Webster, Mary; Holmes, Wayne; Wolf, Motje

    2016-03-01

    Here we review 18 articles that describe the design and evaluation of 1 or more games for diabetes from technical, methodological, and theoretical perspectives. We undertook searches covering the period 2010 to May 2015 in the ACM, IEEE, Journal of Medical Internet Research, Studies in Health Technology and Informatics, and Google Scholar online databases using the keywords "children," "computer games," "diabetes," "games," "type 1," and "type 2" in various Boolean combinations. The review sets out to establish, for future research, an understanding of the current landscape of digital games designed for children with diabetes. We briefly explored the use and impact of well-established learning theories in such games. The most frequently mentioned theoretical frameworks were social cognitive theory and social constructivism. Due to the limitations of the reported evaluation methodologies, little evidence was found to support the strong promise of games for diabetes. Furthermore, we could not establish a relation between design features and the game outcomes. We argue that an in-depth discussion about the extent to which learning theories could and should be manifested in the design decisions is required. PMID:26337753

  1. A methodology for hypersonic transport technology planning

    NASA Technical Reports Server (NTRS)

    Repic, E. M.; Olson, G. A.; Milliken, R. J.

    1973-01-01

    A systematic procedure by which the relative economic value of technology factors affecting design, configuration, and operation of a hypersonic cruise transport can be evaluated is discussed. Use of the methodology results in identification of first-order economic gains potentially achievable by projected advances in each of the definable, hypersonic technologies. Starting with a baseline vehicle, the formulas, procedures and forms which are integral parts of this methodology are developed. A demonstration of the methodology is presented for one specific hypersonic vehicle system.

  2. Duct injection technology prototype development: Scale-up methodology and engineering design criteria

    SciTech Connect

    Not Available

    1991-04-01

    The objective of the Duct Injection Technology Prototype Development project is to develop a sound design basis for applying duct injection technology as a post-combustion SO{sub 2} emissions control method to existing, pre-NSPS, coal-fired power plants. This report is divided into five major topics: (1) design criteria; (2) engineering drawings; (3) equipment sizing and design; (4) plant and equipment arrangement considerations; and (5) equipment bid specification guidelines.

  3. The theory and methodology of capturing and representing the design process and its application to the task of rapid redesign

    NASA Astrophysics Data System (ADS)

    Nii, Kendall M.

    The paradigm under which engineering design is being performed in the Aerospace industry is changing. There is an increased emphasis on a "faster, better, and cheaper" way of doing business. Designers are tasked with developing a better product, in a shorter time, with less money. Engineers are continually trying to improve their products, lower their costs, and reduce their schedules. So at first glance, it might seem difficult if not impossible to perform these three tasks simultaneously and attempt to achieve order of magnitude improvements in each area. Indeed it might well be impossible for an engineer using only traditional tools and techniques. However, there is a new tool, known as design capture, available to the designer. A design capture system, can aid the designer in a variety of ways. One specific use for a design capture system is to aid the designer in performing rapid redesign. This thesis presents a new methodology for a Design Capture System (DCS) which can aid the designer with performing rapid redesign. The Design Capture for Rapid Redesign (DCARRD) method facilitates rapid redesign in three ways: it allows the designer to assess the impact of changing an initial requirement, it allows the designer to assess the impact of changing a decision, and it enhances the ability of the designer to assess the impact of a completely new requirement. The DCARRD method was implemented into an html-based design capture system accessible through a Web browser. This implementation demonstrates the feasibility of the DCARRD method. The most important features of DCARRD are that it is focused an performing rapid redesign, it places the design decisions within the framework of the design process, it is simple to use and implement, and it has the ability to track subsystem baselines. The many complex issues surrounding testing of design tools in general, and DCARRD in particular, are discussed at length. There are a number of complex issues which must be addressed

  4. Design and application of complementary educational resources for self-learning methodology

    NASA Astrophysics Data System (ADS)

    Andrés Gilarranz Casado, Carlos; Rodriguez-Sinobas, Leonor

    2016-04-01

    The main goal of this work is enhanced the student`s self-learning in subjects regarding irrigation and its technology. Thus, the use of visual media (video recording) during the lectures (master classes and practicum) will help the students in understanding the scope of the course since they can watch the recorded material at any time and as many times they wish. The study comprised two parts. In the first, lectures were video filmed inside the classroom during one semester (16 weeks and four hours per week) in the course "Irrigation Systems and Technology" which is taught at the Technical University of Madrid. In total, 200 videos, approximated 12 min long, were recorded. Since the You tube platform is a worldwide platform and since it is commonly used by students and professors, the videos were uploaded in it. Then, the URL was inserted in the Moodle platform which contains the materials for the course. In the second part, the videos were edited and formatted. Special care was taking to maintain image and audio quality. Finally, thirty videos were developed which focused on the different main areas of the course and containing a clear and brief explanation of their basis. Each video lasted between 30 and 45 min Finally, a survey was handled at the end of the semester in order to assess the students' opinion about the methodology. In the questionnaire, the students highlighted the key aspects during the learning process and in general, they were very satisfied with the methodology.

  5. Application of the MIAS methodology in design of the data acquisition system for wastewater treatment plant

    NASA Astrophysics Data System (ADS)

    Ćwikła, G.; Krenczyk, D.; Kampa, A.; Gołda, G.

    2015-11-01

    This paper presents application of MIAS (Manufacturing Information Acquisition System) methodology to develop customized data acquisition system supporting management of the Central Wastewater Treatment Plant (CWWTP) in Gliwice, Poland, being example of production systems leading continuous flow, automated production processes. Access to current data on the state of production system is a key to efficient management of a company, allowing fast reaction or even anticipation of future problems with equipment and reduction of waste. Overview of both analysis and synthesis of organisational solutions, data sources, data pre-processing and communication interfaces, realised according to proposed MIAS methodology, had been presented. The stage of analysis covered i.e.: organisational structure of the company, IT systems used in the company, specifics of technological processes, machines and equipment, structure of control systems, assignments of crew members, materials used in the technological processes. This paper also presents results of the stage of synthesis of technical and organisational solutions of MIAS for CWWTP, including proposed solutions covering MIAS architecture and connections with other IT systems, data sources in production system that are currently available and newly created, data preprocessing procedures, and necessary communication interfaces.

  6. Optical binary de Bruijn networks for massively parallel computing: design methodology and feasibility study

    NASA Astrophysics Data System (ADS)

    Louri, Ahmed; Sung, Hongki

    1995-10-01

    The interconnection network structure can be the deciding and limiting factor in the cost and the performance of parallel computers. One of the most popular point-to-point interconnection networks for parallel computers today is the hypercube. The regularity, logarithmic diameter, symmetry, high connectivity, fault tolerance, simple routing, and reconfigurability (easy embedding of other network topologies) of the hypercube make it a very attractive choice for parallel computers. Unfortunately the hypercube possesses a major drawback, which is the links per node increases as the network grows in size. As an alternative to the hypercube, the binary de Bruijn (BdB) network has recently received much attention. The BdB not only provides a logarithmic diameter, fault tolerance, and simple routing but also requires fewer links than the hypercube for the same network size. Additionally, a major advantage of the BdB edges per node is independent of the network size. This makes it very desirable for large-scale parallel systems. However, because of its asymmetrical nature and global connectivity, it poses a major challenge for VLSI technology. Optics, owing to its three-dimensional and global-connectivity nature, seems to be very suitable for implementing BdB networks. We present an implementation methodology for optical BdB networks. The distinctive feature of the proposed implementation methodology is partitionability of the network into a few primitive operations that can be implemented efficiently. We further show feasibility of the

  7. A Design Heritage-Based Forecasting Methodology for Risk Informed Management of Advanced Systems

    NASA Technical Reports Server (NTRS)

    Maggio, Gaspare; Fragola, Joseph R.

    1999-01-01

    The development of next generation systems often carries with it the promise of improved performance, greater reliability, and reduced operational costs. These expectations arise from the use of novel designs, new materials, advanced integration and production technologies intended for functionality replacing the previous generation. However, the novelty of these nascent technologies is accompanied by lack of operational experience and, in many cases, no actual testing as well. Therefore some of the enthusiasm surrounding most new technologies may be due to inflated aspirations from lack of knowledge rather than actual future expectations. This paper proposes a design heritage approach for improved reliability forecasting of advanced system components. The basis of the design heritage approach is to relate advanced system components to similar designs currently in operation. The demonstrated performance of these components could then be used to forecast the expected performance and reliability of comparable advanced technology components. In this approach the greater the divergence of the advanced component designs from the current systems the higher the uncertainty that accompanies the associated failure estimates. Designers of advanced systems are faced with many difficult decisions. One of the most common and more difficult types of these decisions are those related to the choice between design alternatives. In the past decision-makers have found these decisions to be extremely difficult to make because they often involve the trade-off between a known performing fielded design and a promising paper design. When it comes to expected reliability performance the paper design always looks better because it is on paper and it addresses all the know failure modes of the fielded design. On the other hand there is a long, and sometimes very difficult road, between the promise of a paper design and its fulfillment; with the possibility that sometimes the reliability

  8. SEISMIC DESIGN REQUIREMENTS SELECTION METHODOLOGY FOR THE SLUDGE TREATMENT & M-91 SOLID WASTE PROCESSING FACILITIES PROJECTS

    SciTech Connect

    RYAN GW

    2008-04-25

    In complying with direction from the U.S. Department of Energy (DOE), Richland Operations Office (RL) (07-KBC-0055, 'Direction Associated with Implementation of DOE-STD-1189 for the Sludge Treatment Project,' and 08-SED-0063, 'RL Action on the Safety Design Strategy (SDS) for Obtaining Additional Solid Waste Processing Capabilities (M-91 Project) and Use of Draft DOE-STD-I 189-YR'), it has been determined that the seismic design requirements currently in the Project Hanford Management Contract (PHMC) will be modified by DOE-STD-1189, Integration of Safety into the Design Process (March 2007 draft), for these two key PHMC projects. Seismic design requirements for other PHMC facilities and projects will remain unchanged. Considering the current early Critical Decision (CD) phases of both the Sludge Treatment Project (STP) and the Solid Waste Processing Facilities (M-91) Project and a strong intent to avoid potentially costly re-work of both engineering and nuclear safety analyses, this document describes how Fluor Hanford, Inc. (FH) will maintain compliance with the PHMC by considering both the current seismic standards referenced by DOE 0 420.1 B, Facility Safety, and draft DOE-STD-1189 (i.e., ASCE/SEI 43-05, Seismic Design Criteria for Structures, Systems, and Components in Nuclear Facilities, and ANSI!ANS 2.26-2004, Categorization of Nuclear Facility Structures, Systems and Components for Seismic Design, as modified by draft DOE-STD-1189) to choose the criteria that will result in the most conservative seismic design categorization and engineering design. Following the process described in this document will result in a conservative seismic design categorization and design products. This approach is expected to resolve discrepancies between the existing and new requirements and reduce the risk that project designs and analyses will require revision when the draft DOE-STD-1189 is finalized.

  9. Persuasive System Design Does Matter: A Systematic Review of Adherence to Web-Based Interventions

    PubMed Central

    Kok, Robin N; Ossebaard, Hans C; Van Gemert-Pijnen, Julia EWC

    2012-01-01

    Background Although web-based interventions for promoting health and health-related behavior can be effective, poor adherence is a common issue that needs to be addressed. Technology as a means to communicate the content in web-based interventions has been neglected in research. Indeed, technology is often seen as a black-box, a mere tool that has no effect or value and serves only as a vehicle to deliver intervention content. In this paper we examine technology from a holistic perspective. We see it as a vital and inseparable aspect of web-based interventions to help explain and understand adherence. Objective This study aims to review the literature on web-based health interventions to investigate whether intervention characteristics and persuasive design affect adherence to a web-based intervention. Methods We conducted a systematic review of studies into web-based health interventions. Per intervention, intervention characteristics, persuasive technology elements and adherence were coded. We performed a multiple regression analysis to investigate whether these variables could predict adherence. Results We included 101 articles on 83 interventions. The typical web-based intervention is meant to be used once a week, is modular in set-up, is updated once a week, lasts for 10 weeks, includes interaction with the system and a counselor and peers on the web, includes some persuasive technology elements, and about 50% of the participants adhere to the intervention. Regarding persuasive technology, we see that primary task support elements are most commonly employed (mean 2.9 out of a possible 7.0). Dialogue support and social support are less commonly employed (mean 1.5 and 1.2 out of a possible 7.0, respectively). When comparing the interventions of the different health care areas, we find significant differences in intended usage (p = .004), setup (p < .001), updates (p < .001), frequency of interaction with a counselor (p < .001), the system (p = .003) and peers (p

  10. MODeLeR: A Virtual Constructivist Learning Environment and Methodology for Object-Oriented Design

    ERIC Educational Resources Information Center

    Coffey, John W.; Koonce, Robert

    2008-01-01

    This article contains a description of the organization and method of use of an active learning environment named MODeLeR, (Multimedia Object Design Learning Resource), a tool designed to facilitate the learning of concepts pertaining to object modeling with the Unified Modeling Language (UML). MODeLeR was created to provide an authentic,…

  11. Partnerships for the Design, Conduct, and Analysis of Effectiveness, and Implementation Research: Experiences of the Prevention Science and Methodology Group

    PubMed Central

    Brown, C. Hendricks; Kellam, Sheppard G.; Kaupert, Sheila; Muthén, Bengt O.; Wang, Wei; Muthén, Linda K.; Chamberlain, Patricia; PoVey, Craig L.; Cady, Rick; Valente, Thomas W.; Ogihara, Mitsunori; Prado, Guillermo J.; Pantin, Hilda M.; Gallo, Carlos G.; Szapocznik, José; Czaja, Sara J.; McManus, John W.

    2012-01-01

    What progress prevention research has made comes through strategic partnerships with communities and institutions that host this research, as well as professional and practice networks that facilitate the diffusion of knowledge about prevention. We discuss partnership issues related to the design, analysis, and implementation of prevention research and especially how rigorous designs, including random assignment, get resolved through a partnership between community stakeholders, institutions, and researchers. These partnerships shape not only study design, but they determine the data that can be collected and how results and new methods are disseminated. We also examine a second type of partnership to improve the implementation of effective prevention programs into practice. We draw on social networks to studying partnership formation and function. The experience of the Prevention Science and Methodology Group, which itself is a networked partnership between scientists and methodologists, is highlighted. PMID:22160786

  12. Partnerships for the design, conduct, and analysis of effectiveness, and implementation research: experiences of the prevention science and methodology group.

    PubMed

    Brown, C Hendricks; Kellam, Sheppard G; Kaupert, Sheila; Muthén, Bengt O; Wang, Wei; Muthén, Linda K; Chamberlain, Patricia; PoVey, Craig L; Cady, Rick; Valente, Thomas W; Ogihara, Mitsunori; Prado, Guillermo J; Pantin, Hilda M; Gallo, Carlos G; Szapocznik, José; Czaja, Sara J; McManus, John W

    2012-07-01

    What progress prevention research has made comes through strategic partnerships with communities and institutions that host this research, as well as professional and practice networks that facilitate the diffusion of knowledge about prevention. We discuss partnership issues related to the design, analysis, and implementation of prevention research and especially how rigorous designs, including random assignment, get resolved through a partnership between community stakeholders, institutions, and researchers. These partnerships shape not only study design, but they determine the data that can be collected and how results and new methods are disseminated. We also examine a second type of partnership to improve the implementation of effective prevention programs into practice. We draw on social networks to studying partnership formation and function. The experience of the Prevention Science and Methodology Group, which itself is a networked partnership between scientists and methodologists, is highlighted. PMID:22160786

  13. Multi-variable control of the GE T700 engine using the LQG/LTR design methodology

    NASA Technical Reports Server (NTRS)

    Pfeil, W. H.; Athans, M.; Spang, H. A., III

    1986-01-01

    The design of scalar and multi-variable feedback control systems for the GET700 turboshaft engine coupled to a helicopter rotor system is examined. A series of linearized models are presented and analyzed. Robustness and performance specifications are posed in the frequency domain. The linear-quadratic-Gaussian with loop-transfer-recovery (LQG/LTR) methodology is used to obtain a sequence of three feedback designs. Even in the single-input/single-output case, comparison of the current control system with that derived from the LQG/LTR approach shows significant performance improvement. The multi-variable designs, evaluated using linear and nonlinear simulations, show even more potential for performance improvement.

  14. The conceptual development of a methodology for solving multi-objective hierarchical thermal design problems

    SciTech Connect

    Bascaran, E.; Bannerot, R.; Mistree, F.

    1987-01-01

    The design of thermal systems is complicated by changing operating conditions, the large number of alternatives, the strong dependance of thermal properties on temperature and pressure and sometimes the lack of a good understanding of the basic phenomena involved. A conceptual development is presented for organizing multi-objective hierarchical thermal design problems into a series of decision support problems which are compatible and solvable with DSIDES, a software system that is under development in the Systems Design Laboratory in the Department of Mechanical Engineering at the University of Houston. The software is currently being used to support the design of a variety of mechanical design problems including ships and airplanes. In this paper, a hierarchical coupled thermal problem is presented and solved by way of example.

  15. Experimental validation of optimization-based integrated controls-structures design methodology for flexible space structures

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Gupta, Sandeep; Joshi, Suresh M.; Walz, Joseph E.

    1993-01-01

    An optimization-based integrated design approach for flexible space structures is experimentally validated using three types of dissipative controllers, including static, dynamic, and LQG dissipative controllers. The nominal phase-0 of the controls structure interaction evolutional model (CEM) structure is redesigned to minimize the average control power required to maintain specified root-mean-square line-of-sight pointing error under persistent disturbances. The redesign structure, phase-1 CEM, was assembled and tested against phase-0 CEM. It is analytically and experimentally demonstrated that integrated controls-structures design is substantially superior to that obtained through the traditional sequential approach. The capability of a software design tool based on an automated design procedure in a unified environment for structural and control designs is demonstrated.

  16. A Systematic Framework of Virtual Laboratories Using Mobile Agent and Design Pattern Technologies

    ERIC Educational Resources Information Center

    Li, Yi-Hsung; Dow, Chyi-Ren; Lin, Cheng-Min; Chen, Sheng-Chang; Hsu, Fu-Wei

    2009-01-01

    Innovations in network and information technology have transformed traditional classroom lectures into new approaches that have given universities the opportunity to create a virtual laboratory. However, there is no systematic framework in existing approaches for the development of virtual laboratories. Further, developing a virtual laboratory…

  17. Playground Designs to Increase Physical Activity Levels during School Recess: A Systematic Review

    ERIC Educational Resources Information Center

    Escalante, Yolanda; García-Hermoso, Antonio; Backx, Karianne; Saavedra, Jose M.

    2014-01-01

    School recess provides a major opportunity to increase children's physical activity levels. Various studies have described strategies to increase levels of physical activity. The purpose of this systematic review is therefore to examine the interventions proposed as forms of increasing children's physical activity levels during recess. A…

  18. Mixing design for enzymatic hydrolysis of sugarcane bagasse: methodology for selection of impeller configuration.

    PubMed

    Corrêa, Luciano Jacob; Badino, Alberto Colli; Cruz, Antonio José Gonçalves

    2016-02-01

    One of the major process bottlenecks for viable industrial production of second generation ethanol is related with technical-economic difficulties in the hydrolysis step. The development of a methodology to choose the best configuration of impellers towards improving mass transfer and hydrolysis yield together with a low power consumption is important to make the process cost-effective. In this work, four dual impeller configurations (DICs) were evaluated during hydrolysis of sugarcane bagasse (SCB) experiments in a stirred tank reactor (3 L). The systems tested were dual Rushton turbine impellers (DIC1), Rushton and elephant ear (down-pumping) turbines (DIC2), Rushton and elephant ear (up-pumping) turbines (DIC3), and down-pumping and up-pumping elephant ear turbines (DIC4). The experiments were conducted during 96 h, using 10 % (m/v) SCB, pH 4.8, 50 °C, 10 FPU/g biomass, 470 rpm. The mixing time was successfully used as the characteristic parameter to select the best impeller configuration. Rheological parameters were determined using a rotational rheometer, and the power consumptions of the four DICs were on-line measured with a dynamometer. The values obtained for the energetic efficiency (the ratio between the cellulose to glucose conversion and the total energy) showed that the proposed methodology was successful in choosing a suitable configuration of impellers, wherein the DIC4 obtained approximately three times higher energetic efficiency than DIC1. Furthermore a scale-up protocol (factor scale-up 1000) for the enzymatic hydrolysis reactor was proposed. PMID:26650719

  19. Benign by design: catalyst-free in-water, on-water green chemical methodologies in organic synthesis.

    PubMed

    Gawande, Manoj B; Bonifácio, Vasco D B; Luque, Rafael; Branco, Paula S; Varma, Rajender S

    2013-06-21

    Catalyst-free reactions developed during the last decade and the latest developments in this emerging field are summarized with a focus on catalyst-free reactions in-water and on-water. Various named reactions, multi-component reactions and the synthesis of heterocyclic compounds are discussed including the use of various energy input systems such as microwave- and ultrasound irradiation, among others. Organic chemists and the practitioners of this art both in academia and industry hopefully will continue to design benign methodologies for organic synthesis in aqueous media under catalyst-free conditions by using alternative energy inputs based on fundamental principles. PMID:23529409

  20. Design methodology of focusing elements for multilevel planar optical systems in optical interconnects

    NASA Astrophysics Data System (ADS)

    Al Hafiz, Md. Abdullah; MacKenzie, Mark R.; Kwok, Chee-Yee

    2009-12-01

    We present a simple technique to determine the design parameters of an optical interconnect system that uses integral planar lenses. The technique is based on the ABCD transformation matrix method. This analysis technique is significantly simpler and more efficient than the previously published methods for finding the design parameters and predicting the coupling efficiency of the system. The proposed method is applied to compute the coupling efficiency of single- and two-level optical systems.