Reliability based design optimization: Formulations and methodologies
NASA Astrophysics Data System (ADS)
Agarwal, Harish
Modern products ranging from simple components to complex systems should be designed to be optimal and reliable. The challenge of modern engineering is to ensure that manufacturing costs are reduced and design cycle times are minimized while achieving requirements for performance and reliability. If the market for the product is competitive, improved quality and reliability can generate very strong competitive advantages. Simulation based design plays an important role in designing almost any kind of automotive, aerospace, and consumer products under these competitive conditions. Single discipline simulations used for analysis are being coupled together to create complex coupled simulation tools. This investigation focuses on the development of efficient and robust methodologies for reliability based design optimization in a simulation based design environment. Original contributions of this research are the development of a novel efficient and robust unilevel methodology for reliability based design optimization, the development of an innovative decoupled reliability based design optimization methodology, the application of homotopy techniques in unilevel reliability based design optimization methodology, and the development of a new framework for reliability based design optimization under epistemic uncertainty. The unilevel methodology for reliability based design optimization is shown to be mathematically equivalent to the traditional nested formulation. Numerical test problems show that the unilevel methodology can reduce computational cost by at least 50% as compared to the nested approach. The decoupled reliability based design optimization methodology is an approximate technique to obtain consistent reliable designs at lesser computational expense. Test problems show that the methodology is computationally efficient compared to the nested approach. A framework for performing reliability based design optimization under epistemic uncertainty is also developed. A trust region managed sequential approximate optimization methodology is employed for this purpose. Results from numerical test studies indicate that the methodology can be used for performing design optimization under severe uncertainty.
2017-11-01
ARL-TR-8225 ● NOV 2017 US Army Research Laboratory Methodology for Designing and Developing a New Ultra-Wideband Antenna Based...Research Laboratory Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio-Inspired Optimization Techniques by...SUBTITLE Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio-Inspired Optimization Techniques 5a. CONTRACT NUMBER
Memristor-Based Computing Architecture: Design Methodologies and Circuit Techniques
2013-03-01
MEMRISTOR-BASED COMPUTING ARCHITECTURE : DESIGN METHODOLOGIES AND CIRCUIT TECHNIQUES POLYTECHNIC INSTITUTE OF NEW YORK UNIVERSITY...TECHNICAL REPORT 3. DATES COVERED (From - To) OCT 2010 – OCT 2012 4. TITLE AND SUBTITLE MEMRISTOR-BASED COMPUTING ARCHITECTURE : DESIGN METHODOLOGIES...schemes for a memristor-based reconfigurable architecture design have not been fully explored yet. Therefore, in this project, we investigated
Design Optimization Method for Composite Components Based on Moment Reliability-Sensitivity Criteria
NASA Astrophysics Data System (ADS)
Sun, Zhigang; Wang, Changxi; Niu, Xuming; Song, Yingdong
2017-08-01
In this paper, a Reliability-Sensitivity Based Design Optimization (RSBDO) methodology for the design of the ceramic matrix composites (CMCs) components has been proposed. A practical and efficient method for reliability analysis and sensitivity analysis of complex components with arbitrary distribution parameters are investigated by using the perturbation method, the respond surface method, the Edgeworth series and the sensitivity analysis approach. The RSBDO methodology is then established by incorporating sensitivity calculation model into RBDO methodology. Finally, the proposed RSBDO methodology is applied to the design of the CMCs components. By comparing with Monte Carlo simulation, the numerical results demonstrate that the proposed methodology provides an accurate, convergent and computationally efficient method for reliability-analysis based finite element modeling engineering practice.
Designing a Strategic Plan through an Emerging Knowledge Generation Process: The ATM Experience
ERIC Educational Resources Information Center
Zanotti, Francesco
2012-01-01
Purpose: The aim of this contribution is to describe a new methodology for designing strategic plans and how it was implemented by ATM, a public transportation agency based in Milan, Italy. Design/methodology/approach: This methodology is founded on a new system theory, called "quantum systemics". It is based on models and metaphors both…
Towards Methodologies for Building Knowledge-Based Instructional Systems.
ERIC Educational Resources Information Center
Duchastel, Philippe
1992-01-01
Examines the processes involved in building instructional systems that are based on artificial intelligence and hypermedia technologies. Traditional instructional systems design methodology is discussed; design issues including system architecture and learning strategies are addressed; and a new methodology for building knowledge-based…
Experimental Design of a UCAV-Based High-Energy Laser Weapon
2016-12-01
propagation. The Design of Experiments (DOE) methodology is then applied to determine the significance of the UCAV-HEL design parameters and their... Design of Experiments (DOE) methodology is then applied to determine the significance of the UCAV-HEL design parameters and their effect on the...73 A. DESIGN OF EXPERIMENTS METHODOLOGY .............................73 B. OPERATIONAL CONCEPT
ERIC Educational Resources Information Center
Khalil, Deena; Kier, Meredith
2017-01-01
This article is about introducing Critical Race Design (CRD), a research methodology that centers race and equity at the nucleus of educational opportunities by design. First, the authors define design-based implementation research (DBIR; Penuel, Fishman, Cheng, & Sabelli, 2011) as an equity-oriented education research methodology where…
Design-Based Research: Is This a Suitable Methodology for Short-Term Projects?
ERIC Educational Resources Information Center
Pool, Jessica; Laubscher, Dorothy
2016-01-01
This article reports on a design-based methodology of a thesis in which a fully face-to-face contact module was converted into a blended learning course. The purpose of the article is to report on how design-based phases, in the form of micro-, meso- and macro-cycles were applied to improve practice and to generate design principles. Design-based…
ERIC Educational Resources Information Center
Tutlys, Vidmantas; Spöttl, Georg
2017-01-01
Purpose: This paper aims to explore methodological and institutional challenges on application of the work-process analysis approach in the design and development of competence-based occupational standards for Lithuania. Design/methodology/approach: The theoretical analysis is based on the review of scientific literature and the analysis of…
Using Design-Based Research in Gifted Education
ERIC Educational Resources Information Center
Jen, Enyi; Moon, Sidney; Samarapungavan, Ala
2015-01-01
Design-based research (DBR) is a new methodological framework that was developed in the context of the learning sciences; however, it has not been used very often in the field of gifted education. Compared with other methodologies, DBR is more process-oriented and context-sensitive. In this methodological brief, the authors introduce DBR and…
Design and analysis of sustainable computer mouse using design for disassembly methodology
NASA Astrophysics Data System (ADS)
Roni Sahroni, Taufik; Fitri Sukarman, Ahmad; Agung Mahardini, Karunia
2017-12-01
This paper presents the design and analysis of computer mouse using Design for Disassembly methodology. Basically, the existing computer mouse model consist a number of unnecessary part that cause the assembly and disassembly time in production. The objective of this project is to design a new computer mouse based on Design for Disassembly (DFD) methodology. The main methodology of this paper was proposed from sketch generation, concept selection, and concept scoring. Based on the design screening, design concept B was selected for further analysis. New design of computer mouse is proposed using fastening system. Furthermore, three materials of ABS, Polycarbonate, and PE high density were prepared to determine the environmental impact category. Sustainable analysis was conducted using software SolidWorks. As a result, PE High Density gives the lowers amount in the environmental category with great maximum stress value.
A Method for Co-Designing Theory-Based Behaviour Change Systems for Health Promotion.
Janols, Rebecka; Lindgren, Helena
2017-01-01
A methodology was defined and developed for designing theory-based behaviour change systems for health promotion that can be tailored to the individual. Theories from two research fields were combined with a participatory action research methodology. Two case studies applying the methodology were conducted. During and between group sessions the participants created material and designs following the behaviour change strategy themes, which were discussed, analysed and transformed into a design of a behaviour change system. Theories in behavioural change and persuasive technology guided the data collection, data analyses, and the design of a behaviour change system. The methodology has strong emphasis on the target group's participation in the design process. The different aspects brought forward related to behaviour change strategies defined in literature on persuasive technology, and the dynamics of these are associated to needs and motivation defined in literature on behaviour change. It was concluded that the methodology aids the integration of theories into a participatory action research design process, and aids the analyses and motivations of design choices.
ERIC Educational Resources Information Center
Cochrane, Todd; Davis, Niki; Morrow, Donna
2013-01-01
A methodology for design based research (DBR) into effective development and use of Multi-User Virtual Environments (MUVE) in vocational education is proposed. It blends software development with DBR with two theories selected to inform the methodology. Legitimate peripheral participation LPP (Lave & Wenger, 1991) provides a filter when…
NASA Astrophysics Data System (ADS)
Eimori, Takahisa; Anami, Kenji; Yoshimatsu, Norifumi; Hasebe, Tetsuya; Murakami, Kazuaki
2014-01-01
A comprehensive design optimization methodology using intuitive nondimensional parameters of inversion-level and saturation-level is proposed, especially for ultralow-power, low-voltage, and high-performance analog circuits with mixed strong, moderate, and weak inversion metal-oxide-semiconductor transistor (MOST) operations. This methodology is based on the synthesized charge-based MOST model composed of Enz-Krummenacher-Vittoz (EKV) basic concepts and advanced-compact-model (ACM) physics-based equations. The key concept of this methodology is that all circuit and system characteristics are described as some multivariate functions of inversion-level parameters, where the inversion level is used as an independent variable representative of each MOST. The analog circuit design starts from the first step of inversion-level design using universal characteristics expressed by circuit currents and inversion-level parameters without process-dependent parameters, followed by the second step of foundry-process-dependent design and the last step of verification using saturation-level criteria. This methodology also paves the way to an intuitive and comprehensive design approach for many kinds of analog circuit specifications by optimization using inversion-level log-scale diagrams and saturation-level criteria. In this paper, we introduce an example of our design methodology for a two-stage Miller amplifier.
2016-06-01
characteristics, experimental design techniques, and analysis methodologies that distinguish each phase of the MBSE MEASA. To ensure consistency... methodology . Experimental design selection, simulation analysis, and trade space analysis support the final two stages. Figure 27 segments the MBSE MEASA...rounding has the potential to increase the correlation between columns of the experimental design matrix. The design methodology presented in Vieira
Solar energy program evaluation: an introduction
DOE Office of Scientific and Technical Information (OSTI.GOV)
deLeon, P.
The Program Evaluation Methodology provides an overview of the practice and methodology of program evaluation and defines more precisely the evaluation techniques and methodologies that would be most appropriate to government organizations which are actively involved in the research, development, and commercialization of solar energy systems. Formal evaluation cannot be treated as a single methodological approach for assessing a program. There are four basic types of evaluation designs - the pre-experimental design; the quasi-experimental design based on time series; the quasi-experimental design based on comparison groups; and the true experimental design. This report is organized to first introduce the rolemore » and issues of evaluation. This is to provide a set of issues to organize the subsequent sections detailing the national solar energy programs. Then, these two themes are integrated by examining the evaluation strategies and methodologies tailored to fit the particular needs of the various individual solar energy programs. (MCW)« less
Mechanistic-empirical Pavement Design Guide Implementation
DOT National Transportation Integrated Search
2010-06-01
The recently introduced Mechanistic-Empirical Pavement Design Guide (MEPDG) and associated computer software provides a state-of-practice mechanistic-empirical highway pavement design methodology. The MEPDG methodology is based on pavement responses ...
Transportation Energy Conservation Data Book: A Selected Bibliography. Edition 3,
1978-11-01
Charlottesville, VA 22901 TITLE: Couputer-Based Resource Accounting Model TT1.1: Methodology for the Design of Urban for Automobile Technology Impact...Evaluation System ACCOUNTING; INDUSTRIAL SECTOR; ENERGY tPIESi Documentation. volume 6. CONSUM PTION: PERFORANCE: DESIGN : NASTE MEAT: Methodology for... Methodology for the Design of Urban Transportation 000172 Energy Flows In the U.S., 1973 and 1974. Volume 1: Methodology * $opdate to the Fational Energy
A Framework for Preliminary Design of Aircraft Structures Based on Process Information. Part 1
NASA Technical Reports Server (NTRS)
Rais-Rohani, Masoud
1998-01-01
This report discusses the general framework and development of a computational tool for preliminary design of aircraft structures based on process information. The described methodology is suitable for multidisciplinary design optimization (MDO) activities associated with integrated product and process development (IPPD). The framework consists of three parts: (1) product and process definitions; (2) engineering synthesis, and (3) optimization. The product and process definitions are part of input information provided by the design team. The backbone of the system is its ability to analyze a given structural design for performance as well as manufacturability and cost assessment. The system uses a database on material systems and manufacturing processes. Based on the identified set of design variables and an objective function, the system is capable of performing optimization subject to manufacturability, cost, and performance constraints. The accuracy of the manufacturability measures and cost models discussed here depend largely on the available data on specific methods of manufacture and assembly and associated labor requirements. As such, our focus in this research has been on the methodology itself and not so much on its accurate implementation in an industrial setting. A three-tier approach is presented for an IPPD-MDO based design of aircraft structures. The variable-complexity cost estimation methodology and an approach for integrating manufacturing cost assessment into design process are also discussed. This report is presented in two parts. In the first part, the design methodology is presented, and the computational design tool is described. In the second part, a prototype model of the preliminary design Tool for Aircraft Structures based on Process Information (TASPI) is described. Part two also contains an example problem that applies the methodology described here for evaluation of six different design concepts for a wing spar.
Design, Development and Analysis of Centrifugal Blower
NASA Astrophysics Data System (ADS)
Baloni, Beena Devendra; Channiwala, Salim Abbasbhai; Harsha, Sugnanam Naga Ramannath
2018-06-01
Centrifugal blowers are widely used turbomachines equipment in all kinds of modern and domestic life. Manufacturing of blowers seldom follow an optimum design solution for individual blower. Although centrifugal blowers are developed as highly efficient machines, design is still based on various empirical and semi empirical rules proposed by fan designers. There are different methodologies used to design the impeller and other components of blowers. The objective of present study is to study explicit design methodologies and tracing unified design to get better design point performance. This unified design methodology is based more on fundamental concepts and minimum assumptions. Parametric study is also carried out for the effect of design parameters on pressure ratio and their interdependency in the design. The code is developed based on a unified design using C programming. Numerical analysis is carried out to check the flow parameters inside the blower. Two blowers, one based on the present design and other on industrial design, are developed with a standard OEM blower manufacturing unit. A comparison of both designs is done based on experimental performance analysis as per IS standard. The results suggest better efficiency and more flow rate for the same pressure head in case of the present design compared with industrial one.
A Step-by-Step Design Methodology for a Base Case Vanadium Redox-Flow Battery
ERIC Educational Resources Information Center
Moore, Mark; Counce, Robert M.; Watson, Jack S.; Zawodzinski, Thomas A.; Kamath, Haresh
2012-01-01
The purpose of this work is to develop an evolutionary procedure to be used by Chemical Engineering students for the base-case design of a Vanadium Redox-Flow Battery. The design methodology is based on the work of Douglas (1985) and provides a profitability analysis at each decision level so that more profitable alternatives and directions can be…
A novel methodology for building robust design rules by using design based metrology (DBM)
NASA Astrophysics Data System (ADS)
Lee, Myeongdong; Choi, Seiryung; Choi, Jinwoo; Kim, Jeahyun; Sung, Hyunju; Yeo, Hyunyoung; Shim, Myoungseob; Jin, Gyoyoung; Chung, Eunseung; Roh, Yonghan
2013-03-01
This paper addresses a methodology for building robust design rules by using design based metrology (DBM). Conventional method for building design rules has been using a simulation tool and a simple pattern spider mask. At the early stage of the device, the estimation of simulation tool is poor. And the evaluation of the simple pattern spider mask is rather subjective because it depends on the experiential judgment of an engineer. In this work, we designed a huge number of pattern situations including various 1D and 2D design structures. In order to overcome the difficulties of inspecting many types of patterns, we introduced Design Based Metrology (DBM) of Nano Geometry Research, Inc. And those mass patterns could be inspected at a fast speed with DBM. We also carried out quantitative analysis on PWQ silicon data to estimate process variability. Our methodology demonstrates high speed and accuracy for building design rules. All of test patterns were inspected within a few hours. Mass silicon data were handled with not personal decision but statistical processing. From the results, robust design rules are successfully verified and extracted. Finally we found out that our methodology is appropriate for building robust design rules.
Design and Diagnosis Problem Solving with Multifunctional Technical Knowledge Bases
1992-09-29
STRUCTURE METHODOLOGY Design problem solving is a complex activity involving a number of subtasks. and a number of alternative methods potentially available...Conference on Artificial Intelligence. London: The British Computer Society, pp. 621-633. Friedland, P. (1979). Knowledge-based experimental design ...Computing Milieuxl: Management of Computing and Information Systems- -ty,*m man- agement General Terms: Design . Methodology Additional Key Words and Phrases
TRAC Innovative Visualization Techniques
2016-11-14
Therefore, TRAC analysts need a way to analyze the effectiveness of their visualization design choices. Currently, TRAC does not have a methodology ...to analyze visualizations used to support an analysis story. Our research team developed a visualization design methodology to create effective...visualizations that support an analysis story. First, we based our methodology on the latest research on design thinking, cognitive learning, and
Enhancing the Front-End Phase of Design Methodology
ERIC Educational Resources Information Center
Elias, Erasto
2006-01-01
Design methodology (DM) is defined by the procedural path, expressed in design models, and techniques or methods used to untangle the various activities within a design model. Design education in universities is mainly based on descriptive design models. Much knowledge and organization have been built into DM to facilitate design teaching.…
Methodological convergence of program evaluation designs.
Chacón-Moscoso, Salvador; Anguera, M Teresa; Sanduvete-Chaves, Susana; Sánchez-Martín, Milagrosa
2014-01-01
Nowadays, the confronting dichotomous view between experimental/quasi-experimental and non-experimental/ethnographic studies still exists but, despite the extensive use of non-experimental/ethnographic studies, the most systematic work on methodological quality has been developed based on experimental and quasi-experimental studies. This hinders evaluators and planners' practice of empirical program evaluation, a sphere in which the distinction between types of study is changing continually and is less clear. Based on the classical validity framework of experimental/quasi-experimental studies, we carry out a review of the literature in order to analyze the convergence of design elements in methodological quality in primary studies in systematic reviews and ethnographic research. We specify the relevant design elements that should be taken into account in order to improve validity and generalization in program evaluation practice in different methodologies from a practical methodological and complementary view. We recommend ways to improve design elements so as to enhance validity and generalization in program evaluation practice.
NASA Technical Reports Server (NTRS)
Maghami, Peiman G.; Gupta, Sandeep; Elliott, Kenny B.; Joshi, Suresh M.; Walz, Joseph E.
1994-01-01
This paper describes the first experimental validation of an optimization-based integrated controls-structures design methodology for a class of flexible space structures. The Controls-Structures-Interaction (CSI) Evolutionary Model, a laboratory test bed at Langley, is redesigned based on the integrated design methodology with two different dissipative control strategies. The redesigned structure is fabricated, assembled in the laboratory, and experimentally compared with the original test structure. Design guides are proposed and used in the integrated design process to ensure that the resulting structure can be fabricated. Experimental results indicate that the integrated design requires greater than 60 percent less average control power (by thruster actuators) than the conventional control-optimized design while maintaining the required line-of-sight performance, thereby confirming the analytical findings about the superiority of the integrated design methodology. Amenability of the integrated design structure to other control strategies is considered and evaluated analytically and experimentally. This work also demonstrates the capabilities of the Langley-developed design tool CSI DESIGN which provides a unified environment for structural and control design.
A multi-criteria decision aid methodology to design electric vehicles public charging networks
NASA Astrophysics Data System (ADS)
Raposo, João; Rodrigues, Ana; Silva, Carlos; Dentinho, Tomaz
2015-05-01
This article presents a new multi-criteria decision aid methodology, dynamic-PROMETHEE, here used to design electric vehicle charging networks. In applying this methodology to a Portuguese city, results suggest that it is effective in designing electric vehicle charging networks, generating time and policy based scenarios, considering offer and demand and the city's urban structure. Dynamic-PROMETHE adds to the already known PROMETHEE's characteristics other useful features, such as decision memory over time, versatility and adaptability. The case study, used here to present the dynamic-PROMETHEE, served as inspiration and base to create this new methodology. It can be used to model different problems and scenarios that may present similar requirement characteristics.
A top-down design methodology and its implementation for VCSEL-based optical links design
NASA Astrophysics Data System (ADS)
Li, Jiguang; Cao, Mingcui; Cai, Zilong
2005-01-01
In order to find the optimal design for a given specification of an optical communication link, an integrated simulation of electronic, optoelectronic, and optical components of a complete system is required. It is very important to be able to simulate at both system level and detailed model level. This kind of model is feasible due to the high potential of Verilog-AMS language. In this paper, we propose an effective top-down design methodology and employ it in the development of a complete VCSEL-based optical links simulation. The principle of top-down methodology is that the development would proceed from the system to device level. To design a hierarchical model for VCSEL based optical links, the design framework is organized in three levels of hierarchy. The models are developed, and implemented in Verilog-AMS. Therefore, the model parameters are fitted to measured data. A sample transient simulation demonstrates the functioning of our implementation. Suggestions for future directions in top-down methodology used for optoelectronic systems technology are also presented.
Haptic Technologies for MEMS Design
NASA Astrophysics Data System (ADS)
Calis, Mustafa; Desmulliez, Marc P. Y.
2006-04-01
This paper presents for the first time a design methodology for MEMS/NEMS based on haptic sensing technologies. The software tool created as a result of this methodology will enable designers to model and interact in real time with their virtual prototype. One of the main advantages of haptic sensing is the ability to bring unusual microscopic forces back to the designer's world. Other significant benefits for developing such a methodology include gain productivity and the capability to include manufacturing costs within the design cycle.
Roetzheim, Richard G.; Freund, Karen M.; Corle, Don K.; Murray, David M.; Snyder, Frederick R.; Kronman, Andrea C.; Jean-Pierre, Pascal; Raich, Peter C.; Holden, Alan E. C.; Darnell, Julie S.; Warren-Mears, Victoria; Patierno, Steven; Design, PNRP; Committee, Analysis
2013-01-01
Background The Patient Navigation Research Program (PNRP) is a cooperative effort of nine research projects, each employing its own unique study design. To evaluate projects such as PNRP, it is desirable to perform a pooled analysis to increase power relative to the individual projects. There is no agreed upon prospective methodology, however, for analyzing combined data arising from different study designs. Expert opinions were thus solicited from members of the PNRP Design and Analysis Committee Purpose To review possible methodologies for analyzing combined data arising from heterogeneous study designs. Methods The Design and Analysis Committee critically reviewed the pros and cons of five potential methods for analyzing combined PNRP project data. Conclusions were based on simple consensus. The five approaches reviewed included: 1) Analyzing and reporting each project separately, 2) Combining data from all projects and performing an individual-level analysis, 3) Pooling data from projects having similar study designs, 4) Analyzing pooled data using a prospective meta analytic technique, 5) Analyzing pooled data utilizing a novel simulated group randomized design. Results Methodologies varied in their ability to incorporate data from all PNRP projects, to appropriately account for differing study designs, and in their impact from differing project sample sizes. Limitations The conclusions reached were based on expert opinion and not derived from actual analyses performed. Conclusions The ability to analyze pooled data arising from differing study designs may provide pertinent information to inform programmatic, budgetary, and policy perspectives. Multi-site community-based research may not lend itself well to the more stringent explanatory and pragmatic standards of a randomized controlled trial design. Given our growing interest in community-based population research, the challenges inherent in the analysis of heterogeneous study design are likely to become more salient. Discussion of the analytic issues faced by the PNRP and the methodological approaches we considered may be of value to other prospective community-based research programs. PMID:22273587
NASA Technical Reports Server (NTRS)
Dominick, Wayne D. (Editor); Farooq, Mohammad U.
1986-01-01
The definition of proposed research addressing the development and validation of a methodology for the design and evaluation of user interfaces for interactive information systems is given. The major objectives of this research are: the development of a comprehensive, objective, and generalizable methodology for the design and evaluation of user interfaces for information systems; the development of equations and/or analytical models to characterize user behavior and the performance of a designed interface; the design of a prototype system for the development and administration of user interfaces; and the design and use of controlled experiments to support the research and test/validate the proposed methodology. The proposed design methodology views the user interface as a virtual machine composed of three layers: an interactive layer, a dialogue manager layer, and an application interface layer. A command language model of user system interactions is presented because of its inherent simplicity and structured approach based on interaction events. All interaction events have a common structure based on common generic elements necessary for a successful dialogue. It is shown that, using this model, various types of interfaces could be designed and implemented to accommodate various categories of users. The implementation methodology is discussed in terms of how to store and organize the information.
Electronic Design Automation: Integrating the Design and Manufacturing Functions
NASA Technical Reports Server (NTRS)
Bachnak, Rafic; Salkowski, Charles
1997-01-01
As the complexity of electronic systems grows, the traditional design practice, a sequential process, is replaced by concurrent design methodologies. A major advantage of concurrent design is that the feedback from software and manufacturing engineers can be easily incorporated into the design. The implementation of concurrent engineering methodologies is greatly facilitated by employing the latest Electronic Design Automation (EDA) tools. These tools offer integrated simulation of the electrical, mechanical, and manufacturing functions and support virtual prototyping, rapid prototyping, and hardware-software co-design. This report presents recommendations for enhancing the electronic design and manufacturing capabilities and procedures at JSC based on a concurrent design methodology that employs EDA tools.
A Physics-Based Approach for Power Integrity in Multi-Layered PCBs
NASA Astrophysics Data System (ADS)
Zhao, Biyao
Developing a power distribution network (PDN) for ASICs and ICs to achieve the low-voltage ripple specifications for current digital designs is challenging with the high-speed and low-voltage ICs. Present methods are typically guided by best engineering practices for low impedance looking into the PDN from the IC. A pre-layout design methodology for power integrity in multi-layered PCB PDN geometry is proposed in the thesis. The PCB PDN geometry is segmented into four parts and every part is modelled using different methods based on the geometry details of the part. Physics-based circuit models are built for every part and the four parts are re-assembled into one model. The influence of geometry details is clearly revealed in this methodology. Based on the physics-based circuit mode, the procedures of using the pre-layout design methodology as a guideline during the PDN design is illustrated. Some common used geometries are used to build design space, and the design curves with the geometry details are provided to be a look up library for engineering use. The pre-layout methodology is based on the resonant cavity model of parallel planes for the cavity structures, and parallel-plane PEEC (PPP) for the irregular shaped plane inductance, and PEEC for the decoupling capacitor connection above the top most or bottom most power-return planes. PCB PDN is analyzed based on the input impedance looking into the PCB from the IC. The pre-layout design methodology can be used to obtain the best possible PCB PDN design. With the switching current profile, the target impedance can be selected to evaluate the PDN performance, and the frequency domain PDN input impedance can be used to obtain the voltage ripple in the time domain to give intuitive insight of the geometry impact on the voltage ripple.
Optimization-based controller design for rotorcraft
NASA Technical Reports Server (NTRS)
Tsing, N.-K.; Fan, M. K. H.; Barlow, J.; Tits, A. L.; Tischler, M. B.
1993-01-01
An optimization-based methodology for linear control system design is outlined by considering the design of a controller for a UH-60 rotorcraft in hover. A wide range of design specifications is taken into account: internal stability, decoupling between longitudinal and lateral motions, handling qualities, and rejection of windgusts. These specifications are investigated while taking into account physical limitations in the swashplate displacements and rates of displacement. The methodology crucially relies on user-machine interaction for tradeoff exploration.
Advanced piloted aircraft flight control system design methodology. Volume 1: Knowledge base
NASA Technical Reports Server (NTRS)
Mcruer, Duane T.; Myers, Thomas T.
1988-01-01
The development of a comprehensive and electric methodology for conceptual and preliminary design of flight control systems is presented and illustrated. The methodology is focused on the design stages starting with the layout of system requirements and ending when some viable competing system architectures (feedback control structures) are defined. The approach is centered on the human pilot and the aircraft as both the sources of, and the keys to the solution of, many flight control problems. The methodology relies heavily on computational procedures which are highly interactive with the design engineer. To maximize effectiveness, these techniques, as selected and modified to be used together in the methodology, form a cadre of computational tools specifically tailored for integrated flight control system preliminary design purposes. While theory and associated computational means are an important aspect of the design methodology, the lore, knowledge and experience elements, which guide and govern applications are critical features. This material is presented as summary tables, outlines, recipes, empirical data, lists, etc., which encapsulate a great deal of expert knowledge. Much of this is presented in topical knowledge summaries which are attached as Supplements. The composite of the supplements and the main body elements constitutes a first cut at a a Mark 1 Knowledge Base for manned-aircraft flight control.
Debrus, Benjamin; Lebrun, Pierre; Ceccato, Attilio; Caliaro, Gabriel; Rozet, Eric; Nistor, Iolanda; Oprean, Radu; Rupérez, Francisco J; Barbas, Coral; Boulanger, Bruno; Hubert, Philippe
2011-04-08
HPLC separations of an unknown sample mixture and a pharmaceutical formulation have been optimized using a recently developed chemometric methodology proposed by W. Dewé et al. in 2004 and improved by P. Lebrun et al. in 2008. This methodology is based on experimental designs which are used to model retention times of compounds of interest. Then, the prediction accuracy and the optimal separation robustness, including the uncertainty study, were evaluated. Finally, the design space (ICH Q8(R1) guideline) was computed as the probability for a criterion to lie in a selected range of acceptance. Furthermore, the chromatograms were automatically read. Peak detection and peak matching were carried out with a previously developed methodology using independent component analysis published by B. Debrus et al. in 2009. The present successful applications strengthen the high potential of these methodologies for the automated development of chromatographic methods. Copyright © 2011 Elsevier B.V. All rights reserved.
Assessing Consequential Scenarios in a Complex Operational Environment Using Agent Based Simulation
2017-03-16
RWISE) 93 5.1.5 Conflict Modeling, Planning, and Outcomes Experimentation Program (COMPOEX) 94 5.1.6 Joint Non -Kinetic Effects Model (JNEM)/Athena... experimental design and testing. 4.3.8 Types and Attributes of Agent-Based Model Design Patterns Using the aforementioned ABM flowchart design methodology ...speed, or flexibility during tactical US Army wargaming. The report considers methodologies to improve analysis of the human domain, identifies
1983-12-30
AD-Ri46 57? ARCHITECTURE DESIGN AND SYSTEM; PERFORMANCE ASSESSMENT i/i AND DEVELOPMENT ME..(U) NAVAL SURFACE WEAPONS CENTER SILYER SPRING MD J...AD-A 146 577 NSIWC TR 83-324 ARCHITECTURE , DESIGN , AND SYSTEM; PERFORMANCE ASSESSMENT AND DEVELOPMENT METHODOLOGY...REPORT NUMBER 12. GOVT ACCESSION NO.3. RECIPIENT’S CATALOG NUMBER NSWC TR 83-324 10- 1 1 51’ 4. ?ITLE (and subtitle) ARCHITECTURE , DESIGN , AND SYSTEM; S
One Controller at a Time (1-CAT): A mimo design methodology
NASA Technical Reports Server (NTRS)
Mitchell, J. R.; Lucas, J. C.
1987-01-01
The One Controller at a Time (1-CAT) methodology for designing digital controllers for Large Space Structures (LSS's) is introduced and illustrated. The flexible mode problem is first discussed. Next, desirable features of a LSS control system design methodology are delineated. The 1-CAT approach is presented, along with an analytical technique for carrying out the 1-CAT process. Next, 1-CAT is used to design digital controllers for the proposed Space Based Laser (SBL). Finally, the SBL design is evaluated for dynamical performance, noise rejection, and robustness.
Information System Design Methodology Based on PERT/CPM Networking and Optimization Techniques.
ERIC Educational Resources Information Center
Bose, Anindya
The dissertation attempts to demonstrate that the program evaluation and review technique (PERT)/Critical Path Method (CPM) or some modified version thereof can be developed into an information system design methodology. The methodology utilizes PERT/CPM which isolates the basic functional units of a system and sets them in a dynamic time/cost…
ERIC Educational Resources Information Center
Hall, Wayne; Palmer, Stuart; Bennett, Mitchell
2012-01-01
Project-based learning (PBL) is a well-known student-centred methodology for engineering design education. The methodology claims to offer a number of educational benefits. This paper evaluates the student perceptions of the initial and second offering of a first-year design unit at Griffith University in Australia. It builds on an earlier…
Optimization Under Uncertainty for Electronics Cooling Design
NASA Astrophysics Data System (ADS)
Bodla, Karthik K.; Murthy, Jayathi Y.; Garimella, Suresh V.
Optimization under uncertainty is a powerful methodology used in design and optimization to produce robust, reliable designs. Such an optimization methodology, employed when the input quantities of interest are uncertain, produces output uncertainties, helping the designer choose input parameters that would result in satisfactory thermal solutions. Apart from providing basic statistical information such as mean and standard deviation in the output quantities, auxiliary data from an uncertainty based optimization, such as local and global sensitivities, help the designer decide the input parameter(s) to which the output quantity of interest is most sensitive. This helps the design of experiments based on the most sensitive input parameter(s). A further crucial output of such a methodology is the solution to the inverse problem - finding the allowable uncertainty range in the input parameter(s), given an acceptable uncertainty range in the output quantity of interest...
Methodological Issues in Research on Web-Based Behavioral Interventions
Danaher, Brian G; Seeley, John R
2013-01-01
Background Web-based behavioral intervention research is rapidly growing. Purpose We review methodological issues shared across Web-based intervention research to help inform future research in this area. Methods We examine measures and their interpretation using exemplar studies and our research. Results We report on research designs used to evaluate Web-based interventions and recommend newer, blended designs. We review and critique methodological issues associated with recruitment, engagement, and social validity. Conclusions We suggest that there is value to viewing this burgeoning realm of research from the broader context of behavior change research. We conclude that many studies use blended research designs, that innovative mantling designs such as the Multiphase Optimization Strategy and Sequential Multiple Assignment Randomized Trial methods hold considerable promise and should be used more widely, and that Web-based controls should be used instead of usual care or no-treatment controls in public health research. We recommend topics for future research that address participant recruitment, engagement, and social validity. PMID:19806416
NASA Technical Reports Server (NTRS)
Wang, Ten-See; Canabal, Francisco; Chen, Yen-Sen; Cheng, Gary; Ito, Yasushi
2013-01-01
Nuclear thermal propulsion is a leading candidate for in-space propulsion for human Mars missions. This chapter describes a thermal hydraulics design and analysis methodology developed at the NASA Marshall Space Flight Center, in support of the nuclear thermal propulsion development effort. The objective of this campaign is to bridge the design methods in the Rover/NERVA era, with a modern computational fluid dynamics and heat transfer methodology, to predict thermal, fluid, and hydrogen environments of a hypothetical solid-core, nuclear thermal engine the Small Engine, designed in the 1960s. The computational methodology is based on an unstructured-grid, pressure-based, all speeds, chemically reacting, computational fluid dynamics and heat transfer platform, while formulations of flow and heat transfer through porous and solid media were implemented to describe those of hydrogen flow channels inside the solid24 core. Design analyses of a single flow element and the entire solid-core thrust chamber of the Small Engine were performed and the results are presented herein
NASA Technical Reports Server (NTRS)
Garg, Sanjay; Mattern, Duane L.; Bright, Michelle M.; Ouzts, Peter J.
1990-01-01
Results are presented from an application of H-infinity control design methodology to a centralized integrated flight/propulsion control (IFPC) system design for a supersonic Short Take-Off and Vertical Landing (STOVL) fighter aircraft in transition flight. The overall design methodology consists of a centralized IFPC controller design with controller partitioning. Only the feedback controller design portion of the methodology is addressed. Design and evaluation vehicle models are summarized, and insight is provided into formulating the H-infinity control problem such that it reflects the IFPC design objectives. The H-infinity controller is shown to provide decoupled command tracking for the design model. The controller order could be significantly reduced by modal residualization of the fast controller modes without any deterioration in performance. A discussion is presented of the areas in which the controller performance needs to be improved, and ways in which these improvements can be achieved within the framework of an H-infinity based linear control design.
Methodology for designing psychological habitability for the space station.
Komastubara, A
2000-09-01
Psychological habitability is a critical quality issue for the International Space Station because poor habitability degrades performance shaping factors (PSFs) and increases human errors. However, habitability often receives rather limited design attention based on someone's superficial tastes because systematic design procedures lack habitability quality. To improve design treatment of psychological habitability, this paper proposes and discusses a design methodology for designing psychological habitability for the International Space Station.
Gu, Jing; Wang, Qi; Wang, Xiaogang; Li, Hailong; Gu, Mei; Ming, Haixia; Dong, Xiaoli; Yang, Kehu; Wu, Hongyan
2014-01-01
Background. This review provides the first methodological information assessment of protocol of acupuncture RCTs registered in WHO International Clinical Trials Registry Platform (ICTRP). Methods. All records of acupuncture RCTs registered in the ICTRP have been collected. The methodological design assessment involved whether the randomization methods, allocation concealment, and blinding were adequate or not based on the information of registration records (protocols of acupuncture RCTs). Results. A total of 453 records, found in 11 registries, were examined. Methodological details were insufficient in registration records; there were 76.4%, 89.0%, and 21.4% records that did not provide information on randomization methods, allocation concealment, and blinding respectively. The proportions of adequate randomization methods, allocation concealment, and blinding were only 107 (23.6%), 48 (10.6%), and 210 (46.4%), respectively. The methodological design improved year by year, especially after 2007. Additionally, methodology of RCTs with ethics approval was clearly superior to those without ethics approval and different among registries. Conclusions. The overall methodological design based on registration records of acupuncture RCTs is not very well but improved year by year. The insufficient information on randomization methods, allocation concealment, and blinding maybe due to the relevant description is not taken seriously in acupuncture RCTs' registration. PMID:24688591
Gu, Jing; Wang, Qi; Wang, Xiaogang; Li, Hailong; Gu, Mei; Ming, Haixia; Dong, Xiaoli; Yang, Kehu; Wu, Hongyan
2014-01-01
Background. This review provides the first methodological information assessment of protocol of acupuncture RCTs registered in WHO International Clinical Trials Registry Platform (ICTRP). Methods. All records of acupuncture RCTs registered in the ICTRP have been collected. The methodological design assessment involved whether the randomization methods, allocation concealment, and blinding were adequate or not based on the information of registration records (protocols of acupuncture RCTs). Results. A total of 453 records, found in 11 registries, were examined. Methodological details were insufficient in registration records; there were 76.4%, 89.0%, and 21.4% records that did not provide information on randomization methods, allocation concealment, and blinding respectively. The proportions of adequate randomization methods, allocation concealment, and blinding were only 107 (23.6%), 48 (10.6%), and 210 (46.4%), respectively. The methodological design improved year by year, especially after 2007. Additionally, methodology of RCTs with ethics approval was clearly superior to those without ethics approval and different among registries. Conclusions. The overall methodological design based on registration records of acupuncture RCTs is not very well but improved year by year. The insufficient information on randomization methods, allocation concealment, and blinding maybe due to the relevant description is not taken seriously in acupuncture RCTs' registration.
Impact of computational structure-based methods on drug discovery.
Reynolds, Charles H
2014-01-01
Structure-based drug design has become an indispensible tool in drug discovery. The emergence of structure-based design is due to gains in structural biology that have provided exponential growth in the number of protein crystal structures, new computational algorithms and approaches for modeling protein-ligand interactions, and the tremendous growth of raw computer power in the last 30 years. Computer modeling and simulation have made major contributions to the discovery of many groundbreaking drugs in recent years. Examples are presented that highlight the evolution of computational structure-based design methodology, and the impact of that methodology on drug discovery.
Object-oriented analysis and design: a methodology for modeling the computer-based patient record.
Egyhazy, C J; Eyestone, S M; Martino, J; Hodgson, C L
1998-08-01
The article highlights the importance of an object-oriented analysis and design (OOAD) methodology for the computer-based patient record (CPR) in the military environment. Many OOAD methodologies do not adequately scale up, allow for efficient reuse of their products, or accommodate legacy systems. A methodology that addresses these issues is formulated and used to demonstrate its applicability in a large-scale health care service system. During a period of 6 months, a team of object modelers and domain experts formulated an OOAD methodology tailored to the Department of Defense Military Health System and used it to produce components of an object model for simple order processing. This methodology and the lessons learned during its implementation are described. This approach is necessary to achieve broad interoperability among heterogeneous automated information systems.
Methodological Innovation in Practice-Based Design Doctorates
ERIC Educational Resources Information Center
Yee, Joyce S. R.
2010-01-01
This article presents a selective review of recent design PhDs that identify and analyse the methodological innovation that is occurring in the field, in order to inform future provision of research training. Six recently completed design PhDs are used to highlight possible philosophical and practical models that can be adopted by future PhD…
Design of an integrated airframe/propulsion control system architecture
NASA Technical Reports Server (NTRS)
Cohen, Gerald C.; Lee, C. William; Strickland, Michael J.; Torkelson, Thomas C.
1990-01-01
The design of an integrated airframe/propulsion control system architecture is described. The design is based on a prevalidation methodology that uses both reliability and performance. A detailed account is given for the testing associated with a subset of the architecture and concludes with general observations of applying the methodology to the architecture.
The methodology of database design in organization management systems
NASA Astrophysics Data System (ADS)
Chudinov, I. L.; Osipova, V. V.; Bobrova, Y. V.
2017-01-01
The paper describes the unified methodology of database design for management information systems. Designing the conceptual information model for the domain area is the most important and labor-intensive stage in database design. Basing on the proposed integrated approach to design, the conceptual information model, the main principles of developing the relation databases are provided and user’s information needs are considered. According to the methodology, the process of designing the conceptual information model includes three basic stages, which are defined in detail. Finally, the article describes the process of performing the results of analyzing user’s information needs and the rationale for use of classifiers.
When Playing Meets Learning: Methodological Framework for Designing Educational Games
NASA Astrophysics Data System (ADS)
Linek, Stephanie B.; Schwarz, Daniel; Bopp, Matthias; Albert, Dietrich
Game-based learning builds upon the idea of using the motivational potential of video games in the educational context. Thus, the design of educational games has to address optimizing enjoyment as well as optimizing learning. Within the EC-project ELEKTRA a methodological framework for the conceptual design of educational games was developed. Thereby state-of-the-art psycho-pedagogical approaches were combined with insights of media-psychology as well as with best-practice game design. This science-based interdisciplinary approach was enriched by enclosed empirical research to answer open questions on educational game-design. Additionally, several evaluation-cycles were implemented to achieve further improvements. The psycho-pedagogical core of the methodology can be summarized by the ELEKTRA's 4Ms: Macroadaptivity, Microadaptivity, Metacognition, and Motivation. The conceptual framework is structured in eight phases which have several interconnections and feedback-cycles that enable a close interdisciplinary collaboration between game design, pedagogy, cognitive science and media psychology.
Bayesian experimental design for models with intractable likelihoods.
Drovandi, Christopher C; Pettitt, Anthony N
2013-12-01
In this paper we present a methodology for designing experiments for efficiently estimating the parameters of models with computationally intractable likelihoods. The approach combines a commonly used methodology for robust experimental design, based on Markov chain Monte Carlo sampling, with approximate Bayesian computation (ABC) to ensure that no likelihood evaluations are required. The utility function considered for precise parameter estimation is based upon the precision of the ABC posterior distribution, which we form efficiently via the ABC rejection algorithm based on pre-computed model simulations. Our focus is on stochastic models and, in particular, we investigate the methodology for Markov process models of epidemics and macroparasite population evolution. The macroparasite example involves a multivariate process and we assess the loss of information from not observing all variables. © 2013, The International Biometric Society.
A systematic review of the use of an expertise-based randomised controlled trial design.
Cook, Jonathan A; Elders, Andrew; Boachie, Charles; Bassinga, Ted; Fraser, Cynthia; Altman, Doug G; Boutron, Isabelle; Ramsay, Craig R; MacLennan, Graeme S
2015-05-30
Under a conventional two-arm randomised trial design, participants are allocated to an intervention and participating health professionals are expected to deliver both interventions. However, health professionals often have differing levels of expertise in a skill-based interventions such as surgery or psychotherapy. An expertise-based approach to trial design, where health professionals only deliver an intervention in which they have expertise, has been proposed as an alternative. The aim of this project was to systematically review the use of an expertise-based trial design in the medical literature. We carried out a comprehensive search of nine databases--AMED, BIOSIS, CENTRAL, CINAHL, Cochrane Methodology Register, EMBASE, MEDLINE, Science Citation Index, and PsycINFO--from 1966 to 2012 and performed citation searches using the ISI Citation Indexes and Scopus. Studies that used an expertise-based trial design were included. Two review authors independently screened the titles and abstracts and assessed full-text reports. Data were extracted and summarised on the study characteristics, general and expertise-specific study methodology, and conduct. In total, 7476 titles and abstracts were identified, leading to 43 included studies (54 articles). The vast majority (88%) used a pure expertise-based design; three (7%) adopted a hybrid design, and two (5%) used a design that was unclear. Most studies compared substantially different interventions (79%). In many cases, key information relating to the expertise-based design was absent; only 12 (28%) reported criteria for delivering both interventions. Most studies recruited the target sample size or very close to it (median of 101, interquartile range of 94 to 118), although the target was reported for only 40% of studies. The proportion of participants who received the allocated intervention was high (92%, interquartile range of 82 to 99%). While use of an expertise-based trial design is growing, it remains uncommon. Reporting of study methodology and, particularly, expertise-related methodology was poor. Empirical evidence provided some support for purported benefits such as high levels of recruitment and compliance with allocation. An expertise-based trial design should be considered but its value seems context-specific, particularly when interventions differ substantially or interventions are typically delivered by different health professionals.
NASA Technical Reports Server (NTRS)
Stepniewski, W. Z.; Shinn, R. A.
1983-01-01
A detailed comparative insight into design and operational philosophies of Soviet vs. Western helicopters is provided. This is accomplished by examining conceptual approaches, productibility and maintainability, and weight trends/prediction methodology. Extensive use of Soviet methodology (Tishchenko) to various weight classes of helicopters is compared to the results of using Western based methodology.
Guidelines for reporting evaluations based on observational methodology.
Portell, Mariona; Anguera, M Teresa; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana
2015-01-01
Observational methodology is one of the most suitable research designs for evaluating fidelity of implementation, especially in complex interventions. However, the conduct and reporting of observational studies is hampered by the absence of specific guidelines, such as those that exist for other evaluation designs. This lack of specific guidance poses a threat to the quality and transparency of these studies and also constitutes a considerable publication hurdle. The aim of this study thus was to draw up a set of proposed guidelines for reporting evaluations based on observational methodology. The guidelines were developed by triangulating three sources of information: observational studies performed in different fields by experts in observational methodology, reporting guidelines for general studies and studies with similar designs to observational studies, and proposals from experts in observational methodology at scientific meetings. We produced a list of guidelines grouped into three domains: intervention and expected outcomes, methods, and results. The result is a useful, carefully crafted set of simple guidelines for conducting and reporting observational studies in the field of program evaluation.
The multi-copy simultaneous search methodology: a fundamental tool for structure-based drug design.
Schubert, Christian R; Stultz, Collin M
2009-08-01
Fragment-based ligand design approaches, such as the multi-copy simultaneous search (MCSS) methodology, have proven to be useful tools in the search for novel therapeutic compounds that bind pre-specified targets of known structure. MCSS offers a variety of advantages over more traditional high-throughput screening methods, and has been applied successfully to challenging targets. The methodology is quite general and can be used to construct functionality maps for proteins, DNA, and RNA. In this review, we describe the main aspects of the MCSS method and outline the general use of the methodology as a fundamental tool to guide the design of de novo lead compounds. We focus our discussion on the evaluation of MCSS results and the incorporation of protein flexibility into the methodology. In addition, we demonstrate on several specific examples how the information arising from the MCSS functionality maps has been successfully used to predict ligand binding to protein targets and RNA.
Integrated Controls-Structures Design Methodology: Redesign of an Evolutionary Test Structure
NASA Technical Reports Server (NTRS)
Maghami, Peiman G.; Gupta, Sandeep; Elliot, Kenny B.; Joshi, Suresh M.
1997-01-01
An optimization-based integrated controls-structures design methodology for a class of flexible space structures is described, and the phase-0 Controls-Structures-Integration evolutionary model, a laboratory testbed at NASA Langley, is redesigned using this integrated design methodology. The integrated controls-structures design is posed as a nonlinear programming problem to minimize the control effort required to maintain a specified line-of-sight pointing performance, under persistent white noise disturbance. Static and dynamic dissipative control strategies are employed for feedback control, and parameters of these controllers are considered as the control design variables. Sizes of strut elements in various sections of the CEM are used as the structural design variables. Design guides for the struts are developed and employed in the integrated design process, to ensure that the redesigned structure can be effectively fabricated. The superiority of the integrated design methodology over the conventional design approach is demonstrated analytically by observing a significant reduction in the average control power needed to maintain specified pointing performance with the integrated design approach.
Status of Single-Case Research Designs for Evidence-Based Practice
ERIC Educational Resources Information Center
Matson, Johnny L.; Turygin, Nicole C.; Beighley, Jennifer; Matson, Michael L.
2012-01-01
The single-case research design has become a paradoxical methodology in the applied sciences. While various experimental designs have been in place for over 50 years, there has not been wide acceptance of single-case methodology outside clinical and school psychology, or the field of special education. These methods were developed in the U.S.A.,…
Unmanned Tactical Autonomous Control and Collaboration Situation Awareness
2017-06-01
methodology framework using interdependence analysis (IA) tables for informing design requirements based on SA requirements. Future research should seek...requirements of UTACC. The authors then apply SA principles to Coactive Design in order to inform robotic design. The result is a methodology framework using...28 2. Non -intrusive Methods ................................................................29 3. Post-Mission Reviews
Methodological standards in single-case experimental design: Raising the bar.
Ganz, Jennifer B; Ayres, Kevin M
2018-04-12
Single-case experimental designs (SCEDs), or small-n experimental research, are frequently implemented to assess approaches to improving outcomes for people with disabilities, particularly those with low-incidence disabilities, such as some developmental disabilities. SCED has become increasingly accepted as a research design. As this literature base is needed to determine what interventions are evidence-based practices, the acceptance of SCED has resulted in increased critiques with regard to methodological quality. Recent trends include recommendations from a number of expert scholars and institutions. The purpose of this article is to summarize the recent history of methodological quality considerations, synthesize the recommendations found in the SCED literature, and provide recommendations to researchers designing SCEDs with regard to essential and aspirational standards for methodological quality. Conclusions include imploring SCED to increase the quality of their experiments, with particular consideration regarding the applied nature of SCED research to be published in Research in Developmental Disabilities and beyond. Copyright © 2018 Elsevier Ltd. All rights reserved.
Roetzheim, Richard G; Freund, Karen M; Corle, Don K; Murray, David M; Snyder, Frederick R; Kronman, Andrea C; Jean-Pierre, Pascal; Raich, Peter C; Holden, Alan Ec; Darnell, Julie S; Warren-Mears, Victoria; Patierno, Steven
2012-04-01
The Patient Navigation Research Program (PNRP) is a cooperative effort of nine research projects, with similar clinical criteria but with different study designs. To evaluate projects such as PNRP, it is desirable to perform a pooled analysis to increase power relative to the individual projects. There is no agreed-upon prospective methodology, however, for analyzing combined data arising from different study designs. Expert opinions were thus solicited from the members of the PNRP Design and Analysis Committee. To review possible methodologies for analyzing combined data arising from heterogeneous study designs. The Design and Analysis Committee critically reviewed the pros and cons of five potential methods for analyzing combined PNRP project data. The conclusions were based on simple consensus. The five approaches reviewed included the following: (1) analyzing and reporting each project separately, (2) combining data from all projects and performing an individual-level analysis, (3) pooling data from projects having similar study designs, (4) analyzing pooled data using a prospective meta-analytic technique, and (5) analyzing pooled data utilizing a novel simulated group-randomized design. Methodologies varied in their ability to incorporate data from all PNRP projects, to appropriately account for differing study designs, and to accommodate differing project sample sizes. The conclusions reached were based on expert opinion and not derived from actual analyses performed. The ability to analyze pooled data arising from differing study designs may provide pertinent information to inform programmatic, budgetary, and policy perspectives. Multisite community-based research may not lend itself well to the more stringent explanatory and pragmatic standards of a randomized controlled trial design. Given our growing interest in community-based population research, the challenges inherent in the analysis of heterogeneous study design are likely to become more salient. Discussion of the analytic issues faced by the PNRP and the methodological approaches we considered may be of value to other prospective community-based research programs.
Applications of mixed-methods methodology in clinical pharmacy research.
Hadi, Muhammad Abdul; Closs, S José
2016-06-01
Introduction Mixed-methods methodology, as the name suggests refers to mixing of elements of both qualitative and quantitative methodologies in a single study. In the past decade, mixed-methods methodology has gained popularity among healthcare researchers as it promises to bring together the strengths of both qualitative and quantitative approaches. Methodology A number of mixed-methods designs are available in the literature and the four most commonly used designs in healthcare research are: the convergent parallel design, the embedded design, the exploratory design, and the explanatory design. Each has its own unique advantages, challenges and procedures and selection of a particular design should be guided by the research question. Guidance on designing, conducting and reporting mixed-methods research is available in the literature, so it is advisable to adhere to this to ensure methodological rigour. When to use it is best suited when the research questions require: triangulating findings from different methodologies to explain a single phenomenon; clarifying the results of one method using another method; informing the design of one method based on the findings of another method, development of a scale/questionnaire and answering different research questions within a single study. Two case studies have been presented to illustrate possible applications of mixed-methods methodology. Limitations Possessing the necessary knowledge and skills to undertake qualitative and quantitative data collection, analysis, interpretation and integration remains the biggest challenge for researchers conducting mixed-methods studies. Sequential study designs are often time consuming, being in two (or more) phases whereas concurrent study designs may require more than one data collector to collect both qualitative and quantitative data at the same time.
Logic Design Pathology and Space Flight Electronics
NASA Technical Reports Server (NTRS)
Katz, Richard; Barto, Rod L.; Erickson, K.
1997-01-01
Logic design errors have been observed in space flight missions and the final stages of ground test. The technologies used by designers and their design/analysis methodologies will be analyzed. This will give insight to the root causes of the failures. These technologies include discrete integrated circuit based systems, systems based on field and mask programmable logic, and the use computer aided engineering (CAE) systems. State-of-the-art (SOTA) design tools and methodologies will be analyzed with respect to high-reliability spacecraft design and potential pitfalls are discussed. Case studies of faults from large expensive programs to "smaller, faster, cheaper" missions will be used to explore the fundamental reasons for logic design problems.
NASA Technical Reports Server (NTRS)
Onwubiko, Chin-Yere; Onyebueke, Landon
1996-01-01
The structural design, or the design of machine elements, has been traditionally based on deterministic design methodology. The deterministic method considers all design parameters to be known with certainty. This methodology is, therefore, inadequate to design complex structures that are subjected to a variety of complex, severe loading conditions. A nonlinear behavior that is dependent on stress, stress rate, temperature, number of load cycles, and time is observed on all components subjected to complex conditions. These complex conditions introduce uncertainties; hence, the actual factor of safety margin remains unknown. In the deterministic methodology, the contingency of failure is discounted; hence, there is a use of a high factor of safety. It may be most useful in situations where the design structures are simple. The probabilistic method is concerned with the probability of non-failure performance of structures or machine elements. It is much more useful in situations where the design is characterized by complex geometry, possibility of catastrophic failure, sensitive loads and material properties. Also included: Comparative Study of the use of AGMA Geometry Factors and Probabilistic Design Methodology in the Design of Compact Spur Gear Set.
Tungsten fiber reinforced superalloy composite high temperature component design considerations
NASA Technical Reports Server (NTRS)
Winsa, E. A.
1982-01-01
Tungsten fiber reinforced superalloy composites (TFRS) are intended for use in high temperature turbine components. Current turbine component design methodology is based on applying the experience, sometimes semiempirical, gained from over 30 years of superalloy component design. Current composite component design capability is generally limited to the methodology for low temperature resin matrix composites. Often the tendency is to treat TFRS as just another superalloy or low temperature composite. However, TFRS behavior is significantly different than that of superalloys, and the high environment adds consideration not common in low temperature composite component design. The methodology used for preliminary design of TFRS components are described. Considerations unique to TFRS are emphasized.
Passenger rail vehicle safety assessment methodology. Volume I, Summary of safe performance limits.
DOT National Transportation Integrated Search
2000-04-01
This report presents a methodology based on computer simulation that asseses the safe dyamic performance limits of commuter passenger vehicles. The methodology consists of determining the critical design parameters and characteristic properties of bo...
A methodological approach for designing a usable ontology-based GUI in healthcare.
Lasierra, N; Kushniruk, A; Alesanco, A; Borycki, E; García, J
2013-01-01
This paper presents a methodological approach to the design and evaluation of an interface for an ontology-based system used for designing care plans for monitoring patients at home. In order to define the care plans, physicians need a tool for creating instances of the ontology and configuring some rules. Our purpose is to develop an interface to allow clinicians to interact with the ontology. Although ontology-driven applications do not necessarily present the ontology in the user interface, it is our hypothesis that showing selected parts of the ontology in a "usable" way could enhance clinician's understanding and make easier the definition of the care plans. Based on prototyping and iterative testing, this methodology combines visualization techniques and usability methods. Preliminary results obtained after a formative evaluation indicate the effectiveness of suggested combination.
Integrated layout based Monte-Carlo simulation for design arc optimization
NASA Astrophysics Data System (ADS)
Shao, Dongbing; Clevenger, Larry; Zhuang, Lei; Liebmann, Lars; Wong, Robert; Culp, James
2016-03-01
Design rules are created considering a wafer fail mechanism with the relevant design levels under various design cases, and the values are set to cover the worst scenario. Because of the simplification and generalization, design rule hinders, rather than helps, dense device scaling. As an example, SRAM designs always need extensive ground rule waivers. Furthermore, dense design also often involves "design arc", a collection of design rules, the sum of which equals critical pitch defined by technology. In design arc, a single rule change can lead to chain reaction of other rule violations. In this talk we present a methodology using Layout Based Monte-Carlo Simulation (LBMCS) with integrated multiple ground rule checks. We apply this methodology on SRAM word line contact, and the result is a layout that has balanced wafer fail risks based on Process Assumptions (PAs). This work was performed at the IBM Microelectronics Div, Semiconductor Research and Development Center, Hopewell Junction, NY 12533
ERIC Educational Resources Information Center
Abdallah, Mahmoud M. S.; Wegerif, Rupert B.
2014-01-01
This article discusses educational design-based research (DBR) as an emerging paradigm/methodology in educational enquiry that can be used as a mixed-method, problem-oriented research framework, and thus can act as an alternative to other traditional paradigms/methodologies prominent within the Egyptian context of educational enquiry. DBR is often…
ERIC Educational Resources Information Center
Osler, James Edward
2015-01-01
This monograph provides a neuroscience-based systemological, epistemological, and methodological rational for the design of an advanced and novel parametric statistical analytics designed for the biological sciences referred to as "Biotrichotomy". The aim of this new arena of statistics is to provide dual metrics designed to analyze the…
A quality-based cost model for new electronic systems and products
NASA Astrophysics Data System (ADS)
Shina, Sammy G.; Saigal, Anil
1998-04-01
This article outlines a method for developing a quality-based cost model for the design of new electronic systems and products. The model incorporates a methodology for determining a cost-effective design margin allocation for electronic products and systems and its impact on manufacturing quality and cost. A spreadsheet-based cost estimating tool was developed to help implement this methodology in order for the system design engineers to quickly estimate the effect of design decisions and tradeoffs on the quality and cost of new products. The tool was developed with automatic spreadsheet connectivity to current process capability and with provisions to consider the impact of capital equipment and tooling purchases to reduce the product cost.
Methodology for cloud-based design of robots
NASA Astrophysics Data System (ADS)
Ogorodnikova, O. M.; Vaganov, K. A.; Putimtsev, I. D.
2017-09-01
This paper presents some important results for cloud-based designing a robot arm by a group of students. Methodology for the cloud-based design was developed and used to initiate interdisciplinary project about research and development of a specific manipulator. The whole project data files were hosted by Ural Federal University data center. The 3D (three-dimensional) model of the robot arm was created using Siemens PLM software (Product Lifecycle Management) and structured as a complex mechatronics product by means of Siemens Teamcenter thin client; all processes were performed in the clouds. The robot arm was designed in purpose to load blanks up to 1 kg into the work space of the milling machine for performing student's researches.
Boulesteix, Anne-Laure; Wilson, Rory; Hapfelmeier, Alexander
2017-09-09
The goal of medical research is to develop interventions that are in some sense superior, with respect to patient outcome, to interventions currently in use. Similarly, the goal of research in methodological computational statistics is to develop data analysis tools that are themselves superior to the existing tools. The methodology of the evaluation of medical interventions continues to be discussed extensively in the literature and it is now well accepted that medicine should be at least partly "evidence-based". Although we statisticians are convinced of the importance of unbiased, well-thought-out study designs and evidence-based approaches in the context of clinical research, we tend to ignore these principles when designing our own studies for evaluating statistical methods in the context of our methodological research. In this paper, we draw an analogy between clinical trials and real-data-based benchmarking experiments in methodological statistical science, with datasets playing the role of patients and methods playing the role of medical interventions. Through this analogy, we suggest directions for improvement in the design and interpretation of studies which use real data to evaluate statistical methods, in particular with respect to dataset inclusion criteria and the reduction of various forms of bias. More generally, we discuss the concept of "evidence-based" statistical research, its limitations and its impact on the design and interpretation of real-data-based benchmark experiments. We suggest that benchmark studies-a method of assessment of statistical methods using real-world datasets-might benefit from adopting (some) concepts from evidence-based medicine towards the goal of more evidence-based statistical research.
Constraint-Driven Software Design: An Escape from the Waterfall Model.
ERIC Educational Resources Information Center
de Hoog, Robert; And Others
1994-01-01
Presents the principles of a development methodology for software design based on a nonlinear, product-driven approach that integrates quality aspects. Two examples are given to show that the flexibility needed for building high quality systems leads to integrated development environments in which methodology, product, and tools are closely…
ERIC Educational Resources Information Center
Razak, Rafiza Abdul; Yusop, Farrah Dina; Idris, Aizal Yusrina; Al-Sinaiyah, Yanbu; Halili, Siti Hajar
2016-01-01
The paper introduces Teacher Interactive Electronic Continuous Professional Development (TIE-CPD), an online interactive training system. The framework and methodology of TIE-CPD are designed with functionalities comparable with existing e-training systems. The system design and development literature offers several methodology and framework…
Masoumi, Hamid Reza Fard; Basri, Mahiran; Kassim, Anuar; Abdullah, Dzulkefly Kuang; Abdollahi, Yadollah; Abd Gani, Siti Salwa; Rezaee, Malahat
2013-01-01
Lipase-catalyzed production of triethanolamine-based esterquat by esterification of oleic acid (OA) with triethanolamine (TEA) in n-hexane was performed in 2 L stirred-tank reactor. A set of experiments was designed by central composite design to process modeling and statistically evaluate the findings. Five independent process variables, including enzyme amount, reaction time, reaction temperature, substrates molar ratio of OA to TEA, and agitation speed, were studied under the given conditions designed by Design Expert software. Experimental data were examined for normality test before data processing stage and skewness and kurtosis indices were determined. The mathematical model developed was found to be adequate and statistically accurate to predict the optimum conversion of product. Response surface methodology with central composite design gave the best performance in this study, and the methodology as a whole has been proven to be adequate for the design and optimization of the enzymatic process.
Garg, Harish
2013-03-01
The main objective of the present paper is to propose a methodology for analyzing the behavior of the complex repairable industrial systems. In real-life situations, it is difficult to find the most optimal design policies for MTBF (mean time between failures), MTTR (mean time to repair) and related costs by utilizing available resources and uncertain data. For this, the availability-cost optimization model has been constructed for determining the optimal design parameters for improving the system design efficiency. The uncertainties in the data related to each component of the system are estimated with the help of fuzzy and statistical methodology in the form of the triangular fuzzy numbers. Using these data, the various reliability parameters, which affects the system performance, are obtained in the form of the fuzzy membership function by the proposed confidence interval based fuzzy Lambda-Tau (CIBFLT) methodology. The computed results by CIBFLT are compared with the existing fuzzy Lambda-Tau methodology. Sensitivity analysis on the system MTBF has also been addressed. The methodology has been illustrated through a case study of washing unit, the main part of the paper industry. Copyright © 2012 ISA. Published by Elsevier Ltd. All rights reserved.
Martínez-Moreno, J M; Sánchez-González, P; Luna, M; Roig, T; Tormos, J M; Gómez, E J
2016-01-01
Brain Injury (BI) has become one of the most common causes of neurological disability in developed countries. Cognitive disorders result in a loss of independence and patients' quality of life. Cognitive rehabilitation aims to promote patients' skills to achieve their highest degree of personal autonomy. New technologies such as virtual reality or interactive video allow developing rehabilitation therapies based on reproducible Activities of Daily Living (ADLs), increasing the ecological validity of the therapy. However, the lack of frameworks to formalize and represent the definition of this kind of therapies can be a barrier for widespread use of interactive virtual environments in clinical routine. To provide neuropsychologists with a methodology and an instrument to design and evaluate cognitive rehabilitation therapeutic interventions strategies based on ADLs performed in interactive virtual environments. The proposed methodology is used to model therapeutic interventions during virtual ADLs considering cognitive deficit, expected abnormal interactions and therapeutic hypotheses. It allows identifying abnormal behavioural patterns and designing interventions strategies in order to achieve errorless-based rehabilitation. An ADL case study ('buying bread') is defined according to the guidelines established by the ADL intervention model. This case study is developed, as a proof of principle, using interactive video technology and is used to assess the feasibility of the proposed methodology in the definition of therapeutic intervention procedures. The proposed methodology provides neuropsychologists with an instrument to design and evaluate ADL-based therapeutic intervention strategies, attending to solve actual limitation of virtual scenarios, to be use for ecological rehabilitation of cognitive deficit in daily clinical practice. The developed case study proves the potential of the methodology to design therapeutic interventions strategies; however our current work is devoted to designing more experiments in order to present more evidence about its values.
Cognitive Activity-based Design Methodology for Novice Visual Communication Designers
ERIC Educational Resources Information Center
Kim, Hyunjung; Lee, Hyunju
2016-01-01
The notion of design thinking is becoming more concrete nowadays, as design researchers and practitioners study the thinking processes involved in design and employ the concept of design thinking to foster better solutions to complex and ill-defined problems. The goal of the present research is to develop a cognitive activity-based design…
NASA Astrophysics Data System (ADS)
Echavarria, E.; Tomiyama, T.; van Bussel, G. J. W.
2007-07-01
The objective of this on-going research is to develop a design methodology to increase the availability for offshore wind farms, by means of an intelligent maintenance system capable of responding to faults by reconfiguring the system or subsystems, without increasing service visits, complexity, or costs. The idea is to make use of the existing functional redundancies within the system and sub-systems to keep the wind turbine operational, even at a reduced capacity if necessary. Re-configuration is intended to be a built-in capability to be used as a repair strategy, based on these existing functionalities provided by the components. The possible solutions can range from using information from adjacent wind turbines, such as wind speed and direction, to setting up different operational modes, for instance re-wiring, re-connecting, changing parameters or control strategy. The methodology described in this paper is based on qualitative physics and consists of a fault diagnosis system based on a model-based reasoner (MBR), and on a functional redundancy designer (FRD). Both design tools make use of a function-behaviour-state (FBS) model. A design methodology based on the re-configuration concept to achieve self-maintained wind turbines is an interesting and promising approach to reduce stoppage rate, failure events, maintenance visits, and to maintain energy output possibly at reduced rate until the next scheduled maintenance.
Technological Leverage in Higher Education: An Evolving Pedagogy
ERIC Educational Resources Information Center
Pillai, K. Rajasekharan; Prakash, Ashish Viswanath
2017-01-01
Purpose: The purpose of the study is to analyse the perception of students toward a computer-based exam on a custom-made digital device and their willingness to adopt the same for high-stake summative assessment. Design/methodology/approach: This study followed an analytical methodology using survey design. A modified version of students'…
A Protean Practice? Perspectives on the Practice of Action Learning
ERIC Educational Resources Information Center
Brook, Cheryl; Pedler, Mike; Burgoyne, John G
2013-01-01
Purpose: The purpose of the paper is to assess the extent to which these practitioners ' perspectives and practices match Willis's conception of a Revans "gold standard" of action learning. Design/methodology/approach: This study adopts a qualitative design and methodology based on interviews and the collection of cases or accounts of…
The Research and Evaluation of Serious Games: Toward a Comprehensive Methodology
ERIC Educational Resources Information Center
Mayer, Igor; Bekebrede, Geertje; Harteveld, Casper; Warmelink, Harald; Zhou, Qiqi; van Ruijven, Theo; Lo, Julia; Kortmann, Rens; Wenzler, Ivo
2014-01-01
The authors present the methodological background to and underlying research design of an ongoing research project on the scientific evaluation of serious games and/or computer-based simulation games (SGs) for advanced learning. The main research questions are: (1) what are the requirements and design principles for a comprehensive social…
Development of a Design Methodology for Reconfigurable Flight Control Systems
NASA Technical Reports Server (NTRS)
Hess, Ronald A.; McLean, C.
2000-01-01
A methodology is presented for the design of flight control systems that exhibit stability and performance-robustness in the presence of actuator failures. The design is based upon two elements. The first element consists of a control law that will ensure at least stability in the presence of a class of actuator failures. This law is created by inner-loop, reduced-order, linear dynamic inversion, and outer-loop compensation based upon Quantitative Feedback Theory. The second element consists of adaptive compensators obtained from simple and approximate time-domain identification of the dynamics of the 'effective vehicle' with failed actuator(s). An example involving the lateral-directional control of a fighter aircraft is employed both to introduce the proposed methodology and to demonstrate its effectiveness and limitations.
NASA Astrophysics Data System (ADS)
Demourant, F.; Ferreres, G.
2013-12-01
This article presents a methodology for a linear parameter-varying (LPV) multiobjective flight control law design for a blended wing body (BWB) aircraft and results. So, the method is a direct design of a parametrized control law (with respect to some measured flight parameters) through a multimodel convex design to optimize a set of specifications on the full-flight domain and different mass cases. The methodology is based on the Youla parameterization which is very useful since closed loop specifications are affine with respect to Youla parameter. The LPV multiobjective design method is detailed and applied to the BWB flexible aircraft example.
Validation of a SysML based design for wireless sensor networks
NASA Astrophysics Data System (ADS)
Berrachedi, Amel; Rahim, Messaoud; Ioualalen, Malika; Hammad, Ahmed
2017-07-01
When developing complex systems, the requirement for the verification of the systems' design is one of the main challenges. Wireless Sensor Networks (WSNs) are examples of such systems. We address the problem of how WSNs must be designed to fulfil the system requirements. Using the SysML Language, we propose a Model Based System Engineering (MBSE) specification and verification methodology for designing WSNs. This methodology uses SysML to describe the WSNs requirements, structure and behaviour. Then, it translates the SysML elements to an analytic model, specifically, to a Deterministic Stochastic Petri Net. The proposed approach allows to design WSNs and study their behaviors and their energy performances.
Design Science Methodology Applied to a Chemical Surveillance Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Zhuanyi; Han, Kyungsik; Charles-Smith, Lauren E.
Public health surveillance systems gain significant benefits from integrating existing early incident detection systems,supported by closed data sources, with open source data.However, identifying potential alerting incidents relies on finding accurate, reliable sources and presenting the high volume of data in a way that increases analysts work efficiency; a challenge for any system that leverages open source data. In this paper, we present the design concept and the applied design science research methodology of ChemVeillance, a chemical analyst surveillance system.Our work portrays a system design and approach that translates theoretical methodology into practice creating a powerful surveillance system built for specificmore » use cases.Researchers, designers, developers, and related professionals in the health surveillance community can build upon the principles and methodology described here to enhance and broaden current surveillance systems leading to improved situational awareness based on a robust integrated early warning system.« less
Application of the Hardman methodology to the Army Remotely Piloted Vehicle (RPV)
NASA Technical Reports Server (NTRS)
1983-01-01
The application of the HARDMAN Methodology to the Remotely Piloted Vehicle (RPV) is described. The methodology was used to analyze the manpower, personnel, and training (MPT) requirements of the proposed RPV system design for a number of operating scenarios. The RPV system is defined as consisting of the equipment, personnel, and operational procedures needed to perform five basic artillery missions: reconnaissance, target acquisition, artillery adjustment, target designation and damage assessment. The RPV design evaluated includes an air vehicle (AV), a modular integrated communications and navigation system (MICNS), a ground control station (GCS), a launch subsystem (LS), a recovery subsystem (RS), and a number of ground support requirements. The HARDMAN Methodology is an integrated set of data base management techniques and analytic tools, designed to provide timely and fully documented assessments of the human resource requirements associated with an emerging system's design.
Topology-optimized broadband surface relief transmission grating
NASA Astrophysics Data System (ADS)
Andkjær, Jacob; Ryder, Christian P.; Nielsen, Peter C.; Rasmussen, Thomas; Buchwald, Kristian; Sigmund, Ole
2014-03-01
We propose a design methodology for systematic design of surface relief transmission gratings with optimized diffraction efficiency. The methodology is based on a gradient-based topology optimization formulation along with 2D frequency domain finite element simulations for TE and TM polarized plane waves. The goal of the optimization is to find a grating design that maximizes diffraction efficiency for the -1st transmission order when illuminated by unpolarized plane waves. Results indicate that a surface relief transmission grating can be designed with a diffraction efficiency of more than 40% in a broadband range going from the ultraviolet region, through the visible region and into the near-infrared region.
Computational Fragment-Based Drug Design: Current Trends, Strategies, and Applications.
Bian, Yuemin; Xie, Xiang-Qun Sean
2018-04-09
Fragment-based drug design (FBDD) has become an effective methodology for drug development for decades. Successful applications of this strategy brought both opportunities and challenges to the field of Pharmaceutical Science. Recent progress in the computational fragment-based drug design provide an additional approach for future research in a time- and labor-efficient manner. Combining multiple in silico methodologies, computational FBDD possesses flexibilities on fragment library selection, protein model generation, and fragments/compounds docking mode prediction. These characteristics provide computational FBDD superiority in designing novel and potential compounds for a certain target. The purpose of this review is to discuss the latest advances, ranging from commonly used strategies to novel concepts and technologies in computational fragment-based drug design. Particularly, in this review, specifications and advantages are compared between experimental and computational FBDD, and additionally, limitations and future prospective are discussed and emphasized.
The comparison of various approach to evaluation erosion risks and design control erosion measures
NASA Astrophysics Data System (ADS)
Kapicka, Jiri
2015-04-01
In the present is in the Czech Republic one methodology how to compute and compare erosion risks. This methodology contain also method to design erosion control measures. The base of this methodology is Universal Soil Loss Equation (USLE) and their result long-term average annual rate of erosion (G). This methodology is used for landscape planners. Data and statistics from database of erosion events in the Czech Republic shows that many troubles and damages are from local episodes of erosion events. An extent of these events and theirs impact are conditional to local precipitation events, current plant phase and soil conditions. These erosion events can do troubles and damages on agriculture land, municipally property and hydro components and even in a location is from point of view long-term average annual rate of erosion in good conditions. Other way how to compute and compare erosion risks is episodes approach. In this paper is presented the compare of various approach to compute erosion risks. The comparison was computed to locality from database of erosion events on agricultural land in the Czech Republic where have been records two erosion events. The study area is a simple agriculture land without any barriers that can have high influence to water flow and soil sediment transport. The computation of erosion risks (for all methodology) was based on laboratory analysis of soil samples which was sampled on study area. Results of the methodology USLE, MUSLE and results from mathematical model Erosion 3D have been compared. Variances of the results in space distribution of the places with highest soil erosion where compared and discussed. Other part presents variances of design control erosion measures where their design was done on based different methodology. The results shows variance of computed erosion risks which was done by different methodology. These variances can start discussion about different approach how compute and evaluate erosion risks in areas with different importance.
Thompson-Bean, E; Das, R; McDaid, A
2016-10-31
We present a novel methodology for the design and manufacture of complex biologically inspired soft robotic fluidic actuators. The methodology is applied to the design and manufacture of a prosthetic for the hand. Real human hands are scanned to produce a 3D model of a finger, and pneumatic networks are implemented within it to produce a biomimetic bending motion. The finger is then partitioned into material sections, and a genetic algorithm based optimization, using finite element analysis, is employed to discover the optimal material for each section. This is based on two biomimetic performance criteria. Two sets of optimizations using two material sets are performed. Promising optimized material arrangements are fabricated using two techniques to validate the optimization routine, and the fabricated and simulated results are compared. We find that the optimization is successful in producing biomimetic soft robotic fingers and that fabrication of the fingers is possible. Limitations and paths for development are discussed. This methodology can be applied for other fluidic soft robotic devices.
Mixed-Reality Prototypes to Support Early Creative Design
NASA Astrophysics Data System (ADS)
Safin, Stéphane; Delfosse, Vincent; Leclercq, Pierre
The domain we address is creative design, mainly architecture. Rooted in a multidisciplinary approach as well as a deep understanding of architecture and design, our method aims at proposing adapted mixed-reality solutions to support two crucial activities: sketch-based preliminary design and distant synchronous collaboration in design. This chapter provides a summary of our work on a mixed-reality device, based on a drawing table (the Virtual Desktop), designed specifically to address real-life/business-focused issues. We explain our methodology, describe the two supported activities and the related users’ needs, detail the technological solution we have developed, and present the main results of multiple evaluation sessions. We conclude with a discussion of the usefulness of a profession-centered methodology and the relevance of mixed reality to support creative design activities.
Problem-Based Learning: Lessons for Administrators, Educators and Learners
ERIC Educational Resources Information Center
Yeo, Roland
2005-01-01
Purpose: The paper aims to explore the challenges of problem-based learning (PBL) as an unconventional teaching methodology experienced by a higher learning institute in Singapore. Design/methodology/approach: The exploratory study was conducted using focus group discussions and semi-structured interviews. Four groups of people were invited to…
ERIC Educational Resources Information Center
Zapata-Rivera, Diego; VanWinkle, Waverely; Doyle, Bryan; Buteux, Alyssa; Bauer, Malcolm
2009-01-01
Purpose: The purpose of this paper is to propose and demonstrate an evidence-based scenario design framework for assessment-based computer games. Design/methodology/approach: The evidence-based scenario design framework is presented and demonstrated by using BELLA, a new assessment-based gaming environment aimed at supporting student learning of…
NASA Astrophysics Data System (ADS)
González, M. R.; Lambán, M. P.
2012-04-01
This paper presents the result of designing the subject Quality Engineering and Security of the Product, belonging to the Degree of Engineering in Industrial Design and Product Development, on the basis of the case methodology. Practical sessions of this subject are organized using the whole documents of the Quality System Management of the virtual company BeaLuc S.A.
IMPAC: An Integrated Methodology for Propulsion and Airframe Control
NASA Technical Reports Server (NTRS)
Garg, Sanjay; Ouzts, Peter J.; Lorenzo, Carl F.; Mattern, Duane L.
1991-01-01
The National Aeronautics and Space Administration is actively involved in the development of enabling technologies that will lead towards aircraft with new/enhanced maneuver capabilities such as Short Take-Off Vertical Landing (STOVL) and high angle of attack performance. Because of the high degree of dynamic coupling between the airframe and propulsion systems of these types of aircraft, one key technology is the integration of the flight and propulsion control. The NASA Lewis Research Center approach to developing Integrated Flight Propulsion Control (IFPC) technologies is an in-house research program referred to as IMPAC (Integrated Methodology for Propulsion and Airframe Control). The goals of IMPAC are to develop a viable alternative to the existing integrated control design methodologies that will allow for improved system performance and simplicity of control law synthesis and implementation, and to demonstrate the applicability of the methodology to a supersonic STOVL fighter aircraft. Based on some preliminary control design studies that included evaluation of the existing methodologies, the IFPC design methodology that is emerging at the Lewis Research Center consists of considering the airframe and propulsion system as one integrated system for an initial centralized controller design and then partitioning the centralized controller into separate airframe and propulsion system subcontrollers to ease implementation and to set meaningful design requirements for detailed subsystem control design and evaluation. An overview of IMPAC is provided and detailed discussion of the various important design and evaluation steps in the methodology are included.
Assessment methodology for computer-based instructional simulations.
Koenig, Alan; Iseli, Markus; Wainess, Richard; Lee, John J
2013-10-01
Computer-based instructional simulations are becoming more and more ubiquitous, particularly in military and medical domains. As the technology that drives these simulations grows ever more sophisticated, the underlying pedagogical models for how instruction, assessment, and feedback are implemented within these systems must evolve accordingly. In this article, we review some of the existing educational approaches to medical simulations, and present pedagogical methodologies that have been used in the design and development of games and simulations at the University of California, Los Angeles, Center for Research on Evaluation, Standards, and Student Testing. In particular, we present a methodology for how automated assessments of computer-based simulations can be implemented using ontologies and Bayesian networks, and discuss their advantages and design considerations for pedagogical use. Reprint & Copyright © 2013 Association of Military Surgeons of the U.S.
Fatigue criterion to system design, life and reliability
NASA Technical Reports Server (NTRS)
Zaretsky, E. V.
1985-01-01
A generalized methodology to structural life prediction, design, and reliability based upon a fatigue criterion is advanced. The life prediction methodology is based in part on work of W. Weibull and G. Lundberg and A. Palmgren. The approach incorporates the computed life of elemental stress volumes of a complex machine element to predict system life. The results of coupon fatigue testing can be incorporated into the analysis allowing for life prediction and component or structural renewal rates with reasonable statistical certainty.
NASA Technical Reports Server (NTRS)
Newman, P. A.; Hou, G. J.-W.; Jones, H. E.; Taylor, A. C., III; Korivi, V. M.
1992-01-01
How a combination of various computational methodologies could reduce the enormous computational costs envisioned in using advanced CFD codes in gradient based optimized multidisciplinary design (MdD) procedures is briefly outlined. Implications of these MdD requirements upon advanced CFD codes are somewhat different than those imposed by a single discipline design. A means for satisfying these MdD requirements for gradient information is presented which appear to permit: (1) some leeway in the CFD solution algorithms which can be used; (2) an extension to 3-D problems; and (3) straightforward use of other computational methodologies. Many of these observations have previously been discussed as possibilities for doing parts of the problem more efficiently; the contribution here is observing how they fit together in a mutually beneficial way.
Semi-Supervised Learning of Lift Optimization of Multi-Element Three-Segment Variable Camber Airfoil
NASA Technical Reports Server (NTRS)
Kaul, Upender K.; Nguyen, Nhan T.
2017-01-01
This chapter describes a new intelligent platform for learning optimal designs of morphing wings based on Variable Camber Continuous Trailing Edge Flaps (VCCTEF) in conjunction with a leading edge flap called the Variable Camber Krueger (VCK). The new platform consists of a Computational Fluid Dynamics (CFD) methodology coupled with a semi-supervised learning methodology. The CFD component of the intelligent platform comprises of a full Navier-Stokes solution capability (NASA OVERFLOW solver with Spalart-Allmaras turbulence model) that computes flow over a tri-element inboard NASA Generic Transport Model (GTM) wing section. Various VCCTEF/VCK settings and configurations were considered to explore optimal design for high-lift flight during take-off and landing. To determine globally optimal design of such a system, an extremely large set of CFD simulations is needed. This is not feasible to achieve in practice. To alleviate this problem, a recourse was taken to a semi-supervised learning (SSL) methodology, which is based on manifold regularization techniques. A reasonable space of CFD solutions was populated and then the SSL methodology was used to fit this manifold in its entirety, including the gaps in the manifold where there were no CFD solutions available. The SSL methodology in conjunction with an elastodynamic solver (FiDDLE) was demonstrated in an earlier study involving structural health monitoring. These CFD-SSL methodologies define the new intelligent platform that forms the basis for our search for optimal design of wings. Although the present platform can be used in various other design and operational problems in engineering, this chapter focuses on the high-lift study of the VCK-VCCTEF system. Top few candidate design configurations were identified by solving the CFD problem in a small subset of the design space. The SSL component was trained on the design space, and was then used in a predictive mode to populate a selected set of test points outside of the given design space. The new design test space thus populated was evaluated by using the CFD component by determining the error between the SSL predictions and the true (CFD) solutions, which was found to be small. This demonstrates the proposed CFD-SSL methodologies for isolating the best design of the VCK-VCCTEF system, and it holds promise for quantitatively identifying best designs of flight systems, in general.
INTEGRATION OF POLLUTION PREVENTION TOOLS
A prototype computer-based decision support system was designed to provide small businesses with an integrated pollution prevention methodology. Preliminary research involved compilation of an inventory of existing pollution prevention tools (i.e., methodologies, software, etc.),...
Developing Pedagogical Practices for English-Language Learners: A Design-Based Approach
ERIC Educational Resources Information Center
Iddings, Ana Christina DaSilva; Rose, Brian Christopher
2012-01-01
This study draws on the application of sociocultural theory to second-language learning and teaching to examine the impact of a design-based research approach on teacher development and literacy instruction to English-language learners (ELLs). Design-based research methodology was employed to derive theoretical suppositions relating to the process…
Kamal, Noreen; Fels, Sidney
2013-01-01
Positive health behaviour is critical to preventing illness and managing chronic conditions. A user-centred methodology was employed to design an online social network to motivate health behaviour change. The methodology was augmented by utilizing the Appeal, Belonging, Commitment (ABC) Framework, which is based on theoretical models for health behaviour change and use of online social networks. The user-centred methodology included four phases: 1) initial user inquiry on health behaviour and use of online social networks; 2) interview feedback on paper prototypes; 2) laboratory study on medium fidelity prototype; and 4) a field study on the high fidelity prototype. The points of inquiry through these phases were based on the ABC Framework. This yielded an online social network system that linked to external third party databases to deploy to users via an interactive website.
Control/structure interaction design methodology
NASA Technical Reports Server (NTRS)
Briggs, Hugh C.; Layman, William E.
1989-01-01
The Control Structure Interaction Program is a technology development program for spacecraft that exhibit interactions between the control system and structural dynamics. The program objectives include development and verification of new design concepts (such as active structure) and new tools (such as a combined structure and control optimization algorithm) and their verification in ground and possibly flight test. The new CSI design methodology is centered around interdisciplinary engineers using new tools that closely integrate structures and controls. Verification is an important CSI theme and analysts will be closely integrated to the CSI Test Bed laboratory. Components, concepts, tools and algorithms will be developed and tested in the lab and in future Shuttle-based flight experiments. The design methodology is summarized in block diagrams depicting the evolution of a spacecraft design and descriptions of analytical capabilities used in the process. The multiyear JPL CSI implementation plan is described along with the essentials of several new tools. A distributed network of computation servers and workstations was designed that will provide a state-of-the-art development base for the CSI technologies.
Bayesian design of decision rules for failure detection
NASA Technical Reports Server (NTRS)
Chow, E. Y.; Willsky, A. S.
1984-01-01
The formulation of the decision making process of a failure detection algorithm as a Bayes sequential decision problem provides a simple conceptualization of the decision rule design problem. As the optimal Bayes rule is not computable, a methodology that is based on the Bayesian approach and aimed at a reduced computational requirement is developed for designing suboptimal rules. A numerical algorithm is constructed to facilitate the design and performance evaluation of these suboptimal rules. The result of applying this design methodology to an example shows that this approach is potentially a useful one.
Design Optimization of Liquid Nitrogen Based IQF Tunnel
NASA Astrophysics Data System (ADS)
Datye, A. B.; Narayankhedkar, K. G.; Sharma, G. K.
2006-04-01
A design methodology for an Individual Quick Freezing (IQF) tunnel using liquid nitrogen is developed and the design based on this methodology is validated using the data of commercial tunnels. The design takes care of heat gains due to the conveyor belt which is exposed to atmosphere at the infeed and outfeed ends. The design also considers the heat gains through the insulation as well as due to circulating fans located within the tunnel. For minimum liquid nitrogen consumption, the ratio of the length of the belt, L (from infeed to out feed) to the width of the belt, W can be considered as a parameter. The comparison of predicted and reported liquid nitrogen (experimental data) consumption shows good agreement and is within 10 %.
NASA Astrophysics Data System (ADS)
McPhee, J.; William, Y. W.
2005-12-01
This work presents a methodology for pumping test design based on the reliability requirements of a groundwater model. Reliability requirements take into consideration the application of the model results in groundwater management, expressed in this case as a multiobjective management model. The pumping test design is formulated as a mixed-integer nonlinear programming (MINLP) problem and solved using a combination of genetic algorithm (GA) and gradient-based optimization. Bayesian decision theory provides a formal framework for assessing the influence of parameter uncertainty over the reliability of the proposed pumping test. The proposed methodology is useful for selecting a robust design that will outperform all other candidate designs under most potential 'true' states of the system
Finite element analysis of container ship's cargo hold using ANSYS and POSEIDON software
NASA Astrophysics Data System (ADS)
Tanny, Tania Tamiz; Akter, Naznin; Amin, Osman Md.
2017-12-01
Nowadays ship structural analysis has become an integral part of the preliminary ship design providing further support for the development and detail design of ship structures. Structural analyses of container ship's cargo holds are carried out for the balancing of their safety and capacity, as those ships are exposed to the high risk of structural damage during voyage. Two different design methodologies have been considered for the structural analysis of a container ship's cargo hold. One is rule-based methodology and the other is a more conventional software based analyses. The rule based analysis is done by DNV-GL's software POSEIDON and the conventional package based analysis is done by ANSYS structural module. Both methods have been applied to analyze some of the mechanical properties of the model such as total deformation, stress-strain distribution, Von Mises stress, Fatigue etc., following different design bases and approaches, to indicate some guidance's for further improvements in ship structural design.
Design Methodology for Magnetic Field-Based Soft Tri-Axis Tactile Sensors.
Wang, Hongbo; de Boer, Greg; Kow, Junwai; Alazmani, Ali; Ghajari, Mazdak; Hewson, Robert; Culmer, Peter
2016-08-24
Tactile sensors are essential if robots are to safely interact with the external world and to dexterously manipulate objects. Current tactile sensors have limitations restricting their use, notably being too fragile or having limited performance. Magnetic field-based soft tactile sensors offer a potential improvement, being durable, low cost, accurate and high bandwidth, but they are relatively undeveloped because of the complexities involved in design and calibration. This paper presents a general design methodology for magnetic field-based three-axis soft tactile sensors, enabling researchers to easily develop specific tactile sensors for a variety of applications. All aspects (design, fabrication, calibration and evaluation) of the development of tri-axis soft tactile sensors are presented and discussed. A moving least square approach is used to decouple and convert the magnetic field signal to force output to eliminate non-linearity and cross-talk effects. A case study of a tactile sensor prototype, MagOne, was developed. This achieved a resolution of 1.42 mN in normal force measurement (0.71 mN in shear force), good output repeatability and has a maximum hysteresis error of 3.4%. These results outperform comparable sensors reported previously, highlighting the efficacy of our methodology for sensor design.
Design Methodology for Magnetic Field-Based Soft Tri-Axis Tactile Sensors
Wang, Hongbo; de Boer, Greg; Kow, Junwai; Alazmani, Ali; Ghajari, Mazdak; Hewson, Robert; Culmer, Peter
2016-01-01
Tactile sensors are essential if robots are to safely interact with the external world and to dexterously manipulate objects. Current tactile sensors have limitations restricting their use, notably being too fragile or having limited performance. Magnetic field-based soft tactile sensors offer a potential improvement, being durable, low cost, accurate and high bandwidth, but they are relatively undeveloped because of the complexities involved in design and calibration. This paper presents a general design methodology for magnetic field-based three-axis soft tactile sensors, enabling researchers to easily develop specific tactile sensors for a variety of applications. All aspects (design, fabrication, calibration and evaluation) of the development of tri-axis soft tactile sensors are presented and discussed. A moving least square approach is used to decouple and convert the magnetic field signal to force output to eliminate non-linearity and cross-talk effects. A case study of a tactile sensor prototype, MagOne, was developed. This achieved a resolution of 1.42 mN in normal force measurement (0.71 mN in shear force), good output repeatability and has a maximum hysteresis error of 3.4%. These results outperform comparable sensors reported previously, highlighting the efficacy of our methodology for sensor design. PMID:27563908
NASA Astrophysics Data System (ADS)
Ren, Lei; Zhang, Lin; Tao, Fei; (Luke) Zhang, Xiaolong; Luo, Yongliang; Zhang, Yabin
2012-08-01
Multidisciplinary design of complex products leads to an increasing demand for high performance simulation (HPS) platforms. One great challenge is how to achieve high efficient utilisation of large-scale simulation resources in distributed and heterogeneous environments. This article reports a virtualisation-based methodology to realise a HPS platform. This research is driven by the issues concerning large-scale simulation resources deployment and complex simulation environment construction, efficient and transparent utilisation of fine-grained simulation resources and high reliable simulation with fault tolerance. A framework of virtualisation-based simulation platform (VSIM) is first proposed. Then the article investigates and discusses key approaches in VSIM, including simulation resources modelling, a method to automatically deploying simulation resources for dynamic construction of system environment, and a live migration mechanism in case of faults in run-time simulation. Furthermore, the proposed methodology is applied to a multidisciplinary design system for aircraft virtual prototyping and some experiments are conducted. The experimental results show that the proposed methodology can (1) significantly improve the utilisation of fine-grained simulation resources, (2) result in a great reduction in deployment time and an increased flexibility for simulation environment construction and (3)achieve fault tolerant simulation.
Alppay, Cem; Bayazit, Nigan
2015-11-01
In this paper, we study the arrangement of displays in flight instrument panels of multi-purpose civil helicopters following a user-centered design method based on ergonomics principles. Our methodology can also be described as a user-interface arrangement methodology based on user opinions and preferences. This study can be outlined as gathering user-centered data using two different research methods and then analyzing and integrating the collected data to come up with an optimal instrument panel design. An interview with helicopter pilots formed the first step of our research. In that interview, pilots were asked to provide a quantitative evaluation of basic interface arrangement principles. In the second phase of the research, a paper prototyping study was conducted with same pilots. The final phase of the study entailed synthesizing the findings from interviews and observational studies to formulate an optimal flight instrument arrangement methodology. The primary results that we present in our paper are the methodology that we developed and three new interface arrangement concepts, namely relationship of inseparability, integrated value and locational value. An optimum instrument panel arrangement is also proposed by the researchers. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Designing Online Problem Representation Engine for Conceptual Change
ERIC Educational Resources Information Center
Lee, Chwee Beng; Ling, Keck Voon
2012-01-01
Purpose: This paper aims to describe the web-based scaffold dynamic simulation system (PRES-on) designed for pre-service teachers. Design/methodology/approach: The paper describes the initial design of a web-based scaffold dynamic simulation system (PRES-on) as a cognitive tool for learners to represent problems. For the widespread use of the…
NASA Technical Reports Server (NTRS)
Myers, Thomas T.; Mcruer, Duane T.
1988-01-01
The development of a comprehensive and electric methodology for conceptual and preliminary design of flight control systems is presented and illustrated. The methodology is focused on the design states starting with the layout of system requirements and ending when some viable competing system architectures (feedback control structures) are defined. The approach is centered on the human pilot and the aircraft as both the sources of, and the keys to the solution of, many flight control problems. The methodology relies heavily on computational procedures which are highly interactive with the design engineer. To maximize effectiveness, these techniques, as selected and modified to be used together in the methodology, form a cadre of computational tools specifically tailored for integrated flight control system preliminary design purposes. The FCX expert system as presently developed is only a limited prototype capable of supporting basic lateral-directional FCS design activities related to the design example used. FCX presently supports design of only one FCS architecture (yaw damper plus roll damper) and the rules are largely focused on Class IV (highly maneuverable) aircraft. Despite this limited scope, the major elements which appear necessary for application of knowledge-based software concepts to flight control design were assembled and thus FCX represents a prototype which can be tested, critiqued and evolved in an ongoing process of development.
NASA Technical Reports Server (NTRS)
Celaya, Jose R.; Saha, Sankalita; Goebel, Kai
2011-01-01
Accelerated aging methodologies for electrolytic components have been designed and accelerated aging experiments have been carried out. The methodology is based on imposing electrical and/or thermal overstresses via electrical power cycling in order to mimic the real world operation behavior. Data are collected in-situ and offline in order to periodically characterize the devices' electrical performance as it ages. The data generated through these experiments are meant to provide capability for the validation of prognostic algorithms (both model-based and data-driven). Furthermore, the data allow validation of physics-based and empirical based degradation models for this type of capacitor. A first set of models and algorithms has been designed and tested on the data.
SRB ascent aerodynamic heating design criteria reduction study, volume 1
NASA Technical Reports Server (NTRS)
Crain, W. K.; Frost, C. L.; Engel, C. D.
1989-01-01
An independent set of solid rocket booster (SRB) convective ascent design environments were produced which would serve as a check on the Rockwell IVBC-3 environments used to design the ascent phase of flight. In addition, support was provided for lowering the design environments such that Thermal Protection System (TPS), based on conservative estimates, could be removed leading to a reduction in SRB refurbishment time and cost. Ascent convective heating rates and loads were generated at locations in the SRB where lowering the thermal environment would impact the TPS design. The ascent thermal environments are documented along with the wind tunnel/flight test data base used as well as the trajectory and environment generation methodology. Methodology, as well as, environment summaries compared to the 1980 Design and Rockwell IVBC-3 Design Environment are presented in this volume, 1.
1987-03-01
contends his soft systems methodology is such an approach. [Ref. 2: pp. 105-107] Overview of this Methodology is meant flor addressing fuzzy., ill...could form the basis of office systems development: Checkland’s (1981) soft systems methodology , Pava’s (1983) sociotechnical design, and Mumlbrd and
DOT National Transportation Integrated Search
1995-01-01
This report describes the development of a methodology designed to assure that a sufficiently high level of safety is achieved and maintained in computer-based systems which perform safety cortical functions in high-speed rail or magnetic levitation ...
DOT National Transportation Integrated Search
1995-09-01
This report describes the development of a methodology designed to assure that a sufficiently high level of safety is achieved and maintained in computer-based systems which perform safety critical functions in high-speed rail or magnetic levitation ...
Methodological, Theoretical, Infrastructural, and Design Issues in Conducting Good Outcome Studies
ERIC Educational Resources Information Center
Kelly, Michael P.; Moore, Tessa A.
2011-01-01
This article outlines a set of methodological, theoretical, and other issues relating to the conduct of good outcome studies. The article begins by considering the contribution of evidence-based medicine to the methodology of outcome research. The lessons which can be applied in outcome studies in nonmedical settings are described. The article…
Modeling Web-Based Educational Systems: Process Design Teaching Model
ERIC Educational Resources Information Center
Rokou, Franca Pantano; Rokou, Elena; Rokos, Yannis
2004-01-01
Using modeling languages is essential to the construction of educational systems based on software engineering principles and methods. Furthermore, the instructional design is undoubtedly the cornerstone of the design and development of educational systems. Although several methodologies and languages have been proposed for the specification of…
ERIC Educational Resources Information Center
Joseph, Diana
2004-01-01
Recent interest in design-based research as a research and development methodology in education has begun to clarify the goals and commitments involved in this practice. So far, we have limited views into how the work of design and the work of research impact each other in the course of design-based investigations. In this article, I use the…
Stoll, S; Roelcke, V; Raspe, H
2005-07-29
The article addresses the history of evidence-based medicine in Germany. Its aim was to reconstruct the standard of clinical-therapeutic investigation in Germany at the beginning of the 20 (th) century. By a historical investigation of five important German general medical journals for the time between 1918 and 1932 an overview of the situation of clinical investigation is given. 268 clinical trails are identified, and are analysed in view of their methodological design. Heterogeneous results are found: While few examples of sophisticated methodology exist, the design of the majority of the studies is poor. A response to the situation described can be seen in Paul Martini's book "Methodology of Therapeutic Investigation", first published in 1932. Paul Martini's biography, his criticism of the situation of clinical-therapeutic investigation of his time, the major points of his methodology and the reception of the book in Germany and abroad are described.
ERIC Educational Resources Information Center
Carter, Erik W.; Brock, Matthew E.; Bottema-Beutel, Kristen; Bartholomew, Audrey; Boehm, Thomas L.; Cease-Cook, Jennifer
2013-01-01
Prevailing policy and practice in the field of transition emphasizes the importance of designing services and supports based on research-based practices. We reviewed every article published across the 35-year history of "Career Development and Transition for Exceptional Individuals" (CDTEI) to document methodological trends in research…
Systemic Sustainability in RtI Using Intervention-Based Scheduling Methodologies
ERIC Educational Resources Information Center
Dallas, William P.
2017-01-01
This study evaluated a scheduling methodology referred to as intervention-based scheduling to address the problem of practice regarding the fidelity of implementing Response to Intervention (RtI) in an existing school schedule design. Employing panel data, this study used fixed-effects regressions and first differences ordinary least squares (OLS)…
A flight-test methodology for identification of an aerodynamic model for a V/STOL aircraft
NASA Technical Reports Server (NTRS)
Bach, Ralph E., Jr.; Mcnally, B. David
1988-01-01
Described is a flight test methodology for developing a data base to be used to identify an aerodynamic model of a vertical and short takeoff and landing (V/STOL) fighter aircraft. The aircraft serves as a test bed at Ames for ongoing research in advanced V/STOL control and display concepts. The flight envelope to be modeled includes hover, transition to conventional flight, and back to hover, STOL operation, and normaL cruise. Although the aerodynamic model is highly nonlinear, it has been formulated to be linear in the parameters to be identified. Motivation for the flight test methodology advocated in this paper is based on the choice of a linear least-squares method for model identification. The paper covers elements of the methodology from maneuver design to the completed data base. Major emphasis is placed on the use of state estimation with tracking data to ensure consistency among maneuver variables prior to their entry into the data base. The design and processing of a typical maneuver is illustrated.
NASA Astrophysics Data System (ADS)
Adhikari, Pashupati Raj
Materials selection processes have been the most important aspects in product design and development. Knowledge-based system (KBS) and some of the methodologies used in the materials selection for the design of aircraft cabin metallic structures are discussed. Overall aircraft weight reduction means substantially less fuel consumption. Part of the solution to this problem is to find a way to reduce overall weight of metallic structures inside the cabin. Among various methodologies of materials selection using Multi Criterion Decision Making (MCDM) techniques, a few of them are demonstrated with examples and the results are compared with those obtained using Ashby's approach in materials selection. Pre-defined constraint values, mainly mechanical properties, are employed as relevant attributes in the process. Aluminum alloys with high strength-to-weight ratio have been second-to-none in most of the aircraft parts manufacturing. Magnesium alloys that are much lighter in weight as alternatives to the Al-alloys currently in use in the structures are tested using the methodologies and ranked results are compared. Each material attribute considered in the design are categorized as benefit and non-benefit attribute. Using Ashby's approach, material indices that are required to be maximized for an optimum performance are determined, and materials are ranked based on the average of consolidated indices ranking. Ranking results are compared for any disparity among the methodologies.
Base Stabilization Guidance and Additive Selection for Pavement Design and Rehabilitation
DOT National Transportation Integrated Search
2017-12-01
Significant improvements have been made in base stabilization practice that include design specifications and methodology, experience with the selection of stabilizing additives, and equipment for distribution and uniform blending of additives. For t...
Co-design of RAD and ETHICS methodologies: a combination of information system development methods
NASA Astrophysics Data System (ADS)
Nasehi, Arezo; Shahriyari, Salman
2011-12-01
Co-design is a new trend in the social world which tries to capture different ideas in order to use the most appropriate features for a system. In this paper, co-design of two information system methodologies is regarded; rapid application development (RAD) and effective technical and human implementation of computer-based systems (ETHICS). We tried to consider the characteristics of these methodologies to see the possibility of having a co-design or combination of them for developing an information system. To reach this purpose, four different aspects of them are analyzed: social or technical approach, user participation and user involvement, job satisfaction, and overcoming change resistance. Finally, a case study using the quantitative method is analyzed in order to examine the possibility of co-design using these factors. The paper concludes that RAD and ETHICS are appropriate to be co-designed and brings some suggestions for the co-design.
Statistical power calculations for mixed pharmacokinetic study designs using a population approach.
Kloprogge, Frank; Simpson, Julie A; Day, Nicholas P J; White, Nicholas J; Tarning, Joel
2014-09-01
Simultaneous modelling of dense and sparse pharmacokinetic data is possible with a population approach. To determine the number of individuals required to detect the effect of a covariate, simulation-based power calculation methodologies can be employed. The Monte Carlo Mapped Power method (a simulation-based power calculation methodology using the likelihood ratio test) was extended in the current study to perform sample size calculations for mixed pharmacokinetic studies (i.e. both sparse and dense data collection). A workflow guiding an easy and straightforward pharmacokinetic study design, considering also the cost-effectiveness of alternative study designs, was used in this analysis. Initially, data were simulated for a hypothetical drug and then for the anti-malarial drug, dihydroartemisinin. Two datasets (sampling design A: dense; sampling design B: sparse) were simulated using a pharmacokinetic model that included a binary covariate effect and subsequently re-estimated using (1) the same model and (2) a model not including the covariate effect in NONMEM 7.2. Power calculations were performed for varying numbers of patients with sampling designs A and B. Study designs with statistical power >80% were selected and further evaluated for cost-effectiveness. The simulation studies of the hypothetical drug and the anti-malarial drug dihydroartemisinin demonstrated that the simulation-based power calculation methodology, based on the Monte Carlo Mapped Power method, can be utilised to evaluate and determine the sample size of mixed (part sparsely and part densely sampled) study designs. The developed method can contribute to the design of robust and efficient pharmacokinetic studies.
NASA Technical Reports Server (NTRS)
1979-01-01
Information to identify viable coal gasification and utilization technologies is presented. Analysis capabilities required to support design and implementation of coal based synthetic fuels complexes are identified. The potential market in the Southeast United States for coal based synthetic fuels is investigated. A requirements analysis to identify the types of modeling and analysis capabilities required to conduct and monitor coal gasification project designs is discussed. Models and methodologies to satisfy these requirements are identified and evaluated, and recommendations are developed. Requirements for development of technology and data needed to improve gasification feasibility and economies are examined.
Designing for Inquiry-Based Learning with the Learning Activity Management System
ERIC Educational Resources Information Center
Levy, P.; Aiyegbayo, O.; Little, S.
2009-01-01
This paper explores the relationship between practitioners' pedagogical purposes, values and practices in designing for inquiry-based learning in higher education, and the affordances of the Learning Activity Management System (LAMS) as a tool for creating learning designs in this context. Using a qualitative research methodology, variation was…
Web-Based Learning Design Tool
ERIC Educational Resources Information Center
Bruno, F. B.; Silva, T. L. K.; Silva, R. P.; Teixeira, F. G.
2012-01-01
Purpose: The purpose of this paper is to propose a web-based tool that enables the development and provision of learning designs and its reuse and re-contextualization as generative learning objects, aimed at developing educational materials. Design/methodology/approach: The use of learning objects can facilitate the process of production and…
Hafnium transistor process design for neural interfacing.
Parent, David W; Basham, Eric J
2009-01-01
A design methodology is presented that uses 1-D process simulations of Metal Insulator Semiconductor (MIS) structures to design the threshold voltage of hafnium oxide based transistors used for neural recording. The methodology is comprised of 1-D analytical equations for threshold voltage specification, and doping profiles, and 1-D MIS Technical Computer Aided Design (TCAD) to design a process to implement a specific threshold voltage, which minimized simulation time. The process was then verified with a 2-D process/electrical TCAD simulation. Hafnium oxide films (HfO) were grown and characterized for dielectric constant and fixed oxide charge for various annealing temperatures, two important design variables in threshold voltage design.
NASA Astrophysics Data System (ADS)
Hunka, Frantisek; Matula, Jiri
2017-07-01
Transaction based approach is utilized in some methodologies in business process modeling. Essential parts of these transactions are human beings. The notion of agent or actor role is usually used for them. The paper on a particular example describes possibilities of Design Engineering Methodology for Organizations (DEMO) and Resource-Event-Agent (REA) methodology. Whereas the DEMO methodology can be regarded as a generic methodology having its foundation in the theory of Enterprise Ontology the REA methodology is regarded as the domain specific methodology and has its origin in accountancy systems. The results of these approaches is that the DEMO methodology captures everything that happens in the reality with a good empirical evidence whereas the REA methodology captures only changes connected with economic events. Economic events represent either change of the property rights to economic resource or consumption or production of economic resources. This results from the essence of economic events and their connection to economic resources.
Recent activities within the Aeroservoelasticity Branch at the NASA Langley Research Center
NASA Technical Reports Server (NTRS)
Noll, Thomas E.; Perry, Boyd, III; Gilbert, Michael G.
1989-01-01
The objective of research in aeroservoelasticity at the NASA Langley Research Center is to enhance the modeling, analysis, and multidisciplinary design methodologies for obtaining multifunction digital control systems for application to flexible flight vehicles. Recent accomplishments are discussed, and a status report on current activities within the Aeroservoelasticity Branch is presented. In the area of modeling, improvements to the Minimum-State Method of approximating unsteady aerodynamics are shown to provide precise, low-order aeroservoelastic models for design and simulation activities. Analytical methods based on Matched Filter Theory and Random Process Theory to provide efficient and direct predictions of the critical gust profile and the time-correlated gust loads for linear structural design considerations are also discussed. Two research projects leading towards improved design methodology are summarized. The first program is developing an integrated structure/control design capability based on hierarchical problem decomposition, multilevel optimization and analytical sensitivities. The second program provides procedures for obtaining low-order, robust digital control laws for aeroelastic applications. In terms of methodology validation and application the current activities associated with the Active Flexible Wing project are reviewed.
Mandated Change Gone Wrong? A Case Study of Law-Based School Reform in South Africa
ERIC Educational Resources Information Center
Bisschoff, Tom
2009-01-01
Purpose: This paper aims to explore and describe the limits of recent law-based school reform in South Africa from an education management perspective. Design/methodology/approach: The research design consists of a qualitative, investigative, descriptive and contextual design which Merriam would classify as a basic or generic design type.…
Rotter, Thomas; Kinsman, Leigh; James, Erica; Machotta, Andreas; Steyerberg, Ewout W
2012-06-18
The purpose of this article is to report on the quality of the existing evidence base regarding the effectiveness of clinical pathway (CPW) research in the hospital setting. The analysis is based on a recently published Cochrane review of the effectiveness of CPWs. An integral component of the review process was a rigorous appraisal of the methodological quality of published CPW evaluations. This allowed the identification of strengths and limitations of the evidence base for CPW effectiveness. We followed the validated Cochrane Effective Practice and Organisation of Care Group (EPOC) criteria for randomized and non-randomized clinical pathway evaluations. In addition, we tested the hypotheses that simple pre-post studies tend to overestimate CPW effects reported. Out of the 260 primary studies meeting CPW content criteria, only 27 studies met the EPOC study design criteria, with the majority of CPW studies (more than 70 %) excluded from the review on the basis that they were simple pre-post evaluations, mostly comparing two or more annual patient cohorts. Methodologically poor study designs are often used to evaluate CPWs and this compromises the quality of the existing evidence base. Cochrane EPOC methodological criteria, including the selection of rigorous study designs along with detailed descriptions of CPW development and implementation processes, are recommended for quantitative evaluations to improve the evidence base for the use of CPWs in hospitals.
High-performance radial AMTEC cell design for ultra-high-power solar AMTEC systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hendricks, T.J.; Huang, C.
1999-07-01
Alkali Metal Thermal to Electric Conversion (AMTEC) technology is rapidly maturing for potential application in ultra-high-power solar AMTEC systems required by potential future US Air Force (USAF) spacecraft missions in medium-earth and geosynchronous orbits (MEO and GEO). Solar thermal AMTEC power systems potentially have several important advantages over current solar photovoltaic power systems in ultra-high-power spacecraft applications for USAF MEO and GEO missions. This work presents key aspects of radial AMTEC cell design to achieve high cell performance in solar AMTEC systems delivering larger than 50 kW(e) to support high power USAF missions. These missions typically require AMTEC cell conversionmore » efficiency larger than 25%. A sophisticated design parameter methodology is described and demonstrated which establishes optimum design parameters in any radial cell design to satisfy high-power mission requirements. Specific relationships, which are distinct functions of cell temperatures and pressures, define critical dependencies between key cell design parameters, particularly the impact of parasitic thermal losses on Beta Alumina Solid Electrolyte (BASE) area requirements, voltage, number of BASE tubes, and system power production for both maximum power-per-BASE-area and optimum efficiency conditions. Finally, some high-level system tradeoffs are demonstrated using the design parameter methodology to establish high-power radial cell design requirements and philosophy. The discussion highlights how to incorporate this methodology with sophisticated SINDA/FLUINT AMTEC cell modeling capabilities to determine optimum radial AMTEC cell designs.« less
Enviroplan—a summary methodology for comprehensive environmental planning and design
Robert Allen Jr.; George Nez; Fred Nicholson; Larry Sutphin
1979-01-01
This paper will discuss a comprehensive environmental assessment methodology that includes a numerical method for visual management and analysis. This methodology employs resource and human activity units as a means to produce a visual form unit which is the fundamental unit of the perceptual environment. The resource unit is based on the ecosystem as the fundamental...
ERIC Educational Resources Information Center
Diamond, Michael Jay; Shapiro, Jerrold Lee
This paper proposes a model for the long-term scientific study of encounter, T-, and sensitivity groups. The authors see the need for overcoming major methodological and design inadequacies of such research. They discuss major methodological flaws in group outcome research as including: (1) lack of adequate base rate or pretraining measures; (2)…
ERIC Educational Resources Information Center
Osler, James Edward, II
2015-01-01
This monograph provides an epistemological rational for the Accumulative Manifold Validation Analysis [also referred by the acronym "AMOVA"] statistical methodology designed to test psychometric instruments. This form of inquiry is a form of mathematical optimization in the discipline of linear stochastic modelling. AMOVA is an in-depth…
Computer-Aided Sensor Development Focused on Security Issues.
Bialas, Andrzej
2016-05-26
The paper examines intelligent sensor and sensor system development according to the Common Criteria methodology, which is the basic security assurance methodology for IT products and systems. The paper presents how the development process can be supported by software tools, design patterns and knowledge engineering. The automation of this process brings cost-, quality-, and time-related advantages, because the most difficult and most laborious activities are software-supported and the design reusability is growing. The paper includes a short introduction to the Common Criteria methodology and its sensor-related applications. In the experimental section the computer-supported and patterns-based IT security development process is presented using the example of an intelligent methane detection sensor. This process is supported by an ontology-based tool for security modeling and analyses. The verified and justified models are transferred straight to the security target specification representing security requirements for the IT product. The novelty of the paper is to provide a patterns-based and computer-aided methodology for the sensors development with a view to achieving their IT security assurance. The paper summarizes the validation experiment focused on this methodology adapted for the sensors system development, and presents directions of future research.
Computer-Aided Sensor Development Focused on Security Issues
Bialas, Andrzej
2016-01-01
The paper examines intelligent sensor and sensor system development according to the Common Criteria methodology, which is the basic security assurance methodology for IT products and systems. The paper presents how the development process can be supported by software tools, design patterns and knowledge engineering. The automation of this process brings cost-, quality-, and time-related advantages, because the most difficult and most laborious activities are software-supported and the design reusability is growing. The paper includes a short introduction to the Common Criteria methodology and its sensor-related applications. In the experimental section the computer-supported and patterns-based IT security development process is presented using the example of an intelligent methane detection sensor. This process is supported by an ontology-based tool for security modeling and analyses. The verified and justified models are transferred straight to the security target specification representing security requirements for the IT product. The novelty of the paper is to provide a patterns-based and computer-aided methodology for the sensors development with a view to achieving their IT security assurance. The paper summarizes the validation experiment focused on this methodology adapted for the sensors system development, and presents directions of future research. PMID:27240360
NASA Astrophysics Data System (ADS)
Perry, Dan; Nakamoto, Mark; Verghese, Nishath; Hurat, Philippe; Rouse, Rich
2007-03-01
Model-based hotspot detection and silicon-aware parametric analysis help designers optimize their chips for yield, area and performance without the high cost of applying foundries' recommended design rules. This set of DFM/ recommended rules is primarily litho-driven, but cannot guarantee a manufacturable design without imposing overly restrictive design requirements. This rule-based methodology of making design decisions based on idealized polygons that no longer represent what is on silicon needs to be replaced. Using model-based simulation of the lithography, OPC, RET and etch effects, followed by electrical evaluation of the resulting shapes, leads to a more realistic and accurate analysis. This analysis can be used to evaluate intelligent design trade-offs and identify potential failures due to systematic manufacturing defects during the design phase. The successful DFM design methodology consists of three parts: 1. Achieve a more aggressive layout through limited usage of litho-related recommended design rules. A 10% to 15% area reduction is achieved by using more aggressive design rules. DFM/recommended design rules are used only if there is no impact on cell size. 2. Identify and fix hotspots using a model-based layout printability checker. Model-based litho and etch simulation are done at the cell level to identify hotspots. Violations of recommended rules may cause additional hotspots, which are then fixed. The resulting design is ready for step 3. 3. Improve timing accuracy with a process-aware parametric analysis tool for transistors and interconnect. Contours of diffusion, poly and metal layers are used for parametric analysis. In this paper, we show the results of this physical and electrical DFM methodology at Qualcomm. We describe how Qualcomm was able to develop more aggressive cell designs that yielded a 10% to 15% area reduction using this methodology. Model-based shape simulation was employed during library development to validate architecture choices and to optimize cell layout. At the physical verification stage, the shape simulator was run at full-chip level to identify and fix residual hotspots on interconnect layers, on poly or metal 1 due to interaction between adjacent cells, or on metal 1 due to interaction between routing (via and via cover) and cell geometry. To determine an appropriate electrical DFM solution, Qualcomm developed an experiment to examine various electrical effects. After reporting the silicon results of this experiment, which showed sizeable delay variations due to lithography-related systematic effects, we also explain how contours of diffusion, poly and metal can be used for silicon-aware parametric analysis of transistors and interconnect at the cell-, block- and chip-level.
The National Visitor Use Monitoring methodology and final results for round 1
S.J. Zarnoch; E.M. White; D.B.K. English; Susan M. Kocis; Ross Arnold
2011-01-01
A nationwide, systematic monitoring process has been developed to provide improved estimates of recreation visitation on National Forest System lands. Methodology is presented to provide estimates of site visits and national forest visits based on an onsite sampling design of site-days and last-exiting recreationists. Stratification of the site days, based on site type...
Prototyping a Microcomputer-Based Online Library Catalog. Occasional Papers Number 177.
ERIC Educational Resources Information Center
Lazinger, Susan S.; Shoval, Peretz
This report examines and evaluates the application of prototyping methodology in the design of a microcomputer-based online library catalog. The methodology for carrying out the research involves a five-part examination of the problem on both the theoretical and applied levels, each of which is discussed in a separate section as follows: (1) a…
ERIC Educational Resources Information Center
Fernandes, Joana; Costa, Rute; Peres, Paula
2016-01-01
This paper aims at discussing the advantages of a methodology design grounded on a concept-based approach to Terminology applied to the most prominent scenario of current Higher Education: "blended learning." Terminology is a discipline that aims at representing, describing and defining specialized knowledge through language, putting…
Advanced Design Methodology for Robust Aircraft Sizing and Synthesis
NASA Technical Reports Server (NTRS)
Mavris, Dimitri N.
1997-01-01
Contract efforts are focused on refining the Robust Design Methodology for Conceptual Aircraft Design. Robust Design Simulation (RDS) was developed earlier as a potential solution to the need to do rapid trade-offs while accounting for risk, conflict, and uncertainty. The core of the simulation revolved around Response Surface Equations as approximations of bounded design spaces. An ongoing investigation is concerned with the advantages of using Neural Networks in conceptual design. Thought was also given to the development of systematic way to choose or create a baseline configuration based on specific mission requirements. Expert system was developed, which selects aerodynamics, performance and weights model from several configurations based on the user's mission requirements for subsonic civil transport. The research has also resulted in a step-by-step illustration on how to use the AMV method for distribution generation and the search for robust design solutions to multivariate constrained problems.
The use of experimental design to find the operating maximum power point of PEM fuel cells
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crăciunescu, Aurelian; Pătularu, Laurenţiu; Ciumbulea, Gloria
2015-03-10
Proton Exchange Membrane (PEM) Fuel Cells are difficult to model due to their complex nonlinear nature. In this paper, the development of a PEM Fuel Cells mathematical model based on the Design of Experiment methodology is described. The Design of Experiment provides a very efficient methodology to obtain a mathematical model for the studied multivariable system with only a few experiments. The obtained results can be used for optimization and control of the PEM Fuel Cells systems.
ERIC Educational Resources Information Center
Li, Yanyan; Huang, Zhinan; Jiang, Menglu; Chang, Ting-Wen
2016-01-01
Incorporating scientific fundamentals via engineering through a design-based methodology has proven to be highly effective for STEM education. Engineering design can be instantiated for learning as they involve mental and physical stimulation and develop practical skills especially in solving problems. Lego bricks, as a set of toys based on design…
ERIC Educational Resources Information Center
Singh, Oma B.
2009-01-01
This study used a design based-research (DBR) methodology to examine how an Instructional Systematic Design (ISD) process such as ADDIE (Analysis, Design, Development, Implementation, Evaluation) can be employed to develop a web-based module to teach metacognitive learning strategies to students in higher education. The goal of the study was…
NASA Astrophysics Data System (ADS)
Brezgin, V. I.; Brodov, Yu M.; Kultishev, A. Yu
2017-11-01
The report presents improvement methods review in the fields of the steam turbine units design and operation based on modern information technologies application. In accordance with the life cycle methodology support, a conceptual model of the information support system during life cycle main stages (LC) of steam turbine unit is suggested. A classifying system, which ensures the creation of sustainable information links between the engineer team (manufacture’s plant) and customer organizations (power plants), is proposed. Within report, the principle of parameterization expansion beyond the geometric constructions at the design and improvement process of steam turbine unit equipment is proposed, studied and justified. The report presents the steam turbine unit equipment design methodology based on the brand new oil-cooler design system that have been developed and implemented by authors. This design system combines the construction subsystem, which is characterized by extensive usage of family tables and templates, and computation subsystem, which includes a methodology for the thermal-hydraulic zone-by-zone oil coolers design calculations. The report presents data about the developed software for operational monitoring, assessment of equipment parameters features as well as its implementation on five power plants.
2011-01-01
Background Although principles based in motor learning, rehabilitation, and human-computer interfaces can guide the design of effective interactive systems for rehabilitation, a unified approach that connects these key principles into an integrated design, and can form a methodology that can be generalized to interactive stroke rehabilitation, is presently unavailable. Results This paper integrates phenomenological approaches to interaction and embodied knowledge with rehabilitation practices and theories to achieve the basis for a methodology that can support effective adaptive, interactive rehabilitation. Our resulting methodology provides guidelines for the development of an action representation, quantification of action, and the design of interactive feedback. As Part I of a two-part series, this paper presents key principles of the unified approach. Part II then describes the application of this approach within the implementation of the Adaptive Mixed Reality Rehabilitation (AMRR) system for stroke rehabilitation. Conclusions The accompanying principles for composing novel mixed reality environments for stroke rehabilitation can advance the design and implementation of effective mixed reality systems for the clinical setting, and ultimately be adapted for home-based application. They furthermore can be applied to other rehabilitation needs beyond stroke. PMID:21875441
Design for dependability: A simulation-based approach. Ph.D. Thesis, 1993
NASA Technical Reports Server (NTRS)
Goswami, Kumar K.
1994-01-01
This research addresses issues in simulation-based system level dependability analysis of fault-tolerant computer systems. The issues and difficulties of providing a general simulation-based approach for system level analysis are discussed and a methodology that address and tackle these issues is presented. The proposed methodology is designed to permit the study of a wide variety of architectures under various fault conditions. It permits detailed functional modeling of architectural features such as sparing policies, repair schemes, routing algorithms as well as other fault-tolerant mechanisms, and it allows the execution of actual application software. One key benefit of this approach is that the behavior of a system under faults does not have to be pre-defined as it is normally done. Instead, a system can be simulated in detail and injected with faults to determine its failure modes. The thesis describes how object-oriented design is used to incorporate this methodology into a general purpose design and fault injection package called DEPEND. A software model is presented that uses abstractions of application programs to study the behavior and effect of software on hardware faults in the early design stage when actual code is not available. Finally, an acceleration technique that combines hierarchical simulation, time acceleration algorithms and hybrid simulation to reduce simulation time is introduced.
Performer-centric Interface Design.
ERIC Educational Resources Information Center
McGraw, Karen L.
1995-01-01
Describes performer-centric interface design and explains a model-based approach for conducting performer-centric analysis and design. Highlights include design methodology, including cognitive task analysis; creating task scenarios; creating the presentation model; creating storyboards; proof of concept screens; object models and icons;…
Using a Game Environment to Foster Collaborative Learning: A Design-Based Study
ERIC Educational Resources Information Center
Hamalainen, Raija
2011-01-01
Designing collaborative three-dimensional learning games for vocational learning may be one way to respond to the needs of working life. The theoretical vantage points of collaborative learning for game development and the "design-based research" methodology are described; these have been used to support collaborative learning in the…
Methodological Considerations in Designing and Evaluating Animal-Assisted Interventions.
Stern, Cindy; Chur-Hansen, Anna
2013-02-27
This paper presents a discussion of the literature on animal-assisted interventions and describes limitations surrounding current methodological quality. Benefits to human physical, psychological and social health cannot be empirically confirmed due to the methodological limitations of the existing body of research, and comparisons cannot validly be made across different studies. Without a solid research base animal-assisted interventions will not receive recognition and acceptance as a credible alternative health care treatment. The paper draws on the work of four systematic reviews conducted over April-May 2009, with no date restrictions, focusing exclusively on the use of canine-assisted interventions for older people residing in long-term care. The reviews revealed a lack of good quality studies. Although the literature base has grown in volume since its inception, it predominantly consists of anecdotal accounts and reports. Experimental studies undertaken are often flawed in aspects of design, conduct and reporting. There are few qualitative studies available leading to the inability to draw definitive conclusions. It is clear that due to the complexities associated with these interventions not all weaknesses can be eliminated. However, there are basic methodological weaknesses that can be addressed in future studies in the area. Checklists for quantitative and qualitative research designs to guide future research are offered to help address methodological rigour.
Development of risk-based decision methodology for facility design.
DOT National Transportation Integrated Search
2014-06-01
This report develops a methodology for CDOT to use in the risk analysis of various types of facilities and provides : illustrative examples for the use of the proposed framework. An overview of the current practices and applications to : illustrate t...
Discrete Adjoint-Based Design for Unsteady Turbulent Flows On Dynamic Overset Unstructured Grids
NASA Technical Reports Server (NTRS)
Nielsen, Eric J.; Diskin, Boris
2012-01-01
A discrete adjoint-based design methodology for unsteady turbulent flows on three-dimensional dynamic overset unstructured grids is formulated, implemented, and verified. The methodology supports both compressible and incompressible flows and is amenable to massively parallel computing environments. The approach provides a general framework for performing highly efficient and discretely consistent sensitivity analysis for problems involving arbitrary combinations of overset unstructured grids which may be static, undergoing rigid or deforming motions, or any combination thereof. General parent-child motions are also accommodated, and the accuracy of the implementation is established using an independent verification based on a complex-variable approach. The methodology is used to demonstrate aerodynamic optimizations of a wind turbine geometry, a biologically-inspired flapping wing, and a complex helicopter configuration subject to trimming constraints. The objective function for each problem is successfully reduced and all specified constraints are satisfied.
NASA Technical Reports Server (NTRS)
Hanagud, S.; Uppaluri, B.
1975-01-01
This paper describes a methodology for making cost effective fatigue design decisions. The methodology is based on a probabilistic model for the stochastic process of fatigue crack growth with time. The development of a particular model for the stochastic process is also discussed in the paper. The model is based on the assumption of continuous time and discrete space of crack lengths. Statistical decision theory and the developed probabilistic model are used to develop the procedure for making fatigue design decisions on the basis of minimum expected cost or risk function and reliability bounds. Selections of initial flaw size distribution, NDT, repair threshold crack lengths, and inspection intervals are discussed.
Design Optimization of Gas Generator Hybrid Propulsion Boosters
NASA Technical Reports Server (NTRS)
Weldon, Vincent; Phillips, Dwight; Fink, Larry
1990-01-01
A methodology used in support of a study for NASA/MSFC to optimize the design of gas generator hybrid propulsion booster for uprating the National Space Transportation System (NSTS) is presented. The objective was to compare alternative configurations for this booster approach, optimizing each candidate concept on different bases, in order to develop data for a trade table on which a final decision was based. The methodology is capable of processing a large number of independent and dependent variables, adjusting the overall subsystems characteristics to arrive at a best compromise integrated design to meet various specific optimization criteria subject to selected constraints. For each system considered, a detailed weight statement was generated along with preliminary cost and reliability estimates.
Application of Adjoint Methodology to Supersonic Aircraft Design Using Reversed Equivalent Areas
NASA Technical Reports Server (NTRS)
Rallabhandi, Sriram K.
2013-01-01
This paper presents an approach to shape an aircraft to equivalent area based objectives using the discrete adjoint approach. Equivalent areas can be obtained either using reversed augmented Burgers equation or direct conversion of off-body pressures into equivalent area. Formal coupling with CFD allows computation of sensitivities of equivalent area objectives with respect to aircraft shape parameters. The exactness of the adjoint sensitivities is verified against derivatives obtained using the complex step approach. This methodology has the benefit of using designer-friendly equivalent areas in the shape design of low-boom aircraft. Shape optimization results with equivalent area cost functionals are discussed and further refined using ground loudness based objectives.
Rivera, José; Carrillo, Mariano; Chacón, Mario; Herrera, Gilberto; Bojorquez, Gilberto
2007-01-01
The development of smart sensors involves the design of reconfigurable systems capable of working with different input sensors. Reconfigurable systems ideally should spend the least possible amount of time in their calibration. An autocalibration algorithm for intelligent sensors should be able to fix major problems such as offset, variation of gain and lack of linearity, as accurately as possible. This paper describes a new autocalibration methodology for nonlinear intelligent sensors based on artificial neural networks, ANN. The methodology involves analysis of several network topologies and training algorithms. The proposed method was compared against the piecewise and polynomial linearization methods. Method comparison was achieved using different number of calibration points, and several nonlinear levels of the input signal. This paper also shows that the proposed method turned out to have a better overall accuracy than the other two methods. Besides, experimentation results and analysis of the complete study, the paper describes the implementation of the ANN in a microcontroller unit, MCU. In order to illustrate the method capability to build autocalibration and reconfigurable systems, a temperature measurement system was designed and tested. The proposed method is an improvement over the classic autocalibration methodologies, because it impacts on the design process of intelligent sensors, autocalibration methodologies and their associated factors, like time and cost.
Probabilistic sizing of laminates with uncertainties
NASA Technical Reports Server (NTRS)
Shah, A. R.; Liaw, D. G.; Chamis, C. C.
1993-01-01
A reliability based design methodology for laminate sizing and configuration for a special case of composite structures is described. The methodology combines probabilistic composite mechanics with probabilistic structural analysis. The uncertainties of constituent materials (fiber and matrix) to predict macroscopic behavior are simulated using probabilistic theory. Uncertainties in the degradation of composite material properties are included in this design methodology. A multi-factor interaction equation is used to evaluate load and environment dependent degradation of the composite material properties at the micromechanics level. The methodology is integrated into a computer code IPACS (Integrated Probabilistic Assessment of Composite Structures). Versatility of this design approach is demonstrated by performing a multi-level probabilistic analysis to size the laminates for design structural reliability of random type structures. The results show that laminate configurations can be selected to improve the structural reliability from three failures in 1000, to no failures in one million. Results also show that the laminates with the highest reliability are the least sensitive to the loading conditions.
Rail Passenger Vehicle Truck Design Methodology
DOT National Transportation Integrated Search
1981-01-01
A procedure for the selection of rail passenger truck design parameters to meet dynamic performance indices has been developed. The procedure is based upon partitioning the design task into three tradeoff studies: (1) a vertical ride quality-secondar...
Mechatronics by Analogy and Application to Legged Locomotion
NASA Astrophysics Data System (ADS)
Ragusila, Victor
A new design methodology for mechatronic systems, dubbed as Mechatronics by Analogy (MbA), is introduced and applied to designing a leg mechanism. The new methodology argues that by establishing a similarity relation between a complex system and a number of simpler models it is possible to design the former using the analysis and synthesis means developed for the latter. The methodology provides a framework for concurrent engineering of complex systems while maintaining the transparency of the system behaviour through making formal analogies between the system and those with more tractable dynamics. The application of the MbA methodology to the design of a monopod robot leg, called the Linkage Leg, is also studied. A series of simulations show that the dynamic behaviour of the Linkage Leg is similar to that of a combination of a double pendulum and a spring-loaded inverted pendulum, based on which the system kinematic, dynamic, and control parameters can be designed concurrently. The first stage of Mechatronics by Analogy is a method of extracting significant features of system dynamics through simpler models. The goal is to determine a set of simpler mechanisms with similar dynamic behaviour to that of the original system in various phases of its motion. A modular bond-graph representation of the system is determined, and subsequently simplified using two simplification algorithms. The first algorithm determines the relevant dynamic elements of the system for each phase of motion, and the second algorithm finds the simple mechanism described by the remaining dynamic elements. In addition to greatly simplifying the controller for the system, using simpler mechanisms with similar behaviour provides a greater insight into the dynamics of the system. This is seen in the second stage of the new methodology, which concurrently optimizes the simpler mechanisms together with a control system based on their dynamics. Once the optimal configuration of the simpler system is determined, the original mechanism is optimized such that its dynamic behaviour is analogous. It is shown that, if this analogy is achieved, the control system designed based on the simpler mechanisms can be directly implemented to the more complex system, and their dynamic behaviours are close enough for the system performance to be effectively the same. Finally it is shown that, for the employed objective of fast legged locomotion, the proposed methodology achieves a better design than Reduction-by-Feedback, a competing methodology that uses control layers to simplify the dynamics of the system.
Preliminary structural design of composite main rotor blades for minimum weight
NASA Technical Reports Server (NTRS)
Nixon, Mark W.
1987-01-01
A methodology is developed to perform minimum weight structural design for composite or metallic main rotor blades subject to aerodynamic performance, material strength, autorotation, and frequency constraints. The constraints and load cases are developed such that the final preliminary rotor design will satisfy U.S. Army military specifications, as well as take advantage of the versatility of composite materials. A minimum weight design is first developed subject to satisfying the aerodynamic performance, strength, and autorotation constraints for all static load cases. The minimum weight design is then dynamically tuned to avoid resonant frequencies occurring at the design rotor speed. With this methodology, three rotor blade designs were developed based on the geometry of the UH-60A Black Hawk titanium-spar rotor blade. The first design is of a single titanium-spar cross section, which is compared with the UH-60A Black Hawk rotor blade. The second and third designs use single and multiple graphite/epoxy-spar cross sections. These are compared with the titanium-spar design to demonstrate weight savings from use of this design methodology in conjunction with advanced composite materials.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maji, Partha Sona; Roy Chaudhuri, Partha
In this article, we have presented a new design methodology of obtaining wide band parametric sources based on highly nonlinear chalcogenide material of As{sub 2}S{sub 3}. The dispersion profile of the photonic crystal fiber (PCF) has been engineered wisely by reducing the diameter of the second air-hole ring to have a favorable higher order dispersion parameter. The parametric gain dependence upon fiber length, pump power, and different pumping wavelengths has been investigated in detail. Based upon the nonlinear four wave mixing phenomenon, we are able to achieve a wideband parametric amplifier with peak gain of 29 dB with FWHM of ≈2000 nmmore » around the IR wavelength by proper tailoring of the dispersion profile of the PCF with a continuous wave Erbium (Er{sup 3+})-doped ZBLAN fiber laser emitting at 2.8 μm as the pump source with an average power of 5 W. The new design methodology will unleash a new dimension to the chalcogenide material based investigation for wavelength translation around IR wavelength band.« less
ERIC Educational Resources Information Center
Gammage, David T.
2008-01-01
Purpose: The purpose of this paper is to explore how the process of implementation of school-based management (SBM) has worked within the public school systems in the Australian Capital Territory (ACT) and Victoria in Australia. The period covered was 1976-2006. Design/methodology/approach: The approach adopted was the mixed methodology which…
Promoting the Health and Wellbeing of Young Black Men Using Community-Based Drama
ERIC Educational Resources Information Center
Kemp, Martin
2006-01-01
Purpose--This paper aims to explore the role of drama and theatre in promoting the emotional and social wellbeing of a group of young Black men living in south London. Design/methodology/approach--A qualitative methodology was used in a process and outcome evaluation of a drama-based initiative that aimed to promote young Black men's sexual and…
[Methodology for clinical research in Orthodontics, the assets of the beOrtho website].
Ruiz, Martial; Thibult, François
2014-06-01
The rules applying to the "evidence-based" methodology strongly influenced the clinical research in orthodontics. However, the implementation of clinical studies requires rigour, important statistical and methodological knowledge, as well as a reliable environment in order to compile and store the data obtained from research. We developed the project "beOrtho.com" (based on orthodontic evidence) in order to fill up the gap between our desire to drive clinical research and the necessity of methodological rigour in the exploitation of its results. BeOrtho website was created to answer the issue of sample recruitment, data compilation and storage, while providing help for the methodological design of clinical studies. It allows the development and monitoring of clinical studies, as well as the creation of databases. On the other hand, we designed an evaluation grid for clinical studies which helps developing systematic reviews. In order to illustrate our point, we tested a research protocol evaluating the interest of the mandibular advancement in the framework of Class II treatment. © EDP Sciences, SFODF, 2014.
[Qualitative research methodology in health care].
Bedregal, Paula; Besoain, Carolina; Reinoso, Alejandro; Zubarew, Tamara
2017-03-01
Health care research requires different methodological approaches such as qualitative and quantitative analyzes to understand the phenomena under study. Qualitative research is usually the least considered. Central elements of the qualitative method are that the object of study is constituted by perceptions, emotions and beliefs, non-random sampling by purpose, circular process of knowledge construction, and methodological rigor throughout the research process, from quality design to the consistency of results. The objective of this work is to contribute to the methodological knowledge about qualitative research in health services, based on the implementation of the study, The transition process from pediatric to adult services: perspectives from adolescents with chronic diseases, caregivers and health professionals. The information gathered through the qualitative methodology facilitated the understanding of critical points, barriers and facilitators of the transition process of adolescents with chronic diseases, considering the perspective of users and the health team. This study allowed the design of a transition services model from pediatric to adult health services based on the needs of adolescents with chronic diseases, their caregivers and the health team.
An Integrated Low-Speed Performance and Noise Prediction Methodology for Subsonic Aircraft
NASA Technical Reports Server (NTRS)
Olson, E. D.; Mavris, D. N.
2000-01-01
An integrated methodology has been assembled to compute the engine performance, takeoff and landing trajectories, and community noise levels for a subsonic commercial aircraft. Where feasible, physics-based noise analysis methods have been used to make the results more applicable to newer, revolutionary designs and to allow for a more direct evaluation of new technologies. The methodology is intended to be used with approximation methods and risk analysis techniques to allow for the analysis of a greater number of variable combinations while retaining the advantages of physics-based analysis. Details of the methodology are described and limited results are presented for a representative subsonic commercial aircraft.
C-Based Design Methodology and Topological Change for an Indian Agricultural Tractor Component
NASA Astrophysics Data System (ADS)
Matta, Anil Kumar; Raju, D. Ranga; Suman, K. N. S.; Kranthi, A. S.
2018-06-01
The failure of tractor components and their replacement has now become very common in India because of re-cycling, re-sale, and duplication. To over come the problem of failure we propose a design methodology for topological change co-simulating with software's. In the proposed Design methodology, the designer checks Paxial, Pcr, Pfailue, τ by hand calculations, from which refined topological changes of R.S.Arm are formed. We explained several techniques employed in the component for reduction, removal of rib material to change center of gravity and centroid point by using system C for mixed level simulation and faster topological changes. The design process in system C can be compiled and executed with software, TURBO C7. The modified component is developed in proE and analyzed in ANSYS. The topologically changed component with slot 120 × 4.75 × 32.5 mm at the center showed greater effectiveness than the original component.
C-Based Design Methodology and Topological Change for an Indian Agricultural Tractor Component
NASA Astrophysics Data System (ADS)
Matta, Anil Kumar; Raju, D. Ranga; Suman, K. N. S.; Kranthi, A. S.
2018-02-01
The failure of tractor components and their replacement has now become very common in India because of re-cycling, re-sale, and duplication. To over come the problem of failure we propose a design methodology for topological change co-simulating with software's. In the proposed Design methodology, the designer checks Paxial, Pcr, Pfailue, τ by hand calculations, from which refined topological changes of R.S.Arm are formed. We explained several techniques employed in the component for reduction, removal of rib material to change center of gravity and centroid point by using system C for mixed level simulation and faster topological changes. The design process in system C can be compiled and executed with software, TURBO C7. The modified component is developed in proE and analyzed in ANSYS. The topologically changed component with slot 120 × 4.75 × 32.5 mm at the center showed greater effectiveness than the original component.
Evaluation of Model-Based Training for Vertical Guidance Logic
NASA Technical Reports Server (NTRS)
Feary, Michael; Palmer, Everett; Sherry, Lance; Polson, Peter; Alkin, Marty; McCrobie, Dan; Kelley, Jerry; Rosekind, Mark (Technical Monitor)
1997-01-01
This paper will summarize the results of a study which introduces a structured, model based approach to learning how the automated vertical guidance system works on a modern commercial air transport. The study proposes a framework to provide accurate and complete information in an attempt to eliminate confusion about 'what the system is doing'. This study will examine a structured methodology for organizing the ideas on which the system was designed, communicating this information through the training material, and displaying it in the airplane. Previous research on model-based, computer aided instructional technology has shown reductions in the amount of time to a specified level of competence. The lessons learned from the development of these technologies are well suited for use with the design methodology which was used to develop the vertical guidance logic for a large commercial air transport. The design methodology presents the model from which to derive the training material, and the content of information to be displayed to the operator. The study consists of a 2 X 2 factorial experiment which will compare a new method of training vertical guidance logic and a new type of display. The format of the material used to derive both the training and the display will be provided by the Operational Procedure Methodology. The training condition will compare current training material to the new structured format. The display condition will involve a change of the content of the information displayed into pieces that agree with the concepts with which the system was designed.
A cost-effective methodology for the design of massively-parallel VLSI functional units
NASA Technical Reports Server (NTRS)
Venkateswaran, N.; Sriram, G.; Desouza, J.
1993-01-01
In this paper we propose a generalized methodology for the design of cost-effective massively-parallel VLSI Functional Units. This methodology is based on a technique of generating and reducing a massive bit-array on the mask-programmable PAcube VLSI array. This methodology unifies (maintains identical data flow and control) the execution of complex arithmetic functions on PAcube arrays. It is highly regular, expandable and uniform with respect to problem-size and wordlength, thereby reducing the communication complexity. The memory-functional unit interface is regular and expandable. Using this technique functional units of dedicated processors can be mask-programmed on the naked PAcube arrays, reducing the turn-around time. The production cost of such dedicated processors can be drastically reduced since the naked PAcube arrays can be mass-produced. Analysis of the the performance of functional units designed by our method yields promising results.
Does Maltreatment Beget Maltreatment? A Systematic Review of the Intergenerational Literature
Thornberry, Terence P.; Knight, Kelly E.; Lovegrove, Peter J.
2014-01-01
In this paper, we critically review the literature testing the cycle of maltreatment hypothesis which posits continuity in maltreatment across adjacent generations. That is, we examine whether a history of maltreatment victimization is a significant risk factor for the later perpetration of maltreatment. We begin by establishing 11 methodological criteria that studies testing this hypothesis should meet. They include such basic standards as using representative samples, valid and reliable measures, prospective designs, and different reporters for each generation. We identify 47 studies that investigated this issue and then evaluate them with regard to the 11 methodological criteria. Overall, most of these studies report findings consistent with the cycle of maltreatment hypothesis. Unfortunately, at the same time, few of them satisfy the basic methodological criteria that we established; indeed, even the stronger studies in this area only meet about half of them. Moreover, the methodologically stronger studies present mixed support for the hypothesis. As a result, the positive association often reported in the literature appears to be based largely on the methodologically weaker designs. Based on our systematic methodological review, we conclude that this small and methodologically weak body of literature does not provide a definitive test of the cycle of maltreatment hypothesis. We conclude that it is imperative to develop more robust and methodologically adequate assessments of this hypothesis to more accurately inform the development of prevention and treatment programs. PMID:22673145
Design Considerations for Creating a Chemical Information Workstation.
ERIC Educational Resources Information Center
Mess, John A.
1995-01-01
Discusses what a functional chemical information workstation should provide to support the users in an academic library and examines how it can be implemented. Highlights include basic design considerations; natural language interface, including grammar-based, context-based, and statistical methodologies; expert system interface; and programming…
Experiences of Practice-Based Learning in Phenomenographic Perspective
ERIC Educational Resources Information Center
Rovio-Johansson, Airi
2018-01-01
Purpose: The paper aims to examine, within the context of professional practice and learning, how designers collaboratively working in international teams experience practice-based learning and how such occasions contribute to professional development. Design/methodology/approach: The paper introduces the cooperation project between Tibro Training…
Establishing equivalence: methodological progress in group-matching design and analysis.
Kover, Sara T; Atwoo, Amy K
2013-01-01
This methodological review draws attention to the challenges faced by intellectual and developmental disabilities researchers in the appropriate design and analysis of group comparison studies. We provide a brief overview of matching methodologies in the field, emphasizing group-matching designs used in behavioral research on cognition and language in neurodevelopmental disorders, including autism spectrum disorder, Fragile X syndrome, Down syndrome, and Williams syndrome. The limitations of relying on p values to establish group equivalence are discussed in the context of other existing methods: equivalence tests, propensity scores, and regression-based analyses. Our primary recommendation for advancing research on intellectual and developmental disabilities is the use of descriptive indices of adequate group matching: effect sizes (i.e., standardized mean differences) and variance ratios.
Establishing Equivalence: Methodological Progress in Group-Matching Design and Analysis
Kover, Sara T.; Atwood, Amy K.
2017-01-01
This methodological review draws attention to the challenges faced by intellectual and developmental disabilities researchers in the appropriate design and analysis of group comparison studies. We provide a brief overview of matching methodologies in the field, emphasizing group-matching designs utilized in behavioral research on cognition and language in neurodevelopmental disorders, including autism spectrum disorder, fragile X syndrome, Down syndrome, and Williams syndrome. The limitations of relying on p-values to establish group equivalence are discussed in the context of other existing methods: equivalence tests, propensity scores, and regression-based analyses. Our primary recommendation for advancing research on intellectual and developmental disabilities is the use of descriptive indices of adequate group matching: effect sizes (i.e., standardized mean differences) and variance ratios. PMID:23301899
Applying Case-Based Method in Designing Self-Directed Online Instruction: A Formative Research Study
ERIC Educational Resources Information Center
Luo, Heng; Koszalka, Tiffany A.; Arnone, Marilyn P.; Choi, Ikseon
2018-01-01
This study investigated the case-based method (CBM) instructional-design theory and its application in designing self-directed online instruction. The purpose of this study was to validate and refine the theory for a self-directed online instruction context. Guided by formative research methodology, this study first developed an online tutorial…
Onboard FPGA-based SAR processing for future spaceborne systems
NASA Technical Reports Server (NTRS)
Le, Charles; Chan, Samuel; Cheng, Frank; Fang, Winston; Fischman, Mark; Hensley, Scott; Johnson, Robert; Jourdan, Michael; Marina, Miguel; Parham, Bruce;
2004-01-01
We present a real-time high-performance and fault-tolerant FPGA-based hardware architecture for the processing of synthetic aperture radar (SAR) images in future spaceborne system. In particular, we will discuss the integrated design approach, from top-level algorithm specifications and system requirements, design methodology, functional verification and performance validation, down to hardware design and implementation.
Toward a Web Based Environment for Evaluation and Design of Pedagogical Hypermedia
ERIC Educational Resources Information Center
Trigano, Philippe C.; Pacurar-Giacomini, Ecaterina
2004-01-01
We are working on a method, called CEPIAH. We propose a web based system used to help teachers to design multimedia documents and to evaluate their prototypes. Our current research objectives are to create a methodology to sustain the educational hypermedia design and evaluation. A module is used to evaluate multimedia software applied in…
A Computer Environment for Beginners' Learning of Sorting Algorithms: Design and Pilot Evaluation
ERIC Educational Resources Information Center
Kordaki, M.; Miatidis, M.; Kapsampelis, G.
2008-01-01
This paper presents the design, features and pilot evaluation study of a web-based environment--the SORTING environment--for the learning of sorting algorithms by secondary level education students. The design of this environment is based on modeling methodology, taking into account modern constructivist and social theories of learning while at…
Jitendra, Asha K; Petersen-Brown, Shawna; Lein, Amy E; Zaslofsky, Anne F; Kunkel, Amy K; Jung, Pyung-Gang; Egan, Andrea M
2015-01-01
This study examined the quality of the research base related to strategy instruction priming the underlying mathematical problem structure for students with learning disabilities and those at risk for mathematics difficulties. We evaluated the quality of methodological rigor of 18 group research studies using the criteria proposed by Gersten et al. and 10 single case design (SCD) research studies using criteria suggested by Horner et al. and the What Works Clearinghouse. Results indicated that 14 group design studies met the criteria for high-quality or acceptable research, whereas SCD studies did not meet the standards for an evidence-based practice. Based on these findings, strategy instruction priming the mathematics problem structure is considered an evidence-based practice using only group design methodological criteria. Implications for future research and for practice are discussed. © Hammill Institute on Disabilities 2013.
Progress in multirate digital control system design
NASA Technical Reports Server (NTRS)
Berg, Martin C.; Mason, Gregory S.
1991-01-01
A new methodology for multirate sampled-data control design based on a new generalized control law structure, two new parameter-optimization-based control law synthesis methods, and a new singular-value-based robustness analysis method are described. The control law structure can represent multirate sampled-data control laws of arbitrary structure and dynamic order, with arbitrarily prescribed sampling rates for all sensors and update rates for all processor states and actuators. The two control law synthesis methods employ numerical optimization to determine values for the control law parameters. The robustness analysis method is based on the multivariable Nyquist criterion applied to the loop transfer function for the sampling period equal to the period of repetition of the system's complete sampling/update schedule. The complete methodology is demonstrated by application to the design of a combination yaw damper and modal suppression system for a commercial aircraft.
NASA Astrophysics Data System (ADS)
Lazcano Olea, Miguel; Ramos Astudillo, Reynaldo; Sanhueza Robles, René; Rodriguez Rubke, Leopoldo; Ruiz-Caballero, Domingo Antonio
This paper presents the analysis and design of a power factor pre-regulator based on a symmetrical charge pump circuit applied to electronic ballast. The operation stages of the circuit are analyzed and its main design equations are obtained. Simulation and experimental results are presented in order to show the design methodology feasibility.
Multidisciplinary analysis and design of printed wiring boards
NASA Astrophysics Data System (ADS)
Fulton, Robert E.; Hughes, Joseph L.; Scott, Waymond R., Jr.; Umeagukwu, Charles; Yeh, Chao-Pin
1991-04-01
Modern printed wiring board design depends on electronic prototyping using computer-based simulation and design tools. Existing electrical computer-aided design (ECAD) tools emphasize circuit connectivity with only rudimentary analysis capabilities. This paper describes a prototype integrated PWB design environment denoted Thermal Structural Electromagnetic Testability (TSET) being developed at Georgia Tech in collaboration with companies in the electronics industry. TSET provides design guidance based on enhanced electrical and mechanical CAD capabilities including electromagnetic modeling testability analysis thermal management and solid mechanics analysis. TSET development is based on a strong analytical and theoretical science base and incorporates an integrated information framework and a common database design based on a systematic structured methodology.
Sketching Designs Using the Five Design-Sheet Methodology.
Roberts, Jonathan C; Headleand, Chris; Ritsos, Panagiotis D
2016-01-01
Sketching designs has been shown to be a useful way of planning and considering alternative solutions. The use of lo-fidelity prototyping, especially paper-based sketching, can save time, money and converge to better solutions more quickly. However, this design process is often viewed to be too informal. Consequently users do not know how to manage their thoughts and ideas (to first think divergently, to then finally converge on a suitable solution). We present the Five Design Sheet (FdS) methodology. The methodology enables users to create information visualization interfaces through lo-fidelity methods. Users sketch and plan their ideas, helping them express different possibilities, think through these ideas to consider their potential effectiveness as solutions to the task (sheet 1); they create three principle designs (sheets 2,3 and 4); before converging on a final realization design that can then be implemented (sheet 5). In this article, we present (i) a review of the use of sketching as a planning method for visualization and the benefits of sketching, (ii) a detailed description of the Five Design Sheet (FdS) methodology, and (iii) an evaluation of the FdS using the System Usability Scale, along with a case-study of its use in industry and experience of its use in teaching.
Guidelines for the Design and Conduct of Clinical Studies in Knee Articular Cartilage Repair
Mithoefer, Kai; Saris, Daniel B.F.; Farr, Jack; Kon, Elizaveta; Zaslav, Kenneth; Cole, Brian J.; Ranstam, Jonas; Yao, Jian; Shive, Matthew; Levine, David; Dalemans, Wilfried; Brittberg, Mats
2011-01-01
Objective: To summarize current clinical research practice and develop methodological standards for objective scientific evaluation of knee cartilage repair procedures and products. Design: A comprehensive literature review was performed of high-level original studies providing information relevant for the design of clinical studies on articular cartilage repair in the knee. Analysis of cartilage repair publications and synopses of ongoing trials were used to identify important criteria for the design, reporting, and interpretation of studies in this field. Results: Current literature reflects the methodological limitations of the scientific evidence available for articular cartilage repair. However, clinical trial databases of ongoing trials document a trend suggesting improved study designs and clinical evaluation methodology. Based on the current scientific information and standards of clinical care, detailed methodological recommendations were developed for the statistical study design, patient recruitment, control group considerations, study endpoint definition, documentation of results, use of validated patient-reported outcome instruments, and inclusion and exclusion criteria for the design and conduct of scientifically sound cartilage repair study protocols. A consensus statement among the International Cartilage Repair Society (ICRS) and contributing authors experienced in clinical trial design and implementation was achieved. Conclusions: High-quality clinical research methodology is critical for the optimal evaluation of current and new cartilage repair technologies. In addition to generally applicable principles for orthopedic study design, specific criteria and considerations apply to cartilage repair studies. Systematic application of these criteria and considerations can facilitate study designs that are scientifically rigorous, ethical, practical, and appropriate for the question(s) being addressed in any given cartilage repair research project. PMID:26069574
NASA Astrophysics Data System (ADS)
Wang, Lynn T.-N.; Schroeder, Uwe Paul; Madhavan, Sriram
2017-03-01
A pattern-based methodology for optimizing SADP-compliant layout designs is developed based on identifying cut mask patterns and replacing them with pre-characterized fixing solutions. A pattern-based library of difficult-tomanufacture cut patterns with pre-characterized fixing solutions is built. A pattern-based engine searches for matching patterns in the decomposed layouts. When a match is found, the engine opportunistically replaces the detected pattern with a pre-characterized fixing solution. The methodology was demonstrated on a 7nm routed metal2 block. A small library of 30 cut patterns increased the number of more manufacturable cuts by 38% and metal-via enclosure by 13% with a small parasitic capacitance impact of 0.3%.
2017-10-01
to patient safety by addressing key methodological and conceptual gaps in healthcare simulation-based team training. The investigators are developing...primary outcome of Aim 1a is a conceptually and methodologically sound training design architecture that supports the development and integration of team...should be delivered. This subtask was delayed by approximately 1 month and is now completed. Completed Evaluation of existing experimental dataset to
2010-08-01
a collection of information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. a ...SECURITY CLASSIFICATION OF: This study presents a methodology for computing stochastic sensitivities with respect to the design variables, which are the...Random Variables Report Title ABSTRACT This study presents a methodology for computing stochastic sensitivities with respect to the design variables
Perceived Managerial and Leadership Effectiveness in Colombia
ERIC Educational Resources Information Center
Torres, Luis Eduardo; Ruiz, Carlos Enrique; Hamlin, Bob; Velez-Calle, Andres
2015-01-01
Purpose: The purpose of this study was to identify what Colombians perceive as effective and least effective/ineffective managerial behavior. Design/methodology/approach: This study was conducted following a qualitative methodology based on the philosophical assumptions of pragmatism and the "pragmatic approach" (Morgan, 2007). The…
Multi-Step Usage of in Vivo Models During Rational Drug Design and Discovery
Williams, Charles H.; Hong, Charles C.
2011-01-01
In this article we propose a systematic development method for rational drug design while reviewing paradigms in industry, emerging techniques and technologies in the field. Although the process of drug development today has been accelerated by emergence of computational methodologies, it is a herculean challenge requiring exorbitant resources; and often fails to yield clinically viable results. The current paradigm of target based drug design is often misguided and tends to yield compounds that have poor absorption, distribution, metabolism, and excretion, toxicology (ADMET) properties. Therefore, an in vivo organism based approach allowing for a multidisciplinary inquiry into potent and selective molecules is an excellent place to begin rational drug design. We will review how organisms like the zebrafish and Caenorhabditis elegans can not only be starting points, but can be used at various steps of the drug development process from target identification to pre-clinical trial models. This systems biology based approach paired with the power of computational biology; genetics and developmental biology provide a methodological framework to avoid the pitfalls of traditional target based drug design. PMID:21731440
NASA Technical Reports Server (NTRS)
Rehfield, Lawrence W.; Zischka, Peter J.; Fentress, Michael L.; Chang, Stephen
1992-01-01
Some of the unique considerations that are associated with the design and experimental evaluation of chordwise deformable wing structures are addressed. Since chordwise elastic camber deformations are desired and must be free to develop, traditional rib concepts and experimental methodology cannot be used. New rib design concepts are presented and discussed. An experimental methodology based upon the use of a flexible sling support and load application system has been created and utilized to evaluate a model box beam experimentally. Experimental data correlate extremely well with design analysis predictions based upon a beam model for the global properties of camber compliance and spanwise bending compliance. Local strain measurements exhibit trends in agreement with intuition and theory but depart slightly from theoretical perfection based upon beam-like behavior alone. It is conjectured that some additional refinement of experimental technique is needed to explain or eliminate these (minor) departures from asymmetric behavior of upper and lower box cover strains. Overall, a solid basis for the design of box structures based upon the bending method of elastic camber production has been confirmed by the experiments.
Experience-based co-design in an adult psychological therapies service.
Cooper, Kate; Gillmore, Chris; Hogg, Lorna
2016-01-01
Experience-based co-design (EBCD) is a methodology for service improvement and development, which puts service-user voices at the heart of improving health services. The aim of this paper was to implement the EBCD methodology in a mental health setting, and to investigate the challenges which arise during this process. In order to achieve this, a modified version of the EBCD methodology was undertaken, which involved listening to the experiences of the people who work in and use the mental health setting and sharing these experiences with the people who could effect change within the service, through collaborative work between service-users, staff and managers. EBCD was implemented within the mental health setting and was well received by service-users, staff and stakeholders. A number of modifications were necessary in this setting, for example high levels of support available to participants. It was concluded that EBCD is a suitable methodology for service improvement in mental health settings.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Augustine, Chad
Existing methodologies for estimating the electricity generation potential of Enhanced Geothermal Systems (EGS) assume thermal recovery factors of 5% or less, resulting in relatively low volumetric electricity generation potentials for EGS reservoirs. This study proposes and develops a methodology for calculating EGS electricity generation potential based on the Gringarten conceptual model and analytical solution for heat extraction from fractured rock. The electricity generation potential of a cubic kilometer of rock as a function of temperature is calculated assuming limits on the allowed produced water temperature decline and reservoir lifetime based on surface power plant constraints. The resulting estimates of EGSmore » electricity generation potential can be one to nearly two-orders of magnitude larger than those from existing methodologies. The flow per unit fracture surface area from the Gringarten solution is found to be a key term in describing the conceptual reservoir behavior. The methodology can be applied to aid in the design of EGS reservoirs by giving minimum reservoir volume, fracture spacing, number of fractures, and flow requirements for a target reservoir power output. Limitations of the idealized model compared to actual reservoir performance and the implications on reservoir design are discussed.« less
Calculation of Shuttle Base Heating Environments and Comparison with Flight Data
NASA Technical Reports Server (NTRS)
Greenwood, T. F.; Lee, Y. C.; Bender, R. L.; Carter, R. E.
1983-01-01
The techniques, analytical tools, and experimental programs used initially to generate and later to improve and validate the Shuttle base heating design environments are discussed. In general, the measured base heating environments for STS-1 through STS-5 were in good agreement with the preflight predictions. However, some changes were made in the methodology after reviewing the flight data. The flight data is described, preflight predictions are compared with the flight data, and improvements in the prediction methodology based on the data are discussed.
Venture Creation Programs: Bridging Entrepreneurship Education and Technology Transfer
ERIC Educational Resources Information Center
Lackéus, Martin; Williams Middleton, Karen
2015-01-01
Purpose: The purpose of this paper is to explore how university-based entrepreneurship programs, incorporating real-life venture creation into educational design and delivery, can bridge the gap between entrepreneurship education and technology transfer within the university environment. Design/methodology/approach: Based on a literature review…
Design-Based Implementation Research
ERIC Educational Resources Information Center
LeMahieu, Paul G.; Nordstrum, Lee E.; Potvin, Ashley Seidel
2017-01-01
Purpose: This paper is second of seven in this volume elaborating different approaches to quality improvement in education. It delineates a methodology called design-based implementation research (DBIR). The approach used in this paper is aimed at iteratively improving the quality of classroom teaching and learning practices in defined problem…
NASA Astrophysics Data System (ADS)
Chen, Daniel; Chen, Damian; Yen, Ray; Cheng, Mingjen; Lan, Andy; Ghaskadvi, Rajesh
2008-11-01
Identifying hotspots--structures that limit the lithography process window--become increasingly important as the industry relies heavily on RET to print sub-wavelength designs. KLA-Tencor's patented Process Window Qualification (PWQ) methodology has been used for this purpose in various fabs. PWQ methodology has three key advantages (a) PWQ Layout--to obtain the best sensitivity (b) Design Based Binning--for pattern repeater analysis (c) Intelligent sampling--for the best DOI sampling rate. This paper evaluates two different analysis strategies for SEM review sampling successfully deployed at Inotera Memories, Inc. We propose a new approach combining the location repeater and pattern repeaters. Based on a recent case study the new sampling flow reduces the data analysis and sampling time from 6 hours to 1.5 hour maintaining maximum DOI sample rate.
Evaluation of subgrade moduli for flexible pavements in Virginia : final report.
DOT National Transportation Integrated Search
1980-01-01
Advances in pavement design technology in recent years have led to more dependence on mechanistic approaches and less reliance on subjective design criteria. In Virginia, the tendency is toward a pavement design and evaluation methodology based on el...
Development of asphalt dynamic modulus master curve using falling weight deflectometer measurements.
DOT National Transportation Integrated Search
2014-06-01
The asphalt concrete (AC) dynamic modulus (|E*|) is a key design parameter in mechanistic-based pavement design : methodologies such as the American Association of State Highway and Transportation Officials (AASHTO) MEPDG/Pavement-ME Design. The obje...
Investigation of Dynamic Modulus and Flow Number Properties of Asphalt Mixtures In Washington State
DOT National Transportation Integrated Search
2011-11-11
Pavement design is now moving toward more mechanistic based design methodologies for the purpose of producing long : lasting and higher performance pavements in a cost-effective manner. The recent Mechanistic-Empirical pavement : design guide (MEPDG)...
NASA Astrophysics Data System (ADS)
Glezil, Dorothy
NEHRP's Provisions today currently governing conventional seismic resistant design. These provisions, though they ensure the life-safety of building occupants, extensive damage and economic losses may still occur in the structures. This minimum performance can be enhanced using the Performance-Based Earthquake Engineering methodology and passive control systems like base isolation and energy dissipation systems. Even though these technologies and the PBEE methodology are effective reducing economic losses and fatalities during earthquakes, getting them implemented into seismic resistant design has been challenging. One of the many barriers to their implementation has been their upfront costs. The green building community has faced some of the same challenges that the high performance seismic design community currently faces. The goal of this thesis is to draw on the success of the green building industry to provide recommendations that may be used overcome the barriers that high performance seismic design (HPSD) is currently facing.
ERIC Educational Resources Information Center
Yavuzcan, H. Güçlü; Sahin, Damla
2017-01-01
In industrial design (ID) education, mechanics-based courses are mainly based on a traditional lecture approach and they are highly abstract for ID students to comprehend. The existing studies highlight the requirement of a new approach for mechanics-based courses in ID departments. This study presents a combined teaching model for mechanisms…
Utilizing the Design Charrette for Teaching Sustainability
ERIC Educational Resources Information Center
Walker, Jason B.; Seymour, Michael W.
2008-01-01
Purpose: This paper aims to investigate the design charrette as a method for teaching sustainability. Design/methodology/approach: The paper utilizes a student-based design charrette for the Mississippi Gulf Coast comprising a framework for teaching sustainability. An assessment of the charrette's role in promoting sustainability in higher…
Quasi-experimental study designs series-paper 9: collecting data from quasi-experimental studies.
Aloe, Ariel M; Becker, Betsy Jane; Duvendack, Maren; Valentine, Jeffrey C; Shemilt, Ian; Waddington, Hugh
2017-09-01
To identify variables that must be coded when synthesizing primary studies that use quasi-experimental designs. All quasi-experimental (QE) designs. When designing a systematic review of QE studies, potential sources of heterogeneity-both theory-based and methodological-must be identified. We outline key components of inclusion criteria for syntheses of quasi-experimental studies. We provide recommendations for coding content-relevant and methodological variables and outlined the distinction between bivariate effect sizes and partial (i.e., adjusted) effect sizes. Designs used and controls used are viewed as of greatest importance. Potential sources of bias and confounding are also addressed. Careful consideration must be given to inclusion criteria and the coding of theoretical and methodological variables during the design phase of a synthesis of quasi-experimental studies. The success of the meta-regression analysis relies on the data available to the meta-analyst. Omission of critical moderator variables (i.e., effect modifiers) will undermine the conclusions of a meta-analysis. Copyright © 2017 Elsevier Inc. All rights reserved.
Design optimization of gas generator hybrid propulsion boosters
NASA Technical Reports Server (NTRS)
Weldon, Vincent; Phillips, Dwight U.; Fink, Lawrence E.
1990-01-01
A methodology used in support of a contract study for NASA/MSFC to optimize the design of gas generator hybrid propulsion booster for uprating the National Space Transportation System (NSTS) is presented. The objective was to compare alternative configurations for this booster approach, optimizing each candidate concept on different bases, in order to develop data for a trade table on which a final decision was based. The methodology is capable of processing a large number of independent and dependent variables, adjusting the overall subsystems characteristics to arrive at a best compromise integrated design to meet various specified optimization criteria subject to selected constraints. For each system considered, a detailed weight statement was generated along with preliminary cost and reliability estimates.
Toward a Framework for Comparative HRD Research
ERIC Educational Resources Information Center
Wang, Greg G.; Sun, Judy Y.
2012-01-01
Purpose: This paper seeks to address the recent challenges in the international human resource development (HRD) research and the related methodological strategy. Design/methodology/approach: This inquiry is based on a survey of literatures and integrates various comparative research strategies adopted in other major social science disciplines.…
Does Class Matter? Mentoring Small Businesses' Owner-Managers
ERIC Educational Resources Information Center
Greenbank, Paul
2006-01-01
Purpose: This paper examines the way social class influences the relationship between business mentors and small business owner-managers. Design/methodology/approach: The paper is based on the author's experience of mentoring businesses with The Prince's Trust. Three businesses were selected as cases. The methodological approach involved…
The Interrelations between Competences for Sustainable Development and Research Competences
ERIC Educational Resources Information Center
Lambrechts, Wim; Van Petegem, Peter
2016-01-01
Purpose: The purpose of this paper is to explore how competences for sustainable development and research interrelate within a context of competence-based higher education. Specific focus is oriented towards strengthening research competences for sustainability. Design/methodology/approach: Following a hermeneutic-interpretive methodology, this…
Canadian Chefs' Workplace Learning
ERIC Educational Resources Information Center
Cormier-MacBurnie, Paulette; Doyle, Wendy; Mombourquette, Peter; Young, Jeffrey D.
2015-01-01
Purpose: This paper aims to examine the formal and informal workplace learning of professional chefs. In particular, it considers chefs' learning strategies and outcomes as well as the barriers to and facilitators of their workplace learning. Design/methodology/approach: The methodology is based on in-depth, face-to-face, semi-structured…
A POLLUTION REDUCTION METHODOLOGY FOR CHEMICAL PROCESS SIMULATORS
A pollution minimization methodology was developed for chemical process design using computer simulation. It is based on a pollution balance that at steady state is used to define a pollution index with units of mass of pollution per mass of products. The pollution balance has be...
A four stage approach for ontology-based health information system design.
Kuziemsky, Craig E; Lau, Francis
2010-11-01
To describe and illustrate a four stage methodological approach to capture user knowledge in a biomedical domain area, use that knowledge to design an ontology, and then implement and evaluate the ontology as a health information system (HIS). A hybrid participatory design-grounded theory (GT-PD) method was used to obtain data and code them for ontology development. Prototyping was used to implement the ontology as a computer-based tool. Usability testing evaluated the computer-based tool. An empirically derived domain ontology and set of three problem-solving approaches were developed as a formalized model of the concepts and categories from the GT coding. The ontology and problem-solving approaches were used to design and implement a HIS that tested favorably in usability testing. The four stage approach illustrated in this paper is useful for designing and implementing an ontology as the basis for a HIS. The approach extends existing ontology development methodologies by providing an empirical basis for theory incorporated into ontology design. Copyright © 2010 Elsevier B.V. All rights reserved.
Design Methodology for Multi-Element High-Lift Systems on Subsonic Civil Transport Aircraft
NASA Technical Reports Server (NTRS)
Pepper, R. S.; vanDam, C. P.
1996-01-01
The choice of a high-lift system is crucial in the preliminary design process of a subsonic civil transport aircraft. Its purpose is to increase the allowable aircraft weight or decrease the aircraft's wing area for a given takeoff and landing performance. However, the implementation of a high-lift system into a design must be done carefully, for it can improve the aerodynamic performance of an aircraft but may also drastically increase the aircraft empty weight. If designed properly, a high-lift system can improve the cost effectiveness of an aircraft by increasing the payload weight for a given takeoff and landing performance. This is why the design methodology for a high-lift system should incorporate aerodynamic performance, weight, and cost. The airframe industry has experienced rapid technological growth in recent years which has led to significant advances in high-lift systems. For this reason many existing design methodologies have become obsolete since they are based on outdated low Reynolds number wind-tunnel data and can no longer accurately predict the aerodynamic characteristics or weight of current multi-element wings. Therefore, a new design methodology has been created that reflects current aerodynamic, weight, and cost data and provides enough flexibility to allow incorporation of new data when it becomes available.
Modeling ground-based timber harvesting systems using computer simulation
Jingxin Wang; Chris B. LeDoux
2001-01-01
Modeling ground-based timber harvesting systems with an object-oriented methodology was investigated. Object-oriented modeling and design promote a better understanding of requirements, cleaner designs, and better maintainability of the harvesting simulation system. The model developed simulates chainsaw felling, drive-to-tree feller-buncher, swing-to-tree single-grip...
Design Based Research Methodology for Teaching with Technology in English
ERIC Educational Resources Information Center
Jetnikoff, Anita
2015-01-01
Design based research (DBR) is an appropriate method for small scale educational research projects involving collaboration between teachers, students and researchers. It is particularly useful in collaborative projects where an intervention is implemented and evaluated in a grounded context. The intervention can be technological, or a new program…
Engaging Workers in Simulation-Based E-Learning
ERIC Educational Resources Information Center
Slotte, Virpi; Herbert, Anne
2008-01-01
Purpose: The purpose of this paper is to evaluate learners' attitudes to the use of simulation-based e-learning as part of workplace learning when socially situated interaction and blended learning are specifically included in the instructional design. Design/methodology/approach: Responses to a survey questionnaire of 298 sales personnel were…
A methodology and supply chain management inspired reference ontology for modeling healthcare teams.
Kuziemsky, Craig E; Yazdi, Sara
2011-01-01
Numerous studies and strategic plans are advocating more team based healthcare delivery that is facilitated by information and communication technologies (ICTs). However before we can design ICTs to support teams we need a solid conceptual model of team processes and a methodology for using such a model in healthcare settings. This paper draws upon success in the supply chain management domain to develop a reference ontology of healthcare teams and a methodology for modeling teams to instantiate the ontology in specific settings. This research can help us understand how teams function and how we can design ICTs to support teams.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klymenko, M. V.; Remacle, F., E-mail: fremacle@ulg.ac.be
2014-10-28
A methodology is proposed for designing a low-energy consuming ternary-valued full adder based on a quantum dot (QD) electrostatically coupled with a single electron transistor operating as a charge sensor. The methodology is based on design optimization: the values of the physical parameters of the system required for implementing the logic operations are optimized using a multiobjective genetic algorithm. The searching space is determined by elements of the capacitance matrix describing the electrostatic couplings in the entire device. The objective functions are defined as the maximal absolute error over actual device logic outputs relative to the ideal truth tables formore » the sum and the carry-out in base 3. The logic units are implemented on the same device: a single dual-gate quantum dot and a charge sensor. Their physical parameters are optimized to compute either the sum or the carry out outputs and are compatible with current experimental capabilities. The outputs are encoded in the value of the electric current passing through the charge sensor, while the logic inputs are supplied by the voltage levels on the two gate electrodes attached to the QD. The complex logic ternary operations are directly implemented on an extremely simple device, characterized by small sizes and low-energy consumption compared to devices based on switching single-electron transistors. The design methodology is general and provides a rational approach for realizing non-switching logic operations on QD devices.« less
2006-10-01
agents. Page 5 of 42 BODY Research Study Design & Survey Development: Study Design We designed a prospective quantitative study looking at the...to bioterrorism to different groups of healthcare learners as follows: This research study design was developed in collaboration with the...PDA based format, and printed monograph based format. The research will focus on the effectiveness of distance learning and self-study methodologies
Handley, Margaret A; Schillinger, Dean; Shiboski, Stephen
2011-01-01
Although randomized controlled trials are often a gold standard for determining intervention effects, in the area of practice-based research (PBR), there are many situations in which individual randomization is not possible. Alternative approaches to evaluating interventions have received increased attention, particularly those that can retain elements of randomization such that they can be considered "controlled" trials. Methodological design elements and practical implementation considerations for two quasi-experimental design approaches that have considerable promise in PBR settings--the stepped-wedge design, and a variant of this design, a wait-list cross-over design, are presented along with a case study from a recent PBR intervention for patients with diabetes. PBR-relevant design features include: creation of a cohort over time that collects control data but allows all participants (clusters or patients) to receive the intervention; staggered introduction of clusters; multiple data collection points; and one-way cross-over into the intervention arm. Practical considerations include: randomization versus stratification, training run in phases; and extended time period for overall study completion. Several design features of practice based research studies can be adapted to local circumstances yet retain elements to improve methodological rigor. Studies that utilize these methods, such as the stepped-wedge design and the wait-list cross-over design, can increase the evidence base for controlled studies conducted within the complex environment of PBR.
Design of crashworthy structures with controlled behavior in HCA framework
NASA Astrophysics Data System (ADS)
Bandi, Punit
The field of crashworthiness design is gaining more interest and attention from automakers around the world due to increasing competition and tighter safety norms. In the last two decades, topology and topometry optimization methods from structural optimization have been widely explored to improve existing designs or conceive new designs with better crashworthiness. Although many gradient-based and heuristic methods for topology- and topometry-based crashworthiness design are available these days, most of them result in stiff structures that are suitable only for a set of vehicle components in which maximizing the energy absorption or minimizing the intrusion is the main concern. However, there are some other components in a vehicle structure that should have characteristics of both stiffness and flexibility. Moreover, the load paths within the structure and potential buckle modes also play an important role in efficient functioning of such components. For example, the front bumper, side frame rails, steering column, and occupant protection devices like the knee bolster should all exhibit controlled deformation and collapse behavior. The primary objective of this research is to develop new methodologies to design crashworthy structures with controlled behavior. The well established Hybrid Cellular Automaton (HCA) method is used as the basic framework for the new methodologies, and compliant mechanism-type (sub)structures are the highlight of this research. The ability of compliant mechanisms to efficiently transfer force and/or motion from points of application of input loads to desired points within the structure is used to design solid and tubular components that exhibit controlled deformation and collapse behavior under crash loads. In addition, a new methodology for controlling the behavior of a structure under multiple crash load scenarios by adaptively changing the contributions from individual load cases is developed. Applied to practical design problems, the results demonstrate that the methodologies provide a practical tool to aid the design engineer in generating design concepts for crashworthy structures with controlled behavior. Although developed in the HCA framework, the basic ideas behind these methods are generic and can be easily implemented with other available topology- and topometry-based optimization methods.
Infusing Technology Driven Design Thinking in Industrial Design Education: A Case Study
ERIC Educational Resources Information Center
Mubin, Omar; Novoa, Mauricio; Al Mahmud, Abdullah
2017-01-01
Purpose: This paper narrates a case study on design thinking-based education work in an industrial design honours program. Student projects were developed in a multi-disciplinary setting across a Computing and Engineering faculty that allowed promoting technologically and user-driven innovation strategies. Design/methodology/approach: A renewed…
Sustainability in the Education of Industrial Designers: The Case for Australia
ERIC Educational Resources Information Center
Ramirez, Mariano
2006-01-01
Purpose: The paper intends to determine the extent to which environmental sustainability issues are integrated in the curricula of industrial design programs in Australian universities. Design/methodology/approach: Industrial design lecturers and program heads were invited to participate in a web-based survey on their university's industrial…
A Design Methodology for Complex (E)-Learning. Innovative Session.
ERIC Educational Resources Information Center
Bastiaens, Theo; van Merrienboer, Jeroen; Hoogveld, Bert
Human resource development (HRD) specialists are searching for instructional design models that accommodate e-learning platforms. Van Merrienboer proposed the four-component instructional design model (4C/ID model) for competency-based education. The model's basic message is that well-designed learning environments can always be described in terms…
Towards Design Guidelines for Work Related Learning Arrangements
ERIC Educational Resources Information Center
Lappia, Josephine H.
2011-01-01
Purpose: The purpose of the study is to produce design guidelines based on insights from both practice and theory that will enable teachers and educational developers to execute the design, implementation and evaluation of their work-related learning arrangements with stakeholders involved. Design/methodology/approach: The first study reported in…
NASA Astrophysics Data System (ADS)
Moffitt, Blake Almy
Unmanned Aerial Vehicles (UAVs) are the most dynamic growth sector of the aerospace industry today. The need to provide persistent intelligence, surveillance, and reconnaissance for military operations is driving the planned acquisition of over 5,000 UAVs over the next five years. The most pressing need is for quiet, small UAVs with endurance beyond what is capable with advanced batteries or small internal combustion propulsion systems. Fuel cell systems demonstrate high efficiency, high specific energy, low noise, low temperature operation, modularity, and rapid refuelability making them a promising enabler of the small, quiet, and persistent UAVs that military planners are seeking. Despite the perceived benefits, the actual near-term performance of fuel cell powered UAVs is unknown. Until the auto industry began spending billions of dollars in research, fuel cell systems were too heavy for useful flight applications. However, the last decade has seen rapid development with fuel cell gravimetric and volumetric power density nearly doubling every 2--3 years. As a result, a few design studies and demonstrator aircraft have appeared, but overall the design methodology and vehicles are still in their infancy. The design of fuel cell aircraft poses many challenges. Fuel cells differ fundamentally from combustion based propulsion in how they generate power and interact with other aircraft subsystems. As a result, traditional multidisciplinary analysis (MDA) codes are inappropriate. Building new MDAs is difficult since fuel cells are rapidly changing in design, and various competitive architectures exist for balance of plant, hydrogen storage, and all electric aircraft subsystems. In addition, fuel cell design and performance data is closely protected which makes validation difficult and uncertainty significant. Finally, low specific power and high volumes compared to traditional combustion based propulsion result in more highly constrained design spaces that are problematic for design space exploration. To begin addressing the current gaps in fuel cell aircraft development, a methodology has been developed to explore and characterize the near-term performance of fuel cell powered UAVs. The first step of the methodology is the development of a valid MDA. This is accomplished by using propagated uncertainty estimates to guide the decomposition of a MDA into key contributing analyses (CAs) that can be individually refined and validated to increase the overall accuracy of the MDA. To assist in MDA development, a flexible framework for simultaneously solving the CAs is specified. This enables the MDA to be easily adapted to changes in technology and the changes in data that occur throughout a design process. Various CAs that model a polymer electrolyte membrane fuel cell (PEMFC) UAV are developed, validated, and shown to be in agreement with hardware-in-the-loop simulations of a fully developed fuel cell propulsion system. After creating a valid MDA, the final step of the methodology is the synthesis of the MDA with an uncertainty propagation analysis, an optimization routine, and a chance constrained problem formulation. This synthesis allows an efficient calculation of the probabilistic constraint boundaries and Pareto frontiers that will govern the design space and influence design decisions relating to optimization and uncertainty mitigation. A key element of the methodology is uncertainty propagation. The methodology uses Systems Sensitivity Analysis (SSA) to estimate the uncertainty of key performance metrics due to uncertainties in design variables and uncertainties in the accuracy of the CAs. A summary of SSA is provided and key rules for properly decomposing a MDA for use with SSA are provided. Verification of SSA uncertainty estimates via Monte Carlo simulations is provided for both an example problem as well as a detailed MDA of a fuel cell UAV. Implementation of the methodology was performed on a small fuel cell UAV designed to carry a 2.2 kg payload with 24 hours of endurance. Uncertainty distributions for both design variables and the CAs were estimated based on experimental results and were found to dominate the design space. To reduce uncertainty and test the flexibility of the MDA framework, CAs were replaced with either empirical, or semi-empirical relationships during the optimization process. The final design was validated via a hardware-in-the loop simulation. Finally, the fuel cell UAV probabilistic design space was studied. A graphical representation of the design space was generated and the optima due to deterministic and probabilistic constraints were identified. The methodology was used to identify Pareto frontiers of the design space which were shown on contour plots of the design space. Unanticipated discontinuities of the Pareto fronts were observed as different constraints became active providing useful information on which to base design and development decisions.
Problems related to the integration of fault tolerant aircraft electronic systems
NASA Technical Reports Server (NTRS)
Bannister, J. A.; Adlakha, V.; Triyedi, K.; Alspaugh, T. A., Jr.
1982-01-01
Problems related to the design of the hardware for an integrated aircraft electronic system are considered. Taxonomies of concurrent systems are reviewed and a new taxonomy is proposed. An informal methodology intended to identify feasible regions of the taxonomic design space is described. Specific tools are recommended for use in the methodology. Based on the methodology, a preliminary strawman integrated fault tolerant aircraft electronic system is proposed. Next, problems related to the programming and control of inegrated aircraft electronic systems are discussed. Issues of system resource management, including the scheduling and allocation of real time periodic tasks in a multiprocessor environment, are treated in detail. The role of software design in integrated fault tolerant aircraft electronic systems is discussed. Conclusions and recommendations for further work are included.
Management Information System Based on the Balanced Scorecard
ERIC Educational Resources Information Center
Kettunen, Juha; Kantola, Ismo
2005-01-01
Purpose: This study seeks to describe the planning and implementation in Finland of a campus-wide management information system using a rigorous planning methodology. Design/methodology/approach: The structure of the management information system is planned on the basis of the management process, where strategic management and the balanced…
Developing International Managers: The Contribution of Cultural Experience to Learning
ERIC Educational Resources Information Center
Townsend, Peter; Regan, Padraic; Li, Liang Liang
2015-01-01
Purpose: The purpose of this paper is to evaluate cultural experience as a learning strategy for developing international managers. Design/methodology/approach: Using an integrated framework, two quantitative studies, based on empirical methodology, are conducted. Study 1, with an undergraduate sample situated in the Asia Pacific, aimed to examine…
Barriers and Coping Mechanisms Relating to Agroforestry Adoption by Smallholder Farmers in Zimbabwe
ERIC Educational Resources Information Center
Chitakira, Munyaradzi; Torquebiau, Emmanuel
2010-01-01
Purpose: The purpose of the present study was to investigate agroforestry adoption by smallholder farmers in Gutu District, Zimbabwe. Design/Methodology/Approach: The methodology was based on field data collected through household questionnaires, key informant interviews and direct observations. Findings: Major findings reveal that traditional…
Three-Dimensional Extension of a Digital Library Service System
ERIC Educational Resources Information Center
Xiao, Long
2010-01-01
Purpose: The paper aims to provide an overall methodology and case study for the innovation and extension of a digital library, especially the service system. Design/methodology/approach: Based on the three-dimensional structure theory of the information service industry, this paper combines a comprehensive analysis with the practical experiences…
How Methodological Features Affect Effect Sizes in Education
ERIC Educational Resources Information Center
Cheung, Alan; Slavin, Robert
2016-01-01
As evidence-based reform becomes increasingly important in educational policy, it is becoming essential to understand how research design might contribute to reported effect sizes in experiments evaluating educational programs. The purpose of this study was to examine how methodological features such as types of publication, sample sizes, and…
Sensor-based activity recognition using extended belief rule-based inference methodology.
Calzada, A; Liu, J; Nugent, C D; Wang, H; Martinez, L
2014-01-01
The recently developed extended belief rule-based inference methodology (RIMER+) recognizes the need of modeling different types of information and uncertainty that usually coexist in real environments. A home setting with sensors located in different rooms and on different appliances can be considered as a particularly relevant example of such an environment, which brings a range of challenges for sensor-based activity recognition. Although RIMER+ has been designed as a generic decision model that could be applied in a wide range of situations, this paper discusses how this methodology can be adapted to recognize human activities using binary sensors within smart environments. The evaluation of RIMER+ against other state-of-the-art classifiers in terms of accuracy, efficiency and applicability was found to be significantly relevant, specially in situations of input data incompleteness, and it demonstrates the potential of this methodology and underpins the basis to develop further research on the topic.
Song, Chuan-xia; Chen, Hong-mei; Dai, Yu; Kang, Min; Hu, Jia; Deng, Yun
2014-11-01
To optimize the process of Icraiin be hydrolyzed to Baohuoside I by cellulase by Plackett-Burman design combined with Central Composite Design (CCD) response surface methodology. To select the main influencing factors by Plackett-Burman design, using CCD response surface methodology to optimize the process of Icraiin be hydrolyzed to Baohuoside I by cellulase. Taking substrate concentration, the pH of buffer and reaction time as independent variables, with conversion rate of icariin as dependent variable,using regression fitting of completely quadratic response surface between independent variable and dependent variable,the optimum process of Icraiin be hydrolyzed to Baohuoside I by cellulase was intuitively analyzed by 3D surface chart, and taking verification tests and predictive analysis. The best enzymatic hydrolytic process was as following: substrate concentration 8. 23 mg/mL, pH 5. 12 of buffer,reaction time 35. 34 h. The optimum process of Icraiin be hydrolyzed to Baohuoside I by cellulase is determined by Plackett-Burman design combined with CCD response surface methodology. The optimized enzymatic hydrolytic process is simple, convenient, accurate, reproducible and predictable.
NASA Astrophysics Data System (ADS)
Zhafirah Muhammad, Nurul; Harun, A.; Hambali, N. A. M. A.; Murad, S. A. Z.; Mohyar, S. N.; Isa, M. N.; Jambek, AB
2017-11-01
Increased demand in internet of thing (IOT) application based has inadvertently forced the move towards higher complexity of integrated circuit supporting SoC. Such spontaneous increased in complexity poses unequivocal complicated validation strategies. Hence, the complexity allows researchers to come out with various exceptional methodologies in order to overcome this problem. This in essence brings about the discovery of dynamic verification, formal verification and hybrid techniques. In reserve, it is very important to discover bugs at infancy of verification process in (SoC) in order to reduce time consuming and fast time to market for the system. Ergo, in this paper we are focusing on the methodology of verification that can be done at Register Transfer Level of SoC based on the AMBA bus design. On top of that, the discovery of others verification method called Open Verification Methodology (OVM) brings out an easier way in RTL validation methodology neither as the replacement for the traditional method yet as an effort for fast time to market for the system. Thus, the method called OVM is proposed in this paper as the verification method for larger design to avert the disclosure of the bottleneck in validation platform.
Srinivasan, Srikant; Broderick, Scott R; Zhang, Ruifeng; Mishra, Amrita; Sinnott, Susan B; Saxena, Surendra K; LeBeau, James M; Rajan, Krishna
2015-12-18
A data driven methodology is developed for tracking the collective influence of the multiple attributes of alloying elements on both thermodynamic and mechanical properties of metal alloys. Cobalt-based superalloys are used as a template to demonstrate the approach. By mapping the high dimensional nature of the systematics of elemental data embedded in the periodic table into the form of a network graph, one can guide targeted first principles calculations that identify the influence of specific elements on phase stability, crystal structure and elastic properties. This provides a fundamentally new means to rapidly identify new stable alloy chemistries with enhanced high temperature properties. The resulting visualization scheme exhibits the grouping and proximity of elements based on their impact on the properties of intermetallic alloys. Unlike the periodic table however, the distance between neighboring elements uncovers relationships in a complex high dimensional information space that would not have been easily seen otherwise. The predictions of the methodology are found to be consistent with reported experimental and theoretical studies. The informatics based methodology presented in this study can be generalized to a framework for data analysis and knowledge discovery that can be applied to many material systems and recreated for different design objectives.
Culturally Responsive Online Design: Learning at Intercultural Intersections
ERIC Educational Resources Information Center
Morong, Gail; DesBiens, Donna
2016-01-01
This article presents evidence-based guidelines to inform culturally responsive online learning design in higher education. Intercultural understanding is now a recognised core learning outcome in a large majority of Canadian public universities; however, supporting design methodology is underdeveloped, especially in online contexts. Our search…
Stochastic Analysis and Design of Heterogeneous Microstructural Materials System
NASA Astrophysics Data System (ADS)
Xu, Hongyi
Advanced materials system refers to new materials that are comprised of multiple traditional constituents but complex microstructure morphologies, which lead to superior properties over the conventional materials. To accelerate the development of new advanced materials system, the objective of this dissertation is to develop a computational design framework and the associated techniques for design automation of microstructure materials systems, with an emphasis on addressing the uncertainties associated with the heterogeneity of microstructural materials. Five key research tasks are identified: design representation, design evaluation, design synthesis, material informatics and uncertainty quantification. Design representation of microstructure includes statistical characterization and stochastic reconstruction. This dissertation develops a new descriptor-based methodology, which characterizes 2D microstructures using descriptors of composition, dispersion and geometry. Statistics of 3D descriptors are predicted based on 2D information to enable 2D-to-3D reconstruction. An efficient sequential reconstruction algorithm is developed to reconstruct statistically equivalent random 3D digital microstructures. In design evaluation, a stochastic decomposition and reassembly strategy is developed to deal with the high computational costs and uncertainties induced by material heterogeneity. The properties of Representative Volume Elements (RVE) are predicted by stochastically reassembling SVE elements with stochastic properties into a coarse representation of the RVE. In design synthesis, a new descriptor-based design framework is developed, which integrates computational methods of microstructure characterization and reconstruction, sensitivity analysis, Design of Experiments (DOE), metamodeling and optimization the enable parametric optimization of the microstructure for achieving the desired material properties. Material informatics is studied to efficiently reduce the dimension of microstructure design space. This dissertation develops a machine learning-based methodology to identify the key microstructure descriptors that highly impact properties of interest. In uncertainty quantification, a comparative study on data-driven random process models is conducted to provide guidance for choosing the most accurate model in statistical uncertainty quantification. Two new goodness-of-fit metrics are developed to provide quantitative measurements of random process models' accuracy. The benefits of the proposed methods are demonstrated by the example of designing the microstructure of polymer nanocomposites. This dissertation provides material-generic, intelligent modeling/design methodologies and techniques to accelerate the process of analyzing and designing new microstructural materials system.
Harbison, K; Kelly, J; Burnell, L; Silva, J
1995-01-01
The Scenario-based Engineering Process (SEP) is a user-focused methodology for large and complex system design. This process supports new application development from requirements analysis with domain models to component selection, design and modification, implementation, integration, and archival placement. It is built upon object-oriented methodologies, domain modeling strategies, and scenario-based techniques to provide an analysis process for mapping application requirements to available components. We are using SEP in the health care applications that we are developing. The process has already achieved success in the manufacturing and military domains and is being adopted by many organizations. SEP should prove viable in any domain containing scenarios that can be decomposed into tasks.
Investigation of Weibull statistics in fracture analysis of cast aluminum
NASA Technical Reports Server (NTRS)
Holland, Frederic A., Jr.; Zaretsky, Erwin V.
1989-01-01
The fracture strengths of two large batches of A357-T6 cast aluminum coupon specimens were compared by using two-parameter Weibull analysis. The minimum number of these specimens necessary to find the fracture strength of the material was determined. The applicability of three-parameter Weibull analysis was also investigated. A design methodology based on the combination of elementary stress analysis and Weibull statistical analysis is advanced and applied to the design of a spherical pressure vessel shell. The results from this design methodology are compared with results from the applicable ASME pressure vessel code.
Direct manipulation of wave amplitude and phase through inverse design of isotropic media
NASA Astrophysics Data System (ADS)
Liu, Y.; Vial, B.; Horsley, S. A. R.; Philbin, T. G.; Hao, Y.
2017-07-01
In this article we propose a new design methodology allowing us to control both amplitude and phase of electromagnetic waves from a cylindrical incident wave. This results in isotropic materials and does not resort to transformation optics or its quasi-conformal approximations. Our method leads to two-dimensional isotropic, inhomogeneous material profiles of permittivity and permeability, to which a general class of scattering-free wave solutions arise. Our design is based on the separation of the complex wave solution into amplitude and phase. We give two types of examples to validate our methodology.
Contingency theoretic methodology for agent-based web-oriented manufacturing systems
NASA Astrophysics Data System (ADS)
Durrett, John R.; Burnell, Lisa J.; Priest, John W.
2000-12-01
The development of distributed, agent-based, web-oriented, N-tier Information Systems (IS) must be supported by a design methodology capable of responding to the convergence of shifts in business process design, organizational structure, computing, and telecommunications infrastructures. We introduce a contingency theoretic model for the use of open, ubiquitous software infrastructure in the design of flexible organizational IS. Our basic premise is that developers should change in the way they view the software design process from a view toward the solution of a problem to one of the dynamic creation of teams of software components. We postulate that developing effective, efficient, flexible, component-based distributed software requires reconceptualizing the current development model. The basic concepts of distributed software design are merged with the environment-causes-structure relationship from contingency theory; the task-uncertainty of organizational- information-processing relationships from information processing theory; and the concept of inter-process dependencies from coordination theory. Software processes are considered as employees, groups of processes as software teams, and distributed systems as software organizations. Design techniques already used in the design of flexible business processes and well researched in the domain of the organizational sciences are presented. Guidelines that can be utilized in the creation of component-based distributed software will be discussed.
Web-Based Online Public Access Catalogues of IIT Libraries in India: An Evaluative Study
ERIC Educational Resources Information Center
Madhusudhan, Margam; Aggarwal, Shalini
2011-01-01
Purpose: The purpose of the paper is to examine the various features and components of web-based online public access catalogues (OPACs) of IIT libraries in India with the help of a specially designed evaluation checklist. Design/methodology/approach: The various features of the web-based OPACs in six IIT libraries (IIT Delhi, IIT Bombay, IIT…
Design for Usability; practice-oriented research for user-centered product design.
van Eijk, Daan; van Kuijk, Jasper; Hoolhorst, Frederik; Kim, Chajoong; Harkema, Christelle; Dorrestijn, Steven
2012-01-01
The Design for Usability project aims at improving the usability of electronic professional and consumer products by creating new methodology and methods for user-centred product development, which are feasible to apply in practice. The project was focused on 5 key areas: (i) design methodology, expanding the existing approach of scenario-based design to incorporate the interaction between product design, user characteristics, and user behaviour; (ii) company processes, barriers and enablers for usability in practice; (iii) user characteristics in relation to types of products and use-situations; (iv) usability decision-making; and (v) product impact on user behaviour. The project team developed methods and techniques in each of these areas to support the design of products with a high level of usability. This paper brings together and summarizes the findings.
André, Francisco J; Cardenete, M Alejandro; Romero, Carlos
2009-05-01
The economic policy needs to pay increasingly more attention to the environmental issues, which requires the development of methodologies able to incorporate environmental, as well as macroeconomic, goals in the design of public policies. Starting from this observation, this article proposes a methodology based upon a Simonian satisficing logic made operational with the help of goal programming (GP) models, to address the joint design of macroeconomic and environmental policies. The methodology is applied to the Spanish economy, where a joint policy is elicited, taking into consideration macroeconomic goals (economic growth, inflation, unemployment, public deficit) and environmental goals (CO(2), NO( x ) and SO( x ) emissions) within the context of a computable general equilibrium model. The results show how the government can "fine-tune" its policy according to different criteria using GP models. The resulting policies aggregate the environmental and the economic goals in different ways: maximum aggregate performance, maximum balance and a lexicographic hierarchy of the goals.
Singh, Pankaj Kumar; Negi, Arvind; Gupta, Pawan Kumar; Chauhan, Monika; Kumar, Raj
2016-08-01
Toxicity is a common drawback of newly designed chemotherapeutic agents. With the exception of pharmacophore-induced toxicity (lack of selectivity at higher concentrations of a drug), the toxicity due to chemotherapeutic agents is based on the toxicophore moiety present in the drug. To date, methodologies implemented to determine toxicophores may be broadly classified into biological, bioanalytical and computational approaches. The biological approach involves analysis of bioactivated metabolites, whereas the computational approach involves a QSAR-based method, mapping techniques, an inverse docking technique and a few toxicophore identification/estimation tools. Being one of the major steps in drug discovery process, toxicophore identification has proven to be an essential screening step in drug design and development. The paper is first of its kind, attempting to cover and compare different methodologies employed in predicting and determining toxicophores with an emphasis on their scope and limitations. Such information may prove vital in the appropriate selection of methodology and can be used as screening technology by researchers to discover the toxicophoric potentials of their designed and synthesized moieties. Additionally, it can be utilized in the manipulation of molecules containing toxicophores in such a manner that their toxicities might be eliminated or removed.
Strategic Industrial Alliances in Paper Industry: XML- vs Ontology-Based Integration Platforms
ERIC Educational Resources Information Center
Naumenko, Anton; Nikitin, Sergiy; Terziyan, Vagan; Zharko, Andriy
2005-01-01
Purpose: To identify cases related to design of ICT platforms for industrial alliances, where the use of Ontology-driven architectures based on Semantic web standards is more advantageous than application of conventional modeling together with XML standards. Design/methodology/approach: A comparative analysis of the two latest and the most obvious…
ERIC Educational Resources Information Center
Terrazas-Arellanes, Fatima E.; Knox, Carolyn; Strycker, Lisa A.; Walden, Emily D.
2017-01-01
This article reports on how design-based research methodology was used to guide a line of intervention research that developed, implemented, revised, and evaluated online learning science curricula for middle school students, including general education students and English language learners (primarily of Hispanic origin). The iterative,…
NASA Astrophysics Data System (ADS)
Dutta, Sekhar Chandra; Chakroborty, Suvonkar; Raychaudhuri, Anusrita
Vibration transmitted to the structure during earthquake may vary in magnitude over a wide range. Design methodology should, therefore, enumerates steps so that structures are able to survive in the event of even severe ground motion. However, on account of economic reason, the strengths can be provided to the structures in such a way that the structure remains in elastic range in low to moderate range earthquake and is allowed to undergo inelastic deformation in severe earthquake without collapse. To implement this design philosophy a rigorous nonlinear dynamic analysis is needed to be performed to estimate the inelastic demands. Furthermore, the same is time consuming and requires expertise to judge the results obtained from the same. In this context, the present paper discusses and demonstrates an alternative simple method known as Pushover method, which can be easily used by practicing engineers bypassing intricate nonlinear dynamic analysis and can be thought of as a substitute of the latter. This method is in the process of development and is increasingly becoming popular for its simplicity. The objective of this paper is to emphasize and demonstrate the basic concept, strength and ease of this state of the art methodology for regular use in design offices in performance based seismic design of structures.
2018-01-01
14. ABSTRACT The objective of this effort was to: (a) develop novel and fundamental methodologies for data representation using hardware-based spike...Distribution Unlimited. 1 1.0 SUMMARY This effort is a critical part of an overall program to develop novel and fundamental methodologies for data...to fabrication a dynamic-reservoir circuit that utilizes sensory encoding methodologies similar to those employed in biological brains. Inspired
Nonlinear maneuver autopilot for the F-15 aircraft
NASA Technical Reports Server (NTRS)
Menon, P. K. A.; Badgett, M. E.; Walker, R. A.
1989-01-01
A methodology is described for the development of flight test trajectory control laws based on singular perturbation methodology and nonlinear dynamic modeling. The control design methodology is applied to a detailed nonlinear six degree-of-freedom simulation of the F-15 and results for a level accelerations, pushover/pullup maneuver, zoom and pushover maneuver, excess thrust windup turn, constant thrust windup turn, and a constant dynamic pressure/constant load factor trajectory are presented.
A Test Methodology for Determining Space-Readiness of Xilinx SRAM-Based FPGA Designs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Quinn, Heather M; Graham, Paul S; Morgan, Keith S
2008-01-01
Using reconfigurable, static random-access memory (SRAM) based field-programmable gate arrays (FPGAs) for space-based computation has been an exciting area of research for the past decade. Since both the circuit and the circuit's state is stored in radiation-tolerant memory, both could be alterd by the harsh space radiation environment. Both the circuit and the circuit's state can be prote cted by triple-moduler redundancy (TMR), but applying TMR to FPGA user designs is often an error-prone process. Faulty application of TMR could cause the FPGA user circuit to output incorrect data. This paper will describe a three-tiered methodology for testing FPGA usermore » designs for space-readiness. We will describe the standard approach to testing FPGA user designs using a particle accelerator, as well as two methods using fault injection and a modeling tool. While accelerator testing is the current 'gold standard' for pre-launch testing, we believe the use of fault injection and modeling tools allows for easy, cheap and uniform access for discovering errors early in the design process.« less
Schünemann, Holger J
2013-01-01
In this brief article which summarises a presentation given at the "6. Diskussionsforum zur Nutzenbewertung im Gesundheitswesen" of the German Ministry of Education and Research "Gesundheitsforschungsrat (GFR)" and the Institute for Quality and Efficiency in Healthcare (IQWiG) I will analyse some methodological idiosyncrasies of studies evaluating non-pharmacological non-technical interventions (NPNTI). I will focus on how the methodological framework of the Grading of Recommendations Assessment, Development and Evaluation (GRADE) working group may support design and appraisal of NPNTI. Specific design features that may be of particular value in NPNTI research, such as expertise-based randomised controlled trials, will be briefly described. Finally, based on an example, I will argue that - despite the methodological idiosyncrasies - there is neither a sufficient reason to accept different standards for the assessment of the confidence in the evidence from NPNTI nor for using study designs that are less rigorous compared to "simpler" interventions but that special measures have to be taken to reduce the risk of bias. The example that will be used in this article will primarily come from the field of respiratory rehabilitation, a typical multi-component or complex intervention and by definition a complex NPNTI, which has been evaluated in many randomised controlled trials. (As supplied by publisher). Copyright © 2013. Published by Elsevier GmbH.
The Historical and Situated Nature Design Experiments--Implications for Data Analysis
ERIC Educational Resources Information Center
Krange, I.; Ludvigsen, Sten
2009-01-01
This article is a methodological contribution to the use of design experiments in educational research. We will discuss the implications of a historical and situated interpretation to design experiments, the consequences this has for the analysis of the collected data and empirically based suggestions to improve the designs of the computer-based…
E-Laboratory Design and Implementation for Enhanced Science, Technology and Engineering Education
ERIC Educational Resources Information Center
Morton, William; Uhomoibhi, James
2011-01-01
Purpose: This paper aims to report on the design and implementation of an e-laboratory for enhanced science, technology and engineering education studies. Design/methodology/approach: The paper assesses a computer-based e-laboratory, designed for new entrants to science, technology and engineering programmes of study in further and higher…
Using Appreciative Intelligence for Ice-Breaking: A New Design
ERIC Educational Resources Information Center
Verma, Neena; Pathak, Anil Anand
2011-01-01
Purpose: The purpose of this paper is to highlight the importance of applying appreciative intelligence and appreciative inquiry concepts to design a possibly new model of ice-breaking, which is strengths-based and very often used in any training in general and team building training in particular. Design/methodology/approach: The design has…
NASA Astrophysics Data System (ADS)
Zhang, Lin
2014-02-01
Educators design and create various technology tools to scaffold students' learning. As more and more technology designs are incorporated into learning, growing attention has been paid to the study of technology-based learning tool. This paper discusses the emerging issues, such as how can learning effectiveness be understood in relation to different technology features? And how can pieces of qualitative and quantitative results be integrated to achieve a broader understanding of technology designs? To address these issues, this paper proposes a meta-analysis method. Detailed explanations about the structure of the methodology and its scientific mechanism are provided for discussions and suggestions. This paper ends with an in-depth discussion on the concerns and questions that educational researchers might raise, such as how this methodology takes care of learning contexts.
Game-Like Technology Innovation Education
ERIC Educational Resources Information Center
Magnussen, Rikke
2011-01-01
This paper examines the methodological challenges and perspectives of designing game-like scenarios for the implementation of innovation processes in school science education. This paper presents a design-based research study of a game-like innovation scenario designed for technology education for Danish public school students aged 13-15. Students…
A Perspective on Computational Human Performance Models as Design Tools
NASA Technical Reports Server (NTRS)
Jones, Patricia M.
2010-01-01
The design of interactive systems, including levels of automation, displays, and controls, is usually based on design guidelines and iterative empirical prototyping. A complementary approach is to use computational human performance models to evaluate designs. An integrated strategy of model-based and empirical test and evaluation activities is particularly attractive as a methodology for verification and validation of human-rated systems for commercial space. This talk will review several computational human performance modeling approaches and their applicability to design of display and control requirements.
Light Bulbs and Change: Systems Thinking and Organisational Learning for New Ventures
ERIC Educational Resources Information Center
Hebel, Misha
2007-01-01
Purpose: The purpose of the paper is to revisit the practical worth of different systems thinking tools applied to three different business clients, which may be dismissed by academic researchers as theoretically old fashioned. Design/methodology/approach: The methodologies used are systems-based (SSM, VSM and causal loop diagrams), culminating in…
ERIC Educational Resources Information Center
Kessler, Seth A.; Horton, Karissa D.; Gottlieb, Nell H.; Atwood, Robin
2012-01-01
Purpose: The purpose of this study is to describe preceptors' implementation experiences after implementing a workplace learning program in Texas WIC (women, infant, and children) agencies and identify implementation best practices. Design/methodology/approach: This research used qualitative description methodology. Data collection consisted of 11…
A Study on Learning Organizations in Indian Higher Educational Institutes
ERIC Educational Resources Information Center
Chawla, Saniya; Lenka, Usha
2015-01-01
Purpose: This paper aims to study the antecedents and consequences of learning organizations (LOs) in Indian higher educational institutes. Design/methodology/approach: The methodology used is survey-based. Primary data were collected from 300 faculty members of Indian higher educational institutes. Findings: It was found that all the variables,…
Lean vs Agile in the Context of Complexity Management in Organizations
ERIC Educational Resources Information Center
Putnik, Goran D.; Putnik, Zlata
2012-01-01
Purpose: The objective of this paper is to provide a deeper insight into the relationship of the issue "lean vs agile" in order to inform managers towards more coherent decisions especially in a dynamic, unpredictable, uncertain, non-linear environment. Design/methodology/approach: The methodology is an exploratory study based on secondary data…
Perspectives Do Matter: "Joint Screen", a Promising Methodology for Multimodal Interaction Analysis
ERIC Educational Resources Information Center
Arend, Béatrice; Sunnen, Patrick; Fixmer, Pierre; Sujbert, Monika
2014-01-01
This paper discusses theoretical and methodological issues arising from a video-based research design and the emergent tool "Joint Screen'"when grasping joint activity. We share our reflections regarding the combined reading of four synchronised camera perspectives combined in one screen. By these means we reconstruct and analyse…
ERIC Educational Resources Information Center
Parent, F.; Baulana, R.; Kahombo, G.; Coppieters, Y.; Garant, M.; De Ketele, J.-M.
2011-01-01
Objective: To describe the methodological steps of developing an integrated reference guide for competences according to the profile of the healthcare professionals concerned. Design: Human resources in healthcare represent a complex issue, which needs conceptual and methodological frameworks and tools to help one understand reality and the limits…
An automated methodology development. [software design for combat simulation
NASA Technical Reports Server (NTRS)
Hawley, L. R.
1985-01-01
The design methodology employed in testing the applicability of Ada in large-scale combat simulations is described. Ada was considered as a substitute for FORTRAN to lower life cycle costs and ease the program development efforts. An object-oriented approach was taken, which featured definitions of military targets, the capability of manipulating their condition in real-time, and one-to-one correlation between the object states and real world states. The simulation design process was automated by the problem statement language (PSL)/problem statement analyzer (PSA). The PSL/PSA system accessed the problem data base directly to enhance the code efficiency by, e.g., eliminating non-used subroutines, and provided for automated report generation, besides allowing for functional and interface descriptions. The ways in which the methodology satisfied the responsiveness, reliability, transportability, modifiability, timeliness and efficiency goals are discussed.
Zarit, Steven H.; Liu, Yin; Bangerter, Lauren R.; Rovine, Michael J.
2017-01-01
Objectives There is growing emphasis on empirical validation of the efficacy of community-based services for older people and their families, but research on services such as respite care faces methodological challenges that have limited the growth of outcome studies. We identify problems associated with the usual research approaches for studying respite care, with the goal of stimulating use of novel and more appropriate research designs that can lead to improved studies of community-based services. Method Using the concept of research validity, we evaluate the methodological approaches in the current literature on respite services, including adult day services, in-home respite and overnight respite. Results Although randomized control trials (RCTs) are possible in community settings, validity is compromised by practical limitations of randomization and other problems. Quasi-experimental and interrupted time series designs offer comparable validity to RCTs and can be implemented effectively in community settings. Conclusion An emphasis on RCTs by funders and researchers is not supported by scientific evidence. Alternative designs can lead to development of a valid body of research on community services such as respite. PMID:26729467
Zarit, Steven H; Bangerter, Lauren R; Liu, Yin; Rovine, Michael J
2017-03-01
There is growing emphasis on empirical validation of the efficacy of community-based services for older people and their families, but research on services such as respite care faces methodological challenges that have limited the growth of outcome studies. We identify problems associated with the usual research approaches for studying respite care, with the goal of stimulating use of novel and more appropriate research designs that can lead to improved studies of community-based services. Using the concept of research validity, we evaluate the methodological approaches in the current literature on respite services, including adult day services, in-home respite and overnight respite. Although randomized control trials (RCTs) are possible in community settings, validity is compromised by practical limitations of randomization and other problems. Quasi-experimental and interrupted time series designs offer comparable validity to RCTs and can be implemented effectively in community settings. An emphasis on RCTs by funders and researchers is not supported by scientific evidence. Alternative designs can lead to development of a valid body of research on community services such as respite.
NASA Technical Reports Server (NTRS)
Jadaan, Osama M.; Powers, Lynn M.; Gyekenyesi, John P.
1997-01-01
The desirable properties of ceramics at high temperatures have generated interest in their use for structural applications such as in advanced turbine systems. Design lives for such systems can exceed 10,000 hours. Such long life requirements necessitate subjecting the components to relatively low stresses. The combination of high temperatures and low stresses typically places failure for monolithic ceramics in the creep regime. The objective of this work is to present a design methodology for predicting the lifetimes of structural components subjected to multiaxial creep loading. This methodology utilizes commercially available finite element packages and takes into account the time varying creep stress distributions (stress relaxation). In this methodology, the creep life of a component is divided into short time steps, during which, the stress and strain distributions are assumed constant. The damage, D, is calculated for each time step based on a modified Monkman-Grant creep rupture criterion. For components subjected to predominantly tensile loading, failure is assumed to occur when the normalized accumulated damage at any point in the component is greater than or equal to unity.
1978-09-01
This report describes an effort to specify a software design methodology applicable to the Air Force software environment . Available methodologies...of techniques for proof of correctness, design specification, and performance assessment of static designs. The rational methodology selected is a
NASA Astrophysics Data System (ADS)
Marconi, S.; Orfanelli, S.; Karagounis, M.; Hemperek, T.; Christiansen, J.; Placidi, P.
2017-02-01
A dedicated power analysis methodology, based on modern digital design tools and integrated with the VEPIX53 simulation framework developed within RD53 collaboration, is being used to guide vital choices for the design and optimization of the next generation ATLAS and CMS pixel chips and their critical serial powering circuit (shunt-LDO). Power consumption is studied at different stages of the design flow under different operating conditions. Significant effort is put into extensive investigations of dynamic power variations in relation with the decoupling seen by the powering network. Shunt-LDO simulations are also reported to prove the reliability at the system level.
Application of an integrated flight/propulsion control design methodology to a STOVL aircraft
NASA Technical Reports Server (NTRS)
Garg, Sanjay; Mattern, Duane L.
1991-01-01
Results are presented from the application of an emerging Integrated Flight/Propulsion Control (IFPC) design methodology to a Short Take Off and Vertical Landing (STOVL) aircraft in transition flight. The steps in the methodology consist of designing command shaping prefilters to provide the overall desired response to pilot command inputs. A previously designed centralized controller is first validated for the integrated airframe/engine plant used. This integrated plant is derived from a different model of the engine subsystem than the one used for the centralized controller design. The centralized controller is then partitioned in a decentralized, hierarchical structure comprising of airframe lateral and longitudinal subcontrollers and an engine subcontroller. Command shaping prefilters from the pilot control effector inputs are then designed and time histories of the closed loop IFPC system response to simulated pilot commands are compared to desired responses based on handling qualities requirements. Finally, the propulsion system safety and nonlinear limited protection logic is wrapped around the engine subcontroller and the response of the closed loop integrated system is evaluated for transients that encounter the propulsion surge margin limit.
NASA Technical Reports Server (NTRS)
Onwubiko, Chinyere; Onyebueke, Landon
1996-01-01
This program report is the final report covering all the work done on this project. The goal of this project is technology transfer of methodologies to improve design process. The specific objectives are: 1. To learn and understand the Probabilistic design analysis using NESSUS. 2. To assign Design Projects to either undergraduate or graduate students on the application of NESSUS. 3. To integrate the application of NESSUS into some selected senior level courses in Civil and Mechanical Engineering curricula. 4. To develop courseware in Probabilistic Design methodology to be included in a graduate level Design Methodology course. 5. To study the relationship between the Probabilistic design methodology and Axiomatic design methodology.
NASA Astrophysics Data System (ADS)
Alemany, Kristina
Electric propulsion has recently become a viable technology for spacecraft, enabling shorter flight times, fewer required planetary gravity assists, larger payloads, and/or smaller launch vehicles. With the maturation of this technology, however, comes a new set of challenges in the area of trajectory design. Because low-thrust trajectory optimization has historically required long run-times and significant user-manipulation, mission design has relied on expert-based knowledge for selecting departure and arrival dates, times of flight, and/or target bodies and gravitational swing-bys. These choices are generally based on known configurations that have worked well in previous analyses or simply on trial and error. At the conceptual design level, however, the ability to explore the full extent of the design space is imperative to locating the best solutions in terms of mass and/or flight times. Beginning in 2005, the Global Trajectory Optimization Competition posed a series of difficult mission design problems, all requiring low-thrust propulsion and visiting one or more asteroids. These problems all had large ranges on the continuous variables---launch date, time of flight, and asteroid stay times (when applicable)---as well as being characterized by millions or even billions of possible asteroid sequences. Even with recent advances in low-thrust trajectory optimization, full enumeration of these problems was not possible within the stringent time limits of the competition. This investigation develops a systematic methodology for determining a broad suite of good solutions to the combinatorial, low-thrust, asteroid tour problem. The target application is for conceptual design, where broad exploration of the design space is critical, with the goal being to rapidly identify a reasonable number of promising solutions for future analysis. The proposed methodology has two steps. The first step applies a three-level heuristic sequence developed from the physics of the problem, which allows for efficient pruning of the design space. The second phase applies a global optimization scheme to locate a broad suite of good solutions to the reduced problem. The global optimization scheme developed combines a novel branch-and-bound algorithm with a genetic algorithm and an industry-standard low-thrust trajectory optimization program to solve for the following design variables: asteroid sequence, launch date, times of flight, and asteroid stay times. The methodology is developed based on a small sample problem, which is enumerated and solved so that all possible discretized solutions are known. The methodology is then validated by applying it to a larger intermediate sample problem, which also has a known solution. Next, the methodology is applied to several larger combinatorial asteroid rendezvous problems, using previously identified good solutions as validation benchmarks. These problems include the 2nd and 3rd Global Trajectory Optimization Competition problems. The methodology is shown to be capable of achieving a reduction in the number of asteroid sequences of 6-7 orders of magnitude, in terms of the number of sequences that require low-thrust optimization as compared to the number of sequences in the original problem. More than 70% of the previously known good solutions are identified, along with several new solutions that were not previously reported by any of the competitors. Overall, the methodology developed in this investigation provides an organized search technique for the low-thrust mission design of asteroid rendezvous problems.
An Event-Based Approach to Design a Teamwork Training Scenario and Assessment Tool in Surgery.
Nguyen, Ngan; Watson, William D; Dominguez, Edward
2016-01-01
Simulation is a technique recommended for teaching and measuring teamwork, but few published methodologies are available on how best to design simulation for teamwork training in surgery and health care in general. The purpose of this article is to describe a general methodology, called event-based approach to training (EBAT), to guide the design of simulation for teamwork training and discuss its application to surgery. The EBAT methodology draws on the science of training by systematically introducing training exercise events that are linked to training requirements (i.e., competencies being trained and learning objectives) and performance assessment. The EBAT process involves: Of the 4 teamwork competencies endorsed by the Agency for Healthcare Research Quality and Department of Defense, "communication" was chosen to be the focus of our training efforts. A total of 5 learning objectives were defined based on 5 validated teamwork and communication techniques. Diagnostic laparoscopy was chosen as the clinical context to frame the training scenario, and 29 KSAs were defined based on review of published literature on patient safety and input from subject matter experts. Critical events included those that correspond to a specific phase in the normal flow of a surgical procedure as well as clinical events that may occur when performing the operation. Similar to the targeted KSAs, targeted responses to the critical events were developed based on existing literature and gathering input from content experts. Finally, a 29-item EBAT-derived checklist was created to assess communication performance. Like any instructional tool, simulation is only effective if it is designed and implemented appropriately. It is recognized that the effectiveness of simulation depends on whether (1) it is built upon a theoretical framework, (2) it uses preplanned structured exercises or events to allow learners the opportunity to exhibit the targeted KSAs, (3) it assesses performance, and (4) it provides formative and constructive feedback to bridge the gap between the learners' KSAs and the targeted KSAs. The EBAT methodology guides the design of simulation that incorporates these 4 features and, thus, enhances training effectiveness with simulation. Copyright © 2015 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
Designing Social Videogames for Educational Uses
ERIC Educational Resources Information Center
Gonzalez-Gonzalez, Carina; Blanco-Izquierdo, Francisco
2012-01-01
In this paper we analyze the main areas of research into educational videogames and in the evolution of the technologies and design methodologies that are making these interactive systems increasingly natural, immersive and social. We present the design and development of a prototype for a collaborative educational videogame based on a Massively…
Ethical and methodological issues in research with Sami experiencing disability.
Melbøe, Line; Hansen, Ketil Lenert; Johnsen, Bjørn-Eirik; Fedreheim, Gunn Elin; Dinesen, Tone; Minde, Gunn-Tove; Rustad, Marit
2016-01-01
A study of disability among the indigenous Sami people in Norway presented a number of ethical and methodological challenges rarely addressed in the literature. The main study was designed to examine and understand the everyday life, transitions between life stages and democratic participation of Norwegian Sami people experiencing disability. Hence, the purpose of this article is to increase the understanding of possible ethical and methodological issues in research within this field. The article describes and discusses ethical and methodological issues that arose when conducting our study and identifies some strategies for addressing issues like these. The ethical and methodological issues addressed in the article are based on a qualitative study among indigenous Norwegian Sami people experiencing disability. The data in this study were collected through 31 semi-structured in-depth interviews with altogether 24 Sami people experiencing disability and 13 next of kin of Sami people experiencing disability (8 mothers, 2 fathers, 2 sister and 1 guardian). The researchers identified 4 main areas of ethical and methodological issues. We present these issues chronologically as they emerged in the research process: 1) concept of knowledge when designing the study, 2) gaining access, 3) data collection and 4) analysis and accountability. The knowledge generated from this study has the potential to benefit future health research, specifically of Norwegian Sami people experiencing disability, as well as health research concerning indigenous people in general, providing scientific-based insight into important ethical and methodological issues in research with indigenous people experiencing disability.
NASA Astrophysics Data System (ADS)
Tangen, Steven Anthony
Due to the complexities of modern military operations and the technologies employed on today's military systems, acquisition costs and development times are becoming increasingly large. Meanwhile, the transformation of the global security environment is driving the U.S. military's own transformation. In order to meet the required capabilities of the next generation without buying prohibitively costly new systems, it is necessary for the military to evolve across the spectrum of doctrine, organization, training, materiel, leadership and education, personnel, and facilities (DOTMLPF). However, the methods for analyzing DOTMLPF approaches within the early acquisition phase of a capability-based assessment (CBA) are not as well established as the traditional technology design techniques. This makes it difficult for decision makers to decide if investments should be made in materiel or non-materiel solutions. This research develops an agent-based constructive simulation to quantitatively assess doctrine alongside materiel approaches. Additionally, life-cycle cost techniques are provided to enable a cost-effectiveness trade. These techniques are wrapped together in a decision-making environment that brings crucial information forward so informed and appropriate acquisition choices can be made. The methodology is tested on a future unmanned aerial vehicle design problem. Through the implementation of this quantitative methodology on the proof-of-concept study, it is shown that doctrinal changes including fleet composition, asset allocation, and patrol pattern were capable of dramatic improvements in system effectiveness at a much lower cost than the incorporation of candidate technologies. Additionally, this methodology was able to quantify the precise nature of strong doctrine-doctrine and doctrine-technology interactions which have been observed only qualitatively throughout military history. This dissertation outlines the methodology and demonstrates how potential approaches to capability-gaps can be identified with respect to effectiveness, cost, and time. When implemented, this methodology offers the opportunity to achieve system capabilities in a new way, improve the design of acquisition programs, and field the right combination of ways and means to address future challenges to national security.
Pérez Suárez, Santiago T.; Travieso González, Carlos M.; Alonso Hernández, Jesús B.
2013-01-01
This article presents a design methodology for designing an artificial neural network as an equalizer for a binary signal. Firstly, the system is modelled in floating point format using Matlab. Afterward, the design is described for a Field Programmable Gate Array (FPGA) using fixed point format. The FPGA design is based on the System Generator from Xilinx, which is a design tool over Simulink of Matlab. System Generator allows one to design in a fast and flexible way. It uses low level details of the circuits and the functionality of the system can be fully tested. System Generator can be used to check the architecture and to analyse the effect of the number of bits on the system performance. Finally the System Generator design is compiled for the Xilinx Integrated System Environment (ISE) and the system is described using a hardware description language. In ISE the circuits are managed with high level details and physical performances are obtained. In the Conclusions section, some modifications are proposed to improve the methodology and to ensure portability across FPGA manufacturers.
Methodological issues in the design of a rheumatoid arthritis activity score and its cut-offs.
Collignon, Olivier
2014-01-01
Activity of rheumatoid arthritis (RA) can be evaluated using several scoring scales based on clinical features. The most widely used one is the Disease Activity Score involving 28 joint counts (DAS28) for which cut-offs were proposed to help physicians classify patients. However, inaccurate scoring can lead to inappropriate medical decisions. In this article some methodological issues in the design of such a score and its cut-offs are highlighted in order to further propose a strategy to overcome them. As long as the issues reviewed in this article are not addressed, results of studies based on standard disease activity scores such as DAS28 should be considered with caution.
ERIC Educational Resources Information Center
Donnelly, David S.
2013-01-01
This study employed a design-based research methodology to develop a theoretically sound approach for designing instructional treatments. The instruction of interest addressed the broad issue of physician wellness among medical school faculty, with particular emphasis on physician self-diagnosis and self-care. The theoretically sound approach…
ERIC Educational Resources Information Center
Forsythe, Susan K.
2015-01-01
This article describes a project using Design Based Research methodology to ascertain whether a pedagogical task based on a dynamic figure designed in a Dynamic Geometry Software (DGS) program could be instrumental in developing students' geometrical reasoning. A dragging strategy which I have named "Dragging Maintaining Symmetry" (DMS)…
ERIC Educational Resources Information Center
Moeller, Jeremy D.; Dattilo, John; Rusch, Frank
2015-01-01
This study examined how specific guidelines and heuristics have been used to identify methodological rigor associated with single-case research designs based on quality indicators developed by Horner et al. Specifically, this article describes how literature reviews have applied Horner et al.'s quality indicators and evidence-based criteria.…
ERIC Educational Resources Information Center
Cole, Elaine J.; Fieselman, Laura
2013-01-01
Purpose: The purpose of this paper is to design a community-based social marketing (CBSM) campaign to foster sustainable behavior change in paper reduction, commingled recycling, and purchasing environmentally preferred products (EPP) with faculty and staff at Pacific University Oregon. Design/methodology/approach: A CBSM campaign was developed…
Lee, Yi-Ying; Hsu, Chih-Yuan; Lin, Ling-Jiun; Chang, Chih-Chun; Cheng, Hsiao-Chun; Yeh, Tsung-Hsien; Hu, Rei-Hsing; Lin, Che; Xie, Zhen; Chen, Bor-Sen
2013-10-27
Synthetic genetic transistors are vital for signal amplification and switching in genetic circuits. However, it is still problematic to efficiently select the adequate promoters, Ribosome Binding Sides (RBSs) and inducer concentrations to construct a genetic transistor with the desired linear amplification or switching in the Input/Output (I/O) characteristics for practical applications. Three kinds of promoter-RBS libraries, i.e., a constitutive promoter-RBS library, a repressor-regulated promoter-RBS library and an activator-regulated promoter-RBS library, are constructed for systematic genetic circuit design using the identified kinetic strengths of their promoter-RBS components.According to the dynamic model of genetic transistors, a design methodology for genetic transistors via a Genetic Algorithm (GA)-based searching algorithm is developed to search for a set of promoter-RBS components and adequate concentrations of inducers to achieve the prescribed I/O characteristics of a genetic transistor. Furthermore, according to design specifications for different types of genetic transistors, a look-up table is built for genetic transistor design, from which we could easily select an adequate set of promoter-RBS components and adequate concentrations of external inducers for a specific genetic transistor. This systematic design method will reduce the time spent using trial-and-error methods in the experimental procedure for a genetic transistor with a desired I/O characteristic. We demonstrate the applicability of our design methodology to genetic transistors that have desirable linear amplification or switching by employing promoter-RBS library searching.
2013-01-01
Background Synthetic genetic transistors are vital for signal amplification and switching in genetic circuits. However, it is still problematic to efficiently select the adequate promoters, Ribosome Binding Sides (RBSs) and inducer concentrations to construct a genetic transistor with the desired linear amplification or switching in the Input/Output (I/O) characteristics for practical applications. Results Three kinds of promoter-RBS libraries, i.e., a constitutive promoter-RBS library, a repressor-regulated promoter-RBS library and an activator-regulated promoter-RBS library, are constructed for systematic genetic circuit design using the identified kinetic strengths of their promoter-RBS components. According to the dynamic model of genetic transistors, a design methodology for genetic transistors via a Genetic Algorithm (GA)-based searching algorithm is developed to search for a set of promoter-RBS components and adequate concentrations of inducers to achieve the prescribed I/O characteristics of a genetic transistor. Furthermore, according to design specifications for different types of genetic transistors, a look-up table is built for genetic transistor design, from which we could easily select an adequate set of promoter-RBS components and adequate concentrations of external inducers for a specific genetic transistor. Conclusion This systematic design method will reduce the time spent using trial-and-error methods in the experimental procedure for a genetic transistor with a desired I/O characteristic. We demonstrate the applicability of our design methodology to genetic transistors that have desirable linear amplification or switching by employing promoter-RBS library searching. PMID:24160305
ERIC Educational Resources Information Center
Eseryel, Deniz; Schuver-van Blanken, Marian J.; Spector, J. Michael
ADAPT[IT] (Advanced Design Approach for Personalized Training-Interactive Tools is a European project coordinated by the Dutch National Aerospace Laboratory. The aim of ADAPT[IT] is to create and validate an effective training design methodology, based on cognitive science and leading to the integration of advanced technologies, so that the…
Design of strength characteristics on the example of a mining support
NASA Astrophysics Data System (ADS)
Gwiazda, A.; Sękala, A.; Banaś, W.; Topolska, S.; Foit, K.; Monica, Z.
2017-08-01
It is a special group of particular design aproches that could be characterized as “design for X”. All areas of specific these design methodology, taking into account the requirements of the life cycle are described with the acronym DfX. It means an integrated computing platform approach to design binding together both the area of design knowledge and area of computer systems. In this perspective, computer systems are responsible for the link between design requirements with the subject of the project and to filter the information being circulated throughout the operation of the project. The DfX methodologies together form an approach integrating to different functional areas of industrial organization. Among the internal elements it can distinguish the structure of the project team, the people making it, the same process design, control system design and implementation of the action tools to assist this process. Among the elements that are obtained in the framework of this approach should be distinguished: higher operating efficiency, professionalism, the ability to create innovation, incremental progress of the project and the appropriate focus of the project team. It have been done attempts to integrate identified specific areas for action in the field of design methodology. They have already taken place earlier in the design due to the Economic Design for Manufacture. This approach was characteristic for European industry. In this case, an approach was developed in methodology, which can be defined as the Design to/for Cost. The article presents the idea of an integrated design approach related with the DfX approach. The results are described on the base of a virtual 3D model of a mining support. This model was elaborated in the advanced engineering platform like Siemens PLM NX.
Conjoint analysis: using a market-based research model for healthcare decision making.
Mele, Nancy L
2008-01-01
Conjoint analysis is a market-based research model that has been used by businesses for more than 35 years to predict consumer preferences in product design and purchasing. Researchers in medicine, healthcare economics, and health policy have discovered the value of this methodology in determining treatment preferences, resource allocation, and willingness to pay. To describe the conjoint analysis methodology and explore value-added applications in nursing research. Conjoint analysis methodology is described, using examples from the healthcare and business literature, and personal experience with the method. Nurses are called upon to increase interdisciplinary research, provide an evidence base for nursing practice, create patient-centered treatments, and revise nursing education. Other disciplines have met challenges like these using conjoint analysis and discrete choice modeling.
NASA Astrophysics Data System (ADS)
Lee, Dae Young
The design of a small satellite is challenging since they are constrained by mass, volume, and power. To mitigate these constraint effects, designers adopt deployable configurations on the spacecraft that result in an interesting and difficult optimization problem. The resulting optimization problem is challenging due to the computational complexity caused by the large number of design variables and the model complexity created by the deployables. Adding to these complexities, there is a lack of integration of the design optimization systems into operational optimization, and the utility maximization of spacecraft in orbit. The developed methodology enables satellite Multidisciplinary Design Optimization (MDO) that is extendable to on-orbit operation. Optimization of on-orbit operations is possible with MDO since the model predictive controller developed in this dissertation guarantees the achievement of the on-ground design behavior in orbit. To enable the design optimization of highly constrained and complex-shaped space systems, the spherical coordinate analysis technique, called the "Attitude Sphere", is extended and merged with an additional engineering tools like OpenGL. OpenGL's graphic acceleration facilitates the accurate estimation of the shadow-degraded photovoltaic cell area. This technique is applied to the design optimization of the satellite Electric Power System (EPS) and the design result shows that the amount of photovoltaic power generation can be increased more than 9%. Based on this initial methodology, the goal of this effort is extended from Single Discipline Optimization to Multidisciplinary Optimization, which includes the design and also operation of the EPS, Attitude Determination and Control System (ADCS), and communication system. The geometry optimization satisfies the conditions of the ground development phase; however, the operation optimization may not be as successful as expected in orbit due to disturbances. To address this issue, for the ADCS operations, controllers based on Model Predictive Control that are effective for constraint handling were developed and implemented. All the suggested design and operation methodologies are applied to a mission "CADRE", which is space weather mission scheduled for operation in 2016. This application demonstrates the usefulness and capability of the methodology to enhance CADRE's capabilities, and its ability to be applied to a variety of missions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Das, Sonjoy; Goswami, Kundan; Datta, Biswa N.
2014-12-10
Failure of structural systems under dynamic loading can be prevented via active vibration control which shifts the damped natural frequencies of the systems away from the dominant range of loading spectrum. The damped natural frequencies and the dynamic load typically show significant variations in practice. A computationally efficient methodology based on quadratic partial eigenvalue assignment technique and optimization under uncertainty has been formulated in the present work that will rigorously account for these variations and result in an economic and resilient design of structures. A novel scheme based on hierarchical clustering and importance sampling is also developed in this workmore » for accurate and efficient estimation of probability of failure to guarantee the desired resilience level of the designed system. Numerical examples are presented to illustrate the proposed methodology.« less
[Digital learning object for diagnostic reasoning in nursing applied to the integumentary system].
da Costa, Cecília Passos Vaz; Luz, Maria Helena Barros Araújo
2015-12-01
To describe the creation of a digital learning object for diagnostic reasoning in nursing applied to the integumentary system at a public university of Piaui. A methodological study applied to technological production based on the pedagogical framework of problem-based learning. The methodology for creating the learning object observed the stages of analysis, design, development, implementation and evaluation recommended for contextualized instructional design. The revised taxonomy of Bloom was used to list the educational goals. The four modules of the developed learning object were inserted into the educational platform Moodle. The theoretical assumptions allowed the design of an important online resource that promotes effective learning in the scope of nursing education. This study should add value to nursing teaching practices through the use of digital learning objects for teaching diagnostic reasoning applied to skin and skin appendages.
Integrating Design and Manufacturing for a High Speed Civil Transport Wing
NASA Technical Reports Server (NTRS)
Marx, William J.; Mavris, Dimitri N.; Schrage, Daniel P.
1994-01-01
The aerospace industry is currently addressing the problem of integrating design and manufacturing. Because of the difficulties associated with using conventional, procedural techniques and algorithms, it is the authors' belief that the only feasible way to integrate the two concepts is with the development of an appropriate Knowledge-Based System (KBS). The authors propose a methodology for an aircraft producibility assessment, including a KBS, that addresses both procedural and heuristic aspects of integrating design and manufacturing of a High Speed Civil Transport (HSCT) wing. The HSCT was chosen as the focus of this investigation since it is a current NASA/aerospace industry initiative full of technological challenges involving many disciplines. The paper gives a brief background of selected previous supersonic transport studies followed by descriptions of key relevant design and manufacturing methodologies. Georgia Tech's Concurrent Engineering/Integrated Product and Process Development methodology is discussed with reference to this proposed conceptual producibility assessment. Evaluation criteria are presented that relate pertinent product and process parameters to overall product producibility. In addition, the authors' integration methodology and reasons for selecting a KBS to integrate design and manufacturing are presented in this paper. Finally, a proposed KBS is given, as well as statements of future work and overall investigation objectives.
An Ontology for State Analysis: Formalizing the Mapping to SysML
NASA Technical Reports Server (NTRS)
Wagner, David A.; Bennett, Matthew B.; Karban, Robert; Rouquette, Nicolas; Jenkins, Steven; Ingham, Michel
2012-01-01
State Analysis is a methodology developed over the last decade for architecting, designing and documenting complex control systems. Although it was originally conceived for designing robotic spacecraft, recent applications include the design of control systems for large ground-based telescopes. The European Southern Observatory (ESO) began a project to design the European Extremely Large Telescope (E-ELT), which will require coordinated control of over a thousand articulated mirror segments. The designers are using State Analysis as a methodology and the Systems Modeling Language (SysML) as a modeling and documentation language in this task. To effectively apply the State Analysis methodology in this context it became necessary to provide ontological definitions of the concepts and relations in State Analysis and greater flexibility through a mapping of State Analysis into a practical extension of SysML. The ontology provides the formal basis for verifying compliance with State Analysis semantics including architectural constraints. The SysML extension provides the practical basis for applying the State Analysis methodology with SysML tools. This paper will discuss the method used to develop these formalisms (the ontology), the formalisms themselves, the mapping to SysML and approach to using these formalisms to specify a control system and enforce architectural constraints in a SysML model.
NASA Astrophysics Data System (ADS)
de La Cal, E. A.; Fernández, E. M.; Quiroga, R.; Villar, J. R.; Sedano, J.
In previous works a methodology was defined, based on the design of a genetic algorithm GAP and an incremental training technique adapted to the learning of series of stock market values. The GAP technique consists in a fusion of GP and GA. The GAP algorithm implements the automatic search for crisp trading rules taking as objectives of the training both the optimization of the return obtained and the minimization of the assumed risk. Applying the proposed methodology, rules have been obtained for a period of eight years of the S&P500 index. The achieved adjustment of the relation return-risk has generated rules with returns very superior in the testing period to those obtained applying habitual methodologies and even clearly superior to Buy&Hold. This work probes that the proposed methodology is valid for different assets in a different market than previous work.
Lithography-based automation in the design of program defect masks
NASA Astrophysics Data System (ADS)
Vakanas, George P.; Munir, Saghir; Tejnil, Edita; Bald, Daniel J.; Nagpal, Rajesh
2004-05-01
In this work, we are reporting on a lithography-based methodology and automation in the design of Program Defect masks (PDM"s). Leading edge technology masks have ever-shrinking primary features and more pronounced model-based secondary features such as optical proximity corrections (OPC), sub-resolution assist features (SRAF"s) and phase-shifted mask (PSM) structures. In order to define defect disposition specifications for critical layers of a technology node, experience alone in deciding worst-case scenarios for the placement of program defects is necessary but may not be sufficient. MEEF calculations initiated from layout pattern data and their integration in a PDM layout flow provide a natural approach for improvements, relevance and accuracy in the placement of programmed defects. This methodology provides closed-loop feedback between layout and hard defect disposition specifications, thereby minimizing engineering test restarts, improving quality and reducing cost of high-end masks. Apart from SEMI and industry standards, best-known methods (BKM"s) in integrated lithographically-based layout methodologies and automation specific to PDM"s are scarce. The contribution of this paper lies in the implementation of Design-For-Test (DFT) principles to a synergistic interaction of CAD Layout and Aerial Image Simulator to drive layout improvements, highlight layout-to-fracture interactions and output accurate program defect placement coordinates to be used by tools in the mask shop.
ERIC Educational Resources Information Center
Colomar, M. Pilar Alberola; Guzman, Eva Gil
2009-01-01
We are presenting a methodological approach that aims to increase students' motivation by asking them to develop tasks based on professional settings. In order to meet this objective a collaborative methodology was designed and applied to two multidisciplinary projects: MARKETOUR and ICT-SUSTOUR. Both projects made students face real workplace…
ERIC Educational Resources Information Center
Baltes, Kenneth G.; Hendrix, Vernon L.
Two recent developments in management information system technology and higher education administration have brought about the need for this study, designed to develop a methodology for revealing a relational model of the data base that administrators are operating from currently or would like to be able to operate from in the future.…
Moving towards Optimising Demand-Led Learning: The 2005-2007 ECUANET Leonardo Da Vinci Project
ERIC Educational Resources Information Center
Dealtry, Richard; Howard, Keith
2008-01-01
Purpose: The purpose of this paper is to present the key project learning points and outcomes as a guideline for the future quality management of demand-led learning and development. Design/methodology/approach: The research methodology was based upon a corporate university blueprint architecture and browser toolkit developed by a member of the…
ERIC Educational Resources Information Center
Bonometti, Patrizia
2012-01-01
Purpose: The aim of this contribution is to describe a new complexity-science-based approach for improving safety, quality and efficiency and the way it was implemented by TenarisDalmine. Design/methodology/approach: This methodology is called "a safety-building community". It consists of a safety-behaviour social self-construction…
ERIC Educational Resources Information Center
Neupert, Kent E.; Baughn, C. Cristopher; Dao, Thi Thanh Lam
2005-01-01
Purpose: This paper identifies skills necessary in order to succeed in Vietnam and proposes a training program to develop such skills. Design/methodology/approach: To determine necessary skills, 74 managers were interviewed using critical incident methodology to identify training needs. Critical incident approach asks respondents to describe the…
International Harmonization of Training and Qualification in the Manufacturing Industry
ERIC Educational Resources Information Center
Quintino, L.; Fernandes, I.; Miranda, R. M.
2011-01-01
Purpose: The aim of this paper is to propose a model for international harmonization of the training and qualification of human resources for industrial professions. The outcome is a system based on training guidelines and a quality assurance methodology that is now in use in 42 countries around the world. Design/methodology/approach: The paper…
ERIC Educational Resources Information Center
Healey, Nigel Martin
2018-01-01
Purpose: The purpose of this paper is to investigate the challenges of managing transnational education (TNE) partnerships from the perspective of the home university managers. Design/methodology/approach: The study adopts a qualitative, "insider researcher" methodology. It uses a sample set of eight mangers who operate from the home…
A Methodology for Developing Learning Objects for Web Course Delivery
ERIC Educational Resources Information Center
Stauffer, Karen; Lin, Fuhua; Koole, Marguerite
2008-01-01
This article presents a methodology for developing learning objects for web-based courses using the IMS Learning Design (IMS LD) specification. We first investigated the IMS LD specification, determining how to use it with online courses and the student delivery model, and then applied this to a Unit of Learning (UOL) for online computer science…
ERIC Educational Resources Information Center
Raz, Aviad E.
2007-01-01
Purpose: The purpose of this paper is to describe and analyse the formation of CoPs (communities of practice) in three call centres of cellular communication operating companies in Israel. Design/methodology/approach: This study is based on a qualitative methodology including observations, interviews and textual analysis. Findings: In all three…
ERIC Educational Resources Information Center
Pillay, Hitendra; Kelly, Kathy; Tones, Megan
2010-01-01
Purpose: The purpose of this paper is to identify the transitional employment (TE) aspirations and training and development needs of older and younger workers at risk of early retirement due to limited education and/or employment in blue-collar (BC) occupations. Design/methodology/approach: A computer-based methodology is used to evaluate the…
Optimization of lamp arrangement in a closed-conduit UV reactor based on a genetic algorithm.
Sultan, Tipu; Ahmad, Zeshan; Cho, Jinsoo
2016-01-01
The choice for the arrangement of the UV lamps in a closed-conduit ultraviolet (CCUV) reactor significantly affects the performance. However, a systematic methodology for the optimal lamp arrangement within the chamber of the CCUV reactor is not well established in the literature. In this research work, we propose a viable systematic methodology for the lamp arrangement based on a genetic algorithm (GA). In addition, we analyze the impacts of the diameter, angle, and symmetry of the lamp arrangement on the reduction equivalent dose (RED). The results are compared based on the simulated RED values and evaluated using the computational fluid dynamics simulations software ANSYS FLUENT. The fluence rate was calculated using commercial software UVCalc3D, and the GA-based lamp arrangement optimization was achieved using MATLAB. The simulation results provide detailed information about the GA-based methodology for the lamp arrangement, the pathogen transport, and the simulated RED values. A significant increase in the RED values was achieved by using the GA-based lamp arrangement methodology. This increase in RED value was highest for the asymmetric lamp arrangement within the chamber of the CCUV reactor. These results demonstrate that the proposed GA-based methodology for symmetric and asymmetric lamp arrangement provides a viable technical solution to the design and optimization of the CCUV reactor.
Evaluation of complex community-based childhood obesity prevention interventions.
Karacabeyli, D; Allender, S; Pinkney, S; Amed, S
2018-05-16
Multi-setting, multi-component community-based interventions have shown promise in preventing childhood obesity; however, evaluation of these complex interventions remains a challenge. The objective of the study is to systematically review published methodological approaches to outcome evaluation for multi-setting community-based childhood obesity prevention interventions and synthesize a set of pragmatic recommendations. MEDLINE, CINAHL and PsycINFO were searched from inception to 6 July 2017. Papers were included if the intervention targeted children ≤18 years, engaged at least two community sectors and described their outcome evaluation methodology. A single reviewer conducted title and abstract scans, full article review and data abstraction. Directed content analysis was performed by three reviewers to identify prevailing themes. Thirty-three studies were included, and of these, 26 employed a quasi-experimental design; the remaining were randomized control trials. Body mass index was the most commonly measured outcome, followed by health behaviour change and psychosocial outcomes. Six themes emerged, highlighting advantages and disadvantages of active vs. passive consent, quasi-experimental vs. randomized control trials, longitudinal vs. repeat cross-sectional designs and the roles of process evaluation and methodological flexibility in evaluating complex interventions. Selection of study designs and outcome measures compatible with community infrastructure, accompanied by process evaluation, may facilitate successful outcome evaluation. © 2018 World Obesity Federation.
NASA Technical Reports Server (NTRS)
Allen, Cheryl L.
1991-01-01
Enhanced engineering tools can be obtained through the integration of expert system methodologies and existing design software. The application of these methodologies to the spacecraft design and cost model (SDCM) software provides an improved technique for the selection of hardware for unmanned spacecraft subsystem design. The knowledge engineering system (KES) expert system development tool was used to implement a smarter equipment section algorithm than that which is currently achievable through the use of a standard data base system. The guidance, navigation, and control subsystems of the SDCM software was chosen as the initial subsystem for implementation. The portions of the SDCM code which compute the selection criteria and constraints remain intact, and the expert system equipment selection algorithm is embedded within this existing code. The architecture of this new methodology is described and its implementation is reported. The project background and a brief overview of the expert system is described, and once the details of the design are characterized, an example of its implementation is demonstrated.
NASA Astrophysics Data System (ADS)
Lin, Y.; Zhang, W. J.
2005-02-01
This paper presents an approach to human-machine interface design for control room operators of nuclear power plants. The first step in designing an interface for a particular application is to determine information content that needs to be displayed. The design methodology for this step is called the interface design framework (called framework ). Several frameworks have been proposed for applications at varying levels, including process plants. However, none is based on the design and manufacture of a plant system for which the interface is designed. This paper presents an interface design framework which originates from design theory and methodology for general technical systems. Specifically, the framework is based on a set of core concepts of a function-behavior-state model originally proposed by the artificial intelligence research community and widely applied in the design research community. Benefits of this new framework include the provision of a model-based fault diagnosis facility, and the seamless integration of the design (manufacture, maintenance) of plants and the design of human-machine interfaces. The missing linkage between design and operation of a plant was one of the causes of the Three Mile Island nuclear reactor incident. A simulated plant system is presented to explain how to apply this framework in designing an interface. The resulting human-machine interface is discussed; specifically, several fault diagnosis examples are elaborated to demonstrate how this interface could support operators' fault diagnosis in an unanticipated situation.
Failure detection system design methodology. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Chow, E. Y.
1980-01-01
The design of a failure detection and identification system consists of designing a robust residual generation process and a high performance decision making process. The design of these two processes are examined separately. Residual generation is based on analytical redundancy. Redundancy relations that are insensitive to modelling errors and noise effects are important for designing robust residual generation processes. The characterization of the concept of analytical redundancy in terms of a generalized parity space provides a framework in which a systematic approach to the determination of robust redundancy relations are developed. The Bayesian approach is adopted for the design of high performance decision processes. The FDI decision problem is formulated as a Bayes sequential decision problem. Since the optimal decision rule is incomputable, a methodology for designing suboptimal rules is proposed. A numerical algorithm is developed to facilitate the design and performance evaluation of suboptimal rules.
NASA Astrophysics Data System (ADS)
Belapurkar, Rohit K.
Future aircraft engine control systems will be based on a distributed architecture, in which, the sensors and actuators will be connected to the Full Authority Digital Engine Control (FADEC) through an engine area network. Distributed engine control architecture will allow the implementation of advanced, active control techniques along with achieving weight reduction, improvement in performance and lower life cycle cost. The performance of a distributed engine control system is predominantly dependent on the performance of the communication network. Due to the serial data transmission policy, network-induced time delays and sampling jitter are introduced between the sensor/actuator nodes and the distributed FADEC. Communication network faults and transient node failures may result in data dropouts, which may not only degrade the control system performance but may even destabilize the engine control system. Three different architectures for a turbine engine control system based on a distributed framework are presented. A partially distributed control system for a turbo-shaft engine is designed based on ARINC 825 communication protocol. Stability conditions and control design methodology are developed for the proposed partially distributed turbo-shaft engine control system to guarantee the desired performance under the presence of network-induced time delay and random data loss due to transient sensor/actuator failures. A fault tolerant control design methodology is proposed to benefit from the availability of an additional system bandwidth and from the broadcast feature of the data network. It is shown that a reconfigurable fault tolerant control design can help to reduce the performance degradation in presence of node failures. A T-700 turbo-shaft engine model is used to validate the proposed control methodology based on both single input and multiple-input multiple-output control design techniques.
An autonomous satellite architecture integrating deliberative reasoning and behavioural intelligence
NASA Technical Reports Server (NTRS)
Lindley, Craig A.
1993-01-01
This paper describes a method for the design of autonomous spacecraft, based upon behavioral approaches to intelligent robotics. First, a number of previous spacecraft automation projects are reviewed. A methodology for the design of autonomous spacecraft is then presented, drawing upon both the European Space Agency technological center (ESTEC) automation and robotics methodology and the subsumption architecture for autonomous robots. A layered competency model for autonomous orbital spacecraft is proposed. A simple example of low level competencies and their interaction is presented in order to illustrate the methodology. Finally, the general principles adopted for the control hardware design of the AUSTRALIS-1 spacecraft are described. This system will provide an orbital experimental platform for spacecraft autonomy studies, supporting the exploration of different logical control models, different computational metaphors within the behavioral control framework, and different mappings from the logical control model to its physical implementation.
Jiang, Zheng; Wang, Hong; Wu, Qi-nan
2015-06-01
To optimize the processing of polysaccharide extraction from Spirodela polyrrhiza. Five factors related to extraction rate of polysaccharide were optimized by the Plackett-Burman design. Based on this study, three factors, including alcohol volume fraction, extraction temperature and ratio of material to liquid, were regarded as investigation factors by Box-Behnken response surface methodology. The effect order of three factors on the extraction rate of polysaccharide from Spirodela polyrrhiza were as follows: extraction temperature, alcohol volume fraction,ratio of material to liquid. According to Box-Behnken response, the best extraction conditions were: alcohol volume fraction of 81%, ratio of material to liquid of 1:42, extraction temperature of 100 degrees C, extraction time of 60 min for four times. Plackett-Burman design and Box-Behnken response surface methodology used to optimize the extraction process for the polysaccharide in this study is effective and stable.
Methodologic ramifications of paying attention to sex and gender differences in clinical research.
Prins, Martin H; Smits, Kim M; Smits, Luc J
2007-01-01
Methodologic standards for studies on sex and gender differences should be developed to improve reporting of studies and facilitate their inclusion in systematic reviews. The essence of these studies lies within the concept of effect modification. This article reviews important methodologic issues in the design and reporting of pharmacogenetic studies. Differences in effect based on sex or gender should preferably be expressed in absolute terms (risk differences) to facilitate clinical decisions on treatment. Information on the distribution of potential effect modifiers or prognostic factors should be available to prevent a biased comparison of differences in effect between genotypes. Other considerations included the possibility of selective nonavailability of biomaterial and the choice of a statistical model to study effect modification. To ensure high study quality, additional methodologic issues should be taken into account when designing and reporting studies on sex and gender differences.
MEMS product engineering: methodology and tools
NASA Astrophysics Data System (ADS)
Ortloff, Dirk; Popp, Jens; Schmidt, Thilo; Hahn, Kai; Mielke, Matthias; Brück, Rainer
2011-03-01
The development of MEMS comprises the structural design as well as the definition of an appropriate manufacturing process. Technology constraints have a considerable impact on the device design and vice-versa. Product design and technology development are therefore concurrent tasks. Based on a comprehensive methodology the authors introduce a software environment that links commercial design tools from both area into a common design flow. In this paper emphasis is put on automatic low threshold data acquisition. The intention is to collect and categorize development data for further developments with minimum overhead and minimum disturbance of established business processes. As a first step software tools that automatically extract data from spreadsheets or file-systems and put them in context with existing information are presented. The developments are currently carried out in a European research project.
Towards a general object-oriented software development methodology
NASA Technical Reports Server (NTRS)
Seidewitz, ED; Stark, Mike
1986-01-01
Object diagrams were used to design a 5000 statement team training exercise and to design the entire dynamics simulator. The object diagrams are also being used to design another 50,000 statement Ada system and a personal computer based system that will be written in Modula II. The design methodology evolves out of these experiences as well as the limitations of other methods that were studied. Object diagrams, abstraction analysis, and associated principles provide a unified framework which encompasses concepts from Yourdin, Booch, and Cherry. This general object-oriented approach handles high level system design, possibly with concurrency, through object-oriented decomposition down to a completely functional level. How object-oriented concepts can be used in other phases of the software life-cycle, such as specification and testing is being studied concurrently.
Data Mining for Financial Applications
NASA Astrophysics Data System (ADS)
Kovalerchuk, Boris; Vityaev, Evgenii
This chapter describes Data Mining in finance by discussing financial tasks, specifics of methodologies and techniques in this Data Mining area. It includes time dependence, data selection, forecast horizon, measures of success, quality of patterns, hypothesis evaluation, problem ID, method profile, attribute-based and relational methodologies. The second part of the chapter discusses Data Mining models and practice in finance. It covers use of neural networks in portfolio management, design of interpretable trading rules and discovering money laundering schemes using decision rules and relational Data Mining methodology.
Lost in translation: bridging gaps between design and evidence-based design.
Watkins, Nicholas; Keller, Amy
2008-01-01
The healthcare design community is adopting evidence-based design (EBD) at a startling rate. However, the role of research within an architectural practice is unclear. Reasons for the lack of clarity include multiple connotations of EBD, the tension between a research-driven market and market-driven research, and the competing expectations and standards of design practitioners and researchers. Research as part of EBD should be integral with the design process so that research directly contributes to building projects. Characteristics of a comprehensive programming methodology to close the gap between design and EBD are suggested.
Effective Teaching of the Physical Design of Integrated Circuits Using Educational Tools
ERIC Educational Resources Information Center
Aziz, Syed Mahfuzul; Sicard, Etienne; Ben Dhia, Sonia
2010-01-01
This paper presents the strategies used for effective teaching and skill development in integrated circuit (IC) design using project-based learning (PBL) methodologies. It presents the contexts in which these strategies are applied to IC design courses at the University of South Australia, Adelaide, Australia, and the National Institute of Applied…
Basic Employability Skills: A Triangular Design Approach
ERIC Educational Resources Information Center
Rosenberg, Stuart; Heimler, Ronald; Morote, Elsa-Sofia
2012-01-01
Purpose: This paper seeks to examine the basic employability skills needed for job performance, the reception of these skills in college, and the need for additional training in these skills after graduation. Design/methodology/approach: The research was based on a triangular design approach, in which the attitudes of three distinct groups--recent…
Preparing Turnaround Leaders for High Needs Urban Schools
ERIC Educational Resources Information Center
Lochmiller, Chad R.; Chesnut, Colleen E.
2017-01-01
Purpose: The purpose of this paper is to describe the program structure and design considerations of a 25-day, full-time apprenticeship in a university-based principal preparation program. Design/Methodology/ Approach: The study used a qualitative case study design that drew upon interviews and focus groups with program participants as well as…
Reasserting the Fundamentals of Systems Analysis and Design through the Rudiments of Artifacts
ERIC Educational Resources Information Center
Jafar, Musa; Babb, Jeffry
2012-01-01
In this paper we present an artifacts-based approach to teaching a senior level Object-Oriented Analysis and Design course. Regardless of the systems development methodology and process model, and in order to facilitate communication across the business modeling, analysis, design, construction and deployment disciplines, we focus on (1) the…
Designing a Field Experience Tracking System in the Area of Special Education
ERIC Educational Resources Information Center
He, Wu; Watson, Silvana
2014-01-01
Purpose: To improve the quality of field experience, support field experience cooperation and streamline field experience management, the purpose of this paper is to describe the experience in using Activity Theory to design and develop a web-based field experience tracking system for a special education program. Design/methodology/approach: The…
Rethinking the NTCIP Design and Protocols - Analyzing the Issues
DOT National Transportation Integrated Search
1998-03-03
This working paper discusses the issues involved in changing the current draft NTCIP standard from an X.25-based protocol stack to an Internet-based protocol stack. It contains a methodology which could be used to change NTCIP's base protocols. This ...
A first principles based methodology for design of axial compressor configurations
NASA Astrophysics Data System (ADS)
Iyengar, Vishwas
Axial compressors are widely used in many aerodynamic applications. The design of an axial compressor configuration presents many challenges. Until recently, compressor design was done using 2-D viscous flow analyses that solve the flow field around cascades or in meridional planes or 3-D inviscid analyses. With the advent of modern computational methods it is now possible to analyze the 3-D viscous flow and accurately predict the performance of 3-D multistage compressors. It is necessary to retool the design methodologies to take advantage of the improved accuracy and physical fidelity of these advanced methods. In this study, a first-principles based multi-objective technique for designing single stage compressors is described. The study accounts for stage aerodynamic characteristics, rotor-stator interactions and blade elastic deformations. A parametric representation of compressor blades that include leading and trailing edge camber line angles, thickness and camber distributions was used in this study. A design of experiment approach is used to reduce the large combinations of design variables into a smaller subset. A response surface method is used to approximately map the output variables as a function of design variables. An optimized configuration is determined as the extremum of all extrema. This method has been applied to a rotor-stator stage similar to NASA Stage 35. The study has two parts: a preliminary study where a limited number of design variables were used to give an understanding of the important design variables for subsequent use, and a comprehensive application of the methodology where a larger, more complete set of design variables are used. The extended methodology also attempts to minimize the acoustic fluctuations at the rotor-stator interface by considering a rotor-wake influence coefficient (RWIC). Results presented include performance map calculations at design and off-design speed along with a detailed visualization of the flow field at design and off-design conditions. The present methodology provides a way to systematically screening through the plethora of design variables. By selecting the most influential design parameters and by optimizing the blade leading edge and trailing edge mean camber line angles, phenomenon's such as tip blockages, blade-to-blade shock structures and other loss mechanisms can be weakened or alleviated. It is found that these changes to the configuration can have a beneficial effect on total pressure ratio and stage adiabatic efficiency, thereby improving the performance of the axial compression system. Aeroacoustic benefits were found by minimizing the noise generating mechanisms associated with rotor wake-stator interactions. The new method presented is reliable, low time cost, and easily applicable to industry daily design optimization of turbomachinery blades.
Shielding of medical imaging X-ray facilities: a simple and practical method.
Bibbo, Giovanni
2017-12-01
The most widely accepted method for shielding design of X-ray facilities is that contained in the National Council on Radiation Protection and Measurements Report 147 whereby the computation of the barrier thickness for primary, secondary and leakage radiations is based on the knowledge of the distances from the radiation sources, the assumptions of the clinical workload, and usage and occupancy of adjacent areas. The shielding methodology used in this report is complex. With this methodology, the shielding designers need to make assumptions regarding the use of the X-ray room and the adjoining areas. Different shielding designers may make different assumptions resulting in different shielding requirements for a particular X-ray room. A more simple and practical method is to base the shielding design on the shielding principle used to shield X-ray tube housing to limit the leakage radiation from the X-ray tube. In this case, the shielding requirements of the X-ray room would depend only on the maximum radiation output of the X-ray equipment regardless of workload, usage or occupancy of the adjacent areas of the room. This shielding methodology, which has been used in South Australia since 1985, has proven to be practical and, to my knowledge, has not led to excess shielding of X-ray installations.
Innovation design of medical equipment based on TRIZ.
Gao, Changqing; Guo, Leiming; Gao, Fenglan; Yang, Bo
2015-01-01
Medical equipment is closely related to personal health and safety, and this can be of concern to the equipment user. Furthermore, there is much competition among medical equipment manufacturers. Innovative design is the key to success for those enterprises. The design of medical equipment usually covers vastly different domains of knowledge. The application of modern design methodology in medical equipment and technology invention is an urgent requirement. TRIZ (Russian abbreviation of what can be translated as `theory of inventive problem solving') was born in Russia, which contain some problem-solving methods developed by patent analysis around the world, including Conflict Matrix, Substance Field Analysis, Standard Solution, Effects, etc. TRIZ is an inventive methodology for problems solving. As an Engineering example, infusion system is analyzed and re-designed by TRIZ. The innovative idea is generated to liberate the caretaker from the infusion bag watching out. The research in this paper shows the process of the application of TRIZ in medical device inventions. It is proved that TRIZ is an inventive methodology for problems solving and can be used widely in medical device development.
Design of an integrated airframe/propulsion control system architecture
NASA Technical Reports Server (NTRS)
Cohen, Gerald C.; Lee, C. William; Strickland, Michael J.
1990-01-01
The design of an integrated airframe/propulsion control system architecture is described. The design is based on a prevalidation methodology that used both reliability and performance tools. An account is given of the motivation for the final design and problems associated with both reliability and performance modeling. The appendices contain a listing of the code for both the reliability and performance model used in the design.
Towards a Methodology for the Design of Multimedia Public Access Interfaces.
ERIC Educational Resources Information Center
Rowley, Jennifer
1998-01-01
Discussion of information systems methodologies that can contribute to interface design for public access systems covers: the systems life cycle; advantages of adopting information systems methodologies; soft systems methodologies; task-oriented approaches to user interface design; holistic design, the Star model, and prototyping; the…
Techniques for designing rotorcraft control systems
NASA Technical Reports Server (NTRS)
Yudilevitch, Gil; Levine, William S.
1994-01-01
Over the last two and a half years we have been demonstrating a new methodology for the design of rotorcraft flight control systems (FCS) to meet handling qualities requirements. This method is based on multicriterion optimization as implemented in the optimization package CONSOL-OPTCAD (C-O). This package has been developed at the Institute for Systems Research (ISR) at the University of Maryland at College Park. This design methodology has been applied to the design of a FCS for the UH-60A helicopter in hover having the ADOCS control structure. The controller parameters have been optimized to meet the ADS-33C specifications. Furthermore, using this approach, an optimal (minimum control energy) controller has been obtained and trade-off studies have been performed.
Digital redesign of anti-wind-up controller for cascaded analog system.
Chen, Y S; Tsai, J S H; Shieh, L S; Moussighi, M M
2003-01-01
The cascaded conventional anti-wind-up (CAW) design method for integral controller is discussed. Then, the prediction-based digital redesign methodology is utilized to find the new pulse amplitude modulated (PAM) digital controller for effective digital control of the analog plant with input saturation constraint. The desired digital controller is determined from existing or pre-designed CAW analog controller. The proposed method provides a novel methodology for indirect digital design of a continuous-time unity output-feedback system with a cascaded analog controller as in the case of PID controllers for industrial control processes with the presence of actuator saturations. It enables us to implement an existing or pre-designed cascaded CAW analog controller via a digital controller effectively.
Dong, Jia; Lübberstedt, Marc; Urbaniak, Thomas; Nüssler, Andreas K.N.; Knobeloch, Daniel; Gerlach, Jörg C.; Zeilinger, Katrin
2008-01-01
Optimization of cell culture media based on statistical experimental design methodology is a widely used approach for improving cultivation conditions. We applied this methodology to refine the composition of an established culture medium for growth of a human hepatoma cell line, C3A. A selection of growth factors and nutrient supplements were systematically screened according to standard design of experiments (DoE) procedures. The results of the screening indicated that the medium additives hepatocyte growth factor, oncostatin M, and fibroblast growth factor 4 significantly influenced the metabolic activities of the C3A cell line. Surface response methodology revealed that the optimum levels for these factors were 30 ng/ml for hepatocyte growth factor and 35 ng/ml for oncostatin M. Additional experiments on primary human hepatocyte cultures showed high variance in metabolic activities between cells from different individuals, making determination of optimal levels of factors more difficult. Still, it was possible to conclude that hepatocyte growth factor, epidermal growth factor, and oncostatin M had decisive effects on the metabolic functions of primary human hepatocytes. PMID:19003182
Dong, Jia; Mandenius, Carl-Fredrik; Lübberstedt, Marc; Urbaniak, Thomas; Nüssler, Andreas K N; Knobeloch, Daniel; Gerlach, Jörg C; Zeilinger, Katrin
2008-07-01
Optimization of cell culture media based on statistical experimental design methodology is a widely used approach for improving cultivation conditions. We applied this methodology to refine the composition of an established culture medium for growth of a human hepatoma cell line, C3A. A selection of growth factors and nutrient supplements were systematically screened according to standard design of experiments (DoE) procedures. The results of the screening indicated that the medium additives hepatocyte growth factor, oncostatin M, and fibroblast growth factor 4 significantly influenced the metabolic activities of the C3A cell line. Surface response methodology revealed that the optimum levels for these factors were 30 ng/ml for hepatocyte growth factor and 35 ng/ml for oncostatin M. Additional experiments on primary human hepatocyte cultures showed high variance in metabolic activities between cells from different individuals, making determination of optimal levels of factors more difficult. Still, it was possible to conclude that hepatocyte growth factor, epidermal growth factor, and oncostatin M had decisive effects on the metabolic functions of primary human hepatocytes.
NASA Technical Reports Server (NTRS)
Kimmel, William M. (Technical Monitor); Bradley, Kevin R.
2004-01-01
This paper describes the development of a methodology for sizing Blended-Wing-Body (BWB) transports and how the capabilities of the Flight Optimization System (FLOPS) have been expanded using that methodology. In this approach, BWB transports are sized based on the number of passengers in each class that must fit inside the centerbody or pressurized vessel. Weight estimation equations for this centerbody structure were developed using Finite Element Analysis (FEA). This paper shows how the sizing methodology has been incorporated into FLOPS to enable the design and analysis of BWB transports. Previous versions of FLOPS did not have the ability to accurately represent or analyze BWB configurations in any reliable, logical way. The expanded capabilities allow the design and analysis of a 200 to 450-passenger BWB transport or the analysis of a BWB transport for which the geometry is already known. The modifications to FLOPS resulted in differences of less than 4 percent for the ramp weight of a BWB transport in this range when compared to previous studies performed by NASA and Boeing.
NASA Astrophysics Data System (ADS)
Hanan, Lu; Qiushi, Li; Shaobin, Li
2016-12-01
This paper presents an integrated optimization design method in which uniform design, response surface methodology and genetic algorithm are used in combination. In detail, uniform design is used to select the experimental sampling points in the experimental domain and the system performance is evaluated by means of computational fluid dynamics to construct a database. After that, response surface methodology is employed to generate a surrogate mathematical model relating the optimization objective and the design variables. Subsequently, genetic algorithm is adopted and applied to the surrogate model to acquire the optimal solution in the case of satisfying some constraints. The method has been applied to the optimization design of an axisymmetric diverging duct, dealing with three design variables including one qualitative variable and two quantitative variables. The method of modeling and optimization design performs well in improving the duct aerodynamic performance and can be also applied to wider fields of mechanical design and seen as a useful tool for engineering designers, by reducing the design time and computation consumption.
ERIC Educational Resources Information Center
Higa, Yoshikazu; Shimojima, Ken
2018-01-01
This report describes a workshop on the Dynamics of Machinery based on the fabrication of a gyro- bicycle in a summer school program for junior high school students. The workshop was conducted by engineering students who had completed "Creative Research", an engineering design course at the National Institute of Technology, Okinawa…
ERIC Educational Resources Information Center
Wang, Dongxu; Stewart, Donald; Chang, Chun
2016-01-01
Purpose: The purpose of this paper is to examine the effectiveness of a holistic school-based nutrition programme using the health-promoting school (HPS) approach, on teachers' knowledge, attitudes and behaviour in relation to nutrition in rural China. Design/methodology/approach: A cluster-randomised intervention trial design was employed. Two…
ERIC Educational Resources Information Center
Roy, Robin; Potter, Stephen; Yarrow, Karen
2008-01-01
Purpose: This paper aims to summarise the methods and main findings of a study of the environmental impacts of providing higher education (HE) courses by campus-based and distance/open-learning methods. Design/methodology/approach: The approach takes the form of an environmental audit, with data from surveys of 20 UK courses--13 campus-based,…
ERIC Educational Resources Information Center
Blanco, Teresa; López-Forniés, Ignacio; Zarazaga-Soria, Francisco Javier
2017-01-01
The competence-based education recently launched in Spanish universities presents a set of abilities and skills that are difficult to teach to students in higher and more technologically-oriented grades. In this paper, a teaching intervention that is based on design methodologies is proposed, to upgrade the competitive capacities of computer…
ERIC Educational Resources Information Center
Alorda, B.; Suenaga, K.; Pons, P.
2011-01-01
This paper reports on the design, implementation and assessment of a new approach course structure based on the combination of three cooperative methodologies. The main goal is to reduce the percentage of non-passed students focusing the learning process on students by offering different alternatives and motivational activities based on working in…
ERIC Educational Resources Information Center
Carney, Robert D.
2010-01-01
This dissertation rationalizes the best use of Web-based instruction (WBI) for teaching music theory to private piano students in the later primary grades. It uses an integrative research methodology for defining, designing, and implementing a curriculum that includes WBI. Research from the fields of music education, educational technology,…
Methodology for the Design of Streamline-Traced External-Compression Supersonic Inlets
NASA Technical Reports Server (NTRS)
Slater, John W.
2014-01-01
A design methodology based on streamline-tracing is discussed for the design of external-compression, supersonic inlets for flight below Mach 2.0. The methodology establishes a supersonic compression surface and capture cross-section by tracing streamlines through an axisymmetric Busemann flowfield. The compression system of shock and Mach waves is altered through modifications to the leading edge and shoulder of the compression surface. An external terminal shock is established to create subsonic flow which is diffused in the subsonic diffuser. The design methodology was implemented into the SUPIN inlet design tool. SUPIN uses specified design factors to design the inlets and computes the inlet performance, which includes the flow rates, total pressure recovery, and wave drag. A design study was conducted using SUPIN and the Wind-US computational fluid dynamics code to design and analyze the properties of two streamline-traced, external-compression (STEX) supersonic inlets for Mach 1.6 freestream conditions. The STEX inlets were compared to axisymmetric pitot, two-dimensional, and axisymmetric spike inlets. The STEX inlets had slightly lower total pressure recovery and higher levels of total pressure distortion than the axisymmetric spike inlet. The cowl wave drag coefficients of the STEX inlets were 20% of those for the axisymmetric spike inlet. The STEX inlets had external sound pressures that were 37% of those of the axisymmetric spike inlet, which may result in lower adverse sonic boom characteristics. The flexibility of the shape of the capture cross-section may result in benefits for the integration of STEX inlets with aircraft.
Accounting for Uncertainties in Strengths of SiC MEMS Parts
NASA Technical Reports Server (NTRS)
Nemeth, Noel; Evans, Laura; Beheim, Glen; Trapp, Mark; Jadaan, Osama; Sharpe, William N., Jr.
2007-01-01
A methodology has been devised for accounting for uncertainties in the strengths of silicon carbide structural components of microelectromechanical systems (MEMS). The methodology enables prediction of the probabilistic strengths of complexly shaped MEMS parts using data from tests of simple specimens. This methodology is intended to serve as a part of a rational basis for designing SiC MEMS, supplementing methodologies that have been borrowed from the art of designing macroscopic brittle material structures. The need for this or a similar methodology arises as a consequence of the fundamental nature of MEMS and the brittle silicon-based materials of which they are typically fabricated. When tested to fracture, MEMS and structural components thereof show wide part-to-part scatter in strength. The methodology involves the use of the Ceramics Analysis and Reliability Evaluation of Structures Life (CARES/Life) software in conjunction with the ANSYS Probabilistic Design System (PDS) software to simulate or predict the strength responses of brittle material components while simultaneously accounting for the effects of variability of geometrical features on the strength responses. As such, the methodology involves the use of an extended version of the ANSYS/CARES/PDS software system described in Probabilistic Prediction of Lifetimes of Ceramic Parts (LEW-17682-1/4-1), Software Tech Briefs supplement to NASA Tech Briefs, Vol. 30, No. 9 (September 2006), page 10. The ANSYS PDS software enables the ANSYS finite-element-analysis program to account for uncertainty in the design-and analysis process. The ANSYS PDS software accounts for uncertainty in material properties, dimensions, and loading by assigning probabilistic distributions to user-specified model parameters and performing simulations using various sampling techniques.
NASA Astrophysics Data System (ADS)
Perez, Pedro B.; Hamawi, John N.
2017-09-01
Nuclear power plant radiation protection design features are based on radionuclide source terms derived from conservative assumptions that envelope expected operating experience. Two parameters that significantly affect the radionuclide concentrations in the source term are failed fuel fraction and effective fission product appearance rate coefficients. Failed fuel fraction may be a regulatory based assumption such as in the U.S. Appearance rate coefficients are not specified in regulatory requirements, but have been referenced to experimental data that is over 50 years old. No doubt the source terms are conservative as demonstrated by operating experience that has included failed fuel, but it may be too conservative leading to over-designed shielding for normal operations as an example. Design basis source term methodologies for normal operations had not advanced until EPRI published in 2015 an updated ANSI/ANS 18.1 source term basis document. Our paper revisits the fission product appearance rate coefficients as applied in the derivation source terms following the original U.S. NRC NUREG-0017 methodology. New coefficients have been calculated based on recent EPRI results which demonstrate the conservatism in nuclear power plant shielding design.
Digital design of scaffold for mandibular defect repair based on tissue engineering*
Liu, Yun-feng; Zhu, Fu-dong; Dong, Xing-tao; Peng, Wei
2011-01-01
Mandibular defect occurs more frequently in recent years, and clinical repair operations via bone transplantation are difficult to be further improved due to some intrinsic flaws. Tissue engineering, which is a hot research field of biomedical engineering, provides a new direction for mandibular defect repair. As the basis and key part of tissue engineering, scaffolds have been widely and deeply studied in regards to the basic theory, as well as the principle of biomaterial, structure, design, and fabrication method. However, little research is targeted at tissue regeneration for clinic repair operations. Since mandibular bone has a special structure, rather than uniform and regular structure in existing studies, a methodology based on tissue engineering is proposed for mandibular defect repair in this paper. Key steps regarding scaffold digital design, such as external shape design and internal microstructure design directly based on triangular meshes are discussed in detail. By analyzing the theoretical model and the measured data from the test parts fabricated by rapid prototyping, the feasibility and effectiveness of the proposed methodology are properly verified. More works about mechanical and biological improvements need to be done to promote its clinical application in future. PMID:21887853
Digital design of scaffold for mandibular defect repair based on tissue engineering.
Liu, Yun-feng; Zhu, Fu-dong; Dong, Xing-tao; Peng, Wei
2011-09-01
Mandibular defect occurs more frequently in recent years, and clinical repair operations via bone transplantation are difficult to be further improved due to some intrinsic flaws. Tissue engineering, which is a hot research field of biomedical engineering, provides a new direction for mandibular defect repair. As the basis and key part of tissue engineering, scaffolds have been widely and deeply studied in regards to the basic theory, as well as the principle of biomaterial, structure, design, and fabrication method. However, little research is targeted at tissue regeneration for clinic repair operations. Since mandibular bone has a special structure, rather than uniform and regular structure in existing studies, a methodology based on tissue engineering is proposed for mandibular defect repair in this paper. Key steps regarding scaffold digital design, such as external shape design and internal microstructure design directly based on triangular meshes are discussed in detail. By analyzing the theoretical model and the measured data from the test parts fabricated by rapid prototyping, the feasibility and effectiveness of the proposed methodology are properly verified. More works about mechanical and biological improvements need to be done to promote its clinical application in future.
New geometric design consistency model based on operating speed profiles for road safety evaluation.
Camacho-Torregrosa, Francisco J; Pérez-Zuriaga, Ana M; Campoy-Ungría, J Manuel; García-García, Alfredo
2013-12-01
To assist in the on-going effort to reduce road fatalities as much as possible, this paper presents a new methodology to evaluate road safety in both the design and redesign stages of two-lane rural highways. This methodology is based on the analysis of road geometric design consistency, a value which will be a surrogate measure of the safety level of the two-lane rural road segment. The consistency model presented in this paper is based on the consideration of continuous operating speed profiles. The models used for their construction were obtained by using an innovative GPS-data collection method that is based on continuous operating speed profiles recorded from individual drivers. This new methodology allowed the researchers to observe the actual behavior of drivers and to develop more accurate operating speed models than was previously possible with spot-speed data collection, thereby enabling a more accurate approximation to the real phenomenon and thus a better consistency measurement. Operating speed profiles were built for 33 Spanish two-lane rural road segments, and several consistency measurements based on the global and local operating speed were checked. The final consistency model takes into account not only the global dispersion of the operating speed, but also some indexes that consider both local speed decelerations and speeds over posted speeds as well. For the development of the consistency model, the crash frequency for each study site was considered, which allowed estimating the number of crashes on a road segment by means of the calculation of its geometric design consistency. Consequently, the presented consistency evaluation method is a promising innovative tool that can be used as a surrogate measure to estimate the safety of a road segment. Copyright © 2012 Elsevier Ltd. All rights reserved.
j5 DNA assembly design automation.
Hillson, Nathan J
2014-01-01
Modern standardized methodologies, described in detail in the previous chapters of this book, have enabled the software-automated design of optimized DNA construction protocols. This chapter describes how to design (combinatorial) scar-less DNA assembly protocols using the web-based software j5. j5 assists biomedical and biotechnological researchers construct DNA by automating the design of optimized protocols for flanking homology sequence as well as type IIS endonuclease-mediated DNA assembly methodologies. Unlike any other software tool available today, j5 designs scar-less combinatorial DNA assembly protocols, performs a cost-benefit analysis to identify which portions of an assembly process would be less expensive to outsource to a DNA synthesis service provider, and designs hierarchical DNA assembly strategies to mitigate anticipated poor assembly junction sequence performance. Software integrated with j5 add significant value to the j5 design process through graphical user-interface enhancement and downstream liquid-handling robotic laboratory automation.
Yap, Christina; Billingham, Lucinda J; Cheung, Ying Kuen; Craddock, Charlie; O'Quigley, John
2017-12-15
The ever-increasing pace of development of novel therapies mandates efficient methodologies for assessment of their tolerability and activity. Evidence increasingly support the merits of model-based dose-finding designs in identifying the recommended phase II dose compared with conventional rule-based designs such as the 3 + 3 but despite this, their use remains limited. Here, we propose a useful tool, dose transition pathways (DTP), which helps overcome several commonly faced practical and methodologic challenges in the implementation of model-based designs. DTP projects in advance the doses recommended by a model-based design for subsequent patients (stay, escalate, de-escalate, or stop early), using all the accumulated information. After specifying a model with favorable statistical properties, we utilize the DTP to fine-tune the model to tailor it to the trial's specific requirements that reflect important clinical judgments. In particular, it can help to determine how stringent the stopping rules should be if the investigated therapy is too toxic. Its use to design and implement a modified continual reassessment method is illustrated in an acute myeloid leukemia trial. DTP removes the fears of model-based designs as unknown, complex systems and can serve as a handbook, guiding decision-making for each dose update. In the illustrated trial, the seamless, clear transition for each dose recommendation aided the investigators' understanding of the design and facilitated decision-making to enable finer calibration of a tailored model. We advocate the use of the DTP as an integral procedure in the co-development and successful implementation of practical model-based designs by statisticians and investigators. Clin Cancer Res; 23(24); 7440-7. ©2017 AACR . ©2017 American Association for Cancer Research.
Ethical and methodological issues in research with Sami experiencing disability.
Melbøe, Line; Hansen, Ketil Lenert; Johnsen, Bjørn-Eirik; Fedreheim, Gunn Elin; Dinesen, Tone; Minde, Gunn-Tove; Rustad, Marit
2016-01-01
Background A study of disability among the indigenous Sami people in Norway presented a number of ethical and methodological challenges rarely addressed in the literature. Objectives The main study was designed to examine and understand the everyday life, transitions between life stages and democratic participation of Norwegian Sami people experiencing disability. Hence, the purpose of this article is to increase the understanding of possible ethical and methodological issues in research within this field. The article describes and discusses ethical and methodological issues that arose when conducting our study and identifies some strategies for addressing issues like these. Methods The ethical and methodological issues addressed in the article are based on a qualitative study among indigenous Norwegian Sami people experiencing disability. The data in this study were collected through 31 semi-structured in-depth interviews with altogether 24 Sami people experiencing disability and 13 next of kin of Sami people experiencing disability (8 mothers, 2 fathers, 2 sister and 1 guardian). Findings and discussion The researchers identified 4 main areas of ethical and methodological issues. We present these issues chronologically as they emerged in the research process: 1) concept of knowledge when designing the study, 2) gaining access, 3) data collection and 4) analysis and accountability. Conclusion The knowledge generated from this study has the potential to benefit future health research, specifically of Norwegian Sami people experiencing disability, as well as health research concerning indigenous people in general, providing scientific-based insight into important ethical and methodological issues in research with indigenous people experiencing disability.
Ethical and methodological issues in research with Sami experiencing disability
Melbøe, Line; Hansen, Ketil Lenert; Johnsen, Bjørn-Eirik; Fedreheim, Gunn Elin; Dinesen, Tone; Minde, Gunn-Tove; Rustad, Marit
2016-01-01
Background A study of disability among the indigenous Sami people in Norway presented a number of ethical and methodological challenges rarely addressed in the literature. Objectives The main study was designed to examine and understand the everyday life, transitions between life stages and democratic participation of Norwegian Sami people experiencing disability. Hence, the purpose of this article is to increase the understanding of possible ethical and methodological issues in research within this field. The article describes and discusses ethical and methodological issues that arose when conducting our study and identifies some strategies for addressing issues like these. Methods The ethical and methodological issues addressed in the article are based on a qualitative study among indigenous Norwegian Sami people experiencing disability. The data in this study were collected through 31 semi-structured in-depth interviews with altogether 24 Sami people experiencing disability and 13 next of kin of Sami people experiencing disability (8 mothers, 2 fathers, 2 sister and 1 guardian). Findings and discussion The researchers identified 4 main areas of ethical and methodological issues. We present these issues chronologically as they emerged in the research process: 1) concept of knowledge when designing the study, 2) gaining access, 3) data collection and 4) analysis and accountability. Conclusion The knowledge generated from this study has the potential to benefit future health research, specifically of Norwegian Sami people experiencing disability, as well as health research concerning indigenous people in general, providing scientific-based insight into important ethical and methodological issues in research with indigenous people experiencing disability. PMID:27396747
Materiality in a Practice-Based Approach
ERIC Educational Resources Information Center
Svabo, Connie
2009-01-01
Purpose: The paper aims to provide an overview of the vocabulary for materiality which is used by practice-based approaches to organizational knowing. Design/methodology/approach: The overview is theoretically generated and is based on the anthology Knowing in Organizations: A Practice-based Approach edited by Nicolini, Gherardi and Yanow. The…
Edwards, Rhiannon Tudor; Bryning, Lucy; Crane, Rebecca
Mindfulness-based interventions (MBIs) are being increasingly applied in a variety of settings. A growing body of evidence to support the effectiveness of these interventions exists and there are a few published cost-effectiveness studies. With limited resources available within public sectors (health care, social care, and education), it is necessary to build in concurrent economic evaluations alongside trials in order to inform service commissioning and policy. If future research studies are well-designed, they have strong potential to investigate the economic impact of MBIs. The particular challenge to the health economist is how best to capture the ways that MBIs help people adjust to or build resilience to difficult life circumstances, and to disseminate effectively to enable policy makers to judge the value of the contribution that MBIs can make within the context of the limited resourcing of public services. In anticipation of more research worldwide evaluating MBIs in various settings, this article suggests ten health economics methodological design questions that researchers may want to consider prior to conducting MBI research. These questions draw on both published standards of good methodological practice in economic evaluation of medical interventions, and on the authors' knowledge and experience of mindfulness-based practice. We argue that it is helpful to view MBIs as both complex interventions and as public health prevention initiatives. Our suggestions for well-designed economic evaluations of MBIs in health and other settings, mirror current thinking on the challenges and opportunities of public health economics.
Universal Verification Methodology Based Register Test Automation Flow.
Woo, Jae Hun; Cho, Yong Kwan; Park, Sun Kyu
2016-05-01
In today's SoC design, the number of registers has been increased along with complexity of hardware blocks. Register validation is a time-consuming and error-pron task. Therefore, we need an efficient way to perform verification with less effort in shorter time. In this work, we suggest register test automation flow based UVM (Universal Verification Methodology). UVM provides a standard methodology, called a register model, to facilitate stimulus generation and functional checking of registers. However, it is not easy for designers to create register models for their functional blocks or integrate models in test-bench environment because it requires knowledge of SystemVerilog and UVM libraries. For the creation of register models, many commercial tools support a register model generation from register specification described in IP-XACT, but it is time-consuming to describe register specification in IP-XACT format. For easy creation of register model, we propose spreadsheet-based register template which is translated to IP-XACT description, from which register models can be easily generated using commercial tools. On the other hand, we also automate all the steps involved integrating test-bench and generating test-cases, so that designers may use register model without detailed knowledge of UVM or SystemVerilog. This automation flow involves generating and connecting test-bench components (e.g., driver, checker, bus adaptor, etc.) and writing test sequence for each type of register test-case. With the proposed flow, designers can save considerable amount of time to verify functionality of registers.
Calibration Modeling Methodology to Optimize Performance for Low Range Applications
NASA Technical Reports Server (NTRS)
McCollum, Raymond A.; Commo, Sean A.; Parker, Peter A.
2010-01-01
Calibration is a vital process in characterizing the performance of an instrument in an application environment and seeks to obtain acceptable accuracy over the entire design range. Often, project requirements specify a maximum total measurement uncertainty, expressed as a percent of full-scale. However in some applications, we seek to obtain enhanced performance at the low range, therefore expressing the accuracy as a percent of reading should be considered as a modeling strategy. For example, it is common to desire to use a force balance in multiple facilities or regimes, often well below its designed full-scale capacity. This paper presents a general statistical methodology for optimizing calibration mathematical models based on a percent of reading accuracy requirement, which has broad application in all types of transducer applications where low range performance is required. A case study illustrates the proposed methodology for the Mars Entry Atmospheric Data System that employs seven strain-gage based pressure transducers mounted on the heatshield of the Mars Science Laboratory mission.
NASA Astrophysics Data System (ADS)
Bruzzone, Agostino G.; Revetria, Roberto; Simeoni, Simone; Viazzo, Simone; Orsoni, Alessandra
2004-08-01
In logistics and industrial production managers must deal with the impact of stochastic events to improve performances and reduce costs. In fact, production and logistics systems are generally designed considering some parameters as deterministically distributed. While this assumption is mostly used for preliminary prototyping, it is sometimes also retained during the final design stage, and especially for estimated parameters (i.e. Market Request). The proposed methodology can determine the impact of stochastic events in the system by evaluating the chaotic threshold level. Such an approach, based on the application of a new and innovative methodology, can be implemented to find the condition under which chaos makes the system become uncontrollable. Starting from problem identification and risk assessment, several classification techniques are used to carry out an effect analysis and contingency plan estimation. In this paper the authors illustrate the methodology with respect to a real industrial case: a production problem related to the logistics of distributed chemical processing.
Discrete Adjoint-Based Design Optimization of Unsteady Turbulent Flows on Dynamic Unstructured Grids
NASA Technical Reports Server (NTRS)
Nielsen, Eric J.; Diskin, Boris; Yamaleev, Nail K.
2009-01-01
An adjoint-based methodology for design optimization of unsteady turbulent flows on dynamic unstructured grids is described. The implementation relies on an existing unsteady three-dimensional unstructured grid solver capable of dynamic mesh simulations and discrete adjoint capabilities previously developed for steady flows. The discrete equations for the primal and adjoint systems are presented for the backward-difference family of time-integration schemes on both static and dynamic grids. The consistency of sensitivity derivatives is established via comparisons with complex-variable computations. The current work is believed to be the first verified implementation of an adjoint-based optimization methodology for the true time-dependent formulation of the Navier-Stokes equations in a practical computational code. Large-scale shape optimizations are demonstrated for turbulent flows over a tiltrotor geometry and a simulated aeroelastic motion of a fighter jet.
The "Push-Pull" Approach to Fast-Track Management Development: A Case Study in Scientific Publishing
ERIC Educational Resources Information Center
Fojt, Martin; Parkinson, Stephen; Peters, John; Sandelands, Eric
2008-01-01
Purpose: The purpose of this paper is to explore how a medium sized business has addressed what it has termed a "push-pull" method of management and organization development, based around an action learning approach. Design/methodology/approach: The paper sets out a methodology that other SMEs might look to replicate in their management and…
Challenges to the Learning Organization in the Context of Generational Diversity and Social Networks
ERIC Educational Resources Information Center
Kaminska, Renata; Borzillo, Stefano
2018-01-01
Purpose: The purpose of this paper is to gain a better understanding of the challenges to the emergence of a learning organization (LO) posed by a context of generational diversity and an enterprise social networking system (ESNS). Design/methodology/approach: This study uses a qualitative methodology based on an analysis of 20 semi-structured…
Grant, A. M.; Richard, Y.; Deland, E.; Després, N.; de Lorenzi, F.; Dagenais, A.; Buteau, M.
1997-01-01
The Autocontrol methodology has been developed in order to support the optimisation of decision-making and the use of resources in the context of a clinical unit. The theoretical basis relates to quality assurance and information systems and is influenced by management and cognitive research in the health domain. The methodology uses population rather than individual decision making and because of its dynamic feedback design promises to have rapid and profound effect on practice. Most importantly the health care professional is the principle user of the Autocontrol system. In this methodology we distinguish three types of evidence necessary for practice change: practice based or internal evidence, best evidence derived from the literature or external evidence concerning the practice in question, and process based evidence on how to optimise the process of practice change. The software used by the system is of the executive decision support type which facilitates interrogation of large databases. The Autocontrol system is designed to interrogate the data of the patient medical record however the latter often lacks data on concomitant resource use and this must be supplemented. This paper reviews the Autocontrol methodology and gives examples from current studies. PMID:9357733
Grant, A M; Richard, Y; Deland, E; Després, N; de Lorenzi, F; Dagenais, A; Buteau, M
1997-01-01
The Autocontrol methodology has been developed in order to support the optimisation of decision-making and the use of resources in the context of a clinical unit. The theoretical basis relates to quality assurance and information systems and is influenced by management and cognitive research in the health domain. The methodology uses population rather than individual decision making and because of its dynamic feedback design promises to have rapid and profound effect on practice. Most importantly the health care professional is the principle user of the Autocontrol system. In this methodology we distinguish three types of evidence necessary for practice change: practice based or internal evidence, best evidence derived from the literature or external evidence concerning the practice in question, and process based evidence on how to optimise the process of practice change. The software used by the system is of the executive decision support type which facilitates interrogation of large databases. The Autocontrol system is designed to interrogate the data of the patient medical record however the latter often lacks data on concomitant resource use and this must be supplemented. This paper reviews the Autocontrol methodology and gives examples from current studies.
Auditing as Part of the Terminology Design Life Cycle
Min, Hua; Perl, Yehoshua; Chen, Yan; Halper, Michael; Geller, James; Wang, Yue
2006-01-01
Objective To develop and test an auditing methodology for detecting errors in medical terminologies satisfying systematic inheritance. This methodology is based on various abstraction taxonomies that provide high-level views of a terminology and highlight potentially erroneous concepts. Design Our auditing methodology is based on dividing concepts of a terminology into smaller, more manageable units. First, we divide the terminology’s concepts into areas according to their relationships/roles. Then each multi-rooted area is further divided into partial-areas (p-areas) that are singly-rooted. Each p-area contains a set of structurally and semantically uniform concepts. Two kinds of abstraction networks, called the area taxonomy and p-area taxonomy, are derived. These taxonomies form the basis for the auditing approach. Taxonomies tend to highlight potentially erroneous concepts in areas and p-areas. Human reviewers can focus their auditing efforts on the limited number of problematic concepts following two hypotheses on the probable concentration of errors. Results A sample of the area taxonomy and p-area taxonomy for the Biological Process (BP) hierarchy of the National Cancer Institute Thesaurus (NCIT) was derived from the application of our methodology to its concepts. These views led to the detection of a number of different kinds of errors that are reported, and to confirmation of the hypotheses on error concentration in this hierarchy. Conclusion Our auditing methodology based on area and p-area taxonomies is an efficient tool for detecting errors in terminologies satisfying systematic inheritance of roles, and thus facilitates their maintenance. This methodology concentrates a domain expert’s manual review on portions of the concepts with a high likelihood of errors. PMID:16929044
ARC Collaborative Research Seminar Series
been used to formulate design rules for hydration-based TES systems. Don Siegel is an Associate structural-acoustics, design of complex systems, and blast event simulations. Technology that he developed interests includes advanced fatigue and fracture assessment methodologies, computational methods for
Designing the Alluvial Riverbeds in Curved Paths
NASA Astrophysics Data System (ADS)
Macura, Viliam; Škrinár, Andrej; Štefunková, Zuzana; Muchová, Zlatica; Majorošová, Martina
2017-10-01
The paper presents the method of determining the shape of the riverbed in curves of the watercourse, which is based on the method of Ikeda (1975) developed for a slightly curved path in sandy riverbed. Regulated rivers have essentially slightly and smoothly curved paths; therefore, this methodology provides the appropriate basis for river restoration. Based on the research in the experimental reach of the Holeška Brook and several alluvial mountain streams the methodology was adjusted. The method also takes into account other important characteristics of bottom material - the shape and orientation of the particles, settling velocity and drag coefficients. Thus, the method is mainly meant for the natural sand-gravel material, which is heterogeneous and the particle shape of the bottom material is very different from spherical. The calculation of the river channel in the curved path provides the basis for the design of optimal habitat, but also for the design of foundations of armouring of the bankside of the channel. The input data is adapted to the conditions of design practice.
NASA Technical Reports Server (NTRS)
Wang, Ten-See; Foote, John; Litchford, Ron
2006-01-01
The objective of this effort is to perform design analyses for a non-nuclear hot-hydrogen materials tester, as a first step towards developing efficient and accurate multiphysics, thermo-fluid computational methodology to predict environments for hypothetical solid-core, nuclear thermal engine thrust chamber design and analysis. The computational methodology is based on a multidimensional, finite-volume, turbulent, chemically reacting, thermally radiating, unstructured-grid, and pressure-based formulation. The multiphysics invoked in this study include hydrogen dissociation kinetics and thermodynamics, turbulent flow, convective, and thermal radiative heat transfers. The goals of the design analyses are to maintain maximum hot-hydrogen jet impingement energy and to minimize chamber wall heating. The results of analyses on three test fixture configurations and the rationale for final selection are presented. The interrogation of physics revealed that reactions of hydrogen dissociation and recombination are highly correlated with local temperature and are necessary for accurate prediction of the hot-hydrogen jet temperature.
Siksik, May; Krishnamurthy, Vikram
2017-09-01
This paper proposes a multi-dielectric Brownian dynamics simulation framework for design-space-exploration (DSE) studies of ion-channel permeation. The goal of such DSE studies is to estimate the channel modeling-parameters that minimize the mean-squared error between the simulated and expected "permeation characteristics." To address this computational challenge, we use a methodology based on statistical inference that utilizes the knowledge of channel structure to prune the design space. We demonstrate the proposed framework and DSE methodology using a case study based on the KcsA ion channel, in which the design space is successfully reduced from a 6-D space to a 2-D space. Our results show that the channel dielectric map computed using the framework matches with that computed directly using molecular dynamics with an error of 7%. Finally, the scalability and resolution of the model used are explored, and it is shown that the memory requirements needed for DSE remain constant as the number of parameters (degree of heterogeneity) increases.
Leong, H M; Carter, Mark; Stephenson, Jennifer
2015-12-01
Sensory integration therapy (SIT) is a controversial intervention that is widely used for people with disabilities. Systematic analysis was conducted on the outcomes of 17 single case design studies on sensory integration therapy for people with, or at-risk of, a developmental or learning disability, disorder or delay. An assessment of the quality of methodology of the studies found most used weak designs and poor methodology, with a tendency for higher quality studies to produce negative results. Based on limited comparative evidence, functional analysis-based interventions for challenging behavior were more effective that SIT. Overall the studies do not provide convincing evidence for the efficacy of sensory integration therapy. Given the findings of the present review and other recent analyses it is advised that the use of SIT be limited to experimental contexts. Issues with the studies and possible improvements for future research are discussed including the need to employ designs that allow for adequate demonstration of experimental control. Copyright © 2015 Elsevier Ltd. All rights reserved.
Evaluating and redesigning teaching learning sequences at the introductory physics level
NASA Astrophysics Data System (ADS)
Guisasola, Jenaro; Zuza, Kristina; Ametller, Jaume; Gutierrez-Berraondo, José
2017-12-01
In this paper we put forward a proposal for the design and evaluation of teaching and learning sequences in upper secondary school and university. We will connect our proposal with relevant contributions on the design of teaching sequences, ground it on the design-based research methodology, and discuss how teaching and learning sequences designed according to our proposal relate to learning progressions. An iterative methodology for evaluating and redesigning the teaching and learning sequence (TLS) is presented. The proposed assessment strategy focuses on three aspects: (a) evaluation of the activities of the TLS, (b) evaluation of learning achieved by students in relation to the intended objectives, and (c) a document for gathering the difficulties found when implementing the TLS to serve as a guide to teachers. Discussion of this guide with external teachers provides feedback used for the TLS redesign. The context of our implementation and evaluation is an innovative calculus-based physics course for first-year engineering and science degree students at the University of the Basque Country.
Hybrid CMS methods with model reduction for assembly of structures
NASA Technical Reports Server (NTRS)
Farhat, Charbel
1991-01-01
Future on-orbit structures will be designed and built in several stages, each with specific control requirements. Therefore there must be a methodology which can predict the dynamic characteristics of the assembled structure, based on the dynamic characteristics of the subassemblies and their interfaces. The methodology developed by CSC to address this issue is Hybrid Component Mode Synthesis (HCMS). HCMS distinguishes itself from standard component mode synthesis algorithms in the following features: (1) it does not require the subcomponents to have displacement compatible models, which makes it ideal for analyzing the deployment of heterogeneous flexible multibody systems, (2) it incorporates a second-level model reduction scheme at the interface, which makes it much faster than other algorithms and therefore suitable for control purposes, and (3) it does answer specific questions such as 'how does the global fundamental frequency vary if I change the physical parameters of substructure k by a specified amount?'. Because it is based on an energy principle rather than displacement compatibility, this methodology can also help the designer to define an assembly process. Current and future efforts are devoted to applying the HCMS method to design and analyze docking and berthing procedures in orbital construction.
On sustainable and efficient design of ground-source heat pump systems
NASA Astrophysics Data System (ADS)
Grassi, W.; Conti, P.; Schito, E.; Testi, D.
2015-11-01
This paper is mainly aimed at stressing some fundamental features of the GSHP design and is based on a broad research we are performing at the University of Pisa. In particular, we focus the discussion on an environmentally sustainable approach, based on performance optimization during the entire operational life. The proposed methodology aims at investigating design and management strategies to find the optimal level of exploitation of the ground source and refer to other technical means to cover the remaining energy requirements and modulate the power peaks. The method is holistic, considering the system as a whole, rather than focusing only on some components, usually considered as the most important ones. Each subsystem is modeled and coupled to the others in a full set of equations, which is used within an optimization routine to reproduce the operative performances of the overall GSHP system. As a matter of fact, the recommended methodology is a 4-in-1 activity, including sizing of components, lifecycle performance evaluation, optimization process, and feasibility analysis. The paper reviews also some previous works concerning possible applications of the proposed methodology. In conclusion, we describe undergoing research activities and objectives of future works.
Saravanan, P; Muthuvelayudham, R; Viruthagiri, T
2012-01-01
Optimization of the culture medium for cellulase production using Trichoderma reesei was carried out. The optimization of cellulase production using mango peel as substrate was performed with statistical methodology based on experimental designs. The screening of nine nutrients for their influence on cellulase production is achieved using Plackett-Burman design. Avicel, soybean cake flour, KH(2)PO(4), and CoCl(2)·6H(2)O were selected based on their positive influence on cellulase production. The composition of the selected components was optimized using Response Surface Methodology (RSM). The optimum conditions are as follows: Avicel: 25.30 g/L, Soybean cake flour: 23.53 g/L, KH(2)PO(4): 4.90 g/L, and CoCl(2)·6H(2)O: 0.95 g/L. These conditions are validated experimentally which revealed an enhanced Cellulase activity of 7.8 IU/mL.
FAME, a microprocessor based front-end analysis and modeling environment
NASA Technical Reports Server (NTRS)
Rosenbaum, J. D.; Kutin, E. B.
1980-01-01
Higher order software (HOS) is a methodology for the specification and verification of large scale, complex, real time systems. The HOS methodology was implemented as FAME (front end analysis and modeling environment), a microprocessor based system for interactively developing, analyzing, and displaying system models in a low cost user-friendly environment. The nature of the model is such that when completed it can be the basis for projection to a variety of forms such as structured design diagrams, Petri-nets, data flow diagrams, and PSL/PSA source code. The user's interface with the analyzer is easily recognized by any current user of a structured modeling approach; therefore extensive training is unnecessary. Furthermore, when all the system capabilities are used one can check on proper usage of data types, functions, and control structures thereby adding a new dimension to the design process that will lead to better and more easily verified software designs.
A comprehensive risk assessment framework for offsite transportation of inflammable hazardous waste.
Das, Arup; Gupta, A K; Mazumder, T N
2012-08-15
A framework for risk assessment due to offsite transportation of hazardous wastes is designed based on the type of event that can be triggered from an accident of a hazardous waste carrier. The objective of this study is to design a framework for computing the risk to population associated with offsite transportation of inflammable and volatile wastes. The framework is based on traditional definition of risk and is designed for conditions where accident databases are not available. The probability based variable in risk assessment framework is substituted by a composite accident index proposed in this study. The framework computes the impacts due to a volatile cloud explosion based on TNO Multi-energy model. The methodology also estimates the vulnerable population in terms of disability adjusted life years (DALY) which takes into consideration the demographic profile of the population and the degree of injury on mortality and morbidity sustained. The methodology is illustrated using a case study of a pharmaceutical industry in the Kolkata metropolitan area. Copyright © 2012 Elsevier B.V. All rights reserved.
Intelligent systems engineering methodology
NASA Technical Reports Server (NTRS)
Fouse, Scott
1990-01-01
An added challenge for the designers of large scale systems such as Space Station Freedom is the appropriate incorporation of intelligent system technology (artificial intelligence, expert systems, knowledge-based systems, etc.) into their requirements and design. This presentation will describe a view of systems engineering which successfully addresses several aspects of this complex problem: design of large scale systems, design with requirements that are so complex they only completely unfold during the development of a baseline system and even then continue to evolve throughout the system's life cycle, design that involves the incorporation of new technologies, and design and development that takes place with many players in a distributed manner yet can be easily integrated to meet a single view of the requirements. The first generation of this methodology was developed and evolved jointly by ISX and the Lockheed Aeronautical Systems Company over the past five years on the Defense Advanced Research Projects Agency/Air Force Pilot's Associate Program, one of the largest, most complex, and most successful intelligent systems constructed to date. As the methodology has evolved it has also been applied successfully to a number of other projects. Some of the lessons learned from this experience may be applicable to Freedom.
Extended cooperative control synthesis
NASA Technical Reports Server (NTRS)
Davidson, John B.; Schmidt, David K.
1994-01-01
This paper reports on research for extending the Cooperative Control Synthesis methodology to include a more accurate modeling of the pilot's controller dynamics. Cooperative Control Synthesis (CCS) is a methodology that addresses the problem of how to design control laws for piloted, high-order, multivariate systems and/or non-conventional dynamic configurations in the absence of flying qualities specifications. This is accomplished by emphasizing the parallel structure inherent in any pilot-controlled, augmented vehicle. The original CCS methodology is extended to include the Modified Optimal Control Model (MOCM), which is based upon the optimal control model of the human operator developed by Kleinman, Baron, and Levison in 1970. This model provides a modeling of the pilot's compensation dynamics that is more accurate than the simplified pilot dynamic representation currently in the CCS methodology. Inclusion of the MOCM into the CCS also enables the modeling of pilot-observation perception thresholds and pilot-observation attention allocation affects. This Extended Cooperative Control Synthesis (ECCS) allows for the direct calculation of pilot and system open- and closed-loop transfer functions in pole/zero form and is readily implemented in current software capable of analysis and design for dynamic systems. Example results based upon synthesizing an augmentation control law for an acceleration command system in a compensatory tracking task using the ECCS are compared with a similar synthesis performed by using the original CCS methodology. The ECCS is shown to provide augmentation control laws that yield more favorable, predicted closed-loop flying qualities and tracking performance than those synthesized using the original CCS methodology.
ERIC Educational Resources Information Center
Puma, Michael J.; Ellis, Richard
Part of a study of program management procedures in the campus-based and Basic Educational Opportunity Grant programs reports on the design of the site visit component of the study and the results of the student survey, both in terms of the yield obtained and the quality of the data. Chapter 2 describes the design of sampling methodology employed…
NASA Astrophysics Data System (ADS)
Huang, Xiao
2006-04-01
Today's and especially tomorrow's competitive launch vehicle design environment requires the development of a dedicated generic Space Access Vehicle (SAV) design methodology. A total of 115 industrial, research, and academic aircraft, helicopter, missile, and launch vehicle design synthesis methodologies have been evaluated. As the survey indicates, each synthesis methodology tends to focus on a specific flight vehicle configuration, thus precluding the key capability to systematically compare flight vehicle design alternatives. The aim of the research investigation is to provide decision-making bodies and the practicing engineer a design process and tool box for robust modeling and simulation of flight vehicles where the ultimate performance characteristics may hinge on numerical subtleties. This will enable the designer of a SAV for the first time to consistently compare different classes of SAV configurations on an impartial basis. This dissertation presents the development steps required towards a generic (configuration independent) hands-on flight vehicle conceptual design synthesis methodology. This process is developed such that it can be applied to any flight vehicle class if desired. In the present context, the methodology has been put into operation for the conceptual design of a tourist Space Access Vehicle. The case study illustrates elements of the design methodology & algorithm for the class of Horizontal Takeoff and Horizontal Landing (HTHL) SAVs. The HTHL SAV design application clearly outlines how the conceptual design process can be centrally organized, executed and documented with focus on design transparency, physical understanding and the capability to reproduce results. This approach offers the project lead and creative design team a management process and tool which iteratively refines the individual design logic chosen, leading to mature design methods and algorithms. As illustrated, the HTHL SAV hands-on design methodology offers growth potential in that the same methodology can be continually updated and extended to other SAV configuration concepts, such as the Vertical Takeoff and Vertical Landing (VTVL) SAV class. Having developed, validated and calibrated the methodology for HTHL designs in the 'hands-on' mode, the report provides an outlook how the methodology will be integrated into a prototype computerized design synthesis software AVDS-PrADOSAV in a follow-on step.
Read, Gemma J M; Salmon, Paul M; Lenné, Michael G
2016-09-01
The Cognitive Work Analysis Design Toolkit (CWA-DT) is a recently developed approach that provides guidance and tools to assist in applying the outputs of CWA to design processes to incorporate the values and principles of sociotechnical systems theory. In this paper, the CWA-DT is evaluated based on an application to improve safety at rail level crossings. The evaluation considered the extent to which the CWA-DT met pre-defined methodological criteria and aligned with sociotechnical values and principles. Both process and outcome measures were taken based on the ratings of workshop participants and human factors experts. Overall, workshop participants were positive about the process and indicated that it met the methodological criteria and sociotechnical values. However, expert ratings suggested that the CWA-DT achieved only limited success in producing RLX designs that fully aligned with the sociotechnical approach. Discussion about the appropriateness of the sociotechnical approach in a public safety context is provided. Practitioner Summary: Human factors and ergonomics practitioners need evidence of the effectiveness of methods. A design toolkit for cognitive work analysis, incorporating values and principles from sociotechnical systems theory, was applied to create innovative designs for rail level crossings. Evaluation results based on the application are provided and discussed.
Evaluating and Redesigning Teaching Learning Sequences at the Introductory Physics Level
ERIC Educational Resources Information Center
Guisasola, Jenaro; Zuza, Kristina; Ametller, Jaume; Gutierrez-Berraondo, José
2017-01-01
In this paper we put forward a proposal for the design and evaluation of teaching and learning sequences in upper secondary school and university. We will connect our proposal with relevant contributions on the design of teaching sequences, ground it on the design-based research methodology, and discuss how teaching and learning sequences designed…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-01
... survey tools, and the research methodology to test content, format, and design of labeling is based on... information will have practical utility. (Response) The survey is designed to elicit responses on the... do so. This survey is designed for the health care provider and their feedback. (Comment 2) Comment 2...
ERIC Educational Resources Information Center
Collins, Linda J.; Liang, Xin
2014-01-01
Online professional development (oPD) for teachers should focus on designing web-based learning opportunities that help practicing educators solve the tough problems of practice when working in their schools. Technology, pedagogy, and content knowledge can be integrated in the design of online professional development modules to enhance task…
NASA Astrophysics Data System (ADS)
Whitehead, James Joshua
The analysis documented herein provides an integrated approach for the conduct of optimization under uncertainty (OUU) using Monte Carlo Simulation (MCS) techniques coupled with response surface-based methods for characterization of mixture-dependent variables. This novel methodology provides an innovative means of conducting optimization studies under uncertainty in propulsion system design. Analytic inputs are based upon empirical regression rate information obtained from design of experiments (DOE) mixture studies utilizing a mixed oxidizer hybrid rocket concept. Hybrid fuel regression rate was selected as the target response variable for optimization under uncertainty, with maximization of regression rate chosen as the driving objective. Characteristic operational conditions and propellant mixture compositions from experimental efforts conducted during previous foundational work were combined with elemental uncertainty estimates as input variables. Response surfaces for mixture-dependent variables and their associated uncertainty levels were developed using quadratic response equations incorporating single and two-factor interactions. These analysis inputs, response surface equations and associated uncertainty contributions were applied to a probabilistic MCS to develop dispersed regression rates as a function of operational and mixture input conditions within design space. Illustrative case scenarios were developed and assessed using this analytic approach including fully and partially constrained operational condition sets over all of design mixture space. In addition, optimization sets were performed across an operationally representative region in operational space and across all investigated mixture combinations. These scenarios were selected as representative examples relevant to propulsion system optimization, particularly for hybrid and solid rocket platforms. Ternary diagrams, including contour and surface plots, were developed and utilized to aid in visualization. The concept of Expanded-Durov diagrams was also adopted and adapted to this study to aid in visualization of uncertainty bounds. Regions of maximum regression rate and associated uncertainties were determined for each set of case scenarios. Application of response surface methodology coupled with probabilistic-based MCS allowed for flexible and comprehensive interrogation of mixture and operating design space during optimization cases. Analyses were also conducted to assess sensitivity of uncertainty to variations in key elemental uncertainty estimates. The methodology developed during this research provides an innovative optimization tool for future propulsion design efforts.
NASA Astrophysics Data System (ADS)
Agarwal, Smriti; Bisht, Amit Singh; Singh, Dharmendra; Pathak, Nagendra Prasad
2014-12-01
Millimetre wave imaging (MMW) is gaining tremendous interest among researchers, which has potential applications for security check, standoff personal screening, automotive collision-avoidance, and lot more. Current state-of-art imaging techniques viz. microwave and X-ray imaging suffers from lower resolution and harmful ionizing radiation, respectively. In contrast, MMW imaging operates at lower power and is non-ionizing, hence, medically safe. Despite these favourable attributes, MMW imaging encounters various challenges as; still it is very less explored area and lacks suitable imaging methodology for extracting complete target information. Keeping in view of these challenges, a MMW active imaging radar system at 60 GHz was designed for standoff imaging application. A C-scan (horizontal and vertical scanning) methodology was developed that provides cross-range resolution of 8.59 mm. The paper further details a suitable target identification and classification methodology. For identification of regular shape targets: mean-standard deviation based segmentation technique was formulated and further validated using a different target shape. For classification: probability density function based target material discrimination methodology was proposed and further validated on different dataset. Lastly, a novel artificial neural network based scale and rotation invariant, image reconstruction methodology has been proposed to counter the distortions in the image caused due to noise, rotation or scale variations. The designed neural network once trained with sample images, automatically takes care of these deformations and successfully reconstructs the corrected image for the test targets. Techniques developed in this paper are tested and validated using four different regular shapes viz. rectangle, square, triangle and circle.
Kammoun, Radhouane; Naili, Belgacem; Bejar, Samir
2008-09-01
The production optimization of alpha-amylase (E.C.3.2.1.1) from Aspergillus oryzae CBS 819.72 fungus, using a by-product of wheat grinding (gruel) as sole carbon source, was performed with statistical methodology based on three experimental designs. The optimisation of temperature, agitation and inoculum size was attempted using a Box-Behnken design under the response surface methodology. The screening of nineteen nutrients for their influence on alpha-amylase production was achieved using a Plackett-Burman design. KH(2)PO(4), urea, glycerol, (NH(4))(2)SO(4), CoCl(2), casein hydrolysate, soybean meal hydrolysate, MgSO(4) were selected based on their positive influence on enzyme formation. The optimized nutrients concentration was obtained using a Taguchi experimental design and the analysis of the data predicts a theoretical increase in the alpha-amylase expression of 73.2% (from 40.1 to 151.1 U/ml). These conditions were validated experimentally and revealed an enhanced alpha-amylase yield of 72.7%.
CFD Aided Design and Production of Hydraulic Turbines
NASA Astrophysics Data System (ADS)
Kaplan, Alper; Cetinturk, Huseyin; Demirel, Gizem; Ayli, Ece; Celebioglu, Kutay; Aradag, Selin; ETU Hydro Research Center Team
2014-11-01
Hydraulic turbines are turbo machines which produce electricity from hydraulic energy. Francis type turbines are the most common one in use today. The design of these turbines requires high engineering effort since each turbine is tailor made due to different head and discharge. Therefore each component of the turbine is designed specifically. During the last decades, Computational Fluid Dynamics (CFD) has become very useful tool to predict hydraulic machinery performance and save time and money for designers. This paper describes a design methodology to optimize a Francis turbine by integrating theoretical and experimental fundamentals of hydraulic machines and commercial CFD codes. Specific turbines are designed and manufactured with the help of a collaborative CFD/CAD/CAM methodology based on computational fluid dynamics and five-axis machining for hydraulic electric power plants. The details are presented in this study. This study is financially supported by Turkish Ministry of Development.
77 FR 71794 - Agency Forms Undergoing Paperwork Reduction Act Review
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-04
... in that community. Formative research is research that occurs before a program is designed and... entirely behavioral but most often they are cycles of interviews and focus groups designed to inform the... specific data collection instruments, (3) methodological research (4) usability testing of technology-based...
Experimental Learning Enhancing Improvisation Skills
ERIC Educational Resources Information Center
Pereira Christopoulos, Tania; Wilner, Adriana; Trindade Bestetti, Maria Luisa
2016-01-01
Purpose: This study aims to present improvisation training and experimentation as an alternative method to deal with unexpected events in which structured processes do not seem to work. Design/Methodology/Approach: Based on the literature of sensemaking and improvisation, the study designs a framework and process model of experimental learning…
Wiki-Based Rapid Prototyping for Teaching-Material Design in E-Learning Grids
ERIC Educational Resources Information Center
Shih, Wen-Chung; Tseng, Shian-Shyong; Yang, Chao-Tung
2008-01-01
Grid computing environments with abundant resources can support innovative e-Learning applications, and are promising platforms for e-Learning. To support individualized and adaptive learning, teachers are encouraged to develop various teaching materials according to different requirements. However, traditional methodologies for designing teaching…
ERIC Educational Resources Information Center
Halliburton, Cal; Roza, Victoria
2006-01-01
Technology educators are constantly in search of new tools and methods to enhance the education of their students. This article is an excerpt from a longer article published in "The Technology Teacher" that introduced the technology education community to a research- and knowledge-based methodology for design--invention and innovation. This…
Landing Gear Integration in Aircraft Conceptual Design. Revision
NASA Technical Reports Server (NTRS)
Chai, Sonny T.; Mason, William H.
1997-01-01
The design of the landing gear is one of the more fundamental aspects of aircraft design. The design and integration process encompasses numerous engineering disciplines, e.g., structure, weights, runway design, and economics, and has become extremely sophisticated in the last few decades. Although the design process is well-documented, no attempt has been made until now in the development of a design methodology that can be used within an automated environment. As a result, the process remains to be a key responsibility for the configuration designer and is largely experience-based and graphically-oriented. However, as industry and government try to incorporate multidisciplinary design optimization (MDO) methods in the conceptual design phase, the need for a more systematic procedure has become apparent. The development of an MDO-capable design methodology as described in this work is focused on providing the conceptual designer with tools to help automate the disciplinary analyses, i.e., geometry, kinematics, flotation, and weight. Documented design procedures and analyses were examined to determine their applicability, and to ensure compliance with current practices and regulations. Using the latest information as obtained from industry during initial industry survey, the analyses were in terms modified and expanded to accommodate the design criteria associated with the advanced large subsonic transports. Algorithms were then developed based on the updated analysis procedures to be incorporated into existing MDO codes.
Automated software development workstation
NASA Technical Reports Server (NTRS)
1986-01-01
Engineering software development was automated using an expert system (rule-based) approach. The use of this technology offers benefits not available from current software development and maintenance methodologies. A workstation was built with a library or program data base with methods for browsing the designs stored; a system for graphical specification of designs including a capability for hierarchical refinement and definition in a graphical design system; and an automated code generation capability in FORTRAN. The workstation was then used in a demonstration with examples from an attitude control subsystem design for the space station. Documentation and recommendations are presented.
NASA Technical Reports Server (NTRS)
Li, Fei; Choudhari, Meelan M.; Carpenter, Mark H.; Malik, Mujeeb R.; Eppink, Jenna; Chang, Chau-Lyan; Streett, Craig L.
2010-01-01
A high fidelity transition prediction methodology has been applied to a swept airfoil design at a Mach number of 0.75 and chord Reynolds number of approximately 17 million, with the dual goal of an assessment of the design for the implementation and testing of roughness based crossflow transition control and continued maturation of such methodology in the context of realistic aerodynamic configurations. Roughness based transition control involves controlled seeding of suitable, subdominant crossflow modes in order to weaken the growth of naturally occurring, linearly more unstable instability modes via a nonlinear modification of the mean boundary layer profiles. Therefore, a synthesis of receptivity, linear and nonlinear growth of crossflow disturbances, and high-frequency secondary instabilities becomes desirable to model this form of control. Because experimental data is currently unavailable for passive crossflow transition control for such high Reynolds number configurations, a holistic computational approach is used to assess the feasibility of roughness based control methodology. Potential challenges inherent to this control application as well as associated difficulties in modeling this form of control in a computational setting are highlighted. At high Reynolds numbers, a broad spectrum of stationary crossflow disturbances amplify and, while it may be possible to control a specific target mode using Discrete Roughness Elements (DREs), nonlinear interaction between the control and target modes may yield strong amplification of the difference mode that could have an adverse impact on the transition delay using spanwise periodic roughness elements.
NASA Astrophysics Data System (ADS)
Alfano, M.; Bisagni, C.
2017-01-01
The objective of the running EU project DESICOS (New Robust DESign Guideline for Imperfection Sensitive COmposite Launcher Structures) is to formulate an improved shell design methodology in order to meet the demand of aerospace industry for lighter structures. Within the project, this article discusses the development of a probability-based methodology developed at Politecnico di Milano. It is based on the combination of the Stress-Strength Interference Method and the Latin Hypercube Method with the aim to predict the bucking response of three sandwich composite cylindrical shells, assuming a loading condition of pure compression. The three shells are made of the same material, but have different stacking sequence and geometric dimensions. One of them presents three circular cut-outs. Different types of input imperfections, treated as random variables, are taken into account independently and in combination: variability in longitudinal Young's modulus, ply misalignment, geometric imperfections, and boundary imperfections. The methodology enables a first assessment of the structural reliability of the shells through the calculation of a probabilistic buckling factor for a specified level of probability. The factor depends highly on the reliability level, on the number of adopted samples, and on the assumptions made in modeling the input imperfections. The main advantage of the developed procedure is the versatility, as it can be applied to the buckling analysis of laminated composite shells and sandwich composite shells including different types of imperfections.
DB4US: A Decision Support System for Laboratory Information Management.
Carmona-Cejudo, José M; Hortas, Maria Luisa; Baena-García, Manuel; Lana-Linati, Jorge; González, Carlos; Redondo, Maximino; Morales-Bueno, Rafael
2012-11-14
Until recently, laboratory automation has focused primarily on improving hardware. Future advances are concentrated on intelligent software since laboratories performing clinical diagnostic testing require improved information systems to address their data processing needs. In this paper, we propose DB4US, an application that automates information related to laboratory quality indicators information. Currently, there is a lack of ready-to-use management quality measures. This application addresses this deficiency through the extraction, consolidation, statistical analysis, and visualization of data related to the use of demographics, reagents, and turn-around times. The design and implementation issues, as well as the technologies used for the implementation of this system, are discussed in this paper. To develop a general methodology that integrates the computation of ready-to-use management quality measures and a dashboard to easily analyze the overall performance of a laboratory, as well as automatically detect anomalies or errors. The novelty of our approach lies in the application of integrated web-based dashboards as an information management system in hospital laboratories. We propose a new methodology for laboratory information management based on the extraction, consolidation, statistical analysis, and visualization of data related to demographics, reagents, and turn-around times, offering a dashboard-like user web interface to the laboratory manager. The methodology comprises a unified data warehouse that stores and consolidates multidimensional data from different data sources. The methodology is illustrated through the implementation and validation of DB4US, a novel web application based on this methodology that constructs an interface to obtain ready-to-use indicators, and offers the possibility to drill down from high-level metrics to more detailed summaries. The offered indicators are calculated beforehand so that they are ready to use when the user needs them. The design is based on a set of different parallel processes to precalculate indicators. The application displays information related to tests, requests, samples, and turn-around times. The dashboard is designed to show the set of indicators on a single screen. DB4US was deployed for the first time in the Hospital Costa del Sol in 2008. In our evaluation we show the positive impact of this methodology for laboratory professionals, since the use of our application has reduced the time needed for the elaboration of the different statistical indicators and has also provided information that has been used to optimize the usage of laboratory resources by the discovery of anomalies in the indicators. DB4US users benefit from Internet-based communication of results, since this information is available from any computer without having to install any additional software. The proposed methodology and the accompanying web application, DB4US, automates the processing of information related to laboratory quality indicators and offers a novel approach for managing laboratory-related information, benefiting from an Internet-based communication mechanism. The application of this methodology has been shown to improve the usage of time, as well as other laboratory resources.
NASA Astrophysics Data System (ADS)
Tai, Wei; Abbasi, Mortez; Ricketts, David S.
2018-01-01
We present the analysis and design of high-power millimetre-wave power amplifier (PA) systems using zero-degree combiners (ZDCs). The methodology presented optimises the PA device sizing and the number of combined unit PAs based on device load pull simulations, driver power consumption analysis and loss analysis of the ZDC. Our analysis shows that an optimal number of N-way combined unit PAs leads to the highest power-added efficiency (PAE) for a given output power. To illustrate our design methodology, we designed a 1-W PA system at 45 GHz using a 45 nm silicon-on-insulator process and showed that an 8-way combined PA has the highest PAE that yields simulated output power of 30.6 dBm and 31% peak PAE.
Deterministic Multiaxial Creep and Creep Rupture Enhancements for CARES/Creep Integrated Design Code
NASA Technical Reports Server (NTRS)
Jadaan, Osama M.
1998-01-01
High temperature and long duration applications of monolithic ceramics can place their failure mode in the creep rupture regime. A previous model advanced by the authors described a methodology by which the creep rupture life of a loaded component can be predicted. That model was based on the life fraction damage accumulation rule in association with the modified Monkman-Grant creep rupture criterion. However, that model did not take into account the deteriorating state of the material due to creep damage (e.g., cavitation) as time elapsed. In addition, the material creep parameters used in that life prediction methodology, were based on uniaxial creep curves displaying primary and secondary creep behavior, with no tertiary regime. The objective of this paper is to present a creep life prediction methodology based on a modified form of the Kachanov-Rabotnov continuum damage mechanics (CDM) theory. In this theory, the uniaxial creep rate is described in terms of sum, temperature, time, and the current state of material damage. This scalar damage state parameter is basically an abstract measure of the current state of material damage due to creep deformation. The damage rate is assumed to vary with stress, temperature, time, and the current state of damage itself. Multiaxial creep and creep rupture formulations of the CDM approach are presented in this paper. Parameter estimation methodologies based on nonlinear regression analysis are also described for both, isothermal constant stress states and anisothermal variable stress conditions This creep life prediction methodology was preliminarily added to the integrated design code CARES/Creep (Ceramics Analysis and Reliability Evaluation of Structures/Creep), which is a postprocessor program to commercially available finite element analysis (FEA) packages. Two examples, showing comparisons between experimental and predicted creep lives of ceramic specimens, are used to demonstrate the viability of Ns methodology and the CARES/Creep program.
In Defense of the Randomized Controlled Trial for Health Promotion Research
Rosen, Laura; Manor, Orly; Engelhard, Dan; Zucker, David
2006-01-01
The overwhelming evidence about the role lifestyle plays in mortality, morbidity, and quality of life has pushed the young field of modern health promotion to center stage. The field is beset with intense debate about appropriate evaluation methodologies. Increasingly, randomized designs are considered inappropriate for health promotion research. We have reviewed criticisms against randomized trials that raise philosophical and practical issues, and we will show how most of these criticisms can be overcome with minor design modifications. By providing rebuttal to arguments against randomized trials, our work contributes to building a sound methodological base for health promotion research. PMID:16735622
NASA Technical Reports Server (NTRS)
Jones, Corey; LaPha, Steven
2013-01-01
This presentation will focus on the modernization of design and engineering practices through the use of Model Based Definition methodology. By gathering important engineering data into one 3D digital data set, applying model annotations, and setting up model view states directly in the 3D CAD model, model-specific information can be published to Windchill and CreoView for use during the Design Review Process. This presentation will describe the methods that have been incorporated into the modeling.
Integrating industrial seminars within a graduate engineering programme
NASA Astrophysics Data System (ADS)
Ringwood, John. V.
2013-05-01
The benefit of external, often industry-based, speakers for a seminar series associated with both undergraduate and graduate programmes is relatively unchallenged. However, the means by which such a seminar series can be encapsulated within a structured learning module, and the appropriate design of an accompanying assessment methodology, is not so obvious. This paper examines how such a learning module can be formulated and addresses the main issues involved in the design of such a module, namely the selection of speakers, format of seminars, method of delivery and assessment methodology, informed by the objectives of the module.
Value-centric design architecture based on analysis of space system characteristics
NASA Astrophysics Data System (ADS)
Xu, Q.; Hollingsworth, P.; Smith, K.
2018-03-01
Emerging design concepts such as miniaturisation, modularity, and standardisation, have contributed to the rapid development of small and inexpensive platforms, particularly cubesats. This has been stimulating an upcoming revolution in space design and development, leading satellites into the era of "smaller, faster, and cheaper". However, the current requirement-centric design philosophy, focused on bespoke monolithic systems, along with the associated development and production process does not inherently fit with the innovative modular, standardised, and mass-produced technologies. This paper presents a new categorisation, characterisation, and value-centric design architecture to address this need for both traditional and novel system designs. Based on the categorisation of system configurations, a characterisation of space systems, comprised of duplication, fractionation, and derivation, is proposed to capture the overall system configuration characteristics and promote potential hybrid designs. Complying with the definitions of the system characterisation, mathematical mapping relations between the system characterisation and the system properties are described to establish the mathematical foundation of the proposed value-centric design methodology. To illustrate the methodology, subsystem reliability relationships are therefore analysed to explore potential system configurations in the design space. The results of the applications of system characteristic analysis clearly show that the effects of different configuration characteristics on the system properties can be effectively analysed and evaluated, enabling the optimization of system configurations.
Modern proposal of methodology for retrieval of characteristic synthetic rainfall hyetographs
NASA Astrophysics Data System (ADS)
Licznar, Paweł; Burszta-Adamiak, Ewa; Łomotowski, Janusz; Stańczyk, Justyna
2017-11-01
Modern engineering workshop of designing and modelling complex drainage systems is based on hydrodynamic modelling and has a probabilistic character. Its practical application requires a change regarding rainfall models accepted at the input. Previously used artificial rainfall models of simplified form, e.g. block precipitation or Euler's type II model rainfall are no longer sufficient. It is noticeable that urgent clarification is needed as regards the methodology of standardized rainfall hyetographs that would take into consideration the specifics of local storm rainfall temporal dynamics. The aim of the paper is to present a proposal for innovative methodology for determining standardized rainfall hyetographs, based on statistical processing of the collection of actual local precipitation characteristics. Proposed methodology is based on the classification of standardized rainfall hyetographs with the use of cluster analysis. Its application is presented on the example of selected rain gauges localized in Poland. Synthetic rainfall hyetographs achieved as a final result may be used for hydrodynamic modelling of sewerage systems, including probabilistic detection of necessary capacity of retention reservoirs.
NASA Astrophysics Data System (ADS)
Li, Leihong
A modular structural design methodology for composite blades is developed. This design method can be used to design composite rotor blades with sophisticate geometric cross-sections. This design method hierarchically decomposed the highly-coupled interdisciplinary rotor analysis into global and local levels. In the global level, aeroelastic response analysis and rotor trim are conduced based on multi-body dynamic models. In the local level, variational asymptotic beam sectional analysis methods are used for the equivalent one-dimensional beam properties. Compared with traditional design methodology, the proposed method is more efficient and accurate. Then, the proposed method is used to study three different design problems that have not been investigated before. The first is to add manufacturing constraints into design optimization. The introduction of manufacturing constraints complicates the optimization process. However, the design with manufacturing constraints benefits the manufacturing process and reduces the risk of violating major performance constraints. Next, a new design procedure for structural design against fatigue failure is proposed. This procedure combines the fatigue analysis with the optimization process. The durability or fatigue analysis employs a strength-based model. The design is subject to stiffness, frequency, and durability constraints. Finally, the manufacturing uncertainty impacts on rotor blade aeroelastic behavior are investigated, and a probabilistic design method is proposed to control the impacts of uncertainty on blade structural performance. The uncertainty factors include dimensions, shapes, material properties, and service loads.
E.M. (Ted) Bilek
2007-01-01
The model ChargeOut! was developed to determine charge-out rates or rates of return for machines and capital equipment. This paper introduces a costing methodology and applies it to a piece of capital equipment. Although designed for the forest industry, the methodology is readily transferable to other sectors. Based on discounted cash-flow analysis, ChargeOut!...
Rita C.L.B. Rodrigues; William R. Kenealy; Diane Dietrich; Thomas W. Jeffries
2012-01-01
Response surface methodology (RSM), based on a 22 full factorial design, evaluated the moisture effects in recovering xylose by diethyloxalate (DEO) hydrolysis. Experiments were carried out in laboratory reactors (10 mL glass ampoules) containing corn stover (0.5 g) properly ground. The ampoules were kept at 160 °C for 90 min. Both DEO...
Adapting Rational Unified Process (RUP) approach in designing a secure e-Tendering model
NASA Astrophysics Data System (ADS)
Mohd, Haslina; Robie, Muhammad Afdhal Muhammad; Baharom, Fauziah; Darus, Norida Muhd; Saip, Mohamed Ali; Yasin, Azman
2016-08-01
e-Tendering is an electronic processing of the tender document via internet and allow tenderer to publish, communicate, access, receive and submit all tender related information and documentation via internet. This study aims to design the e-Tendering system using Rational Unified Process approach. RUP provides a disciplined approach on how to assign tasks and responsibilities within the software development process. RUP has four phases that can assist researchers to adjust the requirements of various projects with different scope, problem and the size of projects. RUP is characterized as a use case driven, architecture centered, iterative and incremental process model. However the scope of this study only focusing on Inception and Elaboration phases as step to develop the model and perform only three of nine workflows (business modeling, requirements, analysis and design). RUP has a strong focus on documents and the activities in the inception and elaboration phases mainly concern the creation of diagrams and writing of textual descriptions. The UML notation and the software program, Star UML are used to support the design of e-Tendering. The e-Tendering design based on the RUP approach can contribute to e-Tendering developers and researchers in e-Tendering domain. In addition, this study also shows that the RUP is one of the best system development methodology that can be used as one of the research methodology in Software Engineering domain related to secured design of any observed application. This methodology has been tested in various studies in certain domains, such as in Simulation-based Decision Support, Security Requirement Engineering, Business Modeling and Secure System Requirement, and so forth. As a conclusion, these studies showed that the RUP one of a good research methodology that can be adapted in any Software Engineering (SE) research domain that required a few artifacts to be generated such as use case modeling, misuse case modeling, activity diagram, and initial class diagram from a list of requirements as identified earlier by the SE researchers
FINDING A METHOD FOR THE MADNESS: A COMPARATIVE ANALYSIS OF STRATEGIC DESIGN METHODOLOGIES
2017-06-01
FINDING A METHOD FOR THE MADNESS: A COMPARATIVE ANALYSIS OF STRATEGIC DESIGN METHODOLOGIES BY AMANDA DONNELLY A THESIS...work develops a comparative model for strategic design methodologies, focusing on the primary elements of vision, time, process, communication and...collaboration, and risk assessment. My analysis dissects and compares three potential design methodologies including, net assessment, scenarios and
Organisational Memories in Project-Based Companies: An Autopoietic View
ERIC Educational Resources Information Center
Koskinen, Kaj U.
2010-01-01
Purpose: The purpose of this paper is to describe project-based companies' knowledge production and memory development with the help of autopoietic epistemology. Design/methodology/approach: The discussion first defines the concept of a project-based company. Then the discussion deals with the two epistemological assumptions, namely cognitivist…
Computer-Based Training: Capitalizing on Lessons Learned
ERIC Educational Resources Information Center
Bedwell, Wendy L.; Salas, Eduardo
2010-01-01
Computer-based training (CBT) is a methodology for providing systematic, structured learning; a useful tool when properly designed. CBT has seen a resurgence given the serious games movement, which is at the forefront of integrating primarily entertainment computer-based games into education and training. This effort represents a multidisciplinary…
ERIC Educational Resources Information Center
Cantor, Jeffrey A.
This paper describes a formative/summative process for educational program evaluation, which is appropriate for higher education programs and is based on M. Provus' Discrepancy Evaluation Model and the principles of instructional design. The Discrepancy Based Methodology for Educational Program Evaluation facilitates systematic and detailed…
A methodology for collecting valid software engineering data
NASA Technical Reports Server (NTRS)
Basili, Victor R.; Weiss, David M.
1983-01-01
An effective data collection method for evaluating software development methodologies and for studying the software development process is described. The method uses goal-directed data collection to evaluate methodologies with respect to the claims made for them. Such claims are used as a basis for defining the goals of the data collection, establishing a list of questions of interest to be answered by data analysis, defining a set of data categorization schemes, and designing a data collection form. The data to be collected are based on the changes made to the software during development, and are obtained when the changes are made. To insure accuracy of the data, validation is performed concurrently with software development and data collection. Validation is based on interviews with those people supplying the data. Results from using the methodology show that data validation is a necessary part of change data collection. Without it, as much as 50% of the data may be erroneous. Feasibility of the data collection methodology was demonstrated by applying it to five different projects in two different environments. The application showed that the methodology was both feasible and useful.
Situated Research Design and Methodological Choices in Formative Program Evaluation
ERIC Educational Resources Information Center
Supovitz, Jonathan
2013-01-01
Design-based implementation research offers the opportunity to rethink the relationships between intervention, research, and situation to better attune research and evaluation to the program development process. Using a heuristic called the intervention development curve, I describe the rough trajectory that programs typically follow as they…
Beam-Flattener Design for High Energy Radiographic Inspection
NASA Technical Reports Server (NTRS)
Grandin, Robert; Rudolphi, Thomas
2009-01-01
This report documents the work done to develop a beam flattener for use in the inspection of rocket motors at ATK Space Systems Utah facilities. The following pages provide a brief introduction to the necessity of this project, comprehensive description of the design methodology, and experimentally-based conclusions regarding project success.
Non-Linear Modeling of Growth Prerequisites in a Finnish Polytechnic Institution of Higher Education
ERIC Educational Resources Information Center
Nokelainen, Petri; Ruohotie, Pekka
2009-01-01
Purpose: This study aims to examine the factors of growth-oriented atmosphere in a Finnish polytechnic institution of higher education with categorical exploratory factor analysis, multidimensional scaling and Bayesian unsupervised model-based visualization. Design/methodology/approach: This study was designed to examine employee perceptions of…
Emerging Models of the New Paradigm.
ERIC Educational Resources Information Center
Howser, Lee; Schwinn, Carole
Working with the Philadelphia-based Institute of Interactive Management, several teams at Jackson Community College (JCC), in Michigan, set out in 1994 to learn and apply an interactive design methodology to selected college subsystems. Interactive design begins with understanding problems faced by the system as a whole, which in the case of JCC…
Reporting and methodological quality of meta-analyses in urological literature.
Xia, Leilei; Xu, Jing; Guzzo, Thomas J
2017-01-01
To assess the overall quality of published urological meta-analyses and identify predictive factors for high quality. We systematically searched PubMed to identify meta-analyses published from January 1st, 2011 to December 31st, 2015 in 10 predetermined major paper-based urology journals. The characteristics of the included meta-analyses were collected, and their reporting and methodological qualities were assessed by the PRISMA checklist (27 items) and AMSTAR tool (11 items), respectively. Descriptive statistics were used for individual items as a measure of overall compliance, and PRISMA and AMSTAR scores were calculated as the sum of adequately reported domains. Logistic regression was used to identify predictive factors for high qualities. A total of 183 meta-analyses were included. The mean PRISMA and AMSTAR scores were 22.74 ± 2.04 and 7.57 ± 1.41, respectively. PRISMA item 5, protocol and registration, items 15 and 22, risk of bias across studies, items 16 and 23, additional analysis had less than 50% adherence. AMSTAR item 1, " a priori " design, item 5, list of studies and item 10, publication bias had less than 50% adherence. Logistic regression analyses showed that funding support and " a priori " design were associated with superior reporting quality, following PRISMA guideline and " a priori " design were associated with superior methodological quality. Reporting and methodological qualities of recently published meta-analyses in major paper-based urology journals are generally good. Further improvement could potentially be achieved by strictly adhering to PRISMA guideline and having " a priori " protocol.
NASA Astrophysics Data System (ADS)
Siade, Adam J.; Hall, Joel; Karelse, Robert N.
2017-11-01
Regional groundwater flow models play an important role in decision making regarding water resources; however, the uncertainty embedded in model parameters and model assumptions can significantly hinder the reliability of model predictions. One way to reduce this uncertainty is to collect new observation data from the field. However, determining where and when to obtain such data is not straightforward. There exist a number of data-worth and experimental design strategies developed for this purpose. However, these studies often ignore issues related to real-world groundwater models such as computational expense, existing observation data, high-parameter dimension, etc. In this study, we propose a methodology, based on existing methods and software, to efficiently conduct such analyses for large-scale, complex regional groundwater flow systems for which there is a wealth of available observation data. The method utilizes the well-established d-optimality criterion, and the minimax criterion for robust sampling strategies. The so-called Null-Space Monte Carlo method is used to reduce the computational burden associated with uncertainty quantification. And, a heuristic methodology, based on the concept of the greedy algorithm, is proposed for developing robust designs with subsets of the posterior parameter samples. The proposed methodology is tested on a synthetic regional groundwater model, and subsequently applied to an existing, complex, regional groundwater system in the Perth region of Western Australia. The results indicate that robust designs can be obtained efficiently, within reasonable computational resources, for making regional decisions regarding groundwater level sampling.
A dynamic multi-scale Markov model based methodology for remaining life prediction
NASA Astrophysics Data System (ADS)
Yan, Jihong; Guo, Chaozhong; Wang, Xing
2011-05-01
The ability to accurately predict the remaining life of partially degraded components is crucial in prognostics. In this paper, a performance degradation index is designed using multi-feature fusion techniques to represent deterioration severities of facilities. Based on this indicator, an improved Markov model is proposed for remaining life prediction. Fuzzy C-Means (FCM) algorithm is employed to perform state division for Markov model in order to avoid the uncertainty of state division caused by the hard division approach. Considering the influence of both historical and real time data, a dynamic prediction method is introduced into Markov model by a weighted coefficient. Multi-scale theory is employed to solve the state division problem of multi-sample prediction. Consequently, a dynamic multi-scale Markov model is constructed. An experiment is designed based on a Bently-RK4 rotor testbed to validate the dynamic multi-scale Markov model, experimental results illustrate the effectiveness of the methodology.
The Research of Improving the Particleboard Glue Dosing Process Based on TRIZ Analysis
NASA Astrophysics Data System (ADS)
Yu, Huiling; Fan, Delin; Zhang, Yizhuo
This research creates a design methodology by synthesizing the Theory of Inventive Problem Solving (TRIZ) and cascade control based on Smith predictor. The particleboard glue supplying and dosing system case study defines the problem and the solution using the methodology proposed in the paper. Status difference existing in the gluing dosing process of particleboard production usually causes gluing volume inaccurately. In order to solve the problem above, we applied the TRIZ technical contradiction and inventive principle to improve the key process of particleboard production. The improving method mapped inaccurate problem to TRIZ technical contradiction, the prior action proposed Smith predictor as the control algorithm in the glue dosing system. This research examines the usefulness of a TRIZ based problem-solving process designed to improve the problem-solving ability of users in addressing difficult or reoccurring problems and also testify TRIZ is practicality and validity. Several suggestions are presented on how to approach this problem.
A Model for Oil-Gas Pipelines Cost Prediction Based on a Data Mining Process
NASA Astrophysics Data System (ADS)
Batzias, Fragiskos A.; Spanidis, Phillip-Mark P.
2009-08-01
This paper addresses the problems associated with the cost estimation of oil/gas pipelines during the elaboration of feasibility assessments. Techno-economic parameters, i.e., cost, length and diameter, are critical for such studies at the preliminary design stage. A methodology for the development of a cost prediction model based on Data Mining (DM) process is proposed. The design and implementation of a Knowledge Base (KB), maintaining data collected from various disciplines of the pipeline industry, are presented. The formulation of a cost prediction equation is demonstrated by applying multiple regression analysis using data sets extracted from the KB. Following the methodology proposed, a learning context is inductively developed as background pipeline data are acquired, grouped and stored in the KB, and through a linear regression model provide statistically substantial results, useful for project managers or decision makers.
Recent developments of axial flow compressors under transonic flow conditions
NASA Astrophysics Data System (ADS)
Srinivas, G.; Raghunandana, K.; Satish Shenoy, B.
2017-05-01
The objective of this paper is to give a holistic view of the most advanced technology and procedures that are practiced in the field of turbomachinery design. Compressor flow solver is the turbulence model used in the CFD to solve viscous problems. The popular techniques like Jameson’s rotated difference scheme was used to solve potential flow equation in transonic condition for two dimensional aero foils and later three dimensional wings. The gradient base method is also a popular method especially for compressor blade shape optimization. Various other types of optimization techniques available are Evolutionary algorithms (EAs) and Response surface methodology (RSM). It is observed that in order to improve compressor flow solver and to get agreeable results careful attention need to be paid towards viscous relations, grid resolution, turbulent modeling and artificial viscosity, in CFD. The advanced techniques like Jameson’s rotated difference had most substantial impact on wing design and aero foil. For compressor blade shape optimization, Evolutionary algorithm is quite simple than gradient based technique because it can solve the parameters simultaneously by searching from multiple points in the given design space. Response surface methodology (RSM) is a method basically used to design empirical models of the response that were observed and to study systematically the experimental data. This methodology analyses the correct relationship between expected responses (output) and design variables (input). RSM solves the function systematically in a series of mathematical and statistical processes. For turbomachinery blade optimization recently RSM has been implemented successfully. The well-designed high performance axial flow compressors finds its application in any air-breathing jet engines.
HSTDEK: Developing a methodology for construction of large-scale, multi-use knowledge bases
NASA Technical Reports Server (NTRS)
Freeman, Michael S.
1987-01-01
The primary research objectives of the Hubble Space Telescope Design/Engineering Knowledgebase (HSTDEK) are to develop a methodology for constructing and maintaining large scale knowledge bases which can be used to support multiple applications. To insure the validity of its results, this research is being persued in the context of a real world system, the Hubble Space Telescope. The HSTDEK objectives are described in detail. The history and motivation of the project are briefly described. The technical challenges faced by the project are outlined.
A methodology based on reduced complexity algorithm for system applications using microprocessors
NASA Technical Reports Server (NTRS)
Yan, T. Y.; Yao, K.
1988-01-01
The paper considers a methodology on the analysis and design of a minimum mean-square error criterion linear system incorporating a tapped delay line (TDL) where all the full-precision multiplications in the TDL are constrained to be powers of two. A linear equalizer based on the dispersive and additive noise channel is presented. This microprocessor implementation with optimized power of two TDL coefficients achieves a system performance comparable to the optimum linear equalization with full-precision multiplications for an input data rate of 300 baud.
Espié, Stéphane; Boubezoul, Abderrahmane; Aupetit, Samuel; Bouaziz, Samir
2013-09-01
Instrumented vehicles are key tools for in-depth understanding of drivers' behaviours, thus for the design of scientifically based countermeasures to reduce fatalities and injuries. The instrumentation of Powered Two-Wheelers (PTW) has been less widely implemented that for vehicles, in part due to the technical challenges involved. The last decade has seen the development in Europe of several tools and methodologies to study motorcycle riders' behaviours and motorcycle dynamics for a range of situations, including crash events involving falls. Thanks to these tools, a broad-ranging research programme has been conducted, from the design and tuning of real-time falls detection to the study of riding training systems, as well as studies focusing on naturalistic riding situations such as filtering and line splitting. The methodology designed for the in-depth study of riders' behaviours in naturalistic situations can be based upon the combination of several sources of data such as: PTW sensors, context-based video retrieval system, Global Positioning System (GPS) and verbal data on the riders' decisions making process. The goals of this paper are: (1) to present the methodological tools developed and used by INRETS-MSIS (now Ifsttar-TS2/Simu) in the last decade for the study of riders' behaviours in real-world environment as well as on track for situations up to falls, (2) to illustrate the kind of results that can be gained from the conducted studies, (3) to identify the advantages and limitations of the proposed methodology to conduct large scale naturalistic riding studies, and (4) to highlight how the knowledge gained from this approach will fill many of the knowledge gaps about PTW-riders' behaviours and risk factors. Copyright © 2013 Elsevier Ltd. All rights reserved.
Susceptibility of Redundant Versus Singular Clock Domains Implemented in SRAM-Based FPGA TMR Designs
NASA Technical Reports Server (NTRS)
Berg, Melanie D.; LaBel, Kenneth A.; Pellish, Jonathan
2016-01-01
We present the challenges that arise when using redundant clock domains due to their clock-skew. Radiation data show that a singular clock domain (DTMR) provides an improved TMR methodology for SRAM-based FPGAs over redundant clocks.
ERIC Educational Resources Information Center
Bozkurt, Ipek; Helm, James
2013-01-01
This paper develops a systems engineering-based framework to assist in the design of an online engineering course. Specifically, the purpose of the framework is to provide a structured methodology for the design, development and delivery of a fully online course, either brand new or modified from an existing face-to-face course. The main strength…
An Analysis of Factors that Inhibit Business Use of User-Centered Design Principles: A Delphi Study
ERIC Educational Resources Information Center
Hilton, Tod M.
2010-01-01
The use of user-centered design (UCD) principles has a positive impact on the use of web-based interactive systems in customer-centric organizations. User-centered design methodologies are not widely adopted in organizations due to intraorganizational factors. A qualitative study using a modified Delphi technique was used to identify the factors…
Augmented Reality M-Learning to Enhance Nursing Skills Acquisition in the Clinical Skills Laboratory
ERIC Educational Resources Information Center
Garrett, Bernard M.; Jackson, Cathryn; Wilson, Brian
2015-01-01
Purpose: This paper aims to report on a pilot research project designed to explore if new mobile augmented reality (AR) technologies have the potential to enhance the learning of clinical skills in the lab. Design/methodology/approach: An exploratory action-research-based pilot study was undertaken to explore an initial proof-of-concept design in…
ERIC Educational Resources Information Center
Lau, Kung Wong; Kan, Chi Wai; Lee, Pui Yuen
2017-01-01
Purpose: The purpose of this paper is to discuss the use of stereoscopic virtual technology in textile and fashion studies in particular to the area of chemical experiment. The development of a designed virtual platform, called Stereoscopic Chemical Laboratory (SCL), is introduced. Design/methodology/approach: To implement the suggested…
NASA Astrophysics Data System (ADS)
Acri, Antonio; Offner, Guenter; Nijman, Eugene; Rejlek, Jan
2016-10-01
Noise legislations and the increasing customer demands determine the Noise Vibration and Harshness (NVH) development of modern commercial vehicles. In order to meet the stringent legislative requirements for the vehicle noise emission, exact knowledge of all vehicle noise sources and their acoustic behavior is required. Transfer path analysis (TPA) is a fairly well established technique for estimating and ranking individual low-frequency noise or vibration contributions via the different transmission paths. Transmission paths from different sources to target points of interest and their contributions can be analyzed by applying TPA. This technique is applied on test measurements, which can only be available on prototypes, at the end of the designing process. In order to overcome the limits of TPA, a numerical transfer path analysis methodology based on the substructuring of a multibody system is proposed in this paper. Being based on numerical simulation, this methodology can be performed starting from the first steps of the designing process. The main target of the proposed methodology is to get information of noise sources contributions of a dynamic system considering the possibility to have multiple forces contemporary acting on the system. The contributions of these forces are investigated with particular focus on distribute or moving forces. In this paper, the mathematical basics of the proposed methodology and its advantages in comparison with TPA will be discussed. Then, a dynamic system is investigated with a combination of two methods. Being based on the dynamic substructuring (DS) of the investigated model, the methodology proposed requires the evaluation of the contact forces at interfaces, which are computed with a flexible multi-body dynamic (FMBD) simulation. Then, the structure-borne noise paths are computed with the wave based method (WBM). As an example application a 4-cylinder engine is investigated and the proposed methodology is applied on the engine block. The aim is to get accurate and clear relationships between excitations and responses of the simulated dynamic system, analyzing the noise and vibrational sources inside a car engine, showing the main advantages of a numerical methodology.
Designing an activity-based costing model for a non-admitted prisoner healthcare setting.
Cai, Xiao; Moore, Elizabeth; McNamara, Martin
2013-09-01
To design and deliver an activity-based costing model within a non-admitted prisoner healthcare setting. Key phases from the NSW Health clinical redesign methodology were utilised: diagnostic, solution design and implementation. The diagnostic phase utilised a range of strategies to identify issues requiring attention in the development of the costing model. The solution design phase conceptualised distinct 'building blocks' of activity and cost based on the speciality of clinicians providing care. These building blocks enabled the classification of activity and comparisons of costs between similar facilities. The implementation phase validated the model. The project generated an activity-based costing model based on actual activity performed, gained acceptability among clinicians and managers, and provided the basis for ongoing efficiency and benchmarking efforts.
Aeroelastic optimization methodology for viscous and turbulent flows
NASA Astrophysics Data System (ADS)
Barcelos Junior, Manuel Nascimento Dias
2007-12-01
In recent years, the development of faster computers and parallel processing allowed the application of high-fidelity analysis methods to the aeroelastic design of aircraft. However, these methods are restricted to the final design verification, mainly due to the computational cost involved in iterative design processes. Therefore, this work is concerned with the creation of a robust and efficient aeroelastic optimization methodology for inviscid, viscous and turbulent flows by using high-fidelity analysis and sensitivity analysis techniques. Most of the research in aeroelastic optimization, for practical reasons, treat the aeroelastic system as a quasi-static inviscid problem. In this work, as a first step toward the creation of a more complete aeroelastic optimization methodology for realistic problems, an analytical sensitivity computation technique was developed and tested for quasi-static aeroelastic viscous and turbulent flow configurations. Viscous and turbulent effects are included by using an averaged discretization of the Navier-Stokes equations, coupled with an eddy viscosity turbulence model. For quasi-static aeroelastic problems, the traditional staggered solution strategy has unsatisfactory performance when applied to cases where there is a strong fluid-structure coupling. Consequently, this work also proposes a solution methodology for aeroelastic and sensitivity analyses of quasi-static problems, which is based on the fixed point of an iterative nonlinear block Gauss-Seidel scheme. The methodology can also be interpreted as the solution of the Schur complement of the aeroelastic and sensitivity analyses linearized systems of equations. The methodologies developed in this work are tested and verified by using realistic aeroelastic systems.
NASA Astrophysics Data System (ADS)
Sirirojvisuth, Apinut
In complex aerospace system design, making an effective design decision requires multidisciplinary knowledge from both product and process perspectives. Integrating manufacturing considerations into the design process is most valuable during the early design stages since designers have more freedom to integrate new ideas when changes are relatively inexpensive in terms of time and effort. Several metrics related to manufacturability are cost, time, and manufacturing readiness level (MRL). Yet, there is a lack of structured methodology that quantifies how changes in the design decisions impact these metrics. As a result, a new set of integrated cost analysis tools are proposed in this study to quantify the impacts. Equally important is the capability to integrate this new cost tool into the existing design methodologies without sacrificing agility and flexibility required during the early design phases. To demonstrate the applicability of this concept, a ModelCenter environment is used to develop software architecture that represents Integrated Product and Process Development (IPPD) methodology used in several aerospace systems designs. The environment seamlessly integrates product and process analysis tools and makes effective transition from one design phase to the other while retaining knowledge gained a priori. Then, an advanced cost estimating tool called Hybrid Lifecycle Cost Estimating Tool (HLCET), a hybrid combination of weight-, process-, and activity-based estimating techniques, is integrated with the design framework. A new weight-based lifecycle cost model is created based on Tailored Cost Model (TCM) equations [3]. This lifecycle cost tool estimates the program cost based on vehicle component weights and programmatic assumptions. Additional high fidelity cost tools like process-based and activity-based cost analysis methods can be used to modify the baseline TCM result as more knowledge is accumulated over design iterations. Therefore, with this concept, the additional manufacturing knowledge can be used to identify a more accurate lifecycle cost and facilitate higher fidelity tradeoffs during conceptual and preliminary design. Advanced Composite Cost Estimating Model (ACCEM) is employed as a process-based cost component to replace the original TCM result of the composite part production cost. The reason for the replacement is that TCM estimates production costs from part weights as a result of subtractive manufacturing of metallic origin such as casting, forging, and machining processes. A complexity factor can sometimes be adjusted to reflect different types of metal and machine settings. The TCM assumption, however, gives erroneous results when applied to additive processes like those of composite manufacturing. Another innovative aspect of this research is the introduction of a work measurement technique called Maynard Operation Sequence Technique (MOST) to be used, similarly to Activity-Based Costing (ABC) approach, to estimate manufacturing time of a part by virtue of breaking down the operations occurred during its production. ABC allows a realistic determination of cost incurred in each activity, as opposed to using a traditional method of time estimation by analogy or using response surface equations from historical process data. The MOST concept provides a tailored study of an individual process typically required for a new, innovative design. Nevertheless, the MOST idea has some challenges, one of which is its requirement to build a new process from ground up. The process development requires a Subject Matter Expertise (SME) in manufacturing method of the particular design. The SME must have also a comprehensive understanding of the MOST system so that the correct parameters are chosen. In practice, these knowledge requirements may demand people from outside of the design discipline and a priori training of MOST. To relieve the constraint, this study includes an entirely new sub-system architecture that comprises 1) a knowledge-based system to provide the required knowledge during the process selection; and 2) a new user-interface to guide the parameter selection when building the process using MOST. Also included in this study is the demonstration of how the HLCET and its constituents can be integrated with a Georgia Tech' Integrated Product and Process Development (IPPD) methodology. The applicability of this work will be shown through a complex aerospace design example to gain insights into how manufacturing knowledge helps make better design decisions during the early stages. The setup process is explained with an example of its utility demonstrated in a hypothetical fighter aircraft wing redesign. The evaluation of the system effectiveness against existing methodologies is illustrated to conclude the thesis.
Risk-based process safety assessment and control measures design for offshore process facilities.
Khan, Faisal I; Sadiq, Rehan; Husain, Tahir
2002-09-02
Process operation is the most hazardous activity next to the transportation and drilling operation on an offshore oil and gas (OOG) platform. Past experiences of onshore and offshore oil and gas activities have revealed that a small mis-happening in the process operation might escalate to a catastrophe. This is of especial concern in the OOG platform due to the limited space and compact geometry of the process area, less ventilation, and difficult escape routes. On an OOG platform, each extra control measure, which is implemented, not only occupies space on the platform and increases congestion but also adds extra load to the platform. Eventualities in the OOG platform process operation can be avoided through incorporating the appropriate control measures at the early design stage. In this paper, the authors describe a methodology for risk-based process safety decision making for OOG activities. The methodology is applied to various offshore process units, that is, the compressor, separators, flash drum and driers of an OOG platform. Based on the risk potential, appropriate safety measures are designed for each unit. This paper also illustrates that implementation of the designed safety measures reduces the high Fatal accident rate (FAR) values to an acceptable level.
Simulation environment based on the Universal Verification Methodology
NASA Astrophysics Data System (ADS)
Fiergolski, A.
2017-01-01
Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC design; (2) the C3PD 180 nm HV-CMOS active sensor ASIC design; (3) the FPGA-based DAQ system of the CLICpix chip. This paper, based on the experience from the above projects, introduces briefly UVM and presents a set of tips and advices applicable at different stages of the verification process-cycle.
NASA Technical Reports Server (NTRS)
Madrid, G. A.; Westmoreland, P. T.
1983-01-01
A progress report is presented on a program to upgrade the existing NASA Deep Space Network in terms of a redesigned computer-controlled data acquisition system for channelling tracking, telemetry, and command data between a California-based control center and three signal processing centers in Australia, California, and Spain. The methodology for the improvements is oriented towards single subsystem development with consideration for a multi-system and multi-subsystem network of operational software. Details of the existing hardware configurations and data transmission links are provided. The program methodology includes data flow design, interface design and coordination, incremental capability availability, increased inter-subsystem developmental synthesis and testing, system and network level synthesis and testing, and system verification and validation. The software has been implemented thus far to a 65 percent completion level, and the methodology being used to effect the changes, which will permit enhanced tracking and communication with spacecraft, has been concluded to feature effective techniques.
Adjoint-Based Methodology for Time-Dependent Optimal Control (AMTOC)
NASA Technical Reports Server (NTRS)
Yamaleev, Nail; Diskin, boris; Nishikawa, Hiroaki
2012-01-01
During the five years of this project, the AMTOC team developed an adjoint-based methodology for design and optimization of complex time-dependent flows, implemented AMTOC in a testbed environment, directly assisted in implementation of this methodology in the state-of-the-art NASA's unstructured CFD code FUN3D, and successfully demonstrated applications of this methodology to large-scale optimization of several supersonic and other aerodynamic systems, such as fighter jet, subsonic aircraft, rotorcraft, high-lift, wind-turbine, and flapping-wing configurations. In the course of this project, the AMTOC team has published 13 refereed journal articles, 21 refereed conference papers, and 2 NIA reports. The AMTOC team presented the results of this research at 36 international and national conferences, meeting and seminars, including International Conference on CFD, and numerous AIAA conferences and meetings. Selected publications that include the major results of the AMTOC project are enclosed in this report.
Integrating uniform design and response surface methodology to optimize thiacloprid suspension
Li, Bei-xing; Wang, Wei-chang; Zhang, Xian-peng; Zhang, Da-xia; Mu, Wei; Liu, Feng
2017-01-01
A model 25% suspension concentrate (SC) of thiacloprid was adopted to evaluate an integrative approach of uniform design and response surface methodology. Tersperse2700, PE1601, xanthan gum and veegum were the four experimental factors, and the aqueous separation ratio and viscosity were the two dependent variables. Linear and quadratic polynomial models of stepwise regression and partial least squares were adopted to test the fit of the experimental data. Verification tests revealed satisfactory agreement between the experimental and predicted data. The measured values for the aqueous separation ratio and viscosity were 3.45% and 278.8 mPa·s, respectively, and the relative errors of the predicted values were 9.57% and 2.65%, respectively (prepared under the proposed conditions). Comprehensive benefits could also be obtained by appropriately adjusting the amount of certain adjuvants based on practical requirements. Integrating uniform design and response surface methodology is an effective strategy for optimizing SC formulas. PMID:28383036
ERIC Educational Resources Information Center
Construction Systems Management, Inc., Anchorage, AK.
Volume II of a 3-volume report demonstrates the use of Design Determinants and Options (presented in Volume I) in the planning and design of small rural Alaskan secondary schools. Section I, a checklist for gathering site-specific information to be used as a data base for facility design, is organized in the same format as Volume I, which can be…
ERIC Educational Resources Information Center
Cannon, Joanna E.; Guardino, Caroline; Antia, Shirin D.; Luckner, John L.
2015-01-01
The field of education of deaf and hard of hearing (DHH) students has a paucity of evidence-based practices (EBPs) to guide instruction. The authors discussed how the research methodology of single-case design (SCD) can be used to build EBPs through direct and systematic replication of studies. An overview of SCD research methods is presented,…
The Development of CK2 Inhibitors: From Traditional Pharmacology to in Silico Rational Drug Design
Cozza, Giorgio
2017-01-01
Casein kinase II (CK2) is an ubiquitous and pleiotropic serine/threonine protein kinase able to phosphorylate hundreds of substrates. Being implicated in several human diseases, from neurodegeneration to cancer, the biological roles of CK2 have been intensively studied. Upregulation of CK2 has been shown to be critical to tumor progression, making this kinase an attractive target for cancer therapy. Several CK2 inhibitors have been developed so far, the first being discovered by “trial and error testing”. In the last decade, the development of in silico rational drug design has prompted the discovery, de novo design and optimization of several CK2 inhibitors, active in the low nanomolar range. The screening of big chemical libraries and the optimization of hit compounds by Structure Based Drug Design (SBDD) provide telling examples of a fruitful application of rational drug design to the development of CK2 inhibitors. Ligand Based Drug Design (LBDD) models have been also applied to CK2 drug discovery, however they were mainly focused on methodology improvements rather than being critical for de novo design and optimization. This manuscript provides detailed description of in silico methodologies whose applications to the design and development of CK2 inhibitors proved successful and promising. PMID:28230762
Error Estimation and Uncertainty Propagation in Computational Fluid Mechanics
NASA Technical Reports Server (NTRS)
Zhu, J. Z.; He, Guowei; Bushnell, Dennis M. (Technical Monitor)
2002-01-01
Numerical simulation has now become an integral part of engineering design process. Critical design decisions are routinely made based on the simulation results and conclusions. Verification and validation of the reliability of the numerical simulation is therefore vitally important in the engineering design processes. We propose to develop theories and methodologies that can automatically provide quantitative information about the reliability of the numerical simulation by estimating numerical approximation error, computational model induced errors and the uncertainties contained in the mathematical models so that the reliability of the numerical simulation can be verified and validated. We also propose to develop and implement methodologies and techniques that can control the error and uncertainty during the numerical simulation so that the reliability of the numerical simulation can be improved.
Re-Engineering Complex Legacy Systems at NASA
NASA Technical Reports Server (NTRS)
Ruszkowski, James; Meshkat, Leila
2010-01-01
The Flight Production Process (FPP) Re-engineering project has established a Model-Based Systems Engineering (MBSE) methodology and the technological infrastructure for the design and development of a reference, product-line architecture as well as an integrated workflow model for the Mission Operations System (MOS) for human space exploration missions at NASA Johnson Space Center. The design and architectural artifacts have been developed based on the expertise and knowledge of numerous Subject Matter Experts (SMEs). The technological infrastructure developed by the FPP Re-engineering project has enabled the structured collection and integration of this knowledge and further provides simulation and analysis capabilities for optimization purposes. A key strength of this strategy has been the judicious combination of COTS products with custom coding. The lean management approach that has led to the success of this project is based on having a strong vision for the whole lifecycle of the project and its progress over time, a goal-based design and development approach, a small team of highly specialized people in areas that are critical to the project, and an interactive approach for infusing new technologies into existing processes. This project, which has had a relatively small amount of funding, is on the cutting edge with respect to the utilization of model-based design and systems engineering. An overarching challenge that was overcome by this project was to convince upper management of the needs and merits of giving up more conventional design methodologies (such as paper-based documents and unwieldy and unstructured flow diagrams and schedules) in favor of advanced model-based systems engineering approaches.
Initial Ares I Bending Filter Design
NASA Technical Reports Server (NTRS)
Jang, Jiann-Woei; Bedrossian, Nazareth; Hall, Robert; Norris, H. Lee; Hall, Charles; Jackson, Mark
2007-01-01
The Ares-I launch vehicle represents a challenging flex-body structural environment for control system design. Software filtering of the inertial sensor output will be required to ensure control system stability and adequate performance. This paper presents a design methodology employing numerical optimization to develop the Ares-I bending filters. The filter design methodology was based on a numerical constrained optimization approach to maximize stability margins while meeting performance requirements. The resulting bending filter designs achieved stability by adding lag to the first structural frequency and hence phase stabilizing the first Ares-I flex mode. To minimize rigid body performance impacts, a priority was placed via constraints in the optimization algorithm to minimize bandwidth decrease with the addition of the bending filters. The bending filters provided here have been demonstrated to provide a stable first stage control system in both the frequency domain and the MSFC MAVERIC time domain simulation.
A Simple and Reliable Method of Design for Standalone Photovoltaic Systems
NASA Astrophysics Data System (ADS)
Srinivasarao, Mantri; Sudha, K. Rama; Bhanu, C. V. K.
2017-06-01
Standalone photovoltaic (SAPV) systems are seen as a promoting method of electrifying areas of developing world that lack power grid infrastructure. Proliferations of these systems require a design procedure that is simple, reliable and exhibit good performance over its life time. The proposed methodology uses simple empirical formulae and easily available parameters to design SAPV systems, that is, array size with energy storage. After arriving at the different array size (area), performance curves are obtained for optimal design of SAPV system with high amount of reliability in terms of autonomy at a specified value of loss of load probability (LOLP). Based on the array to load ratio (ALR) and levelized energy cost (LEC) through life cycle cost (LCC) analysis, it is shown that the proposed methodology gives better performance, requires simple data and is more reliable when compared with conventional design using monthly average daily load and insolation.
Fashion sketch design by interactive genetic algorithms
NASA Astrophysics Data System (ADS)
Mok, P. Y.; Wang, X. X.; Xu, J.; Kwok, Y. L.
2012-11-01
Computer aided design is vitally important for the modern industry, particularly for the creative industry. Fashion industry faced intensive challenges to shorten the product development process. In this paper, a methodology is proposed for sketch design based on interactive genetic algorithms. The sketch design system consists of a sketch design model, a database and a multi-stage sketch design engine. First, a sketch design model is developed based on the knowledge of fashion design to describe fashion product characteristics by using parameters. Second, a database is built based on the proposed sketch design model to define general style elements. Third, a multi-stage sketch design engine is used to construct the design. Moreover, an interactive genetic algorithm (IGA) is used to accelerate the sketch design process. The experimental results have demonstrated that the proposed method is effective in helping laypersons achieve satisfied fashion design sketches.
Hanafi, Rasha Sayed; Lämmerhofer, Michael
2018-01-26
Quality-by-Design approach for enantioselective HPLC method development surpasses Quality-by-Testing in offering the optimal separation conditions with the least number of experiments and in its ability to describe the method's Design Space visually which helps to determine enantiorecognition to a significant extent. Although some schemes exist for enantiomeric separations on Cinchona-based zwitterionic stationary phases, the exact design space and the weights by which each of the chromatographic parameters influences the separation have not yet been statistically studied. In the current work, a screening design followed by a Response Surface Methodology optimization design were adopted for enantioseparation optimization of 3 model drugs namely the acidic Fmoc leucine, the amphoteric tryptophan and the basic salbutamol. The screening design proved that the acid/base additives are of utmost importance for the 3 chiral drugs, and that among 3 different pairs of acids and bases, acetic acid and diethylamine is the couple able to provide acceptable resolution at variable conditions. Visualization of the response surface of the retention factor, separation factor and resolution helped describe accurately the magnitude by which each chromatographic factor (% MeOH, concentration and ratio of acid base modifiers) affects the separation while interacting with other parameters. The global optima compromising highest enantioresolution with the least run time for the 3 chiral model drugs varied extremely, where it was best to set low % methanol with equal ratio of acid-base modifiers for the acidic drug, very high % methanol and 10-fold higher concentration of the acid for the amphoteric drug while 20 folds of the base modifier with moderate %methanol were needed for the basic drug. Considering the selected drugs as models for many series of structurally related compounds, the design space defined and the optimum conditions computed are the key for method development on cinchona-based chiral stationary phases. Copyright © 2017 Elsevier B.V. All rights reserved.
Predictor-Based Model Reference Adaptive Control
NASA Technical Reports Server (NTRS)
Lavretsky, Eugene; Gadient, Ross; Gregory, Irene M.
2009-01-01
This paper is devoted to robust, Predictor-based Model Reference Adaptive Control (PMRAC) design. The proposed adaptive system is compared with the now-classical Model Reference Adaptive Control (MRAC) architecture. Simulation examples are presented. Numerical evidence indicates that the proposed PMRAC tracking architecture has better than MRAC transient characteristics. In this paper, we presented a state-predictor based direct adaptive tracking design methodology for multi-input dynamical systems, with partially known dynamics. Efficiency of the design was demonstrated using short period dynamics of an aircraft. Formal proof of the reported PMRAC benefits constitute future research and will be reported elsewhere.
Utility of Army Design Methodology in U.S. Coast Guard Counter Narcotic Interdiction Strategy
2017-06-09
UTILITY OF ARMY DESIGN METHODOLOGY IN U.S. COAST GUARD COUNTER NARCOTIC INTERDICTION STRATEGY A thesis presented to the...Thesis 3. DATES COVERED (From - To) AUG 2016 – JUN 2017 4. TITLE AND SUBTITLE Utility of Army Design Methodology in U.S. Coast Guard Counter...Distribution is Unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT This study investigates the utility of using Army Design Methodology (ADM) to
Study design and "evidence" in patient-oriented research.
Concato, John
2013-06-01
Individual studies in patient-oriented research, whether described as "comparative effectiveness" or using other terms, are based on underlying methodological designs. A simple taxonomy of study designs includes randomized controlled trials on the one hand, and observational studies (such as case series, cohort studies, and case-control studies) on the other. A rigid hierarchy of these design types is a fairly recent phenomenon, promoted as a tenet of "evidence-based medicine," with randomized controlled trials receiving gold-standard status in terms of producing valid results. Although randomized trials have many strengths, and contribute substantially to the evidence base in clinical care, making presumptions about the quality of a study based solely on category of research design is unscientific. Both the limitations of randomized trials as well as the strengths of observational studies tend to be overlooked when a priori assumptions are made. This essay presents an argument in support of a more balanced approach to evaluating evidence, and discusses representative examples from the general medical as well as pulmonary and critical care literature. The simultaneous consideration of validity (whether results are correct "internally") and generalizability (how well results apply to "external" populations) is warranted in assessing whether a study's results are accurate for patients likely to receive the intervention-examining the intersection of clinical and methodological issues in what can be called a medicine-based evidence approach. Examination of cause-effect associations in patient-oriented research should recognize both the strengths and limitations of randomized trials as well as observational studies.
A Formal Methodology to Design and Deploy Dependable Wireless Sensor Networks
Testa, Alessandro; Cinque, Marcello; Coronato, Antonio; Augusto, Juan Carlos
2016-01-01
Wireless Sensor Networks (WSNs) are being increasingly adopted in critical applications, where verifying the correct operation of sensor nodes is a major concern. Undesired events may undermine the mission of the WSNs. Hence, their effects need to be properly assessed before deployment, to obtain a good level of expected performance; and during the operation, in order to avoid dangerous unexpected results. In this paper, we propose a methodology that aims at assessing and improving the dependability level of WSNs by means of an event-based formal verification technique. The methodology includes a process to guide designers towards the realization of a dependable WSN and a tool (“ADVISES”) to simplify its adoption. The tool is applicable to homogeneous WSNs with static routing topologies. It allows the automatic generation of formal specifications used to check correctness properties and evaluate dependability metrics at design time and at runtime for WSNs where an acceptable percentage of faults can be defined. During the runtime, we can check the behavior of the WSN accordingly to the results obtained at design time and we can detect sudden and unexpected failures, in order to trigger recovery procedures. The effectiveness of the methodology is shown in the context of two case studies, as proof-of-concept, aiming to illustrate how the tool is helpful to drive design choices and to check the correctness properties of the WSN at runtime. Although the method scales up to very large WSNs, the applicability of the methodology may be compromised by the state space explosion of the reasoning model, which must be faced by partitioning large topologies into sub-topologies. PMID:28025568
NASA Astrophysics Data System (ADS)
Shi, Jin-Xing; Ohmura, Keiichiro; Shimoda, Masatoshi; Lei, Xiao-Wen
2018-07-01
In recent years, shape design of graphene sheets (GSs) by introducing topological defects for enhancing their mechanical behaviors has attracted the attention of scholars. In the present work, we propose a consistent methodology for optimal shape design of GSs using a combination of the molecular mechanics (MM) method, the non-parametric shape optimization method, the phase field crystal (PFC) method, Voronoi tessellation, and molecular dynamics (MD) simulation to maximize their fundamental frequencies. At first, we model GSs as continuum frame models using a link between the MM method and continuum mechanics. Then, we carry out optimal shape design of GSs in fundamental frequency maximization problem based on a developed shape optimization method for frames. However, the obtained optimal shapes of GSs only consisting of hexagonal carbon rings are unstable that do not satisfy the principle of least action, so we relocate carbon atoms on the optimal shapes by introducing topological defects using the PFC method and Voronoi tessellation. At last, we perform the structural relaxation through MD simulation to determine the final optimal shapes of GSs. We design two examples of GSs and the optimal results show that the fundamental frequencies of GSs can be significantly enhanced according to the optimal shape design methodology.
A design methodology of magentorheological fluid damper using Herschel-Bulkley model
NASA Astrophysics Data System (ADS)
Liao, Linqing; Liao, Changrong; Cao, Jianguo; Fu, L. J.
2003-09-01
Magnetorheological fluid (MR fluid) is highly concentrated suspension of very small magnetic particle in inorganic oil. The essential behavior of MR fluid is its ability to reversibly change from free-flowing, linear viscous liquids to semi-solids having controllable yield strength in milliseconds when exposed to magnetic field. This feature provides simple, quiet, rapid-response interfaces between electronic controls and mechanical systems. In this paper, a mini-bus MR fluid damper based on plate Poiseuille flow mode is typically analyzed using Herschel-Bulkley model, which can be used to account for post-yield shear thinning or thickening under the quasi-steady flow condition. In the light of various value of flow behavior index, the influences of post-yield shear thinning or thickening on flow velocity profiles of MR fluid in annular damping orifice are examined numerically. Analytical damping coefficient predictions also are compared via the nonlinear Bingham plastic model and Herschel-Bulkley constitutive model. A MR fluid damper, which is designed and fabricated according to design method presented in this paper, has tested by electro-hydraulic servo vibrator and its control system in National Center for Test and Supervision of Coach Quality. The experimental results reveal that the analysis methodology and design theory are reasonable and MR fluid damper can be designed according to the design methodology.
ERIC Educational Resources Information Center
McQuade, Eamonn; Sjoer, Ellen; Fabian, Peter; Nascimento, Jose Carlos; Schroeder, Sanaz
2007-01-01
Purpose--The purpose of this paper is to report on a research project, the aim of which was to identify the potential loss of company knowledge and expertise as experienced and expert employees retire. Design/methodology/approach--The methodology used in this research was based on interviewing experienced and expert people who had retired or were…
A strategy for developing a launch vehicle system for orbit insertion: Methodological aspects
NASA Astrophysics Data System (ADS)
Klyushnikov, V. Yu.; Kuznetsov, I. I.; Osadchenko, A. S.
2014-12-01
The article addresses methodological aspects of a development strategy to design a launch vehicle system for orbit insertion. The development and implementation of the strategy are broadly outlined. An analysis is provided of the criterial base and input data needed to define the main requirements for the launch vehicle system. Approaches are suggested for solving individual problems in working out the launch vehicle system development strategy.
ERIC Educational Resources Information Center
Ibáñez Moreno, Ana; Vermeulen, Anna
2015-01-01
In this paper the methodological steps taken in the conception of a new mobile application (app) are introduced. This app, called VISP (Videos for Speaking), is easily accessible and manageable, and is aimed at helping students of English as a Foreign Language (EFL) to improve their idiomaticity in their oral production. In order to do so, the app…
Staff Training for Business Process Improvement: The Benefit of Role-Plays in the Case of KreditSim
ERIC Educational Resources Information Center
Borner, Rene; Moormann, Jurgen; Wang, Minhong
2012-01-01
Purpose: The paper aims to explore staff's experience with role-plays using the example of training bank employees in Six Sigma as a major methodology for business process improvement. Design/methodology/approach: The research is based on a case study. A role-play, KreditSim, is used to simulate a loan approval process that has to be improved by…
Liu, Wenjuan; Ji, Jianlin; Chen, Hua; Ye, Chenyu
2014-01-01
Color is one of the most powerful aspects of a psychological counseling environment. Little scientific research has been conducted on color design and much of the existing literature is based on observational studies. Using design of experiments and response surface methodology, this paper proposes an optimal color design approach for transforming patients' perception into color elements. Six indices, pleasant-unpleasant, interesting-uninteresting, exciting-boring, relaxing-distressing, safe-fearful, and active-inactive, were used to assess patients' impression. A total of 75 patients participated, including 42 for Experiment 1 and 33 for Experiment 2. 27 representative color samples were designed in Experiment 1, and the color sample (L = 75, a = 0, b = -60) was the most preferred one. In Experiment 2, this color sample was set as the 'central point', and three color attributes were optimized to maximize the patients' satisfaction. The experimental results show that the proposed method can get the optimal solution for color design of a counseling room.
Chen, Hua; Ye, Chenyu
2014-01-01
Color is one of the most powerful aspects of a psychological counseling environment. Little scientific research has been conducted on color design and much of the existing literature is based on observational studies. Using design of experiments and response surface methodology, this paper proposes an optimal color design approach for transforming patients’ perception into color elements. Six indices, pleasant-unpleasant, interesting-uninteresting, exciting-boring, relaxing-distressing, safe-fearful, and active-inactive, were used to assess patients’ impression. A total of 75 patients participated, including 42 for Experiment 1 and 33 for Experiment 2. 27 representative color samples were designed in Experiment 1, and the color sample (L = 75, a = 0, b = -60) was the most preferred one. In Experiment 2, this color sample was set as the ‘central point’, and three color attributes were optimized to maximize the patients’ satisfaction. The experimental results show that the proposed method can get the optimal solution for color design of a counseling room. PMID:24594683
Web-Enhanced Learning: Engaging Students in Constructivist Learning
ERIC Educational Resources Information Center
Neo, Mai
2005-01-01
Purpose: The purpose of this paper is to study the impact of a web-based constructivist learning environment, which was developed based on a course given to students in the Faculty of Creative Multimedia (FCM) on student learning. Design/methodology/approach: In this paper, a web-based multimedia-mediated project was developed based on an Internet…
Evidence-Based Administration for Decision Making in the Framework of Knowledge Strategic Management
ERIC Educational Resources Information Center
Del Junco, Julio Garcia; Zaballa, Rafael De Reyna; de Perea, Juan Garcia Alvarez
2010-01-01
Purpose: This paper seeks to present a model based on evidence-based administration (EBA), which aims to facilitate the creation, transformation and diffusion of knowledge in learning organizations. Design/methodology/approach: A theoretical framework is proposed based on EBA and the case method. Accordingly, an empirical study was carried out in…
NASA Astrophysics Data System (ADS)
Zeitz, Christian; Scheidat, Tobias; Dittmann, Jana; Vielhauer, Claus; González Agulla, Elisardo; Otero Muras, Enrique; García Mateo, Carmen; Alba Castro, José L.
2008-02-01
Beside the optimization of biometric error rates the overall security system performance in respect to intentional security attacks plays an important role for biometric enabled authentication schemes. As traditionally most user authentication schemes are knowledge and/or possession based, firstly in this paper we present a methodology for a security analysis of Internet-based biometric authentication systems by enhancing known methodologies such as the CERT attack-taxonomy with a more detailed view on the OSI-Model. Secondly as proof of concept, the guidelines extracted from this methodology are strictly applied to an open source Internet-based biometric authentication system (BioWebAuth). As case studies, two exemplary attacks, based on the found security leaks, are investigated and the attack performance is presented to show that during the biometric authentication schemes beside biometric error performance tuning also security issues need to be addressed. Finally, some design recommendations are given in order to ensure a minimum security level.
The TMIS life-cycle process document, revision A
NASA Technical Reports Server (NTRS)
1991-01-01
The Technical and Management Information System (TMIS) Life-Cycle Process Document describes the processes that shall be followed in the definition, design, development, test, deployment, and operation of all TMIS products and data base applications. This document is a roll out of TMIS Standards Document (SSP 30546). The purpose of this document is to define the life cycle methodology that the developers of all products and data base applications and any subsequent modifications shall follow. Included in this methodology are descriptions of the tasks, deliverables, reviews, and approvals that are required before a product or data base application is accepted in the TMIS environment.
Campbell, Rebecca; Patterson, Debra; Bybee, Deborah
2011-03-01
This article reviews current epistemological and design issues in the mixed methods literature and then examines the application of one specific design, a sequential explanatory mixed methods design, in an evaluation of a community-based intervention to improve postassault care for sexual assault survivors. Guided by a pragmatist epistemological framework, this study collected quantitative and qualitative data to understand how the implementation of a Sexual Assault Nurse Examiner (SANE) program affected prosecution rates of adult sexual assault cases in a large midwestern community. Quantitative results indicated that the program was successful in affecting legal systems change and the qualitative data revealed the mediating mechanisms of the intervention's effectiveness. Challenges of implementing this design are discussed, including epistemological and practical difficulties that developed from blending methodologies into a single project. © The Author(s) 2011.
A software engineering approach to expert system design and verification
NASA Technical Reports Server (NTRS)
Bochsler, Daniel C.; Goodwin, Mary Ann
1988-01-01
Software engineering design and verification methods for developing expert systems are not yet well defined. Integration of expert system technology into software production environments will require effective software engineering methodologies to support the entire life cycle of expert systems. The software engineering methods used to design and verify an expert system, RENEX, is discussed. RENEX demonstrates autonomous rendezvous and proximity operations, including replanning trajectory events and subsystem fault detection, onboard a space vehicle during flight. The RENEX designers utilized a number of software engineering methodologies to deal with the complex problems inherent in this system. An overview is presented of the methods utilized. Details of the verification process receive special emphasis. The benefits and weaknesses of the methods for supporting the development life cycle of expert systems are evaluated, and recommendations are made based on the overall experiences with the methods.
ERIC Educational Resources Information Center
Ismail, Noor Azizi
2010-01-01
Purpose: The purpose of this paper is to discuss how activity-based costing (ABC) technique can be applied in the context of higher education institutions. It also discusses the obstacles and challenges to the successful implementation of activity-based management (ABM) in the higher education environment. Design/methodology/approach: This paper…
Multi-Drafting Feedback Process in a Web-Based Environment
ERIC Educational Resources Information Center
Peled, Yehuda; Sarid, Miriam
2010-01-01
Purpose: The purpose of this paper is to explore the nature of multi-drafting among college students according to demographic characteristics and measure its impact on students' achievements. Design/methodology/approach: The research was conducted in two stages. First, a preliminary research based on data from the Highlearn web-based content…
Performance-Based Service Quality Model: An Empirical Study on Japanese Universities
ERIC Educational Resources Information Center
Sultan, Parves; Wong, Ho
2010-01-01
Purpose: This paper aims to develop and empirically test the performance-based higher education service quality model. Design/methodology/approach: The study develops 67-item instrument for measuring performance-based service quality with a particular focus on the higher education sector. Scale reliability is confirmed using the Cronbach's alpha.…
Pre-Service Teachers' TPACK Development and Conceptions through a TPACK-Based Course
ERIC Educational Resources Information Center
Durdu, Levent; Dag, Funda
2017-01-01
This study examines pre-service teachers' Technological Pedagogical Content Knowledge (TPACK) development and analyses their conceptions of learning and teaching with technology. With this aim in mind, researchers designed and implemented a computer-based mathematics course based on a TPACK framework. As a research methodology, a parallel mixed…
Ethical and Organisational Tensions for Work-Based Learners
ERIC Educational Resources Information Center
Moore, Lesley J.
2007-01-01
Purpose: This paper aims to focus on examples of the perceived tensions of the healthcare work-based learners as they experienced paradigm shifts in both practice and education. Design/methodology/approach: Examples are drawn from a qualitative study to examine work-based learning (WBL) workshops in a Dutch healthcare setting, and a developmental…
A Review of Online Evidence-based Practice Point-of-Care Information Summary Providers
Liberati, Alessandro; Moschetti, Ivan; Tagliabue, Ludovica; Moja, Lorenzo
2010-01-01
Background Busy clinicians need easy access to evidence-based information to inform their clinical practice. Publishers and organizations have designed specific tools to meet doctors’ needs at the point of care. Objective The aim of this study was to describe online point-of-care summaries and evaluate their breadth, content development, and editorial policy against their claims of being “evidence-based.” Methods We searched Medline, Google, librarian association websites, and information conference proceedings from January to December 2008. We included English Web-based point-of-care summaries designed to deliver predigested, rapidly accessible, comprehensive, periodically updated, evidence-based information to clinicians. Two investigators independently extracted data on the general characteristics and content presentation of summaries. We assessed and ranked point-of-care products according to: (1) coverage (volume) of medical conditions, (2) editorial quality, and (3) evidence-based methodology. We explored how these factors were associated. Results We retrieved 30 eligible summaries. Of these products, 18 met our inclusion criteria and were qualitatively described, and 16 provided sufficient data for quantitative evaluation. The median volume of medical conditions covered was 80.6% (interquartile range, 68.9% - 84.2%) and varied for the different products. Similarly, differences emerged for editorial policy (median 8.0, interquartile range 5.8 - 10.3) and evidence-based methodology scores (median 10.0, interquartile range 1.0 - 12.8) on a 15-point scale. None of these dimensions turned out to be significantly associated with the other dimensions (editorial quality and volume, Spearman rank correlation r = -0.001, P = .99; evidence-based methodology and volume, r = -0.19, P = .48; editorial and evidence-based methodology, r = 0.43, P =.09). Conclusions Publishers are moving to develop point-of-care summary products. Some of these have better profiles than others, and there is room for improved reporting of the strengths and weaknesses of these products. PMID:20610379
Persistent misunderstandings about evidence-based (sorry: informed!) policy-making.
Bédard, Pierre-Olivier; Ouimet, Mathieu
2016-01-01
The field of research on knowledge mobilization and evidence-informed policy-making has seen enduring debates related to various fundamental assumptions such as the definition of 'evidence', the relative validity of various research methods, the actual role of evidence to inform policy-making, etc. In many cases, these discussions serve a useful purpose, but they also stem from serious disagreement on methodological and epistemological issues. This essay reviews the rationale for evidence-informed policy-making by examining some of the common claims made about the aims and practices of this perspective on public policy. Supplementing the existing justifications for evidence-based policy making, we argue in favor of a greater inclusion of research evidence in the policy process but in a structured fashion, based on methodological considerations. In this respect, we present an overview of the intricate relation between policy questions and appropriate research designs. By closely examining the relation between research questions and research designs, we claim that the usual points of disagreement are mitigated. For instance, when focusing on the variety of research designs that can answer a range of policy questions, the common critical claim about 'RCT-based policy-making' seems to lose some, if not all of its grip.
ERIC Educational Resources Information Center
Jakovljevic, Maria; Ankiewicz, Piet; De swardt, Estelle; Gross, Elna
2004-01-01
Traditional instructional methodology in the Information System Design (ISD) environment lacks explicit strategies for promoting the cognitive skills of prospective system designers. This contributes to the fragmented knowledge and low motivational and creative involvement of learners in system design tasks. In addition, present ISD methodologies,…
Toward a Knowledge Base for School Climate in Cyprus's Schools
ERIC Educational Resources Information Center
Pashiardis, Georgia
2008-01-01
Purpose: The main purpose of this study was to explore and analyze secondary school students' (8th grade) perceptions about school climate in three areas, namely: the physical environment of the school, the social environment and the learning environment Design/methodology/approach: A questionnaire, which was designed and pilot-tested around the…
ERIC Educational Resources Information Center
Karpudewan, Mageswary; Ismail, Zurida Hg; Mohamed, Norita
2009-01-01
Purpose: The purpose of this paper is to introduce green chemistry experiments as laboratory-based pedagogy and to evaluate effectiveness of green chemistry experiments in delivering sustainable development concepts (SDCs) and traditional environmental concepts (TECs). Design/methodology/approach: Repeated measure design was employed to evaluate…
An Exploratory Study of Sustainable Development at Italian Universities
ERIC Educational Resources Information Center
Vagnoni, Emidia; Cavicchi, Caterina
2015-01-01
Purpose: This paper aims to outline the current status of the implementation of sustainability practices in the context of Italian public universities, highlighting the strengths and gaps. Design/methodology/approach: Based on a qualitative approach, an exploratory study design has been outlined using the model of Glavic and Lukman (2007) focusing…
Institutionalisation in a Newly Created Private University
ERIC Educational Resources Information Center
Hodson, Peter; Connolly, Michael; Younes, Said
2008-01-01
Purpose: The purpose of this paper is to examine the introduction of a quality assurance system in a new, private university in Syria, and considers the extent to which the theoretical model based on institutional theory and isomorphism is reflected in practice. Design/methodology/approach: A five year longitudinal study which reviews the design,…
77 FR 46094 - Proposed Data Collections Submitted for Public Comment and Recommendations
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-02
... that occurs before a program is designed and implemented, or while a program is being conducted and is... behavioral but most often they are cycles of interviews and focus groups designed to inform the development... instruments, (3) methodological research, (4) usability testing of technology-based instruments and materials...
Consequences of No Child Left Behind on Evaluation Purpose, Design, and Impact
ERIC Educational Resources Information Center
Mabry, Linda
2008-01-01
As an outgrowth of No Child Left Behind's narrow definition of scientifically based research, the priority given to certain quantitative evaluation designs has sparked debate among those in the evaluation community. Federal mandates for particular evaluation methodologies run counter to evaluation practice and to the direction of most evaluation…
ERIC Educational Resources Information Center
Diesel, Vivien; Miná Dias, Marcelo
2016-01-01
Purpose: To analyze the Brazilian experience in designing and implementing a recent extension policy reform based on agroecology, and reflect on its wider theoretical implications for extension reform literature. Design/methodology/approach: Using a critical public analysis we characterize the evolution of Brazilian federal extension policy…
The LACIE data bases: Design considerations
NASA Technical Reports Server (NTRS)
Westberry, L. E. (Principal Investigator)
1979-01-01
The implementation of direct access storage devices for LACIE is discussed with emphasis on the storage and retrieval of image data. Topics covered include the definition of the problem, the solution methodology (design decisions), the initial operational structure, and the modifications which were incorporated. Some conclusions and projections of future problems to be solved are also presented.
Evidence-Based Leadership Development: The 4L Framework
ERIC Educational Resources Information Center
Scott, Shelleyann; Webber, Charles F.
2008-01-01
Purpose: This paper aims to use the results of three research initiatives to present the life-long learning leader 4L framework, a model for leadership development intended for use by designers and providers of leadership development programming. Design/methodology/approach: The 4L model is a conceptual framework that emerged from the analysis of…
Teachers' Pedagogical Reasoning and Reframing of Practice in Digital Contexts
ERIC Educational Resources Information Center
Holmberg, Jörgen; Fransson, Göran; Fors, Uno
2018-01-01
Purpose: The purpose of this paper is to advance the understanding of teachers' reframing of practice in digital contexts by analysing teachers' pedagogical reasoning processes as they explore ways of using information and communication technologies (ICT) to create added pedagogical value. Design/methodology/approach: A design-based research (DBR)…
The Need for Private Universities in Japan to Be Agents of Change
ERIC Educational Resources Information Center
Zhang, Rong; McCornac, Dennis C.
2013-01-01
Purpose: The purpose of this paper is to examine a number of current innovations made by private higher educational institutions in Japan to counter decreased enrollments and financial constraints. Design/methodology/approach: The design of this study is both descriptive and conceptual, based on the latest data available. Additional information…
Learning Opportunities for Nurses Working within Home Care
ERIC Educational Resources Information Center
Lundgren, Solveig
2011-01-01
Purpose: The purpose of this study is to explore home care nurses' experience of learning in a multicultural environment. Design/methodology/approach: The study was based on qualitative research design. Data were collected through repeated interviews with registered home care nurses working in a multicultural area. The data were analyzed through a…
Inquiry-Based Learning in Mathematics: Designing Collaborative Research with Schools
ERIC Educational Resources Information Center
Makar, Katie; Dole, Shelley
2013-01-01
A series of research projects were implemented over seven years to understand and facilitate teachers' experiences in adopting inquiry. An overview of the project, methodology and key outcomes are outlined as a basis for the partnership described in this symposium. We end the paper with a list of recommendations for designing collaborative…
A Novel Approach to Rotorcraft Damage Tolerance
NASA Technical Reports Server (NTRS)
Forth, Scott C.; Everett, Richard A.; Newman, John A.
2002-01-01
Damage-tolerance methodology is positioned to replace safe-life methodologies for designing rotorcraft structures. The argument for implementing a damage-tolerance method comes from the fundamental fact that rotorcraft structures typically fail by fatigue cracking. Therefore, if technology permits prediction of fatigue-crack growth in structures, a damage-tolerance method should deliver the most accurate prediction of component life. Implementing damage-tolerance (DT) into high-cycle-fatigue (HCF) components will require a shift from traditional DT methods that rely on detecting an initial flaw with nondestructive inspection (NDI) methods. The rapid accumulation of cycles in a HCF component will result in a design based on a traditional DT method that is either impractical because of frequent inspections, or because the design will be too heavy to operate efficiently. Furthermore, once a HCF component develops a detectable propagating crack, the remaining fatigue life is short, sometimes less than one flight hour, which does not leave sufficient time for inspection. Therefore, designing a HCF component will require basing the life analysis on an initial flaw that is undetectable with current NDI technology.