Sample records for design methodology suitable

  1. Design-Based Research: Is This a Suitable Methodology for Short-Term Projects?

    ERIC Educational Resources Information Center

    Pool, Jessica; Laubscher, Dorothy

    2016-01-01

    This article reports on a design-based methodology of a thesis in which a fully face-to-face contact module was converted into a blended learning course. The purpose of the article is to report on how design-based phases, in the form of micro-, meso- and macro-cycles were applied to improve practice and to generate design principles. Design-based…

  2. The engagement of children with disabilities in health-related technology design processes: identifying methodology.

    PubMed

    Allsop, Matthew J; Holt, Raymond J; Levesley, Martin C; Bhakta, Bipinchandra

    2010-01-01

    This review aims to identify research methodology that is suitable for involving children with disabilities in the design of healthcare technology, such as assistive technology and rehabilitation equipment. A review of the literature included the identification of methodology that is available from domains outside of healthcare and suggested a selection of available methods. The need to involve end users within the design of healthcare technology was highlighted, with particular attention to the need for greater levels of participation from children with disabilities within all healthcare research. Issues that may arise when trying to increase such involvement included the need to consider communication via feedback and tailored information, the need to measure levels of participation occurring in current research, and caution regarding the use of proxy information. Additionally, five suitable methods were highlighted that are available for use with children with disabilities in the design of healthcare technology. The methods identified in the review need to be put into practice to establish effective and, if necessary, novel ways of designing healthcare technology when end users are children with disabilities.

  3. OVERMODED HIGH-POWER RF MAGNETIC SWITCHES AND CIRCULATORS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tantawi, Sami

    2002-08-20

    We present design methodology for active rf magnetic components which are suitable for pulse compression systems of future X-band linear colliders. These components comprise an array of active elements arranged together so that the total electromagnetic field is reduced and the power handling capabilities are increased. The active element of choice is a magnetic material (garnet), which can be switched by changing a biasing magnetic field. A novel design allows these components to operate in the low loss circular waveguide mode TE{sub 01}. We describe the design methodology, the switching elements and circuits.

  4. Habitat Suitability Index Models: Yellow perch

    USGS Publications Warehouse

    Krieger, Douglas A.; Terrell, James W.; Nelson, Patrick C.

    1983-01-01

    A review and synthesis of existing information were used to develop riverine and lacustrine habitat models for yellow perch (Perca flavescens). The models are scaled to produce an index of habitat suitability between 0 (unsuitable habitat) to 1 (optimally suitable habitat) for riverine, lacustrine, and palustrine habitat in the 48 contiguous United States. Habitat Suitability Indexes (HSI's) are designed for use with the Habitat Evaluation Procedures developed by the U.S. Fish and Wildlife Service. Also included are discussions of Suitability Index (SI) curves as used in the Instream Flow Incremental Methodology (IFIM) and SI curves available for an IFIM analysis of yellow perch habitat.

  5. General Electric composite ring-disk flywheel: Recent and potential developments

    NASA Technical Reports Server (NTRS)

    Coppa, A. P.

    1984-01-01

    Recent developments of the General Electric hybrid rotor design are described. The relation of the hybrid rotor design to flywheel designs that are especially suitable for spacecraft applications is discussed. Potential performance gains that can be achieved in such rotor designs by applying latest developments in materials, processing, and design methodology are projected. Indications are that substantial improvements can be obtained.

  6. LOX/hydrocarbon rocket engine analytical design methodology development and validation. Volume 1: Executive summary and technical narrative

    NASA Technical Reports Server (NTRS)

    Pieper, Jerry L.; Walker, Richard E.

    1993-01-01

    During the past three decades, an enormous amount of resources were expended in the design and development of Liquid Oxygen/Hydrocarbon and Hydrogen (LOX/HC and LOX/H2) rocket engines. A significant portion of these resources were used to develop and demonstrate the performance and combustion stability for each new engine. During these efforts, many analytical and empirical models were developed that characterize design parameters and combustion processes that influence performance and stability. Many of these models are suitable as design tools, but they have not been assembled into an industry-wide usable analytical design methodology. The objective of this program was to assemble existing performance and combustion stability models into a usable methodology capable of producing high performing and stable LOX/hydrocarbon and LOX/hydrogen propellant booster engines.

  7. Aircraft optimization by a system approach: Achievements and trends

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw

    1992-01-01

    Recently emerging methodology for optimal design of aircraft treated as a system of interacting physical phenomena and parts is examined. The methodology is found to coalesce into methods for hierarchic, non-hierarchic, and hybrid systems all dependent on sensitivity analysis. A separate category of methods has also evolved independent of sensitivity analysis, hence suitable for discrete problems. References and numerical applications are cited. Massively parallel computer processing is seen as enabling technology for practical implementation of the methodology.

  8. Measuring the Success of Library 2.0 Technologies in the African Context: The Suitability of the DeLone and McLean's Model

    ERIC Educational Resources Information Center

    Lwoga, Edda Tandi

    2013-01-01

    Purpose: This study aims to examine the suitability of information systems (IS) success model in the adoption of library 2.0 technologies among undergraduate students in the African context, and focused at the Muhimbili University of Health and Allied Sciences (MUHAS) of Tanzania. Design/methodology/approach: Based on the IS success model, the…

  9. Guidelines for reporting evaluations based on observational methodology.

    PubMed

    Portell, Mariona; Anguera, M Teresa; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana

    2015-01-01

    Observational methodology is one of the most suitable research designs for evaluating fidelity of implementation, especially in complex interventions. However, the conduct and reporting of observational studies is hampered by the absence of specific guidelines, such as those that exist for other evaluation designs. This lack of specific guidance poses a threat to the quality and transparency of these studies and also constitutes a considerable publication hurdle. The aim of this study thus was to draw up a set of proposed guidelines for reporting evaluations based on observational methodology. The guidelines were developed by triangulating three sources of information: observational studies performed in different fields by experts in observational methodology, reporting guidelines for general studies and studies with similar designs to observational studies, and proposals from experts in observational methodology at scientific meetings. We produced a list of guidelines grouped into three domains: intervention and expected outcomes, methods, and results. The result is a useful, carefully crafted set of simple guidelines for conducting and reporting observational studies in the field of program evaluation.

  10. An evaluation of the directed flow graph methodology

    NASA Technical Reports Server (NTRS)

    Snyder, W. E.; Rajala, S. A.

    1984-01-01

    The applicability of the Directed Graph Methodology (DGM) to the design and analysis of special purpose image and signal processing hardware was evaluated. A special purpose image processing system was designed and described using DGM. The design, suitable for very large scale integration (VLSI) implements a region labeling technique. Two computer chips were designed, both using metal-nitride-oxide-silicon (MNOS) technology, as well as a functional system utilizing those chips to perform real time region labeling. The system is described in terms of DGM primitives. As it is currently implemented, DGM is inappropriate for describing synchronous, tightly coupled, special purpose systems. The nature of the DGM formalism lends itself more readily to modeling networks of general purpose processors.

  11. A Multi-Objective Advanced Design Methodology of Composite Beam-to-Column Joints Subjected to Seismic and Fire Loads

    NASA Astrophysics Data System (ADS)

    Pucinotti, Raffaele; Ferrario, Fabio; Bursi, Oreste S.

    2008-07-01

    A multi-objective advanced design methodology dealing with seismic actions followed by fire on steel-concrete composite full strength joints with concrete filled tubes is proposed in this paper. The specimens were designed in detail in order to exhibit a suitable fire behaviour after a severe earthquake. The major aspects of the cyclic behaviour of composite joints are presented and commented upon. The data obtained from monotonic and cyclic experimental tests have been used to calibrate a model of the joint in order to perform seismic simulations on several moment resisting frames. A hysteretic law was used to take into account the seismic degradation of the joints. Finally, fire tests were conducted with the objective to evaluate fire resistance of the connection already damaged by an earthquake. The experimental activity together with FE simulation demonstrated the adequacy of the advanced design methodology.

  12. The balanced incomplete block design is not suitable for the evaluation of complex interventions.

    PubMed

    Trietsch, Jasper; Leffers, Pieter; van Steenkiste, Ben; Grol, Richard; van der Weijden, Trudy

    2014-12-01

    In quality of care research, the balanced incomplete block (BIB) design is regularly claimed to have been used when evaluating complex interventions. In this article, we reflect on the appropriateness of using this design for evaluating complex interventions. Literature study using PubMed and handbooks. After studying various articles on health services research that claim to have applied the BIB and the original methodological literature on this design, it became clear that the applied method is in fact not a BIB design. We conclude that the use of this design is not suited for evaluating complex interventions. We stress that, to prevent improper use of terms, more attention should be paid to proper referencing of the original methodological literature. Copyright © 2014 Elsevier Inc. All rights reserved.

  13. A methodology for identification and control of electro-mechanical actuators

    PubMed Central

    Tutunji, Tarek A.; Saleem, Ashraf

    2015-01-01

    Mechatronic systems are fully-integrated engineering systems that are composed of mechanical, electronic, and computer control sub-systems. These integrated systems use electro-mechanical actuators to cause the required motion. Therefore, the design of appropriate controllers for these actuators are an essential step in mechatronic system design. In this paper, a three-stage methodology for real-time identification and control of electro-mechanical actuator plants is presented, tested, and validated. First, identification models are constructed from experimental data to approximate the plants’ response. Second, the identified model is used in a simulation environment for the purpose of designing a suitable controller. Finally, the designed controller is applied and tested on the real plant through Hardware-in-the-Loop (HIL) environment. The described three-stage methodology provides the following practical contributions: • Establishes an easy-to-follow methodology for controller design of electro-mechanical actuators. • Combines off-line and on-line controller design for practical performance. • Modifies the HIL concept by using physical plants with computer control (rather than virtual plants with physical controllers). Simulated and experimental results for two case studies, induction motor and vehicle drive system, are presented in order to validate the proposed methodology. These results showed that electromechanical actuators can be identified and controlled using an easy-to-duplicate and flexible procedure. PMID:26150992

  14. A methodology for identification and control of electro-mechanical actuators.

    PubMed

    Tutunji, Tarek A; Saleem, Ashraf

    2015-01-01

    Mechatronic systems are fully-integrated engineering systems that are composed of mechanical, electronic, and computer control sub-systems. These integrated systems use electro-mechanical actuators to cause the required motion. Therefore, the design of appropriate controllers for these actuators are an essential step in mechatronic system design. In this paper, a three-stage methodology for real-time identification and control of electro-mechanical actuator plants is presented, tested, and validated. First, identification models are constructed from experimental data to approximate the plants' response. Second, the identified model is used in a simulation environment for the purpose of designing a suitable controller. Finally, the designed controller is applied and tested on the real plant through Hardware-in-the-Loop (HIL) environment. The described three-stage methodology provides the following practical contributions: •Establishes an easy-to-follow methodology for controller design of electro-mechanical actuators.•Combines off-line and on-line controller design for practical performance.•Modifies the HIL concept by using physical plants with computer control (rather than virtual plants with physical controllers). Simulated and experimental results for two case studies, induction motor and vehicle drive system, are presented in order to validate the proposed methodology. These results showed that electromechanical actuators can be identified and controlled using an easy-to-duplicate and flexible procedure.

  15. An automated procedure for developing hybrid computer simulations of turbofan engines

    NASA Technical Reports Server (NTRS)

    Szuch, J. R.; Krosel, S. M.

    1980-01-01

    A systematic, computer-aided, self-documenting methodology for developing hybrid computer simulations of turbofan engines is presented. The methodology makes use of a host program that can run on a large digital computer and a machine-dependent target (hybrid) program. The host program performs all of the calculations and date manipulations needed to transform user-supplied engine design information to a form suitable for the hybrid computer. The host program also trims the self contained engine model to match specified design point information. A test case is described and comparisons between hybrid simulation and specified engine performance data are presented.

  16. Development of weight and cost estimates for lifting surfaces with active controls

    NASA Technical Reports Server (NTRS)

    Anderson, R. D.; Flora, C. C.; Nelson, R. M.; Raymond, E. T.; Vincent, J. H.

    1976-01-01

    Equations and methodology were developed for estimating the weight and cost incrementals due to active controls added to the wing and horizontal tail of a subsonic transport airplane. The methods are sufficiently generalized to be suitable for preliminary design. Supporting methodology and input specifications for the weight and cost equations are provided. The weight and cost equations are structured to be flexible in terms of the active control technology (ACT) flight control system specification. In order to present a self-contained package, methodology is also presented for generating ACT flight control system characteristics for the weight and cost equations. Use of the methodology is illustrated.

  17. Design consideration of resonance inverters with electro-technological application

    NASA Astrophysics Data System (ADS)

    Hinov, Nikolay

    2017-12-01

    This study presents design consideration of resonance inverters with electro-technological application. The presented methodology was achieved as a result of investigations and analyses of different types and working regimes of resonance inverters, made by the author. Are considered schemes of resonant inverters without inverse diodes. The first harmonic method is used in the analysis and design. This method for the case of inverters with electro-technological application gives very good accuracy. This does not require the use of a complex and heavy mathematical apparatus. The proposed methodology is easy to use and is suitable for use in training students in power electronics. Authenticity of achieved results is confirmed by simulating and physical prototypes research work.

  18. Development of Probabilistic Life Prediction Methodologies and Testing Strategies for MEMS and CMC's

    NASA Technical Reports Server (NTRS)

    Jadaan, Osama

    2003-01-01

    This effort is to investigate probabilistic life prediction methodologies for ceramic matrix composites and MicroElectroMechanical Systems (MEMS) and to analyze designs that determine stochastic properties of MEMS. For CMC's this includes a brief literature survey regarding lifing methodologies. Also of interest for MEMS is the design of a proper test for the Weibull size effect in thin film (bulge test) specimens. The Weibull size effect is a consequence of a stochastic strength response predicted from the Weibull distribution. Confirming that MEMS strength is controlled by the Weibull distribution will enable the development of a probabilistic design methodology for MEMS - similar to the GRC developed CARES/Life program for bulk ceramics. A main objective of this effort is to further develop and verify the ability of the Ceramics Analysis and Reliability Evaluation of Structures/Life (CARES/Life) code to predict the time-dependent reliability of MEMS structures subjected to multiple transient loads. A second set of objectives is to determine the applicability/suitability of the CARES/Life methodology for CMC analysis, what changes would be needed to the methodology and software, and if feasible, run a demonstration problem. Also important is an evaluation of CARES/Life coupled to the ANSYS Probabilistic Design System (PDS) and the potential of coupling transient reliability analysis to the ANSYS PDS.

  19. Sketching Designs Using the Five Design-Sheet Methodology.

    PubMed

    Roberts, Jonathan C; Headleand, Chris; Ritsos, Panagiotis D

    2016-01-01

    Sketching designs has been shown to be a useful way of planning and considering alternative solutions. The use of lo-fidelity prototyping, especially paper-based sketching, can save time, money and converge to better solutions more quickly. However, this design process is often viewed to be too informal. Consequently users do not know how to manage their thoughts and ideas (to first think divergently, to then finally converge on a suitable solution). We present the Five Design Sheet (FdS) methodology. The methodology enables users to create information visualization interfaces through lo-fidelity methods. Users sketch and plan their ideas, helping them express different possibilities, think through these ideas to consider their potential effectiveness as solutions to the task (sheet 1); they create three principle designs (sheets 2,3 and 4); before converging on a final realization design that can then be implemented (sheet 5). In this article, we present (i) a review of the use of sketching as a planning method for visualization and the benefits of sketching, (ii) a detailed description of the Five Design Sheet (FdS) methodology, and (iii) an evaluation of the FdS using the System Usability Scale, along with a case-study of its use in industry and experience of its use in teaching.

  20. Automated procedure for developing hybrid computer simulations of turbofan engines. Part 1: General description

    NASA Technical Reports Server (NTRS)

    Szuch, J. R.; Krosel, S. M.; Bruton, W. M.

    1982-01-01

    A systematic, computer-aided, self-documenting methodology for developing hybrid computer simulations of turbofan engines is presented. The methodology that is pesented makes use of a host program that can run on a large digital computer and a machine-dependent target (hybrid) program. The host program performs all the calculations and data manipulations that are needed to transform user-supplied engine design information to a form suitable for the hybrid computer. The host program also trims the self-contained engine model to match specified design-point information. Part I contains a general discussion of the methodology, describes a test case, and presents comparisons between hybrid simulation and specified engine performance data. Part II, a companion document, contains documentation, in the form of computer printouts, for the test case.

  1. Methodology of shell structure reinforcement layout optimization

    NASA Astrophysics Data System (ADS)

    Szafrański, Tomasz; Małachowski, Jerzy; Damaziak, Krzysztof

    2018-01-01

    This paper presents an optimization process of a reinforced shell diffuser intended for a small wind turbine (rated power of 3 kW). The diffuser structure consists of multiple reinforcement and metal skin. This kind of structure is suitable for optimization in terms of selection of reinforcement density, stringers cross sections, sheet thickness, etc. The optimisation approach assumes the reduction of the amount of work to be done between the optimization process and the final product design. The proposed optimization methodology is based on application of a genetic algorithm to generate the optimal reinforcement layout. The obtained results are the basis for modifying the existing Small Wind Turbine (SWT) design.

  2. Designing a Pedagogical Model for Web Engineering Education: An Evolutionary Perspective

    ERIC Educational Resources Information Center

    Hadjerrouit, Said

    2005-01-01

    In contrast to software engineering, which relies on relatively well established development approaches, there is a lack of a proven methodology that guides Web engineers in building reliable and effective Web-based systems. Currently, Web engineering lacks process models, architectures, suitable techniques and methods, quality assurance, and a…

  3. Personalised Information Services Using a Hybrid Recommendation Method Based on Usage Frequency

    ERIC Educational Resources Information Center

    Kim, Yong; Chung, Min Gyo

    2008-01-01

    Purpose: This paper seeks to describe a personal recommendation service (PRS) involving an innovative hybrid recommendation method suitable for deployment in a large-scale multimedia user environment. Design/methodology/approach: The proposed hybrid method partitions content and user into segments and executes association rule mining,…

  4. Performance degradation mechanisms and modes in terrestrial photovoltaic arrays and technology for their diagnosis

    NASA Technical Reports Server (NTRS)

    Noel, G. T.; Sliemers, F. A.; Derringer, G. C.; Wood, V. E.; Wilkes, K. E.; Gaines, G. B.; Carmichael, D. C.

    1978-01-01

    Accelerated life-prediction test methodologies have been developed for the validation of a 20-year service life for low-cost photovoltaic arrays. Array failure modes, relevant materials property changes, and primary degradation mechanisms are discussed as a prerequisite to identifying suitable measurement techniques and instruments. Measurements must provide sufficient confidence to permit selection among alternative designs and materials and to stimulate widespread deployment of such arrays. Furthermore, the diversity of candidate materials and designs, and the variety of potential environmental stress combinations, degradation mechanisms and failure modes require that combinations of measurement techniques be identified which are suitable for the characterization of various encapsulation system-cell structure-environment combinations.

  5. A Framework for Preliminary Design of Aircraft Structures Based on Process Information. Part 1

    NASA Technical Reports Server (NTRS)

    Rais-Rohani, Masoud

    1998-01-01

    This report discusses the general framework and development of a computational tool for preliminary design of aircraft structures based on process information. The described methodology is suitable for multidisciplinary design optimization (MDO) activities associated with integrated product and process development (IPPD). The framework consists of three parts: (1) product and process definitions; (2) engineering synthesis, and (3) optimization. The product and process definitions are part of input information provided by the design team. The backbone of the system is its ability to analyze a given structural design for performance as well as manufacturability and cost assessment. The system uses a database on material systems and manufacturing processes. Based on the identified set of design variables and an objective function, the system is capable of performing optimization subject to manufacturability, cost, and performance constraints. The accuracy of the manufacturability measures and cost models discussed here depend largely on the available data on specific methods of manufacture and assembly and associated labor requirements. As such, our focus in this research has been on the methodology itself and not so much on its accurate implementation in an industrial setting. A three-tier approach is presented for an IPPD-MDO based design of aircraft structures. The variable-complexity cost estimation methodology and an approach for integrating manufacturing cost assessment into design process are also discussed. This report is presented in two parts. In the first part, the design methodology is presented, and the computational design tool is described. In the second part, a prototype model of the preliminary design Tool for Aircraft Structures based on Process Information (TASPI) is described. Part two also contains an example problem that applies the methodology described here for evaluation of six different design concepts for a wing spar.

  6. Quality Concerns in Technical Education in India: A Quantifiable Quality Enabled Model

    ERIC Educational Resources Information Center

    Gambhir, Victor; Wadhwa, N. C.; Grover, Sandeep

    2016-01-01

    Purpose: The paper aims to discuss current Technical Education scenarios in India. It proposes modelling the factors affecting quality in a technical institute and then applying a suitable technique for assessment, comparison and ranking. Design/methodology/approach: The paper chose graph theoretic approach for quantification of quality-enabled…

  7. Experience-based co-design in an adult psychological therapies service.

    PubMed

    Cooper, Kate; Gillmore, Chris; Hogg, Lorna

    2016-01-01

    Experience-based co-design (EBCD) is a methodology for service improvement and development, which puts service-user voices at the heart of improving health services. The aim of this paper was to implement the EBCD methodology in a mental health setting, and to investigate the challenges which arise during this process. In order to achieve this, a modified version of the EBCD methodology was undertaken, which involved listening to the experiences of the people who work in and use the mental health setting and sharing these experiences with the people who could effect change within the service, through collaborative work between service-users, staff and managers. EBCD was implemented within the mental health setting and was well received by service-users, staff and stakeholders. A number of modifications were necessary in this setting, for example high levels of support available to participants. It was concluded that EBCD is a suitable methodology for service improvement in mental health settings.

  8. Investigating the Release of a Hydrophobic Peptide from Matrices of Biodegradable Polymers: An Integrated Method Approach

    PubMed Central

    Gubskaya, Anna V.; Khan, I. John; Valenzuela, Loreto M.; Lisnyak, Yuriy V.; Kohn, Joachim

    2013-01-01

    The objectives of this work were: (1) to select suitable compositions of tyrosine-derived polycarbonates for controlled delivery of voclosporin, a potent drug candidate to treat ocular diseases, (2) to establish a structure-function relationship between key molecular characteristics of biodegradable polymer matrices and drug release kinetics, and (3) to identify factors contributing in the rate of drug release. For the first time, the experimental study of polymeric drug release was accompanied by a hierarchical sequence of three computational methods. First, suitable polymer compositions used in subsequent neural network modeling were determined by means of response surface methodology (RSM). Second, accurate artificial neural network (ANN) models were built to predict drug release profiles for fifteen polymers located outside the initial design space. Finally, thermodynamic properties and hydrogen-bonding patterns of model drug-polymer complexes were studied using molecular dynamics (MD) technique to elucidate a role of specific interactions in drug release mechanism. This research presents further development of methodological approaches to meet challenges in the design of polymeric drug delivery systems. PMID:24039300

  9. What values in design? The challenge of incorporating moral values into design.

    PubMed

    Manders-Huits, Noëmi

    2011-06-01

    Recently, there is increased attention to the integration of moral values into the conception, design, and development of emerging IT. The most reviewed approach for this purpose in ethics and technology so far is Value-Sensitive Design (VSD). This article considers VSD as the prime candidate for implementing normative considerations into design. Its methodology is considered from a conceptual, analytical, normative perspective. The focus here is on the suitability of VSD for integrating moral values into the design of technologies in a way that joins in with an analytical perspective on ethics of technology. Despite its promising character, it turns out that VSD falls short in several respects: (1) VSD does not have a clear methodology for identifying stakeholders, (2) the integration of empirical methods with conceptual research within the methodology of VSD is obscure, (3) VSD runs the risk of committing the naturalistic fallacy when using empirical knowledge for implementing values in design, (4) the concept of values, as well as their realization, is left undetermined and (5) VSD lacks a complimentary or explicit ethical theory for dealing with value trade-offs. For the normative evaluation of a technology, I claim that an explicit and justified ethical starting point or principle is required. Moreover, explicit attention should be given to the value aims and assumptions of a particular design. The criteria of adequacy for such an approach or methodology follow from the evaluation of VSD as the prime candidate for implementing moral values in design.

  10. Laminated Object Manufacturing-Based Design Ceramic Matrix Composites

    DTIC Science & Technology

    2001-04-01

    components for DoD applications. Program goals included the development of (1) a new LOM based design methodology for CMC, (2) optimized preceramic polymer ...3.1.1-20 3.1.1-12 Detail of LOM Composites Forming System w/ glass fiber/ polymer laminate................ 3.1.1-21 3.1.1-13...such as polymer matrix composites have faced similar barriers to implementation. These barriers have been overcome through the development of suitable

  11. A novel neural network based image reconstruction model with scale and rotation invariance for target identification and classification for Active millimetre wave imaging

    NASA Astrophysics Data System (ADS)

    Agarwal, Smriti; Bisht, Amit Singh; Singh, Dharmendra; Pathak, Nagendra Prasad

    2014-12-01

    Millimetre wave imaging (MMW) is gaining tremendous interest among researchers, which has potential applications for security check, standoff personal screening, automotive collision-avoidance, and lot more. Current state-of-art imaging techniques viz. microwave and X-ray imaging suffers from lower resolution and harmful ionizing radiation, respectively. In contrast, MMW imaging operates at lower power and is non-ionizing, hence, medically safe. Despite these favourable attributes, MMW imaging encounters various challenges as; still it is very less explored area and lacks suitable imaging methodology for extracting complete target information. Keeping in view of these challenges, a MMW active imaging radar system at 60 GHz was designed for standoff imaging application. A C-scan (horizontal and vertical scanning) methodology was developed that provides cross-range resolution of 8.59 mm. The paper further details a suitable target identification and classification methodology. For identification of regular shape targets: mean-standard deviation based segmentation technique was formulated and further validated using a different target shape. For classification: probability density function based target material discrimination methodology was proposed and further validated on different dataset. Lastly, a novel artificial neural network based scale and rotation invariant, image reconstruction methodology has been proposed to counter the distortions in the image caused due to noise, rotation or scale variations. The designed neural network once trained with sample images, automatically takes care of these deformations and successfully reconstructs the corrected image for the test targets. Techniques developed in this paper are tested and validated using four different regular shapes viz. rectangle, square, triangle and circle.

  12. Bayesian cross-entropy methodology for optimal design of validation experiments

    NASA Astrophysics Data System (ADS)

    Jiang, X.; Mahadevan, S.

    2006-07-01

    An important concern in the design of validation experiments is how to incorporate the mathematical model in the design in order to allow conclusive comparisons of model prediction with experimental output in model assessment. The classical experimental design methods are more suitable for phenomena discovery and may result in a subjective, expensive, time-consuming and ineffective design that may adversely impact these comparisons. In this paper, an integrated Bayesian cross-entropy methodology is proposed to perform the optimal design of validation experiments incorporating the computational model. The expected cross entropy, an information-theoretic distance between the distributions of model prediction and experimental observation, is defined as a utility function to measure the similarity of two distributions. A simulated annealing algorithm is used to find optimal values of input variables through minimizing or maximizing the expected cross entropy. The measured data after testing with the optimum input values are used to update the distribution of the experimental output using Bayes theorem. The procedure is repeated to adaptively design the required number of experiments for model assessment, each time ensuring that the experiment provides effective comparison for validation. The methodology is illustrated for the optimal design of validation experiments for a three-leg bolted joint structure and a composite helicopter rotor hub component.

  13. Methodological Quality of Randomized Clinical Trials of Respiratory Physiotherapy in Coronary Artery Bypass Grafting Patients in the Intensive Care Unit: a Systematic Review

    PubMed Central

    Lorscheitter, Jaqueline; Stein, Cinara; Plentz, Rodrigo Della Méa

    2017-01-01

    Objective To assess methodological quality of the randomized controlled trials of physiotherapy in patients undergoing coronary artery bypass grafting in the intensive care unit. Methods The studies published until May 2015, in MEDLINE, Cochrane and PEDro were included. The primary outcome extracted was proper filling of the Cochrane Collaboration's tool's items and the secondary was suitability to the requirements of the CONSORT Statement and its extension. Results From 807 studies identified, 39 were included. Most at CONSORT items showed a better adequacy after the statement's publication. Studies with positive outcomes presented better methodological quality. Conclusion The methodological quality of the studies has been improving over the years. However, many aspects can still be better designed. PMID:28977205

  14. Evidence-based decision-making in infectious diseases epidemiology, prevention and control: matching research questions to study designs and quality appraisal tools

    PubMed Central

    2014-01-01

    Background The Project on a Framework for Rating Evidence in Public Health (PRECEPT) was initiated and is being funded by the European Centre for Disease Prevention and Control (ECDC) to define a methodology for evaluating and grading evidence and strength of recommendations in the field of public health, with emphasis on infectious disease epidemiology, prevention and control. One of the first steps was to review existing quality appraisal tools (QATs) for individual research studies of various designs relevant to this area, using a question-based approach. Methods Through team discussions and expert consultations, we identified 20 relevant types of public health questions, which were grouped into six domains, i.e. characteristics of the pathogen, burden of disease, diagnosis, risk factors, intervention, and implementation of intervention. Previously published systematic reviews were used and supplemented by expert consultation to identify suitable QATs. Finally, a matrix was constructed for matching questions to study designs suitable to address them and respective QATs. Key features of each of the included QATs were then analyzed, in particular in respect to its intended use, types of questions and answers, presence/absence of a quality score, and if a validation was performed. Results In total we identified 21 QATs and 26 study designs, and matched them. Four QATs were suitable for experimental quantitative study designs, eleven for observational quantitative studies, two for qualitative studies, three for economic studies, one for diagnostic test accuracy studies, and one for animal studies. Included QATs consisted of six to 28 items. Six of the QATs had a summary quality score. Fourteen QATs had undergone at least one validation procedure. Conclusions The results of this methodological study can be used as an inventory of potentially relevant questions, appropriate study designs and QATs for researchers and authorities engaged with evidence-based decision-making in infectious disease epidemiology, prevention and control. PMID:24886571

  15. Evidence-based decision-making in infectious diseases epidemiology, prevention and control: matching research questions to study designs and quality appraisal tools.

    PubMed

    Harder, Thomas; Takla, Anja; Rehfuess, Eva; Sánchez-Vivar, Alex; Matysiak-Klose, Dorothea; Eckmanns, Tim; Krause, Gérard; de Carvalho Gomes, Helena; Jansen, Andreas; Ellis, Simon; Forland, Frode; James, Roberta; Meerpohl, Joerg J; Morgan, Antony; Schünemann, Holger; Zuiderent-Jerak, Teun; Wichmann, Ole

    2014-05-21

    The Project on a Framework for Rating Evidence in Public Health (PRECEPT) was initiated and is being funded by the European Centre for Disease Prevention and Control (ECDC) to define a methodology for evaluating and grading evidence and strength of recommendations in the field of public health, with emphasis on infectious disease epidemiology, prevention and control. One of the first steps was to review existing quality appraisal tools (QATs) for individual research studies of various designs relevant to this area, using a question-based approach. Through team discussions and expert consultations, we identified 20 relevant types of public health questions, which were grouped into six domains, i.e. characteristics of the pathogen, burden of disease, diagnosis, risk factors, intervention, and implementation of intervention. Previously published systematic reviews were used and supplemented by expert consultation to identify suitable QATs. Finally, a matrix was constructed for matching questions to study designs suitable to address them and respective QATs. Key features of each of the included QATs were then analyzed, in particular in respect to its intended use, types of questions and answers, presence/absence of a quality score, and if a validation was performed. In total we identified 21 QATs and 26 study designs, and matched them. Four QATs were suitable for experimental quantitative study designs, eleven for observational quantitative studies, two for qualitative studies, three for economic studies, one for diagnostic test accuracy studies, and one for animal studies. Included QATs consisted of six to 28 items. Six of the QATs had a summary quality score. Fourteen QATs had undergone at least one validation procedure. The results of this methodological study can be used as an inventory of potentially relevant questions, appropriate study designs and QATs for researchers and authorities engaged with evidence-based decision-making in infectious disease epidemiology, prevention and control.

  16. Schematic representation of case study research designs.

    PubMed

    Rosenberg, John P; Yates, Patsy M

    2007-11-01

    The paper is a report of a study to demonstrate how the use of schematics can provide procedural clarity and promote rigour in the conduct of case study research. Case study research is a methodologically flexible approach to research design that focuses on a particular case - whether an individual, a collective or a phenomenon of interest. It is known as the 'study of the particular' for its thorough investigation of particular, real-life situations and is gaining increased attention in nursing and social research. However, the methodological flexibility it offers can leave the novice researcher uncertain of suitable procedural steps required to ensure methodological rigour. This article provides a real example of a case study research design that utilizes schematic representation drawn from a doctoral study of the integration of health promotion principles and practices into a palliative care organization. The issues discussed are: (1) the definition and application of case study research design; (2) the application of schematics in research; (3) the procedural steps and their contribution to the maintenance of rigour; and (4) the benefits and risks of schematics in case study research. The inclusion of visual representations of design with accompanying explanatory text is recommended in reporting case study research methods.

  17. Quality Appraisal of Single-Subject Experimental Designs: An Overview and Comparison of Different Appraisal Tools

    ERIC Educational Resources Information Center

    Wendt, Oliver; Miller, Bridget

    2012-01-01

    Critical appraisal of the research literature is an essential step in informing and implementing evidence-based practice. Quality appraisal tools that assess the methodological quality of experimental studies provide a means to identify the most rigorous research suitable for evidence-based decision-making. In single-subject experimental research,…

  18. Advanced electrostatic ion thruster for space propulsion

    NASA Technical Reports Server (NTRS)

    Masek, T. D.; Macpherson, D.; Gelon, W.; Kami, S.; Poeschel, R. L.; Ward, J. W.

    1978-01-01

    The suitability of the baseline 30 cm thruster for future space missions was examined. Preliminary design concepts for several advanced thrusters were developed to assess the potential practical difficulties of a new design. Useful methodologies were produced for assessing both planetary and earth orbit missions. Payload performance as a function of propulsion system technology level and cost sensitivity to propulsion system technology level are among the topics assessed. A 50 cm diameter thruster designed to operate with a beam voltage of about 2400 V is suggested to satisfy most of the requirements of future space missions.

  19. Applying machine learning to pattern analysis for automated in-design layout optimization

    NASA Astrophysics Data System (ADS)

    Cain, Jason P.; Fakhry, Moutaz; Pathak, Piyush; Sweis, Jason; Gennari, Frank; Lai, Ya-Chieh

    2018-04-01

    Building on previous work for cataloging unique topological patterns in an integrated circuit physical design, a new process is defined in which a risk scoring methodology is used to rank patterns based on manufacturing risk. Patterns with high risk are then mapped to functionally equivalent patterns with lower risk. The higher risk patterns are then replaced in the design with their lower risk equivalents. The pattern selection and replacement is fully automated and suitable for use for full-chip designs. Results from 14nm product designs show that the approach can identify and replace risk patterns with quantifiable positive impact on the risk score distribution after replacement.

  20. MEMS Reliability Assurance Guidelines for Space Applications

    NASA Technical Reports Server (NTRS)

    Stark, Brian (Editor)

    1999-01-01

    This guide is a reference for understanding the various aspects of microelectromechanical systems, or MEMS, with an emphasis on device reliability. Material properties, failure mechanisms, processing techniques, device structures, and packaging techniques common to MEMS are addressed in detail. Design and qualification methodologies provide the reader with the means to develop suitable qualification plans for the insertion of MEMS into the space environment.

  1. Adaptive Capacity in the Pacific Region: A Study of Continuous Professional Development for In-Service Teachers in Kiribati

    ERIC Educational Resources Information Center

    Martin, Tess; Thomson, Ian

    2018-01-01

    This study of I-Kiribati secondary school teachers used a project-based approach to investigate the notions of school-based and collaborative learning as a suitable model for in-service teacher continuous professional development (CPD). The design and methodology adopted by the study framed the argument that since collaborative behavior is…

  2. An approach to quantitative sustainability assessment in the early stages of process design.

    PubMed

    Tugnoli, Alessandro; Santarelli, Francesco; Cozzani, Valerio

    2008-06-15

    A procedure was developed for the quantitative assessment of key performance indicators suitable for the sustainability analysis of alternative processes, mainly addressing the early stages of process design. The methodology was based on the calculation of a set of normalized impact indices allowing a direct comparison of the additional burden of each process alternative on a selected reference area. Innovative reference criteria were developed to compare and aggregate the impact indicators on the basis of the site-specific impact burden and sustainability policy. An aggregation procedure also allows the calculation of overall sustainability performance indicators and of an "impact fingerprint" of each process alternative. The final aim of the method is to support the decision making process during process development, providing a straightforward assessment of the expected sustainability performances. The application of the methodology to case studies concerning alternative waste disposal processes allowed a preliminary screening of the expected critical sustainability impacts of each process. The methodology was shown to provide useful results to address sustainability issues in the early stages of process design.

  3. Multiqubit subradiant states in N -port waveguide devices: ɛ-and-μ-near-zero hubs and nonreciprocal circulators

    NASA Astrophysics Data System (ADS)

    Liberal, Iñigo; Engheta, Nader

    2018-02-01

    Quantum emitters interacting through a waveguide setup have been proposed as a promising platform for basic research on light-matter interactions and quantum information processing. We propose to augment waveguide setups with the use of multiport devices. Specifically, we demonstrate theoretically the possibility of exciting N -qubit subradiant, maximally entangled, states with the use of suitably designed N -port devices. Our general methodology is then applied based on two different devices: an epsilon-and-mu-near-zero waveguide hub and a nonreciprocal circulator. A sensitivity analysis is carried out to assess the robustness of the system against a number of nonidealities. These findings link and merge the designs of devices for quantum state engineering with classical communication network methodologies.

  4. Opening a Gateway for Chemiluminescence Cell Imaging: Distinctive Methodology for Design of Bright Chemiluminescent Dioxetane Probes

    PubMed Central

    2017-01-01

    Chemiluminescence probes are considered to be among the most sensitive diagnostic tools that provide high signal-to-noise ratio for various applications such as DNA detection and immunoassays. We have developed a new molecular methodology to design and foresee light-emission properties of turn-ON chemiluminescence dioxetane probes suitable for use under physiological conditions. The methodology is based on incorporation of a substituent on the benzoate species obtained during the chemiexcitation pathway of Schaap’s adamantylidene–dioxetane probe. The substituent effect was initially evaluated on the fluorescence emission generated by the benzoate species and then on the chemiluminescence of the dioxetane luminophores. A striking substituent effect on the chemiluminescence efficiency of the probes was obtained when acrylate and acrylonitrile electron-withdrawing groups were installed. The chemiluminescence quantum yield of the best probe was more than 3 orders of magnitude higher than that of a standard, commercially available adamantylidene–dioxetane probe. These are the most powerful chemiluminescence dioxetane probes synthesized to date that are suitable for use under aqueous conditions. One of our probes was capable of providing high-quality chemiluminescence cell images based on endogenous activity of β-galactosidase. This is the first demonstration of cell imaging achieved by a non-luciferin small-molecule probe with direct chemiluminescence mode of emission. We anticipate that the strategy presented here will lead to development of efficient chemiluminescence probes for various applications in the field of sensing and imaging. PMID:28470053

  5. Design and development of molecularly imprinted polymers for the selective extraction of deltamethrin in olive oil: An integrated computational-assisted approach.

    PubMed

    Martins, Nuno; Carreiro, Elisabete P; Locati, Abel; Ramalho, João P Prates; Cabrita, Maria João; Burke, Anthony J; Garcia, Raquel

    2015-08-28

    This work firstly addresses the design and development of molecularly imprinted systems selective for deltamethrin aiming to provide a suitable sorbent for solid phase (SPE) extraction that will be further used for the implementation of an analytical methodology for the trace analysis of the target pesticide in spiked olive oil samples. To achieve this goal, a preliminary evaluation of the molecular recognition and selectivity of the molecularly imprinted polymers has been performed. In order to investigate the complexity of the mechanistic basis for template selective recognition in these polymeric matrices, the use of a quantum chemical approach has been attempted providing new insights about the mechanisms underlying template recognition, and in particular the crucial role of the crosslinker agent and the solvent used. Thus, DFT calculations corroborate the results obtained by experimental molecular recognition assays enabling one to select the most suitable imprinting system for MISPE extraction technique which encompasses acrylamide as functional monomer and ethylene glycol dimethacrylate as crosslinker. Furthermore, an analytical methodology comprising a sample preparation step based on solid phase extraction has been implemented using this "tailor made" imprinting system as sorbent, for the selective isolation/pre-concentration of deltamethrin from olive oil samples. Molecularly imprinted solid phase extraction (MISPE) methodology was successfully applied for the clean-up of spiked olive oil samples, with recovery rates up to 94%. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Software engineering techniques and CASE tools in RD13

    NASA Astrophysics Data System (ADS)

    Buono, S.; Gaponenko, I.; Jones, R.; Khodabandeh, A.; Mapelli, L.; Mornacchi, G.; Prigent, D.; Sanchez-Corral, E.; Skiadelli, M.; Toppers, A.; Duval, P. Y.; Ferrato, D.; Le Van Suu, A.; Qian, Z.; Rondot, C.; Ambrosini, G.; Fumagalli, G.; Polesello, G.; Aguer, M.; Huet, M.

    1994-12-01

    The RD13 project was approved in April 1991 for the development of a scalable data-taking system suitable for hosting various LHC studies. One of its goals is the exploitation of software engineering techniques, in order to indicate their overall suitability for data acquisition (DAQ), software design and implementation. This paper describes how such techniques have been applied to the development of components of the RD13 DAQ used in test-beam runs at CERN. We describe our experience with the Artifex CASE tool and its associated methodology. The issues raised when code generated by a CASE tool has to be integrated into an existing environment are also discussed.

  7. Modeling and Design Analysis Methodology for Tailoring of Aircraft Structures with Composites

    NASA Technical Reports Server (NTRS)

    Rehfield, Lawrence W.

    2004-01-01

    Composite materials provide design flexibility in that fiber placement and orientation can be specified and a variety of material forms and manufacturing processes are available. It is possible, therefore, to 'tailor' the structure to a high degree in order to meet specific design requirements in an optimum manner. Common industrial practices, however, have limited the choices designers make. One of the reasons for this is that there is a dearth of conceptual/preliminary design analysis tools specifically devoted to identifying structural concepts for composite airframe structures. Large scale finite element simulations are not suitable for such purposes. The present project has been devoted to creating modeling and design analysis methodology for use in the tailoring process of aircraft structures. Emphasis has been given to creating bend-twist elastic coupling in high aspect ratio wings or other lifting surfaces. The direction of our work was in concert with the overall NASA effort Twenty- First Century Aircraft Technology (TCAT). A multi-disciplinary team was assembled by Dr. Damodar Ambur to work on wing technology, which included our project.

  8. GAMES II Project: a general architecture for medical knowledge-based systems.

    PubMed

    Bruno, F; Kindler, H; Leaning, M; Moustakis, V; Scherrer, J R; Schreiber, G; Stefanelli, M

    1994-10-01

    GAMES II aims at developing a comprehensive and commercially viable methodology to avoid problems ordinarily occurring in KBS development. GAMES II methodology proposes to design a KBS starting from an epistemological model of medical reasoning (the Select and Test Model). The design is viewed as a process of adding symbol level information to the epistemological model. The architectural framework provided by GAMES II integrates the use of different formalisms and techniques providing a large set of tools. The user can select the most suitable one for representing a piece of knowledge after a careful analysis of its epistemological characteristics. Special attention is devoted to the tools dealing with knowledge acquisition (both manual and automatic). A panel of practicing physicians are assessing the medical value of such a framework and its related tools by using it in a practical application.

  9. Hybrid CMS methods with model reduction for assembly of structures

    NASA Technical Reports Server (NTRS)

    Farhat, Charbel

    1991-01-01

    Future on-orbit structures will be designed and built in several stages, each with specific control requirements. Therefore there must be a methodology which can predict the dynamic characteristics of the assembled structure, based on the dynamic characteristics of the subassemblies and their interfaces. The methodology developed by CSC to address this issue is Hybrid Component Mode Synthesis (HCMS). HCMS distinguishes itself from standard component mode synthesis algorithms in the following features: (1) it does not require the subcomponents to have displacement compatible models, which makes it ideal for analyzing the deployment of heterogeneous flexible multibody systems, (2) it incorporates a second-level model reduction scheme at the interface, which makes it much faster than other algorithms and therefore suitable for control purposes, and (3) it does answer specific questions such as 'how does the global fundamental frequency vary if I change the physical parameters of substructure k by a specified amount?'. Because it is based on an energy principle rather than displacement compatibility, this methodology can also help the designer to define an assembly process. Current and future efforts are devoted to applying the HCMS method to design and analyze docking and berthing procedures in orbital construction.

  10. Application and testing of a procedure to evaluate transferability of habitat suitability criteria

    USGS Publications Warehouse

    Thomas, Jeff A.; Bovee, Ken D.

    1993-01-01

    A procedure designed to test the transferability of habitat suitability criteria was evaluated in the Cache la Poudre River, Colorado. Habitat suitability criteria were developed for active adult and juvenile rainbow trout in the South Platte River, Colorado. These criteria were tested by comparing microhabitat use predicted from the criteria with observed microhabitat use by adult rainbow trout in the Cache la Poudre River. A one-sided X2 test, using counts of occupied and unoccupied cells in each suitability classification, was used to test for non-random selection for optimum habitat use over usable habitat and for suitable over unsuitable habitat. Criteria for adult rainbow trout were judged to be transferable to the Cache la Poudre River, but juvenile criteria (applied to adults) were not transferable. Random subsampling of occupied and unoccupied cells was conducted to determine the effect of sample size on the reliability of the test procedure. The incidence of type I and type II errors increased rapidly as the sample size was reduced below 55 occupied and 200 unoccupied cells. Recommended modifications to the procedure included the adoption of a systematic or randomized sampling design and direct measurement of microhabitat variables. With these modifications, the procedure is economical, simple and reliable. Use of the procedure as a quality assurance device in routine applications of the instream flow incremental methodology was encouraged.

  11. Observational studies of the association between glucose-lowering medications and cardiovascular outcomes: addressing methodological limitations.

    PubMed

    Patorno, Elisabetta; Patrick, Amanda R; Garry, Elizabeth M; Schneeweiss, Sebastian; Gillet, Victoria G; Bartels, Dorothee B; Masso-Gonzalez, Elvira; Seeger, John D

    2014-11-01

    Recent years have witnessed a growing body of observational literature on the association between glucose-lowering treatments and cardiovascular disease. However, many of the studies are based on designs or analyses that inadequately address the methodological challenges involved. We reviewed recent observational literature on the association between glucose-lowering medications and cardiovascular outcomes and assessed the design and analysis methods used, with a focus on their ability to address specific methodological challenges. We describe and illustrate these methodological issues and their impact on observed associations, providing examples from the reviewed literature. We suggest approaches that may be employed to manage these methodological challenges. From the evaluation of 81 publications of observational investigations assessing the association between glucose-lowering treatments and cardiovascular outcomes, we identified the following methodological challenges: 1) handling of temporality in administrative databases; 2) handling of risks that vary with time and treatment duration; 3) definitions of the exposure risk window; 4) handling of exposures that change over time; and 5) handling of confounding by indication. Most of these methodological challenges may be suitably addressed through application of appropriate methods. Observational research plays an increasingly important role in the evaluation of the clinical effects of diabetes treatment. Implementation of appropriate research methods holds the promise of reducing the potential for spurious findings and the risk that the spurious findings will mislead the medical community about risks and benefits of diabetes medications.

  12. A rotor technology assessment of the advancing blade concept

    NASA Technical Reports Server (NTRS)

    Pleasants, W. A.

    1983-01-01

    A rotor technology assessment of the Advancing Blade Concept (ABC) was conducted in support of a preliminary design study. The analytical methodology modifications and inputs, the correlation, and the results of the assessment are documented. The primary emphasis was on the high-speed forward flight performance of the rotor. The correlation data base included both the wind tunnel and the flight test results. An advanced ABC rotor design was examined; the suitability of the ABC for a particular mission was not considered. The objective of this technology assessment was to provide estimates of the performance potential of an advanced ABC rotor designed for high speed forward flight.

  13. The Challenge of Peat Substitution in Organic Seedling Production: Optimization of Growing Media Formulation through Mixture Design and Response Surface Analysis

    PubMed Central

    Ceglie, Francesco Giovanni; Bustamante, Maria Angeles; Ben Amara, Mouna; Tittarelli, Fabio

    2015-01-01

    Peat replacement is an increasing demand in containerized and transplant production, due to the environmental constraints associated to peat use. However, despite the wide information concerning the use of alternative materials as substrates, it is very complex to establish the best materials and mixtures. This work evaluates the use of mixture design and surface response methodology in a peat substitution experiment using two alternative materials (green compost and palm fibre trunk waste) for transplant production of tomato (Lycopersicon esculentum Mill.); melon, (Cucumis melo L.); and lettuce (Lactuca sativa L.) in organic farming conditions. In general, the substrates showed suitable properties for their use in seedling production, showing the best plant response the mixture of 20% green compost, 39% palm fibre and 31% peat. The mixture design and applied response surface methodology has shown to be an useful approach to optimize substrate formulations in peat substitution experiments to standardize plant responses. PMID:26070163

  14. Designing normative open virtual enterprises

    NASA Astrophysics Data System (ADS)

    Garcia, Emilia; Giret, Adriana; Botti, Vicente

    2016-03-01

    There is an increasing interest on developing virtual enterprises in order to deal with the globalisation of the economy, the rapid growth of information technologies and the increase of competitiveness. In this paper we deal with the development of normative open virtual enterprises (NOVEs). They are systems with a global objective that are composed of a set of heterogeneous entities and enterprises that exchange services following a specific normative context. In order to analyse and design systems of this kind the multi-agent paradigm seems suitable because it offers a specific solution for supporting the social and contractual relationships between enterprises and for formalising their business processes. This paper presents how the Regulated Open Multi-agent systems (ROMAS) methodology, an agent-oriented software methodology, can be used to analyse and design NOVEs. ROMAS offers a complete development process that allows identifying and formalising of the structure of NOVEs, their normative context and the interactions among their members. The use of ROMAS is exemplified by means of a case study that represents an automotive supply chain.

  15. Advanced electric motor technology: Flux mapping

    NASA Technical Reports Server (NTRS)

    Doane, George B., III; Campbell, Warren; Brantley, Larry W.; Dean, Garvin

    1992-01-01

    This report contains the assumptions, mathematical models, design methodology, and design points involved with the design of an electromechanical actuator (EMA) suitable for directing the thrust vector of a large MSFC/NASA launch vehicle. Specifically the design of such an actuator for use on the upcoming liquid fueled National Launch System (NLS) is considered culminating in a point design of both the servo system and the electric motor needed. A major thrust of the work is in selecting spur gear and roller screw reduction ratios to achieve simultaneously wide bandwidth, maximum power transfer, and disturbance rejection while meeting specified horsepower requirements at a given stroking speed as well as a specified maximum stall force. An innovative feedback signal is utilized in meeting these diverse objectives.

  16. RLV Turbine Performance Optimization

    NASA Technical Reports Server (NTRS)

    Griffin, Lisa W.; Dorney, Daniel J.

    2001-01-01

    A task was developed at NASA/Marshall Space Flight Center (MSFC) to improve turbine aerodynamic performance through the application of advanced design and analysis tools. There are four major objectives of this task: 1) to develop, enhance, and integrate advanced turbine aerodynamic design and analysis tools; 2) to develop the methodology for application of the analytical techniques; 3) to demonstrate the benefits of the advanced turbine design procedure through its application to a relevant turbine design point; and 4) to verify the optimized design and analysis with testing. Final results of the preliminary design and the results of the two-dimensional (2D) detailed design of the first-stage vane of a supersonic turbine suitable for a reusable launch vehicle (R-LV) are presented. Analytical techniques for obtaining the results are also discussed.

  17. Structured representation for requirements and specifications

    NASA Technical Reports Server (NTRS)

    Cohen, Gerald C.; Fisher, Gene; Frincke, Deborah; Wolber, Dave

    1991-01-01

    This document was generated in support of NASA contract NAS1-18586, Design and Validation of Digital Flight Control Systems suitable for Fly-By-Wire Applications, Task Assignment 2. Task 2 is associated with a formal representation of requirements and specifications. In particular, this document contains results associated with the development of a Wide-Spectrum Requirements Specification Language (WSRSL) that can be used to express system requirements and specifications in both stylized and formal forms. Included with this development are prototype tools to support the specification language. In addition a preliminary requirements specification methodology based on the WSRSL has been developed. Lastly, the methodology has been applied to an Advanced Subsonic Civil Transport Flight Control System.

  18. Optimisation of surfactant decontamination and pre-treatment of waste chicken feathers by using response surface methodology.

    PubMed

    Tesfaye, Tamrat; Sithole, Bruce; Ramjugernath, Deresh; Ndlela, Luyanda

    2018-02-01

    Commercially processed, untreated chicken feathers are biologically hazardous due to the presence of blood-borne pathogens. Prior to valorisation, it is crucial that they are decontaminated to remove the microbial contamination. The present study focuses on evaluating the best technologies to decontaminate and pre-treat chicken feathers in order to make them suitable for valorisation. Waste chicken feathers were washed with three surfactants (sodium dodecyl sulphate) dimethyl dioctadecyl ammonium chloride, and polyoxyethylene (40) stearate) using statistically designed experiments. Process conditions were optimised using response surface methodology with a Box-Behnken experimental design. The data were compared with decontamination using an autoclave. Under optimised conditions, the microbial counts of the decontaminated and pre-treated chicken feathers were significantly reduced making them safe for handling and use for valorisation applications. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. An evolving-requirements technology assessment process for advanced propulsion concepts

    NASA Astrophysics Data System (ADS)

    McClure, Erin Kathleen

    The following dissertation investigates the development of a methodology suitable for the evaluation of advanced propulsion concepts. At early stages of development, both the future performance of these concepts and their requirements are highly uncertain, making it difficult to forecast their future value. Developing advanced propulsion concepts requires a huge investment of resources. The methodology was developed to enhance the decision-makers understanding of the concepts, so that they could mitigate the risks associated with developing such concepts. A systematic methodology to identify potential advanced propulsion concepts and assess their robustness is necessary to reduce the risk of developing advanced propulsion concepts. Existing advanced design methodologies have evaluated the robustness of technologies or concepts to variations in requirements, but they are not suitable to evaluate a large number of dissimilar concepts. Variations in requirements have been shown to impact the development of advanced propulsion concepts, and any method designed to evaluate these concepts must incorporate the possible variations of the requirements into the assessment. In order to do so, a methodology was formulated to be capable of accounting for two aspects of the problem. First, it had to systemically identify a probabilistic distribution for the future requirements. Such a distribution would allow decision-makers to quantify the uncertainty introduced by variations in requirements. Second, the methodology must be able to assess the robustness of the propulsion concepts as a function of that distribution. This dissertation describes in depth these enabling elements and proceeds to synthesize them into a new method, the Evolving Requirements Technology Assessment (ERTA). As a proof of concept, the ERTA method was used to evaluate and compare advanced propulsion systems that will be capable of powering a hurricane tracking, High Altitude, Long Endurance (HALE) unmanned aerial vehicle (UAV). The use of the ERTA methodology to assess HALE UAV propulsion concepts demonstrated that potential variations in requirements do significantly impact the assessment and selection of propulsion concepts. The proof of concept also demonstrated that traditional forecasting techniques, such as the cross impact analysis, could be used to forecast the requirements for advanced propulsion concepts probabilistically. "Fitness", a measure of relative goodness, was used to evaluate the concepts. Finally, stochastic optimizations were used to evaluate the propulsion concepts across the range of requirement sets that were considered.

  20. Breaking the Link between Environmental Degradation and Oil Palm Expansion: A Method for Enabling Sustainable Oil Palm Expansion

    PubMed Central

    Smit, Hans Harmen; Meijaard, Erik; van der Laan, Carina; Mantel, Stephan; Budiman, Arif; Verweij, Pita

    2013-01-01

    Land degradation is a global concern. In tropical areas it primarily concerns the conversion of forest into non-forest lands and the associated losses of environmental services. Defining such degradation is not straightforward hampering effective reduction in degradation and use of already degraded lands for more productive purposes. To facilitate the processes of avoided degradation and land rehabilitation, we have developed a methodology in which we have used international environmental and social sustainability standards to determine the suitability of lands for sustainable agricultural expansion. The method was developed and tested in one of the frontiers of agricultural expansion, West Kalimantan province in Indonesia. The focus was on oil palm expansion, which is considered as a major driver for deforestation in tropical regions globally. The results suggest that substantial changes in current land-use planning are necessary for most new plantations to comply with international sustainability standards. Through visualizing options for sustainable expansion with our methodology, we demonstrate that the link between oil palm expansion and degradation can be broken. Application of the methodology with criteria and thresholds similar to ours could help the Indonesian government and the industry to achieve its pro-growth, pro-job, pro-poor and pro-environment development goals. For sustainable agricultural production, context specific guidance has to be developed in areas suitable for expansion. Our methodology can serve as a template for designing such commodity and country specific tools and deliver such guidance. PMID:24039700

  1. Application of Response Surface Methodology on Leaching of Iron from Partially Laterised Khondalite Rocks: A Bauxite Mining Waste

    NASA Astrophysics Data System (ADS)

    Swain, Ranjita; Bhima Rao, R.

    2018-04-01

    In the present investigation, response surface methodology (RSM) is used for a quadratic model that continuously controls the process parameters. This model is used to optimize the removal of iron oxide from Partially Laterised Khondalite (PLK) rocks which is influenced by several independent variables namely acid concentration, time and temperature. Second order response functions are produced for leaching of iron oxide from PLK rocks-a bauxite mining waste. In RSM, Box-Behnken design is used for the process optimization to achieve maximum removal of iron oxide. The influence of the process variables of leaching of iron oxide is presented in the form of 3-D response graphs. The results of this investigation reveals that 3 M hydrochloric acid concentration, 240 min time and 373 K temperature are found to be the best conditions for removal of 99% Fe2O3. The product obtain at this condition contain 80% brightness which is suitable for ceramic and filler industry applications. The novelity of the work is that the waste can be a value added product after suitable physical beneficiation and chemical treatment.

  2. Dynamic Compression of the Signal in a Charge Sensitive Amplifier: From Concept to Design

    NASA Astrophysics Data System (ADS)

    Manghisoni, Massimo; Comotti, Daniele; Gaioni, Luigi; Ratti, Lodovico; Re, Valerio

    2015-10-01

    This work is concerned with the design of a low-noise Charge Sensitive Amplifier featuring a dynamic signal compression based on the non-linear features of an inversion-mode MOS capacitor. These features make the device suitable for applications where a non-linear characteristic of the front-end is required, such as in imaging instrumentation for free electron laser experiments. The aim of the paper is to discuss a methodology for the proper design of the feedback network enabling the dynamic signal compression. Starting from this compression solution, the design of a low-noise Charge Sensitive Amplifier is also discussed. The study has been carried out by referring to a 65 nm CMOS technology.

  3. Novel Composites for Wing and Fuselage Applications

    NASA Technical Reports Server (NTRS)

    Suarez, J. A.; Buttitta, C.

    1996-01-01

    Design development was successfully completed for textile preforms with continuous cross-stiffened epoxy panels with cut-outs. The preforms developed included 3-D angle interlock weaving of graphite structural fibers impregnated by resin film infiltration (RFI) and shown to be structurally suitable under conditions requiring minimum acquisition costs. Design guidelines/analysis methodology for such textile structures are given. The development was expanded to a fuselage side-panel component of a subsonic commercial airframe and found to be readily scalable. The successfully manufactured panel was delivered to NASA Langley for biaxial testing. This report covers the work performed under Task 3 -- Cross-Stiffened Subcomponent; Task 4 -- Design Guidelines/Analysis of Textile-Reinforced Composites; and Task 5 -- Integrally Woven Fuselage Panel.

  4. Genetic mechanism for designing new generation of buildings from data obtained by sensor agent robots

    NASA Astrophysics Data System (ADS)

    Ono, Chihiro; Mita, Akira

    2012-04-01

    Due to an increase in an elderly-people household, and global warming, the design of building spaces requires delicate consideration of the needs of elderly-people. Studies of intelligent spaces that can control suitable devices for residents may provide some of functions needed. However, these intelligent spaces are based on predefined scenarios so that it is difficult to handle unexpected circumstances and adapt to the needs of people. This study aims to suggest a Genetic adaption algorithm for building spaces. The feasibility of the algorithm is tested by simulation. The algorithm extend the existing design methodology by reflecting ongoing living information quickly in the variety of patterns.

  5. Advanced Turbine Technology Applications Project (ATTAP)

    NASA Technical Reports Server (NTRS)

    1991-01-01

    This report summarizes work performed in support of the development and demonstration of a structural ceramic technology for automotive gas turbine engines. The AGT101 regenerated gas turbine engine developed under the previous DOE/NASA Advanced Gas Turbine (AGT) program is being utilized for verification testing of the durability of next-generation ceramic components and their suitability for service at reference powertrain design conditions. Topics covered in this report include ceramic processing definition and refinement, design improvements to the test bed engine and test rigs, and design methodologies related to ceramic impact and fracture mechanisms. Appendices include reports by ATTAP subcontractors addressing the development of silicon nitride and silicon carbide families of materials and processes.

  6. A methodology for producing small scale rural land use maps in semi-arid developing countries using orbital imagery

    NASA Technical Reports Server (NTRS)

    Vangenderen, J. L. (Principal Investigator); Lock, B. F.

    1976-01-01

    The author has identified the following significant results. Results have shown that it is feasible to design a methodology that can provide suitable guidelines for operational production of small scale rural land use maps of semiarid developing regions from LANDSAT MSS imagery, using inexpensive and unsophisticated visual techniques. The suggested methodology provides immediate practical benefits to map makers attempting to produce land use maps in countries with limited budgets and equipment. Many preprocessing and interpretation techniques were considered, but rejected on the grounds that they were inappropriate mainly due to the high cost of imagery and/or equipment, or due to their inadequacy for use in operational projects in the developing countries. Suggested imagery and interpretation techniques, consisting of color composites and monocular magnification proved to be the simplest, fastest, and most versatile methods.

  7. Robust decentralized controller for minimizing coupling effect in single inductor multiple output DC-DC converter operating in continuous conduction mode.

    PubMed

    Medeiros, Renan Landau Paiva de; Barra, Walter; Bessa, Iury Valente de; Chaves Filho, João Edgar; Ayres, Florindo Antonio de Cavalho; Neves, Cleonor Crescêncio das

    2018-02-01

    This paper describes a novel robust decentralized control design methodology for a single inductor multiple output (SIMO) DC-DC converter. Based on a nominal multiple input multiple output (MIMO) plant model and performance requirements, a pairing input-output analysis is performed to select the suitable input to control each output aiming to attenuate the loop coupling. Thus, the plant uncertainty limits are selected and expressed in interval form with parameter values of the plant model. A single inductor dual output (SIDO) DC-DC buck converter board is developed for experimental tests. The experimental results show that the proposed methodology can maintain a desirable performance even in the presence of parametric uncertainties. Furthermore, the performance indexes calculated from experimental data show that the proposed methodology outperforms classical MIMO control techniques. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  8. Cardiological database management system as a mediator to clinical decision support.

    PubMed

    Pappas, C; Mavromatis, A; Maglaveras, N; Tsikotis, A; Pangalos, G; Ambrosiadou, V

    1996-03-01

    An object-oriented medical database management system is presented for a typical cardiologic center, facilitating epidemiological trials. Object-oriented analysis and design were used for the system design, offering advantages for the integrity and extendibility of medical information systems. The system was developed using object-oriented design and programming methodology, the C++ language and the Borland Paradox Relational Data Base Management System on an MS-Windows NT environment. Particular attention was paid to system compatibility, portability, the ease of use, and the suitable design of the patient record so as to support the decisions of medical personnel in cardiovascular centers. The system was designed to accept complex, heterogeneous, distributed data in various formats and from different kinds of examinations such as Holter, Doppler and electrocardiography.

  9. An expert-based approach to forest road network planning by combining Delphi and spatial multi-criteria evaluation.

    PubMed

    Hayati, Elyas; Majnounian, Baris; Abdi, Ehsan; Sessions, John; Makhdoum, Majid

    2013-02-01

    Changes in forest landscapes resulting from road construction have increased remarkably in the last few years. On the other hand, the sustainable management of forest resources can only be achieved through a well-organized road network. In order to minimize the environmental impacts of forest roads, forest road managers must design the road network efficiently and environmentally as well. Efficient planning methodologies can assist forest road managers in considering the technical, economic, and environmental factors that affect forest road planning. This paper describes a three-stage methodology using the Delphi method for selecting the important criteria, the Analytic Hierarchy Process for obtaining the relative importance of the criteria, and finally, a spatial multi-criteria evaluation in a geographic information system (GIS) environment for identifying the lowest-impact road network alternative. Results of the Delphi method revealed that ground slope, lithology, distance from stream network, distance from faults, landslide susceptibility, erosion susceptibility, geology, and soil texture are the most important criteria for forest road planning in the study area. The suitability map for road planning was then obtained by combining the fuzzy map layers of these criteria with respect to their weights. Nine road network alternatives were designed using PEGGER, an ArcView GIS extension, and finally, their values were extracted from the suitability map. Results showed that the methodology was useful for identifying road that met environmental and cost considerations. Based on this work, we suggest future work in forest road planning using multi-criteria evaluation and decision making be considered in other regions and that the road planning criteria identified in this study may be useful.

  10. Design and Analysis of Turbines for Space Applications

    NASA Technical Reports Server (NTRS)

    Griffin, Lisa W.; Dorney, Daniel J.; Huber, Frank W.

    2003-01-01

    In order to mitigate the risk of rocket propulsion development, efficient, accurate, detailed fluid dynamics analysis of the turbomachinery is necessary. This analysis is used for component development, design parametrics, performance prediction, and environment definition. To support this requirement, a task was developed at NASAh4arshall Space Flight Center (MSFC) to improve turbine aerodynamic performance through the application of advanced design and analysis tools. There are four major objectives of this task: 1) to develop, enhance, and integrate advanced turbine aerodynamic design and analysis tools; 2) to develop the methodology for application of the analytical techniques; 3) to demonstrate the benefits of the advanced turbine design procedure through its application to a relevant turbine design point; and 4) to verify the optimized design and analysis with testing. The turbine chosen on which to demonstrate the procedure was a supersonic design suitable for a reusable launch vehicle (RLV). The hot gas path and blading were redesigned to obtain an increased efficiency. The redesign of the turbine was conducted with a consideration of system requirements, realizing that a highly efficient turbine that, for example, significantly increases engine weight, is of limited benefit. Both preliminary and detailed designs were considered. To generate an improved design, one-dimensional (1D) design and analysis tools, computational fluid dynamics (CFD), response surface methodology (RSM), and neural nets (NN) were used.

  11. Towards automating the discovery of certain innovative design principles through a clustering-based optimization technique

    NASA Astrophysics Data System (ADS)

    Bandaru, Sunith; Deb, Kalyanmoy

    2011-09-01

    In this article, a methodology is proposed for automatically extracting innovative design principles which make a system or process (subject to conflicting objectives) optimal using its Pareto-optimal dataset. Such 'higher knowledge' would not only help designers to execute the system better, but also enable them to predict how changes in one variable would affect other variables if the system has to retain its optimal behaviour. This in turn would help solve other similar systems with different parameter settings easily without the need to perform a fresh optimization task. The proposed methodology uses a clustering-based optimization technique and is capable of discovering hidden functional relationships between the variables, objective and constraint functions and any other function that the designer wishes to include as a 'basis function'. A number of engineering design problems are considered for which the mathematical structure of these explicit relationships exists and has been revealed by a previous study. A comparison with the multivariate adaptive regression splines (MARS) approach reveals the practicality of the proposed approach due to its ability to find meaningful design principles. The success of this procedure for automated innovization is highly encouraging and indicates its suitability for further development in tackling more complex design scenarios.

  12. Prediction of ground water quality index to assess suitability for drinking purposes using fuzzy rule-based approach

    NASA Astrophysics Data System (ADS)

    Gorai, A. K.; Hasni, S. A.; Iqbal, Jawed

    2016-11-01

    Groundwater is the most important natural resource for drinking water to many people around the world, especially in rural areas where the supply of treated water is not available. Drinking water resources cannot be optimally used and sustained unless the quality of water is properly assessed. To this end, an attempt has been made to develop a suitable methodology for the assessment of drinking water quality on the basis of 11 physico-chemical parameters. The present study aims to select the fuzzy aggregation approach for estimation of the water quality index of a sample to check the suitability for drinking purposes. Based on expert's opinion and author's judgement, 11 water quality (pollutant) variables (Alkalinity, Dissolved Solids (DS), Hardness, pH, Ca, Mg, Fe, Fluoride, As, Sulphate, Nitrates) are selected for the quality assessment. The output results of proposed methodology are compared with the output obtained from widely used deterministic method (weighted arithmetic mean aggregation) for the suitability of the developed methodology.

  13. Use of multidimensional modeling to evaluate a channel restoration design for the Kootenai River, Idaho

    USGS Publications Warehouse

    Logan, B.L.; McDonald, R.R.; Nelson, J.M.; Kinzel, P.J.; Barton, G.J.

    2011-01-01

    River channel construction projects aimed at restoring or improving degraded waterways have become common but have been variously successful. In this report a methodology is proposed to evaluate channel designs before channels are built by using multidimensional modeling and analysis. This approach allows detailed analysis of water-surface profiles, sediment transport, and aquatic habitat that may result if the design is implemented. The method presented here addresses the need to model a range of potential stream-discharge and channel-roughness conditions to best assess the function of the design channel for a suite of possible conditions. This methodology is demonstrated by using a preliminary channel-restoration design proposed for a part of the Kootenai River in northern Idaho designated as critical habitat for the endangered white sturgeon (Acipenser transmontanus) and evaluating the design on the basis of simulations with the Flow and Sediment Transport with Morphologic Evolution of Channels (FaSTMECH) model. This evaluation indicated substantial problems with the preliminary design because boundary conditions used in the design were inconsistent with best estimates of future conditions. As a result, simulated water-surface levels did not meet target levels that corresponded to the designed bankfull surfaces; therefore, the flood plain would not function as intended. Sediment-transport analyses indicated that both the current channel of the Kootenai River and the design channel are largely unable to move the bed material through the reach at bankfull discharge. Therefore, sediment delivered to the design channel would likely be deposited within the reach instead of passing through it as planned. Consequently, the design channel geometry would adjust through time. Despite these issues, the design channel would provide more aquatic habitat suitable for spawning white sturgeon (Acipenser transmontanus) at lower discharges than is currently available in the Kootenai River. The evaluation methodology identified potential problems with the design channel that can be addressed through design modifications to better meet project objectives before channel construction.

  14. User-oriented views in health care information systems.

    PubMed

    Portoni, Luisa; Combi, Carlo; Pinciroli, Francesco

    2002-12-01

    In this paper, we present the methodology we adopted in designing and developing an object-oriented database system for the management of medical records. The designed system provides technical solutions to important requirements of most clinical information systems, such as 1) the support of tools to create and manage views on data and view schemas, offering to different users specific perspectives on data tailored to their needs; 2) the capability to handle in a suitable way the temporal aspects related to clinical information; and 3) the effective integration of multimedia data. Remote data access for authorized users is also considered. As clinical application, we describe here the prototype of a user-oriented clinical information system for the archiving and the management of multimedia and temporally oriented clinical data related to percutaneous transluminal coronary angioplasty (PTCA) patients. Suitable view schemas for various user roles (cath-lab physician, ward nurse, general practitioner) have been modeled and implemented on the basis of a detailed analysis of the considered clinical environment, carried out by an object-oriented approach.

  15. Self-adaptive multi-objective harmony search for optimal design of water distribution networks

    NASA Astrophysics Data System (ADS)

    Choi, Young Hwan; Lee, Ho Min; Yoo, Do Guen; Kim, Joong Hoon

    2017-11-01

    In multi-objective optimization computing, it is important to assign suitable parameters to each optimization problem to obtain better solutions. In this study, a self-adaptive multi-objective harmony search (SaMOHS) algorithm is developed to apply the parameter-setting-free technique, which is an example of a self-adaptive methodology. The SaMOHS algorithm attempts to remove some of the inconvenience from parameter setting and selects the most adaptive parameters during the iterative solution search process. To verify the proposed algorithm, an optimal least cost water distribution network design problem is applied to three different target networks. The results are compared with other well-known algorithms such as multi-objective harmony search and the non-dominated sorting genetic algorithm-II. The efficiency of the proposed algorithm is quantified by suitable performance indices. The results indicate that SaMOHS can be efficiently applied to the search for Pareto-optimal solutions in a multi-objective solution space.

  16. A product-service system approach to telehealth application design.

    PubMed

    Flores-Vaquero, Paul; Tiwari, Ashutosh; Alcock, Jeffrey; Hutabarat, Windo; Turner, Chris

    2016-06-01

    A considerable proportion of current point-of-care devices do not offer a wide enough set of capabilities if they are to function in any telehealth system. There is a need for intermediate devices that lie between healthcare devices and service networks. The development of an application is suggested that allows for a smartphone to take the role of an intermediate device. This research seeks to identify the telehealth service requirements for long-term condition management using a product-service system approach. The use of product-service system has proven to be a suitable methodology for the design and development of telehealth smartphone applications. © The Author(s) 2014.

  17. Evaluation of Soil Loss and Erosion Control Measures on Ranges and Range Structures at Installations in Temperate Climates

    DTIC Science & Technology

    2006-06-01

    Soil Loss Equation ( USLE ) and the Revised Universal Soil Loss Equation (RUSLE) continue to be widely accepted methods for estimating sediment loss...range areas. Therefore, a generalized design methodology using the Universal Soil Loss Equation ( USLE ) is presented to accommodate the variations...constructed use the slope most suitable to the area topography (3:1 or 4:1). Step 4: Using the Universal Soil Loss equation, USLE , find the values of A

  18. Optimising reversed-phase liquid chromatographic separation of an acidic mixture on a monolithic stationary phase with the aid of response surface methodology and experimental design.

    PubMed

    Wang, Y; Harrison, M; Clark, B J

    2006-02-10

    An optimization strategy for the separation of an acidic mixture by employing a monolithic stationary phase is presented, with the aid of experimental design and response surface methodology (RSM). An orthogonal array design (OAD) OA(16) (2(15)) was used to choose the significant parameters for the optimization. The significant factors were optimized by using a central composite design (CCD) and the quadratic models between the dependent and the independent parameters were built. The mathematical models were tested on a number of simulated data set and had a coefficient of R(2) > 0.97 (n = 16). On applying the optimization strategy, the factor effects were visualized as three-dimensional (3D) response surfaces and contour plots. The optimal condition was achieved in less than 40 min by using the monolithic packing with the mobile phase of methanol/20 mM phosphate buffer pH 2.7 (25.5/74.5, v/v). The method showed good agreement between the experimental data and predictive value throughout the studied parameter space and were suitable for optimization studies on the monolithic stationary phase for acidic compounds.

  19. Dataflow computing approach in high-speed digital simulation

    NASA Technical Reports Server (NTRS)

    Ercegovac, M. D.; Karplus, W. J.

    1984-01-01

    New computational tools and methodologies for the digital simulation of continuous systems were explored. Programmability, and cost effective performance in multiprocessor organizations for real time simulation was investigated. Approach is based on functional style languages and data flow computing principles, which allow for the natural representation of parallelism in algorithms and provides a suitable basis for the design of cost effective high performance distributed systems. The objectives of this research are to: (1) perform comparative evaluation of several existing data flow languages and develop an experimental data flow language suitable for real time simulation using multiprocessor systems; (2) investigate the main issues that arise in the architecture and organization of data flow multiprocessors for real time simulation; and (3) develop and apply performance evaluation models in typical applications.

  20. The Design and Validation of the Colorado Learning Attitudes about Science Survey

    NASA Astrophysics Data System (ADS)

    Adams, W. K.; Perkins, K. K.; Dubson, M.; Finkelstein, N. D.; Wieman, C. E.

    2005-09-01

    The Colorado Learning Attitudes about Science Survey (CLASS) is a new instrument designed to measure various facets of student attitudes and beliefs about learning physics. This instrument extends previous work by probing additional facets of student attitudes and beliefs. It has been written to be suitably worded for students in a variety of different courses. This paper introduces the CLASS and its design and validation studies, which include analyzing results from over 2400 students, interviews and factor analyses. Methodology used to determine categories and how to analyze the robustness of categories for probing various facets of student learning are also described. This paper serves as the foundation for the results and conclusions from the analysis of our survey data.

  1. Design of crashworthy structures with controlled behavior in HCA framework

    NASA Astrophysics Data System (ADS)

    Bandi, Punit

    The field of crashworthiness design is gaining more interest and attention from automakers around the world due to increasing competition and tighter safety norms. In the last two decades, topology and topometry optimization methods from structural optimization have been widely explored to improve existing designs or conceive new designs with better crashworthiness. Although many gradient-based and heuristic methods for topology- and topometry-based crashworthiness design are available these days, most of them result in stiff structures that are suitable only for a set of vehicle components in which maximizing the energy absorption or minimizing the intrusion is the main concern. However, there are some other components in a vehicle structure that should have characteristics of both stiffness and flexibility. Moreover, the load paths within the structure and potential buckle modes also play an important role in efficient functioning of such components. For example, the front bumper, side frame rails, steering column, and occupant protection devices like the knee bolster should all exhibit controlled deformation and collapse behavior. The primary objective of this research is to develop new methodologies to design crashworthy structures with controlled behavior. The well established Hybrid Cellular Automaton (HCA) method is used as the basic framework for the new methodologies, and compliant mechanism-type (sub)structures are the highlight of this research. The ability of compliant mechanisms to efficiently transfer force and/or motion from points of application of input loads to desired points within the structure is used to design solid and tubular components that exhibit controlled deformation and collapse behavior under crash loads. In addition, a new methodology for controlling the behavior of a structure under multiple crash load scenarios by adaptively changing the contributions from individual load cases is developed. Applied to practical design problems, the results demonstrate that the methodologies provide a practical tool to aid the design engineer in generating design concepts for crashworthy structures with controlled behavior. Although developed in the HCA framework, the basic ideas behind these methods are generic and can be easily implemented with other available topology- and topometry-based optimization methods.

  2. Design of broadband time-domain impedance boundary conditions using the oscillatory-diffusive representation of acoustical models.

    PubMed

    Monteghetti, Florian; Matignon, Denis; Piot, Estelle; Pascal, Lucas

    2016-09-01

    A methodology to design broadband time-domain impedance boundary conditions (TDIBCs) from the analysis of acoustical models is presented. The derived TDIBCs are recast exclusively as first-order differential equations, well-suited for high-order numerical simulations. Broadband approximations are yielded from an elementary linear least squares optimization that is, for most models, independent of the absorbing material geometry. This methodology relies on a mathematical technique referred to as the oscillatory-diffusive (or poles and cuts) representation, and is applied to a wide range of acoustical models, drawn from duct acoustics and outdoor sound propagation, which covers perforates, semi-infinite ground layers, as well as cavities filled with a porous medium. It is shown that each of these impedance models leads to a different TDIBC. Comparison with existing numerical models, such as multi-pole or extended Helmholtz resonator, provides insights into their suitability. Additionally, the broadly-applicable fractional polynomial impedance models are analyzed using fractional calculus.

  3. Investigation of Inner Loop Flight Control Strategies for High-Speed Research

    NASA Technical Reports Server (NTRS)

    Newman, Brett; Kassem, Ayman

    1999-01-01

    This report describes the activities and findings conducted under contract NAS1-19858 with NASA Langley Research Center. Subject matter is the investigation of suitable flight control design methodologies and solutions for large, flexible high-speed vehicles. Specifically, methodologies are to address the inner control loops used for stabilization and augmentation of a highly coupled airframe system possibly involving rigid-body motion, structural vibrations, unsteady aerodynamics, and actuator dynamics. Techniques considered in this body of work are primarily conventional-based, and the vehicle of interest is the High-Speed Civil Transport (HSCT). Major findings include 1) current aeroelastic vehicle modeling procedures require further emphasis and refinement, 2) traditional and nontraditional inner loop flight control strategies employing a single feedback loop do not appear sufficient for highly flexible HSCT class vehicles, 3) inner loop flight control systems will, in all likelihood, require multiple interacting feedback loops, and 4) Ref. H HSCT configuration presents major challenges to designing acceptable closed-loop flight dynamics.

  4. Modelling white-water rafting suitability in a hydropower regulated Alpine River.

    PubMed

    Carolli, Mauro; Zolezzi, Guido; Geneletti, Davide; Siviglia, Annunziato; Carolli, Fabiano; Cainelli, Oscar

    2017-02-01

    Cultural and recreational river ecosystem services and their relations with the flow regime are still poorly investigated. We develop a modelling-based approach to assess recreational flow requirements and the spatially distributed river suitability for white-water rafting, a typical service offered by mountain streams, with potential conflicts of interest with hydropower regulation. The approach is based on the principles of habitat suitability modelling using water depth as the main attribute, with preference curves defined through interviews with local rafting guides. The methodology allows to compute streamflow thresholds for conditions of suitability and optimality of a river reach in relation to rafting. Rafting suitability response to past, present and future flow management scenarios can be predicted on the basis of a hydrological model, which is incorporated in the methodology and is able to account for anthropic effects. Rafting suitability is expressed through a novel metric, the "Rafting hydro-suitability index" (RHSI) which quantifies the cumulative duration of suitable and optimal conditions for rafting. The approach is applied on the Noce River (NE Italy), an Alpine River regulated by hydropower production and affected by hydropeaking, which influences suitability at a sub-daily scale. A dedicated algorithm is developed within the hydrological model to resemble hydropeaking conditions with daily flow data. In the Noce River, peak flows associated with hydropeaking support rafting activities in late summer, highlighting the dual nature of hydropeaking in regulated rivers. Rafting suitability is slightly reduced under present, hydropower-regulated flow conditions compared to an idealized flow regime characterised by no water abstractions. Localized water abstractions for small, run-of-the-river hydropower plants are predicted to negatively affect rafting suitability. The proposed methodology can be extended to support decision making for flow management in hydropower regulated streams, as it has the potential to quantify the response of different ecosystem services to flow regulation. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Research design: the methodology for interdisciplinary research framework.

    PubMed

    Tobi, Hilde; Kampen, Jarl K

    2018-01-01

    Many of today's global scientific challenges require the joint involvement of researchers from different disciplinary backgrounds (social sciences, environmental sciences, climatology, medicine, etc.). Such interdisciplinary research teams face many challenges resulting from differences in training and scientific culture. Interdisciplinary education programs are required to train truly interdisciplinary scientists with respect to the critical factor skills and competences. For that purpose this paper presents the Methodology for Interdisciplinary Research (MIR) framework. The MIR framework was developed to help cross disciplinary borders, especially those between the natural sciences and the social sciences. The framework has been specifically constructed to facilitate the design of interdisciplinary scientific research, and can be applied in an educational program, as a reference for monitoring the phases of interdisciplinary research, and as a tool to design such research in a process approach. It is suitable for research projects of different sizes and levels of complexity, and it allows for a range of methods' combinations (case study, mixed methods, etc.). The different phases of designing interdisciplinary research in the MIR framework are described and illustrated by real-life applications in teaching and research. We further discuss the framework's utility in research design in landscape architecture, mixed methods research, and provide an outlook to the framework's potential in inclusive interdisciplinary research, and last but not least, research integrity.

  6. Subjective comparison and evaluation of speech enhancement algorithms

    PubMed Central

    Hu, Yi; Loizou, Philipos C.

    2007-01-01

    Making meaningful comparisons between the performance of the various speech enhancement algorithms proposed over the years, has been elusive due to lack of a common speech database, differences in the types of noise used and differences in the testing methodology. To facilitate such comparisons, we report on the development of a noisy speech corpus suitable for evaluation of speech enhancement algorithms. This corpus is subsequently used for the subjective evaluation of 13 speech enhancement methods encompassing four classes of algorithms: spectral subtractive, subspace, statistical-model based and Wiener-type algorithms. The subjective evaluation was performed by Dynastat, Inc. using the ITU-T P.835 methodology designed to evaluate the speech quality along three dimensions: signal distortion, noise distortion and overall quality. This paper reports the results of the subjective tests. PMID:18046463

  7. Mining EEG with SVM for Understanding Cognitive Underpinnings of Math Problem Solving Strategies

    PubMed Central

    López, Julio

    2018-01-01

    We have developed a new methodology for examining and extracting patterns from brain electric activity by using data mining and machine learning techniques. Data was collected from experiments focused on the study of cognitive processes that might evoke different specific strategies in the resolution of math problems. A binary classification problem was constructed using correlations and phase synchronization between different electroencephalographic channels as characteristics and, as labels or classes, the math performances of individuals participating in specially designed experiments. The proposed methodology is based on using well-established procedures of feature selection, which were used to determine a suitable brain functional network size related to math problem solving strategies and also to discover the most relevant links in this network without including noisy connections or excluding significant connections. PMID:29670667

  8. Mining EEG with SVM for Understanding Cognitive Underpinnings of Math Problem Solving Strategies.

    PubMed

    Bosch, Paul; Herrera, Mauricio; López, Julio; Maldonado, Sebastián

    2018-01-01

    We have developed a new methodology for examining and extracting patterns from brain electric activity by using data mining and machine learning techniques. Data was collected from experiments focused on the study of cognitive processes that might evoke different specific strategies in the resolution of math problems. A binary classification problem was constructed using correlations and phase synchronization between different electroencephalographic channels as characteristics and, as labels or classes, the math performances of individuals participating in specially designed experiments. The proposed methodology is based on using well-established procedures of feature selection, which were used to determine a suitable brain functional network size related to math problem solving strategies and also to discover the most relevant links in this network without including noisy connections or excluding significant connections.

  9. Optimization of Extraction Conditions for Phenolic Acids from the Leaves of Melissa officinalis L. Using Response Surface Methodology

    PubMed Central

    Yoo, Guijae; Lee, Il Kyun; Park, Seonju; Kim, Nanyoung; Park, Jun Hyung; Kim, Seung Hyun

    2018-01-01

    Background: Melissa officinalis L. is a well-known medicinal plant from the family Lamiaceae, which is distributed throughout Eastern Mediterranean region and Western Asia. Objective: In this study, response surface methodology (RSM) was utilized to optimize the extraction conditions for bioactive compounds from the leaves of M. officinalis L. Materials and Methods: A Box–Behnken design (BBD) was utilized to evaluate the effects of three independent variables, namely extraction temperature (°C), methanol concentration (%), and solvent-to-material ratio (mL/g) on the responses of the contents of caffeic acid and rosmarinic acid. Results: Regression analysis showed a good fit of the experimental data. The optimal condition was obtained at extraction temperature 80.53°C, methanol concentration 29.89%, and solvent-to-material ratio 30 mL/g. Conclusion: These results indicate the suitability of the model employed and the successful application of RSM in optimizing the extraction conditions. This study may be useful for standardizing production quality, including improving the efficiency of large-scale extraction systems. SUMMARY The optimum conditions for the extraction of major phenolic acids from the leaves of Melissa officinalis L. were determined using response surface methodologyBox–Behnken design was utilized to evaluate the effects of three independent variablesQuadratic polynomial model provided a satisfactory description of the experimental dataThe optimized condition for simultaneous maximum contents of caffeic acid and rosmarinic acid was determined. Abbreviations used: RSM: Response surface methodology, BBD: Box–Behnken design, CA: Caffeic acid, RA: Rosmarinic acid, HPLC: High-performance liquid chromatography. PMID:29720824

  10. Design, Implementation, and Operational Methodologies for Sub-arcsecond Attitude Determination, Control, and Stabilization of the Super-pressure Balloon-Borne Imaging Telescope (SuperBIT)

    NASA Astrophysics Data System (ADS)

    Javier Romualdez, Luis

    Scientific balloon-borne instrumentation offers an attractive, competitive, and effective alternative to space-borne missions when considering the overall scope, cost, and development timescale required to design and launch scientific instruments. In particular, the balloon-borne environment provides a near-space regime that is suitable for a number of modern astronomical and cosmological experiments, where the atmospheric interference suffered by ground-based instrumentation is negligible at stratospheric altitudes. This work is centered around the analytical strategies and implementation considerations for the attitude determination and control of SuperBIT, a scientific balloon-borne payload capable of meeting the strict sub-arcsecond pointing and image stability requirements demanded by modern cosmological experiments. Broadly speaking, the designed stability specifications of SuperBIT coupled with its observational efficiency, image quality, and accessibility rivals state-of-the-art astronomical observatories such as the Hubble Space Telescope. To this end, this work presents an end-to-end design methodology for precision pointing balloon-borne payloads such as SuperBIT within an analytical yet implementationally grounded context. Simulation models of SuperBIT are analytically derived to aid in pre-assembly trade-off and case studies that are pertinent to the dynamic balloon-borne environment. From these results, state estimation techniques and control methodologies are extensively developed, leveraging the analytical framework of simulation models and design studies. This pre-assembly design phase is physically validated during assembly, integration, and testing through implementation in real-time hardware and software, which bridges the gap between analytical results and practical application. SuperBIT attitude determination and control is demonstrated throughout two engineering test flights that verify pointing and image stability requirements in flight, where the post-flight results close the overall design loop by suggesting practical improvements to pre-design methodologies. Overall, the analytical and practical results presented in this work, though centered around the SuperBIT project, provide generically useful and implementationally viable methodologies for high precision balloon-borne instrumentation, all of which are validated, justified, and improved both theoretically and practically. As such, the continuing development of SuperBIT, built from the work presented in this thesis, strives to further the potential for scientific balloon-borne astronomy in the near future.

  11. Learning Methodology in the Classroom to Encourage Participation

    ERIC Educational Resources Information Center

    Luna, Esther; Folgueiras, Pilar

    2014-01-01

    Service learning is a methodology that promotes the participation of citizens in their community. This article presents a brief conceptualization of citizen participation, characteristics of service learning methodology, and validation of a programme that promotes service-learning projects. This validation highlights the suitability of this…

  12. Evaluation of the instream flow incremental methodology by U.S. Fish and Wildlife Service field users

    USGS Publications Warehouse

    Armour, Carl L.; Taylor, Jonathan G.

    1991-01-01

    This paper summarizes results of a survey conducted in 1988 of 57 U.S. Fish and Wildlife Service field offices. The purpose was to document opinions of biologists experienced in applying the Instream Flow Incremental Methodology (IFIM). Responses were received from 35 offices where 616 IFIM applications were reported. The existence of six monitoring studies designed to evaluate the adequacy of flows provided at sites was confirmed. The two principal categories reported as stumbling blocks to the successful application of IFIM were beliefs that the methodology is technically too simplistic or that it is too complex to apply. Recommendations receiving the highest scores for future initiatives to enhance IFIM use were (1) training and workshops for field biologists; and (2) improving suitability index (SI) curves and computer models, and evaluating the relationship of weighted useable area (WUA) to fish responses. The authors concur that emphasis for research should be on addressing technical concerns about SI curves and WUA.

  13. A general architecture for intelligent training systems

    NASA Technical Reports Server (NTRS)

    Loftin, R. Bowen

    1987-01-01

    A preliminary design of a general architecture for autonomous intelligent training systems was developed. The architecture integrates expert system technology with teaching/training methodologies to permit the production of systems suitable for use by NASA, other government agencies, industry, and academia in the training of personnel for the performance of complex, mission-critical tasks. The proposed architecture consists of five elements: a user interface, a domain expert, a training session manager, a trainee model, and a training scenario generator. The design of this architecture was guided and its efficacy tested through the development of a system for use by Mission Control Center Flight Dynamics Officers in training to perform Payload-Assist Module Deploys from the orbiter.

  14. Aerodynamic optimization studies on advanced architecture computers

    NASA Technical Reports Server (NTRS)

    Chawla, Kalpana

    1995-01-01

    The approach to carrying out multi-discipline aerospace design studies in the future, especially in massively parallel computing environments, comprises of choosing (1) suitable solvers to compute solutions to equations characterizing a discipline, and (2) efficient optimization methods. In addition, for aerodynamic optimization problems, (3) smart methodologies must be selected to modify the surface shape. In this research effort, a 'direct' optimization method is implemented on the Cray C-90 to improve aerodynamic design. It is coupled with an existing implicit Navier-Stokes solver, OVERFLOW, to compute flow solutions. The optimization method is chosen such that it can accomodate multi-discipline optimization in future computations. In the work , however, only single discipline aerodynamic optimization will be included.

  15. A scalable method for O-antigen purification applied to various Salmonella serovars

    PubMed Central

    Micoli, F.; Rondini, S.; Gavini, M.; Pisoni, I.; Lanzilao, L.; Colucci, A.M.; Giannelli, C.; Pippi, F.; Sollai, L.; Pinto, V.; Berti, F.; MacLennan, C.A.; Martin, L.B.; Saul, A.

    2014-01-01

    The surface lipopolysaccharide of gram-negative bacteria is both a virulence factor and a B cell antigen. Antibodies against O-antigen of lipopolysaccharide may confer protection against infection, and O-antigen conjugates have been designed against multiple pathogens. Here, we describe a simplified methodology for extraction and purification of the O-antigen core portion of Salmonella lipopolysaccharide, suitable for large-scale production. Lipopolysaccharide extraction and delipidation are performed by acetic acid hydrolysis of whole bacterial culture and can take place directly in a bioreactor, without previous isolation and inactivation of bacteria. Further O-antigen core purification consists of rapid filtration and precipitation steps, without using enzymes or hazardous chemicals. The process was successfully applied to various Salmonella enterica serovars (Paratyphi A, Typhimurium, and Enteritidis), obtaining good yields of high-quality material, suitable for conjugate vaccine preparations. PMID:23142430

  16. Small Molecules Targeting the miRNA-Binding Domain of Argonaute 2: From Computer-Aided Molecular Design to RNA Immunoprecipitation.

    PubMed

    Bellissimo, Teresa; Masciarelli, Silvia; Poser, Elena; Genovese, Ilaria; Del Rio, Alberto; Colotti, Gianni; Fazi, Francesco

    2017-01-01

    The development of small-molecule-based target therapy design for human disease and cancer is object of growing attention. Recently, specific microRNA (miRNA) mimicking compounds able to bind the miRNA-binding domain of Argonaute 2 protein (AGO2) to inhibit miRNA loading and its functional activity were described. Computer-aided molecular design techniques and RNA immunoprecipitation represent suitable approaches to identify and experimentally determine if a compound is able to impair the loading of miRNAs on AGO2 protein. Here, we describe these two methodologies that we recently used to select a specific compound able to interfere with the AGO2 functional activity and able to improve the retinoic acid-dependent myeloid differentiation of leukemic cells.

  17. Proceedings of the Interservice/Industry Training Systems Conference (9th), Held at Washington, DC, on 30 November - 2 December 1987

    DTIC Science & Technology

    1987-12-01

    requires much more data, but holds fast to the idea that the FV approach, or some other model, is critical if the job analysis process is to have its...Ada compiled code executes twice as fast as Microsoft’s Fortran compiled code. This conclusion is at variance with the results obtained from...finish is not so important. Hence, if a design methodology produces coda that will not execute fast enough on processors suitable for flight

  18. The formal verification of generic interpreters

    NASA Technical Reports Server (NTRS)

    Windley, P.; Levitt, K.; Cohen, G. C.

    1991-01-01

    The task assignment 3 of the design and validation of digital flight control systems suitable for fly-by-wire applications is studied. Task 3 is associated with formal verification of embedded systems. In particular, results are presented that provide a methodological approach to microprocessor verification. A hierarchical decomposition strategy for specifying microprocessors is also presented. A theory of generic interpreters is presented that can be used to model microprocessor behavior. The generic interpreter theory abstracts away the details of instruction functionality, leaving a general model of what an interpreter does.

  19. Diet expert subsystem for CELSS

    NASA Technical Reports Server (NTRS)

    Yendler, Boris S.; Nguyen, Thoi K.; Waleh, Ahmad

    1991-01-01

    An account is given of the mathematical basis of a diet-controlling expert system, designated 'Ceres' for the human crews of a Controlled Ecological Life Support System (CELSS). The Ceres methodology can furnish both steady-state and dynamic diet solutions; the differences between Ceres and a conventional nutritional-modeling method is illustrated by the case of a three-component, potato-wheat-soybean food system. Attention is given to the role of food processing in furnishing flexibility in diet-planning management. Crew diet solutions based on simple optimizations are not necessarily the most suitable for optimum CELSS operation.

  20. The characteristics and interpretability of land surface change and implications for project design

    USGS Publications Warehouse

    Sohl, Terry L.; Gallant, Alisa L.; Loveland, Thomas R.

    2004-01-01

    The need for comprehensive, accurate information on land-cover change has never been greater. While remotely sensed imagery affords the opportunity to provide information on land-cover change over large geographic expanses at a relatively low cost, the characteristics of land-surface change bring into question the suitability of many commonly used methodologies. Algorithm-based methodologies to detect change generally cannot provide the same level of accuracy as the analyses done by human interpreters. Results from the Land Cover Trends project, a cooperative venture that includes the U.S. Geological Survey, Environmental Protection Agency, and National Aeronautics and Space Administration, have shown that land-cover conversion is a relatively rare event, occurs locally in small patches, varies geographically and temporally, and is spectrally ambiguous. Based on these characteristics of change and the type of information required, manual interpretation was selected as the primary means of detecting change in the Land Cover Trends project. Mixtures of algorithm-based detection and manual interpretation may often prove to be the most feasible and appropriate design for change-detection applications. Serious examination of the expected characteristics and measurability of change must be considered during the design and implementation phase of any change analysis project.

  1. Suitable RF spectrum in ISM band for 2-way advanced metering network in India

    NASA Astrophysics Data System (ADS)

    Mishra, A.; Khan, M. A.; Gaur, M. S.

    2013-01-01

    The ISM (Industrial Scientific and Medical) bands in the radio frequency space in India offer two alternative spectra to implement wireless network for advanced metering infrastructure (AMI). These bands lie in the range of 2.4GHz and sub-GHz frequencies 865 to 867 MHz This paper aims to examine the suitability of both options by designing and executing experiments in laboratory as well as carrying out field trials on electricity meters to validate the selected option. A parameter, communication effectiveness index (CEI2) is defined to measure the effectiveness of 2 way data communication (packet exchange) between two points under different scenarios of buildings and free space. Both 2.4 GHz and Sub-GHz designs were implemented to compare the results. The experiments were conducted across 3 floors of a building. Validation of the selected option was carried out by conducting a field trial by integrating the selected radio frequency (RF) modem into the single phase electricity meters and installing these meters across three floors of the building. The methodology, implementation details, observations and resulting analytical conclusion are described in the paper.

  2. Using FEP's List and a PA Methodology for Evaluating Suitable Areas for the LLW Repository in Italy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Risoluti, P.; Ciabatti, P.; Mingrone, G.

    2002-02-26

    In Italy following a referendum held in 1987, nuclear energy has been phased out. Since 1998, a general site selection process covering the whole Italian territory has been under way. A GIS (Geographic Information System) methodology was implemented in three steps using the ESRI Arc/Info and Arc/View platforms. The screening identified approximately 0.8% of the Italian territory as suitable for locating the LLW Repository. 200 areas have been identified as suitable for the location of the LLW Repository, using a multiple exclusion criteria procedure (1:500,000), regional scale (1:100.000) and local scale (1:25,000-1:10,000). A methodology for evaluating these areas has beenmore » developed allowing, along with the evaluation of the long term efficiency of the engineered barrier system (EBS), the characterization of the selected areas in terms of physical and safety factors and planning factors. The first step was to identify, on a referenced FEPs list, a group of geomorphological, geological, hydrogeological, climatic and human behavior caused process and/or events, which were considered of importance for the site evaluation, taking into account the Italian situation. A site evaluation system was established ascribing weighted scores to each of these processes and events, which were identified as parameters of the new evaluation system. The score of each parameter is ranging from 1 (low suitability) to 3 (high suitability). The corresponding weight is calculated considering the effect of the parameter in terms of total dose to the critical group, using an upgraded AMBER model for PA calculation. At the end of the process an index obtained by a score weighted sum gives the degree of suitability of the selected areas for the LLW Repository location. The application of the methodology to two selected sites is given in the paper.« less

  3. A climate responsive urban design tool: a platform to improve energy efficiency in a dry hot climate

    NASA Astrophysics Data System (ADS)

    El Dallal, Norhan; Visser, Florentine

    2017-09-01

    In the Middle East and North Africa (MENA) region, new urban developments should address the climatic conditions to improve outdoor comfort and to reduce the energy consumption of buildings. This article describes a design tool that supports climate responsive design for a dry hot climate. The approach takes the climate as an initiator for the conceptual urban form with a more energy-efficient urban morphology. The methodology relates the different passive strategies suitable for major climate conditions in MENA region (dry-hot) to design parameters that create the urban form. This parametric design approach is the basis for a tool that generates conceptual climate responsive urban forms so as to assist the urban designer early in the design process. Various conceptual scenarios, generated by a computational model, are the results of the proposed platform. A practical application of the approach is conducted on a New Urban Community in Aswan (Egypt), showing the economic feasibility of the resulting urban form and morphology, and the proposed tool.

  4. Roughness Based Crossflow Transition Control for a Swept Airfoil Design Relevant to Subsonic Transports

    NASA Technical Reports Server (NTRS)

    Li, Fei; Choudhari, Meelan M.; Carpenter, Mark H.; Malik, Mujeeb R.; Eppink, Jenna; Chang, Chau-Lyan; Streett, Craig L.

    2010-01-01

    A high fidelity transition prediction methodology has been applied to a swept airfoil design at a Mach number of 0.75 and chord Reynolds number of approximately 17 million, with the dual goal of an assessment of the design for the implementation and testing of roughness based crossflow transition control and continued maturation of such methodology in the context of realistic aerodynamic configurations. Roughness based transition control involves controlled seeding of suitable, subdominant crossflow modes in order to weaken the growth of naturally occurring, linearly more unstable instability modes via a nonlinear modification of the mean boundary layer profiles. Therefore, a synthesis of receptivity, linear and nonlinear growth of crossflow disturbances, and high-frequency secondary instabilities becomes desirable to model this form of control. Because experimental data is currently unavailable for passive crossflow transition control for such high Reynolds number configurations, a holistic computational approach is used to assess the feasibility of roughness based control methodology. Potential challenges inherent to this control application as well as associated difficulties in modeling this form of control in a computational setting are highlighted. At high Reynolds numbers, a broad spectrum of stationary crossflow disturbances amplify and, while it may be possible to control a specific target mode using Discrete Roughness Elements (DREs), nonlinear interaction between the control and target modes may yield strong amplification of the difference mode that could have an adverse impact on the transition delay using spanwise periodic roughness elements.

  5. Gadamerian philosophical hermeneutics as a useful methodological framework for the Delphi technique.

    PubMed

    Guzys, Diana; Dickson-Swift, Virginia; Kenny, Amanda; Threlkeld, Guinever

    2015-01-01

    In this article we aim to demonstrate how Gadamerian philosophical hermeneutics may provide a sound methodological framework for researchers using the Delphi Technique (Delphi) in studies exploring health and well-being. Reporting of the use of Delphi in health and well-being research is increasing, but less attention has been given to covering its methodological underpinnings. In Delphi, a structured anonymous conversation between participants is facilitated, via an iterative survey process. Participants are specifically selected for their knowledge and experience with the topic of interest. The purpose of structuring conversation in this manner is to cultivate collective opinion and highlight areas of disagreement, using a process that minimizes the influence of group dynamics. The underlying premise is that the opinion of a collective is more useful than that of an individual. In designing our study into health literacy, Delphi aligned well with our research focus and would enable us to capture collective views. However, we were interested in the methodology that would inform our study. As researchers, we believe that methodology provides the framework and principles for a study and is integral to research integrity. In assessing the suitability of Delphi for our research purpose, we found little information about underpinning methodology. The absence of a universally recognized or consistent methodology associated with Delphi was highlighted through a scoping review we undertook to assist us in our methodological thinking. This led us to consider alternative methodologies, which might be congruent with the key principles of Delphi. We identified Gadamerian philosophical hermeneutics as a methodology that could provide a supportive framework and principles. We suggest that this methodology may be useful in health and well-being studies utilizing the Delphi method.

  6. Gadamerian philosophical hermeneutics as a useful methodological framework for the Delphi technique

    PubMed Central

    Guzys, Diana; Dickson-Swift, Virginia; Kenny, Amanda; Threlkeld, Guinever

    2015-01-01

    In this article we aim to demonstrate how Gadamerian philosophical hermeneutics may provide a sound methodological framework for researchers using the Delphi Technique (Delphi) in studies exploring health and well-being. Reporting of the use of Delphi in health and well-being research is increasing, but less attention has been given to covering its methodological underpinnings. In Delphi, a structured anonymous conversation between participants is facilitated, via an iterative survey process. Participants are specifically selected for their knowledge and experience with the topic of interest. The purpose of structuring conversation in this manner is to cultivate collective opinion and highlight areas of disagreement, using a process that minimizes the influence of group dynamics. The underlying premise is that the opinion of a collective is more useful than that of an individual. In designing our study into health literacy, Delphi aligned well with our research focus and would enable us to capture collective views. However, we were interested in the methodology that would inform our study. As researchers, we believe that methodology provides the framework and principles for a study and is integral to research integrity. In assessing the suitability of Delphi for our research purpose, we found little information about underpinning methodology. The absence of a universally recognized or consistent methodology associated with Delphi was highlighted through a scoping review we undertook to assist us in our methodological thinking. This led us to consider alternative methodologies, which might be congruent with the key principles of Delphi. We identified Gadamerian philosophical hermeneutics as a methodology that could provide a supportive framework and principles. We suggest that this methodology may be useful in health and well-being studies utilizing the Delphi method. PMID:25948132

  7. Deformable Surface Accommodating Intraocular Lens: Second Generation Prototype Design Methodology and Testing.

    PubMed

    McCafferty, Sean J; Schwiegerling, Jim T

    2015-04-01

    Present an analysis methodology for developing and evaluating accommodating intraocular lenses incorporating a deformable interface. The next generation design of extruded gel interface intraocular lens is presented. A prototype based upon similar previously in vivo proven design was tested with measurements of actuation force, lens power, interface contour, optical transfer function, and visual Strehl ratio. Prototype verified mathematical models were used to optimize optical and mechanical design parameters to maximize the image quality and minimize the required force to accommodate. The prototype lens produced adequate image quality with the available physiologic accommodating force. The iterative mathematical modeling based upon the prototype yielded maximized optical and mechanical performance through maximum allowable gel thickness to extrusion diameter ratio, maximum feasible refractive index change at the interface, and minimum gel material properties in Poisson's ratio and Young's modulus. The design prototype performed well. It operated within the physiologic constraints of the human eye including the force available for full accommodative amplitude using the eye's natural focusing feedback, while maintaining image quality in the space available. The parameters that optimized optical and mechanical performance were delineated as those, which minimize both asphericity and actuation pressure. The design parameters outlined herein can be used as a template to maximize the performance of a deformable interface intraocular lens. The article combines a multidisciplinary basic science approach from biomechanics, optical science, and ophthalmology to optimize an intraocular lens design suitable for preliminary animal trials.

  8. Modeling Methodologies for Design and Control of Solid Oxide Fuel Cell APUs

    NASA Astrophysics Data System (ADS)

    Pianese, C.; Sorrentino, M.

    2009-08-01

    Among the existing fuel cell technologies, Solid Oxide Fuel Cells (SOFC) are particularly suitable for both stationary and mobile applications, due to their high energy conversion efficiencies, modularity, high fuel flexibility, low emissions and noise. Moreover, the high working temperatures enable their use for efficient cogeneration applications. SOFCs are entering in a pre-industrial era and a strong interest for designing tools has growth in the last years. Optimal system configuration, components sizing, control and diagnostic system design require computational tools that meet the conflicting needs of accuracy, affordable computational time, limited experimental efforts and flexibility. The paper gives an overview on control-oriented modeling of SOFC at both single cell and stack level. Such an approach provides useful simulation tools for designing and controlling SOFC-APUs destined to a wide application area, ranging from automotive to marine and airplane APUs.

  9. Energy audit in small wastewater treatment plants: methodology, energy consumption indicators, and lessons learned.

    PubMed

    Foladori, P; Vaccari, M; Vitali, F

    2015-01-01

    Energy audits in wastewater treatment plants (WWTPs) reveal large differences in the energy consumption in the various stages, depending also on the indicators used in the audits. This work is aimed at formulating a suitable methodology to perform audits in WWTPs and identifying the most suitable key energy consumption indicators for comparison among different plants and benchmarking. Hydraulic-based stages, stages based on chemical oxygen demand, sludge-based stages and building stages were distinguished in WWTPs and analysed with different energy indicators. Detailed energy audits were carried out on five small WWTPs treating less than 10,000 population equivalent and using continuous data for 2 years. The plants have in common a low designed capacity utilization (52% on average) and equipment oversizing which leads to waste of energy in the absence of controls and inverters (a common situation in small plants). The study confirms that there are several opportunities for reducing energy consumption in small WWTPs: in addition to the pumping of influent wastewater and aeration, small plants demonstrate low energy efficiency in recirculation of settled sludge and in aerobic stabilization. Denitrification above 75% is ensured through intermittent aeration and without recirculation of mixed liquor. Automation in place of manual controls is mandatory in illumination and electrical heating.

  10. Traceable Co-C eutectic points for thermocouple calibration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jahan, F.; Ballico, M. J.

    2013-09-11

    National Measurement Institute of Australia (NMIA) has developed a miniature crucible design suitable for measurement by both thermocouples and radiation thermometry, and has established an ensemble of five Co-C eutectic-point cells based on this design. The cells in this ensemble have been individually calibrated using both ITS-90 radiation thermometry and thermocouples calibrated on the ITS-90 by the NMIA mini-coil methodology. The assigned ITS-90 temperatures obtained using these different techniques are both repeatable and consistent, despite the use of different furnaces and measurement conditions. The results demonstrate that, if individually calibrated, such cells can be practically used as part of amore » national traceability scheme for thermocouple calibration, providing a useful intermediate calibration point between Cu and Pd.« less

  11. Thermostabilisation of membrane proteins for structural studies

    PubMed Central

    Magnani, Francesca; Serrano-Vega, Maria J.; Shibata, Yoko; Abdul-Hussein, Saba; Lebon, Guillaume; Miller-Gallacher, Jennifer; Singhal, Ankita; Strege, Annette; Thomas, Jennifer A.; Tate, Christopher G.

    2017-01-01

    The thermostability of an integral membrane protein in detergent solution is a key parameter that dictates the likelihood of obtaining well-diffracting crystals suitable for structure determination. However, many mammalian membrane proteins are too unstable for crystallisation. We developed a thermostabilisation strategy based on systematic mutagenesis coupled to a radioligand-binding thermostability assay that can be applied to receptors, ion channels and transporters. It takes approximately 6-12 months to thermostabilise a G protein-coupled receptor (GPCR) containing 300 amino acid residues. The resulting thermostabilised membrane proteins are more easily crystallised and result in high-quality structures. This methodology has facilitated structure-based drug design applied to GPCRs, because it is possible to determine multiple structures of the thermostabilised receptors bound to low affinity ligands. Protocols and advice are given on how to develop thermostability assays for membrane proteins and how to combine mutations to make an optimally stable mutant suitable for structural studies. PMID:27466713

  12. Native American nurse leadership.

    PubMed

    Nichols, Lee A

    2004-07-01

    To identify which characteristics, wisdom, and skills are essential in becoming an effective Native American nurse leader. This will lead to the development of a curriculum suitable for Native American nurses. A qualitative, descriptive design was used for this study. Focus groups were conducted in Polson, Montana. A total of 67 Native and non-Native nurses participated. Sixty-seven percent of them were members of Indian tribes. Data were content analyzed using Spradley's ethnographic methodology. Three domains of analysis emerged: point of reference for the leader (individual, family, community), what a leader is (self-actualized, wise, experienced, political, bicultural, recognized, quiet presence, humble, spiritual, and visionary), and what a leader does (mentors, role models, communicates, listens, demonstrates values, mobilizes, and inspires). Native nurse leaders lead differently. Thus, a leadership curriculum suitable for Native nurses may lead to increased work productivity and therefore improved patient care for Native Americans.

  13. Variable speed wind turbine control by discrete-time sliding mode approach.

    PubMed

    Torchani, Borhen; Sellami, Anis; Garcia, Germain

    2016-05-01

    The aim of this paper is to propose a new design variable speed wind turbine control by discrete-time sliding mode approach. This methodology is designed for linear saturated system. The saturation constraint is reported on inputs vector. To this end, the back stepping design procedure is followed to construct a suitable sliding manifold that guarantees the attainment of a stabilization control objective. It is well known that the mechanisms are investigated in term of the most proposed assumptions to deal with the damping, shaft stiffness and inertia effect of the gear. The objectives are to synthesize robust controllers that maximize the energy extracted from wind, while reducing mechanical loads and rotor speed tracking combined with an electromagnetic torque. Simulation results of the proposed scheme are presented. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  14. Effects of aerodynamic heating and TPS thermal performance uncertainties on the Shuttle Orbiter

    NASA Technical Reports Server (NTRS)

    Goodrich, W. D.; Derry, S. M.; Maraia, R. J.

    1980-01-01

    A procedure for estimating uncertainties in the aerodynamic-heating and thermal protection system (TPS) thermal-performance methodologies developed for the Shuttle Orbiter is presented. This procedure is used in predicting uncertainty bands around expected or nominal TPS thermal responses for the Orbiter during entry. Individual flowfield and TPS parameters that make major contributions to these uncertainty bands are identified and, by statistical considerations, combined in a manner suitable for making engineering estimates of the TPS thermal confidence intervals and temperature margins relative to design limits. Thus, for a fixed TPS design, entry trajectories for future Orbiter missions can be shaped subject to both the thermal-margin and confidence-interval requirements. This procedure is illustrated by assessing the thermal margins offered by selected areas of the existing Orbiter TPS design for an entry trajectory typifying early flight test missions.

  15. Bibliography of Research Reports and Publications Issued by the Biodynamics and Bioengineering Division, 1944-1984.

    DTIC Science & Technology

    1985-04-01

    Ratino, R.K.H. Geber , A.A. Karl, D.R. Nelson, "OPTO Electronic Methodology Suitable for Electroretinographic Investigations During Environmental Stess...R.K.H. Geber , A.A. Karl, D.R. Nelson, "OPTO Electronic Methodology Suitable for Electroretinographic Investigations During Environmental Stess...6 Section M -Page 167 Author Index Gaudio, R. L-62, 67 Gawain,G.C.V. A-12 Geber ,R.K.H. G-42,H-22 Geer,R.L. F-92 Gehrich,J.L. A-273 Gell,C.F. D-33,F-97

  16. Open space suitability analysis for emergency shelter after an earthquake

    NASA Astrophysics Data System (ADS)

    Anhorn, J.; Khazai, B.

    2014-06-01

    In an emergency situation shelter space is crucial for people affected by natural hazards. Emergency planners in disaster relief and mass care can greatly benefit from a sound methodology that identifies suitable shelter areas and sites where shelter services need to be improved. A methodology to rank suitability of open spaces for contingency planning and placement of shelter in the immediate aftermath of a disaster is introduced. The Open Space Suitability Index (OSSI) uses the combination of two different measures: a qualitative evaluation criterion for the suitability and manageability of open spaces to be used as shelter sites, and a second quantitative criterion using a capacitated accessibility analysis based on network analysis. For the qualitative assessment, implementation issues, environmental considerations, and basic utility supply are the main categories to rank candidate shelter sites. Geographic Information System (GIS) is used to reveal spatial patterns of shelter demand. Advantages and limitations of this method are discussed on the basis of a case study in Kathmandu Metropolitan City (KMC). According to the results, out of 410 open spaces under investigation, 12.2% have to be considered not suitable (Category D and E) while 10.7% are Category A and 17.6% are Category B. Almost two third (59.5%) are fairly suitable (Category C).

  17. Open space suitability analysis for emergency shelter after an earthquake

    NASA Astrophysics Data System (ADS)

    Anhorn, J.; Khazai, B.

    2015-04-01

    In an emergency situation shelter space is crucial for people affected by natural hazards. Emergency planners in disaster relief and mass care can greatly benefit from a sound methodology that identifies suitable shelter areas and sites where shelter services need to be improved. A methodology to rank suitability of open spaces for contingency planning and placement of shelter in the immediate aftermath of a disaster is introduced. The Open Space Suitability Index uses the combination of two different measures: a qualitative evaluation criterion for the suitability and manageability of open spaces to be used as shelter sites and another quantitative criterion using a capacitated accessibility analysis based on network analysis. For the qualitative assessment implementation issues, environmental considerations and basic utility supply are the main categories to rank candidate shelter sites. A geographic information system is used to reveal spatial patterns of shelter demand. Advantages and limitations of this method are discussed on the basis of an earthquake hazard case study in the Kathmandu Metropolitan City. According to the results, out of 410 open spaces under investigation, 12.2% have to be considered not suitable (Category D and E) while 10.7% are Category A and 17.6% are Category B. Almost two-thirds (59.55%) are fairly suitable (Category C).

  18. Robust modular product family design

    NASA Astrophysics Data System (ADS)

    Jiang, Lan; Allada, Venkat

    2001-10-01

    This paper presents a modified Taguchi methodology to improve the robustness of modular product families against changes in customer requirements. The general research questions posed in this paper are: (1) How to effectively design a product family (PF) that is robust enough to accommodate future customer requirements. (2) How far into the future should designers look to design a robust product family? An example of a simplified vacuum product family is used to illustrate our methodology. In the example, customer requirements are selected as signal factors; future changes of customer requirements are selected as noise factors; an index called quality characteristic (QC) is set to evaluate the product vacuum family; and the module instance matrix (M) is selected as control factor. Initially a relation between the objective function (QC) and the control factor (M) is established, and then the feasible M space is systemically explored using a simplex method to determine the optimum M and the corresponding QC values. Next, various noise levels at different time points are introduced into the system. For each noise level, the optimal values of M and QC are computed and plotted on a QC-chart. The tunable time period of the control factor (the module matrix, M) is computed using the QC-chart. The tunable time period represents the maximum time for which a given control factor can be used to satisfy current and future customer needs. Finally, a robustness index is used to break up the tunable time period into suitable time periods that designers should consider while designing product families.

  19. Optimal illusion and invisibility of multilayered anisotropic cylinders and spheres.

    PubMed

    Zhang, Lin; Shi, Yan; Liang, Chang-Hong

    2016-10-03

    In this paper, full-wave electromagnetic scattering theory is employed to investigate illusion and invisibility of inhomogeneous anisotropic cylinders and spheres. With the use of a shell designed according to Mie series theory for multiple piecewise anisotropic layers, radar cross section (RCS) of the coated inhomogeneous anisotropic object can be dramatically reduced or disguised as another object in the long-wavelength limit. With the suitable adjustment of the anisotropy parameters of the shell, optimal illusion and invisibility characteristics of the coated inhomogeneous anisotropic object can be achieved. Details of theoretical analysis and numerical examples are presented to validate the proposed methodology.

  20. Presenting an Approach for Conducting Knowledge Architecture within Large-Scale Organizations

    PubMed Central

    Varaee, Touraj; Habibi, Jafar; Mohaghar, Ali

    2015-01-01

    Knowledge architecture (KA) establishes the basic groundwork for the successful implementation of a short-term or long-term knowledge management (KM) program. An example of KA is the design of a prototype before a new vehicle is manufactured. Due to a transformation to large-scale organizations, the traditional architecture of organizations is undergoing fundamental changes. This paper explores the main strengths and weaknesses in the field of KA within large-scale organizations and provides a suitable methodology and supervising framework to overcome specific limitations. This objective was achieved by applying and updating the concepts from the Zachman information architectural framework and the information architectural methodology of enterprise architecture planning (EAP). The proposed solution may be beneficial for architects in knowledge-related areas to successfully accomplish KM within large-scale organizations. The research method is descriptive; its validity is confirmed by performing a case study and polling the opinions of KA experts. PMID:25993414

  1. HOLA: Human-like Orthogonal Network Layout.

    PubMed

    Kieffer, Steve; Dwyer, Tim; Marriott, Kim; Wybrow, Michael

    2016-01-01

    Over the last 50 years a wide variety of automatic network layout algorithms have been developed. Some are fast heuristic techniques suitable for networks with hundreds of thousands of nodes while others are multi-stage frameworks for higher-quality layout of smaller networks. However, despite decades of research currently no algorithm produces layout of comparable quality to that of a human. We give a new "human-centred" methodology for automatic network layout algorithm design that is intended to overcome this deficiency. User studies are first used to identify the aesthetic criteria algorithms should encode, then an algorithm is developed that is informed by these criteria and finally, a follow-up study evaluates the algorithm output. We have used this new methodology to develop an automatic orthogonal network layout method, HOLA, that achieves measurably better (by user study) layout than the best available orthogonal layout algorithm and which produces layouts of comparable quality to those produced by hand.

  2. Work Domain Analysis Methodology for Development of Operational Concepts for Advanced Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hugo, Jacques

    2015-05-01

    This report describes a methodology to conduct a Work Domain Analysis in preparation for the development of operational concepts for new plants. This method has been adapted from the classical method described in the literature in order to better deal with the uncertainty and incomplete information typical of first-of-a-kind designs. The report outlines the strategy for undertaking a Work Domain Analysis of a new nuclear power plant and the methods to be used in the development of the various phases of the analysis. Basic principles are described to the extent necessary to explain why and how the classical method wasmore » adapted to make it suitable as a tool for the preparation of operational concepts for a new nuclear power plant. Practical examples are provided of the systematic application of the method and the various presentation formats in the operational analysis of advanced reactors.« less

  3. Presenting an Approach for Conducting Knowledge Architecture within Large-Scale Organizations.

    PubMed

    Varaee, Touraj; Habibi, Jafar; Mohaghar, Ali

    2015-01-01

    Knowledge architecture (KA) establishes the basic groundwork for the successful implementation of a short-term or long-term knowledge management (KM) program. An example of KA is the design of a prototype before a new vehicle is manufactured. Due to a transformation to large-scale organizations, the traditional architecture of organizations is undergoing fundamental changes. This paper explores the main strengths and weaknesses in the field of KA within large-scale organizations and provides a suitable methodology and supervising framework to overcome specific limitations. This objective was achieved by applying and updating the concepts from the Zachman information architectural framework and the information architectural methodology of enterprise architecture planning (EAP). The proposed solution may be beneficial for architects in knowledge-related areas to successfully accomplish KM within large-scale organizations. The research method is descriptive; its validity is confirmed by performing a case study and polling the opinions of KA experts.

  4. Foucault, the subject and the research interview: a critique of methods.

    PubMed

    Fadyl, Joanna K; Nicholls, David A

    2013-03-01

    Research interviews are a widely used method in qualitative health research and have been adapted to suit a range of methodologies. Just as it is valuable that new approaches are explored, it is also important to continue to examine their appropriate use. In this article, we question the suitability of research interviews for 'history of the present' studies informed by the work of Michel Foucault - a form of qualitative research that is being increasingly employed in the analysis of healthcare systems and processes. We argue that several aspects of research interviewing produce philosophical and methodological complications that can interfere with achieving the aims of the analysis in this type of study. The article comprises an introduction to these tensions and examination of them in relation to key aspects of a Foucauldian philosophical position, and discussion of where this might position researchers when it comes to designing a study. © 2012 Blackwell Publishing Ltd.

  5. In-line monitoring of the coffee roasting process with near infrared spectroscopy: Measurement of sucrose and colour.

    PubMed

    Santos, João Rodrigo; Viegas, Olga; Páscoa, Ricardo N M J; Ferreira, Isabel M P L V O; Rangel, António O S S; Lopes, João Almeida

    2016-10-01

    In this work, a real-time and in-situ analytical tool based on near infrared spectroscopy is proposed to predict two of the most relevant coffee parameters during the roasting process, sucrose and colour. The methodology was developed taking in consideration different coffee varieties (Arabica and Robusta), coffee origins (Brazil, East-Timor, India and Uganda) and roasting process procedures (slow and fast). All near infrared spectroscopy-based calibrations were developed resorting to partial least squares regression. The results proved the suitability of this methodology as demonstrated by range-error-ratio and coefficient of determination higher than 10 and 0.85 respectively, for all modelled parameters. The relationship between sucrose and colour development during the roasting process is further discussed, in light of designing in real-time coffee products with similar visual appearance and distinct organoleptic profile. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Engineering Concepts in Stem Cell Research.

    PubMed

    Narayanan, Karthikeyan; Mishra, Sachin; Singh, Satnam; Pei, Ming; Gulyas, Balazs; Padmanabhan, Parasuraman

    2017-12-01

    The field of regenerative medicine integrates advancements made in stem cells, molecular biology, engineering, and clinical methodologies. Stem cells serve as a fundamental ingredient for therapeutic application in regenerative medicine. Apart from stem cells, engineering concepts have equally contributed to the success of stem cell based applications in improving human health. The purpose of various engineering methodologies is to develop regenerative and preventive medicine to combat various diseases and deformities. Explosion of stem cell discoveries and their implementation in clinical setting warrants new engineering concepts and new biomaterials. Biomaterials, microfluidics, and nanotechnology are the major engineering concepts used for the implementation of stem cells in regenerative medicine. Many of these engineering technologies target the specific niche of the cell for better functional capability. Controlling the niche is the key for various developmental activities leading to organogenesis and tissue homeostasis. Biomimetic understanding not only helped to improve the design of the matrices or scaffolds by incorporating suitable biological and physical components, but also ultimately aided adoption of designs that helped these materials/devices have better function. Adoption of engineering concepts in stem cell research improved overall achievement, however, several important issues such as long-term effects with respect to systems biology needs to be addressed. Here, in this review the authors will highlight some interesting breakthroughs in stem cell biology that use engineering methodologies. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Reliable design of a closed loop supply chain network under uncertainty: An interval fuzzy possibilistic chance-constrained model

    NASA Astrophysics Data System (ADS)

    Vahdani, Behnam; Tavakkoli-Moghaddam, Reza; Jolai, Fariborz; Baboli, Arman

    2013-06-01

    This article seeks to offer a systematic approach to establishing a reliable network of facilities in closed loop supply chains (CLSCs) under uncertainties. Facilities that are located in this article concurrently satisfy both traditional objective functions and reliability considerations in CLSC network designs. To attack this problem, a novel mathematical model is developed that integrates the network design decisions in both forward and reverse supply chain networks. The model also utilizes an effective reliability approach to find a robust network design. In order to make the results of this article more realistic, a CLSC for a case study in the iron and steel industry has been explored. The considered CLSC is multi-echelon, multi-facility, multi-product and multi-supplier. Furthermore, multiple facilities exist in the reverse logistics network leading to high complexities. Since the collection centres play an important role in this network, the reliability concept of these facilities is taken into consideration. To solve the proposed model, a novel interactive hybrid solution methodology is developed by combining a number of efficient solution approaches from the recent literature. The proposed solution methodology is a bi-objective interval fuzzy possibilistic chance-constraint mixed integer linear programming (BOIFPCCMILP). Finally, computational experiments are provided to demonstrate the applicability and suitability of the proposed model in a supply chain environment and to help decision makers facilitate their analyses.

  8. Rapid Airplane Parametric Input Design (RAPID)

    NASA Technical Reports Server (NTRS)

    Smith, Robert E.

    1995-01-01

    RAPID is a methodology and software system to define a class of airplane configurations and directly evaluate surface grids, volume grids, and grid sensitivity on and about the configurations. A distinguishing characteristic which separates RAPID from other airplane surface modellers is that the output grids and grid sensitivity are directly applicable in CFD analysis. A small set of design parameters and grid control parameters govern the process which is incorporated into interactive software for 'real time' visual analysis and into batch software for the application of optimization technology. The computed surface grids and volume grids are suitable for a wide range of Computational Fluid Dynamics (CFD) simulation. The general airplane configuration has wing, fuselage, horizontal tail, and vertical tail components. The double-delta wing and tail components are manifested by solving a fourth order partial differential equation (PDE) subject to Dirichlet and Neumann boundary conditions. The design parameters are incorporated into the boundary conditions and therefore govern the shapes of the surfaces. The PDE solution yields a smooth transition between boundaries. Surface grids suitable for CFD calculation are created by establishing an H-type topology about the configuration and incorporating grid spacing functions in the PDE equation for the lifting components and the fuselage definition equations. User specified grid parameters govern the location and degree of grid concentration. A two-block volume grid about a configuration is calculated using the Control Point Form (CPF) technique. The interactive software, which runs on Silicon Graphics IRIS workstations, allows design parameters to be continuously varied and the resulting surface grid to be observed in real time. The batch software computes both the surface and volume grids and also computes the sensitivity of the output grid with respect to the input design parameters by applying the precompiler tool ADIFOR to the grid generation program. The output of ADIFOR is a new source code containing the old code plus expressions for derivatives of specified dependent variables (grid coordinates) with respect to specified independent variables (design parameters). The RAPID methodology and software provide a means of rapidly defining numerical prototypes, grids, and grid sensitivity of a class of airplane configurations. This technology and software is highly useful for CFD research for preliminary design and optimization processes.

  9. Direct access to dithiobenzoate RAFT agent fragmentation rate coefficients by ESR spin-trapping.

    PubMed

    Ranieri, Kayte; Delaittre, Guillaume; Barner-Kowollik, Christopher; Junkers, Thomas

    2014-12-01

    The β-scission rate coefficient of tert-butyl radicals fragmenting off the intermediate resulting from their addition to tert-butyl dithiobenzoate-a reversible addition-fragmentation chain transfer (RAFT) agent-is estimated via the recently introduced electron spin resonance (ESR)-trapping methodology as a function of temperature. The newly introduced ESR-trapping methodology is critically evaluated and found to be reliable. At 20 °C, a fragmentation rate coefficient of close to 0.042 s(-1) is observed, whereas the activation parameters for the fragmentation reaction-determined for the first time-read EA = 82 ± 13.3 kJ mol(-1) and A = (1.4 ± 0.25) × 10(13) s(-1) . The ESR spin-trapping methodology thus efficiently probes the stability of the RAFT adduct radical under conditions relevant for the pre-equilibrium of the RAFT process. It particularly indicates that stable RAFT adduct radicals are indeed formed in early stages of the RAFT poly-merization, at least when dithiobenzoates are employed as controlling agents as stipulated by the so-called slow fragmentation theory. By design of the methodology, the obtained fragmentation rate coefficients represent an upper limit. The ESR spin-trapping methodology is thus seen as a suitable tool for evaluating the fragmentation rate coefficients of a wide range of RAFT adduct radicals. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Methodology for safety optimization of highway cross-sections for horizontal curves with restricted sight distance.

    PubMed

    Ibrahim, Shewkar E; Sayed, Tarek; Ismail, Karim

    2012-11-01

    Several earlier studies have noted the shortcomings with existing geometric design guides which provide deterministic standards. In these standards the safety margin of the design output is generally unknown and there is little knowledge of the safety implications of deviating from the standards. To mitigate these shortcomings, probabilistic geometric design has been advocated where reliability analysis can be used to account for the uncertainty in the design parameters and to provide a mechanism for risk measurement to evaluate the safety impact of deviations from design standards. This paper applies reliability analysis for optimizing the safety of highway cross-sections. The paper presents an original methodology to select a suitable combination of cross-section elements with restricted sight distance to result in reduced collisions and consistent risk levels. The purpose of this optimization method is to provide designers with a proactive approach to the design of cross-section elements in order to (i) minimize the risk associated with restricted sight distance, (ii) balance the risk across the two carriageways of the highway, and (iii) reduce the expected collision frequency. A case study involving nine cross-sections that are parts of two major highway developments in British Columbia, Canada, was presented. The results showed that an additional reduction in collisions can be realized by incorporating the reliability component, P(nc) (denoting the probability of non-compliance), in the optimization process. The proposed approach results in reduced and consistent risk levels for both travel directions in addition to further collision reductions. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. A novel anti-windup framework for cascade control systems: an application to underactuated mechanical systems.

    PubMed

    Mehdi, Niaz; Rehan, Muhammad; Malik, Fahad Mumtaz; Bhatti, Aamer Iqbal; Tufail, Muhammad

    2014-05-01

    This paper describes the anti-windup compensator (AWC) design methodologies for stable and unstable cascade plants with cascade controllers facing actuator saturation. Two novel full-order decoupling AWC architectures, based on equivalence of the overall closed-loop system, are developed to deal with windup effects. The decoupled architectures have been developed, to formulate the AWC synthesis problem, by assuring equivalence of the coupled and the decoupled architectures, instead of using an analogy, for cascade control systems. A comparison of both AWC architectures from application point of view is provided to consolidate their utilities. Mainly, one of the architecture is better in terms of computational complexity for implementation, while the other is suitable for unstable cascade systems. On the basis of the architectures for cascade systems facing stability and performance degradation problems in the event of actuator saturation, the global AWC design methodologies utilizing linear matrix inequalities (LMIs) are developed. These LMIs are synthesized by application of the Lyapunov theory, the global sector condition and the ℒ2 gain reduction of the uncertain decoupled nonlinear component of the decoupled architecture. Further, an LMI-based local AWC design methodology is derived by utilizing a local sector condition by means of a quadratic Lyapunov function to resolve the windup problem for unstable cascade plants under saturation. To demonstrate effectiveness of the proposed AWC schemes, an underactuated mechanical system, the ball-and-beam system, is considered, and details of the simulation and practical implementation results are described. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  12. Economics of ion propulsion for large space systems

    NASA Technical Reports Server (NTRS)

    Masek, T. D.; Ward, J. W.; Rawlin, V. K.

    1978-01-01

    This study of advanced electrostatic ion thrusters for space propulsion was initiated to determine the suitability of the baseline 30-cm thruster for future missions and to identify other thruster concepts that would better satisfy mission requirements. The general scope of the study was to review mission requirements, select thruster designs to meet these requirements, assess the associated thruster technology requirements, and recommend short- and long-term technology directions that would support future thruster needs. Preliminary design concepts for several advanced thrusters were developed to assess the potential practical difficulties of a new design. This study produced useful general methodologies for assessing both planetary and earth orbit missions. For planetary missions, the assessment is in terms of payload performance as a function of propulsion system technology level. For earth orbit missions, the assessment is made on the basis of cost (cost sensitivity to propulsion system technology level).

  13. Application of the analytic hierarchy process to a sustainability assessment of coastal beach exploitation: a case study of the wind power projects on the coastal beaches of Yancheng, China.

    PubMed

    Tian, Weijun; Bai, Jie; Sun, Huimei; Zhao, Yangguo

    2013-01-30

    Sustainability assessments of coastal beach exploitation are difficult because the identification of appropriate monitoring methodologies and evaluation procedures is still ongoing. In particular, the most suitable procedure for the application of sustainability assessment to coastal beaches remains uncertain. This paper presents a complete sustainability assessment process for coastal beach exploitation based on the analytic hierarchy process (AHP). We developed an assessment framework consisting of 14 indicators derived from the three dimensions of suitability, economic and social value, and ecosystem. We chose a wind power project on a coastal beach of Yancheng as a case study. The results indicated that the wind power farms on the coastal beach were not completely in keeping with sustainable development theory. The construction of the wind power farms had some negative impacts. Therefore, in the design stage, wind turbines should be designed and planned carefully to minimize these negative impacts. In addition, the case study demonstrated that the AHP was capable of addressing the complexities associated with the sustainability of coastal beaches. Copyright © 2012 Elsevier Ltd. All rights reserved.

  14. Use of the My Health Record by people with communication disability in Australia: A review to inform the design and direction of future research.

    PubMed

    Hemsley, Bronwyn; Georgiou, Andrew; Carter, Rob; Hill, Sophie; Higgins, Isabel; van Vliet, Paulette; Balandin, Susan

    2016-12-01

    People with communication disability often struggle to convey their health information to multiple service providers and are at increased risk of adverse health outcomes related to the poor exchange of health information. The purpose of this article was to (a) review the literature informing future research on the Australian personally controlled electronic health record, 'My Health Record' (MyHR), specifically to include people with communication disability and their family members or service providers, and (b) to propose a range of suitable methodologies that might be applied in research to inform training, policy and practice in relation to supporting people with communication disability and their representatives to engage in using MyHR. The authors reviewed the literature and, with a cross-disciplinary perspective, considered ways to apply sociotechnical, health informatics, and inclusive methodologies to research on MyHR use by adults with communication disability. This article outlines a range of research methods suitable for investigating the use of MyHR by people who have communication disability associated with a range of acquired or lifelong health conditions, and their family members, and direct support workers. In planning the allocation of funds towards the health and well-being of adults with disabilities, both disability and health service providers must consider the supports needed for people with communication disability to use MyHR. There is an urgent need to focus research efforts on MyHR in populations with communication disability, who struggle to communicate their health information across multiple health and disability service providers. The design of studies and priorities for future research should be set in consultation with people with communication disability and their representatives. © The Author(s) 2016.

  15. 'Your DNA, Your Say': global survey gathering attitudes toward genomics: design, delivery and methods.

    PubMed

    Middleton, Anna; Niemiec, Emilia; Prainsack, Barbara; Bobe, Jason; Farley, Lauren; Steed, Claire; Smith, James; Bevan, Paul; Bonhomme, Natasha; Kleiderman, Erika; Thorogood, Adrian; Schickhardt, Christoph; Garattini, Chiara; Vears, Danya; Littler, Katherine; Banner, Natalie; Scott, Erick; Kovalevskaya, Nadezda V; Levin, Elissa; Morley, Katherine I; Howard, Heidi C

    2018-06-01

    Our international study, 'Your DNA, Your Say', uses film and an online cross-sectional survey to gather public attitudes toward the donation, access and sharing of DNA information. We describe the methodological approach used to create an engaging and bespoke survey, suitable for translation into many different languages. We address some of the particular challenges in designing a survey on the subject of genomics. In order to understand the significance of a genomic result, researchers and clinicians alike use external databases containing DNA and medical information from thousands of people. We ask how publics would like their 'anonymous' data to be used (or not to be used) and whether they are concerned by the potential risks of reidentification; the results will be used to inform policy.

  16. Open loop model for WDM links

    NASA Astrophysics Data System (ADS)

    D, Meena; Francis, Fredy; T, Sarath K.; E, Dipin; Srinivas, T.; K, Jayasree V.

    2014-10-01

    Wavelength Division Multiplexing (WDM) techniques overfibrelinks helps to exploit the high bandwidth capacity of single mode fibres. A typical WDM link consisting of laser source, multiplexer/demultiplexer, amplifier and detectoris considered for obtaining the open loop gain model of the link. The methodology used here is to obtain individual component models using mathematical and different curve fitting techniques. These individual models are then combined to obtain the WDM link model. The objective is to deduce a single variable model for the WDM link in terms of input current to system. Thus it provides a black box solution for a link. The Root Mean Square Error (RMSE) associated with each of the approximated models is given for comparison. This will help the designer to select the suitable WDM link model during a complex link design.

  17. Use of steel slag as a new material for roads

    NASA Astrophysics Data System (ADS)

    Ochoa Díaz, R.; Romero Farfán, M.; Cardenas, J.; Forero, J.

    2017-12-01

    This research paper aims to analyse the behaviour of MDC-19 hot dense asphalt mixtures with steel slag as coarse aggregate, by using asphalt 80-100, in order to verify if this residue has suitable characteristics that allow its use. The physical and mechanical characterization was accomplished using phosphorous slag from the company Acerías Paz del Río S.A. The working formula was then determined for each mixture using the RAMCODES methodology, the briquettes were produced in the laboratory and then, the design verification was performed. Taking into account the results obtained, it is concluded that the use of phosphorous slag as coarse aggregate in asphalt mixtures is workable, since acceptable design parameters and verification are obtained that meet the specifications for use as a rolling layer.

  18. Balancing energy development and conservation: A method utilizing species distribution models

    USGS Publications Warehouse

    Jarnevich, C.S.; Laubhan, M.K.

    2011-01-01

    Alternative energy development is increasing, potentially leading to negative impacts on wildlife populations already stressed by other factors. Resource managers require a scientifically based methodology to balance energy development and species conservation, so we investigated modeling habitat suitability using Maximum Entropy to develop maps that could be used with other information to help site energy developments. We selected one species of concern, the Lesser Prairie-Chicken (LPCH; Tympanuchus pallidicinctus) found on the southern Great Plains of North America, as our case study. LPCH populations have been declining and are potentially further impacted by energy development. We used LPCH lek locations in the state of Kansas along with several environmental and anthropogenic parameters to develop models that predict the probability of lek occurrence across the landscape. The models all performed well as indicated by the high test area under the curve (AUC) scores (all >0.9). The inclusion of anthropogenic parameters in models resulted in slightly better performance based on AUC values, indicating that anthropogenic features may impact LPCH lek habitat suitability. Given the positive model results, this methodology may provide additional guidance in designing future survey protocols, as well as siting of energy development in areas of marginal or unsuitable habitat for species of concern. This technique could help to standardize and quantify the impacts various developments have upon at-risk species. ?? 2011 Springer Science+Business Media, LLC (outside the USA).

  19. A SYSTEMIC APPROACH TO MITIGATING URBAN STORM WATER RUNOFF VIA DEVELOPMENT PLANS BASED ON LAND SUITABILITY ANALYSIS

    EPA Science Inventory

    We advocate an approach to reduce the anticipated increase in stormwater runoff from conventional development by demonstrating a low-impact development that incorporates hydrologic factors into an expanded land suitability analysis. This methodology was applied to a 3 hectare exp...

  20. GC-MS (GAS CHROMATOGRAPHIC-MASS SPECTROMETRIC) SUITABILITY TESTING OF RCRA APPENDIX VIII AND MICHIGAN LIST ANALYTES

    EPA Science Inventory

    As a first step in a hierarchical scheme to demonstrate the suitability of present U.S. Environmental Protection Agency (USEPA) analysis methods and/or develop new methodology, the gas chromatographic (GC) separation and mass spectrometric (MS) detection characteristics of 328 to...

  1. Interdisciplinary Robotics Project for First-Year Engineering Degree Students

    ERIC Educational Resources Information Center

    Aznar, Mercedes; Zacarés, José; López, Jaime; Sánchez, Rafael; Pastor, José M.; Llorca, Jaume

    2015-01-01

    The acquisition of both transversal and specific competences cannot be achieved using conventional methodologies. New methodologies must be applied that promote the necessary competences for proper professional development. Interdisciplinary projects can be a suitable tool for competence-based learning. A priori, this might be complicated, as…

  2. Crop suitability monitoring for improved yield estimations with 100m PROBA-V data

    NASA Astrophysics Data System (ADS)

    Özüm Durgun, Yetkin; Gilliams, Sven; Gobin, Anne; Duveiller, Grégory; Djaby, Bakary; Tychon, Bernard

    2015-04-01

    This study has been realised within the framework of a PhD targeting to advance agricultural monitoring with improved yield estimations using SPOT VEGETATION remotely sensed data. For the first research question, the aim was to improve dry matter productivity (DMP) for C3 and C4 plants by adding a water stress factor. Additionally, the relation between the actual crop yield and DMP was studied. One of the limitations was the lack of crop specific maps which leads to the second research question on 'crop suitability monitoring'. The objective of this work is to create a methodological approach based on the spectral and temporal characteristics of PROBA-V images and ancillary data such as meteorology, soil and topographic data to improve the estimation of annual crop yields. The PROBA-V satellite was launched on 6th May 2013, and was designed to bridge the gap in space-borne vegetation measurements between SPOT-VGT (March 1998 - May 2014) and the upcoming Sentinel-3 satellites scheduled for launch in 2015/2016. PROBA -V has products in four spectral bands: BLUE (centred at 0.463 µm), RED (0.655 µm), NIR (0.845 µm), and SWIR (1.600 µm) with a spatial resolution ranging from 1km to 300m. Due to the construction of the sensor, the central camera can provide a 100m data product with a 5 to 8 days revisiting time. Although the 100m data product is still in test phase a methodology for crop suitability monitoring was developed. The multi-spectral composites, NDVI (Normalised Difference Vegetation Index) (NIR_RED/NIR+RED) and NDII (Normalised Difference Infrared Index) (NIR-SWIR/NIR+SWIR) profiles are used in addition to secondary data such as digital elevation data, precipitation, temperature, soil types and administrative boundaries to improve the accuracy of crop yield estimations. The methodology is evaluated on several FP7 SIGMA test sites for the 2014 - 2015 period. Reference data in the form of vector GIS with boundaries and cover type of agricultural fields are available through the SIGMA site partners. References http://proba-v.vgt.vito.be/ http://www.geoglam-sigma.info/

  3. Multivariable Techniques for High-Speed Research Flight Control Systems

    NASA Technical Reports Server (NTRS)

    Newman, Brett A.

    1999-01-01

    This report describes the activities and findings conducted under contract with NASA Langley Research Center. Subject matter is the investigation of suitable multivariable flight control design methodologies and solutions for large, flexible high-speed vehicles. Specifically, methodologies are to address the inner control loops used for stabilization and augmentation of a highly coupled airframe system possibly involving rigid-body motion, structural vibrations, unsteady aerodynamics, and actuator dynamics. Design and analysis techniques considered in this body of work are both conventional-based and contemporary-based, and the vehicle of interest is the High-Speed Civil Transport (HSCT). Major findings include: (1) control architectures based on aft tail only are not well suited for highly flexible, high-speed vehicles, (2) theoretical underpinnings of the Wykes structural mode control logic is based on several assumptions concerning vehicle dynamic characteristics, and if not satisfied, the control logic can break down leading to mode destabilization, (3) two-loop control architectures that utilize small forward vanes with the aft tail provide highly attractive and feasible solutions to the longitudinal axis control challenges, and (4) closed-loop simulation sizing analyses indicate the baseline vane model utilized in this report is most likely oversized for normal loading conditions.

  4. Design of Passive Power Filter for Hybrid Series Active Power Filter using Estimation, Detection and Classification Method

    NASA Astrophysics Data System (ADS)

    Swain, Sushree Diptimayee; Ray, Pravat Kumar; Mohanty, K. B.

    2016-06-01

    This research paper discover the design of a shunt Passive Power Filter (PPF) in Hybrid Series Active Power Filter (HSAPF) that employs a novel analytic methodology which is superior than FFT analysis. This novel approach consists of the estimation, detection and classification of the signals. The proposed method is applied to estimate, detect and classify the power quality (PQ) disturbance such as harmonics. This proposed work deals with three methods: the harmonic detection through wavelet transform method, the harmonic estimation by Kalman Filter algorithm and harmonic classification by decision tree method. From different type of mother wavelets in wavelet transform method, the db8 is selected as suitable mother wavelet because of its potency on transient response and crouched oscillation at frequency domain. In harmonic compensation process, the detected harmonic is compensated through Hybrid Series Active Power Filter (HSAPF) based on Instantaneous Reactive Power Theory (IRPT). The efficacy of the proposed method is verified in MATLAB/SIMULINK domain and as well as with an experimental set up. The obtained results confirm the superiority of the proposed methodology than FFT analysis. This newly proposed PPF is used to make the conventional HSAPF more robust and stable.

  5. Reliability and availability evaluation of Wireless Sensor Networks for industrial applications.

    PubMed

    Silva, Ivanovitch; Guedes, Luiz Affonso; Portugal, Paulo; Vasques, Francisco

    2012-01-01

    Wireless Sensor Networks (WSN) currently represent the best candidate to be adopted as the communication solution for the last mile connection in process control and monitoring applications in industrial environments. Most of these applications have stringent dependability (reliability and availability) requirements, as a system failure may result in economic losses, put people in danger or lead to environmental damages. Among the different type of faults that can lead to a system failure, permanent faults on network devices have a major impact. They can hamper communications over long periods of time and consequently disturb, or even disable, control algorithms. The lack of a structured approach enabling the evaluation of permanent faults, prevents system designers to optimize decisions that minimize these occurrences. In this work we propose a methodology based on an automatic generation of a fault tree to evaluate the reliability and availability of Wireless Sensor Networks, when permanent faults occur on network devices. The proposal supports any topology, different levels of redundancy, network reconfigurations, criticality of devices and arbitrary failure conditions. The proposed methodology is particularly suitable for the design and validation of Wireless Sensor Networks when trying to optimize its reliability and availability requirements.

  6. Reliability and Availability Evaluation of Wireless Sensor Networks for Industrial Applications

    PubMed Central

    Silva, Ivanovitch; Guedes, Luiz Affonso; Portugal, Paulo; Vasques, Francisco

    2012-01-01

    Wireless Sensor Networks (WSN) currently represent the best candidate to be adopted as the communication solution for the last mile connection in process control and monitoring applications in industrial environments. Most of these applications have stringent dependability (reliability and availability) requirements, as a system failure may result in economic losses, put people in danger or lead to environmental damages. Among the different type of faults that can lead to a system failure, permanent faults on network devices have a major impact. They can hamper communications over long periods of time and consequently disturb, or even disable, control algorithms. The lack of a structured approach enabling the evaluation of permanent faults, prevents system designers to optimize decisions that minimize these occurrences. In this work we propose a methodology based on an automatic generation of a fault tree to evaluate the reliability and availability of Wireless Sensor Networks, when permanent faults occur on network devices. The proposal supports any topology, different levels of redundancy, network reconfigurations, criticality of devices and arbitrary failure conditions. The proposed methodology is particularly suitable for the design and validation of Wireless Sensor Networks when trying to optimize its reliability and availability requirements. PMID:22368497

  7. Optimization of the ultrasonic assisted removal of methylene blue by gold nanoparticles loaded on activated carbon using experimental design methodology.

    PubMed

    Roosta, M; Ghaedi, M; Daneshfar, A; Sahraei, R; Asghari, A

    2014-01-01

    The present study was focused on the removal of methylene blue (MB) from aqueous solution by ultrasound-assisted adsorption onto the gold nanoparticles loaded on activated carbon (Au-NP-AC). This nanomaterial was characterized using different techniques such as SEM, XRD, and BET. The effects of variables such as pH, initial dye concentration, adsorbent dosage (g), temperature and sonication time (min) on MB removal were studied and using central composite design (CCD) and the optimum experimental conditions were found with desirability function (DF) combined response surface methodology (RSM). Fitting the experimental equilibrium data to various isotherm models such as Langmuir, Freundlich, Tempkin and Dubinin-Radushkevich models show the suitability and applicability of the Langmuir model. Analysis of experimental adsorption data to various kinetic models such as pseudo-first and second order, Elovich and intraparticle diffusion models show the applicability of the second-order equation model. The small amount of proposed adsorbent (0.01 g) is applicable for successful removal of MB (RE>95%) in short time (1.6 min) with high adsorption capacity (104-185 mg g(-1)). Copyright © 2013 Elsevier B.V. All rights reserved.

  8. Recent Advances of MEMS Resonators for Lorentz Force Based Magnetic Field Sensors: Design, Applications and Challenges.

    PubMed

    Herrera-May, Agustín Leobardo; Soler-Balcazar, Juan Carlos; Vázquez-Leal, Héctor; Martínez-Castillo, Jaime; Vigueras-Zuñiga, Marco Osvaldo; Aguilera-Cortés, Luz Antonio

    2016-08-24

    Microelectromechanical systems (MEMS) resonators have allowed the development of magnetic field sensors with potential applications such as biomedicine, automotive industry, navigation systems, space satellites, telecommunications and non-destructive testing. We present a review of recent magnetic field sensors based on MEMS resonators, which operate with Lorentz force. These sensors have a compact structure, wide measurement range, low energy consumption, high sensitivity and suitable performance. The design methodology, simulation tools, damping sources, sensing techniques and future applications of magnetic field sensors are discussed. The design process is fundamental in achieving correct selection of the operation principle, sensing technique, materials, fabrication process and readout systems of the sensors. In addition, the description of the main sensing systems and challenges of the MEMS sensors are discussed. To develop the best devices, researches of their mechanical reliability, vacuum packaging, design optimization and temperature compensation circuits are needed. Future applications will require multifunctional sensors for monitoring several physical parameters (e.g., magnetic field, acceleration, angular ratio, humidity, temperature and gases).

  9. Recent Advances of MEMS Resonators for Lorentz Force Based Magnetic Field Sensors: Design, Applications and Challenges

    PubMed Central

    Herrera-May, Agustín Leobardo; Soler-Balcazar, Juan Carlos; Vázquez-Leal, Héctor; Martínez-Castillo, Jaime; Vigueras-Zuñiga, Marco Osvaldo; Aguilera-Cortés, Luz Antonio

    2016-01-01

    Microelectromechanical systems (MEMS) resonators have allowed the development of magnetic field sensors with potential applications such as biomedicine, automotive industry, navigation systems, space satellites, telecommunications and non-destructive testing. We present a review of recent magnetic field sensors based on MEMS resonators, which operate with Lorentz force. These sensors have a compact structure, wide measurement range, low energy consumption, high sensitivity and suitable performance. The design methodology, simulation tools, damping sources, sensing techniques and future applications of magnetic field sensors are discussed. The design process is fundamental in achieving correct selection of the operation principle, sensing technique, materials, fabrication process and readout systems of the sensors. In addition, the description of the main sensing systems and challenges of the MEMS sensors are discussed. To develop the best devices, researches of their mechanical reliability, vacuum packaging, design optimization and temperature compensation circuits are needed. Future applications will require multifunctional sensors for monitoring several physical parameters (e.g., magnetic field, acceleration, angular ratio, humidity, temperature and gases). PMID:27563912

  10. Designing and optimizing a healthcare kiosk for the community.

    PubMed

    Lyu, Yongqiang; Vincent, Christopher James; Chen, Yu; Shi, Yuanchun; Tang, Yida; Wang, Wenyao; Liu, Wei; Zhang, Shuangshuang; Fang, Ke; Ding, Ji

    2015-03-01

    Investigating new ways to deliver care, such as the use of self-service kiosks to collect and monitor signs of wellness, supports healthcare efficiency and inclusivity. Self-service kiosks offer this potential, but there is a need for solutions to meet acceptable standards, e.g. provision of accurate measurements. This study investigates the design and optimization of a prototype healthcare kiosk to collect vital signs measures. The design problem was decomposed, formalized, focused and used to generate multiple solutions. Systematic implementation and evaluation allowed for the optimization of measurement accuracy, first for individuals and then for a population. The optimized solution was tested independently to check the suitability of the methods, and quality of the solution. The process resulted in a reduction of measurement noise and an optimal fit, in terms of the positioning of measurement devices. This guaranteed the accuracy of the solution and provides a general methodology for similar design problems. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  11. Rapid Airplane Parametric Input Design(RAPID)

    NASA Technical Reports Server (NTRS)

    Smith, Robert E.; Bloor, Malcolm I. G.; Wilson, Michael J.; Thomas, Almuttil M.

    2004-01-01

    An efficient methodology is presented for defining a class of airplane configurations. Inclusive in this definition are surface grids, volume grids, and grid sensitivity. A small set of design parameters and grid control parameters govern the process. The general airplane configuration has wing, fuselage, vertical tail, horizontal tail, and canard components. The wing, tail, and canard components are manifested by solving a fourth-order partial differential equation subject to Dirichlet and Neumann boundary conditions. The design variables are incorporated into the boundary conditions, and the solution is expressed as a Fourier series. The fuselage has circular cross section, and the radius is an algebraic function of four design parameters and an independent computational variable. Volume grids are obtained through an application of the Control Point Form method. Grid sensitivity is obtained by applying the automatic differentiation precompiler ADIFOR to software for the grid generation. The computed surface grids, volume grids, and sensitivity derivatives are suitable for a wide range of Computational Fluid Dynamics simulation and configuration optimizations.

  12. Design of experiments applications in bioprocessing: concepts and approach.

    PubMed

    Kumar, Vijesh; Bhalla, Akriti; Rathore, Anurag S

    2014-01-01

    Most biotechnology unit operations are complex in nature with numerous process variables, feed material attributes, and raw material attributes that can have significant impact on the performance of the process. Design of experiments (DOE)-based approach offers a solution to this conundrum and allows for an efficient estimation of the main effects and the interactions with minimal number of experiments. Numerous publications illustrate application of DOE towards development of different bioprocessing unit operations. However, a systematic approach for evaluation of the different DOE designs and for choosing the optimal design for a given application has not been published yet. Through this work we have compared the I-optimal and D-optimal designs to the commonly used central composite and Box-Behnken designs for bioprocess applications. A systematic methodology is proposed for construction of the model and for precise prediction of the responses for the three case studies involving some of the commonly used unit operations in downstream processing. Use of Akaike information criterion for model selection has been examined and found to be suitable for the applications under consideration. © 2013 American Institute of Chemical Engineers.

  13. ASPEN Plus in the Chemical Engineering Curriculum: Suitable Course Content and Teaching Methodology

    ERIC Educational Resources Information Center

    Rockstraw, David A.

    2005-01-01

    An established methodology involving the sequential presentation of five skills on ASPEN Plus to undergraduate seniors majoring in ChE is presented in this document: (1) specifying unit operations; (2) manipulating physical properties; (3) accessing variables; (4) specifying nonstandard components; and (5) applying advanced features. This…

  14. Seismic behavior of a low-rise horizontal cylindrical tank

    NASA Astrophysics Data System (ADS)

    Fiore, Alessandra; Rago, Carlo; Vanzi, Ivo; Greco, Rita; Briseghella, Bruno

    2018-05-01

    Cylindrical storage tanks are widely used for various types of liquids, including hazardous contents, thus requiring suitable and careful design for seismic actions. The study herein presented deals with the dynamic analysis of a ground-based horizontal cylindrical tank containing butane and with its safety verification. The analyses are based on a detailed finite element (FE) model; a simplified one-degree-of-freedom idealization is also set up and used for verification of the FE results. Particular attention is paid to sloshing and asynchronous seismic input effects. Sloshing effects are investigated according to the current literature state of the art. An efficient methodology based on an "impulsive-convective" decomposition of the container-fluid motion is adopted for the calculation of the seismic force. The effects of asynchronous ground motion are studied by suitable pseudo-static analyses. Comparison between seismic action effects, obtained with and without consideration of sloshing and asynchronous seismic input, shows a rather important influence of these conditions on the final results.

  15. Understanding leachate flow in municipal solid waste landfills by combining time-lapse ERT and subsurface flow modelling - Part II: Constraint methodology of hydrodynamic models.

    PubMed

    Audebert, M; Oxarango, L; Duquennoi, C; Touze-Foltz, N; Forquet, N; Clément, R

    2016-09-01

    Leachate recirculation is a key process in the operation of municipal solid waste landfills as bioreactors. To ensure optimal water content distribution, bioreactor operators need tools to design leachate injection systems. Prediction of leachate flow by subsurface flow modelling could provide useful information for the design of such systems. However, hydrodynamic models require additional data to constrain them and to assess hydrodynamic parameters. Electrical resistivity tomography (ERT) is a suitable method to study leachate infiltration at the landfill scale. It can provide spatially distributed information which is useful for constraining hydrodynamic models. However, this geophysical method does not allow ERT users to directly measure water content in waste. The MICS (multiple inversions and clustering strategy) methodology was proposed to delineate the infiltration area precisely during time-lapse ERT survey in order to avoid the use of empirical petrophysical relationships, which are not adapted to a heterogeneous medium such as waste. The infiltration shapes and hydrodynamic information extracted with MICS were used to constrain hydrodynamic models in assessing parameters. The constraint methodology developed in this paper was tested on two hydrodynamic models: an equilibrium model where, flow within the waste medium is estimated using a single continuum approach and a non-equilibrium model where flow is estimated using a dual continuum approach. The latter represents leachate flows into fractures. Finally, this methodology provides insight to identify the advantages and limitations of hydrodynamic models. Furthermore, we suggest an explanation for the large volume detected by MICS when a small volume of leachate is injected. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Automated control of hierarchical systems using value-driven methods

    NASA Technical Reports Server (NTRS)

    Pugh, George E.; Burke, Thomas E.

    1990-01-01

    An introduction is given to the Value-driven methodology, which has been successfully applied to solve a variety of difficult decision, control, and optimization problems. Many real-world decision processes (e.g., those encountered in scheduling, allocation, and command and control) involve a hierarchy of complex planning considerations. For such problems it is virtually impossible to define a fixed set of rules that will operate satisfactorily over the full range of probable contingencies. Decision Science Applications' value-driven methodology offers a systematic way of automating the intuitive, common-sense approach used by human planners. The inherent responsiveness of value-driven systems to user-controlled priorities makes them particularly suitable for semi-automated applications in which the user must remain in command of the systems operation. Three examples of the practical application of the approach in the automation of hierarchical decision processes are discussed: the TAC Brawler air-to-air combat simulation is a four-level computerized hierarchy; the autonomous underwater vehicle mission planning system is a three-level control system; and the Space Station Freedom electrical power control and scheduling system is designed as a two-level hierarchy. The methodology is compared with rule-based systems and with other more widely-known optimization techniques.

  17. Inclusive design--assistive technology for people with cerebral palsy.

    PubMed

    Heidrich, Regina; Bassani, Patrícia

    2012-01-01

    The first sentence of the Abstract should follow the word "Abstract." on the same line. The abstract should be clear, descriptive, self-explanatory and no longer than 200 words. It should also be suitable for publication in abstracting services. Do not include references or formulae in the abstract. This study reports the work of the Inclusive Design research project conducted with a group of children with cerebral palsy Our project has been working with Assistive Technology and has been developing an expanded mouse and a keyboard. Nowadays, we are working as a researcher of Cognitive Ergonomics and of Inclusive Education. The goal of our project is to establish an interdisciplinary study that focus the developing of a research in Ergonomics Design, contributing to improve the assistance to people with special needs. One applied the pedagogical approach, using Vygotsky's Social-historic Theory that advocates the concept of each individual's experiences are important to improve them. The development methodology was based on user-centered design. The results showed that as long as the students applied the new technologies they developed superior psychological processes towards social interaction, autonomy, taking part in class activities more efficiently. Also, we verified how important the new technologies in class were, considering the methodologies, objectives full and effective described on this study. This way, we do hope, from the data obtained on this research, to contribute with the ones who believe that the improvement of handicap students' inclusion in class is a reality.

  18. Rainfall simulation experiments: Influence of water temperature, water quality and plot design on soil erosion and runoff

    NASA Astrophysics Data System (ADS)

    Iserloh, Thomas; Pegoraro, Dominique; Schlösser, Angelika; Thesing, Hannah; Seeger, Manuel; Ries, Johannes B.

    2015-04-01

    Field rainfall simulators are designed to study soil erosion processes and provide urgently needed data for various geomorphological, hydrological and pedological issues. Due to the different conditions and technologies applied, there are several methodological aspects under review of the scientific community, particularly concerning design, procedures and conditions of measurement for infiltration, runoff and soil erosion. This study aims at contributing fundamental data for understanding rainfall simulations in depth by studying the effect of the following parameters on the measurement results: 1. Plot design - round or rectangular plot: Can we identify differences in amount of runoff and erosion? 2. Water quality: What is the influence of the water's salt load on interrill erosion and infiltration as measured by rainfall experiments? 3. Water temperature: How much are the results conditioned by the temperature of water, which is subject to changes due to environmental conditions during the experiments? Preliminary results show a moderate increase of soil erosion with the water's salt load while runoff stays almost on the same level. With increasing water temperature, runoff increases continuously. At very high temperatures, soil erosion is clearly increased. A first comparison between round and rectangular plot indicates the rectangular plot to be the most suitable plot shape, but ambiguous results make further research necessary. The analysis of these three factors concerning their influence on runoff and erosion shows that clear methodological standards are necessary in order to make rainfall simulation experiments comparable.

  19. Improved intracellular PHA determinations with novel spectrophotometric quantification methodologies based on Sudan black dye.

    PubMed

    Porras, Mauricio A; Villar, Marcelo A; Cubitto, María A

    2018-05-01

    The presence of intracellular polyhydroxyalkanoates (PHAs) is usually studied using Sudan black dye solution (SB). In a previous work it was shown that the PHA could be directly quantified using the absorbance of SB fixed by PHA granules in wet cell samples. In the present paper, the optimum SB amount and the optimum conditions to be used for SB assays were determined following an experimental design by hybrid response surface methodology and desirability-function. In addition, a new methodology was developed in which it is shown that the amount of SB fixed by PHA granules can also be determined indirectly through the absorbance of the supernatant obtained from the stained cell samples. This alternative methodology allows a faster determination of the PHA content (involving 23 and 42 min for indirect and direct determinations, respectively), and can be undertaken by means of basic laboratory equipment and reagents. The correlation between PHA content in wet cell samples and the spectra of the SB stained supernatant was determined by means of multivariate and linear regression analysis. The best calibration adjustment (R 2  = 0.91, RSE: 1.56%), and the good PHA prediction obtained (RSE = 1.81%), shows that the proposed methodology constitutes a reasonably precise way for PHA content determination. Thus, this methodology could anticipate the probable results of the above mentioned direct PHA determination. Compared with the most used techniques described in the scientific literature, the combined implementation of these two methodologies seems to be one of the most economical and environmentally friendly, suitable for rapid monitoring of the intracellular PHA content. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. A GIS-based assessment of groundwater suitability for irrigation purposes in flat areas of the wet Pampa plain, Argentina.

    PubMed

    Romanelli, Asunción; Lima, María Lourdes; Quiroz Londoño, Orlando Mauricio; Martínez, Daniel Emilio; Massone, Héctor Enrique

    2012-09-01

    The Pampa in Argentina is a large plain with a quite obvious dependence on agriculture, water availability and its quality. It is a sensitive environment due to weather changes and slope variations. Supplementary irrigation is a useful practice for compensating the production in the zone. However, potential negative impacts of this type of irrigation in salinization and sodification of soils are evident. Most conventional methodologies for assessing water irrigation quality have difficulties in their application in the region because they do not adjust to the defined assumptions for them. Consequently, a new GIS-based methodology integrating multiparametric data was proposed for evaluating and delineating groundwater suitability zones for irrigation purposes in flat areas. Hydrogeological surveys including water level measurements, groundwater samples for chemical analysis and electrical conductivity (EC) measurements were performed. The combination of EC, sodium adsorption ratio, residual sodium carbonate, slopes and hydraulic gradient parameters generated an irrigation water index (IWI). With the integration of the IWI 1 to 3 classes (categories of suitable waters for irrigation) and the aquifer thickness the restricted irrigation water index (RIWI) was obtained. The IWI's index application showed that 61.3 % of the area has "Very high" to "Moderate" potential for irrigation, while the 31.4 % of it has unsuitable waters. Approximately, 46 % of the tested area has high suitability for irrigation and moderate groundwater availability. This proposed methodology has advantages over traditional methods because it allows for better discrimination in homogeneous areas.

  1. Assessment of Reliable Change Using 95% Credible Intervals for the Differences in Proportions: A Statistical Analysis for Case-Study Methodology.

    PubMed

    Unicomb, Rachael; Colyvas, Kim; Harrison, Elisabeth; Hewat, Sally

    2015-06-01

    Case-study methodology studying change is often used in the field of speech-language pathology, but it can be criticized for not being statistically robust. Yet with the heterogeneous nature of many communication disorders, case studies allow clinicians and researchers to closely observe and report on change. Such information is valuable and can further inform large-scale experimental designs. In this research note, a statistical analysis for case-study data is outlined that employs a modification to the Reliable Change Index (Jacobson & Truax, 1991). The relationship between reliable change and clinical significance is discussed. Example data are used to guide the reader through the use and application of this analysis. A method of analysis is detailed that is suitable for assessing change in measures with binary categorical outcomes. The analysis is illustrated using data from one individual, measured before and after treatment for stuttering. The application of this approach to assess change in categorical, binary data has potential application in speech-language pathology. It enables clinicians and researchers to analyze results from case studies for their statistical and clinical significance. This new method addresses a gap in the research design literature, that is, the lack of analysis methods for noncontinuous data (such as counts, rates, proportions of events) that may be used in case-study designs.

  2. A Methodological Proposal for Learning Games Selection and Quality Assessment

    ERIC Educational Resources Information Center

    Dondi, Claudio; Moretti, Michela

    2007-01-01

    This paper presents a methodological proposal elaborated in the framework of two European projects dealing with game-based learning, both of which have focused on "quality" aspects in order to create suitable tools that support European educators, practitioners and lifelong learners in selecting and assessing learning games for use in…

  3. The Role of System Thinking Development and Experiential Learning on Enterprise Transformation

    NASA Astrophysics Data System (ADS)

    Lopez, Gabriel

    The recent economic downturn has had global repercussions in all businesses alike. Competition is fierce and a survival of the fittest model is always present; fast delivery times and innovative designs ultimately translate into the enterprises' bottom line. In such market conditions, enterprises have to find ways to develop and train their workforce in a manner that enhances the innovative capabilities of the enterprise. Additionally, if companies are to stay competitive, they have to ensure critical skills in their workforce are transferred from generation to generation. This study builds on recent research on system-thinking development via experiential learning methodologies. First, a conceptual framework model was developed. This conceptual model captures a methodology to construct a system-thinking apprenticeship program suitable for system engineers. Secondly, a survey of system engineering professionals was conducted in order to assess and refine the proposed conceptual model. This dissertation captures the findings of the conceptual model and the implications of the study for enterprises and for system engineering organizations.

  4. Thermal sensation prediction by soft computing methodology.

    PubMed

    Jović, Srđan; Arsić, Nebojša; Vilimonović, Jovana; Petković, Dalibor

    2016-12-01

    Thermal comfort in open urban areas is very factor based on environmental point of view. Therefore it is need to fulfill demands for suitable thermal comfort during urban planning and design. Thermal comfort can be modeled based on climatic parameters and other factors. The factors are variables and they are changed throughout the year and days. Therefore there is need to establish an algorithm for thermal comfort prediction according to the input variables. The prediction results could be used for planning of time of usage of urban areas. Since it is very nonlinear task, in this investigation was applied soft computing methodology in order to predict the thermal comfort. The main goal was to apply extreme leaning machine (ELM) for forecasting of physiological equivalent temperature (PET) values. Temperature, pressure, wind speed and irradiance were used as inputs. The prediction results are compared with some benchmark models. Based on the results ELM can be used effectively in forecasting of PET. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Widefield quantitative multiplex surface enhanced Raman scattering imaging in vivo

    NASA Astrophysics Data System (ADS)

    McVeigh, Patrick Z.; Mallia, Rupananda J.; Veilleux, Israel; Wilson, Brian C.

    2013-04-01

    In recent years numerous studies have shown the potential advantages of molecular imaging in vitro and in vivo using contrast agents based on surface enhanced Raman scattering (SERS), however the low throughput of traditional point-scanned imaging methodologies have limited their use in biological imaging. In this work we demonstrate that direct widefield Raman imaging based on a tunable filter is capable of quantitative multiplex SERS imaging in vivo, and that this imaging is possible with acquisition times which are orders of magnitude lower than achievable with comparable point-scanned methodologies. The system, designed for small animal imaging, has a linear response from (0.01 to 100 pM), acquires typical in vivo images in <10 s, and with suitable SERS reporter molecules is capable of multiplex imaging without compensation for spectral overlap. To demonstrate the utility of widefield Raman imaging in biological applications, we show quantitative imaging of four simultaneous SERS reporter molecules in vivo with resulting probe quantification that is in excellent agreement with known quantities (R2>0.98).

  6. Hafnium transistor design for neural interfacing.

    PubMed

    Parent, David W; Basham, Eric J

    2008-01-01

    A design methodology is presented that uses the EKV model and the g(m)/I(D) biasing technique to design hafnium oxide field effect transistors that are suitable for neural recording circuitry. The DC gain of a common source amplifier is correlated to the structural properties of a Field Effect Transistor (FET) and a Metal Insulator Semiconductor (MIS) capacitor. This approach allows a transistor designer to use a design flow that starts with simple and intuitive 1-D equations for gain that can be verified in 1-D MIS capacitor TCAD simulations, before final TCAD process verification of transistor properties. The DC gain of a common source amplifier is optimized by using fast 1-D simulations and using slower, complex 2-D simulations only for verification. The 1-D equations are used to show that the increased dielectric constant of hafnium oxide allows a higher DC gain for a given oxide thickness. An additional benefit is that the MIS capacitor can be employed to test additional performance parameters important to an open gate transistor such as dielectric stability and ionic penetration.

  7. Regional Design Approach in Designing Climatic Responsive Administrative Building in the 21st Century

    NASA Astrophysics Data System (ADS)

    Haja Bava Mohidin, Hazrina Binti; Ismail, Alice Sabrina

    2015-01-01

    The objective of this paper is to explicate on the study of modern administrative building in Malaysia which portrays regional design approach that conforms to the local context and climate by reviewing two case studies; Perdana Putra (1999) and former Prime Minister's Office (1967). This paper is significant because the country's stature and political statement was symbolized by administrative building as a national icon. In other words, it is also viewed as a cultural object that is closely tied to a particular social context and nation historical moment. Administrative building, therefore, may exhibit various meanings. This paper uses structuralism paradigm and semiotic principles as a methodological approach. This paper is of importance for practicing architects and society in the future as it offers new knowledge and understanding in identifying the suitable climatic consideration that may reflect regionalist design approach in modern administrative building. These elements then may be adopted in designing public buildings in the future with regional values that are important for expressing national culture to symbolize the identity of place and society as well as responsive to climate change.

  8. Commentary on Reconstituting Fibrinogen Concentrate to Maintain Blinding in a Double-blind, Randomized Trial in an Emergency Setting.

    PubMed

    Bruynseels, Daniel; Solomon, Cristina; Hallam, Angela; Collins, Peter W; Collis, Rachel E; Hamlyn, Vincent; Hall, Judith E

    2016-01-01

    The gold standard of trial design is the double-blind, placebo-controlled, randomized trial. Intravenous medication, which needs reconstitution by the attending clinician in an emergency situation, can be challenging to incorporate into a suitably blinded study. We have developed a method of blindly reconstituting and administering fibrinogen concentrate (presented as a lyophilized powder), where the placebo is normal saline. Fibrinogen concentrate is increasingly being used early in the treatment of major hemorrhage. Our methodology was designed for a multicenter study investigating the role of fibrinogen concentrate in the treatment of the coagulopathy associated with major obstetric hemorrhage. The method has been verified by a stand-alone pharmaceutical manufacturing unit with an investigational medicinal products license, and to date has successfully been applied 45 times in four study centers. There have been no difficulties in reconstitution and no related adverse events reported. We feel our method is simple to perform and maintains blinding throughout, making it potentially suitable for use in other trials conducted in psychologically high-pressure environments. Although fibrinogen concentrate was the focus of our study, it is likely that the method is applicable to other lyophilized medication with limited shelf life (e.g., antibiotics). Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  9. Dynamic conversion of solar generated heat to electricity

    NASA Technical Reports Server (NTRS)

    Powell, J. C.; Fourakis, E.; Hammer, J. M.; Smith, G. A.; Grosskreutz, J. C.; Mcbride, E.

    1974-01-01

    The effort undertaken during this program led to the selection of the water-superheated steam (850 psig/900 F) crescent central receiver as the preferred concept from among 11 candidate systems across the technological spectrum of the dynamic conversion of solar generated heat to electricity. The solar power plant designs were investigated in the range of plant capacities from 100 to 1000 Mw(e). The investigations considered the impacts of plant size, collector design, feed-water temperature ratio, heat rejection equipment, ground cover, and location on solar power technical and economic feasibility. For the distributed receiver systems, the optimization studies showed that plant capacities less than 100 Mw(e) may be best. Although the size of central receiver concepts was not parametrically investigated, all indications are that the optimal plant capacity for central receiver systems will be in the range from 50 to 200 Mw(e). Solar thermal power plant site selection criteria and methodology were also established and used to evaluate potentially suitable sites. The result of this effort was to identify a site south of Inyokern, California, as typically suitable for a solar thermal power plant. The criteria used in the selection process included insolation and climatological characteristics, topography, and seismic history as well as water availability.

  10. Artificial muscles with adjustable stiffness

    NASA Astrophysics Data System (ADS)

    Mutlu, Rahim; Alici, Gursel

    2010-04-01

    This paper reports on a stiffness enhancement methodology based on using a suitably designed contact surface with which cantilevered-type conducting polymer bending actuators are in contact during operation. The contact surface constrains the bending behaviour of the actuators. Depending on the topology of the contact surface, the resistance of the polymer actuators to deformation, i.e. stiffness, is varied. As opposed to their predecessors, these polymer actuators operate in air. Finite element analysis and modelling are used to quantify the effect of the contact surface on the effective stiffness of a trilayer cantilevered beam, which represents a one-end-free, the-other-end-fixed polypyrrole (PPy) conducting polymer actuator under a uniformly distributed load. After demonstrating the feasibility of the adjustable stiffness concept, experiments were conducted to determine the stiffness of bending-type conducting polymer actuators in contact with a range (20-40 mm in radius) of circular contact surfaces. The numerical and experimental results presented demonstrate that the stiffness of the actuators can be varied using a suitably profiled contact surface. The larger the radius of the contact surface is, the higher is the stiffness of the polymer actuators. The outcomes of this study suggest that, although the stiffness of the artificial muscles considered in this study is constant for a given geometric size, and electrical and chemical operation conditions, it can be changed in a nonlinear fashion to suit the stiffness requirement of a considered application. The stiffness enhancement methodology can be extended to other ionic-type conducting polymer actuators.

  11. A method for performance comparison of polycentric knees and its application to the design of a knee for developing countries.

    PubMed

    Anand, T S; Sujatha, S

    2017-08-01

    Polycentric knees for transfemoral prostheses have a variety of geometries, but a survey of literature shows that there are few ways of comparing their performance. Our objective was to present a method for performance comparison of polycentric knee geometries and design a new geometry. In this work, we define parameters to compare various commercially available prosthetic knees in terms of their stability, toe clearance, maximum flexion, and so on and optimize the parameters to obtain a new knee design. We use the defined parameters and optimization to design a new knee geometry that provides the greater stability and toe clearance necessary to navigate uneven terrain which is typically encountered in developing countries. Several commercial knees were compared based on the defined parameters to determine their suitability for uneven terrain. A new knee was designed based on optimization of these parameters. Preliminary user testing indicates that the new knee is very stable and easy to use. The methodology can be used for better knee selection and design of more customized knee geometries. Clinical relevance The method provides a tool to aid in the selection and design of polycentric knees for transfemoral prostheses.

  12. Continuously-stirred anaerobic digester to convert organic wastes into biogas: system setup and basic operation.

    PubMed

    Usack, Joseph G; Spirito, Catherine M; Angenent, Largus T

    2012-07-13

    Anaerobic digestion (AD) is a bioprocess that is commonly used to convert complex organic wastes into a useful biogas with methane as the energy carrier. Increasingly, AD is being used in industrial, agricultural, and municipal waste(water) treatment applications. The use of AD technology allows plant operators to reduce waste disposal costs and offset energy utility expenses. In addition to treating organic wastes, energy crops are being converted into the energy carrier methane. As the application of AD technology broadens for the treatment of new substrates and co-substrate mixtures, so does the demand for a reliable testing methodology at the pilot- and laboratory-scale. Anaerobic digestion systems have a variety of configurations, including the continuously stirred tank reactor (CSTR), plug flow (PF), and anaerobic sequencing batch reactor (ASBR) configurations. The CSTR is frequently used in research due to its simplicity in design and operation, but also for its advantages in experimentation. Compared to other configurations, the CSTR provides greater uniformity of system parameters, such as temperature, mixing, chemical concentration, and substrate concentration. Ultimately, when designing a full-scale reactor, the optimum reactor configuration will depend on the character of a given substrate among many other nontechnical considerations. However, all configurations share fundamental design features and operating parameters that render the CSTR appropriate for most preliminary assessments. If researchers and engineers use an influent stream with relatively high concentrations of solids, then lab-scale bioreactor configurations cannot be fed continuously due to plugging problems of lab-scale pumps with solids or settling of solids in tubing. For that scenario with continuous mixing requirements, lab-scale bioreactors are fed periodically and we refer to such configurations as continuously stirred anaerobic digesters (CSADs). This article presents a general methodology for constructing, inoculating, operating, and monitoring a CSAD system for the purpose of testing the suitability of a given organic substrate for long-term anaerobic digestion. The construction section of this article will cover building the lab-scale reactor system. The inoculation section will explain how to create an anaerobic environment suitable for seeding with an active methanogenic inoculum. The operating section will cover operation, maintenance, and troubleshooting. The monitoring section will introduce testing protocols using standard analyses. The use of these measures is necessary for reliable experimental assessments of substrate suitability for AD. This protocol should provide greater protection against a common mistake made in AD studies, which is to conclude that reactor failure was caused by the substrate in use, when really it was improper user operation.

  13. Directions for new developments on statistical design and analysis of small population group trials.

    PubMed

    Hilgers, Ralf-Dieter; Roes, Kit; Stallard, Nigel

    2016-06-14

    Most statistical design and analysis methods for clinical trials have been developed and evaluated where at least several hundreds of patients could be recruited. These methods may not be suitable to evaluate therapies if the sample size is unavoidably small, which is usually termed by small populations. The specific sample size cut off, where the standard methods fail, needs to be investigated. In this paper, the authors present their view on new developments for design and analysis of clinical trials in small population groups, where conventional statistical methods may be inappropriate, e.g., because of lack of power or poor adherence to asymptotic approximations due to sample size restrictions. Following the EMA/CHMP guideline on clinical trials in small populations, we consider directions for new developments in the area of statistical methodology for design and analysis of small population clinical trials. We relate the findings to the research activities of three projects, Asterix, IDeAl, and InSPiRe, which have received funding since 2013 within the FP7-HEALTH-2013-INNOVATION-1 framework of the EU. As not all aspects of the wide research area of small population clinical trials can be addressed, we focus on areas where we feel advances are needed and feasible. The general framework of the EMA/CHMP guideline on small population clinical trials stimulates a number of research areas. These serve as the basis for the three projects, Asterix, IDeAl, and InSPiRe, which use various approaches to develop new statistical methodology for design and analysis of small population clinical trials. Small population clinical trials refer to trials with a limited number of patients. Small populations may result form rare diseases or specific subtypes of more common diseases. New statistical methodology needs to be tailored to these specific situations. The main results from the three projects will constitute a useful toolbox for improved design and analysis of small population clinical trials. They address various challenges presented by the EMA/CHMP guideline as well as recent discussions about extrapolation. There is a need for involvement of the patients' perspective in the planning and conduct of small population clinical trials for a successful therapy evaluation.

  14. Reusable Rocket Engine Operability Modeling and Analysis

    NASA Technical Reports Server (NTRS)

    Christenson, R. L.; Komar, D. R.

    1998-01-01

    This paper describes the methodology, model, input data, and analysis results of a reusable launch vehicle engine operability study conducted with the goal of supporting design from an operations perspective. Paralleling performance analyses in schedule and method, this requires the use of metrics in a validated operations model useful for design, sensitivity, and trade studies. Operations analysis in this view is one of several design functions. An operations concept was developed given an engine concept and the predicted operations and maintenance processes incorporated into simulation models. Historical operations data at a level of detail suitable to model objectives were collected, analyzed, and formatted for use with the models, the simulations were run, and results collected and presented. The input data used included scheduled and unscheduled timeline and resource information collected into a Space Transportation System (STS) Space Shuttle Main Engine (SSME) historical launch operations database. Results reflect upon the importance not only of reliable hardware but upon operations and corrective maintenance process improvements.

  15. Optimization of deep eutectic solvent-based ultrasound-assisted extraction of polysaccharides from Dioscorea opposita Thunb.

    PubMed

    Zhang, Lijin; Wang, Maoshan

    2017-02-01

    In this study, deep eutectic solvents were proposed for the ultrasound-assisted extraction of polysaccharides from Dioscorea opposita Thunb. Several deep eutectic solvents were prepared for the extraction of polysaccharides, among which the deep eutectic solvent composed of choline chloride and 1,4-butanediol was proved to be suitable for the extraction. Based on the screening of single-factor experiment design and orthogonal experiment design, three experimental factors were optimized for the Box-Behnken experimental design combined with response surface methodology, which gave the optimal extraction conditions: water content of 32.89%(v/v), extraction temperature of 94.00°C, and the extraction time of 44.74min. The optimal extraction conditions could supply higher extraction yield than those of hot water extraction and water-based ultrasound-assisted extraction. Therefore, deep eutectic solvents were an excellent extraction solvent alternative to the extraction of polysaccharides from sample matrices. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Conceptual design of cost-effective and environmentally-friendly configurations for fuel ethanol production from sugarcane by knowledge-based process synthesis.

    PubMed

    Sánchez, Óscar J; Cardona, Carlos A

    2012-01-01

    In this work, the hierarchical decomposition methodology was used to conceptually design the production of fuel ethanol from sugarcane. The decomposition of the process into six levels of analysis was carried out. Several options of technological configurations were assessed in each level considering economic and environmental criteria. The most promising alternatives were chosen rejecting the ones with a least favorable performance. Aspen Plus was employed for simulation of each one of the technological configurations studied. Aspen Icarus was used for economic evaluation of each configuration, and WAR algorithm was utilized for calculation of the environmental criterion. The results obtained showed that the most suitable synthesized flowsheet involves the continuous cultivation of Zymomonas mobilis with cane juice as substrate and including cell recycling and the ethanol dehydration by molecular sieves. The proposed strategy demonstrated to be a powerful tool for conceptual design of biotechnological processes considering both techno-economic and environmental indicators. Copyright © 2011 Elsevier Ltd. All rights reserved.

  17. Technological process and optimum design of organic materials vacuum pyrolysis and indium chlorinated separation from waste liquid crystal display panels.

    PubMed

    Ma, En; Xu, Zhenming

    2013-12-15

    In this study, a technology process including vacuum pyrolysis and vacuum chlorinated separation was proposed to convert waste liquid crystal display (LCD) panels into useful resources using self-design apparatuses. The suitable pyrolysis temperature and pressure are determined as 300°C and 50 Pa at first. The organic parts of the panels were converted to oil (79.10 wt%) and gas (2.93 wt%). Then the technology of separating indium was optimized by central composite design (CCD) under response surface methodology (RSM). The results indicated the indium recovery ratio was 99.97% when the particle size is less than 0.16 mm, the weight percentage of NH4Cl to glass powder is 50 wt% and temperature is 450°C. The research results show that the organic materials, indium and glass of LCD panel can be recovered during the recovery process efficiently and eco-friendly. Copyright © 2013 Elsevier B.V. All rights reserved.

  18. Ultra compact spectrometer using linear variable filters

    NASA Astrophysics Data System (ADS)

    Dami, M.; De Vidi, R.; Aroldi, G.; Belli, F.; Chicarella, L.; Piegari, A.; Sytchkova, A.; Bulir, J.; Lemarquis, F.; Lequime, M.; Abel Tibérini, L.; Harnisch, B.

    2017-11-01

    The Linearly Variable Filters (LVF) are complex optical devices that, integrated in a CCD, can realize a "single chip spectrometer". In the framework of an ESA Study, a team of industries and institutes led by SELEX-Galileo explored the design principles and manufacturing techniques, realizing and characterizing LVF samples based both on All-Dielectric (AD) and Metal-Dielectric (MD) Coating Structures in the VNIR and SWIR spectral ranges. In particular the achieved performances on spectral gradient, transmission bandwidth and Spectral Attenuation (SA) are presented and critically discussed. Potential improvements will be highlighted. In addition the results of a feasibility study of a SWIR Linear Variable Filter are presented with the comparison of design prediction and measured performances. Finally criticalities related to the filter-CCD packaging are discussed. The main achievements reached during these activities have been: - to evaluate by design, manufacturing and test of LVF samples the achievable performances compared with target requirements; - to evaluate the reliability of the projects by analyzing their repeatability; - to define suitable measurement methodologies

  19. Critical considerations when planning experimental in vivo studies in dental traumatology.

    PubMed

    Andreasen, Jens O; Andersson, Lars

    2011-08-01

    In vivo studies are sometimes needed to understand healing processes after trauma. For several reasons, not the least ethical, such studies have to be carefully planned and important considerations have to be taken into account about suitability of the experimental model, sample size and optimizing the accuracy of the analysis. Several manuscripts of in vivo studies are submitted for publication to Dental Traumatology and rejected because of inadequate design, methodology or insufficient documentation of the results. The authors have substantial experience in experimental in vivo studies of tissue healing in dental traumatology and share their knowledge regarding critical considerations when planning experimental in vivo studies. © 2011 John Wiley & Sons A/S.

  20. FPGA implementation of current-sharing strategy for parallel-connected SEPICs

    NASA Astrophysics Data System (ADS)

    Ezhilarasi, A.; Ramaswamy, M.

    2016-01-01

    The attempt echoes to evolve an equal current-sharing algorithm over a number of single-ended primary inductance converters connected in parallel. The methodology involves the development of state-space model to predict the condition for the existence of a stable equilibrium portrait. It acquires the role of a variable structure controller to guide the trajectory, with a view to circumvent the circuit non-linearities and arrive at a stable performance through a preferred operating range. The design elicits an acceptable servo and regulatory characteristics, the desired time response and ensures regulation of the load voltage. The simulation results validated through a field programmable gate array-based prototype serves to illustrate its suitability for present-day applications.

  1. Intrinsic Nano-Ductility of Glasses: The Critical Role of Composition

    NASA Astrophysics Data System (ADS)

    Wang, Bu; Yu, Yingtian; Lee, Young; Bauchy, Mathieu

    2015-02-01

    Understanding, predicting and eventually improving the resistance to fracture for silicate materials is of primary importance to design tougher new glasses suitable for advanced applications. However, the fracture mechanism at the atomic level in amorphous silicate materials is still a topic of debate. In particular, there are some controversies about the existence of ductility at the nanoscale during crack propagation. Here, we present simulations of fracture of three archetypical silicate glasses, using molecular dynamics. The simulations clearly show that, depending on their composition, silicate glasses can exhibit different degrees of ductility at the nanoscale. Additionally, we show that the methodology used in the present work can provide realistic predictions of fracture energy and toughness.

  2. Comparing electronic probes for volumetric water content of low-density feathermoss

    USGS Publications Warehouse

    Overduin, P.P.; Yoshikawa, K.; Kane, D.L.; Harden, J.W.

    2005-01-01

    Purpose - Feathermoss is ubiquitous in the boreal forest and across various land-cover types of the arctic and subarctic. A variety of affordable commercial sensors for soil moisture content measurement have recently become available and are in use in such regions, often in conjunction with fire-susceptibility or ecological studies. Few come supplied with calibrations suitable or suggested for soils high in organics. Aims to test seven of these sensors for use in feathermoss, seeking calibrations between sensor output and volumetric water content. Design/methodology/approach - Measurements from seven sensors installed in live, dead and burned feathermoss samples, drying in a controlled manner, were compared to moisture content measurements. Empirical calibrations of sensor output to water content were determined. Findings - Almost all of the sensors tested were suitable for measuring the moss sample water content, and a unique calibration for each sensor for this material is presented. Differences in sensor design lead to changes in sensitivity as a function of volumetric water content, affecting the spatial averaging over the soil measurement volume. Research limitations/implications - The wide range of electromagnetic sensors available include frequency and time domain designs with variations in wave guide and sensor geometry, the location of sensor electronics and operating frequency. Practical implications - This study provides information for extending the use of electromagnetic sensors to feathermoss. Originality/value - A comparison of volumetric water content sensor mechanics and design is of general interest to researchers measuring soil water content. In particular, researchers working in wetlands, boreal forests and tundra regions will be able to apply these results. ?? Emerald Group Publishing Limited.

  3. Dental implant customization using numerical optimization design and 3-dimensional printing fabrication of zirconia ceramic.

    PubMed

    Cheng, Yung-Chang; Lin, Deng-Huei; Jiang, Cho-Pei; Lin, Yuan-Min

    2017-05-01

    This study proposes a new methodology for dental implant customization consisting of numerical geometric optimization and 3-dimensional printing fabrication of zirconia ceramic. In the numerical modeling, exogenous factors for implant shape include the thread pitch, thread depth, maximal diameter of implant neck, and body size. Endogenous factors are bone density, cortical bone thickness, and non-osseointegration. An integration procedure, including uniform design method, Kriging interpolation and genetic algorithm, is applied to optimize the geometry of dental implants. The threshold of minimal micromotion for optimization evaluation was 100 μm. The optimized model is imported to the 3-dimensional slurry printer to fabricate the zirconia green body (powder is bonded by polymer weakly) of the implant. The sintered implant is obtained using a 2-stage sintering process. Twelve models are constructed according to uniform design method and simulated the micromotion behavior using finite element modeling. The result of uniform design models yields a set of exogenous factors that can provide the minimal micromotion (30.61 μm), as a suitable model. Kriging interpolation and genetic algorithm modified the exogenous factor of the suitable model, resulting in 27.11 μm as an optimization model. Experimental results show that the 3-dimensional slurry printer successfully fabricated the green body of the optimization model, but the accuracy of sintered part still needs to be improved. In addition, the scanning electron microscopy morphology is a stabilized t-phase microstructure, and the average compressive strength of the sintered part is 632.1 MPa. Copyright © 2016 John Wiley & Sons, Ltd.

  4. A Web Resource for Standardized Benchmark Datasets, Metrics, and Rosetta Protocols for Macromolecular Modeling and Design.

    PubMed

    Ó Conchúir, Shane; Barlow, Kyle A; Pache, Roland A; Ollikainen, Noah; Kundert, Kale; O'Meara, Matthew J; Smith, Colin A; Kortemme, Tanja

    2015-01-01

    The development and validation of computational macromolecular modeling and design methods depend on suitable benchmark datasets and informative metrics for comparing protocols. In addition, if a method is intended to be adopted broadly in diverse biological applications, there needs to be information on appropriate parameters for each protocol, as well as metrics describing the expected accuracy compared to experimental data. In certain disciplines, there exist established benchmarks and public resources where experts in a particular methodology are encouraged to supply their most efficient implementation of each particular benchmark. We aim to provide such a resource for protocols in macromolecular modeling and design. We present a freely accessible web resource (https://kortemmelab.ucsf.edu/benchmarks) to guide the development of protocols for protein modeling and design. The site provides benchmark datasets and metrics to compare the performance of a variety of modeling protocols using different computational sampling methods and energy functions, providing a "best practice" set of parameters for each method. Each benchmark has an associated downloadable benchmark capture archive containing the input files, analysis scripts, and tutorials for running the benchmark. The captures may be run with any suitable modeling method; we supply command lines for running the benchmarks using the Rosetta software suite. We have compiled initial benchmarks for the resource spanning three key areas: prediction of energetic effects of mutations, protein design, and protein structure prediction, each with associated state-of-the-art modeling protocols. With the help of the wider macromolecular modeling community, we hope to expand the variety of benchmarks included on the website and continue to evaluate new iterations of current methods as they become available.

  5. Formulation and optimization of mucoadhesive buccal patches of losartan potassium by using response surface methodology

    PubMed Central

    Ikram, Md.; Gilhotra, Neeraj; Gilhotra, Ritu Mehra

    2015-01-01

    Background: This study was undertaken with an aim to systematically design a model of factors that would yield an optimized sustained release dosage form of an anti-hypertensive agent, losartan potassium, using response surface methodology (RSM) by employing 32 full factorial design. Materials and Methods: Mucoadhesive buccal patches were prepared using different grades of hydroxypropyl methylcellulose (HPMC) (K4M and K100M) and polyvinylpyrrolidone-K30 by solvent casting method. The amount of the release retardant polymers – HPMC K4M (X1) and HPMC K100M (X2) was taken as an independent variable. The dependent variables were the burst release in 30 min (Y1), cumulative percentage release of drug after 8 h (Y2) and swelling index (Y3) of the patches. In vitro release and swelling studies were carried out and the data were fitted to kinetic equations. Results: The physicochemical, bioadhesive, and swelling properties of patches were found to vary significantly depending on the viscosity of the polymers and their combination. Patches showed an initial burst release preceding a more gradual sustained release phase following a nonfickian diffusion process. Discussion: The results indicate that suitable bioadhesive buccal patches with desired permeability could be prepared, facilitated with the RSM. PMID:26682205

  6. Experimental design based response surface methodology optimization of ultrasonic assisted adsorption of safaranin O by tin sulfide nanoparticle loaded on activated carbon

    NASA Astrophysics Data System (ADS)

    Roosta, M.; Ghaedi, M.; Daneshfar, A.; Sahraei, R.

    2014-03-01

    In this research, the adsorption rate of safranine O (SO) onto tin sulfide nanoparticle loaded on activated carbon (SnS-NPAC) was accelerated by the ultrasound. SnS-NP-AC was characterized by different techniques such as SEM, XRD and UV-Vis measurements. The present results confirm that the ultrasound assisted adsorption method has remarkable ability to improve the adsorption efficiency. The influence of parameters such as the sonication time, adsorbent dosage, pH and initial SO concentration was examined and evaluated by central composite design (CCD) combined with response surface methodology (RSM) and desirability function (DF). Conducting adsorption experiments at optimal conditions set as 4 min of sonication time, 0.024 g of adsorbent, pH 7 and 18 mg L-1 SO make admit to achieve high removal percentage (98%) and high adsorption capacity (50.25 mg g-1). A good agreement between experimental and predicted data in this study was observed. The experimental equilibrium data fitting to Langmuir, Freundlich, Tempkin and Dubinin-Radushkevich models show that the Langmuir model is a good and suitable model for evaluation and the actual behavior of adsorption. Kinetic evaluation of experimental data showed that the adsorption processes followed well pseudo-second-order and intraparticle diffusion models.

  7. Rational Design Methodology.

    DTIC Science & Technology

    1978-09-01

    This report describes an effort to specify a software design methodology applicable to the Air Force software environment . Available methodologies...of techniques for proof of correctness, design specification, and performance assessment of static designs. The rational methodology selected is a

  8. Environmental Risk Assessment of dredging processes - application to Marin harbour (NW Spain)

    NASA Astrophysics Data System (ADS)

    Gómez, A. G.; García Alba, J.; Puente, A.; Juanes, J. A.

    2014-04-01

    A methodological procedure to estimate the environmental risk of dredging operations in aquatic systems has been developed. Environmental risk estimations are based on numerical models results, which provide an appropriated spatio-temporal framework analysis to guarantee an effective decision-making process. The methodological procedure has been applied on a real dredging operation in the port of Marin (NW Spain). Results from Marin harbour confirmed the suitability of the developed methodology and the conceptual approaches as a comprehensive and practical management tool.

  9. Environmental impact reduction through ecological planning at Bahia Magdalena, Mexico.

    PubMed

    Malagrino, Giovanni; Lagunas, Magdalena; Rubio, Alfredo Ortega

    2008-03-01

    For analyzing basic marine and coastal characteristics we selected the potential sites where shrimp culture could be developed in a large coastal zone, Bahia Magdalena, Baja California Sur, Mexico. Based on our analysis, 6 sites were preselected and field stages of work were then developed to assess the precise suitability of each site in order to develop the proposed aquaculture activities. In ranking the suitability we were able to recommend the most appropriate places to develop shrimp culture in this region. Also, knowing the exact biological, physico-chemical and social environment, we determined the best species to cultivate, the recommended total area and the methodology to be used to lessen the environmental impact and to obtain the maximum profitability Our methodology could be used not only to select appropriate sites for shrimp culture in other coastal lagoons, but it also could be applied to assess the suitability in a quick and accurate way, of any other production activity in coastal zones.

  10. Optimal design of isotope labeling experiments.

    PubMed

    Yang, Hong; Mandy, Dominic E; Libourel, Igor G L

    2014-01-01

    Stable isotope labeling experiments (ILE) constitute a powerful methodology for estimating metabolic fluxes. An optimal label design for such an experiment is necessary to maximize the precision with which fluxes can be determined. But often, precision gained in the determination of one flux comes at the expense of the precision of other fluxes, and an appropriate label design therefore foremost depends on the question the investigator wants to address. One could liken ILE to shadows that metabolism casts on products. Optimal label design is the placement of the lamp; creating clear shadows for some parts of metabolism and obscuring others.An optimal isotope label design is influenced by: (1) the network structure; (2) the true flux values; (3) the available label measurements; and, (4) commercially available substrates. The first two aspects are dictated by nature and constrain any optimal design. The second two aspects are suitable design parameters. To create an optimal label design, an explicit optimization criterion needs to be formulated. This usually is a property of the flux covariance matrix, which can be augmented by weighting label substrate cost. An optimal design is found by using such a criterion as an objective function for an optimizer. This chapter uses a simple elementary metabolite units (EMU) representation of the TCA cycle to illustrate the process of experimental design of isotope labeled substrates.

  11. The Study of the Relationship between Probabilistic Design and Axiomatic Design Methodology. Volume 1

    NASA Technical Reports Server (NTRS)

    Onwubiko, Chinyere; Onyebueke, Landon

    1996-01-01

    This program report is the final report covering all the work done on this project. The goal of this project is technology transfer of methodologies to improve design process. The specific objectives are: 1. To learn and understand the Probabilistic design analysis using NESSUS. 2. To assign Design Projects to either undergraduate or graduate students on the application of NESSUS. 3. To integrate the application of NESSUS into some selected senior level courses in Civil and Mechanical Engineering curricula. 4. To develop courseware in Probabilistic Design methodology to be included in a graduate level Design Methodology course. 5. To study the relationship between the Probabilistic design methodology and Axiomatic design methodology.

  12. SINERGIA laparoscopic virtual reality simulator: didactic design and technical development.

    PubMed

    Lamata, Pablo; Gómez, Enrique J; Sánchez-Margallo, Francisco M; López, Oscar; Monserrat, Carlos; García, Verónica; Alberola, Carlos; Florido, Miguel Angel Rodríguez; Ruiz, Juan; Usón, Jesús

    2007-03-01

    VR laparoscopic simulators have demonstrated its validity in recent studies, and research should be directed towards a high training effectiveness and efficacy. In this direction, an insight into simulators' didactic design and technical development is provided, by describing the methodology followed in the building of the SINERGIA simulator. It departs from a clear analysis of training needs driven by a surgical training curriculum. Existing solutions and validation studies are an important reference for the definition of specifications, which are described with a suitable use of simulation technologies. Five new didactic exercises are proposed to train some of the basic laparoscopic skills. Simulator construction has required existing algorithms and the development of a particle-based biomechanical model, called PARSYS, and a collision handling solution based in a multi-point strategy. The resulting VR laparoscopic simulator includes new exercises and enhanced simulation technologies, and is finding a very good acceptance among surgeons.

  13. Employing a Qualitative Description Approach in Health Care Research.

    PubMed

    Bradshaw, Carmel; Atkinson, Sandra; Doody, Owen

    2017-01-01

    A qualitative description design is particularly relevant where information is required directly from those experiencing the phenomenon under investigation and where time and resources are limited. Nurses and midwives often have clinical questions suitable to a qualitative approach but little time to develop an exhaustive comprehension of qualitative methodological approaches. Qualitative description research is sometimes considered a less sophisticated approach for epistemological reasons. Another challenge when considering qualitative description design is differentiating qualitative description from other qualitative approaches. This article provides a systematic and robust journey through the philosophical, ontological, and epistemological perspectives, which evidences the purpose of qualitative description research. Methods and rigor issues underpinning qualitative description research are also appraised to provide the researcher with a systematic approach to conduct research utilizing this approach. The key attributes and value of qualitative description research in the health care professions will be highlighted with the aim of extending its usage.

  14. Employing a Qualitative Description Approach in Health Care Research

    PubMed Central

    Bradshaw, Carmel; Atkinson, Sandra; Doody, Owen

    2017-01-01

    A qualitative description design is particularly relevant where information is required directly from those experiencing the phenomenon under investigation and where time and resources are limited. Nurses and midwives often have clinical questions suitable to a qualitative approach but little time to develop an exhaustive comprehension of qualitative methodological approaches. Qualitative description research is sometimes considered a less sophisticated approach for epistemological reasons. Another challenge when considering qualitative description design is differentiating qualitative description from other qualitative approaches. This article provides a systematic and robust journey through the philosophical, ontological, and epistemological perspectives, which evidences the purpose of qualitative description research. Methods and rigor issues underpinning qualitative description research are also appraised to provide the researcher with a systematic approach to conduct research utilizing this approach. The key attributes and value of qualitative description research in the health care professions will be highlighted with the aim of extending its usage. PMID:29204457

  15. Handheld ultrasound array imaging device

    NASA Astrophysics Data System (ADS)

    Hwang, Juin-Jet; Quistgaard, Jens

    1999-06-01

    A handheld ultrasound imaging device, one that weighs less than five pounds, has been developed for diagnosing trauma in the combat battlefield as well as a variety of commercial mobile diagnostic applications. This handheld device consists of four component ASICs, each is designed using the state of the art microelectronics technologies. These ASICs are integrated with a convex array transducer to allow high quality imaging of soft tissues and blood flow in real time. The device is designed to be battery driven or ac powered with built-in image storage and cineloop playback capability. Design methodologies of a handheld device are fundamentally different to those of a cart-based system. As system architecture, signal and image processing algorithm as well as image control circuit and software in this device is deigned suitably for large-scale integration, the image performance of this device is designed to be adequate to the intent applications. To elongate the battery life, low power design rules and power management circuits are incorporated in the design of each component ASIC. The performance of the prototype device is currently being evaluated for various applications such as a primary image screening tool, fetal imaging in Obstetrics, foreign object detection and wound assessment for emergency care, etc.

  16. Round versus rectangular: Does the plot shape matter?

    NASA Astrophysics Data System (ADS)

    Iserloh, Thomas; Bäthke, Lars; Ries, Johannes B.

    2016-04-01

    Field rainfall simulators are designed to study soil erosion processes and provide urgently needed data for various geomorphological, hydrological and pedological issues. Due to the different conditions and technologies applied, there are several methodological aspects under review of the scientific community, particularly concerning design, procedures and conditions of measurement for infiltration, runoff and soil erosion. Extensive discussions at the Rainfall Simulator Workshop 2011 in Trier and the Splinter Meeting at EGU 2013 "Rainfall simulation: Big steps forward!" lead to the opinion that the rectangular shape is the more suitable plot shape compared to the round plot. A horizontally edging Gerlach trough is installed for sample collection without forming unnatural necks as is found at round or triangle plots. Since most research groups did and currently do work with round plots at the point scale (<1m²), a precise analysis of the differences between the output of round and square plots are necessary. Our hypotheses are: - Round plot shapes disturb surface runoff, unnatural fluvial dynamics for the given plot size such as pool development especially directly at the plot's outlet occur. - A square plot shape prevent these problems. A first comparison between round and rectangular plots (Iserloh et al., 2015) indicates that the rectangular plot could indeed be the more suitable, but the rather ambiguous results make a more elaborate test setup necessary. The laboratory test setup includes the two plot shapes (round, square), a standardised silty substrate and three inclinations (2°, 6°, 12°). The analysis of the laboratory test provide results on the best performance concerning undisturbed surface runoff and soil/water sampling at the plot's outlet. The analysis of the plot shape concerning its influence on runoff and erosion shows that clear methodological standards are necessary in order to make rainfall simulation experiments comparable. Reference: Iserloh, T., Pegoraro, D., Schlösser, A., Thesing, H., Seeger, M., Ries, J.B. (2015): Rainfall simulation experiments: Influence of water temperature, water quality and plot design on soil erosion and runoff. Geophysical Research Abstracts, Vol. 17, EGU2015-5817.

  17. Spatial Designation of Critical Habitats for Endangered and Threatened Species in the United States

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tuttle, Mark A; Singh, Nagendra; Sabesan, Aarthy

    Establishing biological reserves or "hot spots" for endangered and threatened species is critical to support real-world species regulatory and management problems. Geographic data on the distribution of endangered and threatened species can be used to improve ongoing efforts for species conservation in the United States. At present no spatial database exists which maps out the location endangered species for the US. However, spatial descriptions do exists for the habitat associated with all endangered species, but in a form not readily suitable to use in a geographic information system (GIS). In our study, the principal challenge was extracting spatial data describingmore » these critical habitats for 472 species from over 1000 pages of the federal register. In addition, an appropriate database schema was designed to accommodate the different tiers of information associated with the species along with the confidence of designation; the interpreted location data was geo-referenced to the county enumeration unit producing a spatial database of endangered species for the whole of US. The significance of these critical habitat designations, database scheme and methodologies will be discussed.« less

  18. Reliability based design optimization: Formulations and methodologies

    NASA Astrophysics Data System (ADS)

    Agarwal, Harish

    Modern products ranging from simple components to complex systems should be designed to be optimal and reliable. The challenge of modern engineering is to ensure that manufacturing costs are reduced and design cycle times are minimized while achieving requirements for performance and reliability. If the market for the product is competitive, improved quality and reliability can generate very strong competitive advantages. Simulation based design plays an important role in designing almost any kind of automotive, aerospace, and consumer products under these competitive conditions. Single discipline simulations used for analysis are being coupled together to create complex coupled simulation tools. This investigation focuses on the development of efficient and robust methodologies for reliability based design optimization in a simulation based design environment. Original contributions of this research are the development of a novel efficient and robust unilevel methodology for reliability based design optimization, the development of an innovative decoupled reliability based design optimization methodology, the application of homotopy techniques in unilevel reliability based design optimization methodology, and the development of a new framework for reliability based design optimization under epistemic uncertainty. The unilevel methodology for reliability based design optimization is shown to be mathematically equivalent to the traditional nested formulation. Numerical test problems show that the unilevel methodology can reduce computational cost by at least 50% as compared to the nested approach. The decoupled reliability based design optimization methodology is an approximate technique to obtain consistent reliable designs at lesser computational expense. Test problems show that the methodology is computationally efficient compared to the nested approach. A framework for performing reliability based design optimization under epistemic uncertainty is also developed. A trust region managed sequential approximate optimization methodology is employed for this purpose. Results from numerical test studies indicate that the methodology can be used for performing design optimization under severe uncertainty.

  19. Optimization of antifungal production by an alkaliphilic and halotolerant actinomycete, Streptomyces sp. SY-BS5, using response surface methodology.

    PubMed

    Souagui, Y; Tritsch, D; Grosdemange-Billiard, C; Kecha, M

    2015-06-01

    Optimization of medium components and physicochemical parameters for antifungal production by an alkaliphilic and salt-tolerant actinomycete designated Streptomyces sp. SY-BS5; isolated from an arid region in south of Algeria. The strain showed broad-spectrum activity against pathogenic and toxinogenic fungi. Identification of the actinomycete strain was realized on the basis of 16S rRNA gene sequencing. Antifungal production was optimized following one-factor-at-a-time (OFAT) and response surface methodology (RSM) approaches. The most suitable medium for growth and antifungal production was found using one-factor-at-a-time methodology. The individual and interaction effects of three nutritional variables, carbon source (glucose), nitrogen source (yeast extract) and sodium chloride (NaCl) were optimized by Box-Behnken design. Finally, culture conditions for the antifungal production, pH and temperature were studied and determined. Analysis of the 16S rRNA gene sequence (1454 nucleotides) assigned this strain to Streptomyces genus with 99% similarity with Streptomyces cyaneofuscatus JCM4364(T), the most closely related. The results of the optimization study show that concentrations 3.476g/L of glucose, 3.876g/L of yeast extract and 41.140g/L of NaCl are responsible for the enhancement of antifungal production by Streptomyces sp. SY-BS5. The preferable culture conditions for antifungal production were pH 10, temperature 30°C for 09 days. This study proved that RSM is usual and powerful tool for the optimization of antifungal production from actinomycetes. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  20. A methodology to link national and local information for spatial targeting of ammonia mitigation efforts

    NASA Astrophysics Data System (ADS)

    Carnell, E. J.; Misselbrook, T. H.; Dore, A. J.; Sutton, M. A.; Dragosits, U.

    2017-09-01

    The effects of atmospheric nitrogen (N) deposition are evident in terrestrial ecosystems worldwide, with eutrophication and acidification leading to significant changes in species composition. Substantial reductions in N deposition from nitrogen oxides emissions have been achieved in recent decades. By contrast, ammonia (NH3) emissions from agriculture have not decreased substantially and are typically highly spatially variable, making efficient mitigation challenging. One solution is to target NH3 mitigation measures spatially in source landscapes to maximize the benefits for nature conservation. The paper develops an approach to link national scale data and detailed local data to help identify suitable measures for spatial targeting of local sources near designated Special Areas of Conservation (SACs). The methodology combines high-resolution national data on emissions, deposition and source attribution with local data on agricultural management and site conditions. Application of the methodology for the full set of 240 SACs in England found that agriculture contributes ∼45 % of total N deposition. Activities associated with cattle farming represented 54 % of agricultural NH3 emissions within 2 km of the SACs, making them a major contributor to local N deposition, followed by mineral fertiliser application (21 %). Incorporation of local information on agricultural management practices at seven example SACs provided the means to correct outcomes compared with national-scale emission factors. The outcomes show how national scale datasets can provide information on N deposition threats at landscape to national scales, while local-scale information helps to understand the feasibility of mitigation measures, including the impact of detailed spatial targeting on N deposition rates to designated sites.

  1. NESSUS (Numerical Evaluation of Stochastic Structures Under Stress)/EXPERT: Bridging the gap between artificial intelligence and FORTRAN

    NASA Technical Reports Server (NTRS)

    Fink, Pamela K.; Palmer, Karol K.

    1988-01-01

    The development of a probabilistic structural analysis methodology (PSAM) is described. In the near-term, the methodology will be applied to designing critical components of the next generation space shuttle main engine. In the long-term, PSAM will be applied very broadly, providing designers with a new technology for more effective design of structures whose character and performance are significantly affected by random variables. The software under development to implement the ideas developed in PSAM resembles, in many ways, conventional deterministic structural analysis code. However, several additional capabilities regarding the probabilistic analysis makes the input data requirements and the resulting output even more complex. As a result, an intelligent front- and back-end to the code is being developed to assist the design engineer in providing the input data in a correct and appropriate manner. The type of knowledge that this entails is, in general, heuristically-based, allowing the fairly well-understood technology of production rules to apply with little difficulty. However, the PSAM code, called NESSUS, is written in FORTRAN-77 and runs on a DEC VAX. Thus, the associated expert system, called NESSUS/EXPERT, must run on a DEC VAX as well, and integrate effectively and efficiently with the existing FORTRAN code. This paper discusses the process undergone to select a suitable tool, identify an appropriate division between the functions that should be performed in FORTRAN and those that should be performed by production rules, and how integration of the conventional and AI technologies was achieved.

  2. Susceptibility of Legionella Strains to the Chlorinated Biocide, Monochloramine

    PubMed Central

    Jakubek, Delphine; Guillaume, Carole; Binet, Marie; Leblon, Gérard; DuBow, Michael; Le Brun, Matthieu

    2013-01-01

    Members of the Legionella genus find suitable conditions for their growth and survival in nuclear power plant cooling circuits. To limit the proliferation of Legionella pathogenic bacteria in nuclear power plant cooling circuits, and ensure that levels remain below regulatory thresholds, monochloramine treatment can be used. Although the treatment is highly effective, i.e. it reduces Legionella numbers by over 99%, Legionella bacteria can still be detected at low concentrations and rapid re-colonisation of circuits can occur after the treatment has ceased. The aim of this study was to develop an in vitro methodology for determining the intrinsic susceptibility of L. pneumophila strains, collected from various nuclear power plant cooling circuits subjected to different treatment conditions. The methodology was developed by using an original approach based on response surface methodology (RSM) combined with a multifactorial experimental design. The susceptibility was evaluated by the Ct factor. The susceptibility of environmental strains varies widely and is, for some strains, greater than that of known tolerant species; however, strain susceptibility was not related to treatment conditions. Selection pressure induced by monochloramine use did not result in the selection of more tolerant Legionella strains and did not explain the detection of Legionella during treatment or the rapid re-colonisation of cooling circuits after disinfection has ceased. PMID:24005820

  3. Synergistic Effects of Chinese Herbal Medicine: A Comprehensive Review of Methodology and Current Research

    PubMed Central

    Zhou, Xian; Seto, Sai Wang; Chang, Dennis; Kiat, Hosen; Razmovski-Naumovski, Valentina; Chan, Kelvin; Bensoussan, Alan

    2016-01-01

    Traditional Chinese medicine (TCM) is an important part of primary health care in Asian countries that has utilized complex herbal formulations (consisting 2 or more medicinal herbs) for treating diseases over thousands of years. There seems to be a general assumption that the synergistic therapeutic effects of Chinese herbal medicine (CHM) derive from the complex interactions between the multiple bioactive components within the herbs and/or herbal formulations. However, evidence to support these synergistic effects remains weak and controversial due to several reasons, including the very complex nature of CHM, misconceptions about synergy and methodological challenges to study design. In this review, we clarify the definition of synergy, identify common errors in synergy research and describe current methodological approaches to test for synergistic interaction. We discuss the strengths and weaknesses of these models in the context of CHM and summarize the current status of synergy research in CHM. Despite the availability of some scientific data to support the synergistic effects of multi-herbal and/or herb-drug combinations, the level of evidence remains low, and the clinical relevancy of most of these findings is undetermined. There remain significant challenges in the development of suitable methods for synergistic studies of complex herbal combinations. PMID:27462269

  4. Susceptibility of Legionella strains to the chlorinated biocide, monochloramine.

    PubMed

    Jakubek, Delphine; Guillaume, Carole; Binet, Marie; Leblon, Gérard; DuBow, Michael; Le Brun, Matthieu

    2013-01-01

    Members of the Legionella genus find suitable conditions for their growth and survival in nuclear power plant cooling circuits. To limit the proliferation of Legionella pathogenic bacteria in nuclear power plant cooling circuits, and ensure that levels remain below regulatory thresholds, monochloramine treatment can be used. Although the treatment is highly effective, i.e. it reduces Legionella numbers by over 99%, Legionella bacteria can still be detected at low concentrations and rapid re-colonisation of circuits can occur after the treatment has ceased. The aim of this study was to develop an in vitro methodology for determining the intrinsic susceptibility of L. pneumophila strains, collected from various nuclear power plant cooling circuits subjected to different treatment conditions. The methodology was developed by using an original approach based on response surface methodology (RSM) combined with a multifactorial experimental design. The susceptibility was evaluated by the Ct factor. The susceptibility of environmental strains varies widely and is, for some strains, greater than that of known tolerant species; however, strain susceptibility was not related to treatment conditions. Selection pressure induced by monochloramine use did not result in the selection of more tolerant Legionella strains and did not explain the detection of Legionella during treatment or the rapid re-colonisation of cooling circuits after disinfection has ceased.

  5. Streamlined design and self reliant hardware for active control of precision space structures

    NASA Technical Reports Server (NTRS)

    Hyland, David C.; King, James A.; Phillips, Douglas J.

    1994-01-01

    Precision space structures may require active vibration control to satisfy critical performance requirements relating to line-of-sight pointing accuracy and the maintenance of precise, internal alignments. In order for vibration control concepts to become operational, it is necessary that their benefits be practically demonstrated in large scale ground-based experiments. A unique opportunity to carry out such demonstrations on a wide variety of experimental testbeds was provided by the NASA Control-Structure Integration (CSI) Guest Investigator (GI) Program. This report surveys the experimental results achieved by the Harris Corporation GI team on both Phases 1 and 2 of the program and provides a detailed description of Phase 2 activities. The Phase 1 results illustrated the effectiveness of active vibration control for space structures and demonstrated a systematic methodology for control design, implementation test. In Phase 2, this methodology was significantly streamlined to yield an on-site, single session design/test capability. Moreover, the Phase 2 research on adaptive neural control techniques made significant progress toward fully automated, self-reliant space structure control systems. As a further thrust toward productized, self-contained vibration control systems, the Harris Phase II activity concluded with experimental demonstration of new vibration isolation hardware suitable for a wide range of space-flight and ground-based commercial applications.The CSI GI Program Phase 1 activity was conducted under contract NASA1-18872, and the Phase 2 activity was conducted under NASA1-19372.

  6. Conceptual design and multidisciplinary optimization of in-plane morphing wing structures

    NASA Astrophysics Data System (ADS)

    Inoyama, Daisaku; Sanders, Brian P.; Joo, James J.

    2006-03-01

    In this paper, the topology optimization methodology for the synthesis of distributed actuation system with specific applications to the morphing air vehicle is discussed. The main emphasis is placed on the topology optimization problem formulations and the development of computational modeling concepts. For demonstration purposes, the inplane morphing wing model is presented. The analysis model is developed to meet several important criteria: It must allow large rigid-body displacements, as well as variation in planform area, with minimum strain on structural members while retaining acceptable numerical stability for finite element analysis. Preliminary work has indicated that addressed modeling concept meets the criteria and may be suitable for the purpose. Topology optimization is performed on the ground structure based on this modeling concept with design variables that control the system configuration. In other words, states of each element in the model are design variables and they are to be determined through optimization process. In effect, the optimization process assigns morphing members as 'soft' elements, non-morphing load-bearing members as 'stiff' elements, and non-existent members as 'voids.' In addition, the optimization process determines the location and relative force intensities of distributed actuators, which is represented computationally as equal and opposite nodal forces with soft axial stiffness. Several different optimization problem formulations are investigated to understand their potential benefits in solution quality, as well as meaningfulness of formulation itself. Sample in-plane morphing problems are solved to demonstrate the potential capability of the methodology introduced in this paper.

  7. Using aerial photography to estimate wood suitable for charcoal in managed oak forests

    NASA Astrophysics Data System (ADS)

    Ramírez-Mejía, D.; Gómez-Tagle, A.; Ghilardi, A.

    2018-02-01

    Mexican oak forests (genus Quercus) are frequently used for traditional charcoal production. Appropriate management programs are needed to ensure their long-term use, while conserving the biodiversity and ecosystem services, and associated benefits. A key variable needed to design these programs is the spatial distribution of standing woody biomass. A state-of-the-art methodology using small format aerial photographs was developed to estimate the total aboveground biomass (AGB) and aboveground woody biomass suitable for charcoal making (WSC) in intensively managed oak forests. We used tree crown area (CAap) measurements from very high-resolution (30 cm) orthorectified small format digital aerial photographs as the predictive variable. The CAap accuracy was validated using field measurements of the crown area (CAf). Allometric relationships between: (a) CAap versus AGB, and (b) CAap versus WSC had a high significance level (R 2 > 0.91, p < 0.0001). This approach shows that it is possible to obtain sound biomass estimates as a function of the crown area derived from digital small format aerial photographs.

  8. The chloride induced localised corrosion of aluminium and beryllium: A study by electron and X-ray spectroscopies

    NASA Astrophysics Data System (ADS)

    Mallinson, Christopher F.

    Beryllium is an important metal in the nuclear industry for which there are no suitable replacements. It undergoes localised corrosion at the site of heterogeneities in the metal surface. Corrosion pits are associated with a range of second phase particles. To investigate the role of these particles in corrosion, a safe experimental protocol was established using an aluminium alloy as a corrosion material analogue. The 7075-T6 alloy had not previously been investigated using the experimental methodology used in this thesis. This work led to the development of the experimental methodology and safe working practices for handling beryllium. The range and composition of the second phase particles present in S-65 beryllium billet were identified using a combination of SEM, AES, EDX and WDX. Following the identification of a range of particles with various compositions, including the AlFeBe4 precipitate which has been previously associated with corrosion, the location of the particles were marked to enable their repeated study. Attention was focused on the microchemistry in the vicinity of second phase particles, as a function of immersion time in pH 7, 0.1 M NaCl solution. The corrosion process associated with different particles was followed by repeatedly relocating the particles to perform analysis by means of SEM, AES and EDX. The use of traditional chlorinated vapour degreasing solvents on beryllium was investigated and compared to two modern commercially available cleaning solutions designed as drop-in replacements. This work expanded the range of solvents suitable for cleaning beryllium and validated the conclusions from previous thermodynamic modelling. Additionally, a new experimental methodology has been developed which enables the acquisition of chemical state information from the surface of micron scale features. This was applied to sub-micron copper and iron particles, as well as a copper intermetallic.

  9. Human-Centered Design Study: Enhancing the Usability of a Mobile Phone App in an Integrated Falls Risk Detection System for Use by Older Adult Users

    PubMed Central

    Harte, Richard; Glynn, Liam; Rodríguez-Molinero, Alejandro; Baker, Paul MA; Scharf, Thomas; ÓLaighin, Gearóid

    2017-01-01

    Background Design processes such as human-centered design (HCD), which involve the end user throughout the product development and testing process, can be crucial in ensuring that the product meets the needs and capabilities of the user, particularly in terms of safety and user experience. The structured and iterative nature of HCD can often conflict with the necessary rapid product development life-cycles associated with the competitive connected health industry. Objective The aim of this study was to apply a structured HCD methodology to the development of a smartphone app that was to be used within a connected health fall risk detection system. Our methodology utilizes so called discount usability engineering techniques to minimize the burden on resources during development and maintain a rapid pace of development. This study will provide prospective designers a detailed description of the application of a HCD methodology. Methods A 3-phase methodology was applied. In the first phase, a descriptive “use case” was developed by the system designers and analyzed by both expert stakeholders and end users. The use case described the use of the app and how various actors would interact with it and in what context. A working app prototype and a user manual were then developed based on this feedback and were subjected to a rigorous usability inspection. Further changes were made both to the interface and support documentation. The now advanced prototype was exposed to user testing by end users where further design recommendations were made. Results With combined expert and end-user analysis of a comprehensive use case having originally identified 21 problems with the system interface, we have only seen and observed 3 of these problems in user testing, implying that 18 problems were eliminated between phase 1 and 3. Satisfactory ratings were obtained during validation testing by both experts and end users, and final testing by users shows the system requires low mental, physical, and temporal demands according to the NASA Task Load Index (NASA-TLX). Conclusions From our observation of older adults’ interactions with smartphone interfaces, there were some recurring themes. Clear and relevant feedback as the user attempts to complete a task is critical. Feedback should include pop-ups, sound tones, color or texture changes, or icon changes to indicate that a function has been completed successfully, such as for the connection sequence. For text feedback, clear and unambiguous language should be used so as not to create anxiety, particularly when it comes to saving data. Warning tones or symbols, such as caution symbols or shrill tones, should only be used if absolutely necessary. Our HCD methodology, designed and implemented based on the principles of the International Standard Organizaton (ISO) 9241-210 standard, produced a functional app interface within a short production cycle, which is now suitable for use by older adults in long term clinical trials. PMID:28559227

  10. A high-efficiency low-voltage class-E PA for IoT applications in sub-1 GHz frequency range

    NASA Astrophysics Data System (ADS)

    Zhou, Chenyi; Lu, Zhenghao; Gu, Jiangmin; Yu, Xiaopeng

    2017-10-01

    We present and propose a complete and iterative integrated-circuit and electro-magnetic (EM) co-design methodology and procedure for a low-voltage sub-1 GHz class-E PA. The presented class-E PA consists of the on-chip power transistor, the on-chip gate driving circuits, the off-chip tunable LC load network and the off-chip LC ladder low pass filter. The design methodology includes an explicit design equation based circuit components values' analysis and numerical derivation, output power targeted transistor size and low pass filter design, and power efficiency oriented design optimization. The proposed design procedure includes the power efficiency oriented LC network tuning, the detailed circuit/EM co-simulation plan on integrated circuit level, package level and PCB level to ensure an accurate simulation to measurement match and first pass design success. The proposed PA is targeted to achieve more than 15 dBm output power delivery and 40% power efficiency at 433 MHz frequency band with 1.5 V low voltage supply. The LC load network is designed to be off-chip for the purpose of easy tuning and optimization. The same circuit can be extended to all sub-1 GHz applications with the same tuning and optimization on the load network at different frequencies. The amplifier is implemented in 0.13 μm CMOS technology with a core area occupation of 400 μm by 300 μm. Measurement results showed that it provided power delivery of 16.42 dBm at antenna with efficiency of 40.6%. A harmonics suppression of 44 dBc is achieved, making it suitable for massive deployment of IoT devices. Project supported by the National Natural Science Foundation of China (No. 61574125) and the Industry Innovation Project of Suzhou City of China (No. SYG201641).

  11. A geometric projection method for designing three-dimensional open lattices with inverse homogenization

    DOE PAGES

    Watts, Seth; Tortorelli, Daniel A.

    2017-04-13

    Topology optimization is a methodology for assigning material or void to each point in a design domain in a way that extremizes some objective function, such as the compliance of a structure under given loads, subject to various imposed constraints, such as an upper bound on the mass of the structure. Geometry projection is a means to parameterize the topology optimization problem, by describing the design in a way that is independent of the mesh used for analysis of the design's performance; it results in many fewer design parameters, necessarily resolves the ill-posed nature of the topology optimization problem, andmore » provides sharp descriptions of the material interfaces. We extend previous geometric projection work to 3 dimensions and design unit cells for lattice materials using inverse homogenization. We perform a sensitivity analysis of the geometric projection and show it has smooth derivatives, making it suitable for use with gradient-based optimization algorithms. The technique is demonstrated by designing unit cells comprised of a single constituent material plus void space to obtain light, stiff materials with cubic and isotropic material symmetry. Here, we also design a single-constituent isotropic material with negative Poisson's ratio and a light, stiff material comprised of 2 constituent solids plus void space.« less

  12. A geometric projection method for designing three-dimensional open lattices with inverse homogenization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Watts, Seth; Tortorelli, Daniel A.

    Topology optimization is a methodology for assigning material or void to each point in a design domain in a way that extremizes some objective function, such as the compliance of a structure under given loads, subject to various imposed constraints, such as an upper bound on the mass of the structure. Geometry projection is a means to parameterize the topology optimization problem, by describing the design in a way that is independent of the mesh used for analysis of the design's performance; it results in many fewer design parameters, necessarily resolves the ill-posed nature of the topology optimization problem, andmore » provides sharp descriptions of the material interfaces. We extend previous geometric projection work to 3 dimensions and design unit cells for lattice materials using inverse homogenization. We perform a sensitivity analysis of the geometric projection and show it has smooth derivatives, making it suitable for use with gradient-based optimization algorithms. The technique is demonstrated by designing unit cells comprised of a single constituent material plus void space to obtain light, stiff materials with cubic and isotropic material symmetry. Here, we also design a single-constituent isotropic material with negative Poisson's ratio and a light, stiff material comprised of 2 constituent solids plus void space.« less

  13. Towards a Methodology for the Design of Multimedia Public Access Interfaces.

    ERIC Educational Resources Information Center

    Rowley, Jennifer

    1998-01-01

    Discussion of information systems methodologies that can contribute to interface design for public access systems covers: the systems life cycle; advantages of adopting information systems methodologies; soft systems methodologies; task-oriented approaches to user interface design; holistic design, the Star model, and prototyping; the…

  14. Exploratory High-Fidelity Aerostructural Optimization Using an Efficient Monolithic Solution Method

    NASA Astrophysics Data System (ADS)

    Zhang, Jenmy Zimi

    This thesis is motivated by the desire to discover fuel efficient aircraft concepts through exploratory design. An optimization methodology based on tightly integrated high-fidelity aerostructural analysis is proposed, which has the flexibility, robustness, and efficiency to contribute to this goal. The present aerostructural optimization methodology uses an integrated geometry parameterization and mesh movement strategy, which was initially proposed for aerodynamic shape optimization. This integrated approach provides the optimizer with a large amount of geometric freedom for conducting exploratory design, while allowing for efficient and robust mesh movement in the presence of substantial shape changes. In extending this approach to aerostructural optimization, this thesis has addressed a number of important challenges. A structural mesh deformation strategy has been introduced to translate consistently the shape changes described by the geometry parameterization to the structural model. A three-field formulation of the discrete steady aerostructural residual couples the mesh movement equations with the three-dimensional Euler equations and a linear structural analysis. Gradients needed for optimization are computed with a three-field coupled adjoint approach. A number of investigations have been conducted to demonstrate the suitability and accuracy of the present methodology for use in aerostructural optimization involving substantial shape changes. Robustness and efficiency in the coupled solution algorithms is crucial to the success of an exploratory optimization. This thesis therefore also focuses on the design of an effective monolithic solution algorithm for the proposed methodology. This involves using a Newton-Krylov method for the aerostructural analysis and a preconditioned Krylov subspace method for the coupled adjoint solution. Several aspects of the monolithic solution method have been investigated. These include appropriate strategies for scaling and matrix-vector product evaluation, as well as block preconditioning techniques that preserve the modularity between subproblems. The monolithic solution method is applied to problems with varying degrees of fluid-structural coupling, as well as a wing span optimization study. The monolithic solution algorithm typically requires 20%-70% less computing time than its partitioned counterpart. This advantage increases with increasing wing flexibility. The performance of the monolithic solution method is also much less sensitive to the choice of the solution parameter.

  15. Wide-angle Marine Seismic Refraction Imaging of Vertical Faults: Pre-Stack Turning Wave Migrations of Synthetic Data and Implications for Survey Design

    NASA Astrophysics Data System (ADS)

    Miller, N. C.; Lizarralde, D.; McGuire, J.; Hole, J. A.

    2006-12-01

    We consider methodologies, including survey design and processing algorithms, which are best suited to imaging vertical reflectors in oceanic crust using marine seismic techniques. The ability to image the reflectivity structure of transform faults as a function of depth, for example, may provide new insights into what controls seismicity along these plate boundaries. Turning-wave migration has been used with success to image vertical faults on land. With synthetic datasets we find that this approach has unique difficulties in the deep ocean. The fault-reflected crustal refraction phase (Pg-r) typically used in pre-stack migrations is difficult to isolate in marine seismic data. An "imagable" Pg-r is only observed in a time window between the first arrivals and arrivals from the sediments and the thick, slow water layer at offsets beyond ~25 km. Ocean- bottom seismometers (OBSs), as opposed to a long surface streamer, must be used to acquire data suitable for crustal-scale vertical imaging. The critical distance for Moho reflections (PmP) in oceanic crust is also ~25 km, thus Pg-r and PmP-r are observed with very little separation, and the fault-reflected mantle refraction (Pn-r) arrives prior to Pg-r as the observation window opens with increased OBS-to-fault distance. This situation presents difficulties for "first-arrival" based Kirchoff migration approaches and suggests that wave- equation approaches, which in theory can image all three phases simultaneously, may be more suitable for vertical imaging in oceanic crust. We will present a comparison of these approaches as applied to a synthetic dataset generated from realistic, stochastic velocity models. We will assess their suitability, the migration artifacts unique to the deep ocean, and the ideal instrument layout for such an experiment.

  16. Conceptual Chemical Process Design for Sustainability. ...

    EPA Pesticide Factsheets

    This chapter examines the sustainable design of chemical processes, with a focus on conceptual design, hierarchical and short-cut methods, and analyses of process sustainability for alternatives. The chapter describes a methodology for incorporating process sustainability analyses throughout the conceptual design. Hierarchical and short-cut decision-making methods will be used to approach sustainability. An example showing a sustainability-based evaluation of chlor-alkali production processes is presented with economic analysis and five pollutants described as emissions. These emissions are analyzed according to their human toxicity potential by ingestion using the Waste Reduction Algorithm and a method based on US Environmental Protection Agency reference doses, with the addition of biodegradation for suitable components. Among the emissions, mercury as an element will not biodegrade, and results show the importance of this pollutant to the potential toxicity results and therefore the sustainability of the process design. The dominance of mercury in determining the long-term toxicity results when energy use is included suggests that all process system evaluations should (re)consider the role of mercury and other non-/slow-degrading pollutants in sustainability analyses. The cycling of nondegrading pollutants through the biosphere suggests the need for a complete analysis based on the economic, environmental, and social aspects of sustainability. Chapter reviews

  17. Exploring the possibility of modeling a genetic counseling guideline using agile methodology.

    PubMed

    Choi, Jeeyae

    2013-01-01

    Increased demand of genetic counseling services heightened the necessity of a computerized genetic counseling decision support system. In order to develop an effective and efficient computerized system, modeling of genetic counseling guideline is an essential step. Throughout this pilot study, Agile methodology with United Modeling Language (UML) was utilized to model a guideline. 13 tasks and 14 associated elements were extracted. Successfully constructed conceptual class and activity diagrams revealed that Agile methodology with UML was a suitable tool to modeling a genetic counseling guideline.

  18. Fusion of MultiSpectral and Panchromatic Images Based on Morphological Operators.

    PubMed

    Restaino, Rocco; Vivone, Gemine; Dalla Mura, Mauro; Chanussot, Jocelyn

    2016-04-20

    Nonlinear decomposition schemes constitute an alternative to classical approaches for facing the problem of data fusion. In this paper we discuss the application of this methodology to a popular remote sensing application called pansharpening, which consists in the fusion of a low resolution multispectral image and a high resolution panchromatic image. We design a complete pansharpening scheme based on the use of morphological half gradients operators and demonstrate the suitability of this algorithm through the comparison with state of the art approaches. Four datasets acquired by the Pleiades, Worldview-2, Ikonos and Geoeye-1 satellites are employed for the performance assessment, testifying the effectiveness of the proposed approach in producing top-class images with a setting independent of the specific sensor.

  19. Dynamic peptide libraries for the discovery of supramolecular nanomaterials

    NASA Astrophysics Data System (ADS)

    Pappas, Charalampos G.; Shafi, Ramim; Sasselli, Ivan R.; Siccardi, Henry; Wang, Tong; Narang, Vishal; Abzalimov, Rinat; Wijerathne, Nadeesha; Ulijn, Rein V.

    2016-11-01

    Sequence-specific polymers, such as oligonucleotides and peptides, can be used as building blocks for functional supramolecular nanomaterials. The design and selection of suitable self-assembling sequences is, however, challenging because of the vast combinatorial space available. Here we report a methodology that allows the peptide sequence space to be searched for self-assembling structures. In this approach, unprotected homo- and heterodipeptides (including aromatic, aliphatic, polar and charged amino acids) are subjected to continuous enzymatic condensation, hydrolysis and sequence exchange to create a dynamic combinatorial peptide library. The free-energy change associated with the assembly process itself gives rise to selective amplification of self-assembling candidates. By changing the environmental conditions during the selection process, different sequences and consequent nanoscale morphologies are selected.

  20. Topical, Biological and Clinical Challenges in the Management of Patients with Acne Vulgaris

    PubMed Central

    Al-Hammadi, Anwar; Al-Ismaily, Abla; Al-Ali, Sameer; Ramadurai, Rajesh; Jain, Rishi; McKinley-Grant, Lynn; Mughal, Tariq I.

    2016-01-01

    Acne vulgaris is one of the most common chronic inflammatory skin disorders among adolescents and young adults. It is associated with substantial morbidity and, rarely, with mortality. The exact worldwide incidence and prevalence are currently unknown. Current challenges involve improving understanding of the underlying pathophysiology of acne vulgaris and developing a practical treatment consensus. Expert panel discussions were held in 2013 and 2014 among a group of scientists and clinicians from the Omani and United Arab Emirate Dermatology Societies to ascertain the current optimal management of acne vulgaris, identify clinically relevant end-points and construct suitable methodology for future clinical trial designs. This article reviews the discussions of these sessions and recent literature on this topic. PMID:27226905

  1. Dynamic peptide libraries for the discovery of supramolecular nanomaterials.

    PubMed

    Pappas, Charalampos G; Shafi, Ramim; Sasselli, Ivan R; Siccardi, Henry; Wang, Tong; Narang, Vishal; Abzalimov, Rinat; Wijerathne, Nadeesha; Ulijn, Rein V

    2016-11-01

    Sequence-specific polymers, such as oligonucleotides and peptides, can be used as building blocks for functional supramolecular nanomaterials. The design and selection of suitable self-assembling sequences is, however, challenging because of the vast combinatorial space available. Here we report a methodology that allows the peptide sequence space to be searched for self-assembling structures. In this approach, unprotected homo- and heterodipeptides (including aromatic, aliphatic, polar and charged amino acids) are subjected to continuous enzymatic condensation, hydrolysis and sequence exchange to create a dynamic combinatorial peptide library. The free-energy change associated with the assembly process itself gives rise to selective amplification of self-assembling candidates. By changing the environmental conditions during the selection process, different sequences and consequent nanoscale morphologies are selected.

  2. A qualitative multiresolution model for counterterrorism

    NASA Astrophysics Data System (ADS)

    Davis, Paul K.

    2006-05-01

    This paper describes a prototype model for exploring counterterrorism issues related to the recruiting effectiveness of organizations such as al Qaeda. The prototype demonstrates how a model can be built using qualitative input variables appropriate to representation of social-science knowledge, and how a multiresolution design can allow a user to think and operate at several levels - such as first conducting low-resolution exploratory analysis and then zooming into several layers of detail. The prototype also motivates and introduces a variety of nonlinear mathematical methods for representing how certain influences combine. This has value for, e.g., representing collapse phenomena underlying some theories of victory, and for explanations of historical results. The methodology is believed to be suitable for more extensive system modeling of terrorism and counterterrorism.

  3. Heliostat cost optimization study

    NASA Astrophysics Data System (ADS)

    von Reeken, Finn; Weinrebe, Gerhard; Keck, Thomas; Balz, Markus

    2016-05-01

    This paper presents a methodology for a heliostat cost optimization study. First different variants of small, medium sized and large heliostats are designed. Then the respective costs, tracking and optical quality are determined. For the calculation of optical quality a structural model of the heliostat is programmed and analyzed using finite element software. The costs are determined based on inquiries and from experience with similar structures. Eventually the levelised electricity costs for a reference power tower plant are calculated. Before each annual simulation run the heliostat field is optimized. Calculated LCOEs are then used to identify the most suitable option(s). Finally, the conclusions and findings of this extensive cost study are used to define the concept of a new cost-efficient heliostat called `Stellio'.

  4. Micronised bran-enriched fresh egg tagliatelle: Significance of gums addition on pasta technological features.

    PubMed

    Martín-Esparza, M E; Raga, A; González-Martínez, C; Albors, A

    2018-06-01

    The aim of the work was to produce fibre-enriched fresh pasta based on micronised wheat bran and durum wheat semolina with appropriate techno-functional properties. Wheat semolina was replaced with fine particle size (50% below 75 µm) wheat bran - up to 11.54% (w/w). A Box-Behnken design with randomised response surface methodology was used to determine a suitable combination of carboxymethylcellulose, xanthan gum and locust bean gum to improve pasta attributes: minimum cooking loss, maximum values for water gain and swelling index, as well as better colour and texture characteristics before and after cooking. The proximate chemical composition of wheat semolina and bran was determined and the microstructure of uncooked pasta was observed as well. From the response surface methodology analysis, it is recommended to use: (i) xanthan gum over 0.6% w/w as it led to bran-enriched pasta with a better developed structure and superior cooking behaviour, (ii) a combination of xanthan gum (0.8% w/w) and carboxymethylcellulose (over 0.6% w/w) to enhance uncooked pasta yellowness.

  5. [Food consumption and the nutritional status of schoolchildren of the Community of Madrid (CAENPE): general methodology and overall food consumption. Consumo de alimentos y estado nutricional de la población escolar].

    PubMed

    Vázquez, C; de Cos, A I; Martínez, P; Jaunsolo, M A; Román, E; Gómez, C; López, T; Hernáez, I; Seijas, V; Ramos, V

    1995-01-01

    The CAENPE study (Food Consumption and Nutritional State of the School Population) was a transversal observational study funded and promoted by the Directorate-General of Food Hygiene in the Ministry of Health, implemented in 1991-93, with the main aim of quantifying food consumption in the school population (6-14 years of age) in the Regional Community of Madrid, together with an anthropometric study and nutritional analysis of that population. This project sets our the General Methodology for the study, paying particular attention to the sampling design, to ensure that the sample is representative of the community, and the results of the overall consumption of food and its comparison with recommended diet and other population studies. Quantification shows a high and rising consumption of meat, meat products, sweets, snacks and prepared dishes, suitable consumption of eggs, legumes and fruit and a notable lack of greens, vegetables and potatoes. The basic results underline the need to introduce educational measures with practical effect on home and school menus.

  6. Spatial Ecology of Estuarine Crocodile (Crocodylus porosus) Nesting in a Fragmented Landscape.

    PubMed

    Evans, Luke J; Jones, T Hefin; Pang, Keeyen; Saimin, Silvester; Goossens, Benoit

    2016-09-19

    The role that oil palm plays in the Lower Kinabatangan region of Eastern Sabah is of considerable scientific and conservation interest, providing a model habitat for many tropical regions as they become increasingly fragmented. Crocodilians, as apex predators, widely distributed throughout the tropics, are ideal indicator species for ecosystem health. Drones (or unmanned aerial vehicles (UAVs)) were used to identify crocodile nests in a fragmented landscape. Flights were targeted through the use of fuzzy overlay models and nests located primarily in areas indicated as suitable habitat. Nests displayed a number of similarities in terms of habitat characteristics allowing for refined modelling of survey locations. As well as being more cost-effective compared to traditional methods of nesting survey, the use of drones also enabled a larger survey area to be completed albeit with a limited number of flights. The study provides a methodology for targeted nest surveying, as well as a low-cost repeatable flight methodology. This approach has potential for widespread applicability across a range of species and for a variety of study designs.

  7. Antiproliferative activity of Curcuma phaeocaulis Valeton extract using ultrasonic assistance and response surface methodology.

    PubMed

    Wang, Xiaoqin; Jiang, Ying; Hu, Daode

    2017-01-02

    The objective of the study was to optimize the ultrasonic-assisted extraction of curdione, furanodienone, curcumol, and germacrone from Curcuma phaeocaulis Valeton (Val.) and investigate the antiproliferative activity of the extract. Under the suitable high-performance liquid chromatography condition, the calibration curves for these four tested compounds showed high levels of linearity and the recoveries of these four compounds were between 97.9 and 104.3%. Response surface methodology (RSM) combining central composite design and desirability function (DF) was used to define optimal extraction parameters. The results of RSM and DF revealed that the optimum conditions were obtained as 8 mL g -1 for liquid-solid ratio, 70% ethanol concentration, and 20 min of ultrasonic time. It was found that the surface structures of the sonicated herbal materials were fluffy and irregular. The C. phaeocaulis Val. extract significantly inhibited the proliferation of RKO and HT-29 cells in vitro. The results reveal that the RSM can be effectively used for optimizing the ultrasonic-assisted extraction of bioactive components from C. phaeocaulis Val. for antiproliferative activity.

  8. Spatial Ecology of Estuarine Crocodile (Crocodylus porosus) Nesting in a Fragmented Landscape

    PubMed Central

    Evans, Luke J.; Jones, T. Hefin; Pang, Keeyen; Saimin, Silvester; Goossens, Benoit

    2016-01-01

    The role that oil palm plays in the Lower Kinabatangan region of Eastern Sabah is of considerable scientific and conservation interest, providing a model habitat for many tropical regions as they become increasingly fragmented. Crocodilians, as apex predators, widely distributed throughout the tropics, are ideal indicator species for ecosystem health. Drones (or unmanned aerial vehicles (UAVs)) were used to identify crocodile nests in a fragmented landscape. Flights were targeted through the use of fuzzy overlay models and nests located primarily in areas indicated as suitable habitat. Nests displayed a number of similarities in terms of habitat characteristics allowing for refined modelling of survey locations. As well as being more cost-effective compared to traditional methods of nesting survey, the use of drones also enabled a larger survey area to be completed albeit with a limited number of flights. The study provides a methodology for targeted nest surveying, as well as a low-cost repeatable flight methodology. This approach has potential for widespread applicability across a range of species and for a variety of study designs. PMID:27657065

  9. A prototype computerized synthesis methodology for generic space access vehicle (SAV) conceptual design

    NASA Astrophysics Data System (ADS)

    Huang, Xiao

    2006-04-01

    Today's and especially tomorrow's competitive launch vehicle design environment requires the development of a dedicated generic Space Access Vehicle (SAV) design methodology. A total of 115 industrial, research, and academic aircraft, helicopter, missile, and launch vehicle design synthesis methodologies have been evaluated. As the survey indicates, each synthesis methodology tends to focus on a specific flight vehicle configuration, thus precluding the key capability to systematically compare flight vehicle design alternatives. The aim of the research investigation is to provide decision-making bodies and the practicing engineer a design process and tool box for robust modeling and simulation of flight vehicles where the ultimate performance characteristics may hinge on numerical subtleties. This will enable the designer of a SAV for the first time to consistently compare different classes of SAV configurations on an impartial basis. This dissertation presents the development steps required towards a generic (configuration independent) hands-on flight vehicle conceptual design synthesis methodology. This process is developed such that it can be applied to any flight vehicle class if desired. In the present context, the methodology has been put into operation for the conceptual design of a tourist Space Access Vehicle. The case study illustrates elements of the design methodology & algorithm for the class of Horizontal Takeoff and Horizontal Landing (HTHL) SAVs. The HTHL SAV design application clearly outlines how the conceptual design process can be centrally organized, executed and documented with focus on design transparency, physical understanding and the capability to reproduce results. This approach offers the project lead and creative design team a management process and tool which iteratively refines the individual design logic chosen, leading to mature design methods and algorithms. As illustrated, the HTHL SAV hands-on design methodology offers growth potential in that the same methodology can be continually updated and extended to other SAV configuration concepts, such as the Vertical Takeoff and Vertical Landing (VTVL) SAV class. Having developed, validated and calibrated the methodology for HTHL designs in the 'hands-on' mode, the report provides an outlook how the methodology will be integrated into a prototype computerized design synthesis software AVDS-PrADOSAV in a follow-on step.

  10. [Development of an optimized formulation of damask marmalade with low energy level using Taguchi methodology].

    PubMed

    Villarroel, Mario; Castro, Ruth; Junod, Julio

    2003-06-01

    The goal of this present study was the development of an optimized formula of damask marmalade low in calories applying Taguchi methodology to improve the quality of this product. The selection of this methodology lies on the fact that in real life conditions the result of an experiment frequently depends on the influence of several variables, therefore, one expedite way to solve this problem is utilizing factorial desings. The influence of acid, thickener, sweetener and aroma additives, as well as time of cooking, and possible interactions among some of them, were studied trying to get the best combination of these factors to optimize the sensorial quality of an experimental formulation of dietetic damask marmalade. An orthogonal array L8 (2(7)) was applied in this experience, as well as level average analysis was carried out according Taguchi methodology to determine the suitable working levels of the design factors previously choiced, to achieve a desirable product quality. A sensory trained panel was utilized to analyze the marmalade samples using a composite scoring test with a descriptive acuantitative scale ranging from 1 = Bad, 5 = Good. It was demonstrated that the design factors sugar/aspartame, pectin and damask aroma had a significant effect (p < 0.05) on the sensory quality of the marmalade with 82% of contribution on the response. The optimal combination result to be: citric acid 0.2%; pectin 1%; 30 g sugar/16 mg aspartame/100 g, damask aroma 0.5 ml/100 g, time of cooking 5 minutes. Regarding chemical composition, the most important results turned out to be the decrease in carbohydrate content compaired with traditional marmalade with a reduction of 56% in coloric value and also the amount of dietary fiber greater than similar commercial products. Assays of storage stability were carried out on marmalade samples submitted to different temperatures held in plastic bags of different density. Non percetible sensorial, microbiological and chemical changes were detected after 90 days of storage under controlled conditions.

  11. 40 CFR 80.1645 - Sample retention requirements for producers and importers of denaturant designated as suitable...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... producers and importers of denaturant designated as suitable for the manufacture of denatured fuel ethanol... suitable for the manufacture of denatured fuel ethanol meeting federal quality requirements. Beginning January 1, 2017, or on the first day that any producer or importer of ethanol denaturant designates a...

  12. Optimal design of experiments applied to headspace solid phase microextraction for the quantification of vicinal diketones in beer through gas chromatography-mass spectrometric detection.

    PubMed

    Leça, João M; Pereira, Ana C; Vieira, Ana C; Reis, Marco S; Marques, José C

    2015-08-05

    Vicinal diketones, namely diacetyl (DC) and pentanedione (PN), are compounds naturally found in beer that play a key role in the definition of its aroma. In lager beer, they are responsible for off-flavors (buttery flavor) and therefore their presence and quantification is of paramount importance to beer producers. Aiming at developing an accurate quantitative monitoring scheme to follow these off-flavor compounds during beer production and in the final product, the head space solid-phase microextraction (HS-SPME) analytical procedure was tuned through experiments planned in an optimal way and the final settings were fully validated. Optimal design of experiments (O-DOE) is a computational, statistically-oriented approach for designing experiences that are most informative according to a well-defined criterion. This methodology was applied for HS-SPME optimization, leading to the following optimal extraction conditions for the quantification of VDK: use a CAR/PDMS fiber, 5 ml of samples in 20 ml vial, 5 min of pre-incubation time followed by 25 min of extraction at 30 °C, with agitation. The validation of the final analytical methodology was performed using a matrix-matched calibration, in order to minimize matrix effects. The following key features were obtained: linearity (R(2) > 0.999, both for diacetyl and 2,3-pentanedione), high sensitivity (LOD of 0.92 μg L(-1) and 2.80 μg L(-1), and LOQ of 3.30 μg L(-1) and 10.01 μg L(-1), for diacetyl and 2,3-pentanedione, respectively), recoveries of approximately 100% and suitable precision (repeatability and reproducibility lower than 3% and 7.5%, respectively). The applicability of the methodology was fully confirmed through an independent analysis of several beer samples, with analyte concentrations ranging from 4 to 200 g L(-1). Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Fuzzy attitude control for a nanosatellite in leo orbit

    NASA Astrophysics Data System (ADS)

    Calvo, Daniel; Laverón-Simavilla, Ana; Lapuerta, Victoria; Aviles, Taisir

    Fuzzy logic controllers are flexible and simple, suitable for small satellites Attitude Determination and Control Subsystems (ADCS). In this work, a tailored fuzzy controller is designed for a nanosatellite and is compared with a traditional Proportional Integrative Derivative (PID) controller. Both control methodologies are compared within the same specific mission. The orbit height varies along the mission from injection at around 380 km down to a 200 km height orbit, and the mission requires pointing accuracy over the whole time. Due to both the requirements imposed by such a low orbit, and the limitations in the power available for the attitude control, a robust and efficient ADCS is required. For these reasons a fuzzy logic controller is implemented as the brain of the ADCS and its performance and efficiency are compared to a traditional PID. The fuzzy controller is designed in three separated controllers, each one acting on one of the Euler angles of the satellite in an orbital frame. The fuzzy memberships are constructed taking into account the mission requirements, the physical properties of the satellite and the expected performances. Both methodologies, fuzzy and PID, are fine-tuned using an automated procedure to grant maximum efficiency with fixed performances. Finally both methods are probed in different environments to test their characteristics. The simulations show that the fuzzy controller is much more efficient (up to 65% less power required) in single maneuvers, achieving similar, or even better, precision than the PID. The accuracy and efficiency improvement of the fuzzy controller increase with orbit height because the environmental disturbances decrease, approaching the ideal scenario. A brief mission description is depicted as well as the design process of both ADCS controllers. Finally the validation process and the results obtained during the simulations are described. Those results show that the fuzzy logic methodology is valid for small satellites' missions benefiting from a well-developed artificial intelligence theory.

  14. Clarity of objectives and working principles enhances the success of biomimetic programs.

    PubMed

    Wolff, Jonas O; Wells, David; Reid, Chris R; Blamires, Sean J

    2017-09-26

    Biomimetics, the transfer of functional principles from living systems into product designs, is increasingly being utilized by engineers. Nevertheless, recurring problems must be overcome if it is to avoid becoming a short-lived fad. Here we assess the efficiency and suitability of methods typically employed by examining three flagship examples of biomimetic design approaches from different disciplines: (1) the creation of gecko-inspired adhesives; (2) the synthesis of spider silk, and (3) the derivation of computer algorithms from natural self-organizing systems. We find that identification of the elemental working principles is the most crucial step in the biomimetic design process. It bears the highest risk of failure (e.g. losing the target function) due to false assumptions about the working principle. Common problems that hamper successful implementation are: (i) a discrepancy between biological functions and the desired properties of the product, (ii) uncertainty about objectives and applications, (iii) inherent limits in methodologies, and (iv) false assumptions about the biology of the models. Projects that aim for multi-functional products are particularly challenging to accomplish. We suggest a simplification, modularisation and specification of objectives, and a critical assessment of the suitability of the model. Comparative analyses, experimental manipulation, and numerical simulations followed by tests of artificial models have led to the successful extraction of working principles. A searchable database of biological systems would optimize the choice of a model system in top-down approaches that start at an engineering problem. Only when biomimetic projects become more predictable will there be wider acceptance of biomimetics as an innovative problem-solving tool among engineers and industry.

  15. Design of shape memory alloy actuated intelligent parabolic antenna for space applications

    NASA Astrophysics Data System (ADS)

    Kalra, Sahil; Bhattacharya, Bishakh; Munjal, B. S.

    2017-09-01

    The deployment of large flexible antennas is becoming critical for space applications today. Such antenna systems can be reconfigured in space for variable antenna footprint, and hence can be utilized for signal transmission to different geographic locations. Due to quasi-static shape change requirements, coupled with the demand of large deflection, shape memory alloy (SMA) based actuators are uniquely suitable for this system. In this paper, we discuss the design and development of a reconfigurable parabolic antenna structure. The reflector skin of the antenna is vacuum formed using a metalized polycarbonate shell. Two different strategies are chosen for the antenna actuation. Initially, an SMA wire based offset network is formed on the back side of the reflector. A computational model is developed using equivalent coefficient of thermal expansion (ECTE) for the SMA wire. Subsequently, the interaction between the antenna and SMA wire is modeled as a constrained recovery system, using a 1D modified Brinson model. Joule effect based SMA phase transformation is considered for the relationship between input voltage and temperature at the SMA wire. The antenna is modeled using ABAQUS based finite element methodology. The deflection found through the computational model is compared with that measured in experiment. Subsequently, a point-wise actuation system is developed for higher deflection. For power-minimization, an auto-locking device is developed. The performance of the new configuration is compared with the offset-network configuration. It is envisaged that the study will provide a comprehensive procedure for the design of intelligent flexible structures especially suitable for space applications.

  16. Just-in-time adaptive classifiers-part II: designing the classifier.

    PubMed

    Alippi, Cesare; Roveri, Manuel

    2008-12-01

    Aging effects, environmental changes, thermal drifts, and soft and hard faults affect physical systems by changing their nature and behavior over time. To cope with a process evolution adaptive solutions must be envisaged to track its dynamics; in this direction, adaptive classifiers are generally designed by assuming the stationary hypothesis for the process generating the data with very few results addressing nonstationary environments. This paper proposes a methodology based on k-nearest neighbor (NN) classifiers for designing adaptive classification systems able to react to changing conditions just-in-time (JIT), i.e., exactly when it is needed. k-NN classifiers have been selected for their computational-free training phase, the possibility to easily estimate the model complexity k and keep under control the computational complexity of the classifier through suitable data reduction mechanisms. A JIT classifier requires a temporal detection of a (possible) process deviation (aspect tackled in a companion paper) followed by an adaptive management of the knowledge base (KB) of the classifier to cope with the process change. The novelty of the proposed approach resides in the general framework supporting the real-time update of the KB of the classification system in response to novel information coming from the process both in stationary conditions (accuracy improvement) and in nonstationary ones (process tracking) and in providing a suitable estimate of k. It is shown that the classification system grants consistency once the change targets the process generating the data in a new stationary state, as it is the case in many real applications.

  17. Designing the microturbine engine for waste-derived fuels.

    PubMed

    Seljak, Tine; Katrašnik, Tomaž

    2016-01-01

    Presented paper deals with adaptation procedure of a microturbine (MGT) for exploitation of refuse derived fuels (RDF). RDF often possess significantly different properties than conventional fuels and usually require at least some adaptations of internal combustion systems to obtain full functionality. With the methodology, developed in the paper it is possible to evaluate the extent of required adaptations by performing a thorough analysis of fuel combustion properties in a dedicated experimental rig suitable for testing of wide-variety of waste and biomass derived fuels. In the first part key turbine components are analyzed followed by cause and effect analysis of interaction between different fuel properties and design parameters of the components. The data are then used to build a dedicated test system where two fuels with diametric physical and chemical properties are tested - liquefied biomass waste (LW) and waste tire pyrolysis oil (TPO). The analysis suggests that exploitation of LW requires higher complexity of target MGT system as stable combustion can be achieved only with regenerative thermodynamic cycle, high fuel preheat temperatures and optimized fuel injection nozzle. Contrary, TPO requires less complex MGT design and sufficient operational stability is achieved already with simple cycle MGT and conventional fuel system. The presented approach of testing can significantly reduce the extent and cost of required adaptations of commercial system as pre-selection procedure of suitable MGT is done in developed test system. The obtained data can at the same time serve as an input for fine-tuning the processes for RDF production. Copyright © 2015. Published by Elsevier Ltd.

  18. Assessment of optimum threshold and particle shape parameter for the image analysis of aggregate size distribution of concrete sections

    NASA Astrophysics Data System (ADS)

    Ozen, Murat; Guler, Murat

    2014-02-01

    Aggregate gradation is one of the key design parameters affecting the workability and strength properties of concrete mixtures. Estimating aggregate gradation from hardened concrete samples can offer valuable insights into the quality of mixtures in terms of the degree of segregation and the amount of deviation from the specified gradation limits. In this study, a methodology is introduced to determine the particle size distribution of aggregates from 2D cross sectional images of concrete samples. The samples used in the study were fabricated from six mix designs by varying the aggregate gradation, aggregate source and maximum aggregate size with five replicates of each design combination. Each sample was cut into three pieces using a diamond saw and then scanned to obtain the cross sectional images using a desktop flatbed scanner. An algorithm is proposed to determine the optimum threshold for the image analysis of the cross sections. A procedure was also suggested to determine a suitable particle shape parameter to be used in the analysis of aggregate size distribution within each cross section. Results of analyses indicated that the optimum threshold hence the pixel distribution functions may be different even for the cross sections of an identical concrete sample. Besides, the maximum ferret diameter is the most suitable shape parameter to estimate the size distribution of aggregates when computed based on the diagonal sieve opening. The outcome of this study can be of practical value for the practitioners to evaluate concrete in terms of the degree of segregation and the bounds of mixture's gradation achieved during manufacturing.

  19. HRT-UML: a design method for hard real-time systems based on the UML notation

    NASA Astrophysics Data System (ADS)

    D'Alessandro, Massimo; Mazzini, Silvia; di Natale, Marco; Lipari, Giuseppe

    2002-07-01

    The Hard Real-Time-Unified Modelling Language (HRT-UML) method aims at providing a comprehensive solution to the modeling of Hard Real Time systems. The experience shows that the design of Hard Real-Time systems needs methodologies suitable for the modeling and analysis of aspects related to time, schedulability and performance. In the context of the European Aerospace community a reference method for design is Hierarchical Object Oriented Design (HOOD) and in particular its extension for the modeling of hard real time systems, Hard Real-Time-Hierarchical Object Oriented Design (HRT-HOOD), recommended by the European Space Agency (ESA) for the development of on-board systems. On the other hand in recent years the Unified Modelling Language (UML) has been gaining a very large acceptance in a wide range of domains, all over the world, becoming a de-facto international standard. Tool vendors are very active in this potentially big market. In the Aerospace domain the common opinion is that UML, as a general notation, is not suitable for Hard Real Time systems, even if its importance is recognized as a standard and as a technological trend in the near future. These considerations suggest the possibility of replacing the HRT-HOOD method with a customized version of UML, that incorporates the advantages of both standards and complements the weak points. This approach has the clear advantage of making HRT-HOOD converge on a more powerful and expressive modeling notation. The paper identifies a mapping of the HRT-HOOD semantics into the UML one, and proposes a UML extension profile, that we call HRT-UML, based on the UML standard extension mechanisms, to fully represent HRT-HOOD design concepts. Finally it discusses the relationships between our profile and the UML profile for schedulability, performance and time, adopted by OMG in November 2001.

  20. Vertically aligned carbon nanotubes for microelectrode arrays applications.

    PubMed

    Castro Smirnov, J R; Jover, Eric; Amade, Roger; Gabriel, Gemma; Villa, Rosa; Bertran, Enric

    2012-09-01

    In this work a methodology to fabricate carbon nanotube based electrodes using plasma enhanced chemical vapour deposition has been explored and defined. The final integrated microelectrode based devices should present specific properties that make them suitable for microelectrode arrays applications. The methodology studied has been focused on the preparation of highly regular and dense vertically aligned carbon nanotube (VACNT) mat compatible with the standard lithography used for microelectrode arrays technology.

  1. The application of the rapid assessment and response methodology for cannabis prevention research among youth in the Netherlands.

    PubMed

    Dupont, Hans B; Kaplan, Charles D; Braam, Richard V; Verbraeck, Hans T; de Vries, Nanne K

    2015-08-01

    Drug prevention methods tailored to specific target groups have become increasingly important. There is a growing need to find ways to rapidly assess and situate target groups in their particular contexts. This need is associated with the implementation of evidence-based interventions (EBIs) for these specific target groups. This article describes the application of Rapid Assessment and Response (RAR) as a necessary first step in designing and implementing a prevention intervention plan for problematic cannabis use among "loitering" youth in the South of the Netherlands. Seven RAR studies were conducted using an innovative stepwise model in which the prevention field worker is central. The normative structure for the use of cannabis was found to vary across the neighborhoods of the RAR studies and emerged as the focal point in designing a suitable response. The RAR studies also identified the need in the prevention toolbox for a tailored, low-threshold, effective, individual brief intervention for youth problematic cannabis use. The RAR was found to provide a powerful methodology for detecting target groups and generating contextual and normative data that enable the prevention field worker to select and adapt from the spectrum of existing Evidence based Interventions (EBIs) or develop the most promising model for implementation with the specific target group. Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.

  2. Application of response surface methodology and semi-mechanistic model to optimize fluoride removal using crushed concrete in a fixed-bed column.

    PubMed

    Gu, Bon-Wun; Lee, Chang-Gu; Park, Seong-Jik

    2018-03-01

    The aim of this study was to investigate the removal of fluoride from aqueous solutions by using crushed concrete fines as a filter medium under varying conditions of pH 3-7, flow rate of 0.3-0.7 mL/min, and filter depth of 10-20 cm. The performance of fixed-bed columns was evaluated on the basis of the removal ratio (Re), uptake capacity (qe), degree of sorbent used (DoSU), and sorbent usage rate (SUR) obtained from breakthrough curves (BTCs). Three widely used semi-mechanistic models, that is, Bohart-Adams, Thomas, and Yoon-Nelson models, were applied to simulate the BTCs and to derive the design parameters. The Box-Behnken design of response surface methodology (RSM) was used to elucidate the individual and interactive effects of the three operational parameters on the column performance and to optimize these parameters. The results demonstrated that pH is the most important factor in the performance of fluoride removal by a fixed-bed column. The flow rate had a significant negative influence on Re and DoSU, and the effect of filter depth was observed only in the regression model for DoSU. Statistical analysis indicated that the model attained from the RSM study is suitable for describing the semi-mechanistic model parameters.

  3. Measurement of Flow Pattern Within a Rotating Stall Cell in an Axial Compressor

    NASA Technical Reports Server (NTRS)

    Lepicovsky, Jan; Braunscheidel, Edward P.

    2006-01-01

    Effective active control of rotating stall in axial compressors requires detailed understanding of flow instabilities associated with this compressor regime. Newly designed miniature high frequency response total and static pressure probes as well as commercial thermoanemometric probes are suitable tools for this task. However, during the rotating stall cycle the probes are subjected to flow direction changes that are far larger than the range of probe incidence acceptance, and therefore probe data without a proper correction would misrepresent unsteady variations of flow parameters. A methodology, based on ensemble averaging, is proposed to circumvent this problem. In this approach the ensemble averaged signals acquired for various probe setting angles are segmented, and only the sections for probe setting angles close to the actual flow angle are used for signal recombination. The methodology was verified by excellent agreement between velocity distributions obtained from pressure probe data, and data measured with thermoanemometric probes. Vector plots of unsteady flow behavior during the rotating stall regime indicate reversed flow within the rotating stall cell that spreads over to adjacent rotor blade channels. Results of this study confirmed that the NASA Low Speed Axial Compressor (LSAC) while in a rotating stall regime at rotor design speed exhibits one stall cell that rotates at a speed equal to 50.6 percent of the rotor shaft speed.

  4. Magnetic fluid control for viscous loss reduction of high-speed MRF brakes and clutches with well-defined fail-safe behavior

    NASA Astrophysics Data System (ADS)

    Güth, Dirk; Schamoni, Markus; Maas, Jürgen

    2013-09-01

    No-load losses within brakes and clutches based on magnetorheological fluids are unavoidable and represent a major barrier towards their wide-spread commercial adoption. Completely torque free rotation is not yet possible due to persistent fluid contact within the shear gap. In this paper, a novel concept is presented that facilitates the controlled movement of the magnetorheological fluid from an active, torque-transmitting region into an inactive region of the shear gap. This concept enables complete decoupling of the fluid engaging surfaces such that viscous drag torque can be eliminated. In order to achieve the desired effect, motion in the magnetorheological fluid is induced by magnetic forces acting on the fluid, which requires an appropriate magnetic circuit design. In this investigation, we propose a methodology to determine suitable magnetic circuit designs with well-defined fail-safe behavior. The magnetically induced motion of magnetorheological fluids is modeled by the use of the Kelvin body force, and a multi-physics domain simulation is performed to elucidate various transitions between an engaged and disengaged operating mode. The modeling approach is validated by captured high-speed video frames which show the induced motion of the magnetorheological fluid due to the magnetic field. Finally, measurements performed with a prototype actuator prove that the induced viscous drag torque can be reduced significantly by the proposed magnetic fluid control methodology.

  5. Experimental design based response surface methodology optimization of ultrasonic assisted adsorption of safaranin O by tin sulfide nanoparticle loaded on activated carbon.

    PubMed

    Roosta, M; Ghaedi, M; Daneshfar, A; Sahraei, R

    2014-03-25

    In this research, the adsorption rate of safranine O (SO) onto tin sulfide nanoparticle loaded on activated carbon (SnS-NPAC) was accelerated by the ultrasound. SnS-NP-AC was characterized by different techniques such as SEM, XRD and UV-Vis measurements. The present results confirm that the ultrasound assisted adsorption method has remarkable ability to improve the adsorption efficiency. The influence of parameters such as the sonication time, adsorbent dosage, pH and initial SO concentration was examined and evaluated by central composite design (CCD) combined with response surface methodology (RSM) and desirability function (DF). Conducting adsorption experiments at optimal conditions set as 4 min of sonication time, 0.024 g of adsorbent, pH 7 and 18 mg L(-1) SO make admit to achieve high removal percentage (98%) and high adsorption capacity (50.25 mg g(-)(1)). A good agreement between experimental and predicted data in this study was observed. The experimental equilibrium data fitting to Langmuir, Freundlich, Tempkin and Dubinin-Radushkevich models show that the Langmuir model is a good and suitable model for evaluation and the actual behavior of adsorption. Kinetic evaluation of experimental data showed that the adsorption processes followed well pseudo-second-order and intraparticle diffusion models. Copyright © 2013. Published by Elsevier B.V.

  6. A comprehensive and sustainable approach to the design of the retrofitting and enlargement of the National Etruscan museum `Pompeo Aria' in Marzabotto, Italy

    NASA Astrophysics Data System (ADS)

    Mingozzi, Angelo; Bottiglioni, Sergio

    2006-12-01

    The retrofitting and enlargement of National Etruscan Museum `Pompeo Aria' in Marzabotto, Italy is one of the demonstration projects of `MUSEUMS' EC FPV Project that concerns retrofitting and construction of eight European museums in accordance with sustainability principles.The museum was originally placed inside some buildings unsuitable to host exhibitions. As the wrong indoor microclimate could affect objects and boost their deterioration, it was essential to define suitable conditions under which the pieces should be exhibited, and the structures had to be accordingly upgraded to improve their performances.The project aims to balance active systems with bioclimatic strategies and passive solar techniques in order to assure the best conditions for people's comfort and exhibit preventive conservation together with energy savings. In order to succeed in such an attempt, it is essential to define a methodology that helps in managing the complexity, for instance, this has allowed the cross disciplinary team to focus on common priorities and to talk a universal language.In effect, each retrofitting and extension phase followed a preset design method: starting from the site analysis, general and specific targets have been defined, and they have been continuously verified.Methodologies and technical solutions developed and tested during this experience have a great chance of becoming a knowledge platform and being replicated in future interventions.

  7. LCA of greywater management within a water circular economy restorative thinking framework.

    PubMed

    Dominguez, Sara; Laso, Jara; Margallo, María; Aldaco, Rubén; Rivero, Maria J; Irabien, Ángel; Ortiz, Inmaculada

    2018-04-15

    Greywater reuse is an attractive option for the sustainable management of water under water scarcity circumstances, within a water circular economy restorative thinking framework. Its successful deployment relies on the availability of low cost and environmentally friendly technologies. The life cycle assessment (LCA) approach provides the appropriate methodological tool for the evaluation of alternative treatments based on environmental decision criteria and, therefore, it is highly useful during the process conceptual design. This methodology should be employed in the early design phase to select those technologies with lower environmental impact. This work reports the comparative LCA of three scenarios for greywater reuse: photocatalysis, photovoltaic solar-driven photocatalysis and membrane biological reactor, in order to help the selection of the most environmentally friendly technology. The study has been focused on the removal of the surfactant sodium dodecylbenzenesulfonate, which is used in the formulation of detergents and personal care products and, thus, widely present in greywater. LCA was applied using the Environmental Sustainability Assessment methodology to obtain two main environmental indicators in order to simplify the decision making process: natural resources and environmental burdens. Energy consumption is the main contributor to both indicators owing to the high energy consumption of the light source for the photocatalytic greywater treatment. In order to reduce its environmental burdens, the most desirable scenario would be the use of solar light for the photocatalytic transformation. However, while the technological challenge of direct use of solar light is approached, the environmental suitability of the photovoltaic solar energy driven photocatalysis technology to greywater reuse has been demonstrated, as it involves the smallest environmental impact among the three studied alternatives. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Defining the "proven technology" technical criterion in the reactor technology assessment for Malaysia's nuclear power program

    NASA Astrophysics Data System (ADS)

    Anuar, Nuraslinda; Kahar, Wan Shakirah Wan Abdul; Manan, Jamal Abdul Nasir Abd

    2015-04-01

    Developing countries that are considering the deployment of nuclear power plants (NPPs) in the near future need to perform reactor technology assessment (RTA) in order to select the most suitable reactor design. The International Atomic Energy Agency (IAEA) reported in the Common User Considerations (CUC) document that "proven technology" is one of the most important technical criteria for newcomer countries in performing the RTA. The qualitative description of five desired features for "proven technology" is relatively broad and only provides a general guideline to its characterization. This paper proposes a methodology to define the "proven technology" term according to a specific country's requirements using a three-stage evaluation process. The first evaluation stage screens the available technologies in the market against a predefined minimum Technology Readiness Level (TRL) derived as a condition based on national needs and policy objectives. The result is a list of technology options, which are then assessed in the second evaluation stage against quantitative definitions of CUC desired features for proven technology. The potential technology candidates produced from this evaluation is further narrowed down to obtain a list of proven technology candidates by assessing them against selected risk criteria and the established maximum allowable total score using a scoring matrix. The outcome of this methodology is the proven technology candidates selected using an accurate definition of "proven technology" that fulfills the policy objectives, national needs and risk, and country-specific CUC desired features of the country that performs this assessment. A simplified assessment for Malaysia is carried out to demonstrate and suggest the use of the proposed methodology. In this exercise, ABWR, AP1000, APR1400 and EPR designs assumed the top-ranks of proven technology candidates according to Malaysia's definition of "proven technology".

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anuar, Nuraslinda, E-mail: nuraslinda@uniten.edu.my; Kahar, Wan Shakirah Wan Abdul, E-mail: shakirah@tnb.com.my; Manan, Jamal Abdul Nasir Abd

    Developing countries that are considering the deployment of nuclear power plants (NPPs) in the near future need to perform reactor technology assessment (RTA) in order to select the most suitable reactor design. The International Atomic Energy Agency (IAEA) reported in the Common User Considerations (CUC) document that “proven technology” is one of the most important technical criteria for newcomer countries in performing the RTA. The qualitative description of five desired features for “proven technology” is relatively broad and only provides a general guideline to its characterization. This paper proposes a methodology to define the “proven technology” term according to amore » specific country’s requirements using a three-stage evaluation process. The first evaluation stage screens the available technologies in the market against a predefined minimum Technology Readiness Level (TRL) derived as a condition based on national needs and policy objectives. The result is a list of technology options, which are then assessed in the second evaluation stage against quantitative definitions of CUC desired features for proven technology. The potential technology candidates produced from this evaluation is further narrowed down to obtain a list of proven technology candidates by assessing them against selected risk criteria and the established maximum allowable total score using a scoring matrix. The outcome of this methodology is the proven technology candidates selected using an accurate definition of “proven technology” that fulfills the policy objectives, national needs and risk, and country-specific CUC desired features of the country that performs this assessment. A simplified assessment for Malaysia is carried out to demonstrate and suggest the use of the proposed methodology. In this exercise, ABWR, AP1000, APR1400 and EPR designs assumed the top-ranks of proven technology candidates according to Malaysia’s definition of “proven technology”.« less

  10. The physical vulnerability of elements at risk: a methodology based on fluid and classical mechanics

    NASA Astrophysics Data System (ADS)

    Mazzorana, B.; Fuchs, S.; Levaggi, L.

    2012-04-01

    The impacts of the flood events occurred in autumn 2011 in the Italian regions Liguria and Tuscany revived the engagement of the public decision makers to enhance in synergy flood control and land use planning. In this context, the design of efficient flood risk mitigation strategies and their subsequent implementation critically relies on a careful vulnerability analysis of both, the immobile and mobile elements at risk potentially exposed to flood hazards. Based on fluid and classical mechanics notions we developed computation schemes enabling for a dynamic vulnerability and risk analysis facing a broad typological variety of elements at risk. The methodological skeleton consists of (1) hydrodynamic computation of the time-varying flood intensities resulting for each element at risk in a succession of loading configurations; (2) modelling the mechanical response of the impacted elements through static, elasto-static and dynamic analyses; (3) characterising the mechanical response through proper structural damage variables and (4) economic valuation of the expected losses as a function of the quantified damage variables. From a computational perspective we coupled the description of the hydrodynamic flow behaviour and the induced structural modifications of the elements at risk exposed. Valuation methods, suitable to support a correct mapping from the value domains of the physical damage variables to the economic loss values are discussed. In such a way we target to complement from a methodological perspective the existing, mainly empirical, vulnerability and risk assessment approaches to refine the conceptual framework of the cost-benefit analysis. Moreover, we aim to support the design of effective flood risk mitigation strategies by diminishing the main criticalities within the systems prone to flood risk.

  11. Design And Ground Testing For The Expert PL4/PL5 'Natural And Roughness Induced Transition'

    NASA Astrophysics Data System (ADS)

    Masutti, Davie; Chazot, Olivier; Donelli, Raffaele; de Rosa, Donato

    2011-05-01

    Unpredicted boundary layer transition can impact dramatically the stability of the vehicle, its aerodynamic coefficients and reduce the efficiency of the thermal protection system. In this frame, ESA started the EXPERT (European eXPErimental Reentry Testbed) program to pro- vide and perform in-flight experiments in order to obtain aerothermodynamic data for the validation of numerical models and of ground-to-flight extrapolation methodologies. Considering the boundary layer transition investigation, the EXPERT vehicle is equipped with two specific payloads, PL4 and PL5, concerning respectively the study of the natural and roughness induced transition. The paper is a survey on the design process of these two in-flight experiments and it covers the major analyses and findings encountered during the development of the payloads. A large amount of transition criteria have been investigated and used to estimate either the dangerousness of the height of the distributed roughness, arising due to nose erosion, or the effectiveness of height of the isolated roughness element forcing the boundary layer transition. Supporting the PL4 design, linear stability computations and CFD analyses have been performed by CIRA on the EXPERT flight vehicle to determine the amplification factor of the boundary layer instabilities at different point of the re-entry trajectory. Ground test experiments regarding the PL5 are carried on in the Mach 6 VKI H3 Hypersonic Wind Tunnel with a Reynolds numbers ranging from 18E6/m to 26E6/m. Infrared measurements (Stanton number) and flow visualization are used on a 1/16 scaled model of the EXPERT vehicle and a flat plate to validate the Potter and Whitfield criterion as a suitable methodology for ground-to-flight extrapolation and the payload design.

  12. Sampling design for spatially distributed hydrogeologic and environmental processes

    USGS Publications Warehouse

    Christakos, G.; Olea, R.A.

    1992-01-01

    A methodology for the design of sampling networks over space is proposed. The methodology is based on spatial random field representations of nonhomogeneous natural processes, and on optimal spatial estimation techniques. One of the most important results of random field theory for physical sciences is its rationalization of correlations in spatial variability of natural processes. This correlation is extremely important both for interpreting spatially distributed observations and for predictive performance. The extent of site sampling and the types of data to be collected will depend on the relationship of subsurface variability to predictive uncertainty. While hypothesis formulation and initial identification of spatial variability characteristics are based on scientific understanding (such as knowledge of the physics of the underlying phenomena, geological interpretations, intuition and experience), the support offered by field data is statistically modelled. This model is not limited by the geometric nature of sampling and covers a wide range in subsurface uncertainties. A factorization scheme of the sampling error variance is derived, which possesses certain atttactive properties allowing significant savings in computations. By means of this scheme, a practical sampling design procedure providing suitable indices of the sampling error variance is established. These indices can be used by way of multiobjective decision criteria to obtain the best sampling strategy. Neither the actual implementation of the in-situ sampling nor the solution of the large spatial estimation systems of equations are necessary. The required values of the accuracy parameters involved in the network design are derived using reference charts (readily available for various combinations of data configurations and spatial variability parameters) and certain simple yet accurate analytical formulas. Insight is gained by applying the proposed sampling procedure to realistic examples related to sampling problems in two dimensions. ?? 1992.

  13. FINDING A METHOD FOR THE MADNESS: A COMPARATIVE ANALYSIS OF STRATEGIC DESIGN METHODOLOGIES

    DTIC Science & Technology

    2017-06-01

    FINDING A METHOD FOR THE MADNESS: A COMPARATIVE ANALYSIS OF STRATEGIC DESIGN METHODOLOGIES BY AMANDA DONNELLY A THESIS...work develops a comparative model for strategic design methodologies, focusing on the primary elements of vision, time, process, communication and...collaboration, and risk assessment. My analysis dissects and compares three potential design methodologies including, net assessment, scenarios and

  14. Optimization of supercoiled HPV-16 E6/E7 plasmid DNA purification with arginine monolith using design of experiments.

    PubMed

    Almeida, A M; Queiroz, J A; Sousa, F; Sousa, A

    2015-01-26

    The progress of DNA vaccines is dependent on the development of suitable chromatographic procedures to successfully purify genetic vectors, such as plasmid DNA. Human Papillomavirus is associated with the development of tumours due to the oncogenic power of E6 and E7 proteins, produced by this virus. The supercoiled HPV-16 E6/E7 plasmid-based vaccine was recently purified with the arginine monolith, with 100% of purity, but only 39% of recovery was achieved. Therefore, the present study describes the application of experimental design tools, a newly explored methodology in preparative chromatography, in order to improve the supercoiled plasmid DNA recovery with the arginine monolith, maintaining the high purity degree. In addition, the importance and influence of pH in the pDNA retention to the arginine ligand was also demonstrated. The Composite Central Face design was validated and the recovery of the target molecule was successfully improved from 39% to 83.5%, with an outstanding increase of more than double, while maintaining 100% of purity. Copyright © 2014 Elsevier B.V. All rights reserved.

  15. Prediction of thermal behaviors of an air-cooled lithium-ion battery system for hybrid electric vehicles

    NASA Astrophysics Data System (ADS)

    Choi, Yong Seok; Kang, Dal Mo

    2014-12-01

    Thermal management has been one of the major issues in developing a lithium-ion (Li-ion) hybrid electric vehicle (HEV) battery system since the Li-ion battery is vulnerable to excessive heat load under abnormal or severe operational conditions. In this work, in order to design a suitable thermal management system, a simple modeling methodology describing thermal behavior of an air-cooled Li-ion battery system was proposed from vehicle components designer's point of view. A proposed mathematical model was constructed based on the battery's electrical and mechanical properties. Also, validation test results for the Li-ion battery system were presented. A pulse current duty and an adjusted US06 current cycle for a two-mode HEV system were used to validate the accuracy of the model prediction. Results showed that the present model can give good estimations for simulating convective heat transfer cooling during battery operation. The developed thermal model is useful in structuring the flow system and determining the appropriate cooling capacity for a specified design prerequisite of the battery system.

  16. Development of analytical methodologies to assess recalcitrant pesticide bioremediation in biobeds at laboratory scale.

    PubMed

    Rivero, Anisleidy; Niell, Silvina; Cerdeiras, M Pía; Heinzen, Horacio; Cesio, María Verónica

    2016-06-01

    To assess recalcitrant pesticide bioremediation it is necessary to gradually increase the complexity of the biological system used in order to design an effective biobed assembly. Each step towards this effective biobed design needs a suitable, validated analytical methodology that allows a correct evaluation of the dissipation and bioconvertion. Low recovery yielding methods could give a false idea of a successful biodegradation process. To address this situation, different methods were developed and validated for the simultaneous determination of endosulfan, its main three metabolites, and chlorpyrifos in increasingly complex matrices where the bioconvertor basidiomycete Abortiporus biennis could grow. The matrices were culture media, bran, and finally a laboratory biomix composed of bran, peat and soil. The methodology for the analysis of the first evaluated matrix has already been reported. The methodologies developed for the other two systems are presented in this work. The targeted analytes were extracted from fungi growing over bran in semisolid media YNB (Yeast Nitrogen Based) with acetonitrile using shaker assisted extraction, The salting-out step was performed with MgSO4 and NaCl, and the extracts analyzed by GC-ECD. The best methodology was fully validated for all the evaluated analytes at 1 and 25mgkg(-1) yielding recoveries between 72% and 109% and RSDs <11% in all cases. The application of this methodology proved that A. biennis is able to dissipate 94% of endosulfan and 87% of chlorpyrifos after 90 days. Having assessed that A. biennis growing over bran can metabolize the studied pesticides, the next step faced was the development and validation of an analytical procedure to evaluate the analytes in a laboratory scale biobed composed of 50% of bran, 25% of peat and 25% of soil together with fungal micelium. From the different procedures assayed, only ultrasound assisted extraction with ethyl acetate allowed recoveries between 80% and 110% with RSDs <18%. Linearity, recovery, precision, matrix effect and LODs/LOQs of each method were studied for all the analytes: endosulfan isomers (α & β) and its metabolites (endosulfan sulfate, ether and diol) as well as for chlorpyrifos. In the first laboratory evaluation of these biobeds endosulfan was bioconverted up to 87% and chlorpyrifos more than 79% after 27 days. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. A FPGA implementation for linearly unmixing a hyperspectral image using OpenCL

    NASA Astrophysics Data System (ADS)

    Guerra, Raúl; López, Sebastián.; Sarmiento, Roberto

    2017-10-01

    Hyperspectral imaging systems provide images in which single pixels have information from across the electromagnetic spectrum of the scene under analysis. These systems divide the spectrum into many contiguos channels, which may be even out of the visible part of the spectra. The main advantage of the hyperspectral imaging technology is that certain objects leave unique fingerprints in the electromagnetic spectrum, known as spectral signatures, which allow to distinguish between different materials that may look like the same in a traditional RGB image. Accordingly, the most important hyperspectral imaging applications are related with distinguishing or identifying materials in a particular scene. In hyperspectral imaging applications under real-time constraints, the huge amount of information provided by the hyperspectral sensors has to be rapidly processed and analysed. For such purpose, parallel hardware devices, such as Field Programmable Gate Arrays (FPGAs) are typically used. However, developing hardware applications typically requires expertise in the specific targeted device, as well as in the tools and methodologies which can be used to perform the implementation of the desired algorithms in the specific device. In this scenario, the Open Computing Language (OpenCL) emerges as a very interesting solution in which a single high-level synthesis design language can be used to efficiently develop applications in multiple and different hardware devices. In this work, the Fast Algorithm for Linearly Unmixing Hyperspectral Images (FUN) has been implemented into a Bitware Stratix V Altera FPGA using OpenCL. The obtained results demonstrate the suitability of OpenCL as a viable design methodology for quickly creating efficient FPGAs designs for real-time hyperspectral imaging applications.

  18. Experimental Design of a UCAV-Based High-Energy Laser Weapon

    DTIC Science & Technology

    2016-12-01

    propagation. The Design of Experiments (DOE) methodology is then applied to determine the significance of the UCAV-HEL design parameters and their... Design of Experiments (DOE) methodology is then applied to determine the significance of the UCAV-HEL design parameters and their effect on the...73 A. DESIGN OF EXPERIMENTS METHODOLOGY .............................73 B. OPERATIONAL CONCEPT

  19. 10 CFR 2.110 - Filing and administrative action on submittals for standard design approval or early review of...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... standard design approval or early review of site suitability issues. 2.110 Section 2.110 Energy NUCLEAR... or early review of site suitability issues. (a)(1) A submittal for a standard design approval under... provisions of appendix Q to parts 50 of this chapter, a submittal for early review of site suitability issues...

  20. 10 CFR 2.110 - Filing and administrative action on submittals for standard design approval or early review of...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... standard design approval or early review of site suitability issues. 2.110 Section 2.110 Energy NUCLEAR... or early review of site suitability issues. (a)(1) A submittal for a standard design approval under... provisions of appendix Q to parts 50 of this chapter, a submittal for early review of site suitability issues...

  1. Development and evaluation of habitat suitability criteria for use in the instream flow incremental methodology

    USGS Publications Warehouse

    Bovee, Ken D.

    1986-01-01

    The Instream Flow Incremental Methodology (IFIM) is a habitat-based tool used to evaluate the environmental consequences of various water and land use practices. As such, knowledge about the conditions that provide favorable habitat for a species, and those that do not, is necessary for successful implementation of the methodology. In the context of IFIM, this knowledge is defined as habitat suitability criteria: characteristic behavioral traits of a species that are established as standards for comparison in the decision-making process. Habitat suitability criteria may be expressed in a variety of types and formats. The type, or category, refers to the procedure used to develop the criteria. Category I criteria are based on professional judgment, with little or no empirical data. Category II criteria have as their source, microhabitat data collected at locations where target organisms are observed or collected. These are called “utilization” functions because they are based on observed locations that were used by the target organism. These functions tend to be biased by the environmental conditions that were available to the fish or invertebrates at the time they were observed. Correction of the utilization function for environmental availability creates category III, or “preference” criteria, which tend to be much less site specific than category II criteria. There are also several ways to express habitat suitability in graphical form. The binary format establishes a suitable range for each variable as it pertains to a life stage of interest, and is presented graphically as a step function. The quality rating for a variable is 1.0 if it falls within the range of the criteria, and 0.0 if it falls outside the range. The univariate curve format established both the usable range and the optimum range for each variable, with conditions of intermediate usability expressed along the portion between the tails and the peak of the curve. Multivariate probability density functions, which can be used to compute suitability for several variables simultaneously, are conveyed as three dimensional figures with suitability on the z-axis, and two independent variables on the x-y plane. These functions are useful for incorporating interactive terms between two or more variable. Such interactions can also be demonstrated using conditional criteria, which are stratified by cover type or substrate size. Conditional criteria may be of any category or format, but are distinguishable by two or more sets of functional relationships for each life stage.

  2. REE radiation fault model: a tool for organizing and communication radiation test data and construction COTS based spacebourne computing systems

    NASA Technical Reports Server (NTRS)

    Ferraro, R.; Some, R.

    2002-01-01

    The growth in data rates of instruments on future NASA spacecraft continues to outstrip the improvement in communications bandwidth and processing capabilities of radiation-hardened computers. Sophisticated autonomous operations strategies will further increase the processing workload. Given the reductions in spacecraft size and available power, standard radiation hardened computing systems alone will not be able to address the requirements of future missions. The REE project was intended to overcome this obstacle by developing a COTS- based supercomputer suitable for use as a science and autonomy data processor in most space environments. This development required a detailed knowledge of system behavior in the presence of Single Event Effect (SEE) induced faults so that mitigation strategies could be designed to recover system level reliability while maintaining the COTS throughput advantage. The REE project has developed a suite of tools and a methodology for predicting SEU induced transient fault rates in a range of natural space environments from ground-based radiation testing of component parts. In this paper we provide an overview of this methodology and tool set with a concentration on the radiation fault model and its use in the REE system development methodology. Using test data reported elsewhere in this and other conferences, we predict upset rates for a particular COTS single board computer configuration in several space environments.

  3. Evaluating perceptual integration: uniting response-time- and accuracy-based methodologies.

    PubMed

    Eidels, Ami; Townsend, James T; Hughes, Howard C; Perry, Lacey A

    2015-02-01

    This investigation brings together a response-time system identification methodology (e.g., Townsend & Wenger Psychonomic Bulletin & Review 11, 391-418, 2004a) and an accuracy methodology, intended to assess models of integration across stimulus dimensions (features, modalities, etc.) that were proposed by Shaw and colleagues (e.g., Mulligan & Shaw Perception & Psychophysics 28, 471-478, 1980). The goal was to theoretically examine these separate strategies and to apply them conjointly to the same set of participants. The empirical phases were carried out within an extension of an established experimental design called the double factorial paradigm (e.g., Townsend & Nozawa Journal of Mathematical Psychology 39, 321-359, 1995). That paradigm, based on response times, permits assessments of architecture (parallel vs. serial processing), stopping rule (exhaustive vs. minimum time), and workload capacity, all within the same blocks of trials. The paradigm introduced by Shaw and colleagues uses a statistic formally analogous to that of the double factorial paradigm, but based on accuracy rather than response times. We demonstrate that the accuracy measure cannot discriminate between parallel and serial processing. Nonetheless, the class of models supported by the accuracy data possesses a suitable interpretation within the same set of models supported by the response-time data. The supported model, consistent across individuals, is parallel and has limited capacity, with the participants employing the appropriate stopping rule for the experimental setting.

  4. Human-Centered Design Study: Enhancing the Usability of a Mobile Phone App in an Integrated Falls Risk Detection System for Use by Older Adult Users.

    PubMed

    Harte, Richard; Quinlan, Leo R; Glynn, Liam; Rodríguez-Molinero, Alejandro; Baker, Paul Ma; Scharf, Thomas; ÓLaighin, Gearóid

    2017-05-30

    Design processes such as human-centered design (HCD), which involve the end user throughout the product development and testing process, can be crucial in ensuring that the product meets the needs and capabilities of the user, particularly in terms of safety and user experience. The structured and iterative nature of HCD can often conflict with the necessary rapid product development life-cycles associated with the competitive connected health industry. The aim of this study was to apply a structured HCD methodology to the development of a smartphone app that was to be used within a connected health fall risk detection system. Our methodology utilizes so called discount usability engineering techniques to minimize the burden on resources during development and maintain a rapid pace of development. This study will provide prospective designers a detailed description of the application of a HCD methodology. A 3-phase methodology was applied. In the first phase, a descriptive "use case" was developed by the system designers and analyzed by both expert stakeholders and end users. The use case described the use of the app and how various actors would interact with it and in what context. A working app prototype and a user manual were then developed based on this feedback and were subjected to a rigorous usability inspection. Further changes were made both to the interface and support documentation. The now advanced prototype was exposed to user testing by end users where further design recommendations were made. With combined expert and end-user analysis of a comprehensive use case having originally identified 21 problems with the system interface, we have only seen and observed 3 of these problems in user testing, implying that 18 problems were eliminated between phase 1 and 3. Satisfactory ratings were obtained during validation testing by both experts and end users, and final testing by users shows the system requires low mental, physical, and temporal demands according to the NASA Task Load Index (NASA-TLX). From our observation of older adults' interactions with smartphone interfaces, there were some recurring themes. Clear and relevant feedback as the user attempts to complete a task is critical. Feedback should include pop-ups, sound tones, color or texture changes, or icon changes to indicate that a function has been completed successfully, such as for the connection sequence. For text feedback, clear and unambiguous language should be used so as not to create anxiety, particularly when it comes to saving data. Warning tones or symbols, such as caution symbols or shrill tones, should only be used if absolutely necessary. Our HCD methodology, designed and implemented based on the principles of the International Standard Organizaton (ISO) 9241-210 standard, produced a functional app interface within a short production cycle, which is now suitable for use by older adults in long term clinical trials. ©Richard Harte, Leo R Quinlan, Liam Glynn, Alejandro Rodríguez-Molinero, Paul MA Baker, Thomas Scharf, Gearóid ÓLaighin. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 30.05.2017.

  5. Critical Race Design: An Emerging Methodological Approach to Anti-Racist Design and Implementation Research

    ERIC Educational Resources Information Center

    Khalil, Deena; Kier, Meredith

    2017-01-01

    This article is about introducing Critical Race Design (CRD), a research methodology that centers race and equity at the nucleus of educational opportunities by design. First, the authors define design-based implementation research (DBIR; Penuel, Fishman, Cheng, & Sabelli, 2011) as an equity-oriented education research methodology where…

  6. Assessment of coagulation pretreatment of leachate by response surface methodology.

    PubMed

    Lessoued, Ridha; Souahi, Fatiha; Castrillon Pelaez, Leonor

    2017-11-01

    Coagulation-flocculation is a relatively simple technique that can be used successfully for the treatment of old leachate by poly-aluminum chloride (PAC). The main objectives of this study are to design the experiments, build models and optimize the operating parameters, dosage m and pH, using the central composite design and response surface method. Developed for chemical organic matter (COD) and turbidity responses, the quadratic polynomial model is suitable for prediction within the range of simulated variables as it showed that the optimum conditions were m of 5.55 g/L at pH 7.05, with a determination coefficient R² at 99.33%, 99.92% and adjusted R² at 98.85% and 99.86% for both COD and turbidity. We confirm that the initial pH and PAC dosage have significant effects on COD and turbidity removal. The experimental data and model predictions agreed well and the removal efficiency of COD, turbidity, Fe, Pb and Cu reached respectively 61%, 96.4%, 97.1%, 99% and 100%.

  7. A Hitchhiker's Guide to Functional Magnetic Resonance Imaging

    PubMed Central

    Soares, José M.; Magalhães, Ricardo; Moreira, Pedro S.; Sousa, Alexandre; Ganz, Edward; Sampaio, Adriana; Alves, Victor; Marques, Paulo; Sousa, Nuno

    2016-01-01

    Functional Magnetic Resonance Imaging (fMRI) studies have become increasingly popular both with clinicians and researchers as they are capable of providing unique insights into brain functions. However, multiple technical considerations (ranging from specifics of paradigm design to imaging artifacts, complex protocol definition, and multitude of processing and methods of analysis, as well as intrinsic methodological limitations) must be considered and addressed in order to optimize fMRI analysis and to arrive at the most accurate and grounded interpretation of the data. In practice, the researcher/clinician must choose, from many available options, the most suitable software tool for each stage of the fMRI analysis pipeline. Herein we provide a straightforward guide designed to address, for each of the major stages, the techniques, and tools involved in the process. We have developed this guide both to help those new to the technique to overcome the most critical difficulties in its use, as well as to serve as a resource for the neuroimaging community. PMID:27891073

  8. Utility of Army Design Methodology in U.S. Coast Guard Counter Narcotic Interdiction Strategy

    DTIC Science & Technology

    2017-06-09

    UTILITY OF ARMY DESIGN METHODOLOGY IN U.S. COAST GUARD COUNTER NARCOTIC INTERDICTION STRATEGY A thesis presented to the...Thesis 3. DATES COVERED (From - To) AUG 2016 – JUN 2017 4. TITLE AND SUBTITLE Utility of Army Design Methodology in U.S. Coast Guard Counter...Distribution is Unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT This study investigates the utility of using Army Design Methodology (ADM) to

  9. Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio-Inspired Optimization Techniques

    DTIC Science & Technology

    2017-11-01

    ARL-TR-8225 ● NOV 2017 US Army Research Laboratory Methodology for Designing and Developing a New Ultra-Wideband Antenna Based...Research Laboratory Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio-Inspired Optimization Techniques by...SUBTITLE Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio-Inspired Optimization Techniques 5a. CONTRACT NUMBER

  10. [Integration of new psychosocial facilities into the health care system: considerations on a social ecological evaluation concept exemplified by ambulatory crisis care].

    PubMed

    Leferink, K; Bergold, J B

    1996-11-01

    With respect to the methodological problems concerning the outcome evaluation of crisis intervention centers the outlines of a social-ecological research approach are developed. It is suggested that this approach is more suitable to take into account the role of the network of mental health services. The data come from a research project which was designed to explain the historical and social aspects of the process of integration of a crisis intervention service. The results indicate that on the one hand the practice of the service strongly depends on what other services do and on the other hand influences them. The social integration of an institution into the network of other services is discussed as an alternative criterion of evaluation.

  11. Experimental determination of thermodynamic equilibrium in biocatalytic transamination.

    PubMed

    Tufvesson, Pär; Jensen, Jacob S; Kroutil, Wolfgang; Woodley, John M

    2012-08-01

    The equilibrium constant is a critical parameter for making rational design choices in biocatalytic transamination for the synthesis of chiral amines. However, very few reports are available in the scientific literature determining the equilibrium constant (K) for the transamination of ketones. Various methods for determining (or estimating) equilibrium have previously been suggested, both experimental as well as computational (based on group contribution methods). However, none of these were found suitable for determining the equilibrium constant for the transamination of ketones. Therefore, in this communication we suggest a simple experimental methodology which we hope will stimulate more accurate determination of thermodynamic equilibria when reporting the results of transaminase-catalyzed reactions in order to increase understanding of the relationship between substrate and product molecular structure on reaction thermodynamics. Copyright © 2012 Wiley Periodicals, Inc.

  12. Feminist-informed participatory action research: a methodology of choice for examining critical nursing issues.

    PubMed

    Corbett, Andrea M; Francis, Karen; Chapman, Ysanne

    2007-04-01

    Identifying a methodology to guide a study that aims to enhance service delivery can be challenging. Participatory action research offers a solution to this challenge as it both informs and is informed by critical social theory. In addition, using a feminist lens helps acquiesce this approach as a suitable methodology for changing practice. This methodology embraces empowerment self-determination and the facilitation of agreed change as central tenets that guide the research process. Encouraged by the work of Foucault, Friere, Habermas, and Maguire, this paper explicates the philosophical assumptions underpinning critical social theory and outlines how feminist influences are complimentary in exploring the processes and applications of nursing research that seeks to embrace change.

  13. A Synergy between the Technological Process and a Methodology for Web Design: Implications for Technological Problem Solving and Design

    ERIC Educational Resources Information Center

    Jakovljevic, Maria; Ankiewicz, Piet; De swardt, Estelle; Gross, Elna

    2004-01-01

    Traditional instructional methodology in the Information System Design (ISD) environment lacks explicit strategies for promoting the cognitive skills of prospective system designers. This contributes to the fragmented knowledge and low motivational and creative involvement of learners in system design tasks. In addition, present ISD methodologies,…

  14. Control of maglev vehicles with aerodynamic and guideway disturbances

    NASA Technical Reports Server (NTRS)

    Flueckiger, Karl; Mark, Steve; Caswell, Ruth; Mccallum, Duncan

    1994-01-01

    A modeling, analysis, and control design methodology is presented for maglev vehicle ride quality performance improvement as measured by the Pepler Index. Ride quality enhancement is considered through active control of secondary suspension elements and active aerodynamic surfaces mounted on the train. To analyze and quantify the benefits of active control, the authors have developed a five degree-of-freedom lumped parameter model suitable for describing a large class of maglev vehicles, including both channel and box-beam guideway configurations. Elements of this modeling capability have been recently employed in studies sponsored by the U.S. Department of Transportation (DOT). A perturbation analysis about an operating point, defined by vehicle and average crosswind velocities, yields a suitable linearized state space model for multivariable control system analysis and synthesis. Neglecting passenger compartment noise, the ride quality as quantified by the Pepler Index is readily computed from the system states. A statistical analysis is performed by modeling the crosswind disturbances and guideway variations as filtered white noise, whereby the Pepler Index is established in closed form through the solution to a matrix Lyapunov equation. Data is presented which indicates the anticipated ride quality achieved through various closed-loop control arrangements.

  15. Propulsion integration of hypersonic air-breathing vehicles utilizing a top-down design methodology

    NASA Astrophysics Data System (ADS)

    Kirkpatrick, Brad Kenneth

    In recent years, a focus of aerospace engineering design has been the development of advanced design methodologies and frameworks to account for increasingly complex and integrated vehicles. Techniques such as parametric modeling, global vehicle analyses, and interdisciplinary data sharing have been employed in an attempt to improve the design process. The purpose of this study is to introduce a new approach to integrated vehicle design known as the top-down design methodology. In the top-down design methodology, the main idea is to relate design changes on the vehicle system and sub-system level to a set of over-arching performance and customer requirements. Rather than focusing on the performance of an individual system, the system is analyzed in terms of the net effect it has on the overall vehicle and other vehicle systems. This detailed level of analysis can only be accomplished through the use of high fidelity computational tools such as Computational Fluid Dynamics (CFD) or Finite Element Analysis (FEA). The utility of the top-down design methodology is investigated through its application to the conceptual and preliminary design of a long-range hypersonic air-breathing vehicle for a hypothetical next generation hypersonic vehicle (NHRV) program. System-level design is demonstrated through the development of the nozzle section of the propulsion system. From this demonstration of the methodology, conclusions are made about the benefits, drawbacks, and cost of using the methodology.

  16. A performance-oriented power transformer design methodology using multi-objective evolutionary optimization

    PubMed Central

    Adly, Amr A.; Abd-El-Hafiz, Salwa K.

    2014-01-01

    Transformers are regarded as crucial components in power systems. Due to market globalization, power transformer manufacturers are facing an increasingly competitive environment that mandates the adoption of design strategies yielding better performance at lower costs. In this paper, a power transformer design methodology using multi-objective evolutionary optimization is proposed. Using this methodology, which is tailored to be target performance design-oriented, quick rough estimation of transformer design specifics may be inferred. Testing of the suggested approach revealed significant qualitative and quantitative match with measured design and performance values. Details of the proposed methodology as well as sample design results are reported in the paper. PMID:26257939

  17. A performance-oriented power transformer design methodology using multi-objective evolutionary optimization.

    PubMed

    Adly, Amr A; Abd-El-Hafiz, Salwa K

    2015-05-01

    Transformers are regarded as crucial components in power systems. Due to market globalization, power transformer manufacturers are facing an increasingly competitive environment that mandates the adoption of design strategies yielding better performance at lower costs. In this paper, a power transformer design methodology using multi-objective evolutionary optimization is proposed. Using this methodology, which is tailored to be target performance design-oriented, quick rough estimation of transformer design specifics may be inferred. Testing of the suggested approach revealed significant qualitative and quantitative match with measured design and performance values. Details of the proposed methodology as well as sample design results are reported in the paper.

  18. Applications of mixed-methods methodology in clinical pharmacy research.

    PubMed

    Hadi, Muhammad Abdul; Closs, S José

    2016-06-01

    Introduction Mixed-methods methodology, as the name suggests refers to mixing of elements of both qualitative and quantitative methodologies in a single study. In the past decade, mixed-methods methodology has gained popularity among healthcare researchers as it promises to bring together the strengths of both qualitative and quantitative approaches. Methodology A number of mixed-methods designs are available in the literature and the four most commonly used designs in healthcare research are: the convergent parallel design, the embedded design, the exploratory design, and the explanatory design. Each has its own unique advantages, challenges and procedures and selection of a particular design should be guided by the research question. Guidance on designing, conducting and reporting mixed-methods research is available in the literature, so it is advisable to adhere to this to ensure methodological rigour. When to use it is best suited when the research questions require: triangulating findings from different methodologies to explain a single phenomenon; clarifying the results of one method using another method; informing the design of one method based on the findings of another method, development of a scale/questionnaire and answering different research questions within a single study. Two case studies have been presented to illustrate possible applications of mixed-methods methodology. Limitations Possessing the necessary knowledge and skills to undertake qualitative and quantitative data collection, analysis, interpretation and integration remains the biggest challenge for researchers conducting mixed-methods studies. Sequential study designs are often time consuming, being in two (or more) phases whereas concurrent study designs may require more than one data collector to collect both qualitative and quantitative data at the same time.

  19. Methodological proposal for the remediation of a site affected by phosphogypsum deposits

    NASA Astrophysics Data System (ADS)

    Martínez-Sanchez, M. J.; Perez-Sirvent, C.; Bolivar, J. P.; Garcia-Tenorio, R.

    2012-04-01

    The accumulation of phosphogysum (PY) produces a well known environmental problems. The proposals for the remediation of these sites require multidisciplinary and very specific studies. Since they cover large areas a sampling design specifically outlined for each case is necessary in order the contaminants, transfer pathways and particular processes can be correctly identified. In addition to a suitable sampling of the soil, aquatic medium and biota, appropriate studies of the space-temporal variations by means of control samples are required. Two different stages should be considered: 1.- Diagnostic stage This stage includes preliminary studies, identification of possible sources of radiosotopes, design of the appropriate sampling plan, hydrogeological study, characterization and study of the space-temporal variability of radioisotopes and other contaminants, as well as the risk assessement for health and ecosystems, that depends on the future use of the site. 2.- Remediation proposal stage It comprises the evaluation and comparison of the different procedures for the decontamination/remediation, including models experiments at the laboratory. To this respect, the preparation and detailed study of a small scale pilot project is a task of particular relevance. In this way the suitability of the remediating technology can be checked, and its performance optimized. These two stages allow a technically well-founded proposal to be presented to the Organisms or Institutions in charge of the problem and facilitate decision-making. It both stages be included in a social communication campaign in order the final proposal be accepted by stakeholders.

  20. Green Remediation Best Management Practices: Overview of EPA's Methodology to Address the Environmental Footprint of Site Cleanup

    EPA Pesticide Factsheets

    Contaminated site cleanups involving complex activities may benefit from a detailed environmental footprint analysis to inform decision-making about application of suitable best management practices for greener cleanups.

  1. Data mining in soft computing framework: a survey.

    PubMed

    Mitra, S; Pal, S K; Mitra, P

    2002-01-01

    The present article provides a survey of the available literature on data mining using soft computing. A categorization has been provided based on the different soft computing tools and their hybridizations used, the data mining function implemented, and the preference criterion selected by the model. The utility of the different soft computing methodologies is highlighted. Generally fuzzy sets are suitable for handling the issues related to understandability of patterns, incomplete/noisy data, mixed media information and human interaction, and can provide approximate solutions faster. Neural networks are nonparametric, robust, and exhibit good learning and generalization capabilities in data-rich environments. Genetic algorithms provide efficient search algorithms to select a model, from mixed media data, based on some preference criterion/objective function. Rough sets are suitable for handling different types of uncertainty in data. Some challenges to data mining and the application of soft computing methodologies are indicated. An extensive bibliography is also included.

  2. Identification of Novel "Inks" for 3D Printing Using High-Throughput Screening: Bioresorbable Photocurable Polymers for Controlled Drug Delivery.

    PubMed

    Louzao, Iria; Koch, Britta; Taresco, Vincenzo; Ruiz-Cantu, Laura; Irvine, Derek J; Roberts, Clive J; Tuck, Christopher; Alexander, Cameron; Hague, Richard; Wildman, Ricky; Alexander, Morgan R

    2018-02-28

    A robust methodology is presented to identify novel biomaterials suitable for three-dimensional (3D) printing. Currently, the application of additive manufacturing is limited by the availability of functional inks, especially in the area of biomaterials; this is the first time when this method is used to tackle this problem, allowing hundreds of formulations to be readily assessed. Several functional properties, including the release of an antidepressive drug (paroxetine), cytotoxicity, and printability, are screened for 253 new ink formulations in a high-throughput format as well as mechanical properties. The selected candidates with the desirable properties are successfully scaled up using 3D printing into a range of object architectures. A full drug release study and degradability and tensile modulus experiments are presented on a simple architecture to validating the suitability of this methodology to identify printable inks for 3D printing devices with bespoke properties.

  3. A Model-Based Systems Engineering Methodology for Employing Architecture In System Analysis: Developing Simulation Models Using Systems Modeling Language Products to Link Architecture and Analysis

    DTIC Science & Technology

    2016-06-01

    characteristics, experimental design techniques, and analysis methodologies that distinguish each phase of the MBSE MEASA. To ensure consistency... methodology . Experimental design selection, simulation analysis, and trade space analysis support the final two stages. Figure 27 segments the MBSE MEASA...rounding has the potential to increase the correlation between columns of the experimental design matrix. The design methodology presented in Vieira

  4. Total System Design (TSD) Methodology Assessment.

    DTIC Science & Technology

    1983-01-01

    hardware implementation. Author: Martin - Marietta Aerospace Title: Total System Design Methodology Source: Martin - Marietta Technical Report MCR -79-646...systematic, rational approach to computer systems design is needed. Martin - Marietta has produced a Total System Design Methodology to support such design...gathering and ordering. The purpose of the paper is to document the existing TSD methoeology at Martin - Marietta , describe the supporting tools, and

  5. Software Requirements Engineering Methodology (Development)

    DTIC Science & Technology

    1979-06-01

    Higher Order Software [20]; and the Michael Jackson Design Methodology [21]. Although structured programming constructs have proven to be more useful...reviewed here. Similarly, the manual techniques for software design (e.g., HIPO Diagrams, Nassi-Schneidermann charts, Top-Down Design, the Michael ... Jackson Design Methodology, Yourdon’s Structured Design) are not addressed. 6.1.3 Research Programs There are a number of research programs underway

  6. Advanced applications of numerical modelling techniques for clay extruder design

    NASA Astrophysics Data System (ADS)

    Kandasamy, Saravanakumar

    Ceramic materials play a vital role in our day to day life. Recent advances in research, manufacture and processing techniques and production methodologies have broadened the scope of ceramic products such as bricks, pipes and tiles, especially in the construction industry. These are mainly manufactured using an extrusion process in auger extruders. During their long history of application in the ceramic industry, most of the design developments of extruder systems have resulted from expensive laboratory-based experimental work and field-based trial and error runs. In spite of these design developments, the auger extruders continue to be energy intensive devices with high operating costs. Limited understanding of the physical process involved in the process and the cost and time requirements of lab-based experiments were found to be the major obstacles in the further development of auger extruders.An attempt has been made herein to use Computational Fluid Dynamics (CFD) and Finite Element Analysis (FEA) based numerical modelling techniques to reduce the costs and time associated with research into design improvement by experimental trials. These two techniques, although used widely in other engineering applications, have rarely been applied for auger extruder development. This had been due to a number of reasons including technical limitations of CFD tools previously available. Modern CFD and FEA software packages have much enhanced capabilities and allow the modelling of the flow of complex fluids such as clay.This research work presents a methodology in using Herschel-Bulkley's fluid flow based CFD model to simulate and assess the flow of clay-water mixture through the extruder and the die of a vacuum de-airing type clay extrusion unit used in ceramic extrusion. The extruder design and the operating parameters were varied to study their influence on the power consumption and the extrusion pressure. The model results were then validated using results from experimental trials on a scaled extruder which seemed to be in reasonable agreement with the former. The modelling methodology was then extended to full-scale industrial extruders. The technical and commercialsuitability of using light weight materials to manufacture extruder components was also investigated. The stress and deformation induced on the components, due to extrusion pressure, was analysed using FEA and suitable alternative materials were identified. A cost comparison was then made for different extruder materials. The results show potential of significant technical and commercial benefits to the ceramic industry.

  7. Design, science and naturalism

    NASA Astrophysics Data System (ADS)

    Deming, David

    2008-09-01

    The Design Argument is the proposition that the presence of order in the universe is evidence for the existence of God. The Argument dates at least to the presocratic Greek philosophers, and is largely based on analogical reasoning. Following the appearance of Aquinas' Summa Theologica in the 13th century, the Christian Church in Europe embraced a Natural Theology based on observation and reason that allowed it to dominate the entire world of knowledge. Science in turn advanced itself by demonstrating that it could be of service to theology, the recognized queen of the sciences. During the heyday of British Natural Theology in the 17th and 18th centuries, the watchmaker, shipbuilder, and architect analogies were invoked reflexively by philosophers, theologians, and scientists. The Design Argument was not systematically and analytically criticized until David Hume wrote Dialogues on Natural Religion in the 1750s. After Darwin published Origin of Species in 1859, Design withered on the vine. But in recent years, the Argument has been resurrected under the appellation "intelligent design," and been the subject of political and legal controversy in the United States. Design advocates have argued that intelligent design can be formulated as a scientific hypothesis, that new scientific discoveries validate a design inference, and that naturalism must be removed as a methodological requirement in science. If science is defined by a model of concentric epistemological zonation, design cannot be construed as a scientific hypothesis because it is inconsistent with the core aspects of scientific methodology: naturalism, uniformity, induction, and efficient causation. An analytical examination of claims by design advocates finds no evidence of any type to support either scientific or philosophical claims that design can be unambiguously inferred from nature. The apparent irreducible complexity of biological mechanisms may be explained by exaptation or scaffolding. The argument that design is indicated by the fine-tuning of the universe as a habitat suitable for life is based on an intellectual fallacy of assigning probability to a unique event. Construing the Design Argument as an "inference to the best explanation," rather than analogical reasoning is essentially an equivocation fallacy that does not rescue the Argument from Hume's criticisms. The intelligent design movement is a threat to the unity of science, as its confessed goal is to restore Christian theology as the queen of the sciences.

  8. Railroad classification yard design methodology study Elkhart Yard Rehabilitation : a case study

    DOT National Transportation Integrated Search

    1980-02-01

    This interim report documents the application of a railroad classification : yard design methodology to CONRAIL's Elkhart Yard Rehabilitation. This : case study effort represents Phase 2 of a larger effort to develop a yard : design methodology, and ...

  9. On the suitability of the copula types for the joint modelling of flood peaks and volumes along the Danube River

    NASA Astrophysics Data System (ADS)

    Kohnová, Silvia; Papaioannou, George; Bacigál, Tomáš; Szolgay, Ján; Hlavčová, Kamila; Loukas, Athanasios; Výleta, Roman

    2017-04-01

    Flood frequency analysis is often performed as a univariate analysis of flood peaks using a suitable theoretical probability distribution of the annual maximum flood peaks or peak over threshold values. However, also other flood attributes, such as flood volume and duration, are often necessary for the design of hydrotechnical structures and projects. In this study, the suitability of various copula families for a bivariate analysis of peak discharges and flood volumes has been tested on the streamflow data from gauging stations along the whole Danube River. Kendall's rank correlation coefficient (tau) quantifies the dependence between flood peak discharge and flood volume settings. The methodology is tested on two different data samples: 1) annual maximum flood (AMF) peaks with corresponding flood volumes, which is a typical choice for engineering studies and 2). annual maximum flood (AMF) peaks combined with annual maximum flow volumes of fixed durations at 5, 10, 15, 20, 25, 30 and 60 days, which can be regarded as a regime analysis of the dependence between the extremes of both variables in a given year. The bivariate modelling of the peak discharge - flood volume couples is achieved with the use of the the following copulas: Ali-Mikhail-Haq (AMH), Clayton, Frank, Joe, Gumbel, HuslerReiss, Galambos, Tawn, Normal, Plackett and FGM, respectively. Scatterplots of the observed and simulated peak discharge - flood volume pairs and goodness-of-fit tests have been used to assess the overall applicability of the copulas as well as observing any changes in suitable models along the Danube River. The results indicate that, almost all of the considered Archimedean class copulas (e.g. Frank, Clayton and Ali-Mikhail-Haq) perform better than the other copula families selected for this study, and that for the second data samples mostly the upper-tail-flat copulas were suitable.

  10. Implementation of a generic SFC-MS method for the quality control of potentially counterfeited medicinal cannabis with synthetic cannabinoids.

    PubMed

    Jambo, Hugues; Dispas, Amandine; Avohou, Hermane T; André, Sébastien; Hubert, Cédric; Lebrun, Pierre; Ziemons, Éric; Hubert, Philippe

    2018-06-05

    In this study, we describe the development of a SFC-MS method for the quality control of cannabis plants that could be potentially adulterated with synthetic cannabinoids. Considering the high number of already available synthetic cannabinoids and the high rate of development of novel structures, we aimed to develop a generic method suitable for the analysis of a large panel of substances using seventeen synthetic cannabinoids from multiple classes as model compounds. Firstly, a suitable column was chosen after a screening phase. Secondly, optimal operating conditions were obtained following a robust optimization strategy based on a design of experiments and design space methodology (DoE-DS). Finally, the quantitative performances of the method were assessed with a validation according to the total error approach. The developed method has a run time of 9.4 min. It uses a simple modifier composition of methanol with 2% H 2 O and requires minimal sample preparation. It can chromatographically separate natural cannabinoids (except THC-A and CBD-A) from the synthetics assessed. Also, the use of mass spectrometry provides sensitivity and specificity. Moreover, this quality by design (QbD) approach permits the tuning of the method (within the DS) during routine analysis to achieve a desirable separation since the future compounds that should be analyzed could be unknown. The method was validated for the quantitation of a selected synthetic cannabinoid in fiber-type cannabis matrix over the range of 2.5% - 7.5% (w/w) with LOD value as low as 14.4 ng/mL. This generic method should be easy to implement in customs or QC laboratories in the context of counterfeit drugs tracking. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swamy, S.A.; Mandava, P.R.; Bhowmick, D.C.

    The leak-before-break (LBB) methodology is accepted as a technically justifiable approach for eliminating postulation of Double-Ended Guillotine Breaks (DEGB) in high energy piping systems. This is the result of extensive research, development, and rigorous evaluations by the NRC and the commercial nuclear power industry since the early 1970s. The DEGB postulation is responsible for the many hundreds of pipe whip restraints and jet shields found in commercial nuclear plants. These restraints and jet shields not only cost many millions of dollars, but also cause plant congestion leading to reduced reliability in inservice inspection and increased man-rem exposure. While use ofmore » leak-before-break technology saved hundreds of millions of dollars in backfit costs to many operating Westinghouse plants, value-impacts resulting from the application of this technology for future plants are greater on a per plant basis. These benefits will be highlighted in this paper. The LBB technology has been applied extensively to high energy piping systems in operating plants. However, there are differences between the application of LBB technology to an operating plant and to a new plant design. In this paper an approach is proposed which is suitable for application of LBB to a new plant design such as the Westinghouse AP600. The approach is based on generating Bounding Analyses Curves (BAC) for the candidate piping systems. The general methodology and criteria used for developing the BACs are based on modified GDC-4 and Standard Review Plan (SRP) 3.6.3. The BAC allows advance evaluation of the piping system from the LBB standpoint thereby assuring LBB conformance for the piping system. The piping designer can use the results of the BACs to determine acceptability of design loads and make modifications (in terms of piping layout and support configurations) as necessary at the design stage to assure LBB for the, piping systems under consideration.« less

  12. The design and evaluation of an antimicrobial resistance surveillance system for neonatal intensive care units in Iran.

    PubMed

    Rezaei-Hachesu, Peyman; Samad-Soltani, Taha; Yaghoubi, Sajad; GhaziSaeedi, Marjan; Mirnia, Kayvan; Masoumi-Asl, Hossein; Safdari, Reza

    2018-07-01

    Neonatal intensive care units (NICUs) have complex patients in terms of their diagnoses and required treatments. Antimicrobial treatment is a common therapy for patients in NICUs. To solve problems pertaining to empirical therapy, antimicrobial stewardship programs have recently been introduced. Despite the success of these programs in terms of data collection, there is still inefficiency in terms of analyzing and reporting the data. Thus, to successfully implement these stewardship programs, the design of antimicrobial resistance (AMR) surveillance systems is recommended as a first step. As a result, this study aimed to design an AMR surveillance system for use in the NICUs in northwestern Iranian hospitals to cover these information gaps. The recommended system is compatible with the World Health Organization (WHO) guidelines. The business intelligence (BI) requirements were extracted in an interview with a product owner (PO) using a valid and reliable checklist. Following this, an AMR surveillance system was designed and evaluated in relation to user experiences via a user experience questionnaire (UEQ). Finally, an association analysis was performed on the database, and the results were reported by identifying the important multidrug resistances in the database. A customized software development methodology was proposed. The three major modules of the AMR surveillance are the data registry, dashboard, and decision support modules. The data registry module was implemented based on a three-tier architecture, and the Clinical Decision Support System (CDSS) and dashboard modules were designed based on the BI requirements of the Scrum product owner (PO). The mean values of UEQ measures were in a good range. This measures showed the suitable usability of the AMR surveillance system. Applying efficient software development methodologies allows for the systems' compatibility with users' opinions and requirements. In addition, the construction of interdisciplinary communication models for research and software engineering allows for research and development concepts to be used in operational environments. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. Selection of suitable alternatives to reduce the environmental impact of road traffic noise using a fuzzy multi-criteria decision model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruiz-Padillo, Alejandro, E-mail: aruizp@correo.ugr.es; Civil Engineering Department, University of Granada, Av. Fuentenueva s/n, 18071 Granada; Ruiz, Diego P., E-mail: druiz@ugr.es

    Road traffic noise is one of the most significant environmental impacts generated by transport systems. To this regard, the recent implementation of the European Environmental Noise Directive by Public Administrations of the European Union member countries has led to various noise action plans (NAPs) for reducing the noise exposure of EU inhabitants. Every country or administration is responsible for applying criteria based on their own experience or expert knowledge, but there is no regulated process for the prioritization of technical measures within these plans. This paper proposes a multi-criteria decision methodology for the selection of suitable alternatives against traffic noisemore » in each of the road stretches included in the NAPs. The methodology first defines the main criteria and alternatives to be considered. Secondly, it determines the relative weights for the criteria and sub-criteria using the fuzzy extended analytical hierarchy process as applied to the results from an expert panel, thereby allowing expert knowledge to be captured in an automated way. A final step comprises the use of discrete multi-criteria analysis methods such as weighted sum, ELECTRE and TOPSIS, to rank the alternatives by suitability. To illustrate an application of the proposed methodology, this paper describes its implementation in a complex real case study: the selection of optimal technical solutions against traffic noise in the top priority road stretch included in the revision of the NAP of the regional road network in the province of Almeria (Spain).« less

  14. External gear pumps operating with non-Newtonian fluids: Modelling and experimental validation

    NASA Astrophysics Data System (ADS)

    Rituraj, Fnu; Vacca, Andrea

    2018-06-01

    External Gear Pumps are used in various industries to pump non-Newtonian viscoelastic fluids like plastics, paints, inks, etc. For both design and analysis purposes, it is often a matter of interest to understand the features of the displacing action realized by meshing of the gears and the description of the behavior of the leakages for this kind of pumps. However, very limited work can be found in literature about methodologies suitable to model such phenomena. This article describes the technique of modelling external gear pumps that operate with non-Newtonian fluids. In particular, it explains how the displacing action of the unit can be modelled using a lumped parameter approach which involves dividing fluid domain into several control volumes and internal flow connections. This work is built upon the HYGESim simulation tool, conceived by the authors' research team in the last decade, which is for the first time extended for the simulation of non-Newtonian fluids. The article also describes several comparisons between simulation results and experimental data obtained from numerous experiments performed for validation of the presented methodology. Finally, operation of external gear pump with fluids having different viscosity characteristics is discussed.

  15. Rediscovery of Good-Turing estimators via Bayesian nonparametrics.

    PubMed

    Favaro, Stefano; Nipoti, Bernardo; Teh, Yee Whye

    2016-03-01

    The problem of estimating discovery probabilities originated in the context of statistical ecology, and in recent years it has become popular due to its frequent appearance in challenging applications arising in genetics, bioinformatics, linguistics, designs of experiments, machine learning, etc. A full range of statistical approaches, parametric and nonparametric as well as frequentist and Bayesian, has been proposed for estimating discovery probabilities. In this article, we investigate the relationships between the celebrated Good-Turing approach, which is a frequentist nonparametric approach developed in the 1940s, and a Bayesian nonparametric approach recently introduced in the literature. Specifically, under the assumption of a two parameter Poisson-Dirichlet prior, we show that Bayesian nonparametric estimators of discovery probabilities are asymptotically equivalent, for a large sample size, to suitably smoothed Good-Turing estimators. As a by-product of this result, we introduce and investigate a methodology for deriving exact and asymptotic credible intervals to be associated with the Bayesian nonparametric estimators of discovery probabilities. The proposed methodology is illustrated through a comprehensive simulation study and the analysis of Expressed Sequence Tags data generated by sequencing a benchmark complementary DNA library. © 2015, The International Biometric Society.

  16. Analysis of Flowfields over Four-Engine DC-X Rockets

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Cornelison, Joni

    1996-01-01

    The objective of this study is to validate a computational methodology for the aerodynamic performance of an advanced conical launch vehicle configuration. The computational methodology is based on a three-dimensional, viscous flow, pressure-based computational fluid dynamics formulation. Both wind-tunnel and ascent flight-test data are used for validation. Emphasis is placed on multiple-engine power-on effects. Computational characterization of the base drag in the critical subsonic regime is the focus of the validation effort; until recently, almost no multiple-engine data existed for a conical launch vehicle configuration. Parametric studies using high-order difference schemes are performed for the cold-flow tests, whereas grid studies are conducted for the flight tests. The computed vehicle axial force coefficients, forebody, aftbody, and base surface pressures compare favorably with those of tests. The results demonstrate that with adequate grid density and proper distribution, a high-order difference scheme, finite rate afterburning kinetics to model the plume chemistry, and a suitable turbulence model to describe separated flows, plume/air mixing, and boundary layers, computational fluid dynamics is a tool that can be used to predict the low-speed aerodynamic performance for rocket design and operations.

  17. Creating a spatial multi-criteria decision support system for energy related integrated environmental impact assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wanderer, Thomas, E-mail: thomas.wanderer@dlr.de; Herle, Stefan, E-mail: stefan.herle@rwth-aachen.de

    2015-04-15

    By their spatially very distributed nature, profitability and impacts of renewable energy resources are highly correlated with the geographic locations of power plant deployments. A web-based Spatial Decision Support System (SDSS) based on a Multi-Criteria Decision Analysis (MCDA) approach has been implemented for identifying preferable locations for solar power plants based on user preferences. The designated areas found serve for the input scenario development for a subsequent integrated Environmental Impact Assessment. The capabilities of the SDSS service get showcased for Concentrated Solar Power (CSP) plants in the region of Andalusia, Spain. The resulting spatial patterns of possible power plant sitesmore » are an important input to the procedural chain of assessing impacts of renewable energies in an integrated effort. The applied methodology and the implemented SDSS are applicable for other renewable technologies as well. - Highlights: • The proposed tool facilitates well-founded CSP plant siting decisions. • Spatial MCDA methods are implemented in a WebGIS environment. • GIS-based SDSS can contribute to a modern integrated impact assessment workflow. • The conducted case study proves the suitability of the methodology.« less

  18. Improvement of biodiesel production by lipozyme TL IM-catalyzed methanolysis using response surface methodology and acyl migration enhancer.

    PubMed

    Wang, Y; Wu, H; Zong, M H

    2008-10-01

    The process of biodiesel production from corn oil catalyzed by lipozyme TL IM, an inexpensive 1,3-position specific lipase from Thermomyces lanuginosus was optimized by response surface methodology (RSM) and a central composite rotatable design (CCRD) was used to study the effects of enzyme dosage, ratio of t-butanol to oil (v/v) and ratio of methanol to oil (mol/mol) on the methyl esters (ME) yield of the methanolysis. The optimum combinations for the reaction were 25.9U/goil of enzyme, 0.58 volume ratio of t-butanol to oil and 0.5, 0.5, 2.8 molar equivalent of methanol to oil added at the reaction time of 0, 2, and 4h, respectively, by which a ME yield of 85.6%, which was very close to the predicted value of 85.0%, could be obtained after reaction for 12h. Waste oil was found to be more suitable feedstock, and could give 93.7% ME yield under the optimum conditions described above. Adding triethylamine (TEA), an acyl migration enhancer, could efficiently improve the ME yield of the methanolysis of corn oil, giving a ME yield of 92.0%.

  19. Caffeine, sleep and wakefulness: implications of new understanding about withdrawal reversal.

    PubMed

    James, Jack E; Keane, Michael A

    2007-12-01

    The broad aim of this review is to critically examine the implications of new understanding concerning caffeine withdrawal and withdrawal reversal in the context of research concerned with the effects of caffeine on sleep and wakefulness. A comprehensive search was conducted for relevant experimental studies in the PubMED and PsycINFO databases. Studies were assessed with particular reference to methodological adequacy for controlling against confounding due to caffeine withdrawal and withdrawal reversal. This assessment was used to clarify evidence of effects, highlight areas of ambiguity and derive recommendations for future research. It was found that researchers have generally failed to take account of the fact that habitual use of caffeine, even at moderate levels, leads to physical dependence evidenced by physiological, behavioural and subjective withdrawal effects during periods of abstinence. Consequently, there has been near-complete absence of adequate methodological controls against confounding due to reversal of withdrawal effects when caffeine is experimentally administered. The findings of what has been a substantial research effort to elucidate the effects of caffeine on sleep and wakefulness, undertaken over a period spanning decades, are ambiguous. Current shortcomings can be redressed by incorporating suitable controls in new experimental designs.

  20. The readability and suitability of sexual health promotion leaflets.

    PubMed

    Corcoran, Nova; Ahmad, Fatuma

    2016-02-01

    To investigate the readability and suitability of sexual health promotion leaflets. Application of SMOG, FRY and SAM tests to assess the readability and suitability of a selection of sexual health leaflets. SMOG and FRY scores illustrate an average reading level of grade 9. SAM scores indicate that 59% of leaflets are superior in design and 41% are average in design. Leaflets generally perform well in the categories of content, literacy demand, typography and layout. They perform poorly in use of graphics, learning stimulation/motivation and cultural appropriateness. Sexual health leaflets have a reading level that is too high. Leaflets perform well on the suitability scores indicating they are reasonably suitable. There are a number of areas where sexual health leaflets could improve their design. Numerous practical techniques are suggested for improving the readability and suitability of sexual health leaflets. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  1. A design methodology for nonlinear systems containing parameter uncertainty: Application to nonlinear controller design

    NASA Technical Reports Server (NTRS)

    Young, G.

    1982-01-01

    A design methodology capable of dealing with nonlinear systems, such as a controlled ecological life support system (CELSS), containing parameter uncertainty is discussed. The methodology was applied to the design of discrete time nonlinear controllers. The nonlinear controllers can be used to control either linear or nonlinear systems. Several controller strategies are presented to illustrate the design procedure.

  2. Multirate Flutter Suppression System Design for the Benchmark Active Controls Technology Wing. Part 1; Theory and Design Procedure

    NASA Technical Reports Server (NTRS)

    Mason, Gregory S.; Berg, Martin C.; Mukhopadhyay, Vivek

    2002-01-01

    To study the effectiveness of various control system design methodologies, the NASA Langley Research Center initiated the Benchmark Active Controls Project. In this project, the various methodologies were applied to design a flutter suppression system for the Benchmark Active Controls Technology (BACT) Wing. This report describes a project at the University of Washington to design a multirate suppression system for the BACT wing. The objective of the project was two fold. First, to develop a methodology for designing robust multirate compensators, and second, to demonstrate the methodology by applying it to the design of a multirate flutter suppression system for the BACT wing.

  3. Force 2025 and Beyond Strategic Force Design Analytic Model

    DTIC Science & Technology

    2017-01-12

    depiction of the core ideas of our force design model. Figure 1: Description of Force Design Model Figure 2 shows an overview of our methodology ...the F2025B Force Design Analytic Model research conducted by TRAC- MTRY and the Naval Postgraduate School. Our research develops a methodology for...designs. We describe a data development methodology that characterizes the data required to construct a force design model using our approach. We

  4. 30 CFR 780.21 - Hydrologic information.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... contain information on water availability and alternative water sources, including the suitability of...) flooding or streamflow alteration; (D) ground water and surface water availability; and (E) other... Hydrologic information. (a) Sampling and analysis methodology. All water-quality analyses performed to meet...

  5. Memristor-Based Computing Architecture: Design Methodologies and Circuit Techniques

    DTIC Science & Technology

    2013-03-01

    MEMRISTOR-BASED COMPUTING ARCHITECTURE : DESIGN METHODOLOGIES AND CIRCUIT TECHNIQUES POLYTECHNIC INSTITUTE OF NEW YORK UNIVERSITY...TECHNICAL REPORT 3. DATES COVERED (From - To) OCT 2010 – OCT 2012 4. TITLE AND SUBTITLE MEMRISTOR-BASED COMPUTING ARCHITECTURE : DESIGN METHODOLOGIES...schemes for a memristor-based reconfigurable architecture design have not been fully explored yet. Therefore, in this project, we investigated

  6. Analysis of Alternatives for Risk Assessment Methodologies and Tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nachtigal, Noel M.; Fruetel, Julia A.; Gleason, Nathaniel J.

    The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in themore » risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.« less

  7. Novel optoelectronic methodology for testing of MOEMS

    NASA Astrophysics Data System (ADS)

    Pryputniewicz, Ryszard J.; Furlong, Cosme

    2003-01-01

    Continued demands for delivery of high performance micro-optoelectromechanical systems (MOEMS) place unprecedented requirements on methods used in their development and operation. Metrology is a major and inseparable part of these methods. Optoelectronic methodology is an essential field of metrology. Due to its scalability, optoelectronic methodology is particularly suitable for testing of MOEMS where measurements must be made with ever increasing accuracy and precision. This was particularly evident during the last few years, characterized by miniaturization of devices, when requirements for measurements have rapidly increased as the emerging technologies introduced new products, especially, optical MEMS. In this paper, a novel optoelectronic methodology for testing of MOEMS is described and its applications are illustrated with representative examples. These examples demonstrate capability to measure submicron deformations of various components of the micromirror device, under operating conditions, and show viability of the optoelectronic methodology for testing of MOEMS.

  8. Sound in ecclesiastical spaces in Cordoba. Architectural projects incorporating acoustic methodology (El sonido del espacio eclesial en Cordoba. El proyecto arquitectonico como procedimiento acustico)

    NASA Astrophysics Data System (ADS)

    Suarez, Rafael

    2003-11-01

    This thesis is concerned with the acoustic analysis of ecclesiastical spaces, and the subsequent implementation of acoustic design methodology in architectural renovations. One begins with an adequate architectural design of specific elements (shape, materials, and textures), with the intention of elimination of acoustic deficiencies that are common in such spaces. These are those deficiencies that impair good speech intelligibility and good musical audibility. The investigation is limited to churches in the province of Cordoba and to churches built after the reconquest of Spain (1236) and up until the 18th century. Selected churches are those that have undergone architectural renovations to adapt them to new uses or to make them more suitable for liturgical use. The thesis attempts to summarize the acoustic analyses and the acoustical solutions that have been implemented. The results are presented in a manner that should be useful for the adoption of a model for the functional renovation of ecclesiastical spaces. Such would allow those involved in architectural projects to specify the nature of the sound, even though somewhat intangible, within the ecclesiastical space. Thesis advisors: Jaime Navarro and Juan J. Sendra Copies of this thesis written in Spanish may be obtained by contacting the advisor, Jaime Navarro, E.T.S. de Arquitectura de Sevilla, Dpto. de Construcciones Arquitectonicas I, Av. Reina Mercedes, 2, 41012 Sevilla, Spain. E-mail address: jnavarro@us.es

  9. An Experimental Design Approach for Impurity Profiling of Valacyclovir-Related Products by RP-HPLC

    PubMed Central

    Katakam, Prakash; Dey, Baishakhi; Hwisa, Nagiat T; Assaleh, Fathi H; Chandu, Babu R; Singla, Rajeev K; Mitra, Analava

    2014-01-01

    Abstract Impurity profiling has become an important phase of pharmaceutical research where both spectroscopic and chromatographic methods find applications. The analytical methodology needs to be very sensitive, specific, and precise which will separate and determine the impurity of interest at the 0.1% level. Current research reports a validated RP-HPLC method to detect and separate valacyclovir-related impurities (Imp-E and Imp-G) using the Box-Behnken design approach of response surface methodology. A gradient mobile phase (buffer: acetonitrile as mobile phase A and acetonitrile: methanol as mobile phase B) was used. Linearity was found in the concentration range of 50–150 μg/mL. The mean recovery of impurities was 99.9% and 103.2%, respectively. The %RSD for the peak areas of Imp-E and Imp-G were 0.9 and 0.1, respectively. No blank interferences at the retention times of the impurities suggest the specificity of the method. The LOD values were 0.0024 μg/mL for Imp-E and 0.04 μg/mL for Imp-G and the LOQ values were obtained as 0.0082 μg/mL and 0.136 μg/mL, respectively, for the impurities. The S/N ratios in both cases were within the specification limits. Proper peak shapes and satisfactory resolution with good retention times suggested the suitability of the method for impurity profiling of valacyclovir-related drug substances. PMID:25853072

  10. Testing for genetically modified organisms (GMOs): Past, present and future perspectives.

    PubMed

    Holst-Jensen, Arne

    2009-01-01

    This paper presents an overview of GMO testing methodologies and how these have evolved and may evolve in the next decade. Challenges and limitations for the application of the test methods as well as to the interpretation of results produced with the methods are highlighted and discussed, bearing in mind the various interests and competences of the involved stakeholders. To better understand the suitability and limitations of detection methodologies the evolution of transformation processes for creation of GMOs is briefly reviewed.

  11. Transportation Energy Conservation Data Book: A Selected Bibliography. Edition 3,

    DTIC Science & Technology

    1978-11-01

    Charlottesville, VA 22901 TITLE: Couputer-Based Resource Accounting Model TT1.1: Methodology for the Design of Urban for Automobile Technology Impact...Evaluation System ACCOUNTING; INDUSTRIAL SECTOR; ENERGY tPIESi Documentation. volume 6. CONSUM PTION: PERFORANCE: DESIGN : NASTE MEAT: Methodology for... Methodology for the Design of Urban Transportation 000172 Energy Flows In the U.S., 1973 and 1974. Volume 1: Methodology * $opdate to the Fational Energy

  12. A Method for Co-Designing Theory-Based Behaviour Change Systems for Health Promotion.

    PubMed

    Janols, Rebecka; Lindgren, Helena

    2017-01-01

    A methodology was defined and developed for designing theory-based behaviour change systems for health promotion that can be tailored to the individual. Theories from two research fields were combined with a participatory action research methodology. Two case studies applying the methodology were conducted. During and between group sessions the participants created material and designs following the behaviour change strategy themes, which were discussed, analysed and transformed into a design of a behaviour change system. Theories in behavioural change and persuasive technology guided the data collection, data analyses, and the design of a behaviour change system. The methodology has strong emphasis on the target group's participation in the design process. The different aspects brought forward related to behaviour change strategies defined in literature on persuasive technology, and the dynamics of these are associated to needs and motivation defined in literature on behaviour change. It was concluded that the methodology aids the integration of theories into a participatory action research design process, and aids the analyses and motivations of design choices.

  13. Control design for future agile fighters

    NASA Technical Reports Server (NTRS)

    Murphy, Patrick C.; Davidson, John B.

    1991-01-01

    The CRAFT control design methodology is presented. CRAFT stands for the design objectives addressed, namely, Control power, Robustness, Agility, and Flying Qualities Tradeoffs. The approach combines eigenspace assignment, which allows for direct specification of eigenvalues and eigenvectors, and a graphical approach for representing control design metrics that captures numerous design goals in one composite illustration. The methodology makes use of control design metrics from four design objective areas, namely, control power, robustness, agility, and flying qualities. An example of the CRAFT methodology as well as associated design issues are presented.

  14. OPUS: Optimal Projection for Uncertain Systems. Volume 1

    DTIC Science & Technology

    1991-09-01

    unifiedI control- design methodology that directly addresses these technology issues. 1 In particular, optimal projection theory addresses the need for...effects, and limited identification accuracy in a 1-g environment. The principal contribution of OPUS is a unified design methodology that...characterizing solutions to constrained control- design problems. Transforming OPUS into a practi- cal design methodology requires the development of

  15. Is the Spatial Distribution of Mankind's Most Basic Economic Traits Determined by Climate and Soil Alone?

    PubMed Central

    Beck, Jan; Sieber, Andrea

    2010-01-01

    Background Several authors, most prominently Jared Diamond (1997, Guns, Germs and Steel), have investigated biogeographic determinants of human history and civilization. The timing of the transition to an agricultural lifestyle, associated with steep population growth and consequent societal change, has been suggested to be affected by the availability of suitable organisms for domestication. These factors were shown to quantitatively explain some of the current global inequalities of economy and political power. Here, we advance this approach one step further by looking at climate and soil as sole determining factors. Methodology/Principal Findings As a simplistic ‘null model’, we assume that only climate and soil conditions affect the suitability of four basic landuse types – agriculture, sedentary animal husbandry, nomadic pastoralism and hunting-and-gathering. Using ecological niche modelling (ENM), we derive spatial predictions of the suitability for these four landuse traits and apply these to the Old World and Australia. We explore two aspects of the properties of these predictions, conflict potential and population density. In a calculation of overlap of landuse suitability, we map regions of potential conflict between landuse types. Results are congruent with a number of real, present or historical, regions of conflict between ethnic groups associated with different landuse traditions. Furthermore, we found that our model of agricultural suitability explains a considerable portion of population density variability. We mapped residuals from this correlation, finding geographically highly structured deviations that invite further investigation. We also found that ENM of agricultural suitability correlates with a metric of local wealth generation (Gross Domestic Product, Purchasing Power Parity). Conclusions/Significance From simplified assumptions on the links between climate, soil and landuse we are able to provide good predictions on complex features of human geography. The spatial distribution of deviations from ENM predictions identifies those regions requiring further investigation of potential explanations. Our findings and methodological approaches may be of applied interest, e.g., in the context of climate change. PMID:20463959

  16. Systemic Operational Design: Improving Operational Planning for the Netherlands Armed Forces

    DTIC Science & Technology

    2006-05-25

    This methodology is called Soft Systems Methodology . His methodology is a structured way of thinking in which not only a perceived problematic...Many similarities exist between Systemic Operational Design and Soft Systems Methodology , their epistemology is related. Furthermore, they both have...Systems Thinking: Managing Chaos and Complexity. Boston: Butterworth Heinemann, 1999. Checkland, Peter, and Jim Scholes. Soft Systems Methodology in

  17. Multidisciplinary Concurrent Design Optimization via the Internet

    NASA Technical Reports Server (NTRS)

    Woodard, Stanley E.; Kelkar, Atul G.; Koganti, Gopichand

    2001-01-01

    A methodology is presented which uses commercial design and analysis software and the Internet to perform concurrent multidisciplinary optimization. The methodology provides a means to develop multidisciplinary designs without requiring that all software be accessible from the same local network. The procedures are amenable to design and development teams whose members, expertise and respective software are not geographically located together. This methodology facilitates multidisciplinary teams working concurrently on a design problem of common interest. Partition of design software to different machines allows each constituent software to be used on the machine that provides the most economy and efficiency. The methodology is demonstrated on the concurrent design of a spacecraft structure and attitude control system. Results are compared to those derived from performing the design with an autonomous FORTRAN program.

  18. Three-Dimensional Finite Element Ablative Thermal Response and Thermostructural Design of Thermal Protection Systems

    NASA Technical Reports Server (NTRS)

    Dec, John A.; Braun, Robert D.

    2011-01-01

    A finite element ablation and thermal response program is presented for simulation of three-dimensional transient thermostructural analysis. The three-dimensional governing differential equations and finite element formulation are summarized. A novel probabilistic design methodology for thermal protection systems is presented. The design methodology is an eight step process beginning with a parameter sensitivity study and is followed by a deterministic analysis whereby an optimum design can determined. The design process concludes with a Monte Carlo simulation where the probabilities of exceeding design specifications are estimated. The design methodology is demonstrated by applying the methodology to the carbon phenolic compression pads of the Crew Exploration Vehicle. The maximum allowed values of bondline temperature and tensile stress are used as the design specifications in this study.

  19. De/signing Research in Education: Patchwork(ing) Methodologies with Theory

    ERIC Educational Resources Information Center

    Higgins, Marc; Madden, Brooke; Berard, Marie-France; Lenz Kothe, Elsa; Nordstrom, Susan

    2017-01-01

    Four education scholars extend the methodological space inspired by Jackson and Mazzei's "Thinking with Theory" through focusing on research design. The notion of de/sign is presented and employed to counter prescriptive method/ology that often sutures over pedagogical possibilities in research and educational settings. Key…

  20. Establishing Equivalence: Methodological Progress in Group-Matching Design and Analysis

    ERIC Educational Resources Information Center

    Kover, Sara T.; Atwood, Amy K.

    2013-01-01

    This methodological review draws attention to the challenges faced by intellectual and developmental disabilities researchers in the appropriate design and analysis of group comparison studies. We provide a brief overview of matching methodologies in the field, emphasizing group-matching designs used in behavioral research on cognition and…

  1. 77 FR 50514 - Post-Approval Studies 2012 Workshop: Design, Methodology, and Role in Evidence Appraisal...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-21

    ...] Post-Approval Studies 2012 Workshop: Design, Methodology, and Role in Evidence Appraisal Throughout the... Administration (FDA) is announcing the following public workshop entitled ``Post-Approval Studies 2012 Workshop: Design, Methodology, and Role in Evidence Appraisal Throughout the Total Product Life Cycle.'' The topics...

  2. Multirate flutter suppression system design for the Benchmark Active Controls Technology Wing

    NASA Technical Reports Server (NTRS)

    Berg, Martin C.; Mason, Gregory S.

    1994-01-01

    To study the effectiveness of various control system design methodologies, the NASA Langley Research Center initiated the Benchmark Active Controls Project. In this project, the various methodologies will be applied to design a flutter suppression system for the Benchmark Active Controls Technology (BACT) Wing (also called the PAPA wing). Eventually, the designs will be implemented in hardware and tested on the BACT wing in a wind tunnel. This report describes a project at the University of Washington to design a multirate flutter suppression system for the BACT wing. The objective of the project was two fold. First, to develop a methodology for designing robust multirate compensators, and second, to demonstrate the methodology by applying it to the design of a multirate flutter suppression system for the BACT wing. The contributions of this project are (1) development of an algorithm for synthesizing robust low order multirate control laws (the algorithm is capable of synthesizing a single compensator which stabilizes both the nominal plant and multiple plant perturbations; (2) development of a multirate design methodology, and supporting software, for modeling, analyzing and synthesizing multirate compensators; and (3) design of a multirate flutter suppression system for NASA's BACT wing which satisfies the specified design criteria. This report describes each of these contributions in detail. Section 2.0 discusses our design methodology. Section 3.0 details the results of our multirate flutter suppression system design for the BACT wing. Finally, Section 4.0 presents our conclusions and suggestions for future research. The body of the report focuses primarily on the results. The associated theoretical background appears in the three technical papers that are included as Attachments 1-3. Attachment 4 is a user's manual for the software that is key to our design methodology.

  3. Evaluation of MARC for the analysis of rotating composite blades

    NASA Technical Reports Server (NTRS)

    Bartos, Karen F.; Ernst, Michael A.

    1993-01-01

    The suitability of the MARC code for the analysis of rotating composite blades was evaluated using a four-task process. A nonlinear displacement analysis and subsequent eigenvalue analysis were performed on a rotating spring mass system to ensure that displacement-dependent centrifugal forces were accounted for in the eigenvalue analysis. Normal modes analyses were conducted on isotropic plates with various degrees of twist to evaluate MARC's ability to handle blade twist. Normal modes analyses were conducted on flat composite plates to validate the newly developed coupled COBSTRAN-MARC methodology. Finally, normal modes analyses were conducted on four composite propfan blades that were designed, analyzed, and fabricated at NASA Lewis Research Center. Results were compared with experimental data. The research documented herein presents MARC as a viable tool for the analysis of rotating composite blades.

  4. Evaluation of MARC for the analysis of rotating composite blades

    NASA Astrophysics Data System (ADS)

    Bartos, Karen F.; Ernst, Michael A.

    1993-03-01

    The suitability of the MARC code for the analysis of rotating composite blades was evaluated using a four-task process. A nonlinear displacement analysis and subsequent eigenvalue analysis were performed on a rotating spring mass system to ensure that displacement-dependent centrifugal forces were accounted for in the eigenvalue analysis. Normal modes analyses were conducted on isotropic plates with various degrees of twist to evaluate MARC's ability to handle blade twist. Normal modes analyses were conducted on flat composite plates to validate the newly developed coupled COBSTRAN-MARC methodology. Finally, normal modes analyses were conducted on four composite propfan blades that were designed, analyzed, and fabricated at NASA Lewis Research Center. Results were compared with experimental data. The research documented herein presents MARC as a viable tool for the analysis of rotating composite blades.

  5. Characterization of mechanical properties of pseudoelastic shape memory alloys under harmonic excitation

    NASA Astrophysics Data System (ADS)

    Böttcher, J.; Jahn, M.; Tatzko, S.

    2017-12-01

    Pseudoelastic shape memory alloys exhibit a stress-induced phase transformation which leads to high strains during deformation of the material. The stress-strain characteristic during this thermomechanical process is hysteretic and results in the conversion of mechanical energy into thermal energy. This energy conversion allows for the use of shape memory alloys in vibration reduction. For the application of shape memory alloys as vibration damping devices a dynamic modeling of the material behavior is necessary. In this context experimentally determined material parameters which accurately represent the material behavior are essential for a reliable material model. Subject of this publication is the declaration of suitable material parameters for pseudoelastic shape memory alloys and the methodology of their identification from experimental investigations. The used test rig was specifically designed for the characterization of pseudoelastic shape memory alloys.

  6. Soft Actuators for Small-Scale Robotics.

    PubMed

    Hines, Lindsey; Petersen, Kirstin; Lum, Guo Zhan; Sitti, Metin

    2017-04-01

    This review comprises a detailed survey of ongoing methodologies for soft actuators, highlighting approaches suitable for nanometer- to centimeter-scale robotic applications. Soft robots present a special design challenge in that their actuation and sensing mechanisms are often highly integrated with the robot body and overall functionality. When less than a centimeter, they belong to an even more special subcategory of robots or devices, in that they often lack on-board power, sensing, computation, and control. Soft, active materials are particularly well suited for this task, with a wide range of stimulants and a number of impressive examples, demonstrating large deformations, high motion complexities, and varied multifunctionality. Recent research includes both the development of new materials and composites, as well as novel implementations leveraging the unique properties of soft materials. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Design and analysis of sustainable computer mouse using design for disassembly methodology

    NASA Astrophysics Data System (ADS)

    Roni Sahroni, Taufik; Fitri Sukarman, Ahmad; Agung Mahardini, Karunia

    2017-12-01

    This paper presents the design and analysis of computer mouse using Design for Disassembly methodology. Basically, the existing computer mouse model consist a number of unnecessary part that cause the assembly and disassembly time in production. The objective of this project is to design a new computer mouse based on Design for Disassembly (DFD) methodology. The main methodology of this paper was proposed from sketch generation, concept selection, and concept scoring. Based on the design screening, design concept B was selected for further analysis. New design of computer mouse is proposed using fastening system. Furthermore, three materials of ABS, Polycarbonate, and PE high density were prepared to determine the environmental impact category. Sustainable analysis was conducted using software SolidWorks. As a result, PE High Density gives the lowers amount in the environmental category with great maximum stress value.

  8. Comparative analysis of numerical models of pipe handling equipment used in offshore drilling applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pawlus, Witold, E-mail: witold.p.pawlus@ieee.org; Ebbesen, Morten K.; Hansen, Michael R.

    Design of offshore drilling equipment is a task that involves not only analysis of strict machine specifications and safety requirements but also consideration of changeable weather conditions and harsh environment. These challenges call for a multidisciplinary approach and make the design process complex. Various modeling software products are currently available to aid design engineers in their effort to test and redesign equipment before it is manufactured. However, given the number of available modeling tools and methods, the choice of the proper modeling methodology becomes not obvious and – in some cases – troublesome. Therefore, we present a comparative analysis ofmore » two popular approaches used in modeling and simulation of mechanical systems: multibody and analytical modeling. A gripper arm of the offshore vertical pipe handling machine is selected as a case study for which both models are created. In contrast to some other works, the current paper shows verification of both systems by benchmarking their simulation results against each other. Such criteria as modeling effort and results accuracy are evaluated to assess which modeling strategy is the most suitable given its eventual application.« less

  9. Geohydrology of the Antelope Valley Area, California and design for a ground-water-quality monitoring network

    USGS Publications Warehouse

    Duell, L.F.

    1987-01-01

    A basinwide ideal network and an actual network were designed to identify ambient groundwater quality, trends in groundwater quality, and degree of threat from potential pollution sources in Antelope Valley, California. In general, throughout the valley groundwater quality has remained unchanged, and no specific trends are apparent. The main source of groundwater for the valley is generally suitable for domestic, irrigation, and most industrial uses. Water quality data for selected constituents of some network wells and surface-water sites are presented. The ideal network of 77 sites was selected on the basis of site-specific criteria, geohydrology, and current land use (agricultural, residential, and industrial). These sites were used as a guide in the design of the actual network consisting of 44 existing wells. Wells are currently being monitored and were selected whenever possible because of budgetary constraints. Of the remaining ideal sites, 20 have existing wells not part of a current water quality network, and 13 are locations where no wells exist. The methodology used for the selection of sites, constituents monitored, and frequency of analysis will enable network users to make appropriate future changes to the monitoring network. (USGS)

  10. Optimization of the combined ultrasonic assisted/adsorption method for the removal of malachite green by gold nanoparticles loaded on activated carbon: Experimental design

    NASA Astrophysics Data System (ADS)

    Roosta, M.; Ghaedi, M.; Shokri, N.; Daneshfar, A.; Sahraei, R.; Asghari, A.

    2014-01-01

    The present study was aimed to experimental design optimization applied to removal of malachite green (MG) from aqueous solution by ultrasound-assisted removal onto the gold nanoparticles loaded on activated carbon (Au-NP-AC). This nanomaterial was characterized using different techniques such as FESEM, TEM, BET, and UV-vis measurements. The effects of variables such as pH, initial dye concentration, adsorbent dosage (g), temperature and sonication time on MG removal were studied using central composite design (CCD) and the optimum experimental conditions were found with desirability function (DF) combined response surface methodology (RSM). Fitting the experimental equilibrium data to various isotherm models such as Langmuir, Freundlich, Tempkin and Dubinin-Radushkevich models show the suitability and applicability of the Langmuir model. Kinetic models such as pseudo -first order, pseudo-second order, Elovich and intraparticle diffusion models applicability was tested for experimental data and the second-order equation and intraparticle diffusion models control the kinetic of the adsorption process. The small amount of proposed adsorbent (0.015 g) is applicable for successful removal of MG (RE > 99%) in short time (4.4 min) with high adsorption capacity (140-172 mg g-1).

  11. Choice of design and outcomes in trials among children with moderate acute malnutrition.

    PubMed

    Friis, Henrik; Michaelsen, Kim F; Wells, Jonathan C

    2015-03-01

    There is a need for trials on the effects of food aid products for children with moderate acute malnutrition, to identify how best to restore body tissues and function. The choice of control intervention is a major challenge, with both ethical and scientific implications. While randomized trials are needed, special designs, such as cluster-randomized, stepped-wedged or factorial designs may offer advantages. Anthropometry is widely used as the primary outcome in such trials, but anthropometric traits do not refer directly to specific organs, tissues, or functions. Thus, it is difficult to understand what components of health might be impacted by public health programs, or the underlying mechanisms whereby improved nutritional status might benefit short- and long-term health. Measurement of body composition, specific growth markers and functional outcomes may provide greater insight into the nature and implications of growth failure and recovery. There are now several methodologies suitable for application in infants and young children, e.g., measuring body composition with deuterium dilution, physical activity with accelerometers and linear growth with knemometers. To evaluate the generalizability of the findings from nutrition trials, it is important to collect data on baseline nutritional status.

  12. Assuring data transparency through design methodologies

    NASA Technical Reports Server (NTRS)

    Williams, Allen

    1990-01-01

    This paper addresses the role of design methodologies and practices in the assurance of technology transparency. The development of several subsystems on large, long life cycle government programs was analyzed to glean those characteristics in the design, development, test, and evaluation that precluded or enabled the insertion of new technology. The programs examined were Minuteman, DSP, B1-B, and space shuttle. All these were long life cycle, technology-intensive programs. The design methodologies (or lack thereof) and design practices for each were analyzed in terms of the success or failure in incorporating evolving technology. Common elements contributing to the success or failure were extracted and compared to current methodologies being proposed by the Department of Defense and NASA. The relevance of these practices to the design and deployment of Space Station Freedom were evaluated. In particular, appropriate methodologies now being used on the core development contract were examined.

  13. Efficient testing methodologies for microcameras in a gigapixel imaging system

    NASA Astrophysics Data System (ADS)

    Youn, Seo Ho; Marks, Daniel L.; McLaughlin, Paul O.; Brady, David J.; Kim, Jungsang

    2013-04-01

    Multiscale parallel imaging--based on a monocentric optical design--promises revolutionary advances in diverse imaging applications by enabling high resolution, real-time image capture over a wide field-of-view (FOV), including sport broadcast, wide-field microscopy, astronomy, and security surveillance. Recently demonstrated AWARE-2 is a gigapixel camera consisting of an objective lens and 98 microcameras spherically arranged to capture an image over FOV of 120° by 50°, using computational image processing to form a composite image of 0.96 gigapixels. Since microcameras are capable of individually adjusting exposure, gain, and focus, true parallel imaging is achieved with a high dynamic range. From the integration perspective, manufacturing and verifying consistent quality of microcameras is a key to successful realization of AWARE cameras. We have developed an efficient testing methodology that utilizes a precisely fabricated dot grid chart as a calibration target to extract critical optical properties such as optical distortion, veiling glare index, and modulation transfer function to validate imaging performance of microcameras. This approach utilizes an AWARE objective lens simulator which mimics the actual objective lens but operates with a short object distance, suitable for a laboratory environment. Here we describe the principles of the methodologies developed for AWARE microcameras and discuss the experimental results with our prototype microcameras. Reference Brady, D. J., Gehm, M. E., Stack, R. A., Marks, D. L., Kittle, D. S., Golish, D. R., Vera, E. M., and Feller, S. D., "Multiscale gigapixel photography," Nature 486, 386--389 (2012).

  14. Development of Phaleria macrocarpa (Scheff.) Boerl Fruits Using Response Surface Methodology Focused on Phenolics, Flavonoids and Antioxidant Properties.

    PubMed

    Mohamed Mahzir, Khurul Ain; Abd Gani, Siti Salwa; Hasanah Zaidan, Uswatun; Halmi, Mohd Izuan Effendi

    2018-03-22

    In this study, the optimal conditions for the extraction of antioxidants from the Buah Mahkota Dewa fruit ( Phaleria macrocarpa) was determined by using Response Surface Methodology (RSM). The optimisation was applied using a Central Composite Design (CCD) to investigate the effect of three independent variables, namely extraction temperature (°C), extraction time (minutes) and extraction solvent to-feed ratio (% v / v ) on four responses: free radical scavenging activity (DPPH), ferric ion reducing power assay (FRAP), total phenolic content (TPC) and total flavonoid content (TFC). The optimal conditions for the antioxidants extraction were found to be 64 °C extraction temperature, 66 min extraction time and 75% v / v solvent to-feed ratio giving the highest percentage yields of DPPH, FRAP, TPC and TFC of 86.85%, 7.47%, 292.86 mg/g and 3.22 mg/g, respectively. Moreover, the data were subjected to Response Surface Methodology (RSM) and the results showed that the polynomial equations for all models were significant, did not show lack of fit, and presented adjusted determination coefficients ( R ²) above 99%, proving that the yield of phenolic, flavonoid and antioxidants activities obtained experimentally were close to the predicted values and the suitability of the model employed in RSM to optimise the extraction conditions. Hence, in this study, the fruit from P. macrocarpa could be considered to have strong antioxidant ability and can be used in various cosmeceutical or medicinal applications.

  15. Design description of the Tangaye Village photovoltaic power system

    NASA Astrophysics Data System (ADS)

    Martz, J. E.; Ratajczak, A. F.

    1982-06-01

    The engineering design of a stand alone photovoltaic (PV) powered grain mill and water pump for the village of Tangaye, Upper Volta is described. The socioeconomic effects of reducing the time required by women in rural areas for drawing water and grinding grain were studied. The suitability of photovoltaic technology for use in rural areas by people of limited technical training was demonstrated. The PV system consists of a 1.8-kW (peak) solar cell array, 540 ampere hours of battery storage, instrumentation, automatic controls, and a data collection and storage system. The PV system is situated near an improved village well and supplies d.c. power to a grain mill and a water pump. The array is located in a fenced area and the mill, battery, instruments, controls, and data system are in a mill building. A water storage tank is located near the well. The system employs automatic controls which provide battery charge regulation and system over and under voltage protection. This report includes descriptions of the engineering design of the system and of the load that it serves; a discussion of PV array and battery sizing methodology; descriptions of the mechanical and electrical designs including the array, battery, controls, and instrumentation; and a discussion of the safety features. The system became operational on March 1, 1979.

  16. Design description of the Tangaye Village photovoltaic power system

    NASA Technical Reports Server (NTRS)

    Martz, J. E.; Ratajczak, A. F.

    1982-01-01

    The engineering design of a stand alone photovoltaic (PV) powered grain mill and water pump for the village of Tangaye, Upper Volta is described. The socioeconomic effects of reducing the time required by women in rural areas for drawing water and grinding grain were studied. The suitability of photovoltaic technology for use in rural areas by people of limited technical training was demonstrated. The PV system consists of a 1.8-kW (peak) solar cell array, 540 ampere hours of battery storage, instrumentation, automatic controls, and a data collection and storage system. The PV system is situated near an improved village well and supplies d.c. power to a grain mill and a water pump. The array is located in a fenced area and the mill, battery, instruments, controls, and data system are in a mill building. A water storage tank is located near the well. The system employs automatic controls which provide battery charge regulation and system over and under voltage protection. This report includes descriptions of the engineering design of the system and of the load that it serves; a discussion of PV array and battery sizing methodology; descriptions of the mechanical and electrical designs including the array, battery, controls, and instrumentation; and a discussion of the safety features. The system became operational on March 1, 1979.

  17. A Modified Dynamic Evolving Neural-Fuzzy Approach to Modeling Customer Satisfaction for Affective Design

    PubMed Central

    Kwong, C. K.; Fung, K. Y.; Jiang, Huimin; Chan, K. Y.

    2013-01-01

    Affective design is an important aspect of product development to achieve a competitive edge in the marketplace. A neural-fuzzy network approach has been attempted recently to model customer satisfaction for affective design and it has been proved to be an effective one to deal with the fuzziness and non-linearity of the modeling as well as generate explicit customer satisfaction models. However, such an approach to modeling customer satisfaction has two limitations. First, it is not suitable for the modeling problems which involve a large number of inputs. Second, it cannot adapt to new data sets, given that its structure is fixed once it has been developed. In this paper, a modified dynamic evolving neural-fuzzy approach is proposed to address the above mentioned limitations. A case study on the affective design of mobile phones was conducted to illustrate the effectiveness of the proposed methodology. Validation tests were conducted and the test results indicated that: (1) the conventional Adaptive Neuro-Fuzzy Inference System (ANFIS) failed to run due to a large number of inputs; (2) the proposed dynamic neural-fuzzy model outperforms the subtractive clustering-based ANFIS model and fuzzy c-means clustering-based ANFIS model in terms of their modeling accuracy and computational effort. PMID:24385884

  18. Intervention studies to foster resilience - A systematic review and proposal for a resilience framework in future intervention studies.

    PubMed

    Chmitorz, A; Kunzler, A; Helmreich, I; Tüscher, O; Kalisch, R; Kubiak, T; Wessa, M; Lieb, K

    2018-02-01

    Psychological resilience refers to the phenomenon that many people are able to adapt to the challenges of life and maintain mental health despite exposure to adversity. This has stimulated research on training programs to foster psychological resilience. We evaluated concepts, methods and designs of 43 randomized controlled trials published between 1979 and 2014 which assessed the efficacy of such training programs and propose standards for future intervention research based on recent developments in the field. We found that concepts, methods and designs in current resilience intervention studies are of limited use to properly assess efficacy of interventions to foster resilience. Major problems are the use of definitions of resilience as trait or a composite of resilience factors, the use of unsuited assessment instruments, and inappropriate study designs. To overcome these challenges, we propose 1) an outcome-oriented definition of resilience, 2) an outcome-oriented assessment of resilience as change in mental health in relation to stressor load, and 3) methodological standards for suitable study designs of future intervention studies. Our proposals may contribute to an improved quality of resilience intervention studies and may stimulate further progress in this growing research field. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  19. A modified dynamic evolving neural-fuzzy approach to modeling customer satisfaction for affective design.

    PubMed

    Kwong, C K; Fung, K Y; Jiang, Huimin; Chan, K Y; Siu, Kin Wai Michael

    2013-01-01

    Affective design is an important aspect of product development to achieve a competitive edge in the marketplace. A neural-fuzzy network approach has been attempted recently to model customer satisfaction for affective design and it has been proved to be an effective one to deal with the fuzziness and non-linearity of the modeling as well as generate explicit customer satisfaction models. However, such an approach to modeling customer satisfaction has two limitations. First, it is not suitable for the modeling problems which involve a large number of inputs. Second, it cannot adapt to new data sets, given that its structure is fixed once it has been developed. In this paper, a modified dynamic evolving neural-fuzzy approach is proposed to address the above mentioned limitations. A case study on the affective design of mobile phones was conducted to illustrate the effectiveness of the proposed methodology. Validation tests were conducted and the test results indicated that: (1) the conventional Adaptive Neuro-Fuzzy Inference System (ANFIS) failed to run due to a large number of inputs; (2) the proposed dynamic neural-fuzzy model outperforms the subtractive clustering-based ANFIS model and fuzzy c-means clustering-based ANFIS model in terms of their modeling accuracy and computational effort.

  20. Evolving spiking neural networks: a novel growth algorithm exhibits unintelligent design

    NASA Astrophysics Data System (ADS)

    Schaffer, J. David

    2015-06-01

    Spiking neural networks (SNNs) have drawn considerable excitement because of their computational properties, believed to be superior to conventional von Neumann machines, and sharing properties with living brains. Yet progress building these systems has been limited because we lack a design methodology. We present a gene-driven network growth algorithm that enables a genetic algorithm (evolutionary computation) to generate and test SNNs. The genome for this algorithm grows O(n) where n is the number of neurons; n is also evolved. The genome not only specifies the network topology, but all its parameters as well. Experiments show the algorithm producing SNNs that effectively produce a robust spike bursting behavior given tonic inputs, an application suitable for central pattern generators. Even though evolution did not include perturbations of the input spike trains, the evolved networks showed remarkable robustness to such perturbations. In addition, the output spike patterns retain evidence of the specific perturbation of the inputs, a feature that could be exploited by network additions that could use this information for refined decision making if required. On a second task, a sequence detector, a discriminating design was found that might be considered an example of "unintelligent design"; extra non-functional neurons were included that, while inefficient, did not hamper its proper functioning.

  1. State reference design and saturated control of doubly-fed induction generators under voltage dips

    NASA Astrophysics Data System (ADS)

    Tilli, Andrea; Conficoni, Christian; Hashemi, Ahmad

    2017-04-01

    In this paper, the stator/rotor currents control problem of doubly-fed induction generator under faulty line voltage is carried out. Common grid faults cause a steep decline in the line voltage profile, commonly denoted as voltage dip. This point is critical for such kind of machines, having their stator windings directly connected to the grid. In this respect, solid methodological nonlinear control theory arguments are exploited and applied to design a novel controller, whose main goal is to improve the system behaviour during voltage dips, endowing it with low voltage ride through capability, a fundamental feature required by modern Grid Codes. The proposed solution exploits both feedforward and feedback actions. The feedforward part relies on suitable reference trajectories for the system internal dynamics, which are designed to prevent large oscillations in the rotor currents and command voltages, excited by line perturbations. The feedback part uses state measurements and is designed according to Linear Matrix Inequalities (LMI) based saturated control techniques to further reduce oscillations, while explicitly accounting for the system constraints. Numerical simulations verify the benefits of the internal dynamics trajectory planning, and the saturated state feedback action, in crucially improving the Doubly-Fed Induction Machine response under severe grid faults.

  2. Give Design a Chance: A Case for a Human Centered Approach to Operational Art

    DTIC Science & Technology

    2017-03-30

    strategy development and operational art. This demands fuller integration of the Army Design Methodology (ADM) and the Military Decision Making Process...MDMP). This monograph proposes a way of thinking and planning that goes beyond current Army doctrinal methodologies to address the changing...between conceptual and detailed planning. 15. SUBJECT TERMS Design; Army Design Methodology (ADM); Human Centered; Strategy; Operational Art

  3. Development of tf coil support concepts by design methodology in the case of a Bitter-type magnet. [Bitter-type magnets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brossmann, U.B.

    1981-01-01

    The application of the methodological design is demonstrated for the development of support concepts in the case of a Bitter-type magnet designed for a compact tokamak experimentat aiming at ignition of a DT plasma. With this methodology all boundary conditions and design criteria are more easily satisfied in a technical and economical way.

  4. On the suitability and development of layout templates for analog layout reuse and layout-aware synthesis

    NASA Astrophysics Data System (ADS)

    Castro-Lopez, Rafael; Fernandez, Francisco V.; Rodriguez Vazquez, Angel

    2005-06-01

    Accelerating the synthesis of increasingly complex analog integrated circuits is key to bridge the widening gap between what we can integrate and what we can design while meeting ever-tightening time-to-market constraints. It is a well-known fact in the semiconductor industry that such goal can only be attained by means of adequate CAD methodologies, techniques, and accompanying tools. This is particularly important in analog physical synthesis (a.k.a. layout generation), where large sensitivities of the circuit performances to the many subtle details of layout implementation (device matching, loading and coupling effects, reliability, and area features are of utmost importance to analog designers), render complete automation a truly challenging task. To approach the problem, two directions have been traditionally considered, knowledge-based and optimization-based, both with their own pros and cons. Besides, recently reported solutions oriented to speed up the overall design flow by means of reuse-based practices or by cutting off time-consuming, error-prone spins between electrical and layout synthesis (a technique known as layout-aware synthesis), rely on a outstandingly rapid yet efficient layout generation method. This paper analyses the suitability of procedural layout generation based on templates (a knowledge-based approach) by examining the requirements that both layout reuse and layout-aware solutions impose, and how layout templates face them. The ability to capture the know-how of experienced layout designers and the turnaround times for layout instancing are considered main comparative aspects in relation to other layout generation approaches. A discussion on the benefit-cost trade-off of using layout templates is also included. In addition to this analysis, the paper delves deeper into systematic techniques to develop fully reusable layout templates for analog circuits, either for a change of the circuit sizing (i.e., layout retargeting) or a change of the fabrication process (i.e., layout migration). Several examples implemented with the Cadence's Virtuoso tool suite are provided as demonstration of the paper's contributions.

  5. The Study of the Relationship between Probabilistic Design and Axiomatic Design Methodology. Volume 2

    NASA Technical Reports Server (NTRS)

    Onwubiko, Chin-Yere; Onyebueke, Landon

    1996-01-01

    The structural design, or the design of machine elements, has been traditionally based on deterministic design methodology. The deterministic method considers all design parameters to be known with certainty. This methodology is, therefore, inadequate to design complex structures that are subjected to a variety of complex, severe loading conditions. A nonlinear behavior that is dependent on stress, stress rate, temperature, number of load cycles, and time is observed on all components subjected to complex conditions. These complex conditions introduce uncertainties; hence, the actual factor of safety margin remains unknown. In the deterministic methodology, the contingency of failure is discounted; hence, there is a use of a high factor of safety. It may be most useful in situations where the design structures are simple. The probabilistic method is concerned with the probability of non-failure performance of structures or machine elements. It is much more useful in situations where the design is characterized by complex geometry, possibility of catastrophic failure, sensitive loads and material properties. Also included: Comparative Study of the use of AGMA Geometry Factors and Probabilistic Design Methodology in the Design of Compact Spur Gear Set.

  6. Electronic Design Automation: Integrating the Design and Manufacturing Functions

    NASA Technical Reports Server (NTRS)

    Bachnak, Rafic; Salkowski, Charles

    1997-01-01

    As the complexity of electronic systems grows, the traditional design practice, a sequential process, is replaced by concurrent design methodologies. A major advantage of concurrent design is that the feedback from software and manufacturing engineers can be easily incorporated into the design. The implementation of concurrent engineering methodologies is greatly facilitated by employing the latest Electronic Design Automation (EDA) tools. These tools offer integrated simulation of the electrical, mechanical, and manufacturing functions and support virtual prototyping, rapid prototyping, and hardware-software co-design. This report presents recommendations for enhancing the electronic design and manufacturing capabilities and procedures at JSC based on a concurrent design methodology that employs EDA tools.

  7. Exploring consensus in 21st century projections of climatically suitable areas for African vertebrates

    PubMed Central

    Garcia, Raquel A; Burgess, Neil D; Cabeza, Mar; Rahbek, Carsten; Araújo, Miguel B

    2012-01-01

    Africa is predicted to be highly vulnerable to 21st century climatic changes. Assessing the impacts of these changes on Africa's biodiversity is, however, plagued by uncertainties, and markedly different results can be obtained from alternative bioclimatic envelope models or future climate projections. Using an ensemble forecasting framework, we examine projections of future shifts in climatic suitability, and their methodological uncertainties, for over 2500 species of mammals, birds, amphibians and snakes in sub-Saharan Africa. To summarize a priori the variability in the ensemble of 17 general circulation models, we introduce a consensus methodology that combines co-varying models. Thus, we quantify and map the relative contribution to uncertainty of seven bioclimatic envelope models, three multi-model climate projections and three emissions scenarios, and explore the resulting variability in species turnover estimates. We show that bioclimatic envelope models contribute most to variability, particularly in projected novel climatic conditions over Sahelian and southern Saharan Africa. To summarize agreements among projections from the bioclimatic envelope models we compare five consensus methodologies, which generally increase or retain projection accuracy and provide consistent estimates of species turnover. Variability from emissions scenarios increases towards late-century and affects southern regions of high species turnover centred in arid Namibia. Twofold differences in median species turnover across the study area emerge among alternative climate projections and emissions scenarios. Our ensemble of projections underscores the potential bias when using a single algorithm or climate projection for Africa, and provides a cautious first approximation of the potential exposure of sub-Saharan African vertebrates to climatic changes. The future use and further development of bioclimatic envelope modelling will hinge on the interpretation of results in the light of methodological as well as biological uncertainties. Here, we provide a framework to address methodological uncertainties and contextualize results.

  8. Validation of On-Orbit Methodology for the Assessment of Cardiac Function and Changes in the Circulating Volume Using Ultrasound and "Braslet-M" Occlusion Cuffs

    NASA Technical Reports Server (NTRS)

    Bogomolov, V. V.; Duncan, J. M.; Alferova, I. V.; Dulchavsky, S. A.; Ebert, D.; Hamilton, D. R.; Matveev, V. P.; Sargsyan, A. E.

    2008-01-01

    Recent advances in remotely guided imaging techniques on ISS allow the acquisition of high quality ultrasound data using crewmember operators with no medical background and minimal training. However, ongoing efforts are required to develop and validate methodology for complex imaging protocols to ensure their repeatability, efficiency, and suitability for use aboard the ISS. This Station Developmental Test Objective (SDTO) tests a cardiovascular evaluation methodology that takes advantage of the ISS Ultrasound capability, the Braslet-M device, and modified respiratory maneuvers (Valsalva and Mueller), to broaden the spectrum of anatomical and functional information on human cardiovascular system during long-duration space missions. The proposed methodology optimizes and combines new and previously demonstrated methods, and is expected to benefit medically indicated assessments, operational research protocols, and data collections for science. Braslet-M is a current Russian operational countermeasure that compresses the upper thigh to impede the venous return from lower extremities. The goal of the SDTO is to establish and validate a repeatable ultrasound-based methodology for the assessment of a number of cardiovascular criteria in microgravity. Braslet-M device is used as a means to acutely alter volume distribution while focused ultrasound measurements are performed. Modified respiratory maneuvers are done upon volume manipulations to record commensurate changes in anatomical and functional parameters. The overall cardiovascular effects of the Braslet-M device are not completely understood, and although not a primary objective of this SDTO, this effort will provide pilot data regarding the suitability of Braslet-M for its intended purpose, effects, and the indications for its use.

  9. Design Optimization Method for Composite Components Based on Moment Reliability-Sensitivity Criteria

    NASA Astrophysics Data System (ADS)

    Sun, Zhigang; Wang, Changxi; Niu, Xuming; Song, Yingdong

    2017-08-01

    In this paper, a Reliability-Sensitivity Based Design Optimization (RSBDO) methodology for the design of the ceramic matrix composites (CMCs) components has been proposed. A practical and efficient method for reliability analysis and sensitivity analysis of complex components with arbitrary distribution parameters are investigated by using the perturbation method, the respond surface method, the Edgeworth series and the sensitivity analysis approach. The RSBDO methodology is then established by incorporating sensitivity calculation model into RBDO methodology. Finally, the proposed RSBDO methodology is applied to the design of the CMCs components. By comparing with Monte Carlo simulation, the numerical results demonstrate that the proposed methodology provides an accurate, convergent and computationally efficient method for reliability-analysis based finite element modeling engineering practice.

  10. Designing for fiber composite structural durability in hygrothermomechanical environment

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.

    1985-01-01

    A methodology is described which can be used to design/analyze fiber composite structures subjected to complex hygrothermomechanical environments. This methodology includes composite mechanics and advanced structural analysis methods (finite element). Select examples are described to illustrate the application of the available methodology. The examples include: (1) composite progressive fracture; (2) composite design for high cycle fatigue combined with hot-wet conditions; and (3) general laminate design.

  11. Navy Community of Practice for Programmers and Developers

    DTIC Science & Technology

    2016-12-01

    execute cyber missions. The methodology employed in this research is human-centered design via a social interaction prototype, which allows us to learn...for Navy programmers and developers. Chapter V details the methodology used to design the proposed CoP. This chapter summarizes the results from...thirty years the term has evolved to incorporate ideas from numerous design methodologies and movements [57]. In the 1980s, revealed design began to

  12. Toward a Formal Model of the Design and Evolution of Software

    DTIC Science & Technology

    1988-12-20

    should have the flezibiity to support a variety of design methodologies, be compinhenaive enough to encompass the gamut of software lifecycle...the future. It should have the flezibility to support a variety of design methodologies, be comprehensive enough to encompass the gamut of software...variety of design methodologies, be comprehensive enough to encompass the gamut of software lifecycle activities, and be precise enough to provide the

  13. Reduced order modeling and active flow control of an inlet duct

    NASA Astrophysics Data System (ADS)

    Ge, Xiaoqing

    Many aerodynamic applications require the modeling of compressible flows in or around a body, e.g., the design of aircraft, inlet or exhaust duct, wind turbines, or tall buildings. Traditional methods use wind tunnel experiments and computational fluid dynamics (CFD) to investigate the spatial and temporal distribution of the flows. Although they provide a great deal of insight into the essential characteristics of the flow field, they are not suitable for control analysis and design due to the high physical/computational cost. Many model reduction methods have been studied to reduce the complexity of the flow model. There are two main approaches: linearization based input/output modeling and proper orthogonal decomposition (POD) based model reduction. The former captures mostly the local behavior near a steady state, which is suitable to model laminar flow dynamics. The latter obtains a reduced order model by projecting the governing equation onto an "optimal" subspace and is able to model complex nonlinear flow phenomena. In this research we investigate various model reduction approaches and compare them in flow modeling and control design. We propose an integrated model-based control methodology and apply it to the reduced order modeling and active flow control of compressible flows within a very aggressive (length to exit diameter ratio, L/D, of 1.5) inlet duct and its upstream contraction section. The approach systematically applies reduced order modeling, estimator design, sensor placement and control design to improve the aerodynamic performance. The main contribution of this work is the development of a hybrid model reduction approach that attempts to combine the best features of input/output model identification and POD method. We first identify a linear input/output model by using a subspace algorithm. We next project the difference between CFD response and the identified model response onto a set of POD basis. This trajectory is fit to a nonlinear dynamical model to augment the linear input/output model. Thus, the full system is decomposed into a dominant linear subsystem and a low order nonlinear subsystem. The hybrid model is then used for control design and compared with other modeling methods in CFD simulations. Numerical results indicate that the hybrid model accurately predicts the nonlinear behavior of the flow for a 2D diffuser contraction section model. It also performs best in terms of feedback control design and learning control. Since some outputs of interest (e.g., the AIP pressure recovery) are not observable during normal operations, static and dynamic estimators are designed to recreate the information from available sensor measurements. The latter also provides a state estimation for feedback controller. Based on the reduced order models and estimators, different controllers are designed to improve the aerodynamic performance of the contraction section and inlet duct. The integrated control methodology is evaluated with CFD simulations. Numerical results demonstrate the feasibility and efficacy of the active flow control based on reduced order models. Our reduced order models not only generate a good approximation of the nonlinear flow dynamics over a wide input range, but also help to design controllers that significantly improve the flow response. The tools developed for model reduction, estimator and control design can also be applied to wind tunnel experiment.

  14. Applying CBR to machine tool product configuration design oriented to customer requirements

    NASA Astrophysics Data System (ADS)

    Wang, Pengjia; Gong, Yadong; Xie, Hualong; Liu, Yongxian; Nee, Andrew Yehching

    2017-01-01

    Product customization is a trend in the current market-oriented manufacturing environment. However, deduction from customer requirements to design results and evaluation of design alternatives are still heavily reliant on the designer's experience and knowledge. To solve the problem of fuzziness and uncertainty of customer requirements in product configuration, an analysis method based on the grey rough model is presented. The customer requirements can be converted into technical characteristics effectively. In addition, an optimization decision model for product planning is established to help the enterprises select the key technical characteristics under the constraints of cost and time to serve the customer to maximal satisfaction. A new case retrieval approach that combines the self-organizing map and fuzzy similarity priority ratio method is proposed in case-based design. The self-organizing map can reduce the retrieval range and increase the retrieval efficiency, and the fuzzy similarity priority ratio method can evaluate the similarity of cases comprehensively. To ensure that the final case has the best overall performance, an evaluation method of similar cases based on grey correlation analysis is proposed to evaluate similar cases to select the most suitable case. Furthermore, a computer-aided system is developed using MATLAB GUI to assist the product configuration design. The actual example and result on an ETC series machine tool product show that the proposed method is effective, rapid and accurate in the process of product configuration. The proposed methodology provides a detailed instruction for the product configuration design oriented to customer requirements.

  15. Haptic Technologies for MEMS Design

    NASA Astrophysics Data System (ADS)

    Calis, Mustafa; Desmulliez, Marc P. Y.

    2006-04-01

    This paper presents for the first time a design methodology for MEMS/NEMS based on haptic sensing technologies. The software tool created as a result of this methodology will enable designers to model and interact in real time with their virtual prototype. One of the main advantages of haptic sensing is the ability to bring unusual microscopic forces back to the designer's world. Other significant benefits for developing such a methodology include gain productivity and the capability to include manufacturing costs within the design cycle.

  16. Proceedings of the Seminar on the DOD Computer Security Initiative (4th) Held at the National Bureau of Standards, Gaithersburg, Maryland on August 10-12, 1981.

    DTIC Science & Technology

    1981-01-01

    comparison of formal and informal design methodologies will show how we think they are converging. Lastly, I will describe our involvement with the DoD...computer security must begin with the design methodology , with the objective being provability. The idea ofa formal evaluation and on-the-shelf... Methodologies ] Here we can compare the formal design methodologies with those used by informal practitioners like Control Data. Obviously, both processes

  17. Designing a Strategic Plan through an Emerging Knowledge Generation Process: The ATM Experience

    ERIC Educational Resources Information Center

    Zanotti, Francesco

    2012-01-01

    Purpose: The aim of this contribution is to describe a new methodology for designing strategic plans and how it was implemented by ATM, a public transportation agency based in Milan, Italy. Design/methodology/approach: This methodology is founded on a new system theory, called "quantum systemics". It is based on models and metaphors both…

  18. TRAC Innovative Visualization Techniques

    DTIC Science & Technology

    2016-11-14

    Therefore, TRAC analysts need a way to analyze the effectiveness of their visualization design choices. Currently, TRAC does not have a methodology ...to analyze visualizations used to support an analysis story. Our research team developed a visualization design methodology to create effective...visualizations that support an analysis story. First, we based our methodology on the latest research on design thinking, cognitive learning, and

  19. Integrated Controls-Structures Design Methodology: Redesign of an Evolutionary Test Structure

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Gupta, Sandeep; Elliot, Kenny B.; Joshi, Suresh M.

    1997-01-01

    An optimization-based integrated controls-structures design methodology for a class of flexible space structures is described, and the phase-0 Controls-Structures-Integration evolutionary model, a laboratory testbed at NASA Langley, is redesigned using this integrated design methodology. The integrated controls-structures design is posed as a nonlinear programming problem to minimize the control effort required to maintain a specified line-of-sight pointing performance, under persistent white noise disturbance. Static and dynamic dissipative control strategies are employed for feedback control, and parameters of these controllers are considered as the control design variables. Sizes of strut elements in various sections of the CEM are used as the structural design variables. Design guides for the struts are developed and employed in the integrated design process, to ensure that the redesigned structure can be effectively fabricated. The superiority of the integrated design methodology over the conventional design approach is demonstrated analytically by observing a significant reduction in the average control power needed to maintain specified pointing performance with the integrated design approach.

  20. Space Engineering Projects in Design Methodology

    NASA Technical Reports Server (NTRS)

    Crawford, R.; Wood, K.; Nichols, S.; Hearn, C.; Corrier, S.; DeKunder, G.; George, S.; Hysinger, C.; Johnson, C.; Kubasta, K.

    1993-01-01

    NASA/USRA is an ongoing sponsor of space design projects in the senior design courses of the Mechanical Engineering Department at The University of Texas at Austin. This paper describes the UT senior design sequence, focusing on the first-semester design methodology course. The philosophical basis and pedagogical structure of this course is summarized. A history of the Department's activities in the Advanced Design Program is then presented. The paper includes a summary of the projects completed during the 1992-93 Academic Year in the methodology course, and concludes with an example of two projects completed by student design teams.

  1. Preparation of modified semi-coke by microwave heating and adsorption kinetics of methylene blue.

    PubMed

    Wang, Xin; Peng, Jin-Hui; Duan, Xin-Hui; Srinivasakannan, Chandrasekar

    2013-01-01

    Preparation of modified semi-coke has been achieved, using phosphoric acid as the modifying agent, by microwave heating from virgin semi-coke. Process optimization using a Central Composite Design (CCD) design of Response Surface Methodology (RSM) technique for the preparation of modifies semi-coke is presented in this paper. The optimum conditions for producing modified semi-coke were: concentration of phosphoric acid 2.04, heating time 20 minutes and temperature 587 degrees C, with the optimum iodine of 862 mg/g and yield of 47.48%. The textural characteristics of modified semi-coke were analyzed using scanning electron microscopy (SEM) and nitrogen adsorption isotherm. The BET surface area of modified semi-coke was estimated to be 989.60 m2/g, with the pore volume of 0.74 cm3/g and a pore diameter of 3.009 nm, with micro-pore volume contributing to 62.44%. The Methylene Blue monolayer adsorption capacity was found to be mg/g at K. The adsorption capacity of the modified semi-coke highlights its suitability for liquid phase adsorption application with a potential usage in waste water treatment.

  2. Synthetic antimicrobial peptides as agricultural pesticides for plant-disease control.

    PubMed

    Montesinos, Emilio; Bardají, Eduard

    2008-07-01

    There is a need of antimicrobial compounds in agriculture for plant-disease control, with low toxicity and reduced negative environmental impact. Antimicrobial peptides are produced by living organisms and offer strong possibilities in agriculture because new compounds can be developed based on natural structures with improved properties of activity, specificity, biodegradability, and toxicity. Design of new molecules has been achieved using combinatorial-chemistry procedures coupled to high-throughput screening systems and data processing with design-of-experiments (DOE) methodology to obtain QSAR equation models and optimized compounds. Upon selection of best candidates with low cytotoxicity and moderate stability to protease digestion, anti-infective activity has been evaluated in plant-pathogen model systems. Suitable compounds have been submitted to acute toxicity testing in higher organisms and exhibited a low toxicity profile in a mouse model. Large-scale production can be achieved by solution organic or chemoenzymatic procedures in the case of very small peptides, but, in many cases, production can be performed by biotechnological methods using genetically modified microorganisms (fermentation) or transgenic crops (plant biofactories).

  3. Sediment transport measurements: Chapter 5

    USGS Publications Warehouse

    Diplas, P.; Kuhnle, R.; Gray, J.; Glysson, D.; Edwards, T.; García, Marcelo H.

    2008-01-01

    Sediment erosion, transport, and deposition in fluvial systems are complex processes that are treated in detail in other sections of this book. Development of methods suitable for the collection of data that contribute to understanding these processes is a still-evolving science. Sediment and ancillary data are fundamental requirements for the proper management of river systems, including the design of structures, the determination of aspects of stream behavior, ascertaining the probable effect of removing an existing structure, estimation of bulk erosion, transport, and sediment delivery to the oceans, ascertaining the long-term usefulness of reservoirs and other public works, tracking movement of solid-phase contaminants, restoration of degraded or otherwise modified streams, and assistance in the calibration and validation of numerical models. This chapter presents techniques for measuring bed-material properties and suspended and bed-load discharges. Well-established and relatively recent, yet adequately tested, sampling equipment and methodologies, with designs that are guided by sound physical and statistical principles, are described. Where appropriate, the theory behind the development of the equipment and guidelines for its use are presented.

  4. Collaboration across eight research centers: unanticipated benefits and outcomes for project managers.

    PubMed

    Perez, Norma A; Weathers, Benita; Willis, Marilyn; Mendez, Jacqueline

    2013-02-01

    Managers of transdisciplinary collaborative research lack suitable didactic material to support the implementation of research methodologies and to build ongoing partnerships with community representatives and peers, both between and within multiple academic centers. This article will provide insight on the collaborative efforts of project managers involved in multidisciplinary research and their subsequent development of a tool kit for research project managers and/or directors. Project managers from the 8 Centers for Population Health and Health Disparities across the nation participated in monthly teleconferences to share experiences and offer advice on how to achieve high participation rates and maintain community involvement in collaboration with researchers and community leaders to achieve the common goal of decreasing health inequities. In the process, managers recognized and seized the opportunity to produce a tool kit that was designed for future project managers and directors. Project managers in geographically distinct locations maintained a commitment to work together over 4 years and subsequently built upon an existing communications network to design a tool kit that could be disseminated easily to a diverse audience.

  5. Transmission Index Research of Parallel Manipulators Based on Matrix Orthogonal Degree

    NASA Astrophysics Data System (ADS)

    Shao, Zhu-Feng; Mo, Jiao; Tang, Xiao-Qiang; Wang, Li-Ping

    2017-11-01

    Performance index is the standard of performance evaluation, and is the foundation of both performance analysis and optimal design for the parallel manipulator. Seeking the suitable kinematic indices is always an important and challenging issue for the parallel manipulator. So far, there are extensive studies in this field, but few existing indices can meet all the requirements, such as simple, intuitive, and universal. To solve this problem, the matrix orthogonal degree is adopted, and generalized transmission indices that can evaluate motion/force transmissibility of fully parallel manipulators are proposed. Transmission performance analysis of typical branches, end effectors, and parallel manipulators is given to illustrate proposed indices and analysis methodology. Simulation and analysis results reveal that proposed transmission indices possess significant advantages, such as normalized finite (ranging from 0 to 1), dimensionally homogeneous, frame-free, intuitive and easy to calculate. Besides, proposed indices well indicate the good transmission region and relativity to the singularity with better resolution than the traditional local conditioning index, and provide a novel tool for kinematic analysis and optimal design of fully parallel manipulators.

  6. Adjoint-Based Mesh Adaptation for the Sonic Boom Signature Loudness

    NASA Technical Reports Server (NTRS)

    Rallabhandi, Sriram K.; Park, Michael A.

    2017-01-01

    The mesh adaptation functionality of FUN3D is utilized to obtain a mesh optimized to calculate sonic boom ground signature loudness. During this process, the coupling between the discrete-adjoints of the computational fluid dynamics tool FUN3D and the atmospheric propagation tool sBOOM is exploited to form the error estimate. This new mesh adaptation methodology will allow generation of suitable meshes adapted to reduce the estimated errors in the ground loudness, which is an optimization metric employed in supersonic aircraft design. This new output-based adaptation could allow new insights into meshing for sonic boom analysis and design, and complements existing output-based adaptation techniques such as adaptation to reduce estimated errors in off-body pressure functional. This effort could also have implications for other coupled multidisciplinary adjoint capabilities (e.g., aeroelasticity) as well as inclusion of propagation specific parameters such as prevailing winds or non-standard atmospheric conditions. Results are discussed in the context of existing methods and appropriate conclusions are drawn as to the efficacy and efficiency of the developed capability.

  7. Using High-Dimensional Image Models to Perform Highly Undetectable Steganography

    NASA Astrophysics Data System (ADS)

    Pevný, Tomáš; Filler, Tomáš; Bas, Patrick

    This paper presents a complete methodology for designing practical and highly-undetectable stegosystems for real digital media. The main design principle is to minimize a suitably-defined distortion by means of efficient coding algorithm. The distortion is defined as a weighted difference of extended state-of-the-art feature vectors already used in steganalysis. This allows us to "preserve" the model used by steganalyst and thus be undetectable even for large payloads. This framework can be efficiently implemented even when the dimensionality of the feature set used by the embedder is larger than 107. The high dimensional model is necessary to avoid known security weaknesses. Although high-dimensional models might be problem in steganalysis, we explain, why they are acceptable in steganography. As an example, we introduce HUGO, a new embedding algorithm for spatial-domain digital images and we contrast its performance with LSB matching. On the BOWS2 image database and in contrast with LSB matching, HUGO allows the embedder to hide 7× longer message with the same level of security level.

  8. The normative background of empirical-ethical research: first steps towards a transparent and reasoned approach in the selection of an ethical theory.

    PubMed

    Salloch, Sabine; Wäscher, Sebastian; Vollmann, Jochen; Schildmann, Jan

    2015-04-04

    Empirical-ethical research constitutes a relatively new field which integrates socio-empirical research and normative analysis. As direct inferences from descriptive data to normative conclusions are problematic, an ethical framework is needed to determine the relevance of the empirical data for normative argument. While issues of normative-empirical collaboration and questions of empirical methodology have been widely discussed in the literature, the normative methodology of empirical-ethical research has seldom been addressed. Based on our own research experience, we discuss one aspect of this normative methodology, namely the selection of an ethical theory serving as a background for empirical-ethical research. Whereas criteria for a good ethical theory in philosophical ethics are usually related to inherent aspects, such as the theory's clarity or coherence, additional points have to be considered in the field of empirical-ethical research. Three of these additional criteria will be discussed in the article: (a) the adequacy of the ethical theory for the issue at stake, (b) the theory's suitability for the purposes and design of the empirical-ethical research project, and (c) the interrelation between the ethical theory selected and the theoretical backgrounds of the socio-empirical research. Using the example of our own study on the development of interventions which support clinical decision-making in oncology, we will show how the selection of an ethical theory as a normative background for empirical-ethical research can proceed. We will also discuss the limitations of the procedures chosen in our project. The article stresses that a systematic and reasoned approach towards theory selection in empirical-ethical research should be given priority rather than an accidental or implicit way of choosing the normative framework for one's own research. It furthermore shows that the overall design of an empirical-ethical study is a multi-faceted endeavor which has to balance between theoretical and pragmatic considerations.

  9. Synthesis of deoxyribonucleotidyl(3'5')arabinonucleosides

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gray, S.H.; Ainsworth, C.F.; Bell, C.L.

    Two different synthetic routes using phosphotriester methodology have been utilized to prepare deoxyribonucleotidyl(3'-5)arabinonucleosides containing 9-..beta..-D-arabinofuranosyladenine (ara-A vidarabine) and 1-..beta..-D-arabinofuranosylcytosine (ara-C, cytarabine) at the 3'-terminus in amounts and purity (greater than 95%) suitable for NMR analysis.

  10. A molecular dynamics description of the conformational flexibility of the L-iduronate ring in glycosaminoglycans.

    PubMed

    Angulo, Jesús; Nieto, Pedro M; Martín-Lomas, Manuel

    2003-07-07

    For a synthetic hexasaccharide model it is shown that the conformational flexibility of the L-iduronate ring in glycosaminoglycans can be adequately described by using the PME methodology together with simulation protocols suitable for highly charged systems.

  11. Recent advances in live cell imaging of hepatoma cells

    PubMed Central

    2014-01-01

    Live cell imaging enables the study of dynamic processes of living cells in real time by use of suitable reporter proteins and the staining of specific cellular structures and/or organelles. With the availability of advanced optical devices and improved cell culture protocols it has become a rapidly growing research methodology. The success of this technique relies mainly on the selection of suitable reporter proteins, construction of recombinant plasmids possessing cell type specific promoters as well as reliable methods of gene transfer. This review aims to provide an overview of the recent developments in the field of marker proteins (bioluminescence and fluorescent) and methodologies (fluorescent resonance energy transfer, fluorescent recovery after photobleaching and proximity ligation assay) employed as to achieve an improved imaging of biological processes in hepatoma cells. Moreover, different expression systems of marker proteins and the modes of gene transfer are discussed with emphasis on the study of lipid droplet formation in hepatocytes as an example. PMID:25005127

  12. Pattern recognition of satellite cloud imagery for improved weather prediction

    NASA Technical Reports Server (NTRS)

    Gautier, Catherine; Somerville, Richard C. J.; Volfson, Leonid B.

    1986-01-01

    The major accomplishment was the successful development of a method for extracting time derivative information from geostationary meteorological satellite imagery. This research is a proof-of-concept study which demonstrates the feasibility of using pattern recognition techniques and a statistical cloud classification method to estimate time rate of change of large-scale meteorological fields from remote sensing data. The cloud classification methodology is based on typical shape function analysis of parameter sets characterizing the cloud fields. The three specific technical objectives, all of which were successfully achieved, are as follows: develop and test a cloud classification technique based on pattern recognition methods, suitable for the analysis of visible and infrared geostationary satellite VISSR imagery; develop and test a methodology for intercomparing successive images using the cloud classification technique, so as to obtain estimates of the time rate of change of meteorological fields; and implement this technique in a testbed system incorporating an interactive graphics terminal to determine the feasibility of extracting time derivative information suitable for comparison with numerical weather prediction products.

  13. Synthesis of water suitable as the MEPC.174(58) G8 influent water for testing ballast water management systems.

    PubMed

    D'Agostino, Fabio; Del Core, Marianna; Cappello, Simone; Mazzola, Salvatore; Sprovieri, Mario

    2015-10-01

    Here, we describe the methodologies adopted to ensure that natural seawater, used as "influent water" for the land test, complies with the requirement that should be fulfilled to show the efficacy of the new ballast water treatment system (BWTS). The new BWTS was located on the coast of SW Sicily (Italy), and the sampled seawater showed that bacteria and plankton were two orders of magnitude lower than requested. Integrated approaches for preparation of massive cultures of bacteria (Alcanivorax borkumensis and Marinobacter hydrocarbonoclasticus), algae (Tetraselmis suecica), rotifers (Brachionus plicatilis), and crustaceans (Artemia salina) suitable to ensure that 200 m(3) of water fulfilled the international guidelines of MEPC.174(58)G8 are here described. These methodologies allowed us to prepare the "influent water" in good agreement with guidelines and without specific problems arising from natural conditions (seasons, weather, etc.) which significantly affect the concentrations of organisms at sea. This approach also offered the chance to reliably run land tests once every two weeks.

  14. Statistical assessment of dumpsite soil suitability to enhance methane bio-oxidation under interactive influence of substrates and temperature.

    PubMed

    Bajar, Somvir; Singh, Anita; Kaushik, C P; Kaushik, Anubha

    2017-05-01

    Biocovers are considered as the most effective and efficient way to treat methane (CH 4 ) emission from dumpsites and landfills. Active methanotrophs in the biocovers play a crucial role in reduction of emissions through microbiological methane oxidation. Several factors affecting methane bio-oxidation (MOX) have been well documented, however, their interactive effect on the oxidation process needs to be explored. Therefore, the present study was undertaken to investigate the suitability of a dumpsite soil to be employed as biocover, under the influence of substrate concentrations (CH 4 and O 2 ) and temperature at variable incubation periods. Statistical design matrix of Response Surface Methodology (RSM) revealed that MOX rate up to 69.58μgCH 4 g -1 dw h -1 could be achieved under optimum conditions. MOX was found to be more dependent on CH 4 concentration at higher level (30-40%, v/v), in comparison to O 2 concentration. However, unlike other studies MOX was found in direct proportionality relationship with temperature within a range of 25-35°C. The results obtained with the dumpsite soil biocover open up a new possibility to provide improved, sustained and environmental friendly systems to control even high CH 4 emissions from the waste sector. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. A Methodology for the Assessment of Experiential Learning Lean: The Lean Experience Factory Study

    ERIC Educational Resources Information Center

    De Zan, Giovanni; De Toni, Alberto Felice; Fornasier, Andrea; Battistella, Cinzia

    2015-01-01

    Purpose: The purpose of this paper is to present a methodology to assess the experiential learning processes of learning lean in an innovative learning environment: the lean model factories. Design/methodology/approach: A literature review on learning and lean management literatures was carried out to design the methodology. Then, a case study…

  16. Solar energy program evaluation: an introduction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    deLeon, P.

    The Program Evaluation Methodology provides an overview of the practice and methodology of program evaluation and defines more precisely the evaluation techniques and methodologies that would be most appropriate to government organizations which are actively involved in the research, development, and commercialization of solar energy systems. Formal evaluation cannot be treated as a single methodological approach for assessing a program. There are four basic types of evaluation designs - the pre-experimental design; the quasi-experimental design based on time series; the quasi-experimental design based on comparison groups; and the true experimental design. This report is organized to first introduce the rolemore » and issues of evaluation. This is to provide a set of issues to organize the subsequent sections detailing the national solar energy programs. Then, these two themes are integrated by examining the evaluation strategies and methodologies tailored to fit the particular needs of the various individual solar energy programs. (MCW)« less

  17. DESIGN METHODOLOGIES AND TOOLS FOR SINGLE-FLUX QUANTUM LOGIC CIRCUITS

    DTIC Science & Technology

    2017-10-01

    DESIGN METHODOLOGIES AND TOOLS FOR SINGLE-FLUX QUANTUM LOGIC CIRCUITS UNIVERSITY OF SOUTHERN CALIFORNIA OCTOBER 2017 FINAL...SUBTITLE DESIGN METHODOLOGIES AND TOOLS FOR SINGLE-FLUX QUANTUM LOGIC CIRCUITS 5a. CONTRACT NUMBER FA8750-15-C-0203 5b. GRANT NUMBER N/A 5c. PROGRAM...of this project was to investigate the state-of-the-art in design and optimization of single-flux quantum (SFQ) logic circuits, e.g., RSFQ and ERSFQ

  18. Enhancing the Front-End Phase of Design Methodology

    ERIC Educational Resources Information Center

    Elias, Erasto

    2006-01-01

    Design methodology (DM) is defined by the procedural path, expressed in design models, and techniques or methods used to untangle the various activities within a design model. Design education in universities is mainly based on descriptive design models. Much knowledge and organization have been built into DM to facilitate design teaching.…

  19. Introducing Hurst exponent in pair trading

    NASA Astrophysics Data System (ADS)

    Ramos-Requena, J. P.; Trinidad-Segovia, J. E.; Sánchez-Granero, M. A.

    2017-12-01

    In this paper we introduce a new methodology for pair trading. This new method is based on the calculation of the Hurst exponent of a pair. Our approach is inspired by the classical concepts of co-integration and mean reversion but joined under a unique strategy. We will show how Hurst approach presents better results than classical Distance Method and Correlation strategies in different scenarios. Results obtained prove that this new methodology is consistent and suitable by reducing the drawdown of trading over the classical ones getting as a result a better performance.

  20. Methodology of Numerical Optimization for Orbital Parameters of Binary Systems

    NASA Astrophysics Data System (ADS)

    Araya, I.; Curé, M.

    2010-02-01

    The use of a numerical method of maximization (or minimization) in optimization processes allows us to obtain a great amount of solutions. Therefore, we can find a global maximum or minimum of the problem, but this is only possible if we used a suitable methodology. To obtain the global optimum values, we use the genetic algorithm called PIKAIA (P. Charbonneau) and other four algorithms implemented in Mathematica. We demonstrate that derived orbital parameters of binary systems published in some papers, based on radial velocity measurements, are local minimum instead of global ones.

  1. Methodologies for Root Locus and Loop Shaping Control Design with Comparisons

    NASA Technical Reports Server (NTRS)

    Kopasakis, George

    2017-01-01

    This paper describes some basics for the root locus controls design method as well as for loop shaping, and establishes approaches to expedite the application of these two design methodologies to easily obtain control designs that meet requirements with superior performance. The two design approaches are compared for their ability to meet control design specifications and for ease of application using control design examples. These approaches are also compared with traditional Proportional Integral Derivative (PID) control in order to demonstrate the limitations of PID control. Robustness of these designs is covered as it pertains to these control methodologies and for the example problems.

  2. Regional health care planning: a methodology to cluster facilities using community utilization patterns

    PubMed Central

    2013-01-01

    Background Community-based health care planning and regulation necessitates grouping facilities and areal units into regions of similar health care use. Limited research has explored the methodologies used in creating these regions. We offer a new methodology that clusters facilities based on similarities in patient utilization patterns and geographic location. Our case study focused on Hospital Groups in Michigan, the allocation units used for predicting future inpatient hospital bed demand in the state’s Bed Need Methodology. The scientific, practical, and political concerns that were considered throughout the formulation and development of the methodology are detailed. Methods The clustering methodology employs a 2-step K-means + Ward’s clustering algorithm to group hospitals. The final number of clusters is selected using a heuristic that integrates both a statistical-based measure of cluster fit and characteristics of the resulting Hospital Groups. Results Using recent hospital utilization data, the clustering methodology identified 33 Hospital Groups in Michigan. Conclusions Despite being developed within the politically charged climate of Certificate of Need regulation, we have provided an objective, replicable, and sustainable methodology to create Hospital Groups. Because the methodology is built upon theoretically sound principles of clustering analysis and health care service utilization, it is highly transferable across applications and suitable for grouping facilities or areal units. PMID:23964905

  3. Regional health care planning: a methodology to cluster facilities using community utilization patterns.

    PubMed

    Delamater, Paul L; Shortridge, Ashton M; Messina, Joseph P

    2013-08-22

    Community-based health care planning and regulation necessitates grouping facilities and areal units into regions of similar health care use. Limited research has explored the methodologies used in creating these regions. We offer a new methodology that clusters facilities based on similarities in patient utilization patterns and geographic location. Our case study focused on Hospital Groups in Michigan, the allocation units used for predicting future inpatient hospital bed demand in the state's Bed Need Methodology. The scientific, practical, and political concerns that were considered throughout the formulation and development of the methodology are detailed. The clustering methodology employs a 2-step K-means + Ward's clustering algorithm to group hospitals. The final number of clusters is selected using a heuristic that integrates both a statistical-based measure of cluster fit and characteristics of the resulting Hospital Groups. Using recent hospital utilization data, the clustering methodology identified 33 Hospital Groups in Michigan. Despite being developed within the politically charged climate of Certificate of Need regulation, we have provided an objective, replicable, and sustainable methodology to create Hospital Groups. Because the methodology is built upon theoretically sound principles of clustering analysis and health care service utilization, it is highly transferable across applications and suitable for grouping facilities or areal units.

  4. The Atomic Intrinsic Integration Approach: A Structured Methodology for the Design of Games for the Conceptual Understanding of Physics

    ERIC Educational Resources Information Center

    Echeverria, Alejandro; Barrios, Enrique; Nussbaum, Miguel; Amestica, Matias; Leclerc, Sandra

    2012-01-01

    Computer simulations combined with games have been successfully used to teach conceptual physics. However, there is no clear methodology for guiding the design of these types of games. To remedy this, we propose a structured methodology for the design of conceptual physics games that explicitly integrates the principles of the intrinsic…

  5. Methodology of Computer-Aided Design of Variable Guide Vanes of Aircraft Engines

    ERIC Educational Resources Information Center

    Falaleev, Sergei V.; Melentjev, Vladimir S.; Gvozdev, Alexander S.

    2016-01-01

    The paper presents a methodology which helps to avoid a great amount of costly experimental research. This methodology includes thermo-gas dynamic design of an engine and its mounts, the profiling of compressor flow path and cascade design of guide vanes. Employing a method elaborated by Howell, we provide a theoretical solution to the task of…

  6. Application of an Integrated Methodology for Propulsion and Airframe Control Design to a STOVL Aircraft

    NASA Technical Reports Server (NTRS)

    Garg, Sanjay; Mattern, Duane

    1994-01-01

    An advanced methodology for integrated flight propulsion control (IFPC) design for future aircraft, which will use propulsion system generated forces and moments for enhanced maneuver capabilities, is briefly described. This methodology has the potential to address in a systematic manner the coupling between the airframe and the propulsion subsystems typical of such enhanced maneuverability aircraft. Application of the methodology to a short take-off vertical landing (STOVL) aircraft in the landing approach to hover transition flight phase is presented with brief description of the various steps in the IFPC design methodology. The details of the individual steps have been described in previous publications and the objective of this paper is to focus on how the components of the control system designed at each step integrate into the overall IFPC system. The full nonlinear IFPC system was evaluated extensively in nonreal-time simulations as well as piloted simulations. Results from the nonreal-time evaluations are presented in this paper. Lessons learned from this application study are summarized in terms of areas of potential improvements in the STOVL IFPC design as well as identification of technology development areas to enhance the applicability of the proposed design methodology.

  7. Analog design optimization methodology for ultralow-power circuits using intuitive inversion-level and saturation-level parameters

    NASA Astrophysics Data System (ADS)

    Eimori, Takahisa; Anami, Kenji; Yoshimatsu, Norifumi; Hasebe, Tetsuya; Murakami, Kazuaki

    2014-01-01

    A comprehensive design optimization methodology using intuitive nondimensional parameters of inversion-level and saturation-level is proposed, especially for ultralow-power, low-voltage, and high-performance analog circuits with mixed strong, moderate, and weak inversion metal-oxide-semiconductor transistor (MOST) operations. This methodology is based on the synthesized charge-based MOST model composed of Enz-Krummenacher-Vittoz (EKV) basic concepts and advanced-compact-model (ACM) physics-based equations. The key concept of this methodology is that all circuit and system characteristics are described as some multivariate functions of inversion-level parameters, where the inversion level is used as an independent variable representative of each MOST. The analog circuit design starts from the first step of inversion-level design using universal characteristics expressed by circuit currents and inversion-level parameters without process-dependent parameters, followed by the second step of foundry-process-dependent design and the last step of verification using saturation-level criteria. This methodology also paves the way to an intuitive and comprehensive design approach for many kinds of analog circuit specifications by optimization using inversion-level log-scale diagrams and saturation-level criteria. In this paper, we introduce an example of our design methodology for a two-stage Miller amplifier.

  8. Methodology capture: discriminating between the "best" and the rest of community practice

    PubMed Central

    Eales, James M; Pinney, John W; Stevens, Robert D; Robertson, David L

    2008-01-01

    Background The methodologies we use both enable and help define our research. However, as experimental complexity has increased the choice of appropriate methodologies has become an increasingly difficult task. This makes it difficult to keep track of available bioinformatics software, let alone the most suitable protocols in a specific research area. To remedy this we present an approach for capturing methodology from literature in order to identify and, thus, define best practice within a field. Results Our approach is to implement data extraction techniques on the full-text of scientific articles to obtain the set of experimental protocols used by an entire scientific discipline, molecular phylogenetics. Our methodology for identifying methodologies could in principle be applied to any scientific discipline, whether or not computer-based. We find a number of issues related to the nature of best practice, as opposed to community practice. We find that there is much heterogeneity in the use of molecular phylogenetic methods and software, some of which is related to poor specification of protocols. We also find that phylogenetic practice exhibits field-specific tendencies that have increased through time, despite the generic nature of the available software. We used the practice of highly published and widely collaborative researchers ("expert" researchers) to analyse the influence of authority on community practice. We find expert authors exhibit patterns of practice common to their field and therefore act as useful field-specific practice indicators. Conclusion We have identified a structured community of phylogenetic researchers performing analyses that are customary in their own local community and significantly different from those in other areas. Best practice information can help to bridge such subtle differences by increasing communication of protocols to a wider audience. We propose that the practice of expert authors from the field of evolutionary biology is the closest to contemporary best practice in phylogenetic experimental design. Capturing best practice is, however, a complex task and should also acknowledge the differences between fields such as the specific context of the analysis. PMID:18761740

  9. A methodology to assess the economic impact of power storage technologies.

    PubMed

    El-Ghandour, Laila; Johnson, Timothy C

    2017-08-13

    We present a methodology for assessing the economic impact of power storage technologies. The methodology is founded on classical approaches to the optimal stopping of stochastic processes but involves an innovation that circumvents the need to, ex ante , identify the form of a driving process and works directly on observed data, avoiding model risks. Power storage is regarded as a complement to the intermittent output of renewable energy generators and is therefore important in contributing to the reduction of carbon-intensive power generation. Our aim is to present a methodology suitable for use by policy makers that is simple to maintain, adaptable to different technologies and easy to interpret. The methodology has benefits over current techniques and is able to value, by identifying a viable optimal operational strategy, a conceived storage facility based on compressed air technology operating in the UK.This article is part of the themed issue 'Energy management: flexibility, risk and optimization'. © 2017 The Author(s).

  10. Mechanistic-empirical Pavement Design Guide Implementation

    DOT National Transportation Integrated Search

    2010-06-01

    The recently introduced Mechanistic-Empirical Pavement Design Guide (MEPDG) and associated computer software provides a state-of-practice mechanistic-empirical highway pavement design methodology. The MEPDG methodology is based on pavement responses ...

  11. Systematic review of communication partner training in aphasia: methodological quality.

    PubMed

    Cherney, Leora R; Simmons-Mackie, Nina; Raymer, Anastasia; Armstrong, Elizabeth; Holland, Audrey

    2013-10-01

    Twenty-three studies identified from a previous systematic review examining the effects of communication partner training on persons with aphasia and their communication partners were evaluated for methodological quality. Two reviewers rated the studies on defined methodological quality criteria relevant to each study design. There were 11 group studies, seven single-subject participant design studies, and five qualitative studies. Quality scores were derived for each study. The mean inter-rater reliability of scores for each study design ranged from 85-93%, with Cohen's Kappa indicating substantial agreement between raters. Methodological quality of research on communication partner training in aphasia was highly varied. Overall, group studies employed the least rigorous methodology as compared to single subject and qualitative research. Only two of 11 group studies complied with more than half of the quality criteria. No group studies reported therapist blinding and only one group study reported participant blinding. Across all types of studies, the criterion of treatment fidelity was most commonly omitted. Failure to explicitly report certain methodological quality criteria may account for low ratings. Using methodological rating scales specific to the type of study design may help improve the methodological quality of aphasia treatment studies, including those on communication partner training.

  12. A methodology for the design and evaluation of user interfaces for interactive information systems. Ph.D. Thesis Final Report, 1 Jul. 1985 - 31 Dec. 1987

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Farooq, Mohammad U.

    1986-01-01

    The definition of proposed research addressing the development and validation of a methodology for the design and evaluation of user interfaces for interactive information systems is given. The major objectives of this research are: the development of a comprehensive, objective, and generalizable methodology for the design and evaluation of user interfaces for information systems; the development of equations and/or analytical models to characterize user behavior and the performance of a designed interface; the design of a prototype system for the development and administration of user interfaces; and the design and use of controlled experiments to support the research and test/validate the proposed methodology. The proposed design methodology views the user interface as a virtual machine composed of three layers: an interactive layer, a dialogue manager layer, and an application interface layer. A command language model of user system interactions is presented because of its inherent simplicity and structured approach based on interaction events. All interaction events have a common structure based on common generic elements necessary for a successful dialogue. It is shown that, using this model, various types of interfaces could be designed and implemented to accommodate various categories of users. The implementation methodology is discussed in terms of how to store and organize the information.

  13. IMPAC: An Integrated Methodology for Propulsion and Airframe Control

    NASA Technical Reports Server (NTRS)

    Garg, Sanjay; Ouzts, Peter J.; Lorenzo, Carl F.; Mattern, Duane L.

    1991-01-01

    The National Aeronautics and Space Administration is actively involved in the development of enabling technologies that will lead towards aircraft with new/enhanced maneuver capabilities such as Short Take-Off Vertical Landing (STOVL) and high angle of attack performance. Because of the high degree of dynamic coupling between the airframe and propulsion systems of these types of aircraft, one key technology is the integration of the flight and propulsion control. The NASA Lewis Research Center approach to developing Integrated Flight Propulsion Control (IFPC) technologies is an in-house research program referred to as IMPAC (Integrated Methodology for Propulsion and Airframe Control). The goals of IMPAC are to develop a viable alternative to the existing integrated control design methodologies that will allow for improved system performance and simplicity of control law synthesis and implementation, and to demonstrate the applicability of the methodology to a supersonic STOVL fighter aircraft. Based on some preliminary control design studies that included evaluation of the existing methodologies, the IFPC design methodology that is emerging at the Lewis Research Center consists of considering the airframe and propulsion system as one integrated system for an initial centralized controller design and then partitioning the centralized controller into separate airframe and propulsion system subcontrollers to ease implementation and to set meaningful design requirements for detailed subsystem control design and evaluation. An overview of IMPAC is provided and detailed discussion of the various important design and evaluation steps in the methodology are included.

  14. Action Research: Theory and Applications

    ERIC Educational Resources Information Center

    Jefferson, Renée N.

    2014-01-01

    Action research as a methodology is suitable for use within academic library settings. Its theoretical foundations are located in several disciplines and its applications span across many professions. In this article, an overview of the theoretical beginnings and evolution of action research is presented. Approaches generally used in conducting an…

  15. Suitability Assessment of Printed Dietary Guidelines for Pregnant Women and Parents of Infants and Toddlers From 7 European Countries.

    PubMed

    Garnweidner-Holme, Lisa Maria; Dolvik, Stina; Frisvold, Cathrine; Mosdøl, Annhild

    2016-02-01

    To evaluate selected European printed dietary guidelines for pregnant women and parents of infants and toddlers using the suitability assessment of materials (SAM) method. A descriptive study to determine the suitability of 14 printed dietary guidelines from 7 European countries based on deductive quantitative analyses. Materials varied greatly in format and content: 35.7% of materials were rated superior and 64.3% were rated adequate according to the overall SAM score for patient education material. None of the materials were scored not suitable. Among the categories, the highest average scores were for layout and typography and the lowest average scores were for cultural appropriateness and learning stimulation and motivation. Interrater reliability ranged from Cohen's kappa of 0.37 to 0.62 (mean, 0.41), indicating fair to moderate agreement among the 3 investigators. Overall, the suitability of the assessed printed dietary guidelines was adequate. Based on the SAM methodology, printed dietary guidelines may increase in suitability by emphasizing aspects related to health literacy and accommodating the needs of different food cultures within a population. Copyright © 2016 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.

  16. Experimental Validation of an Integrated Controls-Structures Design Methodology

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Gupta, Sandeep; Elliot, Kenny B.; Walz, Joseph E.

    1996-01-01

    The first experimental validation of an integrated controls-structures design methodology for a class of large order, flexible space structures is described. Integrated redesign of the controls-structures-interaction evolutionary model, a laboratory testbed at NASA Langley, was described earlier. The redesigned structure was fabricated, assembled in the laboratory, and experimentally tested against the original structure. Experimental results indicate that the structure redesigned using the integrated design methodology requires significantly less average control power than the nominal structure with control-optimized designs, while maintaining the required line-of-sight pointing performance. Thus, the superiority of the integrated design methodology over the conventional design approach is experimentally demonstrated. Furthermore, amenability of the integrated design structure to other control strategies is evaluated, both analytically and experimentally. Using Linear-Quadratic-Guassian optimal dissipative controllers, it is observed that the redesigned structure leads to significantly improved performance with alternate controllers as well.

  17. Experimental validation of an integrated controls-structures design methodology for a class of flexible space structures

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Gupta, Sandeep; Elliott, Kenny B.; Joshi, Suresh M.; Walz, Joseph E.

    1994-01-01

    This paper describes the first experimental validation of an optimization-based integrated controls-structures design methodology for a class of flexible space structures. The Controls-Structures-Interaction (CSI) Evolutionary Model, a laboratory test bed at Langley, is redesigned based on the integrated design methodology with two different dissipative control strategies. The redesigned structure is fabricated, assembled in the laboratory, and experimentally compared with the original test structure. Design guides are proposed and used in the integrated design process to ensure that the resulting structure can be fabricated. Experimental results indicate that the integrated design requires greater than 60 percent less average control power (by thruster actuators) than the conventional control-optimized design while maintaining the required line-of-sight performance, thereby confirming the analytical findings about the superiority of the integrated design methodology. Amenability of the integrated design structure to other control strategies is considered and evaluated analytically and experimentally. This work also demonstrates the capabilities of the Langley-developed design tool CSI DESIGN which provides a unified environment for structural and control design.

  18. Towards Methodologies for Building Knowledge-Based Instructional Systems.

    ERIC Educational Resources Information Center

    Duchastel, Philippe

    1992-01-01

    Examines the processes involved in building instructional systems that are based on artificial intelligence and hypermedia technologies. Traditional instructional systems design methodology is discussed; design issues including system architecture and learning strategies are addressed; and a new methodology for building knowledge-based…

  19. One Controller at a Time (1-CAT): A mimo design methodology

    NASA Technical Reports Server (NTRS)

    Mitchell, J. R.; Lucas, J. C.

    1987-01-01

    The One Controller at a Time (1-CAT) methodology for designing digital controllers for Large Space Structures (LSS's) is introduced and illustrated. The flexible mode problem is first discussed. Next, desirable features of a LSS control system design methodology are delineated. The 1-CAT approach is presented, along with an analytical technique for carrying out the 1-CAT process. Next, 1-CAT is used to design digital controllers for the proposed Space Based Laser (SBL). Finally, the SBL design is evaluated for dynamical performance, noise rejection, and robustness.

  20. General Methodology for Designing Spacecraft Trajectories

    NASA Technical Reports Server (NTRS)

    Condon, Gerald; Ocampo, Cesar; Mathur, Ravishankar; Morcos, Fady; Senent, Juan; Williams, Jacob; Davis, Elizabeth C.

    2012-01-01

    A methodology for designing spacecraft trajectories in any gravitational environment within the solar system has been developed. The methodology facilitates modeling and optimization for problems ranging from that of a single spacecraft orbiting a single celestial body to that of a mission involving multiple spacecraft and multiple propulsion systems operating in gravitational fields of multiple celestial bodies. The methodology consolidates almost all spacecraft trajectory design and optimization problems into a single conceptual framework requiring solution of either a system of nonlinear equations or a parameter-optimization problem with equality and/or inequality constraints.

  1. DEVELOPMENT OF OPERATIONAL CONCEPTS FOR ADVANCED SMRs: THE ROLE OF COGNITIVE SYSTEMS ENGINEERING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacques Hugo; David Gertman

    Advanced small modular reactors (AdvSMRs) will use advanced digital instrumentation and control systems, and make greater use of automation. These advances not only pose technical and operational challenges, but will inevitably have an effect on the operating and maintenance (O&M) cost of new plants. However, there is much uncertainty about the impact of AdvSMR designs on operational and human factors considerations, such as workload, situation awareness, human reliability, staffing levels, and the appropriate allocation of functions between the crew and various automated plant systems. Existing human factors and systems engineering design standards and methodologies are not current in terms ofmore » human interaction requirements for dynamic automated systems and are no longer suitable for the analysis of evolving operational concepts. New models and guidance for operational concepts for complex socio-technical systems need to adopt a state-of-the-art approach such as Cognitive Systems Engineering (CSE) that gives due consideration to the role of personnel. This approach we report on helps to identify and evaluate human challenges related to non-traditional concepts of operations. A framework - defining operational strategies was developed based on the operational analysis of Argonne National Laboratory’s Experimental Breeder Reactor-II (EBR-II), a small (20MWe) sodium-cooled reactor that was successfully operated for thirty years. Insights from the application of the systematic application of the methodology and its utility are reviewed and arguments for the formal adoption of CSE as a value-added part of the Systems Engineering process are presented.« less

  2. A Selection Methodology for the RTOS Market

    NASA Astrophysics Data System (ADS)

    Melanson, P.; Tafazoli, S.

    In past years, the market of Operating Systems (OS) has been quite active. One of those key markets is to support embedded real-time applications in which the OS must guarantee the timeliness as well as the correctness of the processing. Many OS claim to be Real-Time Operating Systems (RTOS), but often, it is only by reviewing the OS specifications or detailed information that one can truly identify the OS that enables real- time applications. Designers are faced with and impressive task when selecting an RTOS for their space mission. Whether for historical reasons or due to the rapid evolution of the RTOS market, it appears that RTOS are not evaluated for each mission but rather imposed. Although reasons for imposing this choice can be well justified, other times one is left to wonder if the lack of evaluation to mission requirements can lead to increased risks down the road. How does one select the proper RTOS for space missions, which will a) meet the requirements, b) correspond with the knowledge and expertise of the staff and c) continue to be a strategic choice for the future? The purpose of this paper is to compare commercially available RTOS that are suitable for space missions requiring hard real-time capabilities. It is our belief that this research identifies the important products for space missions and presents a methodology to select the appropriate RTOS that will meet design requirements and other relevant criteria. Lastly, the paper will present the volatility of the market in the past two years and determine the implications for embedded systems used in space missions. 1

  3. New methodology for preventing pressure ulcers using actimetry and autonomous nervous system recording.

    PubMed

    Meffre, R; Gehin, C; Schmitt, P M; De Oliveira, F; Dittmar, A

    2006-01-01

    Pressure ulcers constitute an important health problem. They affect lots of people with mobility disorder and they are difficult to detect and prevent because the damage begins on the muscle. This paper proposes a new approach to study pressure ulcers. We aim at developing a methodology to analyse the probability for a patient to develop a pressure ulcer, and that can detect risky situation. The idea is to relate the mobility disorder to autonomic nervous system (ANS) trouble. More precisely, the evaluation of the consequence of the discomfort on the ANS (stress induced by discomfort) can be relevant for the early detection of the pressure ulcer. Mobility is evaluated through movement measurement. This evaluation, at the interface between soft living tissues and any support has to consider the specificity of the human environment. Soft living tissues have non-linear mechanical properties making conventional rigid sensors non suitable for interface parameters measurement. A new actimeter system has been designed in order to study movements of the human body whatever its support while seating. The device is based on elementary active cells. The number of pressure cells can be easily adapted to the application. The spatial resolution is about 4 cm(2). In this paper, we compare activity measurement of a seated subject with his autonomic nervous system activity, recorded by E.motion device. It has been developed in order to record six parameters: skin potential, skin resistance, skin temperature, skin blood rate, instantaneous cardiac frequency and instantaneous respiratory frequency. The design, instrumentation, and first results are presented.

  4. A methodology for calibration of hyperspectral and multispectral satellite data in coastal areas

    NASA Astrophysics Data System (ADS)

    Pennucci, Giuliana; Fargion, Giulietta; Alvarez, Alberto; Trees, Charles; Arnone, Robert

    2012-06-01

    The objective of this work is to determine the location(s) in any given oceanic area during different temporal periods where in situ sampling for Calibration/Validation (Cal/Val) provides the best capability to retrieve accurate radiometric and derived product data (lowest uncertainties). We present a method to merge satellite imagery with in situ measurements, to determine the best in situ sampling strategy suitable for satellite Cal/Val and to evaluate the present in situ locations through uncertainty indices. This analysis is required to determine if the present in situ sites are adequate for assessing uncertainty and where additional sites and ship programs should be located to improve Calibration/Validation (Cal/Val) procedures. Our methodology uses satellite acquisitions to build a covariance matrix encoding the spatial-temporal variability of the area of interest. The covariance matrix is used in a Bayesian framework to merge satellite and in situ data providing a product with lower uncertainty. The best in situ location for Cal/Val is then identified by using a design principle (A-optimum design) that looks for minimizing the estimated variance of the merged products. Satellite products investigated in this study include Ocean Color water leaving radiance, chlorophyll, and inherent and apparent optical properties (retrieved from MODIS and VIIRS). In situ measurements are obtained from systems operated on fixed deployment platforms (e.g., sites of the Ocean Color component of the AErosol RObotic NETwork- AERONET-OC), moorings (e.g, Marine Optical Buoy-MOBY), ships or autonomous vehicles (such as Autonomous Underwater Vehicles and/or Gliders).

  5. Applying operational research and data mining to performance based medical personnel motivation system.

    PubMed

    Niaksu, Olegas; Zaptorius, Jonas

    2014-01-01

    This paper presents the methodology suitable for creation of a performance related remuneration system in healthcare sector, which would meet requirements for efficiency and sustainable quality of healthcare services. Methodology for performance indicators selection, ranking and a posteriori evaluation has been proposed and discussed. Priority Distribution Method is applied for unbiased performance criteria weighting. Data mining methods are proposed to monitor and evaluate the results of motivation system.We developed a method for healthcare specific criteria selection consisting of 8 steps; proposed and demonstrated application of Priority Distribution Method for the selected criteria weighting. Moreover, a set of data mining methods for evaluation of the motivational system outcomes was proposed. The described methodology for calculating performance related payment needs practical approbation. We plan to develop semi-automated tools for institutional and personal performance indicators monitoring. The final step would be approbation of the methodology in a healthcare facility.

  6. Extraction of breathing pattern using temperature sensor based on Arduino board

    NASA Astrophysics Data System (ADS)

    Patel, Rajesh; Sengottuvel, S.; Gireesan, K.; Janawadkar, M. P.; Radhakrishnan, T. S.

    2015-06-01

    Most of the basic functions of human body are assessed by measuring the different parameters from the body such as temperature, pulse activity and blood pressure etc. Respiration rate is the number of inhalations a person takes per minute and needs to be quantitatively assessed as it modulates other measurements such as SQUID based magnetocardiography (MCG) by bringing the chest closer to or away from the sensor array located inside a stationary liquid helium cryostat. The respiration rate is usually measured when a person is at rest and simply involves counting the number of inhalations for one minute. This paper aims at the development of a suitable methodology for the measurement of respiration rate with the help of a temperature sensor which monitors the very slight change in temperature near the nostril during inhalation & exhalation. The design and development of the proposed system is presented, along with typical experiment results.

  7. The nanomaterial toolkit for neuroengineering

    NASA Astrophysics Data System (ADS)

    Shah, Shreyas

    2016-10-01

    There is a growing interest in developing effective tools to better probe the central nervous system (CNS), to understand how it works and to treat neural diseases, injuries and cancer. The intrinsic complexity of the CNS has made this a challenging task for decades. Yet, with the extraordinary recent advances in nanotechnology and nanoscience, there is a general consensus on the immense value and potential of nanoscale tools for engineering neural systems. In this review, an overview of specialized nanomaterials which have proven to be the most effective tools in neuroscience is provided. After a brief background on the prominent challenges in the field, a variety of organic and inorganic-based nanomaterials are described, with particular emphasis on the distinctive properties that make them versatile and highly suitable in the context of the CNS. Building on this robust nano-inspired foundation, the rational design and application of nanomaterials can enable the generation of new methodologies to greatly advance the neuroscience frontier.

  8. A Participatory Research Approach to develop an Arabic Symbol Dictionary.

    PubMed

    Draffan, E A; Kadous, Amatullah; Idris, Amal; Banes, David; Zeinoun, Nadine; Wald, Mike; Halabi, Nawar

    2015-01-01

    The purpose of the Arabic Symbol Dictionary research discussed in this paper, is to provide a resource of culturally, environmentally and linguistically suitable symbols to aid communication and literacy skills. A participatory approach with the use of online social media and a bespoke symbol management system has been established to enhance the process of matching a user based Arabic and English core vocabulary with appropriate imagery. Participants including AAC users, their families, carers, teachers and therapists who have been involved in the research from the outset, collating the vocabularies, debating cultural nuances for symbols and critiquing the design of technologies for selection procedures. The positive reaction of those who have voted on the symbols with requests for early use have justified the iterative nature of the methodologies used for this part of the project. However, constant re-evaluation will be necessary and in depth analysis of all the data received has yet to be completed.

  9. 30-100-GHz inductors and transformers for millimeter-wave (Bi)CMOS integrated circuits

    NASA Astrophysics Data System (ADS)

    Dickson, T. O.; Lacroix, M.-A.; Boret, S.; Gloria, D.; Beerkens, R.; Voinigescu, S. P.

    2005-01-01

    Silicon planar and three-dimensional inductors and transformers were designed and characterized on-wafer up to 100 GHz. Self-resonance frequencies (SRFs) beyond 100 GHz were obtained, demonstrating for the first time that spiral structures are suitable for applications such as 60-GHz wireless local area network and 77-GHz automotive RADAR. Minimizing area over substrate is critical to achieving high SRF. A stacked transformer is reported with S21 of -2.5 dB at 50 GHz, and which offers improved performance and less area (30 μm × 30 μm) than planar transformers or microstrip couplers. A compact inductor model is described, along with a methodology for extracting model parameters from simulated or measured y-parameters. Millimeter-wave SiGe BiCMOS mixer and voltage-controlled-oscillator circuits employing spiral inductors are presented with better or comparable performance to previously reported transmission-line-based circuits.

  10. Transdermal delivery of biomacromolecules using lipid-like nanoparticles

    NASA Astrophysics Data System (ADS)

    Bello, Evelyn A.

    The transdermal delivery of biomacromolecules, including proteins and nucleic acids, is challenging, owing to their large size and the penetration-resistant nature of the stratum corneum. Thus, an urgent need exists for the development of transdermal delivery methodologies. This research focuses on the use of cationic lipid-like nanoparticles (lipidoids) for the transdermal delivery of proteins, and establishes an in vitro model for the study. The lipidoids used were first combinatorially designed and synthesized; afterwards, they were employed for protein encapsulation in a vesicular system. A skin penetration study demonstrated that lipidoids enhance penetration depth in a pig skin model, overcoming the barrier that the stratum corneum presents. This research has successfully identified active lipidoids capable of efficiently penetrating the skin; therefore, loading proteins into lipidoid nanoparticles will facilitate the transdermal delivery of proteins. Membrane diffusion experiments were used to confirm the results. This research has confirmed that lipidoids are a suitable material for transdermal protein delivery enhancement.

  11. Optimization of the NO photooxidation and the role of relative humidity.

    PubMed

    Ângelo, Joana; Magalhães, Pedro; Andrade, Luísa; Madeira, Luís M; Mendes, Adélio

    2018-05-11

    Photocatalysis was recognised as a suitable process for the photoabatement of atmospheric pollutants. The photooxidation mechanism on TiO 2 has been widely studied. However, recent studies demonstrated that the very often-assumed photooxidation intermediated by the hydroxyl radical cannot explain all the experimental observations. Indeed, this study contributes for a new understanding of NO photooxidation. First, the adsorption equilibrium isotherms of NO, NO 2 and H 2 O on the photocatalyst, Aeroxide ® P25 from Evonik Industries, were obtained. Also, the concentration of hydroxyl radicals was determined by photoluminescence. A comprehensive design of experiments was then followed; NO conversion and selectivity were obtained as a function of the relative humidity, irradiance, NO inlet concentration and residence time, following a response surface methodology (RSM). These results were then used to discuss the photooxidation mechanism of NO. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. Modeling of microjoule and millijoule energy LIDARs with PMT/SiPM/APD detectors: a sensitivity analysis.

    PubMed

    Agishev, Ravil

    2018-05-10

    This paper demonstrates a renewed concept and applications of the generalized methodology for atmospheric light detection and ranging (LIDAR) capability prediction as a continuation of a series of our previous works, where the dimensionless parameterization appeared as a tool for comparing systems of a different scale, design, and applications. The modernized concept applied to microscale and milliscale LIDARs with relatively new silicon photomultiplier detectors and traditional photomultiplier tube and avalanche photodiode detectors allowed prediction of the remote sensing instruments' performance and limitations. Such a generalized, uniform, and objective concept is applied for evaluation of the increasingly popular class of limited-energy LIDARs using the best optical detectors, operating on different targets (back-scatter or topographic, static or dynamic) and under intense sky background conditions. It can be used in the LIDAR community to compare different instruments and select the most suitable and effective ones for specific applications.

  13. Validation of a hydride generation atomic absorption spectrometry methodology for determination of mercury in fish designed for application in the Brazilian national residue control plan.

    PubMed

    Damin, Isabel C F; Santo, Maria A E; Hennigen, Rosmari; Vargas, Denise M

    2013-01-01

    In the present study, a method for the determination of mercury (Hg) in fish was validated according to ISO/IEC 17025, INMETRO (Brazil), and more recent European recommendations (Commission Decision 2007/333/EC and 2002/657/EC) for implementation in the Brazilian Residue Control Plan (NRCP) in routine applications. The parameters evaluated in the validation were investigated in detail. The results obtained for limit of detection and quantification were respectively, 2.36 and 7.88 μg kg(-1) of Hg. While the recovery varies between 90-96%. The coefficient of variation was of 4.06-8.94% for the repeatability. Furthermore, a comparison using an external proficiency testing scheme was realized. The results of method validated for the determination of the mercury in fish by Hydride generation atomic absorption spectrometry were considered suitable for implementation in routine analysis.

  14. Stereo particle image velocimetry set up for measurements in the wake of scaled wind turbines

    NASA Astrophysics Data System (ADS)

    Campanardi, Gabriele; Grassi, Donato; Zanotti, Alex; Nanos, Emmanouil M.; Campagnolo, Filippo; Croce, Alessandro; Bottasso, Carlo L.

    2017-08-01

    Stereo particle image velocimetry measurements were carried out in the boundary layer test section of Politecnico di Milano large wind tunnel to survey the wake of a scaled wind turbine model designed and developed by Technische Universität München. The stereo PIV instrumentation was set up to survey the three velocity components on cross-flow planes at different longitudinal locations. The area of investigation covered the entire extent of the wind turbines wake that was scanned by the use of two separate traversing systems for both the laser and the cameras. Such instrumentation set up enabled to gain rapidly high quality results suitable to characterise the behaviour of the flow field in the wake of the scaled wind turbine. This would be very useful for the evaluation of the performance of wind farm control methodologies based on wake redirection and for the validation of CFD tools.

  15. Dynamical tuning for MPC using population games: A water supply network application.

    PubMed

    Barreiro-Gomez, Julian; Ocampo-Martinez, Carlos; Quijano, Nicanor

    2017-07-01

    Model predictive control (MPC) is a suitable strategy for the control of large-scale systems that have multiple design requirements, e.g., multiple physical and operational constraints. Besides, an MPC controller is able to deal with multiple control objectives considering them within the cost function, which implies to determine a proper prioritization for each of the objectives. Furthermore, when the system has time-varying parameters and/or disturbances, the appropriate prioritization might vary along the time as well. This situation leads to the need of a dynamical tuning methodology. This paper addresses the dynamical tuning issue by using evolutionary game theory. The advantages of the proposed method are highlighted and tested over a large-scale water supply network with periodic time-varying disturbances. Finally, results are analyzed with respect to a multi-objective MPC controller that uses static tuning. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  16. Optimisation of the storage of natural freshwaters before organotin speciation.

    PubMed

    Bancon-Montigny, C; Lespes, G; Potin-Gautier, M

    2001-01-01

    The speciation of organotin compounds is essential due to the species-dependent toxicity, especially in natural waters. Precautions have to be taken during sampling and storage of waters in order to prevent degradations and losses. Experimental design methodology has been used to study the conditions of stability of organotins after water sampling in rivers. The modelling of results allows the determination of optimal conditions of preservation. After acidification at pH = 4 with nitric acid, the storage in polyethylene containers at 4 degrees C in the dark is suitable to preserve the most degradable trisubstituted (butyl- and phenyl-) species over 1 month. These conditions of sampling and storage are applied to two different freshwaters. The rate of species decomposition appears to be only dependent on the water nature, whatever the organotin concentrations in the sample. Speciation could be so preserved between 1 and 3 months.

  17. Hypersonic Inlet for a Laser Powered Propulsion System

    NASA Astrophysics Data System (ADS)

    Harrland, Alan; Doolan, Con; Wheatley, Vincent; Froning, Dave

    2011-11-01

    Propulsion within the lightcraft concept is produced via laser induced detonation of an incoming hypersonic air stream. This process requires suitable engine configurations that offer good performance over all flight speeds and angles of attack to ensure the required thrust is maintained. Stream traced hypersonic inlets have demonstrated the required performance in conventional hydrocarbon fuelled scramjet engines, and has been applied to the laser powered lightcraft vehicle. This paper will outline the current methodology employed in the inlet design, with a particular focus on the performance of the lightcraft inlet at angles of attack. Fully three-dimensional turbulent computational fluid dynamics simulations have been performed on a variety of inlet configurations. The performance of the lightcraft inlets have been evaluated at differing angles of attack. An idealized laser detonation simulation has also been performed to validate that the lightcraft inlet does not unstart during the laser powered propulsion cycle.

  18. Thermodynamic signature of secondary nano-emulsion formation by isothermal titration calorimetry.

    PubMed

    Fotticchia, Iolanda; Fotticchia, Teresa; Mattia, Carlo Andrea; Netti, Paolo Antonio; Vecchione, Raffaele; Giancola, Concetta

    2014-12-09

    The stabilization of oil in water nano-emulsions by means of a polymer coating is extremely important; it prolongs the shelf life of the product and makes it suitable for a variety of applications ranging from nutraceutics to cosmetics and pharmaceutics. To date, an effective methodology to assess the best formulations in terms of thermodynamic stability has yet to be designed. Here, we perform a complete physicochemical characterization based on isothermal titration calorimetry (ITC) compared to conventional dynamic light scattering (DLS) to identify polymer concentration domains that are thermodynamically stable and to define the degree of stability through thermodynamic functions depending upon any relevant parameter affecting the stability itself, such as type of polymer coating, droplet distance, etc. For instance, the method was proven by measuring the energetics in the case of two different biopolymers, chitosan and poly-L-lysine, and for different concentrations of the emulsion coated with poly-L-lysine.

  19. Response surface methodology for the optimization of dispersive liquid-liquid microextraction of chloropropanols in human plasma.

    PubMed

    Gonzalez-Siso, Paula; Lorenzo, Rosa A; Regenjo, María; Fernández, Purificación; Carro, Antonia M

    2015-10-01

    Chloropropanols are processing toxicants with a potential risk to human health due to the increased intake of processed foods. A rapid and efficient method for the determination of three chloropropanols in human plasma was developed using ultrasound-assisted dispersive liquid-liquid microextraction. The method involved derivatization and extraction in one step followed by gas chromatography with tandem mass spectrometry analysis. Parameters affecting extraction, such as sample pH, ionic strength, type and volume of dispersive and extraction solvents were optimized by response surface methodology using a pentagonal design. The linear range of the method was 5-200 ng/mL for 1,3-dichloro-2-propanol, 10-200 ng/mL for 2,3-dichloro-2-propanol and 10-400 ng/mL for 3-chloropropane-1,2-diol with the determination coefficients between 0.9989 and 0.9997. The limits of detection were in the range of 0.3-3.2 ng/mL. The precision varied from 1.9 to 10% relative standard deviation (n = 9). The recovery of the method was between 91 and 101%. Advantages such as low consumption of organic solvents and short time of analysis make the method suitable for the biomonitoring of chloropropanols. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Process optimization via response surface methodology in the treatment of metal working industry wastewater with electrocoagulation.

    PubMed

    Guvenc, Senem Yazici; Okut, Yusuf; Ozak, Mert; Haktanir, Birsu; Bilgili, Mehmet Sinan

    2017-02-01

    In this study, process parameters in chemical oxygen demand (COD) and turbidity removal from metal working industry (MWI) wastewater were optimized by electrocoagulation (EC) using aluminum, iron and steel electrodes. The effects of process variables on COD and turbidity were investigated by developing a mathematical model using central composite design method, which is one of the response surface methodologies. Variance analysis was conducted to identify the interaction between process variables and model responses and the optimum conditions for the COD and turbidity removal. Second-order regression models were developed via the Statgraphics Centurion XVI.I software program to predict COD and turbidity removal efficiencies. Under the optimum conditions, removal efficiencies obtained from aluminum electrodes were found to be 76.72% for COD and 99.97% for turbidity, while the removal efficiencies obtained from iron electrodes were found to be 76.55% for COD and 99.9% for turbidity and the removal efficiencies obtained from steel electrodes were found to be 65.75% for COD and 99.25% for turbidity. Operational costs at optimum conditions were found to be 4.83, 1.91 and 2.91 €/m 3 for aluminum, iron and steel electrodes, respectively. Iron electrode was found to be more suitable for MWI wastewater treatment in terms of operational cost and treatment efficiency.

  1. Methodology of oral formulation selection in the pharmaceutical industry.

    PubMed

    Kuentz, Martin; Holm, René; Elder, David P

    2016-05-25

    Pharmaceutical formulations have to fulfil various requirements with respect to their intended use, either in the development phase or as a commercial product. New drug candidates with their specific properties confront the formulation scientist with industrial challenges for which a strategy is needed to cope with limited resources, stretched timelines as well as regulatory requirements. This paper aims at reviewing different methodologies to select a suitable formulation approach for oral delivery. Exclusively small-molecular drugs are considered and the review is written from an industrial perspective. Specific cases are discussed starting with an emphasis on poorly soluble compounds, then the topics of chemically labile drugs, low-dose compounds, and modified release are reviewed. Due to the broad scope of this work, a primary focus is on explaining basic concepts as well as recent trends. Different strategies are discussed to approach industrial formulation selection, which includes a structured product development. Examples for such structured development aim to provide guidance to formulators and finally, the recent topic of a manufacturing classification system is presented. It can be concluded that the field of oral formulation selection is particularly complex due to both multiple challenges as well as opportunities so that industrial scientists have to employ tailored approaches to design formulations successfully. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Multi-Response Optimization of Granaticinic Acid Production by Endophytic Streptomyces thermoviolaceus NT1, Using Response Surface Methodology

    PubMed Central

    Roy, Sudipta; Halder, Suman Kumar; Banerjee, Debdulal

    2016-01-01

    Streptomyces thermoviolaceus NT1, an endophytic isolate, was studied for optimization of granaticinic acid production. It is an antimicrobial metabolite active against even drug resistant bacteria. Different media, optimum glucose concentration, initial media pH, incubation temperature, incubation period, and inoculum size were among the selected parameters optimized in the one-variable-at-a-time (OVAT) approach, where glucose concentration, pH, and temperature were found to play a critical role in antibiotic production by this strain. Finally, the Box–Behnken experimental design (BBD) was employed with three key factors (selected after OVAT studies) for response surface methodological (RSM) analysis of this optimization study.RSM analysis revealed a multifactorial combination; glucose 0.38%, pH 7.02, and temperature 36.53 °C as the optimum conditions for maximum antimicrobial yield. Experimental verification of model analysis led to 3.30-fold (61.35 mg/L as compared to 18.64 mg/L produced in un-optimized condition) enhanced granaticinic acid production in ISP2 medium with 5% inoculum and a suitable incubation period of 10 days. So, the conjugated optimization study for maximum antibiotic production from Streptomyces thermoviolaceus NT1 was found to result in significantly higher yield, which might be exploited in industrial applications. PMID:28952581

  3. Clinical research of traditional Chinese medicine in big data era.

    PubMed

    Zhang, Junhua; Zhang, Boli

    2014-09-01

    With the advent of big data era, our thinking, technology and methodology are being transformed. Data-intensive scientific discovery based on big data, named "The Fourth Paradigm," has become a new paradigm of scientific research. Along with the development and application of the Internet information technology in the field of healthcare, individual health records, clinical data of diagnosis and treatment, and genomic data have been accumulated dramatically, which generates big data in medical field for clinical research and assessment. With the support of big data, the defects and weakness may be overcome in the methodology of the conventional clinical evaluation based on sampling. Our research target shifts from the "causality inference" to "correlativity analysis." This not only facilitates the evaluation of individualized treatment, disease prediction, prevention and prognosis, but also is suitable for the practice of preventive healthcare and symptom pattern differentiation for treatment in terms of traditional Chinese medicine (TCM), and for the post-marketing evaluation of Chinese patent medicines. To conduct clinical studies involved in big data in TCM domain, top level design is needed and should be performed orderly. The fundamental construction and innovation studies should be strengthened in the sections of data platform creation, data analysis technology and big-data professionals fostering and training.

  4. Investigation of equilibrium and kinetics of Cr(VI) adsorption by dried Bacillus cereus using response surface methodology.

    PubMed

    Yang, Kai; Zhang, Jing; Yang, Tao; Wang, Hongyu

    2016-01-01

    In this study, response surface methodology (RSM) based on three-variable-five-level central composite rotatable design was used to analyze the effects of combined and individual operating parameters (biomass dose, initial concentration of Cr(VI) and pH) on the Cr(VI) adsorption capacity of dried Bacillus cereus. A quadratic polynomial equation was obtained to predict the adsorbed Cr(VI) amount. Analysis of variance showed that the effect of biomass dose was the key factor in the removal of Cr(VI). The maximum adsorbed Cr(VI) amount (30.93 mg g(-1)) was found at 165.30 mg L(-1), 2.96, and 3.01 g L(-1) for initial Cr(VI) concentration, pH, and biosorbent dosage, respectively. The surface chemical functional groups and microstructure of unloaded and Cr(VI)-loaded dried Bacillus cereus were identified by Fourier transform infrared spectroscopy (FTIR) and scanning electron microscopy (SEM), respectively. Besides, the results gained from these studies indicated that Langmuir isotherm and the second-order rate expression were suitable for the removal of Cr(VI) from wastewater. The results revealed RSM was an effective method for optimizing biosorption process, and dried Bacillus cereus had a remarkable performance on the removal of Cr(VI) from wastewater.

  5. Methodology for designing accelerated aging tests for predicting life of photovoltaic arrays

    NASA Technical Reports Server (NTRS)

    Gaines, G. B.; Thomas, R. E.; Derringer, G. C.; Kistler, C. W.; Bigg, D. M.; Carmichael, D. C.

    1977-01-01

    A methodology for designing aging tests in which life prediction was paramount was developed. The methodology builds upon experience with regard to aging behavior in those material classes which are expected to be utilized as encapsulant elements, viz., glasses and polymers, and upon experience with the design of aging tests. The experiences were reviewed, and results are discussed in detail.

  6. CONCEPTUAL DESIGNS FOR A NEW HIGHWAY VEHICLE EMISSIONS ESTIMATION METHODOLOGY

    EPA Science Inventory

    The report discusses six conceptual designs for a new highway vehicle emissions estimation methodology and summarizes the recommendations of each design for improving the emissions and activity factors in the emissions estimation process. he complete design reports are included a...

  7. Comparative Analysis.

    DTIC Science & Technology

    1987-11-01

    differential qualita- tive (DQ) analysis, which solves the task, providing explanations suitable for use by design systems, automated diagnosis, intelligent...solves the task, providing explanations suitable for use by design systems, automated diagnosis, intelligent tutoring systems, and explanation based...comparative analysis as an important component; the explanation is used in many different ways. * One way method of automated design is the principlvd

  8. Hydrogeophysics and remote sensing for the design of hydrogeological conceptual models in hard rocks - Sardón catchment (Spain)

    NASA Astrophysics Data System (ADS)

    Francés, Alain P.; Lubczynski, Maciek W.; Roy, Jean; Santos, Fernando A. M.; Mahmoudzadeh Ardekani, Mohammad R.

    2014-11-01

    Hard rock aquifers are highly heterogeneous and hydrogeologically complex. To contribute to the design of hydrogeological conceptual models of hard rock aquifers, we propose a multi-techniques methodology based on a downward approach that combines remote sensing (RS), non-invasive hydrogeophysics and hydrogeological field data acquisition. The proposed methodology is particularly suitable for data scarce areas. It was applied in the pilot research area of Sardón catchment (80 km2) located west of Salamanca (Spain). The area was selected because of hard-rock hydrogeology, semi-arid climate and scarcity of groundwater resources. The proposed methodology consisted of three main steps. First, we detected the main hydrogeological features at the catchment scale by processing: (i) a high resolution digital terrain model to map lineaments and to outline fault zones; and (ii) high-resolution, multispectral satellite QuickBird and WorldView-2 images to map the outcropping granite. Second, we characterized at the local scale the hydrogeological features identified at step one with: i) ground penetrating radar (GPR) to assess groundwater table depth complementing the available monitoring network data; ii) 2D electric resistivity tomography (ERT) and frequency domain electromagnetic (FDEM) to retrieve the hydrostratigraphy along selected survey transects; iii) magnetic resonance soundings (MRS) to retrieve the hydrostratigraphy and aquifer parameters at the selected survey sites. In the third step, we drilled 5 boreholes (25 to 48 m deep) and performed slug tests to verify the hydrogeophysical interpretation and to calibrate the MRS parameters. Finally, we compiled and integrated all acquired data to define the geometry and parameters of the Sardón aquifer at the catchment scale. In line with a general conceptual model of hard rock aquifers, we identified two main hydrostratigraphic layers: a saprolite layer and a fissured layer. Both layers were intersected and drained by fault zones that control the hydrogeology of the catchment. The spatial discontinuities of the saprolite layer were well defined by RS techniques while subsurface geometry and aquifer parameters by hydrogeophysics. The GPR method was able to detect shallow water table at depth between 1 and 3 m b.g.s. The hydrostratigraphy and parameterization of the fissured layer remained uncertain because ERT and FDEM geophysical methods were quantitatively not conclusive while MRS detectability was restricted by low volumetric water content. The proposed multi-technique methodology integrating cost efficient RS, hydrogeophysics and hydrogeological field investigations allowed us to characterize geometrically and parametrically the Sardón hard rock aquifer system, facilitating the design of hydrogeological conceptual model of the area.

  9. Multirate Flutter Suppression System Design for the Benchmark Active Controls Technology Wing. Part 2; Methodology Application Software Toolbox

    NASA Technical Reports Server (NTRS)

    Mason, Gregory S.; Berg, Martin C.; Mukhopadhyay, Vivek

    2002-01-01

    To study the effectiveness of various control system design methodologies, the NASA Langley Research Center initiated the Benchmark Active Controls Project. In this project, the various methodologies were applied to design a flutter suppression system for the Benchmark Active Controls Technology (BACT) Wing. This report describes the user's manual and software toolbox developed at the University of Washington to design a multirate flutter suppression control law for the BACT wing.

  10. Architecture, Design, and System; Performance Assessment and Development Methodology for Computer-Based Systems. Volume 1. Methodology Description, Discussion, and Assessment,

    DTIC Science & Technology

    1983-12-30

    AD-Ri46 57? ARCHITECTURE DESIGN AND SYSTEM; PERFORMANCE ASSESSMENT i/i AND DEVELOPMENT ME..(U) NAVAL SURFACE WEAPONS CENTER SILYER SPRING MD J...AD-A 146 577 NSIWC TR 83-324 ARCHITECTURE , DESIGN , AND SYSTEM; PERFORMANCE ASSESSMENT AND DEVELOPMENT METHODOLOGY...REPORT NUMBER 12. GOVT ACCESSION NO.3. RECIPIENT’S CATALOG NUMBER NSWC TR 83-324 10- 1 1 51’ 4. ?ITLE (and subtitle) ARCHITECTURE , DESIGN , AND SYSTEM; S

  11. A Way to Select Electrical Sheets of the Segment Stator Core Motors.

    NASA Astrophysics Data System (ADS)

    Enomoto, Yuji; Kitamura, Masashi; Sakai, Toshihiko; Ohara, Kouichiro

    The segment stator core, high density winding coil, high-energy-product permanent magnet are indispensable technologies in the development of a compact and also high efficient motors. The conventional design method for the segment stator core mostly depended on experienced knowledge of selecting a suitable electromagnetic material, far from optimized design. Therefore, we have developed a novel design method in the selection of a suitable electromagnetic material based on the correlation evaluation between the material characteristics and motor performance. It enables the selection of suitable electromagnetic material that will meet the motor specification.

  12. Design of a constant tension thermocouple rake suitable for flame studies

    NASA Technical Reports Server (NTRS)

    Ahuja, Sandeep; Miller, David L.

    1993-01-01

    An improved, spring-loaded thermocouple rake, suitable for studying flame structure, has been designed. This design keeps the thermocouple under tension thereby ensuring that the thermocouple does not droop due to the thermal expansion of the sensing wire when inserted in the flame. The present design allows the usage of thermocouple wire as small as 0.0508 mm and relative ease in changing thermocouple wire.

  13. 7 CFR 3600.3 - Functions.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    .... agricultural and rural economy. (2) Administering a methodological research program to improve agricultural... design and data collection methodologies to the agricultural statistics program. Major functions include...) Designing, testing, and establishing survey techniques and standards, including sample design, sample...

  14. 7 CFR 3600.3 - Functions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    .... agricultural and rural economy. (2) Administering a methodological research program to improve agricultural... design and data collection methodologies to the agricultural statistics program. Major functions include...) Designing, testing, and establishing survey techniques and standards, including sample design, sample...

  15. 7 CFR 3600.3 - Functions.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    .... agricultural and rural economy. (2) Administering a methodological research program to improve agricultural... design and data collection methodologies to the agricultural statistics program. Major functions include...) Designing, testing, and establishing survey techniques and standards, including sample design, sample...

  16. 7 CFR 3600.3 - Functions.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    .... agricultural and rural economy. (2) Administering a methodological research program to improve agricultural... design and data collection methodologies to the agricultural statistics program. Major functions include...) Designing, testing, and establishing survey techniques and standards, including sample design, sample...

  17. 7 CFR 3600.3 - Functions.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    .... agricultural and rural economy. (2) Administering a methodological research program to improve agricultural... design and data collection methodologies to the agricultural statistics program. Major functions include...) Designing, testing, and establishing survey techniques and standards, including sample design, sample...

  18. A methodology for decisionmaking in project evaluation in land management planning

    Treesearch

    A. Weintraub

    1978-01-01

    In order to evaluate alternative plans, wildland management planners must consider many objectives, such as timber production, recreational use, and community stability. The method presented utilizes the type of qualitative and intuitive information widely available to wildland management planners, and structures this information into a format suitable for...

  19. Teaching Methodologies for Population Education: Inquiry/Discovery Approach, Values Clarification.

    ERIC Educational Resources Information Center

    United Nations Educational, Scientific, and Cultural Organization, Bangkok (Thailand). Regional Office for Education in Asia and the Pacific.

    Divided into two sections, this booklet demonstrates how the discovery/inquiry approach and values clarification can be used to teach population education. Each part presents a theoretical discussion of a teaching method including its definition, its relevance to population education, some outstanding characteristics that make it suitable for…

  20. Using a Virtual Population to Authentically Teach Epidemiology and Biostatistics

    ERIC Educational Resources Information Center

    Dunn, Peter K.; Donnison, Sharn; Cole, Rachel; Bulmer, Michael

    2017-01-01

    Epidemiology is the study of the distribution of disease in human populations. This means that authentically teaching primary data collection in epidemiology is difficult as students cannot easily access suitable human populations. Using an action research methodology, this paper studied the use of a virtual human population (called "The…

  1. Rapid deletion plasmid construction methods for protoplast and Agrobacterium based fungal transformation systems

    USDA-ARS?s Scientific Manuscript database

    Increasing availability of genomic data and sophistication of analytical methodology in fungi has elevated the need for functional genomics tools in these organisms. Gene deletion is a critical tool for functional analysis. The targeted deletion of genes requires both a suitable method for the trans...

  2. Applications of a Constrained Mechanics Methodology in Economics

    ERIC Educational Resources Information Center

    Janova, Jitka

    2011-01-01

    This paper presents instructive interdisciplinary applications of constrained mechanics calculus in economics on a level appropriate for undergraduate physics education. The aim of the paper is (i) to meet the demand for illustrative examples suitable for presenting the background of the highly expanding research field of econophysics even at the…

  3. 43 CFR 11.83 - Damage determination phase-use value methodologies.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...

  4. 43 CFR 11.83 - Damage determination phase-use value methodologies.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...

  5. 43 CFR 11.83 - Damage determination phase-use value methodologies.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...

  6. 43 CFR 11.83 - Damage determination phase-use value methodologies.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...

  7. 43 CFR 11.83 - Damage determination phase-use value methodologies.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...

  8. A Human Factors Evaluation of a Methodology for Pressurized Crew Module Acceptability for Zero-Gravity Ingress of Spacecraft

    NASA Technical Reports Server (NTRS)

    Sanchez, Merri J.

    2000-01-01

    This project aimed to develop a methodology for evaluating performance and acceptability characteristics of the pressurized crew module volume suitability for zero-gravity (g) ingress of a spacecraft and to evaluate the operational acceptability of the NASA crew return vehicle (CRV) for zero-g ingress of astronaut crew, volume for crew tasks, and general crew module and seat layout. No standard or methodology has been established for evaluating volume acceptability in human spaceflight vehicles. Volume affects astronauts'ability to ingress and egress the vehicle, and to maneuver in and perform critical operational tasks inside the vehicle. Much research has been conducted on aircraft ingress, egress, and rescue in order to establish military and civil aircraft standards. However, due to the extremely limited number of human-rated spacecraft, this topic has been un-addressed. The NASA CRV was used for this study. The prototype vehicle can return a 7-member crew from the International Space Station in an emergency. The vehicle's internal arrangement must be designed to facilitate rapid zero-g ingress, zero-g maneuverability, ease of one-g egress and rescue, and ease of operational tasks in multiple acceleration environments. A full-scale crew module mockup was built and outfitted with representative adjustable seats, crew equipment, and a volumetrically equivalent hatch. Human factors testing was conducted in three acceleration environments using ground-based facilities and the KC-135 aircraft. Performance and acceptability measurements were collected. Data analysis was conducted using analysis of variance and nonparametric techniques.

  9. An evaluation of total starch and starch gelatinization methodologies in pelleted animal feed.

    PubMed

    Zhu, L; Jones, C; Guo, Q; Lewis, L; Stark, C R; Alavi, S

    2016-04-01

    The quantification of total starch content (TS) or degree of starch gelatinization (DG) in animal feed is always challenging because of the potential interference from other ingredients. In this study, the differences in TS or DG measurement in pelleted swine feed due to variations in analytical methodology were quantified. Pelleted swine feed was used to create 6 different diets manufactured with various processing conditions in a 2 × 3 factorial design (2 conditioning temperatures, 77 or 88°C, and 3 conditioning retention times, 15, 30, or 60 s). Samples at each processing stage (cold mash, hot mash, hot pelletized feed, and final cooled pelletized feed) were collected for each of the 6 treatments and analyzed for TS and DG. Two different methodologies were evaluated for TS determination (the AOAC International method 996.11 vs. the modified glucoamylase method) and DG determination (the modified glucoamylase method vs. differential scanning calorimetry [DSC]). For TS determination, the AOAC International method 996.11 measured lower TS values in cold pellets compared with the modified glucoamylase method. The AOAC International method resulted in lower TS in cold mash than cooled pelletized feed, whereas the modified glucoamylase method showed no significant differences in TS content before or after pelleting. For DG, the modified glucoamylase method demonstrated increased DG with each processing step. Furthermore, increasing the conditioning temperature and time resulted in a greater DG when evaluated by the modified glucoamylase method. However, results demonstrated that DSC is not suitable as a quantitative tool for determining DG in multicomponent animal feeds due to interferences from nonstarch transformations, such as protein denaturation.

  10. Suitability of virtual prototypes to support human factors/ergonomics evaluation during the design.

    PubMed

    Aromaa, Susanna; Väänänen, Kaisa

    2016-09-01

    In recent years, the use of virtual prototyping has increased in product development processes, especially in the assessment of complex systems targeted at end-users. The purpose of this study was to evaluate the suitability of virtual prototyping to support human factors/ergonomics evaluation (HFE) during the design phase. Two different virtual prototypes were used: augmented reality (AR) and virtual environment (VE) prototypes of a maintenance platform of a rock crushing machine. Nineteen designers and other stakeholders were asked to assess the suitability of the prototype for HFE evaluation. Results indicate that the system model characteristics and user interface affect the experienced suitability. The VE system was valued as being more suitable to support the assessment of visibility, reach, and the use of tools than the AR system. The findings of this study can be used as a guidance for the implementing virtual prototypes in the product development process. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Hearing aid user guides: suitability for older adults.

    PubMed

    Caposecco, Andrea; Hickson, Louise; Meyer, Carly

    2014-02-01

    The aim of this study was to analyse the content, design, and readability of printed hearing aid user guides to determine their suitability for older adults, who are the main users of hearing aids. Hearing aid user guides were assessed using four readability formulae and a standardized tool to assess content and design (SAM - Suitability Assessment of Materials). A sample of 36 hearing aid user guides (four user guides from nine different hearing aid manufacturers) were analysed. Sixty nine percent of user guides were rated 'not suitable' and 31% were rated 'adequate' for their suitability. Many scored poorly for scope, vocabulary, aspects of layout and typography, and learning stimulation and motivation. The mean reading grade level for all user guides was grade 9.6 which is too high for older adults. The content, design, and readability of hearing aid user guides are not optimal for older adults and thus may serve as a barrier to successful hearing aid outcomes for this population.

  12. Advanced piloted aircraft flight control system design methodology. Volume 1: Knowledge base

    NASA Technical Reports Server (NTRS)

    Mcruer, Duane T.; Myers, Thomas T.

    1988-01-01

    The development of a comprehensive and electric methodology for conceptual and preliminary design of flight control systems is presented and illustrated. The methodology is focused on the design stages starting with the layout of system requirements and ending when some viable competing system architectures (feedback control structures) are defined. The approach is centered on the human pilot and the aircraft as both the sources of, and the keys to the solution of, many flight control problems. The methodology relies heavily on computational procedures which are highly interactive with the design engineer. To maximize effectiveness, these techniques, as selected and modified to be used together in the methodology, form a cadre of computational tools specifically tailored for integrated flight control system preliminary design purposes. While theory and associated computational means are an important aspect of the design methodology, the lore, knowledge and experience elements, which guide and govern applications are critical features. This material is presented as summary tables, outlines, recipes, empirical data, lists, etc., which encapsulate a great deal of expert knowledge. Much of this is presented in topical knowledge summaries which are attached as Supplements. The composite of the supplements and the main body elements constitutes a first cut at a a Mark 1 Knowledge Base for manned-aircraft flight control.

  13. Research and development activities in unified control-structure modeling and design

    NASA Technical Reports Server (NTRS)

    Nayak, A. P.

    1985-01-01

    Results of work to develop a unified control/structures modeling and design capability for large space structures modeling are presented. Recent analytical results are presented to demonstrate the significant interdependence between structural and control properties. A new design methodology is suggested in which the structure, material properties, dynamic model and control design are all optimized simultaneously. Parallel research done by other researchers is reviewed. The development of a methodology for global design optimization is recommended as a long-term goal. It is suggested that this methodology should be incorporated into computer aided engineering programs, which eventually will be supplemented by an expert system to aid design optimization.

  14. The methodology of database design in organization management systems

    NASA Astrophysics Data System (ADS)

    Chudinov, I. L.; Osipova, V. V.; Bobrova, Y. V.

    2017-01-01

    The paper describes the unified methodology of database design for management information systems. Designing the conceptual information model for the domain area is the most important and labor-intensive stage in database design. Basing on the proposed integrated approach to design, the conceptual information model, the main principles of developing the relation databases are provided and user’s information needs are considered. According to the methodology, the process of designing the conceptual information model includes three basic stages, which are defined in detail. Finally, the article describes the process of performing the results of analyzing user’s information needs and the rationale for use of classifiers.

  15. Soft robot design methodology for `push-button' manufacturing

    NASA Astrophysics Data System (ADS)

    Paik, Jamie

    2018-06-01

    `Push-button' or fully automated manufacturing would enable the production of robots with zero intervention from human hands. Realizing this utopia requires a fundamental shift from a sequential (design-materials-manufacturing) to a concurrent design methodology.

  16. DESIGNING PROCESSES FOR ENVIRONMENTAL PROBLEMS

    EPA Science Inventory

    Designing for the environment requires consideration of environmental impacts. The Generalized WAR Algorithm is the methodology that allows the user to evaluate the potential environmental impact of the design of a chemical process. In this methodology, chemicals are assigned val...

  17. Philosophical and Methodological Beliefs of Instructional Design Faculty and Professionals

    ERIC Educational Resources Information Center

    Sheehan, Michael D.; Johnson, R. Burke

    2012-01-01

    The purpose of this research was to probe the philosophical beliefs of instructional designers using sound philosophical constructs and quantitative data collection and analysis. We investigated the philosophical and methodological beliefs of instructional designers, including 152 instructional design faculty members and 118 non-faculty…

  18. Environmental analysis in the selection of alternative corridors in a long-distance linear project: a methodological proposal.

    PubMed

    Rescia, Alejandro J; Astrada, Elizabeth N; Bono, Julieta; Blasco, Carlos A; Meli, Paula; Adámoli, Jorge M

    2006-08-01

    A linear engineering project--i.e. a pipeline--has a potential long- and short-term impact on the environment and on the inhabitants therein. We must find better, less expensive, and less time-consuming ways to obtain information on the environment and on any modifications resulting from anthropic activity. We need scientifically sound, rapid and affordable assessment and monitoring methods. Construction companies, industries and the regulating government organisms lack the resources needed to conduct long-term basic studies of the environment. Thus there is a need to make the necessary adjustments and improvements in the environmental data considered useful for this development project. More effective and less costly methods are generally needed. We characterized the landscape of the study area, situated in the center and north-east of Argentina. Little is known of the ecology of this region and substantial research is required in order to develop sustainable uses and, at the same time, to develop methods for reducing impacts, both primary and secondary, resulting from anthropic activity in this area. Furthermore, we made an assessment of the environmental impact of the planned linear project, applying an ad hoc impact index, and we analyzed the different alternatives for a corridor, each one of these involving different sections of the territory. Among the alternative corridors considered, this study locates the most suitable ones in accordance with a selection criterion based on different environmental and conservation aspects. We selected the corridor that we considered to be the most compatible--i.e. with the least potential environmental impact--for the possible construction and operation of the linear project. This information, along with suitable measures for mitigating possible impacts, should be the basis of an environmental management plan for the design process and location of the project. We pointed out the objectivity and efficiency of this methodological approach, along with the possibility of integrating the information in order to allow for the application thereof in this type of study.

  19. A Control Law Design Method Facilitating Control Power, Robustness, Agility, and Flying Qualities Tradeoffs: CRAFT

    NASA Technical Reports Server (NTRS)

    Murphy, Patrick C.; Davidson, John B.

    1998-01-01

    A multi-input, multi-output control law design methodology, named "CRAFT", is presented. CRAFT stands for the design objectives addressed, namely, Control power, Robustness, Agility, and Flying Qualities Tradeoffs. The methodology makes use of control law design metrics from each of the four design objective areas. It combines eigenspace assignment, which allows for direct specification of eigenvalues and eigenvectors, with a graphical approach for representing the metrics that captures numerous design goals in one composite illustration. Sensitivity of the metrics to eigenspace choice is clearly displayed, enabling the designer to assess the cost of design tradeoffs. This approach enhances the designer's ability to make informed design tradeoffs and to reach effective final designs. An example of the CRAFT methodology applied to an advanced experimental fighter and discussion of associated design issues are provided.

  20. Land Suitability Assessment in the Catchment Area of Four Southwestern Atlantic Coastal Lagoons: Multicriteria and Optimization Modeling

    NASA Astrophysics Data System (ADS)

    Rodriguez-Gallego, Lorena; Achkar, Marcel; Conde, Daniel

    2012-07-01

    In the present study, a land suitability assessment was conducted in the basin of four Uruguayan coastal lagoons (Southwestern Atlantic) to analyze the productive development while minimizing eutrophication, biodiversity loss and conflicts among different land uses. Suitable land for agriculture, forest, livestock ranching, tourism and conservation sectors were initially established based on a multi-attribute model developed using a geographic information system. Experts were consulted to determine the requirements for each land use sector and the incompatibilities among land use types. The current and potential conflicts among incompatible land use sectors were analyzed by overlapping land suitability maps. We subsequently applied a multi-objective model where land (pixels) with similar suitability was clustered into "land suitability groups", using a two-phase cluster analysis and the Akaike Information Criterion. Finally, a linear programming optimization procedure was applied to allocate land use sectors into land suitable groups, maximizing total suitability and minimizing interference among sectors. Results indicated that current land use overlapped by 4.7 % with suitable land of other incompatible sectors. However, the suitable land of incompatible sectors overlapped in 20.3 % of the study area, indicating a high potential for the occurrence of future conflict. The highest competition was between agriculture and conservation, followed by forest and agriculture. We explored scenarios where livestock ranching and tourism intensified, and found that interference with conservation and agriculture notably increased. This methodology allowed us to analyze current and potential land use conflicts and to contribute to the strategic planning of the study area.

  1. Land suitability assessment in the catchment area of four Southwestern Atlantic coastal lagoons: multicriteria and optimization modeling.

    PubMed

    Rodriguez-Gallego, Lorena; Achkar, Marcel; Conde, Daniel

    2012-07-01

    In the present study, a land suitability assessment was conducted in the basin of four Uruguayan coastal lagoons (Southwestern Atlantic) to analyze the productive development while minimizing eutrophication, biodiversity loss and conflicts among different land uses. Suitable land for agriculture, forest, livestock ranching, tourism and conservation sectors were initially established based on a multi-attribute model developed using a geographic information system. Experts were consulted to determine the requirements for each land use sector and the incompatibilities among land use types. The current and potential conflicts among incompatible land use sectors were analyzed by overlapping land suitability maps. We subsequently applied a multi-objective model where land (pixels) with similar suitability was clustered into "land suitability groups", using a two-phase cluster analysis and the Akaike Information Criterion. Finally, a linear programming optimization procedure was applied to allocate land use sectors into land suitable groups, maximizing total suitability and minimizing interference among sectors. Results indicated that current land use overlapped by 4.7 % with suitable land of other incompatible sectors. However, the suitable land of incompatible sectors overlapped in 20.3 % of the study area, indicating a high potential for the occurrence of future conflict. The highest competition was between agriculture and conservation, followed by forest and agriculture. We explored scenarios where livestock ranching and tourism intensified, and found that interference with conservation and agriculture notably increased. This methodology allowed us to analyze current and potential land use conflicts and to contribute to the strategic planning of the study area.

  2. Methodological convergence of program evaluation designs.

    PubMed

    Chacón-Moscoso, Salvador; Anguera, M Teresa; Sanduvete-Chaves, Susana; Sánchez-Martín, Milagrosa

    2014-01-01

    Nowadays, the confronting dichotomous view between experimental/quasi-experimental and non-experimental/ethnographic studies still exists but, despite the extensive use of non-experimental/ethnographic studies, the most systematic work on methodological quality has been developed based on experimental and quasi-experimental studies. This hinders evaluators and planners' practice of empirical program evaluation, a sphere in which the distinction between types of study is changing continually and is less clear. Based on the classical validity framework of experimental/quasi-experimental studies, we carry out a review of the literature in order to analyze the convergence of design elements in methodological quality in primary studies in systematic reviews and ethnographic research. We specify the relevant design elements that should be taken into account in order to improve validity and generalization in program evaluation practice in different methodologies from a practical methodological and complementary view. We recommend ways to improve design elements so as to enhance validity and generalization in program evaluation practice.

  3. Longitudinal Research with Sexual Assault Survivors: A Methodological Review

    ERIC Educational Resources Information Center

    Campbell, Rebecca; Sprague, Heather Brown; Cottrill, Sara; Sullivan, Cris M.

    2011-01-01

    Longitudinal research designs are relatively rare in the academic literature on rape and sexual assault despite their tremendous methodological rigor and scientific utility. In the interest of promoting wider use of such methods, we conducted a methodological review of projects that have used prospective longitudinal designs to study the…

  4. Solid Waste Management Planning--A Methodology

    ERIC Educational Resources Information Center

    Theisen, Hilary M.; And Others

    1975-01-01

    This article presents a twofold solid waste management plan consisting of a basic design methodology and a decision-making methodology. The former provides a framework for the developing plan while the latter builds flexibility into the design so that there is a model for use during the planning process. (MA)

  5. Advanced piloted aircraft flight control system design methodology. Volume 2: The FCX flight control design expert system

    NASA Technical Reports Server (NTRS)

    Myers, Thomas T.; Mcruer, Duane T.

    1988-01-01

    The development of a comprehensive and electric methodology for conceptual and preliminary design of flight control systems is presented and illustrated. The methodology is focused on the design states starting with the layout of system requirements and ending when some viable competing system architectures (feedback control structures) are defined. The approach is centered on the human pilot and the aircraft as both the sources of, and the keys to the solution of, many flight control problems. The methodology relies heavily on computational procedures which are highly interactive with the design engineer. To maximize effectiveness, these techniques, as selected and modified to be used together in the methodology, form a cadre of computational tools specifically tailored for integrated flight control system preliminary design purposes. The FCX expert system as presently developed is only a limited prototype capable of supporting basic lateral-directional FCS design activities related to the design example used. FCX presently supports design of only one FCS architecture (yaw damper plus roll damper) and the rules are largely focused on Class IV (highly maneuverable) aircraft. Despite this limited scope, the major elements which appear necessary for application of knowledge-based software concepts to flight control design were assembled and thus FCX represents a prototype which can be tested, critiqued and evolved in an ongoing process of development.

  6. Validation of a Low-Thrust Mission Design Tool Using Operational Navigation Software

    NASA Technical Reports Server (NTRS)

    Englander, Jacob A.; Knittel, Jeremy M.; Williams, Ken; Stanbridge, Dale; Ellison, Donald H.

    2017-01-01

    Design of flight trajectories for missions employing solar electric propulsion requires a suitably high-fidelity design tool. In this work, the Evolutionary Mission Trajectory Generator (EMTG) is presented as a medium-high fidelity design tool that is suitable for mission proposals. EMTG is validated against the high-heritage deep-space navigation tool MIRAGE, demonstrating both the accuracy of EMTG's model and an operational mission design and navigation procedure using both tools. The validation is performed using a benchmark mission to the Jupiter Trojans.

  7. Helicopter-V/STOL dynamic wind and turbulence design methodology

    NASA Technical Reports Server (NTRS)

    Bailey, J. Earl

    1987-01-01

    Aircraft and helicopter accidents due to severe dynamic wind and turbulence continue to present challenging design problems. The development of the current set of design analysis tools for a aircraft wind and turbulence design began in the 1940's and 1950's. The areas of helicopter dynamic wind and turbulence modeling and vehicle response to severe dynamic wind inputs (microburst type phenomena) during takeoff and landing remain as major unsolved design problems from a lack of both environmental data and computational methodology. The development of helicopter and V/STOL dynamic wind and turbulence response computation methology is reviewed, the current state of the design art in industry is outlined, and comments on design methodology are made which may serve to improve future flight vehicle design.

  8. Scoping the polymer genome: A roadmap for rational polymer dielectrics design and beyond

    DOE PAGES

    Mannodi-Kanakkithodi, Arun; Chandrasekaran, Anand; Kim, Chiho; ...

    2017-12-19

    The Materials Genome Initiative (MGI) has heralded a sea change in the philosophy of materials design. In an increasing number of applications, the successful deployment of novel materials has benefited from the use of computational methodologies, data descriptors, and machine learning. Polymers have long suffered from a lack of data on electronic, mechanical, and dielectric properties across large chemical spaces, causing a stagnation in the set of suitable candidates for various applications. Extensive efforts over the last few years have seen the fruitful application of MGI principles toward the accelerated discovery of attractive polymer dielectrics for capacitive energy storage. Here,more » we review these efforts, highlighting the importance of computational data generation and screening, targeted synthesis and characterization, polymer fingerprinting and machine-learning prediction models, and the creation of an online knowledgebase to guide ongoing and future polymer discovery and design. We lay special emphasis on the fingerprinting of polymers in terms of their genome or constituent atomic and molecular fragments, an idea that pays homage to the pioneers of the human genome project who identified the basic building blocks of the human DNA. As a result, by scoping the polymer genome, we present an essential roadmap for the design of polymer dielectrics, and provide future perspectives and directions for expansions to other polymer subclasses and properties.« less

  9. Scoping the polymer genome: A roadmap for rational polymer dielectrics design and beyond

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mannodi-Kanakkithodi, Arun; Chandrasekaran, Anand; Kim, Chiho

    The Materials Genome Initiative (MGI) has heralded a sea change in the philosophy of materials design. In an increasing number of applications, the successful deployment of novel materials has benefited from the use of computational methodologies, data descriptors, and machine learning. Polymers have long suffered from a lack of data on electronic, mechanical, and dielectric properties across large chemical spaces, causing a stagnation in the set of suitable candidates for various applications. Extensive efforts over the last few years have seen the fruitful application of MGI principles toward the accelerated discovery of attractive polymer dielectrics for capacitive energy storage. Here,more » we review these efforts, highlighting the importance of computational data generation and screening, targeted synthesis and characterization, polymer fingerprinting and machine-learning prediction models, and the creation of an online knowledgebase to guide ongoing and future polymer discovery and design. We lay special emphasis on the fingerprinting of polymers in terms of their genome or constituent atomic and molecular fragments, an idea that pays homage to the pioneers of the human genome project who identified the basic building blocks of the human DNA. As a result, by scoping the polymer genome, we present an essential roadmap for the design of polymer dielectrics, and provide future perspectives and directions for expansions to other polymer subclasses and properties.« less

  10. Novel chemometric strategy based on the application of artificial neural networks to crossed mixture design for the improvement of recombinant protein production in continuous culture.

    PubMed

    Didier, Caroline; Forno, Guillermina; Etcheverrigaray, Marina; Kratje, Ricardo; Goicoechea, Héctor

    2009-09-21

    The optimal blends of six compounds that should be present in culture media used in recombinant protein production were determined by means of artificial neural networks (ANN) coupled with crossed mixture experimental design. This combination constitutes a novel approach to develop a medium for cultivating genetically engineered mammalian cells. The compounds were collected in two mixtures of three elements each, and the experimental space was determined by a crossed mixture design. Empirical data from 51 experimental units were used in a multiresponse analysis to train artificial neural networks which satisfy different requirements, in order to define two new culture media (Medium 1 and Medium 2) to be used in a continuous biopharmaceutical production process. These media were tested in a bioreactor to produce a recombinant protein in CHO cells. Remarkably, for both predicted media all responses satisfied the predefined goals pursued during the analysis, except in the case of the specific growth rate (mu) observed for Medium 1. ANN analysis proved to be a suitable methodology to be used when dealing with complex experimental designs, as frequently occurs in the optimization of production processes in the biotechnology area. The present work is a new example of the use of ANN for the resolution of a complex, real life system, successfully employed in the context of a biopharmaceutical production process.

  11. Experimental investigation of the structural behavior of equine urethra.

    PubMed

    Natali, Arturo Nicola; Carniel, Emanuele Luigi; Frigo, Alessandro; Fontanella, Chiara Giulia; Rubini, Alessandro; Avital, Yochai; De Benedictis, Giulia Maria

    2017-04-01

    An integrated experimental and computational investigation was developed aiming to provide a methodology for characterizing the structural response of the urethral duct. The investigation provides information that are suitable for the actual comprehension of lower urinary tract mechanical functionality and the optimal design of prosthetic devices. Experimental activity entailed the execution of inflation tests performed on segments of horse penile urethras from both proximal and distal regions. Inflation tests were developed imposing different volumes. Each test was performed according to a two-step procedure. The tubular segment was inflated almost instantaneously during the first step, while volume was held constant for about 300s to allow the development of relaxation processes during the second step. Tests performed on the same specimen were interspersed by 600s of rest to allow the recovery of the specimen mechanical condition. Results from experimental activities were statistically analyzed and processed by means of a specific mechanical model. Such computational model was developed with the purpose of interpreting the general pressure-volume-time response of biologic tubular structures. The model includes parameters that interpret the elastic and viscous behavior of hollow structures, directly correlated with the results from the experimental activities. Post-processing of experimental data provided information about the non-linear elastic and time-dependent behavior of the urethral duct. In detail, statistically representative pressure-volume and pressure relaxation curves were identified, and summarized by structural parameters. Considering elastic properties, initial stiffness ranged between 0.677 ± 0.026kPa and 0.262 ± 0.006kPa moving from proximal to distal region of penile urethra. Viscous parameters showed typical values of soft biological tissues, as τ 1 =0.153±0.018s, τ 2 =17.458 ± 1.644s and τ 1 =0.201 ± 0.085, τ 2 = 8.514 ± 1.379s for proximal and distal regions respectively. A general procedure for the mechanical characterization of the urethral duct has been provided. The proposed methodology allows identifying mechanical parameters that properly express the mechanical behavior of the biological tube. The approach is especially suitable for evaluating the influence of degenerative phenomena on the lower urinary tract mechanical functionality. The information are mandatory for the optimal design of potential surgical procedures and devices. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. A novel high-performance self-powered ultraviolet photodetector: Concept, analytical modeling and analysis

    NASA Astrophysics Data System (ADS)

    Ferhati, H.; Djeffal, F.

    2017-12-01

    In this paper, a new MSM-UV-photodetector (PD) based on dual wide band-gap material (DM) engineering aspect is proposed to achieve high-performance self-powered device. Comprehensive analytical models for the proposed sensor photocurrent and the device properties are developed incorporating the impact of DM aspect on the device photoelectrical behavior. The obtained results are validated with the numerical data using commercial TCAD software. Our investigation demonstrates that the adopted design amendment modulates the electric field in the device, which provides the possibility to drive appropriate photo-generated carriers without an external applied voltage. This phenomenon suggests achieving the dual role of effective carriers' separation and an efficient reduce of the dark current. Moreover, a new hybrid approach based on analytical modeling and Particle Swarm Optimization (PSO) is proposed to achieve improved photoelectric behavior at zero bias that can ensure favorable self-powered MSM-based UV-PD. It is found that the proposed design methodology has succeeded in identifying the optimized design that offers a self-powered device with high-responsivity (98 mA/W) and superior ION/IOFF ratio (480 dB). These results make the optimized MSM-UV-DM-PD suitable for providing low cost self-powered devices for high-performance optical communication and monitoring applications.

  13. Concept and Design of a 3D Printed Support to Assist Hand Scanning for the Realization of Customized Orthosis.

    PubMed

    Baronio, Gabriele; Volonghi, Paola; Signoroni, Alberto

    2017-01-01

    In the rehabilitation field, the use of additive manufacturing techniques to realize customized orthoses is increasingly widespread. Obtaining a 3D model for the 3D printing phase can be done following different methodologies. We consider the creation of personalized upper limb orthoses, also including fingers, starting from the acquisition of the hand geometry through accurate 3D scanning. However, hand scanning procedure presents differences between healthy subjects and patients affected by pathologies that compromise upper limb functionality. In this work, we present the concept and design of a 3D printed support to assist hand scanning of such patients. The device, realized with FDM additive manufacturing techniques in ABS material, allows palmar acquisitions, and its design and test are motivated by the following needs: (1) immobilizing the hand of patients during the palmar scanning to reduce involuntary movements affecting the scanning quality and (2) keeping hands open and in a correct position, especially to contrast the high degree of hypertonicity of spastic subjects. The resulting device can be used indifferently for the right and the left hand; it is provided in four-dimensional sizes and may be also suitable as a palmar support for the acquisition of the dorsal side of the hand.

  14. Analysis of phase II methodologies for single-arm clinical trials with multiple endpoints in rare cancers: An example in Ewing's sarcoma.

    PubMed

    Dutton, P; Love, S B; Billingham, L; Hassan, A B

    2018-05-01

    Trials run in either rare diseases, such as rare cancers, or rare sub-populations of common diseases are challenging in terms of identifying, recruiting and treating sufficient patients in a sensible period. Treatments for rare diseases are often designed for other disease areas and then later proposed as possible treatments for the rare disease after initial phase I testing is complete. To ensure the trial is in the best interests of the patient participants, frequent interim analyses are needed to force the trial to stop promptly if the treatment is futile or toxic. These non-definitive phase II trials should also be stopped for efficacy to accelerate research progress if the treatment proves to be particularly promising. In this paper, we review frequentist and Bayesian methods that have been adapted to incorporate two binary endpoints and frequent interim analyses. The Eurosarc Trial of Linsitinib in advanced Ewing Sarcoma (LINES) is used as a motivating example and provides a suitable platform to compare these approaches. The Bayesian approach provides greater design flexibility, but does not provide additional value over the frequentist approaches in a single trial setting when the prior is non-informative. However, Bayesian designs are able to borrow from any previous experience, using prior information to improve efficiency.

  15. Concept and Design of a 3D Printed Support to Assist Hand Scanning for the Realization of Customized Orthosis

    PubMed Central

    Volonghi, Paola

    2017-01-01

    In the rehabilitation field, the use of additive manufacturing techniques to realize customized orthoses is increasingly widespread. Obtaining a 3D model for the 3D printing phase can be done following different methodologies. We consider the creation of personalized upper limb orthoses, also including fingers, starting from the acquisition of the hand geometry through accurate 3D scanning. However, hand scanning procedure presents differences between healthy subjects and patients affected by pathologies that compromise upper limb functionality. In this work, we present the concept and design of a 3D printed support to assist hand scanning of such patients. The device, realized with FDM additive manufacturing techniques in ABS material, allows palmar acquisitions, and its design and test are motivated by the following needs: (1) immobilizing the hand of patients during the palmar scanning to reduce involuntary movements affecting the scanning quality and (2) keeping hands open and in a correct position, especially to contrast the high degree of hypertonicity of spastic subjects. The resulting device can be used indifferently for the right and the left hand; it is provided in four-dimensional sizes and may be also suitable as a palmar support for the acquisition of the dorsal side of the hand. PMID:29234219

  16. Intermediate band formation in a δ-doped like QW superlattices of GaAs/AlxGa1-xAs for solar cell design

    NASA Astrophysics Data System (ADS)

    Del Río-De Santiago, A.; Martínez-Orozco, J. C.; Rodríguez-Magdaleno, K. A.; Contreras-Solorio, D. A.; Rodríguez-Vargas, I.; Ungan, F.

    2018-03-01

    It is reported a numerical computation of the local density of states for a δ-doped like QW superlattices of AlxGa1-xAs, as a possible heterostructure that, being integrated into a solar cell device design, can provide an intermediate band of allowed states to assist the absorption of photons with lower energies than that of the energy gap of the solar-cell constituent materials. This work was performed using the nearest neighbors sp3s* tight-binding model including spin. The confining potential caused by the ionized donor impurities in δ-doped impurities seeding that was obtained analytically within the lines of the Thomas-Fermi approximation was reproduced here by the Al concentration x variation. This potential is considered as an external perturbation in the tight-binding methodology and it is included in the diagonal terms of the tight-binding Hamiltonian. Special attention is paid to the width of the intermediate band caused by the change in the considered aluminium concentration x, the inter-well distance between δ-doped like QW wells and the number of them in the superlattice. In general we can conclude that this kind of superlattices can be suitable for intermediate band formation for possible intermediate-band solar cell design.

  17. Extending Resolution of Fault Slip With Geodetic Networks Through Optimal Network Design

    NASA Astrophysics Data System (ADS)

    Sathiakumar, Sharadha; Barbot, Sylvain Denis; Agram, Piyush

    2017-12-01

    Geodetic networks consisting of high precision and high rate Global Navigation Satellite Systems (GNSS) stations continuously monitor seismically active regions of the world. These networks measure surface displacements and the amount of geodetic strain accumulated in the region and give insight into the seismic potential. SuGar (Sumatra GPS Array) in Sumatra, GEONET (GNSS Earth Observation Network System) in Japan, and PBO (Plate Boundary Observatory) in California are some examples of established networks around the world that are constantly expanding with the addition of new stations to improve the quality of measurements. However, installing new stations to existing networks is tedious and expensive. Therefore, it is important to choose suitable locations for new stations to increase the precision obtained in measuring the geophysical parameters of interest. Here we describe a methodology to design optimal geodetic networks that augment the existing system and use it to investigate seismo-tectonics at convergent and transform boundaries considering land-based and seafloor geodesy. The proposed network design optimization would be pivotal to better understand seismic and tsunami hazards around the world. Land-based and seafloor networks can monitor fault slip around subduction zones with significant resolution, but transform faults are more challenging to monitor due to their near-vertical geometry.

  18. Multi-response optimization of Artemia hatching process using split-split-plot design based response surface methodology

    PubMed Central

    Arun, V. V.; Saharan, Neelam; Ramasubramanian, V.; Babitha Rani, A. M.; Salin, K. R.; Sontakke, Ravindra; Haridas, Harsha; Pazhayamadom, Deepak George

    2017-01-01

    A novel method, BBD-SSPD is proposed by the combination of Box-Behnken Design (BBD) and Split-Split Plot Design (SSPD) which would ensure minimum number of experimental runs, leading to economical utilization in multi- factorial experiments. The brine shrimp Artemia was tested to study the combined effects of photoperiod, temperature and salinity, each with three levels, on the hatching percentage and hatching time of their cysts. The BBD was employed to select 13 treatment combinations out of the 27 possible combinations that were grouped in an SSPD arrangement. Multiple responses were optimized simultaneously using Derringer’s desirability function. Photoperiod and temperature as well as temperature-salinity interaction were found to significantly affect the hatching percentage of Artemia, while the hatching time was significantly influenced by photoperiod and temperature, and their interaction. The optimum conditions were 23 h photoperiod, 29 °C temperature and 28 ppt salinity resulting in 96.8% hatching in 18.94 h. In order to verify the results obtained from BBD-SSPD experiment, the experiment was repeated preserving the same set up. Results of verification experiment were found to be similar to experiment originally conducted. It is expected that this method would be suitable to optimize the hatching process of animal eggs. PMID:28091611

  19. Optimization of the combined ultrasonic assisted/adsorption method for the removal of malachite green by gold nanoparticles loaded on activated carbon: experimental design.

    PubMed

    Roosta, M; Ghaedi, M; Shokri, N; Daneshfar, A; Sahraei, R; Asghari, A

    2014-01-24

    The present study was aimed to experimental design optimization applied to removal of malachite green (MG) from aqueous solution by ultrasound-assisted removal onto the gold nanoparticles loaded on activated carbon (Au-NP-AC). This nanomaterial was characterized using different techniques such as FESEM, TEM, BET, and UV-vis measurements. The effects of variables such as pH, initial dye concentration, adsorbent dosage (g), temperature and sonication time on MG removal were studied using central composite design (CCD) and the optimum experimental conditions were found with desirability function (DF) combined response surface methodology (RSM). Fitting the experimental equilibrium data to various isotherm models such as Langmuir, Freundlich, Tempkin and Dubinin-Radushkevich models show the suitability and applicability of the Langmuir model. Kinetic models such as pseudo -first order, pseudo-second order, Elovich and intraparticle diffusion models applicability was tested for experimental data and the second-order equation and intraparticle diffusion models control the kinetic of the adsorption process. The small amount of proposed adsorbent (0.015 g) is applicable for successful removal of MG (RE>99%) in short time (4.4 min) with high adsorption capacity (140-172 mg g(-1)). Copyright © 2013. Published by Elsevier B.V.

  20. Methodologies for processing plant material into acceptable food on a small scale

    NASA Technical Reports Server (NTRS)

    Parks, Thomas R.; Bindon, John N.; Bowles, Anthony J. G.; Golbitz, Peter; Lampi, Rauno A.; Marquardt, Robert F.

    1994-01-01

    Based on the Controlled Environment Life Support System (CELSS) production of only four crops, wheat, white potatoes, soybeans, and sweet potatoes; a crew size of twelve; a daily planting/harvesting regimen; and zero-gravity conditions, estimates were made on the quantity of food that would need to be grown to provide adequate nutrition; and the corresponding amount of biomass that would result. Projections were made of the various types of products that could be made from these crops, the unit operations that would be involved, and what menu capability these products could provide. Equipment requirements to perform these unit operations were screened to identify commercially available units capable of operating (or being modified to operate) under CELSS/zero-gravity conditions. Concept designs were developed for those equipment needs for which no suitable units were commercially available. Prototypes of selected concept designs were constructed and tested on a laboratory scale, as were selected commercially available units. This report discusses the practical considerations taken into account in the various design alternatives, some of the many product/process factors that relate to equipment development, and automation alternatives. Recommendations are made on both general and specific areas in which it was felt additional investigation would benefit CELSS missions.

  1. Box-Behnken study design for optimization of bicalutamide-loaded nanostructured lipid carrier: stability assessment.

    PubMed

    Kudarha, Ritu; Dhas, Namdev L; Pandey, Abhijeet; Belgamwar, Veena S; Ige, Pradum P

    2015-01-01

    Bicalutamide (BCM) is an anti-androgen drug used to treat prostate cancer. In this study, nanostructured lipid carriers (NLCs) were chosen as a carrier for delivery of BCM using Box-Behnken (BB) design for optimizing various quality attributes such as particle size and entrapment efficiency which is very critical for efficient drug delivery and high therapeutic efficacy. Stability of formulated NLCs was assessed with respect to storage stability, pH stability, hemolysis, protein stability, serum protein stability and accelerated stability. Hot high-pressure homogenizer was utilized for formulation of BCM-loaded NLCs. In BB response surface methodology, total lipid, % liquid lipid and % soya lecithin was selected as independent variable and particle size and %EE as dependent variables. Scanning electron microscopy (SEM) was done for morphological study of NLCs. Differential scanning calorimeter and X-ray diffraction study were used to study crystalline and amorphous behavior. Analysis of design space showed that process was robust with the particle size less than 200 nm and EE up to 78%. Results of stability studies showed stability of carrier in various storage conditions and in different pH condition. From all the above study, it can be concluded that NLCs may be suitable carrier for the delivery of BCM with respect to stability and quality attributes.

  2. Single-Case Experimental Designs: A Systematic Review of Published Research and Current Standards

    ERIC Educational Resources Information Center

    Smith, Justin D.

    2012-01-01

    This article systematically reviews the research design and methodological characteristics of single-case experimental design (SCED) research published in peer-reviewed journals between 2000 and 2010. SCEDs provide researchers with a flexible and viable alternative to group designs with large sample sizes. However, methodological challenges have…

  3. 46 CFR 90.10-38 - Specially suitable for vehicles.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 4 2014-10-01 2014-10-01 false Specially suitable for vehicles. 90.10-38 Section 90.10... GENERAL PROVISIONS Definition of Terms Used in This Subchapter § 90.10-38 Specially suitable for vehicles. A space which is specially suitable for vehicles is one designed for the carriage of automobiles or...

  4. 46 CFR 90.10-38 - Specially suitable for vehicles.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 4 2012-10-01 2012-10-01 false Specially suitable for vehicles. 90.10-38 Section 90.10... GENERAL PROVISIONS Definition of Terms Used in This Subchapter § 90.10-38 Specially suitable for vehicles. A space which is specially suitable for vehicles is one designed for the carriage of automobiles or...

  5. On a biologically inspired topology optimization method

    NASA Astrophysics Data System (ADS)

    Kobayashi, Marcelo H.

    2010-03-01

    This work concerns the development of a biologically inspired methodology for the study of topology optimization in engineering and natural systems. The methodology is based on L systems and its turtle interpretation for the genotype-phenotype modeling of the topology development. The topology is analyzed using the finite element method, and optimized using an evolutionary algorithm with the genetic encoding of the L system and its turtle interpretation, as well as, body shape and physical characteristics. The test cases considered in this work clearly show the suitability of the proposed method for the study of engineering and natural complex systems.

  6. Global Design Optimization for Aerodynamics and Rocket Propulsion Components

    NASA Technical Reports Server (NTRS)

    Shyy, Wei; Papila, Nilay; Vaidyanathan, Rajkumar; Tucker, Kevin; Turner, James E. (Technical Monitor)

    2000-01-01

    Modern computational and experimental tools for aerodynamics and propulsion applications have matured to a stage where they can provide substantial insight into engineering processes involving fluid flows, and can be fruitfully utilized to help improve the design of practical devices. In particular, rapid and continuous development in aerospace engineering demands that new design concepts be regularly proposed to meet goals for increased performance, robustness and safety while concurrently decreasing cost. To date, the majority of the effort in design optimization of fluid dynamics has relied on gradient-based search algorithms. Global optimization methods can utilize the information collected from various sources and by different tools. These methods offer multi-criterion optimization, handle the existence of multiple design points and trade-offs via insight into the entire design space, can easily perform tasks in parallel, and are often effective in filtering the noise intrinsic to numerical and experimental data. However, a successful application of the global optimization method needs to address issues related to data requirements with an increase in the number of design variables, and methods for predicting the model performance. In this article, we review recent progress made in establishing suitable global optimization techniques employing neural network and polynomial-based response surface methodologies. Issues addressed include techniques for construction of the response surface, design of experiment techniques for supplying information in an economical manner, optimization procedures and multi-level techniques, and assessment of relative performance between polynomials and neural networks. Examples drawn from wing aerodynamics, turbulent diffuser flows, gas-gas injectors, and supersonic turbines are employed to help demonstrate the issues involved in an engineering design context. Both the usefulness of the existing knowledge to aid current design practices and the need for future research are identified.

  7. The Necessity of Company-Grade Air Defense Artillery Officers in the Air Defense and Airspace Management Cells Within the Brigade Combat Team

    DTIC Science & Technology

    2014-06-13

    the role of ADAM Cell OIC. Utilizing the Army design methodology, the study compares the current training and performance of Air Defense officers to...junior company-grade officers to fulfill the role of ADAM Cell OIC. Utilizing the Army design methodology, the study compares the current training...Page Figure 1. Army design methodology ...............................................................................34 Figure 2. The cross-walk

  8. Comparison and Validation of Hydrological E-Flow Methods through Hydrodynamic Modelling

    NASA Astrophysics Data System (ADS)

    Kuriqi, Alban; Rivaes, Rui; Sordo-Ward, Alvaro; Pinheiro, António N.; Garrote, Luis

    2017-04-01

    Flow regime determines physical habitat conditions and local biotic configuration. The development of environmental flow guidelines to support the river integrity is becoming a major concern in water resources management. In this study, we analysed two sites located in southern part of Portugal, respectively at Odelouca and Ocreza Rivers, characterised by the Mediterranean climate. Both rivers are almost in pristine condition, not regulated by dams or other diversion construction. This study presents an analysis of the effect on fish habitat suitability by the implementation of different hydrological e-flow methods. To conduct this study we employed certain hydrological e-flow methods recommended by the European Small Hydropower Association (ESHA). River hydrology assessment was based on approximately 30 years of mean daily flow data, provided by the Portuguese Water Information System (SNIRH). The biological data, bathymetry, physical and hydraulic features, and the Habitat Suitability Index for fish species were collected from extensive field works. We followed the Instream Flow Incremental Methodology (IFIM) to assess the flow-habitat relationship taking into account the habitat suitability of different instream flow releases. Initially, we analysed fish habitat suitability based on natural conditions, and we used it as reference condition for other scenarios considering the chosen hydrological e-flow methods. We accomplished the habitat modelling through hydrodynamic analysis by using River-2D model. The same methodology was applied to each scenario by considering as input the e-flows obtained from each of the hydrological method employed in this study. This contribution shows the significance of ecohydrological studies in establishing a foundation for water resources management actions. Keywords: ecohydrology, e-flow, Mediterranean rivers, river conservation, fish habitat, River-2D, Hydropower.

  9. Analysis and Design of Fuselage Structures Including Residual Strength Prediction Methodology

    NASA Technical Reports Server (NTRS)

    Knight, Norman F.

    1998-01-01

    The goal of this research project is to develop and assess methodologies for the design and analysis of fuselage structures accounting for residual strength. Two primary objectives are included in this research activity: development of structural analysis methodology for predicting residual strength of fuselage shell-type structures; and the development of accurate, efficient analysis, design and optimization tool for fuselage shell structures. Assessment of these tools for robustness, efficient, and usage in a fuselage shell design environment will be integrated with these two primary research objectives.

  10. Combination and selection of traffic safety expert judgments for the prevention of driving risks.

    PubMed

    Cabello, Enrique; Conde, Cristina; de Diego, Isaac Martín; Moguerza, Javier M; Redchuk, Andrés

    2012-11-02

    In this paper, we describe a new framework to combine experts’ judgments for the prevention of driving risks in a cabin truck. In addition, the methodology shows how to choose among the experts the one whose predictions fit best the environmental conditions. The methodology is applied over data sets obtained from a high immersive cabin truck simulator in natural driving conditions. A nonparametric model, based in Nearest Neighbors combined with Restricted Least Squared methods is developed. Three experts were asked to evaluate the driving risk using a Visual Analog Scale (VAS), in order to measure the driving risk in a truck simulator where the vehicle dynamics factors were stored. Numerical results show that the methodology is suitable for embedding in real time systems.

  11. [Radiotherapy phase I trials' methodology: Features].

    PubMed

    Rivoirard, R; Vallard, A; Langrand-Escure, J; Guy, J-B; Ben Mrad, M; Yaoxiong, X; Diao, P; Méry, B; Pigne, G; Rancoule, C; Magné, N

    2016-12-01

    In clinical research, biostatistical methods allow the rigorous analysis of data collection and should be defined from the trial design to obtain the appropriate experimental approach. Thus, if the main purpose of phase I is to determine the dose to use during phase II, methodology should be finely adjusted to experimental treatment(s). Today, the methodology for chemotherapy and targeted therapy is well known. For radiotherapy and chemoradiotherapy phase I trials, the primary endpoint must reflect both effectiveness and potential treatment toxicities. Methodology should probably be complex to limit failures in the following phases. However, there are very few data about methodology design in the literature. The present study focuses on these particular trials and their characteristics. It should help to raise existing methodological patterns shortcomings in order to propose new and better-suited designs. Copyright © 2016 Société française de radiothérapie oncologique (SFRO). Published by Elsevier SAS. All rights reserved.

  12. A novel methodology for building robust design rules by using design based metrology (DBM)

    NASA Astrophysics Data System (ADS)

    Lee, Myeongdong; Choi, Seiryung; Choi, Jinwoo; Kim, Jeahyun; Sung, Hyunju; Yeo, Hyunyoung; Shim, Myoungseob; Jin, Gyoyoung; Chung, Eunseung; Roh, Yonghan

    2013-03-01

    This paper addresses a methodology for building robust design rules by using design based metrology (DBM). Conventional method for building design rules has been using a simulation tool and a simple pattern spider mask. At the early stage of the device, the estimation of simulation tool is poor. And the evaluation of the simple pattern spider mask is rather subjective because it depends on the experiential judgment of an engineer. In this work, we designed a huge number of pattern situations including various 1D and 2D design structures. In order to overcome the difficulties of inspecting many types of patterns, we introduced Design Based Metrology (DBM) of Nano Geometry Research, Inc. And those mass patterns could be inspected at a fast speed with DBM. We also carried out quantitative analysis on PWQ silicon data to estimate process variability. Our methodology demonstrates high speed and accuracy for building design rules. All of test patterns were inspected within a few hours. Mass silicon data were handled with not personal decision but statistical processing. From the results, robust design rules are successfully verified and extracted. Finally we found out that our methodology is appropriate for building robust design rules.

  13. Efficient preliminary floating offshore wind turbine design and testing methodologies and application to a concrete spar design

    PubMed Central

    Matha, Denis; Sandner, Frank; Molins, Climent; Campos, Alexis; Cheng, Po Wen

    2015-01-01

    The current key challenge in the floating offshore wind turbine industry and research is on designing economic floating systems that can compete with fixed-bottom offshore turbines in terms of levelized cost of energy. The preliminary platform design, as well as early experimental design assessments, are critical elements in the overall design process. In this contribution, a brief review of current floating offshore wind turbine platform pre-design and scaled testing methodologies is provided, with a focus on their ability to accommodate the coupled dynamic behaviour of floating offshore wind systems. The exemplary design and testing methodology for a monolithic concrete spar platform as performed within the European KIC AFOSP project is presented. Results from the experimental tests compared to numerical simulations are presented and analysed and show very good agreement for relevant basic dynamic platform properties. Extreme and fatigue loads and cost analysis of the AFOSP system confirm the viability of the presented design process. In summary, the exemplary application of the reduced design and testing methodology for AFOSP confirms that it represents a viable procedure during pre-design of floating offshore wind turbine platforms. PMID:25583870

  14. Biomimetics in the design of a robotic exoskeleton for upper limb therapy

    NASA Astrophysics Data System (ADS)

    Baniqued, Paul Dominick E.; Dungao, Jade R.; Manguerra, Michael V.; Baldovino, Renann G.; Abad, Alexander C.; Bugtai, Nilo T.

    2018-02-01

    Current methodologies in designing robotic exoskeletons for upper limb therapy simplify the complex requirements of the human anatomy. As a result, such devices tend to compromise safety and biocompatibility with the intended user. However, a new design methodology uses biological analogues as inspiration to address these technical issues. This approach follows that of biomimetics, a design principle that uses the extraction and transfer of useful information from natural morphologies and processes to solve technical design issues. In this study, a biomimetic approach in the design of a 5-degree-of-freedom robotic exoskeleton for upper limb therapy was performed. A review of biomimetics was first discussed along with its current contribution to the design of rehabilitation robots. With a proposed methodological framework, the design for an upper limb robotic exoskeleton was generated using CATIA software. The design was inspired by the morphology of the bones and the muscle force transmission of the upper limbs. Finally, a full design assembly presented had integrated features extracted from the biological analogue. The successful execution of a biomimetic design methodology made a case in providing safer and more biocompatible robots for rehabilitation.

  15. Experimental Methodology for Measuring Combustion and Injection-Coupled Responses

    NASA Technical Reports Server (NTRS)

    Cavitt, Ryan C.; Frederick, Robert A.; Bazarov, Vladimir G.

    2006-01-01

    A Russian scaling methodology for liquid rocket engines utilizing a single, full scale element is reviewed. The scaling methodology exploits the supercritical phase of the full scale propellants to simplify scaling requirements. Many assumptions are utilized in the derivation of the scaling criteria. A test apparatus design is presented to implement the Russian methodology and consequently verify the assumptions. This test apparatus will allow researchers to assess the usefulness of the scaling procedures and possibly enhance the methodology. A matrix of the apparatus capabilities for a RD-170 injector is also presented. Several methods to enhance the methodology have been generated through the design process.

  16. Using Design-Based Research in Gifted Education

    ERIC Educational Resources Information Center

    Jen, Enyi; Moon, Sidney; Samarapungavan, Ala

    2015-01-01

    Design-based research (DBR) is a new methodological framework that was developed in the context of the learning sciences; however, it has not been used very often in the field of gifted education. Compared with other methodologies, DBR is more process-oriented and context-sensitive. In this methodological brief, the authors introduce DBR and…

  17. Tungsten fiber reinforced superalloy composite high temperature component design considerations

    NASA Technical Reports Server (NTRS)

    Winsa, E. A.

    1982-01-01

    Tungsten fiber reinforced superalloy composites (TFRS) are intended for use in high temperature turbine components. Current turbine component design methodology is based on applying the experience, sometimes semiempirical, gained from over 30 years of superalloy component design. Current composite component design capability is generally limited to the methodology for low temperature resin matrix composites. Often the tendency is to treat TFRS as just another superalloy or low temperature composite. However, TFRS behavior is significantly different than that of superalloys, and the high environment adds consideration not common in low temperature composite component design. The methodology used for preliminary design of TFRS components are described. Considerations unique to TFRS are emphasized.

  18. Using an innovative combination of quality-by-design and green analytical chemistry approaches for the development of a stability indicating UHPLC method in pharmaceutical products.

    PubMed

    Boussès, Christine; Ferey, Ludivine; Vedrines, Elodie; Gaudin, Karen

    2015-11-10

    An innovative combination of green chemistry and quality by design (QbD) approach is presented through the development of an UHPLC method for the analysis of the main degradation products of dextromethorphan hydrobromide. QbD strategy was integrated to the field of green analytical chemistry to improve method understanding while assuring quality and minimizing environmental impacts, and analyst exposure. This analytical method was thoroughly evaluated by applying risk assessment and multivariate analysis tools. After a scouting phase aimed at selecting a suitable stationary phase and an organic solvent in accordance with green chemistry principles, quality risk assessment tools were applied to determine the critical process parameters (CPPs). The effects of the CPPs on critical quality attributes (CQAs), i.e., resolutions, efficiencies, and solvent consumption were further evaluated by means of a screening design. A response surface methodology was then carried out to model CQAs as function of the selected CPPs and the optimal separation conditions were determined through a desirability analysis. Resulting contour plots enabled to establish the design space (DS) (method operable design region) where all CQAs fulfilled the requirements. An experimental validation of the DS proved that quality within the DS was guaranteed; therefore no more robustness study was required before the validation. Finally, this UHPLC method was validated using the concept of total error and was used to analyze a pharmaceutical drug product. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Distinct profiling of antimicrobial peptide families

    PubMed Central

    Khamis, Abdullah M.; Essack, Magbubah; Gao, Xin; Bajic, Vladimir B.

    2015-01-01

    Motivation: The increased prevalence of multi-drug resistant (MDR) pathogens heightens the need to design new antimicrobial agents. Antimicrobial peptides (AMPs) exhibit broad-spectrum potent activity against MDR pathogens and kills rapidly, thus giving rise to AMPs being recognized as a potential substitute for conventional antibiotics. Designing new AMPs using current in-silico approaches is, however, challenging due to the absence of suitable models, large number of design parameters, testing cycles, production time and cost. To date, AMPs have merely been categorized into families according to their primary sequences, structures and functions. The ability to computationally determine the properties that discriminate AMP families from each other could help in exploring the key characteristics of these families and facilitate the in-silico design of synthetic AMPs. Results: Here we studied 14 AMP families and sub-families. We selected a specific description of AMP amino acid sequence and identified compositional and physicochemical properties of amino acids that accurately distinguish each AMP family from all other AMPs with an average sensitivity, specificity and precision of 92.88%, 99.86% and 95.96%, respectively. Many of our identified discriminative properties have been shown to be compositional or functional characteristics of the corresponding AMP family in literature. We suggest that these properties could serve as guides for in-silico methods in design of novel synthetic AMPs. The methodology we developed is generic and has a potential to be applied for characterization of any protein family. Contact: vladimir.bajic@kaust.edu.sa Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25388148

  20. SSDOnt: An Ontology for Representing Single-Subject Design Studies.

    PubMed

    Berges, Idoia; Bermúdez, Jesus; Illarramendi, Arantza

    2018-02-01

    Single-Subject Design is used in several areas such as education and biomedicine. However, no suited formal vocabulary exists for annotating the detailed configuration and the results of this type of research studies with the appropriate granularity for looking for information about them. Therefore, the search for those study designs relies heavily on a syntactical search on the abstract, keywords or full text of the publications about the study, which entails some limitations. To present SSDOnt, a specific purpose ontology for describing and annotating single-subject design studies, so that complex questions can be asked about them afterwards. The ontology was developed following the NeOn methodology. Once the requirements of the ontology were defined, a formal model was described in a Description Logic and later implemented in the ontology language OWL 2 DL. We show how the ontology provides a reference model with a suitable terminology for the annotation and searching of single-subject design studies and their main components, such as the phases, the intervention types, the outcomes and the results. Some mappings with terms of related ontologies have been established. We show as proof-of-concept that classes in the ontology can be easily extended to annotate more precise information about specific interventions and outcomes such as those related to autism. Moreover, we provide examples of some types of queries that can be posed to the ontology. SSDOnt has achieved the purpose of covering the descriptions of the domain of single-subject research studies. Schattauer GmbH.

Top