Sample records for methodology application results

  1. Application of Resource Description Framework to Personalise Learning: Systematic Review and Methodology

    ERIC Educational Resources Information Center

    Jevsikova, Tatjana; Berniukevicius, Andrius; Kurilovas, Eugenijus

    2017-01-01

    The paper is aimed to present a methodology of learning personalisation based on applying Resource Description Framework (RDF) standard model. Research results are two-fold: first, the results of systematic literature review on Linked Data, RDF "subject-predicate-object" triples, and Web Ontology Language (OWL) application in education…

  2. Fault Modeling of Extreme Scale Applications Using Machine Learning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vishnu, Abhinav; Dam, Hubertus van; Tallent, Nathan R.

    Faults are commonplace in large scale systems. These systems experience a variety of faults such as transient, permanent and intermittent. Multi-bit faults are typically not corrected by the hardware resulting in an error. Here, this paper attempts to answer an important question: Given a multi-bit fault in main memory, will it result in an application error — and hence a recovery algorithm should be invoked — or can it be safely ignored? We propose an application fault modeling methodology to answer this question. Given a fault signature (a set of attributes comprising of system and application state), we use machinemore » learning to create a model which predicts whether a multibit permanent/transient main memory fault will likely result in error. We present the design elements such as the fault injection methodology for covering important data structures, the application and system attributes which should be used for learning the model, the supervised learning algorithms (and potentially ensembles), and important metrics. Lastly, we use three applications — NWChem, LULESH and SVM — as examples for demonstrating the effectiveness of the proposed fault modeling methodology.« less

  3. Fault Modeling of Extreme Scale Applications Using Machine Learning

    DOE PAGES

    Vishnu, Abhinav; Dam, Hubertus van; Tallent, Nathan R.; ...

    2016-05-01

    Faults are commonplace in large scale systems. These systems experience a variety of faults such as transient, permanent and intermittent. Multi-bit faults are typically not corrected by the hardware resulting in an error. Here, this paper attempts to answer an important question: Given a multi-bit fault in main memory, will it result in an application error — and hence a recovery algorithm should be invoked — or can it be safely ignored? We propose an application fault modeling methodology to answer this question. Given a fault signature (a set of attributes comprising of system and application state), we use machinemore » learning to create a model which predicts whether a multibit permanent/transient main memory fault will likely result in error. We present the design elements such as the fault injection methodology for covering important data structures, the application and system attributes which should be used for learning the model, the supervised learning algorithms (and potentially ensembles), and important metrics. Lastly, we use three applications — NWChem, LULESH and SVM — as examples for demonstrating the effectiveness of the proposed fault modeling methodology.« less

  4. Grounded theory as a method for research in speech and language therapy.

    PubMed

    Skeat, J; Perry, A

    2008-01-01

    The use of qualitative methodologies in speech and language therapy has grown over the past two decades, and there is now a body of literature, both generally describing qualitative research, and detailing its applicability to health practice(s). However, there has been only limited profession-specific discussion of qualitative methodologies and their potential application to speech and language therapy. To describe the methodology of grounded theory, and to explain how it might usefully be applied to areas of speech and language research where theoretical frameworks or models are lacking. Grounded theory as a methodology for inductive theory-building from qualitative data is explained and discussed. Some differences between 'modes' of grounded theory are clarified and areas of controversy within the literature are highlighted. The past application of grounded theory to speech and language therapy, and its potential for informing research and clinical practice, are examined. This paper provides an in-depth critique of a qualitative research methodology, including an overview of the main difference between two major 'modes'. The article supports the application of a theory-building approach in the profession, which is sometimes complex to learn and apply, but worthwhile in its results. Grounded theory as a methodology has much to offer speech and language therapists and researchers. Although the majority of research and discussion around this methodology has rested within sociology and nursing, grounded theory can be applied by researchers in any field, including speech and language therapists. The benefit of the grounded theory method to researchers and practitioners lies in its application to social processes and human interactions. The resulting theory may support further research in the speech and language therapy profession.

  5. Turbofan engine control system design using the LQG/LTR methodology

    NASA Technical Reports Server (NTRS)

    Garg, Sanjay

    1989-01-01

    Application of the linear-quadratic-Gaussian with loop-transfer-recovery methodology to design of a control system for a simplified turbofan engine model is considered. The importance of properly scaling the plant to achieve the desired target feedback loop is emphasized. The steps involved in the application of the methodology are discussed via an example, and evaluation results are presented for a reduced-order compensator. The effect of scaling the plant on the stability robustness evaluation of the closed-loop system is studied in detail.

  6. Turbofan engine control system design using the LQG/LTR methodology

    NASA Technical Reports Server (NTRS)

    Garg, Sanjay

    1989-01-01

    Application of the Linear-Quadratic-Gaussian with Loop-Transfer-Recovery methodology to design of a control system for a simplified turbofan engine model is considered. The importance of properly scaling the plant to achieve the desired Target-Feedback-Loop is emphasized. The steps involved in the application of the methodology are discussed via an example, and evaluation results are presented for a reduced-order compensator. The effect of scaling the plant on the stability robustness evaluation of the closed-loop system is studied in detail.

  7. Application of an Integrated Methodology for Propulsion and Airframe Control Design to a STOVL Aircraft

    NASA Technical Reports Server (NTRS)

    Garg, Sanjay; Mattern, Duane

    1994-01-01

    An advanced methodology for integrated flight propulsion control (IFPC) design for future aircraft, which will use propulsion system generated forces and moments for enhanced maneuver capabilities, is briefly described. This methodology has the potential to address in a systematic manner the coupling between the airframe and the propulsion subsystems typical of such enhanced maneuverability aircraft. Application of the methodology to a short take-off vertical landing (STOVL) aircraft in the landing approach to hover transition flight phase is presented with brief description of the various steps in the IFPC design methodology. The details of the individual steps have been described in previous publications and the objective of this paper is to focus on how the components of the control system designed at each step integrate into the overall IFPC system. The full nonlinear IFPC system was evaluated extensively in nonreal-time simulations as well as piloted simulations. Results from the nonreal-time evaluations are presented in this paper. Lessons learned from this application study are summarized in terms of areas of potential improvements in the STOVL IFPC design as well as identification of technology development areas to enhance the applicability of the proposed design methodology.

  8. The use of concept maps during knowledge elicitation in ontology development processes – the nutrigenomics use case

    PubMed Central

    Castro, Alexander Garcia; Rocca-Serra, Philippe; Stevens, Robert; Taylor, Chris; Nashar, Karim; Ragan, Mark A; Sansone, Susanna-Assunta

    2006-01-01

    Background Incorporation of ontologies into annotations has enabled 'semantic integration' of complex data, making explicit the knowledge within a certain field. One of the major bottlenecks in developing bio-ontologies is the lack of a unified methodology. Different methodologies have been proposed for different scenarios, but there is no agreed-upon standard methodology for building ontologies. The involvement of geographically distributed domain experts, the need for domain experts to lead the design process, the application of the ontologies and the life cycles of bio-ontologies are amongst the features not considered by previously proposed methodologies. Results Here, we present a methodology for developing ontologies within the biological domain. We describe our scenario, competency questions, results and milestones for each methodological stage. We introduce the use of concept maps during knowledge acquisition phases as a feasible transition between domain expert and knowledge engineer. Conclusion The contributions of this paper are the thorough description of the steps we suggest when building an ontology, example use of concept maps, consideration of applicability to the development of lower-level ontologies and application to decentralised environments. We have found that within our scenario conceptual maps played an important role in the development process. PMID:16725019

  9. Assessment of Methodological Quality of Economic Evaluations in Belgian Drug Reimbursement Applications

    PubMed Central

    Simoens, Steven

    2013-01-01

    Objectives This paper aims to assess the methodological quality of economic evaluations included in Belgian reimbursement applications for Class 1 drugs. Materials and Methods For 19 reimbursement applications submitted during 2011 and Spring 2012, a descriptive analysis assessed the methodological quality of the economic evaluation, evaluated the assessment of that economic evaluation by the Drug Reimbursement Committee and the response to that assessment by the company. Compliance with methodological guidelines issued by the Belgian Healthcare Knowledge Centre was assessed using a detailed checklist of 23 methodological items. The rate of compliance was calculated based on the number of economic evaluations for which the item was applicable. Results Economic evaluations tended to comply with guidelines regarding perspective, target population, subgroup analyses, comparator, use of comparative clinical data and final outcome measures, calculation of costs, incremental analysis, discounting and time horizon. However, more attention needs to be paid to the description of limitations of indirect comparisons, the choice of an appropriate analytic technique, the expression of unit costs in values for the current year, the estimation and valuation of outcomes, the presentation of results of sensitivity analyses, and testing the face validity of model inputs and outputs. Also, a large variation was observed in the scope and depth of the quality assessment by the Drug Reimbursement Committee. Conclusions Although general guidelines exist, pharmaceutical companies and the Drug Reimbursement Committee would benefit from the existence of a more detailed checklist of methodological items that need to be reported in an economic evaluation. PMID:24386474

  10. Case-Crossover Analysis of Air Pollution Health Effects: A Systematic Review of Methodology and Application

    PubMed Central

    Carracedo-Martínez, Eduardo; Taracido, Margarita; Tobias, Aurelio; Saez, Marc; Figueiras, Adolfo

    2010-01-01

    Background Case-crossover is one of the most used designs for analyzing the health-related effects of air pollution. Nevertheless, no one has reviewed its application and methodology in this context. Objective We conducted a systematic review of case-crossover (CCO) designs used to study the relationship between air pollution and morbidity and mortality, from the standpoint of methodology and application. Data sources and extraction A search was made of the MEDLINE and EMBASE databases. Reports were classified as methodologic or applied. From the latter, the following information was extracted: author, study location, year, type of population (general or patients), dependent variable(s), independent variable(s), type of CCO design, and whether effect modification was analyzed for variables at the individual level. Data synthesis The review covered 105 reports that fulfilled the inclusion criteria. Of these, 24 addressed methodological aspects, and the remainder involved the design’s application. In the methodological reports, the designs that yielded the best results in simulation were symmetric bidirectional CCO and time-stratified CCO. Furthermore, we observed an increase across time in the use of certain CCO designs, mainly symmetric bidirectional and time-stratified CCO. The dependent variables most frequently analyzed were those relating to hospital morbidity; the pollutants most often studied were those linked to particulate matter. Among the CCO-application reports, 13.6% studied effect modification for variables at the individual level. Conclusions The use of CCO designs has undergone considerable growth; the most widely used designs were those that yielded better results in simulation studies: symmetric bidirectional and time-stratified CCO. However, the advantages of CCO as a method of analysis of variables at the individual level are put to little use. PMID:20356818

  11. Using experts feedback in clinical case resolution and arbitration as accuracy diagnosis methodology.

    PubMed

    Rodríguez-González, Alejandro; Torres-Niño, Javier; Valencia-Garcia, Rafael; Mayer, Miguel A; Alor-Hernandez, Giner

    2013-09-01

    This paper proposes a new methodology for assessing the efficiency of medical diagnostic systems and clinical decision support systems by using the feedback/opinions of medical experts. The methodology behind this work is based on a comparison between the expert feedback that has helped solve different clinical cases and the expert system that has evaluated these same cases. Once the results are returned, an arbitration process is carried out in order to ensure the correctness of the results provided by both methods. Once this process has been completed, the results are analyzed using Precision, Recall, Accuracy, Specificity and Matthews Correlation Coefficient (MCC) (PRAS-M) metrics. When the methodology is applied, the results obtained from a real diagnostic system allow researchers to establish the accuracy of the system based on objective facts. The methodology returns enough information to analyze the system's behavior for each disease in the knowledge base or across the entire knowledge base. It also returns data on the efficiency of the different assessors involved in the evaluation process, analyzing their behavior in the diagnostic process. The proposed work facilitates the evaluation of medical diagnostic systems, having a reliable process based on objective facts. The methodology presented in this research makes it possible to identify the main characteristics that define a medical diagnostic system and their values, allowing for system improvement. A good example of the results provided by the application of the methodology is shown in this paper. A diagnosis system was evaluated by means of this methodology, yielding positive results (statistically significant) when comparing the system with the assessors that participated in the evaluation process of the system through metrics such as recall (+27.54%) and MCC (+32.19%). These results demonstrate the real applicability of the methodology used. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. Development of an Expert Judgement Elicitation and Calibration Methodology for Risk Analysis in Conceptual Vehicle Design

    NASA Technical Reports Server (NTRS)

    Unal, Resit; Keating, Charles; Conway, Bruce; Chytka, Trina

    2004-01-01

    A comprehensive expert-judgment elicitation methodology to quantify input parameter uncertainty and analysis tool uncertainty in a conceptual launch vehicle design analysis has been developed. The ten-phase methodology seeks to obtain expert judgment opinion for quantifying uncertainties as a probability distribution so that multidisciplinary risk analysis studies can be performed. The calibration and aggregation techniques presented as part of the methodology are aimed at improving individual expert estimates, and provide an approach to aggregate multiple expert judgments into a single probability distribution. The purpose of this report is to document the methodology development and its validation through application to a reference aerospace vehicle. A detailed summary of the application exercise, including calibration and aggregation results is presented. A discussion of possible future steps in this research area is given.

  13. A Comparison of Temporal Dominance of Sensation (TDS) and Quantitative Descriptive Analysis (QDA™) to Identify Flavors in Strawberries.

    PubMed

    Oliver, Penelope; Cicerale, Sara; Pang, Edwin; Keast, Russell

    2018-04-01

    Temporal dominance of sensations (TDS) is a rapid descriptive method that offers a different magnitude of information to traditional descriptive analysis methodologies. This methodology considers the dynamic nature of eating, assessing sensory perception of foods as they change throughout the eating event. Limited research has applied the TDS methodology to strawberries and subsequently validated the results against Quantitative Descriptive Analysis (QDA™). The aim of this research is to compare the TDS methodology using an untrained consumer panel to the results obtained via QDA™ with a trained sensory panel. The trained panelists (n = 12, minimum 60 hr each panelist) were provided with six strawberry samples (three cultivars at two maturation levels) and applied QDA™ techniques to profile each strawberry sample. Untrained consumers (n = 103) were provided with six strawberry samples (three cultivars at two maturation levels) and required to use TDS methodology to assess the dominant sensations for each sample as they change over time. Results revealed moderately comparable product configurations produced via TDS in comparison to QDA™ (RV coefficient = 0.559), as well as similar application of the sweet attribute (correlation coefficient of 0.895 at first bite). The TDS methodology however was not in agreement with the QDA™ methodology regarding more complex flavor terms. These findings support the notion that the lack of training on the definition of terms, together with the limitations of the methodology to ignore all attributes other than those dominant, provide a different magnitude of information than the QDA™ methodology. A comparison of TDS to traditional descriptive analysis indicate that TDS provides additional information to QDA™ regarding the lingering component of eating. The QDA™ results however provide more precise detail regarding singular attributes. Therefore, the TDS methodology has an application in industry when it is important to understand the lingering profile of products. However, this methodology should not be employed as a replacement to traditional descriptive analysis methods. © 2018 Institute of Food Technologists®.

  14. Investigating transport pathways in the ocean

    NASA Astrophysics Data System (ADS)

    Griffa, Annalisa; Haza, Angelique; Özgökmen, Tamay M.; Molcard, Anne; Taillandier, Vincent; Schroeder, Katrin; Chang, Yeon; Poulain, P.-M.

    2013-01-01

    The ocean is a very complex medium with scales of motion that range from thousands of kilometers to the dissipation scales. Transport by ocean currents plays an important role in many practical applications ranging from climatic problems to coastal management and accident mitigation at sea. Understanding transport is challenging because of the chaotic nature of particle motion. In the last decade, new methods have been put forth to improve our understanding of transport. Powerful tools are provided by dynamical system theory, that allow the identification of the barriers to transport and their time variability for a given flow. A shortcoming of this approach, though, is that it is based on the assumption that the velocity field is known with good accuracy, which is not always the case in practical applications. Improving model performance in terms of transport can be addressed using another important methodology that has been recently developed, namely the assimilation of Lagrangian data provided by floating buoys. The two methodologies are technically different but in many ways complementary. In this paper, we review examples of applications of both methodologies performed by the authors in the last few years, considering flows at different scales and in various ocean basins. The results are among the very first examples of applications of the methodologies to the real ocean including testing with Lagrangian in-situ data. The results are discussed in the general framework of the extended fields related to these methodologies, pointing out to open questions and potential for improvements, with an outlook toward future strategies.

  15. Environmental Risk Assessment of dredging processes - application to Marin harbour (NW Spain)

    NASA Astrophysics Data System (ADS)

    Gómez, A. G.; García Alba, J.; Puente, A.; Juanes, J. A.

    2014-04-01

    A methodological procedure to estimate the environmental risk of dredging operations in aquatic systems has been developed. Environmental risk estimations are based on numerical models results, which provide an appropriated spatio-temporal framework analysis to guarantee an effective decision-making process. The methodological procedure has been applied on a real dredging operation in the port of Marin (NW Spain). Results from Marin harbour confirmed the suitability of the developed methodology and the conceptual approaches as a comprehensive and practical management tool.

  16. Design consideration of resonance inverters with electro-technological application

    NASA Astrophysics Data System (ADS)

    Hinov, Nikolay

    2017-12-01

    This study presents design consideration of resonance inverters with electro-technological application. The presented methodology was achieved as a result of investigations and analyses of different types and working regimes of resonance inverters, made by the author. Are considered schemes of resonant inverters without inverse diodes. The first harmonic method is used in the analysis and design. This method for the case of inverters with electro-technological application gives very good accuracy. This does not require the use of a complex and heavy mathematical apparatus. The proposed methodology is easy to use and is suitable for use in training students in power electronics. Authenticity of achieved results is confirmed by simulating and physical prototypes research work.

  17. A Systematic Determination of Skill and Simulator Requirements for Airplane Pilot Certification

    DOT National Transportation Integrated Search

    1985-03-01

    This research report describes: (1) the FAA's ATP airman certification system; (2) needs of the system regarding simulator use; (3) a systematic methodology for meeting these needs; (4) application of the methodology; (5) results of the study; and (6...

  18. Prognostics and health management design for rotary machinery systems—Reviews, methodology and applications

    NASA Astrophysics Data System (ADS)

    Lee, Jay; Wu, Fangji; Zhao, Wenyu; Ghaffari, Masoud; Liao, Linxia; Siegel, David

    2014-01-01

    Much research has been conducted in prognostics and health management (PHM), an emerging field in mechanical engineering that is gaining interest from both academia and industry. Most of these efforts have been in the area of machinery PHM, resulting in the development of many algorithms for this particular application. The majority of these algorithms concentrate on applications involving common rotary machinery components, such as bearings and gears. Knowledge of this prior work is a necessity for any future research efforts to be conducted; however, there has not been a comprehensive overview that details previous and on-going efforts in PHM. In addition, a systematic method for developing and deploying a PHM system has yet to be established. Such a method would enable rapid customization and integration of PHM systems for diverse applications. To address these gaps, this paper provides a comprehensive review of the PHM field, followed by an introduction of a systematic PHM design methodology, 5S methodology, for converting data to prognostics information. This methodology includes procedures for identifying critical components, as well as tools for selecting the most appropriate algorithms for specific applications. Visualization tools are presented for displaying prognostics information in an appropriate fashion for quick and accurate decision making. Industrial case studies are included in this paper to show how this methodology can help in the design of an effective PHM system.

  19. A Computational Tool for Evaluating THz Imaging Performance in Brownout Conditions at Land Sites Throughout the World

    DTIC Science & Technology

    2009-03-01

    III. Methodology ...............................................................................................................26 Overview...applications relating to this research and the results they have obtained, as well as the background on LEEDR. Chapter 3 will detail the methodology ...different in that the snow dissipates faster and it is better to descend slower, at rates of 200 – 300 ft/min. 26 III. Methodology This chapter

  20. An Integrated Low-Speed Performance and Noise Prediction Methodology for Subsonic Aircraft

    NASA Technical Reports Server (NTRS)

    Olson, E. D.; Mavris, D. N.

    2000-01-01

    An integrated methodology has been assembled to compute the engine performance, takeoff and landing trajectories, and community noise levels for a subsonic commercial aircraft. Where feasible, physics-based noise analysis methods have been used to make the results more applicable to newer, revolutionary designs and to allow for a more direct evaluation of new technologies. The methodology is intended to be used with approximation methods and risk analysis techniques to allow for the analysis of a greater number of variable combinations while retaining the advantages of physics-based analysis. Details of the methodology are described and limited results are presented for a representative subsonic commercial aircraft.

  1. Modified Dynamic Inversion to Control Large Flexible Aircraft: What's Going On?

    NASA Technical Reports Server (NTRS)

    Gregory, Irene M.

    1999-01-01

    High performance aircraft of the future will be designed lighter, more maneuverable, and operate over an ever expanding flight envelope. One of the largest differences from the flight control perspective between current and future advanced aircraft is elasticity. Over the last decade, dynamic inversion methodology has gained considerable popularity in application to highly maneuverable fighter aircraft, which were treated as rigid vehicles. This paper explores dynamic inversion application to an advanced highly flexible aircraft. An initial application has been made to a large flexible supersonic aircraft. In the course of controller design for this advanced vehicle, modifications were made to the standard dynamic inversion methodology. The results of this application were deemed rather promising. An analytical study has been undertaken to better understand the nature of the made modifications and to determine its general applicability. This paper presents the results of this initial analytical look at the modifications to dynamic inversion to control large flexible aircraft.

  2. Investigation of Weibull statistics in fracture analysis of cast aluminum

    NASA Technical Reports Server (NTRS)

    Holland, Frederic A., Jr.; Zaretsky, Erwin V.

    1989-01-01

    The fracture strengths of two large batches of A357-T6 cast aluminum coupon specimens were compared by using two-parameter Weibull analysis. The minimum number of these specimens necessary to find the fracture strength of the material was determined. The applicability of three-parameter Weibull analysis was also investigated. A design methodology based on the combination of elementary stress analysis and Weibull statistical analysis is advanced and applied to the design of a spherical pressure vessel shell. The results from this design methodology are compared with results from the applicable ASME pressure vessel code.

  3. Methodological approach and tools for systems thinking in health systems research: technical assistants' support of health administration reform in the Democratic Republic of Congo as an application.

    PubMed

    Ribesse, Nathalie; Bossyns, Paul; Marchal, Bruno; Karemere, Hermes; Burman, Christopher J; Macq, Jean

    2017-03-01

    In the field of development cooperation, interest in systems thinking and complex systems theories as a methodological approach is increasingly recognised. And so it is in health systems research, which informs health development aid interventions. However, practical applications remain scarce to date. The objective of this article is to contribute to the body of knowledge by presenting the tools inspired by systems thinking and complexity theories and methodological lessons learned from their application. These tools were used in a case study. Detailed results of this study are in process for publication in additional articles. Applying a complexity 'lens', the subject of the case study is the role of long-term international technical assistance in supporting health administration reform at the provincial level in the Democratic Republic of Congo. The Methods section presents the guiding principles of systems thinking and complex systems, their relevance and implication for the subject under study, and the existing tools associated with those theories which inspired us in the design of the data collection and analysis process. The tools and their application processes are presented in the results section, and followed in the discussion section by the critical analysis of their innovative potential and emergent challenges. The overall methodology provides a coherent whole, each tool bringing a different and complementary perspective on the system.

  4. Towards more sustainable management of European food waste: Methodological approach and numerical application.

    PubMed

    Manfredi, Simone; Cristobal, Jorge

    2016-09-01

    Trying to respond to the latest policy needs, the work presented in this article aims at developing a life-cycle based framework methodology to quantitatively evaluate the environmental and economic sustainability of European food waste management options. The methodology is structured into six steps aimed at defining boundaries and scope of the evaluation, evaluating environmental and economic impacts and identifying best performing options. The methodology is able to accommodate additional assessment criteria, for example the social dimension of sustainability, thus moving towards a comprehensive sustainability assessment framework. A numerical case study is also developed to provide an example of application of the proposed methodology to an average European context. Different options for food waste treatment are compared, including landfilling, composting, anaerobic digestion and incineration. The environmental dimension is evaluated with the software EASETECH, while the economic assessment is conducted based on different indicators expressing the costs associated with food waste management. Results show that the proposed methodology allows for a straightforward identification of the most sustainable options for food waste, thus can provide factual support to decision/policy making. However, it was also observed that results markedly depend on a number of user-defined assumptions, for example on the choice of the indicators to express the environmental and economic performance. © The Author(s) 2016.

  5. Assessment of methodological quality of economic evaluations in belgian drug reimbursement applications.

    PubMed

    Simoens, Steven

    2013-01-01

    This paper aims to assess the methodological quality of economic evaluations included in Belgian reimbursement applications for Class 1 drugs. For 19 reimbursement applications submitted during 2011 and Spring 2012, a descriptive analysis assessed the methodological quality of the economic evaluation, evaluated the assessment of that economic evaluation by the Drug Reimbursement Committee and the response to that assessment by the company. Compliance with methodological guidelines issued by the Belgian Healthcare Knowledge Centre was assessed using a detailed checklist of 23 methodological items. The rate of compliance was calculated based on the number of economic evaluations for which the item was applicable. Economic evaluations tended to comply with guidelines regarding perspective, target population, subgroup analyses, comparator, use of comparative clinical data and final outcome measures, calculation of costs, incremental analysis, discounting and time horizon. However, more attention needs to be paid to the description of limitations of indirect comparisons, the choice of an appropriate analytic technique, the expression of unit costs in values for the current year, the estimation and valuation of outcomes, the presentation of results of sensitivity analyses, and testing the face validity of model inputs and outputs. Also, a large variation was observed in the scope and depth of the quality assessment by the Drug Reimbursement Committee. Although general guidelines exist, pharmaceutical companies and the Drug Reimbursement Committee would benefit from the existence of a more detailed checklist of methodological items that need to be reported in an economic evaluation.

  6. Development of a North American paleoclimate pollen-based reconstruction database application

    NASA Astrophysics Data System (ADS)

    Ladd, Matthew; Mosher, Steven; Viau, Andre

    2013-04-01

    Recent efforts in synthesizing paleoclimate records across the globe has warranted an effort to standardize the different paleoclimate archives currently available in order to facilitate data-model comparisons and hence improve our estimates of future climate change. It is often the case that the methodology and programs make it challenging for other researchers to reproduce the results for a reconstruction, therefore there is a need for to standardize paleoclimate reconstruction databases in an application specific to proxy data. Here we present a methodology using the open source R language using North American pollen databases (e.g. NAPD, NEOTOMA) where this application can easily be used to perform new reconstructions and quickly analyze and output/plot the data. The application was developed to easily test methodological and spatial/temporal issues that might affect the reconstruction results. The application allows users to spend more time analyzing and interpreting results instead of on data management and processing. Some of the unique features of this R program are the two modules each with a menu making the user feel at ease with the program, the ability to use different pollen sums, select one of 70 climate variables available, substitute an appropriate modern climate dataset, a user-friendly regional target domain, temporal resolution criteria, linear interpolation and many other features for a thorough exploratory data analysis. The application program will be available for North American pollen-based reconstructions and eventually be made available as a package through the CRAN repository by late 2013.

  7. A new hybrid transfinite element computational methodology for applicability to conduction/convection/radiation heat transfer

    NASA Technical Reports Server (NTRS)

    Tamma, Kumar K.; Railkar, Sudhir B.

    1988-01-01

    This paper describes new and recent advances in the development of a hybrid transfinite element computational methodology for applicability to conduction/convection/radiation heat transfer problems. The transfinite element methodology, while retaining the modeling versatility of contemporary finite element formulations, is based on application of transform techniques in conjunction with classical Galerkin schemes and is a hybrid approach. The purpose of this paper is to provide a viable hybrid computational methodology for applicability to general transient thermal analysis. Highlights and features of the methodology are described and developed via generalized formulations and applications to several test problems. The proposed transfinite element methodology successfully provides a viable computational approach and numerical test problems validate the proposed developments for conduction/convection/radiation thermal analysis.

  8. Determining The Various Perspectives And Consensus Within A Classroom Using Q Methodology

    NASA Astrophysics Data System (ADS)

    Ramlo, Susan E.

    2008-10-01

    Q methodology was developed by PhD physicist and psychologist William Stevenson 73 years ago as a new way of investigating people's views of any topic. Yet its application has primarily been in the fields of marketing, psychology, and political science. Still, Q offers an opportunity for the physics education research community to determine the perspectives and consensus within a group, such as a classroom, related to topics of interest such as the nature of science and epistemology. This paper presents the basics of using Q methodology with a classroom application as an example and subsequent comparisons of this example's results to similar studies using qualitative and survey methods.

  9. The Expanded Application of Forensic Science and Law Enforcement Methodologies in Army Counterintelligence

    DTIC Science & Technology

    2017-09-01

    THE EXPANDED APPLICATION OF FORENSIC SCIENCE AND LAW ENFORCEMENT METHODOLOGIES IN ARMY COUNTERINTELLIGENCE A RESEARCH PROJECT...Jul 2017 The Expanded Application of Forensic Science and Law Enforcement Methodologies in Army Counterintelligence CW2 Stockham, Braden E. National...forensic science resources, law enforcement methodologies and procedures, and basic investigative training. In order to determine if these changes would

  10. Calculation and mitigation of isotopic interferences in liquid chromatography-mass spectrometry/mass spectrometry assays and its application in supporting microdose absolute bioavailability studies.

    PubMed

    Gu, Huidong; Wang, Jian; Aubry, Anne-Françoise; Jiang, Hao; Zeng, Jianing; Easter, John; Wang, Jun-sheng; Dockens, Randy; Bifano, Marc; Burrell, Richard; Arnold, Mark E

    2012-06-05

    A methodology for the accurate calculation and mitigation of isotopic interferences in liquid chromatography-mass spectrometry/mass spectrometry (LC-MS/MS) assays and its application in supporting microdose absolute bioavailability studies are reported for the first time. For simplicity, this calculation methodology and the strategy to minimize the isotopic interference are demonstrated using a simple molecule entity, then applied to actual development drugs. The exact isotopic interferences calculated with this methodology were often much less than the traditionally used, overestimated isotopic interferences simply based on the molecular isotope abundance. One application of the methodology is the selection of a stable isotopically labeled internal standard (SIL-IS) for an LC-MS/MS bioanalytical assay. The second application is the selection of an SIL analogue for use in intravenous (i.v.) microdosing for the determination of absolute bioavailability. In the case of microdosing, the traditional approach of calculating isotopic interferences can result in selecting a labeling scheme that overlabels the i.v.-dosed drug or leads to incorrect conclusions on the feasibility of using an SIL drug and analysis by LC-MS/MS. The methodology presented here can guide the synthesis by accurately calculating the isotopic interferences when labeling at different positions, using different selective reaction monitoring (SRM) transitions or adding more labeling positions. This methodology has been successfully applied to the selection of the labeled i.v.-dosed drugs for use in two microdose absolute bioavailability studies, before initiating the chemical synthesis. With this methodology, significant time and cost saving can be achieved in supporting microdose absolute bioavailability studies with stable labeled drugs.

  11. Evaluation of the HARDMAN comparability methodology for manpower, personnel and training

    NASA Technical Reports Server (NTRS)

    Zimmerman, W.; Butler, R.; Gray, V.; Rosenberg, L.

    1984-01-01

    The methodology evaluation and recommendation are part of an effort to improve Hardware versus Manpower (HARDMAN) methodology for projecting manpower, personnel, and training (MPT) to support new acquisition. Several different validity tests are employed to evaluate the methodology. The methodology conforms fairly well with both the MPT user needs and other accepted manpower modeling techniques. Audits of three completed HARDMAN applications reveal only a small number of potential problem areas compared to the total number of issues investigated. The reliability study results conform well with the problem areas uncovered through the audits. The results of the accuracy studies suggest that the manpower life-cycle cost component is only marginally sensitive to changes in other related cost variables. Even with some minor problems, the methodology seem sound and has good near term utility to the Army. Recommendations are provided to firm up the problem areas revealed through the evaluation.

  12. Evaluation of Structural Robustness against Column Loss: Methodology and Application to RC Frame Buildings.

    PubMed

    Bao, Yihai; Main, Joseph A; Noh, Sam-Young

    2017-08-01

    A computational methodology is presented for evaluating structural robustness against column loss. The methodology is illustrated through application to reinforced concrete (RC) frame buildings, using a reduced-order modeling approach for three-dimensional RC framing systems that includes the floor slabs. Comparisons with high-fidelity finite-element model results are presented to verify the approach. Pushdown analyses of prototype buildings under column loss scenarios are performed using the reduced-order modeling approach, and an energy-based procedure is employed to account for the dynamic effects associated with sudden column loss. Results obtained using the energy-based approach are found to be in good agreement with results from direct dynamic analysis of sudden column loss. A metric for structural robustness is proposed, calculated by normalizing the ultimate capacities of the structural system under sudden column loss by the applicable service-level gravity loading and by evaluating the minimum value of this normalized ultimate capacity over all column removal scenarios. The procedure is applied to two prototype 10-story RC buildings, one employing intermediate moment frames (IMFs) and the other employing special moment frames (SMFs). The SMF building, with its more stringent seismic design and detailing, is found to have greater robustness.

  13. Product environmental footprint in policy and market decisions: Applicability and impact assessment.

    PubMed

    Lehmann, Annekatrin; Bach, Vanessa; Finkbeiner, Matthias

    2015-07-01

    In April 2013, the European Commission published the Product and Organisation Environmental Footprint (PEF/OEF) methodology--a life cycle-based multicriteria measure of the environmental performance of products, services, and organizations. With its approach of "comparability over flexibility," the PEF/OEF methodology aims at harmonizing existing methods, while decreasing the flexibility provided by the International Organization for Standardization (ISO) standards regarding methodological choices. Currently, a 3-y pilot phase is running, aiming at testing the methodology and developing product category and organization sector rules (PEFCR/OEFSR). Although a harmonized method is in theory a good idea, the PEF/OEF methodology presents challenges, including a risk of confusion and limitations in applicability to practice. The paper discusses the main differences between the PEF and ISO methodologies and highlights challenges regarding PEF applicability, with a focus on impact assessment. Some methodological aspects of the PEF and PEFCR Guides are found to contradict the ISO 14044 (2006) and ISO 14025 (2006). Others, such as prohibition of inventory cutoffs, are impractical. The evaluation of the impact assessment methods proposed in the PEF/OEF Guide showed that the predefined methods for water consumption, land use, and abiotic resources are not adequate because of modeling artefacts, missing inventory data, or incomplete characterization factors. However, the methods for global warming and ozone depletion perform very well. The results of this study are relevant for the PEF (and OEF) pilot phase, which aims at testing the PEF (OEF) methodology (and potentially adapting it) as well as addressing challenges and coping with them. © 2015 SETAC.

  14. C3I system modification and EMC (electromagnetic compatibility) methodology, volume 1

    NASA Astrophysics Data System (ADS)

    Wilson, J. L.; Jolly, M. B.

    1984-01-01

    A methodology (i.e., consistent set of procedures) for assessing the electromagnetic compatibility (EMC) of RF subsystem modifications on C3I aircraft was generated during this study (Volume 1). An IEMCAP (Intrasystem Electromagnetic Compatibility Analysis Program) database for the E-3A (AWACS) C3I aircraft RF subsystem was extracted to support the design of the EMC assessment methodology (Volume 2). Mock modifications were performed on the E-3A database to assess, using a preliminary form of the methodology, the resulting EMC impact. Application of the preliminary assessment methodology to modifications in the E-3A database served to fine tune the form of a final assessment methodology. The resulting final assessment methodology is documented in this report in conjunction with the overall study goals, procedures, and database. It is recommended that a similar EMC assessment methodology be developed for the power subsystem within C3I aircraft. It is further recommended that future EMC assessment methodologies be developed around expert systems (i.e., computer intelligent agents) to control both the excruciating detail and user requirement for transparency.

  15. Two-step rating-based 'double-faced applicability' test for sensory analysis of spread products as an alternative to descriptive analysis with trained panel.

    PubMed

    Kim, In-Ah; den-Hollander, Elyn; Lee, Hye-Seong

    2018-03-01

    Descriptive analysis with a trained sensory panel has thus far been the most well defined methodology to characterize various products. However, in practical terms, intensive training in descriptive analysis has been recognized as a serious defect. To overcome this limitation, various novel rapid sensory profiling methodologies have been suggested in the literature. Among these, attribute-based methodologies such as check-all-that-apply (CATA) questions showed results comparable to those of conventional sensory descriptive analysis. Kim, Hopkinson, van Hout, and Lee (2017a, 2017b) have proposed a novel attribute-based methodology termed the two-step rating-based 'double-faced applicability' test with a novel output measure of applicability magnitude (d' A ) for measuring consumers' product usage experience throughout various product usage stages. In this paper, the potential of the two-step rating-based 'double-faced applicability' test with d' A was investigated as an alternative to conventional sensory descriptive analysis in terms of sensory characterization and product discrimination. Twelve commercial spread products were evaluated using both conventional sensory descriptive analysis with a trained sensory panel and two-step rating-based 'double-faced applicability' test with an untrained sensory panel. The results demonstrated that the 'double-faced applicability' test can be used to provide a direct measure of the applicability magnitude of sensory attributes of the samples tested in terms of d' A for sensory characterization of individual samples and multiple sample comparisons. This suggests that when the appropriate list of attributes to be used in the questionnaire is already available, the two-step rating-based 'double-faced applicability' test with d' A can be used as a more efficient alternative to conventional descriptive analysis, without requiring any intensive training process. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Applications of Tutoring Systems in Specialized Subject Areas: An Analysis of Skills, Methodologies, and Results.

    ERIC Educational Resources Information Center

    Heron, Timothy E.; Welsch, Richard G.; Goddard, Yvonne L.

    2003-01-01

    This article reviews how tutoring systems have been applied across specialized subject areas (e.g., music, horticulture, health and safety, social interactions). It summarizes findings, provides an analysis of skills learned within each tutoring system, identifies the respective methodologies, and reports relevant findings, implications, and…

  17. Quality of Service in Networks Supporting Cultural Multimedia Applications

    ERIC Educational Resources Information Center

    Kanellopoulos, Dimitris N.

    2011-01-01

    Purpose: This paper aims to provide an overview of representative multimedia applications in the cultural heritage sector, as well as research results on quality of service (QoS) mechanisms in internet protocol (IP) networks that support such applications. Design/methodology/approach: The paper's approach is a literature review. Findings: Cultural…

  18. Optimal spatio-temporal design of water quality monitoring networks for reservoirs: Application of the concept of value of information

    NASA Astrophysics Data System (ADS)

    Maymandi, Nahal; Kerachian, Reza; Nikoo, Mohammad Reza

    2018-03-01

    This paper presents a new methodology for optimizing Water Quality Monitoring (WQM) networks of reservoirs and lakes using the concept of the value of information (VOI) and utilizing results of a calibrated numerical water quality simulation model. With reference to the value of information theory, water quality of every checkpoint with a specific prior probability differs in time. After analyzing water quality samples taken from potential monitoring points, the posterior probabilities are updated using the Baye's theorem, and VOI of the samples is calculated. In the next step, the stations with maximum VOI is selected as optimal stations. This process is repeated for each sampling interval to obtain optimal monitoring network locations for each interval. The results of the proposed VOI-based methodology is compared with those obtained using an entropy theoretic approach. As the results of the two methodologies would be partially different, in the next step, the results are combined using a weighting method. Finally, the optimal sampling interval and location of WQM stations are chosen using the Evidential Reasoning (ER) decision making method. The efficiency and applicability of the methodology are evaluated using available water quantity and quality data of the Karkheh Reservoir in the southwestern part of Iran.

  19. Application of the probabilistic approximate analysis method to a turbopump blade analysis. [for Space Shuttle Main Engine

    NASA Technical Reports Server (NTRS)

    Thacker, B. H.; Mcclung, R. C.; Millwater, H. R.

    1990-01-01

    An eigenvalue analysis of a typical space propulsion system turbopump blade is presented using an approximate probabilistic analysis methodology. The methodology was developed originally to investigate the feasibility of computing probabilistic structural response using closed-form approximate models. This paper extends the methodology to structures for which simple closed-form solutions do not exist. The finite element method will be used for this demonstration, but the concepts apply to any numerical method. The results agree with detailed analysis results and indicate the usefulness of using a probabilistic approximate analysis in determining efficient solution strategies.

  20. Development of computer-assisted instruction application for statistical data analysis android platform as learning resource

    NASA Astrophysics Data System (ADS)

    Hendikawati, P.; Arifudin, R.; Zahid, M. Z.

    2018-03-01

    This study aims to design an android Statistics Data Analysis application that can be accessed through mobile devices to making it easier for users to access. The Statistics Data Analysis application includes various topics of basic statistical along with a parametric statistics data analysis application. The output of this application system is parametric statistics data analysis that can be used for students, lecturers, and users who need the results of statistical calculations quickly and easily understood. Android application development is created using Java programming language. The server programming language uses PHP with the Code Igniter framework, and the database used MySQL. The system development methodology used is the Waterfall methodology with the stages of analysis, design, coding, testing, and implementation and system maintenance. This statistical data analysis application is expected to support statistical lecturing activities and make students easier to understand the statistical analysis of mobile devices.

  1. Virtual-pulse time integral methodology: A new explicit approach for computational dynamics - Theoretical developments for general nonlinear structural dynamics

    NASA Technical Reports Server (NTRS)

    Chen, Xiaoqin; Tamma, Kumar K.; Sha, Desong

    1993-01-01

    The present paper describes a new explicit virtual-pulse time integral methodology for nonlinear structural dynamics problems. The purpose of the paper is to provide the theoretical basis of the methodology and to demonstrate applicability of the proposed formulations to nonlinear dynamic structures. Different from the existing numerical methods such as direct time integrations or mode superposition techniques, the proposed methodology offers new perspectives and methodology of development, and possesses several unique and attractive computational characteristics. The methodology is tested and compared with the implicit Newmark method (trapezoidal rule) through a nonlinear softening and hardening spring dynamic models. The numerical results indicate that the proposed explicit virtual-pulse time integral methodology is an excellent alternative for solving general nonlinear dynamic problems.

  2. Efficacy of Silk Channel Injections with Insecticides for Management of Lepidoptera Pests of Sweet Corn.

    PubMed

    Sparks, A N; Gadal, L; Ni, X

    2015-08-01

    The primary Lepidoptera pests of sweet corn (Zea mays L. convar. saccharata) in Georgia are the corn earworm, Helicoverpa zea (Boddie), and the fall armyworm, Spodoptera frugiperda (J. E. Smith). Management of these pests typically requires multiple insecticide applications from first silking until harvest, with commercial growers frequently spraying daily. This level of insecticide use presents problems for small growers, particularly for "pick-your-own" operations. Injection of oil into the corn ear silk channel 5-8 days after silking initiation has been used to suppress damage by these insects. Initial work with this technique in Georgia provided poor results. Subsequently, a series of experiments was conducted to evaluate the efficacy of silk channel injections as an application methodology for insecticides. A single application of synthetic insecticide, at greatly reduced per acre rates compared with common foliar applications, provided excellent control of Lepidoptera insects attacking the ear tip and suppressed damage by sap beetles (Nitidulidae). While this methodology is labor-intensive, it requires a single application of insecticide at reduced rates applied ∼2 wk prior to harvest, compared with potential daily applications at full rates up to the day of harvest with foliar insecticide applications. This methodology is not likely to eliminate the need for foliar applications because of other insect pests which do not enter through the silk channel or are not affected by the specific selective insecticide used in the silk channel injection, but would greatly reduce the number of applications required. This methodology may prove particularly useful for small acreage growers. © The Authors 2015. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  3. Error-rate prediction for programmable circuits: methodology, tools and studied cases

    NASA Astrophysics Data System (ADS)

    Velazco, Raoul

    2013-05-01

    This work presents an approach to predict the error rates due to Single Event Upsets (SEU) occurring in programmable circuits as a consequence of the impact or energetic particles present in the environment the circuits operate. For a chosen application, the error-rate is predicted by combining the results obtained from radiation ground testing and the results of fault injection campaigns performed off-beam during which huge numbers of SEUs are injected during the execution of the studied application. The goal of this strategy is to obtain accurate results about different applications' error rates, without using particle accelerator facilities, thus significantly reducing the cost of the sensitivity evaluation. As a case study, this methodology was applied a complex processor, the Power PC 7448 executing a program issued from a real space application and a crypto-processor application implemented in an SRAM-based FPGA and accepted to be embedded in the payload of a scientific satellite of NASA. The accuracy of predicted error rates was confirmed by comparing, for the same circuit and application, predictions with measures issued from radiation ground testing performed at the cyclotron Cyclone cyclotron of HIF (Heavy Ion Facility) of Louvain-la-Neuve (Belgium).

  4. SERVQUAL Application and Adaptation for Educational Service Quality Assessments in Russian Higher Education

    ERIC Educational Resources Information Center

    Galeeva, Railya B.

    2016-01-01

    Purpose: The purpose of this study is to demonstrate an adaptation of the SERVQUAL survey method for measuring the quality of higher educational services in a Russian university context. We use a new analysis and a graphical technique for presentation of results. Design/methodology/approach: The methodology of this research follows the classic…

  5. A negotiation methodology and its application to cogeneration planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, S.M.; Liu, C.C.; Luu, S.

    Power system planning has become a complex process in utilities today. This paper presents a methodology for integrated planning with multiple objectives. The methodology uses a graphical representation (Goal-Decision Network) to capture the planning knowledge. The planning process is viewed as a negotiation process that applies three negotiation operators to search for beneficial decisions in a GDN. Also, the negotiation framework is applied to the problem of planning for cogeneration interconnection. The simulation results are presented to illustrate the cogeneration planning process.

  6. Analysis of the impact of safeguards criteria

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mullen, M.F.; Reardon, P.T.

    As part of the US Program of Technical Assistance to IAEA Safeguards, the Pacific Northwest Laboratory (PNL) was asked to assist in developing and demonstrating a model for assessing the impact of setting criteria for the application of IAEA safeguards. This report presents the results of PNL's work on the task. The report is in three parts. The first explains the technical approach and methodology. The second contains an example application of the methodology. The third presents the conclusions of the study. PNL used the model and computer programs developed as part of Task C.5 (Estimation of Inspection Efforts) ofmore » the Program of Technical Assistance. The example application of the methodology involves low-enriched uranium conversion and fuel fabrication facilities. The effects of variations in seven parameters are considered: false alarm probability, goal probability of detection, detection goal quantity, the plant operator's measurement capability, the inspector's variables measurement capability, the inspector's attributes measurement capability, and annual plant throughput. Among the key results and conclusions of the analysis are the following: the variables with the greatest impact on the probability of detection are the inspector's measurement capability, the goal quantity, and the throughput; the variables with the greatest impact on inspection costs are the throughput, the goal quantity, and the goal probability of detection; there are important interactions between variables. That is, the effects of a given variable often depends on the level or value of some other variable. With the methodology used in this study, these interactions can be quantitatively analyzed; reasonably good approximate prediction equations can be developed using the methodology described here.« less

  7. Development and application of a methodology for a clean development mechanism to avoid methane emissions in closed landfills.

    PubMed

    Janke, Leandro; Lima, André O S; Millet, Maurice; Radetski, Claudemir M

    2013-01-01

    In Brazil, Solid Waste Disposal Sites have operated without consideration of environmental criteria, these areas being characterized by methane (CH4) emissions during the anaerobic degradation of organic matter. The United Nations organization has made efforts to control this situation, through the United Nations Framework Convention on Climate Change (UNFCCC) and the Kyoto Protocol, where projects that seek to reduce the emissions of greenhouse gases (GHG) can be financially rewarded through Certified Emission Reductions (CERs) if they respect the requirements established by the Clean Development Mechanism (CDM), such as the use of methodologies approved by the CDM Executive Board (CDM-EB). Thus, a methodology was developed according to the CDM standards related to the aeration, excavation and composting of closed Municipal Solid Waste (MSW) landfills, which was submitted to CDM-EB for assessment and, after its approval, applied to a real case study in Maringá City (Brazil) with a view to avoiding negative environmental impacts due the production of methane and leachates even after its closure. This paper describes the establishment of this CDM-EB-approved methodology to determine baseline emissions, project emissions and the resultant emission reductions with the application of appropriate aeration, excavation and composting practices at closed MSW landfills. A further result obtained through the application of the methodology in the landfill case study was that it would be possible to achieve an ex-ante emission reduction of 74,013 tCO2 equivalent if the proposed CDM project activity were implemented.

  8. Assessment of reproductive and developmental effects of DINP, DnHP and DCHP using quantitative weight of evidence.

    PubMed

    Dekant, Wolfgang; Bridges, James

    2016-11-01

    Quantitative weight of evidence (QWoE) methodology utilizes detailed scoring sheets to assess the quality/reliability of each publication on toxicity of a chemical and gives numerical scores for quality and observed toxicity. This QWoE-methodology was applied to the reproductive toxicity data on diisononylphthalate (DINP), di-n-hexylphthalate (DnHP), and dicyclohexylphthalate (DCHP) to determine if the scientific evidence for adverse effects meets the requirements for classification as reproductive toxicants. The scores for DINP were compared to those when applying the methodology DCHP and DnHP that have harmonized classifications. Based on the quality/reliability scores, application of the QWoE shows that the three databases are of similar quality; but effect scores differ widely. Application of QWoE to DINP studies resulted in an overall score well below the benchmark required to trigger classification. For DCHP, the QWoE also results in low scores. The high scores from the application of the QWoE methodology to the toxicological data for DnHP represent clear evidence for adverse effects and justify a classification of DnHP as category 1B for both development and fertility. The conclusions on classification based on the QWoE are well supported using a narrative assessment of consistency and biological plausibility. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  9. DB4US: A Decision Support System for Laboratory Information Management

    PubMed Central

    Hortas, Maria Luisa; Baena-García, Manuel; Lana-Linati, Jorge; González, Carlos; Redondo, Maximino; Morales-Bueno, Rafael

    2012-01-01

    Background Until recently, laboratory automation has focused primarily on improving hardware. Future advances are concentrated on intelligent software since laboratories performing clinical diagnostic testing require improved information systems to address their data processing needs. In this paper, we propose DB4US, an application that automates information related to laboratory quality indicators information. Currently, there is a lack of ready-to-use management quality measures. This application addresses this deficiency through the extraction, consolidation, statistical analysis, and visualization of data related to the use of demographics, reagents, and turn-around times. The design and implementation issues, as well as the technologies used for the implementation of this system, are discussed in this paper. Objective To develop a general methodology that integrates the computation of ready-to-use management quality measures and a dashboard to easily analyze the overall performance of a laboratory, as well as automatically detect anomalies or errors. The novelty of our approach lies in the application of integrated web-based dashboards as an information management system in hospital laboratories. Methods We propose a new methodology for laboratory information management based on the extraction, consolidation, statistical analysis, and visualization of data related to demographics, reagents, and turn-around times, offering a dashboard-like user web interface to the laboratory manager. The methodology comprises a unified data warehouse that stores and consolidates multidimensional data from different data sources. The methodology is illustrated through the implementation and validation of DB4US, a novel web application based on this methodology that constructs an interface to obtain ready-to-use indicators, and offers the possibility to drill down from high-level metrics to more detailed summaries. The offered indicators are calculated beforehand so that they are ready to use when the user needs them. The design is based on a set of different parallel processes to precalculate indicators. The application displays information related to tests, requests, samples, and turn-around times. The dashboard is designed to show the set of indicators on a single screen. Results DB4US was deployed for the first time in the Hospital Costa del Sol in 2008. In our evaluation we show the positive impact of this methodology for laboratory professionals, since the use of our application has reduced the time needed for the elaboration of the different statistical indicators and has also provided information that has been used to optimize the usage of laboratory resources by the discovery of anomalies in the indicators. DB4US users benefit from Internet-based communication of results, since this information is available from any computer without having to install any additional software. Conclusions The proposed methodology and the accompanying web application, DB4US, automates the processing of information related to laboratory quality indicators and offers a novel approach for managing laboratory-related information, benefiting from an Internet-based communication mechanism. The application of this methodology has been shown to improve the usage of time, as well as other laboratory resources. PMID:23608745

  10. A Generalizable Methodology for Quantifying User Satisfaction

    NASA Astrophysics Data System (ADS)

    Huang, Te-Yuan; Chen, Kuan-Ta; Huang, Polly; Lei, Chin-Laung

    Quantifying user satisfaction is essential, because the results can help service providers deliver better services. In this work, we propose a generalizable methodology, based on survival analysis, to quantify user satisfaction in terms of session times, i. e., the length of time users stay with an application. Unlike subjective human surveys, our methodology is based solely on passive measurement, which is more cost-efficient and better able to capture subconscious reactions. Furthermore, by using session times, rather than a specific performance indicator, such as the level of distortion of voice signals, the effects of other factors like loudness and sidetone, can also be captured by the developed models. Like survival analysis, our methodology is characterized by low complexity and a simple model-developing process. The feasibility of our methodology is demonstrated through case studies of ShenZhou Online, a commercial MMORPG in Taiwan, and the most prevalent VoIP application in the world, namely Skype. Through the model development process, we can also identify the most significant performance factors and their impacts on user satisfaction and discuss how they can be exploited to improve user experience and optimize resource allocation.

  11. Evaluation of Structural Robustness against Column Loss: Methodology and Application to RC Frame Buildings

    PubMed Central

    Bao, Yihai; Main, Joseph A.; Noh, Sam-Young

    2017-01-01

    A computational methodology is presented for evaluating structural robustness against column loss. The methodology is illustrated through application to reinforced concrete (RC) frame buildings, using a reduced-order modeling approach for three-dimensional RC framing systems that includes the floor slabs. Comparisons with high-fidelity finite-element model results are presented to verify the approach. Pushdown analyses of prototype buildings under column loss scenarios are performed using the reduced-order modeling approach, and an energy-based procedure is employed to account for the dynamic effects associated with sudden column loss. Results obtained using the energy-based approach are found to be in good agreement with results from direct dynamic analysis of sudden column loss. A metric for structural robustness is proposed, calculated by normalizing the ultimate capacities of the structural system under sudden column loss by the applicable service-level gravity loading and by evaluating the minimum value of this normalized ultimate capacity over all column removal scenarios. The procedure is applied to two prototype 10-story RC buildings, one employing intermediate moment frames (IMFs) and the other employing special moment frames (SMFs). The SMF building, with its more stringent seismic design and detailing, is found to have greater robustness. PMID:28890599

  12. A hierarchical clustering methodology for the estimation of toxicity.

    PubMed

    Martin, Todd M; Harten, Paul; Venkatapathy, Raghuraman; Das, Shashikala; Young, Douglas M

    2008-01-01

    ABSTRACT A quantitative structure-activity relationship (QSAR) methodology based on hierarchical clustering was developed to predict toxicological endpoints. This methodology utilizes Ward's method to divide a training set into a series of structurally similar clusters. The structural similarity is defined in terms of 2-D physicochemical descriptors (such as connectivity and E-state indices). A genetic algorithm-based technique is used to generate statistically valid QSAR models for each cluster (using the pool of descriptors described above). The toxicity for a given query compound is estimated using the weighted average of the predictions from the closest cluster from each step in the hierarchical clustering assuming that the compound is within the domain of applicability of the cluster. The hierarchical clustering methodology was tested using a Tetrahymena pyriformis acute toxicity data set containing 644 chemicals in the training set and with two prediction sets containing 339 and 110 chemicals. The results from the hierarchical clustering methodology were compared to the results from several different QSAR methodologies.

  13. Report on an Assessment of the Application of EPP Results from the Strain Limit Evaluation Procedure to the Prediction of Cyclic Life Based on the SMT Methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jetter, R. I.; Messner, M. C.; Sham, T. -L.

    The goal of the proposed integrated Elastic Perfectly-Plastic (EPP) and Simplified Model Test (SMT) methodology is to incorporate an SMT data based approach for creep-fatigue damage evaluation into the EPP methodology to avoid the separate evaluation of creep and fatigue damage and eliminate the requirement for stress classification in current methods; thus greatly simplifying evaluation of elevated temperature cyclic service. This methodology should minimize over-conservatism while properly accounting for localized defects and stress risers. To support the implementation of the proposed methodology and to verify the applicability of the code rules, analytical studies and evaluation of thermomechanical test results continuedmore » in FY17. This report presents the results of those studies. An EPP strain limits methodology assessment was based on recent two-bar thermal ratcheting test results on 316H stainless steel in the temperature range of 405 to 7050C. Strain range predictions from the EPP evaluation of the two-bar tests were also evaluated and compared with the experimental results. The role of sustained primary loading on cyclic life was assessed using the results of pressurized SMT data from tests on Alloy 617 at 9500C. A viscoplastic material model was used in an analytic simulation of two-bar tests to compare with EPP strain limits assessments using isochronous stress strain curves that are consistent with the viscoplastic material model. A finite element model of a prior 304H stainless steel Oak Ridge National Laboratory (ORNL) nozzle-to-sphere test was developed and used for an EPP strain limits and creep-fatigue code case damage evaluations. A theoretical treatment of a recurring issue with convergence criteria for plastic shakedown illustrated the role of computer machine precision in EPP calculations.« less

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faidy, C.; Gilles, P.

    The objective of the seminar was to present the current state of the art in Leak-Before-Break (LBB) methodology development, validation, and application in an international forum. With particular emphasis on industrial applications and regulatory policies, the seminar provided an opportunity to compare approaches, experiences, and codifications developed by different countries. The seminar was organized into four topic areas: status of LBB applications; technical issues in LBB methodology; complementary requirements (leak detection and inspection); LBB assessment and margins. As a result of this seminar, an improved understanding of LBB gained through sharing of different viewpoints from different countries, permits consideration of:more » simplified pipe support design and possible elimination of loss-of-coolant-accident (LOCA) mechanical consequences for specific cases; defense-in-depth type of applications without support modifications; support of safety cases for plants designed without the LOCA hypothesis. In support of these activities, better estimates of the limits to the LBB approach should follow, as well as an improvement in codifying methodologies. Selected papers are indexed separately for inclusion in the Energy Science and Technology Database.« less

  15. The Statistical point of view of Quality: the Lean Six Sigma methodology

    PubMed Central

    Viti, Andrea; Terzi, Alberto

    2015-01-01

    Six Sigma and Lean are two quality improvement methodologies. The Lean Six Sigma methodology is applicable to repetitive procedures. Therefore, the use of this methodology in the health-care arena has focused mainly on areas of business operations, throughput, and case management and has focused on efficiency outcomes. After the revision of methodology, the paper presents a brief clinical example of the use of Lean Six Sigma as a quality improvement method in the reduction of the complications during and after lobectomies. Using Lean Six Sigma methodology, the multidisciplinary teams could identify multiple modifiable points across the surgical process. These process improvements could be applied to different surgical specialties and could result in a measurement, from statistical point of view, of the surgical quality. PMID:25973253

  16. The Statistical point of view of Quality: the Lean Six Sigma methodology.

    PubMed

    Bertolaccini, Luca; Viti, Andrea; Terzi, Alberto

    2015-04-01

    Six Sigma and Lean are two quality improvement methodologies. The Lean Six Sigma methodology is applicable to repetitive procedures. Therefore, the use of this methodology in the health-care arena has focused mainly on areas of business operations, throughput, and case management and has focused on efficiency outcomes. After the revision of methodology, the paper presents a brief clinical example of the use of Lean Six Sigma as a quality improvement method in the reduction of the complications during and after lobectomies. Using Lean Six Sigma methodology, the multidisciplinary teams could identify multiple modifiable points across the surgical process. These process improvements could be applied to different surgical specialties and could result in a measurement, from statistical point of view, of the surgical quality.

  17. Interrogating discourse: the application of Foucault's methodological discussion to specific inquiry.

    PubMed

    Fadyl, Joanna K; Nicholls, David A; McPherson, Kathryn M

    2013-09-01

    Discourse analysis following the work of Michel Foucault has become a valuable methodology in the critical analysis of a broad range of topics relating to health. However, it can be a daunting task, in that there seems to be both a huge number of possible approaches to carrying out this type of project, and an abundance of different, often conflicting, opinions about what counts as 'Foucauldian'. This article takes the position that methodological design should be informed by ongoing discussion and applied as appropriate to a particular area of inquiry. The discussion given offers an interpretation and application of Foucault's methodological principles, integrating a reading of Foucault with applications of his work by other authors, showing how this is then applied to interrogate the practice of vocational rehabilitation. It is intended as a contribution to methodological discussion in this area, offering an interpretation of various methodological elements described by Foucault, alongside specific application of these aspects.

  18. Estimating the return on investment in disease management programs using a pre-post analysis.

    PubMed

    Fetterolf, Donald; Wennberg, David; Devries, Andrea

    2004-01-01

    Disease management programs have become increasingly popular over the past 5-10 years. Recent increases in overall medical costs have precipitated new concerns about the cost-effectiveness of medical management programs that have extended to the program directors for these programs. Initial success of the disease management movement is being challenged on the grounds that reported results have been the result of the application of faulty, if intuitive, methodologies. This paper discusses the use of "pre-post" methodology approaches in the analysis of disease management programs, and areas where application of this approach can result in spurious results and incorrect financial outcome assessments. The paper includes a checklist of these items for use by operational staff working with the programs, and a comprehensive bibliography that addresses many of the issues discussed.

  19. A Visual Programming Methodology for Tactical Aircrew Scheduling and Other Applications

    DTIC Science & Technology

    1991-12-01

    prgramming methodology and environment of a user-specific application remains with and is delivered as part of the application, then there is another factor...animation is useful, not only for scheduling applications, but as a general prgramming methodology. Of course, there are a number of improvements...possible using Excel because there is nothing to prevent access to cells. However, it is easy to imagine a spreadsheet which can support the

  20. A Model-based Prognostics Methodology for Electrolytic Capacitors Based on Electrical Overstress Accelerated Aging

    NASA Technical Reports Server (NTRS)

    Celaya, Jose; Kulkarni, Chetan; Biswas, Gautam; Saha, Sankalita; Goebel, Kai

    2011-01-01

    A remaining useful life prediction methodology for electrolytic capacitors is presented. This methodology is based on the Kalman filter framework and an empirical degradation model. Electrolytic capacitors are used in several applications ranging from power supplies on critical avionics equipment to power drivers for electro-mechanical actuators. These devices are known for their comparatively low reliability and given their criticality in electronics subsystems they are a good candidate for component level prognostics and health management. Prognostics provides a way to assess remaining useful life of a capacitor based on its current state of health and its anticipated future usage and operational conditions. We present here also, experimental results of an accelerated aging test under electrical stresses. The data obtained in this test form the basis for a remaining life prediction algorithm where a model of the degradation process is suggested. This preliminary remaining life prediction algorithm serves as a demonstration of how prognostics methodologies could be used for electrolytic capacitors. In addition, the use degradation progression data from accelerated aging, provides an avenue for validation of applications of the Kalman filter based prognostics methods typically used for remaining useful life predictions in other applications.

  1. Towards A Model-Based Prognostics Methodology for Electrolytic Capacitors: A Case Study Based on Electrical Overstress Accelerated Aging

    NASA Technical Reports Server (NTRS)

    Celaya, Jose R.; Kulkarni, Chetan S.; Biswas, Gautam; Goebel, Kai

    2012-01-01

    A remaining useful life prediction methodology for electrolytic capacitors is presented. This methodology is based on the Kalman filter framework and an empirical degradation model. Electrolytic capacitors are used in several applications ranging from power supplies on critical avionics equipment to power drivers for electro-mechanical actuators. These devices are known for their comparatively low reliability and given their criticality in electronics subsystems they are a good candidate for component level prognostics and health management. Prognostics provides a way to assess remaining useful life of a capacitor based on its current state of health and its anticipated future usage and operational conditions. We present here also, experimental results of an accelerated aging test under electrical stresses. The data obtained in this test form the basis for a remaining life prediction algorithm where a model of the degradation process is suggested. This preliminary remaining life prediction algorithm serves as a demonstration of how prognostics methodologies could be used for electrolytic capacitors. In addition, the use degradation progression data from accelerated aging, provides an avenue for validation of applications of the Kalman filter based prognostics methods typically used for remaining useful life predictions in other applications.

  2. Methodological challenges when doing research that includes ethnic minorities: a scoping review.

    PubMed

    Morville, Anne-Le; Erlandsson, Lena-Karin

    2016-11-01

    There are challenging methodological issues in obtaining valid and reliable results on which to base occupational therapy interventions for ethnic minorities. The aim of this scoping review is to describe the methodological problems within occupational therapy research, when ethnic minorities are included. A thorough literature search yielded 21 articles obtained from the scientific databases PubMed, Cinahl, Web of Science and PsychInfo. Analysis followed Arksey and O'Malley's framework for scoping reviews, applying content analysis. The results showed methodological issues concerning the entire research process from defining and recruiting samples, the conceptual understanding, lack of appropriate instruments, data collection using interpreters to analyzing data. In order to avoid excluding the ethnic minorities from adequate occupational therapy research and interventions, development of methods for the entire research process is needed. It is a costly and time-consuming process, but the results will be valid and reliable, and therefore more applicable in clinical practice.

  3. 3D and 4D Simulations for Landscape Reconstruction and Damage Scenarios: GIS Pilot Applications

    ERIC Educational Resources Information Center

    Pesaresi, Cristano; Van Der Schee, Joop; Pavia, Davide

    2017-01-01

    The project "3D and 4D Simulations for Landscape Reconstruction and Damage Scenarios: GIS Pilot Applications" has been devised with the intention to deal with the demand for research, innovation and applicative methodology on the part of the international programme, requiring concrete results to increase the capacity to know, anticipate…

  4. Ground Thermal Diffusivity Calculation by Direct Soil Temperature Measurement. Application to very Low Enthalpy Geothermal Energy Systems.

    PubMed

    Andújar Márquez, José Manuel; Martínez Bohórquez, Miguel Ángel; Gómez Melgar, Sergio

    2016-02-29

    This paper presents a methodology and instrumentation system for the indirect measurement of the thermal diffusivity of a soil at a given depth from measuring its temperature at that depth. The development has been carried out considering its application to the design and sizing of very low enthalpy geothermal energy (VLEGE) systems, but it can has many other applications, for example in construction, agriculture or biology. The methodology is simple and inexpensive because it can take advantage of the prescriptive geotechnical drilling prior to the construction of a house or building, to take at the same time temperature measurements that will allow get the actual temperature and ground thermal diffusivity to the depth of interest. The methodology and developed system have been tested and used in the design of a VLEGE facility for a chalet with basement at the outskirts of Huelva (a city in the southwest of Spain). Experimental results validate the proposed approach.

  5. Interpreting drinking water quality in the distribution system using Dempster-Shafer theory of evidence.

    PubMed

    Sadiq, Rehan; Rodriguez, Manuel J

    2005-04-01

    Interpreting water quality data routinely generated for control and monitoring purposes in water distribution systems is a complicated task for utility managers. In fact, data for diverse water quality indicators (physico-chemical and microbiological) are generated at different times and at different locations in the distribution system. To simplify and improve the understanding and the interpretation of water quality, methodologies for aggregation and fusion of data must be developed. In this paper, the Dempster-Shafer theory also called theory of evidence is introduced as a potential methodology for interpreting water quality data. The conceptual basis of this methodology and the process for its implementation are presented by two applications. The first application deals with the interpretation of spatial water quality data fusion, while the second application deals with the development of water quality index based on key monitored indicators. Based on the obtained results, the authors discuss the potential contribution of theory of evidence as a decision-making tool for water quality management.

  6. Ground Thermal Diffusivity Calculation by Direct Soil Temperature Measurement. Application to very Low Enthalpy Geothermal Energy Systems

    PubMed Central

    Andújar Márquez, José Manuel; Martínez Bohórquez, Miguel Ángel; Gómez Melgar, Sergio

    2016-01-01

    This paper presents a methodology and instrumentation system for the indirect measurement of the thermal diffusivity of a soil at a given depth from measuring its temperature at that depth. The development has been carried out considering its application to the design and sizing of very low enthalpy geothermal energy (VLEGE) systems, but it can has many other applications, for example in construction, agriculture or biology. The methodology is simple and inexpensive because it can take advantage of the prescriptive geotechnical drilling prior to the construction of a house or building, to take at the same time temperature measurements that will allow get the actual temperature and ground thermal diffusivity to the depth of interest. The methodology and developed system have been tested and used in the design of a VLEGE facility for a chalet with basement at the outskirts of Huelva (a city in the southwest of Spain). Experimental results validate the proposed approach. PMID:26938534

  7. Choosing the Most Effective Pattern Classification Model under Learning-Time Constraint.

    PubMed

    Saito, Priscila T M; Nakamura, Rodrigo Y M; Amorim, Willian P; Papa, João P; de Rezende, Pedro J; Falcão, Alexandre X

    2015-01-01

    Nowadays, large datasets are common and demand faster and more effective pattern analysis techniques. However, methodologies to compare classifiers usually do not take into account the learning-time constraints required by applications. This work presents a methodology to compare classifiers with respect to their ability to learn from classification errors on a large learning set, within a given time limit. Faster techniques may acquire more training samples, but only when they are more effective will they achieve higher performance on unseen testing sets. We demonstrate this result using several techniques, multiple datasets, and typical learning-time limits required by applications.

  8. Reliability Prediction Analysis: Airborne System Results and Best Practices

    NASA Astrophysics Data System (ADS)

    Silva, Nuno; Lopes, Rui

    2013-09-01

    This article presents the results of several reliability prediction analysis for aerospace components, made by both methodologies, the 217F and the 217Plus. Supporting and complementary activities are described, as well as the differences concerning the results and the applications of both methodologies that are summarized in a set of lessons learned that are very useful for RAMS and Safety Prediction practitioners.The effort that is required for these activities is also an important point that is discussed, as is the end result and their interpretation/impact on the system design.The article concludes while positioning these activities and methodologies in an overall process for space and aeronautics equipment/components certification, and highlighting their advantages. Some good practices have also been summarized and some reuse rules have been laid down.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Janjusic, Tommy; Kartsaklis, Christos

    Memory scalability is an enduring problem and bottleneck that plagues many parallel codes. Parallel codes designed for High Performance Systems are typically designed over the span of several, and in some instances 10+, years. As a result, optimization practices which were appropriate for earlier systems may no longer be valid and thus require careful optimization consideration. Specifically, parallel codes whose memory footprint is a function of their scalability must be carefully considered for future exa-scale systems. In this paper we present a methodology and tool to study the memory scalability of parallel codes. Using our methodology we evaluate an applicationmore » s memory footprint as a function of scalability, which we coined memory efficiency, and describe our results. In particular, using our in-house tools we can pinpoint the specific application components which contribute to the application s overall memory foot-print (application data- structures, libraries, etc.).« less

  10. Application of atomic force microscopy as a nanotechnology tool in food science.

    PubMed

    Yang, Hongshun; Wang, Yifen; Lai, Shaojuan; An, Hongjie; Li, Yunfei; Chen, Fusheng

    2007-05-01

    Atomic force microscopy (AFM) provides a method for detecting nanoscale structural information. First, this review explains the fundamentals of AFM, including principle, manipulation, and analysis. Applications of AFM are then reported in food science and technology research, including qualitative macromolecule and polymer imaging, complicated or quantitative structure analysis, molecular interaction, molecular manipulation, surface topography, and nanofood characterization. The results suggested that AFM could bring insightful knowledge on food properties, and the AFM analysis could be used to illustrate some mechanisms of property changes during processing and storage. However, the current difficulty in applying AFM to food research is lacking appropriate methodology for different food systems. Better understanding of AFM technology and developing corresponding methodology for complicated food systems would lead to a more in-depth understanding of food properties at macromolecular levels and enlarge their applications. The AFM results could greatly improve the food processing and storage technologies.

  11. Modern proposal of methodology for retrieval of characteristic synthetic rainfall hyetographs

    NASA Astrophysics Data System (ADS)

    Licznar, Paweł; Burszta-Adamiak, Ewa; Łomotowski, Janusz; Stańczyk, Justyna

    2017-11-01

    Modern engineering workshop of designing and modelling complex drainage systems is based on hydrodynamic modelling and has a probabilistic character. Its practical application requires a change regarding rainfall models accepted at the input. Previously used artificial rainfall models of simplified form, e.g. block precipitation or Euler's type II model rainfall are no longer sufficient. It is noticeable that urgent clarification is needed as regards the methodology of standardized rainfall hyetographs that would take into consideration the specifics of local storm rainfall temporal dynamics. The aim of the paper is to present a proposal for innovative methodology for determining standardized rainfall hyetographs, based on statistical processing of the collection of actual local precipitation characteristics. Proposed methodology is based on the classification of standardized rainfall hyetographs with the use of cluster analysis. Its application is presented on the example of selected rain gauges localized in Poland. Synthetic rainfall hyetographs achieved as a final result may be used for hydrodynamic modelling of sewerage systems, including probabilistic detection of necessary capacity of retention reservoirs.

  12. A deviation based assessment methodology for multiple machine health patterns classification and fault detection

    NASA Astrophysics Data System (ADS)

    Jia, Xiaodong; Jin, Chao; Buzza, Matt; Di, Yuan; Siegel, David; Lee, Jay

    2018-01-01

    Successful applications of Diffusion Map (DM) in machine failure detection and diagnosis have been reported in several recent studies. DM provides an efficient way to visualize the high-dimensional, complex and nonlinear machine data, and thus suggests more knowledge about the machine under monitoring. In this paper, a DM based methodology named as DM-EVD is proposed for machine degradation assessment, abnormality detection and diagnosis in an online fashion. Several limitations and challenges of using DM for machine health monitoring have been analyzed and addressed. Based on the proposed DM-EVD, a deviation based methodology is then proposed to include more dimension reduction methods. In this work, the incorporation of Laplacian Eigen-map and Principal Component Analysis (PCA) are explored, and the latter algorithm is named as PCA-Dev and is validated in the case study. To show the successful application of the proposed methodology, case studies from diverse fields are presented and investigated in this work. Improved results are reported by benchmarking with other machine learning algorithms.

  13. Structural Sizing Methodology for the Tendon-Actuated Lightweight In-Space MANipulator (TALISMAN) System

    NASA Technical Reports Server (NTRS)

    Jones, Thomas C.; Dorsey, John T.; Doggett, William R.

    2015-01-01

    The Tendon-Actuated Lightweight In-Space MANipulator (TALISMAN) is a versatile long-reach robotic manipulator that is currently being tested at NASA Langley Research Center. TALISMAN is designed to be highly mass-efficient and multi-mission capable, with applications including asteroid retrieval and manipulation, in-space servicing, and astronaut and payload positioning. The manipulator uses a modular, periodic, tension-compression design that lends itself well to analytical modeling. Given the versatility of application for TALISMAN, a structural sizing methodology was developed that could rapidly assess mass and configuration sensitivities for any specified operating work space, applied loads and mission requirements. This methodology allows the systematic sizing of the key structural members of TALISMAN, which include the truss arm links, the spreaders and the tension elements. This paper summarizes the detailed analytical derivations and methodology that support the structural sizing approach and provides results from some recent TALISMAN designs developed for current and proposed mission architectures.

  14. A Formal Methodology to Design and Deploy Dependable Wireless Sensor Networks

    PubMed Central

    Testa, Alessandro; Cinque, Marcello; Coronato, Antonio; Augusto, Juan Carlos

    2016-01-01

    Wireless Sensor Networks (WSNs) are being increasingly adopted in critical applications, where verifying the correct operation of sensor nodes is a major concern. Undesired events may undermine the mission of the WSNs. Hence, their effects need to be properly assessed before deployment, to obtain a good level of expected performance; and during the operation, in order to avoid dangerous unexpected results. In this paper, we propose a methodology that aims at assessing and improving the dependability level of WSNs by means of an event-based formal verification technique. The methodology includes a process to guide designers towards the realization of a dependable WSN and a tool (“ADVISES”) to simplify its adoption. The tool is applicable to homogeneous WSNs with static routing topologies. It allows the automatic generation of formal specifications used to check correctness properties and evaluate dependability metrics at design time and at runtime for WSNs where an acceptable percentage of faults can be defined. During the runtime, we can check the behavior of the WSN accordingly to the results obtained at design time and we can detect sudden and unexpected failures, in order to trigger recovery procedures. The effectiveness of the methodology is shown in the context of two case studies, as proof-of-concept, aiming to illustrate how the tool is helpful to drive design choices and to check the correctness properties of the WSN at runtime. Although the method scales up to very large WSNs, the applicability of the methodology may be compromised by the state space explosion of the reasoning model, which must be faced by partitioning large topologies into sub-topologies. PMID:28025568

  15. Analysis of Turbulent Boundary-Layer over Rough Surfaces with Application to Projectile Aerodynamics

    DTIC Science & Technology

    1988-12-01

    12 V. APPLICATION IN COMPONENT BUILD-UP METHODOLOGIES ....................... 12 1. COMPONENT BUILD-UP IN DRAG...dimensional roughness. II. CLASSIFICATION OF PREDICTION METHODS Prediction methods can be classified into two main approache-: 1) Correlation methodologies ...data are availaNe. V. APPLICATION IN COMPONENT BUILD-UP METHODOLOGIES 1. COMPONENT BUILD-UP IN DRAG The new correlation can be used for an engine.ring

  16. Evaluating a collaborative IT based research and development project.

    PubMed

    Khan, Zaheer; Ludlow, David; Caceres, Santiago

    2013-10-01

    In common with all projects, evaluating an Information Technology (IT) based research and development project is necessary in order to discover whether or not the outcomes of the project are successful. However, evaluating large-scale collaborative projects is especially difficult as: (i) stakeholders from different countries are involved who, almost inevitably, have diverse technological and/or application domain backgrounds and objectives; (ii) multiple and sometimes conflicting application specific and user-defined requirements exist; and (iii) multiple and often conflicting technological research and development objectives are apparent. In this paper, we share our experiences based on the large-scale integrated research project - The HUMBOLDT project - with project duration of 54 months, involving contributions from 27 partner organisations, plus 4 sub-contractors from 14 different European countries. In the HUMBOLDT project, a specific evaluation methodology was defined and utilised for the user evaluation of the project outcomes. The user evaluation performed on the HUMBOLDT Framework and its associated nine application scenarios from various application domains, resulted in not only an evaluation of the integrated project, but also revealed the benefits and disadvantages of the evaluation methodology. This paper presents the evaluation methodology, discusses in detail the process of applying it to the HUMBOLDT project and provides an in-depth analysis of the results, which can be usefully applied to other collaborative research projects in a variety of domains. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Verification of nonlinear dynamic structural test results by combined image processing and acoustic analysis

    NASA Astrophysics Data System (ADS)

    Tene, Yair; Tene, Noam; Tene, G.

    1993-08-01

    An interactive data fusion methodology of video, audio, and nonlinear structural dynamic analysis for potential application in forensic engineering is presented. The methodology was developed and successfully demonstrated in the analysis of heavy transportable bridge collapse during preparation for testing. Multiple bridge elements failures were identified after the collapse, including fracture, cracks and rupture of high performance structural materials. Videotape recording by hand held camcorder was the only source of information about the collapse sequence. The interactive data fusion methodology resulted in extracting relevant information form the videotape and from dynamic nonlinear structural analysis, leading to full account of the sequence of events during the bridge collapse.

  18. Regional health care planning: a methodology to cluster facilities using community utilization patterns

    PubMed Central

    2013-01-01

    Background Community-based health care planning and regulation necessitates grouping facilities and areal units into regions of similar health care use. Limited research has explored the methodologies used in creating these regions. We offer a new methodology that clusters facilities based on similarities in patient utilization patterns and geographic location. Our case study focused on Hospital Groups in Michigan, the allocation units used for predicting future inpatient hospital bed demand in the state’s Bed Need Methodology. The scientific, practical, and political concerns that were considered throughout the formulation and development of the methodology are detailed. Methods The clustering methodology employs a 2-step K-means + Ward’s clustering algorithm to group hospitals. The final number of clusters is selected using a heuristic that integrates both a statistical-based measure of cluster fit and characteristics of the resulting Hospital Groups. Results Using recent hospital utilization data, the clustering methodology identified 33 Hospital Groups in Michigan. Conclusions Despite being developed within the politically charged climate of Certificate of Need regulation, we have provided an objective, replicable, and sustainable methodology to create Hospital Groups. Because the methodology is built upon theoretically sound principles of clustering analysis and health care service utilization, it is highly transferable across applications and suitable for grouping facilities or areal units. PMID:23964905

  19. Data Centric Development Methodology

    ERIC Educational Resources Information Center

    Khoury, Fadi E.

    2012-01-01

    Data centric applications, an important effort of software development in large organizations, have been mostly adopting a software methodology, such as a waterfall or Rational Unified Process, as the framework for its development. These methodologies could work on structural, procedural, or object oriented based applications, but fails to capture…

  20. Suggestions for Job and Curriculum Ladders in Health Center Ambulatory Care: A Pilot Test of the Health Services Mobility Study Methodology.

    ERIC Educational Resources Information Center

    Gilpatrick, Eleanor

    This report contains the results of a pilot test which represents the first complete field test of methodological work begun in October 1967 under a Federal grant for the purpose of job analysis in the health services. This 4-year Health Services Mobility Study permitted basic research, field testing, practical application, and policy involvement…

  1. Investigating Dynamics of Eccentricity in Turbomachines

    NASA Technical Reports Server (NTRS)

    Baun, Daniel

    2010-01-01

    A methodology (and hardware and software to implement the methodology) has been developed as a means of investigating coupling between certain rotordynamic and hydrodynamic phenomena in turbomachines. Originally, the methodology was intended for application in an investigation of coupled rotordynamic and hydrodynamic effects postulated to have caused high synchronous vibration in the space shuttle s high-pressure oxygen turbopump (HPOTP). The methodology can also be applied in investigating (for the purpose of developing means of suppressing) undesired hydrodynamic rotor/stator interactions in turbomachines in general. The methodology and the types of phenomena that can be investigated by use of the methodology are best summarized by citing the original application as an example. In that application, in consideration of the high synchronous vibration in the space-shuttle main engine (SSME) HPOTP, it was determined to be necessary to perform tests to investigate the influence of inducer eccentricity and/or synchronous whirl motion on inducer hydrodynamic forces under prescribed flow and cavitation conditions. It was believed that manufacturing tolerances of the turbopump resulted in some induced runout of the pump rotor. Such runout, if oriented with an inducer blade, would cause that blade to run with tip clearance smaller than the tip clearances of the other inducer blades. It was hypothesized that the resulting hydraulic asymmetry, coupled with alternating blade cavitation, could give rise to the observed high synchronous vibration. In tests performed to investigate this hypothesis, prescribed rotor whirl motions have been imposed on a 1/3-scale water-rig version of the SSME LPOTP inducer (which is also a 4-biased inducer having similar cavitation dynamics as the HPOTP) in a magnetic-bearing test facility. The particular magnetic-bearing test facility, through active vibration control, affords a capability to impose, on the rotor, whirl orbits having shapes and whirl rates prescribed by the user, and to simultaneously measure the resulting hydrodynamic forces generated by the impeller. Active control also made it possible to modulate the inducer-blade running tip clearance and consequently effect alternating blade cavitation. The measured hydraulic forces have been compared and correlated with shroud dynamic-pressure measurements.

  2. Recommendations for benefit-risk assessment methodologies and visual representations.

    PubMed

    Hughes, Diana; Waddingham, Ed; Mt-Isa, Shahrul; Goginsky, Alesia; Chan, Edmond; Downey, Gerald F; Hallgreen, Christine E; Hockley, Kimberley S; Juhaeri, Juhaeri; Lieftucht, Alfons; Metcalf, Marilyn A; Noel, Rebecca A; Phillips, Lawrence D; Ashby, Deborah; Micaleff, Alain

    2016-03-01

    The purpose of this study is to draw on the practical experience from the PROTECT BR case studies and make recommendations regarding the application of a number of methodologies and visual representations for benefit-risk assessment. Eight case studies based on the benefit-risk balance of real medicines were used to test various methodologies that had been identified from the literature as having potential applications in benefit-risk assessment. Recommendations were drawn up based on the results of the case studies. A general pathway through the case studies was evident, with various classes of methodologies having roles to play at different stages. Descriptive and quantitative frameworks were widely used throughout to structure problems, with other methods such as metrics, estimation techniques and elicitation techniques providing ways to incorporate technical or numerical data from various sources. Similarly, tree diagrams and effects tables were universally adopted, with other visualisations available to suit specific methodologies or tasks as required. Every assessment was found to follow five broad stages: (i) Planning, (ii) Evidence gathering and data preparation, (iii) Analysis, (iv) Exploration and (v) Conclusion and dissemination. Adopting formal, structured approaches to benefit-risk assessment was feasible in real-world problems and facilitated clear, transparent decision-making. Prior to this work, no extensive practical application and appraisal of methodologies had been conducted using real-world case examples, leaving users with limited knowledge of their usefulness in the real world. The practical guidance provided here takes us one step closer to a harmonised approach to benefit-risk assessment from multiple perspectives. Copyright © 2016 John Wiley & Sons, Ltd.

  3. Application-specific coarse-grained reconfigurable array: architecture and design methodology

    NASA Astrophysics Data System (ADS)

    Zhou, Li; Liu, Dongpei; Zhang, Jianfeng; Liu, Hengzhu

    2015-06-01

    Coarse-grained reconfigurable arrays (CGRAs) have shown potential for application in embedded systems in recent years. Numerous reconfigurable processing elements (PEs) in CGRAs provide flexibility while maintaining high performance by exploring different levels of parallelism. However, a difference remains between the CGRA and the application-specific integrated circuit (ASIC). Some application domains, such as software-defined radios (SDRs), require flexibility with performance demand increases. More effective CGRA architectures are expected to be developed. Customisation of a CGRA according to its application can improve performance and efficiency. This study proposes an application-specific CGRA architecture template composed of generic PEs (GPEs) and special PEs (SPEs). The hardware of the SPE can be customised to accelerate specific computational patterns. An automatic design methodology that includes pattern identification and application-specific function unit generation is also presented. A mapping algorithm based on ant colony optimisation is provided. Experimental results on the SDR target domain show that compared with other ordinary and application-specific reconfigurable architectures, the CGRA generated by the proposed method performs more efficiently for given applications.

  4. Implications for Application of Qualitative Methods to Library and Information Science Research.

    ERIC Educational Resources Information Center

    Grover, Robert; Glazier, Jack

    1985-01-01

    Presents conceptual framework for library and information science research and analyzes research methodology that has application for information science, using as example results of study conducted by authors. Rationale for use of qualitative research methods in theory building is discussed and qualitative and quantitative research methods are…

  5. Independent evaluation of light-vehicle safety applications based on vehicle-to-vehicle communications used in the 2012-2013 safety pilot model deployment

    DOT National Transportation Integrated Search

    2015-12-01

    This report presents the methodology and results of the independent evaluation of safety applications for passenger vehicles in the 2012-2013 Safety Pilot Model Deployment, part of the United States Department of Transportations Intelligent Transp...

  6. Application of Executable Architecture in Early Concept Evaluation using the DoD Architecture Framework

    DTIC Science & Technology

    2016-09-15

    7 Methodology Overview ................................................................................................7...32 III. Methodology ...33 Overview of Research Methodology ..........................................................................34 Implementation of Methodology

  7. Novel thermal management system design methodology for power lithium-ion battery

    NASA Astrophysics Data System (ADS)

    Nieto, Nerea; Díaz, Luis; Gastelurrutia, Jon; Blanco, Francisco; Ramos, Juan Carlos; Rivas, Alejandro

    2014-12-01

    Battery packs conformed by large format lithium-ion cells are increasingly being adopted in hybrid and pure electric vehicles in order to use the energy more efficiently and for a better environmental performance. Safety and cycle life are two of the main concerns regarding this technology, which are closely related to the cell's operating behavior and temperature asymmetries in the system. Therefore, the temperature of the cells in battery packs needs to be controlled by thermal management systems (TMSs). In the present paper an improved design methodology for developing TMSs is proposed. This methodology involves the development of different mathematical models for heat generation, transmission, and dissipation and their coupling and integration in the battery pack product design methodology in order to improve the overall safety and performance. The methodology is validated by comparing simulation results with laboratory measurements on a single module of the battery pack designed at IK4-IKERLAN for a traction application. The maximum difference between model predictions and experimental temperature data is 2 °C. The models developed have shown potential for use in battery thermal management studies for EV/HEV applications since they allow for scalability with accuracy and reasonable simulation time.

  8. Cooperative forestry inventory project for Nevada

    NASA Technical Reports Server (NTRS)

    Thornhill, R.

    1981-01-01

    A forest inventory project employing computerized classification of LANDSAT data to inventory vegetation types in western Nevada is described. The methodology and applicability of the resulting survey are summarized.

  9. Application of CFD in Indonesian Research: A review

    NASA Astrophysics Data System (ADS)

    Ambarita, H.; Siregar, M. R.; Kishinami, K.; Daimaruya, M.; Kawai, H.

    2018-04-01

    Computational Fluid Dynamics (CFD) is a numerical method that solves fluid flow and related governing equations using a computational tool. The studies on CFD, its methodology and its application as a research tool, are increasing. In this study, application of CFD by Indonesian researcher is briefly reviewed. The main objective is to explore the characteristics of CFD applications in Indonesian researchers. Considering the size and reputation, this study uses Scopus publications indexed data base. All of the documents in Scopus related to CFD which is affiliated by at least one of Indonesian researcher are collected to be reviewed. Research topics, CFD method, and simulation results are reviewed in brief. The results show that there are 260 documents found in literature indexed by Scopus. These documents divided into research articles 125 titles, conference paper 135 titles, book 1 title and review 1 title. In the research articles, only limited researchers focused on the development of CFD methodology. Almost all of the articles focus on using CFD in a particular application, as a research tool, such as aircraft application, wind power and heat exchanger. The topics of the 125 research articles can be divided into 12 specific applications and 1 miscellaneous application. The most popular application is Heating Ventilating and Air Conditioning and followed by Reactor, Transportation and Heat Exchanger applications. The most popular commercial CFD code used is ANSYS Fluent and only several researchers use CFX.

  10. DB4US: A Decision Support System for Laboratory Information Management.

    PubMed

    Carmona-Cejudo, José M; Hortas, Maria Luisa; Baena-García, Manuel; Lana-Linati, Jorge; González, Carlos; Redondo, Maximino; Morales-Bueno, Rafael

    2012-11-14

    Until recently, laboratory automation has focused primarily on improving hardware. Future advances are concentrated on intelligent software since laboratories performing clinical diagnostic testing require improved information systems to address their data processing needs. In this paper, we propose DB4US, an application that automates information related to laboratory quality indicators information. Currently, there is a lack of ready-to-use management quality measures. This application addresses this deficiency through the extraction, consolidation, statistical analysis, and visualization of data related to the use of demographics, reagents, and turn-around times. The design and implementation issues, as well as the technologies used for the implementation of this system, are discussed in this paper. To develop a general methodology that integrates the computation of ready-to-use management quality measures and a dashboard to easily analyze the overall performance of a laboratory, as well as automatically detect anomalies or errors. The novelty of our approach lies in the application of integrated web-based dashboards as an information management system in hospital laboratories. We propose a new methodology for laboratory information management based on the extraction, consolidation, statistical analysis, and visualization of data related to demographics, reagents, and turn-around times, offering a dashboard-like user web interface to the laboratory manager. The methodology comprises a unified data warehouse that stores and consolidates multidimensional data from different data sources. The methodology is illustrated through the implementation and validation of DB4US, a novel web application based on this methodology that constructs an interface to obtain ready-to-use indicators, and offers the possibility to drill down from high-level metrics to more detailed summaries. The offered indicators are calculated beforehand so that they are ready to use when the user needs them. The design is based on a set of different parallel processes to precalculate indicators. The application displays information related to tests, requests, samples, and turn-around times. The dashboard is designed to show the set of indicators on a single screen. DB4US was deployed for the first time in the Hospital Costa del Sol in 2008. In our evaluation we show the positive impact of this methodology for laboratory professionals, since the use of our application has reduced the time needed for the elaboration of the different statistical indicators and has also provided information that has been used to optimize the usage of laboratory resources by the discovery of anomalies in the indicators. DB4US users benefit from Internet-based communication of results, since this information is available from any computer without having to install any additional software. The proposed methodology and the accompanying web application, DB4US, automates the processing of information related to laboratory quality indicators and offers a novel approach for managing laboratory-related information, benefiting from an Internet-based communication mechanism. The application of this methodology has been shown to improve the usage of time, as well as other laboratory resources.

  11. Design and Customization of Telemedicine Systems

    PubMed Central

    Martínez-Alcalá, Claudia I.; Muñoz, Mirna; Monguet-Fierro, Josep

    2013-01-01

    In recent years, the advances in information and communication technology (ICT) have resulted in the development of systems and applications aimed at supporting rehabilitation therapy that contributes to enrich patients' life quality. This work is focused on the improvement of the telemedicine systems with the purpose of customizing therapies according to the profile and disability of patients. For doing this, as salient contribution, this work proposes the adoption of user-centered design (UCD) methodology for the design and development of telemedicine systems in order to support the rehabilitation of patients with neurological disorders. Finally, some applications of the UCD methodology in the telemedicine field are presented as a proof of concept. PMID:23762191

  12. A systematic methodology for the robust quantification of energy efficiency at wastewater treatment plants featuring Data Envelopment Analysis.

    PubMed

    Longo, S; Hospido, A; Lema, J M; Mauricio-Iglesias, M

    2018-05-10

    This article examines the potential benefits of using Data Envelopment Analysis (DEA) for conducting energy-efficiency assessment of wastewater treatment plants (WWTPs). WWTPs are characteristically heterogeneous (in size, technology, climate, function …) which limits the correct application of DEA. This paper proposes and describes the Robust Energy Efficiency DEA (REED) in its various stages, a systematic state-of-the-art methodology aimed at including exogenous variables in nonparametric frontier models and especially designed for WWTP operation. In particular, the methodology systematizes the modelling process by presenting an integrated framework for selecting the correct variables and appropriate models, possibly tackling the effect of exogenous factors. As a result, the application of REED improves the quality of the efficiency estimates and hence the significance of benchmarking. For the reader's convenience, this article is presented as a step-by-step guideline to guide the user in the determination of WWTPs energy efficiency from beginning to end. The application and benefits of the developed methodology are demonstrated by a case study related to the comparison of the energy efficiency of a set of 399 WWTPs operating in different countries and under heterogeneous environmental conditions. Copyright © 2018 Elsevier Ltd. All rights reserved.

  13. Application of Haddon's matrix in qualitative research methodology: an experience in burns epidemiology.

    PubMed

    Deljavan, Reza; Sadeghi-Bazargani, Homayoun; Fouladi, Nasrin; Arshi, Shahnam; Mohammadi, Reza

    2012-01-01

    Little has been done to investigate the application of injury specific qualitative research methods in the field of burn injuries. The aim of this study was to use an analytical tool (Haddon's matrix) through qualitative research methods to better understand people's perceptions about burn injuries. This study applied Haddon's matrix as a framework and an analytical tool for a qualitative research methodology in burn research. Both child and adult burn injury victims were enrolled into a qualitative study conducted using focus group discussion. Haddon's matrix was used to develop an interview guide and also through the analysis phase. The main analysis clusters were pre-event level/human (including risky behaviors, belief and cultural factors, and knowledge and education), pre-event level/object, pre-event phase/environment and event and post-event phase (including fire control, emergency scald and burn wound management, traditional remedies, medical consultation, and severity indicators). This research gave rise to results that are possibly useful both for future injury research and for designing burn injury prevention plans. Haddon's matrix is applicable in a qualitative research methodology both at data collection and data analysis phases. The study using Haddon's matrix through a qualitative research methodology yielded substantially rich information regarding burn injuries that may possibly be useful for prevention or future quantitative research.

  14. A Data Analysis Approach to Evaluating Achievement Outcomes of Instruction. Technical Report No. 338. Report from the Project on Conditions of School Learning and Instructional Strategies.

    ERIC Educational Resources Information Center

    Quilling, Mary Rintoul

    The purpose of the present study is to demonstrate the utility of data analysis methodology in evaluative research relating pupil and curriculum variables to pupil achievement. Regression models which account for achievement will result from the application of the methodology to two evaluative problems--one of curriculum comparison and another…

  15. Investigation on application of genetic algorithms to optimal reactive power dispatch of power systems

    NASA Astrophysics Data System (ADS)

    Wu, Q. H.; Ma, J. T.

    1993-09-01

    A primary investigation into application of genetic algorithms in optimal reactive power dispatch and voltage control is presented. The application was achieved, based on (the United Kingdom) National Grid 48 bus network model, using a novel genetic search approach. Simulation results, compared with that obtained using nonlinear programming methods, are included to show the potential of applications of the genetic search methodology in power system economical and secure operations.

  16. Methodology for extracting local constants from petroleum cracking flows

    DOEpatents

    Chang, Shen-Lin; Lottes, Steven A.; Zhou, Chenn Q.

    2000-01-01

    A methodology provides for the extraction of local chemical kinetic model constants for use in a reacting flow computational fluid dynamics (CFD) computer code with chemical kinetic computations to optimize the operating conditions or design of the system, including retrofit design improvements to existing systems. The coupled CFD and kinetic computer code are used in combination with data obtained from a matrix of experimental tests to extract the kinetic constants. Local fluid dynamic effects are implicitly included in the extracted local kinetic constants for each particular application system to which the methodology is applied. The extracted local kinetic model constants work well over a fairly broad range of operating conditions for specific and complex reaction sets in specific and complex reactor systems. While disclosed in terms of use in a Fluid Catalytic Cracking (FCC) riser, the inventive methodology has application in virtually any reaction set to extract constants for any particular application and reaction set formulation. The methodology includes the step of: (1) selecting the test data sets for various conditions; (2) establishing the general trend of the parametric effect on the measured product yields; (3) calculating product yields for the selected test conditions using coupled computational fluid dynamics and chemical kinetics; (4) adjusting the local kinetic constants to match calculated product yields with experimental data; and (5) validating the determined set of local kinetic constants by comparing the calculated results with experimental data from additional test runs at different operating conditions.

  17. Proposing integrated Shannon's entropy-inverse data envelopment analysis methods for resource allocation problem under a fuzzy environment

    NASA Astrophysics Data System (ADS)

    Çakır, Süleyman

    2017-10-01

    In this study, a two-phase methodology for resource allocation problems under a fuzzy environment is proposed. In the first phase, the imprecise Shannon's entropy method and the acceptability index are suggested, for the first time in the literature, to select input and output variables to be used in the data envelopment analysis (DEA) application. In the second step, an interval inverse DEA model is executed for resource allocation in a short run. In an effort to exemplify the practicality of the proposed fuzzy model, a real case application has been conducted involving 16 cement firms listed in Borsa Istanbul. The results of the case application indicated that the proposed hybrid model is a viable procedure to handle input-output selection and resource allocation problems under fuzzy conditions. The presented methodology can also lend itself to different applications such as multi-criteria decision-making problems.

  18. Nonpoint source pollution of urban stormwater runoff: a methodology for source analysis.

    PubMed

    Petrucci, Guido; Gromaire, Marie-Christine; Shorshani, Masoud Fallah; Chebbo, Ghassan

    2014-09-01

    The characterization and control of runoff pollution from nonpoint sources in urban areas are a major issue for the protection of aquatic environments. We propose a methodology to quantify the sources of pollutants in an urban catchment and to analyze the associated uncertainties. After describing the methodology, we illustrate it through an application to the sources of Cu, Pb, Zn, and polycyclic aromatic hydrocarbons (PAH) from a residential catchment (228 ha) in the Paris region. In this application, we suggest several procedures that can be applied for the analysis of other pollutants in different catchments, including an estimation of the total extent of roof accessories (gutters and downspouts, watertight joints and valleys) in a catchment. These accessories result as the major source of Pb and as an important source of Zn in the example catchment, while activity-related sources (traffic, heating) are dominant for Cu (brake pad wear) and PAH (tire wear, atmospheric deposition).

  19. Forecasting the Economic Impact of Future Space Station Operations

    NASA Technical Reports Server (NTRS)

    Summer, R. A.; Smolensky, S. M.; Muir, A. H.

    1967-01-01

    Recent manned and unmanned Earth-orbital operations have suggested great promise of improved knowledge and of substantial economic and associated benefits to be derived from services offered by a space station. Proposed application areas include agriculture, forestry, hydrology, public health, oceanography, natural disaster warning, and search/rescue operations. The need for reliable estimates of economic and related Earth-oriented benefits to be realized from Earth-orbital operations is discussed and recent work in this area is reviewed. Emphasis is given to those services based on remote sensing. Requirements for a uniform, comprehensive and flexible methodology are discussed. A brief review of the suggested methodology is presented. This methodology will be exercised through five case studies which were chosen from a gross inventory of almost 400 user candidates. The relationship of case study results to benefits in broader application areas is discussed, Some management implications of possible future program implementation are included.

  20. A methodology for selecting optimum organizations for space communities

    NASA Technical Reports Server (NTRS)

    Ragusa, J. M.

    1978-01-01

    This paper suggests that a methodology exists for selecting optimum organizations for future space communities of various sizes and purposes. Results of an exploratory study to identify an optimum hypothetical organizational structure for a large earth-orbiting multidisciplinary research and applications (R&A) Space Base manned by a mixed crew of technologists are presented. Since such a facility does not presently exist, in situ empirical testing was not possible. Study activity was, therefore, concerned with the identification of a desired organizational structural model rather than the empirical testing of it. The principal finding of this research was that a four-level project type 'total matrix' model will optimize the effectiveness of Space Base technologists. An overall conclusion which can be reached from the research is that application of this methodology, or portions of it, may provide planning insights for the formal organizations which will be needed during the Space Industrialization Age.

  1. Increasing the applicability of wind power projects via a multi-criteria approach: methodology and case study

    NASA Astrophysics Data System (ADS)

    Polatidis, Heracles; Morales, Jan Borràs

    2016-11-01

    In this paper a methodological framework for increasing the actual applicability of wind farms is developed and applied. The framework is based on multi-criteria decision aid techniques that perform an integrated technical and societal evaluation of a number of potential wind power projects that are a variation of a pre-existing actual proposal that faces implementation difficulties. A number of evaluation criteria are established and assessed via particular related software or are comparatively evaluated among each other on a semi-qualitative basis. The preference of a diverse audience of pertinent stakeholders can be also incorporated in the overall analysis. The result of the process is an identification of a new project that will exhibit increased actual implementation potential compared with the original proposal. The methodology is tested in a case study of a wind farm in the UK and relevant conclusions are drawn.

  2. An object-oriented approach to risk and reliability analysis : methodology and aviation safety applications.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dandini, Vincent John; Duran, Felicia Angelica; Wyss, Gregory Dane

    2003-09-01

    This article describes how features of event tree analysis and Monte Carlo-based discrete event simulation can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology, with some of the best features of each. The resultant object-based event scenario tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible. Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST methodology is then applied to anmore » aviation safety problem that considers mechanisms by which an aircraft might become involved in a runway incursion incident. The resulting OBEST model demonstrates how a close link between human reliability analysis and probabilistic risk assessment methods can provide important insights into aviation safety phenomenology.« less

  3. Adjoint-Based Methodology for Time-Dependent Optimal Control (AMTOC)

    NASA Technical Reports Server (NTRS)

    Yamaleev, Nail; Diskin, boris; Nishikawa, Hiroaki

    2012-01-01

    During the five years of this project, the AMTOC team developed an adjoint-based methodology for design and optimization of complex time-dependent flows, implemented AMTOC in a testbed environment, directly assisted in implementation of this methodology in the state-of-the-art NASA's unstructured CFD code FUN3D, and successfully demonstrated applications of this methodology to large-scale optimization of several supersonic and other aerodynamic systems, such as fighter jet, subsonic aircraft, rotorcraft, high-lift, wind-turbine, and flapping-wing configurations. In the course of this project, the AMTOC team has published 13 refereed journal articles, 21 refereed conference papers, and 2 NIA reports. The AMTOC team presented the results of this research at 36 international and national conferences, meeting and seminars, including International Conference on CFD, and numerous AIAA conferences and meetings. Selected publications that include the major results of the AMTOC project are enclosed in this report.

  4. A Methodology for the Hybridization Based in Active Components: The Case of cGA and Scatter Search.

    PubMed

    Villagra, Andrea; Alba, Enrique; Leguizamón, Guillermo

    2016-01-01

    This work presents the results of a new methodology for hybridizing metaheuristics. By first locating the active components (parts) of one algorithm and then inserting them into second one, we can build efficient and accurate optimization, search, and learning algorithms. This gives a concrete way of constructing new techniques that contrasts the spread ad hoc way of hybridizing. In this paper, the enhanced algorithm is a Cellular Genetic Algorithm (cGA) which has been successfully used in the past to find solutions to such hard optimization problems. In order to extend and corroborate the use of active components as an emerging hybridization methodology, we propose here the use of active components taken from Scatter Search (SS) to improve cGA. The results obtained over a varied set of benchmarks are highly satisfactory in efficacy and efficiency when compared with a standard cGA. Moreover, the proposed hybrid approach (i.e., cGA+SS) has shown encouraging results with regard to earlier applications of our methodology.

  5. Non-linear forecasting in high-frequency financial time series

    NASA Astrophysics Data System (ADS)

    Strozzi, F.; Zaldívar, J. M.

    2005-08-01

    A new methodology based on state space reconstruction techniques has been developed for trading in financial markets. The methodology has been tested using 18 high-frequency foreign exchange time series. The results are in apparent contradiction with the efficient market hypothesis which states that no profitable information about future movements can be obtained by studying the past prices series. In our (off-line) analysis positive gain may be obtained in all those series. The trading methodology is quite general and may be adapted to other financial time series. Finally, the steps for its on-line application are discussed.

  6. Solving rational matrix equations in the state space with applications to computer-aided control-system design

    NASA Technical Reports Server (NTRS)

    Packard, A. K.; Sastry, S. S.

    1986-01-01

    A method of solving a class of linear matrix equations over various rings is proposed, using results from linear geometric control theory. An algorithm, successfully implemented, is presented, along with non-trivial numerical examples. Applications of the method to the algebraic control system design methodology are discussed.

  7. 78 FR 77399 - Basic Health Program: Proposed Federal Funding Methodology for Program Year 2015

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-23

    ... American Indians and Alaska Natives F. Example Application of the BHP Funding Methodology III. Collection... effectively 138 percent due to the application of a required 5 percent income disregard in determining the... correct errors in applying the methodology (such as mathematical errors). Under section 1331(d)(3)(ii) of...

  8. Acceleration-based methodology to assess the blast mitigation performance of explosive ordnance disposal helmets

    NASA Astrophysics Data System (ADS)

    Dionne, J. P.; Levine, J.; Makris, A.

    2018-01-01

    To design the next generation of blast mitigation helmets that offer increasing levels of protection against explosive devices, manufacturers must be able to rely on appropriate test methodologies and human surrogates that will differentiate the performance level of various helmet solutions and ensure user safety. Ideally, such test methodologies and associated injury thresholds should be based on widely accepted injury criteria relevant within the context of blast. Unfortunately, even though significant research has taken place over the last decade in the area of blast neurotrauma, there currently exists no agreement in terms of injury mechanisms for blast-induced traumatic brain injury. In absence of such widely accepted test methods and injury criteria, the current study presents a specific blast test methodology focusing on explosive ordnance disposal protective equipment, involving the readily available Hybrid III mannequin, initially developed for the automotive industry. The unlikely applicability of the associated brain injury criteria (based on both linear and rotational head acceleration) is discussed in the context of blast. Test results encompassing a large number of blast configurations and personal protective equipment are presented, emphasizing the possibility to develop useful correlations between blast parameters, such as the scaled distance, and mannequin engineering measurements (head acceleration). Suggestions are put forward for a practical standardized blast testing methodology taking into account limitations in the applicability of acceleration-based injury criteria as well as the inherent variability in blast testing results.

  9. Global Artificial Boundary Conditions for Computation of External Flow Problems with Propulsive Jets

    NASA Technical Reports Server (NTRS)

    Tsynkov, Semyon; Abarbanel, Saul; Nordstrom, Jan; Ryabenkii, Viktor; Vatsa, Veer

    1998-01-01

    We propose new global artificial boundary conditions (ABC's) for computation of flows with propulsive jets. The algorithm is based on application of the difference potentials method (DPM). Previously, similar boundary conditions have been implemented for calculation of external compressible viscous flows around finite bodies. The proposed modification substantially extends the applicability range of the DPM-based algorithm. In the paper, we present the general formulation of the problem, describe our numerical methodology, and discuss the corresponding computational results. The particular configuration that we analyze is a slender three-dimensional body with boat-tail geometry and supersonic jet exhaust in a subsonic external flow under zero angle of attack. Similarly to the results obtained earlier for the flows around airfoils and wings, current results for the jet flow case corroborate the superiority of the DPM-based ABC's over standard local methodologies from the standpoints of accuracy, overall numerical performance, and robustness.

  10. Intelligent systems/software engineering methodology - A process to manage cost and risk

    NASA Technical Reports Server (NTRS)

    Friedlander, Carl; Lehrer, Nancy

    1991-01-01

    A systems development methodology is discussed that has been successfully applied to the construction of a number of intelligent systems. This methodology is a refinement of both evolutionary and spiral development methodologies. It is appropriate for development of intelligent systems. The application of advanced engineering methodology to the development of software products and intelligent systems is an important step toward supporting the transition of AI technology into aerospace applications. A description of the methodology and the process model from which it derives is given. Associated documents and tools are described which are used to manage the development process and record and report the emerging design.

  11. Flexible versus common technology to estimate economies of scale and scope in the water and sewerage industry: an application to England and Wales.

    PubMed

    Molinos-Senante, María; Maziotis, Alexandros

    2018-05-01

    The water industry presents several structures in different countries and also within countries. Hence, several studies have been conducted to evaluate the presence of economies of scope and scale in the water industry leading to inconclusive results. The lack of a common methodology has been identified as an important factor contributing to divergent conclusions. This paper evaluates, for the first time, the presence of economies of scale and scope in the water industry using a flexible technology approach integrating operational and exogenous variables of the water companies in the cost functions. The empirical application carried out for the English and Welsh water industry evidenced that the inclusion of exogenous variables accounts for significant differences in economies of scale and scope. Moreover, completely different results were obtained when the economies of scale and scope were estimated using common and flexible technology methodological approaches. The findings of this study reveal the importance of using an appropriate methodology to support policy decision-making processes to promote sustainable urban water activities.

  12. Visualizing Forensic Publication Impacts and Collaborations: Presenting at a Scientific Venue Leads to Increased Collaborations between Researchers and Information Professionals

    PubMed Central

    Makar, Susan; Malanowski, Amanda; Rapp, Katie

    2016-01-01

    The Information Services Office (ISO) of the National Institute of Standards and Technology (NIST) proactively sought out an opportunity to present the findings of a study that showed the impact of NIST’s forensic research output to its internal customers and outside researchers. ISO analyzed the impact of NIST’s contributions to the peer-reviewed forensic journal literature through citation analysis and network visualizations. The findings of this study were compiled into a poster that was presented during the Forensics@NIST Symposium in December 2014. ISO’s study informed the forensic research community where NIST has had some of the greatest scholarly impact. This paper describes the methodology used to assess the impact of NIST’s forensic publications and shares the results, outcomes, and impacts of ISO’s study and poster presentation. This methodology is adaptable and applicable to other research fields and to other libraries. It has improved the recognition of ISO’s capabilities within NIST and resulted in application of the methodology to additional scientific disciplines. PMID:27956754

  13. An overview of key technology thrusts at Bell Helicopter Textron

    NASA Technical Reports Server (NTRS)

    Harse, James H.; Yen, Jing G.; Taylor, Rodney S.

    1988-01-01

    Insight is provided into several key technologies at Bell. Specific topics include the results of ongoing research and development in advanced rotors, methodology development, and new configurations. The discussion on advanced rotors highlight developments on the composite, bearingless rotor, including the development and testing of full scale flight hardware as well as some of the design support analyses and verification testing. The discussion on methodology development concentrates on analytical development in aeromechanics, including correlation studies and design application. New configurations, presents the results of some advanced configuration studies including hardware development.

  14. A simple methodology to finance public health initiatives: reimbursement for tuberculosis directly observed therapy services in New York State.

    PubMed

    Klein, S J; Laufer, F N

    1995-01-01

    New York State (NYS) used Medicaid reimbursement to create incentives for health care providers to offer directly observed therapy (DOT) services for active tuberculosis (TB) disease. This resulted in proliferation of 26 new TB DOT providers and expanded capacity for the New York City (NYC). Department of Health. As a result, over 1,200 individuals now receive DOT in NYC. The reimbursement methodology was also used for other NYS public health initiatives. It is applicable for public health initiatives elsewhere.

  15. A new methodology for determining dispersion coefficient using ordinary and partial differential transport equations.

    PubMed

    Cho, Kyung Hwa; Lee, Seungwon; Ham, Young Sik; Hwang, Jin Hwan; Cha, Sung Min; Park, Yongeun; Kim, Joon Ha

    2009-01-01

    The present study proposes a methodology for determining the effective dispersion coefficient based on the field measurements performed in Gwangju (GJ) Creek in South Korea which is environmentally degraded by the artificial interferences such as weirs and culverts. Many previous works determining the dispersion coefficient were limited in application due to the complexity and artificial interferences in natural stream. Therefore, the sequential combination of N-Tank-In-Series (NTIS) model and Advection-Dispersion-Reaction (ADR) model was proposed for evaluating dispersion process in complex stream channel in this study. The series of water quality data were intensively monitored in the field to determine the effective dispersion coefficient of E. coli in rainy day. As a result, the suggested methodology reasonably estimates the dispersion coefficient for GJ Creek with 1.25 m(2)/s. Also, the sequential combined method provided Number of tank-Velocity-Dispersion coefficient (NVD) curves for convenient evaluation of dispersion coefficient of other rivers or streams. Comparing the previous studies, the present methodology is quite general and simple for determining the effective dispersion coefficients which are applicable for other rivers and streams.

  16. Classification of samples into two or more ordered populations with application to a cancer trial.

    PubMed

    Conde, D; Fernández, M A; Rueda, C; Salvador, B

    2012-12-10

    In many applications, especially in cancer treatment and diagnosis, investigators are interested in classifying patients into various diagnosis groups on the basis of molecular data such as gene expression or proteomic data. Often, some of the diagnosis groups are known to be related to higher or lower values of some of the predictors. The standard methods of classifying patients into various groups do not take into account the underlying order. This could potentially result in high misclassification rates, especially when the number of groups is larger than two. In this article, we develop classification procedures that exploit the underlying order among the mean values of the predictor variables and the diagnostic groups by using ideas from order-restricted inference. We generalize the existing methodology on discrimination under restrictions and provide empirical evidence to demonstrate that the proposed methodology improves over the existing unrestricted methodology. The proposed methodology is applied to a bladder cancer data set where the researchers are interested in classifying patients into various groups. Copyright © 2012 John Wiley & Sons, Ltd.

  17. Floating-to-Fixed-Point Conversion for Digital Signal Processors

    NASA Astrophysics Data System (ADS)

    Menard, Daniel; Chillet, Daniel; Sentieys, Olivier

    2006-12-01

    Digital signal processing applications are specified with floating-point data types but they are usually implemented in embedded systems with fixed-point arithmetic to minimise cost and power consumption. Thus, methodologies which establish automatically the fixed-point specification are required to reduce the application time-to-market. In this paper, a new methodology for the floating-to-fixed point conversion is proposed for software implementations. The aim of our approach is to determine the fixed-point specification which minimises the code execution time for a given accuracy constraint. Compared to previous methodologies, our approach takes into account the DSP architecture to optimise the fixed-point formats and the floating-to-fixed-point conversion process is coupled with the code generation process. The fixed-point data types and the position of the scaling operations are optimised to reduce the code execution time. To evaluate the fixed-point computation accuracy, an analytical approach is used to reduce the optimisation time compared to the existing methods based on simulation. The methodology stages are described and several experiment results are presented to underline the efficiency of this approach.

  18. Applications of mixed-methods methodology in clinical pharmacy research.

    PubMed

    Hadi, Muhammad Abdul; Closs, S José

    2016-06-01

    Introduction Mixed-methods methodology, as the name suggests refers to mixing of elements of both qualitative and quantitative methodologies in a single study. In the past decade, mixed-methods methodology has gained popularity among healthcare researchers as it promises to bring together the strengths of both qualitative and quantitative approaches. Methodology A number of mixed-methods designs are available in the literature and the four most commonly used designs in healthcare research are: the convergent parallel design, the embedded design, the exploratory design, and the explanatory design. Each has its own unique advantages, challenges and procedures and selection of a particular design should be guided by the research question. Guidance on designing, conducting and reporting mixed-methods research is available in the literature, so it is advisable to adhere to this to ensure methodological rigour. When to use it is best suited when the research questions require: triangulating findings from different methodologies to explain a single phenomenon; clarifying the results of one method using another method; informing the design of one method based on the findings of another method, development of a scale/questionnaire and answering different research questions within a single study. Two case studies have been presented to illustrate possible applications of mixed-methods methodology. Limitations Possessing the necessary knowledge and skills to undertake qualitative and quantitative data collection, analysis, interpretation and integration remains the biggest challenge for researchers conducting mixed-methods studies. Sequential study designs are often time consuming, being in two (or more) phases whereas concurrent study designs may require more than one data collector to collect both qualitative and quantitative data at the same time.

  19. Efficient Analysis of Complex Structures

    NASA Technical Reports Server (NTRS)

    Kapania, Rakesh K.

    2000-01-01

    Last various accomplishments achieved during this project are : (1) A Survey of Neural Network (NN) applications using MATLAB NN Toolbox on structural engineering especially on equivalent continuum models (Appendix A). (2) Application of NN and GAs to simulate and synthesize substructures: 1-D and 2-D beam problems (Appendix B). (3) Development of an equivalent plate-model analysis method (EPA) for static and vibration analysis of general trapezoidal built-up wing structures composed of skins, spars and ribs. Calculation of all sorts of test cases and comparison with measurements or FEA results. (Appendix C). (4) Basic work on using second order sensitivities on simulating wing modal response, discussion of sensitivity evaluation approaches, and some results (Appendix D). (5) Establishing a general methodology of simulating the modal responses by direct application of NN and by sensitivity techniques, in a design space composed of a number of design points. Comparison is made through examples using these two methods (Appendix E). (6) Establishing a general methodology of efficient analysis of complex wing structures by indirect application of NN: the NN-aided Equivalent Plate Analysis. Training of the Neural Networks for this purpose in several cases of design spaces, which can be applicable for actual design of complex wings (Appendix F).

  20. A systematic and critical review of the evolving methods and applications of value of information in academia and practice.

    PubMed

    Steuten, Lotte; van de Wetering, Gijs; Groothuis-Oudshoorn, Karin; Retèl, Valesca

    2013-01-01

    This article provides a systematic and critical review of the evolving methods and applications of value of information (VOI) in academia and practice and discusses where future research needs to be directed. Published VOI studies were identified by conducting a computerized search on Scopus and ISI Web of Science from 1980 until December 2011 using pre-specified search terms. Only full-text papers that outlined and discussed VOI methods for medical decision making, and studies that applied VOI and explicitly discussed the results with a view to informing healthcare decision makers, were included. The included papers were divided into methodological and applied papers, based on the aim of the study. A total of 118 papers were included of which 50 % (n = 59) are methodological. A rapidly accumulating literature base on VOI from 1999 onwards for methodological papers and from 2005 onwards for applied papers is observed. Expected value of sample information (EVSI) is the preferred method of VOI to inform decision making regarding specific future studies, but real-life applications of EVSI remain scarce. Methodological challenges to VOI are numerous and include the high computational demands, dealing with non-linear models and interdependency between parameters, estimations of effective time horizons and patient populations, and structural uncertainties. VOI analysis receives increasing attention in both the methodological and the applied literature bases, but challenges to applying VOI in real-life decision making remain. For many technical and methodological challenges to VOI analytic solutions have been proposed in the literature, including leaner methods for VOI. Further research should also focus on the needs of decision makers regarding VOI.

  1. Decision support and disease management: a logic engineering approach.

    PubMed

    Fox, J; Thomson, R

    1998-12-01

    This paper describes the development and application of PROforma, a unified technology for clinical decision support and disease management. Work leading to the implementation of PROforma has been carried out in a series of projects funded by European agencies over the past 13 years. The work has been based on logic engineering, a distinct design and development methodology that combines concepts from knowledge engineering, logic programming, and software engineering. Several of the projects have used the approach to demonstrate a wide range of applications in primary and specialist care and clinical research. Concurrent academic research projects have provided a sound theoretical basis for the safety-critical elements of the methodology. The principal technical results of the work are the PROforma logic language for defining clinical processes and an associated suite of software tools for delivering applications, such as decision support and disease management procedures. The language supports four standard objects (decisions, plans, actions, and enquiries), each of which has an intuitive meaning with well-understood logical semantics. The development toolset includes a powerful visual programming environment for composing applications from these standard components, for verifying consistency and completeness of the resulting specification and for delivering stand-alone or embeddable applications. Tools and applications that have resulted from the work are described and illustrated, with examples from specialist cancer care and primary care. The results of a number of evaluation activities are included to illustrate the utility of the technology.

  2. PIA and REWIND: Two New Methodologies for Cross Section Adjustment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palmiotti, G.; Salvatores, M.

    2017-02-01

    This paper presents two new cross section adjustment methodologies intended for coping with the problem of compensations. The first one PIA, Progressive Incremental Adjustment, gives priority to the utilization of experiments of elemental type (those sensitive to a specific cross section), following a definite hierarchy on which type of experiment to use. Once the adjustment is performed, both the new adjusted data and the new covariance matrix are kept. The second methodology is called REWIND (Ranking Experiments by Weighting for Improved Nuclear Data). This new proposed approach tries to establish a methodology for ranking experiments by looking at the potentialmore » gain they can produce in an adjustment. Practical applications for different adjustments illustrate the results of the two methodologies against the current one and show the potential improvement for reducing uncertainties in target reactors.« less

  3. Ensemble modeling of stochastic unsteady open-channel flow in terms of its time-space evolutionary probability distribution - Part 2: numerical application

    NASA Astrophysics Data System (ADS)

    Dib, Alain; Kavvas, M. Levent

    2018-03-01

    The characteristic form of the Saint-Venant equations is solved in a stochastic setting by using a newly proposed Fokker-Planck Equation (FPE) methodology. This methodology computes the ensemble behavior and variability of the unsteady flow in open channels by directly solving for the flow variables' time-space evolutionary probability distribution. The new methodology is tested on a stochastic unsteady open-channel flow problem, with an uncertainty arising from the channel's roughness coefficient. The computed statistical descriptions of the flow variables are compared to the results obtained through Monte Carlo (MC) simulations in order to evaluate the performance of the FPE methodology. The comparisons show that the proposed methodology can adequately predict the results of the considered stochastic flow problem, including the ensemble averages, variances, and probability density functions in time and space. Unlike the large number of simulations performed by the MC approach, only one simulation is required by the FPE methodology. Moreover, the total computational time of the FPE methodology is smaller than that of the MC approach, which could prove to be a particularly crucial advantage in systems with a large number of uncertain parameters. As such, the results obtained in this study indicate that the proposed FPE methodology is a powerful and time-efficient approach for predicting the ensemble average and variance behavior, in both space and time, for an open-channel flow process under an uncertain roughness coefficient.

  4. Eigenvalue Contributon Estimator for Sensitivity Calculations with TSUNAMI-3D

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T; Williams, Mark L

    2007-01-01

    Since the release of the Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI) codes in SCALE [1], the use of sensitivity and uncertainty analysis techniques for criticality safety applications has greatly increased within the user community. In general, sensitivity and uncertainty analysis is transitioning from a technique used only by specialists to a practical tool in routine use. With the desire to use the tool more routinely comes the need to improve the solution methodology to reduce the input and computational burden on the user. This paper reviews the current solution methodology of the Monte Carlo eigenvalue sensitivity analysismore » sequence TSUNAMI-3D, describes an alternative approach, and presents results from both methodologies.« less

  5. Improving ED specimen TAT using Lean Six Sigma.

    PubMed

    Sanders, Janet H; Karr, Tedd

    2015-01-01

    Lean and Six Sigma are continuous improvement methodologies that have garnered international fame for improving manufacturing and service processes. Increasingly these methodologies are demonstrating their power to also improve healthcare processes. The purpose of this paper is to discuss a case study for the application of Lean and Six Sigma tools in the reduction of turnaround time (TAT) for Emergency Department (ED) specimens. This application of the scientific methodologies uncovered opportunities to improve the entire ED to lab system for the specimens. This case study provides details on the completion of a Lean Six Sigma project in a 1,000 bed tertiary care teaching hospital. Six Sigma's Define, Measure, Analyze, Improve, and Control methodology is very similar to good medical practice: first, relevant information is obtained and assembled; second, a careful and thorough diagnosis is completed; third, a treatment is proposed and implemented; and fourth, checks are made to determine if the treatment was effective. Lean's primary goal is to do more with less work and waste. The Lean methodology was used to identify and eliminate waste through rapid implementation of change. The initial focus of this project was the reduction of turn-around-times for ED specimens. However, the results led to better processes for both the internal and external customers of this and other processes. The project results included: a 50 percent decrease in vials used for testing, a 50 percent decrease in unused or extra specimens, a 90 percent decrease in ED specimens without orders, a 30 percent decrease in complete blood count analysis (CBCA) Median TAT, a 50 percent decrease in CBCA TAT Variation, a 10 percent decrease in Troponin TAT Variation, a 18.2 percent decrease in URPN TAT Variation, and a 2-5 minute decrease in ED registered nurses rainbow draw time. This case study demonstrated how the quantitative power of Six Sigma and the speed of Lean worked in harmony to improve the blood draw process for a 1,000 bed tertiary care teaching hospital. The blood draw process is a standard process used in hospitals to collect blood chemistry and hematology information for clinicians. The methods used in this case study demonstrated valuable and practical applications of process improvement methodologies that can be used for any hospital process and/or service environment. While this is not the first case study that has demonstrated the use of continuous process improvement methodologies to improve a hospital process, it is unique in the way in which it utilizes the strength of the project focussed approach that adheres more to the structure and rigor of Six Sigma and relied less on the speed of lean. Additionally, the application of these methodologies in healthcare is emerging research.

  6. Designing application software in wide area network settings

    NASA Technical Reports Server (NTRS)

    Makpangou, Mesaac; Birman, Ken

    1990-01-01

    Progress in methodologies for developing robust local area network software has not been matched by similar results for wide area settings. The design of application software spanning multiple local area environments is examined. For important classes of applications, simple design techniques are presented that yield fault tolerant wide area programs. An implementation of these techniques as a set of tools for use within the ISIS system is described.

  7. Systematic design of membership functions for fuzzy-logic control: A case study on one-stage partial nitritation/anammox treatment systems.

    PubMed

    Boiocchi, Riccardo; Gernaey, Krist V; Sin, Gürkan

    2016-10-01

    A methodology is developed to systematically design the membership functions of fuzzy-logic controllers for multivariable systems. The methodology consists of a systematic derivation of the critical points of the membership functions as a function of predefined control objectives. Several constrained optimization problems corresponding to different qualitative operation states of the system are defined and solved to identify, in a consistent manner, the critical points of the membership functions for the input variables. The consistently identified critical points, together with the linguistic rules, determine the long term reachability of the control objectives by the fuzzy logic controller. The methodology is highlighted using a single-stage side-stream partial nitritation/Anammox reactor as a case study. As a result, a new fuzzy-logic controller for high and stable total nitrogen removal efficiency is designed. Rigorous simulations are carried out to evaluate and benchmark the performance of the controller. The results demonstrate that the novel control strategy is capable of rejecting the long-term influent disturbances, and can achieve a stable and high TN removal efficiency. Additionally, the controller was tested, and showed robustness, against measurement noise levels typical for wastewater sensors. A feedforward-feedback configuration using the present controller would give even better performance. In comparison, a previously developed fuzzy-logic controller using merely expert and intuitive knowledge performed worse. This proved the importance of using a systematic methodology for the derivation of the membership functions for multivariable systems. These results are promising for future applications of the controller in real full-scale plants. Furthermore, the methodology can be used as a tool to help systematically design fuzzy logic control applications for other biological processes. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. [Nursing work and ergonomics].

    PubMed

    Marziale, M H; Robazzi, M L

    2000-12-01

    This text articulates empirical evidence resulting from scientific work with the intention of providing a reflection about the application of ergonomics as a methodological instrument to support improvement of the labor conditions of nursing personnel in hospitals.

  9. The Study of the Relationship between Probabilistic Design and Axiomatic Design Methodology. Volume 1

    NASA Technical Reports Server (NTRS)

    Onwubiko, Chinyere; Onyebueke, Landon

    1996-01-01

    This program report is the final report covering all the work done on this project. The goal of this project is technology transfer of methodologies to improve design process. The specific objectives are: 1. To learn and understand the Probabilistic design analysis using NESSUS. 2. To assign Design Projects to either undergraduate or graduate students on the application of NESSUS. 3. To integrate the application of NESSUS into some selected senior level courses in Civil and Mechanical Engineering curricula. 4. To develop courseware in Probabilistic Design methodology to be included in a graduate level Design Methodology course. 5. To study the relationship between the Probabilistic design methodology and Axiomatic design methodology.

  10. Encouraging the learning of hydraulic engineering subjects in agricultural engineering schools

    NASA Astrophysics Data System (ADS)

    Rodríguez Sinobas, Leonor; Sánchez Calvo, Raúl

    2014-09-01

    Several methodological approaches to improve the understanding and motivation of students in Hydraulic Engineering courses have been adopted in the Agricultural Engineering School at Technical University of Madrid. During three years student's progress and satisfaction have been assessed by continuous monitoring and the use of 'online' and web tools in two undergraduate courses. Results from their application to encourage learning and communication skills in Hydraulic Engineering subjects are analysed and compared to the initial situation. Student's academic performance has improved since their application, but surveys made among students showed that not all the methodological proposals were perceived as beneficial. Their participation in the 'online', classroom and reading activities was low although they were well assessed.

  11. Topology optimization and laser additive manufacturing in design process of efficiency lightweight aerospace parts

    NASA Astrophysics Data System (ADS)

    Fetisov, K. V.; Maksimov, P. V.

    2018-05-01

    The paper presents the application of topology optimization and laser additive manufacturing in the design of lightweight aerospace parts. At the beginning a brief overview of the topology optimization algorithm SIMP is given, one of the most commonly used algorithm in FEA software. After that, methodology of parts design with using topology optimization is discussed as well as issues related to designing for additive manufacturing. In conclusion, the practical application of the proposed methodologies is presented using the example of one complex assembly unit. As a result of the new design approach, the mass of product was reduced five times, and twenty parts were replaced by one.

  12. Characteristics of a semi-custom library development system

    NASA Technical Reports Server (NTRS)

    Yancey, M.; Cannon, R.

    1990-01-01

    Standard cell and gate array macro libraries are in common use with workstation computer aided design (CAD) tools for application specific integrated circuit (ASIC) semi-custom application and have resulted in significant improvements in the overall design efficiencies as contrasted with custom design methodologies. Similar design methodology enhancements in providing for the efficient development of the library cells is an important factor in responding to the need for continuous technology improvement. The characteristics of a library development system that provides design flexibility and productivity enhancements for the library development engineer as he provides libraries in the state-of-the-art process technologies are presented. An overview of Gould's library development system ('Accolade') is also presented.

  13. Application of a multivariate normal distribution methodology to the dissociation of doubly ionized molecules: The DMDS (CH3 -SS-CH3 ) case.

    PubMed

    Varas, Lautaro R; Pontes, F C; Santos, A C F; Coutinho, L H; de Souza, G G B

    2015-09-15

    The ion-ion-coincidence mass spectroscopy technique brings useful information about the fragmentation dynamics of doubly and multiply charged ionic species. We advocate the use of a matrix-parameter methodology in order to represent and interpret the entire ion-ion spectra associated with the ionic dissociation of doubly charged molecules. This method makes it possible, among other things, to infer fragmentation processes and to extract information about overlapped ion-ion coincidences. This important piece of information is difficult to obtain from other previously described methodologies. A Wiley-McLaren time-of-flight mass spectrometer was used to discriminate the positively charged fragment ions resulting from the sample ionization by a pulsed 800 eV electron beam. We exemplify the application of this methodology by analyzing the fragmentation and ionic dissociation of the dimethyl disulfide (DMDS) molecule as induced by fast electrons. The doubly charged dissociation was analyzed using the Multivariate Normal Distribution. The ion-ion spectrum of the DMDS molecule was obtained at an incident electron energy of 800 eV and was matrix represented using the Multivariate Distribution theory. The proposed methodology allows us to distinguish information among [CH n SH n ] + /[CH 3 ] + (n = 1-3) fragment ions in the ion-ion coincidence spectra using ion-ion coincidence data. Using the momenta balance methodology for the inferred parameters, a secondary decay mechanism is proposed for the [CHS] + ion formation. As an additional check on the methodology, previously published data on the SiF 4 molecule was re-analyzed with the present methodology and the results were shown to be statistically equivalent. The use of a Multivariate Normal Distribution allows for the representation of the whole ion-ion mass spectrum of doubly or multiply ionized molecules as a combination of parameters and the extraction of information among overlapped data. We have successfully applied this methodology to the analysis of the fragmentation of the DMDS molecule. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  14. Using a Mobile Application to Support Children's Writing Motivation

    ERIC Educational Resources Information Center

    Kanala, Sari; Nousiainen, Tuula; Kankaanranta, Marja

    2013-01-01

    Purpose: The purpose of this paper is to explore the use of the prototype of a mobile application for the enhancement of children's motivation for writing. The results are explored from students' and experts' perspectives. Design/methodology/approach: This study is based on a field trial and expert evaluations of a prototype of a mobile…

  15. An improved approach for flight readiness certification: Probabilistic models for flaw propagation and turbine blade failure. Volume 1: Methodology and applications

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for designs failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.

  16. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples. Volume 2: Software documentation

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  17. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples, volume 1

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  18. Rational Design Methodology.

    DTIC Science & Technology

    1978-09-01

    This report describes an effort to specify a software design methodology applicable to the Air Force software environment . Available methodologies...of techniques for proof of correctness, design specification, and performance assessment of static designs. The rational methodology selected is a

  19. End State: The Fallacy of Modern Military Planning

    DTIC Science & Technology

    2017-04-06

    operational planning for non -linear, complex scenarios requires application of non -linear, advanced planning techniques such as design methodology ...cannot be approached in a linear, mechanistic manner by a universal planning methodology . Theater/global campaign plans and theater strategies offer no...strategic environments, and instead prescribes a universal linear methodology that pays no mind to strategic complexity. This universal application

  20. A multicriteria-based methodology for site prioritisation in sediment management.

    PubMed

    Alvarez-Guerra, Manuel; Viguri, Javier R; Voulvoulis, Nikolaos

    2009-08-01

    Decision-making for sediment management is a complex task that incorporates the selections of areas for remediation and the assessment of options for any mitigation required. The application of Multicriteria Analysis (MCA) to rank different areas, according to their need for sediment management, provides a great opportunity for prioritisation, a first step in an integrated methodology that finally aims to assess and select suitable alternatives for managing the identified priority sites. This paper develops a methodology that starts with the delimitation of management units within areas of study, followed by the application of MCA methods that allows ranking of these management units, according to their need for remediation. This proposed process considers not only scientific evidence on sediment quality, but also other relevant aspects such as social and economic criteria associated with such decisions. This methodology is illustrated with its application to the case study area of the Bay of Santander, in northern Spain, highlighting some of the implications of utilising different MCA methods in the process. It also uses site-specific data to assess the subjectivity in the decision-making process, mainly reflected through the assignment of the criteria weights and uncertainties in the criteria scores. Analysis of the sensitivity of the results to these factors is used as a way to assess the stability and robustness of the ranking as a first step of the sediment management decision-making process.

  1. IMSF: Infinite Methodology Set Framework

    NASA Astrophysics Data System (ADS)

    Ota, Martin; Jelínek, Ivan

    Software development is usually an integration task in enterprise environment - few software applications work autonomously now. It is usually a collaboration of heterogeneous and unstable teams. One serious problem is lack of resources, a popular result being outsourcing, ‘body shopping’, and indirectly team and team member fluctuation. Outsourced sub-deliveries easily become black boxes with no clear development method used, which has a negative impact on supportability. Such environments then often face the problems of quality assurance and enterprise know-how management. The used methodology is one of the key factors. Each methodology was created as a generalization of a number of solved projects, and each methodology is thus more or less connected with a set of task types. When the task type is not suitable, it causes problems that usually result in an undocumented ad-hoc solution. This was the motivation behind formalizing a simple process for collaborative software engineering. Infinite Methodology Set Framework (IMSF) defines the ICT business process of adaptive use of methods for classified types of tasks. The article introduces IMSF and briefly comments its meta-model.

  2. Fractal Risk Assessment of ISS Propulsion Module in Meteoroid and Orbital Debris Environments

    NASA Technical Reports Server (NTRS)

    Mog, Robert A.

    2001-01-01

    A unique and innovative risk assessment of the International Space Station (ISS) Propulsion Module is conducted using fractal modeling of the Module's response to the meteoroid and orbital debris environments. Both the environment models and structural failure modes due to the resultant hypervelocity impact phenomenology, as well as Module geometry, are investigated for fractal applicability. The fractal risk assessment methodology could produce a greatly simplified alternative to current methodologies, such as BUMPER analyses, while maintaining or increasing the number of complex scenarios that can be assessed. As a minimum, this innovative fractal approach will provide an independent assessment of existing methodologies in a unique way.

  3. Recent studies of measures to improve basamid soil disinfestation.

    PubMed

    Van Wambeke, E

    2011-01-01

    Basamid micro-granule is used worldwide as a broad spectrum soil fumigant generator and has replaced methyl bromide for many applications. A lot is known for decades regarding the factors determining the success of the application from soil preparation and conditions to the application and soil sealing or soil tarping, as well as the operations and hygienic measures after the fumigant contact time. This paper explains last 6 years studies regarding the improvement of application methods, both from the viewpoint of homogenous incorporation of the granule over the soil profile to become treated as well as from possible premature loss of the gaseous active methyl isothiocyanate (MITC) by using improved tarping materials. Both result in lower environmental exposure and better biological performance of the application. In that respect, product incorporation in soil was studied in France and in Italy with more recent commercially available Basamid application machinery, and 29 plastic films have been compared for their MITC barrier properties with an 'in house' developed method. Film testing allowed clear categorizing in standard (monolayer) films, V.I.F. (Virtually Impermeable Film) and T.I.F. (Totally Impermeable Film). The paper presents the methodology for granule incorporation study and results from trials with two specific Basamid application machines compared with a classic rotovator, the methodology and comparison of plastic film barrier properties testing, and directives to minimize exposure and to maximize performance.

  4. Application of the Delphi technique in healthcare maintenance.

    PubMed

    Njuangang, Stanley; Liyanage, Champika; Akintoye, Akintola

    2017-10-09

    Purpose The purpose of this paper is to examine the research design, issues and considerations in the application of the Delphi technique to identify, refine and rate the critical success factors and performance measures in maintenance-associated infections. Design/methodology/approach In-depth literature review through the application of open and axial coding were applied to formulate the interview and research questions. These were used to conduct an exploratory case study of two healthcare maintenance managers, randomly selected from two National Health Service Foundation Trusts in England. The results of exploratory case study provided the rationale for the application of the Delphi technique in this research. The different processes in the application of the Delphi technique in healthcare research are examined thoroughly. Findings This research demonstrates the need to apply and integrate different research methods to enhance the validity of the Delphi technique. The rationale for the application of the Delphi technique in this research is because some healthcare maintenance managers lack knowledge about basic infection control (IC) principles to make hospitals safe for patient care. The result of first round of the Delphi exercise is a useful contribution in its own rights. It identified a number of salient issues and differences in the opinions of the Delphi participants, noticeably between healthcare maintenance managers and members of the infection control team. It also resulted in useful suggestions and comments to improve the quality and presentation of the second- and third-round Delphi instruments. Practical implications This research provides a research methodology that can be adopted by researchers investigating new and emerging issues in the healthcare sector. As this research demonstrates, the Delphi technique is relevant in soliciting expert knowledge and opinion to identify performance measures to control maintenance-associated infections in hospitals. The methodology provided here could be applied by other researchers elsewhere to probe, investigate and generate rich information about new and emerging healthcare research topics. Originality/value The authors demonstrate how different research methods can be integrated to enhance the validity of the Delphi technique. For example, the results of an exploratory case study provided the rationale for the application of the Delphi technique investigating the key performance measures in maintenance-associated infections. The different processes involved in the application of the Delphi technique are also carefully explored and discussed in depth.

  5. Integration of infrared thermography into various maintenance methodologies

    NASA Astrophysics Data System (ADS)

    Morgan, William T.

    1993-04-01

    Maintenance methodologies are in developmental stages throughout the world as global competitiveness drives all industries to improve operational efficiencies. Rapid progress in technical advancements has added an additional strain on maintenance organizations to progressively change. Accompanying needs for advanced training and documentation is the demand for utilization of various analytical instruments and quantitative methods. Infrared thermography is one of the primary elements of engineered approaches to maintenance. Current maintenance methodologies can be divided into six categories; Routine ('Breakdown'), Preventive, Predictive, Proactive, Reliability-Based, and Total Productive (TPM) maintenance. Each of these methodologies have distinctive approaches to achieving improved operational efficiencies. Popular though is that infrared thermography is a Predictive maintenance tool. While this is true, it is also true that it can be effectively integrated into each of the maintenance methodologies for achieving desired results. The six maintenance strategies will be defined. Infrared applications integrated into each will be composed in tabular form.

  6. Measuring salt retention : [summary].

    DOT National Transportation Integrated Search

    2013-03-01

    This project involves measuring and reporting the retention of salt and brine on the roadway as a result of using different salt spreaders, application speeds, and brine quantities. The research develops an evaluation methodology, directs the field c...

  7. 7 CFR 205.670 - Inspection and testing of agricultural product to be sold or labeled “organic.”

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... current applicable validated methodology determining the presence of contaminants in agricultural products... an ongoing compliance investigation. (e) If test results indicate a specific agricultural product...

  8. Application of Low-Cost Methodologies for Mobile Phone App Development

    PubMed Central

    Ng, Beng Yeong; Ho, Roger; Cheok, Christopher Cheng Soon

    2014-01-01

    Background The usage of mobile phones and mobile phone apps in the recent decade has indeed become more prevalent. Previous research has highlighted a method of using just the Internet browser and a text editor to create an app, but this does not eliminate the challenges faced by clinicians. More recently, two methodologies of app development have been shared, but there has not been any disclosures pertaining to the costs involved. In addition, limitations such as the distribution and dissemination of the apps have not been addressed. Objective The aims of this research article are to: (1) highlight a low-cost methodology that clinicians without technical knowledge could use to develop educational apps; (2) clarify the respective costs involved in the process of development; (3) illustrate how limitations pertaining to dissemination could be addressed; and (4) to report initial utilization data of the apps and to share initial users’ self-rated perception of the apps. Methods In this study, we will present two techniques of how to create a mobile app using two of the well-established online mobile app building websites. The costs of development are specified and the methodology of dissemination of the apps will be shared. The application of the low-cost methodologies in the creation of the “Mastering Psychiatry” app for undergraduates and “Déjà vu” app for postgraduates will be discussed. A questionnaire survey has been administered to undergraduate students collating their perceptions towards the app. Results For the Mastering Psychiatry app, a cumulative total of 722 users have used the mobile app since inception, based on our analytics. For the Déjà vu app, there has been a cumulative total of 154 downloads since inception. The utilization data demonstrated the receptiveness towards these apps, and this is reinforced by the positive perceptions undergraduate students (n=185) had towards the low-cost self-developed apps. Conclusions This is one of the few studies that have demonstrated the low-cost methodologies of app development; as well as student and trainee receptivity toward self-created Web-based mobile phone apps. The results obtained have demonstrated that these Web-based low-cost apps are applicable in the real life, and suggest that the methodologies shared in this research paper might be of benefit for other specialities and disciplines. PMID:25491323

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Onoufriou, T.; Simpson, R.J.; Protopapas, M.

    This paper presents the development and application of reliability based inspection planning techniques for floaters. Based on previous experience from jacket structure applications optimized inspection planning (OIP) techniques for floaters are developed. The differences between floaters and jacket structures in relation to fatigue damage, redundancy levels and inspection practice are examined and reflected in the proposed methodology. The application and benefits of these techniques is demonstrated through representative analyses and important trends are highlighted through the results of a parametric sensitivity study.

  10. Medically related activities of application team program

    NASA Technical Reports Server (NTRS)

    1971-01-01

    Application team methodology identifies and specifies problems in technology transfer programs to biomedical areas through direct contact with users of aerospace technology. The availability of reengineering sources increases impact of the program on the medical community and results in broad scale application of some bioinstrumentation systems. Examples are given that include devices adapted to the rehabilitation of neuromuscular disorders, power sources for artificial organs, and automated monitoring and detection equipment in clinical medicine.

  11. Guidelines for the Design and Conduct of Clinical Studies in Knee Articular Cartilage Repair

    PubMed Central

    Mithoefer, Kai; Saris, Daniel B.F.; Farr, Jack; Kon, Elizaveta; Zaslav, Kenneth; Cole, Brian J.; Ranstam, Jonas; Yao, Jian; Shive, Matthew; Levine, David; Dalemans, Wilfried; Brittberg, Mats

    2011-01-01

    Objective: To summarize current clinical research practice and develop methodological standards for objective scientific evaluation of knee cartilage repair procedures and products. Design: A comprehensive literature review was performed of high-level original studies providing information relevant for the design of clinical studies on articular cartilage repair in the knee. Analysis of cartilage repair publications and synopses of ongoing trials were used to identify important criteria for the design, reporting, and interpretation of studies in this field. Results: Current literature reflects the methodological limitations of the scientific evidence available for articular cartilage repair. However, clinical trial databases of ongoing trials document a trend suggesting improved study designs and clinical evaluation methodology. Based on the current scientific information and standards of clinical care, detailed methodological recommendations were developed for the statistical study design, patient recruitment, control group considerations, study endpoint definition, documentation of results, use of validated patient-reported outcome instruments, and inclusion and exclusion criteria for the design and conduct of scientifically sound cartilage repair study protocols. A consensus statement among the International Cartilage Repair Society (ICRS) and contributing authors experienced in clinical trial design and implementation was achieved. Conclusions: High-quality clinical research methodology is critical for the optimal evaluation of current and new cartilage repair technologies. In addition to generally applicable principles for orthopedic study design, specific criteria and considerations apply to cartilage repair studies. Systematic application of these criteria and considerations can facilitate study designs that are scientifically rigorous, ethical, practical, and appropriate for the question(s) being addressed in any given cartilage repair research project. PMID:26069574

  12. Coalescence computations for large samples drawn from populations of time-varying sizes

    PubMed Central

    Polanski, Andrzej; Szczesna, Agnieszka; Garbulowski, Mateusz; Kimmel, Marek

    2017-01-01

    We present new results concerning probability distributions of times in the coalescence tree and expected allele frequencies for coalescent with large sample size. The obtained results are based on computational methodologies, which involve combining coalescence time scale changes with techniques of integral transformations and using analytical formulae for infinite products. We show applications of the proposed methodologies for computing probability distributions of times in the coalescence tree and their limits, for evaluation of accuracy of approximate expressions for times in the coalescence tree and expected allele frequencies, and for analysis of large human mitochondrial DNA dataset. PMID:28170404

  13. Methodology and Results of Mathematical Modelling of Complex Technological Processes

    NASA Astrophysics Data System (ADS)

    Mokrova, Nataliya V.

    2018-03-01

    The methodology of system analysis allows us to draw a mathematical model of the complex technological process. The mathematical description of the plasma-chemical process was proposed. The importance the quenching rate and initial temperature decrease time was confirmed for producing the maximum amount of the target product. The results of numerical integration of the system of differential equations can be used to describe reagent concentrations, plasma jet rate and temperature in order to achieve optimal mode of hardening. Such models are applicable both for solving control problems and predicting future states of sophisticated technological systems.

  14. Conjugate gradient based projection - A new explicit methodology for frictional contact

    NASA Technical Reports Server (NTRS)

    Tamma, Kumar K.; Li, Maocheng; Sha, Desong

    1993-01-01

    With special attention towards the applicability to parallel computation or vectorization, a new and effective explicit approach for linear complementary formulations involving a conjugate gradient based projection methodology is proposed in this study for contact problems with Coulomb friction. The overall objectives are focussed towards providing an explicit methodology of computation for the complete contact problem with friction. In this regard, the primary idea for solving the linear complementary formulations stems from an established search direction which is projected to a feasible region determined by the non-negative constraint condition; this direction is then applied to the Fletcher-Reeves conjugate gradient method resulting in a powerful explicit methodology which possesses high accuracy, excellent convergence characteristics, fast computational speed and is relatively simple to implement for contact problems involving Coulomb friction.

  15. Prototype integration of the joint munitions assessment and planning model with the OSD threat methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lynn, R.Y.S.; Bolmarcich, J.J.

    The purpose of this Memorandum is to propose a prototype procedure which the Office of Munitions might employ to exercise, in a supportive joint fashion, two of its High Level Conventional Munitions Models, namely, the OSD Threat Methodology and the Joint Munitions Assessment and Planning (JMAP) model. The joint application of JMAP and the OSD Threat Methodology provides a tool to optimize munitions stockpiles. The remainder of this Memorandum comprises five parts. The first is a description of the structure and use of the OSD Threat Methodology. The second is a description of JMAP and its use. The third discussesmore » the concept of the joint application of JMAP and OSD Threat Methodology. The fourth displays sample output of the joint application. The fifth is a summary and epilogue. Finally, three appendices contain details of the formulation, data, and computer code.« less

  16. Basic principles, methodology, and applications of remote sensing in agriculture

    NASA Technical Reports Server (NTRS)

    Moreira, M. A. (Principal Investigator); Deassuncao, G. V.

    1984-01-01

    The basic principles of remote sensing applied to agriculture and the methods used in data analysis are described. Emphasis is placed on the importance of developing a methodology that may help crop forecast, basic concepts of spectral signatures of vegetation, the methodology of the LANDSAT data utilization in agriculture, and the remote sensing program application of INPE (Institute for Space Research) in agriculture.

  17. Materials Selection Criteria for Nuclear Power Applications: A Decision Algorithm

    NASA Astrophysics Data System (ADS)

    Rodríguez-Prieto, Álvaro; Camacho, Ana María; Sebastián, Miguel Ángel

    2016-02-01

    An innovative methodology based on stringency levels is proposed in this paper and improves the current selection method for structural materials used in demanding industrial applications. This paper describes a new approach for quantifying the stringency of materials requirements based on a novel deterministic algorithm to prevent potential failures. We have applied the new methodology to different standardized specifications used in pressure vessels design, such as SA-533 Grade B Cl.1, SA-508 Cl.3 (issued by the American Society of Mechanical Engineers), DIN 20MnMoNi55 (issued by the German Institute of Standardization) and 16MND5 (issued by the French Nuclear Commission) specifications and determine the influence of design code selection. This study is based on key scientific publications on the influence of chemical composition on the mechanical behavior of materials, which were not considered when the technological requirements were established in the aforementioned specifications. For this purpose, a new method to quantify the efficacy of each standard has been developed using a deterministic algorithm. The process of assigning relative weights was performed by consulting a panel of experts in materials selection for reactor pressure vessels to provide a more objective methodology; thus, the resulting mathematical calculations for quantitative analysis are greatly simplified. The final results show that steel DIN 20MnMoNi55 is the best material option. Additionally, more recently developed materials such as DIN 20MnMoNi55, 16MND5 and SA-508 Cl.3 exhibit mechanical requirements more stringent than SA-533 Grade B Cl.1. The methodology presented in this paper can be used as a decision tool in selection of materials for a wide range of applications.

  18. A literature review of applied adaptive design methodology within the field of oncology in randomised controlled trials and a proposed extension to the CONSORT guidelines.

    PubMed

    Mistry, Pankaj; Dunn, Janet A; Marshall, Andrea

    2017-07-18

    The application of adaptive design methodology within a clinical trial setting is becoming increasingly popular. However the application of these methods within trials is not being reported as adaptive designs hence making it more difficult to capture the emerging use of these designs. Within this review, we aim to understand how adaptive design methodology is being reported, whether these methods are explicitly stated as an 'adaptive design' or if it has to be inferred and to identify whether these methods are applied prospectively or concurrently. Three databases; Embase, Ovid and PubMed were chosen to conduct the literature search. The inclusion criteria for the review were phase II, phase III and phase II/III randomised controlled trials within the field of Oncology that published trial results in 2015. A variety of search terms related to adaptive designs were used. A total of 734 results were identified, after screening 54 were eligible. Adaptive designs were more commonly applied in phase III confirmatory trials. The majority of the papers performed an interim analysis, which included some sort of stopping criteria. Additionally only two papers explicitly stated the term 'adaptive design' and therefore for most of the papers, it had to be inferred that adaptive methods was applied. Sixty-five applications of adaptive design methods were applied, from which the most common method was an adaptation using group sequential methods. This review indicated that the reporting of adaptive design methodology within clinical trials needs improving. The proposed extension to the current CONSORT 2010 guidelines could help capture adaptive design methods. Furthermore provide an essential aid to those involved with clinical trials.

  19. Reliability Centered Maintenance - Methodologies

    NASA Technical Reports Server (NTRS)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  20. A Methodology for the Hybridization Based in Active Components: The Case of cGA and Scatter Search

    PubMed Central

    Alba, Enrique; Leguizamón, Guillermo

    2016-01-01

    This work presents the results of a new methodology for hybridizing metaheuristics. By first locating the active components (parts) of one algorithm and then inserting them into second one, we can build efficient and accurate optimization, search, and learning algorithms. This gives a concrete way of constructing new techniques that contrasts the spread ad hoc way of hybridizing. In this paper, the enhanced algorithm is a Cellular Genetic Algorithm (cGA) which has been successfully used in the past to find solutions to such hard optimization problems. In order to extend and corroborate the use of active components as an emerging hybridization methodology, we propose here the use of active components taken from Scatter Search (SS) to improve cGA. The results obtained over a varied set of benchmarks are highly satisfactory in efficacy and efficiency when compared with a standard cGA. Moreover, the proposed hybrid approach (i.e., cGA+SS) has shown encouraging results with regard to earlier applications of our methodology. PMID:27403153

  1. Report on FY17 testing in support of integrated EPP-SMT design methods development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Yanli .; Jetter, Robert I.; Sham, T. -L.

    The goal of the proposed integrated Elastic Perfectly-Plastic (EPP) and Simplified Model Test (SMT) methodology is to incorporate a SMT data-based approach for creep-fatigue damage evaluation into the EPP methodology to avoid the separate evaluation of creep and fatigue damage and eliminate the requirement for stress classification in current methods; thus greatly simplifying evaluation of elevated temperature cyclic service. The purpose of this methodology is to minimize over-conservatism while properly accounting for localized defects and stress risers. To support the implementation of the proposed methodology and to verify the applicability of the code rules, thermomechanical tests continued in FY17. Thismore » report presents the recent test results for Type 1 SMT specimens on Alloy 617 with long hold times, pressurization SMT on Alloy 617, and two-bar thermal ratcheting test results on SS316H at the temperature range of 405 °C to 705 °C. Preliminary EPP strain range analysis on the two-bar tests are critically evaluated and compared with the experimental results.« less

  2. A Novel Methodology for Improving Plant Pest Surveillance in Vineyards and Crops Using UAV-Based Hyperspectral and Spatial Data.

    PubMed

    Vanegas, Fernando; Bratanov, Dmitry; Powell, Kevin; Weiss, John; Gonzalez, Felipe

    2018-01-17

    Recent advances in remote sensed imagery and geospatial image processing using unmanned aerial vehicles (UAVs) have enabled the rapid and ongoing development of monitoring tools for crop management and the detection/surveillance of insect pests. This paper describes a (UAV) remote sensing-based methodology to increase the efficiency of existing surveillance practices (human inspectors and insect traps) for detecting pest infestations (e.g., grape phylloxera in vineyards). The methodology uses a UAV integrated with advanced digital hyperspectral, multispectral, and RGB sensors. We implemented the methodology for the development of a predictive model for phylloxera detection. In this method, we explore the combination of airborne RGB, multispectral, and hyperspectral imagery with ground-based data at two separate time periods and under different levels of phylloxera infestation. We describe the technology used-the sensors, the UAV, and the flight operations-the processing workflow of the datasets from each imagery type, and the methods for combining multiple airborne with ground-based datasets. Finally, we present relevant results of correlation between the different processed datasets. The objective of this research is to develop a novel methodology for collecting, processing, analising and integrating multispectral, hyperspectral, ground and spatial data to remote sense different variables in different applications, such as, in this case, plant pest surveillance. The development of such methodology would provide researchers, agronomists, and UAV practitioners reliable data collection protocols and methods to achieve faster processing techniques and integrate multiple sources of data in diverse remote sensing applications.

  3. Methodology and issues of integral experiments selection for nuclear data validation

    NASA Astrophysics Data System (ADS)

    Tatiana, Ivanova; Ivanov, Evgeny; Hill, Ian

    2017-09-01

    Nuclear data validation involves a large suite of Integral Experiments (IEs) for criticality, reactor physics and dosimetry applications. [1] Often benchmarks are taken from international Handbooks. [2, 3] Depending on the application, IEs have different degrees of usefulness in validation, and usually the use of a single benchmark is not advised; indeed, it may lead to erroneous interpretation and results. [1] This work aims at quantifying the importance of benchmarks used in application dependent cross section validation. The approach is based on well-known General Linear Least Squared Method (GLLSM) extended to establish biases and uncertainties for given cross sections (within a given energy interval). The statistical treatment results in a vector of weighting factors for the integral benchmarks. These factors characterize the value added by a benchmark for nuclear data validation for the given application. The methodology is illustrated by one example, selecting benchmarks for 239Pu cross section validation. The studies were performed in the framework of Subgroup 39 (Methods and approaches to provide feedback from nuclear and covariance data adjustment for improvement of nuclear data files) established at the Working Party on International Nuclear Data Evaluation Cooperation (WPEC) of the Nuclear Science Committee under the Nuclear Energy Agency (NEA/OECD).

  4. Toward Green Acylation of (Hetero)arenes: Palladium-Catalyzed Carbonylation of Olefins to Ketones

    PubMed Central

    2017-01-01

    Green Friedel–Crafts acylation reactions belong to the most desired transformations in organic chemistry. The resulting ketones constitute important intermediates, building blocks, and functional molecules in organic synthesis as well as for the chemical industry. Over the past 60 years, advances in this topic have focused on how to make this reaction more economically and environmentally friendly by using green acylating conditions, such as stoichiometric acylations and catalytic homogeneous and heterogeneous acylations. However, currently well-established methodologies for their synthesis either produce significant amounts of waste or proceed under harsh conditions, limiting applications. Here, we present a new protocol for the straightforward and selective introduction of acyl groups into (hetero)arenes without directing groups by using available olefins with inexpensive CO. In the presence of commercial palladium catalysts, inter- and intramolecular carbonylative C–H functionalizations take place with good regio- and chemoselectivity. Compared to classical Friedel–Crafts chemistry, this novel methodology proceeds under mild reaction conditions. The general applicability of this methodology is demonstrated by the direct carbonylation of industrial feedstocks (ethylene and diisobutene) as well as of natural products (eugenol and safrole). Furthermore, synthetic applications to drug molecules are showcased. PMID:29392174

  5. A methodological framework to support the initiation, design and institutionalization of participatory modeling processes in water resources management

    NASA Astrophysics Data System (ADS)

    Halbe, Johannes; Pahl-Wostl, Claudia; Adamowski, Jan

    2018-01-01

    Multiple barriers constrain the widespread application of participatory methods in water management, including the more technical focus of most water agencies, additional cost and time requirements for stakeholder involvement, as well as institutional structures that impede collaborative management. This paper presents a stepwise methodological framework that addresses the challenges of context-sensitive initiation, design and institutionalization of participatory modeling processes. The methodological framework consists of five successive stages: (1) problem framing and stakeholder analysis, (2) process design, (3) individual modeling, (4) group model building, and (5) institutionalized participatory modeling. The Management and Transition Framework is used for problem diagnosis (Stage One), context-sensitive process design (Stage Two) and analysis of requirements for the institutionalization of participatory water management (Stage Five). Conceptual modeling is used to initiate participatory modeling processes (Stage Three) and ensure a high compatibility with quantitative modeling approaches (Stage Four). This paper describes the proposed participatory model building (PMB) framework and provides a case study of its application in Québec, Canada. The results of the Québec study demonstrate the applicability of the PMB framework for initiating and designing participatory model building processes and analyzing barriers towards institutionalization.

  6. Methodological framework for heart rate variability analysis during exercise: application to running and cycling stress testing.

    PubMed

    Hernando, David; Hernando, Alberto; Casajús, Jose A; Laguna, Pablo; Garatachea, Nuria; Bailón, Raquel

    2018-05-01

    Standard methodologies of heart rate variability analysis and physiological interpretation as a marker of autonomic nervous system condition have been largely published at rest, but not so much during exercise. A methodological framework for heart rate variability (HRV) analysis during exercise is proposed, which deals with the non-stationary nature of HRV during exercise, includes respiratory information, and identifies and corrects spectral components related to cardiolocomotor coupling (CC). This is applied to 23 male subjects who underwent different tests: maximal and submaximal, running and cycling; where the ECG, respiratory frequency and oxygen consumption were simultaneously recorded. High-frequency (HF) power results largely modified from estimations with the standard fixed band to those obtained with the proposed methodology. For medium and high levels of exercise and recovery, HF power results in a 20 to 40% increase. When cycling, HF power increases around 40% with respect to running, while CC power is around 20% stronger in running.

  7. A goal programming approach for a joint design of macroeconomic and environmental policies: a methodological proposal and an application to the Spanish economy.

    PubMed

    André, Francisco J; Cardenete, M Alejandro; Romero, Carlos

    2009-05-01

    The economic policy needs to pay increasingly more attention to the environmental issues, which requires the development of methodologies able to incorporate environmental, as well as macroeconomic, goals in the design of public policies. Starting from this observation, this article proposes a methodology based upon a Simonian satisficing logic made operational with the help of goal programming (GP) models, to address the joint design of macroeconomic and environmental policies. The methodology is applied to the Spanish economy, where a joint policy is elicited, taking into consideration macroeconomic goals (economic growth, inflation, unemployment, public deficit) and environmental goals (CO(2), NO( x ) and SO( x ) emissions) within the context of a computable general equilibrium model. The results show how the government can "fine-tune" its policy according to different criteria using GP models. The resulting policies aggregate the environmental and the economic goals in different ways: maximum aggregate performance, maximum balance and a lexicographic hierarchy of the goals.

  8. UWB Tracking Algorithms: AOA and TDOA

    NASA Technical Reports Server (NTRS)

    Ni, Jianjun David; Arndt, D.; Ngo, P.; Gross, J.; Refford, Melinda

    2006-01-01

    Ultra-Wideband (UWB) tracking prototype systems are currently under development at NASA Johnson Space Center for various applications on space exploration. For long range applications, a two-cluster Angle of Arrival (AOA) tracking method is employed for implementation of the tracking system; for close-in applications, a Time Difference of Arrival (TDOA) positioning methodology is exploited. Both AOA and TDOA are chosen to utilize the achievable fine time resolution of UWB signals. This talk presents a brief introduction to AOA and TDOA methodologies. The theoretical analysis of these two algorithms reveal the affecting parameters impact on the tracking resolution. For the AOA algorithm, simulations show that a tracking resolution less than 0.5% of the range can be achieved with the current achievable time resolution of UWB signals. For the TDOA algorithm used in close-in applications, simulations show that the (sub-inch) high tracking resolution is achieved with a chosen tracking baseline configuration. The analytical and simulated results provide insightful guidance for the UWB tracking system design.

  9. Comparison of 250 MHz electron spin echo and continuous wave oxygen EPR imaging methods for in vivo applications

    PubMed Central

    Epel, Boris; Sundramoorthy, Subramanian V.; Barth, Eugene D.; Mailer, Colin; Halpern, Howard J.

    2011-01-01

    Purpose: The authors compare two electron paramagnetic resonance imaging modalities at 250 MHz to determine advantages and disadvantages of those modalities for in vivo oxygen imaging. Methods: Electron spin echo (ESE) and continuous wave (CW) methodologies were used to obtain three-dimensional images of a narrow linewidth, water soluble, nontoxic oxygen-sensitive trityl molecule OX063 in vitro and in vivo. The authors also examined sequential images obtained from the same animal injected intravenously with trityl spin probe to determine temporal stability of methodologies. Results: A study of phantoms with different oxygen concentrations revealed a threefold advantage of the ESE methodology in terms of reduced imaging time and more precise oxygen resolution for samples with less than 70 torr oxygen partial pressure. Above∼100 torr, CW performed better. The images produced by both methodologies showed pO2 distributions with similar mean values. However, ESE images demonstrated superior performance in low pO2 regions while missing voxels in high pO2 regions. Conclusions: ESE and CW have different areas of applicability. ESE is superior for hypoxia studies in tumors. PMID:21626937

  10. Efficient preliminary floating offshore wind turbine design and testing methodologies and application to a concrete spar design

    PubMed Central

    Matha, Denis; Sandner, Frank; Molins, Climent; Campos, Alexis; Cheng, Po Wen

    2015-01-01

    The current key challenge in the floating offshore wind turbine industry and research is on designing economic floating systems that can compete with fixed-bottom offshore turbines in terms of levelized cost of energy. The preliminary platform design, as well as early experimental design assessments, are critical elements in the overall design process. In this contribution, a brief review of current floating offshore wind turbine platform pre-design and scaled testing methodologies is provided, with a focus on their ability to accommodate the coupled dynamic behaviour of floating offshore wind systems. The exemplary design and testing methodology for a monolithic concrete spar platform as performed within the European KIC AFOSP project is presented. Results from the experimental tests compared to numerical simulations are presented and analysed and show very good agreement for relevant basic dynamic platform properties. Extreme and fatigue loads and cost analysis of the AFOSP system confirm the viability of the presented design process. In summary, the exemplary application of the reduced design and testing methodology for AFOSP confirms that it represents a viable procedure during pre-design of floating offshore wind turbine platforms. PMID:25583870

  11. Meta-Study as Diagnostic: Toward Content Over Form in Qualitative Synthesis.

    PubMed

    Frost, Julia; Garside, Ruth; Cooper, Chris; Britten, Nicky

    2016-02-01

    Having previously conducted qualitative syntheses of the diabetes literature, we wanted to explore the changes in theoretical approaches, methodological practices, and the construction of substantive knowledge which have recently been presented in the qualitative diabetes literature. The aim of this research was to explore the feasibility of synthesizing existing qualitative syntheses of patient perspectives of diabetes using meta-study methodology. A systematic review of qualitative literature, published between 2000 and 2013, was conducted. Six articles were identified as qualitative syntheses. The meta-study methodology was used to compare the theoretical, methodological, analytic, and synthetic processes across the six studies, exploring the potential for an overarching synthesis. We identified that while research questions have increasingly concentrated on specific aspects of diabetes, the focus on systematic review processes has led to the neglect of qualitative theory and methods. This can inhibit the production of compelling results with meaningful clinical applications. Although unable to produce a synthesis of syntheses, we recommend that researchers who conduct qualitative syntheses pay equal attention to qualitative traditions and systematic review processes, to produce research products that are both credible and applicable. © The Author(s) 2015.

  12. Independent evaluation of the transit retrofit package safety applications : final report.

    DOT National Transportation Integrated Search

    2015-02-01

    This report presents the methodology and results of the independent evaluation of retrofit safety packages installed on transit vehicles in the : Safety Pilot Model Deploymentpart of the United States Department of Transportations Intelligent T...

  13. Diagnostic radiograph based 3D bone reconstruction framework: application to the femur.

    PubMed

    Gamage, P; Xie, S Q; Delmas, P; Xu, W L

    2011-09-01

    Three dimensional (3D) visualization of anatomy plays an important role in image guided orthopedic surgery and ultimately motivates minimally invasive procedures. However, direct 3D imaging modalities such as Computed Tomography (CT) are restricted to a minority of complex orthopedic procedures. Thus the diagnostics and planning of many interventions still rely on two dimensional (2D) radiographic images, where the surgeon has to mentally visualize the anatomy of interest. The purpose of this paper is to apply and validate a bi-planar 3D reconstruction methodology driven by prominent bony anatomy edges and contours identified on orthogonal radiographs. The results obtained through the proposed methodology are benchmarked against 3D CT scan data to assess the accuracy of reconstruction. The human femur has been used as the anatomy of interest throughout the paper. The novelty of this methodology is that it not only involves the outer contours of the bony anatomy in the reconstruction but also several key interior edges identifiable on radiographic images. Hence, this framework is not simply limited to long bones, but is generally applicable to a multitude of other bony anatomies as illustrated in the results section. Copyright © 2010 Elsevier Ltd. All rights reserved.

  14. Application of tolerance limits to the characterization of image registration performance.

    PubMed

    Fedorov, Andriy; Wells, William M; Kikinis, Ron; Tempany, Clare M; Vangel, Mark G

    2014-07-01

    Deformable image registration is used increasingly in image-guided interventions and other applications. However, validation and characterization of registration performance remain areas that require further study. We propose an analysis methodology for deriving tolerance limits on the initial conditions for deformable registration that reliably lead to a successful registration. This approach results in a concise summary of the probability of registration failure, while accounting for the variability in the test data. The (β, γ) tolerance limit can be interpreted as a value of the input parameter that leads to successful registration outcome in at least 100β% of cases with the 100γ% confidence. The utility of the methodology is illustrated by summarizing the performance of a deformable registration algorithm evaluated in three different experimental setups of increasing complexity. Our examples are based on clinical data collected during MRI-guided prostate biopsy registered using publicly available deformable registration tool. The results indicate that the proposed methodology can be used to generate concise graphical summaries of the experiments, as well as a probabilistic estimate of the registration outcome for a future sample. Its use may facilitate improved objective assessment, comparison and retrospective stress-testing of deformable.

  15. Exploring the bases for a mixed reality stroke rehabilitation system, Part I: A unified approach for representing action, quantitative evaluation, and interactive feedback

    PubMed Central

    2011-01-01

    Background Although principles based in motor learning, rehabilitation, and human-computer interfaces can guide the design of effective interactive systems for rehabilitation, a unified approach that connects these key principles into an integrated design, and can form a methodology that can be generalized to interactive stroke rehabilitation, is presently unavailable. Results This paper integrates phenomenological approaches to interaction and embodied knowledge with rehabilitation practices and theories to achieve the basis for a methodology that can support effective adaptive, interactive rehabilitation. Our resulting methodology provides guidelines for the development of an action representation, quantification of action, and the design of interactive feedback. As Part I of a two-part series, this paper presents key principles of the unified approach. Part II then describes the application of this approach within the implementation of the Adaptive Mixed Reality Rehabilitation (AMRR) system for stroke rehabilitation. Conclusions The accompanying principles for composing novel mixed reality environments for stroke rehabilitation can advance the design and implementation of effective mixed reality systems for the clinical setting, and ultimately be adapted for home-based application. They furthermore can be applied to other rehabilitation needs beyond stroke. PMID:21875441

  16. [Introduction to Exploratory Factor Analysis (EFA)].

    PubMed

    Martínez, Carolina Méndez; Sepúlveda, Martín Alonso Rondón

    2012-03-01

    Exploratory Factor Analysis (EFA) has become one of the most frequently used statistical techniques, especially in the medical and social sciences. Given its popularity, it is essential to understand the basic concepts necessary for its proper application and to take into consideration the main strengths and weaknesses of this technique. To present in a clear and concise manner the main applications of this technique, to determine the basic requirements for its use providing a description step by step of its methodology, and to establish the elements that must be taken into account during its preparation in order to not incur in erroneous results and interpretations. Narrative review. This review identifies the basic concepts and briefly describes the objectives, design, assumptions, and methodology to achieve factor derivation, global adjustment evaluation, and adequate interpretation of results. Copyright © 2012 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.

  17. A methodology for accident analysis of fusion breeder blankets and its application to helium-cooled lead–lithium blanket

    DOE PAGES

    Panayotov, Dobromir; Poitevin, Yves; Grief, Andrew; ...

    2016-09-23

    'Fusion for Energy' (F4E) is designing, developing, and implementing the European Helium-Cooled Lead-Lithium (HCLL) and Helium-Cooled Pebble-Bed (HCPB) Test Blanket Systems (TBSs) for ITER (Nuclear Facility INB-174). Safety demonstration is an essential element for the integration of these TBSs into ITER and accident analysis is one of its critical components. A systematic approach to accident analysis has been developed under the F4E contract on TBS safety analyses. F4E technical requirements, together with Amec Foster Wheeler and INL efforts, have resulted in a comprehensive methodology for fusion breeding blanket accident analysis that addresses the specificity of the breeding blanket designs, materials,more » and phenomena while remaining consistent with the approach already applied to ITER accident analyses. Furthermore, the methodology phases are illustrated in the paper by its application to the EU HCLL TBS using both MELCOR and RELAP5 codes.« less

  18. Evaluation of the instream flow incremental methodology by U.S. Fish and Wildlife Service field users

    USGS Publications Warehouse

    Armour, Carl L.; Taylor, Jonathan G.

    1991-01-01

    This paper summarizes results of a survey conducted in 1988 of 57 U.S. Fish and Wildlife Service field offices. The purpose was to document opinions of biologists experienced in applying the Instream Flow Incremental Methodology (IFIM). Responses were received from 35 offices where 616 IFIM applications were reported. The existence of six monitoring studies designed to evaluate the adequacy of flows provided at sites was confirmed. The two principal categories reported as stumbling blocks to the successful application of IFIM were beliefs that the methodology is technically too simplistic or that it is too complex to apply. Recommendations receiving the highest scores for future initiatives to enhance IFIM use were (1) training and workshops for field biologists; and (2) improving suitability index (SI) curves and computer models, and evaluating the relationship of weighted useable area (WUA) to fish responses. The authors concur that emphasis for research should be on addressing technical concerns about SI curves and WUA.

  19. A methodology for extending domain coverage in SemRep.

    PubMed

    Rosemblat, Graciela; Shin, Dongwook; Kilicoglu, Halil; Sneiderman, Charles; Rindflesch, Thomas C

    2013-12-01

    We describe a domain-independent methodology to extend SemRep coverage beyond the biomedical domain. SemRep, a natural language processing application originally designed for biomedical texts, uses the knowledge sources provided by the Unified Medical Language System (UMLS©). Ontological and terminological extensions to the system are needed in order to support other areas of knowledge. We extended SemRep's application by developing a semantic representation of a previously unsupported domain. This was achieved by adapting well-known ontology engineering phases and integrating them with the UMLS knowledge sources on which SemRep crucially depends. While the process to extend SemRep coverage has been successfully applied in earlier projects, this paper presents in detail the step-wise approach we followed and the mechanisms implemented. A case study in the field of medical informatics illustrates how the ontology engineering phases have been adapted for optimal integration with the UMLS. We provide qualitative and quantitative results, which indicate the validity and usefulness of our methodology. Published by Elsevier Inc.

  20. Applications of Quantum Cascade Laser Spectroscopy in the Analysis of Pharmaceutical Formulations.

    PubMed

    Galán-Freyle, Nataly J; Pacheco-Londoño, Leonardo C; Román-Ospino, Andrés D; Hernandez-Rivera, Samuel P

    2016-09-01

    Quantum cascade laser spectroscopy was used to quantify active pharmaceutical ingredient content in a model formulation. The analyses were conducted in non-contact mode by mid-infrared diffuse reflectance. Measurements were carried out at a distance of 15 cm, covering the spectral range 1000-1600 cm(-1) Calibrations were generated by applying multivariate analysis using partial least squares models. Among the figures of merit of the proposed methodology are the high analytical sensitivity equivalent to 0.05% active pharmaceutical ingredient in the formulation, high repeatability (2.7%), high reproducibility (5.4%), and low limit of detection (1%). The relatively high power of the quantum-cascade-laser-based spectroscopic system resulted in the design of detection and quantification methodologies for pharmaceutical applications with high accuracy and precision that are comparable to those of methodologies based on near-infrared spectroscopy, attenuated total reflection mid-infrared Fourier transform infrared spectroscopy, and Raman spectroscopy. © The Author(s) 2016.

  1. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples. Volume 3: Structure and listing of programs

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  2. Compressed learning and its applications to subcellular localization.

    PubMed

    Zheng, Zhong-Long; Guo, Li; Jia, Jiong; Xie, Chen-Mao; Zeng, Wen-Cai; Yang, Jie

    2011-09-01

    One of the main challenges faced by biological applications is to predict protein subcellular localization in automatic fashion accurately. To achieve this in these applications, a wide variety of machine learning methods have been proposed in recent years. Most of them focus on finding the optimal classification scheme and less of them take the simplifying the complexity of biological systems into account. Traditionally, such bio-data are analyzed by first performing a feature selection before classification. Motivated by CS (Compressed Sensing) theory, we propose the methodology which performs compressed learning with a sparseness criterion such that feature selection and dimension reduction are merged into one analysis. The proposed methodology decreases the complexity of biological system, while increases protein subcellular localization accuracy. Experimental results are quite encouraging, indicating that the aforementioned sparse methods are quite promising in dealing with complicated biological problems, such as predicting the subcellular localization of Gram-negative bacterial proteins.

  3. Information Gain Based Dimensionality Selection for Classifying Text Documents

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dumidu Wijayasekara; Milos Manic; Miles McQueen

    2013-06-01

    Selecting the optimal dimensions for various knowledge extraction applications is an essential component of data mining. Dimensionality selection techniques are utilized in classification applications to increase the classification accuracy and reduce the computational complexity. In text classification, where the dimensionality of the dataset is extremely high, dimensionality selection is even more important. This paper presents a novel, genetic algorithm based methodology, for dimensionality selection in text mining applications that utilizes information gain. The presented methodology uses information gain of each dimension to change the mutation probability of chromosomes dynamically. Since the information gain is calculated a priori, the computational complexitymore » is not affected. The presented method was tested on a specific text classification problem and compared with conventional genetic algorithm based dimensionality selection. The results show an improvement of 3% in the true positives and 1.6% in the true negatives over conventional dimensionality selection methods.« less

  4. An Overview of R in Health Decision Sciences.

    PubMed

    Jalal, Hawre; Pechlivanoglou, Petros; Krijkamp, Eline; Alarid-Escudero, Fernando; Enns, Eva; Hunink, M G Myriam

    2017-10-01

    As the complexity of health decision science applications increases, high-level programming languages are increasingly adopted for statistical analyses and numerical computations. These programming languages facilitate sophisticated modeling, model documentation, and analysis reproducibility. Among the high-level programming languages, the statistical programming framework R is gaining increased recognition. R is freely available, cross-platform compatible, and open source. A large community of users who have generated an extensive collection of well-documented packages and functions supports it. These functions facilitate applications of health decision science methodology as well as the visualization and communication of results. Although R's popularity is increasing among health decision scientists, methodological extensions of R in the field of decision analysis remain isolated. The purpose of this article is to provide an overview of existing R functionality that is applicable to the various stages of decision analysis, including model design, input parameter estimation, and analysis of model outputs.

  5. Fast underdetermined BSS architecture design methodology for real time applications.

    PubMed

    Mopuri, Suresh; Reddy, P Sreenivasa; Acharyya, Amit; Naik, Ganesh R

    2015-01-01

    In this paper, we propose a high speed architecture design methodology for the Under-determined Blind Source Separation (UBSS) algorithm using our recently proposed high speed Discrete Hilbert Transform (DHT) targeting real time applications. In UBSS algorithm, unlike the typical BSS, the number of sensors are less than the number of the sources, which is of more interest in the real time applications. The DHT architecture has been implemented based on sub matrix multiplication method to compute M point DHT, which uses N point architecture recursively and where M is an integer multiples of N. The DHT architecture and state of the art architecture are coded in VHDL for 16 bit word length and ASIC implementation is carried out using UMC 90 - nm technology @V DD = 1V and @ 1MHZ clock frequency. The proposed architecture implementation and experimental comparison results show that the DHT design is two times faster than state of the art architecture.

  6. Towards application of rule learning to the meta-analysis of clinical data: an example of the metabolic syndrome.

    PubMed

    Wojtusiak, Janusz; Michalski, Ryszard S; Simanivanh, Thipkesone; Baranova, Ancha V

    2009-12-01

    Systematic reviews and meta-analysis of published clinical datasets are important part of medical research. By combining results of multiple studies, meta-analysis is able to increase confidence in its conclusions, validate particular study results, and sometimes lead to new findings. Extensive theory has been built on how to aggregate results from multiple studies and arrive to the statistically valid conclusions. Surprisingly, very little has been done to adopt advanced machine learning methods to support meta-analysis. In this paper we describe a novel machine learning methodology that is capable of inducing accurate and easy to understand attributional rules from aggregated data. Thus, the methodology can be used to support traditional meta-analysis in systematic reviews. Most machine learning applications give primary attention to predictive accuracy of the learned knowledge, and lesser attention to its understandability. Here we employed attributional rules, the special form of rules that are relatively easy to interpret for medical experts who are not necessarily trained in statistics and meta-analysis. The methodology has been implemented and initially tested on a set of publicly available clinical data describing patients with metabolic syndrome (MS). The objective of this application was to determine rules describing combinations of clinical parameters used for metabolic syndrome diagnosis, and to develop rules for predicting whether particular patients are likely to develop secondary complications of MS. The aggregated clinical data was retrieved from 20 separate hospital cohorts that included 12 groups of patients with present liver disease symptoms and 8 control groups of healthy subjects. The total of 152 attributes were used, most of which were measured, however, in different studies. Twenty most common attributes were selected for the rule learning process. By applying the developed rule learning methodology we arrived at several different possible rulesets that can be used to predict three considered complications of MS, namely nonalcoholic fatty liver disease (NAFLD), simple steatosis (SS), and nonalcoholic steatohepatitis (NASH).

  7. Analysis of SBIR phase I and phase II review results at the National Institutes of Health.

    PubMed

    Vener, K J; Calkins, B M

    1991-09-01

    A cohort of phase I and phase II summary statements for the SBIR grant applications was evaluated to determine the strengths and weaknesses in approved and disapproved applications. An analysis of outcome variables (disapproval or unfunded status) was examined with respect to exposure variables (strengths or shortcomings). Logistic regression models were developed for comparisons to measure the predictive value of shortcomings and strengths to the outcomes. Disapproved phase I results were compared with an earlier 1985 study. Although the magnitude of the frequencies of shortcomings was greater in the present study, the relative rankings within shortcoming class were more alike than different. Also, the frequencies of shortcomings were, with one exception, not significantly different in the two studies. Differences in the summary statement review may have accounted for some differences observed between the 1985 data and results of the present study. Comparisons of Approved/Disapproved and Approved-Unfunded/Funded yielded the following observations. For phase I applicants, a lack of a clearly stated, testable hypothesis, a poorly qualified or described investigative team, and inadequate methodological approaches contributed significantly (in that order) to a rating of disapproval. A critical flaw for phase II proposals was failure to accomplish objectives of the phase I study. Methodological issues also dominate the distinctions in both comparison groups. A clear result of the data presented here and that published previously is that SBIR applicants need continuing assistance to improve the chances of their success. These results should serve as a guide to assist NIH staff as they provide information to prospective applicants focusing on key elements of the application. A continuing review of the SBIR program would be helpful to evaluate the quality of the submitted science.

  8. Optical diagnosis of cervical cancer by higher order spectra and boosting

    NASA Astrophysics Data System (ADS)

    Pratiher, Sawon; Mukhopadhyay, Sabyasachi; Barman, Ritwik; Pratiher, Souvik; Pradhan, Asima; Ghosh, Nirmalya; Panigrahi, Prasanta K.

    2017-03-01

    In this contribution, we report the application of higher order statistical moments using decision tree and ensemble based learning methodology for the development of diagnostic algorithms for optical diagnosis of cancer. The classification results were compared to those obtained with an independent feature extractors like linear discriminant analysis (LDA). The performance and efficacy of these methodology using higher order statistics as a classifier using boosting has higher specificity and sensitivity while being much faster as compared to other time-frequency domain based methods.

  9. Testing for genetically modified organisms (GMOs): Past, present and future perspectives.

    PubMed

    Holst-Jensen, Arne

    2009-01-01

    This paper presents an overview of GMO testing methodologies and how these have evolved and may evolve in the next decade. Challenges and limitations for the application of the test methods as well as to the interpretation of results produced with the methods are highlighted and discussed, bearing in mind the various interests and competences of the involved stakeholders. To better understand the suitability and limitations of detection methodologies the evolution of transformation processes for creation of GMOs is briefly reviewed.

  10. The Methodology for Developing Mobile Agent Application for Ubiquitous Environment

    NASA Astrophysics Data System (ADS)

    Matsuzaki, Kazutaka; Yoshioka, Nobukazu; Honiden, Shinichi

    A methodology which enables a flexible and reusable development of mobile agent application to a mobility aware indoor environment is provided in this study. The methodology is named Workflow-awareness model based on a concept of a pair of mobile agents cooperating to perform a given task. A monolithic mobile agent application with numerous concerns in a mobility aware setting is divided into a master agent (MA) and a shadow agent (SA) according to a type of tasks. The MA executes a main application logic which includes monitoring a user's physical movement and coordinating various services. The SA performs additional tasks depending on environments to aid the MA in achieving efficient execution without losing application logic. "Workflow-awareness (WFA)" means that the SA knows the MA's execution state transition so that the SA can provide a proper task at a proper timing. A prototype implementation of the methodology is done with a practical use of AspectJ. AspectJ is used to automate WFA by weaving communication modules to both MA and SA. Usefulness of this methodology concerning its efficiency and software engineering aspects are analyzed. As for the effectiveness, the overhead of WFA is relatively small to the whole expenditure time. And from the view of the software engineering, WFA is possible to provide a mechanism to deploy one application in various situations.

  11. Application of ion chromatography in pharmaceutical and drug analysis.

    PubMed

    Jenke, Dennis

    2011-08-01

    Ion chromatography (IC) has developed and matured into an important analytical methodology in a number of diverse applications and industries, including pharmaceuticals. This manuscript provides a review of IC applications for the determinations of active and inactive ingredients, excipients, degradation products, and impurities relevant to pharmaceutical analyses and thus serves as a resource for investigators looking for insights into the use of the IC methodology in this field of application.

  12. Efficient Robust Optimization of Metal Forming Processes using a Sequential Metamodel Based Strategy

    NASA Astrophysics Data System (ADS)

    Wiebenga, J. H.; Klaseboer, G.; van den Boogaard, A. H.

    2011-08-01

    The coupling of Finite Element (FE) simulations to mathematical optimization techniques has contributed significantly to product improvements and cost reductions in the metal forming industries. The next challenge is to bridge the gap between deterministic optimization techniques and the industrial need for robustness. This paper introduces a new and generally applicable structured methodology for modeling and solving robust optimization problems. Stochastic design variables or noise variables are taken into account explicitly in the optimization procedure. The metamodel-based strategy is combined with a sequential improvement algorithm to efficiently increase the accuracy of the objective function prediction. This is only done at regions of interest containing the optimal robust design. Application of the methodology to an industrial V-bending process resulted in valuable process insights and an improved robust process design. Moreover, a significant improvement of the robustness (>2σ) was obtained by minimizing the deteriorating effects of several noise variables. The robust optimization results demonstrate the general applicability of the robust optimization strategy and underline the importance of including uncertainty and robustness explicitly in the numerical optimization procedure.

  13. Methodology of management of dredging operations II. Applications.

    PubMed

    Junqua, G; Abriak, N E; Gregoire, P; Dubois, V; Mac Farlane, F; Damidot, D

    2006-04-01

    This paper presents the new methodology of management of dredging operations. Derived partly from existing methodologies (OECD, PNUE, AIPCN), it aims to be more comprehensive, mixing the qualities and the complementarities of previous methodologies. The application of the methodology has been carried out on the site of the Port of Dunkirk (FRANCE). Thus, a characterization of the sediments of this site has allowed a zoning of the Port to be established in to zones of probable homogeneity of sediments. Moreover, sources of pollution have been identified, with an aim of prevention. Ways of waste improvement have also been developed, to answer regional needs, from a point of view of competitive and territorial intelligence. Their development has required a mutualisation of resources between professionals, research centres and local communities, according to principles of industrial ecology. Lastly, a tool of MultiCriteria Decision-Making Aid (M.C.D.M.A.) has been used to determine the most relevant scenario (or alternative, or action) for a dredging operation intended by the Port of Dunkirk. These applications have confirmed the relevance of this methodology for the management of dredging operations.

  14. Analysis of Software Development Methodologies to Build Safety Software Applications for the SATEX-II: A Mexican Experimental Satellite

    NASA Astrophysics Data System (ADS)

    Aguilar Cisneros, Jorge; Vargas Martinez, Hector; Pedroza Melendez, Alejandro; Alonso Arevalo, Miguel

    2013-09-01

    Mexico is a country where the experience to build software for satellite applications is beginning. This is a delicate situation because in the near future we will need to develop software for the SATEX-II (Mexican Experimental Satellite). SATEX- II is a SOMECyTA's project (the Mexican Society of Aerospace Science and Technology). We have experienced applying software development methodologies, like TSP (Team Software Process) and SCRUM in other areas. Then, we analyzed these methodologies and we concluded: these can be applied to develop software for the SATEX-II, also, we supported these methodologies with SSP-05-0 Standard in particular with ESA PSS-05-11. Our analysis was focusing on main characteristics of each methodology and how these methodologies could be used with the ESA PSS 05-0 Standards. Our outcomes, in general, may be used by teams who need to build small satellites, but, in particular, these are going to be used when we will build the on board software applications for the SATEX-II.

  15. Four applications of a software data collection and analysis methodology

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Selby, Richard W., Jr.

    1985-01-01

    The evaluation of software technologies suffers because of the lack of quantitative assessment of their effect on software development and modification. A seven-step data collection and analysis methodology couples software technology evaluation with software measurement. Four in-depth applications of the methodology are presented. The four studies represent each of the general categories of analyses on the software product and development process: blocked subject-project studies, replicated project studies, multi-project variation studies, and single project strategies. The four applications are in the areas of, respectively, software testing, cleanroom software development, characteristic software metric sets, and software error analysis.

  16. An Approach for Preoperative Planning and Performance of MR-guided Interventions Demonstrated With a Manual Manipulator in a 1.5T MRI Scanner

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seimenis, Ioannis; Tsekos, Nikolaos V.; Keroglou, Christoforos

    2012-04-15

    Purpose: The aim of this work was to develop and test a general methodology for the planning and performance of robot-assisted, MR-guided interventions. This methodology also includes the employment of software tools with appropriately tailored routines to effectively exploit the capabilities of MRI and address the relevant spatial limitations. Methods: The described methodology consists of: (1) patient-customized feasibility study that focuses on the geometric limitations imposed by the gantry, the robotic hardware, and interventional tools, as well as the patient; (2) stereotactic preoperative planning for initial positioning of the manipulator and alignment of its end-effector with a selected target; andmore » (3) real-time, intraoperative tool tracking and monitoring of the actual intervention execution. Testing was performed inside a standard 1.5T MRI scanner in which the MR-compatible manipulator is deployed to provide the required access. Results: A volunteer imaging study demonstrates the application of the feasibility stage. A phantom study on needle targeting is also presented, demonstrating the applicability and effectiveness of the proposed preoperative and intraoperative stages of the methodology. For this purpose, a manually actuated, MR-compatible robotic manipulation system was used to accurately acquire a prescribed target through alternative approaching paths. Conclusions: The methodology presented and experimentally examined allows the effective performance of MR-guided interventions. It is suitable for, but not restricted to, needle-targeting applications assisted by a robotic manipulation system, which can be deployed inside a cylindrical scanner to provide the required access to the patient facilitating real-time guidance and monitoring.« less

  17. Studies to determine the effectiveness of longitudinal channelizing devices in work zones.

    DOT National Transportation Integrated Search

    2011-01-01

    This report describes the methodology and results of analyses performed to determine whether the following longitudinal : channelizing device (LCD) applications improve the traffic safety and operations of work zones relative to the use of standard :...

  18. 48 CFR 917.601 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Definitions. 917.601 Section 917.601 Federal Acquisition Regulations System DEPARTMENT OF ENERGY CONTRACTING METHODS AND..., performance-based contracting concepts and methodologies through the application of results-oriented...

  19. 48 CFR 917.601 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Definitions. 917.601 Section 917.601 Federal Acquisition Regulations System DEPARTMENT OF ENERGY CONTRACTING METHODS AND..., performance-based contracting concepts and methodologies through the application of results-oriented...

  20. Using continuous process improvement methodology to standardize nursing handoff communication.

    PubMed

    Klee, Kristi; Latta, Linda; Davis-Kirsch, Sallie; Pecchia, Maria

    2012-04-01

    The purpose of this article was to describe the use of continuous performance improvement (CPI) methodology to standardize nurse shift-to-shift handoff communication. The goals of the process were to standardize the content and process of shift handoff, improve patient safety, increase patient and family involvement in the handoff process, and decrease end-of-shift overtime. This article will describe process changes made over a 4-year period as result of application of the plan-do-check-act procedure, which is an integral part of the CPI methodology, and discuss further work needed to continue to refine this critical nursing care process. Copyright © 2012 Elsevier Inc. All rights reserved.

  1. New methodology for fast prediction of wheel wear evolution

    NASA Astrophysics Data System (ADS)

    Apezetxea, I. S.; Perez, X.; Casanueva, C.; Alonso, A.

    2017-07-01

    In railway applications wear prediction in the wheel-rail interface is a fundamental matter in order to study problems such as wheel lifespan and the evolution of vehicle dynamic characteristic with time. However, one of the principal drawbacks of the existing methodologies for calculating the wear evolution is the computational cost. This paper proposes a new wear prediction methodology with a reduced computational cost. This methodology is based on two main steps: the first one is the substitution of the calculations over the whole network by the calculation of the contact conditions in certain characteristic point from whose result the wheel wear evolution can be inferred. The second one is the substitution of the dynamic calculation (time integration calculations) by the quasi-static calculation (the solution of the quasi-static situation of a vehicle at a certain point which is the same that neglecting the acceleration terms in the dynamic equations). These simplifications allow a significant reduction of computational cost to be obtained while maintaining an acceptable level of accuracy (error order of 5-10%). Several case studies are analysed along the paper with the objective of assessing the proposed methodology. The results obtained in the case studies allow concluding that the proposed methodology is valid for an arbitrary vehicle running through an arbitrary track layout.

  2. Using Modern Methodologies with Maintenance Software

    NASA Technical Reports Server (NTRS)

    Streiffert, Barbara A.; Francis, Laurie K.; Smith, Benjamin D.

    2014-01-01

    Jet Propulsion Laboratory uses multi-mission software produced by the Mission Planning and Sequencing (MPS) team to process, simulate, translate, and package the commands that are sent to a spacecraft. MPS works under the auspices of the Multi-Mission Ground Systems and Services (MGSS). This software consists of nineteen applications that are in maintenance. The MPS software is classified as either class B (mission critical) or class C (mission important). The scheduling of tasks is difficult because mission needs must be addressed prior to performing any other tasks and those needs often spring up unexpectedly. Keeping track of the tasks that everyone is working on is also difficult because each person is working on a different software component. Recently the group adopted the Scrum methodology for planning and scheduling tasks. Scrum is one of the newer methodologies typically used in agile development. In the Scrum development environment, teams pick their tasks that are to be completed within a sprint based on priority. The team specifies the sprint length usually a month or less. Scrum is typically used for new development of one application. In the Scrum methodology there is a scrum master who is a facilitator who tries to make sure that everything moves smoothly, a product owner who represents the user(s) of the software and the team. MPS is not the traditional environment for the Scrum methodology. MPS has many software applications in maintenance, team members who are working on disparate applications, many users, and is interruptible based on mission needs, issues and requirements. In order to use scrum, the methodology needed adaptation to MPS. Scrum was chosen because it is adaptable. This paper is about the development of the process for using scrum, a new development methodology, with a team that works on disparate interruptible tasks on multiple software applications.

  3. Global-local methodologies and their application to nonlinear analysis

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    1989-01-01

    An assessment is made of the potential of different global-local analysis strategies for predicting the nonlinear and postbuckling responses of structures. Two postbuckling problems of composite panels are used as benchmarks and the application of different global-local methodologies to these benchmarks is outlined. The key elements of each of the global-local strategies are discussed and future research areas needed to realize the full potential of global-local methodologies are identified.

  4. Application of Fuzzy Logic to Matrix FMECA

    NASA Astrophysics Data System (ADS)

    Shankar, N. Ravi; Prabhu, B. S.

    2001-04-01

    A methodology combining the benefits of Fuzzy Logic and Matrix FMEA is presented in this paper. The presented methodology extends the risk prioritization beyond the conventional Risk Priority Number (RPN) method. Fuzzy logic is used to calculate the criticality rank. Also the matrix approach is improved further to develop a pictorial representation retaining all relevant qualitative and quantitative information of several FMEA elements relationships. The methodology presented is demonstrated by application to an illustrative example.

  5. Regional health care planning: a methodology to cluster facilities using community utilization patterns.

    PubMed

    Delamater, Paul L; Shortridge, Ashton M; Messina, Joseph P

    2013-08-22

    Community-based health care planning and regulation necessitates grouping facilities and areal units into regions of similar health care use. Limited research has explored the methodologies used in creating these regions. We offer a new methodology that clusters facilities based on similarities in patient utilization patterns and geographic location. Our case study focused on Hospital Groups in Michigan, the allocation units used for predicting future inpatient hospital bed demand in the state's Bed Need Methodology. The scientific, practical, and political concerns that were considered throughout the formulation and development of the methodology are detailed. The clustering methodology employs a 2-step K-means + Ward's clustering algorithm to group hospitals. The final number of clusters is selected using a heuristic that integrates both a statistical-based measure of cluster fit and characteristics of the resulting Hospital Groups. Using recent hospital utilization data, the clustering methodology identified 33 Hospital Groups in Michigan. Despite being developed within the politically charged climate of Certificate of Need regulation, we have provided an objective, replicable, and sustainable methodology to create Hospital Groups. Because the methodology is built upon theoretically sound principles of clustering analysis and health care service utilization, it is highly transferable across applications and suitable for grouping facilities or areal units.

  6. One common way - The strategic and methodological influence on environmental planning across Europe

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiricka, Alexandra, E-mail: alexandra.jiricka@boku.ac.a; Proebstl, Ulrike, E-mail: ulrike.proebstl@boku.ac.a

    In the last decades the European Union exerted influence on precautionary environmental planning by the establishment of several Directives. The most relevant were the Habitat-Directive, the EIA-Directive, the SEA-Directive and the Water Framework Directive. Comparing these EU policies in the area of environmental precaution it becomes obvious that there is a lot of common ground. Thus, the conclusion seems likely that the European Union, in doing so, has intended to establish general planning concepts through introducing several methodological steps indicated by the regulations. The goal of this article is firstly to point out, which are the common planning principles, convertedmore » by methodological elements and secondly examine the consideration of these planning concepts by the implementation and application in the member states. In this context it is analysed whether the connections and divergences between the directives lead to significant differences in the implementation process. To this aim the directives are shortly introduced and significant steps of the processes regulated by them are outlined. In the second steps the national legal implementation in the Alpine states and its consequences for the practical application are discussed. The results show a heterogeneous application of the EU principles. Within the comparative view on the four directives influence and causalities between the national implementation and the practical application were identified, which can be simplified as four types. Since a coherent strategic and methodological concept for improving environmental precaution planning from part of the EU is noticeable, more unity and comparability within the implementation is desirable, particularly in areas with comparable habitats such as the alpine space. Beyond this the trade-off between the directives poses an important task for the future.« less

  7. Disease Risk Score (DRS) as a Confounder Summary Method: Systematic Review and Recommendations

    PubMed Central

    Tadrous, Mina; Gagne, Joshua J.; Stürmer, Til; Cadarette, Suzanne M.

    2013-01-01

    Purpose To systematically examine trends and applications of the disease risk score (DRS) as a confounder summary method. Methods We completed a systematic search of MEDLINE and Web of Science® to identify all English language articles that applied DRS methods. We tabulated the number of publications by year and type (empirical application, methodological contribution, or review paper) and summarized methods used in empirical applications overall and by publication year (<2000, ≥2000). Results Of 714 unique articles identified, 97 examined DRS methods and 86 were empirical applications. We observed a bimodal distribution in the number of publications over time, with a peak 1979-1980, and resurgence since 2000. The majority of applications with methodological detail derived DRS using logistic regression (47%), used DRS as a categorical variable in regression (93%), and applied DRS in a non-experimental cohort (47%) or case-control (42%) study. Few studies examined effect modification by outcome risk (23%). Conclusion Use of DRS methods has increased yet remains low. Comparative effectiveness research may benefit from more DRS applications, particularly to examine effect modification by outcome risk. Standardized terminology may facilitate identification, application, and comprehension of DRS methods. More research is needed to support the application of DRS methods, particularly in case-control studies. PMID:23172692

  8. Willingness to use mobile application for smartphone for improving road safety.

    PubMed

    Cardamone, Angelo Stephen; Eboli, Laura; Forciniti, Carmen; Mazzulla, Gabriella

    2016-01-01

    In the last few years mobile devices have reached a large amount of consumers in both developed and high-growth world economies. In 2013, 97% of the Italian population owns a mobile phone, and 62% owns a smartphone. Application software for mobile devices is largely proposed to consumers, and several mobile applications were oriented toward the improvement of road safety and road accident risk reduction. In this paper, we describe the results of a survey oriented to preventively investigate on the willingness to receive and/or to give information about road condition by means of mobile devices. Road users were informed about the characteristics of a mobile application, and then they were invited to complete a questionnaire. Experimental data were used for capturing road user attitudes toward the use of the smartphone to improve road safety, and to establish the preferences for the different features of the proposed mobile application. To this end, we choose to use the ordered probit model methodology. We demonstrate that the adopted methodology accounts for the differential impacts of the willingness to receive and/or to give information about road conditions on the overall willingness to receive and/or to give information through an application software for mobile devices.

  9. TAPping into argumentation: Developments in the application of Toulmin's Argument Pattern for studying science discourse

    NASA Astrophysics Data System (ADS)

    Erduran, Sibel; Simon, Shirley; Osborne, Jonathan

    2004-11-01

    This paper reports some methodological approaches to the analysis of argumentation discourse developed as part of the two-and-a-half year project titled Enhancing the Quality of Argument in School Scienc'' supported by the Economic and Social Research Council in the United Kingdom. In this project researchers collaborated with middle-school science teachers to develop models of instructional activities in an effort to make argumentation a component of instruction. We begin the paper with a brief theoretical justification for why we consider argumentation to be of significance to science education. We then contextualize the use of Toulmin's Argument Pattern in the study of argumentation discourse and provide a justification for the methodological outcomes our approach generates. We illustrate how our work refines and develops research methodologies in argumentation analysis. In particular, we present two methodological approaches to the analysis of argumentation resulting in whole-class as well as small-group student discussions. For each approach, we illustrate our coding scheme and some results as well as how our methodological approach has enabled our inquiry into the quality of argumentation in the classroom. We conclude with some implications for future research in argumentation in science education.

  10. Improved FTA methodology and application to subsea pipeline reliability design.

    PubMed

    Lin, Jing; Yuan, Yongbo; Zhang, Mingyuan

    2014-01-01

    An innovative logic tree, Failure Expansion Tree (FET), is proposed in this paper, which improves on traditional Fault Tree Analysis (FTA). It describes a different thinking approach for risk factor identification and reliability risk assessment. By providing a more comprehensive and objective methodology, the rather subjective nature of FTA node discovery is significantly reduced and the resulting mathematical calculations for quantitative analysis are greatly simplified. Applied to the Useful Life phase of a subsea pipeline engineering project, the approach provides a more structured analysis by constructing a tree following the laws of physics and geometry. Resulting improvements are summarized in comparison table form.

  11. Improved FTA Methodology and Application to Subsea Pipeline Reliability Design

    PubMed Central

    Lin, Jing; Yuan, Yongbo; Zhang, Mingyuan

    2014-01-01

    An innovative logic tree, Failure Expansion Tree (FET), is proposed in this paper, which improves on traditional Fault Tree Analysis (FTA). It describes a different thinking approach for risk factor identification and reliability risk assessment. By providing a more comprehensive and objective methodology, the rather subjective nature of FTA node discovery is significantly reduced and the resulting mathematical calculations for quantitative analysis are greatly simplified. Applied to the Useful Life phase of a subsea pipeline engineering project, the approach provides a more structured analysis by constructing a tree following the laws of physics and geometry. Resulting improvements are summarized in comparison table form. PMID:24667681

  12. 42 CFR 436.601 - Application of financial eligibility methodologies.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... VIRGIN ISLANDS General Financial Eligibility Requirements and Options § 436.601 Application of financial... methodologies in determining financial eligibility of the following groups: (i) Qualified pregnant women and children under the mandatory categorically needy group under § 436.120; (ii) Low-income pregnant women...

  13. A Novel Methodology for Improving Plant Pest Surveillance in Vineyards and Crops Using UAV-Based Hyperspectral and Spatial Data

    PubMed Central

    Vanegas, Fernando; Weiss, John; Gonzalez, Felipe

    2018-01-01

    Recent advances in remote sensed imagery and geospatial image processing using unmanned aerial vehicles (UAVs) have enabled the rapid and ongoing development of monitoring tools for crop management and the detection/surveillance of insect pests. This paper describes a (UAV) remote sensing-based methodology to increase the efficiency of existing surveillance practices (human inspectors and insect traps) for detecting pest infestations (e.g., grape phylloxera in vineyards). The methodology uses a UAV integrated with advanced digital hyperspectral, multispectral, and RGB sensors. We implemented the methodology for the development of a predictive model for phylloxera detection. In this method, we explore the combination of airborne RGB, multispectral, and hyperspectral imagery with ground-based data at two separate time periods and under different levels of phylloxera infestation. We describe the technology used—the sensors, the UAV, and the flight operations—the processing workflow of the datasets from each imagery type, and the methods for combining multiple airborne with ground-based datasets. Finally, we present relevant results of correlation between the different processed datasets. The objective of this research is to develop a novel methodology for collecting, processing, analysing and integrating multispectral, hyperspectral, ground and spatial data to remote sense different variables in different applications, such as, in this case, plant pest surveillance. The development of such methodology would provide researchers, agronomists, and UAV practitioners reliable data collection protocols and methods to achieve faster processing techniques and integrate multiple sources of data in diverse remote sensing applications. PMID:29342101

  14. Global-local methodologies and their application to nonlinear analysis. [for structural postbuckling study

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    1986-01-01

    An assessment is made of the potential of different global-local analysis strategies for predicting the nonlinear and postbuckling responses of structures. Two postbuckling problems of composite panels are used as benchmarks and the application of different global-local methodologies to these benchmarks is outlined. The key elements of each of the global-local strategies are discussed and future research areas needed to realize the full potential of global-local methodologies are identified.

  15. A methodology for collecting valid software engineering data

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Weiss, David M.

    1983-01-01

    An effective data collection method for evaluating software development methodologies and for studying the software development process is described. The method uses goal-directed data collection to evaluate methodologies with respect to the claims made for them. Such claims are used as a basis for defining the goals of the data collection, establishing a list of questions of interest to be answered by data analysis, defining a set of data categorization schemes, and designing a data collection form. The data to be collected are based on the changes made to the software during development, and are obtained when the changes are made. To insure accuracy of the data, validation is performed concurrently with software development and data collection. Validation is based on interviews with those people supplying the data. Results from using the methodology show that data validation is a necessary part of change data collection. Without it, as much as 50% of the data may be erroneous. Feasibility of the data collection methodology was demonstrated by applying it to five different projects in two different environments. The application showed that the methodology was both feasible and useful.

  16. Thermophysics modeling of an infrared detector cryochamber for transient operational scenario

    NASA Astrophysics Data System (ADS)

    Singhal, Mayank; Singhal, Gaurav; Verma, Avinash C.; Kumar, Sushil; Singh, Manmohan

    2016-05-01

    An infrared detector (IR) is essentially a transducer capable of converting radiant energy in the infrared regime into a measurable form. The benefit of infrared radiation is that it facilitates viewing objects in dark or through obscured conditions by detecting the infrared energy emitted by them. One of the most significant applications of IR detector systems is for target acquisition and tracking of projectile systems. IR detectors also find widespread applications in the industry and commercial market. The performance of infrared detector is sensitive to temperatures and performs best when cooled to cryogenic temperatures in the range of nearly 120 K. However, the necessity to operate in such cryogenic regimes increases the complexity in the application of IR detectors. This entails a need for detailed thermophysics analysis to be able to determine the actual cooling load specific to the application and also due to its interaction with the environment. This will enable design of most appropriate cooling methodologies suitable for specific scenarios. The focus of the present work is to develop a robust thermo-physical numerical methodology for predicting IR cryochamber behavior under transient conditions, which is the most critical scenario, taking into account all relevant heat loads including radiation in its original form. The advantage of the developed code against existing commercial software (COMSOL, ANSYS, etc.), is that it is capable of handling gas conduction together with radiation terms effectively, employing a ubiquitous software such as MATLAB. Also, it requires much smaller computational resources and is significantly less time intensive. It provides physically correct results enabling thermal characterization of cryochamber geometry in conjunction with appropriate cooling methodology. The code has been subsequently validated experimentally as the observed cooling characteristics are found to be in close agreement with the results predicted using the developed model thereby proving its efficacy.

  17. A New Vegetation Segmentation Approach for Cropped Fields Based on Threshold Detection from Hue Histograms

    PubMed Central

    Hassanein, Mohamed; El-Sheimy, Naser

    2018-01-01

    Over the last decade, the use of unmanned aerial vehicle (UAV) technology has evolved significantly in different applications as it provides a special platform capable of combining the benefits of terrestrial and aerial remote sensing. Therefore, such technology has been established as an important source of data collection for different precision agriculture (PA) applications such as crop health monitoring and weed management. Generally, these PA applications depend on performing a vegetation segmentation process as an initial step, which aims to detect the vegetation objects in collected agriculture fields’ images. The main result of the vegetation segmentation process is a binary image, where vegetations are presented in white color and the remaining objects are presented in black. Such process could easily be performed using different vegetation indexes derived from multispectral imagery. Recently, to expand the use of UAV imagery systems for PA applications, it was important to reduce the cost of such systems through using low-cost RGB cameras Thus, developing vegetation segmentation techniques for RGB images is a challenging problem. The proposed paper introduces a new vegetation segmentation methodology for low-cost UAV RGB images, which depends on using Hue color channel. The proposed methodology follows the assumption that the colors in any agriculture field image can be distributed into vegetation and non-vegetations colors. Therefore, four main steps are developed to detect five different threshold values using the hue histogram of the RGB image, these thresholds are capable to discriminate the dominant color, either vegetation or non-vegetation, within the agriculture field image. The achieved results for implementing the proposed methodology showed its ability to generate accurate and stable vegetation segmentation performance with mean accuracy equal to 87.29% and standard deviation as 12.5%. PMID:29670055

  18. Transportation Systems Evaluation

    NASA Technical Reports Server (NTRS)

    Fanning, M. L.; Michelson, R. A.

    1972-01-01

    A methodology for the analysis of transportation systems consisting of five major interacting elements is reported. The analysis begins with the causes of travel demand: geographic, economic, and demographic characteristics as well as attitudes toward travel. Through the analysis, the interaction of these factors with the physical and economic characteristics of the transportation system is determined. The result is an evaluation of the system from the point of view of both passenger and operator. The methodology is applicable to the intraurban transit systems as well as major airlines. Applications of the technique to analysis of a PRT system and a study of intraurban air travel are given. In the discussion several unique models or techniques are mentioned: i.e., passenger preference modeling, an integrated intraurban transit model, and a series of models to perform airline analysis.

  19. Stacked Autoencoders for Outlier Detection in Over-the-Horizon Radar Signals

    PubMed Central

    Protopapadakis, Eftychios; Doulamis, Anastasios; Doulamis, Nikolaos; Dres, Dimitrios; Bimpas, Matthaios

    2017-01-01

    Detection of outliers in radar signals is a considerable challenge in maritime surveillance applications. High-Frequency Surface-Wave (HFSW) radars have attracted significant interest as potential tools for long-range target identification and outlier detection at over-the-horizon (OTH) distances. However, a number of disadvantages, such as their low spatial resolution and presence of clutter, have a negative impact on their accuracy. In this paper, we explore the applicability of deep learning techniques for detecting deviations from the norm in behavioral patterns of vessels (outliers) as they are tracked from an OTH radar. The proposed methodology exploits the nonlinear mapping capabilities of deep stacked autoencoders in combination with density-based clustering. A comparative experimental evaluation of the approach shows promising results in terms of the proposed methodology's performance. PMID:29312449

  20. [Work-related musculo-skeletal disorders in apiculture: a biomechanical approach to the risk assessment].

    PubMed

    Maina, G; Sorasio, D; Rossi, F; Zito, D; Perrelli, E; Baracco, A

    2012-01-01

    The risk assessment in apiculture points out methodological problems due to discontinuities and variability of exposure. This study analyzes a comprehensive set of potential determinants influencing the biomechanical risks in apiarists using recognized technical standards to ensure the technical-scientific accuracy; it offers a simplified methodological toolkit to be used in the risk assessment process and provides a user-friendly computer application. The toolkit asks the beekeeper to specify, for each month, the total number of hours worked, specifying the distribution among different tasks. As a result, the application calculates the average index risk and the peak index risk. The evidence of the study indicates that there are activities in this occupational area with biomechanical risks that remain for some tasks, while reducing the exposure time.

  1. ARAMIS project: a comprehensive methodology for the identification of reference accident scenarios in process industries.

    PubMed

    Delvosalle, Christian; Fievez, Cécile; Pipart, Aurore; Debray, Bruno

    2006-03-31

    In the frame of the Accidental Risk Assessment Methodology for Industries (ARAMIS) project, this paper aims at presenting the work carried out in the part of the project devoted to the definition of accident scenarios. This topic is a key-point in risk assessment and serves as basis for the whole risk quantification. The first result of the work is the building of a methodology for the identification of major accident hazards (MIMAH), which is carried out with the development of generic fault and event trees based on a typology of equipment and substances. The term "major accidents" must be understood as the worst accidents likely to occur on the equipment, assuming that no safety systems are installed. A second methodology, called methodology for the identification of reference accident scenarios (MIRAS) takes into account the influence of safety systems on both the frequencies and possible consequences of accidents. This methodology leads to identify more realistic accident scenarios. The reference accident scenarios are chosen with the help of a tool called "risk matrix", crossing the frequency and the consequences of accidents. This paper presents both methodologies and an application on an ethylene oxide storage.

  2. A framework for assessing the adequacy and effectiveness of software development methodologies

    NASA Technical Reports Server (NTRS)

    Arthur, James D.; Nance, Richard E.

    1990-01-01

    Tools, techniques, environments, and methodologies dominate the software engineering literature, but relatively little research in the evaluation of methodologies is evident. This work reports an initial attempt to develop a procedural approach to evaluating software development methodologies. Prominent in this approach are: (1) an explication of the role of a methodology in the software development process; (2) the development of a procedure based on linkages among objectives, principles, and attributes; and (3) the establishment of a basis for reduction of the subjective nature of the evaluation through the introduction of properties. An application of the evaluation procedure to two Navy methodologies has provided consistent results that demonstrate the utility and versatility of the evaluation procedure. Current research efforts focus on the continued refinement of the evaluation procedure through the identification and integration of product quality indicators reflective of attribute presence, and the validation of metrics supporting the measure of those indicators. The consequent refinement of the evaluation procedure offers promise of a flexible approach that admits to change as the field of knowledge matures. In conclusion, the procedural approach presented in this paper represents a promising path toward the end goal of objectively evaluating software engineering methodologies.

  3. Hazmat transport: a methodological framework for the risk analysis of marshalling yards.

    PubMed

    Cozzani, Valerio; Bonvicini, Sarah; Spadoni, Gigliola; Zanelli, Severino

    2007-08-17

    A methodological framework was outlined for the comprehensive risk assessment of marshalling yards in the context of quantified area risk analysis. Three accident typologies were considered for yards: (i) "in-transit-accident-induced" releases; (ii) "shunting-accident-induced" spills; and (iii) "non-accident-induced" leaks. A specific methodology was developed for the assessment of expected release frequencies and equivalent release diameters, based on the application of HazOp and Fault Tree techniques to reference schemes defined for the more common types of railcar vessels used for "hazmat" transportation. The approach was applied to the assessment of an extended case-study. The results evidenced that "non-accident-induced" leaks in marshalling yards represent an important contribution to the overall risk associated to these zones. Furthermore, the results confirmed the considerable role of these fixed installations to the overall risk associated to "hazmat" transportation.

  4. A reliability evaluation methodology for memory chips for space applications when sample size is small

    NASA Technical Reports Server (NTRS)

    Chen, Y.; Nguyen, D.; Guertin, S.; Berstein, J.; White, M.; Menke, R.; Kayali, S.

    2003-01-01

    This paper presents a reliability evaluation methodology to obtain the statistical reliability information of memory chips for space applications when the test sample size needs to be kept small because of the high cost of the radiation hardness memories.

  5. SOME POSSIBLE APPLICATIONS OF PROJECT OUTCOMES RESEARCH METHODOLOGY

    DTIC Science & Technology

    Section I, refers to the possibility of using the theory and methodology of Project Outcomes to problems of strategic information. It is felt that...purposes of assessing present and future organizational effectiveness . Section IV, refers to the applications that our study may have for problems of

  6. Methodology of citrate-based biomaterial development and application

    NASA Astrophysics Data System (ADS)

    Tran, M. Richard

    Biomaterials play central roles in modern strategies of regenerative medicine and tissue engineering. Attempts to find tissue-engineered solutions to cure various injuries or diseases have led to an enormous increase in the number of polymeric biomaterials over the past decade. The breadth of new materials arises from the multiplicity of anatomical locations, cell types, and mode of application, which all place application-specific requirements on the biomaterial. Unfortunately, many of the currently available biodegradable polymers are limited in their versatility to meet the wide range of requirements for tissue engineering. Therefore, a methodology of biomaterial development, which is able to address a broad spectrum of requirements, would be beneficial to the biomaterial field. This work presents a methodology of citrate-based biomaterial design and application to meet the multifaceted needs of tissue engineering. We hypothesize that (1) citric acid, a non-toxic metabolic product of the body (Krebs Cycle), can be exploited as a universal multifunctional monomer and reacted with various diols to produce a new class of soft biodegradable elastomers with the flexibility to tune the material properties of the resulting material to meet a wide range of requirements; (2) the newly developed citrate-based polymers can be used as platform biomaterials for the design of novel tissue engineering scaffolding; and (3) microengineering approaches in the form thin scaffold sheets, microchannels, and a new porogen design can be used to generate complex cell-cell and cell-microenvironment interactions to mimic tissue complexity and architecture. To test these hypotheses, we first developed a methodology of citrate-based biomaterial development through the synthesis and characterization of a family of in situ crosslinkable and urethane-doped elastomers, which are synthesized using simple, cost-effective strategies and offer a variety methods to tailor the material properties to meet the needs of a particular application. Next, we introduced a new porogen generation technique, and showed the potential application of the newly developed materials through the fabrication and characterization of scaffold sheets, multiphasic small diameter vascular grafts, and multichanneled nerve guides. Finally, the in vivo applications of citrate-based materials are exemplified through the evaluation of peripheral nerve regeneration using multichanneled guides and the ability to assist in injection-based endoscopic mucosal resection therapy. The results presented in this work show that citric acid can be utilized as a cornerstone in the development of novel biodegradable materials, and combined with microengineering approaches to produce the next generation of tissue engineering scaffolding. These enabling new biomaterials and scaffolding strategies should address many of the existing challenges in tissue engineering and advance the field as a whole.

  7. IFC BIM-Based Methodology for Semi-Automated Building Energy Performance Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bazjanac, Vladimir

    2008-07-01

    Building energy performance (BEP) simulation is still rarely used in building design, commissioning and operations. The process is too costly and too labor intensive, and it takes too long to deliver results. Its quantitative results are not reproducible due to arbitrary decisions and assumptions made in simulation model definition, and can be trusted only under special circumstances. A methodology to semi-automate BEP simulation preparation and execution makes this process much more effective. It incorporates principles of information science and aims to eliminate inappropriate human intervention that results in subjective and arbitrary decisions. This is achieved by automating every part ofmore » the BEP modeling and simulation process that can be automated, by relying on data from original sources, and by making any necessary data transformation rule-based and automated. This paper describes the new methodology and its relationship to IFC-based BIM and software interoperability. It identifies five steps that are critical to its implementation, and shows what part of the methodology can be applied today. The paper concludes with a discussion of application to simulation with EnergyPlus, and describes data transformation rules embedded in the new Geometry Simplification Tool (GST).« less

  8. Robotic application of a dynamic resultant force vector using real-time load-control: simulation of an ideal follower load on Cadaveric L4-L5 segments.

    PubMed

    Bennett, Charles R; Kelly, Brian P

    2013-08-09

    Standard in-vitro spine testing methods have focused on application of isolated and/or constant load components while the in-vivo spine is subject to multiple components that can be resolved into resultant dynamic load vectors. To advance towards more in-vivo like simulations the objective of the current study was to develop a methodology to apply robotically-controlled, non-zero, real-time dynamic resultant forces during flexion-extension on human lumbar motion segment units (MSU) with initial application towards simulation of an ideal follower load (FL) force vector. A proportional-integral-derivative (PID) controller with custom algorithms coordinated the motion of a Cartesian serial manipulator comprised of six axes each capable of position- or load-control. Six lumbar MSUs (L4-L5) were tested with continuously increasing sagittal plane bending to 8 Nm while force components were dynamically programmed to deliver a resultant 400 N FL that remained normal to the moving midline of the intervertebral disc. Mean absolute load-control tracking errors between commanded and experimental loads were computed. Global spinal ranges of motion and sagittal plane inter-body translations were compared to previously published values for non-robotic applications. Mean TEs for zero-commanded force and moment axes were 0.7 ± 0.4N and 0.03 ± 0.02 Nm, respectively. For non-zero force axes mean TEs were 0.8 ± 0.8 N, 1.3 ± 1.6 Nm, and 1.3 ± 1.6N for Fx, Fz, and the resolved ideal follower load vector FL(R), respectively. Mean extension and flexion ranges of motion were 2.6° ± 1.2° and 5.0° ± 1.7°, respectively. Relative vertebral body translations and rotations were very comparable to data collected with non-robotic systems in the literature. The robotically coordinated Cartesian load controlled testing system demonstrated robust real-time load-control that permitted application of a real-time dynamic non-zero load vector during flexion-extension. For single MSU investigations the methodology has potential to overcome conventional follower load limitations, most notably via application outside the sagittal plane. This methodology holds promise for future work aimed at reducing the gap between current in-vitro testing and in-vivo circumstances. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. An Effective Modal Approach to the Dynamic Evaluation of Fracture Toughness of Quasi-Brittle Materials

    NASA Astrophysics Data System (ADS)

    Ferreira, L. E. T.; Vareda, L. V.; Hanai, J. B.; Sousa, J. L. A. O.; Silva, A. I.

    2017-05-01

    A modal dynamic analysis is used as the tool to evaluate the fracture toughness of concrete from the results of notched-through beam tests. The dimensionless functions describing the relation between the frequencies and specimen geometry used for identifying the variation in the natural frequency as a function of crack depth is first determined for a 150 × 150 × 500-mm notched-through specimen. The frequency decrease resulting from the propagating crack is modeled through a modal/fracture mechanics approach, leading to determination of an effective crack length. This length, obtained numerically, is used to evaluate the fracture toughness of concrete, the critical crack mouth opening displacements, and the brittleness index proposed. The methodology is applied to tests performed on high-strength concrete specimens. The frequency response for each specimen is evaluated before and after each crack propagation step. The methodology is then validated by comparison with results from the application of other methodologies described in the literature and suggested by RILEM.

  10. NEW SAMPLING THEORY FOR MEASURING ECOSYSTEM STRUCTURE

    EPA Science Inventory

    This research considered the application of systems analysis to the study of laboratory ecosystems. The work concerned the development of a methodology which was shown to be useful in the design of laboratory experiments, the processing and interpretation of the results of these ...

  11. RE-EVALUATION OF APPLICABILITY OF AGENCY SAMPLE HOLDING TIMES

    EPA Science Inventory

    Holding times are the length of time a sample can be stored after collection and prior to analysis without significantly affecting the analytical results. Holding times vary with the analyte, sample matrix, and analytical methodology used to quantify the analytes concentration. ...

  12. Multimedia Sampling During The Application Of Biosolids On A Land Test Site

    EPA Science Inventory

    This report documents the approach, methodologies, results, and interpretation of a collaborative research study conducted by the National Risk Management Research Center (NRMRL) of the U.S. Environmental Protection Agency's (U.S. EPA's) Office of Research and Development (ORD); ...

  13. Generating social impact scenarios: A key step in making technology assessment studies

    NASA Technical Reports Server (NTRS)

    Jones, M. V.

    1975-01-01

    The MITRE methodological studies were conducted to define relevant questions in relation to the concept of total impact analysis and to provide a procedure for integrating diverse checklists of questions which trace the initial and secondary impacts of any major technological application or of society's attempts to respond to or redirect that application. Some of the results of that study are presented in tabular form.

  14. A new methodology to integrate planetary quarantine requirements into mission planning, with application to a Jupiter orbiter

    NASA Technical Reports Server (NTRS)

    Howard, R. A.; North, D. W.; Pezier, J. P.

    1975-01-01

    A new methodology is proposed for integrating planetary quarantine objectives into space exploration planning. This methodology is designed to remedy the major weaknesses inherent in the current formulation of planetary quarantine requirements. Application of the methodology is illustrated by a tutorial analysis of a proposed Jupiter Orbiter mission. The proposed methodology reformulates planetary quarantine planning as a sequential decision problem. Rather than concentrating on a nominal plan, all decision alternatives and possible consequences are laid out in a decision tree. Probabilities and values are associated with the outcomes, including the outcome of contamination. The process of allocating probabilities, which could not be made perfectly unambiguous and systematic, is replaced by decomposition and optimization techniques based on principles of dynamic programming. Thus, the new methodology provides logical integration of all available information and allows selection of the best strategy consistent with quarantine and other space exploration goals.

  15. Efficient preliminary floating offshore wind turbine design and testing methodologies and application to a concrete spar design.

    PubMed

    Matha, Denis; Sandner, Frank; Molins, Climent; Campos, Alexis; Cheng, Po Wen

    2015-02-28

    The current key challenge in the floating offshore wind turbine industry and research is on designing economic floating systems that can compete with fixed-bottom offshore turbines in terms of levelized cost of energy. The preliminary platform design, as well as early experimental design assessments, are critical elements in the overall design process. In this contribution, a brief review of current floating offshore wind turbine platform pre-design and scaled testing methodologies is provided, with a focus on their ability to accommodate the coupled dynamic behaviour of floating offshore wind systems. The exemplary design and testing methodology for a monolithic concrete spar platform as performed within the European KIC AFOSP project is presented. Results from the experimental tests compared to numerical simulations are presented and analysed and show very good agreement for relevant basic dynamic platform properties. Extreme and fatigue loads and cost analysis of the AFOSP system confirm the viability of the presented design process. In summary, the exemplary application of the reduced design and testing methodology for AFOSP confirms that it represents a viable procedure during pre-design of floating offshore wind turbine platforms. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  16. Sharing methodology: a worked example of theoretical integration with qualitative data to clarify practical understanding of learning and generate new theoretical development.

    PubMed

    Yardley, Sarah; Brosnan, Caragh; Richardson, Jane

    2013-01-01

    Theoretical integration is a necessary element of study design if clarification of experiential learning is to be achieved. There are few published examples demonstrating how this can be achieved. This methodological article provides a worked example of research methodology that achieved clarification of authentic early experiences (AEEs) through a bi-directional approach to theory and data. Bi-directional refers to our simultaneous use of theory to guide and interrogate empirical data and the use of empirical data to refine theory. We explain the five steps of our methodological approach: (1) understanding the context; (2) critique on existing applications of socio-cultural models to inform study design; (3) data generation; (4) analysis and interpretation and (5) theoretical development through a novel application of Metis. These steps resulted in understanding of how and why different outcomes arose from students participating in AEE. Our approach offers a mechanism for clarification without which evidence-based effective ways to maximise constructive learning cannot be developed. In our example it also contributed to greater theoretical understanding of the influence of social interactions. By sharing this example of research undertaken to develop both theory and educational practice we hope to assist others seeking to conduct similar research.

  17. System learning approach to assess sustainability and ...

    EPA Pesticide Factsheets

    This paper presents a methodology that combines the power of an Artificial Neural Network and Information Theory to forecast variables describing the condition of a regional system. The novelty and strength of this approach is in the application of Fisher information, a key method in Information Theory, to preserve trends in the historical data and prevent over fitting projections. The methodology was applied to demographic, environmental, food and energy consumption, and agricultural production in the San Luis Basin regional system in Colorado, U.S.A. These variables are important for tracking conditions in human and natural systems. However, available data are often so far out of date that they limit the ability to manage these systems. Results indicate that the approaches developed provide viable tools for forecasting outcomes with the aim of assisting management toward sustainable trends. This methodology is also applicable for modeling different scenarios in other dynamic systems. Indicators are indispensable for tracking conditions in human and natural systems, however, available data is sometimes far out of date and limit the ability to gauge system status. Techniques like regression and simulation are not sufficient because system characteristics have to be modeled ensuring over simplification of complex dynamics. This work presents a methodology combining the power of an Artificial Neural Network and Information Theory to capture patterns in a real dyna

  18. HSTDEK: Developing a methodology for construction of large-scale, multi-use knowledge bases

    NASA Technical Reports Server (NTRS)

    Freeman, Michael S.

    1987-01-01

    The primary research objectives of the Hubble Space Telescope Design/Engineering Knowledgebase (HSTDEK) are to develop a methodology for constructing and maintaining large scale knowledge bases which can be used to support multiple applications. To insure the validity of its results, this research is being persued in the context of a real world system, the Hubble Space Telescope. The HSTDEK objectives are described in detail. The history and motivation of the project are briefly described. The technical challenges faced by the project are outlined.

  19. Application of CFE/POST2 for Simulation of Launch Vehicle Stage Separation

    NASA Technical Reports Server (NTRS)

    Pamadi, Bandu N.; Tartabini, Paul V.; Toniolo, Matthew D.; Roithmayr, Carlos M.; Karlgaard, Christopher D.; Samareh, Jamshid A.

    2009-01-01

    The constraint force equation (CFE) methodology provides a framework for modeling constraint forces and moments acting at joints that connect multiple vehicles. With implementation in Program to Optimize Simulated Trajectories II (POST 2), the CFE provides a capability to simulate end-to-end trajectories of launch vehicles, including stage separation. In this paper, the CFE/POST2 methodology is applied to the Shuttle-SRB separation problem as a test and validation case. The CFE/POST2 results are compared with STS-1 flight test data.

  20. A New Finite-Time Observer for Nonlinear Systems: Applications to Synchronization of Lorenz-Like Systems.

    PubMed

    Aguilar-López, Ricardo; Mata-Machuca, Juan L

    2016-01-01

    This paper proposes a synchronization methodology of two chaotic oscillators under the framework of identical synchronization and master-slave configuration. The proposed methodology is based on state observer design under the frame of control theory; the observer structure provides finite-time synchronization convergence by cancelling the upper bounds of the main nonlinearities of the chaotic oscillator. The above is showed via an analysis of the dynamic of the so called synchronization error. Numerical experiments corroborate the satisfactory results of the proposed scheme.

  1. A New Finite-Time Observer for Nonlinear Systems: Applications to Synchronization of Lorenz-Like Systems

    PubMed Central

    Aguilar-López, Ricardo

    2016-01-01

    This paper proposes a synchronization methodology of two chaotic oscillators under the framework of identical synchronization and master-slave configuration. The proposed methodology is based on state observer design under the frame of control theory; the observer structure provides finite-time synchronization convergence by cancelling the upper bounds of the main nonlinearities of the chaotic oscillator. The above is showed via an analysis of the dynamic of the so called synchronization error. Numerical experiments corroborate the satisfactory results of the proposed scheme. PMID:27738651

  2. [Methodological deficits in neuroethics: do we need theoretical neuroethics?].

    PubMed

    Northoff, G

    2013-10-01

    Current neuroethics can be characterized best as empirical neuroethics: it is strongly empirically oriented in that it not only includes empirical findings from neuroscience but also searches for applications within neuroscience. This, however, neglects the social and political contexts which could be subject to a future social neuroethics. In addition, methodological issues need to be considered as in theoretical neuroethics. The focus in this article is on two such methodological issues: (1) the analysis of the different levels and their inferences among each other which is exemplified by the inference of consciousness from the otherwise purely neuronal data in patients with vegetative state and (2) the problem of linking descriptive and normative concepts in a non-reductive and non-inferential way for which I suggest the mutual contextualization between both concepts. This results in a methodological strategy that can be described as contextual fact-norm iterativity.

  3. Multiphysics Simulation of Welding-Arc and Nozzle-Arc System: Mathematical-Model, Solution-Methodology and Validation

    NASA Astrophysics Data System (ADS)

    Pawar, Sumedh; Sharma, Atul

    2018-01-01

    This work presents mathematical model and solution methodology for a multiphysics engineering problem on arc formation during welding and inside a nozzle. A general-purpose commercial CFD solver ANSYS FLUENT 13.0.0 is used in this work. Arc formation involves strongly coupled gas dynamics and electro-dynamics, simulated by solution of coupled Navier-Stoke equations, Maxwell's equations and radiation heat-transfer equation. Validation of the present numerical methodology is demonstrated with an excellent agreement with the published results. The developed mathematical model and the user defined functions (UDFs) are independent of the geometry and are applicable to any system that involves arc-formation, in 2D axisymmetric coordinates system. The high-pressure flow of SF6 gas in the nozzle-arc system resembles arc chamber of SF6 gas circuit breaker; thus, this methodology can be extended to simulate arcing phenomenon during current interruption.

  4. Applying operational research and data mining to performance based medical personnel motivation system.

    PubMed

    Niaksu, Olegas; Zaptorius, Jonas

    2014-01-01

    This paper presents the methodology suitable for creation of a performance related remuneration system in healthcare sector, which would meet requirements for efficiency and sustainable quality of healthcare services. Methodology for performance indicators selection, ranking and a posteriori evaluation has been proposed and discussed. Priority Distribution Method is applied for unbiased performance criteria weighting. Data mining methods are proposed to monitor and evaluate the results of motivation system.We developed a method for healthcare specific criteria selection consisting of 8 steps; proposed and demonstrated application of Priority Distribution Method for the selected criteria weighting. Moreover, a set of data mining methods for evaluation of the motivational system outcomes was proposed. The described methodology for calculating performance related payment needs practical approbation. We plan to develop semi-automated tools for institutional and personal performance indicators monitoring. The final step would be approbation of the methodology in a healthcare facility.

  5. Beyond annual streamflow reconstructions for the Upper Colorado River Basin: a paleo-water-balance approach

    USGS Publications Warehouse

    Gangopadhyay, Subhrendu; McCabe, Gregory J.; Woodhouse, Connie A.

    2015-01-01

    In this paper, we present a methodology to use annual tree-ring chronologies and a monthly water balance model to generate annual reconstructions of water balance variables (e.g., potential evapotrans- piration (PET), actual evapotranspiration (AET), snow water equivalent (SWE), soil moisture storage (SMS), and runoff (R)). The method involves resampling monthly temperature and precipitation from the instrumental record directed by variability indicated by the paleoclimate record. The generated time series of monthly temperature and precipitation are subsequently used as inputs to a monthly water balance model. The methodology is applied to the Upper Colorado River Basin, and results indicate that the methodology reliably simulates water-year runoff, maximum snow water equivalent, and seasonal soil moisture storage for the instrumental period. As a final application, the methodology is used to produce time series of PET, AET, SWE, SMS, and R for the 1404–1905 period for the Upper Colorado River Basin.

  6. SCAP: a new methodology for safety management based on feedback from credible accident-probabilistic fault tree analysis system.

    PubMed

    Khan, F I; Iqbal, A; Ramesh, N; Abbasi, S A

    2001-10-12

    As it is conventionally done, strategies for incorporating accident--prevention measures in any hazardous chemical process industry are developed on the basis of input from risk assessment. However, the two steps-- risk assessment and hazard reduction (or safety) measures--are not linked interactively in the existing methodologies. This prevents a quantitative assessment of the impacts of safety measures on risk control. We have made an attempt to develop a methodology in which risk assessment steps are interactively linked with implementation of safety measures. The resultant system tells us the extent of reduction of risk by each successive safety measure. It also tells based on sophisticated maximum credible accident analysis (MCAA) and probabilistic fault tree analysis (PFTA) whether a given unit can ever be made 'safe'. The application of the methodology has been illustrated with a case study.

  7. Application of Steinberg vibration fatigue model for structural verification of space instruments

    NASA Astrophysics Data System (ADS)

    García, Andrés; Sorribes-Palmer, Félix; Alonso, Gustavo

    2018-01-01

    Electronic components in spaceships are subjected to vibration loads during the ascent phase of the launcher. It is important to verify by tests and analysis that all parts can survive in the most severe load cases. The purpose of this paper is to present the methodology and results of the application of the Steinberg's fatigue model to estimate the life of electronic components of the EPT-HET instrument for the Solar Orbiter space mission. A Nastran finite element model (FEM) of the EPT-HET instrument was created and used for the structural analysis. The methodology is based on the use of the FEM of the entire instrument to calculate the relative displacement RDSD and RMS values of the PCBs from random vibration analysis. These values are used to estimate the fatigue life of the most susceptible electronic components with the Steinberg's fatigue damage equation and the Miner's cumulative fatigue index. The estimations are calculated for two different configurations of the instrument and three different inputs in order to support the redesign process. Finally, these analytical results are contrasted with the inspections and the functional tests made after the vibration tests, concluding that this methodology can adequately predict the fatigue damage or survival of the electronic components.

  8. A comparison of results of empirical studies of supplementary search techniques and recommendations in review methodology handbooks: a methodological review.

    PubMed

    Cooper, Chris; Booth, Andrew; Britten, Nicky; Garside, Ruth

    2017-11-28

    The purpose and contribution of supplementary search methods in systematic reviews is increasingly acknowledged. Numerous studies have demonstrated their potential in identifying studies or study data that would have been missed by bibliographic database searching alone. What is less certain is how supplementary search methods actually work, how they are applied, and the consequent advantages, disadvantages and resource implications of each search method. The aim of this study is to compare current practice in using supplementary search methods with methodological guidance. Four methodological handbooks in informing systematic review practice in the UK were read and audited to establish current methodological guidance. Studies evaluating the use of supplementary search methods were identified by searching five bibliographic databases. Studies were included if they (1) reported practical application of a supplementary search method (descriptive) or (2) examined the utility of a supplementary search method (analytical) or (3) identified/explored factors that impact on the utility of a supplementary method, when applied in practice. Thirty-five studies were included in this review in addition to the four methodological handbooks. Studies were published between 1989 and 2016, and dates of publication of the handbooks ranged from 1994 to 2014. Five supplementary search methods were reviewed: contacting study authors, citation chasing, handsearching, searching trial registers and web searching. There is reasonable consistency between recommended best practice (handbooks) and current practice (methodological studies) as it relates to the application of supplementary search methods. The methodological studies provide useful information on the effectiveness of the supplementary search methods, often seeking to evaluate aspects of the method to improve effectiveness or efficiency. In this way, the studies advance the understanding of the supplementary search methods. Further research is required, however, so that a rational choice can be made about which supplementary search strategies should be used, and when.

  9. A normative price for energy from an electricity generation system: An Owner-dependent Methodology for Energy Generation (system) Assessment (OMEGA). Volume 1: Summary

    NASA Technical Reports Server (NTRS)

    Chamberlain, R. G.; Mcmaster, K. M.

    1981-01-01

    The utility owned solar electric system methodology is generalized and updated. The net present value of the system is determined by consideration of all financial benefits and costs (including a specified return on investment). Life cycle costs, life cycle revenues, and residual system values are obtained. Break even values of system parameters are estimated by setting the net present value to zero. While the model was designed for photovoltaic generators with a possible thermal energy byproduct, it applicability is not limited to such systems. The resulting owner-dependent methodology for energy generation system assessment consists of a few equations that can be evaluated without the aid of a high-speed computer.

  10. Development and testing of controller performance evaluation methodology for multi-input/multi-output digital control systems

    NASA Technical Reports Server (NTRS)

    Pototzky, Anthony; Wieseman, Carol; Hoadley, Sherwood Tiffany; Mukhopadhyay, Vivek

    1991-01-01

    Described here is the development and implementation of on-line, near real time controller performance evaluation (CPE) methods capability. Briefly discussed are the structure of data flow, the signal processing methods used to process the data, and the software developed to generate the transfer functions. This methodology is generic in nature and can be used in any type of multi-input/multi-output (MIMO) digital controller application, including digital flight control systems, digitally controlled spacecraft structures, and actively controlled wind tunnel models. Results of applying the CPE methodology to evaluate (in near real time) MIMO digital flutter suppression systems being tested on the Rockwell Active Flexible Wing (AFW) wind tunnel model are presented to demonstrate the CPE capability.

  11. A combined stochastic feedforward and feedback control design methodology with application to autoland design

    NASA Technical Reports Server (NTRS)

    Halyo, Nesim

    1987-01-01

    A combined stochastic feedforward and feedback control design methodology was developed. The objective of the feedforward control law is to track the commanded trajectory, whereas the feedback control law tries to maintain the plant state near the desired trajectory in the presence of disturbances and uncertainties about the plant. The feedforward control law design is formulated as a stochastic optimization problem and is embedded into the stochastic output feedback problem where the plant contains unstable and uncontrollable modes. An algorithm to compute the optimal feedforward is developed. In this approach, the use of error integral feedback, dynamic compensation, control rate command structures are an integral part of the methodology. An incremental implementation is recommended. Results on the eigenvalues of the implemented versus designed control laws are presented. The stochastic feedforward/feedback control methodology is used to design a digital automatic landing system for the ATOPS Research Vehicle, a Boeing 737-100 aircraft. The system control modes include localizer and glideslope capture and track, and flare to touchdown. Results of a detailed nonlinear simulation of the digital control laws, actuator systems, and aircraft aerodynamics are presented.

  12. A biased review of biases in Twitter studies on political collective action

    NASA Astrophysics Data System (ADS)

    Cihon, Peter; Yasseri, Taha

    2016-08-01

    In recent years researchers have gravitated to Twitter and other social media platforms as fertile ground for empirical analysis of social phenomena. Social media provides researchers access to trace data of interactions and discourse that once went unrecorded in the offline world. Researchers have sought to use these data to explain social phenomena both particular to social media and applicable to the broader social world. This paper offers a minireview of Twitter-based research on political crowd behaviour. This literature offers insight into particular social phenomena on Twitter, but often fails to use standardized methods that permit interpretation beyond individual studies. Moreover, the literature fails to ground methodologies and results in social or political theory, divorcing empirical research from the theory needed to interpret it. Rather, investigations focus primarily on methodological innovations for social media analyses, but these too often fail to sufficiently demonstrate the validity of such methodologies. This minireview considers a small number of selected papers; we analyse their (often lack of) theoretical approaches, review their methodological innovations, and offer suggestions as to the relevance of their results for political scientists and sociologists.

  13. Development of Risk Assessment Methodology for Land Application and Distribution and Marketing of Municipal Sludge

    EPA Science Inventory

    This is one of a series of reports that present methodologies for assessing the potential risks to humans or other organisms from the disposal or reuse of municipal sludge. The sludge management practices addressed by this series include land application practices, distribution a...

  14. Multilevel Modeling: A Review of Methodological Issues and Applications

    ERIC Educational Resources Information Center

    Dedrick, Robert F.; Ferron, John M.; Hess, Melinda R.; Hogarty, Kristine Y.; Kromrey, Jeffrey D.; Lang, Thomas R.; Niles, John D.; Lee, Reginald S.

    2009-01-01

    This study analyzed the reporting of multilevel modeling applications of a sample of 99 articles from 13 peer-reviewed journals in education and the social sciences. A checklist, derived from the methodological literature on multilevel modeling and focusing on the issues of model development and specification, data considerations, estimation, and…

  15. Validation of a physical anthropology methodology using mandibles for gender estimation in a Brazilian population

    PubMed Central

    CARVALHO, Suzana Papile Maciel; BRITO, Liz Magalhães; de PAIVA, Luiz Airton Saavedra; BICUDO, Lucilene Arilho Ribeiro; CROSATO, Edgard Michel; de OLIVEIRA, Rogério Nogueira

    2013-01-01

    Validation studies of physical anthropology methods in the different population groups are extremely important, especially in cases in which the population variations may cause problems in the identification of a native individual by the application of norms developed for different communities. Objective This study aimed to estimate the gender of skeletons by application of the method of Oliveira, et al. (1995), previously used in a population sample from Northeast Brazil. Material and Methods The accuracy of this method was assessed for a population from Southeast Brazil and validated by statistical tests. The method used two mandibular measurements, namely the bigonial distance and the mandibular ramus height. The sample was composed of 66 skulls and the method was applied by two examiners. The results were statistically analyzed by the paired t test, logistic discriminant analysis and logistic regression. Results The results demonstrated that the application of the method of Oliveira, et al. (1995) in this population achieved very different outcomes between genders, with 100% for females and only 11% for males, which may be explained by ethnic differences. However, statistical adjustment of measurement data for the population analyzed allowed accuracy of 76.47% for males and 78.13% for females, with the creation of a new discriminant formula. Conclusion It was concluded that methods involving physical anthropology present high rate of accuracy for human identification, easy application, low cost and simplicity; however, the methodologies must be validated for the different populations due to differences in ethnic patterns, which are directly related to the phenotypic aspects. In this specific case, the method of Oliveira, et al. (1995) presented good accuracy and may be used for gender estimation in Brazil in two geographic regions, namely Northeast and Southeast; however, for other regions of the country (North, Central West and South), previous methodological adjustment is recommended as demonstrated in this study. PMID:24037076

  16. Benefits of clean development mechanism application on the life cycle assessment perspective: a case study in the palm oil industry.

    PubMed

    Chuen, Onn Chiu; Yusoff, Sumiani

    2012-03-01

    This study performed an assessment on the beneficial of the Clean Development Mechanism (CDM) application on waste treatment system in a local palm oil industry in Malaysia. Life cycle assessment (LCA) was conducted to assess the environmental impacts of the greenhouse gas (GHG) reduction from the CDM application. Calculations on the emission reduction used the methodology based on AM002 (Avoided Wastewater and On-site Energy Use Emissions in the Industrial Sector) Version 4 published by United Nations Framework Convention on Climate Change (UNFCC). The results from the studies showed that the introduction of CDM in the palm oil mill through conversion of the captured biogas from palm oil mill effluent (POME) treatment into power generation were able to reduce approximate 0.12 tonnes CO2 equivalent concentration (tCO2e) emission and 30 kW x hr power generation per 1 tonne of fresh fruit bunch processed. Thus, the application of CDM methodology on palm oil mill wastewater treatment was able to reduce up to 1/4 of the overall environment impact generated in palm oil mill.

  17. Developing a stochastic conflict resolution model for urban runoff quality management: Application of info-gap and bargaining theories

    NASA Astrophysics Data System (ADS)

    Ghodsi, Seyed Hamed; Kerachian, Reza; Estalaki, Siamak Malakpour; Nikoo, Mohammad Reza; Zahmatkesh, Zahra

    2016-02-01

    In this paper, two deterministic and stochastic multilateral, multi-issue, non-cooperative bargaining methodologies are proposed for urban runoff quality management. In the proposed methodologies, a calibrated Storm Water Management Model (SWMM) is used to simulate stormwater runoff quantity and quality for different urban stormwater runoff management scenarios, which have been defined considering several Low Impact Development (LID) techniques. In the deterministic methodology, the best management scenario, representing location and area of LID controls, is identified using the bargaining model. In the stochastic methodology, uncertainties of some key parameters of SWMM are analyzed using the info-gap theory. For each water quality management scenario, robustness and opportuneness criteria are determined based on utility functions of different stakeholders. Then, to find the best solution, the bargaining model is performed considering a combination of robustness and opportuneness criteria for each scenario based on utility function of each stakeholder. The results of applying the proposed methodology in the Velenjak urban watershed located in the northeastern part of Tehran, the capital city of Iran, illustrate its practical utility for conflict resolution in urban water quantity and quality management. It is shown that the solution obtained using the deterministic model cannot outperform the result of the stochastic model considering the robustness and opportuneness criteria. Therefore, it can be concluded that the stochastic model, which incorporates the main uncertainties, could provide more reliable results.

  18. Additive Manufacturing in Production: A Study Case Applying Technical Requirements

    NASA Astrophysics Data System (ADS)

    Ituarte, Iñigo Flores; Coatanea, Eric; Salmi, Mika; Tuomi, Jukka; Partanen, Jouni

    Additive manufacturing (AM) is expanding the manufacturing capabilities. However, quality of AM produced parts is dependent on a number of machine, geometry and process parameters. The variability of these parameters affects the manufacturing drastically and therefore standardized processes and harmonized methodologies need to be developed to characterize the technology for end use applications and enable the technology for manufacturing. This research proposes a composite methodology integrating Taguchi Design of Experiments, multi-objective optimization and statistical process control, to optimize the manufacturing process and fulfil multiple requirements imposed to an arbitrary geometry. The proposed methodology aims to characterize AM technology depending upon manufacturing process variables as well as to perform a comparative assessment of three AM technologies (Selective Laser Sintering, Laser Stereolithography and Polyjet). Results indicate that only one machine, laser-based Stereolithography, was feasible to fulfil simultaneously macro and micro level geometrical requirements but mechanical properties were not at required level. Future research will study a single AM system at the time to characterize AM machine technical capabilities and stimulate pre-normative initiatives of the technology for end use applications.

  19. Methodology for assessing laser-based equipment

    NASA Astrophysics Data System (ADS)

    Pelegrina-Bonilla, Gabriel; Hermsdorf, Jörg; Thombansen, Ulrich; Abels, Peter; Kaierle, Stefan; Neumann, Jörg

    2017-10-01

    Methodologies for the assessment of technology's maturity are widely used in industry and research. Probably the best known are technology readiness levels (TRLs), initially pioneered by the National Aeronautics and Space Administration (NASA). At the beginning, only descriptively defined TRLs existed, but over time, automated assessment techniques in the form of questionnaires emerged in order to determine TRLs. Originally TRLs targeted equipment for space applications, but the demands on industrial relevant equipment are partly different in terms of, for example, overall costs, product quantities, or the presence of competitors. Therefore, we present a commonly valid assessment methodology with the aim of assessing laser-based equipment for industrial use, in general. The assessment is carried out with the help of a questionnaire, which allows for a user-friendly and easy accessible way to monitor the progress from the lab-proven state to the application-ready product throughout the complete development period. The assessment result is presented in a multidimensional metric in order to reveal the current specific strengths and weaknesses of the equipment development process, which can be used to direct the remaining development process of the equipment in the right direction.

  20. Application of an integrated flight/propulsion control design methodology to a STOVL aircraft

    NASA Technical Reports Server (NTRS)

    Garg, Sanjay; Mattern, Duane L.

    1991-01-01

    Results are presented from the application of an emerging Integrated Flight/Propulsion Control (IFPC) design methodology to a Short Take Off and Vertical Landing (STOVL) aircraft in transition flight. The steps in the methodology consist of designing command shaping prefilters to provide the overall desired response to pilot command inputs. A previously designed centralized controller is first validated for the integrated airframe/engine plant used. This integrated plant is derived from a different model of the engine subsystem than the one used for the centralized controller design. The centralized controller is then partitioned in a decentralized, hierarchical structure comprising of airframe lateral and longitudinal subcontrollers and an engine subcontroller. Command shaping prefilters from the pilot control effector inputs are then designed and time histories of the closed loop IFPC system response to simulated pilot commands are compared to desired responses based on handling qualities requirements. Finally, the propulsion system safety and nonlinear limited protection logic is wrapped around the engine subcontroller and the response of the closed loop integrated system is evaluated for transients that encounter the propulsion surge margin limit.

  1. Implementation of an innovative teaching project in a Chemical Process Design course at the University of Cantabria, Spain

    NASA Astrophysics Data System (ADS)

    Galan, Berta; Muñoz, Iciar; Viguri, Javier R.

    2016-09-01

    This paper shows the planning, the teaching activities and the evaluation of the learning and teaching process implemented in the Chemical Process Design course at the University of Cantabria, Spain. Educational methods to address the knowledge, skills and attitudes that students who complete the course are expected to acquire are proposed and discussed. Undergraduate and graduate engineers' perceptions of the methodology used are evaluated by means of a questionnaire. Results of the teaching activities and the strengths and weaknesses of the proposed case study are discussed in relation to the course characteristics. The findings of the empirical evaluation shows that the excessive time students had to dedicate to the case study project and dealing with limited information are the most negative aspects obtained, whereas an increase in the students' self-confidence and the practical application of the methodology are the most positive aspects. Finally, improvements are discussed in order to extend the application of the methodology to other courses offered as part of the chemical engineering degree.

  2. Kansei, surfaces and perception engineering

    NASA Astrophysics Data System (ADS)

    Rosen, B.-G.; Eriksson, L.; Bergman, M.

    2016-09-01

    The aesthetic and pleasing properties of a product are important and add significantly to the meaning and relevance of a product. Customer sensation and perception are largely about psychological factors. There has been a strong industrial and academic need and interest for methods and tools to quantify and link product properties to the human response but a lack of studies of the impact of surfaces. In this study, affective surface engineering is used to illustrate and model the link between customer expectations and perception to controllable product surface properties. The results highlight the use of the soft metrology concept for linking physical and human factors contributing to the perception of products. Examples of surface applications of the Kansei methodology are presented from sauna bath, health care, architectural and hygiene tissue application areas to illustrate, discuss and confirm the strength of the methodology. In the conclusions of the study, future research in soft metrology is proposed to allow understanding and modelling of product perception and sensations in combination with a development of the Kansei surface engineering methodology and software tools.

  3. Solution-adaptive finite element method in computational fracture mechanics

    NASA Technical Reports Server (NTRS)

    Min, J. B.; Bass, J. M.; Spradley, L. W.

    1993-01-01

    Some recent results obtained using solution-adaptive finite element method in linear elastic two-dimensional fracture mechanics problems are presented. The focus is on the basic issue of adaptive finite element method for validating the applications of new methodology to fracture mechanics problems by computing demonstration problems and comparing the stress intensity factors to analytical results.

  4. An Application of the Methodology for Assessment of the Sustainability of Air Transport System

    NASA Technical Reports Server (NTRS)

    Janic, Milan

    2003-01-01

    An assessment and operationalization of the concept of sustainable air transport system is recognized as an important but complex research, operational and policy task. In the scope of the academic efforts to properly address the problem, this paper aims to assess the sustainability of air transport system. It particular, the paper describes the methodology for assessment of sustainability and its potential application. The methodology consists of the indicator systems, which relate to the air transport system operational, economic, social and environmental dimension of performance. The particular indicator systems are relevant for the particular actors such users (air travellers), air transport operators, aerospace manufacturers, local communities, governmental authorities at different levels (local, national, international), international air transport associations, pressure groups and public. In the scope of application of the methodology, the specific cases are selected to estimate the particular indicators, and thus to assess the system sustainability under given conditions.

  5. Application of gamma spectrometry in the Kola peninsula (in Russian)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Golovin, I.V.; Kolesnik, N.I.; Antipov, V.S.

    1973-01-01

    The methodology used and results obtained in gamma spectrometric studies of pre-Cambrian formations of some nickel-bearing regions of the Kola Penlnsula are described. The radioactive element contents of typical metamorphic and magmatic complexes and sulfide ores are presented. (au-trans)

  6. 34 CFR 462.11 - What must an application contain?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... the methodology and procedures used to measure the reliability of the test. (h) Construct validity... previous test, and results from validity, reliability, and equating or standard-setting studies undertaken... NRS educational functioning levels (content validity). Documentation of the extent to which the items...

  7. An Evolutionary Method for Financial Forecasting in Microscopic High-Speed Trading Environment.

    PubMed

    Huang, Chien-Feng; Li, Hsu-Chih

    2017-01-01

    The advancement of information technology in financial applications nowadays have led to fast market-driven events that prompt flash decision-making and actions issued by computer algorithms. As a result, today's markets experience intense activity in the highly dynamic environment where trading systems respond to others at a much faster pace than before. This new breed of technology involves the implementation of high-speed trading strategies which generate significant portion of activity in the financial markets and present researchers with a wealth of information not available in traditional low-speed trading environments. In this study, we aim at developing feasible computational intelligence methodologies, particularly genetic algorithms (GA), to shed light on high-speed trading research using price data of stocks on the microscopic level. Our empirical results show that the proposed GA-based system is able to improve the accuracy of the prediction significantly for price movement, and we expect this GA-based methodology to advance the current state of research for high-speed trading and other relevant financial applications.

  8. Indexing NASA programs for technology transfer methods development and feasibility

    NASA Technical Reports Server (NTRS)

    Clingman, W. H.

    1972-01-01

    This project was undertaken to evaluate the application of a previously developed indexing methodology to ongoing NASA programs. These programs are comprehended by the NASA Program Approval Documents (PADS). Each PAD contains a technical plan for the area it covers. It was proposed that these could be used to generate an index to the complete NASA program. To test this hypothesis two PADS were selected by the NASA Technology Utilization Office for trial indexing. Twenty-five individuals indexed the two PADS using NASA Thesaurus terms. The results demonstrated the feasibility of indexing ongoing NASA programs using PADS as the source of information. The same indexing methodology could be applied to other documents containing a brief description of the technical plan. Results of this project showed that over 85% of the concepts in the technology should be covered by the indexing. Also over 85% of the descriptors chosen would be accurate. This completeness and accuracy for the indexing is considered satisfactory for application in technology transfer.

  9. An object-oriented approach for harmonization of multimedia markup languages

    NASA Astrophysics Data System (ADS)

    Chen, Yih-Feng; Kuo, May-Chen; Sun, Xiaoming; Kuo, C.-C. Jay

    2003-12-01

    An object-oriented methodology is proposed to harmonize several different markup languages in this research. First, we adopt the Unified Modelling Language (UML) as the data model to formalize the concept and the process of the harmonization process between the eXtensible Markup Language (XML) applications. Then, we design the Harmonization eXtensible Markup Language (HXML) based on the data model and formalize the transformation between the Document Type Definitions (DTDs) of the original XML applications and HXML. The transformation between instances is also discussed. We use the harmonization of SMIL and X3D as an example to demonstrate the proposed methodology. This methodology can be generalized to various application domains.

  10. Applications of decision analysis and related techniques to industrial engineering problems at KSC

    NASA Technical Reports Server (NTRS)

    Evans, Gerald W.

    1995-01-01

    This report provides: (1) a discussion of the origination of decision analysis problems (well-structured problems) from ill-structured problems; (2) a review of the various methodologies and software packages for decision analysis and related problem areas; (3) a discussion of how the characteristics of a decision analysis problem affect the choice of modeling methodologies, thus providing a guide as to when to choose a particular methodology; and (4) examples of applications of decision analysis to particular problems encountered by the IE Group at KSC. With respect to the specific applications at KSC, particular emphasis is placed on the use of the Demos software package (Lumina Decision Systems, 1993).

  11. Candidate substances for space bioprocessing methodology and data specification for benefit evaluation

    NASA Technical Reports Server (NTRS)

    1978-01-01

    Analytical and quantitative economic techniques are applied to the evaluation of the economic benefits of a wide range of substances for space bioprocessing. On the basis of expected clinical applications, as well as the size of the patient that could be affected by the clinical applications, eight substances are recommended for further benefit evaluation. Results show that a transitional probability methodology can be used to model at least one clinical application for each of these substances. In each recommended case, the disease and its therapy are sufficiently well understood and documented, and the statistical data is available to operate the model and produce estimates of the impact of new therapy systems on the cost of treatment, morbidity, and mortality. Utilizing the morbidity and mortality information produced by the model, a standard economic technique called the Value of Human Capital is used to estimate the social welfare benefits that could be attributable to the new therapy systems.

  12. Application of Bayesian and cost benefit risk analysis in water resources management

    NASA Astrophysics Data System (ADS)

    Varouchakis, E. A.; Palogos, I.; Karatzas, G. P.

    2016-03-01

    Decision making is a significant tool in water resources management applications. This technical note approaches a decision dilemma that has not yet been considered for the water resources management of a watershed. A common cost-benefit analysis approach, which is novel in the risk analysis of hydrologic/hydraulic applications, and a Bayesian decision analysis are applied to aid the decision making on whether or not to construct a water reservoir for irrigation purposes. The alternative option examined is a scaled parabolic fine variation in terms of over-pumping violations in contrast to common practices that usually consider short-term fines. The methodological steps are analytically presented associated with originally developed code. Such an application, and in such detail, represents new feedback. The results indicate that the probability uncertainty is the driving issue that determines the optimal decision with each methodology, and depending on the unknown probability handling, each methodology may lead to a different optimal decision. Thus, the proposed tool can help decision makers to examine and compare different scenarios using two different approaches before making a decision considering the cost of a hydrologic/hydraulic project and the varied economic charges that water table limit violations can cause inside an audit interval. In contrast to practices that assess the effect of each proposed action separately considering only current knowledge of the examined issue, this tool aids decision making by considering prior information and the sampling distribution of future successful audits.

  13. Crossing trend analysis methodology and application for Turkish rainfall records

    NASA Astrophysics Data System (ADS)

    Şen, Zekâi

    2018-01-01

    Trend analyses are the necessary tools for depicting possible general increase or decrease in a given time series. There are many versions of trend identification methodologies such as the Mann-Kendall trend test, Spearman's tau, Sen's slope, regression line, and Şen's innovative trend analysis. The literature has many papers about the use, cons and pros, and comparisons of these methodologies. In this paper, a completely new approach is proposed based on the crossing properties of a time series. It is suggested that the suitable trend from the centroid of the given time series should have the maximum number of crossings (total number of up-crossings or down-crossings). This approach is applicable whether the time series has dependent or independent structure and also without any dependence on the type of the probability distribution function. The validity of this method is presented through extensive Monte Carlo simulation technique and its comparison with other existing trend identification methodologies. The application of the methodology is presented for a set of annual daily extreme rainfall time series from different parts of Turkey and they have physically independent structure.

  14. Using an Android application to assess registration strategies in open hepatic procedures: a planning and simulation tool

    NASA Astrophysics Data System (ADS)

    Doss, Derek J.; Heiselman, Jon S.; Collins, Jarrod A.; Weis, Jared A.; Clements, Logan W.; Geevarghese, Sunil K.; Miga, Michael I.

    2017-03-01

    Sparse surface digitization with an optically tracked stylus for use in an organ surface-based image-to-physical registration is an established approach for image-guided open liver surgery procedures. However, variability in sparse data collections during open hepatic procedures can produce disparity in registration alignments. In part, this variability arises from inconsistencies with the patterns and fidelity of collected intraoperative data. The liver lacks distinct landmarks and experiences considerable soft tissue deformation. Furthermore, data coverage of the organ is often incomplete or unevenly distributed. While more robust feature-based registration methodologies have been developed for image-guided liver surgery, it is still unclear how variation in sparse intraoperative data affects registration. In this work, we have developed an application to allow surgeons to study the performance of surface digitization patterns on registration. Given the intrinsic nature of soft-tissue, we incorporate realistic organ deformation when assessing fidelity of a rigid registration methodology. We report the construction of our application and preliminary registration results using four participants. Our preliminary results indicate that registration quality improves as users acquire more experience selecting patterns of sparse intraoperative surface data.

  15. Risk assessment for construction projects of transport infrastructure objects

    NASA Astrophysics Data System (ADS)

    Titarenko, Boris

    2017-10-01

    The paper analyzes and compares different methods of risk assessment for construction projects of transport objects. The management of such type of projects demands application of special probabilistic methods due to large level of uncertainty of their implementation. Risk management in the projects requires the use of probabilistic and statistical methods. The aim of the work is to develop a methodology for using traditional methods in combination with robust methods that allow obtaining reliable risk assessments in projects. The robust approach is based on the principle of maximum likelihood and in assessing the risk allows the researcher to obtain reliable results in situations of great uncertainty. The application of robust procedures allows to carry out a quantitative assessment of the main risk indicators of projects when solving the tasks of managing innovation-investment projects. Calculation of damage from the onset of a risky event is possible by any competent specialist. And an assessment of the probability of occurrence of a risky event requires the involvement of special probabilistic methods based on the proposed robust approaches. Practice shows the effectiveness and reliability of results. The methodology developed in the article can be used to create information technologies and their application in automated control systems for complex projects.

  16. Railroad classification yard design methodology study Elkhart Yard Rehabilitation : a case study

    DOT National Transportation Integrated Search

    1980-02-01

    This interim report documents the application of a railroad classification : yard design methodology to CONRAIL's Elkhart Yard Rehabilitation. This : case study effort represents Phase 2 of a larger effort to develop a yard : design methodology, and ...

  17. Experimental uncertainty and drag measurements in the national transonic facility

    NASA Technical Reports Server (NTRS)

    Batill, Stephen M.

    1994-01-01

    This report documents the results of a study which was conducted in order to establish a framework for the quantitative description of the uncertainty in measurements conducted in the National Transonic Facility (NTF). The importance of uncertainty analysis in both experiment planning and reporting results has grown significantly in the past few years. Various methodologies have been proposed and the engineering community appears to be 'converging' on certain accepted practices. The practical application of these methods to the complex wind tunnel testing environment at the NASA Langley Research Center was based upon terminology and methods established in the American National Standards Institute (ANSI) and the American Society of Mechanical Engineers (ASME) standards. The report overviews this methodology.

  18. The cost of energy from utility-owned solar electric systems. A required revenue methodology for ERDA/EPRI evaluations

    NASA Technical Reports Server (NTRS)

    1976-01-01

    This methodology calculates the electric energy busbar cost from a utility-owned solar electric system. This approach is applicable to both publicly- and privately-owned utilities. Busbar cost represents the minimum price per unit of energy consistent with producing system-resultant revenues equal to the sum of system-resultant costs. This equality is expressed in present value terms, where the discount rate used reflects the rate of return required on invested capital. Major input variables describe the output capabilities and capital cost of the energy system, the cash flows required for system operation amd maintenance, and the financial structure and tax environment of the utility.

  19. Are research papers reporting results from nutrigenetics clinical research a potential source of biohype?

    PubMed

    Stenne, R; Hurlimann, T; Godard, Béatrice

    2012-01-01

    Nutrigenetics is a promising field, but the achievability of expected benefits is challenged by the methodological limitations that are associated with clinical research in that field. The mere existence of these limitations suggests that promises about potential outcomes may be premature. Thus, benefits claimed in scientific journal articles in which these limitations are not acknowledged might stimulate biohype. This article aims to examine whether nutrigenetics clinical research articles are a potential source of biohype. Of the 173 articles identified, 16 contained claims in which clinical applications were extrapolated from study results. The methodological limitations being incompletely acknowledged, these articles could potentially be a source of biohype.

  20. An evaluation of the quality of statistical design and analysis of published medical research: results from a systematic survey of general orthopaedic journals.

    PubMed

    Parsons, Nick R; Price, Charlotte L; Hiskens, Richard; Achten, Juul; Costa, Matthew L

    2012-04-25

    The application of statistics in reported research in trauma and orthopaedic surgery has become ever more important and complex. Despite the extensive use of statistical analysis, it is still a subject which is often not conceptually well understood, resulting in clear methodological flaws and inadequate reporting in many papers. A detailed statistical survey sampled 100 representative orthopaedic papers using a validated questionnaire that assessed the quality of the trial design and statistical analysis methods. The survey found evidence of failings in study design, statistical methodology and presentation of the results. Overall, in 17% (95% confidence interval; 10-26%) of the studies investigated the conclusions were not clearly justified by the results, in 39% (30-49%) of studies a different analysis should have been undertaken and in 17% (10-26%) a different analysis could have made a difference to the overall conclusions. It is only by an improved dialogue between statistician, clinician, reviewer and journal editor that the failings in design methodology and analysis highlighted by this survey can be addressed.

  1. Security Quality Requirements Engineering (SQUARE) Methodology

    DTIC Science & Technology

    2005-11-01

    such as Joint Application Development and the Accelerated Requirements Method [Wood 89, Hubbard 99] • Soft Systems Methodology [Checkland 89...investigated were misuse cases [Jacobson 92], Soft Systems Methodology (SSM) [Checkland 89], Quality Function Deployment (QFD) [QFD 05], Con- trolled...html (2005). [Checkland 89] Checkland, Peter. Soft Systems Methodology . Rational Analysis for a Problematic World. New York, NY: John Wiley & Sons

  2. Radioactive waste disposal fees-Methodology for calculation

    NASA Astrophysics Data System (ADS)

    Bemš, Július; Králík, Tomáš; Kubančák, Ján; Vašíček, Jiří; Starý, Oldřich

    2014-11-01

    This paper summarizes the methodological approach used for calculation of fee for low- and intermediate-level radioactive waste disposal and for spent fuel disposal. The methodology itself is based on simulation of cash flows related to the operation of system for waste disposal. The paper includes demonstration of methodology application on the conditions of the Czech Republic.

  3. Alternative Models for Individualized Armor Training. Part I. Interim Report: Review and Analysis of the Literature

    DTIC Science & Technology

    1980-01-01

    for an individualized instructional con- text is provided by Giordono (1975), in his discussion of the design of a " non - lockstep educational system...state of ATI research, sum- marized the methodological and theoretical problems that may have inhibited the application of ATI findings to the design ...years. In contrast, systematic modifications based on results obtained through the application of appropriate experimental designs are desired and

  4. Application of activity-based costing (ABC) for a Peruvian NGO healthcare provider.

    PubMed

    Waters, H; Abdallah, H; Santillán, D

    2001-01-01

    This article describes the application of activity-based costing (ABC) to calculate the unit costs of the services for a health care provider in Peru. While traditional costing allocates overhead and indirect costs in proportion to production volume or to direct costs, ABC assigns costs through activities within an organization. ABC uses personnel interviews to determine principal activities and the distribution of individual's time among these activities. Indirect costs are linked to services through time allocation and other tracing methods, and the result is a more accurate estimate of unit costs. The study concludes that applying ABC in a developing country setting is feasible, yielding results that are directly applicable to pricing and management. ABC determines costs for individual clinics, departments and services according to the activities that originate these costs, showing where an organization spends its money. With this information, it is possible to identify services that are generating extra revenue and those operating at a loss, and to calculate cross subsidies across services. ABC also highlights areas in the health care process where efficiency improvements are possible. Conclusions about the ultimate impact of the methodology are not drawn here, since the study was not repeated and changes in utilization patterns and the addition of new clinics affected applicability of the results. A potential constraint to implementing ABC is the availability and organization of cost information. Applying ABC efficiently requires information to be readily available, by cost category and department, since the greatest benefits of ABC come from frequent, systematic application of the methodology in order to monitor efficiency and provide feedback for management. The article concludes with a discussion of the potential applications of ABC in the health sector in developing countries.

  5. Web-4D-QSAR: A web-based application to generate 4D-QSAR descriptors.

    PubMed

    Ataide Martins, João Paulo; Rougeth de Oliveira, Marco Antônio; Oliveira de Queiroz, Mário Sérgio

    2018-06-05

    A web-based application is developed to generate 4D-QSAR descriptors using the LQTA-QSAR methodology, based on molecular dynamics (MD) trajectories and topology information retrieved from the GROMACS package. The LQTAGrid module calculates the intermolecular interaction energies at each grid point, considering probes and all aligned conformations resulting from MD simulations. These interaction energies are the independent variables or descriptors employed in a QSAR analysis. A friendly front end web interface, built using the Django framework and Python programming language, integrates all steps of the LQTA-QSAR methodology in a way that is transparent to the user, and in the backend, GROMACS and LQTAGrid are executed to generate 4D-QSAR descriptors to be used later in the process of QSAR model building. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.

  6. Robust nonlinear variable selective control for networked systems

    NASA Astrophysics Data System (ADS)

    Rahmani, Behrooz

    2016-10-01

    This paper is concerned with the networked control of a class of uncertain nonlinear systems. In this way, Takagi-Sugeno (T-S) fuzzy modelling is used to extend the previously proposed variable selective control (VSC) methodology to nonlinear systems. This extension is based upon the decomposition of the nonlinear system to a set of fuzzy-blended locally linearised subsystems and further application of the VSC methodology to each subsystem. To increase the applicability of the T-S approach for uncertain nonlinear networked control systems, this study considers the asynchronous premise variables in the plant and the controller, and then introduces a robust stability analysis and control synthesis. The resulting optimal switching-fuzzy controller provides a minimum guaranteed cost on an H2 performance index. Simulation studies on three nonlinear benchmark problems demonstrate the effectiveness of the proposed method.

  7. Segmentation of epidermal tissue with histopathological damage in images of haematoxylin and eosin stained human skin

    PubMed Central

    2014-01-01

    Background Digital image analysis has the potential to address issues surrounding traditional histological techniques including a lack of objectivity and high variability, through the application of quantitative analysis. A key initial step in image analysis is the identification of regions of interest. A widely applied methodology is that of segmentation. This paper proposes the application of image analysis techniques to segment skin tissue with varying degrees of histopathological damage. The segmentation of human tissue is challenging as a consequence of the complexity of the tissue structures and inconsistencies in tissue preparation, hence there is a need for a new robust method with the capability to handle the additional challenges materialising from histopathological damage. Methods A new algorithm has been developed which combines enhanced colour information, created following a transformation to the L*a*b* colourspace, with general image intensity information. A colour normalisation step is included to enhance the algorithm’s robustness to variations in the lighting and staining of the input images. The resulting optimised image is subjected to thresholding and the segmentation is fine-tuned using a combination of morphological processing and object classification rules. The segmentation algorithm was tested on 40 digital images of haematoxylin & eosin (H&E) stained skin biopsies. Accuracy, sensitivity and specificity of the algorithmic procedure were assessed through the comparison of the proposed methodology against manual methods. Results Experimental results show the proposed fully automated methodology segments the epidermis with a mean specificity of 97.7%, a mean sensitivity of 89.4% and a mean accuracy of 96.5%. When a simple user interaction step is included, the specificity increases to 98.0%, the sensitivity to 91.0% and the accuracy to 96.8%. The algorithm segments effectively for different severities of tissue damage. Conclusions Epidermal segmentation is a crucial first step in a range of applications including melanoma detection and the assessment of histopathological damage in skin. The proposed methodology is able to segment the epidermis with different levels of histological damage. The basic method framework could be applied to segmentation of other epithelial tissues. PMID:24521154

  8. Fostering Effective Leadership in Foreign Contexts through Study of Cultural Values

    ERIC Educational Resources Information Center

    Schenck, Andrew D.

    2016-01-01

    While leadership styles have been extensively examined, cultural biases implicit within research methodologies often preclude application of results in foreign contexts. To more holistically comprehend the impact of culture on leadership, belief systems were empirically correlated to both transactional and transformational tendencies in public…

  9. Design guidelines for an umbilical cord blood stem cell therapy quality assessment model

    NASA Astrophysics Data System (ADS)

    Januszewski, Witold S.; Michałek, Krzysztof; Yagensky, Oleksandr; Wardzińska, Marta

    The paper enlists the pivotal guidelines for producing an empirical umbilical cord blood stem cell therapy quality assessment model. The methodology adapted was single equation linear model with domain knowledge derived from MEDAFAR classification. The resulting model is ready for therapeutical application.

  10. APPLICATION OF POLLUTION PREVENTION TECHNIQUES TO REDUCE INDOOR AIR EMISSONS FROM AEROSOL CONSUMER PRODUCTS

    EPA Science Inventory

    The report gives results of a research project to develop tools and methodologies to measure aerosol chemical and particle dispersion through space. These tools can be used to devise pollution prevention strategies that could reduce occupant chemical exposures and guide manufactu...

  11. Evaluating Rigor in Qualitative Methodology and Research Dissemination

    ERIC Educational Resources Information Center

    Trainor, Audrey A.; Graue, Elizabeth

    2014-01-01

    Despite previous and successful attempts to outline general criteria for rigor, researchers in special education have debated the application of rigor criteria, the significance or importance of small n research, the purpose of interpretivist approaches, and the generalizability of qualitative empirical results. Adding to these complications, the…

  12. Application of the Hardman methodology to the Single Channel Ground-Airborne Radio System (SINCGARS)

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The HARDMAN methodology was applied to the various configurations of employment for an emerging Army multipurpose communications system. The methodology was used to analyze the manpower, personnel and training (MPT) requirements and associated costs, of the system concepts responsive to the Army's requirement for the Single Channel Ground-Airborne Radio System (SINCGARS). The scope of the application includes the analysis of two conceptual designs Cincinnati Electronics and ITT Aerospace/Optical Division for operating and maintenance support addressed through the general support maintenance echelon.

  13. Digital Methodology to implement the ECOUTER engagement process.

    PubMed

    Wilson, Rebecca C; Butters, Oliver W; Clark, Tom; Minion, Joel; Turner, Andrew; Murtagh, Madeleine J

    2016-01-01

    ECOUTER ( E mploying CO ncept u al schema for policy and T ranslation E  in R esearch - French for 'to listen' - is a new stakeholder engagement method incorporating existing evidence to help participants draw upon their own knowledge of cognate issues and interact on a topic of shared concern. The results of an ECOUTER can form the basis of recommendations for research, governance, practice and/or policy. This paper describes the development of a digital methodology for the ECOUTER engagement process based on currently available mind mapping freeware software. The implementation of an ECOUTER process tailored to applications within health studies are outlined for both online and face-to-face scenarios. Limitations of the present digital methodology are discussed, highlighting the requirement of a purpose built software for ECOUTER research purposes.

  14. Ethics and Epistemology in Big Data Research.

    PubMed

    Lipworth, Wendy; Mason, Paul H; Kerridge, Ian; Ioannidis, John P A

    2017-12-01

    Biomedical innovation and translation are increasingly emphasizing research using "big data." The hope is that big data methods will both speed up research and make its results more applicable to "real-world" patients and health services. While big data research has been embraced by scientists, politicians, industry, and the public, numerous ethical, organizational, and technical/methodological concerns have also been raised. With respect to technical and methodological concerns, there is a view that these will be resolved through sophisticated information technologies, predictive algorithms, and data analysis techniques. While such advances will likely go some way towards resolving technical and methodological issues, we believe that the epistemological issues raised by big data research have important ethical implications and raise questions about the very possibility of big data research achieving its goals.

  15. Modeling, Analyzing, and Mitigating Dissonance Between Alerting Systems

    NASA Technical Reports Server (NTRS)

    Song, Lixia; Kuchar, James K.

    2003-01-01

    Alerting systems are becoming pervasive in process operations, which may result in the potential for dissonance or conflict in information from different alerting systems that suggests different threat levels and/or actions to resolve hazards. Little is currently available to help in predicting or solving the dissonance problem. This thesis presents a methodology to model and analyze dissonance between alerting systems, providing both a theoretical foundation for understanding dissonance and a practical basis from which specific problems can be addressed. A state-space representation of multiple alerting system operation is generalized that can be tailored across a variety of applications. Based on the representation, two major causes of dissonance are identified: logic differences and sensor error. Additionally, several possible types of dissonance are identified. A mathematical analysis method is developed to identify the conditions for dissonance originating from logic differences. A probabilistic analysis methodology is developed to estimate the probability of dissonance originating from sensor error, and to compare the relative contribution to dissonance of sensor error against the contribution from logic differences. A hybrid model, which describes the dynamic behavior of the process with multiple alerting systems, is developed to identify dangerous dissonance space, from which the process can lead to disaster. Methodologies to avoid or mitigate dissonance are outlined. Two examples are used to demonstrate the application of the methodology. First, a conceptual In-Trail Spacing example is presented. The methodology is applied to identify the conditions for possible dissonance, to identify relative contribution of logic difference and sensor error, and to identify dangerous dissonance space. Several proposed mitigation methods are demonstrated in this example. In the second example, the methodology is applied to address the dissonance problem between two air traffic alert and avoidance systems: the existing Traffic Alert and Collision Avoidance System (TCAS) vs. the proposed Airborne Conflict Management system (ACM). Conditions on ACM resolution maneuvers are identified to avoid dynamic dissonance between TCAS and ACM. Also included in this report is an Appendix written by Lee Winder about recent and continuing work on alerting systems design. The application of Markov Decision Process (MDP) theory to complex alerting problems is discussed and illustrated with an abstract example system.

  16. FY16 Progress Report on Test Results In Support Of Integrated EPP and SMT Design Methods Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Yanli; Jetter, Robert I.; Sham, T. -L.

    2016-08-08

    The proposed integrated Elastic Perfectly-Plastic (EPP) and Simplified Model Test (SMT) methodology consists of incorporating an SMT data-based approach for creep-fatigue damage evaluation into the EPP methodology to avoid using the creep-fatigue interaction diagram (the D diagram) and to minimize over-conservatism while properly accounting for localized defects and stress risers. To support the implementation of the proposed code rules and to verify their applicability, a series of thermomechanical tests have been initiated. This report presents the recent test results for Type 2 SMT specimens on Alloy 617, Pressurization SMT on Alloy 617, Type 1 SMT on Gr. 91, and two-barmore » thermal ratcheting test results on Alloy 617 with a new thermal loading profile.« less

  17. Robust detection-isolation-accommodation for sensor failures

    NASA Technical Reports Server (NTRS)

    Weiss, J. L.; Pattipati, K. R.; Willsky, A. S.; Eterno, J. S.; Crawford, J. T.

    1985-01-01

    The results of a one year study to: (1) develop a theory for Robust Failure Detection and Identification (FDI) in the presence of model uncertainty, (2) develop a design methodology which utilizes the robust FDI ththeory, (3) apply the methodology to a sensor FDI problem for the F-100 jet engine, and (4) demonstrate the application of the theory to the evaluation of alternative FDI schemes are presented. Theoretical results in statistical discrimination are used to evaluate the robustness of residual signals (or parity relations) in terms of their usefulness for FDI. Furthermore, optimally robust parity relations are derived through the optimization of robustness metrics. The result is viewed as decentralization of the FDI process. A general structure for decentralized FDI is proposed and robustness metrics are used for determining various parameters of the algorithm.

  18. The Effectiveness of Educational Technology Applications for Enhancing Mathematics Achievement in K-12 Classrooms: A Meta-Analysis

    ERIC Educational Resources Information Center

    Cheung, Alan C. K.; Slavin, Robert E.

    2013-01-01

    The present review examines research on the effects of educational technology applications on mathematics achievement in K-12 classrooms. Unlike previous reviews, this review applies consistent inclusion standards to focus on studies that met high methodological standards. In addition, methodological and substantive features of the studies are…

  19. Developing Applications of Artificial Intelligence Technology To Provide Consultative Support in the Use of Research Methodology by Practitioners.

    ERIC Educational Resources Information Center

    Vitale, Michael R.; Romance, Nancy

    Adopting perspectives based on applications of artificial intelligence proven in industry, this paper discusses methodological strategies and issues that underlie the development of such software environments. The general concept of an expert system is discussed in the context of its relevance to the problem of increasing the accessibility of…

  20. Classification of Computer-Aided Design-Computer-Aided Manufacturing Applications for the Reconstruction of Cranio-Maxillo-Facial Defects.

    PubMed

    Wauters, Lauri D J; Miguel-Moragas, Joan San; Mommaerts, Maurice Y

    2015-11-01

    To gain insight into the methodology of different computer-aided design-computer-aided manufacturing (CAD-CAM) applications for the reconstruction of cranio-maxillo-facial (CMF) defects. We reviewed and analyzed the available literature pertaining to CAD-CAM for use in CMF reconstruction. We proposed a classification system of the techniques of implant and cutting, drilling, and/or guiding template design and manufacturing. The system consisted of 4 classes (I-IV). These classes combine techniques used for both the implant and template to most accurately describe the methodology used. Our classification system can be widely applied. It should facilitate communication and immediate understanding of the methodology of CAD-CAM applications for the reconstruction of CMF defects.

  1. Improving the Method of Roof Fall Susceptibility Assessment based on Fuzzy Approach

    NASA Astrophysics Data System (ADS)

    Ghasemi, Ebrahim; Ataei, Mohammad; Shahriar, Kourosh

    2017-03-01

    Retreat mining is always accompanied by a great amount of accidents and most of them are due to roof fall. Therefore, development of methodologies to evaluate the roof fall susceptibility (RFS) seems essential. Ghasemi et al. (2012) proposed a systematic methodology to assess the roof fall risk during retreat mining based on risk assessment classic approach. The main defect of this method is ignorance of subjective uncertainties due to linguistic input value of some factors, low resolution, fixed weighting, sharp class boundaries, etc. To remove this defection and improve the mentioned method, in this paper, a novel methodology is presented to assess the RFS using fuzzy approach. The application of fuzzy approach provides an effective tool to handle the subjective uncertainties. Furthermore, fuzzy analytical hierarchy process (AHP) is used to structure and prioritize various risk factors and sub-factors during development of this method. This methodology is applied to identify the susceptibility of roof fall occurrence in main panel of Tabas Central Mine (TCM), Iran. The results indicate that this methodology is effective and efficient in assessing RFS.

  2. Integration of topological modification within the modeling of multi-physics systems: Application to a Pogo-stick

    NASA Astrophysics Data System (ADS)

    Abdeljabbar Kharrat, Nourhene; Plateaux, Régis; Miladi Chaabane, Mariem; Choley, Jean-Yves; Karra, Chafik; Haddar, Mohamed

    2018-05-01

    The present work tackles the modeling of multi-physics systems applying a topological approach while proceeding with a new methodology using a topological modification to the structure of systems. Then the comparison with the Magos' methodology is made. Their common ground is the use of connectivity within systems. The comparison and analysis of the different types of modeling show the importance of the topological methodology through the integration of the topological modification to the topological structure of a multi-physics system. In order to validate this methodology, the case of Pogo-stick is studied. The first step consists in generating a topological graph of the system. Then the connectivity step takes into account the contact with the ground. During the last step of this research; the MGS language (Modeling of General System) is used to model the system through equations. Finally, the results are compared to those obtained by MODELICA. Therefore, this proposed methodology may be generalized to model multi-physics systems that can be considered as a set of local elements.

  3. Stability Result For Dynamic Inversion Devised to Control Large Flexible Aircraft

    NASA Technical Reports Server (NTRS)

    Gregory, Irene M.

    2001-01-01

    High performance aircraft of the future will be designed lighter, more maneuverable, and operate over an ever expanding flight envelope. One of the largest differences from the flight control perspective between current and future advanced aircraft is elasticity. Over the last decade, dynamic inversion methodology has gained considerable popularity in application to highly maneuverable fighter aircraft, which were treated as rigid vehicles. This paper is an initial attempt to establish global stability results for dynamic inversion methodology as applied to a large, flexible aircraft. This work builds on a previous result for rigid fighter aircraft and adds a new level of complexity that is the flexible aircraft dynamics, which cannot be ignored even in the most basic flight control. The results arise from observations of the control laws designed for a new generation of the High-Speed Civil Transport aircraft.

  4. Reverse Engineering and Security Evaluation of Commercial Tags for RFID-Based IoT Applications.

    PubMed

    Fernández-Caramés, Tiago M; Fraga-Lamas, Paula; Suárez-Albela, Manuel; Castedo, Luis

    2016-12-24

    The Internet of Things (IoT) is a distributed system of physical objects that requires the seamless integration of hardware (e.g., sensors, actuators, electronics) and network communications in order to collect and exchange data. IoT smart objects need to be somehow identified to determine the origin of the data and to automatically detect the elements around us. One of the best positioned technologies to perform identification is RFID (Radio Frequency Identification), which in the last years has gained a lot of popularity in applications like access control, payment cards or logistics. Despite its popularity, RFID security has not been properly handled in numerous applications. To foster security in such applications, this article includes three main contributions. First, in order to establish the basics, a detailed review of the most common flaws found in RFID-based IoT systems is provided, including the latest attacks described in the literature. Second, a novel methodology that eases the detection and mitigation of such flaws is presented. Third, the latest RFID security tools are analyzed and the methodology proposed is applied through one of them (Proxmark 3) to validate it. Thus, the methodology is tested in different scenarios where tags are commonly used for identification. In such systems it was possible to clone transponders, extract information, and even emulate both tags and readers. Therefore, it is shown that the methodology proposed is useful for auditing security and reverse engineering RFID communications in IoT applications. It must be noted that, although this paper is aimed at fostering RFID communications security in IoT applications, the methodology can be applied to any RFID communications protocol.

  5. Reverse Engineering and Security Evaluation of Commercial Tags for RFID-Based IoT Applications

    PubMed Central

    Fernández-Caramés, Tiago M.; Fraga-Lamas, Paula; Suárez-Albela, Manuel; Castedo, Luis

    2016-01-01

    The Internet of Things (IoT) is a distributed system of physical objects that requires the seamless integration of hardware (e.g., sensors, actuators, electronics) and network communications in order to collect and exchange data. IoT smart objects need to be somehow identified to determine the origin of the data and to automatically detect the elements around us. One of the best positioned technologies to perform identification is RFID (Radio Frequency Identification), which in the last years has gained a lot of popularity in applications like access control, payment cards or logistics. Despite its popularity, RFID security has not been properly handled in numerous applications. To foster security in such applications, this article includes three main contributions. First, in order to establish the basics, a detailed review of the most common flaws found in RFID-based IoT systems is provided, including the latest attacks described in the literature. Second, a novel methodology that eases the detection and mitigation of such flaws is presented. Third, the latest RFID security tools are analyzed and the methodology proposed is applied through one of them (Proxmark 3) to validate it. Thus, the methodology is tested in different scenarios where tags are commonly used for identification. In such systems it was possible to clone transponders, extract information, and even emulate both tags and readers. Therefore, it is shown that the methodology proposed is useful for auditing security and reverse engineering RFID communications in IoT applications. It must be noted that, although this paper is aimed at fostering RFID communications security in IoT applications, the methodology can be applied to any RFID communications protocol. PMID:28029119

  6. Development of flight experiment task requirements. Volume 1: Summary

    NASA Technical Reports Server (NTRS)

    Hatterick, G. R.

    1972-01-01

    A study was conducted to develop the means to identify skills required of scientist passengers on advanced missions related to the space shuttle and RAM programs. The scope of the study was defined to include only the activities of on-orbit personnel which are directly related to, or required by, on-orbit experimentation and scientific investigations conducted on or supported by the shuttle orbiter. A program summary is presented which provides a description of the methodology developed, an overview of the activities performed during the study, and the results obtained through application of the methodology.

  7. Injector element characterization methodology

    NASA Technical Reports Server (NTRS)

    Cox, George B., Jr.

    1988-01-01

    Characterization of liquid rocket engine injector elements is an important part of the development process for rocket engine combustion devices. Modern nonintrusive instrumentation for flow velocity and spray droplet size measurement, and automated, computer-controlled test facilities allow rapid, low-cost evaluation of injector element performance and behavior. Application of these methods in rocket engine development, paralleling their use in gas turbine engine development, will reduce rocket engine development cost and risk. The Alternate Turbopump (ATP) Hot Gas Systems (HGS) preburner injector elements were characterized using such methods, and the methodology and some of the results obtained will be shown.

  8. Constrained Stochastic Extended Redundancy Analysis.

    PubMed

    DeSarbo, Wayne S; Hwang, Heungsun; Stadler Blank, Ashley; Kappe, Eelco

    2015-06-01

    We devise a new statistical methodology called constrained stochastic extended redundancy analysis (CSERA) to examine the comparative impact of various conceptual factors, or drivers, as well as the specific predictor variables that contribute to each driver on designated dependent variable(s). The technical details of the proposed methodology, the maximum likelihood estimation algorithm, and model selection heuristics are discussed. A sports marketing consumer psychology application is provided in a Major League Baseball (MLB) context where the effects of six conceptual drivers of game attendance and their defining predictor variables are estimated. Results compare favorably to those obtained using traditional extended redundancy analysis (ERA).

  9. Application of low-cost methodologies for mobile phone app development.

    PubMed

    Zhang, Melvyn; Cheow, Enquan; Ho, Cyrus Sh; Ng, Beng Yeong; Ho, Roger; Cheok, Christopher Cheng Soon

    2014-12-09

    The usage of mobile phones and mobile phone apps in the recent decade has indeed become more prevalent. Previous research has highlighted a method of using just the Internet browser and a text editor to create an app, but this does not eliminate the challenges faced by clinicians. More recently, two methodologies of app development have been shared, but there has not been any disclosures pertaining to the costs involved. In addition, limitations such as the distribution and dissemination of the apps have not been addressed. The aims of this research article are to: (1) highlight a low-cost methodology that clinicians without technical knowledge could use to develop educational apps; (2) clarify the respective costs involved in the process of development; (3) illustrate how limitations pertaining to dissemination could be addressed; and (4) to report initial utilization data of the apps and to share initial users' self-rated perception of the apps. In this study, we will present two techniques of how to create a mobile app using two of the well-established online mobile app building websites. The costs of development are specified and the methodology of dissemination of the apps will be shared. The application of the low-cost methodologies in the creation of the "Mastering Psychiatry" app for undergraduates and "Déjà vu" app for postgraduates will be discussed. A questionnaire survey has been administered to undergraduate students collating their perceptions towards the app. For the Mastering Psychiatry app, a cumulative total of 722 users have used the mobile app since inception, based on our analytics. For the Déjà vu app, there has been a cumulative total of 154 downloads since inception. The utilization data demonstrated the receptiveness towards these apps, and this is reinforced by the positive perceptions undergraduate students (n=185) had towards the low-cost self-developed apps. This is one of the few studies that have demonstrated the low-cost methodologies of app development; as well as student and trainee receptivity toward self-created Web-based mobile phone apps. The results obtained have demonstrated that these Web-based low-cost apps are applicable in the real life, and suggest that the methodologies shared in this research paper might be of benefit for other specialities and disciplines.

  10. Application of Lean Healthcare methodology in a urology department of a tertiary hospital as a tool for improving efficiency.

    PubMed

    Boronat, F; Budia, A; Broseta, E; Ruiz-Cerdá, J L; Vivas-Consuelo, D

    To describe the application of the Lean methodology as a method for continuously improving the efficiency of a urology department in a tertiary hospital. The implementation of the Lean Healthcare methodology in a urology department was conducted in 3 phases: 1) team training and improvement of feedback among the practitioners, 2) management by process and superspecialisation and 3) improvement of indicators (continuous improvement). The indicators were obtained from the Hospital's information systems. The main source of information was the Balanced Scorecard for health systems management (CUIDISS). The comparison with other autonomous and national urology departments was performed through the same platform with the help of the Hospital's records department (IASIST). A baseline was established with the indicators obtained in 2011 for the comparative analysis of the results after implementing the Lean Healthcare methodology. The implementation of this methodology translated into high practitioner satisfaction, improved quality indicators reaching a risk-adjusted complication index (RACI) of 0.59 and a risk-adjusted mortality rate (RAMR) of 0.24 in 4 years. A value of 0.61 was reached with the efficiency indicator (risk-adjusted length of stay [RALOS] index), with a savings of 2869 stays compared with national Benchmarking (IASIST). The risk-adjusted readmissions index (RARI) was the only indicator above the standard, with a value of 1.36 but with progressive annual improvement of the same. The Lean methodology can be effectively applied to a urology department of a tertiary hospital to improve efficiency, obtaining significant and continuous improvements in all its indicators, as well as practitioner satisfaction. Team training, management by process, continuous improvement and delegation of responsibilities has been shown to be the fundamental pillars of this methodology. Copyright © 2017 AEU. Publicado por Elsevier España, S.L.U. All rights reserved.

  11. Large-Scale Networked Virtual Environments: Architecture and Applications

    ERIC Educational Resources Information Center

    Lamotte, Wim; Quax, Peter; Flerackers, Eddy

    2008-01-01

    Purpose: Scalability is an important research topic in the context of networked virtual environments (NVEs). This paper aims to describe the ALVIC (Architecture for Large-scale Virtual Interactive Communities) approach to NVE scalability. Design/methodology/approach: The setup and results from two case studies are shown: a 3-D learning environment…

  12. What Is the Probability You Are a Bayesian?

    ERIC Educational Resources Information Center

    Wulff, Shaun S.; Robinson, Timothy J.

    2014-01-01

    Bayesian methodology continues to be widely used in statistical applications. As a result, it is increasingly important to introduce students to Bayesian thinking at early stages in their mathematics and statistics education. While many students in upper level probability courses can recite the differences in the Frequentist and Bayesian…

  13. The Application of Operations Research Techniques to the Evaluation of Military Management Information Systems.

    DTIC Science & Technology

    systems such as management information systems . To provide a methodology yielding quantitative results which may assist a commander and his staff in...this analysis, it is proposed that management information systems be evaluated as a whole by a technique defined as the semantic differential. Each

  14. Assessing Sustainability When Data Availability Limits Real-Time Estimates: Using Near-Time Indicators to Extend Sustainability Metrics

    EPA Science Inventory

    We produced a scientifically defensible methodology to assess whether a regional system is on a sustainable path. The approach required readily available data, metrics applicable to the relevant scale, and results useful to decision makers. We initiated a pilot project to test ...

  15. Independent Evaluation of Heavy-Truck Safety Applications Based on Vehicle-to-Vehicle and Vehicle-to-Infrastructure Communications Used in the Safety Pilot Model Deployment

    DOT National Transportation Integrated Search

    2016-01-01

    This report presents the methodology and results of the independent evaluation of heavy trucks (HTs) in the Safety Pilot Model Deployment (SPMD); part of the United States Department of Transportations Intelligent Transportation Systems research p...

  16. A framework for an alternatives assessment dashboard for evaluating chemical alternatives applied to flame retardants for electronic applications

    EPA Science Inventory

    The goal of alternatives assessment (AA) is to facilitate a comparison of alternatives to a chemical of concern, resulting in the identification of safer alternatives. A two-stage methodology for comparing chemical alternatives was developed. In the first stage, alternatives are ...

  17. Calibration Modeling Methodology to Optimize Performance for Low Range Applications

    NASA Technical Reports Server (NTRS)

    McCollum, Raymond A.; Commo, Sean A.; Parker, Peter A.

    2010-01-01

    Calibration is a vital process in characterizing the performance of an instrument in an application environment and seeks to obtain acceptable accuracy over the entire design range. Often, project requirements specify a maximum total measurement uncertainty, expressed as a percent of full-scale. However in some applications, we seek to obtain enhanced performance at the low range, therefore expressing the accuracy as a percent of reading should be considered as a modeling strategy. For example, it is common to desire to use a force balance in multiple facilities or regimes, often well below its designed full-scale capacity. This paper presents a general statistical methodology for optimizing calibration mathematical models based on a percent of reading accuracy requirement, which has broad application in all types of transducer applications where low range performance is required. A case study illustrates the proposed methodology for the Mars Entry Atmospheric Data System that employs seven strain-gage based pressure transducers mounted on the heatshield of the Mars Science Laboratory mission.

  18. Application Of The Iberdrola Licensing Methodology To The Cofrentes BWR-6 110% Extended Power Up-rate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mata, Pedro; Fuente, Rafael de la; Iglesias, Javier

    Iberdrola (spanish utility) and Iberdrola Ingenieria (engineering branch) have been developing during the last two years the 110% Extended Power Up-rate Project (EPU 110%) for Cofrentes BWR-6. IBERDROLA has available an in-house design and licensing reload methodology that has been approved by the Spanish Nuclear Regulatory Authority. This methodology has been already used to perform the nuclear design and the reload licensing analysis for Cofrentes cycles 12 to 14. The methodology has been also applied to develop a significant number of safety analysis of the Cofrentes Extended Power Up-rate including: Reactor Heat Balance, Core and Fuel performance, Thermal Hydraulic Stability,more » ECCS LOCA Evaluation, Transient Analysis, Anticipated Transient Without Scram (ATWS) and Station Blackout (SBO) Since the scope of the licensing process of the Cofrentes Extended Power Up-rate exceeds the range of analysis included in the Cofrentes generic reload licensing process, it has been required to extend the applicability of the Cofrentes licensing methodology to the analysis of new transients. This is the case of the TLFW transient. The content of this paper shows the benefits of having an in-house design and licensing methodology, and describes the process to extend the applicability of the methodology to the analysis of new transients. The case of analysis of Total Loss of Feedwater with the Cofrentes Retran Model is included as an example of this process. (authors)« less

  19. Time-resolved methods in biophysics. 9. Laser temperature-jump methods for investigating biomolecular dynamics.

    PubMed

    Kubelka, Jan

    2009-04-01

    Many important biochemical processes occur on the time-scales of nanoseconds and microseconds. The introduction of the laser temperature-jump (T-jump) to biophysics more than a decade ago opened these previously inaccessible time regimes up to direct experimental observation. Since then, laser T-jump methodology has evolved into one of the most versatile and generally applicable methods for studying fast biomolecular kinetics. This perspective is a review of the principles and applications of the laser T-jump technique in biophysics. A brief overview of the T-jump relaxation kinetics and the historical development of laser T-jump methodology is presented. The physical principles and practical experimental considerations that are important for the design of the laser T-jump experiments are summarized. These include the Raman conversion for generating heating pulses, considerations of size, duration and uniformity of the temperature jump, as well as potential adverse effects due to photo-acoustic waves, cavitation and thermal lensing, and their elimination. The laser T-jump apparatus developed at the NIH Laboratory of Chemical Physics is described in detail along with a brief survey of other laser T-jump designs in use today. Finally, applications of the laser T-jump in biophysics are reviewed, with an emphasis on the broad range of problems where the laser T-jump methodology has provided important new results and insights into the dynamics of the biomolecular processes.

  20. [Nursing methodology applicated in patients with pressure ulcers. Clinical report].

    PubMed

    Galvez Romero, Carmen

    2014-05-01

    The application of functional patterns lets us to make a systematic and premeditated nursing assessment, with which we obtain a lot of relevant patient data in an organized way, making easier to analize them. In our case, we use Marjory Gordon's functional health patterns and NANDA (North American Nursing Diagnosis Association), NOC (Nursing Outcomes Classification), NIC (Nursing Intervention Classification) taxonomy. The overall objective of this paper is to present the experience of implementation and development of nursing methodology in the care of patients with pressure ulcers. In this article it's reported a case of a 52-year-old female who presented necrosis of phalanxes in upper and lower limbs and suffered amputations of them after being hospitalized in an Intensive Care Unit. She was discharged with pressure ulcers on both heels. GENERAL ASSESSMENT: It was implemented the nursing theory known as "Gordon's functional health patterns" and the affected patterns were identified. The Second Pattern (Nutritional-Metabolic) was considered as reference, since this was the pattern which altered the rest. EVOLUTION OF THE PATIENT: The patient had a favourable evolution, improving all the altered patterns. The infections symptoms disappeared and the pressure ulcers of both heels healed completely. The application of nursing methodology to care patients with pressure ulcers using clinical practice guidelines, standardized procedures and rating scales of assessment improves the evaluation of results and the performance of nurses.

  1. Automation Applications in an Advanced Air Traffic Management System : Volume 3. Methodology for Man-Machine Task Allocation

    DOT National Transportation Integrated Search

    1974-08-01

    Volume 3 describes the methodology for man-machine task allocation. It contains a description of man and machine performance capabilities and an explanation of the methodology employed to allocate tasks to human or automated resources. It also presen...

  2. 43 CFR 11.83 - Damage determination phase-use value methodologies.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...

  3. 43 CFR 11.83 - Damage determination phase-use value methodologies.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...

  4. 43 CFR 11.83 - Damage determination phase-use value methodologies.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...

  5. 43 CFR 11.83 - Damage determination phase-use value methodologies.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...

  6. 43 CFR 11.83 - Damage determination phase-use value methodologies.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...

  7. A methodology for analyzing general categorical data with misclassification errors with an application in studying seat belt effectiveness

    DOT National Transportation Integrated Search

    1977-06-01

    Author's abstract: In this report, a methodology for analyzing general categorical data with misclassification errors is developed and applied to the study of seat belt effectiveness. The methodology assumes the availability of an original large samp...

  8. Predictive aging results in radiation environments

    NASA Astrophysics Data System (ADS)

    Gillen, Kenneth T.; Clough, Roger L.

    1993-06-01

    We have previously derived a time-temperature-dose rate superposition methodology, which, when applicable, can be used to predict polymer degradation versus dose rate, temperature and exposure time. This methodology results in predictive capabilities at the low dose rates and long time periods appropriate, for instance, to ambient nuclear power plant environments. The methodology was successfully applied to several polymeric cable materials and then verified for two of the materials by comparisons of the model predictions with 12 year, low-dose-rate aging data on these materials from a nuclear environment. In this paper, we provide a more detailed discussion of the methodology and apply it to data obtained on a number of additional nuclear power plant cable insulation (a hypalon, a silicone rubber and two ethylene-tetrafluoroethylenes) and jacket (a hypalon) materials. We then show that the predicted, low-dose-rate results for our materials are in excellent agreement with long-term (7-9 year) low-dose-rate results recently obtained for the same material types actually aged under bnuclear power plant conditions. Based on a combination of the modelling and long-term results, we find indications of reasonably similar degradation responses among several different commercial formulations for each of the following "generic" materials: hypalon, ethylene-tetrafluoroethylene, silicone rubber and PVC. If such "generic" behavior can be further substantiated through modelling and long-term results on additional formulations, predictions of cable life for other commercial materials of the same generic types would be greatly facilitated.

  9. Analysis of experts' perception of the effectiveness of teaching methods

    NASA Astrophysics Data System (ADS)

    Kindra, Gurprit S.

    1984-03-01

    The present study attempts to shed light on the perceptions of business educators regarding the effectiveness of six methodologies in achieving Gagné's five learning outcomes. Results of this study empirically confirm the oft-stated contention that no one method is globally effective for the attainment of all objectives. Specifically, business games, traditional lecture, and case study methods are perceived to be most effective for the learning of application, knowledge acquisition, and analysis and application, respectively.

  10. Optimally Robust Redundancy Relations for Failure Detection in Uncertain Systems,

    DTIC Science & Technology

    1983-04-01

    particular applications. While the general methods provide the basis for what in principle should be a widely applicable failure detection methodology...modifications to this result which overcome them at no fundmental increase in complexity. 4.1 Scaling A critical problem with the criteria of the preceding...criterion which takes scaling into account L 2 s[ (45) As in (38), we can multiply the C. by positive scalars to take into account unequal weightings on

  11. Analytical group decision making in natural resources: Methodology and application

    USGS Publications Warehouse

    Schmoldt, D.L.; Peterson, D.L.

    2000-01-01

    Group decision making is becoming increasingly important in natural resource management and associated scientific applications, because multiple values are treated coincidentally in time and space, multiple resource specialists are needed, and multiple stakeholders must be included in the decision process. Decades of social science research on decision making in groups have provided insights into the impediments to effective group processes and on techniques that can be applied in a group context. Nevertheless, little integration and few applications of these results have occurred in resource management decision processes, where formal groups are integral, either directly or indirectly. A group decision-making methodology is introduced as an effective approach for temporary, formal groups (e.g., workshops). It combines the following three components: (1) brainstorming to generate ideas; (2) the analytic hierarchy process to produce judgments, manage conflict, enable consensus, and plan for implementation; and (3) a discussion template (straw document). Resulting numerical assessments of alternative decision priorities can be analyzed statistically to indicate where group member agreement occurs and where priority values are significantly different. An application of this group process to fire research program development in a workshop setting indicates that the process helps focus group deliberations; mitigates groupthink, nondecision, and social loafing pitfalls; encourages individual interaction; identifies irrational judgments; and provides a large amount of useful quantitative information about group preferences. This approach can help facilitate scientific assessments and other decision-making processes in resource management.

  12. Structural Technology and Analysis Program (STAP) Delivery Order 0004: Durability Patch

    NASA Astrophysics Data System (ADS)

    Ikegami, Roy; Haugse, Eric; Trego, Angela; Rogers, Lynn; Maly, Joe

    2001-06-01

    Structural cracks in secondary structure, resulting from a high cycle fatigue (HCF) environment, are often referred to as nuisance cracks. This type of damage can result in costly inspections and repair. The repairs often do not last long because the repaired structure continues to respond in a resonant fashion to the environment. Although the use of materials for passive damping applications is well understood, there are few applications to high-cycle fatigue problems. This is because design information characterization temperature, resonant response frequency and strain levels are difficult to determine. The Durability Patch and Damage Dosimeter Program addressed these problems by: (1) Developing a damped repair design process which includes a methodology for designing the material and application characteristics required to optimally damp the repair. (2) Designing and developing a rugged, small, and lightweight data acquisition unit called the damage dosimeter. This is a battery operated, single board computer, capable of collecting three channels of strain and one channel of temperature, processing this data by user developed algorithms written in the C programming language, and storing the processed data in resident memory. The dosimeter is used to provide flight data needed to characterize the vibration environment. The vibration environment is then used to design the damping material characteristics and repair. The repair design methodology and dosimeter were demonstrated on B-52, C-130, and F-15 aircraft applications.

  13. UAV-Based Photogrammetry and Integrated Technologies for Architectural Applications--Methodological Strategies for the After-Quake Survey of Vertical Structures in Mantua (Italy).

    PubMed

    Achille, Cristiana; Adami, Andrea; Chiarini, Silvia; Cremonesi, Stefano; Fassi, Francesco; Fregonese, Luigi; Taffurelli, Laura

    2015-06-30

    This paper examines the survey of tall buildings in an emergency context like in the case of post-seismic events. The after-earthquake survey has to guarantee time-savings, high precision and security during the operational stages. The main goal is to optimize the application of methodologies based on acquisition and automatic elaborations of photogrammetric data even with the use of Unmanned Aerial Vehicle (UAV) systems in order to provide fast and low cost operations. The suggested methods integrate new technologies with commonly used technologies like TLS and topographic acquisition. The value of the photogrammetric application is demonstrated by a test case, based on the comparison of acquisition, calibration and 3D modeling results in case of use of a laser scanner, metric camera and amateur reflex camera. The test would help us to demonstrate the efficiency of image based methods in the acquisition of complex architecture. The case study is Santa Barbara Bell tower in Mantua. The applied survey solution allows a complete 3D database of the complex architectural structure to be obtained for the extraction of all the information needed for significant intervention. This demonstrates the applicability of the photogrammetry using UAV for the survey of vertical structures, complex buildings and difficult accessible architectural parts, providing high precision results.

  14. A web based health technology assessment in tele-echocardiography: the experience within an Italian project.

    PubMed

    Giansanti, Daniele; Morelli, Sandra; Maccioni, Giovanni; Guerriero, Lorenzo; Bedini, Remo; Pepe, Gennaro; Colombo, Cesare; Borghi, Gabriella; Macellari, Velio

    2009-01-01

    Due to major advances in the information technology, telemedicine applications are ready for a widespread use. Nonetheless, to allow their diffusion in National Health Care Systems (NHCSs) specific methodologies of health technology assessment (HTA) should be used to assess the standardization, the overall quality, the interoperability, the addressing to legal, economic and cost benefit aspects. One of the limits to the diffusion of the digital tele-echocardiography (T-E) applications in the NHCS lacking of a specific methodology for the HTA. In the present study, a solution offering a structured HTA of T-E products was designed. The methodology assured also the definition of standardized quality levels for the application. The first level represents the minimum level of acceptance; the other levels are accessory levels useful for a more accurate assessment of the product. The methodology showed to be useful to rationalize the process of standardization and has received a high degree of acceptance by the subjects involved in the study.

  15. Empirical Distributional Semantics: Methods and Biomedical Applications

    PubMed Central

    Cohen, Trevor; Widdows, Dominic

    2009-01-01

    Over the past fifteen years, a range of methods have been developed that are able to learn human-like estimates of the semantic relatedness between terms from the way in which these terms are distributed in a corpus of unannotated natural language text. These methods have also been evaluated in a number of applications in the cognitive science, computational linguistics and the information retrieval literatures. In this paper, we review the available methodologies for derivation of semantic relatedness from free text, as well as their evaluation in a variety of biomedical and other applications. Recent methodological developments, and their applicability to several existing applications are also discussed. PMID:19232399

  16. Strategic Analysis Overview

    NASA Technical Reports Server (NTRS)

    Cirillo, William M.; Earle, Kevin D.; Goodliff, Kandyce E.; Reeves, J. D.; Stromgren, Chel; Andraschko, Mark R.; Merrill, R. Gabe

    2008-01-01

    NASA s Constellation Program employs a strategic analysis methodology in providing an integrated analysis capability of Lunar exploration scenarios and to support strategic decision-making regarding those scenarios. The strategic analysis methodology integrates the assessment of the major contributors to strategic objective satisfaction performance, affordability, and risk and captures the linkages and feedbacks between all three components. Strategic analysis supports strategic decision making by senior management through comparable analysis of alternative strategies, provision of a consistent set of high level value metrics, and the enabling of cost-benefit analysis. The tools developed to implement the strategic analysis methodology are not element design and sizing tools. Rather, these models evaluate strategic performance using predefined elements, imported into a library from expert-driven design/sizing tools or expert analysis. Specific components of the strategic analysis tool set include scenario definition, requirements generation, mission manifesting, scenario lifecycle costing, crew time analysis, objective satisfaction benefit, risk analysis, and probabilistic evaluation. Results from all components of strategic analysis are evaluated a set of pre-defined figures of merit (FOMs). These FOMs capture the high-level strategic characteristics of all scenarios and facilitate direct comparison of options. The strategic analysis methodology that is described in this paper has previously been applied to the Space Shuttle and International Space Station Programs and is now being used to support the development of the baseline Constellation Program lunar architecture. This paper will present an overview of the strategic analysis methodology and will present sample results from the application of the strategic analysis methodology to the Constellation Program lunar architecture.

  17. Development and application of a hybrid transport methodology for active interrogation systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Royston, K.; Walters, W.; Haghighat, A.

    A hybrid Monte Carlo and deterministic methodology has been developed for application to active interrogation systems. The methodology consists of four steps: i) neutron flux distribution due to neutron source transport and subcritical multiplication; ii) generation of gamma source distribution from (n, 7) interactions; iii) determination of gamma current at a detector window; iv) detection of gammas by the detector. This paper discusses the theory and results of the first three steps for the case of a cargo container with a sphere of HEU in third-density water cargo. To complete the first step, a response-function formulation has been developed tomore » calculate the subcritical multiplication and neutron flux distribution. Response coefficients are pre-calculated using the MCNP5 Monte Carlo code. The second step uses the calculated neutron flux distribution and Bugle-96 (n, 7) cross sections to find the resulting gamma source distribution. In the third step the gamma source distribution is coupled with a pre-calculated adjoint function to determine the gamma current at a detector window. The AIMS (Active Interrogation for Monitoring Special-Nuclear-Materials) software has been written to output the gamma current for a source-detector assembly scanning across a cargo container using the pre-calculated values and taking significantly less time than a reference MCNP5 calculation. (authors)« less

  18. Incorporating social network effects into cost-effectiveness analysis: a methodological contribution with application to obesity prevention

    PubMed Central

    Konchak, Chad; Prasad, Kislaya

    2012-01-01

    Objectives To develop a methodology for integrating social networks into traditional cost-effectiveness analysis (CEA) studies. This will facilitate the economic evaluation of treatment policies in settings where health outcomes are subject to social influence. Design This is a simulation study based on a Markov model. The lifetime health histories of a cohort are simulated, and health outcomes compared, under alternative treatment policies. Transition probabilities depend on the health of others with whom there are shared social ties. Setting The methodology developed is shown to be applicable in any healthcare setting where social ties affect health outcomes. The example of obesity prevention is used for illustration under the assumption that weight changes are subject to social influence. Main outcome measures Incremental cost-effectiveness ratio (ICER). Results When social influence increases, treatment policies become more cost effective (have lower ICERs). The policy of only treating individuals who span multiple networks can be more cost effective than the policy of treating everyone. This occurs when the network is more fragmented. Conclusions (1) When network effects are accounted for, they result in very different values of incremental cost-effectiveness ratios (ICERs). (2) Treatment policies can be devised to take network structure into account. The integration makes it feasible to conduct a cost-benefit evaluation of such policies. PMID:23117559

  19. Assessing reservoir operations risk under climate change

    USGS Publications Warehouse

    Brekke, L.D.; Maurer, E.P.; Anderson, J.D.; Dettinger, M.D.; Townsley, E.S.; Harrison, A.; Pruitt, T.

    2009-01-01

    Risk-based planning offers a robust way to identify strategies that permit adaptive water resources management under climate change. This paper presents a flexible methodology for conducting climate change risk assessments involving reservoir operations. Decision makers can apply this methodology to their systems by selecting future periods and risk metrics relevant to their planning questions and by collectively evaluating system impacts relative to an ensemble of climate projection scenarios (weighted or not). This paper shows multiple applications of this methodology in a case study involving California's Central Valley Project and State Water Project systems. Multiple applications were conducted to show how choices made in conducting the risk assessment, choices known as analytical design decisions, can affect assessed risk. Specifically, risk was reanalyzed for every choice combination of two design decisions: (1) whether to assume climate change will influence flood-control constraints on water supply operations (and how), and (2) whether to weight climate change scenarios (and how). Results show that assessed risk would motivate different planning pathways depending on decision-maker attitudes toward risk (e.g., risk neutral versus risk averse). Results also show that assessed risk at a given risk attitude is sensitive to the analytical design choices listed above, with the choice of whether to adjust flood-control rules under climate change having considerably more influence than the choice on whether to weight climate scenarios. Copyright 2009 by the American Geophysical Union.

  20. Validated analytical methodology for the simultaneous determination of a wide range of pesticides in human blood using GC-MS/MS and LC-ESI/MS/MS and its application in two poisoning cases.

    PubMed

    Luzardo, Octavio P; Almeida-González, Maira; Ruiz-Suárez, Norberto; Zumbado, Manuel; Henríquez-Hernández, Luis A; Meilán, María José; Camacho, María; Boada, Luis D

    2015-09-01

    Pesticides are frequently responsible for human poisoning and often the information on the involved substance is lacking. The great variety of pesticides that could be responsible for intoxication makes necessary the development of powerful and versatile analytical methodologies, which allows the identification of the unknown toxic substance. Here we developed a methodology for simultaneous identification and quantification in human blood of 109 highly toxic pesticides. The application of this analytical scheme would help in minimizing the cost of this type of chemical identification, maximizing the chances of identifying the pesticide involved. In the methodology that we present here, we use a liquid-liquid extraction, followed by one single purification step, and quantitation of analytes by a combination of liquid and gas chromatography, both coupled to triple quadrupole mass spectrometry, which is operated in the mode of multiple reaction monitoring. The methodology has been fully validated, and its applicability has been demonstrated in two recent cases involving one self-poisoning fatality and one non-fatal homicidal attempt. Copyright © 2015 The Chartered Society of Forensic Sciences. Published by Elsevier Ireland Ltd. All rights reserved.

  1. Methodology of development and students' perceptions of a psychiatry educational smartphone application.

    PubMed

    Zhang, Melvyn W B; Ho, Cyrus S H; Ho, Roger C M

    2014-01-01

    The usage of Smartphones and smartphone applications in the recent decade has indeed become more prevalent. Previous research has highlighted the lack of critical appraisal of new applications. In addition, previous research has highlighted a method of using just the Internet Browser and a text editor to create an application, but this does not eliminate the challenges faced by clinicians. In addition, even though there has been a high rate of smartphone applications usage and acceptance, it is common knowledge that it would cost clinicians as well as their centers a lot to develop smartphone applications that could be catered to their needs, and help them in their daily educational needs. The objectives of the current research are thus to highlight a cost-effective methodology of development of interactive education smartphone applications, and also to determine whether medical students are receptive towards having smartphone applications and their perspectives with regards to the contents within. In this study, we will elaborate how the Mastering Psychiatry Online Portal and web-based mobile application were developed using HTML5 as the core programming language. The online portal and web-based application was launched in July 2012 and usage data were obtained. Subsequently, a native application was developed, as it was funded by an educational grant and students are recruited after their end of posting clinical examination to fill up a survey questionnaire relating to perspectives. Our initial analytical results showed that since inception to date, for the online portal, there have been a total of 15,803 views, with a total of 2,109 copies of the online textbook being downloaded. As for the online videos, 5,895 viewers have watched the training videos from the start till the end. 722 users have accessed the mobile textbook application. A total of 185 students participated in the perspective survey, with the majority having positive perspectives about the implementation of a smartphone application in psychiatry. This is one of the few studies that describe how an educational application could be developed using a simple and cost effective methodology and this study has also demonstrated students' perspectives towards Smartphone in psychiatric education. Our methods might apply to future research involving the use of technology in education.

  2. The application of Lean Six Sigma methodology to reduce the risk of healthcare-associated infections in surgery departments.

    PubMed

    Montella, Emma; Di Cicco, Maria Vincenza; Ferraro, Anna; Centobelli, Piera; Raiola, Eliana; Triassi, Maria; Improta, Giovanni

    2017-06-01

    Nowadays, the monitoring and prevention of healthcare-associated infections (HAIs) is a priority for the healthcare sector. In this article, we report on the application of the Lean Six Sigma (LSS) methodology to reduce the number of patients affected by sentinel bacterial infections who are at risk of HAI. The LSS methodology was applied in the general surgery department by using a multidisciplinary team of both physicians and academics. Data on more than 20 000 patients who underwent a wide range of surgical procedures between January 2011 and December 2014 were collected to conduct the study using the departmental information system. The most prevalent sentinel bacteria were determined among the infected patients. The preintervention (January 2011 to December 2012) and postintervention (January 2013 to December 2014) phases were compared to analyze the effects of the methodology implemented. The methodology allowed the identification of variables that influenced the risk of HAIs and the implementation of corrective actions to improve the care process, thereby reducing the percentage of infected patients. The improved process resulted in a 20% reduction in the average number of hospitalization days between preintervention and control phases, and a decrease in the mean (SD) number of days of hospitalization amounted to 36 (15.68), with a data distribution around 3 σ. The LSS is a helpful strategy that ensures a significant decrease in the number of HAIs in patients undergoing surgical interventions. The implementation of this intervention in the general surgery departments resulted in a significant reduction in both the number of hospitalization days and the number of patients affected by HAIs. This approach, together with other tools for reducing the risk of infection (surveillance, epidemiological guidelines, and training of healthcare personnel), could be applied to redesign and improve a wide range of healthcare processes. © 2016 John Wiley & Sons, Ltd.

  3. Application of the Spanish methodological approach for biosphere assessment to a generic high-level waste disposal site.

    PubMed

    Agüero, A; Pinedo, P; Simón, I; Cancio, D; Moraleda, M; Trueba, C; Pérez-Sánchez, D

    2008-09-15

    A methodological approach which includes conceptual developments, methodological aspects and software tools have been developed in the Spanish context, based on the BIOMASS "Reference Biospheres Methodology". The biosphere assessments have to be undertaken with the aim of demonstrating compliance with principles and regulations established to limit the possible radiological impact of radioactive waste disposals on human health and on the environment, and to ensure that future generations will not be exposed to higher radiation levels than those that would be acceptable today. The biosphere in the context of high-level waste disposal is defined as the collection of various radionuclide transfer pathways that may result in releases into the surface environment, transport within and between the biosphere receptors, exposure of humans and biota, and the doses/risks associated with such exposures. The assessments need to take into account the complexity of the biosphere, the nature of the radionuclides released and the long timescales considered. It is also necessary to make assumptions related to the habits and lifestyle of the exposed population, human activities in the long term and possible modifications of the biosphere. A summary on the Spanish methodological approach for biosphere assessment are presented here as well as its application in a Spanish generic case study. A reference scenario has been developed based on current conditions at a site located in Central-West Spain, to indicate the potential impact to the actual population. In addition, environmental change has been considered qualitatively through the use of interaction matrices and transition diagrams. Unit source terms of (36)Cl, (79)Se, (99)Tc, (129)I, (135)Cs, (226)Ra, (231)Pa, (238)U, (237)Np and (239)Pu have been taken. Two exposure groups of infants and adults have been chosen for dose calculations. Results are presented and their robustness is evaluated through the use of uncertainty and sensitivity analyses.

  4. Sorbent, Sublimation, and Icing Modeling Methods: Experimental Validation and Application to an Integrated MTSA Subassembly Thermal Model

    NASA Technical Reports Server (NTRS)

    Bower, Chad; Padilla, Sebastian; Iacomini, Christie; Paul, Heather L.

    2010-01-01

    This paper details the validation of modeling methods for the three core components of a Metabolic heat regenerated Temperature Swing Adsorption (MTSA) subassembly, developed for use in a Portable Life Support System (PLSS). The first core component in the subassembly is a sorbent bed, used to capture and reject metabolically produced carbon dioxide (CO2). The sorbent bed performance can be augmented with a temperature swing driven by a liquid CO2 (LCO2) sublimation heat exchanger (SHX) for cooling the sorbent bed, and a condensing, icing heat exchanger (CIHX) for warming the sorbent bed. As part of the overall MTSA effort, scaled design validation test articles for each of these three components have been independently tested in laboratory conditions. Previously described modeling methodologies developed for implementation in Thermal Desktop and SINDA/FLUINT are reviewed and updated, their application in test article models outlined, and the results of those model correlations relayed. Assessment of the applicability of each modeling methodology to the challenge of simulating the response of the test articles and their extensibility to a full scale integrated subassembly model is given. The independent verified and validated modeling methods are applied to the development of a MTSA subassembly prototype model and predictions of the subassembly performance are given. These models and modeling methodologies capture simulation of several challenging and novel physical phenomena in the Thermal Desktop and SINDA/FLUINT software suite. Novel methodologies include CO2 adsorption front tracking and associated thermal response in the sorbent bed, heat transfer associated with sublimation of entrained solid CO2 in the SHX, and water mass transfer in the form of ice as low as 210 K in the CIHX.

  5. Ensemble modeling of stochastic unsteady open-channel flow in terms of its time-space evolutionary probability distribution - Part 1: theoretical development

    NASA Astrophysics Data System (ADS)

    Dib, Alain; Kavvas, M. Levent

    2018-03-01

    The Saint-Venant equations are commonly used as the governing equations to solve for modeling the spatially varied unsteady flow in open channels. The presence of uncertainties in the channel or flow parameters renders these equations stochastic, thus requiring their solution in a stochastic framework in order to quantify the ensemble behavior and the variability of the process. While the Monte Carlo approach can be used for such a solution, its computational expense and its large number of simulations act to its disadvantage. This study proposes, explains, and derives a new methodology for solving the stochastic Saint-Venant equations in only one shot, without the need for a large number of simulations. The proposed methodology is derived by developing the nonlocal Lagrangian-Eulerian Fokker-Planck equation of the characteristic form of the stochastic Saint-Venant equations for an open-channel flow process, with an uncertain roughness coefficient. A numerical method for its solution is subsequently devised. The application and validation of this methodology are provided in a companion paper, in which the statistical results computed by the proposed methodology are compared against the results obtained by the Monte Carlo approach.

  6. Leishmania Surveillance and Diagnostic Capability in Support of the Joint Biological Agent Identification and Diagnostic System (JBAIDS) and Leishmania Vector Surveillance

    DTIC Science & Technology

    2013-02-07

    specific biosurveillance activities as well as clinical applications and alternative versions preformatted and categorized as ‘high-tech’ and ‘low-tech’ and...methodologies. Application for patent protection of this DoD intellectual property is underway. 1::1 . :::.u~~l-1 I ~111VI:::O Leishmaniasis...LHL assay and the need to develop novel and unique sample preparation methodologies. Application for patent protection of this DoD intellectual

  7. [Theoretical and methodological bases for formation of future drivers 'readiness to application of physical-rehabilitation technologies].

    PubMed

    Yemets, Anatoliy V; Donchenko, Viktoriya I; Scrinick, Eugenia O

    2018-01-01

    Introduction: Experimental work is aimed at introducing theoretical and methodological foundations for the professional training of the future doctor. The aim: Identify the dynamics of quantitative and qualitative indicators of the readiness of a specialist in medicine. Materials and methods: The article presents the course and results of the experimental work of the conditions of forming the readiness of future specialists in medicine. Results: Our methodical bases for studying the disciplines of the general practice and specialized professional stage of experimental training of future physicians have been worked out. Conclusions: It is developed taking into account the peculiarities of future physician training of materials for various stages of experimental implementation in the educational process of higher medical educational institutions.

  8. The Relevancy of Large-Scale, Quantitative Methodologies in Middle Grades Education Research

    ERIC Educational Resources Information Center

    Mertens, Steven B.

    2006-01-01

    This article examines the relevancy of large-scale, quantitative methodologies in middle grades education research. Based on recommendations from national advocacy organizations, the need for more large-scale, quantitative research, combined with the application of more rigorous methodologies, is presented. Subsequent sections describe and discuss…

  9. 48 CFR 1552.215-72 - Instructions for the Preparation of Proposals.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... used. If escalation is included, state the degree (percent) and methodology. The methodology shall.... If so, state the number required, the professional or technical level and the methodology used to... for which the salary is applicable; (C) List of other research Projects or proposals for which...

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sandor, Debra; Chung, Donald; Keyser, David

    This report documents the CEMAC methodologies for developing and reporting annual global clean energy manufacturing benchmarks. The report reviews previously published manufacturing benchmark reports and foundational data, establishes a framework for benchmarking clean energy technologies, describes the CEMAC benchmark analysis methodologies, and describes the application of the methodologies to the manufacturing of four specific clean energy technologies.

  11. Ancient DNA studies: new perspectives on old samples

    PubMed Central

    2012-01-01

    In spite of past controversies, the field of ancient DNA is now a reliable research area due to recent methodological improvements. A series of recent large-scale studies have revealed the true potential of ancient DNA samples to study the processes of evolution and to test models and assumptions commonly used to reconstruct patterns of evolution and to analyze population genetics and palaeoecological changes. Recent advances in DNA technologies, such as next-generation sequencing make it possible to recover DNA information from archaeological and paleontological remains allowing us to go back in time and study the genetic relationships between extinct organisms and their contemporary relatives. With the next-generation sequencing methodologies, DNA sequences can be retrieved even from samples (for example human remains) for which the technical pitfalls of classical methodologies required stringent criteria to guaranty the reliability of the results. In this paper, we review the methodologies applied to ancient DNA analysis and the perspectives that next-generation sequencing applications provide in this field. PMID:22697611

  12. Making sense of grounded theory in medical education.

    PubMed

    Kennedy, Tara J T; Lingard, Lorelei A

    2006-02-01

    Grounded theory is a research methodology designed to develop, through collection and analysis of data that is primarily (but not exclusively) qualitative, a well-integrated set of concepts that provide a theoretical explanation of a social phenomenon. This paper aims to provide an introduction to key features of grounded theory methodology within the context of medical education research. In this paper we include a discussion of the origins of grounded theory, a description of key methodological processes, a comment on pitfalls encountered commonly in the application of grounded theory research, and a summary of the strengths of grounded theory methodology with illustrations from the medical education domain. The significant strengths of grounded theory that have resulted in its enduring prominence in qualitative research include its clearly articulated analytical process and its emphasis on the generation of pragmatic theory that is grounded in the data of experience. When applied properly and thoughtfully, grounded theory can address research questions of significant relevance to the domain of medical education.

  13. Partitioning an object-oriented terminology schema.

    PubMed

    Gu, H; Perl, Y; Halper, M; Geller, J; Kuo, F; Cimino, J J

    2001-07-01

    Controlled medical terminologies are increasingly becoming strategic components of various healthcare enterprises. However, the typical medical terminology can be difficult to exploit due to its extensive size and high density. The schema of a medical terminology offered by an object-oriented representation is a valuable tool in providing an abstract view of the terminology, enhancing comprehensibility and making it more usable. However, schemas themselves can be large and unwieldy. We present a methodology for partitioning a medical terminology schema into manageably sized fragments that promote increased comprehension. Our methodology has a refinement process for the subclass hierarchy of the terminology schema. The methodology is carried out by a medical domain expert in conjunction with a computer. The expert is guided by a set of three modeling rules, which guarantee that the resulting partitioned schema consists of a forest of trees. This makes it easier to understand and consequently use the medical terminology. The application of our methodology to the schema of the Medical Entities Dictionary (MED) is presented.

  14. Investigating the application of AOP methodology in development of Financial Accounting Software using Eclipse-AJDT Environment

    NASA Astrophysics Data System (ADS)

    Sharma, Amita; Sarangdevot, S. S.

    2010-11-01

    Aspect-Oriented Programming (AOP) methodology has been investigated in development of real world business application software—Financial Accounting Software. Eclipse-AJDT environment has been used as open source enhanced IDE support for programming in AOP language—Aspect J. Crosscutting concerns have been identified and modularized as aspects. This reduces the complexity of the design considerably due to elimination of code scattering and tangling. Improvement in modularity, quality and performance is achieved. The study concludes that AOP methodology in Eclipse-AJDT environment offers powerful support for modular design and implementation of real world quality business software.

  15. Predicting Great Lakes fish yields: tools and constraints

    USGS Publications Warehouse

    Lewis, C.A.; Schupp, D.H.; Taylor, W.W.; Collins, J.J.; Hatch, Richard W.

    1987-01-01

    Prediction of yield is a critical component of fisheries management. The development of sound yield prediction methodology and the application of the results of yield prediction are central to the evolution of strategies to achieve stated goals for Great Lakes fisheries and to the measurement of progress toward those goals. Despite general availability of species yield models, yield prediction for many Great Lakes fisheries has been poor due to the instability of the fish communities and the inadequacy of available data. A host of biological, institutional, and societal factors constrain both the development of sound predictions and their application to management. Improved predictive capability requires increased stability of Great Lakes fisheries through rehabilitation of well-integrated communities, improvement of data collection, data standardization and information-sharing mechanisms, and further development of the methodology for yield prediction. Most important is the creation of a better-informed public that will in turn establish the political will to do what is required.

  16. Application of enhanced modern structured analysis techniques to Space Station Freedom electric power system requirements

    NASA Technical Reports Server (NTRS)

    Biernacki, John; Juhasz, John; Sadler, Gerald

    1991-01-01

    A team of Space Station Freedom (SSF) system engineers are in the process of extensive analysis of the SSF requirements, particularly those pertaining to the electrical power system (EPS). The objective of this analysis is the development of a comprehensive, computer-based requirements model, using an enhanced modern structured analysis methodology (EMSA). Such a model provides a detailed and consistent representation of the system's requirements. The process outlined in the EMSA methodology is unique in that it allows the graphical modeling of real-time system state transitions, as well as functional requirements and data relationships, to be implemented using modern computer-based tools. These tools permit flexible updating and continuous maintenance of the models. Initial findings resulting from the application of EMSA to the EPS have benefited the space station program by linking requirements to design, providing traceability of requirements, identifying discrepancies, and fostering an understanding of the EPS.

  17. Advancement in Electrospun Nanofibrous Membranes Modification and Their Application in Water Treatment

    PubMed Central

    Nasreen, Shaik Anwar Ahamed Nabeela; Sundarrajan, Subramanian; Nizar, Syed Abdulrahim Syed; Balamurugan, Ramalingam; Ramakrishna, Seeram

    2013-01-01

    Water, among the most valuable natural resources available on earth, is under serious threat as a result of undesirable human activities: for example, marine dumping, atmospheric deposition, domestic, industrial and agricultural practices. Optimizing current methodologies and developing new and effective techniques to remove contaminants from water is the current focus of interest, in order to renew the available water resources. Materials like nanoparticles, polymers, and simple organic compounds, inorganic clay materials in the form of thin film, membrane or powder have been employed for water treatment. Among these materials, membrane technology plays a vital role in removal of contaminants due to its easy handling and high efficiency. Though many materials are under investigation, nanofibers driven membrane are more valuable and reliable. Synthetic methodologies applied over the modification of membrane and its applications in water treatment have been reviewed in this article. PMID:24957057

  18. Examination of Short- and Long-Range Atomic Order Nanocrystalline SiC and Diamond by Powder Diffraction Methods

    NASA Technical Reports Server (NTRS)

    Palosz, B.; Grzanka, E.; Stelmakh, S.; Gierlotka, S.; Weber, H.-P.; Proffen, T.; Palosz, W.

    2002-01-01

    The real atomic structure of nanocrystals determines unique, key properties of the materials. Determination of the structure presents a challenge due to inherent limitations of standard powder diffraction techniques when applied to nanocrystals. Alternate methodology of the structural analysis of nanocrystals (several nanometers in size) based on Bragg-like scattering and called the "apparent lattice parameter" (alp) is proposed. Application of the alp methodology to examination of the core-shell model of nanocrystals will be presented. The results of application of the alp method to structural analysis of several nanopowders were complemented by those obtained by determination of the Atomic Pair Distribution Function, PDF. Based on synchrotron and neutron diffraction data measured in a large diffraction vector of up to Q = 25 Angstroms(exp -1), the surface stresses in nanocrystalline diamond and SiC were evaluated.

  19. [Research on the Application of Fuzzy Logic to Systems Analysis and Control

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Research conducted with the support of NASA Grant NCC2-275 has been focused in the main on the development of fuzzy logic and soft computing methodologies and their applications to systems analysis and control. with emphasis 011 problem areas which are of relevance to NASA's missions. One of the principal results of our research has been the development of a new methodology called Computing with Words (CW). Basically, in CW words drawn from a natural language are employed in place of numbers for computing and reasoning. There are two major imperatives for computing with words. First, computing with words is a necessity when the available information is too imprecise to justify the use of numbers, and second, when there is a tolerance for imprecision which can be exploited to achieve tractability, robustness, low solution cost, and better rapport with reality. Exploitation of the tolerance for imprecision is an issue of central importance in CW.

  20. Selection of phage-displayed accessible recombinant targeted antibodies (SPARTA): methodology and applications.

    PubMed

    D'Angelo, Sara; Staquicini, Fernanda I; Ferrara, Fortunato; Staquicini, Daniela I; Sharma, Geetanjali; Tarleton, Christy A; Nguyen, Huynh; Naranjo, Leslie A; Sidman, Richard L; Arap, Wadih; Bradbury, Andrew Rm; Pasqualini, Renata

    2018-05-03

    We developed a potentially novel and robust antibody discovery methodology, termed selection of phage-displayed accessible recombinant targeted antibodies (SPARTA). This combines an in vitro screening step of a naive human antibody library against known tumor targets, with in vivo selections based on tumor-homing capabilities of a preenriched antibody pool. This unique approach overcomes several rate-limiting challenges to generate human antibodies amenable to rapid translation into medical applications. As a proof of concept, we evaluated SPARTA on 2 well-established tumor cell surface targets, EphA5 and GRP78. We evaluated antibodies that showed tumor-targeting selectivity as a representative panel of antibody-drug conjugates (ADCs) and were highly efficacious. Our results validate a discovery platform to identify and validate monoclonal antibodies with favorable tumor-targeting attributes. This approach may also extend to other diseases with known cell surface targets and affected tissues easily isolated for in vivo selection.

  1. Evaluating the trade-off between mechanical and electrochemical performance of separators for lithium-ion batteries: Methodology and application

    NASA Astrophysics Data System (ADS)

    Plaimer, Martin; Breitfuß, Christoph; Sinz, Wolfgang; Heindl, Simon F.; Ellersdorfer, Christian; Steffan, Hermann; Wilkening, Martin; Hennige, Volker; Tatschl, Reinhard; Geier, Alexander; Schramm, Christian; Freunberger, Stefan A.

    2016-02-01

    Lithium-ion batteries are in widespread use in electric vehicles and hybrid vehicles. Besides features like energy density, cost, lifetime, and recyclability the safety of a battery system is of prime importance. The separator material impacts all these properties and requires therefore an informed selection. The interplay between the mechanical and electrochemical properties as key selection criteria is investigated. Mechanical properties were investigated using tensile and puncture penetration tests at abuse relevant conditions. To investigate the electrochemical performance in terms of effective conductivity a method based on impedance spectroscopy was introduced. This methodology is applied to evaluate ten commercial separators which allows for a trade-off analysis of mechanical versus electrochemical performance. Based on the results, and in combination with other factors, this offers an effective approach to select suitable separators for automotive applications.

  2. Using ontological inference and hierarchical matchmaking to overcome semantic heterogeneity in remote sensing-based biodiversity monitoring

    NASA Astrophysics Data System (ADS)

    Nieland, Simon; Kleinschmit, Birgit; Förster, Michael

    2015-05-01

    Ontology-based applications hold promise in improving spatial data interoperability. In this work we use remote sensing-based biodiversity information and apply semantic formalisation and ontological inference to show improvements in data interoperability/comparability. The proposed methodology includes an observation-based, "bottom-up" engineering approach for remote sensing applications and gives a practical example of semantic mediation of geospatial products. We apply the methodology to three different nomenclatures used for remote sensing-based classification of two heathland nature conservation areas in Belgium and Germany. We analysed sensor nomenclatures with respect to their semantic formalisation and their bio-geographical differences. The results indicate that a hierarchical and transparent nomenclature is far more important for transferability than the sensor or study area. The inclusion of additional information, not necessarily belonging to a vegetation class description, is a key factor for the future success of using semantics for interoperability in remote sensing.

  3. Application of electrical geophysics to the release of water resources, case of Ain Leuh (Morocco)

    NASA Astrophysics Data System (ADS)

    Zitouni, A.; Boukdir, A.; El Fjiji, H.; Baite, W.; Ekouele Mbaki, V. R.; Ben Said, H.; Echakraoui, Z.; Elissami, A.; El Maslouhi, M. R.

    2018-05-01

    Being seen needs in increasing waters in our contry for fine domestics, manufactures and agricultural, the prospecting of subterranean waters by geologic and hydrogeologic classic method remains inaplicable in the cases of the regions where one does not arrange drillings or polls (soundings) of gratitude (recongnition) in very sufficient (self-important) number. In that case of figure, the method of prospecting geophysics such as the method of nuclear magnetic resonance (NMR) and the method of the geophysics radar are usually used most usually because they showed, worldwide, results very desive in the projects of prospecting and evaluation of the resources in subterranean waters. In the present work, which concerns only the methodology of the electric resistivity, we treat the adopted methodological approach and the study of the case of application in the tray of Ajdir Ain Leuh.

  4. Common Criteria Related Security Design Patterns for Intelligent Sensors—Knowledge Engineering-Based Implementation

    PubMed Central

    Bialas, Andrzej

    2011-01-01

    Intelligent sensors experience security problems very similar to those inherent to other kinds of IT products or systems. The assurance for these products or systems creation methodologies, like Common Criteria (ISO/IEC 15408) can be used to improve the robustness of the sensor systems in high risk environments. The paper presents the background and results of the previous research on patterns-based security specifications and introduces a new ontological approach. The elaborated ontology and knowledge base were validated on the IT security development process dealing with the sensor example. The contribution of the paper concerns the application of the knowledge engineering methodology to the previously developed Common Criteria compliant and pattern-based method for intelligent sensor security development. The issue presented in the paper has a broader significance in terms that it can solve information security problems in many application domains. PMID:22164064

  5. Common criteria related security design patterns for intelligent sensors--knowledge engineering-based implementation.

    PubMed

    Bialas, Andrzej

    2011-01-01

    Intelligent sensors experience security problems very similar to those inherent to other kinds of IT products or systems. The assurance for these products or systems creation methodologies, like Common Criteria (ISO/IEC 15408) can be used to improve the robustness of the sensor systems in high risk environments. The paper presents the background and results of the previous research on patterns-based security specifications and introduces a new ontological approach. The elaborated ontology and knowledge base were validated on the IT security development process dealing with the sensor example. The contribution of the paper concerns the application of the knowledge engineering methodology to the previously developed Common Criteria compliant and pattern-based method for intelligent sensor security development. The issue presented in the paper has a broader significance in terms that it can solve information security problems in many application domains.

  6. A framework for an alternatives assessment dashboard for evaluating chemical alternatives applied to flame retardants for electronic applications.

    PubMed

    Martin, Todd M

    2017-05-01

    The goal of alternatives assessment (AA) is to facilitate a comparison of alternatives to a chemical of concern, resulting in the identification of safer alternatives. A two stage methodology for comparing chemical alternatives was developed. In the first stage, alternatives are compared using a variety of human health effects, ecotoxicity, and physicochemical properties. Hazard profiles are completed using a variety of online sources and quantitative structure activity relationship models. In the second stage, alternatives are evaluated utilizing an exposure/risk assessment over the entire life cycle. Exposure values are calculated using screening-level near-field and far-field exposure models. The second stage allows one to more accurately compare potential exposure to each alternative and consider additional factors that may not be obvious from separate binned persistence, bioaccumulation, and toxicity scores. The methodology was utilized to compare phosphate-based alternatives for decabromodiphenyl ether (decaBDE) in electronics applications.

  7. RLV Turbine Performance Optimization

    NASA Technical Reports Server (NTRS)

    Griffin, Lisa W.; Dorney, Daniel J.

    2001-01-01

    A task was developed at NASA/Marshall Space Flight Center (MSFC) to improve turbine aerodynamic performance through the application of advanced design and analysis tools. There are four major objectives of this task: 1) to develop, enhance, and integrate advanced turbine aerodynamic design and analysis tools; 2) to develop the methodology for application of the analytical techniques; 3) to demonstrate the benefits of the advanced turbine design procedure through its application to a relevant turbine design point; and 4) to verify the optimized design and analysis with testing. Final results of the preliminary design and the results of the two-dimensional (2D) detailed design of the first-stage vane of a supersonic turbine suitable for a reusable launch vehicle (R-LV) are presented. Analytical techniques for obtaining the results are also discussed.

  8. Evaluation of stormwater harvesting sites using multi criteria decision methodology

    NASA Astrophysics Data System (ADS)

    Inamdar, P. M.; Sharma, A. K.; Cook, Stephen; Perera, B. J. C.

    2018-07-01

    Selection of suitable urban stormwater harvesting sites and associated project planning are often complex due to spatial, temporal, economic, environmental and social factors, and related various other variables. This paper is aimed at developing a comprehensive methodology framework for evaluating of stormwater harvesting sites in urban areas using Multi Criteria Decision Analysis (MCDA). At the first phase, framework selects potential stormwater harvesting (SWH) sites using spatial characteristics in a GIS environment. In second phase, MCDA methodology is used for evaluating and ranking of SWH sites in multi-objective and multi-stakeholder environment. The paper briefly describes first phase of framework and focuses chiefly on the second phase of framework. The application of the methodology is also demonstrated over a case study comprising of the local government area, City of Melbourne (CoM), Australia for the benefit of wider water professionals engaged in this area. Nine performance measures (PMs) were identified to characterise the objectives and system performance related to the eight alternative SWH sites for the demonstration of the application of developed methodology. To reflect the stakeholder interests in the current study, four stakeholder participant groups were identified, namely, water authorities (WA), academics (AC), consultants (CS), and councils (CL). The decision analysis methodology broadly consisted of deriving PROMETHEE II rankings of eight alternative SWH sites in the CoM case study, under two distinct group decision making scenarios. The major innovation of this work is the development and application of comprehensive methodology framework that assists in the selection of potential sites for SWH, and facilitates the ranking in multi-objective and multi-stakeholder environment. It is expected that the proposed methodology will assist the water professionals and managers with better knowledge that will reduce the subjectivity in the selection and evaluation of SWH sites.

  9. Knowledge-based system verification and validation

    NASA Technical Reports Server (NTRS)

    Johnson, Sally C.

    1990-01-01

    The objective of this task is to develop and evaluate a methodology for verification and validation (V&V) of knowledge-based systems (KBS) for space station applications with high reliability requirements. The approach consists of three interrelated tasks. The first task is to evaluate the effectiveness of various validation methods for space station applications. The second task is to recommend requirements for KBS V&V for Space Station Freedom (SSF). The third task is to recommend modifications to the SSF to support the development of KBS using effectiveness software engineering and validation techniques. To accomplish the first task, three complementary techniques will be evaluated: (1) Sensitivity Analysis (Worchester Polytechnic Institute); (2) Formal Verification of Safety Properties (SRI International); and (3) Consistency and Completeness Checking (Lockheed AI Center). During FY89 and FY90, each contractor will independently demonstrate the user of his technique on the fault detection, isolation, and reconfiguration (FDIR) KBS or the manned maneuvering unit (MMU), a rule-based system implemented in LISP. During FY91, the application of each of the techniques to other knowledge representations and KBS architectures will be addressed. After evaluation of the results of the first task and examination of Space Station Freedom V&V requirements for conventional software, a comprehensive KBS V&V methodology will be developed and documented. Development of highly reliable KBS's cannot be accomplished without effective software engineering methods. Using the results of current in-house research to develop and assess software engineering methods for KBS's as well as assessment of techniques being developed elsewhere, an effective software engineering methodology for space station KBS's will be developed, and modification of the SSF to support these tools and methods will be addressed.

  10. Validation of a physical anthropology methodology using mandibles for gender estimation in a Brazilian population.

    PubMed

    Carvalho, Suzana Papile Maciel; Brito, Liz Magalhães; Paiva, Luiz Airton Saavedra de; Bicudo, Lucilene Arilho Ribeiro; Crosato, Edgard Michel; Oliveira, Rogério Nogueira de

    2013-01-01

    Validation studies of physical anthropology methods in the different population groups are extremely important, especially in cases in which the population variations may cause problems in the identification of a native individual by the application of norms developed for different communities. This study aimed to estimate the gender of skeletons by application of the method of Oliveira, et al. (1995), previously used in a population sample from Northeast Brazil. The accuracy of this method was assessed for a population from Southeast Brazil and validated by statistical tests. The method used two mandibular measurements, namely the bigonial distance and the mandibular ramus height. The sample was composed of 66 skulls and the method was applied by two examiners. The results were statistically analyzed by the paired t test, logistic discriminant analysis and logistic regression. The results demonstrated that the application of the method of Oliveira, et al. (1995) in this population achieved very different outcomes between genders, with 100% for females and only 11% for males, which may be explained by ethnic differences. However, statistical adjustment of measurement data for the population analyzed allowed accuracy of 76.47% for males and 78.13% for females, with the creation of a new discriminant formula. It was concluded that methods involving physical anthropology present high rate of accuracy for human identification, easy application, low cost and simplicity; however, the methodologies must be validated for the different populations due to differences in ethnic patterns, which are directly related to the phenotypic aspects. In this specific case, the method of Oliveira, et al. (1995) presented good accuracy and may be used for gender estimation in Brazil in two geographic regions, namely Northeast and Southeast; however, for other regions of the country (North, Central West and South), previous methodological adjustment is recommended as demonstrated in this study.

  11. An improved approach for flight readiness certification: Probabilistic models for flaw propagation and turbine blade failure. Volume 2: Software documentation

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflights systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for design, failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.

  12. 78 FR 68449 - Notice of Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-14

    ... RFA-HS14-002, Addressing Methodological Challenges in Research for Patients With Multiple Chronic... applications for the ``AHRQ RFA-HS14-002, Addressing Methodological Challenges in Research for Patients With...

  13. Technology transfer methodology

    NASA Technical Reports Server (NTRS)

    Labotz, Rich

    1991-01-01

    Information on technology transfer methodology is given in viewgraph form. Topics covered include problems in economics, technology drivers, inhibitors to using improved technology in development, technology application opportunities, and co-sponsorship of technology.

  14. Enhanced methods for determining operational capabilities and support costs of proposed space systems

    NASA Technical Reports Server (NTRS)

    Ebeling, Charles

    1993-01-01

    This report documents the work accomplished during the first two years of research to provide support to NASA in predicting operational and support parameters and costs of proposed space systems. The first year's research developed a methodology for deriving reliability and maintainability (R & M) parameters based upon the use of regression analysis to establish empirical relationships between performance and design specifications and corresponding mean times of failure and repair. The second year focused on enhancements to the methodology, increased scope of the model, and software improvements. This follow-on effort expands the prediction of R & M parameters and their effect on the operations and support of space transportation vehicles to include other system components such as booster rockets and external fuel tanks. It also increases the scope of the methodology and the capabilities of the model as implemented by the software. The focus is on the failure and repair of major subsystems and their impact on vehicle reliability, turn times, maintenance manpower, and repairable spares requirements. The report documents the data utilized in this study, outlines the general methodology for estimating and relating R&M parameters, presents the analyses and results of application to the initial data base, and describes the implementation of the methodology through the use of a computer model. The report concludes with a discussion on validation and a summary of the research findings and results.

  15. An applicational process for dynamic balancing of turbomachinery shafting

    NASA Technical Reports Server (NTRS)

    Verhoff, Vincent G.

    1990-01-01

    The NASA Lewis Research Center has developed and implemented a time-efficient methodology for dynamically balancing turbomachinery shafting. This methodology minimizes costly facility downtime by using a balancing arbor (mandrel) that simulates the turbomachinery (rig) shafting. The need for precision dynamic balancing of turbomachinery shafting and for a dynamic balancing methodology is discussed in detail. Additionally, the inherent problems (and their causes and effects) associated with unbalanced turbomachinery shafting as a function of increasing shaft rotational speeds are discussed. Included are the design criteria concerning rotor weight differentials for rotors made of different materials that have similar parameters and shafting. The balancing methodology for applications where rotor replaceability is a requirement is also covered. This report is intended for use as a reference when designing, fabricating, and troubleshooting turbomachinery shafting.

  16. A methodological approach for designing a usable ontology-based GUI in healthcare.

    PubMed

    Lasierra, N; Kushniruk, A; Alesanco, A; Borycki, E; García, J

    2013-01-01

    This paper presents a methodological approach to the design and evaluation of an interface for an ontology-based system used for designing care plans for monitoring patients at home. In order to define the care plans, physicians need a tool for creating instances of the ontology and configuring some rules. Our purpose is to develop an interface to allow clinicians to interact with the ontology. Although ontology-driven applications do not necessarily present the ontology in the user interface, it is our hypothesis that showing selected parts of the ontology in a "usable" way could enhance clinician's understanding and make easier the definition of the care plans. Based on prototyping and iterative testing, this methodology combines visualization techniques and usability methods. Preliminary results obtained after a formative evaluation indicate the effectiveness of suggested combination.

  17. Methodology for building confidence measures

    NASA Astrophysics Data System (ADS)

    Bramson, Aaron L.

    2004-04-01

    This paper presents a generalized methodology for propagating known or estimated levels of individual source document truth reliability to determine the confidence level of a combined output. Initial document certainty levels are augmented by (i) combining the reliability measures of multiply sources, (ii) incorporating the truth reinforcement of related elements, and (iii) incorporating the importance of the individual elements for determining the probability of truth for the whole. The result is a measure of confidence in system output based on the establishing of links among the truth values of inputs. This methodology was developed for application to a multi-component situation awareness tool under development at the Air Force Research Laboratory in Rome, New York. Determining how improvements in data quality and the variety of documents collected affect the probability of a correct situational detection helps optimize the performance of the tool overall.

  18. Distributed computing methodology for training neural networks in an image-guided diagnostic application.

    PubMed

    Plagianakos, V P; Magoulas, G D; Vrahatis, M N

    2006-03-01

    Distributed computing is a process through which a set of computers connected by a network is used collectively to solve a single problem. In this paper, we propose a distributed computing methodology for training neural networks for the detection of lesions in colonoscopy. Our approach is based on partitioning the training set across multiple processors using a parallel virtual machine. In this way, interconnected computers of varied architectures can be used for the distributed evaluation of the error function and gradient values, and, thus, training neural networks utilizing various learning methods. The proposed methodology has large granularity and low synchronization, and has been implemented and tested. Our results indicate that the parallel virtual machine implementation of the training algorithms developed leads to considerable speedup, especially when large network architectures and training sets are used.

  19. Modal interactions due to friction in the nonlinear vibration response of the "Harmony" test structure: Experiments and simulations

    NASA Astrophysics Data System (ADS)

    Claeys, M.; Sinou, J.-J.; Lambelin, J.-P.; Todeschini, R.

    2016-08-01

    The nonlinear vibration response of an assembly with friction joints - named "Harmony" - is studied both experimentally and numerically. The experimental results exhibit a softening effect and an increase of dissipation with excitation level. Modal interactions due to friction are also evidenced. The numerical methodology proposed groups together well-known structural dynamic methods, including finite elements, substructuring, Harmonic Balance and continuation methods. On the one hand, the application of this methodology proves its capacity to treat a complex system where several friction movements occur at the same time. On the other hand, the main contribution of this paper is the experimental and numerical study of evidence of modal interactions due to friction. The simulation methodology succeeds in reproducing complex form of dynamic behavior such as these modal interactions.

  20. Reliability based design optimization: Formulations and methodologies

    NASA Astrophysics Data System (ADS)

    Agarwal, Harish

    Modern products ranging from simple components to complex systems should be designed to be optimal and reliable. The challenge of modern engineering is to ensure that manufacturing costs are reduced and design cycle times are minimized while achieving requirements for performance and reliability. If the market for the product is competitive, improved quality and reliability can generate very strong competitive advantages. Simulation based design plays an important role in designing almost any kind of automotive, aerospace, and consumer products under these competitive conditions. Single discipline simulations used for analysis are being coupled together to create complex coupled simulation tools. This investigation focuses on the development of efficient and robust methodologies for reliability based design optimization in a simulation based design environment. Original contributions of this research are the development of a novel efficient and robust unilevel methodology for reliability based design optimization, the development of an innovative decoupled reliability based design optimization methodology, the application of homotopy techniques in unilevel reliability based design optimization methodology, and the development of a new framework for reliability based design optimization under epistemic uncertainty. The unilevel methodology for reliability based design optimization is shown to be mathematically equivalent to the traditional nested formulation. Numerical test problems show that the unilevel methodology can reduce computational cost by at least 50% as compared to the nested approach. The decoupled reliability based design optimization methodology is an approximate technique to obtain consistent reliable designs at lesser computational expense. Test problems show that the methodology is computationally efficient compared to the nested approach. A framework for performing reliability based design optimization under epistemic uncertainty is also developed. A trust region managed sequential approximate optimization methodology is employed for this purpose. Results from numerical test studies indicate that the methodology can be used for performing design optimization under severe uncertainty.

  1. Measuring solids concentration in stormwater runoff: comparison of analytical methods.

    PubMed

    Clark, Shirley E; Siu, Christina Y S

    2008-01-15

    Stormwater suspended solids typically are quantified using one of two methods: aliquot/subsample analysis (total suspended solids [TSS]) or whole-sample analysis (suspended solids concentration [SSC]). Interproject comparisons are difficult because of inconsistencies in the methods and in their application. To address this concern, the suspended solids content has been measured using both methodologies in many current projects, but the question remains about how to compare these values with historical water-quality data where the analytical methodology is unknown. This research was undertaken to determine the effect of analytical methodology on the relationship between these two methods of determination of the suspended solids concentration, including the effect of aliquot selection/collection method and of particle size distribution (PSD). The results showed that SSC was best able to represent the known sample concentration and that the results were independent of the sample's PSD. Correlations between the results and the known sample concentration could be established for TSS samples, but they were highly dependent on the sample's PSD and on the aliquot collection technique. These results emphasize the need to report not only the analytical method but also the particle size information on the solids in stormwater runoff.

  2. P-Soccer: Soccer Games Application using Kinect

    NASA Astrophysics Data System (ADS)

    Nasir, Mohamad Fahim Mohamed; Suparjoh, Suriawati; Razali, Nazim; Mustapha, Aida

    2018-05-01

    This paper presents a soccer game application called P-Soccer that uses Kinect as the interaction medium between users and the game characters. P-Soccer focuses on training penalty kicks with one character who is taking the kick. This game is developed based on the Game Development Life Cycle (GDLC) methodology. Results for alpha and beta testing showed that the target users are satisfied with overall game design and theme as well as the interactivity with the main character in the game.

  3. NASA biomedical applications team. Applications of aerospace technology in biology and medicine

    NASA Technical Reports Server (NTRS)

    Rouse, D. J.; Beadles, R.; Beall, H. C.; Brown, J. N., Jr.; Clingman, W. H.; Courtney, M. W.; Mccartney, M.; Scearce, R. W.; Wilson, B.

    1979-01-01

    The use of a bipolar donor-recipient model of medical technology transfer is presented. That methodology is designed to: (1) identify medical problems and aerospace technology that in combination constitute opportunities for successful medical products; (2) obtain the early participation of industry in the transfer process; and (3) obtain acceptance by the medical community of new medical products based on aerospace technology. Problem descriptions and activity reports and the results of a market study on the tissue freezing device are presented.

  4. One-Pot Isomerization–Cross Metathesis–Reduction (ICMR) Synthesis of Lipophilic Tetrapeptides

    PubMed Central

    2015-01-01

    An efficient, versatile and rapid method toward homologue series of lipophilic tetrapeptide derivatives (herein, the opioid peptides H-TIPP-OH and H-DIPP-OH) is reported. High atom economy and a minimal number of synthetic steps resulted from a one-pot tandem isomerization-cross metathesis-reduction sequence (ICMR), applicable both in solution and solid phase methodology. The broadly applicable synthesis proceeds with short reaction times and simple work-up, as illustrated in this work for alkylated opioid tetrapeptides. PMID:24906051

  5. Parameter estimation in a structural acoustic system with fully nonlinear coupling conditions

    NASA Technical Reports Server (NTRS)

    Banks, H. T.; Smith, Ralph C.

    1994-01-01

    A methodology for estimating physical parameters in a class of structural acoustic systems is presented. The general model under consideration consists of an interior cavity which is separated from an exterior noise source by an enclosing elastic structure. Piezoceramic patches are bonded to or embedded in the structure; these can be used both as actuators and sensors in applications ranging from the control of interior noise levels to the determination of structural flaws through nondestructive evaluation techniques. The presence and excitation of patches, however, changes the geometry and material properties of the structure as well as involves unknown patch parameters, thus necessitating the development of parameter estimation techniques which are applicable in this coupled setting. In developing a framework for approximation, parameter estimation and implementation, strong consideration is given to the fact that the input operator is unbonded due to the discrete nature of the patches. Moreover, the model is weakly nonlinear. As a result of the coupling mechanism between the structural vibrations and the interior acoustic dynamics. Within this context, an illustrating model is given, well-posedness and approximations results are discussed and an applicable parameter estimation methodology is presented. The scheme is then illustrated through several numerical examples with simulations modeling a variety of commonly used structural acoustic techniques for systems excitations and data collection.

  6. Imaging screening of catastrophic neurological events using a software tool: preliminary results.

    PubMed

    Fernandes, A P; Gomes, A; Veiga, J; Ermida, D; Vardasca, T

    2015-05-01

    In Portugal, as in most countries, the most frequent organ donors are brain-dead donors. To answer the increasing need for transplants, donation programs have been implemented. The goal is to recognize virtually all the possible and potential brain-dead donors admitted to hospitals. The aim of this work was to describe preliminary results of a software application designed to identify devastating neurological injury victims who may progress to brain death and can be possible organ donors. This was an observational, longitudinal study with retrospective data collection. The software application is an automatic algorithm based on natural language processing for selected keywords/expressions present in the cranio-encephalic computerized tomography (CE CT) scan reports to identify catastrophic neurological situations, with e-mail notification to the Transplant Coordinator (TC). The first 7 months of this application were analyzed and compared with the standard clinical evaluation methodology. The imaging identification tool showed a sensitivity of 77% and a specificity of 66%; predictive positive value (PPV) was 0.8 and predictive negative value (PNV) was 0.7 for the identification of catastrophic neurological events. The methodology proposed in this work seems promising in improving the screening efficiency of critical neurological events. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. A simplified approach to determine the carbon footprint of a region: Key learning points from a Galician study.

    PubMed

    Roibás, Laura; Loiseau, Eléonore; Hospido, Almudena

    2018-07-01

    On a previous study, the carbon footprint (CF) of all production and consumption activities of Galicia, an Autonomous Community located in the north-west of Spain, was determined and the results were used to devise strategies aimed at the reduction and mitigation of the greenhouse gas (GHG) emissions. The territorial LCA methodology was used there to perform the calculations. However, that methodology was initially designed to compute the emissions of all types of polluting substances to the environment (several thousands of substances considered in the life cycle inventories), aimed at performing complete LCA studies. This requirement implies the use of specific modelling approaches and databases that in turn raised some difficulties, i.e., need of large amounts of data (which increased gathering times), low temporal, geographical and technological representativeness of the study, lack of data, and presence of double counting issues when trying to combine the sectorial CF results into those of the total economy. In view of these of difficulties, and considering the need to focus only on GHG emissions, it seems important to improve the robustness of the CF computation while proposing a simplified methodology. This study is the result of those efforts to improve the aforementioned methodology. In addition to the territorial LCA approach, several Input-Output (IO) based alternatives have been used here to compute direct and indirect GHG emissions of all Galician production and consumption activities. The results of the different alternatives were compared and evaluated under a multi-criteria approach considering reliability, completeness, temporal and geographical correlation, applicability and consistency. Based on that, an improved and simplified methodology was proposed to determine the CF of the Galician consumption and production activities from a total responsibility perspective. This methodology adequately reflects the current characteristics of the Galician economy, thus increasing the representativeness of the results, and can be applied to any region in which IO tables and environmental vectors are available. This methodology could thus provide useful information in decision making processes to reduce and prevent GHG emissions. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. Hyperbolic reformulation of a 1D viscoelastic blood flow model and ADER finite volume schemes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Montecinos, Gino I.; Müller, Lucas O.; Toro, Eleuterio F.

    2014-06-01

    The applicability of ADER finite volume methods to solve hyperbolic balance laws with stiff source terms in the context of well-balanced and non-conservative schemes is extended to solve a one-dimensional blood flow model for viscoelastic vessels, reformulated as a hyperbolic system, via a relaxation time. A criterion for selecting relaxation times is found and an empirical convergence rate assessment is carried out to support this result. The proposed methodology is validated by applying it to a network of viscoelastic vessels for which experimental and numerical results are available. The agreement between the results obtained in the present paper and thosemore » available in the literature is satisfactory. Key features of the present formulation and numerical methodologies, such as accuracy, efficiency and robustness, are fully discussed in the paper.« less

  9. IDENTIFICATION AND CHARACTERIZATION OF FIVE NON- TRADITIONAL SOURCE CATEGORIES: CATASTROPHIC/ACCIDENTAL RELEASES, VEHICLE REPAIR FACILITIES, RECYCLING, PESTICIDE APPLICATION, AND AGRICULTURAL OPERATIONS

    EPA Science Inventory

    The report gives results of work that is part of EPA's program to identify and characterize emissions sources not currently accounted for by either the existing Aerometric Information Retrieval System (AIRS) or State Implementation Plan (SIP) area source methodologies and to deve...

  10. IMPROVEMENT OF EXPOSURE-DOSE MODELS: APPLICATION OF CONTINUOUS BREATH SAMPLING TO DETERMINE VOC DOSE AND BODY BURDEN

    EPA Science Inventory

    This is a continuation of an Internal Grant research project with the focus on completing the research due to initial funding delays and then analyzing and reporting the research results. This project will employ a new continuous breath sampling methodology to investigate dose a...

  11. The development of a host potential index and its postharvest application to the spotted wing drosophila, Drosophila suzukii (Diptera: Drosophilidae)

    USDA-ARS?s Scientific Manuscript database

    Novel methodology is presented for indexing the relative potential of hosts to function as resources. Results from studies examining host selection, utilization, and physiological development of the organism resourcing the host were combined and quantitatively related via a Host Potential Index (HPI...

  12. Logistic Achievement Test Scaling and Equating with Fixed versus Estimated Lower Asymptotes.

    ERIC Educational Resources Information Center

    Phillips, S. E.

    This study compared the lower asymptotes estimated by the maximum likelihood procedures of the LOGIST computer program with those obtained via application of the Norton methodology. The study also compared the equating results from the three-parameter logistic model with those obtained from the equipercentile, Rasch, and conditional…

  13. A Methodology in the Teaching Process of Calculus and Its Motivation.

    ERIC Educational Resources Information Center

    Vasquez-Martinez, Claudio-Rafael

    The development of calculus and science by being permanent, didactic, demands on one part an analytical, deductive study and on another an application of methods, rhochrematics, resources, within calculus, which allows to dialectically conform knowledge in its different phases and to test the results. For the purpose of this study, the motivation…

  14. Application of LSP Texts in Translator Training

    ERIC Educational Resources Information Center

    Ilynska, Larisa; Smirnova, Tatjana; Platonova, Marina

    2017-01-01

    The paper presents discussion of the results of extensive empirical research into efficient methods of educating and training translators of LSP (language for special purposes) texts. The methodology is based on using popular LSP texts in the respective fields as one of the main media for translator training. The aim of the paper is to investigate…

  15. Nanorobot Hardware Architecture for Medical Defense.

    PubMed

    Cavalcanti, Adriano; Shirinzadeh, Bijan; Zhang, Mingjun; Kretly, Luiz C

    2008-05-06

    This work presents a new approach with details on the integrated platform and hardware architecture for nanorobots application in epidemic control, which should enable real time in vivo prognosis of biohazard infection. The recent developments in the field of nanoelectronics, with transducers progressively shrinking down to smaller sizes through nanotechnology and carbon nanotubes, are expected to result in innovative biomedical instrumentation possibilities, with new therapies and efficient diagnosis methodologies. The use of integrated systems, smart biosensors, and programmable nanodevices are advancing nanoelectronics, enabling the progressive research and development of molecular machines. It should provide high precision pervasive biomedical monitoring with real time data transmission. The use of nanobioelectronics as embedded systems is the natural pathway towards manufacturing methodology to achieve nanorobot applications out of laboratories sooner as possible. To demonstrate the practical application of medical nanorobotics, a 3D simulation based on clinical data addresses how to integrate communication with nanorobots using RFID, mobile phones, and satellites, applied to long distance ubiquitous surveillance and health monitoring for troops in conflict zones. Therefore, the current model can also be used to prevent and save a population against the case of some targeted epidemic disease.

  16. A comprehensive review on the quasi-induced exposure technique.

    PubMed

    Jiang, Xinguo; Lyles, Richard W; Guo, Runhua

    2014-04-01

    The goal is to comprehensively examine the state-of-the-art applications and methodological development of quasi-induced exposure and consequently pinpoint the future research directions in terms of implementation guidelines, limitations, and validity tests. The paper conducts a comprehensive review on approximately 45 published papers relevant to quasi-induced exposure regarding four key topics of interest: applications, responsibility assignment, validation of assumptions, and methodological development. Specific findings include that: (1) there is no systematic data screening procedure in place and how the eliminated crash data will impact the responsibility assignment is generally unknown; (2) there is a lack of necessary efforts to assess the validity of assumptions prior to its application and the validation efforts are mostly restricted to the aggregated levels due to the limited availability of exposure truth; and (3) there is a deficiency of quantitative analyses to evaluate the magnitude and directions of bias as a result of injury risks and crash avoidance ability. The paper points out the future research directions and insights in terms of the validity tests and implementation guidelines. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Optimal Tuner Selection for Kalman Filter-Based Aircraft Engine Performance Estimation

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Garg, Sanjay

    2010-01-01

    A linear point design methodology for minimizing the error in on-line Kalman filter-based aircraft engine performance estimation applications is presented. This technique specifically addresses the underdetermined estimation problem, where there are more unknown parameters than available sensor measurements. A systematic approach is applied to produce a model tuning parameter vector of appropriate dimension to enable estimation by a Kalman filter, while minimizing the estimation error in the parameters of interest. Tuning parameter selection is performed using a multi-variable iterative search routine which seeks to minimize the theoretical mean-squared estimation error. This paper derives theoretical Kalman filter estimation error bias and variance values at steady-state operating conditions, and presents the tuner selection routine applied to minimize these values. Results from the application of the technique to an aircraft engine simulation are presented and compared to the conventional approach of tuner selection. Experimental simulation results are found to be in agreement with theoretical predictions. The new methodology is shown to yield a significant improvement in on-line engine performance estimation accuracy

  18. 2Loud?: Community mapping of exposure to traffic noise with mobile phones.

    PubMed

    Leao, Simone; Ong, Kok-Leong; Krezel, Adam

    2014-10-01

    Despite ample medical evidence of the adverse impacts of traffic noise on health, most policies for traffic noise management are arbitrary or incomplete, resulting in serious social and economic impacts. Surprisingly, there is limited information about citizen's exposure to traffic noise worldwide. This paper presents the 2Loud? mobile phone application, developed and tested as a methodology to monitor, assess and map the level of exposure to traffic noise of citizens with focus on the night period and indoor locations, since sleep disturbance is one of the major triggers for ill health related to traffic noise. Based on a community participation experiment using the 2Loud? mobile phone application in a region close to freeways in Australia, the results of this research indicates a good level of accuracy for the noise monitoring by mobile phones and also demonstrates significant levels of indoor night exposure to traffic noise in the study area. The proposed methodology, through the data produced and the participatory process involved, can potentially assist in planning and management towards healthier urban environments.

  19. An Evolutionary Method for Financial Forecasting in Microscopic High-Speed Trading Environment

    PubMed Central

    Li, Hsu-Chih

    2017-01-01

    The advancement of information technology in financial applications nowadays have led to fast market-driven events that prompt flash decision-making and actions issued by computer algorithms. As a result, today's markets experience intense activity in the highly dynamic environment where trading systems respond to others at a much faster pace than before. This new breed of technology involves the implementation of high-speed trading strategies which generate significant portion of activity in the financial markets and present researchers with a wealth of information not available in traditional low-speed trading environments. In this study, we aim at developing feasible computational intelligence methodologies, particularly genetic algorithms (GA), to shed light on high-speed trading research using price data of stocks on the microscopic level. Our empirical results show that the proposed GA-based system is able to improve the accuracy of the prediction significantly for price movement, and we expect this GA-based methodology to advance the current state of research for high-speed trading and other relevant financial applications. PMID:28316618

  20. Quality assessment of urban environment

    NASA Astrophysics Data System (ADS)

    Ovsiannikova, T. Y.; Nikolaenko, M. N.

    2015-01-01

    This paper is dedicated to the research applicability of quality management problems of construction products. It is offered to expand quality management borders in construction, transferring its principles to urban systems as economic systems of higher level, which qualitative characteristics are substantially defined by quality of construction product. Buildings and structures form spatial-material basis of cities and the most important component of life sphere - urban environment. Authors justify the need for the assessment of urban environment quality as an important factor of social welfare and life quality in urban areas. The authors suggest definition of a term "urban environment". The methodology of quality assessment of urban environment is based on integrated approach which includes the system analysis of all factors and application of both quantitative methods of assessment (calculation of particular and integrated indicators) and qualitative methods (expert estimates and surveys). The authors propose the system of indicators, characterizing quality of the urban environment. This indicators fall into four classes. The authors show the methodology of their definition. The paper presents results of quality assessment of urban environment for several Siberian regions and comparative analysis of these results.

  1. T-4G Methodology: Undergraduate Pilot Training T-37 Phase.

    ERIC Educational Resources Information Center

    Woodruff, Robert R.; And Others

    The report's brief introduction describes the application of T-4G methodology to the T-37 instrument phase of undergraduate pilot training. The methodology is characterized by instruction in trainers, proficiency advancement, a highly structured syllabus, the training manager concept, early exposure to instrument training, and hands-on training.…

  2. An Information Theoretic Investigation Of Complex Adaptive Supply Networks With Organizational Topologies

    DTIC Science & Technology

    2016-12-22

    assumptions of behavior. This research proposes an information theoretic methodology to discover such complex network structures and dynamics while overcoming...the difficulties historically associated with their study. Indeed, this was the first application of an information theoretic methodology as a tool...1 Research Objectives and Questions..............................................................................2 Methodology

  3. Methodological Limitations of the Application of Expert Systems Methodology in Reading.

    ERIC Educational Resources Information Center

    Willson, Victor L.

    Methodological deficiencies inherent in expert-novice reading research make it impossible to draw inferences about curriculum change. First, comparisons of intact groups are often used as a basis for making causal inferences about how observed characteristics affect behaviors. While comparing different groups is not by itself a useless activity,…

  4. Applications of Mass Spectrometry for Cellular Lipid Analysis

    PubMed Central

    Wang, Chunyan; Wang, Miao; Han, Xianlin

    2015-01-01

    Mass spectrometric analysis of cellular lipids is an enabling technology for lipidomics, which is a rapidly-developing research field. In this review, we briefly discuss the principles, advantages, and possible limitations of electrospray ionization (ESI) and matrix assisted laser desorption/ionization (MALDI) mass spectrometry-based methodologies for the analysis of lipid species. The applications of these methodologies to lipidomic research are also summarized. PMID:25598407

  5. Recent developments and applications of immobilized laccase.

    PubMed

    Fernández-Fernández, María; Sanromán, M Ángeles; Moldes, Diego

    2013-12-01

    Laccase is a promising biocatalyst with many possible applications, including bioremediation, chemical synthesis, biobleaching of paper pulp, biosensing, textile finishing and wine stabilization. The immobilization of enzymes offers several improvements for enzyme applications because the storage and operational stabilities are frequently enhanced. Moreover, the reusability of immobilized enzymes represents a great advantage compared with free enzymes. In this work, we discuss the different methodologies of enzyme immobilization that have been reported for laccases, such as adsorption, entrapment, encapsulation, covalent binding and self-immobilization. The applications of laccase immobilized by the aforementioned methodologies are presented, paying special attention to recent approaches regarding environmental applications and electrobiochemistry. Copyright © 2012 Elsevier Inc. All rights reserved.

  6. Development of Bioethics and Clinical Ethics in Bulgaria.

    PubMed

    Aleksandrova-Yankulovska, Silviya S

    2017-03-01

    Bioethics and clinical ethics emerged from the classical medical ethics in the 1970s of the 20th century. Both fields are new for the Bulgarian academic tradition. The aims of this paper were to demarcate the subject fields of medical ethics, bioethics, and clinical ethics, to present the developments in the field of medical ethics in Bulgaria, to delineate the obstacles to effective ethics education of medical professionals, and to present the results of the application of an adapted bottom-up methodology for clinical ethics consultation in several clinical units in Bulgaria. Extended literature review and application of an adapted METAP methodology for clinical ethics consultation in six clinical units in the Northern Bulgaria between May 2013 and December 2014. Teaching of medical ethics in Bulgaria was introduced in the 1990s and still stands mainly as theoretical expertise without sufficient dilemma training in clinical settings. Earlier studies revealed need of clinical ethics consultation services in our country. METAP methodology was applied in 69 ethics meetings. In 31.9% of them non-medical considerations affected the choice of treatment and 34.8% resulted in reaching consensus between the team and the patient. Participants' opinion about the meetings was highly positive with 87.7% overall satisfaction. Development of bioethics in Bulgaria follows recent worldwide trends. Several ideas could be applied towards increasing the effectiveness of ethics education. Results of the ethics meetings lead to the conclusion that it is a successful and well accepted approach for clinical ethics consultation with a potential for wider introduction in our medical practice.

  7. The PHM-Ethics methodology: interdisciplinary technology assessment of personal health monitoring.

    PubMed

    Schmidt, Silke; Verweij, Marcel

    2013-01-01

    The contribution briefly introduces the PHM Ethics project and the PHM methodology. Within the PHM-Ethics project, a set of tools and modules had been developed that may assist in the evaluation and assessment of new technologies for personal health monitoring, referred to as "PHM methodology" or "PHM toolbox". An overview on this interdisciplinary methodology and its comprising modules is provided, areas of application and intended target groups are indicated.

  8. A finite element program for postbuckling calculations (PSTBKL)

    NASA Technical Reports Server (NTRS)

    Simitses, G. T.; Carlson, R. L.; Riff, R.

    1991-01-01

    The object of the research reported herein was to develop a general mathematical model and solution methodologies for analyzing the structural response of thin, metallic shell structures under large transient, cyclic, or static thermochemical loads. This report describes the computer program resulting from the research. Among the system responses associated with these loads and conditions are thermal buckling, creep buckling, and ratcheting. Thus geometric and material nonlinearities (of high order) have been anticipated and are considered in developing the mathematical model. The methodology is demonstrated through different problems of extension, shear, and of planar curved beams. Moreover, importance of the inclusion of large strains is clearly demonstrated, through the chosen applications.

  9. A new zero-inflated negative binomial methodology for latent category identification.

    PubMed

    Blanchard, Simon J; DeSarbo, Wayne S

    2013-04-01

    We introduce a new statistical procedure for the identification of unobserved categories that vary between individuals and in which objects may span multiple categories. This procedure can be used to analyze data from a proposed sorting task in which individuals may simultaneously assign objects to multiple piles. The results of a synthetic example and a consumer psychology study involving categories of restaurant brands illustrate how the application of the proposed methodology to the new sorting task can account for a variety of categorization phenomena including multiple category memberships and for heterogeneity through individual differences in the saliency of latent category structures.

  10. [Customer and patient satisfaction. An appropriate management tool in hospitals?].

    PubMed

    Pawils, S; Trojan, A; Nickel, S; Bleich, C

    2012-09-01

    Recently, the concept of patient satisfaction has been established as an essential part of the quality management of hospitals. Despite the concept's lack of theoretical and methodological foundations, patient surveys on subjective hospital experiences contribute immensely to the improvement of hospitals. What needs to be considered critically in this context is the concept of customer satisfaction for patients, the theoretical integration of empirical results, the reduction of false satisfaction indications and the application of risk-adjusted versus naïve benchmarking of data. This paper aims to contribute to the theoretical discussion of the topic and to build a basis for planning methodologically sound patient surveys.

  11. Set-membership fault detection under noisy environment with application to the detection of abnormal aircraft control surface positions

    NASA Astrophysics Data System (ADS)

    El Houda Thabet, Rihab; Combastel, Christophe; Raïssi, Tarek; Zolghadri, Ali

    2015-09-01

    The paper develops a set membership detection methodology which is applied to the detection of abnormal positions of aircraft control surfaces. Robust and early detection of such abnormal positions is an important issue for early system reconfiguration and overall optimisation of aircraft design. In order to improve fault sensitivity while ensuring a high level of robustness, the method combines a data-driven characterisation of noise and a model-driven approach based on interval prediction. The efficiency of the proposed methodology is illustrated through simulation results obtained based on data recorded in several flight scenarios of a highly representative aircraft benchmark.

  12. From intuition to statistics in building subsurface structural models

    USGS Publications Warehouse

    Brandenburg, J.P.; Alpak, F.O.; Naruk, S.; Solum, J.

    2011-01-01

    Experts associated with the oil and gas exploration industry suggest that combining forward trishear models with stochastic global optimization algorithms allows a quantitative assessment of the uncertainty associated with a given structural model. The methodology is applied to incompletely imaged structures related to deepwater hydrocarbon reservoirs and results are compared to prior manual palinspastic restorations and borehole data. This methodology is also useful for extending structural interpretations into other areas of limited resolution, such as subsalt in addition to extrapolating existing data into seismic data gaps. This technique can be used for rapid reservoir appraisal and potentially have other applications for seismic processing, well planning, and borehole stability analysis.

  13. Methodology of shell structure reinforcement layout optimization

    NASA Astrophysics Data System (ADS)

    Szafrański, Tomasz; Małachowski, Jerzy; Damaziak, Krzysztof

    2018-01-01

    This paper presents an optimization process of a reinforced shell diffuser intended for a small wind turbine (rated power of 3 kW). The diffuser structure consists of multiple reinforcement and metal skin. This kind of structure is suitable for optimization in terms of selection of reinforcement density, stringers cross sections, sheet thickness, etc. The optimisation approach assumes the reduction of the amount of work to be done between the optimization process and the final product design. The proposed optimization methodology is based on application of a genetic algorithm to generate the optimal reinforcement layout. The obtained results are the basis for modifying the existing Small Wind Turbine (SWT) design.

  14. On the generalized VIP time integral methodology for transient thermal problems

    NASA Technical Reports Server (NTRS)

    Mei, Youping; Chen, Xiaoqin; Tamma, Kumar K.; Sha, Desong

    1993-01-01

    The paper describes the development and applicability of a generalized VIrtual-Pulse (VIP) time integral method of computation for thermal problems. Unlike past approaches for general heat transfer computations, and with the advent of high speed computing technology and the importance of parallel computations for efficient use of computing environments, a major motivation via the developments described in this paper is the need for developing explicit computational procedures with improved accuracy and stability characteristics. As a consequence, a new and effective VIP methodology is described which inherits these improved characteristics. Numerical illustrative examples are provided to demonstrate the developments and validate the results obtained for thermal problems.

  15. Streakline-based closed-loop control of a bluff body flow

    NASA Astrophysics Data System (ADS)

    Roca, Pablo; Cammilleri, Ada; Duriez, Thomas; Mathelin, Lionel; Artana, Guillermo

    2014-04-01

    A novel closed-loop control methodology is introduced to stabilize a cylinder wake flow based on images of streaklines. Passive scalar tracers are injected upstream the cylinder and their concentration is monitored downstream at certain image sectors of the wake. An AutoRegressive with eXogenous inputs mathematical model is built from these images and a Generalized Predictive Controller algorithm is used to compute the actuation required to stabilize the wake by adding momentum tangentially to the cylinder wall through plasma actuators. The methodology is new and has real-world applications. It is demonstrated on a numerical simulation and the provided results show that good performances are achieved.

  16. Statistical Model Applied to NetFlow for Network Intrusion Detection

    NASA Astrophysics Data System (ADS)

    Proto, André; Alexandre, Leandro A.; Batista, Maira L.; Oliveira, Isabela L.; Cansian, Adriano M.

    The computers and network services became presence guaranteed in several places. These characteristics resulted in the growth of illicit events and therefore the computers and networks security has become an essential point in any computing environment. Many methodologies were created to identify these events; however, with increasing of users and services on the Internet, many difficulties are found in trying to monitor a large network environment. This paper proposes a methodology for events detection in large-scale networks. The proposal approaches the anomaly detection using the NetFlow protocol, statistical methods and monitoring the environment in a best time for the application.

  17. Feasibility of spirography features for objective assessment of motor function in Parkinson's disease.

    PubMed

    Sadikov, Aleksander; Groznik, Vida; Možina, Martin; Žabkar, Jure; Nyholm, Dag; Memedi, Mevludin; Bratko, Ivan; Georgiev, Dejan

    2017-09-01

    Parkinson's disease (PD) is currently incurable, however proper treatment can ease the symptoms and significantly improve the quality of life of patients. Since PD is a chronic disease, its efficient monitoring and management is very important. The objective of this paper was to investigate the feasibility of using the features and methodology of a spirography application, originally designed to detect early Parkinson's disease (PD) motoric symptoms, for automatically assessing motor symptoms of advanced PD patients experiencing motor fluctuations. More specifically, the aim was to objectively assess motor symptoms related to bradykinesias (slowness of movements occurring as a result of under-medication) and dyskinesias (involuntary movements occurring as a result of over-medication). This work combined spirography data and clinical assessments from a longitudinal clinical study in Sweden with the features and pre-processing methodology of a Slovenian spirography application. The study involved 65 advanced PD patients and over 30,000 spiral-drawing measurements over the course of three years. Machine learning methods were used to learn to predict the "cause" (bradykinesia or dyskinesia) of upper limb motor dysfunctions as assessed by a clinician who observed animated spirals in a web interface. The classification model was also tested for comprehensibility. For this purpose a visualisation technique was used to present visual clues to clinicians as to which parts of the spiral drawing (or its animation) are important for the given classification. Using the machine learning methods with feature descriptions and pre-processing from the Slovenian application resulted in 86% classification accuracy and over 0.90 AUC. The clinicians also rated the computer's visual explanations of its classifications as at least meaningful if not necessarily helpful in over 90% of the cases. The relatively high classification accuracy and AUC demonstrates the usefulness of this approach for objective monitoring of PD patients. The positive evaluation of computer's explanations suggests the potential use of this methodology in a decision support setting. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Suggested criteria for evaluating systems engineering methodologies

    NASA Technical Reports Server (NTRS)

    Gates, Audrey; Paul, Arthur S.; Gill, Tepper L.

    1989-01-01

    Systems engineering is the application of mathematical and scientific principles to practical ends in the life-cycle of a system. A methodology for systems engineering is a carefully developed, relatively complex procedure or process for applying these mathematical and scientific principles. There are many systems engineering methodologies (or possibly many versions of a few methodologies) currently in use in government and industry. These methodologies are usually designed to meet the needs of a particular organization. It has been observed, however, that many technical and non-technical problems arise when inadequate systems engineering methodologies are applied by organizations to their systems development projects. Various criteria for evaluating systems engineering methodologies are discussed. Such criteria are developed to assist methodology-users in identifying and selecting methodologies that best fit the needs of the organization.

  19. 42 CFR 436.601 - Application of financial eligibility methodologies.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... methodologies in determining financial eligibility of the following groups: (i) Qualified pregnant women and children under the mandatory categorically needy group under § 436.120; (ii) Low-income pregnant women...

  20. 42 CFR 436.601 - Application of financial eligibility methodologies.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... methodologies in determining financial eligibility of the following groups: (i) Qualified pregnant women and children under the mandatory categorically needy group under § 436.120; (ii) Low-income pregnant women...

  1. 42 CFR 436.601 - Application of financial eligibility methodologies.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... methodologies in determining financial eligibility of the following groups: (i) Qualified pregnant women and children under the mandatory categorically needy group under § 436.120; (ii) Low-income pregnant women...

  2. 42 CFR 436.601 - Application of financial eligibility methodologies.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... methodologies in determining financial eligibility of the following groups: (i) Qualified pregnant women and children under the mandatory categorically needy group under § 436.120; (ii) Low-income pregnant women...

  3. GPS system simulation methodology

    NASA Technical Reports Server (NTRS)

    Ewing, Thomas F.

    1993-01-01

    The following topics are presented: background; Global Positioning System (GPS) methodology overview; the graphical user interface (GUI); current models; application to space nuclear power/propulsion; and interfacing requirements. The discussion is presented in vugraph form.

  4. Methodologies for Root Locus and Loop Shaping Control Design with Comparisons

    NASA Technical Reports Server (NTRS)

    Kopasakis, George

    2017-01-01

    This paper describes some basics for the root locus controls design method as well as for loop shaping, and establishes approaches to expedite the application of these two design methodologies to easily obtain control designs that meet requirements with superior performance. The two design approaches are compared for their ability to meet control design specifications and for ease of application using control design examples. These approaches are also compared with traditional Proportional Integral Derivative (PID) control in order to demonstrate the limitations of PID control. Robustness of these designs is covered as it pertains to these control methodologies and for the example problems.

  5. Modeling and Characterization of Near-Crack-Tip Plasticity from Micro- to Nano-Scales

    NASA Technical Reports Server (NTRS)

    Glaessgen, Edward H.; Saether, Erik; Hochhalter, Jacob; Smith, Stephen W.; Ransom, Jonathan B.; Yamakov, Vesselin; Gupta, Vipul

    2010-01-01

    Methodologies for understanding the plastic deformation mechanisms related to crack propagation at the nano-, meso- and micro-length scales are being developed. These efforts include the development and application of several computational methods including atomistic simulation, discrete dislocation plasticity, strain gradient plasticity and crystal plasticity; and experimental methods including electron backscattered diffraction and video image correlation. Additionally, methodologies for multi-scale modeling and characterization that can be used to bridge the relevant length scales from nanometers to millimeters are being developed. The paper focuses on the discussion of newly developed methodologies in these areas and their application to understanding damage processes in aluminum and its alloys.

  6. Modeling and Characterization of Near-Crack-Tip Plasticity from Micro- to Nano-Scales

    NASA Technical Reports Server (NTRS)

    Glaessgen, Edward H.; Saether, Erik; Hochhalter, Jacob; Smith, Stephen W.; Ransom, Jonathan B.; Yamakov, Vesselin; Gupta, Vipul

    2011-01-01

    Methodologies for understanding the plastic deformation mechanisms related 10 crack propagation at the nano, meso- and micro-length scales are being developed. These efforts include the development and application of several computational methods including atomistic simulation, discrete dislocation plasticity, strain gradient plasticity and crystal plasticity; and experimental methods including electron backscattered diffraction and video image correlation. Additionally, methodologies for multi-scale modeling and characterization that can be used to bridge the relevant length scales from nanometers to millimeters are being developed. The paper focuses on the discussion of newly developed methodologies in these areas and their application to understanding damage processes in aluminum and its alloys.

  7. A System Engineering Approach to Strategic Partnership Development: A pilot study with NASA's Orbiting Carbon Observatory-2 (OCO-2) and the National Laboratory for Agriculture and the Environment (NLAE)

    NASA Astrophysics Data System (ADS)

    Yuen, K.; Chang, G.; Basilio, R. R.; Hatfield, J.; Cox, E. L.

    2017-12-01

    The prevalence and availability of NASA remote sensing data over the last 40+ years have produced many opportunities for the development of science derived data applications. However, extending and systematically integrating the applications into decision support models and tools have been sporadic and incomplete. Despite efforts among the research communities and external partners, implementation challenges exist and still remain to be addressed. In order to effectively address the systemic gap between the research and applications communities, steps must be taken to effectively bridge that gap: specific goals, a clear plan, and a concerted and diligent effort are needed to produce the desired results. The Orbiting Carbon Observatory-2 (OCO-2) mission sponsored a pilot effort on science data applications with the specific intent of building strategic partnerships, so that organizations and individuals could effectively use OCO-2 data products for application development. The successful partnership with the USDA/ARS National Laboratory for Agriculture and the Environment (NLAE) has laid the foundation for: 1) requirements and lessons for establishing a strategic partnership for application development, 2) building opportunities and growing partnerships for new missions such as OCO-3, and 3) the development of a methodology and approach for integrating application development into a mission life cycle. This presentation will provide an overview of the OCO-2 pilot effort, deliverables, the methodology, implementation, and best practices.

  8. Effect of Methodological and Ecological Approaches on Heterogeneity of Nest-Site Selection of a Long-Lived Vulture

    PubMed Central

    Moreno-Opo, Rubén; Fernández-Olalla, Mariana; Margalida, Antoni; Arredondo, Ángel; Guil, Francisco

    2012-01-01

    The application of scientific-based conservation measures requires that sampling methodologies in studies modelling similar ecological aspects produce comparable results making easier their interpretation. We aimed to show how the choice of different methodological and ecological approaches can affect conclusions in nest-site selection studies along different Palearctic meta-populations of an indicator species. First, a multivariate analysis of the variables affecting nest-site selection in a breeding colony of cinereous vulture (Aegypius monachus) in central Spain was performed. Then, a meta-analysis was applied to establish how methodological and habitat-type factors determine differences and similarities in the results obtained by previous studies that have modelled the forest breeding habitat of the species. Our results revealed patterns in nesting-habitat modelling by the cinereous vulture throughout its whole range: steep and south-facing slopes, great cover of large trees and distance to human activities were generally selected. The ratio and situation of the studied plots (nests/random), the use of plots vs. polygons as sampling units and the number of years of data set determined the variability explained by the model. Moreover, a greater size of the breeding colony implied that ecological and geomorphological variables at landscape level were more influential. Additionally, human activities affected in greater proportion to colonies situated in Mediterranean forests. For the first time, a meta-analysis regarding the factors determining nest-site selection heterogeneity for a single species at broad scale was achieved. It is essential to homogenize and coordinate experimental design in modelling the selection of species' ecological requirements in order to avoid that differences in results among studies would be due to methodological heterogeneity. This would optimize best conservation and management practices for habitats and species in a global context. PMID:22413023

  9. A new methodology for evaluating the damage to the skin barrier caused by repeated application and removal of adhesive dressings.

    PubMed

    Waring, Mike; Bielfeldt, Stephan; Mätzold, Katja; Wilhelm, Klaus-Peter

    2013-02-01

    Chronic wounds require frequent dressing changes. Adhesive dressings used for this indication can be damaging to the stratum corneum, particularly in the elderly where the skin tends to be thinner. Understanding the level of damage caused by dressing removal can aid dressing selection. This study used a novel methodology that applied a stain to the skin and measured the intensity of that stain after repeated application and removal of a series of different adhesive types. Additionally, a traditional method of measuring skin barrier damage (transepidermal water loss) was also undertaken and compared with the staining methodology. The staining methodology and measurement of transepidermal water loss differentiated the adhesive dressings, showing that silicone adhesives caused least trauma to the skin. The staining methodology was shown to be as effective as transepidermal water loss in detecting damage to the stratum corneum and was shown to detect disruption of the barrier earlier than the traditional technique. © 2012 John Wiley & Sons A/S.

  10. Application of new methodologies based on design of experiments, independent component analysis and design space for robust optimization in liquid chromatography.

    PubMed

    Debrus, Benjamin; Lebrun, Pierre; Ceccato, Attilio; Caliaro, Gabriel; Rozet, Eric; Nistor, Iolanda; Oprean, Radu; Rupérez, Francisco J; Barbas, Coral; Boulanger, Bruno; Hubert, Philippe

    2011-04-08

    HPLC separations of an unknown sample mixture and a pharmaceutical formulation have been optimized using a recently developed chemometric methodology proposed by W. Dewé et al. in 2004 and improved by P. Lebrun et al. in 2008. This methodology is based on experimental designs which are used to model retention times of compounds of interest. Then, the prediction accuracy and the optimal separation robustness, including the uncertainty study, were evaluated. Finally, the design space (ICH Q8(R1) guideline) was computed as the probability for a criterion to lie in a selected range of acceptance. Furthermore, the chromatograms were automatically read. Peak detection and peak matching were carried out with a previously developed methodology using independent component analysis published by B. Debrus et al. in 2009. The present successful applications strengthen the high potential of these methodologies for the automated development of chromatographic methods. Copyright © 2011 Elsevier B.V. All rights reserved.

  11. Cost allocation methodology applicable to the temporary assistance for needy families program. Final rule.

    PubMed

    2008-07-23

    This final rule applies to the Temporary Assistance for Needy Families (TANF) program and requires States, the District of Columbia and the Territories (hereinafter referred to as the "States") to use the "benefiting program" cost allocation methodology in U.S. Office of Management and Budget (OMB) Circular A-87 (2 CFR part 225). It is the judgment and determination of HHS/ACF that the "benefiting program" cost allocation methodology is the appropriate methodology for the proper use of Federal TANF funds. The Personal Responsibility and Work Opportunity Reconciliation Act (PRWORA) of 1996 gave federally-recognized Tribes the opportunity to operate their own Tribal TANF programs. Federally-recognized Indian tribes operating approved Tribal TANF programs have always followed the "benefiting program" cost allocation methodology in accordance with OMB Circular A-87 (2 CFR part 225) and the applicable regulatory provisions at 45 CFR 286.45(c) and (d). This final rule contains no substantive changes to the proposed rule published on September 27, 2006.

  12. Development and Application of a Clinical Microsystem Simulation Methodology for Human Factors-Based Research of Alarm Fatigue.

    PubMed

    Kobayashi, Leo; Gosbee, John W; Merck, Derek L

    2017-07-01

    (1) To develop a clinical microsystem simulation methodology for alarm fatigue research with a human factors engineering (HFE) assessment framework and (2) to explore its application to the comparative examination of different approaches to patient monitoring and provider notification. Problems with the design, implementation, and real-world use of patient monitoring systems result in alarm fatigue. A multidisciplinary team is developing an open-source tool kit to promote bedside informatics research and mitigate alarm fatigue. Simulation, HFE, and computer science experts created a novel simulation methodology to study alarm fatigue. Featuring multiple interconnected simulated patient scenarios with scripted timeline, "distractor" patient care tasks, and triggered true and false alarms, the methodology incorporated objective metrics to assess provider and system performance. Developed materials were implemented during institutional review board-approved study sessions that assessed and compared an experimental multiparametric alerting system with a standard monitor telemetry system for subject response, use characteristics, and end-user feedback. A four-patient simulation setup featuring objective metrics for participant task-related performance and response to alarms was developed along with accompanying structured HFE assessment (questionnaire and interview) for monitor systems use testing. Two pilot and four study sessions with individual nurse subjects elicited true alarm and false alarm responses (including diversion from assigned tasks) as well as nonresponses to true alarms. In-simulation observation and subject questionnaires were used to test the experimental system's approach to suppressing false alarms and alerting providers. A novel investigative methodology applied simulation and HFE techniques to replicate and study alarm fatigue in controlled settings for systems assessment and experimental research purposes.

  13. Warehouses information system design and development

    NASA Astrophysics Data System (ADS)

    Darajatun, R. A.; Sukanta

    2017-12-01

    Materials/goods handling industry is fundamental for companies to ensure the smooth running of their warehouses. Efficiency and organization within every aspect of the business is essential in order to gain a competitive advantage. The purpose of this research is design and development of Kanban of inventory storage and delivery system. Application aims to facilitate inventory stock checks to be more efficient and effective. Users easily input finished goods from production department, warehouse, customer, and also suppliers. Master data designed as complete as possible to be prepared applications used in a variety of process logistic warehouse variations. The author uses Java programming language to develop the application, which is used for building Java Web applications, while the database used is MySQL. System development methodology that I use is the Waterfall methodology. Waterfall methodology has several stages of the Analysis, System Design, Implementation, Integration, Operation and Maintenance. In the process of collecting data the author uses the method of observation, interviews, and literature.

  14. Risk assessment for physical and cyber attacks on critical infrastructures.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Bryan J.; Sholander, Peter E.; Phelan, James M.

    2005-08-01

    Assessing the risk of malevolent attacks against large-scale critical infrastructures requires modifications to existing methodologies. Existing risk assessment methodologies consider physical security and cyber security separately. As such, they do not accurately model attacks that involve defeating both physical protection and cyber protection elements (e.g., hackers turning off alarm systems prior to forced entry). This paper presents a risk assessment methodology that accounts for both physical and cyber security. It also preserves the traditional security paradigm of detect, delay and respond, while accounting for the possibility that a facility may be able to recover from or mitigate the results ofmore » a successful attack before serious consequences occur. The methodology provides a means for ranking those assets most at risk from malevolent attacks. Because the methodology is automated the analyst can also play 'what if with mitigation measures to gain a better understanding of how to best expend resources towards securing the facilities. It is simple enough to be applied to large infrastructure facilities without developing highly complicated models. Finally, it is applicable to facilities with extensive security as well as those that are less well-protected.« less

  15. Development and application of stir bar sorptive extraction with polyurethane foams for the determination of testosterone and methenolone in urine matrices.

    PubMed

    Sequeiros, R C P; Neng, N R; Portugal, F C M; Pinto, M L; Pires, J; Nogueira, J M F

    2011-04-01

    This work describes the development, validation, and application of a novel methodology for the determination of testosterone and methenolone in urine matrices by stir bar sorptive extraction using polyurethane foams [SBSE(PU)] followed by liquid desorption and high-performance liquid chromatography with diode array detection. The methodology was optimized in terms of extraction time, agitation speed, pH, ionic strength and organic modifier, as well as back-extraction solvent and desorption time. Under optimized experimental conditions, convenient accuracy were achieved with average recoveries of 49.7 8.6% for testosterone and 54.2 ± 4.7% for methenolone. Additionally, the methodology showed good precision (<9%), excellent linear dynamic ranges (>0.9963) and convenient detection limits (0.2-0.3 μg/L). When comparing the efficiency obtained by SBSE(PU) and with the conventional polydimethylsiloxane phase [SBSE(PDMS)], yields up to four-fold higher are attained for the former, under the same experimental conditions. The application of the proposed methodology for the analysis of testosterone and methenolone in urine matrices showed negligible matrix effects and good analytical performance.

  16. Methodological Quality of Consensus Guidelines in Implant Dentistry

    PubMed Central

    Faggion, Clovis Mariano; Apaza, Karol; Ariza-Fritas, Tania; Málaga, Lilian; Giannakopoulos, Nikolaos Nikitas; Alarcón, Marco Antonio

    2017-01-01

    Background Consensus guidelines are useful to improve clinical decision making. Therefore, the methodological evaluation of these guidelines is of paramount importance. Low quality information may guide to inadequate or harmful clinical decisions. Objective To evaluate the methodological quality of consensus guidelines published in implant dentistry using a validated methodological instrument. Methods The six implant dentistry journals with impact factors were scrutinised for consensus guidelines related to implant dentistry. Two assessors independently selected consensus guidelines, and four assessors independently evaluated their methodological quality using the Appraisal of Guidelines for Research & Evaluation (AGREE) II instrument. Disagreements in the selection and evaluation of guidelines were resolved by consensus. First, the consensus guidelines were analysed alone. Then, systematic reviews conducted to support the guidelines were included in the analysis. Non-parametric statistics for dependent variables (Wilcoxon signed rank test) was used to compare both groups. Results Of 258 initially retrieved articles, 27 consensus guidelines were selected. Median scores in four domains (applicability, rigour of development, stakeholder involvement, and editorial independence), expressed as percentages of maximum possible domain scores, were below 50% (median, 26%, 30.70%, 41.70%, and 41.70%, respectively). The consensus guidelines and consensus guidelines + systematic reviews data sets could be compared for 19 guidelines, and the results showed significant improvements in all domain scores (p < 0.05). Conclusions Methodological improvement of consensus guidelines published in major implant dentistry journals is needed. The findings of the present study may help researchers to better develop consensus guidelines in implant dentistry, which will improve the quality and trust of information needed to make proper clinical decisions. PMID:28107405

  17. Nuclear power plant digital system PRA pilot study with the dynamic flow-graph methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yau, M.; Motamed, M.; Guarro, S.

    2006-07-01

    Current Probabilistic Risk Assessment (PRA) methodology is well established in analyzing hardware and some of the key human interactions. However processes for analyzing the software functions of digital systems within a plant PRA framework, and accounting for the digital system contribution to the overall risk are not generally available nor are they well understood and established. A recent study reviewed a number of methodologies that have potential applicability to modeling and analyzing digital systems within a PRA framework. This study identified the Dynamic Flow-graph Methodology (DFM) and the Markov Methodology as the most promising tools. As a result of thismore » study, a task was defined under the framework of a collaborative agreement between the U.S. Nuclear Regulatory Commission (NRC) and the Ohio State Univ. (OSU). The objective of this task is to set up benchmark systems representative of digital systems used in nuclear power plants and to evaluate DFM and the Markov methodology with these benchmark systems. The first benchmark system is a typical Pressurized Water Reactor (PWR) Steam Generator (SG) Feedwater System (FWS) level control system based on an earlier ASCA work with the U.S. NRC 2, upgraded with modern control laws. ASCA, Inc. is currently under contract to OSU to apply DFM to this benchmark system. The goal is to investigate the feasibility of using DFM to analyze and quantify digital system risk, and to integrate the DFM analytical results back into the plant event tree/fault tree PRA model. (authors)« less

  18. Spatially resolved hazard and exposure assessments: an example of lead in soil at Lavrion, Greece.

    PubMed

    Tristán, E; Demetriades, A; Ramsey, M H; Rosenbaum, M S; Stavrakis, P; Thornton, I; Vassiliades, E; Vergou, K

    2000-01-01

    Spatially resolved hazard assessment (SRHA) and spatially resolved exposure assessment (SREA) are methodologies that have been devised for assessing child exposure to soil containing environmental pollutants. These are based on either a quantitative or a semiquantitative approach. The feasibility of the methodologies has been demonstrated in a study assessing child exposure to Pb accessible in soil at the town of Lavrion in Greece. Using a quantitative approach, both measured and kriged concentrations of Pb in soil are compared with an "established" statutory threshold value. The probabilistic approach gives a refined classification of the contaminated land, since it takes into consideration the uncertainty in both the actual measurement and estimated kriged values. Two exposure assessment models (i.e., IEUBK and HESP) are used as the basis of the quantitative SREA methodologies. The significant correlation between the blood-Pb predictions, using the IEUBK model, and measured concentrations provides a partial validation of the method, because it allows for the uncertainty in the measurements and the lack of some site-specific measurements. The semiquantitative applications of SRHA and SREA incorporate both qualitative information (e.g., land use and dustiness of waste) and quantitative information (e.g., distance from wastes and distance from industry). The significant correlation between the results of these assessments and the measured blood-Pb levels confirms the robust nature of this approach. Successful application of these methodologies could reduce the cost of the assessment and allow areas to be prioritized for further investigation, remediation, or risk management.

  19. Selected analytical challenges in the determination of pharmaceuticals in drinking/marine waters and soil/sediment samples.

    PubMed

    Białk-Bielińska, Anna; Kumirska, Jolanta; Borecka, Marta; Caban, Magda; Paszkiewicz, Monika; Pazdro, Ksenia; Stepnowski, Piotr

    2016-03-20

    Recent developments and improvements in advanced instruments and analytical methodologies have made the detection of pharmaceuticals at low concentration levels in different environmental matrices possible. As a result of these advances, over the last 15 years residues of these compounds and their metabolites have been detected in different environmental compartments and pharmaceuticals have now become recognized as so-called 'emerging' contaminants. To date, a lot of papers have been published presenting the development of analytical methodologies for the determination of pharmaceuticals in aqueous and solid environmental samples. Many papers have also been published on the application of the new methodologies, mainly to the assessment of the environmental fate of pharmaceuticals. Although impressive improvements have undoubtedly been made, in order to fully understand the behavior of these chemicals in the environment, there are still numerous methodological challenges to be overcome. The aim of this paper therefore, is to present a review of selected recent improvements and challenges in the determination of pharmaceuticals in environmental samples. Special attention has been paid to the strategies used and the current challenges (also in terms of Green Analytical Chemistry) that exist in the analysis of these chemicals in soils, marine environments and drinking waters. There is a particular focus on the applicability of modern sorbents such as carbon nanotubes (CNTs) in sample preparation techniques, to overcome some of the problems that exist in the analysis of pharmaceuticals in different environmental samples. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. A manual for conducting environmental impact studies.

    DOT National Transportation Integrated Search

    1971-01-01

    This report suggests methodologies which should enable an interdisciplinary team to assess community values. The methodologies are applicable to the conceptual, location, and design phases of highway planning, respectively. The approach employs a wei...

  1. Application of cooperative and non-cooperative games in large-scale water quantity and quality management: a case study.

    PubMed

    Mahjouri, Najmeh; Ardestani, Mojtaba

    2011-01-01

    In this paper, two cooperative and non-cooperative methodologies are developed for a large-scale water allocation problem in Southern Iran. The water shares of the water users and their net benefits are determined using optimization models having economic objectives with respect to the physical and environmental constraints of the system. The results of the two methodologies are compared based on the total obtained economic benefit, and the role of cooperation in utilizing a shared water resource is demonstrated. In both cases, the water quality in rivers satisfies the standards. Comparing the results of the two mentioned approaches shows the importance of acting cooperatively to achieve maximum revenue in utilizing a surface water resource while the river water quantity and quality issues are addressed.

  2. Application of cokriging techniques for the estimation of hail size

    NASA Astrophysics Data System (ADS)

    Farnell, Carme; Rigo, Tomeu; Martin-Vide, Javier

    2018-01-01

    There are primarily two ways of estimating hail size: the first is the direct interpolation of point observations, and the second is the transformation of remote sensing fields into measurements of hail properties. Both techniques have advantages and limitations as regards generating the resultant map of hail damage. This paper presents a new methodology that combines the above mentioned techniques in an attempt to minimise the limitations and take advantage of the benefits of interpolation and the use of remote sensing data. The methodology was tested for several episodes with good results being obtained for the estimation of hail size at practically all the points analysed. The study area presents a large database of hail episodes, and for this reason, it constitutes an optimal test bench.

  3. Effective World Modeling: Multisensor Data Fusion Methodology for Automated Driving

    PubMed Central

    Elfring, Jos; Appeldoorn, Rein; van den Dries, Sjoerd; Kwakkernaat, Maurice

    2016-01-01

    The number of perception sensors on automated vehicles increases due to the increasing number of advanced driver assistance system functions and their increasing complexity. Furthermore, fail-safe systems require redundancy, thereby increasing the number of sensors even further. A one-size-fits-all multisensor data fusion architecture is not realistic due to the enormous diversity in vehicles, sensors and applications. As an alternative, this work presents a methodology that can be used to effectively come up with an implementation to build a consistent model of a vehicle’s surroundings. The methodology is accompanied by a software architecture. This combination minimizes the effort required to update the multisensor data fusion system whenever sensors or applications are added or replaced. A series of real-world experiments involving different sensors and algorithms demonstrates the methodology and the software architecture. PMID:27727171

  4. Application of validity theory and methodology to patient-reported outcome measures (PROMs): building an argument for validity.

    PubMed

    Hawkins, Melanie; Elsworth, Gerald R; Osborne, Richard H

    2018-07-01

    Data from subjective patient-reported outcome measures (PROMs) are now being used in the health sector to make or support decisions about individuals, groups and populations. Contemporary validity theorists define validity not as a statistical property of the test but as the extent to which empirical evidence supports the interpretation of test scores for an intended use. However, validity testing theory and methodology are rarely evident in the PROM validation literature. Application of this theory and methodology would provide structure for comprehensive validation planning to support improved PROM development and sound arguments for the validity of PROM score interpretation and use in each new context. This paper proposes the application of contemporary validity theory and methodology to PROM validity testing. The validity testing principles will be applied to a hypothetical case study with a focus on the interpretation and use of scores from a translated PROM that measures health literacy (the Health Literacy Questionnaire or HLQ). Although robust psychometric properties of a PROM are a pre-condition to its use, a PROM's validity lies in the sound argument that a network of empirical evidence supports the intended interpretation and use of PROM scores for decision making in a particular context. The health sector is yet to apply contemporary theory and methodology to PROM development and validation. The theoretical and methodological processes in this paper are offered as an advancement of the theory and practice of PROM validity testing in the health sector.

  5. Dynamic Decision Making under Uncertainty and Partial Information

    DTIC Science & Technology

    2017-01-30

    order to address these problems, we investigated efficient computational methodologies for dynamic decision making under uncertainty and partial...information. In the course of this research, we developed and studied efficient simulation-based methodologies for dynamic decision making under...uncertainty and partial information; (ii) studied the application of these decision making models and methodologies to practical problems, such as those

  6. Methodological and Pedagogical Potential of Reflection in Development of Contemporary Didactics

    ERIC Educational Resources Information Center

    Chupina, Valentina A.; Pleshakova, Anastasiia Yu.; Konovalova, Maria E.

    2016-01-01

    Applicability of the issue under research is preconditioned by the need of practical pedagogics to expand methodological and methodical tools of contemporary didactics. The purpose of the article is to detect the methodological core of reflection as a form of thinking and to provide insight thereunto on the basis of systematic attributes of the…

  7. Determining Faculty and Student Views: Applications of Q Methodology in Higher Education

    ERIC Educational Resources Information Center

    Ramlo, Susan

    2012-01-01

    William Stephenson specifically developed Q methodology, or Q, as a means of measuring subjectivity. Q has been used to determine perspectives/views in a wide variety of fields from marketing research to political science but less frequently in education. In higher education, the author has used Q methodology to determine views about a variety of…

  8. A theoretical and experimental investigation of propeller performance methodologies

    NASA Technical Reports Server (NTRS)

    Korkan, K. D.; Gregorek, G. M.; Mikkelson, D. C.

    1980-01-01

    This paper briefly covers aspects related to propeller performance by means of a review of propeller methodologies; presentation of wind tunnel propeller performance data taken in the NASA Lewis Research Center 10 x 10 wind tunnel; discussion of the predominent limitations of existing propeller performance methodologies; and a brief review of airfoil developments appropriate for propeller applications.

  9. Vertically aligned carbon nanotubes for microelectrode arrays applications.

    PubMed

    Castro Smirnov, J R; Jover, Eric; Amade, Roger; Gabriel, Gemma; Villa, Rosa; Bertran, Enric

    2012-09-01

    In this work a methodology to fabricate carbon nanotube based electrodes using plasma enhanced chemical vapour deposition has been explored and defined. The final integrated microelectrode based devices should present specific properties that make them suitable for microelectrode arrays applications. The methodology studied has been focused on the preparation of highly regular and dense vertically aligned carbon nanotube (VACNT) mat compatible with the standard lithography used for microelectrode arrays technology.

  10. Benefit-cost methodology study with example application of the use of wind generators

    NASA Technical Reports Server (NTRS)

    Zimmer, R. P.; Justus, C. G.; Mason, R. M.; Robinette, S. L.; Sassone, P. G.; Schaffer, W. A.

    1975-01-01

    An example application for cost-benefit methodology is presented for the use of wind generators. The approach adopted for the example application consisted of the following activities: (1) surveying of the available wind data and wind power system information, (2) developing models which quantitatively described wind distributions, wind power systems, and cost-benefit differences between conventional systems and wind power systems, and (3) applying the cost-benefit methodology to compare a conventional electrical energy generation system with systems which included wind power generators. Wind speed distribution data were obtained from sites throughout the contiguous United States and were used to compute plant factor contours shown on an annual and seasonal basis. Plant factor values (ratio of average output power to rated power) are found to be as high as 0.6 (on an annual average basis) in portions of the central U. S. and in sections of the New England coastal area. Two types of wind power systems were selected for the application of the cost-benefit methodology. A cost-benefit model was designed and implemented on a computer to establish a practical tool for studying the relative costs and benefits of wind power systems under a variety of conditions and to efficiently and effectively perform associated sensitivity analyses.

  11. Methodology for Computational Fluid Dynamic Validation for Medical Use: Application to Intracranial Aneurysm.

    PubMed

    Paliwal, Nikhil; Damiano, Robert J; Varble, Nicole A; Tutino, Vincent M; Dou, Zhongwang; Siddiqui, Adnan H; Meng, Hui

    2017-12-01

    Computational fluid dynamics (CFD) is a promising tool to aid in clinical diagnoses of cardiovascular diseases. However, it uses assumptions that simplify the complexities of the real cardiovascular flow. Due to high-stakes in the clinical setting, it is critical to calculate the effect of these assumptions in the CFD simulation results. However, existing CFD validation approaches do not quantify error in the simulation results due to the CFD solver's modeling assumptions. Instead, they directly compare CFD simulation results against validation data. Thus, to quantify the accuracy of a CFD solver, we developed a validation methodology that calculates the CFD model error (arising from modeling assumptions). Our methodology identifies independent error sources in CFD and validation experiments, and calculates the model error by parsing out other sources of error inherent in simulation and experiments. To demonstrate the method, we simulated the flow field of a patient-specific intracranial aneurysm (IA) in the commercial CFD software star-ccm+. Particle image velocimetry (PIV) provided validation datasets for the flow field on two orthogonal planes. The average model error in the star-ccm+ solver was 5.63 ± 5.49% along the intersecting validation line of the orthogonal planes. Furthermore, we demonstrated that our validation method is superior to existing validation approaches by applying three representative existing validation techniques to our CFD and experimental dataset, and comparing the validation results. Our validation methodology offers a streamlined workflow to extract the "true" accuracy of a CFD solver.

  12. High-rate RTK and PPP multi-GNSS positioning for small-scale dynamic displacements monitoring

    NASA Astrophysics Data System (ADS)

    Paziewski, Jacek; Sieradzki, Rafał; Baryła, Radosław; Wielgosz, Pawel

    2017-04-01

    The monitoring of dynamic displacements and deformations of engineering structures such as buildings, towers and bridges is of great interest due to several practical and theoretical reasons. The most important is to provide information required for safe maintenance of the constructions. High temporal resolution and precision of GNSS observations predestine this technology to be applied to most demanding application in terms of accuracy, availability and reliability. GNSS technique supported by appropriate processing methodology may meet the specific demands and requirements of ground and structures monitoring. Thus, high-rate multi-GNSS signals may be used as reliable source of information on dynamic displacements of ground and engineering structures, also in real time applications. In this study we present initial results of application of precise relative GNSS positioning for detection of small scale (cm level) high temporal resolution dynamic displacements. Methodology and algorithms applied in self-developed software allowing for relative positioning using high-rate dual-frequency phase and pseudorange GPS+Galileo observations are also given. Additionally, an approach was also made to use the Precise Point Positioning technique to such application. In the experiment were used the observations obtained from high-rate (20 Hz) geodetic receivers. The dynamic displacements were simulated using specially constructed device moving GNSS antenna with dedicated amplitude and frequency. The obtained results indicate on possibility of detection of dynamic displacements of the GNSS antenna even at the level of few millimetres using both relative and Precise Point Positioning techniques after suitable signals processing.

  13. Definition of a methodology for the management of geological heritage. An application to the Azores archipelago (Portugal)

    NASA Astrophysics Data System (ADS)

    Lima, Eva; Nunes, João; Brilha, José; Calado, Helena

    2013-04-01

    The conservation of the geological heritage requires the support of appropriate policies, which should be the result of the integration of nature conservation, environmental and land-use planning, and environmental education perspectives. There are several papers about inventory methodologies for geological heritage and its scientific, educational and tourism uses (e.g. Cendrero, 2000, Lago et al., 2000; Brilha, 2005; Carcavilla et al., 2007). However, management methodologies for geological heritage are still poorly developed. They should be included in environmental and land-use planning and nature conservation policies, in order to support a holistic approach to natural heritage. This gap is explained by the fact that geoconservation is a new geoscience still needed of more basic scientific research, like any other geoscience (Henriques et al., 2011). It is necessary to establish protocols and mechanisms for the conservation and management of geological heritage. This is a complex type of management because it needs to address not only the fragile natural features to preserve but also legal, economic, cultural, educational and recreational aspects. In addition, a management methodology should ensure the geosites conservation, the local development and the dissemination of the geological heritage (Carcavilla et al., 2007). This work is part of a PhD project aiming to contribute to fill this gap that exists in the geoconservation domain, specifically in terms of establishing an appropriate methodology for the management of geological heritage, taking into account the natural diversity of geosites and the variety of natural and anthropic threats. The proposed methodology will be applied to the geological heritage of the Azores archipelago, which management acquires particular importance and urgency after the decision of the Regional Government to create the Azores Geopark and its application to the European and Global Geoparks Networks. Acknowledgment This work is part of a PhD research project funded by the Regional Fund for Science and Technology of the Azores Regional Government (PhD scholarship M3.1.2/F/033/201).

  14. Symmetrical windowing for quantum states in quasi-classical trajectory simulations: Application to electronically non-adiabatic processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cotton, Stephen J.; Miller, William H., E-mail: millerwh@berkeley.edu

    A recently described symmetrical windowing methodology [S. J. Cotton and W. H. Miller, J. Phys. Chem. A 117, 7190 (2013)] for quasi-classical trajectory simulations is applied here to the Meyer-Miller [H.-D. Meyer and W. H. Miller, J. Chem. Phys. 70, 3214 (1979)] model for the electronic degrees of freedom in electronically non-adiabatic dynamics. Results generated using this classical approach are observed to be in very good agreement with accurate quantum mechanical results for a variety of test applications, including problems where coherence effects are significant such as the challenging asymmetric spin-boson system.

  15. Ultracompact vibrometry measurement with nanometric accuracy using optical feedback

    NASA Astrophysics Data System (ADS)

    Jha, Ajit; Azcona, Francisco; Royo, Santiago

    2015-05-01

    The nonlinear dynamics of a semiconductor laser with optical feedback (OF) combined with direct current modulation of the laser is demonstrated to suffice for the measurement of subwavelength changes in the position of a vibrating object. So far, classical Optical Feedback Interferometry (OFI) has been used to measure the vibration of an object given its amplitude is greater than half the wavelength of emission, and the resolution of the measurement limited to some tenths of the wavelength after processing. We present here a methodology which takes advantage of the combination of two different phenomena: continuous wave frequency modulation (CWFM), induced by direct modulation of the laser, and non-linear dynamics inside of the laser cavity subject to optical self-injection (OSI). The methodology we propose shows how to detect vibration amplitudes smaller than half the emission wavelength with resolutions way beyond λ/2, extending the typical performance of OFI setups to very small amplitudes. A detailed mathematical model and simulation results are presented to support the proposed methodology, showing its ability to perform such displacement measurements of frequencies in the MHz range, depending upon the modulation frequency. Such approach makes the technique a suitable candidate, among other applications, to economic laser-based ultrasound measurements, with applications in nondestructive testing of materials (thickness, flaws, density, stresses), among others. The results of simulations of the proposed approach confirm the merit of the figures as detection of amplitudes of vibration below λ/2) with resolutions in the nanometer range.

  16. A methodology for the characterization and diagnosis of cognitive impairments-Application to specific language impairment.

    PubMed

    Oliva, Jesús; Serrano, J Ignacio; del Castillo, M Dolores; Iglesias, Angel

    2014-06-01

    The diagnosis of mental disorders is in most cases very difficult because of the high heterogeneity and overlap between associated cognitive impairments. Furthermore, early and individualized diagnosis is crucial. In this paper, we propose a methodology to support the individualized characterization and diagnosis of cognitive impairments. The methodology can also be used as a test platform for existing theories on the causes of the impairments. We use computational cognitive modeling to gather information on the cognitive mechanisms underlying normal and impaired behavior. We then use this information to feed machine-learning algorithms to individually characterize the impairment and to differentiate between normal and impaired behavior. We apply the methodology to the particular case of specific language impairment (SLI) in Spanish-speaking children. The proposed methodology begins by defining a task in which normal and individuals with impairment present behavioral differences. Next we build a computational cognitive model of that task and individualize it: we build a cognitive model for each participant and optimize its parameter values to fit the behavior of each participant. Finally, we use the optimized parameter values to feed different machine learning algorithms. The methodology was applied to an existing database of 48 Spanish-speaking children (24 normal and 24 SLI children) using clustering techniques for the characterization, and different classifier techniques for the diagnosis. The characterization results show three well-differentiated groups that can be associated with the three main theories on SLI. Using a leave-one-subject-out testing methodology, all the classifiers except the DT produced sensitivity, specificity and area under curve values above 90%, reaching 100% in some cases. The results show that our methodology is able to find relevant information on the underlying cognitive mechanisms and to use it appropriately to provide better diagnosis than existing techniques. It is also worth noting that the individualized characterization obtained using our methodology could be extremely helpful in designing individualized therapies. Moreover, the proposed methodology could be easily extended to other languages and even to other cognitive impairments not necessarily related to language. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Decision-theoretic methodology for reliability and risk allocation in nuclear power plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cho, N.Z.; Papazoglou, I.A.; Bari, R.A.

    1985-01-01

    This paper describes a methodology for allocating reliability and risk to various reactor systems, subsystems, components, operations, and structures in a consistent manner, based on a set of global safety criteria which are not rigid. The problem is formulated as a multiattribute decision analysis paradigm; the multiobjective optimization, which is performed on a PRA model and reliability cost functions, serves as the guiding principle for reliability and risk allocation. The concept of noninferiority is used in the multiobjective optimization problem. Finding the noninferior solution set is the main theme of the current approach. The assessment of the decision maker's preferencesmore » could then be performed more easily on the noninferior solution set. Some results of the methodology applications to a nontrivial risk model are provided and several outstanding issues such as generic allocation and preference assessment are discussed.« less

  18. Response-Guided Community Detection: Application to Climate Index Discovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bello, Gonzalo; Angus, Michael; Pedemane, Navya

    Discovering climate indices-time series that summarize spatiotemporal climate patterns-is a key task in the climate science domain. In this work, we approach this task as a problem of response-guided community detection; that is, identifying communities in a graph associated with a response variable of interest. To this end, we propose a general strategy for response-guided community detection that explicitly incorporates information of the response variable during the community detection process, and introduce a graph representation of spatiotemporal data that leverages information from multiple variables. We apply our proposed methodology to the discovery of climate indices associated with seasonal rainfall variability.more » Our results suggest that our methodology is able to capture the underlying patterns known to be associated with the response variable of interest and to improve its predictability compared to existing methodologies for data-driven climate index discovery and official forecasts.« less

  19. The role of structural characteristics in video-game play motivation: a Q-methodology study.

    PubMed

    Westwood, Dave; Griffiths, Mark D

    2010-10-01

    Until recently, there has been very little naturalistic study of what gaming experiences are like, and how gaming fits into people's lives. Using a recently developed structural characteristic taxonomy of video games, this study examined the psycho-structural elements of computer games that motivate gamers to play them. Using Q-Sort methodology, 40 gamers participated in an online Q-sort task. Results identified six distinct types of gamers based on the factors generated: (a) story-driven solo gamers; (b) social gamers; (c) solo limited gamers; (d) hardcore online gamers; (e) solo control/identity gamers; and (f ) casual gamers. These gaming types are discussed, and a brief evaluation of similar and unique elements of the different types of gamer is also offered. The current study shows Q-methodology to be a relevant and applicable method in the psychological research of gaming.

  20. The maximum specific hydrogen-producing activity of anaerobic mixed cultures: definition and determination

    PubMed Central

    Mu, Yang; Yang, Hou-Yun; Wang, Ya-Zhou; He, Chuan-Shu; Zhao, Quan-Bao; Wang, Yi; Yu, Han-Qing

    2014-01-01

    Fermentative hydrogen production from wastes has many advantages compared to various chemical methods. Methodology for characterizing the hydrogen-producing activity of anaerobic mixed cultures is essential for monitoring reactor operation in fermentative hydrogen production, however there is lack of such kind of standardized methodologies. In the present study, a new index, i.e., the maximum specific hydrogen-producing activity (SHAm) of anaerobic mixed cultures, was proposed, and consequently a reliable and simple method, named SHAm test, was developed to determine it. Furthermore, the influences of various parameters on the SHAm value determination of anaerobic mixed cultures were evaluated. Additionally, this SHAm assay was tested for different types of substrates and bacterial inocula. Our results demonstrate that this novel SHAm assay was a rapid, accurate and simple methodology for determining the hydrogen-producing activity of anaerobic mixed cultures. Thus, application of this approach is beneficial to establishing a stable anaerobic hydrogen-producing system. PMID:24912488

  1. Stabilized Finite Elements in FUN3D

    NASA Technical Reports Server (NTRS)

    Anderson, W. Kyle; Newman, James C.; Karman, Steve L.

    2017-01-01

    A Streamlined Upwind Petrov-Galerkin (SUPG) stabilized finite-element discretization has been implemented as a library into the FUN3D unstructured-grid flow solver. Motivation for the selection of this methodology is given, details of the implementation are provided, and the discretization for the interior scheme is verified for linear and quadratic elements by using the method of manufactured solutions. A methodology is also described for capturing shocks, and simulation results are compared to the finite-volume formulation that is currently the primary method employed for routine engineering applications. The finite-element methodology is demonstrated to be more accurate than the finite-volume technology, particularly on tetrahedral meshes where the solutions obtained using the finite-volume scheme can suffer from adverse effects caused by bias in the grid. Although no effort has been made to date to optimize computational efficiency, the finite-element scheme is competitive with the finite-volume scheme in terms of computer time to reach convergence.

  2. Contribution to the application of two-colour imaging to diesel combustion

    NASA Astrophysics Data System (ADS)

    Payri, F.; Pastor, J. V.; García, J. M.; Pastor, J. M.

    2007-08-01

    The two-colour method (2C) is a well-known methodology for the estimation of flame temperature and the soot-related KL factor. A 2C imaging system has been built with a single charge-coupled device (CCD) camera for visualization of the diesel flame in a single-cylinder 2-stroke engine with optical accesses. The work presented here focuses on methodological aspects. In that sense, the influence of calibration uncertainties on the measured temperature and KL factor has been analysed. Besides, a theoretical study is presented that tries to link the true flame temperature and soot distributions with those derived from the 2C images. Finally, an experimental study has been carried out in order to show the influence of injection pressure, air density and temperature on the 2C-derived parameters. Comparison with the expected results has shown the limitations of this methodology for diesel flame analysis.

  3. Fuzzy logic controllers for electrotechnical devices - On-site tuning approach

    NASA Astrophysics Data System (ADS)

    Hissel, D.; Maussion, P.; Faucher, J.

    2001-12-01

    Fuzzy logic offers nowadays an interesting alternative to the designers of non linear control laws for electrical or electromechanical systems. However, due to the huge number of tuning parameters, this kind of control is only used in a few industrial applications. This paper proposes a new, very simple, on-site tuning strategy for a PID-like fuzzy logic controller. Thanks to the experimental designs methodology, we will propose sets of optimized pre-established settings for this kind of fuzzy controllers. The proposed parameters are only depending on one on-site open-loop identification test. In this way, this on-site tuning methodology has to be compared to the Ziegler-Nichols one's for conventional controllers. Experimental results (on a permanent magnets synchronous motor and on a DC/DC converter) will underline all the efficiency of this tuning methodology. Finally, the field of validity of the proposed pre-established settings will be given.

  4. The maximum specific hydrogen-producing activity of anaerobic mixed cultures: definition and determination

    NASA Astrophysics Data System (ADS)

    Mu, Yang; Yang, Hou-Yun; Wang, Ya-Zhou; He, Chuan-Shu; Zhao, Quan-Bao; Wang, Yi; Yu, Han-Qing

    2014-06-01

    Fermentative hydrogen production from wastes has many advantages compared to various chemical methods. Methodology for characterizing the hydrogen-producing activity of anaerobic mixed cultures is essential for monitoring reactor operation in fermentative hydrogen production, however there is lack of such kind of standardized methodologies. In the present study, a new index, i.e., the maximum specific hydrogen-producing activity (SHAm) of anaerobic mixed cultures, was proposed, and consequently a reliable and simple method, named SHAm test, was developed to determine it. Furthermore, the influences of various parameters on the SHAm value determination of anaerobic mixed cultures were evaluated. Additionally, this SHAm assay was tested for different types of substrates and bacterial inocula. Our results demonstrate that this novel SHAm assay was a rapid, accurate and simple methodology for determining the hydrogen-producing activity of anaerobic mixed cultures. Thus, application of this approach is beneficial to establishing a stable anaerobic hydrogen-producing system.

  5. Analysis of torque transmitting behavior and wheel slip prevention control during regenerative braking for high speed EMU trains

    NASA Astrophysics Data System (ADS)

    Xu, Kun; Xu, Guo-Qing; Zheng, Chun-Hua

    2016-04-01

    The wheel-rail adhesion control for regenerative braking systems of high speed electric multiple unit trains is crucial to maintaining the stability, improving the adhesion utilization, and achieving deep energy recovery. There remain technical challenges mainly because of the nonlinear, uncertain, and varying features of wheel-rail contact conditions. This research analyzes the torque transmitting behavior during regenerative braking, and proposes a novel methodology to detect the wheel-rail adhesion stability. Then, applications to the wheel slip prevention during braking are investigated, and the optimal slip ratio control scheme is proposed, which is based on a novel optimal reference generation of the slip ratio and a robust sliding mode control. The proposed methodology achieves the optimal braking performance without the wheel-rail contact information. Numerical simulation results for uncertain slippery rails verify the effectiveness of the proposed methodology.

  6. Semi-Empirical Prediction of Aircraft Low-Speed Aerodynamic Characteristics

    NASA Technical Reports Server (NTRS)

    Olson, Erik D.

    2015-01-01

    This paper lays out a comprehensive methodology for computing a low-speed, high-lift polar, without requiring additional details about the aircraft design beyond what is typically available at the conceptual design stage. Introducing low-order, physics-based aerodynamic analyses allows the methodology to be more applicable to unconventional aircraft concepts than traditional, fully-empirical methods. The methodology uses empirical relationships for flap lift effectiveness, chord extension, drag-coefficient increment and maximum lift coefficient of various types of flap systems as a function of flap deflection, and combines these increments with the characteristics of the unflapped airfoils. Once the aerodynamic characteristics of the flapped sections are known, a vortex-lattice analysis calculates the three-dimensional lift, drag and moment coefficients of the whole aircraft configuration. This paper details the results of two validation cases: a supercritical airfoil model with several types of flaps; and a 12-foot, full-span aircraft model with slats and double-slotted flaps.

  7. Rapid Prototyping a Collections-Based Mobile Wayfinding Application

    ERIC Educational Resources Information Center

    Hahn, Jim; Morales, Alaina

    2011-01-01

    This research presents the results of a project that investigated how students use a library developed mobile app to locate books in the library. The study employed a methodology of formative evaluation so that the development of the mobile app would be informed by user preferences for next generation wayfinding systems. A key finding is the…

  8. A Methodology in the Teaching Process of the Derivative and Its Motivation.

    ERIC Educational Resources Information Center

    Vasquez-Martinez, Claudio-Rafael

    The development of the derivative because of being part of calculus in permanent dialectic, demands on one part an analytical, deductive study and on another an application of rochrematic methods, sources of resources, within calculus of derivative which allows to dialectically confront knowledge in its different phases and to test the results.…

  9. Behavioral Self-Monitoring of Safety and Productivity in the Workplace: A Methodological Primer and Quantitative Literature Review

    ERIC Educational Resources Information Center

    Olson, Ryan; Winchester, Jamey

    2008-01-01

    Workplace applications of behavioral self-monitoring (BSM) methods have been studied periodically for over 35 years, yet the literature has never been systematically reviewed. Recent occupational safety interventions including BSM resulted in relatively large behavior changes. Moreover, BSM methods are functional for addressing a broad range of…

  10. Adapting Evidence-Based Mental Health Treatments in Community Settings: Preliminary Results from a Partnership Approach

    ERIC Educational Resources Information Center

    Southam-Gerow, Michael A.; Hourigan, Shannon E.; Allin, Robert B., Jr.

    2009-01-01

    This article describes the application of a university-community partnership model to the problem of adapting evidence-based treatment approaches in a community mental health setting. Background on partnership research is presented, with consideration of methodological and practical issues related to this kind of research. Then, a rationale for…

  11. Effects of Kindergarten Retention on Children's Social-Emotional Development: An Application of Propensity Score Method to Multivariate, Multilevel Data

    ERIC Educational Resources Information Center

    Hong, Guanglei; Yu, Bing

    2008-01-01

    This study examines the effects of kindergarten retention on children's social-emotional development in the early, middle, and late elementary years. Previous studies have generated mixed results partly due to some major methodological challenges, including selection bias, measurement error, and divergent perceptions of multiple respondents in…

  12. The Development and Application of Distance Learning Courses on the Internet.

    ERIC Educational Resources Information Center

    Fuks, Hugo; Gerosa, Marco Aurelio; Lucena, Carlos Jose Pereira de

    2002-01-01

    Presents the methodology, results, and difficulties encountered in the development and delivery of a course through the Internet at a university in Rio de Janeiro. Provides a model for group work, including group discussions; and shows how a Web-based environment can be used to provide support and to facilitate cooperative learning. (Author/LRW)

  13. Methodology for estimating human perception to tremors in high-rise buildings

    NASA Astrophysics Data System (ADS)

    Du, Wenqi; Goh, Key Seng; Pan, Tso-Chien

    2017-07-01

    Human perception to tremors during earthquakes in high-rise buildings is usually associated with psychological discomfort such as fear and anxiety. This paper presents a methodology for estimating the level of perception to tremors for occupants living in high-rise buildings subjected to ground motion excitations. Unlike other approaches based on empirical or historical data, the proposed methodology performs a regression analysis using the analytical results of two generic models of 15 and 30 stories. The recorded ground motions in Singapore are collected and modified for structural response analyses. Simple predictive models are then developed to estimate the perception level to tremors based on a proposed ground motion intensity parameter—the average response spectrum intensity in the period range between 0.1 and 2.0 s. These models can be used to predict the percentage of occupants in high-rise buildings who may perceive the tremors at a given ground motion intensity. Furthermore, the models are validated with two recent tremor events reportedly felt in Singapore. It is found that the estimated results match reasonably well with the reports in the local newspapers and from the authorities. The proposed methodology is applicable to urban regions where people living in high-rise buildings might feel tremors during earthquakes.

  14. Computational simulation of probabilistic lifetime strength for aerospace materials subjected to high temperature, mechanical fatigue, creep and thermal fatigue

    NASA Technical Reports Server (NTRS)

    Boyce, Lola; Bast, Callie C.; Trimble, Greg A.

    1992-01-01

    This report presents the results of a fourth year effort of a research program, conducted for NASA-LeRC by the University of Texas at San Antonio (UTSA). The research included on-going development of methodology that provides probabilistic lifetime strength of aerospace materials via computational simulation. A probabilistic material strength degradation model, in the form of a randomized multifactor interaction equation, is postulated for strength degradation of structural components of aerospace propulsion systems subject to a number of effects or primitive variables. These primitive variables may include high temperature, fatigue or creep. In most cases, strength is reduced as a result of the action of a variable. This multifactor interaction strength degradation equation has been randomized and is included in the computer program, PROMISS. Also included in the research is the development of methodology to calibrate the above-described constitutive equation using actual experimental materials data together with regression analysis of that data, thereby predicting values for the empirical material constants for each effect or primitive variable. This regression methodology is included in the computer program, PROMISC. Actual experimental materials data were obtained from industry and the open literature for materials typically for applications in aerospace propulsion system components. Material data for Inconel 718 has been analyzed using the developed methodology.

  15. Computational simulation of probabilistic lifetime strength for aerospace materials subjected to high temperature, mechanical fatigue, creep, and thermal fatigue

    NASA Technical Reports Server (NTRS)

    Boyce, Lola; Bast, Callie C.; Trimble, Greg A.

    1992-01-01

    The results of a fourth year effort of a research program conducted for NASA-LeRC by The University of Texas at San Antonio (UTSA) are presented. The research included on-going development of methodology that provides probabilistic lifetime strength of aerospace materials via computational simulation. A probabilistic material strength degradation model, in the form of a randomized multifactor interaction equation, is postulated for strength degradation of structural components of aerospace propulsion systems subjected to a number of effects or primitive variables. These primitive variables may include high temperature, fatigue, or creep. In most cases, strength is reduced as a result of the action of a variable. This multifactor interaction strength degradation equation was randomized and is included in the computer program, PROMISC. Also included in the research is the development of methodology to calibrate the above-described constitutive equation using actual experimental materials data together with regression analysis of that data, thereby predicting values for the empirical material constants for each effect or primitive variable. This regression methodology is included in the computer program, PROMISC. Actual experimental materials data were obtained from industry and the open literature for materials typically for applications in aerospace propulsion system components. Material data for Inconel 718 was analyzed using the developed methodology.

  16. Definition and applications of a versatile chemical pollution footprint methodology.

    PubMed

    Zijp, Michiel C; Posthuma, Leo; van de Meent, Dik

    2014-09-16

    Because of the great variety in behavior and modes of action of chemicals, impact assessment of multiple substances is complex, as is the communication of its results. Given calls for cumulative impact assessments, we developed a methodology that is aimed at expressing the expected cumulative impacts of mixtures of chemicals on aquatic ecosystems for a region and subsequently allows to present these results as a chemical pollution footprint, in short: a chemical footprint. Setting and using a boundary for chemical pollution is part of the methodology. Two case studies were executed to test and illustrate the methodology. The first case illustrates that the production and use of organic substances in Europe, judged with the European water volume, stays within the currently set policy boundaries for chemical pollution. The second case shows that the use of pesticides in Northwestern Europe, judged with the regional water volume, has exceeded the set boundaries, while showing a declining trend over time. The impact of mixtures of substances in the environment could be expressed as a chemical footprint, and the relative contribution of substances to that footprint could be evaluated. These features are a novel type of information to support risk management, by helping prioritization of management among chemicals and environmental compartments.

  17. 78 FR 66681 - Census Advisory Committees

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-06

    ..., filing of petitions and applications and agency #0;statements of organization and functions are examples... policies, research and methodology, tests, operations, communications/messaging and other activities to..., socioeconomic, linguistic, technological, methodological, geographic, behavioral and operational variables...

  18. Application of a statewide intermodal freight planning methodology.

    DOT National Transportation Integrated Search

    2001-08-01

    Anticipating the need for Virginia to comply with the new freight planning requirements mandated by ISTEA and TEA-21, the Virginia Transportation Research Council in 1998 developed a Statewide Intermodal Freight Transportation Planning Methodology, w...

  19. Proposed Risk-Informed Seismic Hazard Periodic Reevaluation Methodology for Complying with DOE Order 420.1C

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kammerer, Annie

    Department of Energy (DOE) nuclear facilities must comply with DOE Order 420.1C Facility Safety, which requires that all such facilities review their natural phenomena hazards (NPH) assessments no less frequently than every ten years. The Order points the reader to Standard DOE-STD-1020-2012. In addition to providing a discussion of the applicable evaluation criteria, the Standard references other documents, including ANSI/ANS-2.29-2008 and NUREG-2117. These documents provide supporting criteria and approaches for evaluating the need to update an existing probabilistic seismic hazard analysis (PSHA). All of the documents are consistent at a high level regarding the general conceptual criteria that should bemore » considered. However, none of the documents provides step-by-step detailed guidance on the required or recommended approach for evaluating the significance of new information and determining whether or not an existing PSHA should be updated. Further, all of the conceptual approaches and criteria given in these documents deal with changes that may have occurred in the knowledge base that might impact the inputs to the PSHA, the calculated hazard itself, or the technical basis for the hazard inputs. Given that the DOE Order is aimed at achieving and assuring the safety of nuclear facilities—which is a function not only of the level of the seismic hazard but also the capacity of the facility to withstand vibratory ground motions—the inclusion of risk information in the evaluation process would appear to be both prudent and in line with the objectives of the Order. The purpose of this white paper is to describe a risk-informed methodology for evaluating the need for an update of an existing PSHA consistent with the DOE Order. While the development of the proposed methodology was undertaken as a result of assessments for specific SDC-3 facilities at Idaho National Laboratory (INL), and it is expected that the application at INL will provide a demonstration of the methodology, there is potential for general applicability to other facilities across the DOE complex. As such, both a general methodology and a specific approach intended for INL are described in this document. The general methodology proposed in this white paper is referred to as the “seismic hazard periodic review methodology,” or SHPRM. It presents a graded approach for SDC-3, SDC-4 and SDC-5 facilities that can be applied in any risk-informed regulatory environment by once risk-objectives appropriate for the framework are developed. While the methodology was developed for seismic hazard considerations, it can also be directly applied to other types of natural hazards.« less

  20. Proposed Risk-Informed Seismic Hazard Periodic Reevaluation Methodology for Complying with DOE Order 420.1C

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kammerer, Annie

    Department of Energy (DOE) nuclear facilities must comply with DOE Order 420.1C Facility Safety, which requires that all such facilities review their natural phenomena hazards (NPH) assessments no less frequently than every ten years. The Order points the reader to Standard DOE-STD-1020-2012. In addition to providing a discussion of the applicable evaluation criteria, the Standard references other documents, including ANSI/ANS-2.29-2008 and NUREG-2117. These documents provide supporting criteria and approaches for evaluating the need to update an existing probabilistic seismic hazard analysis (PSHA). All of the documents are consistent at a high level regarding the general conceptual criteria that should bemore » considered. However, none of the documents provides step-by-step detailed guidance on the required or recommended approach for evaluating the significance of new information and determining whether or not an existing PSHA should be updated. Further, all of the conceptual approaches and criteria given in these documents deal with changes that may have occurred in the knowledge base that might impact the inputs to the PSHA, the calculated hazard itself, or the technical basis for the hazard inputs. Given that the DOE Order is aimed at achieving and assuring the safety of nuclear facilities—which is a function not only of the level of the seismic hazard but also the capacity of the facility to withstand vibratory ground motions—the inclusion of risk information in the evaluation process would appear to be both prudent and in line with the objectives of the Order. The purpose of this white paper is to describe a risk-informed methodology for evaluating the need for an update of an existing PSHA consistent with the DOE Order. While the development of the proposed methodology was undertaken as a result of assessments for specific SDC-3 facilities at Idaho National Laboratory (INL), and it is expected that the application at INL will provide a demonstration of the methodology, there is potential for general applicability to other facilities across the DOE complex. As such, both a general methodology and a specific approach intended for INL are described in this document. The general methodology proposed in this white paper is referred to as the “seismic hazard periodic review methodology,” or SHPRM. It presents a graded approach for SDC-3, SDC-4 and SDC-5 facilities that can be applied in any risk-informed regulatory environment once risk-objectives appropriate for the framework are developed. While the methodology was developed for seismic hazard considerations, it can also be directly applied to other types of natural hazards.« less

  1. Seventh NASTRAN User's Colloquium

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The general application of finite element methodology and the specific application of NASTRAN to a wide variety of static and dynamic structural problems are described. Topics include: fluids and thermal applications, NASTRAN programming, substructuring methods, unique new applications, general auxiliary programs, specific applications, and new capabilities.

  2. Validations of Coupled CSD/CFD and Particle Vortex Transport Method for Rotorcraft Applications: Hover, Transition, and High Speed Flights

    NASA Technical Reports Server (NTRS)

    Anusonti-Inthra, Phuriwat

    2010-01-01

    This paper presents validations of a novel rotorcraft analysis that coupled Computational Fluid Dynamics (CFD), Computational Structural Dynamics (CSD), and Particle Vortex Transport Method (PVTM) methodologies. The CSD with associated vehicle trim analysis is used to calculate blade deformations and trim parameters. The near body CFD analysis is employed to provide detailed near body flow field information which is used to obtain high-fidelity blade aerodynamic loadings. The far field wake dominated region is simulated using the PVTM analysis which provides accurate prediction of the evolution of the rotor wake released from the near body CFD domains. A loose coupling methodology between the CSD and CFD/PVTM modules are used with appropriate information exchange amongst the CSD/CFD/PVTM modules. The coupled CSD/CFD/PVTM methodology is used to simulate various rotorcraft flight conditions (i.e. hover, transition, and high speed flights), and the results are compared with several sets of experimental data. For the hover condition, the results are compared with hover data for the HART II rotor tested at DLR Institute of Flight Systems, Germany. For the forward flight conditions, the results are validated with the UH-60A flight test data.

  3. Statistical investigation of Kluyveromyces lactis cells permeabilization with ethanol by response surface methodology.

    PubMed

    de Faria, Janaína T; Rocha, Pollyana F; Converti, Attilio; Passos, Flávia M L; Minim, Luis A; Sampaio, Fábio C

    2013-12-01

    The aim of our study was to select the optimal operating conditions to permeabilize Kluyveromyces lactis cells using ethanol as a solvent as an alternative to cell disruption and extraction. Cell permeabilization was carried out by a non-mechanical method consisting of chemical treatment with ethanol, and the results were expressed as β-galactosidase activity. Experiments were conducted under different conditions of ethanol concentration, treatment time and temperature according to a central composite rotatable design (CCRD), and the collected results were then worked out by response surface methodology (RSM). Cell permeabilization was improved by an increase in ethanol concentration and simultaneous decreases in the incubation temperature and treatment time. Such an approach allowed us to identify an optimal range of the independent variables within which the β-galactosidase activity was optimized. A maximum permeabilization of 2,816 mmol L(-1) oNP min(-1) g(-1) was obtained by treating cells with 75.0% v/v of ethanol at 20.0 °C for 15.0 min. The proposed methodology resulted to be effective and suited for K. lactis cells permeabilization at a lab-scale and promises to be of possible interest for future applications mainly in the food industry.

  4. Investigation of Super Learner Methodology on HIV-1 Small Sample: Application on Jaguar Trial Data.

    PubMed

    Houssaïni, Allal; Assoumou, Lambert; Marcelin, Anne Geneviève; Molina, Jean Michel; Calvez, Vincent; Flandre, Philippe

    2012-01-01

    Background. Many statistical models have been tested to predict phenotypic or virological response from genotypic data. A statistical framework called Super Learner has been introduced either to compare different methods/learners (discrete Super Learner) or to combine them in a Super Learner prediction method. Methods. The Jaguar trial is used to apply the Super Learner framework. The Jaguar study is an "add-on" trial comparing the efficacy of adding didanosine to an on-going failing regimen. Our aim was also to investigate the impact on the use of different cross-validation strategies and different loss functions. Four different repartitions between training set and validations set were tested through two loss functions. Six statistical methods were compared. We assess performance by evaluating R(2) values and accuracy by calculating the rates of patients being correctly classified. Results. Our results indicated that the more recent Super Learner methodology of building a new predictor based on a weighted combination of different methods/learners provided good performance. A simple linear model provided similar results to those of this new predictor. Slight discrepancy arises between the two loss functions investigated, and slight difference arises also between results based on cross-validated risks and results from full dataset. The Super Learner methodology and linear model provided around 80% of patients correctly classified. The difference between the lower and higher rates is around 10 percent. The number of mutations retained in different learners also varys from one to 41. Conclusions. The more recent Super Learner methodology combining the prediction of many learners provided good performance on our small dataset.

  5. Psycho-informatics: Big Data shaping modern psychometrics.

    PubMed

    Markowetz, Alexander; Błaszkiewicz, Konrad; Montag, Christian; Switala, Christina; Schlaepfer, Thomas E

    2014-04-01

    For the first time in history, it is possible to study human behavior on great scale and in fine detail simultaneously. Online services and ubiquitous computational devices, such as smartphones and modern cars, record our everyday activity. The resulting Big Data offers unprecedented opportunities for tracking and analyzing behavior. This paper hypothesizes the applicability and impact of Big Data technologies in the context of psychometrics both for research and clinical applications. It first outlines the state of the art, including the severe shortcomings with respect to quality and quantity of the resulting data. It then presents a technological vision, comprised of (i) numerous data sources such as mobile devices and sensors, (ii) a central data store, and (iii) an analytical platform, employing techniques from data mining and machine learning. To further illustrate the dramatic benefits of the proposed methodologies, the paper then outlines two current projects, logging and analyzing smartphone usage. One such study attempts to thereby quantify severity of major depression dynamically; the other investigates (mobile) Internet Addiction. Finally, the paper addresses some of the ethical issues inherent to Big Data technologies. In summary, the proposed approach is about to induce the single biggest methodological shift since the beginning of psychology or psychiatry. The resulting range of applications will dramatically shape the daily routines of researches and medical practitioners alike. Indeed, transferring techniques from computer science to psychiatry and psychology is about to establish Psycho-Informatics, an entire research direction of its own. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Evaluating Multi-Input/Multi-Output Digital Control Systems

    NASA Technical Reports Server (NTRS)

    Pototzky, Anthony S.; Wieseman, Carol D.; Hoadley, Sherwood T.; Mukhopadhyay, Vivek

    1994-01-01

    Controller-performance-evaluation (CPE) methodology for multi-input/multi-output (MIMO) digital control systems developed. Procedures identify potentially destabilizing controllers and confirm satisfactory performance of stabilizing ones. Methodology generic and used in many types of multi-loop digital-controller applications, including digital flight-control systems, digitally controlled spacecraft structures, and actively controlled wind-tunnel models. Also applicable to other complex, highly dynamic digital controllers, such as those in high-performance robot systems.

  7. Application of Design Methodologies for Feedback Compensation Associated with Linear Systems

    NASA Technical Reports Server (NTRS)

    Smith, Monty J.

    1996-01-01

    The work that follows is concerned with the application of design methodologies for feedback compensation associated with linear systems. In general, the intent is to provide a well behaved closed loop system in terms of stability and robustness (internal signals remain bounded with a certain amount of uncertainty) and simultaneously achieve an acceptable level of performance. The approach here has been to convert the closed loop system and control synthesis problem into the interpolation setting. The interpolation formulation then serves as our mathematical representation of the design process. Lifting techniques have been used to solve the corresponding interpolation and control synthesis problems. Several applications using this multiobjective design methodology have been included to show the effectiveness of these techniques. In particular, the mixed H 2-H performance criteria with algorithm has been used on several examples including an F-18 HARV (High Angle of Attack Research Vehicle) for sensitivity performance.

  8. Reactor Dosimetry Applications Using RAPTOR-M3G:. a New Parallel 3-D Radiation Transport Code

    NASA Astrophysics Data System (ADS)

    Longoni, Gianluca; Anderson, Stanwood L.

    2009-08-01

    The numerical solution of the Linearized Boltzmann Equation (LBE) via the Discrete Ordinates method (SN) requires extensive computational resources for large 3-D neutron and gamma transport applications due to the concurrent discretization of the angular, spatial, and energy domains. This paper will discuss the development RAPTOR-M3G (RApid Parallel Transport Of Radiation - Multiple 3D Geometries), a new 3-D parallel radiation transport code, and its application to the calculation of ex-vessel neutron dosimetry responses in the cavity of a commercial 2-loop Pressurized Water Reactor (PWR). RAPTOR-M3G is based domain decomposition algorithms, where the spatial and angular domains are allocated and processed on multi-processor computer architectures. As compared to traditional single-processor applications, this approach reduces the computational load as well as the memory requirement per processor, yielding an efficient solution methodology for large 3-D problems. Measured neutron dosimetry responses in the reactor cavity air gap will be compared to the RAPTOR-M3G predictions. This paper is organized as follows: Section 1 discusses the RAPTOR-M3G methodology; Section 2 describes the 2-loop PWR model and the numerical results obtained. Section 3 addresses the parallel performance of the code, and Section 4 concludes this paper with final remarks and future work.

  9. On the merging of optical and SAR satellite imagery for surface water mapping applications

    NASA Astrophysics Data System (ADS)

    Markert, Kel N.; Chishtie, Farrukh; Anderson, Eric R.; Saah, David; Griffin, Robert E.

    2018-06-01

    Optical and Synthetic Aperture Radar (SAR) imagery from satellite platforms provide a means to discretely map surface water; however, the application of the two data sources in tandem has been inhibited by inconsistent data availability, the distinct physical properties that optical and SAR instruments sense, and dissimilar data delivery platforms. In this paper, we describe a preliminary methodology for merging optical and SAR data into a common data space. We apply our approach over a portion of the Mekong Basin, a region with highly variable surface water cover and persistent cloud cover, for surface water applications requiring dense time series analysis. The methods include the derivation of a representative index from both sensors that transforms data from disparate physical units (reflectance and backscatter) to a comparable dimensionless space applying a consistent water extraction approach to both datasets. The merging of optical and SAR data allows for increased observations in cloud prone regions that can be used to gain additional insight into surface water dynamics or flood mapping applications. This preliminary methodology shows promise for a common optical-SAR water extraction; however, data ranges and thresholding values can vary depending on data source, yielding classification errors in the resulting surface water maps. We discuss some potential future approaches to address these inconsistencies.

  10. On-line capillary electrophoresis-based enzymatic methodology for the study of polymer-drug conjugates.

    PubMed

    Coussot, G; Ladner, Y; Bayart, C; Faye, C; Vigier, V; Perrin, C

    2015-01-09

    This work aims at studying the potentialities of an on-line capillary electrophoresis (CE)-based digestion methodology for evaluating polymer-drug conjugates degradability in the presence of free trypsin (in-solution digestion). A sandwich plugs injection scheme with transverse diffusion of laminar profile (TDLFP) mode was used to achieve on-line digestions. Electrophoretic separation conditions were established using poly-l-Lysine (PLL) as reference substrate. Comparison with off-line digestion was carried out to demonstrate the feasibility of the proposed methodology. The applicability of the on-line CE-based digestion methodology was evaluated for two PLL-drug conjugates and for the four first generations of dendrigraft of lysine (DGL). Different electrophoretic profiles presenting the formation of di, tri, and tetralysine were observed for PLL-drug and DGL. These findings are in good agreement with the nature of the linker used to link the drug to PLL structure and the predicted degradability of DGL. The present on-line methodology applicability was also successfully proven for protein conjugates hydrolysis. In summary, the described methodology provides a powerful tool for the rapid study of biodegradable polymers. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. Results from Alloy 600 And Alloy 690 Caustic SCC Model Boiler Tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Frederick D.; Thomas, Larry E.

    2009-08-03

    A versatile model boiler test methodology was developed and used to compare caustic stress corrosion cracking (SCC) of mill annealed Alloy 600 and thermally treated Alloy 690. The model boiler included simulated crevice devices that efficiently and consistently concentrated Na2CO3, resulting in volatilization of CO2 with the steam and concentration of NaOH at the tube surfaces. The test methodology also included variation in tube stress, either produced by the primary to secondary side pressure differential, or by a novel method that reproducibly yields a higher stress condition on the tube. The significant effect of residual stress on tube SCC wasmore » also considered. SCC of both Alloy 600 and Alloy 690 were evaluated as a function of temperature and stress. Analytical transmission electron microscopy (ATEM) evaluations of the cracks and the grain boundaries ahead of the cracks were performed, providing insight into the SCC mechanism. This model boiler test methodology may be applicable to a range of bulkwater secondary chemistries that concentrate to produce aggressive crevice environments.« less

  12. [Methodological approach to the use of artificial neural networks for predicting results in medicine].

    PubMed

    Trujillano, Javier; March, Jaume; Sorribas, Albert

    2004-01-01

    In clinical practice, there is an increasing interest in obtaining adequate models of prediction. Within the possible available alternatives, the artificial neural networks (ANN) are progressively more used. In this review we first introduce the ANN methodology, describing the most common type of ANN, the Multilayer Perceptron trained with backpropagation algorithm (MLP). Then we compare the MLP with the Logistic Regression (LR). Finally, we show a practical scheme to make an application based on ANN by means of an example with actual data. The main advantage of the RN is its capacity to incorporate nonlinear effects and interactions between the variables of the model without need to include them a priori. As greater disadvantages, they show a difficult interpretation of their parameters and large empiricism in their process of construction and training. ANN are useful for the computation of probabilities of a given outcome based on a set of predicting variables. Furthermore, in some cases, they obtain better results than LR. Both methodologies, ANN and LR, are complementary and they help us to obtain more valid models.

  13. How to Select a Questionnaire with a Good Methodological Quality?

    PubMed

    Paiva, Saul Martins; Perazzo, Matheus de França; Ortiz, Fernanda Ruffo; Pordeus, Isabela Almeida; Martins-Júnior, Paulo Antônio

    2018-01-01

    In the last decades, several instruments have been used to evaluate the impact of oral health problems on the oral health-related quality of life (OHRQoL) of individuals. However, some instruments lack thorough methodological validation or present conceptual differences that hinder comparisons with instruments. Thus, it can be difficult to clinicians and researchers to select a questionnaire that accurately reflect what are really meaningful to individuals. This short communication aimed to discuss the importance of use an appropriate checklist to select an instrument with a good methodological quality. The COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) checklist was developed to provide tools for evidence-based instrument selection. The COSMIN checklist comprises ten boxes that evaluate whether a study meets the standard for good methodological quality and two additional boxes to meet studies that use the Item Response Theory method and general requirements for results generalization, resulting in four steps to be followed. In this way, it is required at least some expertise in psychometrics or clinimetrics to a wide-ranging use of this checklist. The COSMIN applications include its use to ensure the standardization of cross-cultural adaptations and safer comparisons between measurement studies and evaluation of methodological quality of systematic reviews of measurement properties. Also, it can be used by students when training about measurement properties and by editors and reviewers when revising manuscripts on this topic. The popularization of COSMIN checklist is therefore necessary to improve the selection and evaluation of health measurement instruments.

  14. A high-throughput virus-induced gene silencing protocol identifies genes involved in multi-stress tolerance

    PubMed Central

    2013-01-01

    Background Understanding the function of a particular gene under various stresses is important for engineering plants for broad-spectrum stress tolerance. Although virus-induced gene silencing (VIGS) has been used to characterize genes involved in abiotic stress tolerance, currently available gene silencing and stress imposition methodology at the whole plant level is not suitable for high-throughput functional analyses of genes. This demands a robust and reliable methodology for characterizing genes involved in abiotic and multi-stress tolerance. Results Our methodology employs VIGS-based gene silencing in leaf disks combined with simple stress imposition and effect quantification methodologies for easy and faster characterization of genes involved in abiotic and multi-stress tolerance. By subjecting leaf disks from gene-silenced plants to various abiotic stresses and inoculating silenced plants with various pathogens, we show the involvement of several genes for multi-stress tolerance. In addition, we demonstrate that VIGS can be used to characterize genes involved in thermotolerance. Our results also showed the functional relevance of NtEDS1 in abiotic stress, NbRBX1 and NbCTR1 in oxidative stress; NtRAR1 and NtNPR1 in salinity stress; NbSOS1 and NbHSP101 in biotic stress; and NtEDS1, NbETR1, NbWRKY2 and NbMYC2 in thermotolerance. Conclusions In addition to widening the application of VIGS, we developed a robust, easy and high-throughput methodology for functional characterization of genes involved in multi-stress tolerance. PMID:24289810

  15. An extensible six-step methodology to automatically generate fuzzy DSSs for diagnostic applications

    PubMed Central

    2013-01-01

    Background The diagnosis of many diseases can be often formulated as a decision problem; uncertainty affects these problems so that many computerized Diagnostic Decision Support Systems (in the following, DDSSs) have been developed to aid the physician in interpreting clinical data and thus to improve the quality of the whole process. Fuzzy logic, a well established attempt at the formalization and mechanization of human capabilities in reasoning and deciding with noisy information, can be profitably used. Recently, we informally proposed a general methodology to automatically build DDSSs on the top of fuzzy knowledge extracted from data. Methods We carefully refine and formalize our methodology that includes six stages, where the first three stages work with crisp rules, whereas the last three ones are employed on fuzzy models. Its strength relies on its generality and modularity since it supports the integration of alternative techniques in each of its stages. Results The methodology is designed and implemented in the form of a modular and portable software architecture according to a component-based approach. The architecture is deeply described and a summary inspection of the main components in terms of UML diagrams is outlined as well. A first implementation of the architecture has been then realized in Java following the object-oriented paradigm and used to instantiate a DDSS example aimed at accurately diagnosing breast masses as a proof of concept. Conclusions The results prove the feasibility of the whole methodology implemented in terms of the architecture proposed. PMID:23368970

  16. Automated quantification of neurite outgrowth orientation distributions on patterned surfaces

    NASA Astrophysics Data System (ADS)

    Payne, Matthew; Wang, Dadong; Sinclair, Catriona M.; Kapsa, Robert M. I.; Quigley, Anita F.; Wallace, Gordon G.; Razal, Joselito M.; Baughman, Ray H.; Münch, Gerald; Vallotton, Pascal

    2014-08-01

    Objective. We have developed an image analysis methodology for quantifying the anisotropy of neuronal projections on patterned substrates. Approach. Our method is based on the fitting of smoothing splines to the digital traces produced using a non-maximum suppression technique. This enables precise estimates of the local tangents uniformly along the neurite length, and leads to unbiased orientation distributions suitable for objectively assessing the anisotropy induced by tailored surfaces. Main results. In our application, we demonstrate that carbon nanotubes arrayed in parallel bundles over gold surfaces induce a considerable neurite anisotropy; a result which is relevant for regenerative medicine. Significance. Our pipeline is generally applicable to the study of fibrous materials on 2D surfaces and should also find applications in the study of DNA, microtubules, and other polymeric materials.

  17. Methodology for estimating helicopter performance and weights using limited data

    NASA Technical Reports Server (NTRS)

    Baserga, Claudio; Ingalls, Charles; Lee, Henry; Peyran, Richard

    1990-01-01

    Methodology is developed and described for estimating the flight performance and weights of a helicopter for which limited data are available. The methodology is based on assumptions which couple knowledge of the technology of the helicopter under study with detailed data from well documented helicopters thought to be of similar technology. The approach, analysis assumptions, technology modeling, and the use of reference helicopter data are discussed. Application of the methodology is illustrated with an investigation of the Agusta A129 Mangusta.

  18. Application of Response Surface Methodology for Modeling of Postweld Heat Treatment Process in a Pressure Vessel Steel ASTM A516 Grade 70.

    PubMed

    Peasura, Prachya

    2015-01-01

    This research studied the application of the response surface methodology (RSM) and central composite design (CCD) experiment in mathematical model and optimizes postweld heat treatment (PWHT). The material of study is a pressure vessel steel ASTM A516 grade 70 that is used for gas metal arc welding. PWHT parameters examined in this study included PWHT temperatures and time. The resulting materials were examined using CCD experiment and the RSM to determine the resulting material tensile strength test, observed with optical microscopy and scanning electron microscopy. The experimental results show that using a full quadratic model with the proposed mathematical model is YTS = -285.521 + 15.706X1 + 2.514X2 - 0.004X1(2) - 0.001X2(2) - 0.029X1X2. Tensile strength parameters of PWHT were optimized PWHT time of 5.00 hr and PWHT temperature of 645.75°C. The results show that the PWHT time is the dominant mechanism used to modify the tensile strength compared to the PWHT temperatures. This phenomenon could be explained by the fact that pearlite can contribute to higher tensile strength. Pearlite has an intensity, which results in increased material tensile strength. The research described here can be used as material data on PWHT parameters for an ASTM A516 grade 70 weld.

  19. Self-Contained Automated Methodology for Optimal Flow Control

    NASA Technical Reports Server (NTRS)

    Joslin, Ronald D.; Gunzburger, Max D.; Nicolaides, Roy A.; Erlebacherl, Gordon; Hussaini, M. Yousuff

    1997-01-01

    This paper describes a self-contained, automated methodology for active flow control which couples the time-dependent Navier-Stokes system with an adjoint Navier-Stokes system and optimality conditions from which optimal states, i.e., unsteady flow fields and controls (e.g., actuators), may be determined. The problem of boundary layer instability suppression through wave cancellation is used as the initial validation case to test the methodology. Here, the objective of control is to match the stress vector along a portion of the boundary to a given vector; instability suppression is achieved by choosing the given vector to be that of a steady base flow. Control is effected through the injection or suction of fluid through a single orifice on the boundary. The results demonstrate that instability suppression can be achieved without any a priori knowledge of the disturbance, which is significant because other control techniques have required some knowledge of the flow unsteadiness such as frequencies, instability type, etc. The present methodology has been extended to three dimensions and may potentially be applied to separation control, re-laminarization, and turbulence control applications using one to many sensors and actuators.

  20. Inter-provider comparison of patient-reported outcomes: developing an adjustment to account for differences in patient case mix.

    PubMed

    Nuttall, David; Parkin, David; Devlin, Nancy

    2015-01-01

    This paper describes the development of a methodology for the case-mix adjustment of patient-reported outcome measures (PROMs) data permitting the comparison of outcomes between providers on a like-for-like basis. Statistical models that take account of provider-specific effects form the basis of the proposed case-mix adjustment methodology. Indirect standardisation provides a transparent means of case mix adjusting the PROMs data, which are updated on a monthly basis. Recently published PROMs data for patients undergoing unilateral knee replacement are used to estimate empirical models and to demonstrate the application of the proposed case-mix adjustment methodology in practice. The results are illustrative and are used to highlight a number of theoretical and empirical issues that warrant further exploration. For example, because of differences between PROMs instruments, case-mix adjustment methodologies may require instrument-specific approaches. A number of key assumptions are made in estimating the empirical models, which could be open to challenge. The covariates of post-operative health status could be expanded, and alternative econometric methods could be employed. © 2013 Crown copyright.

Top