Failure mode effect analysis and fault tree analysis as a combined methodology in risk management
NASA Astrophysics Data System (ADS)
Wessiani, N. A.; Yoshio, F.
2018-04-01
There have been many studies reported the implementation of Failure Mode Effect Analysis (FMEA) and Fault Tree Analysis (FTA) as a method in risk management. However, most of the studies usually only choose one of these two methods in their risk management methodology. On the other side, combining these two methods will reduce the drawbacks of each methods when implemented separately. This paper aims to combine the methodology of FMEA and FTA in assessing risk. A case study in the metal company will illustrate how this methodology can be implemented. In the case study, this combined methodology will assess the internal risks that occur in the production process. Further, those internal risks should be mitigated based on their level of risks.
Analysis of a Proposal to Implement the Readiness Based Sparing Process in the Brazilian Navy
2017-06-01
determine inventory levels. This research investigates whether implementing the U.S. DOD readiness-based sparing (RBS) methodology could provide the...suggested by applying the methodology first for determining reparable spares initial provisioning. 14. SUBJECT TERMS reparable, system-approach...This research investigates whether implementing the U.S. DOD readiness-based sparing (RBS) methodology could provide the Brazilian Navy with greater
KSC management training system project
NASA Technical Reports Server (NTRS)
Sepulveda, Jose A.
1993-01-01
The stated objectives for the summer of 1993 were: to review the Individual Development Plan Surveys for 1994 in order to automate the analysis of the Needs Assessment effort; and to develop and implement evaluation methodologies to perform ongoing program-wide course-to-course assessment. This includes the following: to propose a methodology to develop and implement objective, performance-based assessment instruments for each training effort; to mechanize course evaluation forms and develop software to facilitate the data gathering, analysis, and reporting processes; and to implement the methodology, forms, and software in at lease one training course or seminar selected among those normally offered in the summer at KSC. Section two of this report addresses the work done in regard to the Individual Development Plan Surveys for 1994. Section three presents the methodology proposed to develop and implement objective, performance-based assessment instruments for each training course offered at KSC.
D'Onza, Giuseppe; Greco, Giulio; Allegrini, Marco
2016-02-01
Recycling implies additional costs for separated municipal solid waste (MSW) collection. The aim of the present study is to propose and implement a management tool - the full cost accounting (FCA) method - to calculate the full collection costs of different types of waste. Our analysis aims for a better understanding of the difficulties of putting FCA into practice in the MSW sector. We propose a FCA methodology that uses standard cost and actual quantities to calculate the collection costs of separate and undifferentiated waste. Our methodology allows cost efficiency analysis and benchmarking, overcoming problems related to firm-specific accounting choices, earnings management policies and purchase policies. Our methodology allows benchmarking and variance analysis that can be used to identify the causes of off-standards performance and guide managers to deploy resources more efficiently. Our methodology can be implemented by companies lacking a sophisticated management accounting system. Copyright © 2015 Elsevier Ltd. All rights reserved.
Constraint Force Equation Methodology for Modeling Multi-Body Stage Separation Dynamics
NASA Technical Reports Server (NTRS)
Toniolo, Matthew D.; Tartabini, Paul V.; Pamadi, Bandu N.; Hotchko, Nathaniel
2008-01-01
This paper discusses a generalized approach to the multi-body separation problems in a launch vehicle staging environment based on constraint force methodology and its implementation into the Program to Optimize Simulated Trajectories II (POST2), a widely used trajectory design and optimization tool. This development facilitates the inclusion of stage separation analysis into POST2 for seamless end-to-end simulations of launch vehicle trajectories, thus simplifying the overall implementation and providing a range of modeling and optimization capabilities that are standard features in POST2. Analysis and results are presented for two test cases that validate the constraint force equation methodology in a stand-alone mode and its implementation in POST2.
ERIC Educational Resources Information Center
Ross, Linda
2003-01-01
Recent work with automotive e-commerce clients led to the development of a performance analysis methodology called the Seven Performance Drivers, including: standards, incentives, capacity, knowledge and skill, measurement, feedback, and analysis. This methodology has been highly effective in introducing and implementing performance improvement.…
Stochastic response surface methodology: A study in the human health area
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oliveira, Teresa A., E-mail: teresa.oliveira@uab.pt; Oliveira, Amílcar, E-mail: amilcar.oliveira@uab.pt; Centro de Estatística e Aplicações, Universidade de Lisboa
2015-03-10
In this paper we review Stochastic Response Surface Methodology as a tool for modeling uncertainty in the context of Risk Analysis. An application in the survival analysis in the breast cancer context is implemented with R software.
Situational Analysis for Complex Systems: Methodological Development in Public Health Research.
Martin, Wanda; Pauly, Bernie; MacDonald, Marjorie
2016-01-01
Public health systems have suffered infrastructure losses worldwide. Strengthening public health systems requires not only good policies and programs, but also development of new research methodologies to support public health systems renewal. Our research team considers public health systems to be complex adaptive systems and as such new methods are necessary to generate knowledge about the process of implementing public health programs and services. Within our program of research, we have employed situational analysis as a method for studying complex adaptive systems in four distinct research studies on public health program implementation. The purpose of this paper is to demonstrate the use of situational analysis as a method for studying complex systems and highlight the need for further methodological development.
Optimized planning methodologies of ASON implementation
NASA Astrophysics Data System (ADS)
Zhou, Michael M.; Tamil, Lakshman S.
2005-02-01
Advanced network planning concerns effective network-resource allocation for dynamic and open business environment. Planning methodologies of ASON implementation based on qualitative analysis and mathematical modeling are presented in this paper. The methodology includes method of rationalizing technology and architecture, building network and nodal models, and developing dynamic programming for multi-period deployment. The multi-layered nodal architecture proposed here can accommodate various nodal configurations for a multi-plane optical network and the network modeling presented here computes the required network elements for optimizing resource allocation.
ERIC Educational Resources Information Center
Iborra Urios, Montserrat; Ramírez Rangel, Eliana; Badia Córcoles, Jordi Hug; Bringué Tomàs, Roger; Tejero Salvador, Javier
2017-01-01
This work is focused on the implementation, development, documentation, analysis, and assessment of the flipped classroom methodology, by means of the just-in-time teaching strategy, for a pilot group (1 out of 6) in the subject "Applied Computing" of both the Chemical and Materials Engineering Undergraduate Degrees of the University of…
Transitioning Domain Analysis: An Industry Experience.
1996-06-01
References 6 Implementation 6.1 Analysis of Operator Services’ Requirements Process 21 6.2 Preliminary Planning for FODA Training by SEI 21...an academic and industry partnership took feature oriented domain analysis ( FODA ) from a methodology that is still being defined to a well-documented...to pilot the use of the Software Engineering Institute (SEI) domain analysis methodology known as feature-oriented domain analysis ( FODA ). Supported
ERIC Educational Resources Information Center
Dyehouse, Jeremiah
2007-01-01
Researchers studying technology development often examine how rhetorical activity contributes to technologies' design, implementation, and stabilization. This article offers a possible methodology for studying one role of rhetorical activity in technology development: knowledge consolidation analysis. Applying this method to an exemplar case, the…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-12
... methodological studies conducted during the Vanguard phase will inform the implementation and analysis plan for... Research Methodology Studies for the National Children's Study SUMMARY: In compliance with the requirement... Collection: Title: Environmental Science Formative Research Methodology Studies for the National Children's...
NASA Technical Reports Server (NTRS)
Walters, Robert; Summers, Geoffrey P.; Warmer. Keffreu J/; Messenger, Scott; Lorentzen, Justin R.; Morton, Thomas; Taylor, Stephen J.; Evans, Hugh; Heynderickx, Daniel; Lei, Fan
2007-01-01
This paper presents a method for using the SPENVIS on-line computational suite to implement the displacement damage dose (D(sub d)) methodology for calculating end-of-life (EOL) solar cell performance for a specific space mission. This paper builds on our previous work that has validated the D(sub d) methodology against both measured space data [1,2] and calculations performed using the equivalent fluence methodology developed by NASA JPL [3]. For several years, the space solar community has considered general implementation of the D(sub d) method, but no computer program exists to enable this implementation. In a collaborative effort, NRL, NASA and OAI have produced the Solar Array Verification and Analysis Tool (SAVANT) under NASA funding, but this program has not progressed beyond the beta-stage [4]. The SPENVIS suite with the Multi Layered Shielding Simulation Software (MULASSIS) contains all of the necessary components to implement the Dd methodology in a format complementary to that of SAVANT [5]. NRL is currently working with ESA and BIRA to include the Dd method of solar cell EOL calculations as an integral part of SPENVIS. This paper describes how this can be accomplished.
A Data Warehouse Architecture for DoD Healthcare Performance Measurements.
1999-09-01
design, develop, implement, and apply statistical analysis and data mining tools to a Data Warehouse of healthcare metrics. With the DoD healthcare...framework, this thesis defines a methodology to design, develop, implement, and apply statistical analysis and data mining tools to a Data Warehouse...21 F. INABILITY TO CONDUCT HELATHCARE ANALYSIS
Analysis and methodology for aeronautical systems technology program planning
NASA Technical Reports Server (NTRS)
White, M. J.; Gershkoff, I.; Lamkin, S.
1983-01-01
A structured methodology was developed that allows the generation, analysis, and rank-ordering of system concepts by their benefits and costs, indicating the preferred order of implementation. The methodology is supported by a base of data on civil transport aircraft fleet growth projections and data on aircraft performance relating the contribution of each element of the aircraft to overall performance. The performance data are used to assess the benefits of proposed concepts. The methodology includes a computer program for performing the calculations needed to rank-order the concepts and compute their cumulative benefit-to-cost ratio. The use of the methodology and supporting data is illustrated through the analysis of actual system concepts from various sources.
Wightman, Jade; Julio, Flávia; Virués-Ortega, Javier
2014-05-01
Experimental functional analysis is an assessment methodology to identify the environmental factors that maintain problem behavior in individuals with developmental disabilities and in other populations. Functional analysis provides the basis for the development of reinforcement-based approaches to treatment. This article reviews the procedures, validity, and clinical implementation of the methodological variations of functional analysis and function-based interventions. We present six variations of functional analysis methodology in addition to the typical functional analysis: brief functional analysis, single-function tests, latency-based functional analysis, functional analysis of precursors, and trial-based functional analysis. We also present the three general categories of function-based interventions: extinction, antecedent manipulation, and differential reinforcement. Functional analysis methodology is a valid and efficient approach to the assessment of problem behavior and the selection of treatment strategies.
Aircraft optimization by a system approach: Achievements and trends
NASA Technical Reports Server (NTRS)
Sobieszczanski-Sobieski, Jaroslaw
1992-01-01
Recently emerging methodology for optimal design of aircraft treated as a system of interacting physical phenomena and parts is examined. The methodology is found to coalesce into methods for hierarchic, non-hierarchic, and hybrid systems all dependent on sensitivity analysis. A separate category of methods has also evolved independent of sensitivity analysis, hence suitable for discrete problems. References and numerical applications are cited. Massively parallel computer processing is seen as enabling technology for practical implementation of the methodology.
Teaching Instrumentation and Methodology in Human Motion Analysis
2001-10-25
TEACHING INSTRUMENTATION AND METHODOLOGY IN HUMAN MOTION ANALYSIS V. Medved Faculty of Physical Education , University of Zagreb, Zagreb, Croatia...the introducement of teaching curricula to implement the apropriate knowledge. Problems are discussed of educating professionals and disseminating...University of Zagreb, undergraduate teaching of locomotion biomechanics is provided only at the Faculty of Physical Education . Following a need to teach
An Approach for Implementation of Project Management Information Systems
NASA Astrophysics Data System (ADS)
Běrziša, Solvita; Grabis, Jānis
Project management is governed by project management methodologies, standards, and other regulatory requirements. This chapter proposes an approach for implementing and configuring project management information systems according to requirements defined by these methodologies. The approach uses a project management specification framework to describe project management methodologies in a standardized manner. This specification is used to automatically configure the project management information system by applying appropriate transformation mechanisms. Development of the standardized framework is based on analysis of typical project management concepts and process and existing XML-based representations of project management. A demonstration example of project management information system's configuration is provided.
Development of economic consequence methodology for process risk analysis.
Zadakbar, Omid; Khan, Faisal; Imtiaz, Syed
2015-04-01
A comprehensive methodology for economic consequence analysis with appropriate models for risk analysis of process systems is proposed. This methodology uses loss functions to relate process deviations in a given scenario to economic losses. It consists of four steps: definition of a scenario, identification of losses, quantification of losses, and integration of losses. In this methodology, the process deviations that contribute to a given accident scenario are identified and mapped to assess potential consequences. Losses are assessed with an appropriate loss function (revised Taguchi, modified inverted normal) for each type of loss. The total loss is quantified by integrating different loss functions. The proposed methodology has been examined on two industrial case studies. Implementation of this new economic consequence methodology in quantitative risk assessment will provide better understanding and quantification of risk. This will improve design, decision making, and risk management strategies. © 2014 Society for Risk Analysis.
Netlist Oriented Sensitivity Evaluation (NOSE)
2017-03-01
developing methodologies to assess sensitivities of alternative chip design netlist implementations. The research is somewhat foundational in that such...Netlist-Oriented Sensitivity Evaluation (NOSE) project was to develop methodologies to assess sensitivities of alternative chip design netlist...analysis to devise a methodology for scoring the sensitivity of circuit nodes in a netlist and thus providing the raw data for any meaningful
Methodology for object-oriented real-time systems analysis and design: Software engineering
NASA Technical Reports Server (NTRS)
Schoeffler, James D.
1991-01-01
Successful application of software engineering methodologies requires an integrated analysis and design life-cycle in which the various phases flow smoothly 'seamlessly' from analysis through design to implementation. Furthermore, different analysis methodologies often lead to different structuring of the system so that the transition from analysis to design may be awkward depending on the design methodology to be used. This is especially important when object-oriented programming is to be used for implementation when the original specification and perhaps high-level design is non-object oriented. Two approaches to real-time systems analysis which can lead to an object-oriented design are contrasted: (1) modeling the system using structured analysis with real-time extensions which emphasizes data and control flows followed by the abstraction of objects where the operations or methods of the objects correspond to processes in the data flow diagrams and then design in terms of these objects; and (2) modeling the system from the beginning as a set of naturally occurring concurrent entities (objects) each having its own time-behavior defined by a set of states and state-transition rules and seamlessly transforming the analysis models into high-level design models. A new concept of a 'real-time systems-analysis object' is introduced and becomes the basic building block of a series of seamlessly-connected models which progress from the object-oriented real-time systems analysis and design system analysis logical models through the physical architectural models and the high-level design stages. The methodology is appropriate to the overall specification including hardware and software modules. In software modules, the systems analysis objects are transformed into software objects.
Risk analysis within environmental impact assessment of proposed construction activity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zeleňáková, Martina; Zvijáková, Lenka
Environmental impact assessment is an important process, prior to approval of the investment plan, providing a detailed examination of the likely and foreseeable impacts of proposed construction activity on the environment. The objective of this paper is to develop a specific methodology for the analysis and evaluation of environmental impacts of selected constructions – flood protection structures using risk analysis methods. The application of methodology designed for the process of environmental impact assessment will develop assumptions for further improvements or more effective implementation and performance of this process. The main objective of the paper is to improve the implementation ofmore » the environmental impact assessment process. Through the use of risk analysis methods in environmental impact assessment process, the set objective has been achieved. - Highlights: This paper is informed by an effort to develop research with the aim of: • Improving existing qualitative and quantitative methods for assessing the impacts • A better understanding of relations between probabilities and consequences • Methodology for the EIA of flood protection constructions based on risk analysis • Creative approaches in the search for environmentally friendly proposed activities.« less
DOT National Transportation Integrated Search
2017-02-01
As part of the Federal Highway Administration (FHWA) Traffic Analysis Toolbox (Volume XIII), this guide was designed to help corridor stakeholders implement the Integrated Corridor Management (ICM) Analysis, Modeling, and Simulation (AMS) methodology...
DOT National Transportation Integrated Search
2017-02-01
As part of the Federal Highway Administration (FHWA) Traffic Analysis Toolbox (Volume XIII), this guide was designed to help corridor stakeholders implement the Integrated Corridor Management (ICM) Analysis, Modeling, and Simulation (AMS) methodology...
Evans-Agnew, Robin A; Johnson, Susan; Liu, Fuqin; Boutain, Doris M
2016-08-01
Critical discourse analysis (CDA) is a promising methodology for policy research in nursing. As a critical theoretical methodology, researchers use CDA to analyze social practices and language use in policies to examine whether such policies may promote or impede social transformation. Despite the widespread use of CDA in other disciplines such as education and sociology, nursing policy research employing CDA methodology is sparse. To advance CDA use in nursing science, it is important to outline the overall research strategies and describe the steps of CDA in policy research. This article describes, using exemplar case studies, how nursing and health policy researchers can employ CDA as a methodology. Three case studies are provided to discuss the application of CDA research methodologies in nursing policy research: (a) implementation of preconception care policies in the Zhejiang province of China, (b) formation and enactment of statewide asthma policy in Washington state of the United States, and (c) organizational implementation of employee antibullying policies in hospital systems in the Pacific Northwest of the United States. Each exemplar details how CDA guided the examination of policy within specific contexts and social practices. The variations of the CDA approaches in the three exemplars demonstrated the flexibilities and potentials for conducting policy research grounded in CDA. CDA provides novel insights for nurse researchers examining health policy formation, enactment, and implementation. © The Author(s) 2016.
Exploring Ways to Implement the Health Services Mobility Study: A Feasibility Study.
ERIC Educational Resources Information Center
Lavine, Eileen M.; Moore, Audrey
A feasibility study was aimed at developing a strategy for implementing and utilizing the job analysis methodology which resulted from the Health Services Mobility Study (HSMS), particularly as it can be applied to the field of diagnostic radiology. (The HSMS method of job analysis starts with task descriptions analyzing the tasks that make up a…
Project management practices in engineering university
NASA Astrophysics Data System (ADS)
Sirazitdinova, Y.; Dulzon, A.; Mueller, B.
2015-10-01
The article presents the analysis of usage of project management methodology in Tomsk Polytechnic University, in particular the experience with the course Project management which started 15 years ago. The article presents the discussion around advantages of project management methodology for engineering education and administration of the university in general and the problems impeding extensive implementation of this methodology in teaching, research and management in the university.
Megacity analysis: a clustering approach to classification
2017-06-01
kinetic or non -kinetic urban operations. We develop and implement a methodology to classify megacities into groups. Using 33 variables, we construct a...is interested in these megacity networks and their implications for potential urban operations. We develop a methodology to group like megacities...is interested in these megacity networks and their implications for potential urban operations. We develop a methodology to group like megacities
Lewis, Cara C; Scott, Kelli; Marriott, Brigid R
2018-05-16
Tailored implementation approaches are touted as more likely to support the integration of evidence-based practices. However, to our knowledge, few methodologies for tailoring implementations exist. This manuscript will apply a model-driven, mixed methods approach to a needs assessment to identify the determinants of practice, and pilot a modified conjoint analysis method to generate an implementation blueprint using a case example of a cognitive behavioral therapy (CBT) implementation in a youth residential center. Our proposed methodology contains five steps to address two goals: (1) identify the determinants of practice and (2) select and match implementation strategies to address the identified determinants (focusing on barriers). Participants in the case example included mental health therapists and operations staff in two programs of Wolverine Human Services. For step 1, the needs assessment, they completed surveys (clinician N = 10; operations staff N = 58; other N = 7) and participated in focus groups (clinician N = 15; operations staff N = 38) guided by the domains of the Framework for Diffusion [1]. For step 2, the research team conducted mixed methods analyses following the QUAN + QUAL structure for the purpose of convergence and expansion in a connecting process, revealing 76 unique barriers. Step 3 consisted of a modified conjoint analysis. For step 3a, agency administrators prioritized the identified barriers according to feasibility and importance. For step 3b, strategies were selected from a published compilation and rated for feasibility and likelihood of impacting CBT fidelity. For step 4, sociometric surveys informed implementation team member selection and a meeting was held to identify officers and clarify goals and responsibilities. For step 5, blueprints for each of pre-implementation, implementation, and sustainment phases were generated. Forty-five unique strategies were prioritized across the 5 years and three phases representing all nine categories. Our novel methodology offers a relatively low burden collaborative approach to generating a plan for implementation that leverages advances in implementation science including measurement, models, strategy compilations, and methods from other fields.
Information security system quality assessment through the intelligent tools
NASA Astrophysics Data System (ADS)
Trapeznikov, E. V.
2018-04-01
The technology development has shown the automated system information security comprehensive analysis necessity. The subject area analysis indicates the study relevance. The research objective is to develop the information security system quality assessment methodology based on the intelligent tools. The basis of the methodology is the information security assessment model in the information system through the neural network. The paper presents the security assessment model, its algorithm. The methodology practical implementation results in the form of the software flow diagram are represented. The practical significance of the model being developed is noted in conclusions.
Eigenvalue Contributon Estimator for Sensitivity Calculations with TSUNAMI-3D
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rearden, Bradley T; Williams, Mark L
2007-01-01
Since the release of the Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI) codes in SCALE [1], the use of sensitivity and uncertainty analysis techniques for criticality safety applications has greatly increased within the user community. In general, sensitivity and uncertainty analysis is transitioning from a technique used only by specialists to a practical tool in routine use. With the desire to use the tool more routinely comes the need to improve the solution methodology to reduce the input and computational burden on the user. This paper reviews the current solution methodology of the Monte Carlo eigenvalue sensitivity analysismore » sequence TSUNAMI-3D, describes an alternative approach, and presents results from both methodologies.« less
NASA Technical Reports Server (NTRS)
Olivas, J. D.; Melroy, P.; McDanels, S.; Wallace, T.; Zapata, M. C.
2006-01-01
In connection with the accident investigation of the space shuttle Columbia, an analysis methodology utilizing well established microscopic and spectroscopic techniques was implemented for evaluating the environment to which the exterior fused silica glass was exposed. Through the implementation of optical microscopy, scanning electron microscopy, energy dispersive spectroscopy, transmission electron microscopy, and electron diffraction, details emerged regarding the manner in which a charred metallic deposited layer formed on top of the exposed glass. Due to nature of the substrate and the materials deposited, the methodology proved to allow for a more detailed analysis of the vehicle breakup. By contrast, similar analytical methodologies on metallic substrates have proven to be challenging due to strong potential for error resulting from substrate contamination. This information proved to be valuable to not only those involved in investigating the break up of Columbia, but also provides a potential guide for investigating future high altitude and high energy accidents.
Note: Methodology for the analysis of Bluetooth gateways in an implemented scatternet.
Etxaniz, J; Monje, P M; Aranguren, G
2014-03-01
This Note introduces a novel methodology to analyze the time performance of Bluetooth gateways in multi-hop networks, known as scatternets. The methodology is focused on distinguishing between the processing time and the time that each communication between nodes takes along an implemented scatternet. This technique is not only valid for Bluetooth networks but also for other wireless networks that offer access to their middleware in order to include beacons in the operation of the nodes. We show in this Note the results of the tests carried out on a Bluetooth scatternet in order to highlight the reliability and effectiveness of the methodology. The results also validate this technique showing convergence in the results when subtracting the time for the beacons from the delay measurements.
[Methodological problems in the scientific research on HIV /AIDS in Bolivia].
Hita, Susana Ramírez
2013-05-01
This paper discusses the methodological problems in the scientific research on HIV/AIDS in Bolivia, both in the areas of epidemiology and social sciences. Studies associated with this research served as the basis for the implementation of health programs run by The Global Fund, The Pan-American Health Organization, International Cooperation, Non-Governmental Organizations and the Bolivian Ministry of Health and Sports. An analysis of the methodological contradictions and weaknesses was made by reviewing the bibliography of the studies and by conducting qualitative methodological research, that was focused on the quality of health care available to people living with HIV/AIDS in public hospitals and health centers, and looked at how programs targeted at this sector of the population are designed and delivered. In this manner, it was possible to observe the shortcomings of the methodological design in the epidemiological and social science studies which serve as the basis for the implementation of these health programs.
Payne, Suzette J.; Coppersmith, Kevin J.; Coppersmith, Ryan; ...
2017-08-23
A key decision for nuclear facilities is evaluating the need for an update of an existing seismic hazard analysis in light of new data and information that has become available since the time that the analysis was completed. We introduce the newly developed risk-informed Seismic Hazard Periodic Review Methodology (referred to as the SHPRM) and present how a Senior Seismic Hazard Analysis Committee (SSHAC) Level 1 probabilistic seismic hazard analysis (PSHA) was performed in an implementation of this new methodology. The SHPRM offers a defensible and documented approach that considers both the changes in seismic hazard and engineering-based risk informationmore » of an existing nuclear facility to assess the need for an update of an existing PSHA. The SHPRM has seven evaluation criteria that are employed at specific analysis, decision, and comparison points which are applied to seismic design categories established for nuclear facilities in United States. The SHPRM is implemented using a SSHAC Level 1 study performed for the Idaho National Laboratory, USA. The implementation focuses on the first six of the seven evaluation criteria of the SHPRM which are all provided from the SSHAC Level 1 PSHA. Finally, to illustrate outcomes of the SHPRM that do not lead to the need for an update and those that do, the example implementations of the SHPRM are performed for nuclear facilities that have target performance goals expressed as the mean annual frequency of unacceptable performance at 1x10 -4, 4x10 -5 and 1x10 -5.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Payne, Suzette J.; Coppersmith, Kevin J.; Coppersmith, Ryan
A key decision for nuclear facilities is evaluating the need for an update of an existing seismic hazard analysis in light of new data and information that has become available since the time that the analysis was completed. We introduce the newly developed risk-informed Seismic Hazard Periodic Review Methodology (referred to as the SHPRM) and present how a Senior Seismic Hazard Analysis Committee (SSHAC) Level 1 probabilistic seismic hazard analysis (PSHA) was performed in an implementation of this new methodology. The SHPRM offers a defensible and documented approach that considers both the changes in seismic hazard and engineering-based risk informationmore » of an existing nuclear facility to assess the need for an update of an existing PSHA. The SHPRM has seven evaluation criteria that are employed at specific analysis, decision, and comparison points which are applied to seismic design categories established for nuclear facilities in United States. The SHPRM is implemented using a SSHAC Level 1 study performed for the Idaho National Laboratory, USA. The implementation focuses on the first six of the seven evaluation criteria of the SHPRM which are all provided from the SSHAC Level 1 PSHA. Finally, to illustrate outcomes of the SHPRM that do not lead to the need for an update and those that do, the example implementations of the SHPRM are performed for nuclear facilities that have target performance goals expressed as the mean annual frequency of unacceptable performance at 1x10 -4, 4x10 -5 and 1x10 -5.« less
Development of Flight Safety Prediction Methodology for U. S. Naval Safety Center. Revision 1
1970-02-01
Safety Center. The methodology develoned encompassed functional analysis of the F-4J aircraft, assessment of the importance of safety- sensitive ... Sensitivity ... ....... . 4-8 V 4.5 Model Implementation ........ ......... . 4-10 4.5.1 Functional Analysis ..... ........... . 4-11 4. 5. 2 Major...Function Sensitivity Assignment ........ ... 4-13 i 4.5.3 Link Dependency Assignment ... ......... . 4-14 4.5.4 Computer Program for Sensitivity
Payload training methodology study
NASA Technical Reports Server (NTRS)
1990-01-01
The results of the Payload Training Methodology Study (PTMS) are documented. Methods and procedures are defined for the development of payload training programs to be conducted at the Marshall Space Flight Center Payload Training Complex (PCT) for the Space Station Freedom program. The study outlines the overall training program concept as well as the six methodologies associated with the program implementation. The program concept outlines the entire payload training program from initial identification of training requirements to the development of detailed design specifications for simulators and instructional material. The following six methodologies are defined: (1) The Training and Simulation Needs Assessment Methodology; (2) The Simulation Approach Methodology; (3) The Simulation Definition Analysis Methodology; (4) The Simulator Requirements Standardization Methodology; (5) The Simulator Development Verification Methodology; and (6) The Simulator Validation Methodology.
Evaluation of a Progressive Failure Analysis Methodology for Laminated Composite Structures
NASA Technical Reports Server (NTRS)
Sleight, David W.; Knight, Norman F., Jr.; Wang, John T.
1997-01-01
A progressive failure analysis methodology has been developed for predicting the nonlinear response and failure of laminated composite structures. The progressive failure analysis uses C plate and shell elements based on classical lamination theory to calculate the in-plane stresses. Several failure criteria, including the maximum strain criterion, Hashin's criterion, and Christensen's criterion, are used to predict the failure mechanisms. The progressive failure analysis model is implemented into a general purpose finite element code and can predict the damage and response of laminated composite structures from initial loading to final failure.
Performance-cost evaluation methodology for ITS equipment deployment
DOT National Transportation Integrated Search
2000-09-01
Although extensive Intelligent Transportation Systems (ITS) technology is being deployed in the field, little analysis is being performed to evaluate the benefits of implementation schemes. Benefit analysis is particularly in need for one popular ITS...
Object-oriented analysis and design: a methodology for modeling the computer-based patient record.
Egyhazy, C J; Eyestone, S M; Martino, J; Hodgson, C L
1998-08-01
The article highlights the importance of an object-oriented analysis and design (OOAD) methodology for the computer-based patient record (CPR) in the military environment. Many OOAD methodologies do not adequately scale up, allow for efficient reuse of their products, or accommodate legacy systems. A methodology that addresses these issues is formulated and used to demonstrate its applicability in a large-scale health care service system. During a period of 6 months, a team of object modelers and domain experts formulated an OOAD methodology tailored to the Department of Defense Military Health System and used it to produce components of an object model for simple order processing. This methodology and the lessons learned during its implementation are described. This approach is necessary to achieve broad interoperability among heterogeneous automated information systems.
Brown, C. Hendricks; Kellam, Sheppard G.; Kaupert, Sheila; Muthén, Bengt O.; Wang, Wei; Muthén, Linda K.; Chamberlain, Patricia; PoVey, Craig L.; Cady, Rick; Valente, Thomas W.; Ogihara, Mitsunori; Prado, Guillermo J.; Pantin, Hilda M.; Gallo, Carlos G.; Szapocznik, José; Czaja, Sara J.; McManus, John W.
2012-01-01
What progress prevention research has made comes through strategic partnerships with communities and institutions that host this research, as well as professional and practice networks that facilitate the diffusion of knowledge about prevention. We discuss partnership issues related to the design, analysis, and implementation of prevention research and especially how rigorous designs, including random assignment, get resolved through a partnership between community stakeholders, institutions, and researchers. These partnerships shape not only study design, but they determine the data that can be collected and how results and new methods are disseminated. We also examine a second type of partnership to improve the implementation of effective prevention programs into practice. We draw on social networks to studying partnership formation and function. The experience of the Prevention Science and Methodology Group, which itself is a networked partnership between scientists and methodologists, is highlighted. PMID:22160786
Brown, C Hendricks; Kellam, Sheppard G; Kaupert, Sheila; Muthén, Bengt O; Wang, Wei; Muthén, Linda K; Chamberlain, Patricia; PoVey, Craig L; Cady, Rick; Valente, Thomas W; Ogihara, Mitsunori; Prado, Guillermo J; Pantin, Hilda M; Gallo, Carlos G; Szapocznik, José; Czaja, Sara J; McManus, John W
2012-07-01
What progress prevention research has made comes through strategic partnerships with communities and institutions that host this research, as well as professional and practice networks that facilitate the diffusion of knowledge about prevention. We discuss partnership issues related to the design, analysis, and implementation of prevention research and especially how rigorous designs, including random assignment, get resolved through a partnership between community stakeholders, institutions, and researchers. These partnerships shape not only study design, but they determine the data that can be collected and how results and new methods are disseminated. We also examine a second type of partnership to improve the implementation of effective prevention programs into practice. We draw on social networks to studying partnership formation and function. The experience of the Prevention Science and Methodology Group, which itself is a networked partnership between scientists and methodologists, is highlighted.
Functional Analysis and Treatment of Nail Biting
ERIC Educational Resources Information Center
Dufrene, Brad A.; Watson, T. Steuart; Kazmerski, Jennifer S.
2008-01-01
This study applied functional analysis methodology to nail biting exhibited by a 24-year-old female graduate student. Results from the brief functional analysis indicated variability in nail biting across assessment conditions. Functional analysis data were then used to guide treatment development and implementation. Treatment included a…
Geomatics for Maritime Parks and Preserved Areas
NASA Astrophysics Data System (ADS)
Lo Tauro, Agata
2009-11-01
The aim of this research is to use hyperspectral MIVIS data for protection of sensitive cultural, natural resources, Nature Reserves and maritime parks. A knowledge of the distribution of submerged vegetation is useful to monitor the health of ecosystems in coastal areas. The objective of this project was to develop a new methodology within geomatic environment to facilitate the analysis and application of Local Institutions who are not familiar with Spatial Analysis softwares in order to implement new research activities in this field of study. Field controls may be carried out with the support of accurate and novel in situ analysis in order to determine the training sites for the novel tested classification. The methodology applied demonstrates that the combination of hyperspectral sensors and ESA Remote Sensing (RS) data can be used to analyse thematic cartography of submerged vegetation and land use analysis for Sustainable Development. This project will be implemented for Innovative Educational and Research Programmes.
A Framework for Implementing TQM in Higher Education Programs
ERIC Educational Resources Information Center
Venkatraman, Sitalakshmi
2007-01-01
Purpose: This paper aims to provide a TQM framework that stresses continuous improvements in teaching as a plausible means of TQM implementation in higher education programs. Design/methodology/approach: The literature survey of the TQM philosophies and the comparative analysis of TQM adoption in industry versus higher education provide the…
Implementing meta-analysis from genome-wide association studies for pork quality traits
USDA-ARS?s Scientific Manuscript database
Pork quality plays an important role in the meat processing industry, thus different methodologies have been implemented to elucidate the genetic architecture of traits affecting meat quality. One of the most common and widely used approaches is to perform genome-wide association (GWA) studies. Howe...
Evolution of Ada technology in the flight dynamics area: Implementation/testing phase analysis
NASA Technical Reports Server (NTRS)
Quimby, Kelvin L.; Esker, Linda; Miller, John; Smith, Laurie; Stark, Mike; Mcgarry, Frank
1989-01-01
An analysis is presented of the software engineering issues related to the use of Ada for the implementation and system testing phases of four Ada projects developed in the flight dynamics area. These projects reflect an evolving understanding of more effective use of Ada features. In addition, the testing methodology used on these projects has changed substantially from that used on previous FORTRAN projects.
Development of Management Methodology for Engineering Production Quality
NASA Astrophysics Data System (ADS)
Gorlenko, O.; Miroshnikov, V.; Borbatc, N.
2016-04-01
The authors of the paper propose four directions of the methodology developing the quality management of engineering products that implement the requirements of new international standard ISO 9001:2015: the analysis of arrangement context taking into account stakeholders, the use of risk management, management of in-house knowledge, assessment of the enterprise activity according to the criteria of effectiveness
Okariz, Ana; Guraya, Teresa; Iturrondobeitia, Maider; Ibarretxe, Julen
2017-02-01
The SIRT (Simultaneous Iterative Reconstruction Technique) algorithm is commonly used in Electron Tomography to calculate the original volume of the sample from noisy images, but the results provided by this iterative procedure are strongly dependent on the specific implementation of the algorithm, as well as on the number of iterations employed for the reconstruction. In this work, a methodology for selecting the iteration number of the SIRT reconstruction that provides the most accurate segmentation is proposed. The methodology is based on the statistical analysis of the intensity profiles at the edge of the objects in the reconstructed volume. A phantom which resembles a a carbon black aggregate has been created to validate the methodology and the SIRT implementations of two free software packages (TOMOJ and TOMO3D) have been used. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Cirillo, William M.; Earle, Kevin D.; Goodliff, Kandyce E.; Reeves, J. D.; Stromgren, Chel; Andraschko, Mark R.; Merrill, R. Gabe
2008-01-01
NASA s Constellation Program employs a strategic analysis methodology in providing an integrated analysis capability of Lunar exploration scenarios and to support strategic decision-making regarding those scenarios. The strategic analysis methodology integrates the assessment of the major contributors to strategic objective satisfaction performance, affordability, and risk and captures the linkages and feedbacks between all three components. Strategic analysis supports strategic decision making by senior management through comparable analysis of alternative strategies, provision of a consistent set of high level value metrics, and the enabling of cost-benefit analysis. The tools developed to implement the strategic analysis methodology are not element design and sizing tools. Rather, these models evaluate strategic performance using predefined elements, imported into a library from expert-driven design/sizing tools or expert analysis. Specific components of the strategic analysis tool set include scenario definition, requirements generation, mission manifesting, scenario lifecycle costing, crew time analysis, objective satisfaction benefit, risk analysis, and probabilistic evaluation. Results from all components of strategic analysis are evaluated a set of pre-defined figures of merit (FOMs). These FOMs capture the high-level strategic characteristics of all scenarios and facilitate direct comparison of options. The strategic analysis methodology that is described in this paper has previously been applied to the Space Shuttle and International Space Station Programs and is now being used to support the development of the baseline Constellation Program lunar architecture. This paper will present an overview of the strategic analysis methodology and will present sample results from the application of the strategic analysis methodology to the Constellation Program lunar architecture.
NASA Astrophysics Data System (ADS)
Khavekar, Rajendra; Vasudevan, Hari, Dr.; Modi, Bhavik
2017-08-01
Two well-known Design of Experiments (DoE) methodologies, such as Taguchi Methods (TM) and Shainin Systems (SS) are compared and analyzed in this study through their implementation in a plastic injection molding unit. Experiments were performed at a perfume bottle cap manufacturing company (made by acrylic material) using TM and SS to find out the root cause of defects and to optimize the process parameters for minimum rejection. Experiments obtained the rejection rate to be 8.57% from 40% (appx.) during trial runs, which is quiet low, representing successful implementation of these DoE methods. The comparison showed that both methodologies gave same set of variables as critical for defect reduction, but with change in their significance order. Also, Taguchi methods require more number of experiments and consume more time compared to the Shainin System. Shainin system is less complicated and is easy to implement, whereas Taguchi methods is statistically more reliable for optimization of process parameters. Finally, experimentations implied that DoE methods are strong and reliable in implementation, as organizations attempt to improve the quality through optimization.
CERT tribal internship program. Final intern report: David Conrad, 1993
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1998-09-01
The intern`s report contains a Master`s thesis entitled, ``An implementation analysis of the US Department of Energy`s American Indian policy as part of its environmental restoration and waste management mission.`` This thesis examines the implementation of a working relationship between the Nez Perce Tribe and the US Department of Energy`s Office of Environmental Restoration and Waste Management at the Hanford reservation. It examines the relationship using a qualitative methodology and three generations of policy analysis literature to gain a clear understanding of the potential for successful implementation.
Wavelet and Multiresolution Analysis for Finite Element Networking Paradigms
NASA Technical Reports Server (NTRS)
Kurdila, Andrew J.; Sharpley, Robert C.
1999-01-01
This paper presents a final report on Wavelet and Multiresolution Analysis for Finite Element Networking Paradigms. The focus of this research is to derive and implement: 1) Wavelet based methodologies for the compression, transmission, decoding, and visualization of three dimensional finite element geometry and simulation data in a network environment; 2) methodologies for interactive algorithm monitoring and tracking in computational mechanics; and 3) Methodologies for interactive algorithm steering for the acceleration of large scale finite element simulations. Also included in this report are appendices describing the derivation of wavelet based Particle Image Velocity algorithms and reduced order input-output models for nonlinear systems by utilizing wavelet approximations.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-20
..., Risk Management and Analysis (RAM) ACTION: Notice of request for public comments. SUMMARY: The... of 1995. Title of Information Collection: Risk Analysis and Management. OMB Control Number: None.... Methodology: The State Department, is implementing a Risk Analysis and Management Program to vet potential...
Developing and Implementing an Online Doctoral Programme
ERIC Educational Resources Information Center
Combe, Colin
2005-01-01
Purpose: This article is a critical reflection of the development and implementation of one of the first online doctoral programs in the UK set up at the University of Northumbria, Newcastle in 2000. Design/methodology/approach: The method adopted for analysis takes the form of a case study. Findings: Effective market research has to be undertaken…
ERIC Educational Resources Information Center
Wills, Frances G.
Strategies utilized by district superintendents to implement school improvement plans in response to state-mandated change are examined in this report. Methodology involved document analysis of written plans and interviews with 30 Maine superintendents and assistant superintendents who were identified as successful developers of school improvement…
Change Forces: Implementing Change in a Secondary School for the Common Good
ERIC Educational Resources Information Center
Melville, Wayne; Bartley, Anthony; Weinburgh, Molly
2012-01-01
In this article, we investigate the change forces that act on administrators, subject department chairpersons and teachers as they seek to implement a change in a Canadian secondary school. Using a case study methodology, our analysis of the data uses Sergiovanni's (1998) six change forces: bureaucratic, personal, market, professional, cultural,…
Representing Embodiment and the Policy Implementing Principal Using Photovoice
ERIC Educational Resources Information Center
Werts, Amanda B.; Brewer, Curtis A.; Mathews, Sarah A.
2012-01-01
Purpose: The purpose of this paper is to contribute to the literature on the many dimensions of the principal's positionality by using a unique research approach to link the experiences of the policy implementing principal to embodiment. Design/methodology/approach: The researchers employed a form of critical policy analysis that utilized…
NASA Technical Reports Server (NTRS)
1979-01-01
Information to identify viable coal gasification and utilization technologies is presented. Analysis capabilities required to support design and implementation of coal based synthetic fuels complexes are identified. The potential market in the Southeast United States for coal based synthetic fuels is investigated. A requirements analysis to identify the types of modeling and analysis capabilities required to conduct and monitor coal gasification project designs is discussed. Models and methodologies to satisfy these requirements are identified and evaluated, and recommendations are developed. Requirements for development of technology and data needed to improve gasification feasibility and economies are examined.
NASA Technical Reports Server (NTRS)
Allen, Gregory; Edmonds, Larry D.; Swift, Gary; Carmichael, Carl; Tseng, Chen Wei; Heldt, Kevin; Anderson, Scott Arlo; Coe, Michael
2010-01-01
We present a test methodology for estimating system error rates of Field Programmable Gate Arrays (FPGAs) mitigated with Triple Modular Redundancy (TMR). The test methodology is founded in a mathematical model, which is also presented. Accelerator data from 90 nm Xilins Military/Aerospace grade FPGA are shown to fit the model. Fault injection (FI) results are discussed and related to the test data. Design implementation and the corresponding impact of multiple bit upset (MBU) are also discussed.
Texture analysis of Napoleonic War Era copper bolts
NASA Astrophysics Data System (ADS)
Malamud, Florencia; Northover, Shirley; James, Jon; Northover, Peter; Kelleher, Joe
2016-04-01
Neutron diffraction techniques are suitable for volume texture analyses due to high penetration of thermal neutrons in most materials. We have implemented a new data analysis methodology that employed the spatial resolution achievable by a time-of-flight neutron strain scanner to non-destructively determine the crystallographic texture at selected locations within a macroscopic sample. The method is based on defining the orientation distribution function of the crystallites from several incomplete pole figures, and it has been implemented on ENGIN-X, a neutron strain scanner at the Isis Facility in the UK. Here, we demonstrate the application of this new texture analysis methodology in determining the crystallographic texture at selected locations within museum quality archaeological objects up to 1 m in length. The results were verified using samples of similar, but less valuable, objects by comparing the results of applying this method with those obtained using both electron backscatter diffraction and X-ray diffraction on their cross sections.
ERIC Educational Resources Information Center
Jovanovic, Aleksandar; Jankovic, Anita; Jovanovic, Snezana Markovic; Peric, Vladan; Vitosevic, Biljana; Pavlovic, Milos
2015-01-01
The paper describes the delivery of the courses in the framework of the project implementation and presents the effect the change in the methodology had on student performance as measured by final grade. Methodology: University of Pristina piloted blended courses in 2013 under the framework of the Tempus BLATT project. The blended learning…
NASA Technical Reports Server (NTRS)
Giles, G. L.; Rogers, J. L., Jr.
1982-01-01
The methodology used to implement structural sensitivity calculations into a major, general-purpose finite-element analysis system (SPAR) is described. This implementation includes a generalized method for specifying element cross-sectional dimensions as design variables that can be used in analytically calculating derivatives of output quantities from static stress, vibration, and buckling analyses for both membrane and bending elements. Limited sample results for static displacements and stresses are presented to indicate the advantages of analytically calculating response derivatives compared to finite difference methods. Continuing developments to implement these procedures into an enhanced version of SPAR are also discussed.
Khan, F I; Iqbal, A; Ramesh, N; Abbasi, S A
2001-10-12
As it is conventionally done, strategies for incorporating accident--prevention measures in any hazardous chemical process industry are developed on the basis of input from risk assessment. However, the two steps-- risk assessment and hazard reduction (or safety) measures--are not linked interactively in the existing methodologies. This prevents a quantitative assessment of the impacts of safety measures on risk control. We have made an attempt to develop a methodology in which risk assessment steps are interactively linked with implementation of safety measures. The resultant system tells us the extent of reduction of risk by each successive safety measure. It also tells based on sophisticated maximum credible accident analysis (MCAA) and probabilistic fault tree analysis (PFTA) whether a given unit can ever be made 'safe'. The application of the methodology has been illustrated with a case study.
An Analysis of Hardware-Assisted Virtual Machine Based Rootkits
2014-06-01
certain aspects of TPM implementation just to name a few. HyperWall is an architecture proposed by Szefer and Lee to protect guest VMs from...DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) The use of virtual machine (VM) technology has expanded rapidly since AMD and Intel implemented ...Intel VT-x implementations of Blue Pill to identify commonalities in the respective versions’ attack methodologies from both a functional and technical
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-19
... DEPARTMENT OF COMMERCE International Trade Administration Methodological Change for Implementation..., the Department of Commerce (``the Department'') will implement a methodological change to reduce... administrative reviews involving merchandise from the PRC and Vietnam. Methodological Change In antidumping duty...
Stream habitat analysis using the instream flow incremental methodology
Bovee, Ken D.; Lamb, Berton L.; Bartholow, John M.; Stalnaker, Clair B.; Taylor, Jonathan; Henriksen, Jim
1998-01-01
This document describes the Instream Flow Methodology in its entirety. This also is to serve as a comprehensive introductory textbook on IFIM for training courses as it contains the most complete and comprehensive description of IFIM in existence today. This should also serve as an official guide to IFIM in publication to counteract the misconceptions about the methodology that have pervaded the professional literature since the mid-1980's as this describes IFIM as it is envisioned by its developers. The document is aimed at the decisionmakers of management and allocation of natural resources in providing them an overview; and to those who design and implement studies to inform the decisionmakers. There should be enough background on model concepts, data requirements, calibration techniques, and quality assurance to help the technical user design and implement a cost-effective application of IFIM that will provide policy-relevant information. Some of the chapters deal with basic organization of IFIM, procedural sequence of applying IFIM starting with problem identification, study planning and implementation, and problem resolution.
Regional Shelter Analysis Methodology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dillon, Michael B.; Dennison, Deborah; Kane, Jave
2015-08-01
The fallout from a nuclear explosion has the potential to injure or kill 100,000 or more people through exposure to external gamma (fallout) radiation. Existing buildings can reduce radiation exposure by placing material between fallout particles and exposed people. Lawrence Livermore National Laboratory was tasked with developing an operationally feasible methodology that could improve fallout casualty estimates. The methodology, called a Regional Shelter Analysis, combines the fallout protection that existing buildings provide civilian populations with the distribution of people in various locations. The Regional Shelter Analysis method allows the consideration of (a) multiple building types and locations within buildings, (b)more » country specific estimates, (c) population posture (e.g., unwarned vs. minimally warned), and (d) the time of day (e.g., night vs. day). The protection estimates can be combined with fallout predictions (or measurements) to (a) provide a more accurate assessment of exposure and injury and (b) evaluate the effectiveness of various casualty mitigation strategies. This report describes the Regional Shelter Analysis methodology, highlights key operational aspects (including demonstrating that the methodology is compatible with current tools), illustrates how to implement the methodology, and provides suggestions for future work.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
PECH, S.H.
This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Final Safety Analysis Report. This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.
NASA Astrophysics Data System (ADS)
Aguila, Alexander; Wilson, Jorge
2017-07-01
This paper develops a methodology to assess a group of measures of electrical improvements in distribution systems, starting from the complementation of technical and economic criteria. In order to solve the problem of energy losses in distribution systems, technical and economic analysis was performed based on a mathematical model to establish a direct relationship between the energy saved by way of minimized losses and the costs of implementing the proposed measures. This paper aims at analysing the feasibility of reducing energy losses in distribution systems, by changing existing network conductors by larger crosssection conductors and distribution voltage change at higher levels. The impact of this methodology provides a highly efficient mathematical tool for analysing the feasibility of implementing improvement projects based on their costs which is a very useful tool for the distribution companies that will serve as a starting point to the analysis for this type of projects in distribution systems.
The 'Direct Attack' Strategy for Poverty Removal: Implementation Methodology.
ERIC Educational Resources Information Center
Sinha, Sanjay
1981-01-01
Discusses elements of an implementation methodology for the removal of poverty in India. Includes background, methodology, aggregation of demands, economics of the strategy, complementary activities and infrastructure, mechanics of implementation, and monitoring. (CT)
Cost benefits of advanced software: A review of methodology used at Kennedy Space Center
NASA Technical Reports Server (NTRS)
Joglekar, Prafulla N.
1993-01-01
To assist rational investments in advanced software, a formal, explicit, and multi-perspective cost-benefit analysis methodology is proposed. The methodology can be implemented through a six-stage process which is described and explained. The current practice of cost-benefit analysis at KSC is reviewed in the light of this methodology. The review finds that there is a vicious circle operating. Unsound methods lead to unreliable cost-benefit estimates. Unreliable estimates convince management that cost-benefit studies should not be taken seriously. Then, given external demands for cost-benefit estimates, management encourages software enginees to somehow come up with the numbers for their projects. Lacking the expertise needed to do a proper study, courageous software engineers with vested interests use ad hoc and unsound methods to generate some estimates. In turn, these estimates are unreliable, and the cycle continues. The proposed methodology should help KSC to break out of this cycle.
Climate Model Diagnostic Analyzer Web Service System
NASA Astrophysics Data System (ADS)
Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Kubar, T. L.; Li, J.; Zhang, J.; Wang, W.
2015-12-01
Both the National Research Council Decadal Survey and the latest Intergovernmental Panel on Climate Change Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with the synergistic use of global satellite observations in order to improve our weather and climate simulation and prediction capabilities. The abundance of satellite observations for fundamental climate parameters and the availability of coordinated model outputs from CMIP5 for the same parameters offer a great opportunity to understand and diagnose model biases in climate models. In addition, the Obs4MIPs efforts have created several key global observational datasets that are readily usable for model evaluations. However, a model diagnostic evaluation process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. In response, we have developed a novel methodology to diagnose model biases in contemporary climate models and implementing the methodology as a web-service based, cloud-enabled, provenance-supported climate-model evaluation system. The evaluation system is named Climate Model Diagnostic Analyzer (CMDA), which is the product of the research and technology development investments of several current and past NASA ROSES programs. The current technologies and infrastructure of CMDA are designed and selected to address several technical challenges that the Earth science modeling and model analysis community faces in evaluating and diagnosing climate models. In particular, we have three key technology components: (1) diagnostic analysis methodology; (2) web-service based, cloud-enabled technology; (3) provenance-supported technology. The diagnostic analysis methodology includes random forest feature importance ranking, conditional probability distribution function, conditional sampling, and time-lagged correlation map. We have implemented the new methodology as web services and incorporated the system into the Cloud. We have also developed a provenance management system for CMDA where CMDA service semantics modeling, service search and recommendation, and service execution history management are designed and implemented.
Structural Optimization Methodology for Rotating Disks of Aircraft Engines
NASA Technical Reports Server (NTRS)
Armand, Sasan C.
1995-01-01
In support of the preliminary evaluation of various engine technologies, a methodology has been developed for structurally designing the rotating disks of an aircraft engine. The structural design methodology, along with a previously derived methodology for predicting low-cycle fatigue life, was implemented in a computer program. An interface computer program was also developed that gathers the required data from a flowpath analysis program (WATE) being used at NASA Lewis. The computer program developed for this study requires minimum interaction with the user, thus allowing engineers with varying backgrounds in aeropropulsion to successfully execute it. The stress analysis portion of the methodology and the computer program were verified by employing the finite element analysis method. The 10th- stage, high-pressure-compressor disk of the Energy Efficient Engine Program (E3) engine was used to verify the stress analysis; the differences between the stresses and displacements obtained from the computer program developed for this study and from the finite element analysis were all below 3 percent for the problem solved. The computer program developed for this study was employed to structurally optimize the rotating disks of the E3 high-pressure compressor. The rotating disks designed by the computer program in this study were approximately 26 percent lighter than calculated from the E3 drawings. The methodology is presented herein.
Efficient Process Migration for Parallel Processing on Non-Dedicated Networks of Workstations
NASA Technical Reports Server (NTRS)
Chanchio, Kasidit; Sun, Xian-He
1996-01-01
This paper presents the design and preliminary implementation of MpPVM, a software system that supports process migration for PVM application programs in a non-dedicated heterogeneous computing environment. New concepts of migration point as well as migration point analysis and necessary data analysis are introduced. In MpPVM, process migrations occur only at previously inserted migration points. Migration point analysis determines appropriate locations to insert migration points; whereas, necessary data analysis provides a minimum set of variables to be transferred at each migration pint. A new methodology to perform reliable point-to-point data communications in a migration environment is also discussed. Finally, a preliminary implementation of MpPVM and its experimental results are presented, showing the correctness and promising performance of our process migration mechanism in a scalable non-dedicated heterogeneous computing environment. While MpPVM is developed on top of PVM, the process migration methodology introduced in this study is general and can be applied to any distributed software environment.
Robinson, Claire H; Annis, Ann M; Forman, Jane; Krein, Sarah L; Yankey, Nicholas; Duffy, Sonia A; Taylor, Beth; Sales, Anne E
2016-08-01
To assess implementation of the Veterans Health Administration staffing methodology directive. In 2010 the Veterans Health Administration promulgated a staffing methodology directive for inpatient nursing units to address staffing and budget forecasting. A qualitative multi-case evaluation approach assessed staffing methodology implementation. Semi-structured telephone interviews were conducted from March - June 2014 with Nurse Executives and their teams at 21 facilities. Interviews focused on the budgeting process, implementation experiences, use of data, leadership support, and training. An implementation score was created for each facility using a 4-point rating scale. The scores were used to select three facilities (low, medium and high implementation) for more detailed case studies. After analysing interview summaries, the evaluation team developed a four domain scoring structure: (1) integration of staffing methodology into budget development; (2) implementation of the Directive elements; (3) engagement of leadership and staff; and (4) use of data to support the staffing methodology process. The high implementation facility had leadership understanding and endorsement of staffing methodology, confidence in and ability to work with data, and integration of staffing methodology results into the budgeting process. The low implementation facility reported poor leadership engagement and little understanding of data sources and interpretation. Implementation varies widely across facilities. Implementing staffing methodology in facilities with complex and changing staffing needs requires substantial commitment at all organizational levels especially for facilities that have traditionally relied on historical levels to budget for staffing. Published 2016. This article is a U.S. Government work and is in the public domain in the USA.
Transmission line relay mis-operation detection based on time-synchronized field data
Esmaeilian, Ahad; Popovic, Tomo; Kezunovic, Mladen
2015-05-04
In this paper, a real-time tool to detect transmission line relay mis-operation is implemented. The tool uses time-synchronized measurements obtained from both ends of the line during disturbances. The proposed fault analysis tool comes into the picture only after the protective device has operated and tripped the line. The proposed methodology is able not only to detect, classify, and locate transmission line faults, but also to accurately confirm whether the line was tripped due to a mis-operation of protective relays. The analysis report includes either detailed description of the fault type and location or detection of relay mis-operation. As such,more » it can be a source of very useful information to support the system restoration. The focus of the paper is on the implementation requirements that allow practical application of the methodology, which is illustrated using the field data obtained the real power system. Testing and validation is done using the field data recorded by digital fault recorders and protective relays. The test data included several hundreds of event records corresponding to both relay mis-operations and actual faults. The discussion of results addresses various challenges encountered during the implementation and validation of the presented methodology.« less
Implementing a Social-Ecological Model of Health in Wales
ERIC Educational Resources Information Center
Rothwell, Heather; Shepherd, Michael; Murphy, Simon; Burgess, Stephen; Townsend, Nick; Pimm, Claire
2010-01-01
Purpose: The purpose of this paper is to assess the implementation of the Welsh Network of Healthy School Schemes (WNHSS) at national, local and school levels, using a systems approach drawing on the Ottawa Charter. Design/methodology/approach: The approach takes the form of a single-case study using data from a documentary analysis, interviews…
Towards adaptive and integrated management paradigms to meet the challenges of water governance.
Halbe, J; Pahl-Wostl, C; Sendzimir, J; Adamowski, J
2013-01-01
Integrated Water Resource Management (IWRM) aims at finding practical and sustainable solutions to water resource issues. Research and practice have shown that innovative methods and tools are not sufficient to implement IWRM - the concept needs to also be integrated in prevailing management paradigms and institutions. Water governance science addresses this human dimension by focusing on the analysis of regulatory processes that influence the behavior of actors in water management systems. This paper proposes a new methodology for the integrated analysis of water resources management and governance systems in order to elicit and analyze case-specific management paradigms. It builds on the Management and Transition Framework (MTF) that allows for the examination of structures and processes underlying water management and governance. The new methodology presented in this paper combines participatory modeling and analysis of the governance system by using the MTF to investigate case-specific management paradigms. The linking of participatory modeling and research on complex management and governance systems allows for the transfer of knowledge between scientific, policy, engineering and local communities. In this way, the proposed methodology facilitates assessment and implementation of transformation processes towards IWRM that require also the adoption of adaptive management principles. A case study on flood management in the Tisza River Basin in Hungary is provided to illustrate the application of the proposed methodology.
Multiphysics Analysis of a Solid-Core Nuclear Thermal Engine Thrust Chamber
NASA Technical Reports Server (NTRS)
Wang, Ten-See; Canabal, Francisco; Cheng, Gary; Chen, Yen-Sen
2006-01-01
The objective of this effort is to develop an efficient and accurate thermo-fluid computational methodology to predict environments for a hypothetical solid-core, nuclear thermal engine thrust chamber. The computational methodology is based on an unstructured-grid, pressure-based computational fluid dynamics methodology. Formulations for heat transfer in solids and porous media were implemented and anchored. A two-pronged approach was employed in this effort: A detailed thermo-fluid analysis on a multi-channel flow element for mid-section corrosion investigation; and a global modeling of the thrust chamber to understand the effect of hydrogen dissociation and recombination on heat transfer and thrust performance. The formulations and preliminary results on both aspects are presented.
Predicting the Reliability of Ceramics Under Transient Loads and Temperatures With CARES/Life
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Jadaan, Osama M.; Palfi, Tamas; Baker, Eric H.
2003-01-01
A methodology is shown for predicting the time-dependent reliability of ceramic components against catastrophic rupture when subjected to transient thermomechanical loads (including cyclic loads). The methodology takes into account the changes in material response that can occur with temperature or time (i.e., changing fatigue and Weibull parameters with temperature or time). This capability has been added to the NASA CARES/Life (Ceramic Analysis and Reliability Evaluation of Structures/Life) code. The code has been modified to have the ability to interface with commercially available finite element analysis (FEA) codes executed for transient load histories. Examples are provided to demonstrate the features of the methodology as implemented in the CARES/Life program.
Romanelli, Asunción; Massone, Héctor E; Escalante, Alicia H
2011-09-01
This article gives an account of the implementation of a stakeholder analysis framework at La Brava Wetland Basin, Argentina, in a common-pool resource (CPR) management context. Firstly, the context in which the stakeholder framework was implemented is described. Secondly, a four-step methodology is applied: (1) stakeholder identification, (2) stakeholder differentiation-categorization, (3) investigation of stakeholders' relationships, and (4) analysis of social-biophysical interdependencies. This methodology classifies stakeholders according to their level of influence on the system and their potential in the conservation of natural resources. The main influential stakeholders are La Brava Village residents and tourism-related entrepreneurs who are empowered to make the more important decisions within the planning process of the ecosystem. While these key players are seen as facilitators of change, there are other groups (residents of the inner basin and fishermen) which are seen mainly as key blockers. The applied methodology for the Stakeholder Analysis and the evaluation of social-biophysical interdependencies carried out in this article can be seen as an encouraging example for other experts in natural sciences to learn and use these methods developed in social sciences. Major difficulties and some recommendations of applying this method in the practice by non-experts are discussed.
de Paiva, Anderson Paulo
2018-01-01
This research evaluates the influence of the Brazilian accreditation methodology on the sustainability of the organizations. Critical factors for implementing accreditation were also examined, including measuring the relationships established between these factors in the organization sustainability. The present study was developed based on the survey methodology applied in the organizations accredited by ONA (National Accreditation Organization); 288 responses were received from the top level managers. The analysis of quantitative data of the measurement models was made with factorial analysis from principal components. The final model was evaluated from the confirmatory factorial analysis and structural equation modeling techniques. The results from the research are vital for the definition of factors that interfere in the accreditation processes, providing a better understanding for accredited organizations and for Brazilian accreditation. PMID:29599939
Analysis of CrIs/ATMS Using AIRS Version-7 Retrieval and QC Methodology
NASA Technical Reports Server (NTRS)
Susskind, Joel; Kouvaris, Louis; Blaisdell, John M.; Iredell, Lena
2017-01-01
The objective of this research is to develop and implement an algorithm to analyze a long term data record of CrIS/ATMS observations so as to produce monthly mean gridded Level-3 products which are consistent with, and will serve as a seamless follow on to, those of AIRS Version-7. We feel the best way to achieve this result is to analyze CrIS/ATMS data using retrieval and Quality Control (QC) methodologies which are scientifically equivalent to those used in AIRS Version-7. We developed and implemented a single retrieval program that uses as input either AIRS/AMSU or CrIS/ATMS radiance observations, and has appropriate switches that take into account the spectral and radiometric differences between CrIS and AIRS. Our methodology is call CHART (Climate Heritage AIRS Retrieval Technique).
H-P adaptive methods for finite element analysis of aerothermal loads in high-speed flows
NASA Technical Reports Server (NTRS)
Chang, H. J.; Bass, J. M.; Tworzydlo, W.; Oden, J. T.
1993-01-01
The commitment to develop the National Aerospace Plane and Maneuvering Reentry Vehicles has generated resurgent interest in the technology required to design structures for hypersonic flight. The principal objective of this research and development effort has been to formulate and implement a new class of computational methodologies for accurately predicting fine scale phenomena associated with this class of problems. The initial focus of this effort was to develop optimal h-refinement and p-enrichment adaptive finite element methods which utilize a-posteriori estimates of the local errors to drive the adaptive methodology. Over the past year this work has specifically focused on two issues which are related to overall performance of a flow solver. These issues include the formulation and implementation (in two dimensions) of an implicit/explicit flow solver compatible with the hp-adaptive methodology, and the design and implementation of computational algorithm for automatically selecting optimal directions in which to enrich the mesh. These concepts and algorithms have been implemented in a two-dimensional finite element code and used to solve three hypersonic flow benchmark problems (Holden Mach 14.1, Edney shock on shock interaction Mach 8.03, and the viscous backstep Mach 4.08).
A normative price for a manufactured product: The SAMICS methodology. Volume 2: Analysis
NASA Technical Reports Server (NTRS)
Chamberlain, R. G.
1979-01-01
The Solar Array Manufacturing Industry Costing Standards provide standard formats, data, assumptions, and procedures for determining the price a hypothetical solar array manufacturer would have to be able to obtain in the market to realize a specified after-tax rate of return on equity for a specified level of production. The methodology and its theoretical background are presented. The model is sufficiently general to be used in any production-line manufacturing environment. Implementation of this methodology by the Solar Array Manufacturing Industry Simultation computer program is discussed.
Implementing a Quality Management Framework in a Higher Education Organisation: A Case Study
ERIC Educational Resources Information Center
O'Mahony, Kim; Garavan, Thomas N.
2012-01-01
Purpose: This paper aims to report and analyse the lessons learned from a case study on the implementation of a quality management system within an IT Division in a higher education (HE) organisation. Design/methodology/approach: The paper is based on a review of the relevant literatures and the use of primary sources such as document analysis,…
ERIC Educational Resources Information Center
Štofková, Katarína; Strícek, Ivan; Štofková, Jana
2014-01-01
The paper is aimed to evaluate the possibility of applying new methods and tools of more effective educational processes, with an emphasis on increasing their quality especially aimed on educational processes at secondary schools and universities. There are some contributions from practice for the effective implementation of time management, such…
ERIC Educational Resources Information Center
Bradford, Deborah J.
2010-01-01
The purpose of the study was to understand and appreciate the methodologies and procedures used in determining the extent to which an information technology (IT) organization within the eleven member State University Systems (SUS) of Florida planned, implemented, and diffused emerging educational technologies. Key findings found how critical it…
NASA Technical Reports Server (NTRS)
Bogert, Philip B.; Satyanarayana, Arunkumar; Chunchu, Prasad B.
2006-01-01
Splitting, ultimate failure load and the damage path in center notched composite specimens subjected to in-plane tension loading are predicted using progressive failure analysis methodology. A 2-D Hashin-Rotem failure criterion is used in determining intra-laminar fiber and matrix failures. This progressive failure methodology has been implemented in the Abaqus/Explicit and Abaqus/Standard finite element codes through user written subroutines "VUMAT" and "USDFLD" respectively. A 2-D finite element model is used for predicting the intra-laminar damages. Analysis results obtained from the Abaqus/Explicit and Abaqus/Standard code show good agreement with experimental results. The importance of modeling delamination in progressive failure analysis methodology is recognized for future studies. The use of an explicit integration dynamics code for simple specimen geometry and static loading establishes a foundation for future analyses where complex loading and nonlinear dynamic interactions of damage and structure will necessitate it.
A strategy for developing a launch vehicle system for orbit insertion: Methodological aspects
NASA Astrophysics Data System (ADS)
Klyushnikov, V. Yu.; Kuznetsov, I. I.; Osadchenko, A. S.
2014-12-01
The article addresses methodological aspects of a development strategy to design a launch vehicle system for orbit insertion. The development and implementation of the strategy are broadly outlined. An analysis is provided of the criterial base and input data needed to define the main requirements for the launch vehicle system. Approaches are suggested for solving individual problems in working out the launch vehicle system development strategy.
Development and exploration of a new methodology for the fitting and analysis of XAS data.
Delgado-Jaime, Mario Ulises; Kennepohl, Pierre
2010-01-01
A new data analysis methodology for X-ray absorption near-edge spectroscopy (XANES) is introduced and tested using several examples. The methodology has been implemented within the context of a new Matlab-based program discussed in a companion related article [Delgado-Jaime et al. (2010), J. Synchrotron Rad. 17, 132-137]. The approach makes use of a Monte Carlo search method to seek appropriate starting points for a fit model, allowing for the generation of a large number of independent fits with minimal user-induced bias. The applicability of this methodology is tested using various data sets on the Cl K-edge XAS data for tetragonal CuCl(4)(2-), a common reference compound used for calibration and covalency estimation in M-Cl bonds. A new background model function that effectively blends together background profiles with spectral features is an important component of the discussed methodology. The development of a robust evaluation function to fit multiple-edge data is discussed and the implications regarding standard approaches to data analysis are discussed and explored within these examples.
Development and exploration of a new methodology for the fitting and analysis of XAS data
Delgado-Jaime, Mario Ulises; Kennepohl, Pierre
2010-01-01
A new data analysis methodology for X-ray absorption near-edge spectroscopy (XANES) is introduced and tested using several examples. The methodology has been implemented within the context of a new Matlab-based program discussed in a companion related article [Delgado-Jaime et al. (2010 ▶), J. Synchrotron Rad. 17, 132–137]. The approach makes use of a Monte Carlo search method to seek appropriate starting points for a fit model, allowing for the generation of a large number of independent fits with minimal user-induced bias. The applicability of this methodology is tested using various data sets on the Cl K-edge XAS data for tetragonal CuCl4 2−, a common reference compound used for calibration and covalency estimation in M—Cl bonds. A new background model function that effectively blends together background profiles with spectral features is an important component of the discussed methodology. The development of a robust evaluation function to fit multiple-edge data is discussed and the implications regarding standard approaches to data analysis are discussed and explored within these examples. PMID:20029120
NASA Astrophysics Data System (ADS)
Camacho-Navarro, Jhonatan; Ruiz, Magda; Villamizar, Rodolfo; Mujica, Luis; Moreno-Beltrán, Gustavo; Quiroga, Jabid
2017-05-01
Continuous monitoring for damage detection in structural assessment comprises implementation of low cost equipment and efficient algorithms. This work describes the stages involved in the design of a methodology with high feasibility to be used in continuous damage assessment. Specifically, an algorithm based on a data-driven approach by using principal component analysis and pre-processing acquired signals by means of cross-correlation functions, is discussed. A carbon steel pipe section and a laboratory tower were used as test structures in order to demonstrate the feasibility of the methodology to detect abrupt changes in the structural response when damages occur. Two types of damage cases are studied: crack and leak for each structure, respectively. Experimental results show that the methodology is promising in the continuous monitoring of real structures.
Mantzoukas, Stefanos
2009-04-01
Evidence-based practice has become an imperative for efficient, effective and safe practice. Furthermore, evidences emerging from published research are considered as valid knowledge sources to guiding practice. The aim of this paper is to review all research articles published in the top 10 general nursing journals for the years 2000-2006 to identify the methodologies used, the types of evidence these studies produced and the issues upon which they endeavored. Quantitative content analysis was implemented to study all published research papers of the top 10 general nursing journals for the years 2000-2006. The top 10 general nursing journals were included in the study. The abstracts of all research articles were analysed with regards the methodologies of enquiry, the types of evidence produced and the issues of study they endeavored upon. Percentages were developed as to enable conclusions to be drawn. The results for the category methodologies used were 7% experimental, 6% quasi-experimental, 39% non-experimental, 2% ethnographical studies, 7% phenomenological, 4% grounded theory, 1% action research, 1% case study, 15% unspecified, 5.5% other, 0.5% meta-synthesis, 2% meta-analysis, 5% literature reviews and 3% secondary analysis. For the category types of evidence were 4% hypothesis/theory testing, 11% evaluative, 5% comparative, 2% correlational, 46% descriptive, 5% interpretative and 27% exploratory. For the category issues of study were 45% practice/clinical, 8% educational, 11% professional, 3% spiritual/ethical/metaphysical, 26% health promotion and 7% managerial/policy. Published studies can provide adequate evidences for practice if nursing journals conceptualise evidence emerging from non-experimental and qualitative studies as relevant types of evidences for practice and develop appropriate mechanisms for assessing their validity. Also, nursing journals need to increase and encourage the publication of studies that implement RCT methodology, systematic reviews, meta-synthesis and meta-analysis methodologies. Finally, nursing journals need to encourage more high quality research evidence that derive from interpretative, theory testing and evaluative types of studies that are practice relevant.
Tchepel, Oxana; Dias, Daniela
2011-06-01
This study is focused on the assessment of potential health benefits by meeting the air quality limit values (2008/50/CE) for short-term PM₁₀ exposure. For this purpose, the methodology of the WHO for Health Impact Assessment and APHEIS guidelines for data collection were applied to Porto Metropolitan Area, Portugal. Additionally, an improved methodology using population mobility data is proposed in this work to analyse number of persons exposed. In order to obtain representative background concentrations, an innovative approach to process air quality time series was implemented. The results provide the number of attributable cases prevented annually by reducing PM(10) concentration. An intercomparison of two approaches to process input data for the health risk analysis provides information on sensitivity of the applied methodology. The findings highlight the importance of taking into account spatial variability of the air pollution levels and population mobility in the health impact assessment.
Discrete Adjoint-Based Design for Unsteady Turbulent Flows On Dynamic Overset Unstructured Grids
NASA Technical Reports Server (NTRS)
Nielsen, Eric J.; Diskin, Boris
2012-01-01
A discrete adjoint-based design methodology for unsteady turbulent flows on three-dimensional dynamic overset unstructured grids is formulated, implemented, and verified. The methodology supports both compressible and incompressible flows and is amenable to massively parallel computing environments. The approach provides a general framework for performing highly efficient and discretely consistent sensitivity analysis for problems involving arbitrary combinations of overset unstructured grids which may be static, undergoing rigid or deforming motions, or any combination thereof. General parent-child motions are also accommodated, and the accuracy of the implementation is established using an independent verification based on a complex-variable approach. The methodology is used to demonstrate aerodynamic optimizations of a wind turbine geometry, a biologically-inspired flapping wing, and a complex helicopter configuration subject to trimming constraints. The objective function for each problem is successfully reduced and all specified constraints are satisfied.
NASA Astrophysics Data System (ADS)
Polatidis, Heracles; Morales, Jan Borràs
2016-11-01
In this paper a methodological framework for increasing the actual applicability of wind farms is developed and applied. The framework is based on multi-criteria decision aid techniques that perform an integrated technical and societal evaluation of a number of potential wind power projects that are a variation of a pre-existing actual proposal that faces implementation difficulties. A number of evaluation criteria are established and assessed via particular related software or are comparatively evaluated among each other on a semi-qualitative basis. The preference of a diverse audience of pertinent stakeholders can be also incorporated in the overall analysis. The result of the process is an identification of a new project that will exhibit increased actual implementation potential compared with the original proposal. The methodology is tested in a case study of a wind farm in the UK and relevant conclusions are drawn.
Introducing a new bond reactivity index: Philicities for natural bond orbitals.
Sánchez-Márquez, Jesús; Zorrilla, David; García, Víctor; Fernández, Manuel
2017-12-22
In the present work, a new methodology defined for obtaining reactivity indices (philicities) is proposed. This is based on reactivity functions such as the Fukui function or the dual descriptor, and makes it possible to project the information from reactivity functions onto molecular orbitals, instead of onto the atoms of the molecule (atomic reactivity indices). The methodology focuses on the molecules' natural bond orbitals (bond reactivity indices) because these orbitals have the advantage of being localized, allowing the reaction site of an electrophile or nucleophile to be determined within a very precise molecular region. This methodology provides a "philicity" index for every NBO, and a representative set of molecules has been used to test the new definition. A new methodology has also been developed to compare the "finite difference" and the "frontier molecular orbital" approximations. To facilitate their use, the proposed methodology as well as the possibility of calculating the new indices have been implemented in a new version of UCA-FUKUI software. In addition, condensation schemes based on atomic populations of the "atoms in molecules" theory, the Hirshfeld population analysis, the approximation of Mulliken (with a minimal basis set) and electrostatic potential-derived charges have also been implemented, including the calculation of "bond reactivity indices" defined in previous studies. Graphical abstract A new methodology defined for obtaining bond reactivity indices (philicities) is proposed and makes it possible to project the information from reactivity functions onto molecular orbitals. The proposed methodology as well as the possibility of calculating the new indices have been implemented in a new version of UCA-FUKUI software. In addition, this version can use new atomic condensation schemes and new "utilities" have also been included in this second version.
An evaluation of the directed flow graph methodology
NASA Technical Reports Server (NTRS)
Snyder, W. E.; Rajala, S. A.
1984-01-01
The applicability of the Directed Graph Methodology (DGM) to the design and analysis of special purpose image and signal processing hardware was evaluated. A special purpose image processing system was designed and described using DGM. The design, suitable for very large scale integration (VLSI) implements a region labeling technique. Two computer chips were designed, both using metal-nitride-oxide-silicon (MNOS) technology, as well as a functional system utilizing those chips to perform real time region labeling. The system is described in terms of DGM primitives. As it is currently implemented, DGM is inappropriate for describing synchronous, tightly coupled, special purpose systems. The nature of the DGM formalism lends itself more readily to modeling networks of general purpose processors.
STEM_CELL: a software tool for electron microscopy: part 2--analysis of crystalline materials.
Grillo, Vincenzo; Rossi, Francesca
2013-02-01
A new graphical software (STEM_CELL) for analysis of HRTEM and STEM-HAADF images is here introduced in detail. The advantage of the software, beyond its graphic interface, is to put together different analysis algorithms and simulation (described in an associated article) to produce novel analysis methodologies. Different implementations and improvements to state of the art approach are reported in the image analysis, filtering, normalization, background subtraction. In particular two important methodological results are here highlighted: (i) the definition of a procedure for atomic scale quantitative analysis of HAADF images, (ii) the extension of geometric phase analysis to large regions up to potentially 1μm through the use of under sampled images with aliasing effects. Copyright © 2012 Elsevier B.V. All rights reserved.
Corridor-based forecasts of work-zone impacts for freeways.
DOT National Transportation Integrated Search
2011-08-09
This project developed an analysis methodology and associated software implementation for the evaluation of : significant work zone impacts on freeways in North Carolina. The FREEVAL-WZ software tool allows the analyst : to predict the operational im...
Gil, Modesta Inmaculada; Benrimoj, Shalom Isaac; Martínez-Martínez, Fernando; Cardero, Manuel; Gastelurrutia, Miguel Ángel
2013-01-01
to prioritize previously identified in Spain facilitators for the implementation of new Pharmaceutical Services that allow designing strategies for the implementation of Medication Review with follow-up (MRFup) service. Exploratory factor analysis (EFA). A draft of a questionnaire was performed based on a previous literature review and following the RAND/UCLA methodology. An expert panel worked with it and generated a definitive questionnaire which, after piloting, was used with a representative sample of pharmacists, owners or staff members, who were working in community pharmacy, using computer-assisted telephone interviewing (CATI) methodology. To understand underlying constructs in the questionnaire an EFA was performed. Different approaches were tested such as principal components factor analysis and principal axis factoring method. The best interpretability was achieved using the Factorization of Principal axis method with Direct Oblimin rotation, which explained the 40.0% of total variance. This produced four factors defined as: «Incentives», «External campaigns», «Expert in MRFup» and «Professionalism of the pharmacist». It can be stated that for implementation and sustainability of MRFup Service it is necessary being paid; also it must be explained to health professional and society in general. Practice of MRFup service demands pharmacists receiving a more clinical education and assuming more responsibilities as health professionals. Copyright © 2012 Elsevier España, S.L. All rights reserved.
ERIC Educational Resources Information Center
Beach, Mary G.
2014-01-01
In this research study, the principal investigator used the methodology of auto-ethnography, interviews, and a critical perspective to analyze the effects of the implementation of high-stakes testing on educators and the classroom environment. In accessing reflective journals and interviews, the researcher gains insight into the dynamics of…
Parallel processing in a host plus multiple array processor system for radar
NASA Technical Reports Server (NTRS)
Barkan, B. Z.
1983-01-01
Host plus multiple array processor architecture is demonstrated to yield a modular, fast, and cost-effective system for radar processing. Software methodology for programming such a system is developed. Parallel processing with pipelined data flow among the host, array processors, and discs is implemented. Theoretical analysis of performance is made and experimentally verified. The broad class of problems to which the architecture and methodology can be applied is indicated.
ERIC Educational Resources Information Center
Diesel, Vivien; Miná Dias, Marcelo
2016-01-01
Purpose: To analyze the Brazilian experience in designing and implementing a recent extension policy reform based on agroecology, and reflect on its wider theoretical implications for extension reform literature. Design/methodology/approach: Using a critical public analysis we characterize the evolution of Brazilian federal extension policy…
Transient loads analysis for space flight applications
NASA Technical Reports Server (NTRS)
Thampi, S. K.; Vidyasagar, N. S.; Ganesan, N.
1992-01-01
A significant part of the flight readiness verification process involves transient analysis of the coupled Shuttle-payload system to determine the low frequency transient loads. This paper describes a methodology for transient loads analysis and its implementation for the Spacelab Life Sciences Mission. The analysis is carried out using two major software tools - NASTRAN and an external FORTRAN code called EZTRAN. This approach is adopted to overcome some of the limitations of NASTRAN's standard transient analysis capabilities. The method uses Data Recovery Matrices (DRM) to improve computational efficiency. The mode acceleration method is fully implemented in the DRM formulation to recover accurate displacements, stresses, and forces. The advantages of the method are demonstrated through a numerical example.
Spanish methodological approach for biosphere assessment of radioactive waste disposal.
Agüero, A; Pinedo, P; Cancio, D; Simón, I; Moraleda, M; Pérez-Sánchez, D; Trueba, C
2007-10-01
The development of radioactive waste disposal facilities requires implementation of measures that will afford protection of human health and the environment over a specific temporal frame that depends on the characteristics of the wastes. The repository design is based on a multi-barrier system: (i) the near-field or engineered barrier, (ii) far-field or geological barrier and (iii) the biosphere system. Here, the focus is on the analysis of this last system, the biosphere. A description is provided of conceptual developments, methodological aspects and software tools used to develop the Biosphere Assessment Methodology in the context of high-level waste (HLW) disposal facilities in Spain. This methodology is based on the BIOMASS "Reference Biospheres Methodology" and provides a logical and systematic approach with supplementary documentation that helps to support the decisions necessary for model development. It follows a five-stage approach, such that a coherent biosphere system description and the corresponding conceptual, mathematical and numerical models can be built. A discussion on the improvements implemented through application of the methodology to case studies in international and national projects is included. Some facets of this methodological approach still require further consideration, principally an enhanced integration of climatology, geography and ecology into models considering evolution of the environment, some aspects of the interface between the geosphere and biosphere, and an accurate quantification of environmental change processes and rates.
Analysis of effects of impurities intentionally incorporated into silicon
NASA Technical Reports Server (NTRS)
Uno, F.
1977-01-01
A methodology was developed and implemented to allow silicon samples containing intentionally incorporated impurities to be fabricated into finished solar cells under carefully controlled conditions. The electrical and spectral properties were then measured for each group processed.
Integrated corridor management modeling results report : Dallas, Minneapolis, and San Diego.
DOT National Transportation Integrated Search
2012-02-01
This executive summary documents the analysis methodologies, tools, and performance measures used to analyze Integrated Corridor Management (ICM) strategies; and presents high-level results for the successful implementation of ICM at three Stage 2 Pi...
Population-Level Cost-Effectiveness of Implementing Evidence-Based Practices into Routine Care
Fortney, John C; Pyne, Jeffrey M; Burgess, James F
2014-01-01
Objective The objective of this research was to apply a new methodology (population-level cost-effectiveness analysis) to determine the value of implementing an evidence-based practice in routine care. Data Sources/Study Setting Data are from sequentially conducted studies: a randomized controlled trial and an implementation trial of collaborative care for depression. Both trials were conducted in the same practice setting and population (primary care patients prescribed antidepressants). Study Design The study combined results from a randomized controlled trial and a pre-post-quasi-experimental implementation trial. Data Collection/Extraction Methods The randomized controlled trial collected quality-adjusted life years (QALYs) from survey and medication possession ratios (MPRs) from administrative data. The implementation trial collected MPRs and intervention costs from administrative data and implementation costs from survey. Principal Findings In the randomized controlled trial, MPRs were significantly correlated with QALYs (p = .03). In the implementation trial, patients at implementation sites had significantly higher MPRs (p = .01) than patients at control sites, and by extrapolation higher QALYs (0.00188). Total costs (implementation, intervention) were nonsignificantly higher ($63.76) at implementation sites. The incremental population-level cost-effectiveness ratio was $33,905.92/QALY (bootstrap interquartile range −$45,343.10/QALY to $99,260.90/QALY). Conclusions The methodology was feasible to operationalize and gave reasonable estimates of implementation value. PMID:25328029
NASA Astrophysics Data System (ADS)
Dadashzadeh, N.; Duzgun, H. S. B.; Yesiloglu-Gultekin, N.
2017-08-01
While advanced numerical techniques in slope stability analysis are successfully used in deterministic studies, they have so far found limited use in probabilistic analyses due to their high computation cost. The first-order reliability method (FORM) is one of the most efficient probabilistic techniques to perform probabilistic stability analysis by considering the associated uncertainties in the analysis parameters. However, it is not possible to directly use FORM in numerical slope stability evaluations as it requires definition of a limit state performance function. In this study, an integrated methodology for probabilistic numerical modeling of rock slope stability is proposed. The methodology is based on response surface method, where FORM is used to develop an explicit performance function from the results of numerical simulations. The implementation of the proposed methodology is performed by considering a large potential rock wedge in Sumela Monastery, Turkey. The accuracy of the developed performance function to truly represent the limit state surface is evaluated by monitoring the slope behavior. The calculated probability of failure is compared with Monte Carlo simulation (MCS) method. The proposed methodology is found to be 72% more efficient than MCS, while the accuracy is decreased with an error of 24%.
User Evaluation of the NASA Technical Report Server Recommendation Service
NASA Technical Reports Server (NTRS)
Nelson, Michael L.; Bollen, Johan; Calhoun, JoAnne R.; Mackey, Calvin E.
2004-01-01
We present the user evaluation of two recommendation server methodologies implemented for the NASA Technical Report Server (NTRS). One methodology for generating recommendations uses log analysis to identify co-retrieval events on full-text documents. For comparison, we used the Vector Space Model (VSM) as the second methodology. We calculated cosine similarities and used the top 10 most similar documents (based on metadata) as recommendations . We then ran an experiment with NASA Langley Research Center (LaRC) staff members to gather their feedback on which method produced the most quality recommendations. We found that in most cases VSM outperformed log analysis of co-retrievals. However, analyzing the data revealed the evaluations may have been structurally biased in favor of the VSM generated recommendations. We explore some possible methods for combining log analysis and VSM generated recommendations and suggest areas of future work.
User Evaluation of the NASA Technical Report Server Recommendation Service
NASA Technical Reports Server (NTRS)
Nelson, Michael L.; Bollen, Johan; Calhoun, JoAnne R.; Mackey, Calvin E.
2004-01-01
We present the user evaluation of two recommendation server methodologies implemented for the NASA Technical Report Server (NTRS). One methodology for generating recommendations uses log analysis to identify co-retrieval events on full-text documents. For comparison, we used the Vector Space Model (VSM) as the second methodology. We calculated cosine similarities and used the top 10 most similar documents (based on metadata) as 'recommendations'. We then ran an experiment with NASA Langley Research Center (LaRC) staff members to gather their feedback on which method produced the most 'quality' recommendations. We found that in most cases VSM outperformed log analysis of co-retrievals. However, analyzing the data revealed the evaluations may have been structurally biased in favor of the VSM generated recommendations. We explore some possible methods for combining log analysis and VSM generated recommendations and suggest areas of future work.
Konstantinidis, Georgios; Anastassopoulos, George C; Karakos, Alexandros S; Anagnostou, Emmanouil; Danielides, Vasileios
2012-04-01
The aim of this study is to present our perspectives on healthcare analysis and design and the lessons learned from our experience with the development of a distributed, object-oriented Clinical Information System (CIS). In order to overcome known issues regarding development, implementation and finally acceptance of a CIS by the physicians we decided to develop a novel object-oriented methodology by integrating usability principles and techniques in a simplified version of a well established software engineering process (SEP), the Unified Process (UP). A multilayer architecture has been defined and implemented with the use of a vendor application framework. Our first experiences from a pilot implementation of our CIS are positive. This approach allowed us to gain a socio-technical understanding of the domain and enabled us to identify all the important factors that define both the structure and the behavior of a Health Information System.
Development of Probabilistic Structural Analysis Integrated with Manufacturing Processes
NASA Technical Reports Server (NTRS)
Pai, Shantaram S.; Nagpal, Vinod K.
2007-01-01
An effort has been initiated to integrate manufacturing process simulations with probabilistic structural analyses in order to capture the important impacts of manufacturing uncertainties on component stress levels and life. Two physics-based manufacturing process models (one for powdered metal forging and the other for annular deformation resistance welding) have been linked to the NESSUS structural analysis code. This paper describes the methodology developed to perform this integration including several examples. Although this effort is still underway, particularly for full integration of a probabilistic analysis, the progress to date has been encouraging and a software interface that implements the methodology has been developed. The purpose of this paper is to report this preliminary development.
MacFarlane, Anne; O'Donnell, Catherine; Mair, Frances; O'Reilly-de Brún, Mary; de Brún, Tomas; Spiegel, Wolfgang; van den Muijsenbergh, Maria; van Weel-Baumgarten, Evelyn; Lionis, Christos; Burns, Nicola; Gravenhorst, Katja; Princz, Christine; Teunissen, Erik; van den Driessen Mareeuw, Francine; Saridaki, Aristoula; Papadakaki, Maria; Vlahadi, Maria; Dowrick, Christopher
2012-11-20
The implementation of guidelines and training initiatives to support communication in cross-cultural primary care consultations is ad hoc across a range of international settings with negative consequences particularly for migrants. This situation reflects a well-documented translational gap between evidence and practice and is part of the wider problem of implementing guidelines and the broader range of professional educational and quality interventions in routine practice. In this paper, we describe our use of a contemporary social theory, Normalization Process Theory and participatory research methodology--Participatory Learning and Action--to investigate and support implementation of such guidelines and training initiatives in routine practice. This is a qualitative case study, using multiple primary care sites across Europe. Purposive and maximum variation sampling approaches will be used to identify and recruit stakeholders-migrant service users, general practitioners, primary care nurses, practice managers and administrative staff, interpreters, cultural mediators, service planners, and policy makers. We are conducting a mapping exercise to identify relevant guidelines and training initiatives. We will then initiate a PLA-brokered dialogue with stakeholders around Normalization Process Theory's four constructs--coherence, cognitive participation, collective action, and reflexive monitoring. Through this, we will enable stakeholders in each setting to select a single guideline or training initiative for implementation in their local setting. We will prospectively investigate and support the implementation journeys for the five selected interventions. Data will be generated using a Participatory Learning and Action approach to interviews and focus groups. Data analysis will follow the principles of thematic analysis, will occur in iterative cycles throughout the project and will involve participatory co-analysis with key stakeholders to enhance the authenticity and veracity of findings. This research employs a unique combination of Normalization Process Theory and Participatory Learning and Action, which will provide a novel approach to the analysis of implementation journeys. The findings will advance knowledge in the field of implementation science because we are using and testing theoretical and methodological approaches so that we can critically appraise their scope to mediate barriers and improve the implementation processes.
Implementing CDIO project-based learning in training of Heat and Power engineers
NASA Astrophysics Data System (ADS)
Boiko, E. A.; Shishmarev, P. V.; Karabarin, D. I.; Yanov, S. R.; Pikalova, A. A.
2017-11-01
This paper presents the experience and current results of CDIO standards implementation in training of bachelors in Heat and Power Engineering at Thermal Power Stations academic department in Siberian Federal University. It provides information on methodology of modernization of educational programs, curricula and programs of disciplines in transition to CDIO project-based learning technology. Preliminary assessment and analysis of lessons learned and scaling perspectives are given.
NASA Technical Reports Server (NTRS)
Leininger, G.; Jutila, S.; King, J.; Muraco, W.; Hansell, J.; Lindeen, J.; Franckowiak, E.; Flaschner, A.
1975-01-01
A methodology is described for the evaluation of societal impacts associated with the implementation of a new technology. Theoretical foundations for the methodology, called the total assessment profile, are established from both the economic and social science perspectives. The procedure provides for accountability of nonquantifiable factors and measures through the use of a comparative value matrix by assessing the impacts of the technology on the value system of the society.
DOT National Transportation Integrated Search
2012-11-01
As part of the ongoing evolution towards integrated highway asset management, the Indiana Department of Transportation (INDOT), : through SPR studies in 2004 and 2010, sponsored research that developed an overall framework for asset management. This ...
NASA Technical Reports Server (NTRS)
Funk, Christie J.
2013-01-01
A software program and associated methodology to study gust loading on aircraft exists for a classification of geometrically simplified flexible configurations. This program consists of a simple aircraft response model with two rigid and three flexible symmetric degrees of freedom and allows for the calculation of various airplane responses due to a discrete one-minus-cosine gust as well as continuous turbulence. Simplifications, assumptions, and opportunities for potential improvements pertaining to the existing software program are first identified, then a revised version of the original software tool is developed with improved methodology to include more complex geometries, additional excitation cases, and output data so as to provide a more useful and accurate tool for gust load analysis. Revisions are made in the categories of aircraft geometry, computation of aerodynamic forces and moments, and implementation of horizontal tail mode shapes. In order to improve the original software program to enhance usefulness, a wing control surface and horizontal tail control surface is added, an extended application of the discrete one-minus-cosine gust input is employed, a supplemental continuous turbulence spectrum is implemented, and a capability to animate the total vehicle deformation response to gust inputs in included. These revisions and enhancements are implemented and an analysis of the results is used to validate the modifications.
Gupta, Renu; Sharma, Sangeeta; Saxena, Sonal
2018-01-01
Healthcare-associated infections (HAI) are preventable in up to 30% of patients with evidence-based infection prevention and control (IPC) activities. IPC activities require effective surveillance to generate data for the HAI rates, defining priority areas, identifying processes amenable for improvement and institute interventions to improve patient's safety. However, uniform, accurate and standardised surveillance methodology using objective definitions can only generate meaningful data for effective execution of IPC activities. The highly exhaustive, complex and ever-evolving infection surveillance methodology pose a challenge for effective data capture, analysis and interpretation by ground level personnel. The present review addresses the gaps in knowledge and day-to-day challenges in surveillance faced by infection control team for effective implementation of IPC activities.
Panayotov, Dobromir; Poitevin, Yves; Grief, Andrew; ...
2016-09-23
'Fusion for Energy' (F4E) is designing, developing, and implementing the European Helium-Cooled Lead-Lithium (HCLL) and Helium-Cooled Pebble-Bed (HCPB) Test Blanket Systems (TBSs) for ITER (Nuclear Facility INB-174). Safety demonstration is an essential element for the integration of these TBSs into ITER and accident analysis is one of its critical components. A systematic approach to accident analysis has been developed under the F4E contract on TBS safety analyses. F4E technical requirements, together with Amec Foster Wheeler and INL efforts, have resulted in a comprehensive methodology for fusion breeding blanket accident analysis that addresses the specificity of the breeding blanket designs, materials,more » and phenomena while remaining consistent with the approach already applied to ITER accident analyses. Furthermore, the methodology phases are illustrated in the paper by its application to the EU HCLL TBS using both MELCOR and RELAP5 codes.« less
Methodology issues in implementation science.
Newhouse, Robin; Bobay, Kathleen; Dykes, Patricia C; Stevens, Kathleen R; Titler, Marita
2013-04-01
Putting evidence into practice at the point of care delivery requires an understanding of implementation strategies that work, in what context and how. To identify methodological issues in implementation science using 4 studies as cases and make recommendations for further methods development. Four cases are presented and methodological issues identified. For each issue raised, evidence on the state of the science is described. Issues in implementation science identified include diverse conceptual frameworks, potential weaknesses in pragmatic study designs, and the paucity of standard concepts and measurement. Recommendations to advance methods in implementation include developing a core set of implementation concepts and metrics, generating standards for implementation methods including pragmatic trials, mixed methods designs, complex interventions and measurement, and endorsing reporting standards for implementation studies.
DOT National Transportation Integrated Search
2008-12-01
Shortly after the 1994 Northridge Earthquake, Caltrans geotechnical engineers charged with developing site-specific : response spectra for high priority California bridges initiated a research project aimed at broadening their perspective : from simp...
NASA Astrophysics Data System (ADS)
Hendikawati, P.; Arifudin, R.; Zahid, M. Z.
2018-03-01
This study aims to design an android Statistics Data Analysis application that can be accessed through mobile devices to making it easier for users to access. The Statistics Data Analysis application includes various topics of basic statistical along with a parametric statistics data analysis application. The output of this application system is parametric statistics data analysis that can be used for students, lecturers, and users who need the results of statistical calculations quickly and easily understood. Android application development is created using Java programming language. The server programming language uses PHP with the Code Igniter framework, and the database used MySQL. The system development methodology used is the Waterfall methodology with the stages of analysis, design, coding, testing, and implementation and system maintenance. This statistical data analysis application is expected to support statistical lecturing activities and make students easier to understand the statistical analysis of mobile devices.
[Application of root cause analysis in healthcare].
Hsu, Tsung-Fu
2007-12-01
The main purpose of this study was to explore various aspects of root cause analysis (RCA), including its definition, rationale concept, main objective, implementation procedures, most common analysis methodology (fault tree analysis, FTA), and advantages and methodologic limitations in regard to healthcare. Several adverse events that occurred at a certain hospital were also analyzed by the author using FTA as part of this study. RCA is a process employed to identify basic and contributing causal factors underlying performance variations associated with adverse events. The rationale concept of RCA offers a systemic approach to improving patient safety that does not assign blame or liability to individuals. The four-step process involved in conducting an RCA includes: RCA preparation, proximate cause identification, root cause identification, and recommendation generation and implementation. FTA is a logical, structured process that can help identify potential causes of system failure before actual failures occur. Some advantages and significant methodologic limitations of RCA were discussed. Finally, we emphasized that errors stem principally from faults attributable to system design, practice guidelines, work conditions, and other human factors, which induce health professionals to make negligence or mistakes with regard to healthcare. We must explore the root causes of medical errors to eliminate potential RCA system failure factors. Also, a systemic approach is needed to resolve medical errors and move beyond a current culture centered on assigning fault to individuals. In constructing a real environment of patient-centered safety healthcare, we can help encourage clients to accept state-of-the-art healthcare services.
NASA Astrophysics Data System (ADS)
Ravanelli, R.; Nascetti, A.; Cirigliano, R. V.; Di Rico, C.; Monti, P.; Crespi, M.
2018-04-01
The aim of this work is to exploit the large-scale analysis capabilities of the innovative Google Earth Engine platform in order to investigate the temporal variations of the Urban Heat Island phenomenon as a whole. A intuitive methodology implementing a largescale correlation analysis between the Land Surface Temperature and Land Cover alterations was thus developed.The results obtained for the Phoenix MA are promising and show how the urbanization heavily affects the magnitude of the UHI effects with significant increases in LST. The proposed methodology is therefore able to efficiently monitor the UHI phenomenon.
Karayiannis, Nikos Ch.; Kröger, Martin
2009-01-01
We review the methodology, algorithmic implementation and performance characteristics of a hierarchical modeling scheme for the generation, equilibration and topological analysis of polymer systems at various levels of molecular description: from atomistic polyethylene samples to random packings of freely-jointed chains of tangent hard spheres of uniform size. Our analysis focuses on hitherto less discussed algorithmic details of the implementation of both, the Monte Carlo (MC) procedure for the system generation and equilibration, and a postprocessing step, where we identify the underlying topological structure of the simulated systems in the form of primitive paths. In order to demonstrate our arguments, we study how molecular length and packing density (volume fraction) affect the performance of the MC scheme built around chain-connectivity altering moves. In parallel, we quantify the effect of finite system size, of polydispersity, and of the definition of the number of entanglements (and related entanglement molecular weight) on the results about the primitive path network. Along these lines we approve main concepts which had been previously proposed in the literature. PMID:20087477
[The implementation of strategy of medicinal support in multi-type hospital].
Ludupova, E Yu
2016-01-01
The article presents brief review of implementation of strategy of medicinal support of population of the Russian Federation and experience of application of at the level of regional hospital. The necessity and importance of implementation into practice of hospitals of methodology of pharmaco-economical management of medicinal care using modern technologies of XYZ-, ABC and VEN-analysis is demonstrated. The stages of development and implementation of process of medicinal support of multifield hospital applying principles of system of quality management (processing and systemic approaches, risk management) on the basis of standards ISO 9001 are described. The significance of monitoring of results ofprocess of medicinal support of the basis of implementation of priority target programs (prevention of venous thrombo-embolic complications, system od control of anti-bacterial therapy) are demonstrated in relation to multi-field hospital using technique of ATC/DDD-analysis for evaluating indices of effectiveness and efficiency.
OPTIGOV - A new methodology for evaluating Clinical Governance implementation by health providers
2010-01-01
Background The aim of Clinical Governance (CG) is to the pursuit of quality in health care through the integration of all the activities impacting on the patient into a single strategy. OPTIGOV (Optimizing Health Care Governance) is a methodology for the assessment of the level of implementation of CG within healthcare organizations. The aim of this paper is to explain the process underlying the development of OPTIGOV, and describe its characteristics and steps. Methods OPTIGOV was developed in 2006 by the Institute of Hygiene of the Catholic University of the Sacred Heart and Eurogroup Consulting Alliance. The main steps of the process were: choice of areas for analysis and questionnaire development, based on a review of scientific literature; assignment of scores and weights to individual questions and areas; implementation of a software interfaceable with Microsoft Office. Results OPTIGOV consists of: a) a hospital audit with a structured approach; b) development of an improvement operational plan. A questionnaire divided into 13 areas of analysis is used. For each area there is a form with a variable number of questions and "closed" answers. A score is assigned to each answer, area of analysis, healthcare department and unit. The single scores can be gathered for the organization as a whole. The software application allows for collation of data, calculation of scores and development of benchmarks to allow comparisons between healthcare organizations. Implementation consists of three stages: the preparation phase includes a kick off meeting, selection of interviewees and development of a survey plan. The registration phase includes hospital audits, reviewing of hospital documentation, data collection and score processing. Lastly, results are processed, inserted into a final report, and discussed in a meeting with the Hospital Board and in a final workshop. Conclusions The OPTIGOV methodology for the evaluation of CG implementation was developed with an evidence-based approach. The ongoing adoption of OPTIGOV in several projects will put to the test its potential to realistically represent the organization status, pinpoint criticalities and transferable best practices, provide a plan for improvement, and contribute to triggering changes and pursuit of quality in health care. PMID:20565967
System cost/performance analysis (study 2.3). Volume 1: Executive summary
NASA Technical Reports Server (NTRS)
Kazangey, T.
1973-01-01
The relationships between performance, safety, cost, and schedule parameters were identified and quantified in support of an overall effort to generate program models and methodology that provide insight into a total space vehicle program. A specific space vehicle system, the attitude control system (ACS), was used, and a modeling methodology was selected that develops a consistent set of quantitative relationships among performance, safety, cost, and schedule, based on the characteristics of the components utilized in candidate mechanisms. These descriptive equations were developed for a three-axis, earth-pointing, mass expulsion ACS. A data base describing typical candidate ACS components was implemented, along with a computer program to perform sample calculations. This approach, implemented on a computer, is capable of determining the effect of a change in functional requirements to the ACS mechanization and the resulting cost and schedule. By a simple extension of this modeling methodology to the other systems in a space vehicle, a complete space vehicle model can be developed. Study results and recommendations are presented.
Schaafsma, Joanna D; van der Graaf, Yolanda; Rinkel, Gabriel J E; Buskens, Erik
2009-12-01
The lack of a standard methodology in diagnostic research impedes adequate evaluation before implementation of constantly developing diagnostic techniques. We discuss the methodology of diagnostic research and underscore the relevance of decision analysis in the process of evaluation of diagnostic tests. Overview and conceptual discussion. Diagnostic research requires a stepwise approach comprising assessment of test characteristics followed by evaluation of added value, clinical outcome, and cost-effectiveness. These multiple goals are generally incompatible with a randomized design. Decision-analytic models provide an important alternative through integration of the best available evidence. Thus, critical assessment of clinical value and efficient use of resources can be achieved. Decision-analytic models should be considered part of the standard methodology in diagnostic research. They can serve as a valid alternative to diagnostic randomized clinical trials (RCTs).
A system management methodology for building successful resource management systems
NASA Technical Reports Server (NTRS)
Hornstein, Rhoda Shaller; Willoughby, John K.
1989-01-01
This paper presents a system management methodology for building successful resource management systems that possess lifecycle effectiveness. This methodology is based on an analysis of the traditional practice of Systems Engineering Management as it applies to the development of resource management systems. The analysis produced fifteen significant findings presented as recommended adaptations to the traditional practice of Systems Engineering Management to accommodate system development when the requirements are incomplete, unquantifiable, ambiguous and dynamic. Ten recommended adaptations to achieve operational effectiveness when requirements are incomplete, unquantifiable or ambiguous are presented and discussed. Five recommended adaptations to achieve system extensibility when requirements are dynamic are also presented and discussed. The authors conclude that the recommended adaptations to the traditional practice of Systems Engineering Management should be implemented for future resource management systems and that the technology exists to build these systems extensibly.
Mixed time integration methods for transient thermal analysis of structures, appendix 5
NASA Technical Reports Server (NTRS)
Liu, W. K.
1982-01-01
Mixed time integration methods for transient thermal analysis of structures are studied. An efficient solution procedure for predicting the thermal behavior of aerospace vehicle structures was developed. A 2D finite element computer program incorporating these methodologies is being implemented. The performance of these mixed time finite element algorithms can then be evaluated employing the proposed example problem.
NASA Astrophysics Data System (ADS)
Sokolov, M. A.
This handbook treats the design and analysis of of pulsed radar receivers, with emphasis on elements (especially IC elements) that implement optimal and suboptimal algorithms. The design methodology is developed from the viewpoint of statistical communications theory. Particular consideration is given to the synthesis of single-channel and multichannel detectors, the design of analog and digital signal-processing devices, and the analysis of IF amplifiers.
Montowska, Magdalena; Alexander, Morgan R; Tucker, Gregory A; Barrett, David A
2014-10-21
In this Article, our previously developed ambient LESA-MS methodology is implemented to analyze five types of thermally treated meat species, namely, beef, pork, horse, chicken, and turkey meat, to select and identify heat-stable and species-specific peptide markers. In-solution tryptic digests of cooked meats were deposited onto a polymer surface, followed by LESA-MS analysis and evaluation using multivariate data analysis and tandem electrospray MS. The five types of cooked meat were clearly discriminated using principal component analysis and orthogonal partial least-squares discriminant analysis. 23 heat stable peptide markers unique to species and muscle protein were identified following data-dependent tandem LESA-MS analysis. Surface extraction and direct ambient MS analysis of mixtures of cooked meat species was performed for the first time and enabled detection of 10% (w/w) of pork, horse, and turkey meat and 5% (w/w) of chicken meat in beef, using the developed LESA-MS/MS analysis. The study shows, for the first time, that ambient LESA-MS methodology displays specificity sufficient to be implemented effectively for the analysis of processed and complex peptide digests. The proposed approach is much faster and simpler than other measurement tools for meat speciation; it has potential for application in other areas of meat science or food production.
NASA Technical Reports Server (NTRS)
Wang, Ten-See; Canabal, Francisco; Chen, Yen-Sen; Cheng, Gary; Ito, Yasushi
2013-01-01
Nuclear thermal propulsion is a leading candidate for in-space propulsion for human Mars missions. This chapter describes a thermal hydraulics design and analysis methodology developed at the NASA Marshall Space Flight Center, in support of the nuclear thermal propulsion development effort. The objective of this campaign is to bridge the design methods in the Rover/NERVA era, with a modern computational fluid dynamics and heat transfer methodology, to predict thermal, fluid, and hydrogen environments of a hypothetical solid-core, nuclear thermal engine the Small Engine, designed in the 1960s. The computational methodology is based on an unstructured-grid, pressure-based, all speeds, chemically reacting, computational fluid dynamics and heat transfer platform, while formulations of flow and heat transfer through porous and solid media were implemented to describe those of hydrogen flow channels inside the solid24 core. Design analyses of a single flow element and the entire solid-core thrust chamber of the Small Engine were performed and the results are presented herein
NASA Astrophysics Data System (ADS)
Sierra-Pérez, Julián; Torres-Arredondo, M.-A.; Alvarez-Montoya, Joham
2018-01-01
Structural health monitoring consists of using sensors integrated within structures together with algorithms to perform load monitoring, damage detection, damage location, damage size and severity, and prognosis. One possibility is to use strain sensors to infer structural integrity by comparing patterns in the strain field between the pristine and damaged conditions. In previous works, the authors have demonstrated that it is possible to detect small defects based on strain field pattern recognition by using robust machine learning techniques. They have focused on methodologies based on principal component analysis (PCA) and on the development of several unfolding and standardization techniques, which allow dealing with multiple load conditions. However, before a real implementation of this approach in engineering structures, changes in the strain field due to conditions different from damage occurrence need to be isolated. Since load conditions may vary in most engineering structures and promote significant changes in the strain field, it is necessary to implement novel techniques for uncoupling such changes from those produced by damage occurrence. A damage detection methodology based on optimal baseline selection (OBS) by means of clustering techniques is presented. The methodology includes the use of hierarchical nonlinear PCA as a nonlinear modeling technique in conjunction with Q and nonlinear-T 2 damage indices. The methodology is experimentally validated using strain measurements obtained by 32 fiber Bragg grating sensors bonded to an aluminum beam under dynamic bending loads and simultaneously submitted to variations in its pitch angle. The results demonstrated the capability of the methodology for clustering data according to 13 different load conditions (pitch angles), performing the OBS and detecting six different damages induced in a cumulative way. The proposed methodology showed a true positive rate of 100% and a false positive rate of 1.28% for a 99% of confidence.
DOT National Transportation Integrated Search
2012-05-05
As part of the Federal Highway Administration (FHWA) Traffic Analysis Toolbox (Volume XIII), this guide was designed to help corridor stakeholders implement the ICM AMS methodology successfully and effectively. It provides a step-by-step approach to ...
Multimodal Narrative Inquiry: Six Teacher Candidates Respond
ERIC Educational Resources Information Center
Morawski, Cynthia M.; Rottmann, Jennifer
2016-01-01
In this paper we present findings of a study on the implementation of a multimodal teacher narrative inquiry component, theoretically grounded by Rosenblatt's theory of transaction analysis, methodologically supported by action research and practically enacted by narrative inquiry and multimodal learning. In particular, the component offered…
Boronat, F; Budia, A; Broseta, E; Ruiz-Cerdá, J L; Vivas-Consuelo, D
To describe the application of the Lean methodology as a method for continuously improving the efficiency of a urology department in a tertiary hospital. The implementation of the Lean Healthcare methodology in a urology department was conducted in 3 phases: 1) team training and improvement of feedback among the practitioners, 2) management by process and superspecialisation and 3) improvement of indicators (continuous improvement). The indicators were obtained from the Hospital's information systems. The main source of information was the Balanced Scorecard for health systems management (CUIDISS). The comparison with other autonomous and national urology departments was performed through the same platform with the help of the Hospital's records department (IASIST). A baseline was established with the indicators obtained in 2011 for the comparative analysis of the results after implementing the Lean Healthcare methodology. The implementation of this methodology translated into high practitioner satisfaction, improved quality indicators reaching a risk-adjusted complication index (RACI) of 0.59 and a risk-adjusted mortality rate (RAMR) of 0.24 in 4 years. A value of 0.61 was reached with the efficiency indicator (risk-adjusted length of stay [RALOS] index), with a savings of 2869 stays compared with national Benchmarking (IASIST). The risk-adjusted readmissions index (RARI) was the only indicator above the standard, with a value of 1.36 but with progressive annual improvement of the same. The Lean methodology can be effectively applied to a urology department of a tertiary hospital to improve efficiency, obtaining significant and continuous improvements in all its indicators, as well as practitioner satisfaction. Team training, management by process, continuous improvement and delegation of responsibilities has been shown to be the fundamental pillars of this methodology. Copyright © 2017 AEU. Publicado por Elsevier España, S.L.U. All rights reserved.
ERIC Educational Resources Information Center
Kessler, Seth A.; Horton, Karissa D.; Gottlieb, Nell H.; Atwood, Robin
2012-01-01
Purpose: The purpose of this study is to describe preceptors' implementation experiences after implementing a workplace learning program in Texas WIC (women, infant, and children) agencies and identify implementation best practices. Design/methodology/approach: This research used qualitative description methodology. Data collection consisted of 11…
Creating an enabling environment for WR&R implementation.
Stathatou, P-M; Kampragou, E; Grigoropoulou, H; Assimacopoulos, D; Karavitis, C; Gironás, J
2017-09-01
Reclaimed water is receiving growing attention worldwide as an effective solution for alleviating the growing water scarcity in many areas. Despite the various benefits associated with reclaimed water, water recycling and reuse (WR&R) practices are not widely applied around the world. This is mostly due to complex and inadequate local legal and institutional frameworks and socio-economic structures, which pose barriers to wider WR&R implementation. An integrated approach is therefore needed while planning the implementation of WR&R schemes, considering all the potential barriers, and aiming to develop favourable conditions for enhancing reclaimed water use. This paper proposes a comprehensive methodology supporting the development of an enabling environment for WR&R implementation. The political, economic, social, technical, legal and institutional factors that may influence positively (drivers) or negatively (barriers) WR&R implementation in the regional water systems are identified, through the mapping of local stakeholder perceptions. The identified barriers are further analysed, following a Cross-Impact/System analysis, to recognize the most significant barriers inhibiting system transition, and to prioritize the enabling instruments and arrangements that are needed to boost WR&R implementation. The proposed methodology was applied in the Copiapó River Basin in Chile, which faces severe water scarcity. Through the analysis, it was observed that barriers outweigh drivers for the implementation of WR&R schemes in the Copiapó River Basin, while the key barriers which could be useful for policy formulation towards an enabling environment in the area concern the unclear legal framework regarding the ownership of treated wastewater, the lack of environmental policies focusing on pollution control, the limited integration of reclaimed water use in current land use and development policies, the limited public awareness on WR&R, and the limited availability of governmental funding sources for WR&R.
An engineering methodology for implementing and testing VLSI (Very Large Scale Integrated) circuits
NASA Astrophysics Data System (ADS)
Corliss, Walter F., II
1989-03-01
The engineering methodology for producing a fully tested VLSI chip from a design layout is presented. A 16-bit correlator, NPS CORN88, that was previously designed, was used as a vehicle to demonstrate this methodology. The study of the design and simulation tools, MAGIC and MOSSIM II, was the focus of the design and validation process. The design was then implemented and the chip was fabricated by MOSIS. This fabricated chip was then used to develop a testing methodology for using the digital test facilities at NPS. NPS CORN88 was the first full custom VLSI chip, designed at NPS, to be tested with the NPS digital analysis system, Tektronix DAS 9100 series tester. The capabilities and limitations of these test facilities are examined. NPS CORN88 test results are included to demonstrate the capabilities of the digital test system. A translator, MOS2DAS, was developed to convert the MOSSIM II simulation program to the input files required by the DAS 9100 device verification software, 91DVS. Finally, a tutorial for using the digital test facilities, including the DAS 9100 and associated support equipments, is included as an appendix.
Novel horizontal and vertical integrated bioethics curriculum for medical courses.
D'Souza, Russell F; Mathew, Mary; D'Souza, Derek S J; Palatty, Princy
2018-02-28
Studies conducted by the University of Haifa, Israel in 2001, evaluating the effectiveness of bioethics being taught in medical colleges, suggested that there was a significant lack of translation in clinical care. Analysis also revealed, ineffectiveness with the teaching methodology used, lack of longitudinal integration of bioethics into the undergraduate medical curriculum, and the limited exposure to the technology in decision making when confronting ethical dilemmas. A modern novel bioethics curriculum and innovative methodology for teaching bioethics for the medical course was developed by the UNESCO Chair in Bioethics, Haifa. The horizontal (subject-wise) curriculum was vertically integrated seamlessly through the entire course. An innovative bioethics teaching methodology was employed to implement the curriculum. This new curriculum was piloted in a few medical colleges in India from 2011 to 2015 and the outcomes were evaluated. The evaluation confirmed gains over the earlier identified translation gap with added high student acceptability and satisfaction. This integrated curriculum is now formally implemented in the Indian program's Health Science Universities which is affiliated with over 200 medical schools in India. This article offers insights from the evaluated novel integrated bioethics curriculum and the innovative bioethics teaching methodology that was used in the pilot program.
Integrating Kano’s Model into Quality Function Deployment for Product Design: A Comprehensive Review
NASA Astrophysics Data System (ADS)
Ginting, Rosnani; Hidayati, Juliza; Siregar, Ikhsan
2018-03-01
Many methods and techniques are adopted by some companies to improve the competitiveness through the fulfillment of customer satisfaction by enhancement and improvement the product design quality. Over the past few years, several researcher have studied extensively combining Quality Function Deployment and Kano’s model as design techniques by focusing on translating consumer desires into a product design. This paper presents a review and analysis of several literatures that associated to the integration methodology of Kano into the QFD process. Various of international journal articles were selected, collected and analyzed through a number of relevant scientific publications. In-depth analysis was performed, and focused in this paper on the results, advantages and drawbacks of its methodology. In addition, this paper also provides the analysis that acquired in this study related to the development of the methodology. It is hopedd this paper can be a reference for other researchers and manufacturing companies to implement the integration method of QFD- Kano for product design.
Identification of Good Practices in the Implementation of Innovative Learning Methodologies
ERIC Educational Resources Information Center
Lincaru, Cristina; Ciuca, Vasilica; Grecu, Liliana; Atanasiu, Draga; Dragoiu, Codruta
2011-01-01
We intend to present the partial issues resulted from the development of the European Project DeInTRA "cooperation for innovative training methodologies deployment in the European Labour Market"--Stage 4: Identification of good practices in the implementation of innovative learning methodologies. This project is included into the…
Gulliford, Martin C; van Staa, Tjeerd P; McDermott, Lisa; McCann, Gerard; Charlton, Judith; Dregan, Alex
2014-06-11
There is growing interest in conducting clinical and cluster randomized trials through electronic health records. This paper reports on the methodological issues identified during the implementation of two cluster randomized trials using the electronic health records of the Clinical Practice Research Datalink (CPRD). Two trials were completed in primary care: one aimed to reduce inappropriate antibiotic prescribing for acute respiratory infection; the other aimed to increase physician adherence with secondary prevention interventions after first stroke. The paper draws on documentary records and trial datasets to report on the methodological experience with respect to research ethics and research governance approval, general practice recruitment and allocation, sample size calculation and power, intervention implementation, and trial analysis. We obtained research governance approvals from more than 150 primary care organizations in England, Wales, and Scotland. There were 104 CPRD general practices recruited to the antibiotic trial and 106 to the stroke trial, with the target number of practices being recruited within six months. Interventions were installed into practice information systems remotely over the internet. The mean number of participants per practice was 5,588 in the antibiotic trial and 110 in the stroke trial, with the coefficient of variation of practice sizes being 0.53 and 0.56 respectively. Outcome measures showed substantial correlations between the 12 months before, and after intervention, with coefficients ranging from 0.42 for diastolic blood pressure to 0.91 for proportion of consultations with antibiotics prescribed, defining practice and participant eligibility for analysis requires careful consideration. Cluster randomized trials may be performed efficiently in large samples from UK general practices using the electronic health records of a primary care database. The geographical dispersal of trial sites presents a difficulty for research governance approval and intervention implementation. Pretrial data analyses should inform trial design and analysis plans. Current Controlled Trials ISRCTN 47558792 and ISRCTN 35701810 (both registered on 17 March 2010).
2014-01-01
Background There is growing interest in conducting clinical and cluster randomized trials through electronic health records. This paper reports on the methodological issues identified during the implementation of two cluster randomized trials using the electronic health records of the Clinical Practice Research Datalink (CPRD). Methods Two trials were completed in primary care: one aimed to reduce inappropriate antibiotic prescribing for acute respiratory infection; the other aimed to increase physician adherence with secondary prevention interventions after first stroke. The paper draws on documentary records and trial datasets to report on the methodological experience with respect to research ethics and research governance approval, general practice recruitment and allocation, sample size calculation and power, intervention implementation, and trial analysis. Results We obtained research governance approvals from more than 150 primary care organizations in England, Wales, and Scotland. There were 104 CPRD general practices recruited to the antibiotic trial and 106 to the stroke trial, with the target number of practices being recruited within six months. Interventions were installed into practice information systems remotely over the internet. The mean number of participants per practice was 5,588 in the antibiotic trial and 110 in the stroke trial, with the coefficient of variation of practice sizes being 0.53 and 0.56 respectively. Outcome measures showed substantial correlations between the 12 months before, and after intervention, with coefficients ranging from 0.42 for diastolic blood pressure to 0.91 for proportion of consultations with antibiotics prescribed, defining practice and participant eligibility for analysis requires careful consideration. Conclusions Cluster randomized trials may be performed efficiently in large samples from UK general practices using the electronic health records of a primary care database. The geographical dispersal of trial sites presents a difficulty for research governance approval and intervention implementation. Pretrial data analyses should inform trial design and analysis plans. Trial registration Current Controlled Trials ISRCTN 47558792 and ISRCTN 35701810 (both registered on 17 March 2010). PMID:24919485
Iribarren, Diego; Vázquez-Rowe, Ian; Moreira, María Teresa; Feijoo, Gumersindo
2010-10-15
The combined application of Life Cycle Assessment and Data Envelopment Analysis has been recently proposed to provide a tool for the comprehensive assessment of the environmental and operational performance of multiple similar entities. Among the acknowledged advantages of LCA+DEA methodology, eco-efficiency verification and avoidance of average inventories are usually highlighted. However, given the novelty of LCA+DEA methods, a high number of additional potentials remain unexplored. In this sense, there are some features that are worth detailing given their wide interest to enhance LCA performance. Emphasis is laid on the improved interpretation of LCA results through the complementary use of DEA with respect to: (i) super-efficiency analysis to facilitate the selection of reference performers, (ii) inter- and intra-assessments of multiple data sets within any specific sector with benchmarking and trend analysis purposes, (iii) integration of an economic dimension in order to enrich sustainability assessments, and (iv) window analysis to evaluate environmental impact efficiency over a certain period of time. Furthermore, the capability of LCA+DEA methodology to be generally implemented in a wide range of scenarios is discussed. These further potentials are explained and demonstrated via the presentation of brief case studies based on real data sets. Copyright © 2010 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
J.D. Sanders
Under the U.S.-Russian Material Protection, Control and Accounting (MPC&A) Program, the Material Control and Accounting Measurements (MCAM) Project has supported a joint U.S.-Russian effort to coordinate improvements of the Russian MC&A measurement system. These efforts have resulted in the development of a MC&A Equipment and Methodological Support (MEMS) Strategic Plan (SP), developed by the Russian MEM Working Group. The MEMS SP covers implementation of MC&A measurement equipment, as well as the development, attestation and implementation of measurement methodologies and reference materials at the facility and industry levels. This paper provides an overview of the activities conducted under the MEMS SP,more » as well as a status on current efforts to develop reference materials, implement destructive and nondestructive assay measurement methodologies, and implement sample exchange, scrap and holdup measurement programs across Russian nuclear facilities.« less
Research Project Evaluation-Learnings from the PATHWAYS Project Experience.
Galas, Aleksander; Pilat, Aleksandra; Leonardi, Matilde; Tobiasz-Adamczyk, Beata
2018-05-25
Every research project faces challenges regarding how to achieve its goals in a timely and effective manner. The purpose of this paper is to present a project evaluation methodology gathered during the implementation of the Participation to Healthy Workplaces and Inclusive Strategies in the Work Sector (the EU PATHWAYS Project). The PATHWAYS project involved multiple countries and multi-cultural aspects of re/integrating chronically ill patients into labor markets in different countries. This paper describes key project's evaluation issues including: (1) purposes, (2) advisability, (3) tools, (4) implementation, and (5) possible benefits and presents the advantages of a continuous monitoring. Project evaluation tool to assess structure and resources, process, management and communication, achievements, and outcomes. The project used a mixed evaluation approach and included Strengths (S), Weaknesses (W), Opportunities (O), and Threats (SWOT) analysis. A methodology for longitudinal EU projects' evaluation is described. The evaluation process allowed to highlight strengths and weaknesses and highlighted good coordination and communication between project partners as well as some key issues such as: the need for a shared glossary covering areas investigated by the project, problematic issues related to the involvement of stakeholders from outside the project, and issues with timing. Numerical SWOT analysis showed improvement in project performance over time. The proportion of participating project partners in the evaluation varied from 100% to 83.3%. There is a need for the implementation of a structured evaluation process in multidisciplinary projects involving different stakeholders in diverse socio-environmental and political conditions. Based on the PATHWAYS experience, a clear monitoring methodology is suggested as essential in every multidisciplinary research projects.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boddu, S; Morrow, A; Krishnamurthy, N
Purpose: Our goal is to implement lean methodology to make our current process of CT simulation to treatment more efficient. Methods: In this study, we implemented lean methodology and tools and employed flowchart in excel for process-mapping. We formed a group of physicians, physicists, dosimetrists, therapists and a clinical physics assistant and huddled bi-weekly to map current value streams. We performed GEMBA walks and observed current processes from scheduling patient CT Simulations to treatment plan approval. From this, the entire workflow was categorized into processes, sub-processes, and tasks. For each process we gathered data on touch time, first time quality,more » undesirable effects (UDEs), and wait-times from relevant members of each task. UDEs were binned per frequency of their occurrence. We huddled to map future state and to find solutions to high frequency UDEs. We implemented visual controls, hard stops, and documented issues found during chart checks prior to treatment plan approval. Results: We have identified approximately 64 UDEs in our current workflow that could cause delays, re-work, compromise the quality and safety of patient treatments, or cause wait times between 1 – 6 days. While some UDEs are unavoidable, such as re-planning due to patient weight loss, eliminating avoidable UDEs is our goal. In 2015, we found 399 issues with patient treatment plans, of which 261, 95 and 43 were low, medium and high severity, respectively. We also mapped patient-specific QA processes for IMRT/Rapid Arc and SRS/SBRT, involving 10 and 18 steps, respectively. From these, 13 UDEs were found and 5 were addressed that solved 20% of issues. Conclusion: We have successfully implemented lean methodology and tools. We are further mapping treatment site specific workflows to identify bottlenecks, potential breakdowns and personnel allocation and employ tools like failure mode effects analysis to mitigate risk factors to make this process efficient.« less
Snijder, Mieke; Shakeshaft, Anthony; Wagemakers, Annemarie; Stephens, Anne; Calabria, Bianca
2015-11-21
Community development is a health promotion approach identified as having great potential to improve Indigenous health, because of its potential for extensive community participation. There has been no systematic examination of the extent of community participation in community development projects and little analysis of their effectiveness. This systematic review aims to identify the extent of community participation in community development projects implemented in Australian Indigenous communities, critically appraise the qualitative and quantitative methods used in their evaluation, and summarise their outcomes. Ten electronic peer-reviewed databases and two electronic grey literature databases were searched for relevant studies published between 1990 and 2015. The level of community participation and the methodological quality of the qualitative and quantitative components of the studies were assessed against standardised criteria. Thirty one evaluation studies of community development projects were identified. Community participation varied between different phases of project development, generally high during project implementation, but low during the evaluation phase. For the majority of studies, methodological quality was low and the methods were poorly described. Although positive qualitative or quantitative outcomes were reported in all studies, only two studies reported statistically significant outcomes. Partnerships between researchers, community members and service providers have great potential to improve methodological quality and community participation when research skills and community knowledge are integrated to design, implement and evaluate community development projects. The methodological quality of studies evaluating Australian Indigenous community development projects is currently too weak to confidently determine the cost-effectiveness of community development projects in improving the health and wellbeing of Indigenous Australians. Higher quality studies evaluating community development projects would strengthen the evidence base.
Implementing Service Excellence in Higher Education
ERIC Educational Resources Information Center
Khan, Hina; Matlay, Harry
2009-01-01
Purpose: The purpose of this paper is to provide a critical analysis of the importance of service excellence in higher education. Design/methodology/approach: The research upon which this paper is based employed a phenomenological approach. This method was selected for its focus on respondent perceptions and experiences. Both structured and…
Physician Sensemaking and Readiness for Electronic Medical Records
ERIC Educational Resources Information Center
Riesenmy, Kelly Rouse
2010-01-01
Purpose: The purpose of this paper is to explore physician sensemaking and readiness to implement electronic medical records (EMR) as a first step to finding strategies that enhance EMR adoption behaviors. Design/methodology/approach: The case study approach provides a detailed analysis of individuals within an organizational unit. Using a…
ERIC Educational Resources Information Center
Hughes, Carolyn; Agran, Martin
1993-01-01
This literature review examines the effects of self-instructional programs on increasing independence of persons with moderate/severe mental retardation in integrated environments. The article discusses methodological issues, research needs, and recommendations for program implementation. The feasibility of using self-instruction to promote…
Depression Prevention Research: Design, Implementation, and Analysis of Randomized Trials.
ERIC Educational Resources Information Center
Munoz, Ricardo F.; And Others
This document contains three papers concerned with prevention intervention research, a new area of depression research which has shown great promise for contributing new knowledge to the understanding of depression. The first paper, "Clinical Trials vs. Prevention Trials: Methodological Issues in Depression Research" (Ricardo F. Munoz), emphasizes…
School Uniforms: A Qualitative Analysis of Aims and Accomplishments at Two Christian Schools
ERIC Educational Resources Information Center
Firmin, Michael; Smith, Suzanne; Perry, Lynsey
2006-01-01
Employing rigorous qualitative research methodology, we studied the implementation of two schools' uniform policies. Their primary intents were to eliminate competition, teach young people to dress appropriately, decrease nonacademic distractions, and lower the parental clothing costs. The young people differed with adults regarding whether or not…
Administrator Preparation: Looking Backwards and Forwards
ERIC Educational Resources Information Center
Bridges, Edwin
2012-01-01
Purpose: The purpose of this paper was to conduct a critical analysis of the origins and implementation of problem-based learning in educational administration as a window into the limitations of this approach and more generally administrator preparation. Design/methodology/approach: The author reviewed the published work of the originator from…
Teachers Implementing Entrepreneurship Education: Classroom Practices
ERIC Educational Resources Information Center
Ruskovaara, Elena; Pihkala, Timo
2013-01-01
Purpose: This study aims to highlight the entrepreneurship education practices teachers use in their work. Another target is to analyze how these practices differ based on a number of background factors. Design/methodology/approach: This article presents a quantitative analysis of 521 teachers and other entrepreneurship education actors. The paper…
ERIC Educational Resources Information Center
San Antonio, Diosdado M.; Gamage, David T.
2007-01-01
Purpose: The paper aims to examine the effect of implementing participatory school administration, leadership and management (PSALM) on the levels of empowerment among the educational stakeholders. Design/methodology/approach: A mixed method approach, combining the experimental design with empirical surveys, interviews and documentary analysis,…
AI-Based Chatterbots and Spoken English Teaching: A Critical Analysis
ERIC Educational Resources Information Center
Sha, Guoquan
2009-01-01
The aim of various approaches implemented, whether the classical "three Ps" (presentation, practice, and production) or communicative language teaching (CLT), is to achieve communicative competence. Although a lot of software developed for teaching spoken English is dressed up to raise interaction, its methodology is largely rooted in tradition.…
ERIC Educational Resources Information Center
Khalil, Deena; Kier, Meredith
2017-01-01
This article is about introducing Critical Race Design (CRD), a research methodology that centers race and equity at the nucleus of educational opportunities by design. First, the authors define design-based implementation research (DBIR; Penuel, Fishman, Cheng, & Sabelli, 2011) as an equity-oriented education research methodology where…
Evaluation Methodologies for Estimating the Likelihood of Program Implementation Failure
ERIC Educational Resources Information Center
Durand, Roger; Decker, Phillip J.; Kirkman, Dorothy M.
2014-01-01
Despite our best efforts as evaluators, program implementation failures abound. A wide variety of valuable methodologies have been adopted to explain and evaluate the "why" of these failures. Yet, typically these methodologies have been employed concurrently (e.g., project monitoring) or to the post-hoc assessment of program activities.…
ERIC Educational Resources Information Center
Ndirangu, Caroline
2017-01-01
This study aims to evaluate teachers' attitude towards implementation of learner-centered methodology in science education in Kenya. The study used a survey design methodology, adopting the purposive, stratified random and simple random sampling procedures and hypothesised that there was no significant relationship between the head teachers'…
Multivariate Methods for Meta-Analysis of Genetic Association Studies.
Dimou, Niki L; Pantavou, Katerina G; Braliou, Georgia G; Bagos, Pantelis G
2018-01-01
Multivariate meta-analysis of genetic association studies and genome-wide association studies has received a remarkable attention as it improves the precision of the analysis. Here, we review, summarize and present in a unified framework methods for multivariate meta-analysis of genetic association studies and genome-wide association studies. Starting with the statistical methods used for robust analysis and genetic model selection, we present in brief univariate methods for meta-analysis and we then scrutinize multivariate methodologies. Multivariate models of meta-analysis for a single gene-disease association studies, including models for haplotype association studies, multiple linked polymorphisms and multiple outcomes are discussed. The popular Mendelian randomization approach and special cases of meta-analysis addressing issues such as the assumption of the mode of inheritance, deviation from Hardy-Weinberg Equilibrium and gene-environment interactions are also presented. All available methods are enriched with practical applications and methodologies that could be developed in the future are discussed. Links for all available software implementing multivariate meta-analysis methods are also provided.
A methodology based on reduced complexity algorithm for system applications using microprocessors
NASA Technical Reports Server (NTRS)
Yan, T. Y.; Yao, K.
1988-01-01
The paper considers a methodology on the analysis and design of a minimum mean-square error criterion linear system incorporating a tapped delay line (TDL) where all the full-precision multiplications in the TDL are constrained to be powers of two. A linear equalizer based on the dispersive and additive noise channel is presented. This microprocessor implementation with optimized power of two TDL coefficients achieves a system performance comparable to the optimum linear equalization with full-precision multiplications for an input data rate of 300 baud.
Development of a Practical Methodology for Elastic-Plastic and Fully Plastic Fatigue Crack Growth
NASA Technical Reports Server (NTRS)
McClung, R. C.; Chell, G. G.; Lee, Y. -D.; Russell, D. A.; Orient, G. E.
1999-01-01
A practical engineering methodology has been developed to analyze and predict fatigue crack growth rates under elastic-plastic and fully plastic conditions. The methodology employs the closure-corrected effective range of the J-integral, delta J(sub eff) as the governing parameter. The methodology contains original and literature J and delta J solutions for specific geometries, along with general methods for estimating J for other geometries and other loading conditions, including combined mechanical loading and combined primary and secondary loading. The methodology also contains specific practical algorithms that translate a J solution into a prediction of fatigue crack growth rate or life, including methods for determining crack opening levels, crack instability conditions, and material properties. A critical core subset of the J solutions and the practical algorithms has been implemented into independent elastic-plastic NASGRO modules. All components of the entire methodology, including the NASGRO modules, have been verified through analysis and experiment, and limits of applicability have been identified.
Carayon, Pascale; Li, Yaqiong; Kelly, Michelle M.; DuBenske, Lori L.; Xie, Anping; McCabe, Brenna; Orne, Jason; Cox, Elizabeth D.
2014-01-01
Human factors and ergonomics methods are needed to redesign healthcare processes and support patient-centered care, in particular for vulnerable patients such as hospitalized children. We implemented and evaluated a stimulated recall methodology for collective confrontation in the context of family-centered rounds. Five parents and five healthcare team members reviewed video records of their bedside rounds, and were then interviewed using the stimulated recall methodology to identify work system barriers and facilitators in family-centered rounds. The evaluation of the methodology was based on a survey of the participants, and a qualitative analysis of interview data in light of the work system model of Smith and Carayon (1989; 2000). Positive survey feedback from the participants was received. The stimulated recall methodology identified barriers and facilitators in all work system elements. Participatory ergonomics methods such as the stimulated recall methodology allow a range of participants, including parents and children, to participate in healthcare process improvement. PMID:24894378
Development of a Practical Methodology for Elastic-Plastic and Fully Plastic Fatigue Crack Growth
NASA Technical Reports Server (NTRS)
McClung, R. C.; Chell, G. G.; Lee, Y.-D.; Russell, D. A.; Orient, G. E.
1999-01-01
A practical engineering methodology has been developed to analyze and predict fatigue crack growth rates under elastic-plastic and fully plastic conditions. The methodology employs the closure-corrected effective range of the J-integral, (Delta)J(sub eff), as the governing parameter. The methodology contains original and literature J and (Delta)J solutions for specific geometries, along with general methods for estimating J for other geometries and other loading conditions, including combined mechanical loading and combined primary and secondary loading. The methodology also contains specific practical algorithms that translate a J solution into a prediction of fatigue crack growth rate or life, including methods for determining crack opening levels, crack instability conditions, and material properties. A critical core subset of the J solutions and the practical algorithms has been implemented into independent elastic-plastic NASGRO modules. All components of the entire methodology, including the NASGRO modules, have been verified through analysis and experiment, and limits of applicability have been identified.
Reviewing the methodology of an integrative review.
Hopia, Hanna; Latvala, Eila; Liimatainen, Leena
2016-12-01
Whittemore and Knafl's updated description of methodological approach for integrative review was published in 2005. Since then, the five stages of the approach have been regularly used as a basic conceptual structure of the integrative reviews conducted by nursing researchers. However, this methodological approach is seldom examined from the perspective of how systematically and rigorously the stages are implemented in the published integrative reviews. To appraise the selected integrative reviews on the basis of the methodological approach according to the five stages published by Whittemore and Knafl in 2005. A literature review was used in this study. CINAHL (Cumulative Index to Nursing and Allied Health), PubMed, OVID (Journals@Ovid) and the Cochrane Library databases were searched for integrative reviews published between 2002 and 2014. Papers were included if they used the methodological approach described by Whittemore and Knafl, were published in English and were focused on nursing education or nursing expertise. A total of 259 integrative review publications for potential inclusion were identified. Ten integrative reviews fulfilled the inclusion criteria. Findings from the studies were extracted and critically examined according to the five methodological stages. The reviews assessed followed the guidelines of the stated methodology approach to different extents. The stages of literature search, data evaluation and data analysis were fairly poorly formulated and only partially implemented in the studies included in the sample. The other two stages, problem identification and presentation, followed those described in the methodological approach quite well. Increasing use of research in clinical practice is inevitable, and therefore, integrative reviews can play a greater role in developing evidence-based nursing practices. Because of this, nurse researchers should pay more attention to sound integrative nursing research to systematise the review process and make it more rigorous. © 2016 Nordic College of Caring Science.
Peirlinck, Mathias; De Beule, Matthieu; Segers, Patrick; Rebelo, Nuno
2018-05-28
Patient-specific biomechanical modeling of the cardiovascular system is complicated by the presence of a physiological pressure load given that the imaged tissue is in a pre-stressed and -strained state. Neglect of this prestressed state into solid tissue mechanics models leads to erroneous metrics (e.g. wall deformation, peak stress, wall shear stress) which in their turn are used for device design choices, risk assessment (e.g. procedure, rupture) and surgery planning. It is thus of utmost importance to incorporate this deformed and loaded tissue state into the computational models, which implies solving an inverse problem (calculating an undeformed geometry given the load and the deformed geometry). Methodologies to solve this inverse problem can be categorized into iterative and direct methodologies, both having their inherent advantages and disadvantages. Direct methodologies are typically based on the inverse elastostatics (IE) approach and offer a computationally efficient single shot methodology to compute the in vivo stress state. However, cumbersome and problem-specific derivations of the formulations and non-trivial access to the finite element analysis (FEA) code, especially for commercial products, refrain a broad implementation of these methodologies. For that reason, we developed a novel, modular IE approach and implemented this methodology in a commercial FEA solver with minor user subroutine interventions. The accuracy of this methodology was demonstrated in an arterial tube and porcine biventricular myocardium model. The computational power and efficiency of the methodology was shown by computing the in vivo stress and strain state, and the corresponding unloaded geometry, for two models containing multiple interacting incompressible, anisotropic (fiber-embedded) and hyperelastic material behaviors: a patient-specific abdominal aortic aneurysm and a full 4-chamber heart model. Copyright © 2018 Elsevier Ltd. All rights reserved.
Improving Reports Turnaround Time: An Essential Healthcare Quality Dimension.
Khan, Mustafa; Khalid, Parwaiz; Al-Said, Youssef; Cupler, Edward; Almorsy, Lamia; Khalifa, Mohamed
2016-01-01
Turnaround time is one of the most important healthcare performance indicators. King Faisal Specialist Hospital and Research Center in Jeddah, Saudi Arabia worked on reducing the reports turnaround time of the neurophysiology lab from more than two weeks to only five working days for 90% of cases. The main quality improvement methodology used was the FOCUS PDCA. Using root cause analysis, Pareto analysis and qualitative survey methods, the main factors contributing to the delay of turnaround time and the suggested improvement strategies were identified and implemented, through restructuring transcriptionists daily tasks, rescheduling physicians time and alerting for new reports, engaging consultants, consistent coordination and prioritizing critical reports. After implementation; 92% of reports are verified within 5 days compared to only 6% before implementation. 7% of reports were verified in 5 days to 2 weeks and only 1% of reports needed more than 2 weeks compared to 76% before implementation.
2016-09-15
7 Methodology Overview ................................................................................................7...32 III. Methodology ...33 Overview of Research Methodology ..........................................................................34 Implementation of Methodology
NASA Technical Reports Server (NTRS)
Ebeling, Charles
1993-01-01
This report documents the work accomplished during the first two years of research to provide support to NASA in predicting operational and support parameters and costs of proposed space systems. The first year's research developed a methodology for deriving reliability and maintainability (R & M) parameters based upon the use of regression analysis to establish empirical relationships between performance and design specifications and corresponding mean times of failure and repair. The second year focused on enhancements to the methodology, increased scope of the model, and software improvements. This follow-on effort expands the prediction of R & M parameters and their effect on the operations and support of space transportation vehicles to include other system components such as booster rockets and external fuel tanks. It also increases the scope of the methodology and the capabilities of the model as implemented by the software. The focus is on the failure and repair of major subsystems and their impact on vehicle reliability, turn times, maintenance manpower, and repairable spares requirements. The report documents the data utilized in this study, outlines the general methodology for estimating and relating R&M parameters, presents the analyses and results of application to the initial data base, and describes the implementation of the methodology through the use of a computer model. The report concludes with a discussion on validation and a summary of the research findings and results.
NASA Technical Reports Server (NTRS)
Funk, Christie J.; Perry, Boyd, III; Silva, Walter A.; Newman, Brett
2014-01-01
A software program and associated methodology to study gust loading on aircraft exists for a classification of geometrically simplified flexible configurations. This program consists of a simple aircraft response model with two rigid and three flexible symmetric degrees-of - freedom and allows for the calculation of various airplane responses due to a discrete one-minus- cosine gust as well as continuous turbulence. Simplifications, assumptions, and opportunities for potential improvements pertaining to the existing software program are first identified, then a revised version of the original software tool is developed with improved methodology to include more complex geometries, additional excitation cases, and additional output data so as to provide a more useful and precise tool for gust load analysis. In order to improve the original software program to enhance usefulness, a wing control surface and horizontal tail control surface is added, an extended application of the discrete one-minus-cosine gust input is employed, a supplemental continuous turbulence spectrum is implemented, and a capability to animate the total vehicle deformation response to gust inputs is included. These revisions and enhancements are implemented and an analysis of the results is used to validate the modifications.
SCALE Continuous-Energy Eigenvalue Sensitivity Coefficient Calculations
Perfetti, Christopher M.; Rearden, Bradley T.; Martin, William R.
2016-02-25
Sensitivity coefficients describe the fractional change in a system response that is induced by changes to system parameters and nuclear data. The Tools for Sensitivity and UNcertainty Analysis Methodology Implementation (TSUNAMI) code within the SCALE code system makes use of eigenvalue sensitivity coefficients for an extensive number of criticality safety applications, including quantifying the data-induced uncertainty in the eigenvalue of critical systems, assessing the neutronic similarity between different critical systems, and guiding nuclear data adjustment studies. The need to model geometrically complex systems with improved fidelity and the desire to extend TSUNAMI analysis to advanced applications has motivated the developmentmore » of a methodology for calculating sensitivity coefficients in continuous-energy (CE) Monte Carlo applications. The Contributon-Linked eigenvalue sensitivity/Uncertainty estimation via Tracklength importance CHaracterization (CLUTCH) and Iterated Fission Probability (IFP) eigenvalue sensitivity methods were recently implemented in the CE-KENO framework of the SCALE code system to enable TSUNAMI-3D to perform eigenvalue sensitivity calculations using continuous-energy Monte Carlo methods. This work provides a detailed description of the theory behind the CLUTCH method and describes in detail its implementation. This work explores the improvements in eigenvalue sensitivity coefficient accuracy that can be gained through the use of continuous-energy sensitivity methods and also compares several sensitivity methods in terms of computational efficiency and memory requirements.« less
Calhoun, Aaron W; Rider, Elizabeth A; Peterson, Eleanor; Meyer, Elaine C
2010-09-01
Multi-rater assessment with gap analysis is a powerful method for assessing communication skills and self-insight, and enhancing self-reflection. We demonstrate the use of this methodology. The Program for the Approach to Complex Encounters (PACE) is an interdisciplinary simulation-based communication skills program. Encounters are assessed using an expanded Kalamazoo Consensus Statement Essential Elements Checklist adapted for multi-rater feedback and gap analysis. Data from a representative conversation were analyzed. Likert and forced-choice data with gap analysis are used to assess performance. Participants were strong in Demonstrating Empathy and Providing Closure, and needed to improve Relationship Building, Gathering Information, and understanding the Patient's/Family's Perspective. Participants under-appraised their abilities in Relationship Building, Providing Closure, and Demonstrating Empathy, as well as their overall performance. The conversion of these results into verbal feedback is discussed. We describe an evaluation methodology using multi-rater assessment with gap analysis to assess communication skills and self-insight. This methodology enables faculty to identify undervalued skills and perceptual blind spots, provide comprehensive, data driven, feedback, and encourage reflection. Implementation of graphical feedback forms coupled with one-on-one discussion using the above methodology has the potential to enhance trainee self-awareness and reflection, improving the impact of educational programs. Copyright (c) 2010 Elsevier Ireland Ltd. All rights reserved.
System-wide lean implementation in health care: A multiple case study.
Centauri, Federica; Mazzocato, Pamela; Villa, Stefano; Marsilio, Marta
2018-05-01
Background Lean practices have been widely used by health care organizations to meet efficiency, performance and quality improvement needs. The lean health care literature shows that the effective implementation of lean requires a holistic system-wide approach. However, there is still limited evidence on what drives effective system-wide lean implementation in health care. The existing literature suggests that a deeper understanding of how lean interventions interact with the organizational context is necessary to identify the critical variables to successfully sustain system-wide lean strategies. Purpose and methodology: A multiple case study of three Italian hospitals is conducted with the aim to explore the organizational conditions that are relevant for an effective system-wide lean implementation. A conceptual framework, built on socio-technical system schemas, is used to guide data collection and analysis. The analysis points out the importance to support lean implementation with an integrated and coordinated strategy involving the social, technical, and external components of the overall hospital system.
Harries, Bruce; Filiatrault, Lyne; Abu-Laban, Riyad B
2018-05-30
Quality improvement (QI) analytic methodology is rarely encountered in the emergency medicine literature. We sought to comparatively apply QI design and analysis techniques to an existing data set, and discuss these techniques as an alternative to standard research methodology for evaluating a change in a process of care. We used data from a previously published randomized controlled trial on triage-nurse initiated radiography using the Ottawa ankle rules (OAR). QI analytic tools were applied to the data set from this study and evaluated comparatively against the original standard research methodology. The original study concluded that triage nurse-initiated radiographs led to a statistically significant decrease in mean emergency department length of stay. Using QI analytic methodology, we applied control charts and interpreted the results using established methods that preserved the time sequence of the data. This analysis found a compelling signal of a positive treatment effect that would have been identified after the enrolment of 58% of the original study sample, and in the 6th month of this 11-month study. Our comparative analysis demonstrates some of the potential benefits of QI analytic methodology. We found that had this approach been used in the original study, insights regarding the benefits of nurse-initiated radiography using the OAR would have been achieved earlier, and thus potentially at a lower cost. In situations where the overarching aim is to accelerate implementation of practice improvement to benefit future patients, we believe that increased consideration should be given to the use of QI analytic methodology.
Dickerson, Jane A; Schmeling, Michael; Hoofnagle, Andrew N; Hoffman, Noah G
2013-01-16
Mass spectrometry provides a powerful platform for performing quantitative, multiplexed assays in the clinical laboratory, but at the cost of increased complexity of analysis and quality assurance calculations compared to other methodologies. Here we describe the design and implementation of a software application that performs quality control calculations for a complex, multiplexed, mass spectrometric analysis of opioids and opioid metabolites. The development and implementation of this application improved our data analysis and quality assurance processes in several ways. First, use of the software significantly improved the procedural consistency for performing quality control calculations. Second, it reduced the amount of time technologists spent preparing and reviewing the data, saving on average over four hours per run, and in some cases improving turnaround time by a day. Third, it provides a mechanism for coupling procedural and software changes with the results of each analysis. We describe several key details of the implementation including the use of version control software and automated unit tests. These generally useful software engineering principles should be considered for any software development project in the clinical lab. Copyright © 2012 Elsevier B.V. All rights reserved.
Using lean methodology to improve productivity in a hospital oncology pharmacy.
Sullivan, Peter; Soefje, Scott; Reinhart, David; McGeary, Catherine; Cabie, Eric D
2014-09-01
Quality improvements achieved by a hospital pharmacy through the use of lean methodology to guide i.v. compounding workflow changes are described. The outpatient oncology pharmacy of Yale-New Haven Hospital conducted a quality-improvement initiative to identify and implement workflow changes to support a major expansion of chemotherapy services. Applying concepts of lean methodology (i.e., elimination of non-value-added steps and waste in the production process), the pharmacy team performed a failure mode and effects analysis, workflow mapping, and impact analysis; staff pharmacists and pharmacy technicians identified 38 opportunities to decrease waste and increase efficiency. Three workflow processes (order verification, compounding, and delivery) accounted for 24 of 38 recommendations and were targeted for lean process improvements. The workflow was decreased to 14 steps, eliminating 6 non-value-added steps, and pharmacy staff resources and schedules were realigned with the streamlined workflow. The time required for pharmacist verification of patient-specific oncology orders was decreased by 33%; the time required for product verification was decreased by 52%. The average medication delivery time was decreased by 47%. The results of baseline and postimplementation time trials indicated a decrease in overall turnaround time to about 70 minutes, compared with a baseline time of about 90 minutes. The use of lean methodology to identify non-value-added steps in oncology order processing and the implementation of staff-recommended workflow changes resulted in an overall reduction in the turnaround time per dose. Copyright © 2014 by the American Society of Health-System Pharmacists, Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Pamadi, Bandu N.; Toniolo, Matthew D.; Tartabini, Paul V.; Roithmayr, Carlos M.; Albertson, Cindy W.; Karlgaard, Christopher D.
2016-01-01
The objective of this report is to develop and implement a physics based method for analysis and simulation of multi-body dynamics including launch vehicle stage separation. The constraint force equation (CFE) methodology discussed in this report provides such a framework for modeling constraint forces and moments acting at joints when the vehicles are still connected. Several stand-alone test cases involving various types of joints were developed to validate the CFE methodology. The results were compared with ADAMS(Registered Trademark) and Autolev, two different industry standard benchmark codes for multi-body dynamic analysis and simulations. However, these two codes are not designed for aerospace flight trajectory simulations. After this validation exercise, the CFE algorithm was implemented in Program to Optimize Simulated Trajectories II (POST2) to provide a capability to simulate end-to-end trajectories of launch vehicles including stage separation. The POST2/CFE methodology was applied to the STS-1 Space Shuttle solid rocket booster (SRB) separation and Hyper-X Research Vehicle (HXRV) separation from the Pegasus booster as a further test and validation for its application to launch vehicle stage separation problems. Finally, to demonstrate end-to-end simulation capability, POST2/CFE was applied to the ascent, orbit insertion, and booster return of a reusable two-stage-to-orbit (TSTO) vehicle concept. With these validation exercises, POST2/CFE software can be used for performing conceptual level end-to-end simulations, including launch vehicle stage separation, for problems similar to those discussed in this report.
Ward, Marie; McAuliffe, Eilish; Wakai, Abel; Geary, Una; Browne, John; Deasy, Conor; Schull, Michael; Boland, Fiona; McDaid, Fiona; Coughlan, Eoin; O'Sullivan, Ronan
2017-01-23
Early detection of patient deterioration is a key element of patient safety as it allows timely clinical intervention and potential rescue, thus reducing the risks of serious patient safety incidents. Longitudinal patient monitoring systems have been widely recommended for use to detect clinical deterioration. However, there is conflicting evidence on whether they improve patient outcomes. This may in part be related to variation in the rigour with which they are implemented and evaluated. This study aims to evaluate the implementation and effectiveness of a longitudinal patient monitoring system designed for adult patients in the unique environment of the Emergency Department (ED). A novel participatory action research (PAR) approach is taken where socio-technical systems (STS) theory and analysis informs the implementation through the improvement methodology of 'Plan Do Study Act' (PDSA) cycles. We hypothesise that conducting an STS analysis of the ED before beginning the PDSA cycles will provide for a much richer understanding of the current situation and possible challenges to implementing the ED-specific longitudinal patient monitoring system. This methodology will enable both a process and an outcome evaluation of implementing the ED-specific longitudinal patient monitoring system. Process evaluations can help distinguish between interventions that have inherent faults and those that are badly executed. Over 1.2 million patients attend EDs annually in Ireland; the successful implementation of an ED-specific longitudinal patient monitoring system has the potential to affect the care of a significant number of such patients. To the best of our knowledge, this is the first study combining PAR, STS and multiple PDSA cycles to evaluate the implementation of an ED-specific longitudinal patient monitoring system and to determine (through process and outcome evaluation) whether this system can significantly improve patient outcomes by early detection and appropriate intervention for patients at risk of clinical deterioration.
Shankardass, Ketan; Renahy, Emilie; Muntaner, Carles; O'Campo, Patricia
2015-05-01
To address macro-social and economic determinants of health and equity, there has been growing use of intersectoral action by governments around the world. Health in All Policies (HiAP) initiatives are a special case where governments use cross-sectoral structures and relationships to systematically address health in policymaking by targeting broad health determinants rather than health services alone. Although many examples of HiAP have emerged in recent decades, the reasons for their successful implementation--and for implementation failures--have not been systematically studied. Consequently, rigorous evidence based on systematic research of the social mechanisms that have regularly enabled or hindered implementation in different jurisdictions is sparse. We describe a novel methodology for explanatory case studies that use a scientific realist perspective to study the implementation of HiAP. Our methodology begins with the formulation of a conceptual framework to describe contexts, social mechanisms and outcomes of relevance to the sustainable implementation of HiAP. We then describe the process of systematically explaining phenomena of interest using evidence from literature and key informant interviews, and looking for patterns and themes. Finally, we present a comparative example of how Health Impact Assessment tools have been utilized in Sweden and Quebec to illustrate how this methodology uses evidence to first describe successful practices for implementation of HiAP and then refine the initial framework. The methodology that we describe helps researchers to identify and triangulate rich evidence describing social mechanisms and salient contextual factors that characterize successful practices in implementing HiAP in specific jurisdictions. This methodology can be applied to study the implementation of HiAP and other forms of intersectoral action to reduce health inequities involving multiple geographic levels of government in diverse settings. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine © The Author 2014; all rights reserved.
NASA Astrophysics Data System (ADS)
Nemeth, Noel N.; Jadaan, Osama M.; Palfi, Tamas; Baker, Eric H.
Brittle materials today are being used, or considered, for a wide variety of high tech applications that operate in harsh environments, including static and rotating turbine parts, thermal protection systems, dental prosthetics, fuel cells, oxygen transport membranes, radomes, and MEMS. Designing brittle material components to sustain repeated load without fracturing while using the minimum amount of material requires the use of a probabilistic design methodology. The NASA CARES/Life 1 (Ceramic Analysis and Reliability Evaluation of Structure/Life) code provides a general-purpose analysis tool that predicts the probability of failure of a ceramic component as a function of its time in service. This capability includes predicting the time-dependent failure probability of ceramic components against catastrophic rupture when subjected to transient thermomechanical loads (including cyclic loads). The developed methodology allows for changes in material response that can occur with temperature or time (i.e. changing fatigue and Weibull parameters with temperature or time). For this article an overview of the transient reliability methodology and how this methodology is extended to account for proof testing is described. The CARES/Life code has been modified to have the ability to interface with commercially available finite element analysis (FEA) codes executed for transient load histories. Examples are provided to demonstrate the features of the methodology as implemented in the CARES/Life program.
[A functional analysis of healthcare auditors' skills in Venezuela, 2008].
Chirinos-Muñoz, Mónica S
2010-10-01
Using functional analysis for identifying the basic, working, specific and generic skills and values which a health service auditor must have. Implementing the functional analysis technique with 10 experts, identifying specific, basic, generic skills and values by means of deductive logic. A functional map was obtained which started by establishing a key purpose based on improving healthcare and service quality from which three key functions emerged. The main functions and skills' units were then broken down into the competitive elements defining what a health service auditor is able to do. This functional map (following functional analysis methodology) shows in detail the simple and complex tasks which a healthcare auditor should apply in the workplace, adopting a forward management approach for improving healthcare and health service quality. This methodology, expressing logical-deductive awareness raising, provides expert consensual information validating each element regarding overall skills.
NASA Technical Reports Server (NTRS)
Biernacki, John; Juhasz, John; Sadler, Gerald
1991-01-01
A team of Space Station Freedom (SSF) system engineers are in the process of extensive analysis of the SSF requirements, particularly those pertaining to the electrical power system (EPS). The objective of this analysis is the development of a comprehensive, computer-based requirements model, using an enhanced modern structured analysis methodology (EMSA). Such a model provides a detailed and consistent representation of the system's requirements. The process outlined in the EMSA methodology is unique in that it allows the graphical modeling of real-time system state transitions, as well as functional requirements and data relationships, to be implemented using modern computer-based tools. These tools permit flexible updating and continuous maintenance of the models. Initial findings resulting from the application of EMSA to the EPS have benefited the space station program by linking requirements to design, providing traceability of requirements, identifying discrepancies, and fostering an understanding of the EPS.
Applied Swarm-based medicine: collecting decision trees for patterns of algorithms analysis.
Panje, Cédric M; Glatzer, Markus; von Rappard, Joscha; Rothermundt, Christian; Hundsberger, Thomas; Zumstein, Valentin; Plasswilm, Ludwig; Putora, Paul Martin
2017-08-16
The objective consensus methodology has recently been applied in consensus finding in several studies on medical decision-making among clinical experts or guidelines. The main advantages of this method are an automated analysis and comparison of treatment algorithms of the participating centers which can be performed anonymously. Based on the experience from completed consensus analyses, the main steps for the successful implementation of the objective consensus methodology were identified and discussed among the main investigators. The following steps for the successful collection and conversion of decision trees were identified and defined in detail: problem definition, population selection, draft input collection, tree conversion, criteria adaptation, problem re-evaluation, results distribution and refinement, tree finalisation, and analysis. This manuscript provides information on the main steps for successful collection of decision trees and summarizes important aspects at each point of the analysis.
Saturno-Hernández, Pedro J; Gutiérrez-Reyes, Juan Pablo; Vieyra-Romero, Waldo Ivan; Romero-Martínez, Martín; O'Shea-Cuevas, Gabriel Jaime; Lozano-Herrera, Javier; Tavera-Martínez, Sonia; Hernández-Ávila, Mauricio
2016-01-01
To describe the conceptual framework and methods for implementation and analysis of the satisfaction survey of the Mexican System for Social Protection in Health. We analyze the methodological elements of the 2013, 2014 and 2015 surveys, including the instrument, sampling method and study design, conceptual framework, and characteristics and indicators of the analysis. The survey captures information on perceived quality and satisfaction. Sampling has national and State representation. Simple and composite indicators (index of satisfaction and rate of reported quality problems) are built and described. The analysis is completed using Pareto diagrams, correlation between indicators and association with satisfaction by means of multivariate models. The measurement of satisfaction and perceived quality is a complex but necessary process to comply with regulations and to identify strategies for improvement. The described survey presents a design and rigorous analysis focused on its utility for improving.
NASA Astrophysics Data System (ADS)
Bruno, N.; Roncella, R.
2018-05-01
The need to safeguard and preserve Cultural Heritage (CH) is increasing and especially in Italy, where the amount of historical buildings is considerable, having efficient and standardized processes of CH management and conservation becomes strategic. At the time being, there are no tools capable of fulfilling all the specific functions required by Cultural Heritage documentation and, due to the complexity of historical assets, there are no solution as flexible and customizable as CH specific needs require. Nevertheless, BIM methodology can represent the most effective solution, on condition that proper methodologies, tools and functions are made available. The paper describes an ongoing research on the implementation of a Historical BIM system for the Parma cathedral, aimed at the maintenance, conservation and restoration. Its main goal was to give a concrete answer to the lack of specific tools required by Cultural Heritage documentation: organized and coordinated storage and management of historical data, easy analysis and query, time management, 3D modelling of irregular shapes, flexibility, user-friendliness, etc. The paper will describe the project and the implemented methodology, focusing mainly on survey and modelling phases. In describing the methodology, critical issues about the creation of a HBIM will be highlighted, trying to outline a workflow applicable also in other similar contexts.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-27
... has obtained an OMB generic clearance to conduct survey and instrument design and administration... conduct the detailed preparation needed for a study of this size and complexity, the NCS was designed to... methodological studies conducted during the Vanguard phase will inform the implementation and analysis plan for...
Evaluating a Policing Strategy Intended to Disrupt an Illicit Street-Level Drug Market
ERIC Educational Resources Information Center
Corsaro, Nicholas; Brunson, Rod K.; McGarrell, Edmund F.
2010-01-01
The authors examined a strategic policing initiative that was implemented in a high crime Nashville, Tennessee neighborhood by utilizing a mixed-methodological evaluation approach in order to provide (a) a descriptive process assessment of program fidelity; (b) an interrupted time-series analysis relying upon generalized linear models; (c)…
Environmental Management and Sustainability in Higher Education: The Case of Spanish Universities
ERIC Educational Resources Information Center
Leon-Fernandez, Yolanda; Domínguez-Vilches, Eugenio
2015-01-01
Purpose: This paper aims to analyse trends in implementing the main initiatives in the field of environmental management and sustainability in Spanish universities, taking as a reference point the guidelines adopted by a number of universities in countries most committed to sustainable development. Design/methodology/approach: An analysis of…
A Summary of Project Open Horizons, Phase I: Implementation and Data Analysis.
ERIC Educational Resources Information Center
Grantham, Robert J.; Gordon, Myra
Project "Open Horizons," in Buffalo and Niagara Falls, New York, was born out of a recognition that minority adolescents in disadvantaged communities face serious social and personal problems in the area of career development. The originators of the project were seeking an effective methodology for exposing disadvantaged youth to a…
ERIC Educational Resources Information Center
Cheng, Yin Cheong; Yuen, Timothy W. W.
2017-01-01
Purpose: The purpose of this paper is to contribute to the worldwide discussion of conceptualization, multiple functions and management of national education in an era of globalisation by proposing a new comprehensive framework for research, policy analysis and practical implementation. Design/Methodology/Approach: Based on a review of the…
Dynamical Geometry: Analysis of Mistakes in Student Constructions
ERIC Educational Resources Information Center
Vanicek, Jiri
2007-01-01
In the early stages of working with dynamical geometry environments, students make many more mistakes than if they thought out and implemented the same constructions on paper. Most Czech teachers have very little experience of doing geometry using computers. A methodology which could help them to teach students to avoid mistakes dependent on the…
Principles of Assessment for Project and Research Based Learning
ERIC Educational Resources Information Center
Hunaiti, Ziad; Grimaldi, Silvia; Goven, Dharmendra; Mootanah, Rajshree; Martin, Louise
2010-01-01
Purpose: The purpose of this paper is to provide assessment guidelines which help to implement research-based education in science and technology areas, which would benefit from the quality of this type of education within this subject area. Design/methodology/approach: This paper is a reflection on, and analysis of, different aspects of…
Getting State Education Data Right: What We Can Learn from Tennessee
ERIC Educational Resources Information Center
Jones, Joseph; Southern, Kyle
2011-01-01
Federal education policy in recent years has encouraged state and local education agencies to embrace data use and analysis in decision-making, ranging from policy development and implementation to performance evaluation. The capacity of these agencies to make effective and methodologically sound use of collected data for these purposes remains an…
ERIC Educational Resources Information Center
Jung, Steven M.; And Others
Survey activities are reported which were designed to provide the foundation for a national evaluation of the effectiveness of programs assisted under the Career Education Incentive Act of 1977 (PL 95-207). The methodology described, called "program evaluability assessment," focuses on detailed analysis of program assumptions in order to…
Important Revelations about School Reform: Looking at and beyond Reading First
ERIC Educational Resources Information Center
Barone, Diane
2013-01-01
This article reports on Nevada's Reading First program and positions it as a source of reflection for future worldwide literacy reform efforts. Qualitative methodology was used for this analysis. Students' literacy achievement improved throughout the program until the last year of implementation. Students who remained at a Reading First school…
2014-04-01
15 Figure 4: Example cognitive map ... map , aligning planning efforts throughout the government. Even after strategy implementation, SDI calls for continuing, iterative learning and...the design before total commitment to it. Capturing this analysis on a cognitive map allows strategists to articulate a design to government
Propensity Score Estimation with Data Mining Techniques: Alternatives to Logistic Regression
ERIC Educational Resources Information Center
Keller, Bryan S. B.; Kim, Jee-Seon; Steiner, Peter M.
2013-01-01
Propensity score analysis (PSA) is a methodological technique which may correct for selection bias in a quasi-experiment by modeling the selection process using observed covariates. Because logistic regression is well understood by researchers in a variety of fields and easy to implement in a number of popular software packages, it has…
What Do Police Academy Instructors and STEM Teachers Have in Common? The "Mission Paradox"
ERIC Educational Resources Information Center
Even Zahav, Anat; Shahar, Sigalit; Hazzan, Orit
2016-01-01
This article presents the "Mission Paradox," shared by two public sector organizations in Israel: the police training system and the post-primary STEM education system. The "Mission Paradox" was identified in data analysis of two doctoral studies, which implemented a qualitative methodology. The study's purpose was to analyze…
2012-08-01
Difference Vegetation Index ( NDVI ) ..................................... 15 2.3 Methodology...Atmospheric Compensation ........................................................................ 31 3.2.3.1 Normalized Difference Vegetation Index ( NDVI ...anomaly detection algorithms are contrasted and implemented, and explains the use of the Normalized Difference Vegetation Index ( NDVI ) in post
Introduction on performance analysis and profiling methodologies for KVM on ARM virtualization
NASA Astrophysics Data System (ADS)
Motakis, Antonios; Spyridakis, Alexander; Raho, Daniel
2013-05-01
The introduction of hardware virtualization extensions on ARM Cortex-A15 processors has enabled the implementation of full virtualization solutions for this architecture, such as KVM on ARM. This trend motivates the need to quantify and understand the performance impact, emerged by the application of this technology. In this work we start looking into some interesting performance metrics on KVM for ARM processors, which can provide us with useful insight that may lead to potential improvements in the future. This includes measurements such as interrupt latency and guest exit cost, performed on ARM Versatile Express and Samsung Exynos 5250 hardware platforms. Furthermore, we discuss additional methodologies that can provide us with a deeper understanding in the future of the performance footprint of KVM. We identify some of the most interesting approaches in this field, and perform a tentative analysis on how these may be implemented in the KVM on ARM port. These take into consideration hardware and software based counters for profiling, and issues related to the limitations of the simulators which are often used, such as the ARM Fast Models platform.
Hover, Alexander R; Sistrunk, William W; Cavagnol, Robert M; Scarrow, Alan; Finley, Phillip J; Kroencke, Audrey D; Walker, Judith L
2014-01-01
Mercy Hospital Springfield is a tertiary care facility with 32 000 discharges and 15 000 inpatient surgeries in 2011. From June 2009 through January 2011, a stable inpatient elective neurosurgery infection rate of 2.15% was observed. The failure mode and effects analysis (FMEA) methodology to reduce inpatient neurosurgery infections was utilized. Following FMEA implementation, overall elective neurosurgery infection rates were reduced to 1.51% and sustained through May 2012. Compared with baseline, the post-FMEA deep-space and organ infection rate was reduced by 41% (P = .052). Overall hospital inpatient clean surgery infection rates for the same time frame did not decrease to the same extent, suggesting a specific effect of the FMEA. The study team believes that the FMEA interventions resulted in 14 fewer expected infections, $270 270 in savings, a 168-day reduction in expected length of stay, and 22 fewer readmissions. Given the serious morbidity and cost of health care-associated infections, the study team concludes that FMEA implementation was clinically cost-effective. © 2013 by the American College of Medical Quality.
NASA Astrophysics Data System (ADS)
Rasmeni, Zelda; Pan, Xiaowei
2017-07-01
The Quick-E-Scan methodology is a simple and quick method that is used to achieve operational energy efficiency as opposed to detailed energy audits, which therefore offers a no cost or less cost solutions for energy management programs with a limited budget. The quick-E-scan methodology was used to assesses a steel foundry plant based in Benoni through dividing the foundry into production sections which entailed a review of the current processes and usage patterns of energy within the plant and a detailed analysis of options available for improvement and profitable areas in which energy saving measures may be implemented for an increase energy efficiency which can be presented to management of the company.
Strategic implementation of integrated water resources management in Mozambique: An A’WOT analysis
NASA Astrophysics Data System (ADS)
Gallego-Ayala, Jordi; Juízo, Dinis
The Integrated Water Resources Management (IWRM) paradigm has become an important framework in development and management of water resources. Many countries in the Southern Africa region have begun water sector reforms to align the sector with the IWRM concepts. In 2007 the Mozambican Government started to update the policy and the legal framework of the water sector to foster the application of IWRM concept as a basis for achieving sustainable development. However the steps towards the implementation of this national framework are still in preparation. This research aims to identify and establish a priority ranking of the fundamental factors likely to affect the outcome of the IWRM reforms in Mozambique. This study uses the hybrid multi-criteria decision method A’WOT, a methodology coined by Kurttila et al. (2000). This method relies on the combination of the Strengths, Weaknesses, Opportunities, and Threats (SWOT) technique and the Analytic Hierarchy Process (AHP) technique. Using this procedure it is possible to identify and rank the factors affecting the functioning of a system. The key factors affecting the implementation of the IWRM, analysed in this study, were identified through an expert group discussion. These factors have been grouped into different categories of SWOT. Subsequently, the AHP methodology was applied to obtain the relative importance of each factor captured in the SWOT analysis; to this end the authors interviewed a panel of water resources management experts and practitioners. As a result, of this study and the application of the A’WOT methodology, the research identified and ranked the fundamental factors for the success of the IWRM strategy in Mozambique. The results of this study suggest that in Mozambique a planning strategy for the implementation of the IWRM should be guided mainly by combination of interventions in factors falling under opportunity and weakness SWOT groups.
Harbison, K; Kelly, J; Burnell, L; Silva, J
1995-01-01
The Scenario-based Engineering Process (SEP) is a user-focused methodology for large and complex system design. This process supports new application development from requirements analysis with domain models to component selection, design and modification, implementation, integration, and archival placement. It is built upon object-oriented methodologies, domain modeling strategies, and scenario-based techniques to provide an analysis process for mapping application requirements to available components. We are using SEP in the health care applications that we are developing. The process has already achieved success in the manufacturing and military domains and is being adopted by many organizations. SEP should prove viable in any domain containing scenarios that can be decomposed into tasks.
Data analysis environment (DASH2000) for the Subaru telescope
NASA Astrophysics Data System (ADS)
Mizumoto, Yoshihiko; Yagi, Masafumi; Chikada, Yoshihiro; Ogasawara, Ryusuke; Kosugi, George; Takata, Tadafumi; Yoshida, Michitoshi; Ishihara, Yasuhide; Yanaka, Hiroshi; Yamamoto, Tadahiro; Morita, Yasuhiro; Nakamoto, Hiroyuki
2000-06-01
New framework of data analysis system (DASH) has been developed for the SUBARU Telescope. It is designed using object-oriented methodology and adopted a restaurant model. DASH shares the load of CPU and I/O among distributed heterogeneous computers. The distributed object environment of the system is implemented with JAVA and CORBA. DASH has been evaluated by several prototypings. DASH2000 is the latest version, which will be released as the beta version of data analysis system for the SUBARU Telescope.
von Groote, Per Maximilian; Giustini, Alessandro; Bickenbach, Jerome Edmond
2014-01-01
A long-standing scientific discourse on the use of health research evidence to inform policy has come to produce multiple implementation theories, frameworks, models, and strategies. It is from this extensive body of research that the authors extract and present essential components of an implementation process in the health domain, gaining valuable guidance on how to successfully meet the challenges of implementation. Furthermore, this article describes how implementation content can be analyzed and reorganized, with a special focus on implementation at different policy, systems and services, and individual levels using existing frameworks and tools. In doing so, the authors aim to contribute to the establishment and testing of an implementation framework for reports such as the World Health Organization World Report on Disability, the World Health Organization International Perspectives on Spinal Cord Injury, and other health policy reports or technical health guidelines.
Martinez, Elizabeth A; Chavez-Valdez, Raul; Holt, Natalie F; Grogan, Kelly L; Khalifeh, Katherine W; Slater, Tammy; Winner, Laura E; Moyer, Jennifer; Lehmann, Christoph U
2011-01-01
Although the evidence strongly supports perioperative glycemic control among cardiac surgical patients, there is scant literature to describe the practical application of such a protocol in the complex ICU environment. This paper describes the use of the Lean Six Sigma methodology to implement a perioperative insulin protocol in a cardiac surgical intensive care unit (CSICU) in a large academic hospital. A preintervention chart audit revealed that fewer than 10% of patients were admitted to the CSICU with glucose <200 mg/dL, prompting the initiation of the quality improvement project. Following protocol implementation, more than 90% of patients were admitted with a glucose <200 mg/dL. Key elements to success include barrier analysis and intervention, provider education, and broadening the project scope to address the intraoperative period.
Martinez, Elizabeth A.; Chavez-Valdez, Raul; Holt, Natalie F.; Grogan, Kelly L.; Khalifeh, Katherine W.; Slater, Tammy; Winner, Laura E.; Moyer, Jennifer; Lehmann, Christoph U.
2011-01-01
Although the evidence strongly supports perioperative glycemic control among cardiac surgical patients, there is scant literature to describe the practical application of such a protocol in the complex ICU environment. This paper describes the use of the Lean Six Sigma methodology to implement a perioperative insulin protocol in a cardiac surgical intensive care unit (CSICU) in a large academic hospital. A preintervention chart audit revealed that fewer than 10% of patients were admitted to the CSICU with glucose <200 mg/dL, prompting the initiation of the quality improvement project. Following protocol implementation, more than 90% of patients were admitted with a glucose <200 mg/dL. Key elements to success include barrier analysis and intervention, provider education, and broadening the project scope to address the intraoperative period. PMID:22091218
A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis
Guardia, Gabriela D. A.; Pires, Luís Ferreira; Vêncio, Ricardo Z. N.; Malmegrim, Kelen C. R.; de Farias, Cléver R. G.
2015-01-01
Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS) Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis. PMID:26207740
A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis.
Guardia, Gabriela D A; Pires, Luís Ferreira; Vêncio, Ricardo Z N; Malmegrim, Kelen C R; de Farias, Cléver R G
2015-01-01
Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS) Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis.
Optimization of coupled multiphysics methodology for safety analysis of pebble bed modular reactor
NASA Astrophysics Data System (ADS)
Mkhabela, Peter Tshepo
The research conducted within the framework of this PhD thesis is devoted to the high-fidelity multi-physics (based on neutronics/thermal-hydraulics coupling) analysis of Pebble Bed Modular Reactor (PBMR), which is a High Temperature Reactor (HTR). The Next Generation Nuclear Plant (NGNP) will be a HTR design. The core design and safety analysis methods are considerably less developed and mature for HTR analysis than those currently used for Light Water Reactors (LWRs). Compared to LWRs, the HTR transient analysis is more demanding since it requires proper treatment of both slower and much longer transients (of time scale in hours and days) and fast and short transients (of time scale in minutes and seconds). There is limited operation and experimental data available for HTRs for validation of coupled multi-physics methodologies. This PhD work developed and verified reliable high fidelity coupled multi-physics models subsequently implemented in robust, efficient, and accurate computational tools to analyse the neutronics and thermal-hydraulic behaviour for design optimization and safety evaluation of PBMR concept The study provided a contribution to a greater accuracy of neutronics calculations by including the feedback from thermal hydraulics driven temperature calculation and various multi-physics effects that can influence it. Consideration of the feedback due to the influence of leakage was taken into account by development and implementation of improved buckling feedback models. Modifications were made in the calculation procedure to ensure that the xenon depletion models were accurate for proper interpolation from cross section tables. To achieve this, the NEM/THERMIX coupled code system was developed to create the system that is efficient and stable over the duration of transient calculations that last over several tens of hours. Another achievement of the PhD thesis was development and demonstration of full-physics, three-dimensional safety analysis methodology for the PBMR to provide reference solutions. Investigation of different aspects of the coupled methodology and development of efficient kinetics treatment for the PBMR were carried out, which accounts for all feedback phenomena in an efficient manner. The OECD/NEA PBMR-400 coupled code benchmark was used as a test matrix for the proposed investigations. The integrated thermal-hydraulics and neutronics (multi-physics) methods were extended to enable modeling of a wider range of transients pertinent to the PBMR. First, the effect of the spatial mapping schemes (spatial coupling) was studied and quantified for different types of transients, which resulted in implementation of improved mapping methodology based on user defined criteria. The second aspect that was studied and optimized is the temporal coupling and meshing schemes between the neutronics and thermal-hydraulics time step selection algorithms. The coupled code convergence was achieved supplemented by application of methods to accelerate it. Finally, the modeling of all feedback phenomena in PBMRs was investigated and a novel treatment of cross-section dependencies was introduced for improving the representation of cross-section variations. The added benefit was that in the process of studying and improving the coupled multi-physics methodology more insight was gained into the physics and dynamics of PBMR, which will help also to optimize the PBMR design and improve its safety. One unique contribution of the PhD research is the investigation of the importance of the correct representation of the three-dimensional (3-D) effects in the PBMR analysis. The performed studies demonstrated that explicit 3-D modeling of control rod movement is superior and removes the errors associated with the grey curtain (2-D homogenized) approximation.
A design and implementation methodology for diagnostic systems
NASA Technical Reports Server (NTRS)
Williams, Linda J. F.
1988-01-01
A methodology for design and implementation of diagnostic systems is presented. Also discussed are the advantages of embedding a diagnostic system in a host system environment. The methodology utilizes an architecture for diagnostic system development that is hierarchical and makes use of object-oriented representation techniques. Additionally, qualitative models are used to describe the host system components and their behavior. The methodology architecture includes a diagnostic engine that utilizes a combination of heuristic knowledge to control the sequence of diagnostic reasoning. The methodology provides an integrated approach to development of diagnostic system requirements that is more rigorous than standard systems engineering techniques. The advantages of using this methodology during various life cycle phases of the host systems (e.g., National Aerospace Plane (NASP)) include: the capability to analyze diagnostic instrumentation requirements during the host system design phase, a ready software architecture for implementation of diagnostics in the host system, and the opportunity to analyze instrumentation for failure coverage in safety critical host system operations.
Soldatini, Cecilia; Albores-Barajas, Yuri Vladimir; Lovato, Tomas; Andreon, Adriano; Torricelli, Patrizia; Montemaggiori, Alessandro; Corsa, Cosimo; Georgalas, Vyron
2011-01-01
The presence of wildlife in airport areas poses substantial hazards to aviation. Wildlife aircraft collisions (hereafter wildlife strikes) cause losses in terms of human lives and direct monetary losses for the aviation industry. In recent years, wildlife strikes have increased in parallel with air traffic increase and species habituation to anthropic areas. In this paper, we used an ecological approach to wildlife strike risk assessment to eight Italian international airports. The main achievement is a site-specific analysis that avoids flattening wildlife strike events on a large scale while maintaining comparable airport risk assessments. This second version of the Birdstrike Risk Index (BRI2) is a sensitive tool that provides different time scale results allowing appropriate management planning. The methodology applied has been developed in accordance with the Italian Civil Aviation Authority, which recognizes it as a national standard implemented in the advisory circular ENAC APT-01B.
Verification of a Constraint Force Equation Methodology for Modeling Multi-Body Stage Separation
NASA Technical Reports Server (NTRS)
Tartabini, Paul V.; Roithmayr, Carlos; Toniolo, Matthew D.; Karlgaard, Christopher; Pamadi, Bandu N.
2008-01-01
This paper discusses the verification of the Constraint Force Equation (CFE) methodology and its implementation in the Program to Optimize Simulated Trajectories II (POST2) for multibody separation problems using three specially designed test cases. The first test case involves two rigid bodies connected by a fixed joint; the second case involves two rigid bodies connected with a universal joint; and the third test case is that of Mach 7 separation of the Hyper-X vehicle. For the first two cases, the POST2/CFE solutions compared well with those obtained using industry standard benchmark codes, namely AUTOLEV and ADAMS. For the Hyper-X case, the POST2/CFE solutions were in reasonable agreement with the flight test data. The CFE implementation in POST2 facilitates the analysis and simulation of stage separation as an integral part of POST2 for seamless end-to-end simulations of launch vehicle trajectories.
Soldatini, Cecilia; Albores-Barajas, Yuri Vladimir; Lovato, Tomas; Andreon, Adriano; Torricelli, Patrizia; Montemaggiori, Alessandro; Corsa, Cosimo; Georgalas, Vyron
2011-01-01
The presence of wildlife in airport areas poses substantial hazards to aviation. Wildlife aircraft collisions (hereafter wildlife strikes) cause losses in terms of human lives and direct monetary losses for the aviation industry. In recent years, wildlife strikes have increased in parallel with air traffic increase and species habituation to anthropic areas. In this paper, we used an ecological approach to wildlife strike risk assessment to eight Italian international airports. The main achievement is a site-specific analysis that avoids flattening wildlife strike events on a large scale while maintaining comparable airport risk assessments. This second version of the Birdstrike Risk Index (BRI2) is a sensitive tool that provides different time scale results allowing appropriate management planning. The methodology applied has been developed in accordance with the Italian Civil Aviation Authority, which recognizes it as a national standard implemented in the advisory circular ENAC APT-01B. PMID:22194950
Uncertainty Analysis Framework - Hanford Site-Wide Groundwater Flow and Transport Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cole, Charles R.; Bergeron, Marcel P.; Murray, Christopher J.
2001-11-09
Pacific Northwest National Laboratory (PNNL) embarked on a new initiative to strengthen the technical defensibility of the predictions being made with a site-wide groundwater flow and transport model at the U.S. Department of Energy Hanford Site in southeastern Washington State. In FY 2000, the focus of the initiative was on the characterization of major uncertainties in the current conceptual model that would affect model predictions. The long-term goals of the initiative are the development and implementation of an uncertainty estimation methodology in future assessments and analyses using the site-wide model. This report focuses on the development and implementation of anmore » uncertainty analysis framework.« less
NASA Technical Reports Server (NTRS)
Haley, D. C.; Almand, B. J.; Thomas, M. M.; Krauze, L. D.; Gremban, K. D.; Sanborn, J. C.; Kelly, J. H.; Depkovich, T. M.
1984-01-01
A generic computer simulation for manipulator systems (ROBSIM) was implemented and the specific technologies necessary to increase the role of automation in various missions were developed. The specific items developed are: (1) capability for definition of a manipulator system consisting of multiple arms, load objects, and an environment; (2) capability for kinematic analysis, requirements analysis, and response simulation of manipulator motion; (3) postprocessing options such as graphic replay of simulated motion and manipulator parameter plotting; (4) investigation and simulation of various control methods including manual force/torque and active compliances control; (5) evaluation and implementation of three obstacle avoidance methods; (6) video simulation and edge detection; and (7) software simulation validation.
The ethics of placebo-controlled trials: methodological justifications.
Millum, Joseph; Grady, Christine
2013-11-01
The use of placebo controls in clinical trials remains controversial. Ethical analysis and international ethical guidance permit the use of placebo controls in randomized trials when scientifically indicated in four cases: (1) when there is no proven effective treatment for the condition under study; (2) when withholding treatment poses negligible risks to participants; (3) when there are compelling methodological reasons for using placebo, and withholding treatment does not pose a risk of serious harm to participants; and, more controversially, (4) when there are compelling methodological reasons for using placebo, and the research is intended to develop interventions that can be implemented in the population from which trial participants are drawn, and the trial does not require participants to forgo treatment they would otherwise receive. The concept of methodological reasons is essential to assessing the ethics of placebo controls in these controversial last two cases. This article sets out key considerations relevant to considering whether methodological reasons for a placebo control are compelling. © 2013.
NASA Technical Reports Server (NTRS)
Baker, T. C. (Principal Investigator)
1982-01-01
A general methodology is presented for estimating a stratum's at-harvest crop acreage proportion for a given crop year (target year) from the crop's estimated acreage proportion for sample segments from within the stratum. Sample segments from crop years other than the target year are (usually) required for use in conjunction with those from the target year. In addition, the stratum's (identifiable) crop acreage proportion may be estimated for times other than at-harvest in some situations. A by-product of the procedure is a methodology for estimating the change in the stratum's at-harvest crop acreage proportion from crop year to crop year. An implementation of the proposed procedure as a statistical analysis system routine using the system's matrix language module, PROC MATRIX, is described and documented. Three examples illustrating use of the methodology and algorithm are provided.
Predicting Failure Progression and Failure Loads in Composite Open-Hole Tension Coupons
NASA Technical Reports Server (NTRS)
Arunkumar, Satyanarayana; Przekop, Adam
2010-01-01
Failure types and failure loads in carbon-epoxy [45n/90n/-45n/0n]ms laminate coupons with central circular holes subjected to tensile load are simulated using progressive failure analysis (PFA) methodology. The progressive failure methodology is implemented using VUMAT subroutine within the ABAQUS(TradeMark)/Explicit nonlinear finite element code. The degradation model adopted in the present PFA methodology uses an instantaneous complete stress reduction (COSTR) approach to simulate damage at a material point when failure occurs. In-plane modeling parameters such as element size and shape are held constant in the finite element models, irrespective of laminate thickness and hole size, to predict failure loads and failure progression. Comparison to published test data indicates that this methodology accurately simulates brittle, pull-out and delamination failure types. The sensitivity of the failure progression and the failure load to analytical loading rates and solvers precision is demonstrated.
NASA Astrophysics Data System (ADS)
Mateos-Espejel, Enrique
The objective of this thesis is to develop, validate, and apply a unified methodology for the energy efficiency improvement of a Kraft process that addresses globally the interactions of the various process systems that affect its energy performance. An implementation strategy is the final result. An operating Kraft pulping mill situated in Eastern Canada with a production of 700 adt/d of high-grade bleached pulp was the case study. The Pulp and Paper industry is Canada's premier industry. It is characterized by large thermal energy and water consumption. Rising energy costs and more stringent environmental regulations have led the industry to refocus its efforts toward identifying ways to improve energy and water conservation. Energy and water aspects are usually analyzed independently, but in reality they are strongly interconnected. Therefore, there is a need for an integrated methodology, which considers energy and water aspects, as well as the optimal utilization and production of the utilities. The methodology consists of four successive stages. The first stage is the base case definition. The development of a focused, reliable and representative model of an operating process is a prerequisite to the optimization and fine tuning of its energy performance. A four-pronged procedure has been developed: data gathering, master diagram, utilities systems analysis, and simulation. The computer simulation has been focused on the energy and water systems. The second stage corresponds to the benchmarking analysis. The benchmarking of the base case has the objectives of identifying the process inefficiencies and to establish guidelines for the development of effective enhancement measures. The studied process is evaluated by a comparison of its efficiency to the current practice of the industry and by the application of new energy and exergy content indicators. The minimum energy and water requirements of the process are also determined in this step. The third stage is the core of the methodology; it represents the formulation of technically feasible energy enhancing options. Several techniques are applied in an iterative procedure to cast light on their synergies and counter-actions. The objective is to develop a path for improving the process so as to maximize steam savings while minimizing the investment required. The fourth stage is the implementation strategy. As the existing process configuration and operating conditions vary from process to process it is important to develop a strategy for the implementation of energy enhancement programs in the most advantageous way for each case. A three-phase strategy was selected for the specific case study in the context of its management strategic plan: the elimination of fossil fuel, the production of power and the liberation of steam capacity. A post-benchmarking analysis is done to quantify the improvement of the energy efficiency. The performance indicators are computed after all energy enhancing measures have been implemented. The improvement of the process by applying the unified methodology results in substantially more steam savings than by applying individually the typical techniques that it comprises: energy savings of 5.6 GJ/adt (27% of the current requirement), water savings of 32 m3/adt (34% of the current requirement) and an electricity production potential of 44.5MW. As a result of applying the unified methodology the process becomes eco-friendly as it does not require fossil fuel for producing steam; its water and steam consumptions are below the Canadian average and it produces large revenues from the production of green electricity.
Evolutionary Computing Methods for Spectral Retrieval
NASA Technical Reports Server (NTRS)
Terrile, Richard; Fink, Wolfgang; Huntsberger, Terrance; Lee, Seugwon; Tisdale, Edwin; VonAllmen, Paul; Tinetti, Geivanna
2009-01-01
A methodology for processing spectral images to retrieve information on underlying physical, chemical, and/or biological phenomena is based on evolutionary and related computational methods implemented in software. In a typical case, the solution (the information that one seeks to retrieve) consists of parameters of a mathematical model that represents one or more of the phenomena of interest. The methodology was developed for the initial purpose of retrieving the desired information from spectral image data acquired by remote-sensing instruments aimed at planets (including the Earth). Examples of information desired in such applications include trace gas concentrations, temperature profiles, surface types, day/night fractions, cloud/aerosol fractions, seasons, and viewing angles. The methodology is also potentially useful for retrieving information on chemical and/or biological hazards in terrestrial settings. In this methodology, one utilizes an iterative process that minimizes a fitness function indicative of the degree of dissimilarity between observed and synthetic spectral and angular data. The evolutionary computing methods that lie at the heart of this process yield a population of solutions (sets of the desired parameters) within an accuracy represented by a fitness-function value specified by the user. The evolutionary computing methods (ECM) used in this methodology are Genetic Algorithms and Simulated Annealing, both of which are well-established optimization techniques and have also been described in previous NASA Tech Briefs articles. These are embedded in a conceptual framework, represented in the architecture of the implementing software, that enables automatic retrieval of spectral and angular data and analysis of the retrieved solutions for uniqueness.
Quantifying construction and demolition waste: An analytical review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Zezhou; Yu, Ann T.W., E-mail: bsannyu@polyu.edu.hk; Shen, Liyin
2014-09-15
Highlights: • Prevailing C and D waste quantification methodologies are identified and compared. • One specific methodology cannot fulfill all waste quantification scenarios. • A relevance tree for appropriate quantification methodology selection is proposed. • More attentions should be paid to civil and infrastructural works. • Classified information is suggested for making an effective waste management plan. - Abstract: Quantifying construction and demolition (C and D) waste generation is regarded as a prerequisite for the implementation of successful waste management. In literature, various methods have been employed to quantify the C and D waste generation at both regional and projectmore » levels. However, an integrated review that systemically describes and analyses all the existing methods has yet to be conducted. To bridge this research gap, an analytical review is conducted. Fifty-seven papers are retrieved based on a set of rigorous procedures. The characteristics of the selected papers are classified according to the following criteria - waste generation activity, estimation level and quantification methodology. Six categories of existing C and D waste quantification methodologies are identified, including site visit method, waste generation rate method, lifetime analysis method, classification system accumulation method, variables modelling method and other particular methods. A critical comparison of the identified methods is given according to their characteristics and implementation constraints. Moreover, a decision tree is proposed for aiding the selection of the most appropriate quantification method in different scenarios. Based on the analytical review, limitations of previous studies and recommendations of potential future research directions are further suggested.« less
NASA Astrophysics Data System (ADS)
Vazquez Rascon, Maria de Lourdes
This thesis focuses on the implementation of a participatory and transparent decision making tool about the wind farm projects. This tool is based on an (argumentative) framework that reflects the stakeholder's values systems involved in these projects and it employs two multicriteria methods: the multicriteria decision aide and the participatory geographical information systems, making it possible to represent this value systems by criteria and indicators to be evaluated. The stakeholder's values systems will allow the inclusion of environmental, economic and social-cultural aspects of wind energy projects and, thus, a sustainable development wind projects vision. This vision will be analyzed using the 16 sustainable principles included in the Quebec's Sustainable Development Act. Four specific objectives have been instrumented to favor a logical completion work, and to ensure the development of a successfultool : designing a methodology to couple the MCDA and participatory GIS, testing the developed methodology by a case study, making a robustness analysis to address strategic issues and analyzing the strengths, weaknesses, opportunities and threads of the developed methodology. Achieving the first goal allowed us to obtain a decision-making tool called Territorial Intelligence Modeling for Energy Development (TIMED approach). The TIMED approach is visually represented by a figure expressing the idea of a co-construction decision and where ail stakeholders are the focus of this methodology. TIMED is composed of four modules: Multi-Criteria decision analysis, participatory geographic Information systems, active involvement of the stakeholders and scientific knowledge/local knowledge. The integration of these four modules allows for the analysis of different implementation scenarios of wind turbines in order to choose the best one based on a participatory and transparent decision-making process that takes into account stakeholders' concerns. The second objective enabled the testing of TIMED in an ex-post experience of a wind farm in operation since 2006. In this test, II people participated representing four stakeholder' categories: the private sector, the public sector, experts and civil society. This test allowed us to analyze the current situation in which wind projects are currently developed in Quebec. The concerns of some stakeholders regarding situations that are not considered in the current context were explored through a third goal. This third objective allowed us to make simulations taking into account the assumptions of strategic levels. Examples of the strategic level are the communication tools used to approach the host community and the park property type. Finally, the fourth objective, a SWOT analysis with the participation of eight experts, allowed us to verify the extent to which TIMED approach succeeded in constructing four fields for participatory decision-making: physical, intellectual, emotional and procedural. From these facts, 116 strengths, 28 weaknesses, 32 constraints and 54 opportunities were identified. Contributions, applications, limitations and extensions of this research are based on giving a participatory decision-making methodology taking into account socio-cultural, environmental and economic variables; making reflection sessions on a wind farm in operation; acquiring MCDA knowledge for participants involved in testing the proposed methodology; taking into account the physical, intellectual, emotional and procedural spaces to al1iculate a participatory decision; using the proposed methodology in renewable energy sources other than wind; the need to an interdisciplinary team for the methodology application; access to quality data; access to information technologies; the right to public participation; the neutrality of experts; the relationships between experts and non-experts; cultural constraints; improvement of designed indicators; the implementation of a Web platform for participatory decision-making and writing a manual on the use of the developed methodology. Keywords: wind farm, multicriteria decision, geographic information systems, TIMED approach, sustainable wind energy projects development, renewable energy, social participation, robustness concern, SWOT analysis.
Automated Verification of Specifications with Typestates and Access Permissions
NASA Technical Reports Server (NTRS)
Siminiceanu, Radu I.; Catano, Nestor
2011-01-01
We propose an approach to formally verify Plural specifications based on access permissions and typestates, by model-checking automatically generated abstract state-machines. Our exhaustive approach captures all the possible behaviors of abstract concurrent programs implementing the specification. We describe the formal methodology employed by our technique and provide an example as proof of concept for the state-machine construction rules. The implementation of a fully automated algorithm to generate and verify models, currently underway, provides model checking support for the Plural tool, which currently supports only program verification via data flow analysis (DFA).
Integrated assessment of urban drainage system under the framework of uncertainty analysis.
Dong, X; Chen, J; Zeng, S; Zhao, D
2008-01-01
Due to a rapid urbanization as well as the presence of large number of aging urban infrastructures in China, the urban drainage system is facing a dual pressure of construction and renovation nationwide. This leads to the need for an integrated assessment when an urban drainage system is under planning or re-design. In this paper, an integrated assessment methodology is proposed based upon the approaches of analytic hierarchy process (AHP), uncertainty analysis, mathematical simulation of urban drainage system and fuzzy assessment. To illustrate this methodology, a case study in Shenzhen City of south China has been implemented to evaluate and compare two different urban drainage system renovation plans, i.e., the distributed plan and the centralized plan. By comparing their water quality impacts, ecological impacts, technological feasibility and economic costs, the integrated performance of the distributed plan is found to be both better and robust. The proposed methodology is also found to be both effective and practical. (c) IWA Publishing 2008.
Recent advances in CE-MS coupling: Instrumentation, methodology, and applications.
Týčová, Anna; Ledvina, Vojtěch; Klepárník, Karel
2017-01-01
This review focuses on the latest development of microseparation electromigration methods in capillaries and microfluidic devices coupled with MS for detection and identification of important analytes. It is a continuation of the review article on the same topic by Kleparnik (Electrophoresis 2015, 36, 159-178). A wide selection of 161 relevant articles covers the literature published from June 2014 till May 2016. New improvements in the instrumentation and methodology of MS interfaced with capillary or microfluidic versions of zone electrophoresis, isotachophoresis, and isoelectric focusing are described in detail. The most frequently implemented MS ionization methods include electrospray ionization, matrix-assisted desorption/ionization and inductively coupled plasma ionization. Although the main attention is paid to the development of instrumentation and methodology, representative examples illustrate also applications in the proteomics, glycomics, metabolomics, biomarker research, forensics, pharmacology, food analysis, and single-cell analysis. The combinations of MS with capillary versions of electrochromatography, and micellar electrokinetic chromatography are not included. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Perception of Young Adults on Online Games: Implications for Higher Education
ERIC Educational Resources Information Center
Chen, Liwen; Chen, Tung-Liang; Liu, Hsu-Kuan Jonathan
2010-01-01
The purpose of this study is to identify and categorize the perceptions of young adults before we allocate the resources to design, develop, and implement digital game-based learning in higher education institutions in Taiwan. Q-methodology was conducted for this study because it is a quantitative analysis of subjective data. Thirty young adults…
ERIC Educational Resources Information Center
Jain, Suresh; Pant, Pallavi
2010-01-01
Purpose: The purpose of this paper is to put forth a model for implementation of an environmental management system (EMS) in institutes of higher education in India. Design/methodology/approach: The authors carried out initial environmental review (IER) and strengths, weaknesses, opportunities and threats (SWOT) analysis to identify the major…
ERIC Educational Resources Information Center
Navarro, Manuel
2014-01-01
This paper presents a model of how children generate concrete concepts from perception through processes of differentiation and integration. The model informs the design of a novel methodology ("evolutionary maps" or "emaps"), whose implementation on certain domains unfolds the web of itineraries that children may follow in the…
The Rescue911 Emergency Response Information System (ERIS): A Systems Development Project Case
ERIC Educational Resources Information Center
Cohen, Jason F.; Thiel, Franz H.
2010-01-01
This teaching case presents a systems development project useful for courses in object-oriented analysis and design. The case has a strong focus on the business, methodology, modeling and implementation aspects of systems development. The case is centered on a fictitious ambulance and emergency services company (Rescue911). The case describes that…
ERIC Educational Resources Information Center
Fielke, Simon J.; Botha, Neels; Reid, Janet; Gray, David; Blackett, Paula; Park, Nicola; Williams, Tracy
2018-01-01
Purpose: This paper highlights important lessons for co-innovation drawn from three ex-post case study innovation projects implemented within three sub-sectors of the primary industry sector in New Zealand. Design/methodology/approach: The characteristics that fostered co-innovation in each innovation project case study were identified from…
Web-Based OPACs in Indian Academic Libraries: A Functional Comparison
ERIC Educational Resources Information Center
Kapoor, Kanta; Goyal, O. P.
2007-01-01
Purpose: The paper seeks to provide a comparative analysis of the functionality of five web-based OPACs available in Indian academic libraries. Design/methodology/approach: Same-topic searches were carried out by three researchers on the web-based OPACs of Libsys, VTLS's iPortal, NewGenLib, Troodon, and Alice for Windows, implemented in five…
ERIC Educational Resources Information Center
Usta, Mehmet Emin
2018-01-01
In this study, 1926 official legislation document was analyzed taking the role and duties of inspectors at that time in mind. These roles and duties were explained based on the authorization, investigation, interrogation methodology, and employment of inspectors. This study was carried out by implementing documentary research methods. Like other…
Courtwright, Suzanne E; Mastro, Kari A; Preuster, Christa; Dardashti, Navid; McGill, Sandra; Madelon, Myrlene; Johnson, Donna
2017-10-01
This review focuses on identifying (1) evidence of the effectiveness of care bundle methodology to reduce hospital-acquired pressure ulcers (HAPUs) in pediatric and neonatal patients receiving extracorporeal membrane oxygenation (ECMO) therapy and (2) barriers to implementing HAPU care bundles in this at-risk population. An integrative review was conducted and reported following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines. A search of the scientific literature was performed. Studies included were published between January 2011 and February 2016. A total of seven articles met inclusion criteria. Data were extracted from each published article and analyzed to identify common themes, specifically bundle methodology and barriers to implementing HAPU bundles, in this population. There is limited research on effectiveness of care bundle methodology in reducing HAPUs in children, and no research specific to its effectiveness in pediatric or neonatal ECMO patients. No research was identified studying barriers to implementation of HAPU care bundles in this population. Nurses are well poised to test innovative strategies to prevent HAPUs. Nurses should consider implementing and testing bundle methodology to reduce HAPU in this at-risk population, and conduct research to identify any barriers to implementing this strategy. There is literature to support the use of nurses as unit-based skin care champions to facilitate teamwork and reliable use of the bundle, both critical components to the success of bundle methodology. © 2017 Wiley Periodicals, Inc.
A methodology for Manufacturing Execution Systems (MES) implementation
NASA Astrophysics Data System (ADS)
Govindaraju, Rajesri; Putra, Krisna
2016-02-01
Manufacturing execution system is information systems (IS) application that bridges the gap between IS at the top level, namely enterprise resource planning (ERP), and IS at the lower levels, namely the automation systems. MES provides a media for optimizing the manufacturing process as a whole in a real time basis. By the use of MES in combination with the implementation of ERP and other automation systems, a manufacturing company is expected to have high competitiveness. In implementing MES, functional integration -making all the components of the manufacturing system able to work well together, is the most difficult challenge. For this, there has been an industry standard that specifies the sub-systems of a manufacturing execution systems and defines the boundaries between ERP systems, MES, and other automation systems. The standard is known as ISA-95. Although the advantages from the use of MES have been stated in some studies, not much research being done on how to implement MES effectively. The purpose of this study is to develop a methodology describing how MES implementation project should be managed, utilising the support of ISA- 95 reference model in the system development process. A proposed methodology was developed based on a general IS development methodology. The developed methodology were then revisited based on the understanding about the specific charateristics of MES implementation project found in an Indonesian steel manufacturing company implementation case. The case study highlighted the importance of applying an effective requirement elicitation method during innitial system assessment process, managing system interfaces and labor division in the design process, and performing a pilot deployment before putting the whole system into operation.
Santos, Melissa Costa; Tesser, Charles Dalcanale
2012-11-01
The rendering of integrated and complementary practices in the Brazilian Unified Health System is fostered to increase the comprehensiveness of care and access to same, though it is a challenge to incorporate them into the services. Our objective is to provide a simple method of implementation of such practices in Primary Healthcare, derived from analysis of experiences in municipalities, using partial results of a master's thesis that employed research-action methodology. The method involves four stages: 1 - defininition of a nucleus responsible for implementation and consolidation thereof; 2 - situational analysis, with definition of the existing competent professionals; 3 - regulation, organization of access and legitimation; and 4 - implementation cycle: local plans, mentoring and ongoing education in health. The phases are described, justified and briefly discussed. The method encourages the development of rational and sustainable actions, sponsors participatory management, the creation of comprehensivenessand the broadening of care provided in Primary Healthcare by offering progressive and sustainable comprehensive and complementary practices.
Multi-criteria analysis for PM10 planning
NASA Astrophysics Data System (ADS)
Pisoni, Enrico; Carnevale, Claudio; Volta, Marialuisa
To implement sound air quality policies, Regulatory Agencies require tools to evaluate outcomes and costs associated to different emission reduction strategies. These tools are even more useful when considering atmospheric PM10 concentrations due to the complex nonlinear processes that affect production and accumulation of the secondary fraction of this pollutant. The approaches presented in the literature (Integrated Assessment Modeling) are mainly cost-benefit and cost-effective analysis. In this work, the formulation of a multi-objective problem to control particulate matter is proposed. The methodology defines: (a) the control objectives (the air quality indicator and the emission reduction cost functions); (b) the decision variables (precursor emission reductions); (c) the problem constraints (maximum feasible technology reductions). The cause-effect relations between air quality indicators and decision variables are identified tuning nonlinear source-receptor models. The multi-objective problem solution provides to the decision maker a set of not-dominated scenarios representing the efficient trade-off between the air quality benefit and the internal costs (emission reduction technology costs). The methodology has been implemented for Northern Italy, often affected by high long-term exposure to PM10. The source-receptor models used in the multi-objective analysis are identified processing long-term simulations of GAMES multiphase modeling system, performed in the framework of CAFE-Citydelta project.
Concerns related to Safety Management of Engineered Nanomaterials in research environment
NASA Astrophysics Data System (ADS)
Groso, A.; Meyer, Th
2013-04-01
Since the rise of occupational safety and health research on nanomaterials a lot of progress has been made in generating health effects and exposure data. However, when detailed quantitative risk analysis is in question, more research is needed, especially quantitative measures of workers exposure and standards to categorize toxicity/hazardousness data. In the absence of dose-response relationships and quantitative exposure measurements, control banding (CB) has been widely adopted by OHS community as a pragmatic tool in implementing a risk management strategy based on a precautionary approach. Being in charge of health and safety in a Swiss university, where nanomaterials are largely used and produced, we are also faced with the challenge related to nanomaterials' occupational safety. In this work, we discuss the field application of an in-house risk management methodology similar to CB as well as some other methodologies. The challenges and issues related to the process will be discussed. Since exact data on nanomaterials hazardousness are missing for most of the situations, we deduce that the outcome of the analysis for a particular process is essentially the same with a simple methodology that determines only exposure potential and the one taking into account the hazardousness of ENPs. It is evident that when reliable data on hazardousness factors (as surface chemistry, solubility, carcinogenicity, toxicity etc.) will be available, more differentiation will be possible in determining the risk for different materials. On the protective measures side, all CB methodologies are inclined to overprotection side, only that some of them suggest comprehensive protective/preventive measures and others remain with basic advices. The implementation and control of protective measures in research environment will also be discussed.
Zareski, Rubin; Kapedanovska Nestorovska, A; Grozdanova, A; Dimitrova, B; Suturkova, L J; Sterjev, Z
2016-09-01
The introduction of a new methodology for the pricing of drugs by the Agency of Medicines of the Republic of Macedonia for the period 2012 to 2015 resulted in a price reduction of 1386 drugs. This pioneer study evaluated the effects of the price changes during this period of 4 years and the consequent effects on the sale quantities for the segmented Anatomical Therapeutic Chemical groups. The drugs were grouped by the size of the reductions, by segmenting the drugs by generic names, and by the Anatomical Therapeutic Chemical classification, in which the quantities are grouped by generic names and the prices are calculated by average values for a period of 1 year. Analysis of the relations between price changes and quantities sold showed that since the introduction of the new methodology the decrease in the prices pushed down the sales of the drugs. This article presents not only the market developments but also projects the tendencies, concluding clearly that focusing only on the price reduction of drugs and not on the implementation of the pharmacoeconomic studies is deviating the supply of drugs that are on the market and affecting their quality. The trends indicate that patients are using old-generation drugs, packaging forms that do not fully answer the market demand, and policies that significantly affect the suppliers. The presented analysis confirms that if the new methodology is only partially implemented and is not followed in full consideration of the pharmacoeconomic studies, negative consequences will also have an impact on regional pharmaceutical markets, which are benchmarking prices of drugs with the Macedonian market. Copyright © 2016. Published by Elsevier Inc.
The SAMCO Web-platform for resilience assessment in mountainous valleys impacted by landslide risks.
NASA Astrophysics Data System (ADS)
Grandjean, Gilles; Thomas, Loic; Bernardie, Severine
2016-04-01
The ANR-SAMCO project aims to develop a proactive resilience framework enhancing the overall resilience of societies on the impacts of mountain risks. The project aims to elaborate methodological tools to characterize and measure ecosystem and societal resilience from an operative perspective on three mountain representative case studies. To achieve this objective, the methodology is split in several points: (1) the definition of the potential impacts of global environmental changes (climate system, ecosystem e.g. land use, socio-economic system) on landslide hazards, (2) the analysis of these consequences in terms of vulnerability (e.g. changes in the location and characteristics of the impacted areas and level of their perturbation) and (3) the implementation of a methodology for quantitatively investigating and mapping indicators of mountain slope vulnerability exposed to several hazard types, and the development of a GIS-based demonstration platform available on the web. The strength and originality of the SAMCO project lies in the combination of different techniques, methodologies and models (multi-hazard assessment, risk evolution in time, vulnerability functional analysis, and governance strategies) that are implemented in a user-oriented web-platform, currently in development. We present the first results of this development task, architecture and functions of the web-tools, the case studies database showing the multi-hazard maps and the stakes at risks. Risk assessment over several area of interest in Alpine or Pyrenean valleys are still in progress, but the first analyses are presented for current and future periods for which climate change and land-use (economical, geographical and social aspects) scenarios are taken into account. This tool, dedicated to stakeholders, should be finally used to evaluate resilience of mountainous regions since multiple scenarios can be tested and compared.
Churilov, Leonid; Liu, Daniel; Ma, Henry; Christensen, Soren; Nagakane, Yoshinari; Campbell, Bruce; Parsons, Mark W; Levi, Christopher R; Davis, Stephen M; Donnan, Geoffrey A
2013-04-01
The appropriateness of a software platform for rapid MRI assessment of the amount of salvageable brain tissue after stroke is critical for both the validity of the Extending the Time for Thrombolysis in Emergency Neurological Deficits (EXTEND) Clinical Trial of stroke thrombolysis beyond 4.5 hours and for stroke patient care outcomes. The objective of this research is to develop and implement a methodology for selecting the acute stroke imaging software platform most appropriate for the setting of a multi-centre clinical trial. A multi-disciplinary decision making panel formulated the set of preferentially independent evaluation attributes. Alternative Multi-Attribute Value Measurement methods were used to identify the best imaging software platform followed by sensitivity analysis to ensure the validity and robustness of the proposed solution. Four alternative imaging software platforms were identified. RApid processing of PerfusIon and Diffusion (RAPID) software was selected as the most appropriate for the needs of the EXTEND trial. A theoretically grounded generic multi-attribute selection methodology for imaging software was developed and implemented. The developed methodology assured both a high quality decision outcome and a rational and transparent decision process. This development contributes to stroke literature in the area of comprehensive evaluation of MRI clinical software. At the time of evaluation, RAPID software presented the most appropriate imaging software platform for use in the EXTEND clinical trial. The proposed multi-attribute imaging software evaluation methodology is based on sound theoretical foundations of multiple criteria decision analysis and can be successfully used for choosing the most appropriate imaging software while ensuring both robust decision process and outcomes. © 2012 The Authors. International Journal of Stroke © 2012 World Stroke Organization.
Reference values of elements in human hair: a systematic review.
Mikulewicz, Marcin; Chojnacka, Katarzyna; Gedrange, Thomas; Górecki, Henryk
2013-11-01
The lack of systematic review on reference values of elements in human hair with the consideration of methodological approach. The absence of worldwide accepted and implemented universal reference ranges causes that hair mineral analysis has not become yet a reliable and useful method of assessment of nutritional status and exposure of individuals. Systematic review of reference values of elements in human hair. PubMed, ISI Web of Knowledge, Scopus. Humans, hair mineral analysis, elements or minerals, reference values, original studies. The number of studies screened and assessed for eligibility was 52. Eventually, included in the review were 5 papers. The studies report reference ranges for the content of elements in hair: macroelements, microelements, toxic elements and other elements. Reference ranges were elaborated for different populations in the years 2000-2012. The analytical methodology differed, in particular sample preparation, digestion and analysis (ICP-AES, ICP-MS). Consequently, the levels of hair minerals reported as reference values varied. It is necessary to elaborate the standard procedures and furtherly validate hair mineral analysis and deliver detailed methodology. Only then it would be possible to provide meaningful reference ranges and take advantage of the potential that lies in Hair Mineral Analysis as a medical diagnostic technique. Copyright © 2013 Elsevier B.V. All rights reserved.
Capabilities, methodologies, and use of the cambio file-translation application.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lasche, George P.
2007-03-01
This report describes the capabilities, methodologies, and uses of the Cambio computer application, designed to automatically read and display nuclear spectral data files of any known format in the world and to convert spectral data to one of several commonly used analysis formats. To further assist responders, Cambio incorporates an analysis method based on non-linear fitting techniques found in open literature and implemented in openly published source code in the late 1980s. A brief description is provided of how Cambio works, of what basic formats it can currently read, and how it can be used. Cambio was developed at Sandiamore » National Laboratories and is provided as a free service to assist nuclear emergency response analysts anywhere in the world in the fight against nuclear terrorism.« less
1987-01-01
two nodes behave identically. In GRASP, these constraints are entirely invisible from the user’s point of view. GRASP (Recall that the Levi - Civita ...virtual rotation GRASP is the first program implementing a new methodWl( = Levi -Ciudta symbol op for dynamic analysis of structures, parts of which may...natural coordinatization of sis for this methodology, which incorporates body flexibility components. with the large discrete motions previously
Developing and Implementing the Data Mining Algorithms in RAVEN
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sen, Ramazan Sonat; Maljovec, Daniel Patrick; Alfonsi, Andrea
The RAVEN code is becoming a comprehensive tool to perform probabilistic risk assessment, uncertainty quantification, and verification and validation. The RAVEN code is being developed to support many programs and to provide a set of methodologies and algorithms for advanced analysis. Scientific computer codes can generate enormous amounts of data. To post-process and analyze such data might, in some cases, take longer than the initial software runtime. Data mining algorithms/methods help in recognizing and understanding patterns in the data, and thus discover knowledge in databases. The methodologies used in the dynamic probabilistic risk assessment or in uncertainty and error quantificationmore » analysis couple system/physics codes with simulation controller codes, such as RAVEN. RAVEN introduces both deterministic and stochastic elements into the simulation while the system/physics code model the dynamics deterministically. A typical analysis is performed by sampling values of a set of parameter values. A major challenge in using dynamic probabilistic risk assessment or uncertainty and error quantification analysis for a complex system is to analyze the large number of scenarios generated. Data mining techniques are typically used to better organize and understand data, i.e. recognizing patterns in the data. This report focuses on development and implementation of Application Programming Interfaces (APIs) for different data mining algorithms, and the application of these algorithms to different databases.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Panayotov, Dobromir; Poitevin, Yves; Grief, Andrew
'Fusion for Energy' (F4E) is designing, developing, and implementing the European Helium-Cooled Lead-Lithium (HCLL) and Helium-Cooled Pebble-Bed (HCPB) Test Blanket Systems (TBSs) for ITER (Nuclear Facility INB-174). Safety demonstration is an essential element for the integration of these TBSs into ITER and accident analysis is one of its critical components. A systematic approach to accident analysis has been developed under the F4E contract on TBS safety analyses. F4E technical requirements, together with Amec Foster Wheeler and INL efforts, have resulted in a comprehensive methodology for fusion breeding blanket accident analysis that addresses the specificity of the breeding blanket designs, materials,more » and phenomena while remaining consistent with the approach already applied to ITER accident analyses. Furthermore, the methodology phases are illustrated in the paper by its application to the EU HCLL TBS using both MELCOR and RELAP5 codes.« less
Islam, Nadia Shilpi; Khan, Suhaila; Kwon, Simona; Jang, Deeana; Ro, Marguerite; Trinh-Shevrin, Chau
2011-01-01
There are close to 15 million Asian Americans living in the United States, and they represent the fastest growing populations in the country. By the year 2050, there will be an estimated 33.4 million Asian Americans living in the country. However, their health needs remain poorly understood and there is a critical lack of data disaggregated by Asian American ethnic subgroups, primary language, and geography. This paper examines methodological issues, challenges, and potential solutions to addressing the collection, analysis, and reporting of disaggregated (or, granular) data on Asian Americans. The article explores emerging efforts to increase granular data through the use of innovative study design and analysis techniques. Concerted efforts to implement these techniques will be critical to the future development of sound research, health programs, and policy efforts targeting this and other minority populations. PMID:21099084
Items Supporting the Hanford Internal Dosimetry Program Implementation of the IMBA Computer Code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carbaugh, Eugene H.; Bihl, Donald E.
2008-01-07
The Hanford Internal Dosimetry Program has adopted the computer code IMBA (Integrated Modules for Bioassay Analysis) as its primary code for bioassay data evaluation and dose assessment using methodologies of ICRP Publications 60, 66, 67, 68, and 78. The adoption of this code was part of the implementation plan for the June 8, 2007 amendments to 10 CFR 835. This information release includes action items unique to IMBA that were required by PNNL quality assurance standards for implementation of safety software. Copie of the IMBA software verification test plan and the outline of the briefing given to new users aremore » also included.« less
Di Guardo, Andrea; Finizio, Antonio
2018-01-01
In the last decades, several monitoring programs were established as an effect of EU Directives addressing the quality of water resources (drinking water, groundwater and surface water). Plant Protection Products (PPPs) are an obvious target of monitoring activities, since they are directly released into the environment. One of the challenges in managing the risk of pesticides at the territorial scale is identifying the locations in water bodies needing implementation of risk mitigation measures. In this, the national pesticides monitoring plans could be very helpful. However, monitoring of pesticides is a challenging task because of the high number of registered pesticides, cost of analyses, and the periodicity of sampling related to pesticide application and use. Extensive high-quality data-sets are consequently often missing. More in general, the information that can be obtained from monitoring studies are frequently undervalued by risk managers. In this study, we propose a new methodology providing indications about the need to implement mitigation measures in stretches of surface water bodies on a territory by combining historical series of monitoring data and GIS. The methodology is articulated in two distinct phases: a) acquisition of monitoring data and setting-up of informative layers of georeferenced data (phase 1) and b) statistical and expert analysis for the identification of areas where implementation of limitation or mitigation measures are suggested (phase 2). Our methodology identifies potentially vulnerable water bodies, considering temporal contamination trends and relative risk levels at selected monitoring stations. A case study is presented considering glyphosate monitoring data in Lombardy Region (Northern of Italy) for the 2008-2014 period. Copyright © 2017 Elsevier B.V. All rights reserved.
Software Testing and Verification in Climate Model Development
NASA Technical Reports Server (NTRS)
Clune, Thomas L.; Rood, RIchard B.
2011-01-01
Over the past 30 years most climate models have grown from relatively simple representations of a few atmospheric processes to a complex multi-disciplinary system. Computer infrastructure over that period has gone from punch card mainframes to modem parallel clusters. Model implementations have become complex, brittle, and increasingly difficult to extend and maintain. Existing verification processes for model implementations rely almost exclusively upon some combination of detailed analysis of output from full climate simulations and system-level regression tests. In additional to being quite costly in terms of developer time and computing resources, these testing methodologies are limited in terms of the types of defects that can be detected, isolated and diagnosed. Mitigating these weaknesses of coarse-grained testing with finer-grained "unit" tests has been perceived as cumbersome and counter-productive. In the commercial software sector, recent advances in tools and methodology have led to a renaissance for systematic fine-grained testing. We discuss the availability of analogous tools for scientific software and examine benefits that similar testing methodologies could bring to climate modeling software. We describe the unique challenges faced when testing complex numerical algorithms and suggest techniques to minimize and/or eliminate the difficulties.
Integrating interface slicing into software engineering processes
NASA Technical Reports Server (NTRS)
Beck, Jon
1993-01-01
Interface slicing is a tool which was developed to facilitate software engineering. As previously presented, it was described in terms of its techniques and mechanisms. The integration of interface slicing into specific software engineering activities is considered by discussing a number of potential applications of interface slicing. The applications discussed specifically address the problems, issues, or concerns raised in a previous project. Because a complete interface slicer is still under development, these applications must be phrased in future tenses. Nonetheless, the interface slicing techniques which were presented can be implemented using current compiler and static analysis technology. Whether implemented as a standalone tool or as a module in an integrated development or reverse engineering environment, they require analysis no more complex than that required for current system development environments. By contrast, conventional slicing is a methodology which, while showing much promise and intuitive appeal, has yet to be fully implemented in a production language environment despite 12 years of development.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garrett, Richard L.; Niemi, Belinda J.; Paik, Ingle K.
2013-11-07
A Comparative Evaluation was conducted for One System Integrated Project Team to compare the safety bases for the Hanford Waste Treatment and Immobilization Plant Project (WTP) and Tank Operations Contract (TOC) (i.e., Tank Farms) by an Expert Review Team. The evaluation had an overarching purpose to facilitate effective integration between WTP and TOC safety bases. It was to provide One System management with an objective evaluation of identified differences in safety basis process requirements, guidance, direction, procedures, and products (including safety controls, key safety basis inputs and assumptions, and consequence calculation methodologies) between WTP and TOC. The evaluation identified 25more » recommendations (Opportunities for Integration). The resolution of these recommendations resulted in 16 implementation plans. The completion of these implementation plans will help ensure consistent safety bases for WTP and TOC along with consistent safety basis processes. procedures, and analyses. and should increase the likelihood of a successful startup of the WTP. This early integration will result in long-term cost savings and significant operational improvements. In addition, the implementation plans lead to the development of eight new safety analysis methodologies that can be used at other U.S. Department of Energy (US DOE) complex sites where URS Corporation is involved.« less
Lewis, Cara C; Stanick, Cameo F; Martinez, Ruben G; Weiner, Bryan J; Kim, Mimi; Barwick, Melanie; Comtois, Katherine A
2015-01-08
Identification of psychometrically strong instruments for the field of implementation science is a high priority underscored in a recent National Institutes of Health working meeting (October 2013). Existing instrument reviews are limited in scope, methods, and findings. The Society for Implementation Research Collaboration Instrument Review Project's objectives address these limitations by identifying and applying a unique methodology to conduct a systematic and comprehensive review of quantitative instruments assessing constructs delineated in two of the field's most widely used frameworks, adopt a systematic search process (using standard search strings), and engage an international team of experts to assess the full range of psychometric criteria (reliability, construct and criterion validity). Although this work focuses on implementation of psychosocial interventions in mental health and health-care settings, the methodology and results will likely be useful across a broad spectrum of settings. This effort has culminated in a centralized online open-access repository of instruments depicting graphical head-to-head comparisons of their psychometric properties. This article describes the methodology and preliminary outcomes. The seven stages of the review, synthesis, and evaluation methodology include (1) setting the scope for the review, (2) identifying frameworks to organize and complete the review, (3) generating a search protocol for the literature review of constructs, (4) literature review of specific instruments, (5) development of an evidence-based assessment rating criteria, (6) data extraction and rating instrument quality by a task force of implementation experts to inform knowledge synthesis, and (7) the creation of a website repository. To date, this multi-faceted and collaborative search and synthesis methodology has identified over 420 instruments related to 34 constructs (total 48 including subconstructs) that are relevant to implementation science. Despite numerous constructs having greater than 20 available instruments, which implies saturation, preliminary results suggest that few instruments stem from gold standard development procedures. We anticipate identifying few high-quality, psychometrically sound instruments once our evidence-based assessment rating criteria have been applied. The results of this methodology may enhance the rigor of implementation science evaluations by systematically facilitating access to psychometrically validated instruments and identifying where further instrument development is needed.
Roberts, Renée J; Wilson, Ashley E; Quezado, Zenaide
2017-03-01
Six Sigma and Lean methodologies are effective quality improvement tools in many health care settings. We applied the DMAIC methodology (define, measure, analyze, improve, control) to address deficiencies in our pediatric anesthesia supply chain. We defined supply chain problems by mapping existing processes and soliciting comments from those involved. We used daily distance walked by anesthesia technicians and number of callouts for missing supplies as measurements that we analyzed before and after implementing improvements (anesthesia cart redesign). We showed improvement in the metrics after those interventions were implemented, and those improvements were sustained and thus controlled 1 year after implementation.
1988-03-01
INDUSTRY ANALYSIS/FINDINGS 132 12 EQUIPMENT/MACHINERY ALTERNATIVES 134 13 MIS REQUIREMENTS/IMPROVEMENTS 135 14 COST BENEFIT ANALYSIS AND PROCEDURE 137 15...SOFTWARE DIAGRAM 14.0-1 COST BENEFIT ANALYSIS METHODOLOGY 138 14.3-1 PROJECT 80 EXPENDITURE SCHEDULE 141 14.4-1 PROJECT 80 CASH FLOWS 142 15.1-1 PROJECT 80...testing, streamlining work flow, and installation of ergonomically designed work cell/work centers. The benefits associated with the implementation of ITM
Dadich, Ann; Doloswala, Navin
2018-05-10
Despite the relative abundance of frameworks and models to guide implementation science, the explicit use of theory is limited. Bringing together two seemingly disparate fields of research, this article asks, what can organisational theory offer implementation science? This is examined by applying a theoretical lens that incorporates agency, institutional, and situated change theories to understand the implementation of healthcare knowledge into practice. Interviews were conducted with 20 general practitioners (GPs) before and after using a resource to facilitate evidence-based sexual healthcare. Research material was analysed using two approaches - researcher-driven thematic coding and lexical analysis, which was relatively less researcher-driven. The theoretical lens elucidated the complex pathways of knowledge translation. More specifically, agency theory revealed tensions between the GP as agent and their organisations and patients as principals. Institutional theory highlighted the importance of GP-embeddedness within their chosen specialty of general practice; their medical profession; and the practice in which they worked. Situated change theory exposed the role of localised adaptations over time - a metamorphosis. This study has theoretical, methodological, and practical implications. Theoretically, it is the first to examine knowledge translation using a lens premised on agency, institutional, and situated change theories. Methodologically, the study highlights the complementary value of researcher-driven and researcher-guided analysis of qualitative research material. Practically, this study signposts opportunities to facilitate knowledge translation - more specifically, it suggests that efforts to shape clinician practices should accommodate the interrelated influence of the agent and the institution, and recognise that change can be ever so subtle.
Kostick, Kristin M; Schensul, Stephen L; Singh, Rajendra; Pelto, Pertti; Saggurti, Niranjan
2011-05-01
This paper responds to the call for culturally-relevant intervention research by introducing a methodology for identifying community norms and resources in order to more effectively implement sustainable interventions strategies. Results of an analysis of community norms, specifically attitudes toward gender equity, are presented from an HIV/STI research and intervention project in a low-income community in Mumbai, India (2008-2012). Community gender norms were explored because of their relevance to sexual risk in settings characterized by high levels of gender inequity. This paper recommends approaches that interventionists and social scientists can take to incorporate cultural insights into formative assessments and project implementation These approaches include how to (1) examine modal beliefs and norms and any patterned variation within the community; (2) identify and assess variation in cultural beliefs and norms among community members (including leaders, social workers, members of civil society and the religious sector); and (3) identify differential needs among sectors of the community and key types of individuals best suited to help formulate and disseminate culturally-relevant intervention messages. Using a multi-method approach that includes the progressive translation of qualitative interviews into a quantitative survey of cultural norms, along with an analysis of community consensus, we outline a means for measuring variation in cultural expectations and beliefs about gender relations in an urban community in Mumbai. Results illustrate how intervention strategies and implementation can benefit from an organic (versus a priori and/or stereotypical) approach to cultural characteristics and analysis of community resources and vulnerabilities. Copyright © 2011 Elsevier Ltd. All rights reserved.
A methodology for building culture and gender norms into intervention: An example from Mumbai, India
Schensul, Stephen L.; Singh, Rajendra; Pelto, Pertti; Saggurti, Niranjan
2011-01-01
This paper responds to the call for culturally relevant intervention research by introducing a methodology for identifying community norms and resources in order to more effectively implement sustainable interventions strategies. Results of an analysis of community norms, specifically attitudes toward gender equity, are presented from an HIV/STI research and intervention project in a low income community in Mumbai, India (2008–2012). Community gender norms were explored because of their relevance to sexual risk in settings characterized by high levels of gender inequity. This paper recommends approaches that interventionists and social scientists can take to incorporate cultural insights into formative assessments and project implementation These approaches include how to (1) examine modal beliefs and norms and any patterned variation within the community; (2) identify and assess variation in cultural beliefs and norms among community members (including leaders, social workers, members of civil society and the religious sector); and (3) identify differential needs among sectors of the community and key types of individuals best suited to help formulate and disseminate culturally relevant intervention messages. Using a multi-method approach that includes the progressive translation of qualitative interviews into a quantitative survey of cultural norms, along with an analysis of community consensus, we outline a means for measuring variation in cultural expectations and beliefs about gender relations in an urban community in Mumbai. Results illustrate how intervention strategies and implementation can benefit from an organic (versus a priori and/or stereotypical) approach to cultural characteristics and analysis of community resources and vulnerabilities. PMID:21524835
Systematic Analysis Of Ocean Colour Uncertainties
NASA Astrophysics Data System (ADS)
Lavender, Samantha
2013-12-01
This paper reviews current research into the estimation of uncertainties as a pixel-based measure to aid non- specialist users of remote sensing products. An example MERIS image, captured on the 28 March 2012, was processed with above-water atmospheric correction code. This was initially based on both the Antoine & Morel Standard Atmospheric Correction, with Bright Pixel correction component, and Doerffer Neural Network coastal water's approach. It's showed that analysis of the atmospheric by-products yield important information about the separation of the atmospheric and in-water signals, helping to sign-post possible uncertainties in the atmospheric correction results. Further analysis has concentrated on implementing a ‘simplistic' atmospheric correction so that the impact of changing the input auxiliary data can be analysed; the influence of changing surface pressure is demonstrated. Future work will focus on automating the analysis, so that the methodology can be implemented within an operational system.
Improvement of laboratory turnaround time using lean methodology.
Gupta, Shradha; Kapil, Sahil; Sharma, Monica
2018-05-14
Purpose The purpose of this paper is to discuss the implementation of lean methodology to reduce the turnaround time (TAT) of a clinical laboratory in a super speciality hospital. Delays in report delivery lead to delayed diagnosis increased waiting time and decreased customer satisfaction. The reduction in TAT will lead to increased patient satisfaction, quality of care, employee satisfaction and ultimately the hospital's revenue. Design/methodology/approach The generic causes resulting in increasing TAT of clinical laboratories were identified using lean tools and techniques such as value stream mapping (VSM), Gemba, Pareto Analysis and Root Cause Analysis. VSM was used as a tool to analyze the current state of the process and further VSM was used to design the future state with suggestions for process improvements. Findings This study identified 12 major non-value added factors for the hematology laboratory and 5 major non-value added factors for the biochemistry lab which were acting as bottlenecks resulting in limiting throughput. A four-month research study by the authors together with hospital quality department and laboratory staff members led to reduction of the average TAT from 180 to 95minutes in the hematology lab and from 268 to 208 minutes in the biochemistry lab. Practical implications Very few improvement initiatives in Indian healthcare are based on industrial engineering tools and techniques, which might be due to a lack of interaction between healthcare and engineering. The study provides a positive outcome in terms of improving the efficiency of services in hospitals and identifies a scope for lean in the Indian healthcare sector. Social implications Applying lean in the Indian healthcare sector gives its own potential solution to the problem caused, due to a wide gap between lean accessibility and lean implementation. Lean helped in changing the mindset of an organization toward providing the highest quality of services with faster delivery at an optimal cost. Originality/value This paper is an effort to reduce the gap between healthcare and industrial engineering and enhancing the use of lean practices in Indian healthcare. The study is motivated toward implementing lean methodology successfully in services.
[Clinical practice guidelines in Peru: evaluation of its quality using the AGREE II instrument].
Canelo-Aybar, Carlos; Balbin, Graciela; Perez-Gomez, Ángela; Florez, Iván D
2016-01-01
To evaluate the methodological quality of clinical practice guidelines (CPGs) put into practice by the Peruvian Ministry of Health (MINSA), 17 CPGs from the ministry, published between 2009 and 2014, were independently evaluated by three methodologic experts using the AGREE II instrument. The score of AGREE II domains was low and very low in all CPGs: scope and purpose (medium, 44%), clarity of presentation (medium, 47%), participation of decision-makers (medium, 8%), methodological rigor (medium, 5%), applicability (medium, 5%), and editorial independence (medium, 8%). In conclusion, the methodological quality of CPGs implemented by the MINSA is low. Consequently, its use could not be recommended. The implementation of the methodology for the development of CPGs described in the recentlypublished CPG methodological preparation manual in Peru is a pressing need.
ERIC Educational Resources Information Center
Watkins, Arthur Noel
The purpose of this study was to identify and describe the decision-making processes in senior high schools that were implementing programs of individualized schooling. Field methodology, including interviews, observations, and analysis of documents, was used to gather data in six senior high schools of varying size located throughout the country,…
1992-07-01
methodologies ; software performance analysis; software testing; and concurrent languages. Finally, efforts in algorithms, which are primarily designed to upgrade...These codes provide a powerful research tool for testing new concepts and designs prior to experimental implementation. DoE’s laser program has also...development, and specially designed production facilities. World leadership in bth non -fluorinated and fluorinated materials resides in the U.S. but Japan
Validation of vision-based range estimation algorithms using helicopter flight data
NASA Technical Reports Server (NTRS)
Smith, Phillip N.
1993-01-01
The objective of this research was to demonstrate the effectiveness of an optic flow method for passive range estimation using a Kalman-filter implementation with helicopter flight data. This paper is divided into the following areas: (1) ranging algorithm; (2) flight experiment; (3) analysis methodology; (4) results; and (5) concluding remarks. The discussion is presented in viewgraph format.
Contamination Control for Thermal Engineers
NASA Technical Reports Server (NTRS)
Rivera, Rachel B.
2015-01-01
The presentation will be given at the 26th Annual Thermal Fluids Analysis Workshop (TFAWS 2015) hosted by the Goddard Spaceflight Center (GSFC) Thermal Engineering Branch (Code 545). This course will cover the basics of Contamination Control, including contamination control related failures, the effects of contamination on Flight Hardware, what contamination requirements translate to, design methodology, and implementing contamination control into Integration, Testing and Launch.
ERIC Educational Resources Information Center
Singh, Oma B.
2009-01-01
This study used a design based-research (DBR) methodology to examine how an Instructional Systematic Design (ISD) process such as ADDIE (Analysis, Design, Development, Implementation, Evaluation) can be employed to develop a web-based module to teach metacognitive learning strategies to students in higher education. The goal of the study was…
ERIC Educational Resources Information Center
Exarchou, Evi; Klonari, Aikaterini; Lambrinos, Nikos; Vaitis, Michalis
2017-01-01
This study focused on the analysis of Grade-12 (Senior) students' sociocultural constructivist interactions using Web 2.0 applications during a geographical research process. In the study methodology context, a transdisciplinary case study (TdCS) with ethnographic and research action data was designed, implemented and analyzed in real teaching…
Analysis of the Effects of the Implementation of Cooperative Learning in Physical Education
ERIC Educational Resources Information Center
Callado, Carlos Velázquez
2012-01-01
Our research was oriented to test the effects of a structured program of cooperative learning in Physical Education classes with students in grades 5 and 6 of primary school, with and without previous experience with this methodology. In a second phase we sought to determine how students perceived the received classes for a time later. We analysed…
ERIC Educational Resources Information Center
Kutina, Kenneth L.; And Others
A simulation model of an academic medical center that was developed to aid in strategic planning and policy analysis is described. The model, designated MCM for Medical Center Model, was implemented at the School of Medicine, University Hospitals of Cleveland, and the private practices of the faculty in the clinical departments at University…
ERIC Educational Resources Information Center
Egorov, Sergey B.; Kapitanov, Alexey V.; Mitrofanov, Vladimir G.; Shvartsburg, Leonid E.; Ivanova, Natalia A.; Ryabov, Sergey A.
2016-01-01
The aim of article is to provide development of a unified assessment methodology in relation to various technological processes and the actual conditions of their implementation. To carry the energy efficiency analysis of the technological processes through comparison of the established power and the power consumed by the actual technological…
Dorofeev, S B; Babenko, A I
2017-01-01
The article deals with analysis of national and international publications concerning methodological aspects of elaborating systematic approach to healthy life-style of population. This scope of inquiry plays a key role in development of human capital. The costs related to healthy life-style are to be considered as personal investment into future income due to physical incrementation of human capital. The definitions of healthy life-style, its categories and supportive factors are to be considered in the process of development of strategies and programs of healthy lifestyle. The implementation of particular strategies entails application of comprehensive information and educational programs meant for various categories of population. Therefore, different motivation techniques are to be considered for children, adolescents, able-bodied population, the elderly. This approach is to be resulted in establishing particular responsibility for national government, territorial administrations, health care administrations, employers and population itself. The necessity of complex legislative measures is emphasized. The recent social hygienic studies were focused mostly on particular aspects of development of healthy life-style of population. Hence, the demand for long term exploration of development of organizational and functional models implementing medical preventive measures on the basis of comprehensive information analysis using statistical, sociological and professional expertise.
OFF-Stagnation point testing in plasma facility
NASA Astrophysics Data System (ADS)
Viladegut, A.; Chazot, O.
2015-06-01
Reentry space vehicles face extreme conditions of heat flux when interacting with the atmosphere at hypersonic velocities. Stagnation point heat flux is normally used as a reference for Thermal Protection Material (TPS) design; however, many critical phenomena also occur at off-stagnation point. This paper adresses the implementation of an offstagnation point methodology able to duplicate in ground facility the hypersonic boundary layer over a flat plate model. The first analysis using two-dimensional (2D) computational fluid dynamics (CFD) simulations is carried out to understand the limitations of this methodology when applying it in plasma wind tunnel. The results from the testing campaign at VKI Plasmatron are also presented.
A computer simulator for development of engineering system design methodologies
NASA Technical Reports Server (NTRS)
Padula, S. L.; Sobieszczanski-Sobieski, J.
1987-01-01
A computer program designed to simulate and improve engineering system design methodology is described. The simulator mimics the qualitative behavior and data couplings occurring among the subsystems of a complex engineering system. It eliminates the engineering analyses in the subsystems by replacing them with judiciously chosen analytical functions. With the cost of analysis eliminated, the simulator is used for experimentation with a large variety of candidate algorithms for multilevel design optimization to choose the best ones for the actual application. Thus, the simulator serves as a development tool for multilevel design optimization strategy. The simulator concept, implementation, and status are described and illustrated with examples.
Making the Hubble Space Telescope servicing mission safe
NASA Technical Reports Server (NTRS)
Bahr, N. J.; Depalo, S. V.
1992-01-01
The implementation of the HST system safety program is detailed. Numerous safety analyses are conducted through various phases of design, test, and fabrication, and results are presented to NASA management for discussion during dedicated safety reviews. Attention is given to the system safety assessment and risk analysis methodologies used, i.e., hazard analysis, fault tree analysis, and failure modes and effects analysis, and to how they are coupled with engineering and test analysis for a 'synergistic picture' of the system. Some preliminary safety analysis results, showing the relationship between hazard identification, control or abatement, and finally control verification, are presented as examples of this safety process.
Acceptance testing for PACS: from methodology to design to implementation
NASA Astrophysics Data System (ADS)
Liu, Brent J.; Huang, H. K.
2004-04-01
Acceptance Testing (AT) is a crucial step in the implementation process of a PACS within a clinical environment. AT determines whether the PACS is ready for clinical use and marks the official sign off of the PACS product. Most PACS vendors have Acceptance Testing (AT) plans, however, these plans do not provide a complete and robust evaluation of the full system. In addition, different sites will have different special requirements that vendor AT plans do not cover. The purpose of this paper is to introduce a protocol for AT design and present case studies of AT performed on clinical PACS. A methodology is presented that includes identifying testing components within PACS, quality assurance for both functionality and performance, and technical testing focusing on key single points-of-failure within the PACS product. Tools and resources that provide assistance in performing AT are discussed. In addition, implementation of the AT within the clinical environment and the overall implementation timeline of the PACS process are presented. Finally, case studies of actual AT of clinical PACS performed in the healthcare environment will be reviewed. The methodology for designing and implementing a robust AT plan for PACS was documented and has been used in PACS acceptance tests in several sites. This methodology can be applied to any PACS and can be used as a validation for the PACS product being acquired by radiology departments and hospitals. A methodology for AT design and implementation was presented that can be applied to future PACS installations. A robust AT plan for a PACS installation can increase both the utilization and satisfaction of a successful implementation of a PACS product that benefits both vendor and customer.
DB4US: A Decision Support System for Laboratory Information Management.
Carmona-Cejudo, José M; Hortas, Maria Luisa; Baena-García, Manuel; Lana-Linati, Jorge; González, Carlos; Redondo, Maximino; Morales-Bueno, Rafael
2012-11-14
Until recently, laboratory automation has focused primarily on improving hardware. Future advances are concentrated on intelligent software since laboratories performing clinical diagnostic testing require improved information systems to address their data processing needs. In this paper, we propose DB4US, an application that automates information related to laboratory quality indicators information. Currently, there is a lack of ready-to-use management quality measures. This application addresses this deficiency through the extraction, consolidation, statistical analysis, and visualization of data related to the use of demographics, reagents, and turn-around times. The design and implementation issues, as well as the technologies used for the implementation of this system, are discussed in this paper. To develop a general methodology that integrates the computation of ready-to-use management quality measures and a dashboard to easily analyze the overall performance of a laboratory, as well as automatically detect anomalies or errors. The novelty of our approach lies in the application of integrated web-based dashboards as an information management system in hospital laboratories. We propose a new methodology for laboratory information management based on the extraction, consolidation, statistical analysis, and visualization of data related to demographics, reagents, and turn-around times, offering a dashboard-like user web interface to the laboratory manager. The methodology comprises a unified data warehouse that stores and consolidates multidimensional data from different data sources. The methodology is illustrated through the implementation and validation of DB4US, a novel web application based on this methodology that constructs an interface to obtain ready-to-use indicators, and offers the possibility to drill down from high-level metrics to more detailed summaries. The offered indicators are calculated beforehand so that they are ready to use when the user needs them. The design is based on a set of different parallel processes to precalculate indicators. The application displays information related to tests, requests, samples, and turn-around times. The dashboard is designed to show the set of indicators on a single screen. DB4US was deployed for the first time in the Hospital Costa del Sol in 2008. In our evaluation we show the positive impact of this methodology for laboratory professionals, since the use of our application has reduced the time needed for the elaboration of the different statistical indicators and has also provided information that has been used to optimize the usage of laboratory resources by the discovery of anomalies in the indicators. DB4US users benefit from Internet-based communication of results, since this information is available from any computer without having to install any additional software. The proposed methodology and the accompanying web application, DB4US, automates the processing of information related to laboratory quality indicators and offers a novel approach for managing laboratory-related information, benefiting from an Internet-based communication mechanism. The application of this methodology has been shown to improve the usage of time, as well as other laboratory resources.
A spatially constrained ecological classification: rationale, methodology and implementation
Franz Mora; Louis Iverson; Louis Iverson
2002-01-01
The theory, methodology and implementation for an ecological and spatially constrained classification are presented. Ecological and spatial relationships among several landscape variables are analyzed in order to define a new approach for a landscape classification. Using ecological and geostatistical analyses, several ecological and spatial weights are derived to...
Report #13-P-0430, September 24, 2013. The two Region 8 program offices that jointly implement the Lead Renovation, Repair and Painting Program do not have methodology or agreement for sharing SEE funding, which has led to confusion.
Best evidence on the educational effects of undergraduate portfolios.
Buckley, Sharon; Coleman, Jamie; Khan, Khalid
2010-09-01
The great variety of portfolio types and schemes used in the education of health professionals is reflected in the extensive and diverse educational literature relating to portfolio use. We have recently completed a Best Evidence Medical Education (BEME) systematic review of the literature relating to the use of portfolios in the undergraduate setting that offers clinical teachers insights into both their effects on learning and issues to consider in portfolio implementation. Using a methodology based on BEME recommendations, we searched the literature relating to a range of health professions, identifying evidence for the effects of portfolios on undergraduate student learning, and assessing the methodological quality of each study. The higher quality studies in our review report that, when implemented appropriately, portfolios can improve students' ability to integrate theory with practice, can encourage their self-awareness and reflection, and can offer support for students facing difficult emotional situations. Portfolios can also enhance student-tutor relationships and prepare students for the rigours of postgraduate training. However, the time required to complete a portfolio may detract from students' clinical learning. An analysis of methodological quality against year of publication suggests that, across a range of health professions, the quality of the literature relating to the educational effects of portfolios is improving. However, further work is still required to build the evidence base for the educational effects of portfolios, particularly comparative studies that assess effects on learning directly. Our findings have implications for the design and implementation of portfolios in the undergraduate setting. © Blackwell Publishing Ltd 2010.
Analysis of central enterprise architecture elements in models of six eHealth projects.
Virkanen, Hannu; Mykkänen, Juha
2014-01-01
Large-scale initiatives for eHealth services have been established in many countries on regional or national level. The use of Enterprise Architecture has been suggested as a methodology to govern and support the initiation, specification and implementation of large-scale initiatives including the governance of business changes as well as information technology. This study reports an analysis of six health IT projects in relation to Enterprise Architecture elements, focusing on central EA elements and viewpoints in different projects.
Meijster, Tim; van Duuren-Stuurman, Birgit; Heederik, Dick; Houba, Remko; Koningsveld, Ernst; Warren, Nicholas; Tielemans, Erik
2011-10-01
Use of cost-benefit analysis in occupational health increases insight into the intervention strategy that maximises the cost-benefit ratio. This study presents a methodological framework identifying the most important elements of a cost-benefit analysis for occupational health settings. One of the main aims of the methodology is to evaluate cost-benefit ratios for different stakeholders (employers, employees and society). The developed methodology was applied to two intervention strategies focused on reducing respiratory diseases. A cost-benefit framework was developed and used to set up a calculation spreadsheet containing the inputs and algorithms required to calculate the costs and benefits for all cost elements. Inputs from a large variety of sources were used to calculate total costs, total benefits, net costs and the benefit-to-costs ratio for both intervention scenarios. Implementation of a covenant intervention program resulted in a net benefit of €16 848 546 over 20 years for a population of 10 000 workers. Implementation was cost-effective for all stakeholders. For a health surveillance scenario, total benefits resulting from a decreased disease burden were estimated to be €44 659 352. The costs of the interventions could not be calculated. This study provides important insights for developing effective intervention strategies in the field of occupational medicine. Use of a model based approach enables investigation of those parameters most likely to impact on the effectiveness and costs of interventions for work related diseases. Our case study highlights the importance of considering different perspectives (of employers, society and employees) in assessing and sharing the costs and benefits of interventions.
Urquhart, Robin; Porter, Geoffrey A; Sargeant, Joan; Jackson, Lois; Grunfeld, Eva
2014-09-16
The implementation of innovations (i.e., new tools and practices) in healthcare organizations remains a significant challenge. The objective of this study was to examine the key interpersonal, organizational, and system level factors that influenced the implementation and use of synoptic reporting tools in three specific areas of cancer care. Using case study methodology, we studied three cases in Nova Scotia, Canada, wherein synoptic reporting tools were implemented within clinical departments/programs. Synoptic reporting tools capture and present information about a medical or surgical procedure in a structured, checklist-like format and typically report only items critical for understanding the disease and subsequent impacts on patient care. Data were collected through semi-structured interviews with key informants, document analysis, nonparticipant observation, and tool use/examination. Analysis involved production of case histories, in-depth analysis of each case, and a cross-case analysis. Numerous techniques were used during the research design, data collection, and data analysis stages to increase the rigour of this study. The analysis revealed five common factors that were particularly influential to implementation and use of synoptic reporting tools across the three cases: stakeholder involvement, managing the change process (e.g., building demand, communication, training and support), champions and respected colleagues, administrative and managerial support, and innovation attributes (e.g., complexity, compatibility with interests and values). The direction of influence (facilitating or impeding) of each of these factors differed across and within cases. The findings demonstrate the importance of a multi-level contextual analysis to gaining both breadth and depth to our understanding of innovation implementation and use in health care. They also provide new insights into several important issues under-reported in the literature on moving innovations into healthcare practice, including the role of middle managers in implementation efforts and the importance of attending to the interpersonal aspects of implementation.
Moreno-Conde, Alberto; Moner, David; Cruz, Wellington Dimas da; Santos, Marcelo R; Maldonado, José Alberto; Robles, Montserrat; Kalra, Dipak
2015-07-01
This systematic review aims to identify and compare the existing processes and methodologies that have been published in the literature for defining clinical information models (CIMs) that support the semantic interoperability of electronic health record (EHR) systems. Following the preferred reporting items for systematic reviews and meta-analyses systematic review methodology, the authors reviewed published papers between 2000 and 2013 that covered that semantic interoperability of EHRs, found by searching the PubMed, IEEE Xplore, and ScienceDirect databases. Additionally, after selection of a final group of articles, an inductive content analysis was done to summarize the steps and methodologies followed in order to build CIMs described in those articles. Three hundred and seventy-eight articles were screened and thirty six were selected for full review. The articles selected for full review were analyzed to extract relevant information for the analysis and characterized according to the steps the authors had followed for clinical information modeling. Most of the reviewed papers lack a detailed description of the modeling methodologies used to create CIMs. A representative example is the lack of description related to the definition of terminology bindings and the publication of the generated models. However, this systematic review confirms that most clinical information modeling activities follow very similar steps for the definition of CIMs. Having a robust and shared methodology could improve their correctness, reliability, and quality. Independently of implementation technologies and standards, it is possible to find common patterns in methods for developing CIMs, suggesting the viability of defining a unified good practice methodology to be used by any clinical information modeler. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
GSuite HyperBrowser: integrative analysis of dataset collections across the genome and epigenome.
Simovski, Boris; Vodák, Daniel; Gundersen, Sveinung; Domanska, Diana; Azab, Abdulrahman; Holden, Lars; Holden, Marit; Grytten, Ivar; Rand, Knut; Drabløs, Finn; Johansen, Morten; Mora, Antonio; Lund-Andersen, Christin; Fromm, Bastian; Eskeland, Ragnhild; Gabrielsen, Odd Stokke; Ferkingstad, Egil; Nakken, Sigve; Bengtsen, Mads; Nederbragt, Alexander Johan; Thorarensen, Hildur Sif; Akse, Johannes Andreas; Glad, Ingrid; Hovig, Eivind; Sandve, Geir Kjetil
2017-07-01
Recent large-scale undertakings such as ENCODE and Roadmap Epigenomics have generated experimental data mapped to the human reference genome (as genomic tracks) representing a variety of functional elements across a large number of cell types. Despite the high potential value of these publicly available data for a broad variety of investigations, little attention has been given to the analytical methodology necessary for their widespread utilisation. We here present a first principled treatment of the analysis of collections of genomic tracks. We have developed novel computational and statistical methodology to permit comparative and confirmatory analyses across multiple and disparate data sources. We delineate a set of generic questions that are useful across a broad range of investigations and discuss the implications of choosing different statistical measures and null models. Examples include contrasting analyses across different tissues or diseases. The methodology has been implemented in a comprehensive open-source software system, the GSuite HyperBrowser. To make the functionality accessible to biologists, and to facilitate reproducible analysis, we have also developed a web-based interface providing an expertly guided and customizable way of utilizing the methodology. With this system, many novel biological questions can flexibly be posed and rapidly answered. Through a combination of streamlined data acquisition, interoperable representation of dataset collections, and customizable statistical analysis with guided setup and interpretation, the GSuite HyperBrowser represents a first comprehensive solution for integrative analysis of track collections across the genome and epigenome. The software is available at: https://hyperbrowser.uio.no. © The Author 2017. Published by Oxford University Press.
Full-Envelope Launch Abort System Performance Analysis Methodology
NASA Technical Reports Server (NTRS)
Aubuchon, Vanessa V.
2014-01-01
The implementation of a new dispersion methodology is described, which dis-perses abort initiation altitude or time along with all other Launch Abort System (LAS) parameters during Monte Carlo simulations. In contrast, the standard methodology assumes that an abort initiation condition is held constant (e.g., aborts initiated at altitude for Mach 1, altitude for maximum dynamic pressure, etc.) while dispersing other LAS parameters. The standard method results in large gaps in performance information due to the discrete nature of initiation conditions, while the full-envelope dispersion method provides a significantly more comprehensive assessment of LAS abort performance for the full launch vehicle ascent flight envelope and identifies performance "pinch-points" that may occur at flight conditions outside of those contained in the discrete set. The new method has significantly increased the fidelity of LAS abort simulations and confidence in the results.
NASA Technical Reports Server (NTRS)
Campbell, B. H.
1974-01-01
A methodology which was developed for balanced designing of spacecraft subsystems and interrelates cost, performance, safety, and schedule considerations was refined. The methodology consists of a two-step process: the first step is one of selecting all hardware designs which satisfy the given performance and safety requirements, the second step is one of estimating the cost and schedule required to design, build, and operate each spacecraft design. Using this methodology to develop a systems cost/performance model allows the user of such a model to establish specific designs and the related costs and schedule. The user is able to determine the sensitivity of design, costs, and schedules to changes in requirements. The resulting systems cost performance model is described and implemented as a digital computer program.
NASA Technical Reports Server (NTRS)
Padovan, J.; Adams, M.; Lam, P.; Fertis, D.; Zeid, I.
1982-01-01
Second-year efforts within a three-year study to develop and extend finite element (FE) methodology to efficiently handle the transient/steady state response of rotor-bearing-stator structure associated with gas turbine engines are outlined. The two main areas aim at (1) implanting the squeeze film damper element into a general purpose FE code for testing and evaluation; and (2) determining the numerical characteristics of the FE-generated rotor-bearing-stator simulation scheme. The governing FE field equations are set out and the solution methodology is presented. The choice of ADINA as the general-purpose FE code is explained, and the numerical operational characteristics of the direct integration approach of FE-generated rotor-bearing-stator simulations is determined, including benchmarking, comparison of explicit vs. implicit methodologies of direct integration, and demonstration problems.
pyAudioAnalysis: An Open-Source Python Library for Audio Signal Analysis.
Giannakopoulos, Theodoros
2015-01-01
Audio information plays a rather important role in the increasing digital content that is available today, resulting in a need for methodologies that automatically analyze such content: audio event recognition for home automations and surveillance systems, speech recognition, music information retrieval, multimodal analysis (e.g. audio-visual analysis of online videos for content-based recommendation), etc. This paper presents pyAudioAnalysis, an open-source Python library that provides a wide range of audio analysis procedures including: feature extraction, classification of audio signals, supervised and unsupervised segmentation and content visualization. pyAudioAnalysis is licensed under the Apache License and is available at GitHub (https://github.com/tyiannak/pyAudioAnalysis/). Here we present the theoretical background behind the wide range of the implemented methodologies, along with evaluation metrics for some of the methods. pyAudioAnalysis has been already used in several audio analysis research applications: smart-home functionalities through audio event detection, speech emotion recognition, depression classification based on audio-visual features, music segmentation, multimodal content-based movie recommendation and health applications (e.g. monitoring eating habits). The feedback provided from all these particular audio applications has led to practical enhancement of the library.
pyAudioAnalysis: An Open-Source Python Library for Audio Signal Analysis
Giannakopoulos, Theodoros
2015-01-01
Audio information plays a rather important role in the increasing digital content that is available today, resulting in a need for methodologies that automatically analyze such content: audio event recognition for home automations and surveillance systems, speech recognition, music information retrieval, multimodal analysis (e.g. audio-visual analysis of online videos for content-based recommendation), etc. This paper presents pyAudioAnalysis, an open-source Python library that provides a wide range of audio analysis procedures including: feature extraction, classification of audio signals, supervised and unsupervised segmentation and content visualization. pyAudioAnalysis is licensed under the Apache License and is available at GitHub (https://github.com/tyiannak/pyAudioAnalysis/). Here we present the theoretical background behind the wide range of the implemented methodologies, along with evaluation metrics for some of the methods. pyAudioAnalysis has been already used in several audio analysis research applications: smart-home functionalities through audio event detection, speech emotion recognition, depression classification based on audio-visual features, music segmentation, multimodal content-based movie recommendation and health applications (e.g. monitoring eating habits). The feedback provided from all these particular audio applications has led to practical enhancement of the library. PMID:26656189
Avaliani, S L; Novikov, S M; Shashina, T A; Dodina, N S; Kislitsin, V A; Mishina, A L
2014-01-01
The lack of adequate legislative and regulatory framework for ensuring minimization of the health risks in the field of environmental protection is the obstacle for the application of the risk analysis methodology as a leading tool for administrative activity in Russia. "Principles of the state policy in the sphere of ensuring chemical and biological safety of the Russian Federation for the period up to 2025 and beyond", approved by the President of the Russian Federation on 01 November 2013, No PR-25 73, are aimed at the legal support for the health risk analysis methodology. In the article there have been supposed the main stages of the operative control of the environmental quality, which lead to the reduction of the health risk to the acceptable level. The further improvement of the health risk analysis methodology in Russia should contribute to the implementation of the state policy in the sphere of chemical and biological safety through the introduction of complex measures on neutralization of chemical and biological threats to the human health and the environment, as well as evaluation of the economic effectiveness of these measures. The primary step should be the legislative securing of the quantitative value for the term: "acceptable risk".
Methodology for evaluation of railroad technology research projects
DOT National Transportation Integrated Search
1981-04-01
This Project memorandum presents a methodology for evaluating railroad research projects. The methodology includes consideration of industry and societal benefits, with special attention given to technical risks, implementation considerations, and po...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wanderer, Thomas, E-mail: thomas.wanderer@dlr.de; Herle, Stefan, E-mail: stefan.herle@rwth-aachen.de
2015-04-15
By their spatially very distributed nature, profitability and impacts of renewable energy resources are highly correlated with the geographic locations of power plant deployments. A web-based Spatial Decision Support System (SDSS) based on a Multi-Criteria Decision Analysis (MCDA) approach has been implemented for identifying preferable locations for solar power plants based on user preferences. The designated areas found serve for the input scenario development for a subsequent integrated Environmental Impact Assessment. The capabilities of the SDSS service get showcased for Concentrated Solar Power (CSP) plants in the region of Andalusia, Spain. The resulting spatial patterns of possible power plant sitesmore » are an important input to the procedural chain of assessing impacts of renewable energies in an integrated effort. The applied methodology and the implemented SDSS are applicable for other renewable technologies as well. - Highlights: • The proposed tool facilitates well-founded CSP plant siting decisions. • Spatial MCDA methods are implemented in a WebGIS environment. • GIS-based SDSS can contribute to a modern integrated impact assessment workflow. • The conducted case study proves the suitability of the methodology.« less
Solution to the indexing problem of frequency domain simulation experiments
NASA Technical Reports Server (NTRS)
Mitra, Mousumi; Park, Stephen K.
1991-01-01
A frequency domain simulation experiment is one in which selected system parameters are oscillated sinusoidally to induce oscillations in one or more system statistics of interest. A spectral (Fourier) analysis of these induced oscillations is then performed. To perform this spectral analysis, all oscillation frequencies must be referenced to a common, independent variable - an oscillation index. In a discrete-event simulation, the global simulation clock is the most natural choice for the oscillation index. However, past efforts to reference all frequencies to the simulation clock generally yielded unsatisfactory results. The reason for these unsatisfactory results is explained in this paper and a new methodology which uses the simulation clock as the oscillation index is presented. Techniques for implementing this new methodology are demonstrated by performing a frequency domain simulation experiment for a network of queues.
Measurement and analysis of operating system fault tolerance
NASA Technical Reports Server (NTRS)
Lee, I.; Tang, D.; Iyer, R. K.
1992-01-01
This paper demonstrates a methodology to model and evaluate the fault tolerance characteristics of operational software. The methodology is illustrated through case studies on three different operating systems: the Tandem GUARDIAN fault-tolerant system, the VAX/VMS distributed system, and the IBM/MVS system. Measurements are made on these systems for substantial periods to collect software error and recovery data. In addition to investigating basic dependability characteristics such as major software problems and error distributions, we develop two levels of models to describe error and recovery processes inside an operating system and on multiple instances of an operating system running in a distributed environment. Based on the models, reward analysis is conducted to evaluate the loss of service due to software errors and the effect of the fault-tolerance techniques implemented in the systems. Software error correlation in multicomputer systems is also investigated.
Peláez-Coca, M. D.; Orini, M.; Lázaro, J.; Bailón, R.; Gil, E.
2013-01-01
A methodology that combines information from several nonstationary biological signals is presented. This methodology is based on time-frequency coherence, that quantifies the similarity of two signals in the time-frequency domain. A cross time-frequency analysis method, based on quadratic time-frequency distribution, has been used for combining information of several nonstationary biomedical signals. In order to evaluate this methodology, the respiratory rate from the photoplethysmographic (PPG) signal is estimated. The respiration provokes simultaneous changes in the pulse interval, amplitude, and width of the PPG signal. This suggests that the combination of information from these sources will improve the accuracy of the estimation of the respiratory rate. Another target of this paper is to implement an algorithm which provides a robust estimation. Therefore, respiratory rate was estimated only in those intervals where the features extracted from the PPG signals are linearly coupled. In 38 spontaneous breathing subjects, among which 7 were characterized by a respiratory rate lower than 0.15 Hz, this methodology provided accurate estimates, with the median error {0.00; 0.98} mHz ({0.00; 0.31}%) and the interquartile range error {4.88; 6.59} mHz ({1.60; 1.92}%). The estimation error of the presented methodology was largely lower than the estimation error obtained without combining different PPG features related to respiration. PMID:24363777
Meyer, Pablo; Hoeng, Julia; Rice, J. Jeremy; Norel, Raquel; Sprengel, Jörg; Stolle, Katrin; Bonk, Thomas; Corthesy, Stephanie; Royyuru, Ajay; Peitsch, Manuel C.; Stolovitzky, Gustavo
2012-01-01
Motivation: Analyses and algorithmic predictions based on high-throughput data are essential for the success of systems biology in academic and industrial settings. Organizations, such as companies and academic consortia, conduct large multi-year scientific studies that entail the collection and analysis of thousands of individual experiments, often over many physical sites and with internal and outsourced components. To extract maximum value, the interested parties need to verify the accuracy and reproducibility of data and methods before the initiation of such large multi-year studies. However, systematic and well-established verification procedures do not exist for automated collection and analysis workflows in systems biology which could lead to inaccurate conclusions. Results: We present here, a review of the current state of systems biology verification and a detailed methodology to address its shortcomings. This methodology named ‘Industrial Methodology for Process Verification in Research’ or IMPROVER, consists on evaluating a research program by dividing a workflow into smaller building blocks that are individually verified. The verification of each building block can be done internally by members of the research program or externally by ‘crowd-sourcing’ to an interested community. www.sbvimprover.com Implementation: This methodology could become the preferred choice to verify systems biology research workflows that are becoming increasingly complex and sophisticated in industrial and academic settings. Contact: gustavo@us.ibm.com PMID:22423044
Lombard, Catherine B; Harrison, Cheryce L; Kozica, Samantha L; Zoungas, Sophia; Keating, Catherine; Teede, Helena J
2014-06-16
To impact on the obesity epidemic, interventions that prevent weight gain across populations are urgently needed. However, even the most efficacious interventions will have little impact on obesity prevention unless they are successfully implemented in diverse populations and settings. Implementation research takes isolated efficacy studies into practice and policy and is particularly important in obesity prevention where there is an urgent need to accelerate the evidence to practice cycle. Despite the recognised need, few obesity prevention interventions have been implemented in real life settings and to our knowledge rarely target rural communities. Here we describe the rationale, design and implementation of a Healthy Lifestyle Program for women living in small rural communities (HeLP-her Rural). The primary goal of HeLP-her Rural is to prevent weight gain using a low intensity, self-management intervention. Six hundred women from 42 small rural communities in Australia will be randomised as clusters (n-21 control towns and n = 21 intervention towns). A pragmatic randomised controlled trial methodology will test efficacy and a comprehensive mixed methods community evaluation and cost analysis will inform effectiveness and implementation of this novel prevention program. Implementing population interventions to prevent obesity is complex, costly and challenging. To address these barriers, evidence based interventions need to move beyond isolated efficacy trials and report outcomes related to effectiveness and implementation. Large pragmatic trials provide an opportunity to inform both effectiveness and implementation leading to potential for greater impact at the population level. Pragmatic trials should incorporate both effectiveness and implementation outcomes and a multidimensional methodology to inform scale-up to population level. The learnings from this trial will impact on the design and implementation of population obesity prevention strategies nationally and internationally. ANZ clinical trial registry ACTRN12612000115831. Date of registration 24/01/2012.
An experimental methodology for a fuzzy set preference model
NASA Technical Reports Server (NTRS)
Turksen, I. B.; Willson, Ian A.
1992-01-01
A flexible fuzzy set preference model first requires approximate methodologies for implementation. Fuzzy sets must be defined for each individual consumer using computer software, requiring a minimum of time and expertise on the part of the consumer. The amount of information needed in defining sets must also be established. The model itself must adapt fully to the subject's choice of attributes (vague or precise), attribute levels, and importance weights. The resulting individual-level model should be fully adapted to each consumer. The methodologies needed to develop this model will be equally useful in a new generation of intelligent systems which interact with ordinary consumers, controlling electronic devices through fuzzy expert systems or making recommendations based on a variety of inputs. The power of personal computers and their acceptance by consumers has yet to be fully utilized to create interactive knowledge systems that fully adapt their function to the user. Understanding individual consumer preferences is critical to the design of new products and the estimation of demand (market share) for existing products, which in turn is an input to management systems concerned with production and distribution. The question of what to make, for whom to make it and how much to make requires an understanding of the customer's preferences and the trade-offs that exist between alternatives. Conjoint analysis is a widely used methodology which de-composes an overall preference for an object into a combination of preferences for its constituent parts (attributes such as taste and price), which are combined using an appropriate combination function. Preferences are often expressed using linguistic terms which cannot be represented in conjoint models. Current models are also not implemented an individual level, making it difficult to reach meaningful conclusions about the cause of an individual's behavior from an aggregate model. The combination of complex aggregate models and vague linguistic preferences has greatly limited the usefulness and predictive validity of existing preference models. A fuzzy set preference model that uses linguistic variables and a fully interactive implementation should be able to simultaneously address these issues and substantially improve the accuracy of demand estimates. The parallel implementation of crisp and fuzzy conjoint models using identical data not only validates the fuzzy set model but also provides an opportunity to assess the impact of fuzzy set definitions and individual attribute choices implemented in the interactive methodology developed in this research. The generalized experimental tools needed for conjoint models can also be applied to many other types of intelligent systems.
Thompson-Bean, E; Das, R; McDaid, A
2016-10-31
We present a novel methodology for the design and manufacture of complex biologically inspired soft robotic fluidic actuators. The methodology is applied to the design and manufacture of a prosthetic for the hand. Real human hands are scanned to produce a 3D model of a finger, and pneumatic networks are implemented within it to produce a biomimetic bending motion. The finger is then partitioned into material sections, and a genetic algorithm based optimization, using finite element analysis, is employed to discover the optimal material for each section. This is based on two biomimetic performance criteria. Two sets of optimizations using two material sets are performed. Promising optimized material arrangements are fabricated using two techniques to validate the optimization routine, and the fabricated and simulated results are compared. We find that the optimization is successful in producing biomimetic soft robotic fingers and that fabrication of the fingers is possible. Limitations and paths for development are discussed. This methodology can be applied for other fluidic soft robotic devices.
NASA Technical Reports Server (NTRS)
Hopkins, Dale A.; Patnaik, Surya N.
2000-01-01
A preliminary aircraft engine design methodology is being developed that utilizes a cascade optimization strategy together with neural network and regression approximation methods. The cascade strategy employs different optimization algorithms in a specified sequence. The neural network and regression methods are used to approximate solutions obtained from the NASA Engine Performance Program (NEPP), which implements engine thermodynamic cycle and performance analysis models. The new methodology is proving to be more robust and computationally efficient than the conventional optimization approach of using a single optimization algorithm with direct reanalysis. The methodology has been demonstrated on a preliminary design problem for a novel subsonic turbofan engine concept that incorporates a wave rotor as a cycle-topping device. Computations of maximum thrust were obtained for a specific design point in the engine mission profile. The results (depicted in the figure) show a significant improvement in the maximum thrust obtained using the new methodology in comparison to benchmark solutions obtained using NEPP in a manual design mode.
Current-mode subthreshold MOS implementation of the Herault-Jutten autoadaptive network
NASA Astrophysics Data System (ADS)
Cohen, Marc H.; Andreou, Andreas G.
1992-05-01
The translinear circuits in subthreshold MOS technology and current-mode design techniques for the implementation of neuromorphic analog network processing are investigated. The architecture, also known as the Herault-Jutten network, performs an independent component analysis and is essentially a continuous-time recursive linear adaptive filter. Analog I/O interface, weight coefficients, and adaptation blocks are all integrated on the chip. A small network with six neurons and 30 synapses was fabricated in a 2-microns n-well double-polysilicon, double-metal CMOS process. Circuit designs at the transistor level yield area-efficient implementations for neurons, synapses, and the adaptation blocks. The design methodology and constraints as well as test results from the fabricated chips are discussed.
Schooley, Janine; Morales, Linda
2007-01-01
The "traditional" use of the Positive Deviance approach to behavior change involves studying children who thrive despite adversity, identifying uncommon model behaviors among Positive Deviant families, and then designing and implementing an intervention to replicate these behaviors among mothers of malnourished children. This article presents the results of a literature review designed to gather information on the role of the Positive Deviance/Hearth methodology in social and behavior change. Examples of how the methodology has been applied beyond infant and child malnutrition to address other health areas, such as improving pregnancy outcomes, are explored. An analysis of Positive Deviance programming being carried out by Project Concern International in Guatemala and Indonesia is conducted. The role of cultural context in the design and implementation of Positive Deviance/Hearth, as well as the role of Positive Deviance in affecting social and behavior change, require further exploration. The issues related to cultural context and the challenges for monitoring and evaluation of program outcomes are presented.
A Big Data Analytics Methodology Program in the Health Sector
ERIC Educational Resources Information Center
Lawler, James; Joseph, Anthony; Howell-Barber, H.
2016-01-01
The benefits of Big Data Analytics are cited frequently in the literature. However, the difficulties of implementing Big Data Analytics can limit the number of organizational projects. In this study, the authors evaluate business, procedural and technical factors in the implementation of Big Data Analytics, applying a methodology program. Focusing…
NASA Astrophysics Data System (ADS)
Dai, H.; Chen, X.; Ye, M.; Song, X.; Zachara, J. M.
2016-12-01
Sensitivity analysis has been an important tool in groundwater modeling to identify the influential parameters. Among various sensitivity analysis methods, the variance-based global sensitivity analysis has gained popularity for its model independence characteristic and capability of providing accurate sensitivity measurements. However, the conventional variance-based method only considers uncertainty contribution of single model parameters. In this research, we extended the variance-based method to consider more uncertainty sources and developed a new framework to allow flexible combinations of different uncertainty components. We decompose the uncertainty sources into a hierarchical three-layer structure: scenario, model and parametric. Furthermore, each layer of uncertainty source is capable of containing multiple components. An uncertainty and sensitivity analysis framework was then constructed following this three-layer structure using Bayesian network. Different uncertainty components are represented as uncertain nodes in this network. Through the framework, variance-based sensitivity analysis can be implemented with great flexibility of using different grouping strategies for uncertainty components. The variance-based sensitivity analysis thus is improved to be able to investigate the importance of an extended range of uncertainty sources: scenario, model, and other different combinations of uncertainty components which can represent certain key model system processes (e.g., groundwater recharge process, flow reactive transport process). For test and demonstration purposes, the developed methodology was implemented into a test case of real-world groundwater reactive transport modeling with various uncertainty sources. The results demonstrate that the new sensitivity analysis method is able to estimate accurate importance measurements for any uncertainty sources which were formed by different combinations of uncertainty components. The new methodology can provide useful information for environmental management and decision-makers to formulate policies and strategies.
Hubble, Lee J; Cooper, James S; Sosa-Pintos, Andrea; Kiiveri, Harri; Chow, Edith; Webster, Melissa S; Wieczorek, Lech; Raguse, Burkhard
2015-02-09
Chemiresistor sensor arrays are a promising technology to replace current laboratory-based analysis instrumentation, with the advantage of facile integration into portable, low-cost devices for in-field use. To increase the performance of chemiresistor sensor arrays a high-throughput fabrication and screening methodology was developed to assess different organothiol-functionalized gold nanoparticle chemiresistors. This high-throughput fabrication and testing methodology was implemented to screen a library consisting of 132 different organothiol compounds as capping agents for functionalized gold nanoparticle chemiresistor sensors. The methodology utilized an automated liquid handling workstation for the in situ functionalization of gold nanoparticle films and subsequent automated analyte testing of sensor arrays using a flow-injection analysis system. To test the methodology we focused on the discrimination and quantitation of benzene, toluene, ethylbenzene, p-xylene, and naphthalene (BTEXN) mixtures in water at low microgram per liter concentration levels. The high-throughput methodology identified a sensor array configuration consisting of a subset of organothiol-functionalized chemiresistors which in combination with random forests analysis was able to predict individual analyte concentrations with overall root-mean-square errors ranging between 8-17 μg/L for mixtures of BTEXN in water at the 100 μg/L concentration. The ability to use a simple sensor array system to quantitate BTEXN mixtures in water at the low μg/L concentration range has direct and significant implications to future environmental monitoring and reporting strategies. In addition, these results demonstrate the advantages of high-throughput screening to improve the performance of gold nanoparticle based chemiresistors for both new and existing applications.
NASA Astrophysics Data System (ADS)
Nogueira, Juan Manuel; Romero, David; Espadas, Javier; Molina, Arturo
2013-02-01
With the emergence of new enterprise models, such as technology-based enterprises, and the large quantity of information generated through technological advances, the Zachman framework continues to represent a modelling tool of great utility and value to construct an enterprise architecture (EA) that can integrate and align the IT infrastructure and business goals. Nevertheless, implementing an EA requires an important effort within an enterprise. Small technology-based enterprises and start-ups can take advantage of EAs and frameworks but, because these enterprises have limited resources to allocate for this task, an enterprise framework implementation is not feasible in most cases. This article proposes a new methodology based on action-research for the implementation of the business, system and technology models of the Zachman framework to assist and facilitate its implementation. Following the explanation of cycles of the proposed methodology, a case study is presented to illustrate the results of implementing the Zachman framework in a technology-based enterprise: PyME CREATIVA, using action-research approach.
A neural network based methodology to predict site-specific spectral acceleration values
NASA Astrophysics Data System (ADS)
Kamatchi, P.; Rajasankar, J.; Ramana, G. V.; Nagpal, A. K.
2010-12-01
A general neural network based methodology that has the potential to replace the computationally-intensive site-specific seismic analysis of structures is proposed in this paper. The basic framework of the methodology consists of a feed forward back propagation neural network algorithm with one hidden layer to represent the seismic potential of a region and soil amplification effects. The methodology is implemented and verified with parameters corresponding to Delhi city in India. For this purpose, strong ground motions are generated at bedrock level for a chosen site in Delhi due to earthquakes considered to originate from the central seismic gap of the Himalayan belt using necessary geological as well as geotechnical data. Surface level ground motions and corresponding site-specific response spectra are obtained by using a one-dimensional equivalent linear wave propagation model. Spectral acceleration values are considered as a target parameter to verify the performance of the methodology. Numerical studies carried out to validate the proposed methodology show that the errors in predicted spectral acceleration values are within acceptable limits for design purposes. The methodology is general in the sense that it can be applied to other seismically vulnerable regions and also can be updated by including more parameters depending on the state-of-the-art in the subject.
New insights into liquid chromatography for more eco-friendly analysis of pharmaceuticals.
Shaaban, Heba
2016-10-01
Greening the analytical methods used for analysis of pharmaceuticals has been receiving great interest aimed at eliminating or minimizing the amount of organic solvents consumed daily worldwide without loss in chromatographic performance. Traditional analytical LC techniques employed in pharmaceutical analysis consume tremendous amounts of hazardous solvents and consequently generate large amounts of waste. The monetary and ecological impact of using large amounts of solvents and waste disposal motivated the analytical community to search for alternatives to replace polluting analytical methodologies with clean ones. In this context, implementing the principles of green analytical chemistry (GAC) in analytical laboratories is highly desired. This review gives a comprehensive overview on different green LC pathways for implementing GAC principles in analytical laboratories and focuses on evaluating the greenness of LC analytical procedures. This review presents green LC approaches for eco-friendly analysis of pharmaceuticals in industrial, biological, and environmental matrices. Graphical Abstract Green pathways of liquid chromatography for more eco-friendly analysis of pharmaceuticals.
ERIC Educational Resources Information Center
Hollowell, Gail P.; Osler, James E.; Hester, April L.
2015-01-01
This paper provides an applied research rational for a longitudinal investigation that involved teaching a "Technology Engineered Science Education Course" via an Interactive Laboratory Based Genomics Curriculum. The Technology st Engineering [TE] methodology was first introduced at the SAPES: South Atlantic Philosophy of Education…
Code of Federal Regulations, 2010 CFR
2010-10-01
... December 31, 2015. In the case of future cessation of local service, the expectation may be documented by... anticipate future requests for service not in keeping with prior service patterns.(See § 236.1005(b)(3)). (2... procedures and using the same methodology as required for safety and security route analysis under 49 CFR 172...
ERIC Educational Resources Information Center
Putnik, Goran; Costa, Eric; Alves, Cátia; Castro, Hélio; Varela, Leonilde; Shah, Vaibhav
2016-01-01
Social network-based engineering education (SNEE) is designed and implemented as a model of Education 3.0 paradigm. SNEE represents a new learning methodology, which is based on the concept of social networks and represents an extended model of project-led education. The concept of social networks was applied in the real-life experiment,…
ERIC Educational Resources Information Center
Naito-Billen, Yuka
2012-01-01
Recently, the significant role that pronunciation and prosody plays in processing spoken language has been widely recognized and a variety of teaching methodologies of pronunciation/prosody has been implemented in teaching foreign languages. Thus, an analysis of how similarly or differently native and L2 learners of a language use…
Improta, Giovanni; Russo, Mario Alessandro; Triassi, Maria; Converso, Giuseppe; Murino, Teresa; Santillo, Liberatina Carmela
2018-05-01
Health technology assessments (HTAs) are often difficult to conduct because of the decisive procedures of the HTA algorithm, which are often complex and not easy to apply. Thus, their use is not always convenient or possible for the assessment of technical requests requiring a multidisciplinary approach. This paper aims to address this issue through a multi-criteria analysis focusing on the analytic hierarchy process (AHP). This methodology allows the decision maker to analyse and evaluate different alternatives and monitor their impact on different actors during the decision-making process. However, the multi-criteria analysis is implemented through a simulation model to overcome the limitations of the AHP methodology. Simulations help decision-makers to make an appropriate decision and avoid unnecessary and costly attempts. Finally, a decision problem regarding the evaluation of two health technologies, namely, the evaluation of two biological prostheses for incisional infected hernias, will be analysed to assess the effectiveness of the model. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.
Kelly, Elizabeth W; Kelly, Jonathan D; Hiestand, Brian; Wells-Kiser, Kathy; Starling, Stephanie; Hoekstra, James W
2010-01-01
Rapid reperfusion in patients with ST-elevation myocardial infarction (STEMI) is associated with lower mortality. Reduction in door-to-balloon (D2B) time for percutaneous coronary intervention requires multidisciplinary cooperation, process analysis, and quality improvement methodology. Six Sigma methodology was used to reduce D2B times in STEMI patients presenting to a tertiary care center. Specific steps in STEMI care were determined, time goals were established, and processes were changed to reduce each step's duration. Outcomes were tracked, and timely feedback was given to providers. After process analysis and implementation of improvements, mean D2B times decreased from 128 to 90 minutes. Improvement has been sustained; as of June 2010, the mean D2B was 56 minutes, with 100% of patients meeting the 90-minute window for the year. Six Sigma methodology and immediate provider feedback result in significant reductions in D2B times. The lessons learned may be extrapolated to other primary percutaneous coronary intervention centers. Copyright © 2010 Elsevier Inc. All rights reserved.
Huang, Chiu-Mieh; Hung, Wei-Shu; Lai, Jung-Nien; Kao, Yu-Hsiu; Wang, Ching-Ling; Guo, Jong-Long
2016-06-01
To explore the resource demands of implementing the Baby-Friendly Hospital Initiative among maternity staff. Implementing the Baby-Friendly Hospital Initiative is the most recognized global strategy for ensuring that hospital routines support breastfeeding. The maternity services of Baby-Friendly Hospital Initiative accredited hospitals are evaluated according to the Ten Steps to Successful Breastfeeding. Q methodology was applied to investigate the perspectives of 60 maternity staff in Northern Taiwan. Data were collected from May - December 2014. An online Q-sort platform was designed for the participants to perform sorting. The Q-sorts were subjected to factor analysis by using PQ Method software. Factors were extracted using principal component analysis with a varimax rotation. A combination of eigenvalues and a scree plot were employed to determine the number of retained factors. Four factors retained in the final model accounted for 56% of the total variance: (1) emphasis on implementing an institutional policy; (2) emphasis on providing supportive practices for breastfeeding mothers; (3) emphasis on establishing continual breastfeeding support; and (4) emphasis on managing breastfeeding supportive practices concerning a designated time period. The participants that were associated with Factors 1 and 3 emphasized the necessity of allocating resources to Steps 1, 2 and 10 of the Ten Steps. The participants associated with Factors 2 and 4 emphasized allocating resources to Steps 2-5 and 7. This study revealed the various perspectives of maternity staff regarding the resource demands of implementing the Baby-Friendly Hospital Initiative. These perspectives may serve as a reference for decision-makers in prioritizing resource allocation. © 2016 John Wiley & Sons Ltd.
Varsos, Constantinos; Patkos, Theodore; Pavloudi, Christina; Gougousis, Alexandros; Ijaz, Umer Zeeshan; Filiopoulou, Irene; Pattakos, Nikolaos; Vanden Berghe, Edward; Fernández-Guerra, Antonio; Faulwetter, Sarah; Chatzinikolaou, Eva; Pafilis, Evangelos; Bekiari, Chryssoula; Doerr, Martin; Arvanitidis, Christos
2016-01-01
Abstract Background Parallel data manipulation using R has previously been addressed by members of the R community, however most of these studies produce ad hoc solutions that are not readily available to the average R user. Our targeted users, ranging from the expert ecologist/microbiologists to computational biologists, often experience difficulties in finding optimal ways to exploit the full capacity of their computational resources. In addition, improving performance of commonly used R scripts becomes increasingly difficult especially with large datasets. Furthermore, the implementations described here can be of significant interest to expert bioinformaticians or R developers. Therefore, our goals can be summarized as: (i) description of a complete methodology for the analysis of large datasets by combining capabilities of diverse R packages, (ii) presentation of their application through a virtual R laboratory (RvLab) that makes execution of complex functions and visualization of results easy and readily available to the end-user. New information In this paper, the novelty stems from implementations of parallel methodologies which rely on the processing of data on different levels of abstraction and the availability of these processes through an integrated portal. Parallel implementation R packages, such as the pbdMPI (Programming with Big Data – Interface to MPI) package, are used to implement Single Program Multiple Data (SPMD) parallelization on primitive mathematical operations, allowing for interplay with functions of the vegan package. The dplyr and RPostgreSQL R packages are further integrated offering connections to dataframe like objects (databases) as secondary storage solutions whenever memory demands exceed available RAM resources. The RvLab is running on a PC cluster, using version 3.1.2 (2014-10-31) on a x86_64-pc-linux-gnu (64-bit) platform, and offers an intuitive virtual environmet interface enabling users to perform analysis of ecological and microbial communities based on optimized vegan functions. A beta version of the RvLab is available after registration at: https://portal.lifewatchgreece.eu/ PMID:27932907
Varsos, Constantinos; Patkos, Theodore; Oulas, Anastasis; Pavloudi, Christina; Gougousis, Alexandros; Ijaz, Umer Zeeshan; Filiopoulou, Irene; Pattakos, Nikolaos; Vanden Berghe, Edward; Fernández-Guerra, Antonio; Faulwetter, Sarah; Chatzinikolaou, Eva; Pafilis, Evangelos; Bekiari, Chryssoula; Doerr, Martin; Arvanitidis, Christos
2016-01-01
Parallel data manipulation using R has previously been addressed by members of the R community, however most of these studies produce ad hoc solutions that are not readily available to the average R user. Our targeted users, ranging from the expert ecologist/microbiologists to computational biologists, often experience difficulties in finding optimal ways to exploit the full capacity of their computational resources. In addition, improving performance of commonly used R scripts becomes increasingly difficult especially with large datasets. Furthermore, the implementations described here can be of significant interest to expert bioinformaticians or R developers. Therefore, our goals can be summarized as: (i) description of a complete methodology for the analysis of large datasets by combining capabilities of diverse R packages, (ii) presentation of their application through a virtual R laboratory (RvLab) that makes execution of complex functions and visualization of results easy and readily available to the end-user. In this paper, the novelty stems from implementations of parallel methodologies which rely on the processing of data on different levels of abstraction and the availability of these processes through an integrated portal. Parallel implementation R packages, such as the pbdMPI (Programming with Big Data - Interface to MPI) package, are used to implement Single Program Multiple Data (SPMD) parallelization on primitive mathematical operations, allowing for interplay with functions of the vegan package. The dplyr and RPostgreSQL R packages are further integrated offering connections to dataframe like objects (databases) as secondary storage solutions whenever memory demands exceed available RAM resources. The RvLab is running on a PC cluster, using version 3.1.2 (2014-10-31) on a x86_64-pc-linux-gnu (64-bit) platform, and offers an intuitive virtual environmet interface enabling users to perform analysis of ecological and microbial communities based on optimized vegan functions. A beta version of the RvLab is available after registration at: https://portal.lifewatchgreece.eu/.
Evaluation of grid generation technologies from an applied perspective
NASA Technical Reports Server (NTRS)
Hufford, Gary S.; Harrand, Vincent J.; Patel, Bhavin C.; Mitchell, Curtis R.
1995-01-01
An analysis of the grid generation process from the point of view of an applied CFD engineer is given. Issues addressed include geometric modeling, structured grid generation, unstructured grid generation, hybrid grid generation and use of virtual parts libraries in large parametric analysis projects. The analysis is geared towards comparing the effective turn around time for specific grid generation and CFD projects. The conclusion was made that a single grid generation methodology is not universally suited for all CFD applications due to both limitations in grid generation and flow solver technology. A new geometric modeling and grid generation tool, CFD-GEOM, is introduced to effectively integrate the geometric modeling process to the various grid generation methodologies including structured, unstructured, and hybrid procedures. The full integration of the geometric modeling and grid generation allows implementation of extremely efficient updating procedures, a necessary requirement for large parametric analysis projects. The concept of using virtual parts libraries in conjunction with hybrid grids for large parametric analysis projects is also introduced to improve the efficiency of the applied CFD engineer.
Nadal, Ana; Pons, Oriol; Cuerva, Eva; Rieradevall, Joan; Josa, Alejandro
2018-06-01
Today, urban agriculture is one of the most widely used sustainability strategies to improve the metabolism of a city. Schools can play an important role in the implementation of sustainability master plans, due their socio-educational activities and their cohesive links with families; all key elements in the development of urban agriculture. Thus, the main objective of this research is to develop a procedure, in compact cities, to assess the potential installation of rooftop greenhouses (RTGs) in schools. The generation of a dynamic assessment tool capable of identifying and prioritizing schools with a high potential for RTGs and their eventual implementation would also represent a significant factor in the environmental, social, and nutritional education of younger generations. The methodology has four-stages (Pre-selection criteria; Selection of necessities; Sustainability analysis; and Sensitivity analysis and selection of the best alternative) in which economic, environmental, social and governance aspects all are considered. It makes use of Multi-Attribute Utility Theory and Multi-Criteria Decision Making, through the Integrated Value Model for Sustainability Assessments and the participation of two panels of multidisciplinary specialists, for the preparation of a unified sustainability index that guarantees the objectivity of the selection process. This methodology has been applied and validated in a case study of 11 schools in Barcelona (Spain). The social perspective of the proposed methodology favored the school in the case-study with the most staff and the largest parent-teacher association (social and governance indicators) that obtained the highest sustainability index (S11); at a considerable distance (45%) from the worst case (S3) with fewer school staff and parental support. Finally, objective decisions may be taken with the assistance of this appropriate, adaptable, and reliable Multi-Criteria Decision-Making tool on the vertical integration and implementation of urban agriculture in schools, in support of the goals of sustainable development and the circular economy. Copyright © 2018 Elsevier B.V. All rights reserved.
A topological multilayer model of the human body.
Barbeito, Antonio; Painho, Marco; Cabral, Pedro; O'Neill, João
2015-11-04
Geographical information systems deal with spatial databases in which topological models are described with alphanumeric information. Its graphical interfaces implement the multilayer concept and provide powerful interaction tools. In this study, we apply these concepts to the human body creating a representation that would allow an interactive, precise, and detailed anatomical study. A vector surface component of the human body is built using a three-dimensional (3-D) reconstruction methodology. This multilayer concept is implemented by associating raster components with the corresponding vector surfaces, which include neighbourhood topology enabling spatial analysis. A root mean square error of 0.18 mm validated the three-dimensional reconstruction technique of internal anatomical structures. The expansion of the identification and the development of a neighbourhood analysis function are the new tools provided in this model.
Di Tecco, Cristina; Ronchetti, Matteo; Ghelli, Monica; Russo, Simone; Persechino, Benedetta
2015-01-01
Studies on Intervention Process Evaluation are attracting growing attention in the literature on interventions linked to stress and the wellbeing of workers. There is evidence that some elements relating to the process and content of an intervention may have a decisive role in implementing it by facilitating or hindering the effectiveness of the results. This study aimed to provide a process evaluation on interventions to assess and manage risks related to work-related stress using a methodological path offered by INAIL. The final sample is composed of 124 companies participating to an interview on aspects relating to each phase of the INAIL methodological path put in place to implement the intervention. INAIL methodology has been defined as useful in the process of assessing and managing the risks related to work-related stress. Some factors related to the process (e.g., implementation of a preliminary phase, workers' involvement, and use of external consultants) showed a role in significant differences that emerged in the levels of risk, particularly in relation to findings from the preliminary assessment. Main findings provide information on the key aspects of process and content that are useful in implementing an intervention for assessing and managing risks related to work-related stress. PMID:26504788
Di Tecco, Cristina; Ronchetti, Matteo; Ghelli, Monica; Russo, Simone; Persechino, Benedetta; Iavicoli, Sergio
2015-01-01
Studies on Intervention Process Evaluation are attracting growing attention in the literature on interventions linked to stress and the wellbeing of workers. There is evidence that some elements relating to the process and content of an intervention may have a decisive role in implementing it by facilitating or hindering the effectiveness of the results. This study aimed to provide a process evaluation on interventions to assess and manage risks related to work-related stress using a methodological path offered by INAIL. The final sample is composed of 124 companies participating to an interview on aspects relating to each phase of the INAIL methodological path put in place to implement the intervention. INAIL methodology has been defined as useful in the process of assessing and managing the risks related to work-related stress. Some factors related to the process (e.g., implementation of a preliminary phase, workers' involvement, and use of external consultants) showed a role in significant differences that emerged in the levels of risk, particularly in relation to findings from the preliminary assessment. Main findings provide information on the key aspects of process and content that are useful in implementing an intervention for assessing and managing risks related to work-related stress.
Analysis of the production and transaction costs of forest carbon offset projects in the USA.
Galik, Christopher S; Cooley, David M; Baker, Justin S
2012-12-15
Forest carbon offset project implementation costs, comprised of both production and transaction costs, could present an important barrier to private landowner participation in carbon offset markets. These costs likewise represent a largely undocumented component of forest carbon offset potential. Using a custom spreadsheet model and accounting tool, this study examines the implementation costs of different forest offset project types operating in different forest types under different accounting and sampling methodologies. Sensitivity results are summarized concisely through response surface regression analysis to illustrate the relative effect of project-specific variables on total implementation costs. Results suggest that transaction costs may represent a relatively small percentage of total project implementation costs - generally less than 25% of the total. Results also show that carbon accounting methods, specifically the method used to establish project baseline, may be among the most important factors in driving implementation costs on a per-ton-of-carbon-sequestered basis, dramatically increasing variability in both transaction and production costs. This suggests that accounting could be a large driver in the financial viability of forest offset projects, with transaction costs likely being of largest concern to those projects at the margin. Copyright © 2012 Elsevier Ltd. All rights reserved.
Rink, Elizabeth
2016-01-01
Newly emerging research suggests that the actual physical location of a study and the geographic context in which a study is implemented influences the types of research methods most appropriate to use in a study as well as the study's research outcomes. This article presents a reflection on the extent to which place influenced the use of community-based participatory research (CBPR) as a research methodology in the implementation of an intervention to address sexually transmitted infections in Greenland. An evaluation of the interaction between place and CBPR suggests that the physicality of place influenced the intervention's successes and challenges. Future research that uses CBPR as a research methodology in sexual and reproductive health research in the Arctic warrants situating the research design, implementation and outcomes within the context of place.
NASA Astrophysics Data System (ADS)
Leu, Jun-Der; Lee, Larry Jung-Hsing
2017-09-01
Enterprise resource planning (ERP) is a software solution that integrates the operational processes of the business functions of an enterprise. However, implementing ERP systems is a complex process. In addition to the technical issues, companies must address problems associated with business process re-engineering, time and budget control, and organisational change. Numerous industrial studies have shown that the failure rate of ERP implementation is high, even for well-designed systems. Thus, ERP projects typically require a clear methodology to support the project execution and effectiveness. In this study, we propose a theoretical model for ERP implementation. The value engineering (VE) method forms the basis of the proposed framework, which integrates Six Sigma tools. The proposed framework encompasses five phases: knowledge generation, analysis, creation, development and execution. In the VE method, potential ERP problems related to software, hardware, consultation and organisation are analysed in a group-decision manner and in relation to value, and Six Sigma tools are applied to avoid any project defects. We validate the feasibility of the proposed model by applying it to an international manufacturing enterprise in Taiwan. The results show improvements in customer response time and operational efficiency in terms of work-in-process and turnover of materials. Based on the evidence from the case study, the theoretical framework is discussed together with the study's limitations and suggestions for future research.
Rivera, José; Carrillo, Mariano; Chacón, Mario; Herrera, Gilberto; Bojorquez, Gilberto
2007-01-01
The development of smart sensors involves the design of reconfigurable systems capable of working with different input sensors. Reconfigurable systems ideally should spend the least possible amount of time in their calibration. An autocalibration algorithm for intelligent sensors should be able to fix major problems such as offset, variation of gain and lack of linearity, as accurately as possible. This paper describes a new autocalibration methodology for nonlinear intelligent sensors based on artificial neural networks, ANN. The methodology involves analysis of several network topologies and training algorithms. The proposed method was compared against the piecewise and polynomial linearization methods. Method comparison was achieved using different number of calibration points, and several nonlinear levels of the input signal. This paper also shows that the proposed method turned out to have a better overall accuracy than the other two methods. Besides, experimentation results and analysis of the complete study, the paper describes the implementation of the ANN in a microcontroller unit, MCU. In order to illustrate the method capability to build autocalibration and reconfigurable systems, a temperature measurement system was designed and tested. The proposed method is an improvement over the classic autocalibration methodologies, because it impacts on the design process of intelligent sensors, autocalibration methodologies and their associated factors, like time and cost.
ERIC Educational Resources Information Center
Raj, Saravanan
2013-01-01
Purpose: This case study deals with the implementation methodology, innovations and lessons of the ICT initiative in providing agricultural extension services to the rural tribal farming community of North-East India. Methodology: This study documents the ICT project implementation challenges, impact among farmers and briefly indicates lessons of…
A Socio-Technical Exploration for Reducing & Mitigating the Risk of Retained Foreign Objects
Corrigan, Siobhán; Kay, Alison; O’Byrne, Katie; Slattery, Dubhfeasa; Sheehan, Sharon; McDonald, Nick; Smyth, David; Mealy, Ken; Cromie, Sam
2018-01-01
A Retained Foreign Object (RFO) is a fairly infrequent but serious adverse event. An accurate rate of RFOs is difficult to establish due to underreporting but it has been estimated that incidences range between 1/1000 and 1/19,000 procedures. The cost of a RFO incident may be substantial and three-fold: (i) the cost to the patient of physical and/or psychological harm; (ii) the reputational cost to an institution and/or healthcare provider; and (iii) the financial cost to the taxpayer in the event of a legal claim. This Health Research Board-funded project aims to analyse and understand the problem of RFOs in surgical and maternity settings in Ireland and develop hospital-specific foreign object management processes and implementation roadmaps. This project will deploy an integrated evidence-based assessment methodology for social-technical modelling (Supply, Context, Organising, Process & Effects/ SCOPE Analysis Cube) and bow tie methodologies that focuses on managing the risks in effectively implementing and sustaining change. It comprises a multi-phase research approach that involves active and ongoing collaboration with clinical and other healthcare staff through each phase of the research. The specific objective of this paper is to present the methodological approach and outline the potential to produce generalisable results which could be applied to other health-related issues. PMID:29642646
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gatsonis, Nikolaos A.; Spirkin, Anton
2009-06-01
The mathematical formulation and computational implementation of a three-dimensional particle-in-cell methodology on unstructured Delaunay-Voronoi tetrahedral grids is presented. The method allows simulation of plasmas in complex domains and incorporates the duality of the Delaunay-Voronoi in all aspects of the particle-in-cell cycle. Charge assignment and field interpolation weighting schemes of zero- and first-order are formulated based on the theory of long-range constraints. Electric potential and fields are derived from a finite-volume formulation of Gauss' law using the Voronoi-Delaunay dual. Boundary conditions and the algorithms for injection, particle loading, particle motion, and particle tracking are implemented for unstructured Delaunay grids. Error andmore » sensitivity analysis examines the effects of particles/cell, grid scaling, and timestep on the numerical heating, the slowing-down time, and the deflection times. The problem of current collection by cylindrical Langmuir probes in collisionless plasmas is used for validation. Numerical results compare favorably with previous numerical and analytical solutions for a wide range of probe radius to Debye length ratios, probe potentials, and electron to ion temperature ratios. The versatility of the methodology is demonstrated with the simulation of a complex plasma microsensor, a directional micro-retarding potential analyzer that includes a low transparency micro-grid.« less
MoPCoM Methodology: Focus on Models of Computation
NASA Astrophysics Data System (ADS)
Koudri, Ali; Champeau, Joël; Le Lann, Jean-Christophe; Leilde, Vincent
Today, developments of Real Time Embedded Systems have to face new challenges. On the one hand, Time-To-Market constraints require a reliable development process allowing quick design space exploration. On the other hand, rapidly developing technology, as stated by Moore's law, requires techniques to handle the resulting productivity gap. In a previous paper, we have presented our Model Based Engineering methodology addressing those issues. In this paper, we make a focus on Models of Computation design and analysis. We illustrate our approach on a Cognitive Radio System development implemented on an FPGA. This work is part of the MoPCoM research project gathering academic and industrial organizations (http://www.mopcom.fr).
BIM Methodology Approach to Infrastructure Design: Case Study of Paniga Tunnel
NASA Astrophysics Data System (ADS)
Osello, Anna; Rapetti, Niccolò; Semeraro, Francesco
2017-10-01
Nowadays, the implementation of Building Information Modelling (BIM) in civil design represent a new challenge for the AECO (Architecture, Engineering, Construction, Owner and Operator) world, which will involve the interest of many researchers in the next years. It is due to the incentives of Public Administration and European Directives that aim to improve the efficiency and to enhance a better management of the complexity of infrastructure projects. For these reasons, the goal of this research is to propose a methodology for the use of BIM in a tunnel project, analysing the definition of a correct level of detail (LOD) and the possibility to share information via interoperability for FEM analysis.
A multicriteria decision making model for assessment and selection of an ERP in a logistics context
NASA Astrophysics Data System (ADS)
Pereira, Teresa; Ferreira, Fernanda A.
2017-07-01
The aim of this work is to apply a methodology of decision support based on a multicriteria decision analyses (MCDA) model that allows the assessment and selection of an Enterprise Resource Planning (ERP) in a Portuguese logistics company by Group Decision Maker (GDM). A Decision Support system (DSS) that implements a MCDA - Multicriteria Methodology for the Assessment and Selection of Information Systems / Information Technologies (MMASSI / IT) is used based on its features and facility to change and adapt the model to a given scope. Using this DSS it was obtained the information system that best suited to the decisional context, being this result evaluated through a sensitivity and robustness analysis.
Making the Case for Reusable Booster Systems: The Operations Perspective
NASA Technical Reports Server (NTRS)
Zapata, Edgar
2012-01-01
Presentation to the Aeronautics Space Engineering Board National Research Council Reusable Booster System: Review and Assessment Committee. Addresses: the criteria and assumptions used in the formulation of current RBS plans; the methodologies used in the current cost estimates for RBS; the modeling methodology used to frame the business case for an RBS capability including: the data used in the analysis, the models' robustness if new data become available, and the impact of unclassified government data that was previously unavailable and which will be supplied by the USAF; the technical maturity of key elements critical to RBS implementation and the ability of current technology development plans to meet technical readiness milestones.
NASA Technical Reports Server (NTRS)
Noll, Thomas E.; Perry, Boyd, III; Tiffany, Sherwood H.; Cole, Stanley R.; Buttrill, Carey S.; Adams, William M., Jr.; Houck, Jacob A.; Srinathkumar, S.; Mukhopadhyay, Vivek; Pototzky, Anthony S.
1989-01-01
The status of the joint NASA/Rockwell Active Flexible Wing Wind-Tunnel Test Program is described. The objectives are to develop and validate the analysis, design, and test methodologies required to apply multifunction active control technology for improving aircraft performance and stability. Major tasks include designing digital multi-input/multi-output flutter-suppression and rolling-maneuver-load alleviation concepts for a flexible full-span wind-tunnel model, obtaining an experimental data base for the basic model and each control concept and providing comparisons between experimental and analytical results to validate the methodologies. The opportunity is provided to improve real-time simulation techniques and to gain practical experience with digital control law implementation procedures.
Complex multidisciplinary system composition for aerospace vehicle conceptual design
NASA Astrophysics Data System (ADS)
Gonzalez, Lex
Although, there exists a vast amount of work concerning the analysis, design, integration of aerospace vehicle systems, there is no standard for how this data and knowledge should be combined in order to create a synthesis system. Each institution creating a synthesis system has in house vehicle and hardware components they are attempting to model and proprietary methods with which to model them. This leads to the fact that synthesis systems begin as one-off creations meant to answer a specific problem. As the scope of the synthesis system grows to encompass more and more problems, so does its size and complexity; in order for a single synthesis system to answer multiple questions the number of methods and method interface must increase. As a means to curtail the requirement that the increase of an aircraft synthesis systems capability leads to an increase in its size and complexity, this research effort focuses on the idea that each problem in aerospace requires its own analysis framework. By focusing on the creation of a methodology which centers on the matching of an analysis framework towards the problem being solved, the complexity of the analysis framework is decoupled from the complexity of the system that creates it. The derived methodology allows for the composition of complex multi-disciplinary systems (CMDS) through the automatic creation and implementation of system and disciplinary method interfaces. The CMDS Composition process follows a four step methodology meant to take a problem definition and progress towards the creation of an analysis framework meant to answer said problem. The unique implementation of the CMDS Composition process take user selected disciplinary analysis methods and automatically integrates them, together in order to create a syntactically composable analysis framework. As a means of assessing the validity of the CMDS Composition process a prototype system (AVDDBMS) has been developed. AVD DBMS has been used to model the Generic Hypersonic Vehicle (GHV), an open source family of hypersonic vehicles originating from the Air Force Research Laboratory. AVDDBMS has been applied in three different ways in order to assess its validity: Verification using GHV disciplinary data, Validation using selected disciplinary analysis methods, and Application of the CMDS Composition Process to assess the design solution space for the GHV hardware. The research demonstrates the holistic effect that selection of individual disciplinary analysis methods has on the structure and integration of the analysis framework.
Code System for Performance Assessment Ground-water Analysis for Low-level Nuclear Waste.
DOE Office of Scientific and Technical Information (OSTI.GOV)
MATTHEW,; KOZAK, W.
1994-02-09
Version 00 The PAGAN code system is a part of the performance assessment methodology developed for use by the U. S. Nuclear Regulatory Commission in evaluating license applications for low-level waste disposal facilities. In this methodology, PAGAN is used as one candidate approach for analysis of the ground-water pathway. PAGAN, Version 1.1 has the capability to model the source term, vadose-zone transport, and aquifer transport of radionuclides from a waste disposal unit. It combines the two codes SURFACE and DISPERSE which are used as semi-analytical solutions to the convective-dispersion equation. This system uses menu driven input/out for implementing a simplemore » ground-water transport analysis and incorporates statistical uncertainty functions for handling data uncertainties. The output from PAGAN includes a time- and location-dependent radionuclide concentration at a well in the aquifer, or a time- and location-dependent radionuclide flux into a surface-water body.« less
Implementation of efficient trajectories for an ultrasonic scanner using chaotic maps
NASA Astrophysics Data System (ADS)
Almeda, A.; Baltazar, A.; Treesatayapun, C.; Mijarez, R.
2012-05-01
Typical ultrasonic methodology for nondestructive scanning evaluation uses systematic scanning paths. In many cases, this approach is time inefficient and also energy and computational power consuming. Here, a methodology for the scanning of defects using an ultrasonic echo-pulse scanning technique combined with chaotic trajectory generation is proposed. This is implemented in a Cartesian coordinate robotic system developed in our lab. To cover the entire search area, a chaotic function and a proposed mirror mapping were incorporated. To improve detection probability, our proposed scanning methodology is complemented with a probabilistic approach of discontinuity detection. The developed methodology was found to be more efficient than traditional ones used to localize and characterize hidden flaws.
Resting-State Functional Connectivity in the Infant Brain: Methods, Pitfalls, and Potentiality.
Mongerson, Chandler R L; Jennings, Russell W; Borsook, David; Becerra, Lino; Bajic, Dusica
2017-01-01
Early brain development is characterized by rapid growth and perpetual reconfiguration, driven by a dynamic milieu of heterogeneous processes. Postnatal brain plasticity is associated with increased vulnerability to environmental stimuli. However, little is known regarding the ontogeny and temporal manifestations of inter- and intra-regional functional connectivity that comprise functional brain networks. Resting-state functional magnetic resonance imaging (rs-fMRI) has emerged as a promising non-invasive neuroinvestigative tool, measuring spontaneous fluctuations in blood oxygen level dependent (BOLD) signal at rest that reflect baseline neuronal activity. Over the past decade, its application has expanded to infant populations providing unprecedented insight into functional organization of the developing brain, as well as early biomarkers of abnormal states. However, many methodological issues of rs-fMRI analysis need to be resolved prior to standardization of the technique to infant populations. As a primary goal, this methodological manuscript will (1) present a robust methodological protocol to extract and assess resting-state networks in early infancy using independent component analysis (ICA), such that investigators without previous knowledge in the field can implement the analysis and reliably obtain viable results consistent with previous literature; (2) review the current methodological challenges and ethical considerations associated with emerging field of infant rs-fMRI analysis; and (3) discuss the significance of rs-fMRI application in infants for future investigations of neurodevelopment in the context of early life stressors and pathological processes. The overarching goal is to catalyze efforts toward development of robust, infant-specific acquisition, and preprocessing pipelines, as well as promote greater transparency by researchers regarding methods used.
Characterizing Aeroallergens by Infrared Spectroscopy of Fungal Spores and Pollen
Zimmermann, Boris; Tkalčec, Zdenko; Mešić, Armin; Kohler, Achim
2015-01-01
Background Fungal spores and plant pollen cause respiratory diseases in susceptible individuals, such as asthma, allergic rhinitis and hypersensitivity pneumonitis. Aeroallergen monitoring networks are an important part of treatment strategies, but unfortunately traditional analysis is time consuming and expensive. We have explored the use of infrared spectroscopy of pollen and spores for an inexpensive and rapid characterization of aeroallergens. Methodology The study is based on measurement of spore and pollen samples by single reflectance attenuated total reflectance Fourier transform infrared spectroscopy (SR-ATR FTIR). The experimental set includes 71 spore (Basidiomycota) and 121 pollen (Pinales, Fagales and Poales) samples. Along with fresh basidiospores, the study has been conducted on the archived samples collected within the last 50 years. Results The spectroscopic-based methodology enables clear spectral differentiation between pollen and spores, as well as the separation of confamiliar and congeneric species. In addition, the analysis of the scattering signals inherent in the infrared spectra indicates that the FTIR methodology offers indirect estimation of morphology of pollen and spores. The analysis of fresh and archived spores shows that chemical composition of spores is well preserved even after decades of storage, including the characteristic taxonomy-related signals. Therefore, biochemical analysis of fungal spores by FTIR could provide economical, reliable and timely methodologies for improving fungal taxonomy, as well as for fungal identification and monitoring. This proof of principle study shows the potential for using FTIR as a rapid tool in aeroallergen studies. In addition, the presented method is ready to be immediately implemented in biological and ecological studies for direct measurement of pollen and spores from flowers and sporocarps. PMID:25867755
Resting-State Functional Connectivity in the Infant Brain: Methods, Pitfalls, and Potentiality
Mongerson, Chandler R. L.; Jennings, Russell W.; Borsook, David; Becerra, Lino; Bajic, Dusica
2017-01-01
Early brain development is characterized by rapid growth and perpetual reconfiguration, driven by a dynamic milieu of heterogeneous processes. Postnatal brain plasticity is associated with increased vulnerability to environmental stimuli. However, little is known regarding the ontogeny and temporal manifestations of inter- and intra-regional functional connectivity that comprise functional brain networks. Resting-state functional magnetic resonance imaging (rs-fMRI) has emerged as a promising non-invasive neuroinvestigative tool, measuring spontaneous fluctuations in blood oxygen level dependent (BOLD) signal at rest that reflect baseline neuronal activity. Over the past decade, its application has expanded to infant populations providing unprecedented insight into functional organization of the developing brain, as well as early biomarkers of abnormal states. However, many methodological issues of rs-fMRI analysis need to be resolved prior to standardization of the technique to infant populations. As a primary goal, this methodological manuscript will (1) present a robust methodological protocol to extract and assess resting-state networks in early infancy using independent component analysis (ICA), such that investigators without previous knowledge in the field can implement the analysis and reliably obtain viable results consistent with previous literature; (2) review the current methodological challenges and ethical considerations associated with emerging field of infant rs-fMRI analysis; and (3) discuss the significance of rs-fMRI application in infants for future investigations of neurodevelopment in the context of early life stressors and pathological processes. The overarching goal is to catalyze efforts toward development of robust, infant-specific acquisition, and preprocessing pipelines, as well as promote greater transparency by researchers regarding methods used. PMID:28856131
Sørensen, Ellen Westh; Haugbølle, Lotte Stig
2008-12-01
Action research (AR) is a common research-based methodology useful for development and organizational changes in health care when participant involvement is key. However, AR is not widely used for research in the development of pharmaceutical care services in pharmacy practice. To disseminate the experience from using AR methodology to develop cognitive services in pharmacies by describing how the AR process was conducted in a specific study, and to describe the outcome for participants. The study was conducted over a 3-year period and run by a steering group of researchers, pharmacy students, and preceptors. The study design was based on AR methodology. The following data production methods were used to describe and evaluate the AR model: documentary analysis, qualitative interviews, and questionnaires. Experiences from using AR methodology and the outcome for participants are described. A set of principles was followed while the study, called the Pharmacy-University study, was being conducted. These principles are considered useful for designing future AR studies. Outcome for participating pharmacies was registered for staff-oriented and patient-oriented activities. Outcome for students was practice as project leaders and enhancement of clinical pharmacy-based skills. Outcome for researchers and the steering group conducting the study was in-depth knowledge of the status of pharmacies in giving advice to patient groups, and effective learning methods for students. Developing and implementing cognitive pharmaceutical services (CPS) involves wide-reaching changes that require the willingness of pharmacy and staff as well as external partners. The use of AR methodology creates a platform that supports raising the awareness and the possible inclusion of these partners. During this study, a set of tools was developed for use in implementing CPS as part of AR.
NASA Astrophysics Data System (ADS)
Guzman, Diego; Mohor, Guilherme; Câmara, Clarissa; Mendiondo, Eduardo
2017-04-01
Researches from around the world relate global environmental changes with the increase of vulnerability to extreme events, such as heavy and scarce precipitations - floods and droughts. Hydrological disasters have caused increasing losses in recent years. Thus, risk transfer mechanisms, such as insurance, are being implemented to mitigate impacts, finance the recovery of the affected population, and promote the reduction of hydrological risks. However, among the main problems in implementing these strategies, there are: First, the partial knowledge of natural and anthropogenic climate change in terms of intensity and frequency; Second, the efficient risk reduction policies require accurate risk assessment, with careful consideration of costs; Third, the uncertainty associated with numerical models and input data used. The objective of this document is to introduce and discuss the feasibility of the application of Hydrological Risk Transfer Models (HRTMs) as a strategy of adaptation to global climate change. The article shows the development of a methodology for the collective and multi-sectoral vulnerability management, facing the hydrological risk in the long term, under an insurance funds simulator. The methodology estimates the optimized premium as a function of willingness to pay (WTP) and the potential direct loss derived from hydrological risk. The proposed methodology structures the watershed insurance scheme in three analysis modules. First, the hazard module, which characterizes the hydrologic threat from the recorded series input or modelled series under IPCC / RCM's generated scenarios. Second, the vulnerability module calculates the potential economic loss for each sector1 evaluated as a function of the return period "TR". Finally, the finance module determines the value of the optimal aggregate premium by evaluating equiprobable scenarios of water vulnerability; taking into account variables such as the maximum limit of coverage, deductible, reinsurance schemes, and incentives for risk reduction. The methodology tested by members of the Integrated Nucleus of River Basins (NIBH) (University of Sao Paulo (USP) School of Engineering of São Carlos (EESC) - Brazil) presents an alternative to the analysis and planning of insurance funds, aiming to mitigate the impacts of hydrological droughts and stream flash floods. The presented procedure is especially important when information relevant to studies and the development and implementation of insurance funds are difficult to access and of complex evaluation. A sequence of academic applications has been made in Brazil under the South American context, where the market of hydrological insurance has a low penetration compared to developed economies and insurance markets more established as the United States and Europe, producing relevant information and demonstrating the potential of the methodology in development.
Shaikh, Faiq; Franc, Benjamin; Allen, Erastus; Sala, Evis; Awan, Omer; Hendrata, Kenneth; Halabi, Safwan; Mohiuddin, Sohaib; Malik, Sana; Hadley, Dexter; Shrestha, Rasu
2018-03-01
Enterprise imaging has channeled various technological innovations to the field of clinical radiology, ranging from advanced imaging equipment and postacquisition iterative reconstruction tools to image analysis and computer-aided detection tools. More recently, the advancement in the field of quantitative image analysis coupled with machine learning-based data analytics, classification, and integration has ushered in the era of radiomics, a paradigm shift that holds tremendous potential in clinical decision support as well as drug discovery. However, there are important issues to consider to incorporate radiomics into a clinically applicable system and a commercially viable solution. In this two-part series, we offer insights into the development of the translational pipeline for radiomics from methodology to clinical implementation (Part 1) and from that point to enterprise development (Part 2). In Part 2 of this two-part series, we study the components of the strategy pipeline, from clinical implementation to building enterprise solutions. Copyright © 2017 American College of Radiology. Published by Elsevier Inc. All rights reserved.
Laitinen, Heleena; Kaunonen, Marja; Astedt-Kurki, Päivi
2014-11-01
To give clarity to the analysis of participant observation in nursing when implementing the grounded theory method. Participant observation (PO) is a method of collecting data that reveals the reality of daily life in a specific context. In grounded theory, interviews are the primary method of collecting data but PO gives a distinctive insight, revealing what people are really doing, instead of what they say they are doing. However, more focus is needed on the analysis of PO. An observational study carried out to gain awareness of nursing care and its electronic documentation in four acute care wards in hospitals in Finland. Discussion of using the grounded theory method and PO as a data collection tool. The following methodological tools are discussed: an observational protocol, jotting of notes, microanalysis, the use of questioning, constant comparison, and writing and illustrating. Each tool has specific significance in collecting and analysing data, working in constant interaction. Grounded theory and participant observation supplied rich data and revealed the complexity of the daily reality of acute care. In this study, the methodological tools provided a base for the study at the research sites and outside. The process as a whole was challenging. It was time-consuming and it required rigorous and simultaneous data collection and analysis, including reflective writing. Using these methodological tools helped the researcher stay focused from data collection and analysis to building theory. Using PO as a data collection method in qualitative nursing research provides insights. It is not commonly discussed in nursing research and therefore this study can provide insight, which cannot be seen or revealed by using other data collection methods. Therefore, this paper can produce a useful tool for those who intend to use PO and grounded theory in their nursing research.
Liang, Geng
2015-01-01
In this paper, improving control performance of a networked control system by reducing DTD in a different perspective was investigated. Two different network architectures for system implementation were presented. Analysis and improvement dealing with DTD for the experimental control system were expounded. Effects of control scheme configuration on DTD in the form of FB were investigated and corresponding improvements by reallocation of FB and re-arrangement of schedule table are proposed. Issues of DTD in hybrid network were investigated and corresponding approaches to improve performance including (1) reducing DTD in PLC or PAC by way of IEC61499 and (2) cascade Smith predictive control with BPNN-based identification were proposed and investigated. Control effects under the proposed methodologies were also given. Experimental and field practices validated these methodologies. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.
Ogirala, Ajay; Stachel, Joshua R; Mickle, Marlin H
2011-11-01
Increasing density of wireless communication and development of radio frequency identification (RFID) technology in particular have increased the susceptibility of patients equipped with cardiac rhythmic monitoring devices (CRMD) to environmental electro magnetic interference (EMI). Several organizations reported observing CRMD EMI from different sources. This paper focuses on mathematically analyzing the energy as perceived by the implanted device, i.e., voltage. Radio frequency (RF) energy transmitted by RFID interrogators is considered as an example. A simplified front-end equivalent circuit of a CRMD sensing circuitry is proposed for the analysis following extensive black-box testing of several commercial pacemakers and implantable defibrillators. After careful understanding of the mechanics of the CRMD signal processing in identifying the QRS complex of the heart-beat, a mitigation technique is proposed. The mitigation methodology introduced in this paper is logical in approach, simple to implement and is therefore applicable to all wireless communication protocols.
Norlyk, Annelise; Harder, Ingegerd
2010-03-01
This article contributes to the debate about phenomenology as a research approach in nursing by providing a systematic review of what nurse researchers hold as phenomenology in published empirical studies. Based on the assumption that presentations of phenomenological approaches in peer-reviewed journals have consequences for the quality of future research, the aim was to analyze articles presenting phenomenological studies and, in light of the findings, raise a discussion about addressing scientific criteria. The analysis revealed considerable variations, ranging from brief to detailed descriptions of the stated phenomenological approach, and from inconsistencies to methodological clarity and rigor. Variations, apparent inconsistencies, and omissions made it unclear what makes a phenomenological study phenomenological. There is a need for clarifying how the principles of the phenomenological philosophy are implemented in a particular study before publishing. This should include an articulation of methodological keywords of the investigated phenomenon, and how an open attitude was adopted.
High resolution earth observation from geostationary orbit by optical aperture synthesys
NASA Astrophysics Data System (ADS)
Mesrine, M.; Thomas, E.; Garin, S.; Blanc, P.; Alis, C.; Cassaing, F.; Laubier, D.
2017-11-01
In this paper, we describe Optical Aperture Synthesis (OAS) imaging instrument concepts studied by Alcatel Alenia Space under a CNES R&T contract in term of technical feasibility. First, the methodology to select the aperture configuration is proposed, based on the definition and quantification of image quality criteria adapted to an OAS instrument for direct imaging of extended objects. The following section presents, for each interferometer type (Michelson and Fizeau), the corresponding optical configurations compatible with a large field of view from GEO orbit. These optical concepts take into account the constraints imposed by the foreseen resolution and the implementation of the co-phasing functions. The fourth section is dedicated to the analysis of the co-phasing methodologies, from the configuration deployment to the fine stabilization during observation. Finally, we present a trade-off analysis allowing to select the concept wrt mission specification and constraints related to instrument accommodation under launcher shroud and in-orbit deployment.
FAME, a microprocessor based front-end analysis and modeling environment
NASA Technical Reports Server (NTRS)
Rosenbaum, J. D.; Kutin, E. B.
1980-01-01
Higher order software (HOS) is a methodology for the specification and verification of large scale, complex, real time systems. The HOS methodology was implemented as FAME (front end analysis and modeling environment), a microprocessor based system for interactively developing, analyzing, and displaying system models in a low cost user-friendly environment. The nature of the model is such that when completed it can be the basis for projection to a variety of forms such as structured design diagrams, Petri-nets, data flow diagrams, and PSL/PSA source code. The user's interface with the analyzer is easily recognized by any current user of a structured modeling approach; therefore extensive training is unnecessary. Furthermore, when all the system capabilities are used one can check on proper usage of data types, functions, and control structures thereby adding a new dimension to the design process that will lead to better and more easily verified software designs.
Patterson, Stephanie Y; Smith, Veronica; Mirenda, Pat
2012-09-01
The purpose of this systematic review was to examine research utilizing single subject research designs (SSRD) to explore the effectiveness of interventions designed to increase parents' ability to support communication and social development in children with autism spectrum disorders (ASDs). Included studies were systematically assessed for methodological quality (Logan et al., 2008; Smith et al., 2007) and intervention effects. Data examining participant characteristics, study methodology, outcomes, and analysis were systematically extracted. Eleven SSRD parent-training intervention studies examining 44 participants with ASD were included. Overall, the studies were of moderate quality and reported increases in parent skills and child language and communication outcomes. The results supported by improvement rate difference (IRD) analysis indicated several interventions demonstrated positive effects for both parent and child outcomes. However, limited generalization and follow-up data suggested only one intervention demonstrated parents' accurate and ongoing intervention implementation beyond training.
DB4US: A Decision Support System for Laboratory Information Management
Hortas, Maria Luisa; Baena-García, Manuel; Lana-Linati, Jorge; González, Carlos; Redondo, Maximino; Morales-Bueno, Rafael
2012-01-01
Background Until recently, laboratory automation has focused primarily on improving hardware. Future advances are concentrated on intelligent software since laboratories performing clinical diagnostic testing require improved information systems to address their data processing needs. In this paper, we propose DB4US, an application that automates information related to laboratory quality indicators information. Currently, there is a lack of ready-to-use management quality measures. This application addresses this deficiency through the extraction, consolidation, statistical analysis, and visualization of data related to the use of demographics, reagents, and turn-around times. The design and implementation issues, as well as the technologies used for the implementation of this system, are discussed in this paper. Objective To develop a general methodology that integrates the computation of ready-to-use management quality measures and a dashboard to easily analyze the overall performance of a laboratory, as well as automatically detect anomalies or errors. The novelty of our approach lies in the application of integrated web-based dashboards as an information management system in hospital laboratories. Methods We propose a new methodology for laboratory information management based on the extraction, consolidation, statistical analysis, and visualization of data related to demographics, reagents, and turn-around times, offering a dashboard-like user web interface to the laboratory manager. The methodology comprises a unified data warehouse that stores and consolidates multidimensional data from different data sources. The methodology is illustrated through the implementation and validation of DB4US, a novel web application based on this methodology that constructs an interface to obtain ready-to-use indicators, and offers the possibility to drill down from high-level metrics to more detailed summaries. The offered indicators are calculated beforehand so that they are ready to use when the user needs them. The design is based on a set of different parallel processes to precalculate indicators. The application displays information related to tests, requests, samples, and turn-around times. The dashboard is designed to show the set of indicators on a single screen. Results DB4US was deployed for the first time in the Hospital Costa del Sol in 2008. In our evaluation we show the positive impact of this methodology for laboratory professionals, since the use of our application has reduced the time needed for the elaboration of the different statistical indicators and has also provided information that has been used to optimize the usage of laboratory resources by the discovery of anomalies in the indicators. DB4US users benefit from Internet-based communication of results, since this information is available from any computer without having to install any additional software. Conclusions The proposed methodology and the accompanying web application, DB4US, automates the processing of information related to laboratory quality indicators and offers a novel approach for managing laboratory-related information, benefiting from an Internet-based communication mechanism. The application of this methodology has been shown to improve the usage of time, as well as other laboratory resources. PMID:23608745
Cummings, Amanda; Lund, Susi; Campling, Natasha; May, Carl; Richardson, Alison; Myall, Michelle
2017-01-01
Objectives To identify the factors that promote and inhibit the implementation of interventions that improve communication and decision-making directed at goals of care in the event of acute clinical deterioration. Design and methods A scoping review was undertaken based on the methodological framework of Arksey and O’Malley for conducting this type of review. Searches were carried out in Medline and Cumulative Index to Nursing and Allied Health Literature (CINAHL) to identify peer-reviewed papers and in Google to identify grey literature. Searches were limited to those published in the English language from 2000 onwards. Inclusion and exclusion criteria were applied, and only papers that had a specific focus on implementation in practice were selected. Data extracted were treated as qualitative and subjected to directed content analysis. A theory-informed coding framework using Normalisation Process Theory (NPT) was applied to characterise and explain implementation processes. Results Searches identified 2619 citations, 43 of which met the inclusion criteria. Analysis generated six themes fundamental to successful implementation of goals of care interventions: (1) input into development; (2) key clinical proponents; (3) training and education; (4) intervention workability and functionality; (5) setting and context; and (6) perceived value and appraisal. Conclusions A broad and diverse literature focusing on implementation of goals of care interventions was identified. Our review recognised these interventions as both complex and contentious in nature, making their incorporation into routine clinical practice dependent on a number of factors. Implementing such interventions presents challenges at individual, organisational and systems levels, which make them difficult to introduce and embed. We have identified a series of factors that influence successful implementation and our analysis has distilled key learning points, conceptualised as a set of propositions, we consider relevant to implementing other complex and contentious interventions. PMID:28988176
Implementing case study methodology in critical care nursing: a discourse analysis.
Henning, John E; Nielsen, Lynn E; Hauschildt, James A
2006-01-01
The authors provide a description of the classroom interactions as one nursing education professor transformed his teaching from a lecture format to a case study approach. This description serves as a road map for nursing educators who are interested in making the transition to a case study approach by showing how, when, and to what degree they can maximize both student participation and content acquisition.
User Guide for GoldSim Model to Calculate PA/CA Doses and Limits
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, F.
2016-10-31
A model to calculate doses for solid waste disposal at the Savannah River Site (SRS) and corresponding disposal limits has been developed using the GoldSim commercial software. The model implements the dose calculations documented in SRNL-STI-2015-00056, Rev. 0 “Dose Calculation Methodology and Data for Solid Waste Performance Assessment (PA) and Composite Analysis (CA) at the Savannah River Site”.
ERIC Educational Resources Information Center
Fedotova, Olga; Ermakov, Pavel; Latun, Vladimir; Hovhannisyan, Haykaz; Avanesyan, Grant
2017-01-01
The article analyzes the transformation of the methodological toolkit for teaching humanities and sciences in the Russian Federation. The method of case study, being widely spread in modern higher education research, is used as an example to illustrate the attempts to implement the best practices of foreign educational technology into tertiary…
A System Approach to Navy Medical Education and Training. Appendix 15. Biotronics Technicians.
1974-08-31
curricula based upon job analysis was implemented to a level of methodology determination. These methods and curriculum materials constituted a third...Therapy Technician 8495 Dermatology Technician 8496 Embalming Technician 8497 Medical Illustration Technician 8498 Medical Equipment Repair Technician... WET COMPRESSES/SOAKS/PACKS 24 ICONTROL BLEEDING BY PRESSURE DRESSING 25 1APPLY/CHANGE BANDAGES, E.G. ROLLER, TRIANGULAR, KURLEX GO TO RIGHT HAND PAGE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ward, Lee H.; Laros, James H., III
This paper describes a methodology for implementing disk-less cluster systems using the Network File System (NFS) that scales to thousands of nodes. This method has been successfully deployed and is currently in use on several production systems at Sandia National Labs. This paper will outline our methodology and implementation, discuss hardware and software considerations in detail and present cluster configurations with performance numbers for various management operations like booting.
ERIC Educational Resources Information Center
Gammage, David T.
2008-01-01
Purpose: The purpose of this paper is to explore how the process of implementation of school-based management (SBM) has worked within the public school systems in the Australian Capital Territory (ACT) and Victoria in Australia. The period covered was 1976-2006. Design/methodology/approach: The approach adopted was the mixed methodology which…
Space station operating system study
NASA Technical Reports Server (NTRS)
Horn, Albert E.; Harwell, Morris C.
1988-01-01
The current phase of the Space Station Operating System study is based on the analysis, evaluation, and comparison of the operating systems implemented on the computer systems and workstations in the software development laboratory. Primary emphasis has been placed on the DEC MicroVMS operating system as implemented on the MicroVax II computer, with comparative analysis of the SUN UNIX system on the SUN 3/260 workstation computer, and to a limited extent, the IBM PC/AT microcomputer running PC-DOS. Some benchmark development and testing was also done for the Motorola MC68010 (VM03 system) before the system was taken from the laboratory. These systems were studied with the objective of determining their capability to support Space Station software development requirements, specifically for multi-tasking and real-time applications. The methodology utilized consisted of development, execution, and analysis of benchmark programs and test software, and the experimentation and analysis of specific features of the system or compilers in the study.
Environmental sustainability in European public healthcare.
Chiarini, Andrea; Vagnoni, Emidia
2016-01-01
Purpose - The purpose of this paper is to enlarge the debate concerning the influence of leadership on environmental sustainability implementation in European public healthcare organisations. Design/methodology/approach - This paper is a viewpoint. It is based on preliminary analysis of European standards dedicated to environmental sustainability and their spread across Europe in public healthcare organisations. Viewpoints concerning leadership are then discussed and asserted. Findings - This paper found a limited implementation of standards such as Green Public Procurement criteria, Eco-Management and Audit Scheme and ISO 14001 in public healthcare. Some clues indicate that the lack of implementation is related to leadership and management commitment. Originality/value - For the first time, this paper investigates relationships between leadership and environmental sustainability in European public healthcare opening further avenues of research on the subject.
Load leveling on industrial refrigeration systems
NASA Astrophysics Data System (ADS)
Bierenbaum, H. S.; Kraus, A. D.
1982-01-01
A computer model was constructed of a brewery with a 2000 horsepower compressor/refrigeration system. The various conservation and load management options were simulated using the validated model. The savings available for implementing the most promising options were verified by trials in the brewery. Result show that an optimized methodology for implementing load leveling and energy conservation consisted of: (1) adjusting (or tuning) refrigeration systems controller variables to minimize unnecessary compressor starts, (2) The primary refrigeration system operating parameters, compressor suction pressure, and discharge pressure are carefully controlled (modulated) to satisfy product quality constraints (as well as in-process material cooling rates and temperature levels) and energy evaluating the energy cost savings associated with reject heat recovery, and (4) a decision is made to implement the reject heat recovery system based on a cost/benefits analysis.
Ezenwa, Miriam O; Suarez, Marie L; Carrasco, Jesus D; Hipp, Theresa; Gill, Anayza; Miller, Jacob; Shea, Robert; Shuey, David; Zhao, Zhongsheng; Angulo, Veronica; McCurry, Timothy; Martin, Joanna; Yao, Yingwei; Molokie, Robert E; Wang, Zaijie Jim; Wilkie, Diana J
2017-07-01
This purpose of this article is to describe how we adhere to the Patient-Centered Outcomes Research Institute's (PCORI) methodology standards relevant to the design and implementation of our PCORI-funded study, the PAIN RelieveIt Trial. We present details of the PAIN RelieveIt Trial organized by the PCORI methodology standards and components that are relevant to our study. The PAIN RelieveIt Trial adheres to four PCORI standards and 21 subsumed components. The four standards include standards for formulating research questions, standards associated with patient centeredness, standards for data integrity and rigorous analyses, and standards for preventing and handling missing data. In the past 24 months, we screened 2,837 cancer patients and their caregivers; 874 dyads were eligible; 223.5 dyads consented and provided baseline data. Only 55 patients were lost to follow-up-a 25% attrition rate. The design and implementation of the PAIN RelieveIt Trial adhered to PCORI's methodology standards for research rigor.
Pillai, Goonaseelan Colin; Mentré, France; Steimer, Jean-Louis
2005-04-01
Few scientific contributions have made significant impact unless there was a champion who had the vision to see the potential for its use in seemingly disparate areas-and who then drove active implementation. In this paper, we present a historical summary of the development of non-linear mixed effects (NLME) modeling up to the more recent extensions of this statistical methodology. The paper places strong emphasis on the pivotal role played by Lewis B. Sheiner (1940-2004), who used this statistical methodology to elucidate solutions to real problems identified in clinical practice and in medical research and on how he drove implementation of the proposed solutions. A succinct overview of the evolution of the NLME modeling methodology is presented as well as ideas on how its expansion helped to provide guidance for a more scientific view of (model-based) drug development that reduces empiricism in favor of critical quantitative thinking and decision making.
NASA Astrophysics Data System (ADS)
Dutta, Sekhar Chandra; Chakroborty, Suvonkar; Raychaudhuri, Anusrita
Vibration transmitted to the structure during earthquake may vary in magnitude over a wide range. Design methodology should, therefore, enumerates steps so that structures are able to survive in the event of even severe ground motion. However, on account of economic reason, the strengths can be provided to the structures in such a way that the structure remains in elastic range in low to moderate range earthquake and is allowed to undergo inelastic deformation in severe earthquake without collapse. To implement this design philosophy a rigorous nonlinear dynamic analysis is needed to be performed to estimate the inelastic demands. Furthermore, the same is time consuming and requires expertise to judge the results obtained from the same. In this context, the present paper discusses and demonstrates an alternative simple method known as Pushover method, which can be easily used by practicing engineers bypassing intricate nonlinear dynamic analysis and can be thought of as a substitute of the latter. This method is in the process of development and is increasingly becoming popular for its simplicity. The objective of this paper is to emphasize and demonstrate the basic concept, strength and ease of this state of the art methodology for regular use in design offices in performance based seismic design of structures.
NASA Technical Reports Server (NTRS)
Scott, Elaine P.
1996-01-01
A thermal stress analysis is an important aspect in the design of aerospace structures and vehicles such as the High Speed Civil Transport (HSCT) at the National Aeronautics and Space Administration Langley Research Center (NASA-LaRC). These structures are complex and are often composed of numerous components fabricated from a variety of different materials. The thermal loads on these structures induce temperature variations within the structure, which in turn result in the development of thermal stresses. Therefore, a thermal stress analysis requires knowledge of the temperature distributions within the structures which consequently necessitates the need for accurate knowledge of the thermal properties, boundary conditions and thermal interface conditions associated with the structural materials. The goal of this proposed multi-year research effort was to develop estimation methodologies for the determination of the thermal properties and interface conditions associated with aerospace vehicles. Specific objectives focused on the development and implementation of optimal experimental design strategies and methodologies for the estimation of thermal properties associated with simple composite and honeycomb structures. The strategy used in this multi-year research effort was to first develop methodologies for relatively simple systems and then systematically modify these methodologies to analyze complex structures. This can be thought of as a building block approach. This strategy was intended to promote maximum usability of the resulting estimation procedure by NASA-LARC researchers through the design of in-house experimentation procedures and through the use of an existing general purpose finite element software.
Detection and Processing Techniques of FECG Signal for Fetal Monitoring
2009-01-01
Fetal electrocardiogram (FECG) signal contains potentially precise information that could assist clinicians in making more appropriate and timely decisions during labor. The ultimate reason for the interest in FECG signal analysis is in clinical diagnosis and biomedical applications. The extraction and detection of the FECG signal from composite abdominal signals with powerful and advance methodologies are becoming very important requirements in fetal monitoring. The purpose of this review paper is to illustrate the various methodologies and developed algorithms on FECG signal detection and analysis to provide efficient and effective ways of understanding the FECG signal and its nature for fetal monitoring. A comparative study has been carried out to show the performance and accuracy of various methods of FECG signal analysis for fetal monitoring. Finally, this paper further focused some of the hardware implementations using electrical signals for monitoring the fetal heart rate. This paper opens up a passage for researchers, physicians, and end users to advocate an excellent understanding of FECG signal and its analysis procedures for fetal heart rate monitoring system. PMID:19495912
Generalized causal mediation and path analysis: Extensions and practical considerations.
Albert, Jeffrey M; Cho, Jang Ik; Liu, Yiying; Nelson, Suchitra
2018-01-01
Causal mediation analysis seeks to decompose the effect of a treatment or exposure among multiple possible paths and provide casually interpretable path-specific effect estimates. Recent advances have extended causal mediation analysis to situations with a sequence of mediators or multiple contemporaneous mediators. However, available methods still have limitations, and computational and other challenges remain. The present paper provides an extended causal mediation and path analysis methodology. The new method, implemented in the new R package, gmediation (described in a companion paper), accommodates both a sequence (two stages) of mediators and multiple mediators at each stage, and allows for multiple types of outcomes following generalized linear models. The methodology can also handle unsaturated models and clustered data. Addressing other practical issues, we provide new guidelines for the choice of a decomposition, and for the choice of a reference group multiplier for the reduction of Monte Carlo error in mediation formula computations. The new method is applied to data from a cohort study to illuminate the contribution of alternative biological and behavioral paths in the effect of socioeconomic status on dental caries in adolescence.
Landslide hazard assessment : LIFE+IMAGINE project methodology and Liguria region use case
NASA Astrophysics Data System (ADS)
Spizzichino, Daniele; Campo, Valentina; Congi, Maria Pia; Cipolloni, Carlo; Delmonaco, Giuseppe; Guerrieri, Luca; Iadanza, Carla; Leoni, Gabriele; Trigila, Alessandro
2015-04-01
Scope of the work is to present a methodology developed for analysis of potential impacts in areas prone to landslide hazard in the framework of the EC project LIFE+IMAGINE. The project aims to implement a web services-based infrastructure addressed to environmental analysis, that integrates, in its own architecture, specifications and results from INSPIRE, SEIS and GMES. Existing web services has been customized to provide functionalities for supporting environmental integrated management. The implemented infrastructure has been applied to landslide risk scenarios, developed in selected pilot areas, aiming at: i) application of standard procedures to implement a landslide risk analysis; ii) definition of a procedure for assessment of potential environmental impacts, based on a set of indicators to estimate the different exposed elements with their specific vulnerability in the pilot area. The landslide pilot and related scenario are focused at providing a simplified Landslide Risk Assessment (LRA) through: 1) a landslide inventory derived from available historical and recent databases and maps; 2) landslide susceptibility and hazard maps; 3) assessment of exposure and vulnerability on selected typologies of elements at risk; 4) implementation of a landslide risk scenario for different sets of exposed elements 5) development of a use case; 6) definition of guidelines, best practices and production of thematic maps. The LRA has been implemented in Liguria region, Italy, in two different catchment areas located in the Cinque Terre National Park, characterized by a high landslide susceptibility and low resilience. The landslide risk impact analysis has been calibrated taking into account the socio-economic damage caused by landslides triggered by the October 2011 meteorological event. During this event, over 600 landslides were triggered in the selected pilot area. Most of landslides affected the diffuse system of anthropogenic terraces and caused the direct disruption of the walls as well as transportation of a large amount of loose sediments along the slopes and channels as induced consequence of the event. Application of a spatial analysis detected ca. 400 critical point along the road network with an average length of about 200 m. Over 1,000 buildings were affected and damaged by the event. The exposed population in the area involved by the event has been estimated in ca. 2,600 inhabitants (people?). In the pilot area, 19 different typologies of Cultural Heritage were affected by landslide phenomena or located in zones classified as high landslide hazard. The final scope of the landslide scenario is to improve the awareness on hazard, exposure, vulnerability and landslide risk in the Cinque Terre National Park to the benefit of local authorities and population. In addition, the results of the application will be used for updating the land planning process in order to improve the resilience of local communities, ii) implementing cost-benefit analysis aimed at the definition of guidelines for sustainable landslide risk mitigation strategies, iii) suggesting a general road map for the implementation of a local adaptation plan.
Biological Parametric Mapping: A Statistical Toolbox for Multi-Modality Brain Image Analysis
Casanova, Ramon; Ryali, Srikanth; Baer, Aaron; Laurienti, Paul J.; Burdette, Jonathan H.; Hayasaka, Satoru; Flowers, Lynn; Wood, Frank; Maldjian, Joseph A.
2006-01-01
In recent years multiple brain MR imaging modalities have emerged; however, analysis methodologies have mainly remained modality specific. In addition, when comparing across imaging modalities, most researchers have been forced to rely on simple region-of-interest type analyses, which do not allow the voxel-by-voxel comparisons necessary to answer more sophisticated neuroscience questions. To overcome these limitations, we developed a toolbox for multimodal image analysis called biological parametric mapping (BPM), based on a voxel-wise use of the general linear model. The BPM toolbox incorporates information obtained from other modalities as regressors in a voxel-wise analysis, thereby permitting investigation of more sophisticated hypotheses. The BPM toolbox has been developed in MATLAB with a user friendly interface for performing analyses, including voxel-wise multimodal correlation, ANCOVA, and multiple regression. It has a high degree of integration with the SPM (statistical parametric mapping) software relying on it for visualization and statistical inference. Furthermore, statistical inference for a correlation field, rather than a widely-used T-field, has been implemented in the correlation analysis for more accurate results. An example with in-vivo data is presented demonstrating the potential of the BPM methodology as a tool for multimodal image analysis. PMID:17070709
The use of geospatial web services for exchanging utilities data
NASA Astrophysics Data System (ADS)
Kuczyńska, Joanna
2013-04-01
Geographic information technologies and related geo-information systems currently play an important role in the management of public administration in Poland. One of these tasks is to maintain and update Geodetic Evidence of Public Utilities (GESUT), part of the National Geodetic and Cartographic Resource, which contains an important for many institutions information of technical infrastructure. It requires an active exchange of data between the Geodesy and Cartography Documentation Centers and institutions, which administrate transmission lines. The administrator of public utilities, is legally obliged to provide information about utilities to GESUT. The aim of the research work was to develop a universal data exchange methodology, which can be implemented on a variety of hardware and software platforms. This methodology use Unified Modeling Language (UML), eXtensible Markup Language (XML), and Geography Markup Language (GML). The proposed methodology is based on the two different strategies: Model Driven Architecture (MDA) and Service Oriented Architecture (SOA). Used solutions are consistent with the INSPIRE Directive and ISO 19100 series standards for geographic information. On the basis of analysis of the input data structures, conceptual models were built for both databases. Models were written in the universal modeling language: UML. Combined model that defines a common data structure was also built. This model was transformed into developed for the exchange of geographic information GML standard. The structure of the document describing the data that may be exchanged is defined in the .xsd file. Network services were selected and implemented in the system designed for data exchange based on open source tools. Methodology was implemented and tested. Data in the agreed data structure and metadata were set up on the server. Data access was provided by geospatial network services: data searching possibilities by Catalog Service for the Web (CSW), data collection by Web Feature Service (WFS). WFS provides also operation for modification data, for example to update them by utility administrator. The proposed solution significantly increases the efficiency of data exchange and facilitates maintenance the National Geodetic and Cartographic Resource.
Chen, Gang; Adleman, Nancy E; Saad, Ziad S; Leibenluft, Ellen; Cox, Robert W
2014-10-01
All neuroimaging packages can handle group analysis with t-tests or general linear modeling (GLM). However, they are quite hamstrung when there are multiple within-subject factors or when quantitative covariates are involved in the presence of a within-subject factor. In addition, sphericity is typically assumed for the variance-covariance structure when there are more than two levels in a within-subject factor. To overcome such limitations in the traditional AN(C)OVA and GLM, we adopt a multivariate modeling (MVM) approach to analyzing neuroimaging data at the group level with the following advantages: a) there is no limit on the number of factors as long as sample sizes are deemed appropriate; b) quantitative covariates can be analyzed together with within-subject factors; c) when a within-subject factor is involved, three testing methodologies are provided: traditional univariate testing (UVT) with sphericity assumption (UVT-UC) and with correction when the assumption is violated (UVT-SC), and within-subject multivariate testing (MVT-WS); d) to correct for sphericity violation at the voxel level, we propose a hybrid testing (HT) approach that achieves equal or higher power via combining traditional sphericity correction methods (Greenhouse-Geisser and Huynh-Feldt) with MVT-WS. To validate the MVM methodology, we performed simulations to assess the controllability for false positives and power achievement. A real FMRI dataset was analyzed to demonstrate the capability of the MVM approach. The methodology has been implemented into an open source program 3dMVM in AFNI, and all the statistical tests can be performed through symbolic coding with variable names instead of the tedious process of dummy coding. Our data indicates that the severity of sphericity violation varies substantially across brain regions. The differences among various modeling methodologies were addressed through direct comparisons between the MVM approach and some of the GLM implementations in the field, and the following two issues were raised: a) the improper formulation of test statistics in some univariate GLM implementations when a within-subject factor is involved in a data structure with two or more factors, and b) the unjustified presumption of uniform sphericity violation and the practice of estimating the variance-covariance structure through pooling across brain regions. Published by Elsevier Inc.
Ciraj-Bjelac, Olivera; Faj, Dario; Stimac, Damir; Kosutic, Dusko; Arandjic, Danijela; Brkic, Hrvoje
2011-04-01
The purpose of this study is to investigate the need for and the possible achievements of a comprehensive QA programme and to look at effects of simple corrective actions on image quality in Croatia and in Serbia. The paper focuses on activities related to the technical and radiological aspects of QA. The methodology consisted of two phases. The aim of the first phase was the initial assessment of mammography practice in terms of image quality, patient dose and equipment performance in selected number of mammography units in Croatia and Serbia. Subsequently, corrective actions were suggested and implemented. Then the same parameters were re-assessed. Most of the suggested corrective actions were simple, low-cost and possible to implement immediately, as these were related to working habits in mammography units, such as film processing and darkroom conditions. It has been demonstrated how simple quantitative assessment of image quality can be used for optimisation purposes. Analysis of image quality parameters as OD, gradient and contrast demonstrated general similarities between mammography practices in Croatia and Serbia. The applied methodology should be expanded to larger number of hospitals and applied on a regular basis. Copyright © 2009 Elsevier Ireland Ltd. All rights reserved.
78 FR 57128 - Census Advisory Committees; Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-17
... Other Populations (NAC). The Committee will address census policies, research and methodology, tests..., methodological, geographic, behavioral and operational variables affecting the cost, accuracy and implementation...
NASA Technical Reports Server (NTRS)
Haley, D. C.; Almand, B. J.; Thomas, M. M.; Krauze, L. D.; Gremban, K. D.; Sanborn, J. C.; Kelly, J. H.; Depkovich, T. M.
1984-01-01
A generic computer simulation for manipulator systems (ROBSIM) was implemented and the specific technologies necessary to increase the role of automation in various missions were developed. The specific items developed were: (1) Capability for definition of a manipulator system consisting of multiple arms, load objects, and an environment; (2) Capability for kinematic analysis, requirements analysis, and response simulation of manipulator motion; (3) Postprocessing options such as graphic replay of simulated motion and manipulator parameter plotting; (4) Investigation and simulation of various control methods including manual force/torque and active compliance control; (5) Evaluation and implementation of three obstacle avoidance methods; (6) Video simulation and edge detection; and (7) Software simulation validation. This appendix is the user's guide and includes examples of program runs and outputs as well as instructions for program use.
Arvanitoyannis, Ioannis S; Traikou, Athina
2005-01-01
The production of flour and semolina and their ensuing products, such as bread, cake, spaghetti, noodles, and corn flakes, is of major importance, because these products constitute some of the main ingredients of the human diet. The Hazard Analysis Critical Control Point (HACCP) system aims at ensuring the safety of these products. HACCP has been implemented within the frame of this study on various products of both Asian and European origin; the hazards, critical control limits (CCLs), observation practices, and corrective actions have been summarized in comprehensive tables. Furthermore, the various production steps, packaging included, were thoroughly analyzed, and reference was made to both the traditional and new methodologies in an attempt to pinpoint the occurring differences (advantages and disadvantages) per process.
Lifetime Reliability Prediction of Ceramic Structures Under Transient Thermomechanical Loads
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Jadaan, Osama J.; Gyekenyesi, John P.
2005-01-01
An analytical methodology is developed to predict the probability of survival (reliability) of ceramic components subjected to harsh thermomechanical loads that can vary with time (transient reliability analysis). This capability enables more accurate prediction of ceramic component integrity against fracture in situations such as turbine startup and shutdown, operational vibrations, atmospheric reentry, or other rapid heating or cooling situations (thermal shock). The transient reliability analysis methodology developed herein incorporates the following features: fast-fracture transient analysis (reliability analysis without slow crack growth, SCG); transient analysis with SCG (reliability analysis with time-dependent damage due to SCG); a computationally efficient algorithm to compute the reliability for components subjected to repeated transient loading (block loading); cyclic fatigue modeling using a combined SCG and Walker fatigue law; proof testing for transient loads; and Weibull and fatigue parameters that are allowed to vary with temperature or time. Component-to-component variation in strength (stochastic strength response) is accounted for with the Weibull distribution, and either the principle of independent action or the Batdorf theory is used to predict the effect of multiaxial stresses on reliability. The reliability analysis can be performed either as a function of the component surface (for surface-distributed flaws) or component volume (for volume-distributed flaws). The transient reliability analysis capability has been added to the NASA CARES/ Life (Ceramic Analysis and Reliability Evaluation of Structures/Life) code. CARES/Life was also updated to interface with commercially available finite element analysis software, such as ANSYS, when used to model the effects of transient load histories. Examples are provided to demonstrate the features of the methodology as implemented in the CARES/Life program.
Mestdagh, Inge; Bonicelli, Bernard; Laplana, Ramon; Roettele, Manfred
2009-01-01
Based on the results and lessons learned from the TOPPS project (Training the Operators to prevent Pollution from Point Sources), a proposal on a sustainable strategy to avoid point source pollution from Plant Protection Products (PPPs) was made. Within this TOPPS project (2005-2008), stakeholders were interviewed and research and analysis were done in 6 pilot catchment areas (BE, FR, DE, DK, IT, PL). Next, there was a repeated survey on operators' perception and opinion to measure changes resulting from TOPPS activities and good and bad practices were defined based on the Best Management Practices (risk analysis). Aim of the proposal is to suggest a strategy considering the differences between countries which can be implemented on Member State level in order to avoid PPP pollution of water through point sources. The methodology used for the up-scaLing proposal consists of the analysis of the current situation, a gap analysis, a consistency analysis and organisational structures for implementation. The up-scaling proposal focuses on the behaviour of the operators, on the equipment and infrastructure available with the operators. The proposal defines implementation structures to support correct behaviour through the development and updating of Best Management Practices (BMPs) and through the transfer and the implementation of these BMPs. Next, the proposal also defines requirements for the improvement of equipment and infrastructure based on the defined key factors related to point source pollution. It also contains cost estimates for technical and infrastructure upgrades to comply with BMPs.
34 CFR 75.210 - General selection criteria.
Code of Federal Regulations, 2011 CFR
2011-07-01
... for research activities, and the use of appropriate theoretical and methodological tools, including... project implementation, and the use of appropriate methodological tools to ensure successful achievement...
34 CFR 75.210 - General selection criteria.
Code of Federal Regulations, 2014 CFR
2014-07-01
... for research activities, and the use of appropriate theoretical and methodological tools, including... project implementation, and the use of appropriate methodological tools to ensure successful achievement...
34 CFR 75.210 - General selection criteria.
Code of Federal Regulations, 2012 CFR
2012-07-01
... for research activities, and the use of appropriate theoretical and methodological tools, including... project implementation, and the use of appropriate methodological tools to ensure successful achievement...
34 CFR 75.210 - General selection criteria.
Code of Federal Regulations, 2010 CFR
2010-07-01
... for research activities, and the use of appropriate theoretical and methodological tools, including... project implementation, and the use of appropriate methodological tools to ensure successful achievement...
34 CFR 75.210 - General selection criteria.
Code of Federal Regulations, 2013 CFR
2013-07-01
... for research activities, and the use of appropriate theoretical and methodological tools, including... project implementation, and the use of appropriate methodological tools to ensure successful achievement...
Chadeau-Hyam, Marc; Campanella, Gianluca; Jombart, Thibaut; Bottolo, Leonardo; Portengen, Lutzen; Vineis, Paolo; Liquet, Benoit; Vermeulen, Roel C H
2013-08-01
Recent technological advances in molecular biology have given rise to numerous large-scale datasets whose analysis imposes serious methodological challenges mainly relating to the size and complex structure of the data. Considerable experience in analyzing such data has been gained over the past decade, mainly in genetics, from the Genome-Wide Association Study era, and more recently in transcriptomics and metabolomics. Building upon the corresponding literature, we provide here a nontechnical overview of well-established methods used to analyze OMICS data within three main types of regression-based approaches: univariate models including multiple testing correction strategies, dimension reduction techniques, and variable selection models. Our methodological description focuses on methods for which ready-to-use implementations are available. We describe the main underlying assumptions, the main features, and advantages and limitations of each of the models. This descriptive summary constitutes a useful tool for driving methodological choices while analyzing OMICS data, especially in environmental epidemiology, where the emergence of the exposome concept clearly calls for unified methods to analyze marginally and jointly complex exposure and OMICS datasets. Copyright © 2013 Wiley Periodicals, Inc.
Report on FY17 testing in support of integrated EPP-SMT design methods development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Yanli .; Jetter, Robert I.; Sham, T. -L.
The goal of the proposed integrated Elastic Perfectly-Plastic (EPP) and Simplified Model Test (SMT) methodology is to incorporate a SMT data-based approach for creep-fatigue damage evaluation into the EPP methodology to avoid the separate evaluation of creep and fatigue damage and eliminate the requirement for stress classification in current methods; thus greatly simplifying evaluation of elevated temperature cyclic service. The purpose of this methodology is to minimize over-conservatism while properly accounting for localized defects and stress risers. To support the implementation of the proposed methodology and to verify the applicability of the code rules, thermomechanical tests continued in FY17. Thismore » report presents the recent test results for Type 1 SMT specimens on Alloy 617 with long hold times, pressurization SMT on Alloy 617, and two-bar thermal ratcheting test results on SS316H at the temperature range of 405 °C to 705 °C. Preliminary EPP strain range analysis on the two-bar tests are critically evaluated and compared with the experimental results.« less
Salmos, Janaina; Gerbi, Marleny E M M; Braz, Rodivan; Andrade, Emanuel S S; Vasconcelos, Belmiro C E; Bessa-Nogueira, Ricardo V
2010-01-01
The purpose of this study was to identify systematic reviews (SRs) that compared laser with other dental restorative procedures and to evaluate their methodological quality. A search strategy was developed and implemented for MEDLINE, the Cochrane Library, LILACS, and the Brazilian Dentistry Bibliography (1966- 2007). Inclusion criteria were: the article had to be an SR (+/- meta-analysis); primary focus was the use of laser in restorative dentistry; published in English, Spanish, Portuguese, Italian, German. Two investigators independently selected and evaluated the SRs. The overview quality assessment questionnaire (OQAQ) was used to evaluate methodological quality, and the results were averaged. There were 145 references identified, of which seven were SRs that met the inclusion criteria (kappa=0.81). Of the SRs, 71.4% appraised lasers in dental caries diagnosis. The mean overall OQAQ score was 4.4 [95% confidence interval (CI) 2.4- 6.5]. Of the SRs, 57.1% had major flaws, scoring < or = 4. SR methodological quality is low; therefore, clinicians should critically appraise them prior to considering their recommendations to guide patient care.
NASA Technical Reports Server (NTRS)
Allen, Cheryl L.
1991-01-01
Enhanced engineering tools can be obtained through the integration of expert system methodologies and existing design software. The application of these methodologies to the spacecraft design and cost model (SDCM) software provides an improved technique for the selection of hardware for unmanned spacecraft subsystem design. The knowledge engineering system (KES) expert system development tool was used to implement a smarter equipment section algorithm than that which is currently achievable through the use of a standard data base system. The guidance, navigation, and control subsystems of the SDCM software was chosen as the initial subsystem for implementation. The portions of the SDCM code which compute the selection criteria and constraints remain intact, and the expert system equipment selection algorithm is embedded within this existing code. The architecture of this new methodology is described and its implementation is reported. The project background and a brief overview of the expert system is described, and once the details of the design are characterized, an example of its implementation is demonstrated.
Henrard, S; Speybroeck, N; Hermans, C
2015-11-01
Haemophilia is a rare genetic haemorrhagic disease characterized by partial or complete deficiency of coagulation factor VIII, for haemophilia A, or IX, for haemophilia B. As in any other medical research domain, the field of haemophilia research is increasingly concerned with finding factors associated with binary or continuous outcomes through multivariable models. Traditional models include multiple logistic regressions, for binary outcomes, and multiple linear regressions for continuous outcomes. Yet these regression models are at times difficult to implement, especially for non-statisticians, and can be difficult to interpret. The present paper sought to didactically explain how, why, and when to use classification and regression tree (CART) analysis for haemophilia research. The CART method is non-parametric and non-linear, based on the repeated partitioning of a sample into subgroups based on a certain criterion. Breiman developed this method in 1984. Classification trees (CTs) are used to analyse categorical outcomes and regression trees (RTs) to analyse continuous ones. The CART methodology has become increasingly popular in the medical field, yet only a few examples of studies using this methodology specifically in haemophilia have to date been published. Two examples using CART analysis and previously published in this field are didactically explained in details. There is increasing interest in using CART analysis in the health domain, primarily due to its ease of implementation, use, and interpretation, thus facilitating medical decision-making. This method should be promoted for analysing continuous or categorical outcomes in haemophilia, when applicable. © 2015 John Wiley & Sons Ltd.
Energy efficiency analysis and implementation of AES on an FPGA
NASA Astrophysics Data System (ADS)
Kenney, David
The Advanced Encryption Standard (AES) was developed by Joan Daemen and Vincent Rjimen and endorsed by the National Institute of Standards and Technology in 2001. It was designed to replace the aging Data Encryption Standard (DES) and be useful for a wide range of applications with varying throughput, area, power dissipation and energy consumption requirements. Field Programmable Gate Arrays (FPGAs) are flexible and reconfigurable integrated circuits that are useful for many different applications including the implementation of AES. Though they are highly flexible, FPGAs are often less efficient than Application Specific Integrated Circuits (ASICs); they tend to operate slower, take up more space and dissipate more power. There have been many FPGA AES implementations that focus on obtaining high throughput or low area usage, but very little research done in the area of low power or energy efficient FPGA based AES; in fact, it is rare for estimates on power dissipation to be made at all. This thesis presents a methodology to evaluate the energy efficiency of FPGA based AES designs and proposes a novel FPGA AES implementation which is highly flexible and energy efficient. The proposed methodology is implemented as part of a novel scripting tool, the AES Energy Analyzer, which is able to fully characterize the power dissipation and energy efficiency of FPGA based AES designs. Additionally, this thesis introduces a new FPGA power reduction technique called Opportunistic Combinational Operand Gating (OCOG) which is used in the proposed energy efficient implementation. The AES Energy Analyzer was able to estimate the power dissipation and energy efficiency of the proposed AES design during its most commonly performed operations. It was found that the proposed implementation consumes less energy per operation than any previous FPGA based AES implementations that included power estimations. Finally, the use of Opportunistic Combinational Operand Gating on an AES cipher was found to reduce its dynamic power consumption by up to 17% when compared to an identical design that did not employ the technique.
Parker, R David; Regier, Michael; Brown, Zachary; Davis, Stephen
2015-02-01
Homelessness is a primary concern for community health. Scientific literature on homelessness is wide ranging and diverse. One opportunity to add to existing literature is the development and testing of affordable, easily implemented methods for measuring the impact of homeless on the healthcare system. Such methodological approaches rely on the strengths in a multidisciplinary approach, including providers, both healthcare and homeless services and applied clinical researchers. This paper is a proof of concept for a methodology which is easily adaptable nationwide, given the mandated implementation of homeless management information systems in the United States and other countries; medical billing systems by hospitals; and research methods of researchers. Adaptation is independent of geographic region, budget restraints, specific agency skill sets, and many other factors that impact the application of a consistent methodological science based approach to assess and address homelessness. We conducted a secondary data analysis merging data from homeless utilization and hospital case based data. These data detailed care utilization among homeless persons in a small, Appalachian city in the United States. In our sample of 269 persons who received at least one hospital based service and one homeless service between July 1, 2012 and June 30, 2013, the total billed costs were $5,979,463 with 10 people costing more than one-third ($1,957,469) of the total. Those persons were primarily men, living in an emergency shelter, with pre-existing disabling conditions. We theorize that targeted services, including Housing First, would be an effective intervention. This is proposed in a future study.
Towards an Effective Management Strategy for Passive RFID Implementation
2004-08-01
36 vi IV. Results and Analysis …………………………………………………………………37 Overview……………………………………………………………………………..37 Interview Background...1 TITLE I. Introduction Background After a decade of critical analysis , Operation Desert Storm has proven to be one of the swiftest...failure of information technology projects. This analysis will focus on published research and scholarly texts. Methodology This research will
Chen, Li-Ding; Lu, Yi-He; Tian, Hui-Ying; Shi, Qian
2007-03-01
Global ecological security becomes increasingly important with the intensive human activities. The function of ecological security is influenced by human activities, and in return, the efficiency of human activities will also be affected by the patterns of regional ecological security. Since the 1990s, China has initiated the construction of key projects "Yangtze Three Gorges Dam", "Qinghai-Tibet Railway", "West-to-East Gas Pipeline", "West-to-East Electricity Transmission" and "South-to-North Water Transfer" , etc. The interaction between these projects and regional ecological security has particularly attracted the attention of Chinese government. It is not only important for the regional environmental protection, but also of significance for the smoothly implementation of various projects aimed to develop an ecological rehabilitation system and to design a regional ecological security pattern. This paper made a systematic analysis on the types and characteristics of key project construction and their effects on the environment, and on the basis of this, brought forward the basic principles and methodology for ecological rehabilitation and security pattern design in this construction. It was considered that the following issues should be addressed in the implementation of a key project: 1) analysis and evaluation of current regional ecological environment, 2) evaluation of anthropogenic disturbances and their ecological risk, 3) regional ecological rehabilitation and security pattern design, 4) scenario analysis of environmental benefits of regional ecological security pattern, 5) re-optimization of regional ecological system framework, and 6) establishment of regional ecosystem management plan.
NASA Technical Reports Server (NTRS)
Zyla, L. V.
1979-01-01
The modifications are described as necessary to give the Houston Operations Predictor/Estimator (HOPE) program the capability to solve for or consider vent forces for orbit determination. The model implemented in solving for vent forces is described along with the integrator problems encountered. A summary derivation of the mathematical principles applicable to solve/consider methodology is provided.
2011-09-20
optimal portfolio point on the efficient frontier, for example, Portfolio B on the chart in Figure A1. Then, by subsequently changing some of the ... optimized portfolio controlling for risk using the IRM methodology and tool suite. Results indicate that both rapid and incremental implementation...Results of the KVA and SD scenario analysis provided the financial information required to forecast an optimized
Heuristic decomposition for non-hierarchic systems
NASA Technical Reports Server (NTRS)
Bloebaum, Christina L.; Hajela, P.
1991-01-01
Design and optimization is substantially more complex in multidisciplinary and large-scale engineering applications due to the existing inherently coupled interactions. The paper introduces a quasi-procedural methodology for multidisciplinary optimization that is applicable for nonhierarchic systems. The necessary decision-making support for the design process is provided by means of an embedded expert systems capability. The method employs a decomposition approach whose modularity allows for implementation of specialized methods for analysis and optimization within disciplines.
Van Deun, Jan; Hendrix, An
2017-01-01
The EV-TRACK knowledgebase is developed to cope with the need for transparency and rigour to increase reproducibility and facilitate standardization of extracellular vesicle (EV) research. The knowledgebase includes a checklist for authors and editors intended to improve the transparency of methodological aspects of EV experiments, allows queries and meta-analysis of EV experiments and keeps track of the current state of the art. Widespread implementation by the EV research community is key to its success.
ERIC Educational Resources Information Center
Queiroz, Fernanda Cristina Barbosa Pereira; Samohyl, Robert Wayne; Queiroz, Jamerson Viegas; Lima, Nilton Cesar; de Souza, Gustavo Henrique Silva
2014-01-01
This paper aims to develop and implement a method to identify the causes of the choice of a course and the reasons for evasion in higher education. This way, we sought to identify the factors that influence student choice to opt for Higher Education Institution parsed, as well as the factors influencing its evasion. The methodology employed was…
2005-03-01
ethnography , grounded theory , phenomenological , case study , and content analysis. As ethnography is based upon a longitudinal study in...a qualitative methodology consisting of a case study strategy is warranted for this research project. Yin (2003) lists five components of research ...systems. Journal of End User Computing, 12(3), 14. Yin, R. K. (2003). Case Study Research : Design and
Development of a Aerothermoelastic-Acoustics Simulation Capability of Flight Vehicles
NASA Technical Reports Server (NTRS)
Gupta, K. K.; Choi, S. B.; Ibrahim, A.
2010-01-01
A novel numerical, finite element based analysis methodology is presented in this paper suitable for accurate and efficient simulation of practical, complex flight vehicles. An associated computer code, developed in this connection, is also described in some detail. Thermal effects of high speed flow obtained from a heat conduction analysis are incorporated in the modal analysis which in turn affects the unsteady flow arising out of interaction of elastic structures with the air. Numerical examples pertaining to representative problems are given in much detail testifying to the efficacy of the advocated techniques. This is a unique implementation of temperature effects in a finite element CFD based multidisciplinary simulation analysis capability involving large scale computations.
Methodological reporting of randomized trials in five leading Chinese nursing journals.
Shi, Chunhu; Tian, Jinhui; Ren, Dan; Wei, Hongli; Zhang, Lihuan; Wang, Quan; Yang, Kehu
2014-01-01
Randomized controlled trials (RCTs) are not always well reported, especially in terms of their methodological descriptions. This study aimed to investigate the adherence of methodological reporting complying with CONSORT and explore associated trial level variables in the Chinese nursing care field. In June 2012, we identified RCTs published in five leading Chinese nursing journals and included trials with details of randomized methods. The quality of methodological reporting was measured through the methods section of the CONSORT checklist and the overall CONSORT methodological items score was calculated and expressed as a percentage. Meanwhile, we hypothesized that some general and methodological characteristics were associated with reporting quality and conducted a regression with these data to explore the correlation. The descriptive and regression statistics were calculated via SPSS 13.0. In total, 680 RCTs were included. The overall CONSORT methodological items score was 6.34 ± 0.97 (Mean ± SD). No RCT reported descriptions and changes in "trial design," changes in "outcomes" and "implementation," or descriptions of the similarity of interventions for "blinding." Poor reporting was found in detailing the "settings of participants" (13.1%), "type of randomization sequence generation" (1.8%), calculation methods of "sample size" (0.4%), explanation of any interim analyses and stopping guidelines for "sample size" (0.3%), "allocation concealment mechanism" (0.3%), additional analyses in "statistical methods" (2.1%), and targeted subjects and methods of "blinding" (5.9%). More than 50% of trials described randomization sequence generation, the eligibility criteria of "participants," "interventions," and definitions of the "outcomes" and "statistical methods." The regression analysis found that publication year and ITT analysis were weakly associated with CONSORT score. The completeness of methodological reporting of RCTs in the Chinese nursing care field is poor, especially with regard to the reporting of trial design, changes in outcomes, sample size calculation, allocation concealment, blinding, and statistical methods.
A Novel Approach to Rotorcraft Damage Tolerance
NASA Technical Reports Server (NTRS)
Forth, Scott C.; Everett, Richard A.; Newman, John A.
2002-01-01
Damage-tolerance methodology is positioned to replace safe-life methodologies for designing rotorcraft structures. The argument for implementing a damage-tolerance method comes from the fundamental fact that rotorcraft structures typically fail by fatigue cracking. Therefore, if technology permits prediction of fatigue-crack growth in structures, a damage-tolerance method should deliver the most accurate prediction of component life. Implementing damage-tolerance (DT) into high-cycle-fatigue (HCF) components will require a shift from traditional DT methods that rely on detecting an initial flaw with nondestructive inspection (NDI) methods. The rapid accumulation of cycles in a HCF component will result in a design based on a traditional DT method that is either impractical because of frequent inspections, or because the design will be too heavy to operate efficiently. Furthermore, once a HCF component develops a detectable propagating crack, the remaining fatigue life is short, sometimes less than one flight hour, which does not leave sufficient time for inspection. Therefore, designing a HCF component will require basing the life analysis on an initial flaw that is undetectable with current NDI technology.
López-Bolaños, Lizbeth; Campos-Rivera, Marisol; Villanueva-Borbolla, María Ángeles
2018-01-01
Objective. To reflect on the process of committing to participation in the implementation of a health strategic plan, using Participative Systematization of Social Experiences as a tool. Our study was a qualitative research-intervention study, based on the Dialectical Methodological Conception approach. We designed and implemented a two-day workshop, six hours daily, using Systematization methodology with a Community Work Group (CWG). During the workshop, women systematized their experience, with compromise as axis of the process. Using Grounded Theory techniques, we applied micro-analysis to data in order to identify and strengthen categories that emerged during the systematization process. We completed open and axial coding. The CWG identified that commitment and participation itself is influenced by group dynamics and structural determinants. They also reconsidered the way they understood and exercised commitment and participation, and generated knowledge, empowering them to improve their future practice. Commitment and participation were determined by group dynamics and structural factors such as socioeconomic conditions and gender roles. These determinants must be visible and understood in order to generate proposals that are aimed at strengthening the participation and organization of groups.
Macedonia, Christian R; Johnson, Clark T; Rajapakse, Indika
2017-02-01
Technical advances in science have had broad implications in reproductive and women's health care. Recent innovations in population-level data collection and storage have made available an unprecedented amount of data for analysis while computational technology has evolved to permit processing of data previously thought too dense to study. "Big data" is a term used to describe data that are a combination of dramatically greater volume, complexity, and scale. The number of variables in typical big data research can readily be in the thousands, challenging the limits of traditional research methodologies. Regardless of what it is called, advanced data methods, predictive analytics, or big data, this unprecedented revolution in scientific exploration has the potential to dramatically assist research in obstetrics and gynecology broadly across subject matter. Before implementation of big data research methodologies, however, potential researchers and reviewers should be aware of strengths, strategies, study design methods, and potential pitfalls. Examination of big data research examples contained in this article provides insight into the potential and the limitations of this data science revolution and practical pathways for its useful implementation.
Repetitive deliberate fires: Development and validation of a methodology to detect series.
Bruenisholz, Eva; Delémont, Olivier; Ribaux, Olivier; Wilson-Wilde, Linzi
2017-08-01
The detection of repetitive deliberate fire events is challenging and still often ineffective due to a case-by-case approach. A previous study provided a critical review of the situation and analysis of the main challenges. This study suggested that the intelligence process, integrating forensic data, could be a valid framework to provide a follow-up and systematic analysis provided it is adapted to the specificities of repetitive deliberate fires. In this current manuscript, a specific methodology to detect deliberate fires series, i.e. set by the same perpetrators, is presented and validated. It is based on case profiles relying on specific elements previously identified. The method was validated using a dataset of approximately 8000 deliberate fire events collected over 12 years in a Swiss state. Twenty possible series were detected, including 6 of 9 known series. These results are very promising and lead the way to a systematic implementation of this methodology in an intelligence framework, whilst demonstrating the need and benefit of increasing the collection of forensic specific information to strengthen the value of links between cases. Crown Copyright © 2017. Published by Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Shaykhian, Gholam Ali; Baggs, Rhoda
2007-01-01
In the early problem-solution era of software programming, functional decompositions were mainly used to design and implement software solutions. In functional decompositions, functions and data are introduced as two separate entities during the design phase, and are followed as such in the implementation phase. Functional decompositions make use of refactoring through optimizing the algorithms, grouping similar functionalities into common reusable functions, and using abstract representations of data where possible; all these are done during the implementation phase. This paper advocates the usage of object-oriented methodologies and design patterns as the centerpieces of refactoring software solutions. Refactoring software is a method of changing software design while explicitly preserving its external functionalities. The combined usage of object-oriented methodologies and design patterns to refactor should also benefit the overall software life cycle cost with improved software.
Indications and Warning Methodology for Strategic Intelligence
2017-12-01
WARNING METHODOLOGY FOR STRATEGIC INTELLIGENCE by Susann Kimmelman December 2017 Thesis Co-Advisors: Robert Simeral James Wirtz THIS...Master’s thesis 4. TITLE AND SUBTITLE INDICATIONS AND WARNING METHODOLOGY FOR STRATEGIC INTELLIGENCE 5. FUNDING NUMBERS 6. AUTHOR(S) Susann...enterprise. The research found that, for homeland security, implementing a human-centric indications and warning methodology that focuses on the actor as
NASA Astrophysics Data System (ADS)
Manzo, Ciro; Braga, Federica; Zaggia, Luca; Brando, Vittorio Ernesto; Giardino, Claudia; Bresciani, Mariano; Bassani, Cristiana
2018-04-01
This paper describes a procedure to perform spatio-temporal analysis of river plume dispersion in prodelta areas by multi-temporal Landsat-8-derived products for identifying zones sensitive to water discharge and for providing geostatistical patterns of turbidity linked to different meteo-marine forcings. In particular, we characterized the temporal and spatial variability of turbidity and sea surface temperature (SST) in the Po River prodelta (Northern Adriatic Sea, Italy) during the period 2013-2016. To perform this analysis, a two-pronged processing methodology was implemented and the resulting outputs were analysed through a series of statistical tools. A pixel-based spatial correlation analysis was carried out by comparing temporal curves of turbidity and SST hypercubes with in situ time series of wind speed and water discharge, providing correlation coefficient maps. A geostatistical analysis was performed to determine the spatial dependency of the turbidity datasets per each satellite image, providing maps of correlation and variograms. The results show a linear correlation between water discharge and turbidity variations in the points more affected by the buoyant plumes and along the southern coast of Po River delta. Better inverse correlation was found between turbidity and SST during floods rather than other periods. The correlation maps of wind speed with turbidity show different spatial patterns depending on local or basin-scale wind effects. Variogram maps identify different spatial anisotropy structures of turbidity in response to ambient conditions (i.e. strong Bora or Scirocco winds, floods). Since the implemented processing methodology is based on open source software and free satellite data, it represents a promising tool for the monitoring of maritime ecosystems and to address water quality analyses and the investigations of sediment dynamics in estuarine and coastal waters.
NASA Astrophysics Data System (ADS)
Yang, Shuangming; Wei, Xile; Deng, Bin; Liu, Chen; Li, Huiyan; Wang, Jiang
2018-03-01
Balance between biological plausibility of dynamical activities and computational efficiency is one of challenging problems in computational neuroscience and neural system engineering. This paper proposes a set of efficient methods for the hardware realization of the conductance-based neuron model with relevant dynamics, targeting reproducing the biological behaviors with low-cost implementation on digital programmable platform, which can be applied in wide range of conductance-based neuron models. Modified GP neuron models for efficient hardware implementation are presented to reproduce reliable pallidal dynamics, which decode the information of basal ganglia and regulate the movement disorder related voluntary activities. Implementation results on a field-programmable gate array (FPGA) demonstrate that the proposed techniques and models can reduce the resource cost significantly and reproduce the biological dynamics accurately. Besides, the biological behaviors with weak network coupling are explored on the proposed platform, and theoretical analysis is also made for the investigation of biological characteristics of the structured pallidal oscillator and network. The implementation techniques provide an essential step towards the large-scale neural network to explore the dynamical mechanisms in real time. Furthermore, the proposed methodology enables the FPGA-based system a powerful platform for the investigation on neurodegenerative diseases and real-time control of bio-inspired neuro-robotics.
Meta-analysis and The Cochrane Collaboration: 20 years of the Cochrane Statistical Methods Group
2013-01-01
The Statistical Methods Group has played a pivotal role in The Cochrane Collaboration over the past 20 years. The Statistical Methods Group has determined the direction of statistical methods used within Cochrane reviews, developed guidance for these methods, provided training, and continued to discuss and consider new and controversial issues in meta-analysis. The contribution of Statistical Methods Group members to the meta-analysis literature has been extensive and has helped to shape the wider meta-analysis landscape. In this paper, marking the 20th anniversary of The Cochrane Collaboration, we reflect on the history of the Statistical Methods Group, beginning in 1993 with the identification of aspects of statistical synthesis for which consensus was lacking about the best approach. We highlight some landmark methodological developments that Statistical Methods Group members have contributed to in the field of meta-analysis. We discuss how the Group implements and disseminates statistical methods within The Cochrane Collaboration. Finally, we consider the importance of robust statistical methodology for Cochrane systematic reviews, note research gaps, and reflect on the challenges that the Statistical Methods Group faces in its future direction. PMID:24280020
NASA Astrophysics Data System (ADS)
Alexakis, Dimitrios D.; Sarris, Apostolos; Papadopoulos, Nikos; Soupios, Pantelis; Doula, Maria; Cavvadias, Victor
2014-08-01
The olive-oil industry is one of the most important sectors of agricultural production in Greece, which is the third in olive-oil production country worldwide. Olive oil mill wastes (OOMW) constitute a major factor in pollution in olivegrowing regions and an important problem to be solved for the agricultural industry. The olive-oil mill wastes are normally deposited at tanks, or directly in the soil or even on adjacent torrents, rivers and lakes posing a high risk to the environmental pollution and the community health. GEODIAMETRIS project aspires to develop integrated geoinformatic methodologies for performing monitoring of land pollution from the disposal of OOMW in the island of Crete -Greece. These methodologies integrate GPS surveys, satellite remote sensing and risk assessment analysis in GIS environment, application of in situ and laboratory geophysical methodologies as well as soil and water physicochemical analysis. Concerning project's preliminary results, all the operating OOMW areas located in Crete have been already registered through extensive GPS field campaigns. Their spatial and attribute information has been stored in an integrated GIS database and an overall OOMW spectral signature database has been constructed through the analysis of multi-temporal Landsat-8 OLI satellite images. In addition, a specific OOMW area located in Alikianos village (Chania-Crete) has been selected as one of the main case study areas. Various geophysical methodologies, such as Electrical Resistivity Tomography, Induced Polarization, multifrequency electromagnetic, Self Potential measurements and Ground Penetrating Radar have been already implemented. Soil as well as liquid samples have been collected for performing physico-chemical analysis. The preliminary results have already contributed to the gradual development of an integrated environmental monitoring tool for studying and understanding environmental degradation from the disposal of OOMW.
Network Meta-Analysis Using R: A Review of Currently Available Automated Packages
Neupane, Binod; Richer, Danielle; Bonner, Ashley Joel; Kibret, Taddele; Beyene, Joseph
2014-01-01
Network meta-analysis (NMA) – a statistical technique that allows comparison of multiple treatments in the same meta-analysis simultaneously – has become increasingly popular in the medical literature in recent years. The statistical methodology underpinning this technique and software tools for implementing the methods are evolving. Both commercial and freely available statistical software packages have been developed to facilitate the statistical computations using NMA with varying degrees of functionality and ease of use. This paper aims to introduce the reader to three R packages, namely, gemtc, pcnetmeta, and netmeta, which are freely available software tools implemented in R. Each automates the process of performing NMA so that users can perform the analysis with minimal computational effort. We present, compare and contrast the availability and functionality of different important features of NMA in these three packages so that clinical investigators and researchers can determine which R packages to implement depending on their analysis needs. Four summary tables detailing (i) data input and network plotting, (ii) modeling options, (iii) assumption checking and diagnostic testing, and (iv) inference and reporting tools, are provided, along with an analysis of a previously published dataset to illustrate the outputs available from each package. We demonstrate that each of the three packages provides a useful set of tools, and combined provide users with nearly all functionality that might be desired when conducting a NMA. PMID:25541687
Network meta-analysis using R: a review of currently available automated packages.
Neupane, Binod; Richer, Danielle; Bonner, Ashley Joel; Kibret, Taddele; Beyene, Joseph
2014-01-01
Network meta-analysis (NMA)--a statistical technique that allows comparison of multiple treatments in the same meta-analysis simultaneously--has become increasingly popular in the medical literature in recent years. The statistical methodology underpinning this technique and software tools for implementing the methods are evolving. Both commercial and freely available statistical software packages have been developed to facilitate the statistical computations using NMA with varying degrees of functionality and ease of use. This paper aims to introduce the reader to three R packages, namely, gemtc, pcnetmeta, and netmeta, which are freely available software tools implemented in R. Each automates the process of performing NMA so that users can perform the analysis with minimal computational effort. We present, compare and contrast the availability and functionality of different important features of NMA in these three packages so that clinical investigators and researchers can determine which R packages to implement depending on their analysis needs. Four summary tables detailing (i) data input and network plotting, (ii) modeling options, (iii) assumption checking and diagnostic testing, and (iv) inference and reporting tools, are provided, along with an analysis of a previously published dataset to illustrate the outputs available from each package. We demonstrate that each of the three packages provides a useful set of tools, and combined provide users with nearly all functionality that might be desired when conducting a NMA.
NASA Astrophysics Data System (ADS)
Lamouroux, Julien; Testut, Charles-Emmanuel; Lellouche, Jean-Michel; Perruche, Coralie; Paul, Julien
2017-04-01
The operational production of data-assimilated biogeochemical state of the ocean is one of the challenging core projects of the Copernicus Marine Environment Monitoring Service. In that framework - and with the April 2018 CMEMS V4 release as a target - Mercator Ocean is in charge of improving the realism of its global ¼° BIOMER coupled physical-biogeochemical (NEMO/PISCES) simulations, analyses and re-analyses, and to develop an effective capacity to routinely estimate the biogeochemical state of the ocean, through the implementation of biogeochemical data assimilation. Primary objectives are to enhance the time representation of the seasonal cycle in the real time and reanalysis systems, and to provide a better control of the production in the equatorial regions. The assimilation of BGC data will rely on a simplified version of the SEEK filter, where the error statistics do not evolve with the model dynamics. The associated forecast error covariances are based on the statistics of a collection of 3D ocean state anomalies. The anomalies are computed from a multi-year numerical experiment (free run without assimilation) with respect to a running mean in order to estimate the 7-day scale error on the ocean state at a given period of the year. These forecast error covariances rely thus on a fixed-basis seasonally variable ensemble of anomalies. This methodology, which is currently implemented in the "blue" component of the CMEMS operational forecast system, is now under adaptation to be applied to the biogeochemical part of the operational system. Regarding observations - and as a first step - the system shall rely on the CMEMS GlobColour Global Ocean surface chlorophyll concentration products, delivered in NRT. The objective of this poster is to provide a detailed overview of the implementation of the aforementioned data assimilation methodology in the CMEMS BIOMER forecasting system. Focus shall be put on (1) the assessment of the capabilities of this data assimilation methodology to provide satisfying statistics of the model variability errors (through space-time analysis of dedicated representers of satellite surface Chla observations), (2) the dedicated features of the data assimilation configuration that have been implemented so far (e.g. log-transformation of the analysis state, multivariate Chlorophyll-Nutrient control vector, etc.) and (3) the assessment of the performances of this future operational data assimilation configuration.
NASA Astrophysics Data System (ADS)
Vázquez-Suñé, Enric; Ángel Marazuela, Miguel; Velasco, Violeta; Diviu, Marc; Pérez-Estaún, Andrés; Álvarez-Marrón, Joaquina
2016-09-01
The overdevelopment of cities since the industrial revolution has shown the need to incorporate a sound geological knowledge in the management of required subsurface infrastructures and in the assessment of increasingly needed groundwater resources. Additionally, the scarcity of outcrops and the technical difficulty to conduct underground exploration in urban areas highlights the importance of implementing efficient management plans that deal with the legacy of heterogeneous subsurface information. To deal with these difficulties, a methodology has been proposed to integrate all the available spatio-temporal data into a comprehensive spatial database and a set of tools that facilitates the analysis and processing of the existing and newly added data for the city of Barcelona (NE Spain). Here we present the resulting actual subsurface 3-D geological model that incorporates and articulates all the information stored in the database. The methodology applied to Barcelona benefited from a good collaboration between administrative bodies and researchers that enabled the realization of a comprehensive geological database despite logistic difficulties. Currently, the public administration and also private sectors both benefit from the geological understanding acquired in the city of Barcelona, for example, when preparing the hydrogeological models used in groundwater assessment plans. The methodology further facilitates the continuous incorporation of new data in the implementation and sustainable management of urban groundwater, and also contributes to significantly reducing the costs of new infrastructures.
Impact of agile methodologies on team capacity in automotive radio-navigation projects
NASA Astrophysics Data System (ADS)
Prostean, G.; Hutanu, A.; Volker, S.
2017-01-01
The development processes used in automotive radio-navigation projects are constantly under adaption pressure. While the software development models are based on automotive production processes, the integration of peripheral components into an automotive system will trigger a high number of requirement modifications. The use of traditional development models in automotive industry will bring team’s development capacity to its boundaries. The root cause lays in the inflexibility of actual processes and their adaption limits. This paper addresses a new project management approach for the development of radio-navigation projects. The understanding of weaknesses of current used models helped us in development and integration of agile methodologies in traditional development model structure. In the first part we focus on the change management methods to reduce request for change inflow. Established change management risk analysis processes enables the project management to judge the impact of a requirement change and also gives time to the project to implement some changes. However, in big automotive radio-navigation projects the saved time is not enough to implement the large amount of changes, which are submitted to the project. In the second phase of this paper we focus on increasing team capacity by integrating at critical project phases agile methodologies into the used traditional model. The overall objective of this paper is to prove the need of process adaption in order to solve project team capacity bottlenecks.
Intermountain Health Care, Inc.: Standard Costing System Methodology and Implementation
Rosqvist, W.V.
1984-01-01
Intermountain Health Care, Inc. (IHC) a notfor-profit hospital chain with 22 hospitals in the intermountain area and corporate offices located in Salt Lake City, Utah, has developed a Standard Costing System to provide hospital management with a tool for confronting increased cost pressures in the health care environment. This document serves as a description of methodology used in developing the standard costing system and outlines the implementation process.
Does team training work? Principles for health care.
Salas, Eduardo; DiazGranados, Deborah; Weaver, Sallie J; King, Heidi
2008-11-01
Teamwork is integral to a working environment conducive to patient safety and care. Team training is one methodology designed to equip team members with the competencies necessary for optimizing teamwork. There is evidence of team training's effectiveness in highly complex and dynamic work environments, such as aviation and health care. However, most quantitative evaluations of training do not offer any insight into the actual reasons why, how, and when team training is effective. To address this gap in understanding, and to provide guidance for members of the health care community interested in implementing team training programs, this article presents both quantitative results and a specific qualitative review and content analysis of team training implemented in health care. Based on this review, we offer eight evidence-based principles for effective planning, implementation, and evaluation of team training programs specific to health care.
NASA Astrophysics Data System (ADS)
Jadhav, J. R.; Mantha, S. S.; Rane, S. B.
2015-09-01
`Survival of the fittest' is the reality in modern global competition. Organizations around the globe are adopting or willing to embrace just-in-time (JIT) production to reinforce the competitiveness. Even though JIT is the most powerful inventory management methodologies it is not free from barriers. Barriers derail the implementation of JIT production system. One of the most significant tasks of top management is to identify and understand the relationship between the barriers to JIT production for alleviating its bad effects. The aims of this paper are to study the barriers hampering the implementation of successful JIT production and analysing the interactions among the barriers using interpretive structural modelling technique. Twelve barriers have been identified after reviewing literature. This paper offers a roadmap for preparing an action plan to tackle the barriers in successful implementation of JIT production.
Gifford, Wendy; Graham, Ian D; Ehrhart, Mark G; Davies, Barbara L; Aarons, Gregory A
2017-01-01
Leadership in health care is instrumental to creating a supportive organizational environment and positive staff attitudes for implementing evidence-based practices to improve patient care and outcomes. The purpose of this study is to demonstrate the alignment of the Ottawa Model of Implementation Leadership (O-MILe), a theoretical model for developing implementation leadership, with the Implementation Leadership Scale (ILS), an empirically validated tool for measuring implementation leadership. A secondary objective is to describe the methodological process for aligning concepts of a theoretical model with an independently established measurement tool for evaluating theory-based interventions. Modified template analysis was conducted to deductively map items of the ILS onto concepts of the O-MILe. An iterative process was used in which the model and scale developers (n=5) appraised the relevance, conceptual clarity, and fit of each ILS items with the O-MILe concepts through individual feedback and group discussions until consensus was reached. All 12 items of the ILS correspond to at least one O-MILe concept, demonstrating compatibility of the ILS as a measurement tool for the O-MILe theoretical constructs. The O-MILe provides a theoretical basis for developing implementation leadership, and the ILS is a compatible tool for measuring leadership based on the O-MILe. Used together, the O-MILe and ILS provide an evidence- and theory-based approach for developing and measuring leadership for implementing evidence-based practices in health care. Template analysis offers a convenient approach for determining the compatibility of independently developed evaluation tools to test theoretical models.
Gifford, Wendy; Graham, Ian D; Ehrhart, Mark G; Davies, Barbara L; Aarons, Gregory A
2017-01-01
Purpose Leadership in health care is instrumental to creating a supportive organizational environment and positive staff attitudes for implementing evidence-based practices to improve patient care and outcomes. The purpose of this study is to demonstrate the alignment of the Ottawa Model of Implementation Leadership (O-MILe), a theoretical model for developing implementation leadership, with the Implementation Leadership Scale (ILS), an empirically validated tool for measuring implementation leadership. A secondary objective is to describe the methodological process for aligning concepts of a theoretical model with an independently established measurement tool for evaluating theory-based interventions. Methods Modified template analysis was conducted to deductively map items of the ILS onto concepts of the O-MILe. An iterative process was used in which the model and scale developers (n=5) appraised the relevance, conceptual clarity, and fit of each ILS items with the O-MILe concepts through individual feedback and group discussions until consensus was reached. Results All 12 items of the ILS correspond to at least one O-MILe concept, demonstrating compatibility of the ILS as a measurement tool for the O-MILe theoretical constructs. Conclusion The O-MILe provides a theoretical basis for developing implementation leadership, and the ILS is a compatible tool for measuring leadership based on the O-MILe. Used together, the O-MILe and ILS provide an evidence- and theory-based approach for developing and measuring leadership for implementing evidence-based practices in health care. Template analysis offers a convenient approach for determining the compatibility of independently developed evaluation tools to test theoretical models. PMID:29355212
The impact of Lean bundles on hospital performance: does size matter?
Al-Hyari, Khalil; Abu Hammour, Sewar; Abu Zaid, Mohammad Khair Saleem; Haffar, Mohamed
2016-10-10
Purpose The purpose of this paper is to study the effect of the implementation of Lean bundles on hospital performance in private hospitals in Jordan and evaluate how much the size of organization can affect the relationship between Lean bundles implementation and hospital performance. Design/methodology/approach The research is considered as quantitative method (descriptive and hypothesis testing). Three statistical techniques were adopted to analyse the data. Structural equation modeling techniques and multi-group analysis were used to examine the research's hypothesis, and to perform the required statistical analysis of the data from the survey. Reliability analysis and confirmatory factor analysis were used to test the construct validity, reliability and measurement loadings that were performed. Findings Lean bundles have been identified as an effective approach that can dramatically improve the organizational performance of private hospitals in Jordan. Main Lean bundles - just in time, human resource management, and total quality management are applicable to large, small and medium hospitals without significant differences in advantages that depend on size. Originality/value According to the researchers' best knowledge, this is the first research that studies the impact of Lean bundles implementation in healthcare sector in Jordan. This research also makes a significant contribution for decision makers in healthcare to increase their awareness of Lean bundles.
A top-down design methodology and its implementation for VCSEL-based optical links design
NASA Astrophysics Data System (ADS)
Li, Jiguang; Cao, Mingcui; Cai, Zilong
2005-01-01
In order to find the optimal design for a given specification of an optical communication link, an integrated simulation of electronic, optoelectronic, and optical components of a complete system is required. It is very important to be able to simulate at both system level and detailed model level. This kind of model is feasible due to the high potential of Verilog-AMS language. In this paper, we propose an effective top-down design methodology and employ it in the development of a complete VCSEL-based optical links simulation. The principle of top-down methodology is that the development would proceed from the system to device level. To design a hierarchical model for VCSEL based optical links, the design framework is organized in three levels of hierarchy. The models are developed, and implemented in Verilog-AMS. Therefore, the model parameters are fitted to measured data. A sample transient simulation demonstrates the functioning of our implementation. Suggestions for future directions in top-down methodology used for optoelectronic systems technology are also presented.
Wojtusiak, Janusz; Michalski, Ryszard S; Simanivanh, Thipkesone; Baranova, Ancha V
2009-12-01
Systematic reviews and meta-analysis of published clinical datasets are important part of medical research. By combining results of multiple studies, meta-analysis is able to increase confidence in its conclusions, validate particular study results, and sometimes lead to new findings. Extensive theory has been built on how to aggregate results from multiple studies and arrive to the statistically valid conclusions. Surprisingly, very little has been done to adopt advanced machine learning methods to support meta-analysis. In this paper we describe a novel machine learning methodology that is capable of inducing accurate and easy to understand attributional rules from aggregated data. Thus, the methodology can be used to support traditional meta-analysis in systematic reviews. Most machine learning applications give primary attention to predictive accuracy of the learned knowledge, and lesser attention to its understandability. Here we employed attributional rules, the special form of rules that are relatively easy to interpret for medical experts who are not necessarily trained in statistics and meta-analysis. The methodology has been implemented and initially tested on a set of publicly available clinical data describing patients with metabolic syndrome (MS). The objective of this application was to determine rules describing combinations of clinical parameters used for metabolic syndrome diagnosis, and to develop rules for predicting whether particular patients are likely to develop secondary complications of MS. The aggregated clinical data was retrieved from 20 separate hospital cohorts that included 12 groups of patients with present liver disease symptoms and 8 control groups of healthy subjects. The total of 152 attributes were used, most of which were measured, however, in different studies. Twenty most common attributes were selected for the rule learning process. By applying the developed rule learning methodology we arrived at several different possible rulesets that can be used to predict three considered complications of MS, namely nonalcoholic fatty liver disease (NAFLD), simple steatosis (SS), and nonalcoholic steatohepatitis (NASH).
Ergonomics for enhancing detection of machine abnormalities.
Illankoon, Prasanna; Abeysekera, John; Singh, Sarbjeet
2016-10-17
Detecting abnormal machine conditions is of great importance in an autonomous maintenance environment. Ergonomic aspects can be invaluable when detection of machine abnormalities using human senses is examined. This research outlines the ergonomic issues involved in detecting machine abnormalities and suggests how ergonomics would improve such detections. Cognitive Task Analysis was performed in a plant in Sri Lanka where Total Productive Maintenance is being implemented to identify sensory types that would be used to detect machine abnormalities and relevant Ergonomic characteristics. As the outcome of this research, a methodology comprising of an Ergonomic Gap Analysis Matrix for machine abnormality detection is presented.
A Test Generation Framework for Distributed Fault-Tolerant Algorithms
NASA Technical Reports Server (NTRS)
Goodloe, Alwyn; Bushnell, David; Miner, Paul; Pasareanu, Corina S.
2009-01-01
Heavyweight formal methods such as theorem proving have been successfully applied to the analysis of safety critical fault-tolerant systems. Typically, the models and proofs performed during such analysis do not inform the testing process of actual implementations. We propose a framework for generating test vectors from specifications written in the Prototype Verification System (PVS). The methodology uses a translator to produce a Java prototype from a PVS specification. Symbolic (Java) PathFinder is then employed to generate a collection of test cases. A small example is employed to illustrate how the framework can be used in practice.
Non-Grey Radiation Modeling using Thermal Desktop/Sindaworks TFAWS06-1009
NASA Technical Reports Server (NTRS)
Anderson, Kevin R.; Paine, Chris
2006-01-01
This paper provides an overview of the non-grey radiation modeling capabilities of Cullimore and Ring's Thermal Desktop(Registered TradeMark) Version 4.8 SindaWorks software. The non-grey radiation analysis theory implemented by Sindaworks and the methodology used by the software are outlined. Representative results from a parametric trade study of a radiation shield comprised of a series of v-grooved shaped deployable panels is used to illustrate the capabilities of the SindaWorks non-grey radiation thermal analysis software using emissivities with temperature and wavelength dependency modeled via a Hagen-Rubens relationship.
A Proven Methodology for Developing Secure Software and Applying It to Ground Systems
NASA Technical Reports Server (NTRS)
Bailey, Brandon
2016-01-01
Part Two expands upon Part One in an attempt to translate the methodology for ground system personnel. The goal is to build upon the methodology presented in Part One by showing examples and details on how to implement the methodology. Section 1: Ground Systems Overview; Section 2: Secure Software Development; Section 3: Defense in Depth for Ground Systems; Section 4: What Now?
A simple landslide susceptibility analysis for hazard and risk assessment in developing countries
NASA Astrophysics Data System (ADS)
Guinau, M.; Vilaplana, J. M.
2003-04-01
In recent years, a number of techniques and methodologies have been developed for mitigating natural disasters. The complexity of these methodologies and the scarcity of material and data series justify the need for simple methodologies to obtain the necessary information for minimising the effects of catastrophic natural phenomena. The work with polygonal maps using a GIS allowed us to develop a simple methodology, which was developed in an area of 473 Km2 in the Departamento de Chinandega (NW Nicaragua). This area was severely affected by a large number of landslides (mainly debris flows), triggered by the Hurricane Mitch rainfalls in October 1998. With the aid of aerial photography interpretation at 1:40.000 scale, amplified to 1:20.000, and detailed field work, a landslide map at 1:10.000 scale was constructed. The failure zones of landslides were digitized in order to obtain a failure zone digital map. A terrain unit digital map, in which a series of physical-environmental terrain factors are represented, was also used. Dividing the studied area into two zones (A and B) with homogeneous physical and environmental characteristics, allows us to develop the proposed methodology and to validate it. In zone A, the failure zone digital map is superimposed onto the terrain unit digital map to establish the relationship between the different terrain factors and the failure zones. The numerical expression of this relationship enables us to classify the terrain by its landslide susceptibility. In zone B, this numerical relationship was employed to obtain a landslide susceptibility map, obviating the need for a failure zone map. The validity of the methodology can be tested in this area by using the degree of superposition of the susceptibility map and the failure zone map. The implementation of the methodology in tropical countries with physical and environmental characteristics similar to those of the study area allows us to carry out a landslide susceptibility analysis in areas where landslide records do not exist. This analysis is essential to landslide hazard and risk assessment, which is necessary to determine the actions for mitigating landslide effects, e.g. land planning, emergency aid actions, etc.
Agile methodology selection criteria: IT start-up case study
NASA Astrophysics Data System (ADS)
Micic, Lj
2017-05-01
Project management in modern IT companies is often based on agile methodologies which have several advantages compared to traditional methodologies such is waterfall. Having in mind that clients sometimes change project during development it is crucial for an IT company to choose carefully which methodology is going to implement and is it going to be mostly based on one or is it going got be combination of several. There are several modern and often used methodologies but among those Scrum, Kanban and XP programming are usually the most common. Sometimes companies use mostly tools and procedures from one but quite often they use some of the combination of those methodologies. Having in mind that those methodologies are just a framework they allow companies to adapt it for their specific projects as well as for other limitations. These methodologies are in limited usage Bosnia but more and more IT companies are starting to use agile methodologies because it is practice and common not just for their clients abroad but also starting to be the only option in order to deliver quality product on time. However it is always challenging which methodology or combination of several companies should implement and how to connect it to its own project, organizational framework and HR management. This paper presents one case study based on local IT start up and delivers solution based on theoretical framework and practical limitations that case company has.
Methodology for Designing Fault-Protection Software
NASA Technical Reports Server (NTRS)
Barltrop, Kevin; Levison, Jeffrey; Kan, Edwin
2006-01-01
A document describes a methodology for designing fault-protection (FP) software for autonomous spacecraft. The methodology embodies and extends established engineering practices in the technical discipline of Fault Detection, Diagnosis, Mitigation, and Recovery; and has been successfully implemented in the Deep Impact Spacecraft, a NASA Discovery mission. Based on established concepts of Fault Monitors and Responses, this FP methodology extends the notion of Opinion, Symptom, Alarm (aka Fault), and Response with numerous new notions, sub-notions, software constructs, and logic and timing gates. For example, Monitor generates a RawOpinion, which graduates into Opinion, categorized into no-opinion, acceptable, or unacceptable opinion. RaiseSymptom, ForceSymptom, and ClearSymptom govern the establishment and then mapping to an Alarm (aka Fault). Local Response is distinguished from FP System Response. A 1-to-n and n-to- 1 mapping is established among Monitors, Symptoms, and Responses. Responses are categorized by device versus by function. Responses operate in tiers, where the early tiers attempt to resolve the Fault in a localized step-by-step fashion, relegating more system-level response to later tier(s). Recovery actions are gated by epoch recovery timing, enabling strategy, urgency, MaxRetry gate, hardware availability, hazardous versus ordinary fault, and many other priority gates. This methodology is systematic, logical, and uses multiple linked tables, parameter files, and recovery command sequences. The credibility of the FP design is proven via a fault-tree analysis "top-down" approach, and a functional fault-mode-effects-and-analysis via "bottoms-up" approach. Via this process, the mitigation and recovery strategy(s) per Fault Containment Region scope (width versus depth) the FP architecture.
Mechanistic-empirical Pavement Design Guide Implementation
DOT National Transportation Integrated Search
2010-06-01
The recently introduced Mechanistic-Empirical Pavement Design Guide (MEPDG) and associated computer software provides a state-of-practice mechanistic-empirical highway pavement design methodology. The MEPDG methodology is based on pavement responses ...
Levay, Adrienne V; Chapman, Gwen E; Seed, Barbara; Wittman, Hannah
2018-04-30
School food environments are the target of nutrition interventions and evaluations across the globe. Yet little work to-date has articulated the importance of developing a theory of change upon which to base evaluation of both implementation and outcomes. This paper undertakes an interpretive approach to develop a retrospective theory of change for an implementation evaluation of British Columbia's school food and beverage sales Guidelines. This study contributes broadly to a nuanced conceptualization of this type of public health intervention and provides a methodological contribution on how to develop a retrospective theory of change with implications for effective evaluation. Data collection strategies included document analysis, semi-structured interviews with key stakeholders, and participant observation. Developing the logic model revealed that, despite the broad population health aims of the intervention, the main focus of implementation is to change behaviors of adults who create school food environments. Derived from the analysis and interpretation of the data, the emergent program theory focuses on the assumption that if adults are responsibilized through information and education campaigns and provided implementation tools, they will be 'convinced' to implement changes to school food environments to foster broader public health goals. These findings highlight the importance of assessing individual-level implementation indicators as well as the more often evaluated measures of food and beverage availability. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ning, S. A.; Hayman, G.; Damiani, R.
Blade element momentum methods, though conceptually simple, are highly useful for analyzing wind turbines aerodynamics and are widely used in many design and analysis applications. A new version of AeroDyn is being developed to take advantage of new robust solution methodologies, conform to a new modularization framework for National Renewable Energy Laboratory's FAST, utilize advanced skewed-wake analysis methods, fix limitations with previous implementations, and to enable modeling of highly flexible and nonstraight blades. This paper reviews blade element momentum theory and several of the options available for analyzing skewed inflow. AeroDyn implementation details are described for the benefit of usersmore » and developers. These new options are compared to solutions from the previous version of AeroDyn and to experimental data. Finally, recommendations are given on how one might select from the various available solution approaches.« less
INTERIM ANALYSIS OF THE CONTRIBUTION OF HIGH-LEVEL EVIDENCE FOR DENGUE VECTOR CONTROL.
Horstick, Olaf; Ranzinger, Silvia Runge
2015-01-01
This interim analysis reviews the available systematic literature for dengue vector control on three levels: 1) single and combined vector control methods, with existing work on peridomestic space spraying and on Bacillus thuringiensis israelensis; further work is available soon on the use of Temephos, Copepods and larvivorous fish; 2) or for a specific purpose, like outbreak control, and 3) on a strategic level, as for example decentralization vs centralization, with a systematic review on vector control organization. Clear best practice guidelines for methodology of entomological studies are needed. There is a need to include measuring dengue transmission data. The following recommendations emerge: Although vector control can be effective, implementation remains an issue; Single interventions are probably not useful; Combinations of interventions have mixed results; Careful implementation of vector control measures may be most important; Outbreak interventions are often applied with questionable effectiveness.
Reiner, Bruce I
2017-10-01
Conventional peer review practice is compromised by a number of well-documented biases, which in turn limit standard of care analysis, which is fundamental to determination of medical malpractice. In addition to these intrinsic biases, other existing deficiencies exist in current peer review including the lack of standardization, objectivity, retrospective practice, and automation. An alternative model to address these deficiencies would be one which is completely blinded to the peer reviewer, requires independent reporting from both parties, utilizes automated data mining techniques for neutral and objective report analysis, and provides data reconciliation for resolution of finding-specific report differences. If properly implemented, this peer review model could result in creation of a standardized referenceable peer review database which could further assist in customizable education, technology refinement, and implementation of real-time context and user-specific decision support.
da Cruz, Andrea de Mello Pereira; Almeida, Miriam de Abreu
2010-12-01
This is a qualitative, exploratory and descriptive study whose general objective was to learn, considering the perspective of the nursing technician who works in school hospitals, the competencies developed during their educational process to implement the Nursing Care Systematization (NCS). Data collection and analysis were carried out through a focal group, with content analysis and nursing technicians. Two thematic categories emerged: The participation of the nursing technician in the NCS and The competencies in the education of the nursing technician. Each one received two subcategories: Conception of the NCS and (De)valuation of the NCS, Technical-scientific competency and Competency in the interpersonal relationship, respectively. It was observed that the NCS must be shared, discussed and made public among nursing professionals, so that they may acknowledge themselves as the leading actors of their methodology and be aware that their practices determine the results.
NASA Technical Reports Server (NTRS)
1991-01-01
The technical effort and computer code enhancements performed during the sixth year of the Probabilistic Structural Analysis Methods program are summarized. Various capabilities are described to probabilistically combine structural response and structural resistance to compute component reliability. A library of structural resistance models is implemented in the Numerical Evaluations of Stochastic Structures Under Stress (NESSUS) code that included fatigue, fracture, creep, multi-factor interaction, and other important effects. In addition, a user interface was developed for user-defined resistance models. An accurate and efficient reliability method was developed and was successfully implemented in the NESSUS code to compute component reliability based on user-selected response and resistance models. A risk module was developed to compute component risk with respect to cost, performance, or user-defined criteria. The new component risk assessment capabilities were validated and demonstrated using several examples. Various supporting methodologies were also developed in support of component risk assessment.
The Aeronautical Data Link: Decision Framework for Architecture Analysis
NASA Technical Reports Server (NTRS)
Morris, A. Terry; Goode, Plesent W.
2003-01-01
A decision analytic approach that develops optimal data link architecture configuration and behavior to meet multiple conflicting objectives of concurrent and different airspace operations functions has previously been developed. The approach, premised on a formal taxonomic classification that correlates data link performance with operations requirements, information requirements, and implementing technologies, provides a coherent methodology for data link architectural analysis from top-down and bottom-up perspectives. This paper follows the previous research by providing more specific approaches for mapping and transitioning between the lower levels of the decision framework. The goal of the architectural analysis methodology is to assess the impact of specific architecture configurations and behaviors on the efficiency, capacity, and safety of operations. This necessarily involves understanding the various capabilities, system level performance issues and performance and interface concepts related to the conceptual purpose of the architecture and to the underlying data link technologies. Efficient and goal-directed data link architectural network configuration is conditioned on quantifying the risks and uncertainties associated with complex structural interface decisions. Deterministic and stochastic optimal design approaches will be discussed that maximize the effectiveness of architectural designs.
Sum-of-Squares-Based Region of Attraction Analysis for Gain-Scheduled Three-Loop Autopilot
NASA Astrophysics Data System (ADS)
Seo, Min-Won; Kwon, Hyuck-Hoon; Choi, Han-Lim
2018-04-01
A conventional method of designing a missile autopilot is to linearize the original nonlinear dynamics at several trim points, then to determine linear controllers for each linearized model, and finally implement gain-scheduling technique. The validation of such a controller is often based on linear system analysis for the linear closed-loop system at the trim conditions. Although this type of gain-scheduled linear autopilot works well in practice, validation based solely on linear analysis may not be sufficient to fully characterize the closed-loop system especially when the aerodynamic coefficients exhibit substantial nonlinearity with respect to the flight condition. The purpose of this paper is to present a methodology for analyzing the stability of a gain-scheduled controller in a setting close to the original nonlinear setting. The method is based on sum-of-squares (SOS) optimization that can be used to characterize the region of attraction of a polynomial system by solving convex optimization problems. The applicability of the proposed SOS-based methodology is verified on a short-period autopilot of a skid-to-turn missile.
Assessment of Infrared Sounder Radiometric Noise from Analysis of Spectral Residuals
NASA Astrophysics Data System (ADS)
Dufour, E.; Klonecki, A.; Standfuss, C.; Tournier, B.; Serio, C.; Masiello, G.; Tjemkes, S.; Stuhlmann, R.
2016-08-01
For the preparation and performance monitoring of the future generation of hyperspectral InfraRed sounders dedicated to the precise vertical profiling of the atmospheric state, such as the Meteosat Third Generation hyperspectral InfraRed Sounder, a reliable assessment of the instrument radiometric error covariance matrix is needed.Ideally, an inflight estimation of the radiometrric noise is recommended as certain sources of noise can be driven by the spectral signature of the observed Earth/ atmosphere radiance. Also, unknown correlated noise sources, generally related to incomplete knowledge of the instrument state, can be present, so a caracterisation of the noise spectral correlation is also neeed.A methodology, relying on the analysis of post-retreival spectral residuals, is designed and implemented to derive in-flight the covariance matrix on the basis of Earth scenes measurements. This methodology is successfully demonstrated using IASI observations as MTG-IRS proxy data and made it possible to highlight anticipated correlation structures explained by apodization and micro-vibration effects (ghost). This analysis is corroborated by a parallel estimation based on an IASI black body measurement dataset and the results of an independent micro-vibration model.
Single-Case Experimental Designs: A Systematic Review of Published Research and Current Standards
Smith, Justin D.
2013-01-01
This article systematically reviews the research design and methodological characteristics of single-case experimental design (SCED) research published in peer-reviewed journals between 2000 and 2010. SCEDs provide researchers with a flexible and viable alternative to group designs with large sample sizes. However, methodological challenges have precluded widespread implementation and acceptance of the SCED as a viable complementary methodology to the predominant group design. This article includes a description of the research design, measurement, and analysis domains distinctive to the SCED; a discussion of the results within the framework of contemporary standards and guidelines in the field; and a presentation of updated benchmarks for key characteristics (e.g., baseline sampling, method of analysis), and overall, it provides researchers and reviewers with a resource for conducting and evaluating SCED research. The results of the systematic review of 409 studies suggest that recently published SCED research is largely in accordance with contemporary criteria for experimental quality. Analytic method emerged as an area of discord. Comparison of the findings of this review with historical estimates of the use of statistical analysis indicates an upward trend, but visual analysis remains the most common analytic method and also garners the most support amongst those entities providing SCED standards. Although consensus exists along key dimensions of single-case research design and researchers appear to be practicing within these parameters, there remains a need for further evaluation of assessment and sampling techniques and data analytic methods. PMID:22845874
Van Deun, Jan; Hendrix, An
2017-01-01
ABSTRACT The EV-TRACK knowledgebase is developed to cope with the need for transparency and rigour to increase reproducibility and facilitate standardization of extracellular vesicle (EV) research. The knowledgebase includes a checklist for authors and editors intended to improve the transparency of methodological aspects of EV experiments, allows queries and meta-analysis of EV experiments and keeps track of the current state of the art. Widespread implementation by the EV research community is key to its success. PMID:29184624
Forecasting daily passenger traffic volumes in the Moscow metro
NASA Astrophysics Data System (ADS)
Ivanov, V. V.; Osetrov, E. S.
2018-01-01
In this paper we have developed a methodology for the medium-term prediction of daily volumes of passenger traffic in the Moscow metro. It includes three options for the forecast: (1) based on artificial neural networks (ANNs), (2) singular-spectral analysis implemented in the Caterpillar-SSA package, and (3) a combination of the ANN and Caterpillar-SSA approaches. The methods and algorithms allow the mediumterm forecasting of passenger traffic flows in the Moscow metro with reasonable accuracy.
The Effects of Work on Family Life: A Review and Analysis of the Literature
1988-07-01
reviewed and ap- proved this report, indicated that these data will ne useful in the (Ovelop- mont of new avenues of research on the interface...this area. However, several conceptual and methodological weaknesses are apparent in this relatively new field of study. Several suggestions for...implement studies in the Army Families Research Program. It suggests new avenues of study and will serve as one of several literature reviews that will
Eggenreich, Britta; Rajamanickam, Vignesh; Wurm, David Johannes; Fricke, Jens; Herwig, Christoph; Spadiut, Oliver
2017-08-01
Cell disruption is a key unit operation to make valuable, intracellular target products accessible for further downstream unit operations. Independent of the applied cell disruption method, each cell disruption process must be evaluated with respect to disruption efficiency and potential product loss. Current state-of-the-art methods, like measuring the total amount of released protein and plating-out assays, are usually time-delayed and involve manual intervention making them error-prone. An automated method to monitor cell disruption efficiency at-line is not available to date. In the current study we implemented a methodology, which we had originally developed to monitor E. coli cell integrity during bioreactor cultivations, to automatically monitor and evaluate cell disruption of a recombinant E. coli strain by high-pressure homogenization. We compared our tool with a library of state-of-the-art methods, analyzed the effect of freezing the biomass before high-pressure homogenization and finally investigated this unit operation in more detail by a multivariate approach. A combination of HPLC and automated data analysis describes a valuable, novel tool to monitor and evaluate cell disruption processes. Our methodology, which can be used both in upstream (USP) and downstream processing (DSP), describes a valuable tool to evaluate cell disruption processes as it can be implemented at-line, gives results within minutes after sampling and does not need manual intervention.
Off-site training of laparoscopic skills, a scoping review using a thematic analysis.
Thinggaard, Ebbe; Kleif, Jakob; Bjerrum, Flemming; Strandbygaard, Jeanett; Gögenur, Ismail; Matthew Ritter, E; Konge, Lars
2016-11-01
The focus of research in simulation-based laparoscopic training has changed from examining whether simulation training works to examining how best to implement it. In laparoscopic skills training, portable and affordable box trainers allow for off-site training. Training outside simulation centers and hospitals can increase access to training, but also poses new challenges to implementation. This review aims to guide implementation of off-site training of laparoscopic skills by critically reviewing the existing literature. An iterative systematic search was carried out in MEDLINE, EMBASE, ERIC, Scopus, and PsychINFO, following a scoping review methodology. The included literature was analyzed iteratively using a thematic analysis approach. The study was reported in accordance with the STructured apprOach to the Reporting In healthcare education of Evidence Synthesis statement. From the search, 22 records were identified and included for analysis. A thematic analysis revealed the themes: access to training, protected training time, distribution of training, goal setting and testing, task design, and unsupervised training. The identified themes were based on learning theories including proficiency-based learning, deliberate practice, and self-regulated learning. Methods of instructional design vary widely in off-site training of laparoscopic skills. Implementation can be facilitated by organizing courses and training curricula following sound education theories such as proficiency-based learning and deliberate practice. Directed self-regulated learning has the potential to improve off-site laparoscopic skills training; however, further studies are needed to demonstrate the effect of this type of instructional design.
NASA Astrophysics Data System (ADS)
Mozgovoy, Dmitry k.; Hnatushenko, Volodymyr V.; Vasyliev, Volodymyr V.
2018-04-01
Vegetation and water bodies are a fundamental element of urban ecosystems, and water mapping is critical for urban and landscape planning and management. A methodology of automated recognition of vegetation and water bodies on the territory of megacities in satellite images of sub-meter spatial resolution of the visible and IR bands is proposed. By processing multispectral images from the satellite SuperView-1A, vector layers of recognized plant and water objects were obtained. Analysis of the results of image processing showed a sufficiently high accuracy of the delineation of the boundaries of recognized objects and a good separation of classes. The developed methodology provides a significant increase of the efficiency and reliability of updating maps of large cities while reducing financial costs. Due to the high degree of automation, the proposed methodology can be implemented in the form of a geo-information web service functioning in the interests of a wide range of public services and commercial institutions.
Assessing Similarity Among Individual Tumor Size Lesion Dynamics: The CICIL Methodology
Girard, Pascal; Ioannou, Konstantinos; Klinkhardt, Ute; Munafo, Alain
2018-01-01
Mathematical models of tumor dynamics generally omit information on individual target lesions (iTLs), and consider the most important variable to be the sum of tumor sizes (TS). However, differences in lesion dynamics might be predictive of tumor progression. To exploit this information, we have developed a novel and flexible approach for the non‐parametric analysis of iTLs, which integrates knowledge from signal processing and machine learning. We called this new methodology ClassIfication Clustering of Individual Lesions (CICIL). We used CICIL to assess similarities among the TS dynamics of 3,223 iTLs measured in 1,056 patients with metastatic colorectal cancer treated with cetuximab combined with irinotecan, in two phase II studies. We mainly observed similar dynamics among lesions within the same tumor site classification. In contrast, lesions in anatomic locations with different features showed different dynamics in about 35% of patients. The CICIL methodology has also been implemented in a user‐friendly and efficient Java‐based framework. PMID:29388396
Warehouses information system design and development
NASA Astrophysics Data System (ADS)
Darajatun, R. A.; Sukanta
2017-12-01
Materials/goods handling industry is fundamental for companies to ensure the smooth running of their warehouses. Efficiency and organization within every aspect of the business is essential in order to gain a competitive advantage. The purpose of this research is design and development of Kanban of inventory storage and delivery system. Application aims to facilitate inventory stock checks to be more efficient and effective. Users easily input finished goods from production department, warehouse, customer, and also suppliers. Master data designed as complete as possible to be prepared applications used in a variety of process logistic warehouse variations. The author uses Java programming language to develop the application, which is used for building Java Web applications, while the database used is MySQL. System development methodology that I use is the Waterfall methodology. Waterfall methodology has several stages of the Analysis, System Design, Implementation, Integration, Operation and Maintenance. In the process of collecting data the author uses the method of observation, interviews, and literature.
NASA Astrophysics Data System (ADS)
Bruzzone, Agostino G.; Revetria, Roberto; Simeoni, Simone; Viazzo, Simone; Orsoni, Alessandra
2004-08-01
In logistics and industrial production managers must deal with the impact of stochastic events to improve performances and reduce costs. In fact, production and logistics systems are generally designed considering some parameters as deterministically distributed. While this assumption is mostly used for preliminary prototyping, it is sometimes also retained during the final design stage, and especially for estimated parameters (i.e. Market Request). The proposed methodology can determine the impact of stochastic events in the system by evaluating the chaotic threshold level. Such an approach, based on the application of a new and innovative methodology, can be implemented to find the condition under which chaos makes the system become uncontrollable. Starting from problem identification and risk assessment, several classification techniques are used to carry out an effect analysis and contingency plan estimation. In this paper the authors illustrate the methodology with respect to a real industrial case: a production problem related to the logistics of distributed chemical processing.
Application of Six Sigma methodology to a diagnostic imaging process.
Taner, Mehmet Tolga; Sezen, Bulent; Atwat, Kamal M
2012-01-01
This paper aims to apply the Six Sigma methodology to improve workflow by eliminating the causes of failure in the medical imaging department of a private Turkish hospital. Implementation of the design, measure, analyse, improve and control (DMAIC) improvement cycle, workflow chart, fishbone diagrams and Pareto charts were employed, together with rigorous data collection in the department. The identification of root causes of repeat sessions and delays was followed by failure, mode and effect analysis, hazard analysis and decision tree analysis. The most frequent causes of failure were malfunction of the RIS/PACS system and improper positioning of patients. Subsequent to extensive training of professionals, the sigma level was increased from 3.5 to 4.2. The data were collected over only four months. Six Sigma's data measurement and process improvement methodology is the impetus for health care organisations to rethink their workflow and reduce malpractice. It involves measuring, recording and reporting data on a regular basis. This enables the administration to monitor workflow continuously. The improvements in the workflow under study, made by determining the failures and potential risks associated with radiologic care, will have a positive impact on society in terms of patient safety. Having eliminated repeat examinations, the risk of being exposed to more radiation was also minimised. This paper supports the need to apply Six Sigma and present an evaluation of the process in an imaging department.
NASA Technical Reports Server (NTRS)
Taylor, Arthur C., III; Hou, Gene W.
1996-01-01
An incremental iterative formulation together with the well-known spatially split approximate-factorization algorithm, is presented for solving the large, sparse systems of linear equations that are associated with aerodynamic sensitivity analysis. This formulation is also known as the 'delta' or 'correction' form. For the smaller two dimensional problems, a direct method can be applied to solve these linear equations in either the standard or the incremental form, in which case the two are equivalent. However, iterative methods are needed for larger two-dimensional and three dimensional applications because direct methods require more computer memory than is currently available. Iterative methods for solving these equations in the standard form are generally unsatisfactory due to an ill-conditioned coefficient matrix; this problem is overcome when these equations are cast in the incremental form. The methodology is successfully implemented and tested using an upwind cell-centered finite-volume formulation applied in two dimensions to the thin-layer Navier-Stokes equations for external flow over an airfoil. In three dimensions this methodology is demonstrated with a marching-solution algorithm for the Euler equations to calculate supersonic flow over the High-Speed Civil Transport configuration (HSCT 24E). The sensitivity derivatives obtained with the incremental iterative method from a marching Euler code are used in a design-improvement study of the HSCT configuration that involves thickness. camber, and planform design variables.
O'Dwyer, Gisele; Machado, Cristiani Vieira; Alves, Renan Paes; Salvador, Fernanda Gonçalves
2016-06-01
Mobile prehospital care is a key component of emergency care. The aim of this study was to analyze the implementation of the State of Rio de Janeiro's Mobile Emergency Medical Service (SAMU, acronym in Portuguese). The methodology employed included document analysis, visits to six SAMU emergency call centers, and semistructured interviews conducted with 12 local and state emergency care coordinators. The study's conceptual framework was based on Giddens' theory of structuration. Intergovernmental conflicts were observed between the state and municipal governments, and between municipal governments. Despite the shortage of hospital beds, the SAMUs in periphery regions were better integrated with the emergency care network than the metropolitan SAMUs. The steering committees were not very active and weaknesses were observed relating to the limited role played by the state government in funding, management, and monitoring. It was concluded that the SAMU implementation process in the state was marked by political tensions and management and coordination weaknesses. As a result, serious drawbacks remain in the coordination of the SAMU with the other health services and the regionalization of emergency care in the state.
Patorno, Elisabetta; Patrick, Amanda R; Garry, Elizabeth M; Schneeweiss, Sebastian; Gillet, Victoria G; Bartels, Dorothee B; Masso-Gonzalez, Elvira; Seeger, John D
2014-11-01
Recent years have witnessed a growing body of observational literature on the association between glucose-lowering treatments and cardiovascular disease. However, many of the studies are based on designs or analyses that inadequately address the methodological challenges involved. We reviewed recent observational literature on the association between glucose-lowering medications and cardiovascular outcomes and assessed the design and analysis methods used, with a focus on their ability to address specific methodological challenges. We describe and illustrate these methodological issues and their impact on observed associations, providing examples from the reviewed literature. We suggest approaches that may be employed to manage these methodological challenges. From the evaluation of 81 publications of observational investigations assessing the association between glucose-lowering treatments and cardiovascular outcomes, we identified the following methodological challenges: 1) handling of temporality in administrative databases; 2) handling of risks that vary with time and treatment duration; 3) definitions of the exposure risk window; 4) handling of exposures that change over time; and 5) handling of confounding by indication. Most of these methodological challenges may be suitably addressed through application of appropriate methods. Observational research plays an increasingly important role in the evaluation of the clinical effects of diabetes treatment. Implementation of appropriate research methods holds the promise of reducing the potential for spurious findings and the risk that the spurious findings will mislead the medical community about risks and benefits of diabetes medications.
Implementing the 4D cycle of appreciative inquiry in health care: a methodological review.
Trajkovski, Suza; Schmied, Virginia; Vickers, Margaret; Jackson, Debra
2013-06-01
To examine and critique how the phases of the 4D cycle (Discovery, Dream, Design, and Destiny) of appreciative inquiry are implemented in a healthcare context. Appreciative inquiry is a theoretical research perspective, an emerging research methodology and a world view that builds on action research, organizational learning, and organizational change. Increasing numbers of articles published provide insights and learning into its theoretical and philosophical underpinnings. Many articles describe appreciative inquiry and the outcomes of their studies; however, there is a gap in the literature examining the approaches commonly used to implement the 4D cycle in a healthcare context. A methodological review following systematic principles. A methodological review was conducted including articles from the inception of appreciative inquiry in 1986 to the time of writing this review in November, 2011. Key database searches included CINAHL, Emerald, MEDLINE, PubMed, PsycINFO, and Scopus. A methodological review following systematic principles was undertaken. Studies were included if they described in detail the methods used to implement the 4D cycle of appreciative inquiry in a healthcare context. Nine qualitative studies met the inclusion criteria. Results highlighted that appreciative inquiry application is unique and varied between studies. The 4D phases were not rigid steps and were adapted to the setting and participants. Overall, participant enthusiasm and commitment were highlighted suggesting appreciative inquiry was mostly positively perceived by participants. Appreciative inquiry provides a positive way forward shifting from problems to solutions offering a new way of practicing in health care and health research. © 2012 Blackwell Publishing Ltd.
NASA Astrophysics Data System (ADS)
Verardo, E.; Atteia, O.; Rouvreau, L.
2015-12-01
In-situ bioremediation is a commonly used remediation technology to clean up the subsurface of petroleum-contaminated sites. Forecasting remedial performance (in terms of flux and mass reduction) is a challenge due to uncertainties associated with source properties and the uncertainties associated with contribution and efficiency of concentration reducing mechanisms. In this study, predictive uncertainty analysis of bio-remediation system efficiency is carried out with the null-space Monte Carlo (NSMC) method which combines the calibration solution-space parameters with the ensemble of null-space parameters, creating sets of calibration-constrained parameters for input to follow-on remedial efficiency. The first step in the NSMC methodology for uncertainty analysis is model calibration. The model calibration was conducted by matching simulated BTEX concentration to a total of 48 observations from historical data before implementation of treatment. Two different bio-remediation designs were then implemented in the calibrated model. The first consists in pumping/injection wells and the second in permeable barrier coupled with infiltration across slotted piping. The NSMC method was used to calculate 1000 calibration-constrained parameter sets for the two different models. Several variants of the method were implemented to investigate their effect on the efficiency of the NSMC method. The first variant implementation of the NSMC is based on a single calibrated model. In the second variant, models were calibrated from different initial parameter sets. NSMC calibration-constrained parameter sets were sampled from these different calibrated models. We demonstrate that in context of nonlinear model, second variant avoids to underestimate parameter uncertainty which may lead to a poor quantification of predictive uncertainty. Application of the proposed approach to manage bioremediation of groundwater in a real site shows that it is effective to provide support in management of the in-situ bioremediation systems. Moreover, this study demonstrates that the NSMC method provides a computationally efficient and practical methodology of utilizing model predictive uncertainty methods in environmental management.
1987-03-01
contends his soft systems methodology is such an approach. [Ref. 2: pp. 105-107] Overview of this Methodology is meant flor addressing fuzzy., ill...could form the basis of office systems development: Checkland’s (1981) soft systems methodology , Pava’s (1983) sociotechnical design, and Mumlbrd and
Rabadan, Jose; Perez-Jimenez, Rafael
2017-01-01
Visible Light Communications (VLC) is a cutting edge technology for data communication that is being considered to be implemented in a wide range of applications such as Inter-vehicle communication or Local Area Network (LAN) communication. As a novel technology, some aspects of the implementation of VLC have not been deeply considered or tested. Among these aspects, security and its implementation may become an obstacle for VLCs broad usage. In this article, we have used the well-known Risk Matrix methodology to determine the relative risk that several common attacks have in a VLC network. Four examples: a War Driving, a Queensland alike Denial of Service, a Preshared Key Cracking, and an Evil Twin attack, illustrate the utilization of the methodology over a VLC implementation. The used attacks also covered the different areas delimited by the attack taxonomy used in this work. By defining and determining which attacks present a greater risk, the results of this work provide a lead into which areas should be invested to increase the safety of VLC networks. PMID:29186184
Marin-Garcia, Ignacio; Chavez-Burbano, Patricia; Guerra, Victor; Rabadan, Jose; Perez-Jimenez, Rafael
2017-01-01
Visible Light Communications (VLC) is a cutting edge technology for data communication that is being considered to be implemented in a wide range of applications such as Inter-vehicle communication or Local Area Network (LAN) communication. As a novel technology, some aspects of the implementation of VLC have not been deeply considered or tested. Among these aspects, security and its implementation may become an obstacle for VLCs broad usage. In this article, we have used the well-known Risk Matrix methodology to determine the relative risk that several common attacks have in a VLC network. Four examples: a War Driving, a Queensland alike Denial of Service, a Preshared Key Cracking, and an Evil Twin attack, illustrate the utilization of the methodology over a VLC implementation. The used attacks also covered the different areas delimited by the attack taxonomy used in this work. By defining and determining which attacks present a greater risk, the results of this work provide a lead into which areas should be invested to increase the safety of VLC networks.
System Analysis Applied to Autonomy: Application to Human-Rated Lunar/Mars Landers
NASA Technical Reports Server (NTRS)
Young, Larry A.
2006-01-01
System analysis is an essential technical discipline for the modern design of spacecraft and their associated missions. Specifically, system analysis is a powerful aid in identifying and prioritizing the required technologies needed for mission and/or vehicle development efforts. Maturation of intelligent systems technologies, and their incorporation into spacecraft systems, are dictating the development of new analysis tools, and incorporation of such tools into existing system analysis methodologies, in order to fully capture the trade-offs of autonomy on vehicle and mission success. A "system analysis of autonomy" methodology will be outlined and applied to a set of notional human-rated lunar/Mars lander missions toward answering these questions: 1. what is the optimum level of vehicle autonomy and intelligence required? and 2. what are the specific attributes of an autonomous system implementation essential for a given surface lander mission/application in order to maximize mission success? Future human-rated lunar/Mars landers, though nominally under the control of their crew, will, nonetheless, be highly automated systems. These automated systems will range from mission/flight control functions, to vehicle health monitoring and prognostication, to life-support and other "housekeeping" functions. The optimum degree of autonomy afforded to these spacecraft systems/functions has profound implications from an exploration system architecture standpoint.
Thow, A M; Snowdon, W; Schultz, J T; Leeder, S; Vivili, P; Swinburn, B A
2011-11-01
There is global interest in using multisectoral policy approaches to improve diets, and reduce obesity and non-communicable disease. However, there has been ad hoc implementation, which in some sectors such as the economic sector has been very limited, because of the lack of quality evidence on potential costs and impacts, and the inherent challenges associated with cross-sectoral policy development and implementation. The Pacific Obesity Prevention in Communities food policy project aimed to inform relevant policy development and implementation in Pacific Island countries. The project developed an innovative participatory approach to identifying and assessing potential policy options in terms of their effectiveness and feasibility. It also used policy analysis methodology to assess three policy initiatives to reduce fatty meat availability and four soft drink taxes in the region, in order to identify strategies for supporting effective policy implementation. © 2011 The Authors. obesity reviews © 2011 International Association for the Study of Obesity.
Toroody, Ahmad Bahoo; Abaei, Mohammad Mahdy; Gholamnia, Reza
2016-12-01
Risk assessment can be classified into two broad categories: traditional and modern. This paper is aimed at contrasting the functional resonance analysis method (FRAM) as a modern approach with the fault tree analysis (FTA) as a traditional method, regarding assessing the risks of a complex system. Applied methodology by which the risk assessment is carried out, is presented in each approach. Also, FRAM network is executed with regard to nonlinear interaction of human and organizational levels to assess the safety of technological systems. The methodology is implemented for lifting structures deep offshore. The main finding of this paper is that the combined application of FTA and FRAM during risk assessment, could provide complementary perspectives and may contribute to a more comprehensive understanding of an incident. Finally, it is shown that coupling a FRAM network with a suitable quantitative method will result in a plausible outcome for a predefined accident scenario.
Pattern recognition of satellite cloud imagery for improved weather prediction
NASA Technical Reports Server (NTRS)
Gautier, Catherine; Somerville, Richard C. J.; Volfson, Leonid B.
1986-01-01
The major accomplishment was the successful development of a method for extracting time derivative information from geostationary meteorological satellite imagery. This research is a proof-of-concept study which demonstrates the feasibility of using pattern recognition techniques and a statistical cloud classification method to estimate time rate of change of large-scale meteorological fields from remote sensing data. The cloud classification methodology is based on typical shape function analysis of parameter sets characterizing the cloud fields. The three specific technical objectives, all of which were successfully achieved, are as follows: develop and test a cloud classification technique based on pattern recognition methods, suitable for the analysis of visible and infrared geostationary satellite VISSR imagery; develop and test a methodology for intercomparing successive images using the cloud classification technique, so as to obtain estimates of the time rate of change of meteorological fields; and implement this technique in a testbed system incorporating an interactive graphics terminal to determine the feasibility of extracting time derivative information suitable for comparison with numerical weather prediction products.
Richard, Joshua; Galloway, Jack; Fensin, Michael; ...
2015-04-04
A novel object-oriented modular mapping methodology for externally coupled neutronics–thermal hydraulics multiphysics simulations was developed. The Simulator using MCNP with Integrated Thermal-Hydraulics for Exploratory Reactor Studies (SMITHERS) code performs on-the-fly mapping of material-wise power distribution tallies implemented by MCNP-based neutron transport/depletion solvers for use in estimating coolant temperature and density distributions with a separate thermal-hydraulic solver. The key development of SMITHERS is that it reconstructs the hierarchical geometry structure of the material-wise power generation tallies from the depletion solver automatically, with only a modicum of additional information required from the user. In addition, it performs the basis mapping from themore » combinatorial geometry of the depletion solver to the required geometry of the thermal-hydraulic solver in a generalizable manner, such that it can transparently accommodate varying levels of thermal-hydraulic solver geometric fidelity, from the nodal geometry of multi-channel analysis solvers to the pin-cell level of discretization for sub-channel analysis solvers.« less
NASA Technical Reports Server (NTRS)
Yau, M.; Guarro, S.; Apostolakis, G.
1993-01-01
Dynamic Flowgraph Methodology (DFM) is a new approach developed to integrate the modeling and analysis of the hardware and software components of an embedded system. The objective is to complement the traditional approaches which generally follow the philosophy of separating out the hardware and software portions of the assurance analysis. In this paper, the DFM approach is demonstrated using the Titan 2 Space Launch Vehicle Digital Flight Control System. The hardware and software portions of this embedded system are modeled in an integrated framework. In addition, the time dependent behavior and the switching logic can be captured by this DFM model. In the modeling process, it is found that constructing decision tables for software subroutines is very time consuming. A possible solution is suggested. This approach makes use of a well-known numerical method, the Newton-Raphson method, to solve the equations implemented in the subroutines in reverse. Convergence can be achieved in a few steps.
NASA Astrophysics Data System (ADS)
de Vito, Rossella; Portoghese, Ivan; Pagano, Alessandro; Fratino, Umberto; Vurro, Michele
2017-12-01
Increasing pressure affects water resources, especially in the agricultural sector, with cascading impacts on energy consumption. This is particularly relevant in the Mediterranean area, showing significant water scarcity problems, further exacerbated by the crucial economic role of agricultural production. Assessing the sustainability of water resource use is thus essential to preserving ecosystems and maintaining high levels of agricultural productivity. This paper proposes an integrated methodology based on the Water-Energy-Food Nexus to evaluate the multi-dimensional implications of irrigation practices. Three different indices are introduced, based on an analysis of the most influential factors. The methodology is then implemented in a catchment located in Puglia (Italy) and a comparative analysis of the three indices is presented. The results mainly highlight that economic land productivity is a key driver of irrigated agriculture, and that groundwater is highly affordable compared to surface water, thus being often dangerously perceived as freely available.
Development of an Aeroelastic Modeling Capability for Transient Nozzle Side Load Analysis
NASA Technical Reports Server (NTRS)
Wang, Ten-See; Zhao, Xiang; Zhang, Sijun; Chen, Yen-Sen
2013-01-01
Lateral nozzle forces are known to cause severe structural damage to any new rocket engine in development during test. While three-dimensional, transient, turbulent, chemically reacting computational fluid dynamics methodology has been demonstrated to capture major side load physics with rigid nozzles, hot-fire tests often show nozzle structure deformation during major side load events, leading to structural damages if structural strengthening measures were not taken. The modeling picture is incomplete without the capability to address the two-way responses between the structure and fluid. The objective of this study is to develop a coupled aeroelastic modeling capability by implementing the necessary structural dynamics component into an anchored computational fluid dynamics methodology. The computational fluid dynamics component is based on an unstructured-grid, pressure-based computational fluid dynamics formulation, while the computational structural dynamics component is developed in the framework of modal analysis. Transient aeroelastic nozzle startup analyses of the Block I Space Shuttle Main Engine at sea level were performed. The computed results from the aeroelastic nozzle modeling are presented.
Design and Analysis of Cognitive Interviews for Comparative Multinational Testing
Fitzgerald, Rory; Padilla, José-Luis; Willson, Stephanie; Widdop, Sally; Caspar, Rachel; Dimov, Martin; Gray, Michelle; Nunes, Cátia; Prüfer, Peter; Schöbi, Nicole; Schoua-Glusberg, Alisú
2011-01-01
This article summarizes the work of the Comparative Cognitive Testing Workgroup, an international coalition of survey methodologists interested in developing an evidence-based methodology for examining the comparability of survey questions within cross-cultural or multinational contexts. To meet this objective, it was necessary to ensure that the cognitive interviewing (CI) method itself did not introduce method bias. Therefore, the workgroup first identified specific characteristics inherent in CI methodology that could undermine the comparability of CI evidence. The group then developed and implemented a protocol addressing those issues. In total, 135 cognitive interviews were conducted by participating countries. Through the process, the group identified various interpretive patterns resulting from sociocultural and language-related differences among countries as well as other patterns of error that would impede comparability of survey data. PMID:29081719
Adapting Western research methods to indigenous ways of knowing.
Simonds, Vanessa W; Christopher, Suzanne
2013-12-01
Indigenous communities have long experienced exploitation by researchers and increasingly require participatory and decolonizing research processes. We present a case study of an intervention research project to exemplify a clash between Western research methodologies and Indigenous methodologies and how we attempted reconciliation. We then provide implications for future research based on lessons learned from Native American community partners who voiced concern over methods of Western deductive qualitative analysis. Decolonizing research requires constant reflective attention and action, and there is an absence of published guidance for this process. Continued exploration is needed for implementing Indigenous methods alone or in conjunction with appropriate Western methods when conducting research in Indigenous communities. Currently, examples of Indigenous methods and theories are not widely available in academic texts or published articles, and are often not perceived as valid.
Population forecasts for Bangladesh, using a Bayesian methodology.
Mahsin, Md; Hossain, Syed Shahadat
2012-12-01
Population projection for many developing countries could be quite a challenging task for the demographers mostly due to lack of availability of enough reliable data. The objective of this paper is to present an overview of the existing methods for population forecasting and to propose an alternative based on the Bayesian statistics, combining the formality of inference. The analysis has been made using Markov Chain Monte Carlo (MCMC) technique for Bayesian methodology available with the software WinBUGS. Convergence diagnostic techniques available with the WinBUGS software have been applied to ensure the convergence of the chains necessary for the implementation of MCMC. The Bayesian approach allows for the use of observed data and expert judgements by means of appropriate priors, and a more realistic population forecasts, along with associated uncertainty, has been possible.
Modeling Multibody Stage Separation Dynamics Using Constraint Force Equation Methodology
NASA Technical Reports Server (NTRS)
Tartabini, Paul V.; Roithmayr, Carlos M.; Toniolo, Matthew D.; Karlgaard, Christopher D.; Pamadi, Bandu N.
2011-01-01
This paper discusses the application of the constraint force equation methodology and its implementation for multibody separation problems using three specially designed test cases. The first test case involves two rigid bodies connected by a fixed joint, the second case involves two rigid bodies connected with a universal joint, and the third test case is that of Mach 7 separation of the X-43A vehicle. For the first two cases, the solutions obtained using the constraint force equation method compare well with those obtained using industry- standard benchmark codes. For the X-43A case, the constraint force equation solutions show reasonable agreement with the flight-test data. Use of the constraint force equation method facilitates the analysis of stage separation in end-to-end simulations of launch vehicle trajectories
Fracture Mechanics for Composites: State of the Art and Challenges
NASA Technical Reports Server (NTRS)
Krueger, Ronald; Krueger, Ronald
2006-01-01
Interlaminar fracture mechanics has proven useful for characterizing the onset of delaminations in composites and has been used with limited success primarily to investigate onset in fracture toughness specimens and laboratory size coupon type specimens. Future acceptance of the methodology by industry and certification authorities however, requires the successful demonstration of the methodology on the structural level. In this paper, the state-of-the-art in fracture toughness characterization, and interlaminar fracture mechanics analysis tools are described. To demonstrate the application on the structural level, a panel was selected which is reinforced with stringers. Full implementation of interlaminar fracture mechanics in design however remains a challenge and requires a continuing development effort of codes to calculate energy release rates and advancements in delamination onset and growth criteria under mixed mode conditions.
Interrupted time series regression for the evaluation of public health interventions: a tutorial.
Bernal, James Lopez; Cummins, Steven; Gasparrini, Antonio
2017-02-01
Interrupted time series (ITS) analysis is a valuable study design for evaluating the effectiveness of population-level health interventions that have been implemented at a clearly defined point in time. It is increasingly being used to evaluate the effectiveness of interventions ranging from clinical therapy to national public health legislation. Whereas the design shares many properties of regression-based approaches in other epidemiological studies, there are a range of unique features of time series data that require additional methodological considerations. In this tutorial we use a worked example to demonstrate a robust approach to ITS analysis using segmented regression. We begin by describing the design and considering when ITS is an appropriate design choice. We then discuss the essential, yet often omitted, step of proposing the impact model a priori. Subsequently, we demonstrate the approach to statistical analysis including the main segmented regression model. Finally we describe the main methodological issues associated with ITS analysis: over-dispersion of time series data, autocorrelation, adjusting for seasonal trends and controlling for time-varying confounders, and we also outline some of the more complex design adaptations that can be used to strengthen the basic ITS design.
Bayesian network meta-analysis for cluster randomized trials with binary outcomes.
Uhlmann, Lorenz; Jensen, Katrin; Kieser, Meinhard
2017-06-01
Network meta-analysis is becoming a common approach to combine direct and indirect comparisons of several treatment arms. In recent research, there have been various developments and extensions of the standard methodology. Simultaneously, cluster randomized trials are experiencing an increased popularity, especially in the field of health services research, where, for example, medical practices are the units of randomization but the outcome is measured at the patient level. Combination of the results of cluster randomized trials is challenging. In this tutorial, we examine and compare different approaches for the incorporation of cluster randomized trials in a (network) meta-analysis. Furthermore, we provide practical insight on the implementation of the models. In simulation studies, it is shown that some of the examined approaches lead to unsatisfying results. However, there are alternatives which are suitable to combine cluster randomized trials in a network meta-analysis as they are unbiased and reach accurate coverage rates. In conclusion, the methodology can be extended in such a way that an adequate inclusion of the results obtained in cluster randomized trials becomes feasible. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Analysis of in vitro fertilization data with multiple outcomes using discrete time-to-event analysis
Maity, Arnab; Williams, Paige; Ryan, Louise; Missmer, Stacey; Coull, Brent; Hauser, Russ
2014-01-01
In vitro fertilization (IVF) is an increasingly common method of assisted reproductive technology. Because of the careful observation and followup required as part of the procedure, IVF studies provide an ideal opportunity to identify and assess clinical and demographic factors along with environmental exposures that may impact successful reproduction. A major challenge in analyzing data from IVF studies is handling the complexity and multiplicity of outcome, resulting from both multiple opportunities for pregnancy loss within a single IVF cycle in addition to multiple IVF cycles. To date, most evaluations of IVF studies do not make use of full data due to its complex structure. In this paper, we develop statistical methodology for analysis of IVF data with multiple cycles and possibly multiple failure types observed for each individual. We develop a general analysis framework based on a generalized linear modeling formulation that allows implementation of various types of models including shared frailty models, failure specific frailty models, and transitional models, using standard software. We apply our methodology to data from an IVF study conducted at the Brigham and Women’s Hospital, Massachusetts. We also summarize the performance of our proposed methods based on a simulation study. PMID:24317880
Interrupted time series regression for the evaluation of public health interventions: a tutorial
Bernal, James Lopez; Cummins, Steven; Gasparrini, Antonio
2017-01-01
Abstract Interrupted time series (ITS) analysis is a valuable study design for evaluating the effectiveness of population-level health interventions that have been implemented at a clearly defined point in time. It is increasingly being used to evaluate the effectiveness of interventions ranging from clinical therapy to national public health legislation. Whereas the design shares many properties of regression-based approaches in other epidemiological studies, there are a range of unique features of time series data that require additional methodological considerations. In this tutorial we use a worked example to demonstrate a robust approach to ITS analysis using segmented regression. We begin by describing the design and considering when ITS is an appropriate design choice. We then discuss the essential, yet often omitted, step of proposing the impact model a priori. Subsequently, we demonstrate the approach to statistical analysis including the main segmented regression model. Finally we describe the main methodological issues associated with ITS analysis: over-dispersion of time series data, autocorrelation, adjusting for seasonal trends and controlling for time-varying confounders, and we also outline some of the more complex design adaptations that can be used to strengthen the basic ITS design. PMID:27283160
Gotovac, Sandra; Espinet, Stacey; Naqvi, Reza; Lingard, Lorelei; Steele, Margaret
2018-04-01
The need for child/adolescent mental health care in Canada is growing. Primary care can play a key role in filling this gap, yet most providers feel they do not have adequate training. This paper reviews the Canadian literature on capacity building programs in child and adolescent psychiatry for primary care providers, to examine how these programs are being implemented and evaluated to contribute to evidence-based initiatives. A systematic literature review of peer-reviewed published articles of capacity building initiatives in child/adolescent mental health care for primary care practitioners that have been implemented in Canada. Sixteen articles were identified that met inclusion criteria. Analysis revealed that capacity building initiatives in Canada are varied but rigorous evaluation methodology is lacking. Primary care providers welcome efforts to increase mental health care capacity and were satisfied with the implementation of most programs. Objective conclusions regarding the effectiveness of these programs to increase mental health care capacity is challenging given the evaluation methodology of these studies. Rigorous evaluation methods are needed to make evidence-based decisions on ways forward to be able to build child/adolescent mental health care capacity in primary care. Outcome measures need to move beyond self-report to more objective measures, and should expand the measurement of patient outcomes to ensure that these initiative are indeed leading to improved care for families.
A structured multi-stakeholder learning process for Sustainable Land Management.
Schwilch, Gudrun; Bachmann, Felicitas; Valente, Sandra; Coelho, Celeste; Moreira, Jorge; Laouina, Abdellah; Chaker, Miloud; Aderghal, Mohamed; Santos, Patricia; Reed, Mark S
2012-09-30
There are many, often competing, options for Sustainable Land Management (SLM). Each must be assessed - and sometimes negotiated - prior to implementation. Participatory, multi-stakeholder approaches to identification and selection of SLM options are increasingly popular, often motivated by social learning and empowerment goals. Yet there are few practical tools for facilitating processes in which land managers may share, select, and decide on the most appropriate SLM options. The research presented here aims to close the gap between the theory and the practice of stakeholder participation/learning in SLM decision-making processes. The paper describes a three-part participatory methodology for selecting SLM options that was tested in 14 desertification-prone study sites within the EU-DESIRE project. Cross-site analysis and in-depth evaluation of the Moroccan and Portuguese sites were used to evaluate how well the proposed process facilitated stakeholder learning and selection of appropriate SLM options for local implementation. The structured nature of the process - starting with SLM goal setting - was found to facilitate mutual understanding and collaboration between stakeholders. The deliberation process led to a high degree of consensus over the outcome and, though not an initial aim, it fostered social learning in many cases. This solution-oriented methodology is applicable in a wide range of contexts and may be implemented with limited time and resources. Copyright © 2012 Elsevier Ltd. All rights reserved.
Identifying Rodent Resting-State Brain Networks with Independent Component Analysis
Bajic, Dusica; Craig, Michael M.; Mongerson, Chandler R. L.; Borsook, David; Becerra, Lino
2017-01-01
Rodent models have opened the door to a better understanding of the neurobiology of brain disorders and increased our ability to evaluate novel treatments. Resting-state functional magnetic resonance imaging (rs-fMRI) allows for in vivo exploration of large-scale brain networks with high spatial resolution. Its application in rodents affords researchers a powerful translational tool to directly assess/explore the effects of various pharmacological, lesion, and/or disease states on known neural circuits within highly controlled settings. Integration of animal and human research at the molecular-, systems-, and behavioral-levels using diverse neuroimaging techniques empowers more robust interrogations of abnormal/ pathological processes, critical for evolving our understanding of neuroscience. We present a comprehensive protocol to evaluate resting-state brain networks using Independent Component Analysis (ICA) in rodent model. Specifically, we begin with a brief review of the physiological basis for rs-fMRI technique and overview of rs-fMRI studies in rodents to date, following which we provide a robust step-by-step approach for rs-fMRI investigation including data collection, computational preprocessing, and brain network analysis. Pipelines are interwoven with underlying theory behind each step and summarized methodological considerations, such as alternative methods available and current consensus in the literature for optimal results. The presented protocol is designed in such a way that investigators without previous knowledge in the field can implement the analysis and obtain viable results that reliably detect significant differences in functional connectivity between experimental groups. Our goal is to empower researchers to implement rs-fMRI in their respective fields by incorporating technical considerations to date into a workable methodological framework. PMID:29311770
Cummings, Amanda; Lund, Susi; Campling, Natasha; May, Carl R; Richardson, Alison; Myall, Michelle
2017-10-06
To identify the factors that promote and inhibit the implementation of interventions that improve communication and decision-making directed at goals of care in the event of acute clinical deterioration. A scoping review was undertaken based on the methodological framework of Arksey and O'Malley for conducting this type of review. Searches were carried out in Medline and Cumulative Index to Nursing and Allied Health Literature (CINAHL) to identify peer-reviewed papers and in Google to identify grey literature. Searches were limited to those published in the English language from 2000 onwards. Inclusion and exclusion criteria were applied, and only papers that had a specific focus on implementation in practice were selected. Data extracted were treated as qualitative and subjected to directed content analysis. A theory-informed coding framework using Normalisation Process Theory (NPT) was applied to characterise and explain implementation processes. Searches identified 2619 citations, 43 of which met the inclusion criteria. Analysis generated six themes fundamental to successful implementation of goals of care interventions: (1) input into development; (2) key clinical proponents; (3) training and education; (4) intervention workability and functionality; (5) setting and context; and (6) perceived value and appraisal. A broad and diverse literature focusing on implementation of goals of care interventions was identified. Our review recognised these interventions as both complex and contentious in nature, making their incorporation into routine clinical practice dependent on a number of factors. Implementing such interventions presents challenges at individual, organisational and systems levels, which make them difficult to introduce and embed. We have identified a series of factors that influence successful implementation and our analysis has distilled key learning points, conceptualised as a set of propositions, we consider relevant to implementing other complex and contentious interventions. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Cottrell, Erika K; Hall, Jennifer D; Kautz, Glenn; Angier, Heather; Likumahuwa-Ackman, Sonja; Sisulak, Laura; Keller, Sara; Cameron, David C; DeVoe, Jennifer E; Cohen, Deborah J
Alternative payment models have been proposed as a way to facilitate patient-centered medical home model implementation, yet little is known about how payment reform translates into changes in care delivery. We conducted site visits, observed operations, and conducted interviews within 3 Federally Qualified Health Center organizations that were part of Oregon's Alternative Payment Methodology demonstration project. Data were analyzed using an immersion-crystallization approach. We identified several care delivery changes during the early stages of implementation, as well as challenges associated with this new model of payment. Future research is needed to further understand the implications of these changes.
NASA Astrophysics Data System (ADS)
Zein-Sabatto, Saleh; Mikhail, Maged; Bodruzzaman, Mohammad; DeSimio, Martin; Derriso, Mark; Behbahani, Alireza
2012-06-01
It has been widely accepted that data fusion and information fusion methods can improve the accuracy and robustness of decision-making in structural health monitoring systems. It is arguably true nonetheless, that decision-level is equally beneficial when applied to integrated health monitoring systems. Several decisions at low-levels of abstraction may be produced by different decision-makers; however, decision-level fusion is required at the final stage of the process to provide accurate assessment about the health of the monitored system as a whole. An example of such integrated systems with complex decision-making scenarios is the integrated health monitoring of aircraft. Thorough understanding of the characteristics of the decision-fusion methodologies is a crucial step for successful implementation of such decision-fusion systems. In this paper, we have presented the major information fusion methodologies reported in the literature, i.e., probabilistic, evidential, and artificial intelligent based methods. The theoretical basis and characteristics of these methodologies are explained and their performances are analyzed. Second, candidate methods from the above fusion methodologies, i.e., Bayesian, Dempster-Shafer, and fuzzy logic algorithms are selected and their applications are extended to decisions fusion. Finally, fusion algorithms are developed based on the selected fusion methods and their performance are tested on decisions generated from synthetic data and from experimental data. Also in this paper, a modeling methodology, i.e. cloud model, for generating synthetic decisions is presented and used. Using the cloud model, both types of uncertainties; randomness and fuzziness, involved in real decision-making are modeled. Synthetic decisions are generated with an unbiased process and varying interaction complexities among decisions to provide for fair performance comparison of the selected decision-fusion algorithms. For verification purposes, implementation results of the developed fusion algorithms on structural health monitoring data collected from experimental tests are reported in this paper.
Moullin, Joanna C; Sabater-Hernández, Daniel; Benrimoj, Shalom I
2016-08-25
Multiple studies have explored the implementation process and influences, however it appears there is no study investigating these influences across the stages of implementation. Community pharmacy is attempting to implement professional services (pharmaceutical care and other health services). The use of implementation theory may assist the achievement of widespread provision, support and integration. The objective was to investigate professional service implementation in community pharmacy to contextualise and advance the concepts of a generic implementation framework previously published. Purposeful sampling was used to investigate implementation across a range of levels of implementation in community pharmacies in Australia. Twenty-five semi-structured interviews were conducted and analysed using a framework methodology. Data was charted using implementation stages as overarching themes and each stage was thematically analysed, to investigate the implementation process, the influences and their relationships. Secondary analyses were performed of the factors (barriers and facilitators) using an adapted version of the Consolidated Framework for Implementation Research (CFIR), and implementation strategies and interventions, using the Expert Recommendations for Implementing Change (ERIC) discrete implementation strategy compilation. Six stages emerged, labelled as development or discovery, exploration, preparation, testing, operation and sustainability. Within the stages, a range of implementation activities/steps and five overarching influences (pharmacys' direction and impetus, internal communication, staffing, community fit and support) were identified. The stages and activities were not applied strictly in a linear fashion. There was a trend towards the greater the number of activities considered, the greater the apparent integration into the pharmacy organization. Implementation factors varied over the implementation stages, and additional factors were added to the CFIR list and definitions modified/contextualised for pharmacy. Implementation strategies employed by pharmacies varied widely. Evaluations were lacking. The process of implementation and five overarching influences of professional services implementation in community pharmacy have been outlined. Framework analysis revealed, outside of the five overarching influences, factors influencing implementation varied across the implementation stages. It is proposed at each stage, for each domain, the factors, strategies and evaluations should be considered. The Framework for the Implementation of Services in Pharmacy incorporates the contextualisation of implementation science for pharmacy.
Application of hybrid methodology to rotors in steady and maneuvering flight
NASA Astrophysics Data System (ADS)
Rajmohan, Nischint
Helicopters are versatile flying machines that have capabilities that are unparalleled by fixed wing aircraft, such as operating in hover, performing vertical takeoff and landing on unprepared sites. This makes their use especially desirable in military and search-and-rescue operations. However, modern helicopters still suffer from high levels of noise and vibration caused by the physical phenomena occurring in the vicinity of the rotor blades. Therefore, improvement in rotorcraft design to reduce the noise and vibration levels requires understanding of the underlying physical phenomena, and accurate prediction capabilities of the resulting rotorcraft aeromechanics. The goal of this research is to study the aeromechanics of rotors in steady and maneuvering flight using hybrid Computational Fluid Dynamics (CFD) methodology. The hybrid CFD methodology uses the Navier-Stokes equations to solve the flow near the blade surface but the effect of the far wake is computed through the wake model. The hybrid CFD methodology is computationally efficient and its wake modeling approach is nondissipative making it an attractive tool to study rotorcraft aeromechanics. Several enhancements were made to the CFD methodology and it was coupled to a Computational Structural Dynamics (CSD) methodology to perform a trimmed aeroelastic analysis of a rotor in forward flight. The coupling analyses, both loose and tight were used to identify the key physical phenomena that affect rotors in different steady flight regimes. The modeling enhancements improved the airloads predictions for a variety of flight conditions. It was found that the tightly coupled method did not impact the loads significantly for steady flight conditions compared to the loosely coupled method. The coupling methodology was extended to maneuvering flight analysis by enhancing the computational and structural models to handle non-periodic flight conditions and vehicle motions in time accurate mode. The flight test control angles were employed to enable the maneuvering flight analysis. The fully coupled model provided the presence of three dynamic stall cycles on the rotor in maneuver. It is important to mention that analysis of maneuvering flight requires knowledge of the pilot input control pitch settings, and the vehicle states. As the result, these computational tools cannot be used for analysis of loads in a maneuver that has not been duplicated in a real flight. This is a significant limitation if these tools are to be selected during the design phase of a helicopter where its handling qualities are evaluated in different trajectories. Therefore, a methodology was developed to couple the CFD/CSD simulation with an inverse flight mechanics simulation to perform the maneuver analysis without using the flight test control input. The methodology showed reasonable convergence in steady flight regime and control angles predictions compared fairly well with test data. In the maneuvering flight regions, the convergence was slower due to relaxation techniques used for the numerical stability. The subsequent computed control angles for the maneuvering flight regions compared well with test data. Further, the enhancement of the rotor inflow computations in the inverse simulation through implementation of a Lagrangian wake model improved the convergence of the coupling methodology.
Model-based object classification using unification grammars and abstract representations
NASA Astrophysics Data System (ADS)
Liburdy, Kathleen A.; Schalkoff, Robert J.
1993-04-01
The design and implementation of a high level computer vision system which performs object classification is described. General object labelling and functional analysis require models of classes which display a wide range of geometric variations. A large representational gap exists between abstract criteria such as `graspable' and current geometric image descriptions. The vision system developed and described in this work addresses this problem and implements solutions based on a fusion of semantics, unification, and formal language theory. Object models are represented using unification grammars, which provide a framework for the integration of structure and semantics. A methodology for the derivation of symbolic image descriptions capable of interacting with the grammar-based models is described and implemented. A unification-based parser developed for this system achieves object classification by determining if the symbolic image description can be unified with the abstract criteria of an object model. Future research directions are indicated.
Stakeholder management for conservation projects: a case study of Ream National Park, Cambodia.
De Lopez, T T
2001-07-01
The paper gives an account of the development and implementation of a stakeholder management framework at Ream National Park, Cambodia. Firstly, the concept of stakeholder is reviewed in management and in conservation literatures. Secondly, the context in which the stakeholder framework was implemented is described. Thirdly, a five-step methodological framework is suggested: (1) stakeholder analysis, (2) stakeholder mapping, (3) development of generic strategies and workplan, (4) presentation of the workplan to stakeholders, and (5) implementation of the workplan. This framework classifies stakeholders according to their level of influence on the project and their potential for the conservation of natural resources. In a situation characterized by conflicting claims on natural resources, park authorities were able to successfully develop specific strategies for the management of stakeholders. The conclusion discusses the implications of the Ream experience and the generalization of the framework to other protected areas.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walker, La Tonya; Malczynski, Leonard
A Powersim Studio implementation of the system dynamics’ ‘Molecules of Structure’. The original implementation was in Ventana’s Vensim language by James Hines. The molecules are fundamental constructs of the system dynamics simulation methodology.
Designing a Strategic Plan through an Emerging Knowledge Generation Process: The ATM Experience
ERIC Educational Resources Information Center
Zanotti, Francesco
2012-01-01
Purpose: The aim of this contribution is to describe a new methodology for designing strategic plans and how it was implemented by ATM, a public transportation agency based in Milan, Italy. Design/methodology/approach: This methodology is founded on a new system theory, called "quantum systemics". It is based on models and metaphors both…
Scrum Methodology in Higher Education: Innovation in Teaching, Learning and Assessment
ERIC Educational Resources Information Center
Jurado-Navas, Antonio; Munoz-Luna, Rosa
2017-01-01
The present paper aims to detail the experience developed in a classroom of English Studies from the Spanish University of Málaga, where an alternative project-based learning methodology has been implemented. Such methodology is inspired by scrum sessions widely extended in technological companies where staff members work in teams and are assigned…