FINDING A METHOD FOR THE MADNESS: A COMPARATIVE ANALYSIS OF STRATEGIC DESIGN METHODOLOGIES
2017-06-01
FINDING A METHOD FOR THE MADNESS: A COMPARATIVE ANALYSIS OF STRATEGIC DESIGN METHODOLOGIES BY AMANDA DONNELLY A THESIS...work develops a comparative model for strategic design methodologies, focusing on the primary elements of vision, time, process, communication and...collaboration, and risk assessment. My analysis dissects and compares three potential design methodologies including, net assessment, scenarios and
NASA Technical Reports Server (NTRS)
Starnes, James H., Jr.; Newman, James C., Jr.; Harris, Charles E.; Piascik, Robert S.; Young, Richard D.; Rose, Cheryl A.
2003-01-01
Analysis methodologies for predicting fatigue-crack growth from rivet holes in panels subjected to cyclic loads and for predicting the residual strength of aluminum fuselage structures with cracks and subjected to combined internal pressure and mechanical loads are described. The fatigue-crack growth analysis methodology is based on small-crack theory and a plasticity induced crack-closure model, and the effect of a corrosive environment on crack-growth rate is included. The residual strength analysis methodology is based on the critical crack-tip-opening-angle fracture criterion that characterizes the fracture behavior of a material of interest, and a geometric and material nonlinear finite element shell analysis code that performs the structural analysis of the fuselage structure of interest. The methodologies have been verified experimentally for structures ranging from laboratory coupons to full-scale structural components. Analytical and experimental results based on these methodologies are described and compared for laboratory coupons and flat panels, small-scale pressurized shells, and full-scale curved stiffened panels. The residual strength analysis methodology is sufficiently general to include the effects of multiple-site damage on structural behavior.
Analysis and Design of Fuselage Structures Including Residual Strength Prediction Methodology
NASA Technical Reports Server (NTRS)
Knight, Norman F.
1998-01-01
The goal of this research project is to develop and assess methodologies for the design and analysis of fuselage structures accounting for residual strength. Two primary objectives are included in this research activity: development of structural analysis methodology for predicting residual strength of fuselage shell-type structures; and the development of accurate, efficient analysis, design and optimization tool for fuselage shell structures. Assessment of these tools for robustness, efficient, and usage in a fuselage shell design environment will be integrated with these two primary research objectives.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mata, Pedro; Fuente, Rafael de la; Iglesias, Javier
Iberdrola (spanish utility) and Iberdrola Ingenieria (engineering branch) have been developing during the last two years the 110% Extended Power Up-rate Project (EPU 110%) for Cofrentes BWR-6. IBERDROLA has available an in-house design and licensing reload methodology that has been approved by the Spanish Nuclear Regulatory Authority. This methodology has been already used to perform the nuclear design and the reload licensing analysis for Cofrentes cycles 12 to 14. The methodology has been also applied to develop a significant number of safety analysis of the Cofrentes Extended Power Up-rate including: Reactor Heat Balance, Core and Fuel performance, Thermal Hydraulic Stability,more » ECCS LOCA Evaluation, Transient Analysis, Anticipated Transient Without Scram (ATWS) and Station Blackout (SBO) Since the scope of the licensing process of the Cofrentes Extended Power Up-rate exceeds the range of analysis included in the Cofrentes generic reload licensing process, it has been required to extend the applicability of the Cofrentes licensing methodology to the analysis of new transients. This is the case of the TLFW transient. The content of this paper shows the benefits of having an in-house design and licensing methodology, and describes the process to extend the applicability of the methodology to the analysis of new transients. The case of analysis of Total Loss of Feedwater with the Cofrentes Retran Model is included as an example of this process. (authors)« less
Benefit-Cost Analysis of Integrated Paratransit Systems : Volume 6. Technical Appendices.
DOT National Transportation Integrated Search
1979-09-01
This last volume, includes five technical appendices which document the methodologies used in the benefit-cost analysis. They are the following: Scenario analysis methodology; Impact estimation; Example of impact estimation; Sensitivity analysis; Agg...
Global/local methods research using a common structural analysis framework
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.; Ransom, Jonathan B.; Griffin, O. H., Jr.; Thompson, Danniella M.
1991-01-01
Methodologies for global/local stress analysis are described including both two- and three-dimensional analysis methods. These methods are being developed within a common structural analysis framework. Representative structural analysis problems are presented to demonstrate the global/local methodologies being developed.
Methodologies for Evaluating the Impact of Contraceptive Social Marketing Programs.
ERIC Educational Resources Information Center
Bertrand, Jane T.; And Others
1989-01-01
An overview of the evaluation issues associated with contraceptive social marketing programs is provided. Methodologies covered include survey techniques, cost-effectiveness analyses, retail audits of sales data, time series analysis, nested logit analysis, and discriminant analysis. (TJH)
Rat sperm motility analysis: methodologic considerations
The objective of these studies was to optimize conditions for computer-assisted sperm analysis (CASA) of rat epididymal spermatozoa. Methodologic issues addressed include sample collection technique, sampling region within the epididymis, type of diluent medium used, and sample c...
A New Methodology for Systematic Exploitation of Technology Databases.
ERIC Educational Resources Information Center
Bedecarrax, Chantal; Huot, Charles
1994-01-01
Presents the theoretical aspects of a data analysis methodology that can help transform sequential raw data from a database into useful information, using the statistical analysis of patents as an example. Topics discussed include relational analysis and a technology watch approach. (Contains 17 references.) (LRW)
How equity is addressed in clinical practice guidelines: a content analysis
Shi, Chunhu; Tian, Jinhui; Wang, Quan; Petkovic, Jennifer; Ren, Dan; Yang, Kehu; Yang, Yang
2014-01-01
Objectives Considering equity into guidelines presents methodological challenges. This study aims to qualitatively synthesise the methods for incorporating equity in clinical practice guidelines (CPGs). Setting Content analysis of methodological publications. Eligibility criteria for selecting studies Methodological publications were included if they provided checklists/frameworks on when, how and to what extent equity should be incorporated in CPGs. Data sources We electronically searched MEDLINE, retrieved references, and browsed guideline development organisation websites from inception to January 2013. After study selection by two authors, general characteristics and checklists items/framework components from included studies were extracted. Based on the questions or items from checklists/frameworks (unit of analysis), content analysis was conducted to identify themes and questions/items were grouped into these themes. Primary outcomes The primary outcomes were methodological themes and processes on how to address equity issues in guideline development. Results 8 studies with 10 publications were included from 3405 citations. In total, a list of 87 questions/items was generated from 17 checklists/frameworks. After content analysis, questions were grouped into eight themes (‘scoping questions’, ‘searching relevant evidence’, ‘appraising evidence and recommendations’, ‘formulating recommendations’, ‘monitoring implementation’, ‘providing a flow chart to include equity in CPGs’, and ‘others: reporting of guidelines and comments from stakeholders’ for CPG developers and ‘assessing the quality of CPGs’ for CPG users). Four included studies covered more than five of these themes. We also summarised the process of guideline development based on the themes mentioned above. Conclusions For disadvantaged population-specific CPGs, eight important methodological issues identified in this review should be considered when including equity in CPGs under the guidance of a scientific guideline development manual. PMID:25479795
Speed Accuracy Tradeoffs in Human Speech Production
2017-05-01
for considering Fitts’ law in the domain of speech production is elucidated. Methodological challenges in applying Fitts-style analysis are addressed...order to assess whether articulatory kinematics conform to Fitts’ law. A second, associated goal is to address the methodological challenges inherent in...performing Fitts-style analysis on rtMRI data of speech production. Methodological challenges include segmenting continuous speech into specific motor
ERIC Educational Resources Information Center
Ross, Linda
2003-01-01
Recent work with automotive e-commerce clients led to the development of a performance analysis methodology called the Seven Performance Drivers, including: standards, incentives, capacity, knowledge and skill, measurement, feedback, and analysis. This methodology has been highly effective in introducing and implementing performance improvement.…
Analysis and methodology for aeronautical systems technology program planning
NASA Technical Reports Server (NTRS)
White, M. J.; Gershkoff, I.; Lamkin, S.
1983-01-01
A structured methodology was developed that allows the generation, analysis, and rank-ordering of system concepts by their benefits and costs, indicating the preferred order of implementation. The methodology is supported by a base of data on civil transport aircraft fleet growth projections and data on aircraft performance relating the contribution of each element of the aircraft to overall performance. The performance data are used to assess the benefits of proposed concepts. The methodology includes a computer program for performing the calculations needed to rank-order the concepts and compute their cumulative benefit-to-cost ratio. The use of the methodology and supporting data is illustrated through the analysis of actual system concepts from various sources.
Recent Methodology in Ginseng Analysis
Baek, Seung-Hoon; Bae, Ok-Nam; Park, Jeong Hill
2012-01-01
As much as the popularity of ginseng in herbal prescriptions or remedies, ginseng has become the focus of research in many scientific fields. Analytical methodologies for ginseng, referred to as ginseng analysis hereafter, have been developed for bioactive component discovery, phytochemical profiling, quality control, and pharmacokinetic studies. This review summarizes the most recent advances in ginseng analysis in the past half-decade including emerging techniques and analytical trends. Ginseng analysis includes all of the leading analytical tools and serves as a representative model for the analytical research of herbal medicines. PMID:23717112
Method and system for dynamic probabilistic risk assessment
NASA Technical Reports Server (NTRS)
Dugan, Joanne Bechta (Inventor); Xu, Hong (Inventor)
2013-01-01
The DEFT methodology, system and computer readable medium extends the applicability of the PRA (Probabilistic Risk Assessment) methodology to computer-based systems, by allowing DFT (Dynamic Fault Tree) nodes as pivot nodes in the Event Tree (ET) model. DEFT includes a mathematical model and solution algorithm, supports all common PRA analysis functions and cutsets. Additional capabilities enabled by the DFT include modularization, phased mission analysis, sequence dependencies, and imperfect coverage.
NASA Technical Reports Server (NTRS)
Unal, Resit; Keating, Charles; Conway, Bruce; Chytka, Trina
2004-01-01
A comprehensive expert-judgment elicitation methodology to quantify input parameter uncertainty and analysis tool uncertainty in a conceptual launch vehicle design analysis has been developed. The ten-phase methodology seeks to obtain expert judgment opinion for quantifying uncertainties as a probability distribution so that multidisciplinary risk analysis studies can be performed. The calibration and aggregation techniques presented as part of the methodology are aimed at improving individual expert estimates, and provide an approach to aggregate multiple expert judgments into a single probability distribution. The purpose of this report is to document the methodology development and its validation through application to a reference aerospace vehicle. A detailed summary of the application exercise, including calibration and aggregation results is presented. A discussion of possible future steps in this research area is given.
Beyond Needs Analysis: Soft Systems Methodology for Meaningful Collaboration in EAP Course Design
ERIC Educational Resources Information Center
Tajino, Akira; James, Robert; Kijima, Kyoichi
2005-01-01
Designing an EAP course requires collaboration among various concerned stakeholders, including students, subject teachers, institutional administrators and EAP teachers themselves. While needs analysis is often considered fundamental to EAP, alternative research methodologies may be required to facilitate meaningful collaboration between these…
Designing for fiber composite structural durability in hygrothermomechanical environment
NASA Technical Reports Server (NTRS)
Chamis, C. C.
1985-01-01
A methodology is described which can be used to design/analyze fiber composite structures subjected to complex hygrothermomechanical environments. This methodology includes composite mechanics and advanced structural analysis methods (finite element). Select examples are described to illustrate the application of the available methodology. The examples include: (1) composite progressive fracture; (2) composite design for high cycle fatigue combined with hot-wet conditions; and (3) general laminate design.
Demonstration of a Safety Analysis on a Complex System
NASA Technical Reports Server (NTRS)
Leveson, Nancy; Alfaro, Liliana; Alvarado, Christine; Brown, Molly; Hunt, Earl B.; Jaffe, Matt; Joslyn, Susan; Pinnell, Denise; Reese, Jon; Samarziya, Jeffrey;
1997-01-01
For the past 17 years, Professor Leveson and her graduate students have been developing a theoretical foundation for safety in complex systems and building a methodology upon that foundation. The methodology includes special management structures and procedures, system hazard analyses, software hazard analysis, requirements modeling and analysis for completeness and safety, special software design techniques including the design of human-machine interaction, verification, operational feedback, and change analysis. The Safeware methodology is based on system safety techniques that are extended to deal with software and human error. Automation is used to enhance our ability to cope with complex systems. Identification, classification, and evaluation of hazards is done using modeling and analysis. To be effective, the models and analysis tools must consider the hardware, software, and human components in these systems. They also need to include a variety of analysis techniques and orthogonal approaches: There exists no single safety analysis or evaluation technique that can handle all aspects of complex systems. Applying only one or two may make us feel satisfied, but will produce limited results. We report here on a demonstration, performed as part of a contract with NASA Langley Research Center, of the Safeware methodology on the Center-TRACON Automation System (CTAS) portion of the air traffic control (ATC) system and procedures currently employed at the Dallas/Fort Worth (DFW) TRACON (Terminal Radar Approach CONtrol). CTAS is an automated system to assist controllers in handling arrival traffic in the DFW area. Safety is a system property, not a component property, so our safety analysis considers the entire system and not simply the automated components. Because safety analysis of a complex system is an interdisciplinary effort, our team included system engineers, software engineers, human factors experts, and cognitive psychologists.
Pantex Falling Man - Independent Review Panel Report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bertolini, Louis; Brannon, Nathan; Olson, Jared
2014-11-01
Consolidated Nuclear Security (CNS) Pantex took the initiative to organize a Review Panel of subject matter experts to independently assess the adequacy of the Pantex Tripping Man Analysis methodology. The purpose of this report is to capture the details of the assessment including the scope, approach, results, and detailed Appendices. Along with the assessment of the analysis methodology, the panel evaluated the adequacy with which the methodology was applied as well as congruence with Department of Energy (DOE) standards 3009 and 3016. The approach included the review of relevant documentation, interactive discussion with Pantex staff, and the iterative process ofmore » evaluating critical lines of inquiry.« less
An Ontology for State Analysis: Formalizing the Mapping to SysML
NASA Technical Reports Server (NTRS)
Wagner, David A.; Bennett, Matthew B.; Karban, Robert; Rouquette, Nicolas; Jenkins, Steven; Ingham, Michel
2012-01-01
State Analysis is a methodology developed over the last decade for architecting, designing and documenting complex control systems. Although it was originally conceived for designing robotic spacecraft, recent applications include the design of control systems for large ground-based telescopes. The European Southern Observatory (ESO) began a project to design the European Extremely Large Telescope (E-ELT), which will require coordinated control of over a thousand articulated mirror segments. The designers are using State Analysis as a methodology and the Systems Modeling Language (SysML) as a modeling and documentation language in this task. To effectively apply the State Analysis methodology in this context it became necessary to provide ontological definitions of the concepts and relations in State Analysis and greater flexibility through a mapping of State Analysis into a practical extension of SysML. The ontology provides the formal basis for verifying compliance with State Analysis semantics including architectural constraints. The SysML extension provides the practical basis for applying the State Analysis methodology with SysML tools. This paper will discuss the method used to develop these formalisms (the ontology), the formalisms themselves, the mapping to SysML and approach to using these formalisms to specify a control system and enforce architectural constraints in a SysML model.
2017-05-25
37 Research Design ... research employed a mixed research methodology – quantitative with descriptive statistical analysis and qualitative with a thematic analysis approach...mixed research methodology – quantitative and qualitative, using interviews to collect the data. The interviews included demographic and open-ended
Faggion, Clovis M; Huda, Fahd; Wasiak, Jason
2014-06-01
To evaluate the methodological approaches used to assess the quality of studies included in systematic reviews (SRs) in periodontology and implant dentistry. Two electronic databases (PubMed and Cochrane Database of Systematic Reviews) were searched independently to identify SRs examining interventions published through 2 September 2013. The reference lists of included SRs and records of 10 specialty dental journals were searched manually. Methodological approaches were assessed using seven criteria based on the Cochrane Handbook for Systematic Reviews of Interventions. Temporal trends in methodological quality were also explored. Of the 159 SRs with meta-analyses included in the analysis, 44 (28%) reported the use of domain-based tools, 15 (9%) reported the use of checklists and 7 (4%) reported the use of scales. Forty-two (26%) SRs reported use of more than one tool. Criteria were met heterogeneously; authors of 15 (9%) publications incorporated the quality of evidence of primary studies into SRs, whereas 69% of SRs reported methodological approaches in the Materials/Methods section. Reporting of four criteria was significantly better in recent (2010-2013) than in previous publications. The analysis identified several methodological limitations of approaches used to assess evidence in studies included in SRs in periodontology and implant dentistry. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
A critical methodological review of discourse and conversation analysis studies of family therapy.
Tseliou, Eleftheria
2013-12-01
Discourse (DA) and conversation (CA) analysis, two qualitative research methods, have been recently suggested as potentially promising for the study of family therapy due to common epistemological adherences and their potential for an in situ study of therapeutic dialog. However, to date, there is no systematic methodological review of the few existing DA and CA studies of family therapy. This study aims at addressing this lack by critically reviewing published DA and CA studies of family therapy on methodological grounds. Twenty-eight articles in total are reviewed in relation to certain methodological axes identified in the relevant literature. These include choice of method, framing of research question(s), data/sampling, type of analysis, epistemological perspective, content/type of knowledge claims, and attendance to criteria for good quality practice. It is argued that the reviewed studies show "glimpses" of the methods' potential for family therapy research despite the identification of certain "shortcomings" regarding their methodological rigor. These include unclearly framed research questions and the predominance of case study designs. They also include inconsistencies between choice of method, stated or unstated epistemological orientations and knowledge claims, and limited attendance to criteria for good quality practice. In conclusion, it is argued that DA and CA can add to the existing quantitative and qualitative methods for family therapy research. They can both offer unique ways for a detailed study of the actual therapeutic dialog, provided that future attempts strive for a methodologically rigorous practice and against their uncritical deployment. © FPI, Inc.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
NASA Technical Reports Server (NTRS)
Cirillo, William M.; Earle, Kevin D.; Goodliff, Kandyce E.; Reeves, J. D.; Stromgren, Chel; Andraschko, Mark R.; Merrill, R. Gabe
2008-01-01
NASA s Constellation Program employs a strategic analysis methodology in providing an integrated analysis capability of Lunar exploration scenarios and to support strategic decision-making regarding those scenarios. The strategic analysis methodology integrates the assessment of the major contributors to strategic objective satisfaction performance, affordability, and risk and captures the linkages and feedbacks between all three components. Strategic analysis supports strategic decision making by senior management through comparable analysis of alternative strategies, provision of a consistent set of high level value metrics, and the enabling of cost-benefit analysis. The tools developed to implement the strategic analysis methodology are not element design and sizing tools. Rather, these models evaluate strategic performance using predefined elements, imported into a library from expert-driven design/sizing tools or expert analysis. Specific components of the strategic analysis tool set include scenario definition, requirements generation, mission manifesting, scenario lifecycle costing, crew time analysis, objective satisfaction benefit, risk analysis, and probabilistic evaluation. Results from all components of strategic analysis are evaluated a set of pre-defined figures of merit (FOMs). These FOMs capture the high-level strategic characteristics of all scenarios and facilitate direct comparison of options. The strategic analysis methodology that is described in this paper has previously been applied to the Space Shuttle and International Space Station Programs and is now being used to support the development of the baseline Constellation Program lunar architecture. This paper will present an overview of the strategic analysis methodology and will present sample results from the application of the strategic analysis methodology to the Constellation Program lunar architecture.
Documentation of indigenous Pacific agroforestry systems: a review of methodologies
Bill Raynor
1993-01-01
Recent interest in indigenous agroforestry has led to a need for documentation of these systems. However, previous work is very limited, and few methodologies are well-known or widely accepted. This paper outlines various methodologies (including sampling methods, data to be collected, and considerations in analysis) for documenting structure and productivity of...
NASA Technical Reports Server (NTRS)
Anderson, B. H.
1983-01-01
A broad program to develop advanced, reliable, and user oriented three-dimensional viscous design techniques for supersonic inlet systems, and encourage their transfer into the general user community is discussed. Features of the program include: (1) develop effective methods of computing three-dimensional flows within a zonal modeling methodology; (2) ensure reasonable agreement between said analysis and selective sets of benchmark validation data; (3) develop user orientation into said analysis; and (4) explore and develop advanced numerical methodology.
Health economic evaluation: important principles and methodology.
Rudmik, Luke; Drummond, Michael
2013-06-01
To discuss health economic evaluation and improve the understanding of common methodology. This article discusses the methodology for the following types of economic evaluations: cost-minimization, cost-effectiveness, cost-utility, cost-benefit, and economic modeling. Topics include health-state utility measures, the quality-adjusted life year (QALY), uncertainty analysis, discounting, decision tree analysis, and Markov modeling. Economic evaluation is the comparative analysis of alternative courses of action in terms of both their costs and consequences. With increasing health care expenditure and limited resources, it is important for physicians to consider the economic impact of their interventions. Understanding common methodology involved in health economic evaluation will improve critical appraisal of the literature and optimize future economic evaluations. Copyright © 2012 The American Laryngological, Rhinological and Otological Society, Inc.
A micromechanics-based strength prediction methodology for notched metal matrix composites
NASA Technical Reports Server (NTRS)
Bigelow, C. A.
1992-01-01
An analytical micromechanics based strength prediction methodology was developed to predict failure of notched metal matrix composites. The stress-strain behavior and notched strength of two metal matrix composites, boron/aluminum (B/Al) and silicon-carbide/titanium (SCS-6/Ti-15-3), were predicted. The prediction methodology combines analytical techniques ranging from a three dimensional finite element analysis of a notched specimen to a micromechanical model of a single fiber. In the B/Al laminates, a fiber failure criteria based on the axial and shear stress in the fiber accurately predicted laminate failure for a variety of layups and notch-length to specimen-width ratios with both circular holes and sharp notches when matrix plasticity was included in the analysis. For the SCS-6/Ti-15-3 laminates, a fiber failure based on the axial stress in the fiber correlated well with experimental results for static and post fatigue residual strengths when fiber matrix debonding and matrix cracking were included in the analysis. The micromechanics based strength prediction methodology offers a direct approach to strength prediction by modeling behavior and damage on a constituent level, thus, explicitly including matrix nonlinearity, fiber matrix debonding, and matrix cracking.
A micromechanics-based strength prediction methodology for notched metal-matrix composites
NASA Technical Reports Server (NTRS)
Bigelow, C. A.
1993-01-01
An analytical micromechanics-based strength prediction methodology was developed to predict failure of notched metal matrix composites. The stress-strain behavior and notched strength of two metal matrix composites, boron/aluminum (B/Al) and silicon-carbide/titanium (SCS-6/Ti-15-3), were predicted. The prediction methodology combines analytical techniques ranging from a three-dimensional finite element analysis of a notched specimen to a micromechanical model of a single fiber. In the B/Al laminates, a fiber failure criteria based on the axial and shear stress in the fiber accurately predicted laminate failure for a variety of layups and notch-length to specimen-width ratios with both circular holes and sharp notches when matrix plasticity was included in the analysis. For the SCS-6/Ti-15-3 laminates, a fiber failure based on the axial stress in the fiber correlated well with experimental results for static and postfatigue residual strengths when fiber matrix debonding and matrix cracking were included in the analysis. The micromechanics-based strength prediction methodology offers a direct approach to strength prediction by modeling behavior and damage on a constituent level, thus, explicitly including matrix nonlinearity, fiber matrix debonding, and matrix cracking.
NASA Astrophysics Data System (ADS)
Nemeth, Noel N.; Jadaan, Osama M.; Palfi, Tamas; Baker, Eric H.
Brittle materials today are being used, or considered, for a wide variety of high tech applications that operate in harsh environments, including static and rotating turbine parts, thermal protection systems, dental prosthetics, fuel cells, oxygen transport membranes, radomes, and MEMS. Designing brittle material components to sustain repeated load without fracturing while using the minimum amount of material requires the use of a probabilistic design methodology. The NASA CARES/Life 1 (Ceramic Analysis and Reliability Evaluation of Structure/Life) code provides a general-purpose analysis tool that predicts the probability of failure of a ceramic component as a function of its time in service. This capability includes predicting the time-dependent failure probability of ceramic components against catastrophic rupture when subjected to transient thermomechanical loads (including cyclic loads). The developed methodology allows for changes in material response that can occur with temperature or time (i.e. changing fatigue and Weibull parameters with temperature or time). For this article an overview of the transient reliability methodology and how this methodology is extended to account for proof testing is described. The CARES/Life code has been modified to have the ability to interface with commercially available finite element analysis (FEA) codes executed for transient load histories. Examples are provided to demonstrate the features of the methodology as implemented in the CARES/Life program.
Sabharwal, Sanjeeve; Carter, Alexander; Darzi, Lord Ara; Reilly, Peter; Gupte, Chinmay M
2015-06-01
Approximately 76,000 people a year sustain a hip fracture in the UK and the estimated cost to the NHS is £1.4 billion a year. Health economic evaluations (HEEs) are one of the methods employed by decision makers to deliver healthcare policy supported by clinical and economic evidence. The objective of this study was to (1) identify and characterize HEEs for the management of patients with hip fractures, and (2) examine their methodological quality. A literature search was performed in MEDLINE, EMBASE and the NHS Economic Evaluation Database. Studies that met the specified definition for a HEE and evaluated hip fracture management were included. Methodological quality was assessed using the Consensus on Health Economic Criteria (CHEC). Twenty-seven publications met the inclusion criteria of this study and were included in our descriptive and methodological analysis. Domains of methodology that performed poorly included use of an appropriate time horizon (66.7% of studies), incremental analysis of costs and outcomes (63%), future discounting (44.4%), sensitivity analysis (40.7%), declaration of conflicts of interest (37%) and discussion of ethical considerations (29.6%). HEEs for patients with hip fractures are increasing in publication in recent years. Most of these studies fail to adopt a societal perspective and key aspects of their methodology are poor. The development of future HEEs in this field must adhere to established principles of methodology, so that better quality research can be used to inform health policy on the management of patients with a hip fracture. Copyright © 2014 Royal College of Surgeons of Edinburgh (Scottish charity number SC005317) and Royal College of Surgeons in Ireland. Published by Elsevier Ltd. All rights reserved.
Methodological challenges when doing research that includes ethnic minorities: a scoping review.
Morville, Anne-Le; Erlandsson, Lena-Karin
2016-11-01
There are challenging methodological issues in obtaining valid and reliable results on which to base occupational therapy interventions for ethnic minorities. The aim of this scoping review is to describe the methodological problems within occupational therapy research, when ethnic minorities are included. A thorough literature search yielded 21 articles obtained from the scientific databases PubMed, Cinahl, Web of Science and PsychInfo. Analysis followed Arksey and O'Malley's framework for scoping reviews, applying content analysis. The results showed methodological issues concerning the entire research process from defining and recruiting samples, the conceptual understanding, lack of appropriate instruments, data collection using interpreters to analyzing data. In order to avoid excluding the ethnic minorities from adequate occupational therapy research and interventions, development of methods for the entire research process is needed. It is a costly and time-consuming process, but the results will be valid and reliable, and therefore more applicable in clinical practice.
Enviroplan—a summary methodology for comprehensive environmental planning and design
Robert Allen Jr.; George Nez; Fred Nicholson; Larry Sutphin
1979-01-01
This paper will discuss a comprehensive environmental assessment methodology that includes a numerical method for visual management and analysis. This methodology employs resource and human activity units as a means to produce a visual form unit which is the fundamental unit of the perceptual environment. The resource unit is based on the ecosystem as the fundamental...
On Improving the Experiment Methodology in Pedagogical Research
ERIC Educational Resources Information Center
Horakova, Tereza; Houska, Milan
2014-01-01
The paper shows how the methodology for a pedagogical experiment can be improved through including the pre-research stage. If the experiment has the form of a test procedure, an improvement of methodology can be achieved using for example the methods of statistical and didactic analysis of tests which are traditionally used in other areas, i.e.…
NASA Astrophysics Data System (ADS)
Tene, Yair; Tene, Noam; Tene, G.
1993-08-01
An interactive data fusion methodology of video, audio, and nonlinear structural dynamic analysis for potential application in forensic engineering is presented. The methodology was developed and successfully demonstrated in the analysis of heavy transportable bridge collapse during preparation for testing. Multiple bridge elements failures were identified after the collapse, including fracture, cracks and rupture of high performance structural materials. Videotape recording by hand held camcorder was the only source of information about the collapse sequence. The interactive data fusion methodology resulted in extracting relevant information form the videotape and from dynamic nonlinear structural analysis, leading to full account of the sequence of events during the bridge collapse.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
2010-04-01
available [11]. Additionally, Table-3 is a guide for DMAIC methodology including 29 different methods [12]. RTO-MP-SAS-081 6 - 4 NATO UNCLASSIFIED NATO...Table 3: DMAIC Methodology (5-Phase Methodology). Define Measure Analyze Improve Control Project Charter Prioritization Matrix 5 Whys Analysis...Methodology Scope [13] DMAIC PDCA Develop performance priorities This is a preliminary stage that precedes specific improvement projects, and the aim
RISMC Toolkit and Methodology Research and Development Plan for External Hazards Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coleman, Justin Leigh
This report includes the description and development plan for a Risk Informed Safety Margins Characterization (RISMC) toolkit and methodology that will evaluate multihazard risk in an integrated manner to support the operating nuclear fleet.
Probabilistic Based Modeling and Simulation Assessment
2010-06-01
different crash and blast scenarios. With the integration of the high fidelity neck and head model, a methodology to calculate the probability of injury...variability, correlation, and multiple (often competing) failure metrics. Important scenarios include vehicular collisions, blast /fragment impact, and...first area of focus is to develop a methodology to integrate probabilistic analysis into finite element analysis of vehicle collisions and blast . The
Development of a Practical Methodology for Elastic-Plastic and Fully Plastic Fatigue Crack Growth
NASA Technical Reports Server (NTRS)
McClung, R. C.; Chell, G. G.; Lee, Y. -D.; Russell, D. A.; Orient, G. E.
1999-01-01
A practical engineering methodology has been developed to analyze and predict fatigue crack growth rates under elastic-plastic and fully plastic conditions. The methodology employs the closure-corrected effective range of the J-integral, delta J(sub eff) as the governing parameter. The methodology contains original and literature J and delta J solutions for specific geometries, along with general methods for estimating J for other geometries and other loading conditions, including combined mechanical loading and combined primary and secondary loading. The methodology also contains specific practical algorithms that translate a J solution into a prediction of fatigue crack growth rate or life, including methods for determining crack opening levels, crack instability conditions, and material properties. A critical core subset of the J solutions and the practical algorithms has been implemented into independent elastic-plastic NASGRO modules. All components of the entire methodology, including the NASGRO modules, have been verified through analysis and experiment, and limits of applicability have been identified.
Development of a Practical Methodology for Elastic-Plastic and Fully Plastic Fatigue Crack Growth
NASA Technical Reports Server (NTRS)
McClung, R. C.; Chell, G. G.; Lee, Y.-D.; Russell, D. A.; Orient, G. E.
1999-01-01
A practical engineering methodology has been developed to analyze and predict fatigue crack growth rates under elastic-plastic and fully plastic conditions. The methodology employs the closure-corrected effective range of the J-integral, (Delta)J(sub eff), as the governing parameter. The methodology contains original and literature J and (Delta)J solutions for specific geometries, along with general methods for estimating J for other geometries and other loading conditions, including combined mechanical loading and combined primary and secondary loading. The methodology also contains specific practical algorithms that translate a J solution into a prediction of fatigue crack growth rate or life, including methods for determining crack opening levels, crack instability conditions, and material properties. A critical core subset of the J solutions and the practical algorithms has been implemented into independent elastic-plastic NASGRO modules. All components of the entire methodology, including the NASGRO modules, have been verified through analysis and experiment, and limits of applicability have been identified.
Speed-Accuracy Tradeoffs in Speech Production
2017-06-01
imaging data of speech production. A theoretical framework for considering Fitts’ law in the domain of speech production is elucidated. Methodological ...articulatory kinematics conform to Fitts’ law. A second, associated goal is to address the methodological challenges inherent in performing Fitts-style...analysis on rtMRI data of speech production. Methodological challenges include segmenting continuous speech into specific motor tasks, defining key
Johnson, Blair T; Low, Robert E; MacDonald, Hayley V
2015-01-01
Systematic reviews now routinely assess methodological quality to gauge the validity of the included studies and of the synthesis as a whole. Although trends from higher quality studies should be clearer, it is uncertain how often meta-analyses incorporate methodological quality in models of study results either as predictors, or, more interestingly, in interactions with theoretical moderators. We survey 200 meta-analyses in three health promotion domains to examine when and how meta-analyses incorporate methodological quality. Although methodological quality assessments commonly appear in contemporary meta-analyses (usually as scales), they are rarely incorporated in analyses, and still more rarely analysed in interaction with theoretical determinants of the success of health promotions. The few meta-analyses (2.5%) that did include such an interaction analysis showed that moderator results remained significant in higher quality studies or were present only among higher quality studies. We describe how to model quality interactively with theoretically derived moderators and discuss strengths and weaknesses of this approach and in relation to current meta-analytic practice. In large literatures exhibiting heterogeneous effects, meta-analyses can incorporate methodological quality and generate conclusions that enable greater confidence not only about the substantive phenomenon but also about the role that methodological quality itself plays.
Toumi, Mondher; Motrunich, Anastasiia; Millier, Aurélie; Rémuzat, Cécile; Chouaid, Christos; Falissard, Bruno; Aballéa, Samuel
2017-01-01
ABSTRACT Background: Despite the guidelines for Economic and Public Health Assessment Committee (CEESP) submission having been available for nearly six years, the dossiers submitted continue to deviate from them, potentially impacting product prices. Objective: to review the reports published by CEESP, analyse deviations from the guidelines, and discuss their implications for the pricing and reimbursement process. Study design: CEESP reports published until January 2017 were reviewed, and deviations from the guidelines were extracted. The frequency of deviations was described by type of methodological concern (minor, important or major). Results: In 19 reports, we identified 243 methodological concerns, most often concerning modelling, measurement and valuation of health states and results presentation and sensitivity analyses; nearly 63% were minor, 33% were important and 4.5% were major. All reports included minor methodological concerns, and 17 (89%) included at least one important and/or major methodological concern. Global major methodological concerns completely invalidated the analysis in seven dossiers (37%). Conclusion: The CEESP submission dossiers fail to adhere to the guidelines, potentially invalidating the health economics analysis and resulting in pricing negotiations. As these negotiations tend to be unfavourable for the manufacturer, the industry should strive to improve the quality of the analyses submitted to CEESP. PMID:28804600
NASA Technical Reports Server (NTRS)
Francois, J.
1981-01-01
The focus of the investigation is centered around two main themes: an analysis of the effects of aircraft noise on the psychological and physiological equilibrium of airport residents; and an analysis of the sources of variability of sensitivity to noise. The methodology used is presented. Nine statistical tables are included, along with a set of conclusions.
Regional Shelter Analysis Methodology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dillon, Michael B.; Dennison, Deborah; Kane, Jave
2015-08-01
The fallout from a nuclear explosion has the potential to injure or kill 100,000 or more people through exposure to external gamma (fallout) radiation. Existing buildings can reduce radiation exposure by placing material between fallout particles and exposed people. Lawrence Livermore National Laboratory was tasked with developing an operationally feasible methodology that could improve fallout casualty estimates. The methodology, called a Regional Shelter Analysis, combines the fallout protection that existing buildings provide civilian populations with the distribution of people in various locations. The Regional Shelter Analysis method allows the consideration of (a) multiple building types and locations within buildings, (b)more » country specific estimates, (c) population posture (e.g., unwarned vs. minimally warned), and (d) the time of day (e.g., night vs. day). The protection estimates can be combined with fallout predictions (or measurements) to (a) provide a more accurate assessment of exposure and injury and (b) evaluate the effectiveness of various casualty mitigation strategies. This report describes the Regional Shelter Analysis methodology, highlights key operational aspects (including demonstrating that the methodology is compatible with current tools), illustrates how to implement the methodology, and provides suggestions for future work.« less
SOCIOECONOMIC ANALYSIS OF HAZARDOUS WASTE MANAGEMENT ALTERNATIVES: METHOLOLOGY AND DEMONSTRATION
A methodology for analyzing economic and social effects of alternatives in hazardous waste management is presented and demonstrated. The approach includes the use of environmental threat scenarios and evaluation of effects on and responses by parties-at-interest. The methodology ...
Performer-centric Interface Design.
ERIC Educational Resources Information Center
McGraw, Karen L.
1995-01-01
Describes performer-centric interface design and explains a model-based approach for conducting performer-centric analysis and design. Highlights include design methodology, including cognitive task analysis; creating task scenarios; creating the presentation model; creating storyboards; proof of concept screens; object models and icons;…
Prabakaran, Rema; Seymour, Shiri; Moles, David R; Cunningham, Susan J
2012-08-01
Motivation and cooperation are vital components of orthodontic treatment if a good outcome is to be achieved. In this study, we used Q-methodology to investigate motivating factors among adolescents seeking orthodontic treatment and parents wanting their children to undergo orthodontic treatment. This technique asks participants to rank a series of statements, and the analysis of this ranking then provides insight into the participants' opinions. Each of these complementary studies was divided into 2 phases: interviews to generate a list of reasons for seeking orthodontic treatment and the use of Q-methodology to assess and categorize the relative importance of these reasons for the groups of participants. In the patient study, 32 items were generated from the interviews and placed in order of importance on a Q-methodology grid by 60 patients who were about to commence orthodontic treatment. The rankings were subjected to factor analysis, which categorized the patients' views into groups of shared opinions. The same methodology was used with the parent group, and a Q-methodology grid was designed to accommodate 35 items that were then ranked by the 60 parents. The rankings were subjected to factor analysis as for the patient group. For the patients, factor analysis identified 3 factors, all of which included esthetics, as important. The remaining respondents had more individual viewpoints and did not map to any of the 3 factors. For the parents, factor analysis identified 4 factors, all of which included treatment in adolescence to prevent future problems, as important. This study showed that Q-methodology is a novel and efficient tool that can be used in dental research with few difficulties. It might prove useful for the aspects of care for which subjective views or opinions play an important role. Copyright © 2012 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.
A Radial Basis Function Approach to Financial Time Series Analysis
1993-12-01
including efficient methods for parameter estimation and pruning, a pointwise prediction error estimator, and a methodology for controlling the "data...collection of practical techniques to address these issues for a modeling methodology . Radial Basis Function networks. These techniques in- clude efficient... methodology often then amounts to a careful consideration of the interplay between model complexity and reliability. These will be recurrent themes
Conceptual and Preliminary Design of a Low-Cost Precision Aerial Delivery System
2016-06-01
test results. It includes an analysis of the failure modes encountered during flight experimentation , methodology used for conducting coordinate...and experimentation . Additionally, the current and desired end state of the research is addressed. Finally, this chapter outlines the methodology ...preliminary design phases are utilized to investigate and develop a potentially low-cost alternative to existing systems. Using an Agile methodology
A human factors methodology for real-time support applications
NASA Technical Reports Server (NTRS)
Murphy, E. D.; Vanbalen, P. M.; Mitchell, C. M.
1983-01-01
A general approach to the human factors (HF) analysis of new or existing projects at NASA/Goddard is delineated. Because the methodology evolved from HF evaluations of the Mission Planning Terminal (MPT) and the Earth Radiation Budget Satellite Mission Operations Room (ERBS MOR), it is directed specifically to the HF analysis of real-time support applications. Major topics included for discussion are the process of establishing a working relationship between the Human Factors Group (HFG) and the project, orientation of HF analysts to the project, human factors analysis and review, and coordination with major cycles of system development. Sub-topics include specific areas for analysis and appropriate HF tools. Management support functions are outlined. References provide a guide to sources of further information.
Review of Recent Methodological Developments in Group-Randomized Trials: Part 2-Analysis.
Turner, Elizabeth L; Prague, Melanie; Gallis, John A; Li, Fan; Murray, David M
2017-07-01
In 2004, Murray et al. reviewed methodological developments in the design and analysis of group-randomized trials (GRTs). We have updated that review with developments in analysis of the past 13 years, with a companion article to focus on developments in design. We discuss developments in the topics of the earlier review (e.g., methods for parallel-arm GRTs, individually randomized group-treatment trials, and missing data) and in new topics, including methods to account for multiple-level clustering and alternative estimation methods (e.g., augmented generalized estimating equations, targeted maximum likelihood, and quadratic inference functions). In addition, we describe developments in analysis of alternative group designs (including stepped-wedge GRTs, network-randomized trials, and pseudocluster randomized trials), which require clustering to be accounted for in their design and analysis.
Evaluation of a Progressive Failure Analysis Methodology for Laminated Composite Structures
NASA Technical Reports Server (NTRS)
Sleight, David W.; Knight, Norman F., Jr.; Wang, John T.
1997-01-01
A progressive failure analysis methodology has been developed for predicting the nonlinear response and failure of laminated composite structures. The progressive failure analysis uses C plate and shell elements based on classical lamination theory to calculate the in-plane stresses. Several failure criteria, including the maximum strain criterion, Hashin's criterion, and Christensen's criterion, are used to predict the failure mechanisms. The progressive failure analysis model is implemented into a general purpose finite element code and can predict the damage and response of laminated composite structures from initial loading to final failure.
Development of Methodologies Evaluating Emissions from Metal-Containing Explosives and Propellants
Experiments were performed to develop methodologies that will allow determination of pollutant emission factors for gases and particles produced by...micrometer, 16 by weight). Although not included here, the analysis methods described will be directly applicable to the study of pyrotechnics.
Representation of scientific methodology in secondary science textbooks
NASA Astrophysics Data System (ADS)
Binns, Ian C.
The purpose of this investigation was to assess the representation of scientific methodology in secondary science textbooks. More specifically, this study looked at how textbooks introduced scientific methodology and to what degree the examples from the rest of the textbook, the investigations, and the images were consistent with the text's description of scientific methodology, if at all. The sample included eight secondary science textbooks from two publishers, McGraw-Hill/Glencoe and Harcourt/Holt, Rinehart & Winston. Data consisted of all student text and teacher text that referred to scientific methodology. Second, all investigations in the textbooks were analyzed. Finally, any images that depicted scientists working were also collected and analyzed. The text analysis and activity analysis used the ethnographic content analysis approach developed by Altheide (1996). The rubrics used for the text analysis and activity analysis were initially guided by the Benchmarks (AAAS, 1993), the NSES (NRC, 1996), and the nature of science literature. Preliminary analyses helped to refine each of the rubrics and grounded them in the data. Image analysis used stereotypes identified in the DAST literature. Findings indicated that all eight textbooks presented mixed views of scientific methodology in their initial descriptions. Five textbooks placed more emphasis on the traditional view and three placed more emphasis on the broad view. Results also revealed that the initial descriptions, examples, investigations, and images all emphasized the broad view for Glencoe Biology and the traditional view for Chemistry: Matter and Change. The initial descriptions, examples, investigations, and images in the other six textbooks were not consistent. Overall, the textbook with the most appropriate depiction of scientific methodology was Glencoe Biology and the textbook with the least appropriate depiction of scientific methodology was Physics: Principles and Problems. These findings suggest that compared to earlier investigations, textbooks have begun to improve in how they represent scientific methodology. However, there is still much room for improvement. Future research needs to consider how textbooks impact teachers' and students' understandings of scientific methodology.
Philosophical and Methodological Beliefs of Instructional Design Faculty and Professionals
ERIC Educational Resources Information Center
Sheehan, Michael D.; Johnson, R. Burke
2012-01-01
The purpose of this research was to probe the philosophical beliefs of instructional designers using sound philosophical constructs and quantitative data collection and analysis. We investigated the philosophical and methodological beliefs of instructional designers, including 152 instructional design faculty members and 118 non-faculty…
Methodological quality of behavioural weight loss studies: a systematic review
Lemon, S. C.; Wang, M. L.; Haughton, C. F.; Estabrook, D. P.; Frisard, C. F.; Pagoto, S. L.
2018-01-01
Summary This systematic review assessed the methodological quality of behavioural weight loss intervention studies conducted among adults and associations between quality and statistically significant weight loss outcome, strength of intervention effectiveness and sample size. Searches for trials published between January, 2009 and December, 2014 were conducted using PUBMED, MEDLINE and PSYCINFO and identified ninety studies. Methodological quality indicators included study design, anthropometric measurement approach, sample size calculations, intent-to-treat (ITT) analysis, loss to follow-up rate, missing data strategy, sampling strategy, report of treatment receipt and report of intervention fidelity (mean = 6.3). Indicators most commonly utilized included randomized design (100%), objectively measured anthropometrics (96.7%), ITT analysis (86.7%) and reporting treatment adherence (76.7%). Most studies (62.2%) had a follow-up rate >75% and reported a loss to follow-up analytic strategy or minimal missing data (69.9%). Describing intervention fidelity (34.4%) and sampling from a known population (41.1%) were least common. Methodological quality was not associated with reporting a statistically significant result, effect size or sample size. This review found the published literature of behavioural weight loss trials to be of high quality for specific indicators, including study design and measurement. Identified for improvement include utilization of more rigorous statistical approaches to loss to follow up and better fidelity reporting. PMID:27071775
NASA Astrophysics Data System (ADS)
Ortiz-Jaramillo, B.; Fandiño Toro, H. A.; Benitez-Restrepo, H. D.; Orjuela-Vargas, S. A.; Castellanos-Domínguez, G.; Philips, W.
2012-03-01
Infrared Non-Destructive Testing (INDT) is known as an effective and rapid method for nondestructive inspection. It can detect a broad range of near-surface structuring flaws in metallic and composite components. Those flaws are modeled as a smooth contour centered at peaks of stored thermal energy, termed Regions of Interest (ROI). Dedicated methodologies must detect the presence of those ROIs. In this paper, we present a methodology for ROI extraction in INDT tasks. The methodology deals with the difficulties due to the non-uniform heating. The non-uniform heating affects low spatial/frequencies and hinders the detection of relevant points in the image. In this paper, a methodology for ROI extraction in INDT using multi-resolution analysis is proposed, which is robust to ROI low contrast and non-uniform heating. The former methodology includes local correlation, Gaussian scale analysis and local edge detection. In this methodology local correlation between image and Gaussian window provides interest points related to ROIs. We use a Gaussian window because thermal behavior is well modeled by Gaussian smooth contours. Also, the Gaussian scale is used to analyze details in the image using multi-resolution analysis avoiding low contrast, non-uniform heating and selection of the Gaussian window size. Finally, local edge detection is used to provide a good estimation of the boundaries in the ROI. Thus, we provide a methodology for ROI extraction based on multi-resolution analysis that is better or equal compared with the other dedicate algorithms proposed in the state of art.
A methodology for producing reliable software, volume 1
NASA Technical Reports Server (NTRS)
Stucki, L. G.; Moranda, P. B.; Foshee, G.; Kirchoff, M.; Omre, R.
1976-01-01
An investigation into the areas having an impact on producing reliable software including automated verification tools, software modeling, testing techniques, structured programming, and management techniques is presented. This final report contains the results of this investigation, analysis of each technique, and the definition of a methodology for producing reliable software.
MacDonnell, Judith Ann
2014-01-01
The aim of this analysis is to contribute to an understanding of emancipatory nursing in the context of higher education. Engagement with formative studies that used critical feminist methodologies led to my research focus on lesbian, gay, bisexual, and transgender (LGBT) health in my academic research program. Dimensions of emancipatory nursing include reflexivity, transformative learning, interdisciplinarity, praxis, and situated privilege. Several critical feminist methodologies are addressed: feminist ethnography, community-based participatory action research (CBPAR), and comparative life history. Commonalities across methodologies illustrate the potential for emancipatory outcomes/goals.
NASA Technical Reports Server (NTRS)
Onwubiko, Chinyere; Onyebueke, Landon
1996-01-01
This program report is the final report covering all the work done on this project. The goal of this project is technology transfer of methodologies to improve design process. The specific objectives are: 1. To learn and understand the Probabilistic design analysis using NESSUS. 2. To assign Design Projects to either undergraduate or graduate students on the application of NESSUS. 3. To integrate the application of NESSUS into some selected senior level courses in Civil and Mechanical Engineering curricula. 4. To develop courseware in Probabilistic Design methodology to be included in a graduate level Design Methodology course. 5. To study the relationship between the Probabilistic design methodology and Axiomatic design methodology.
Wavelet and Multiresolution Analysis for Finite Element Networking Paradigms
NASA Technical Reports Server (NTRS)
Kurdila, Andrew J.; Sharpley, Robert C.
1999-01-01
This paper presents a final report on Wavelet and Multiresolution Analysis for Finite Element Networking Paradigms. The focus of this research is to derive and implement: 1) Wavelet based methodologies for the compression, transmission, decoding, and visualization of three dimensional finite element geometry and simulation data in a network environment; 2) methodologies for interactive algorithm monitoring and tracking in computational mechanics; and 3) Methodologies for interactive algorithm steering for the acceleration of large scale finite element simulations. Also included in this report are appendices describing the derivation of wavelet based Particle Image Velocity algorithms and reduced order input-output models for nonlinear systems by utilizing wavelet approximations.
Global/local methods research using the CSM testbed
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.; Ransom, Jonathan B.; Griffin, O. Hayden, Jr.; Thompson, Danniella M.
1990-01-01
Research activities in global/local stress analysis are described including both two- and three-dimensional analysis methods. These methods are being developed within a common structural analysis framework. Representative structural analysis problems are presented to demonstrate the global/local methodologies being developed.
Lunar mission safety and rescue: Hazards analysis and safety requirements
NASA Technical Reports Server (NTRS)
1971-01-01
The results are presented of the hazards analysis which was concerned only with hazards to personnel and not with loss of equipment or property. Hazards characterization includes the definition of a hazard, the hazard levels, and the hazard groups. The analysis methodology is described in detail. The methodology was used to prepare the top level functional flow diagrams, to perform the first level hazards assessment, and to develop a list of conditions and situations requiring individual hazard studies. The 39 individual hazard study results are presented in total.
Tao, Huan; Zhang, Yueyuan; Li, Qian; Chen, Jin
2017-11-01
To assess the methodological quality of systematic reviews (SRs) or meta-analysis concerning the predictive value of ERCC1 in platinum chemotherapy of non-small cell lung cancer. We searched the PubMed, EMbase, Cochrane library, international prospective register of systematic reviews, Chinese BioMedical Literature Database, China National Knowledge Infrastructure, Wan Fang and VIP database for SRs or meta-analysis. The methodological quality of included literatures was evaluated by risk of bias in systematic review (ROBIS) scale. Nineteen eligible SRs/meta-analysis were included. The most frequently searched databases were EMbase (74%), PubMed, Medline and CNKI. Fifteen SRs did additional retrieval manually, but none of them retrieved the registration platform. 47% described the two-reviewers model in the screening for eligible original articles, and seven SRs described the two reviewers to extract data. In methodological quality assessment, inter-rater reliability Kappa was 0.87 between two reviewers. Research question were well related to all SRs in phase 1 and the eligibility criteria was suitable for each SR, and rated as 'low' risk bias. But the 'high' risk bias existed in all the SRs regarding methods used to identify and/or select studies, and data collection and study appraisal. More than two-third of SRs or meta-analysis were finished with high risk of bias in the synthesis, findings and the final phase. The study demonstrated poor methodological quality of SRs/meta-analysis assessing the predictive value of ERCC1 in chemotherapy among the NSCLC patients, especially the high performance bias. Registration or publishing the protocol is recommended in future research.
Kids'Cam: An Objective Methodology to Study the World in Which Children Live.
Signal, Louise N; Smith, Moira B; Barr, Michelle; Stanley, James; Chambers, Tim J; Zhou, Jiang; Duane, Aaron; Jenkin, Gabrielle L S; Pearson, Amber L; Gurrin, Cathal; Smeaton, Alan F; Hoek, Janet; Ni Mhurchu, Cliona
2017-09-01
This paper reports on a new methodology to objectively study the world in which children live. The primary research study (Kids'Cam Food Marketing) illustrates the method; numerous ancillary studies include exploration of children's exposure to alcohol, smoking, "blue" space and gambling, and their use of "green" space, transport, and sun protection. One hundred sixty-eight randomly selected children (aged 11-13 years) recruited from 16 randomly selected schools in Wellington, New Zealand used wearable cameras and GPS units for 4 days, recording imagery every 7 seconds and longitude/latitude locations every 5 seconds. Data were collected from July 2014 to June 2015. Analysis commenced in 2015 and is ongoing. Bespoke software was used to manually code images for variables of interest including setting, marketing media, and product category to produce variables for statistical analysis. GPS data were extracted and cleaned in ArcGIS, version 10.3 for exposure spatial analysis. Approximately 1.4 million images and 2.2 million GPS coordinates were generated (most were usable) from many settings including the difficult to measure aspects of exposures in the home, at school, and during leisure time. The method is ethical, legal, and acceptable to children and the wider community. This methodology enabled objective analysis of the world in which children live. The main arm examined the frequency and nature of children's exposure to food and beverage marketing and provided data on difficult to measure settings. The methodology will likely generate robust evidence facilitating more effective policymaking to address numerous public health concerns. Copyright © 2017. Published by Elsevier Inc.
Involving People in the Analysis: Listening, Reflecting, Discounting Nothing.
ERIC Educational Resources Information Center
Richardson, Malcolm
2002-01-01
This article explores methodological issues arising from a research project that involved six people with learning difficulties in researching aspects of their own lives. These included how participants were included in data analysis and the researcher's role. It stresses the importance of the researcher listening to participants, taking time to…
Pedagogy in Counselor Education: A 10-Year Content Analysis of Journals
ERIC Educational Resources Information Center
Barrio Minton, Casey A.; Wachter Morris, Carrie A.; Yaites, LaToya D.
2014-01-01
This content analysis includes 230 peer-reviewed articles regarding teaching and learning published in journals of the American Counseling Association and its divisions between January 2001 and December 2010. Results include examination of focus, pedagogical foundations, and the methodologies used. Implications for the scholarship of teaching and…
Computational methods for global/local analysis
NASA Technical Reports Server (NTRS)
Ransom, Jonathan B.; Mccleary, Susan L.; Aminpour, Mohammad A.; Knight, Norman F., Jr.
1992-01-01
Computational methods for global/local analysis of structures which include both uncoupled and coupled methods are described. In addition, global/local analysis methodology for automatic refinement of incompatible global and local finite element models is developed. Representative structural analysis problems are presented to demonstrate the global/local analysis methods.
Environmental exposure effects on composite materials for commercial aircraft
NASA Technical Reports Server (NTRS)
Hoffman, D. J.
1978-01-01
Activities reported include completion of the program design tasks, resolution of a high fiber volume problem and resumption of specimen fabrication, fixture fabrication, and progress on the analysis methodology and definition of the typical aircraft environment. Program design activities including test specimens, specimen holding fixtures, flap-track fairing tailcones, and ground exposure racks were completed. The problem experienced in obtaining acceptable fiber volume fraction results on two of the selected graphite epoxy material systems was resolved with an alteration to the bagging procedure called out in BAC 5562. The revised bagging procedure, involving lower numbers of bleeder plies, produces acceptable results. All required laminates for the contract have now been laid up and cured. Progress in the area of analysis methodology has been centered about definition of the environment that a commercial transport aircraft undergoes. The selected methodology is analagous to fatigue life assessment.
Klepárník, Karel
2015-01-01
This review focuses on the latest development of microseparation electromigration methods in capillaries and microfluidic devices with MS detection and identification. A wide selection of 183 relevant articles covers the literature published from June 2012 till May 2014 as a continuation of the review article on the same topic by Kleparnik [Electrophoresis 2013, 34, 70-86]. Special attention is paid to the new improvements in the theory of instrumentation and methodology of MS interfacing with capillary versions of zone electrophoresis, ITP, and IEF. Ionization methods in MS include ESI, MALDI, and ICP. Although the main attention is paid to the development of instrumentation and methodology, representative examples illustrate also applications in the proteomics, glycomics, metabolomics, biomarker research, forensics, pharmacology, food analysis, and single-cell analysis. The combinations of MS with capillary versions of electrochromatography and micellar electrokinetic chromatography are not included. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
A Methodology for Teaching Afro-American Literature.
ERIC Educational Resources Information Center
Kittrell, Jean
This paper outlines a system of methods for teaching Afro-American Literature at the secondary and college level. Seven goals of the methodology are presented for the course, including making the students familiar with various definitions of black literature, helping the students use the tools of literary analysis in the discussion of black…
ERIC Educational Resources Information Center
Braguglia, Kay H.; Jackson, Kanata A.
2012-01-01
This article presents a reflective analysis of teaching research methodology through a three course sequence using a project-based approach. The authors reflect critically on their experiences in teaching research methods courses in an undergraduate business management program. The introduction of a range of specific techniques including student…
Radiation Assurance for the Space Environment
NASA Technical Reports Server (NTRS)
Barth, Janet L.; LaBel, Kenneth A.; Poivey, Christian
2004-01-01
The space radiation environment can lead to extremely harsh operating conditions for spacecraft electronic systems. A hardness assurance methodology must be followed to assure that the space radiation environment does not compromise the functionality and performance of space-based systems during the mission lifetime. The methodology includes a definition of the radiation environment, assessment of the radiation sensitivity of parts, worst-case analysis of the impact of radiation effects, and part acceptance decisions which are likely to include mitigation measures.
de Paiva, Anderson Paulo
2018-01-01
This research evaluates the influence of the Brazilian accreditation methodology on the sustainability of the organizations. Critical factors for implementing accreditation were also examined, including measuring the relationships established between these factors in the organization sustainability. The present study was developed based on the survey methodology applied in the organizations accredited by ONA (National Accreditation Organization); 288 responses were received from the top level managers. The analysis of quantitative data of the measurement models was made with factorial analysis from principal components. The final model was evaluated from the confirmatory factorial analysis and structural equation modeling techniques. The results from the research are vital for the definition of factors that interfere in the accreditation processes, providing a better understanding for accredited organizations and for Brazilian accreditation. PMID:29599939
Deljavan, Reza; Sadeghi-Bazargani, Homayoun; Fouladi, Nasrin; Arshi, Shahnam; Mohammadi, Reza
2012-01-01
Little has been done to investigate the application of injury specific qualitative research methods in the field of burn injuries. The aim of this study was to use an analytical tool (Haddon's matrix) through qualitative research methods to better understand people's perceptions about burn injuries. This study applied Haddon's matrix as a framework and an analytical tool for a qualitative research methodology in burn research. Both child and adult burn injury victims were enrolled into a qualitative study conducted using focus group discussion. Haddon's matrix was used to develop an interview guide and also through the analysis phase. The main analysis clusters were pre-event level/human (including risky behaviors, belief and cultural factors, and knowledge and education), pre-event level/object, pre-event phase/environment and event and post-event phase (including fire control, emergency scald and burn wound management, traditional remedies, medical consultation, and severity indicators). This research gave rise to results that are possibly useful both for future injury research and for designing burn injury prevention plans. Haddon's matrix is applicable in a qualitative research methodology both at data collection and data analysis phases. The study using Haddon's matrix through a qualitative research methodology yielded substantially rich information regarding burn injuries that may possibly be useful for prevention or future quantitative research.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for designs failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.
Functional Analysis and Treatment of Nail Biting
ERIC Educational Resources Information Center
Dufrene, Brad A.; Watson, T. Steuart; Kazmerski, Jennifer S.
2008-01-01
This study applied functional analysis methodology to nail biting exhibited by a 24-year-old female graduate student. Results from the brief functional analysis indicated variability in nail biting across assessment conditions. Functional analysis data were then used to guide treatment development and implementation. Treatment included a…
Hernando, David; Hernando, Alberto; Casajús, Jose A; Laguna, Pablo; Garatachea, Nuria; Bailón, Raquel
2018-05-01
Standard methodologies of heart rate variability analysis and physiological interpretation as a marker of autonomic nervous system condition have been largely published at rest, but not so much during exercise. A methodological framework for heart rate variability (HRV) analysis during exercise is proposed, which deals with the non-stationary nature of HRV during exercise, includes respiratory information, and identifies and corrects spectral components related to cardiolocomotor coupling (CC). This is applied to 23 male subjects who underwent different tests: maximal and submaximal, running and cycling; where the ECG, respiratory frequency and oxygen consumption were simultaneously recorded. High-frequency (HF) power results largely modified from estimations with the standard fixed band to those obtained with the proposed methodology. For medium and high levels of exercise and recovery, HF power results in a 20 to 40% increase. When cycling, HF power increases around 40% with respect to running, while CC power is around 20% stronger in running.
Methodology for national risk analysis and prioritization of toxic industrial chemicals.
Taxell, Piia; Engström, Kerstin; Tuovila, Juha; Söderström, Martin; Kiljunen, Harri; Vanninen, Paula; Santonen, Tiina
2013-01-01
The identification of chemicals that pose the greatest threat to human health from incidental releases is a cornerstone in public health preparedness for chemical threats. The present study developed and applied a methodology for the risk analysis and prioritization of industrial chemicals to identify the most significant chemicals that pose a threat to public health in Finland. The prioritization criteria included acute and chronic health hazards, physicochemical and environmental hazards, national production and use quantities, the physicochemical properties of the substances, and the history of substance-related incidents. The presented methodology enabled a systematic review and prioritization of industrial chemicals for the purpose of national public health preparedness for chemical incidents.
Predicting the Reliability of Ceramics Under Transient Loads and Temperatures With CARES/Life
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Jadaan, Osama M.; Palfi, Tamas; Baker, Eric H.
2003-01-01
A methodology is shown for predicting the time-dependent reliability of ceramic components against catastrophic rupture when subjected to transient thermomechanical loads (including cyclic loads). The methodology takes into account the changes in material response that can occur with temperature or time (i.e., changing fatigue and Weibull parameters with temperature or time). This capability has been added to the NASA CARES/Life (Ceramic Analysis and Reliability Evaluation of Structures/Life) code. The code has been modified to have the ability to interface with commercially available finite element analysis (FEA) codes executed for transient load histories. Examples are provided to demonstrate the features of the methodology as implemented in the CARES/Life program.
Review of Recent Methodological Developments in Group-Randomized Trials: Part 1—Design
Li, Fan; Gallis, John A.; Prague, Melanie; Murray, David M.
2017-01-01
In 2004, Murray et al. reviewed methodological developments in the design and analysis of group-randomized trials (GRTs). We have highlighted the developments of the past 13 years in design with a companion article to focus on developments in analysis. As a pair, these articles update the 2004 review. We have discussed developments in the topics of the earlier review (e.g., clustering, matching, and individually randomized group-treatment trials) and in new topics, including constrained randomization and a range of randomized designs that are alternatives to the standard parallel-arm GRT. These include the stepped-wedge GRT, the pseudocluster randomized trial, and the network-randomized GRT, which, like the parallel-arm GRT, require clustering to be accounted for in both their design and analysis. PMID:28426295
Review of Recent Methodological Developments in Group-Randomized Trials: Part 1-Design.
Turner, Elizabeth L; Li, Fan; Gallis, John A; Prague, Melanie; Murray, David M
2017-06-01
In 2004, Murray et al. reviewed methodological developments in the design and analysis of group-randomized trials (GRTs). We have highlighted the developments of the past 13 years in design with a companion article to focus on developments in analysis. As a pair, these articles update the 2004 review. We have discussed developments in the topics of the earlier review (e.g., clustering, matching, and individually randomized group-treatment trials) and in new topics, including constrained randomization and a range of randomized designs that are alternatives to the standard parallel-arm GRT. These include the stepped-wedge GRT, the pseudocluster randomized trial, and the network-randomized GRT, which, like the parallel-arm GRT, require clustering to be accounted for in both their design and analysis.
On the Application of Syntactic Methodologies in Automatic Text Analysis.
ERIC Educational Resources Information Center
Salton, Gerard; And Others
1990-01-01
Summarizes various linguistic approaches proposed for document analysis in information retrieval environments. Topics discussed include syntactic analysis; use of machine-readable dictionary information; knowledge base construction; the PLNLP English Grammar (PEG) system; phrase normalization; and statistical and syntactic phrase evaluation used…
Methodology for object-oriented real-time systems analysis and design: Software engineering
NASA Technical Reports Server (NTRS)
Schoeffler, James D.
1991-01-01
Successful application of software engineering methodologies requires an integrated analysis and design life-cycle in which the various phases flow smoothly 'seamlessly' from analysis through design to implementation. Furthermore, different analysis methodologies often lead to different structuring of the system so that the transition from analysis to design may be awkward depending on the design methodology to be used. This is especially important when object-oriented programming is to be used for implementation when the original specification and perhaps high-level design is non-object oriented. Two approaches to real-time systems analysis which can lead to an object-oriented design are contrasted: (1) modeling the system using structured analysis with real-time extensions which emphasizes data and control flows followed by the abstraction of objects where the operations or methods of the objects correspond to processes in the data flow diagrams and then design in terms of these objects; and (2) modeling the system from the beginning as a set of naturally occurring concurrent entities (objects) each having its own time-behavior defined by a set of states and state-transition rules and seamlessly transforming the analysis models into high-level design models. A new concept of a 'real-time systems-analysis object' is introduced and becomes the basic building block of a series of seamlessly-connected models which progress from the object-oriented real-time systems analysis and design system analysis logical models through the physical architectural models and the high-level design stages. The methodology is appropriate to the overall specification including hardware and software modules. In software modules, the systems analysis objects are transformed into software objects.
A design methodology for nonlinear systems containing parameter uncertainty
NASA Technical Reports Server (NTRS)
Young, G. E.; Auslander, D. M.
1983-01-01
In the present design methodology for nonlinear systems containing parameter uncertainty, a generalized sensitivity analysis is incorporated which employs parameter space sampling and statistical inference. For the case of a system with j adjustable and k nonadjustable parameters, this methodology (which includes an adaptive random search strategy) is used to determine the combination of j adjustable parameter values which maximize the probability of those performance indices which simultaneously satisfy design criteria in spite of the uncertainty due to k nonadjustable parameters.
Optimized planning methodologies of ASON implementation
NASA Astrophysics Data System (ADS)
Zhou, Michael M.; Tamil, Lakshman S.
2005-02-01
Advanced network planning concerns effective network-resource allocation for dynamic and open business environment. Planning methodologies of ASON implementation based on qualitative analysis and mathematical modeling are presented in this paper. The methodology includes method of rationalizing technology and architecture, building network and nodal models, and developing dynamic programming for multi-period deployment. The multi-layered nodal architecture proposed here can accommodate various nodal configurations for a multi-plane optical network and the network modeling presented here computes the required network elements for optimizing resource allocation.
A method for the design of transonic flexible wings
NASA Technical Reports Server (NTRS)
Smith, Leigh Ann; Campbell, Richard L.
1990-01-01
Methodology was developed for designing airfoils and wings at transonic speeds which includes a technique that can account for static aeroelastic deflections. This procedure is capable of designing either supercritical or more conventional airfoil sections. Methods for including viscous effects are also illustrated and are shown to give accurate results. The methodology developed is an interactive system containing three major parts. A design module was developed which modifies airfoil sections to achieve a desired pressure distribution. This design module works in conjunction with an aerodynamic analysis module, which for this study is a small perturbation transonic flow code. Additionally, an aeroelastic module is included which determines the wing deformation due to the calculated aerodynamic loads. Because of the modular nature of the method, it can be easily coupled with any aerodynamic analysis code.
ERIC Educational Resources Information Center
Raz, Aviad E.
2007-01-01
Purpose: The purpose of this paper is to describe and analyse the formation of CoPs (communities of practice) in three call centres of cellular communication operating companies in Israel. Design/methodology/approach: This study is based on a qualitative methodology including observations, interviews and textual analysis. Findings: In all three…
Theoretical Significance in Q Methodology: A Qualitative Approach to a Mixed Method
ERIC Educational Resources Information Center
Ramlo, Susan
2015-01-01
Q methodology (Q) has offered researchers a unique scientific measure of subjectivity since William Stephenson's first article in 1935. Q's focus on subjectivity includes self-referential meaning and interpretation. Q is most often identified with its technique (Q-sort) and its method (factor analysis to group people); yet, it consists of a…
Reference values for muscle strength: a systematic review with a descriptive meta-analysis.
Benfica, Poliana do Amaral; Aguiar, Larissa Tavares; Brito, Sherindan Ayessa Ferreira de; Bernardino, Luane Helena Nunes; Teixeira-Salmela, Luci Fuscaldi; Faria, Christina Danielli Coelho de Morais
2018-05-03
Muscle strength is an important component of health. To describe and evaluate the studies which have established the reference values for muscle strength on healthy individuals and to synthesize these values with a descriptive meta-analysis approach. A systematic review was performed in MEDLINE, LILACS, and SciELO databases. Studies that investigated the reference values for muscle strength of two or more appendicular/axial muscle groups of health individuals were included. Methodological quality, including risk of bias was assessed by the QUADAS-2. Data extracted included: country of the study, sample size, population characteristics, equipment/method used, and muscle groups evaluated. Of the 414 studies identified, 46 were included. Most of the studies had adequate methodological quality. Included studies evaluated: appendicular (80.4%) and axial (36.9%) muscles; adults (78.3%), elderly (58.7%), adolescents (43.5%), children (23.9%); isometric (91.3%) and isokinetic (17.4%) strength. Six studies (13%) with similar procedures were synthesized with meta-analysis. Generally, the coefficient of variation values that resulted from the meta-analysis ranged from 20.1% to 30% and were similar to those reported by the original studies. The meta-analysis synthesized the reference values of isometric strength of 14 muscle groups of the dominant/non-dominant sides of the upper/lower limbs of adults/elderly from developed countries, using dynamometers/myometer. Most of the included studies had adequate methodological quality. The meta-analysis provided reference values for the isometric strength of 14 appendicular muscle groups of the dominant/non-dominant sides, measured with dynamometers/myometers, of men/women, of adults/elderly. These data may be used to interpret the results of the evaluations and establish appropriate treatment goals. Copyright © 2018 Associação Brasileira de Pesquisa e Pós-Graduação em Fisioterapia. Publicado por Elsevier Editora Ltda. All rights reserved.
Development of Probabilistic Life Prediction Methodologies and Testing Strategies for MEMS and CMC's
NASA Technical Reports Server (NTRS)
Jadaan, Osama
2003-01-01
This effort is to investigate probabilistic life prediction methodologies for ceramic matrix composites and MicroElectroMechanical Systems (MEMS) and to analyze designs that determine stochastic properties of MEMS. For CMC's this includes a brief literature survey regarding lifing methodologies. Also of interest for MEMS is the design of a proper test for the Weibull size effect in thin film (bulge test) specimens. The Weibull size effect is a consequence of a stochastic strength response predicted from the Weibull distribution. Confirming that MEMS strength is controlled by the Weibull distribution will enable the development of a probabilistic design methodology for MEMS - similar to the GRC developed CARES/Life program for bulk ceramics. A main objective of this effort is to further develop and verify the ability of the Ceramics Analysis and Reliability Evaluation of Structures/Life (CARES/Life) code to predict the time-dependent reliability of MEMS structures subjected to multiple transient loads. A second set of objectives is to determine the applicability/suitability of the CARES/Life methodology for CMC analysis, what changes would be needed to the methodology and software, and if feasible, run a demonstration problem. Also important is an evaluation of CARES/Life coupled to the ANSYS Probabilistic Design System (PDS) and the potential of coupling transient reliability analysis to the ANSYS PDS.
Debrus, Benjamin; Lebrun, Pierre; Ceccato, Attilio; Caliaro, Gabriel; Rozet, Eric; Nistor, Iolanda; Oprean, Radu; Rupérez, Francisco J; Barbas, Coral; Boulanger, Bruno; Hubert, Philippe
2011-04-08
HPLC separations of an unknown sample mixture and a pharmaceutical formulation have been optimized using a recently developed chemometric methodology proposed by W. Dewé et al. in 2004 and improved by P. Lebrun et al. in 2008. This methodology is based on experimental designs which are used to model retention times of compounds of interest. Then, the prediction accuracy and the optimal separation robustness, including the uncertainty study, were evaluated. Finally, the design space (ICH Q8(R1) guideline) was computed as the probability for a criterion to lie in a selected range of acceptance. Furthermore, the chromatograms were automatically read. Peak detection and peak matching were carried out with a previously developed methodology using independent component analysis published by B. Debrus et al. in 2009. The present successful applications strengthen the high potential of these methodologies for the automated development of chromatographic methods. Copyright © 2011 Elsevier B.V. All rights reserved.
A methodology for creating greenways through multidisciplinary sustainable landscape planning.
Pena, Selma Beatriz; Abreu, Maria Manuela; Teles, Rui; Espírito-Santo, Maria Dalila
2010-01-01
This research proposes a methodology for defining greenways via sustainable planning. This approach includes the analysis and discussion of culture and natural processes that occur in the landscape. The proposed methodology is structured in three phases: eco-cultural analysis; synthesis and diagnosis; and proposal. An interdisciplinary approach provides an assessment of the relationships between landscape structure and landscape dynamics, which are essential to any landscape management or land use. The landscape eco-cultural analysis provides a biophysical, dynamic (geomorphologic rate), vegetation (habitats from directive 92/43/EEC) and cultural characterisation. The knowledge obtained by this analysis then supports the definition of priority actions to stabilise the landscape and the management measures for the habitats. After the analysis and diagnosis phases, a proposal for the development of sustainable greenways can be achieved. This methodology was applied to a study area of the Azambuja Municipality in the Lisbon Metropolitan Area (Portugal). The application of the proposed methodology to the study area shows that landscape stability is crucial for greenway users in order to appreciate the landscape and its natural and cultural elements in a sustainable and healthy way, both by cycling or by foot. A balanced landscape will increase the value of greenways and in return, they can develop socio-economic activities with benefits for rural communities. Copyright 2009 Elsevier Ltd. All rights reserved.
RAT SPERM MOTILITY ANALYSIS: METHODOLOGICAL CONSIDERATIONS
The objective of these studies was to optimize conditions for computer assisted sperm analysis (CASA) of rat epididymal spermatozoa. ethodological issues addressed include sample collection technique, sampling region within the epididymis, type of diluent medium used, and sample ...
Zhu, Xiaoyan; Zhou, Xiaobin; Zhang, Yuan; Sun, Xiao; Liu, Haihua; Zhang, Yingying
2017-01-01
Abstract Survival analysis methods have gained widespread use in the filed of oncology. For achievement of reliable results, the methodological process and report quality is crucial. This review provides the first examination of methodological characteristics and reporting quality of survival analysis in articles published in leading Chinese oncology journals. To examine methodological and reporting quality of survival analysis, to identify some common deficiencies, to desirable precautions in the analysis, and relate advice for authors, readers, and editors. A total of 242 survival analysis articles were included to be evaluated from 1492 articles published in 4 leading Chinese oncology journals in 2013. Articles were evaluated according to 16 established items for proper use and reporting of survival analysis. The application rates of Kaplan–Meier, life table, log-rank test, Breslow test, and Cox proportional hazards model (Cox model) were 91.74%, 3.72%, 78.51%, 0.41%, and 46.28%, respectively, no article used the parametric method for survival analysis. Multivariate Cox model was conducted in 112 articles (46.28%). Follow-up rates were mentioned in 155 articles (64.05%), of which 4 articles were under 80% and the lowest was 75.25%, 55 articles were100%. The report rates of all types of survival endpoint were lower than 10%. Eleven of 100 articles which reported a loss to follow-up had stated how to treat it in the analysis. One hundred thirty articles (53.72%) did not perform multivariate analysis. One hundred thirty-nine articles (57.44%) did not define the survival time. Violations and omissions of methodological guidelines included no mention of pertinent checks for proportional hazard assumption; no report of testing for interactions and collinearity between independent variables; no report of calculation method of sample size. Thirty-six articles (32.74%) reported the methods of independent variable selection. The above defects could make potentially inaccurate, misleading of the reported results, or difficult to interpret. There are gaps in the conduct and reporting of survival analysis in studies published in Chinese oncology journals, severe deficiencies were noted. More endorsement by journals of the report guideline for survival analysis may improve articles quality, and the dissemination of reliable evidence to oncology clinicians. We recommend authors, readers, reviewers, and editors to consider survival analysis more carefully and cooperate more closely with statisticians and epidemiologists. PMID:29390340
A systematic review of grounded theory studies in physiotherapy.
Ali, Nancy; May, Stephen; Grafton, Kate
2018-05-23
This systematic review aimed at appraising the methodological rigor of grounded theory research published in the field of physiotherapy to assess how the methodology is understood and applied. A secondary aim was to provide research implications drawn from the findings to guide future grounded theory methodology (GTM) research. A systematic search was conducted in MEDLINE, CINHAL, SPORT Discus, Science Direct, PubMed, Scopus, and Web of Science to identify studies in the field of physiotherapy that reported using GTM and/or methods in the study title and/or abstract. The descriptive characteristics and methodological quality of eligible studies were examined using grounded theory methodology assessment guidelines. The review included 68 studies conducted between 1998 and 2017. The findings showed that GTM is becoming increasingly used by physiotherapy researchers. Thirty-six studies (53%) demonstrated a good understanding and appropriate application of GTM. Thirty-two studies (47%) presented descriptive findings and were considered to be of poor methodological quality. There are several key tenets of GTM that are integral to the iterative process of qualitative theorizing and need to be applied throughout all research practices including sampling, data collection, and analysis.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflights systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for design, failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.
Methodological quality and reporting of systematic reviews in hand and wrist pathology.
Wasiak, J; Shen, A Y; Ware, R; O'Donohoe, T J; Faggion, C M
2017-10-01
The objective of this study was to assess methodological and reporting quality of systematic reviews in hand and wrist pathology. MEDLINE, EMBASE and Cochrane Library were searched from inception to November 2016 for relevant studies. Reporting quality was evaluated using Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) and methodological quality using a measurement tool to assess systematic reviews, the Assessment of Multiple Systematic Reviews (AMSTAR). Descriptive statistics and linear regression were used to identify features associated with improved methodological quality. A total of 91 studies were included in the analysis. Most reviews inadequately reported PRISMA items regarding study protocol, search strategy and bias and AMSTAR items regarding protocol, publication bias and funding. Systematic reviews published in a plastics journal, or which included more authors, were associated with higher AMSTAR scores. A large proportion of systematic reviews within hand and wrist pathology literature score poorly with validated methodological assessment tools, which may affect the reliability of their conclusions. I.
[Why evidence-based medicine? 20 years of meta-analysis].
Ceballos, C; Valdizán, J R; Artal, A; Almárcegui, C; Allepuz, C; García Campayo, J; Fernández Liesa, R; Giraldo, P; Puértolas, T
2000-10-01
Meta-analysis, described within evidence-based medicine, has become a frequent issue in recent medical literature. An exhaustive search of reported meta-analysis from any medical specialty is described. Search of papers included in Medline or Embase between 1973-1998. A study of intra and inter-reviewers liability about selection and classification have been performed. A descriptive analysis of the reported papers (frequency tables and graphics) is described, including differences of mean of reported meta-analysis papers by medical specialty and year. 1,518 papers were selected and classified. Most frequently found (45.91%) were: methodology (15.7%), psychiatry (11.79%), cardiology (10.01%) and oncology (8.36%). Inter personal agreement was 0.93 in selecting papers and 0.72 in classifying them. Between 1977-1987 overall mean of reported studies of meta-analysis (1.67 + 4.10) was significatively inferior to the 1988-1998 (49.54 + 56.55) (p < 0.001). Global number of meta-analysis was positively correlated (p < 0.05) with the number of studies about fundamentals and methodology during the study period. The method used to identify meta-analysis reports can be considered to be adequate; however, the agreement in classifying them in medical specialties was inferior. A progressive increase in the number of reported meta-analysis since 1977 can be demonstrated. Specialties with a greater number of meta-analysis published in the literature were: psychiatry, oncology and cardiology. Diffusion of knowledge about fundamentals and methodology of meta-analysis seems to have drawn and increase in performing and reporting this kind of analysis.
Satellite services system analysis study. Volume 2: Satellite and services user model
NASA Technical Reports Server (NTRS)
1981-01-01
Satellite services needs are analyzed. Topics include methodology: a satellite user model; representative servicing scenarios; potential service needs; manned, remote, and automated involvement; and inactive satellites/debris. Satellite and services user model development is considered. Groundrules and assumptions, servicing, events, and sensitivity analysis are included. Selection of references satellites is also discussed.
KSC management training system project
NASA Technical Reports Server (NTRS)
Sepulveda, Jose A.
1993-01-01
The stated objectives for the summer of 1993 were: to review the Individual Development Plan Surveys for 1994 in order to automate the analysis of the Needs Assessment effort; and to develop and implement evaluation methodologies to perform ongoing program-wide course-to-course assessment. This includes the following: to propose a methodology to develop and implement objective, performance-based assessment instruments for each training effort; to mechanize course evaluation forms and develop software to facilitate the data gathering, analysis, and reporting processes; and to implement the methodology, forms, and software in at lease one training course or seminar selected among those normally offered in the summer at KSC. Section two of this report addresses the work done in regard to the Individual Development Plan Surveys for 1994. Section three presents the methodology proposed to develop and implement objective, performance-based assessment instruments for each training course offered at KSC.
NASA Technical Reports Server (NTRS)
1974-01-01
A methodology for the display and analysis of postulated energy futures for the United States is presented. A systems approach methodology including the methodology of technology assessment is used to examine three energy scenarios--the Westinghouse Nuclear Electric Economy, the Ford Technical Fix Base Case and a MEGASTAR generated Alternate to the Ford Technical Fix Base Case. The three scenarios represent different paths of energy consumption from the present to the year 2000. Associated with these paths are various mixes of fuels, conversion, distribution, conservation and end-use technologies. MEGASTAR presents the estimated times and unit requirements to supply the fuels, conversion and distribution systems for the postulated end uses for the three scenarios and then estimates the aggregate manpower, materials, and capital requirements needed to develop the energy system described by the particular scenario.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grabaskas, Dave; Brunett, Acacia J.; Bucknor, Matthew
GE Hitachi Nuclear Energy (GEH) and Argonne National Laboratory are currently engaged in a joint effort to modernize and develop probabilistic risk assessment (PRA) techniques for advanced non-light water reactors. At a high level, the primary outcome of this project will be the development of next-generation PRA methodologies that will enable risk-informed prioritization of safety- and reliability-focused research and development, while also identifying gaps that may be resolved through additional research. A subset of this effort is the development of PRA methodologies to conduct a mechanistic source term (MST) analysis for event sequences that could result in the release ofmore » radionuclides. The MST analysis seeks to realistically model and assess the transport, retention, and release of radionuclides from the reactor to the environment. The MST methods developed during this project seek to satisfy the requirements of the Mechanistic Source Term element of the ASME/ANS Non-LWR PRA standard. The MST methodology consists of separate analysis approaches for risk-significant and non-risk significant event sequences that may result in the release of radionuclides from the reactor. For risk-significant event sequences, the methodology focuses on a detailed assessment, using mechanistic models, of radionuclide release from the fuel, transport through and release from the primary system, transport in the containment, and finally release to the environment. The analysis approach for non-risk significant event sequences examines the possibility of large radionuclide releases due to events such as re-criticality or the complete loss of radionuclide barriers. This paper provides details on the MST methodology, including the interface between the MST analysis and other elements of the PRA, and provides a simplified example MST calculation for a sodium fast reactor.« less
Establishing equivalence: methodological progress in group-matching design and analysis.
Kover, Sara T; Atwoo, Amy K
2013-01-01
This methodological review draws attention to the challenges faced by intellectual and developmental disabilities researchers in the appropriate design and analysis of group comparison studies. We provide a brief overview of matching methodologies in the field, emphasizing group-matching designs used in behavioral research on cognition and language in neurodevelopmental disorders, including autism spectrum disorder, Fragile X syndrome, Down syndrome, and Williams syndrome. The limitations of relying on p values to establish group equivalence are discussed in the context of other existing methods: equivalence tests, propensity scores, and regression-based analyses. Our primary recommendation for advancing research on intellectual and developmental disabilities is the use of descriptive indices of adequate group matching: effect sizes (i.e., standardized mean differences) and variance ratios.
Establishing Equivalence: Methodological Progress in Group-Matching Design and Analysis
Kover, Sara T.; Atwood, Amy K.
2017-01-01
This methodological review draws attention to the challenges faced by intellectual and developmental disabilities researchers in the appropriate design and analysis of group comparison studies. We provide a brief overview of matching methodologies in the field, emphasizing group-matching designs utilized in behavioral research on cognition and language in neurodevelopmental disorders, including autism spectrum disorder, fragile X syndrome, Down syndrome, and Williams syndrome. The limitations of relying on p-values to establish group equivalence are discussed in the context of other existing methods: equivalence tests, propensity scores, and regression-based analyses. Our primary recommendation for advancing research on intellectual and developmental disabilities is the use of descriptive indices of adequate group matching: effect sizes (i.e., standardized mean differences) and variance ratios. PMID:23301899
A method for the analysis of the benefits and costs for aeronautical research and technology
NASA Technical Reports Server (NTRS)
Williams, L. J.; Hoy, H. H.; Anderson, J. L.
1978-01-01
A relatively simple, consistent, and reasonable methodology for performing cost-benefit analyses which can be used to guide, justify, and explain investments in aeronautical research and technology is presented. The elements of this methodology (labeled ABC-ART for the Analysis of the Benefits and Costs of Aeronautical Research and Technology) include estimation of aircraft markets; manufacturer costs and return on investment versus aircraft price; airline costs and return on investment versus aircraft price and passenger yield; and potential system benefits--fuel savings, cost savings, and noise reduction. The application of this methodology is explained using the introduction of an advanced turboprop powered transport aircraft in the medium range market in 1978 as an example.
Recent advances in CE-MS coupling: Instrumentation, methodology, and applications.
Týčová, Anna; Ledvina, Vojtěch; Klepárník, Karel
2017-01-01
This review focuses on the latest development of microseparation electromigration methods in capillaries and microfluidic devices coupled with MS for detection and identification of important analytes. It is a continuation of the review article on the same topic by Kleparnik (Electrophoresis 2015, 36, 159-178). A wide selection of 161 relevant articles covers the literature published from June 2014 till May 2016. New improvements in the instrumentation and methodology of MS interfaced with capillary or microfluidic versions of zone electrophoresis, isotachophoresis, and isoelectric focusing are described in detail. The most frequently implemented MS ionization methods include electrospray ionization, matrix-assisted desorption/ionization and inductively coupled plasma ionization. Although the main attention is paid to the development of instrumentation and methodology, representative examples illustrate also applications in the proteomics, glycomics, metabolomics, biomarker research, forensics, pharmacology, food analysis, and single-cell analysis. The combinations of MS with capillary versions of electrochromatography, and micellar electrokinetic chromatography are not included. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Coupled structural/thermal/electromagnetic analysis/tailoring of graded composite structures
NASA Technical Reports Server (NTRS)
Mcknight, R. L.; Huang, H.; Hartle, M.
1992-01-01
Accomplishments are described for the third years effort of a 5-year program to develop a methodology for coupled structural/thermal/electromagnetic analysis/tailoring of graded composite structures. These accomplishments include: (1) structural analysis capability specialized for graded composite structures including large deformation and deformation position eigenanalysis technologies; (2) a thermal analyzer specialized for graded composite structures; (3) absorption of electromagnetic waves by graded composite structures; and (4) coupled structural thermal/electromagnetic analysis of graded composite structures.
LIFE CYCLE DESIGN OF AIR INTAKE MANIFOLDS; PHASE I: 2.0 L FORD CONTOUR AIR INTAKE MANIFOLD
The project team applied the life cycle design methodology to the design analysis of three alternative air intake manifolds: a sand cast aluminum, brazed aluminum tubular, and nylon composite. The design analysis included a life cycle inventory analysis, environmental regulatory...
Lonjon, Guillaume; Porcher, Raphael; Ergina, Patrick; Fouet, Mathilde; Boutron, Isabelle
2017-05-01
To describe the evolution of the use and reporting of propensity score (PS) analysis in observational studies assessing a surgical procedure. Assessing surgery in randomized controlled trials raises several challenges. Observational studies with PS analysis are a robust alternative for comparative effectiveness research. In this methodological systematic review, we identified all PubMed reports of observational studies with PS analysis that evaluated a surgical procedure and described the evolution of their use over time. Then, we selected a sample of articles published from August 2013 to July 2014 and systematically appraised the quality of reporting and potential bias of the PS analysis used. We selected 652 reports of observational studies with PS analysis. The publications increased over time, from 1 report in 1987 to 198 in 2013. Among the 129 reports assessed, 20% (n = 24) did not detail the covariates included in the PS and 77% (n = 100) did not report a justification for including these covariates in the PS. The rate of missing data for potential covariates was reported in 9% of articles. When a crossover by conversion was possible, only 14% of reports (n = 12) mentioned this issue. For matched analysis, 10% of articles reported all 4 key elements that allow for reproducibility of a PS-matched analysis (matching ratio, method to choose the nearest neighbors, replacement and method for statistical analysis). Observational studies with PS analysis in surgery are increasing in frequency, but specific methodological issues and weaknesses in reporting exist.
Haegele, Justin A; Hodge, Samuel Russell
2015-10-01
There are basic philosophical and paradigmatic assumptions that guide scholarly research endeavors, including the methods used and the types of questions asked. Through this article, kinesiology faculty and students with interests in adapted physical activity are encouraged to understand the basic assumptions of applied behavior analysis (ABA) methodology for conducting, analyzing, and presenting research of high quality in this paradigm. The purposes of this viewpoint paper are to present information fundamental to understanding the assumptions undergirding research methodology in ABA, describe key aspects of single-subject research designs, and discuss common research designs and data-analysis strategies used in single-subject studies.
Application of the Hardman methodology to the Single Channel Ground-Airborne Radio System (SINCGARS)
NASA Technical Reports Server (NTRS)
1984-01-01
The HARDMAN methodology was applied to the various configurations of employment for an emerging Army multipurpose communications system. The methodology was used to analyze the manpower, personnel and training (MPT) requirements and associated costs, of the system concepts responsive to the Army's requirement for the Single Channel Ground-Airborne Radio System (SINCGARS). The scope of the application includes the analysis of two conceptual designs Cincinnati Electronics and ITT Aerospace/Optical Division for operating and maintenance support addressed through the general support maintenance echelon.
NASA Technical Reports Server (NTRS)
Nakajima, Yukio; Padovan, Joe
1987-01-01
In a three-part series of papers, a generalized finite element methodology is formulated to handle traveling load problems involving large deformation fields in structure composed of viscoelastic media. The main thrust of this paper is to develop an overall finite element methodology and associated solution algorithms to handle the transient aspects of moving problems involving contact impact type loading fields. Based on the methodology and algorithms formulated, several numerical experiments are considered. These include the rolling/sliding impact of tires with road obstructions.
NASA Technical Reports Server (NTRS)
1982-01-01
An effective data collection methodology for evaluating software development methodologies was applied to four different software development projects. Goals of the data collection included characterizing changes and errors, characterizing projects and programmers, identifying effective error detection and correction techniques, and investigating ripple effects. The data collected consisted of changes (including error corrections) made to the software after code was written and baselined, but before testing began. Data collection and validation were concurrent with software development. Changes reported were verified by interviews with programmers.
Muskett, Tom; Body, Richard
2013-01-01
Conversation analysis (CA) continues to accrue interest within clinical linguistics as a methodology that can enable elucidation of structural and sequential orderliness in interactions involving participants who produce ostensibly disordered communication behaviours. However, it can be challenging to apply CA to re-examine clinical phenomena that have initially been defined in terms of linguistics, as a logical starting point for analysis may be to focus primarily on the organisation of language ("talk") in such interactions. In this article, we argue that CA's methodological power can only be fully exploited in this research context when a multimodal analytic orientation is adopted, where due consideration is given to participants' co-ordinated use of multiple semiotic resources including, but not limited to, talk (e.g., gaze, embodied action, object use and so forth). To evidence this argument, a two-layered analysis of unusual question-answer sequences in a play episode involving a child with autism is presented. It is thereby demonstrated that only when the scope of enquiry is broadened to include gaze and other embodied action can an account be generated of orderliness within these sequences. This finding has important implications for CA's application as a research methodology within clinical linguistics.
USDA-ARS?s Scientific Manuscript database
The development of genomic selection methodology, with accompanying substantial gains in reliability for low-heritability traits, may dramatically improve the feasibility of genetic improvement of dairy cow health. Many methods for genomic analysis have now been developed, including the “Bayesian Al...
The HIV Cure Research Agenda: The Role of Mathematical Modelling and Cost-Effectiveness Analysis.
Freedberg, Kenneth A; Possas, Cristina; Deeks, Steven; Ross, Anna Laura; Rosettie, Katherine L; Di Mascio, Michele; Collins, Chris; Walensky, Rochelle P; Yazdanpanah, Yazdan
The research agenda towards an HIV cure is building rapidly. In this article, we discuss the reasons for and methodological approach to using mathematical modeling and cost-effectiveness analysis in this agenda. We provide a brief description of the proof of concept for cure and the current directions of cure research. We then review the types of clinical economic evaluations, including cost analysis, cost-benefit analysis, and cost-effectiveness analysis. We describe the use of mathematical modeling and cost-effectiveness analysis early in the HIV epidemic as well as in the era of combination antiretroviral therapy. We then highlight the novel methodology of Value of Information analysis and its potential role in the planning of clinical trials. We close with recommendations for modeling and cost-effectiveness analysis in the HIV cure agenda.
Ge, Long; Tian, Jin-hui; Li, Xiu-xia; Song, Fujian; Li, Lun; Zhang, Jun; Li, Ge; Pei, Gai-qin; Qiu, Xia; Yang, Ke-hu
2016-01-01
Because of the methodological complexity of network meta-analyses (NMAs), NMAs may be more vulnerable to methodological risks than conventional pair-wise meta-analysis. Our study aims to investigate epidemiology characteristics, conduction of literature search, methodological quality and reporting of statistical analysis process in the field of cancer based on PRISMA extension statement and modified AMSTAR checklist. We identified and included 102 NMAs in the field of cancer. 61 NMAs were conducted using a Bayesian framework. Of them, more than half of NMAs did not report assessment of convergence (60.66%). Inconsistency was assessed in 27.87% of NMAs. Assessment of heterogeneity in traditional meta-analyses was more common (42.62%) than in NMAs (6.56%). Most of NMAs did not report assessment of similarity (86.89%) and did not used GRADE tool to assess quality of evidence (95.08%). 43 NMAs were adjusted indirect comparisons, the methods used were described in 53.49% NMAs. Only 4.65% NMAs described the details of handling of multi group trials and 6.98% described the methods of similarity assessment. The median total AMSTAR-score was 8.00 (IQR: 6.00–8.25). Methodological quality and reporting of statistical analysis did not substantially differ by selected general characteristics. Overall, the quality of NMAs in the field of cancer was generally acceptable. PMID:27848997
Integrated Aero-Propulsion CFD Methodology for the Hyper-X Flight Experiment
NASA Technical Reports Server (NTRS)
Cockrell, Charles E., Jr.; Engelund, Walter C.; Bittner, Robert D.; Dilley, Arthur D.; Jentink, Tom N.; Frendi, Abdelkader
2000-01-01
Computational fluid dynamics (CFD) tools have been used extensively in the analysis and development of the X-43A Hyper-X Research Vehicle (HXRV). A significant element of this analysis is the prediction of integrated vehicle aero-propulsive performance, which includes an integration of aerodynamic and propulsion flow fields. This paper describes analysis tools used and the methodology for obtaining pre-flight predictions of longitudinal performance increments. The use of higher-fidelity methods to examine flow-field characteristics and scramjet flowpath component performance is also discussed. Limited comparisons with available ground test data are shown to illustrate the approach used to calibrate methods and assess solution accuracy. Inviscid calculations to evaluate lateral-directional stability characteristics are discussed. The methodology behind 3D tip-to-tail calculations is described and the impact of 3D exhaust plume expansion in the afterbody region is illustrated. Finally, future technology development needs in the area of hypersonic propulsion-airframe integration analysis are discussed.
Gis-Based Accessibility Analysis of Urban Emergency Shelters: the Case of Adana City
NASA Astrophysics Data System (ADS)
Unal, M.; Uslu, C.
2016-10-01
Accessibility analysis of urban emergency shelters can help support urban disaster prevention planning. Pre-disaster emergency evacuation zoning has become a significant topic on disaster prevention and mitigation research. In this study, we assessed the level of serviceability of urban emergency shelters within maximum capacity, usability, sufficiency and a certain walking time limit by employing spatial analysis techniques of GIS-Network Analyst. The methodology included the following aspects: the distribution analysis of emergency evacuation demands, the calculation of shelter space accessibility and the optimization of evacuation destinations. This methodology was applied to Adana, a city in Turkey, which is located within the Alpine-Himalayan orogenic system, the second major earthquake belt after the Pacific-Belt. It was found that the proposed methodology could be useful in aiding to understand the spatial distribution of urban emergency shelters more accurately and establish effective future urban disaster prevention planning. Additionally, this research provided a feasible way for supporting emergency management in terms of shelter construction, pre-disaster evacuation drills and rescue operations.
Wu, Xin Yin; Lam, Victor C K; Yu, Yue Feng; Ho, Robin S T; Feng, Ye; Wong, Charlene H L; Yip, Benjamin H K; Tsoi, Kelvin K F; Wong, Samuel Y S; Chung, Vincent C H
2016-11-01
Well-conducted meta-analyses (MAs) are considered as one of the best sources of clinical evidence for treatment decision. MA with methodological flaws may introduce bias and mislead evidence users. The aim of this study is to investigate the characteristics and methodological quality of MAs on diabetes mellitus (DM) treatments. Systematic review. Cochrane Database of Systematic Review and Database of Abstract of Reviews of Effects were searched for relevant MAs. Assessing methodological quality of systematic reviews (AMSTAR) tool was used to evaluate the methodological quality of included MAs. Logistic regression analysis was used to identify association between characteristics of MA and AMSTAR results. A total of 252 MAs including 4999 primary studies and 13,577,025 patients were included. Over half of the MAs (65.1%) only included type 2 DM patients and 160 MAs (63.5%) focused on pharmacological treatments. About 89.7% MAs performed comprehensive literature search and 89.3% provided characteristics of included studies. Included MAs generally had poor performance on the remaining AMSTAR items, especially in assessing publication bias (39.3%), providing lists of studies (19.0%) and declaring source of support comprehensively (7.5%). Only 62.7% MAs mentioned about harm of interventions. MAs with corresponding author from Asia performed less well in providing MA protocol than those from Europe. Methodological quality of MA on DM treatments was unsatisfactory. There is considerable room for improvement, especially in assessing publication bias, providing lists of studies and declaring source of support comprehensively. Also, there is an urgent need for MA authors to report treatment harm comprehensively. © 2016 European Society of Endocrinology.
Automatic Inference of Cryptographic Key Length Based on Analysis of Proof Tightness
2016-06-01
within an attack tree structure, then expand attack tree methodology to include cryptographic reductions. We then provide the algorithms for...maintaining and automatically reasoning about these expanded attack trees . We provide a software tool that utilizes machine-readable proof and attack metadata...and the attack tree methodology to provide rapid and precise answers regarding security parameters and effective security. This eliminates the need
Evaluative methodology for comprehensive water quality management planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dyer, H. L.
Computer-based evaluative methodologies have been developed to provide for the analysis of coupled phenomena associated with natural resource comprehensive planning requirements. Provisions for planner/computer interaction have been included. Each of the simulation models developed is described in terms of its coded procedures. An application of the models for water quality management planning is presented; and the data requirements for each of the models are noted.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1988-12-01
This document contains twelve papers on various aspects of low-level radioactive waste management. Topics of this volume include: performance assessment methodology; remedial action alternatives; site selection and site characterization procedures; intruder scenarios; sensitivity analysis procedures; mathematical models for mixed waste environmental transport; and risk assessment methodology. Individual papers were processed separately for the database. (TEM)
Frosini, Francesco; Miniati, Roberto; Grillone, Saverio; Dori, Fabrizio; Gentili, Guido Biffi; Belardinelli, Andrea
2016-11-14
The following study proposes and tests an integrated methodology involving Health Technology Assessment (HTA) and Failure Modes, Effects and Criticality Analysis (FMECA) for the assessment of specific aspects related to robotic surgery involving safety, process and technology. The integrated methodology consists of the application of specific techniques coming from the HTA joined to the aid of the most typical models from reliability engineering such as FMEA/FMECA. The study has also included in-site data collection and interviews to medical personnel. The total number of robotic procedures included in the analysis was 44: 28 for urology and 16 for general surgery. The main outcomes refer to the comparative evaluation between robotic, laparoscopic and open surgery. Risk analysis and mitigation interventions come from FMECA application. The small sample size available for the study represents an important bias, especially for the clinical outcomes reliability. Despite this, the study seems to confirm the better trend for robotics' surgical times with comparison to the open technique as well as confirming the robotics' clinical benefits in urology. More complex situation is observed for general surgery, where robotics' clinical benefits directly measured are the lowest blood transfusion rate.
NASA Technical Reports Server (NTRS)
Rasmussen, Robert; Bennett, Matthew
2006-01-01
The State Analysis Database Tool software establishes a productive environment for collaboration among software and system engineers engaged in the development of complex interacting systems. The tool embodies State Analysis, a model-based system engineering methodology founded on a state-based control architecture (see figure). A state represents a momentary condition of an evolving system, and a model may describe how a state evolves and is affected by other states. The State Analysis methodology is a process for capturing system and software requirements in the form of explicit models and states, and defining goal-based operational plans consistent with the models. Requirements, models, and operational concerns have traditionally been documented in a variety of system engineering artifacts that address different aspects of a mission s lifecycle. In State Analysis, requirements, models, and operations information are State Analysis artifacts that are consistent and stored in a State Analysis Database. The tool includes a back-end database, a multi-platform front-end client, and Web-based administrative functions. The tool is structured to prompt an engineer to follow the State Analysis methodology, to encourage state discovery and model description, and to make software requirements and operations plans consistent with model descriptions.
Use of Computer Simulation for the Analysis of Railroad Operations in the St. Louis Terminal Area
DOT National Transportation Integrated Search
1977-11-01
This report discusses the computer simulation methodology, its uses and limitations, and its applicability to the analysis of alternative railroad terminal restructuring plans. Included is a detailed discussion of the AAR Simulation System, an overvi...
Comparative Lifecycle Energy Analysis: Theory and Practice.
ERIC Educational Resources Information Center
Morris, Jeffrey; Canzoneri, Diana
1992-01-01
Explores the position that more energy is conserved through recycling secondary materials than is generated from municipal solid waste incineration. Discusses one component of a lifecycle analysis--a comparison of energy requirements for manufacturing competing products. Includes methodological issues, energy cost estimates, and difficulties…
Houston Cole Library Collection Assessment.
ERIC Educational Resources Information Center
Henderson, William Abbot, Ed.; McAbee, Sonja L., Ed.
This document reports on an assessment of the Jacksonville State University's Houston Cole Library collection that employed a variety of methodologies and tools, including list-checking, direct collection examination, shelflist measurement and analysis, WLN (Washington Library Network) conspectus sheets, analysis of OCLC/AMIGOS Collection Analysis…
Morales, Daniel R.; Pacurariu, Alexandra; Kurz, Xavier
2017-01-01
Aims Evaluating the public health impact of regulatory interventions is important but there is currently no common methodological approach to guide this evaluation. This systematic review provides a descriptive overview of the analytical methods for impact research. Methods We searched MEDLINE and EMBASE for articles with an empirical analysis evaluating the impact of European Union or non‐European Union regulatory actions to safeguard public health published until March 2017. References from systematic reviews and articles from other known sources were added. Regulatory interventions, data sources, outcomes of interest, methodology and key findings were extracted. Results From 1246 screened articles, 229 were eligible for full‐text review and 153 articles in English language were included in the descriptive analysis. Over a third of articles studied analgesics and antidepressants. Interventions most frequently evaluated are regulatory safety communications (28.8%), black box warnings (23.5%) and direct healthcare professional communications (10.5%); 55% of studies measured changes in drug utilization patterns, 27% evaluated health outcomes, and 18% targeted knowledge, behaviour or changes in clinical practice. Unintended consequences like switching therapies or spill‐over effects were rarely evaluated. Two‐thirds used before–after time series and 15.7% before–after cross‐sectional study designs. Various analytical approaches were applied including interrupted time series regression (31.4%), simple descriptive analysis (28.8%) and descriptive analysis with significance tests (23.5%). Conclusion Whilst impact evaluation of pharmacovigilance and product‐specific regulatory interventions is increasing, the marked heterogeneity in study conduct and reporting highlights the need for scientific guidance to ensure robust methodologies are applied and systematic dissemination of results occurs. PMID:29105853
Characterizing Aeroallergens by Infrared Spectroscopy of Fungal Spores and Pollen
Zimmermann, Boris; Tkalčec, Zdenko; Mešić, Armin; Kohler, Achim
2015-01-01
Background Fungal spores and plant pollen cause respiratory diseases in susceptible individuals, such as asthma, allergic rhinitis and hypersensitivity pneumonitis. Aeroallergen monitoring networks are an important part of treatment strategies, but unfortunately traditional analysis is time consuming and expensive. We have explored the use of infrared spectroscopy of pollen and spores for an inexpensive and rapid characterization of aeroallergens. Methodology The study is based on measurement of spore and pollen samples by single reflectance attenuated total reflectance Fourier transform infrared spectroscopy (SR-ATR FTIR). The experimental set includes 71 spore (Basidiomycota) and 121 pollen (Pinales, Fagales and Poales) samples. Along with fresh basidiospores, the study has been conducted on the archived samples collected within the last 50 years. Results The spectroscopic-based methodology enables clear spectral differentiation between pollen and spores, as well as the separation of confamiliar and congeneric species. In addition, the analysis of the scattering signals inherent in the infrared spectra indicates that the FTIR methodology offers indirect estimation of morphology of pollen and spores. The analysis of fresh and archived spores shows that chemical composition of spores is well preserved even after decades of storage, including the characteristic taxonomy-related signals. Therefore, biochemical analysis of fungal spores by FTIR could provide economical, reliable and timely methodologies for improving fungal taxonomy, as well as for fungal identification and monitoring. This proof of principle study shows the potential for using FTIR as a rapid tool in aeroallergen studies. In addition, the presented method is ready to be immediately implemented in biological and ecological studies for direct measurement of pollen and spores from flowers and sporocarps. PMID:25867755
Application of atomic force microscopy as a nanotechnology tool in food science.
Yang, Hongshun; Wang, Yifen; Lai, Shaojuan; An, Hongjie; Li, Yunfei; Chen, Fusheng
2007-05-01
Atomic force microscopy (AFM) provides a method for detecting nanoscale structural information. First, this review explains the fundamentals of AFM, including principle, manipulation, and analysis. Applications of AFM are then reported in food science and technology research, including qualitative macromolecule and polymer imaging, complicated or quantitative structure analysis, molecular interaction, molecular manipulation, surface topography, and nanofood characterization. The results suggested that AFM could bring insightful knowledge on food properties, and the AFM analysis could be used to illustrate some mechanisms of property changes during processing and storage. However, the current difficulty in applying AFM to food research is lacking appropriate methodology for different food systems. Better understanding of AFM technology and developing corresponding methodology for complicated food systems would lead to a more in-depth understanding of food properties at macromolecular levels and enlarge their applications. The AFM results could greatly improve the food processing and storage technologies.
Emerging and recurrent issues in drug development.
Anello, C
This paper reviews several emerging and recurrent issues relating to the drug development process. These emerging issues include changes to the FDA regulatory environment, internationalization of drug development, advances in computer technology and visualization tools, and efforts to incorporate meta-analysis methodology. Recurrent issues include: renewed interest in statistical methods for handling subgroups in the design and analysis of clinical trials; renewed interest in alternatives to the 'intention-to-treat' analysis in the presence of non-compliance in randomized clinical trials; renewed interest in methodology to address the multiplicities resulting from a variety of sources inherent in the drug development process, and renewed interest in methods to assure data integrity. These emerging and recurrent issues provide a continuing challenge to the international community of statisticians involved in drug development. Moreover, the involvement of statisticians with different perspectives continues to enrich the field and contributes to improvement in the public health.
Qualitative research methods in renal medicine: an introduction.
Bristowe, Katherine; Selman, Lucy; Murtagh, Fliss E M
2015-09-01
Qualitative methodologies are becoming increasingly widely used in health research. However, within some specialties, including renal medicine, qualitative approaches remain under-represented in the high-impact factor journals. Qualitative research can be undertaken: (i) as a stand-alone research method, addressing specific research questions; (ii) as part of a mixed methods approach alongside quantitative approaches or (iii) embedded in clinical trials, or during the development of complex interventions. The aim of this paper is to introduce qualitative research, including the rationale for choosing qualitative approaches, and guidance for ensuring quality when undertaking and reporting qualitative research. In addition, we introduce types of qualitative data (observation, interviews and focus groups) as well as some of the most commonly encountered methodological approaches (case studies, ethnography, phenomenology, grounded theory, thematic analysis, framework analysis and content analysis). © The Author 2015. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.
Towards a Methodology for Identifying Program Constraints During Requirements Analysis
NASA Technical Reports Server (NTRS)
Romo, Lilly; Gates, Ann Q.; Della-Piana, Connie Kubo
1997-01-01
Requirements analysis is the activity that involves determining the needs of the customer, identifying the services that the software system should provide and understanding the constraints on the solution. The result of this activity is a natural language document, typically referred to as the requirements definition document. Some of the problems that exist in defining requirements in large scale software projects includes synthesizing knowledge from various domain experts and communicating this information across multiple levels of personnel. One approach that addresses part of this problem is called context monitoring and involves identifying the properties of and relationships between objects that the system will manipulate. This paper examines several software development methodologies, discusses the support that each provide for eliciting such information from experts and specifying the information, and suggests refinements to these methodologies.
Portell, Mariona; Anguera, M Teresa; Hernández-Mendo, Antonio; Jonsson, Gudberg K
2015-01-01
Contextual factors are crucial for evaluative research in psychology, as they provide insights into what works, for whom, in what circumstances, in what respects, and why. Studying behavior in context, however, poses numerous methodological challenges. Although a comprehensive framework for classifying methods seeking to quantify biopsychosocial aspects in everyday contexts was recently proposed, this framework does not contemplate contributions from observational methodology. The aim of this paper is to justify and propose a more general framework that includes observational methodology approaches. Our analysis is rooted in two general concepts: ecological validity and methodological complementarity. We performed a narrative review of the literature on research methods and techniques for studying daily life and describe their shared properties and requirements (collection of data in real time, on repeated occasions, and in natural settings) and classification criteria (eg, variables of interest and level of participant involvement in the data collection process). We provide several examples that illustrate why, despite their higher costs, studies of behavior and experience in everyday contexts offer insights that complement findings provided by other methodological approaches. We urge that observational methodology be included in classifications of research methods and techniques for studying everyday behavior and advocate a renewed commitment to prioritizing ecological validity in behavioral research seeking to quantify biopsychosocial aspects. PMID:26089708
Baxter, Siyan; Sanderson, Kristy; Venn, Alison J; Blizzard, C Leigh; Palmer, Andrew J
2014-01-01
To determine the relationship between return on investment (ROI) and quality of study methodology in workplace health promotion programs. Data were obtained through a systematic literature search of National Health Service Economic Evaluation Database (NHS EED), Database of Abstracts of Reviews of Effects (DARE), Health Technology Database (HTA), Cost Effectiveness Analysis (CEA) Registry, EconLit, PubMed, Embase, Wiley, and Scopus. Included were articles written in English or German reporting cost(s) and benefit(s) and single or multicomponent health promotion programs on working adults. Return-to-work and workplace injury prevention studies were excluded. Methodological quality was graded using British Medical Journal Economic Evaluation Working Party checklist. Economic outcomes were presented as ROI. ROI was calculated as ROI = (benefits - costs of program)/costs of program. Results were weighted by study size and combined using meta-analysis techniques. Sensitivity analysis was performed using two additional methodological quality checklists. The influences of quality score and important study characteristics on ROI were explored. Fifty-one studies (61 intervention arms) published between 1984 and 2012 included 261,901 participants and 122,242 controls from nine industry types across 12 countries. Methodological quality scores were highly correlated between checklists (r = .84-.93). Methodological quality improved over time. Overall weighted ROI [mean ± standard deviation (confidence interval)] was 1.38 ± 1.97 (1.38-1.39), which indicated a 138% return on investment. When accounting for methodological quality, an inverse relationship to ROI was found. High-quality studies (n = 18) had a smaller mean ROI, 0.26 ± 1.74 (.23-.30), compared to moderate (n = 16) 0.90 ± 1.25 (.90-.91) and low-quality (n = 27) 2.32 ± 2.14 (2.30-2.33) studies. Randomized control trials (RCTs) (n = 12) exhibited negative ROI, -0.22 ± 2.41(-.27 to -.16). Financial returns become increasingly positive across quasi-experimental, nonexperimental, and modeled studies: 1.12 ± 2.16 (1.11-1.14), 1.61 ± 0.91 (1.56-1.65), and 2.05 ± 0.88 (2.04-2.06), respectively. Overall, mean weighted ROI in workplace health promotion demonstrated a positive ROI. Higher methodological quality studies provided evidence of smaller financial returns. Methodological quality and study design are important determinants.
NASA Technical Reports Server (NTRS)
Pamadi, Bandu N.; Toniolo, Matthew D.; Tartabini, Paul V.; Roithmayr, Carlos M.; Albertson, Cindy W.; Karlgaard, Christopher D.
2016-01-01
The objective of this report is to develop and implement a physics based method for analysis and simulation of multi-body dynamics including launch vehicle stage separation. The constraint force equation (CFE) methodology discussed in this report provides such a framework for modeling constraint forces and moments acting at joints when the vehicles are still connected. Several stand-alone test cases involving various types of joints were developed to validate the CFE methodology. The results were compared with ADAMS(Registered Trademark) and Autolev, two different industry standard benchmark codes for multi-body dynamic analysis and simulations. However, these two codes are not designed for aerospace flight trajectory simulations. After this validation exercise, the CFE algorithm was implemented in Program to Optimize Simulated Trajectories II (POST2) to provide a capability to simulate end-to-end trajectories of launch vehicles including stage separation. The POST2/CFE methodology was applied to the STS-1 Space Shuttle solid rocket booster (SRB) separation and Hyper-X Research Vehicle (HXRV) separation from the Pegasus booster as a further test and validation for its application to launch vehicle stage separation problems. Finally, to demonstrate end-to-end simulation capability, POST2/CFE was applied to the ascent, orbit insertion, and booster return of a reusable two-stage-to-orbit (TSTO) vehicle concept. With these validation exercises, POST2/CFE software can be used for performing conceptual level end-to-end simulations, including launch vehicle stage separation, for problems similar to those discussed in this report.
Rao, Goutham; Lopez-Jimenez, Francisco; Boyd, Jack; D'Amico, Frank; Durant, Nefertiti H; Hlatky, Mark A; Howard, George; Kirley, Katherine; Masi, Christopher; Powell-Wiley, Tiffany M; Solomonides, Anthony E; West, Colin P; Wessel, Jennifer
2017-09-05
Meta-analyses are becoming increasingly popular, especially in the fields of cardiovascular disease prevention and treatment. They are often considered to be a reliable source of evidence for making healthcare decisions. Unfortunately, problems among meta-analyses such as the misapplication and misinterpretation of statistical methods and tests are long-standing and widespread. The purposes of this statement are to review key steps in the development of a meta-analysis and to provide recommendations that will be useful for carrying out meta-analyses and for readers and journal editors, who must interpret the findings and gauge methodological quality. To make the statement practical and accessible, detailed descriptions of statistical methods have been omitted. Based on a survey of cardiovascular meta-analyses, published literature on methodology, expert consultation, and consensus among the writing group, key recommendations are provided. Recommendations reinforce several current practices, including protocol registration; comprehensive search strategies; methods for data extraction and abstraction; methods for identifying, measuring, and dealing with heterogeneity; and statistical methods for pooling results. Other practices should be discontinued, including the use of levels of evidence and evidence hierarchies to gauge the value and impact of different study designs (including meta-analyses) and the use of structured tools to assess the quality of studies to be included in a meta-analysis. We also recommend choosing a pooling model for conventional meta-analyses (fixed effect or random effects) on the basis of clinical and methodological similarities among studies to be included, rather than the results of a test for statistical heterogeneity. © 2017 American Heart Association, Inc.
Probabilistic structural analysis methods of hot engine structures
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Hopkins, D. A.
1989-01-01
Development of probabilistic structural analysis methods for hot engine structures at Lewis Research Center is presented. Three elements of the research program are: (1) composite load spectra methodology; (2) probabilistic structural analysis methodology; and (3) probabilistic structural analysis application. Recent progress includes: (1) quantification of the effects of uncertainties for several variables on high pressure fuel turbopump (HPFT) turbine blade temperature, pressure, and torque of the space shuttle main engine (SSME); (2) the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; and (3) evaluation of the failure probability. Collectively, the results demonstrate that the structural durability of hot engine structural components can be effectively evaluated in a formal probabilistic/reliability framework.
Methodological quality of meta-analyses of single-case experimental studies.
Jamshidi, Laleh; Heyvaert, Mieke; Declercq, Lies; Fernández-Castilla, Belén; Ferron, John M; Moeyaert, Mariola; Beretvas, S Natasha; Onghena, Patrick; Van den Noortgate, Wim
2017-12-28
Methodological rigor is a fundamental factor in the validity and credibility of the results of a meta-analysis. Following an increasing interest in single-case experimental design (SCED) meta-analyses, the current study investigates the methodological quality of SCED meta-analyses. We assessed the methodological quality of 178 SCED meta-analyses published between 1985 and 2015 through the modified Revised-Assessment of Multiple Systematic Reviews (R-AMSTAR) checklist. The main finding of the current review is that the methodological quality of the SCED meta-analyses has increased over time, but is still low according to the R-AMSTAR checklist. A remarkable percentage of the studies (93.80% of the included SCED meta-analyses) did not even reach the midpoint score (22, on a scale of 0-44). The mean and median methodological quality scores were 15.57 and 16, respectively. Relatively high scores were observed for "providing the characteristics of the included studies" and "doing comprehensive literature search". The key areas of deficiency were "reporting an assessment of the likelihood of publication bias" and "using the methods appropriately to combine the findings of studies". Although the results of the current review reveal that the methodological quality of the SCED meta-analyses has increased over time, still more efforts are needed to improve their methodological quality. Copyright © 2017 Elsevier Ltd. All rights reserved.
Methodological reporting of randomized clinical trials in respiratory research in 2010.
Lu, Yi; Yao, Qiuju; Gu, Jie; Shen, Ce
2013-09-01
Although randomized controlled trials (RCTs) are considered the highest level of evidence, they are also subject to bias, due to a lack of adequately reported randomization, and therefore the reporting should be as explicit as possible for readers to determine the significance of the contents. We evaluated the methodological quality of RCTs in respiratory research in high ranking clinical journals, published in 2010. We assessed the methodological quality, including generation of the allocation sequence, allocation concealment, double-blinding, sample-size calculation, intention-to-treat analysis, flow diagrams, number of medical centers involved, diseases, funding sources, types of interventions, trial registration, number of times the papers have been cited, journal impact factor, journal type, and journal endorsement of the CONSORT (Consolidated Standards of Reporting Trials) rules, in RCTs published in 12 top ranking clinical respiratory journals and 5 top ranking general medical journals. We included 176 trials, of which 93 (53%) reported adequate generation of the allocation sequence, 66 (38%) reported adequate allocation concealment, 79 (45%) were double-blind, 123 (70%) reported adequate sample-size calculation, 88 (50%) reported intention-to-treat analysis, and 122 (69%) included a flow diagram. Multivariate logistic regression analysis revealed that journal impact factor ≥ 5 was the only variable that significantly influenced adequate allocation sequence generation. Trial registration and journal impact factor ≥ 5 significantly influenced adequate allocation concealment. Medical interventions, trial registration, and journal endorsement of the CONSORT statement influenced adequate double-blinding. Publication in one of the general medical journal influenced adequate sample-size calculation. The methodological quality of RCTs in respiratory research needs improvement. Stricter enforcement of the CONSORT statement should enhance the quality of RCTs.
The Dispositions for Culturally Responsive Pedagogy Scale
ERIC Educational Resources Information Center
Whitaker, Manya C.; Valtierra, Kristina Marie
2018-01-01
Purpose: The purpose of this study is to develop and validate the dispositions for culturally responsive pedagogy scale (DCRPS). Design/methodology/approach: Scale development consisted of a six-step process including item development, expert review, exploratory factor analysis, factor interpretation, confirmatory factor analysis and convergent…
NASA Technical Reports Server (NTRS)
Berg, M. D.; Kim, H. S.; Friendlich, M. A.; Perez, C. E.; Seidlick, C. M.; LaBel, K. A.
2011-01-01
We present SEU test and analysis of the Microsemi ProASIC3 FPGA. SEU Probability models are incorporated for device evaluation. Included is a comparison to the RTAXS FPGA illustrating the effectiveness of the overall testing methodology.
A Phenomenological Exploration of Adoption
ERIC Educational Resources Information Center
Baltimore, Diana L.; Crase, Sedahlia Jasper
2009-01-01
This qualitative analysis explored children's and adults' experiences with adoption. We used phenomenological methodology and individually interviewed 25 participants and included adoptive mothers and fathers, and their children, each adopted before 18 months of age. Two research questions guided the data analysis: (a) What are children's and…
Zhu, Xiaoyan; Zhou, Xiaobin; Zhang, Yuan; Sun, Xiao; Liu, Haihua; Zhang, Yingying
2017-12-01
Survival analysis methods have gained widespread use in the filed of oncology. For achievement of reliable results, the methodological process and report quality is crucial. This review provides the first examination of methodological characteristics and reporting quality of survival analysis in articles published in leading Chinese oncology journals.To examine methodological and reporting quality of survival analysis, to identify some common deficiencies, to desirable precautions in the analysis, and relate advice for authors, readers, and editors.A total of 242 survival analysis articles were included to be evaluated from 1492 articles published in 4 leading Chinese oncology journals in 2013. Articles were evaluated according to 16 established items for proper use and reporting of survival analysis.The application rates of Kaplan-Meier, life table, log-rank test, Breslow test, and Cox proportional hazards model (Cox model) were 91.74%, 3.72%, 78.51%, 0.41%, and 46.28%, respectively, no article used the parametric method for survival analysis. Multivariate Cox model was conducted in 112 articles (46.28%). Follow-up rates were mentioned in 155 articles (64.05%), of which 4 articles were under 80% and the lowest was 75.25%, 55 articles were100%. The report rates of all types of survival endpoint were lower than 10%. Eleven of 100 articles which reported a loss to follow-up had stated how to treat it in the analysis. One hundred thirty articles (53.72%) did not perform multivariate analysis. One hundred thirty-nine articles (57.44%) did not define the survival time. Violations and omissions of methodological guidelines included no mention of pertinent checks for proportional hazard assumption; no report of testing for interactions and collinearity between independent variables; no report of calculation method of sample size. Thirty-six articles (32.74%) reported the methods of independent variable selection. The above defects could make potentially inaccurate, misleading of the reported results, or difficult to interpret.There are gaps in the conduct and reporting of survival analysis in studies published in Chinese oncology journals, severe deficiencies were noted. More endorsement by journals of the report guideline for survival analysis may improve articles quality, and the dissemination of reliable evidence to oncology clinicians. We recommend authors, readers, reviewers, and editors to consider survival analysis more carefully and cooperate more closely with statisticians and epidemiologists. Copyright © 2017 The Authors. Published by Wolters Kluwer Health, Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Howard, R. A.; North, D. W.; Pezier, J. P.
1975-01-01
A new methodology is proposed for integrating planetary quarantine objectives into space exploration planning. This methodology is designed to remedy the major weaknesses inherent in the current formulation of planetary quarantine requirements. Application of the methodology is illustrated by a tutorial analysis of a proposed Jupiter Orbiter mission. The proposed methodology reformulates planetary quarantine planning as a sequential decision problem. Rather than concentrating on a nominal plan, all decision alternatives and possible consequences are laid out in a decision tree. Probabilities and values are associated with the outcomes, including the outcome of contamination. The process of allocating probabilities, which could not be made perfectly unambiguous and systematic, is replaced by decomposition and optimization techniques based on principles of dynamic programming. Thus, the new methodology provides logical integration of all available information and allows selection of the best strategy consistent with quarantine and other space exploration goals.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Szilard, Ronaldo Henriques
A Risk Informed Safety Margin Characterization (RISMC) toolkit and methodology are proposed for investigating nuclear power plant core, fuels design and safety analysis, including postulated Loss-of-Coolant Accident (LOCA) analysis. This toolkit, under an integrated evaluation model framework, is name LOCA toolkit for the US (LOTUS). This demonstration includes coupled analysis of core design, fuel design, thermal hydraulics and systems analysis, using advanced risk analysis tools and methods to investigate a wide range of results.
NASA Technical Reports Server (NTRS)
Boyce, Lola; Bast, Callie C.; Trimble, Greg A.
1992-01-01
This report presents the results of a fourth year effort of a research program, conducted for NASA-LeRC by the University of Texas at San Antonio (UTSA). The research included on-going development of methodology that provides probabilistic lifetime strength of aerospace materials via computational simulation. A probabilistic material strength degradation model, in the form of a randomized multifactor interaction equation, is postulated for strength degradation of structural components of aerospace propulsion systems subject to a number of effects or primitive variables. These primitive variables may include high temperature, fatigue or creep. In most cases, strength is reduced as a result of the action of a variable. This multifactor interaction strength degradation equation has been randomized and is included in the computer program, PROMISS. Also included in the research is the development of methodology to calibrate the above-described constitutive equation using actual experimental materials data together with regression analysis of that data, thereby predicting values for the empirical material constants for each effect or primitive variable. This regression methodology is included in the computer program, PROMISC. Actual experimental materials data were obtained from industry and the open literature for materials typically for applications in aerospace propulsion system components. Material data for Inconel 718 has been analyzed using the developed methodology.
NASA Technical Reports Server (NTRS)
Boyce, Lola; Bast, Callie C.; Trimble, Greg A.
1992-01-01
The results of a fourth year effort of a research program conducted for NASA-LeRC by The University of Texas at San Antonio (UTSA) are presented. The research included on-going development of methodology that provides probabilistic lifetime strength of aerospace materials via computational simulation. A probabilistic material strength degradation model, in the form of a randomized multifactor interaction equation, is postulated for strength degradation of structural components of aerospace propulsion systems subjected to a number of effects or primitive variables. These primitive variables may include high temperature, fatigue, or creep. In most cases, strength is reduced as a result of the action of a variable. This multifactor interaction strength degradation equation was randomized and is included in the computer program, PROMISC. Also included in the research is the development of methodology to calibrate the above-described constitutive equation using actual experimental materials data together with regression analysis of that data, thereby predicting values for the empirical material constants for each effect or primitive variable. This regression methodology is included in the computer program, PROMISC. Actual experimental materials data were obtained from industry and the open literature for materials typically for applications in aerospace propulsion system components. Material data for Inconel 718 was analyzed using the developed methodology.
Teresa E. Jordan
2015-10-22
The files included in this submission contain all data pertinent to the methods and results of this task’s output, which is a cohesive multi-state map of all known potential geothermal reservoirs in our region, ranked by their potential favorability. Favorability is quantified using a new metric, Reservoir Productivity Index, as explained in the Reservoirs Methodology Memo (included in zip file). Shapefile and images of the Reservoir Productivity and Reservoir Uncertainty are included as well.
Methodological quality of systematic reviews on influenza vaccination.
Remschmidt, Cornelius; Wichmann, Ole; Harder, Thomas
2014-03-26
There is a growing body of evidence on the risks and benefits of influenza vaccination in various target groups. Systematic reviews are of particular importance for policy decisions. However, their methodological quality can vary considerably. To investigate the methodological quality of systematic reviews on influenza vaccination (efficacy, effectiveness, safety) and to identify influencing factors. A systematic literature search on systematic reviews on influenza vaccination was performed, using MEDLINE, EMBASE and three additional databases (1990-2013). Review characteristics were extracted and the methodological quality of the reviews was evaluated using the assessment of multiple systematic reviews (AMSTAR) tool. U-test, Kruskal-Wallis test, chi-square test, and multivariable linear regression analysis were used to assess the influence of review characteristics on AMSTAR-score. Fourty-six systematic reviews fulfilled the inclusion criteria. Average methodological quality was high (median AMSTAR-score: 8), but variability was large (AMSTAR range: 0-11). Quality did not differ significantly according to vaccination target group. Cochrane reviews had higher methodological quality than non-Cochrane reviews (p=0.001). Detailed analysis showed that this was due to better study selection and data extraction, inclusion of unpublished studies, and better reporting of study characteristics (all p<0.05). In the adjusted analysis, no other factor, including industry sponsorship or journal impact factor had an influence on AMSTAR score. Systematic reviews on influenza vaccination showed large differences regarding their methodological quality. Reviews conducted by the Cochrane collaboration were of higher quality than others. When using systematic reviews to guide the development of vaccination recommendations, the methodological quality of a review in addition to its content should be considered. Copyright © 2014 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Coventry, D. R.; Poswal, R. S.; Yadav, Ashok; Zhou, Yi; Riar, Amritbir; Kumar, Anuj; Sharma, R. K.; Chhokar, R. S.; Gupta, R. K.; Mehta, A. K.; Chand, Ramesh; Denton, M. D.; Cummins, J. A.
2018-01-01
Purpose: The purpose of this study is to develop a conceptual framework with related analysis methodologies that identifies the influence of social environment on an established cropping system. Design/Methodology/Approach: A stratified survey including 103 villages and 823 farmers was conducted in all districts of Haryana (India). Firstly,…
Application of ion chromatography in pharmaceutical and drug analysis.
Jenke, Dennis
2011-08-01
Ion chromatography (IC) has developed and matured into an important analytical methodology in a number of diverse applications and industries, including pharmaceuticals. This manuscript provides a review of IC applications for the determinations of active and inactive ingredients, excipients, degradation products, and impurities relevant to pharmaceutical analyses and thus serves as a resource for investigators looking for insights into the use of the IC methodology in this field of application.
Api, A M; Belsito, D; Bruze, M; Cadby, P; Calow, P; Dagli, M L; Dekant, W; Ellis, G; Fryer, A D; Fukayama, M; Griem, P; Hickey, C; Kromidas, L; Lalko, J F; Liebler, D C; Miyachi, Y; Politano, V T; Renskers, K; Ritacco, G; Salvito, D; Schultz, T W; Sipes, I G; Smith, B; Vitale, D; Wilcox, D K
2015-08-01
The Research Institute for Fragrance Materials, Inc. (RIFM) has been engaged in the generation and evaluation of safety data for fragrance materials since its inception over 45 years ago. Over time, RIFM's approach to gathering data, estimating exposure and assessing safety has evolved as the tools for risk assessment evolved. This publication is designed to update the RIFM safety assessment process, which follows a series of decision trees, reflecting advances in approaches in risk assessment and new and classical toxicological methodologies employed by RIFM over the past ten years. These changes include incorporating 1) new scientific information including a framework for choosing structural analogs, 2) consideration of the Threshold of Toxicological Concern (TTC), 3) the Quantitative Risk Assessment (QRA) for dermal sensitization, 4) the respiratory route of exposure, 5) aggregate exposure assessment methodology, 6) the latest methodology and approaches to risk assessments, 7) the latest alternatives to animal testing methodology and 8) environmental risk assessment. The assessment begins with a thorough analysis of existing data followed by in silico analysis, identification of 'read across' analogs, generation of additional data through in vitro testing as well as consideration of the TTC approach. If necessary, risk management may be considered. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Hendikawati, P.; Arifudin, R.; Zahid, M. Z.
2018-03-01
This study aims to design an android Statistics Data Analysis application that can be accessed through mobile devices to making it easier for users to access. The Statistics Data Analysis application includes various topics of basic statistical along with a parametric statistics data analysis application. The output of this application system is parametric statistics data analysis that can be used for students, lecturers, and users who need the results of statistical calculations quickly and easily understood. Android application development is created using Java programming language. The server programming language uses PHP with the Code Igniter framework, and the database used MySQL. The system development methodology used is the Waterfall methodology with the stages of analysis, design, coding, testing, and implementation and system maintenance. This statistical data analysis application is expected to support statistical lecturing activities and make students easier to understand the statistical analysis of mobile devices.
A Method for Evaluating the Safety Impacts of Air Traffic Automation
NASA Technical Reports Server (NTRS)
Kostiuk, Peter; Shapiro, Gerald; Hanson, Dave; Kolitz, Stephan; Leong, Frank; Rosch, Gene; Bonesteel, Charles
1998-01-01
This report describes a methodology for analyzing the safety and operational impacts of emerging air traffic technologies. The approach integrates traditional reliability models of the system infrastructure with models that analyze the environment within which the system operates, and models of how the system responds to different scenarios. Products of the analysis include safety measures such as predicted incident rates, predicted accident statistics, and false alarm rates; and operational availability data. The report demonstrates the methodology with an analysis of the operation of the Center-TRACON Automation System at Dallas-Fort Worth International Airport.
Some guidelines for conducting research in applied behavioral pharmacology.
van Haaren, Frans; Weeden, Marc
2013-01-01
The Journal of Applied Behavior Analysis (JABA) has published a number of articles on the behavioral effects of psychomotor stimulant drugs in individuals with attention deficit hyperactivity disorder. Some additional JABA publications have included investigations of the behavioral effects of other drugs. However, a review of these articles revealed many methodological differences among studies, which makes it difficult to evaluate the relative contribution of each research effort to the overall database. In this context, we offer some guidelines to solidify the methodological rigor of behavior pharmacological research published in JABA. © Society for the Experimental Analysis of Behavior.
Methodological issues in medical workforce analysis: implications for regional Australia.
Hays, R B; Veitch, P C; Franklin, L; Crossland, L
1998-02-01
Medical workforce data have a profound impact on health policy formulation, but derived doctor population ratios (DPR) are often more relevant to plotting national trends than providing a detailed regional or local workforce perspective. Regional workforce data may be more useful if national approaches are augmented by local information. In developing a detailed workforce analysis for one region of Australia, the authors encountered several challenging methodological issues, including the accuracy of medical workforce databases, clarity of definition of community boundaries, interpretation of workforce definitions and the difficulty accounting for local community needs. This paper discusses the implications for regional workforce research.
Development of Probabilistic Structural Analysis Integrated with Manufacturing Processes
NASA Technical Reports Server (NTRS)
Pai, Shantaram S.; Nagpal, Vinod K.
2007-01-01
An effort has been initiated to integrate manufacturing process simulations with probabilistic structural analyses in order to capture the important impacts of manufacturing uncertainties on component stress levels and life. Two physics-based manufacturing process models (one for powdered metal forging and the other for annular deformation resistance welding) have been linked to the NESSUS structural analysis code. This paper describes the methodology developed to perform this integration including several examples. Although this effort is still underway, particularly for full integration of a probabilistic analysis, the progress to date has been encouraging and a software interface that implements the methodology has been developed. The purpose of this paper is to report this preliminary development.
Methodological reporting of randomized trials in five leading Chinese nursing journals.
Shi, Chunhu; Tian, Jinhui; Ren, Dan; Wei, Hongli; Zhang, Lihuan; Wang, Quan; Yang, Kehu
2014-01-01
Randomized controlled trials (RCTs) are not always well reported, especially in terms of their methodological descriptions. This study aimed to investigate the adherence of methodological reporting complying with CONSORT and explore associated trial level variables in the Chinese nursing care field. In June 2012, we identified RCTs published in five leading Chinese nursing journals and included trials with details of randomized methods. The quality of methodological reporting was measured through the methods section of the CONSORT checklist and the overall CONSORT methodological items score was calculated and expressed as a percentage. Meanwhile, we hypothesized that some general and methodological characteristics were associated with reporting quality and conducted a regression with these data to explore the correlation. The descriptive and regression statistics were calculated via SPSS 13.0. In total, 680 RCTs were included. The overall CONSORT methodological items score was 6.34 ± 0.97 (Mean ± SD). No RCT reported descriptions and changes in "trial design," changes in "outcomes" and "implementation," or descriptions of the similarity of interventions for "blinding." Poor reporting was found in detailing the "settings of participants" (13.1%), "type of randomization sequence generation" (1.8%), calculation methods of "sample size" (0.4%), explanation of any interim analyses and stopping guidelines for "sample size" (0.3%), "allocation concealment mechanism" (0.3%), additional analyses in "statistical methods" (2.1%), and targeted subjects and methods of "blinding" (5.9%). More than 50% of trials described randomization sequence generation, the eligibility criteria of "participants," "interventions," and definitions of the "outcomes" and "statistical methods." The regression analysis found that publication year and ITT analysis were weakly associated with CONSORT score. The completeness of methodological reporting of RCTs in the Chinese nursing care field is poor, especially with regard to the reporting of trial design, changes in outcomes, sample size calculation, allocation concealment, blinding, and statistical methods.
Probabilistic sizing of laminates with uncertainties
NASA Technical Reports Server (NTRS)
Shah, A. R.; Liaw, D. G.; Chamis, C. C.
1993-01-01
A reliability based design methodology for laminate sizing and configuration for a special case of composite structures is described. The methodology combines probabilistic composite mechanics with probabilistic structural analysis. The uncertainties of constituent materials (fiber and matrix) to predict macroscopic behavior are simulated using probabilistic theory. Uncertainties in the degradation of composite material properties are included in this design methodology. A multi-factor interaction equation is used to evaluate load and environment dependent degradation of the composite material properties at the micromechanics level. The methodology is integrated into a computer code IPACS (Integrated Probabilistic Assessment of Composite Structures). Versatility of this design approach is demonstrated by performing a multi-level probabilistic analysis to size the laminates for design structural reliability of random type structures. The results show that laminate configurations can be selected to improve the structural reliability from three failures in 1000, to no failures in one million. Results also show that the laminates with the highest reliability are the least sensitive to the loading conditions.
Incremental Upgrade of Legacy Systems (IULS)
2001-04-01
analysis task employed SEI’s Feature-Oriented Domain Analysis methodology (see FODA reference) and included several phases: • Context Analysis • Establish...Legacy, new Host and upgrade system and software. The Feature Oriented Domain Analysis approach ( FODA , see SUM References) was used for this step...Feature-Oriented Domain Analysis ( FODA ) Feasibility Study (CMU/SEI-90-TR- 21, ESD-90-TR-222); Software Engineering Institute, Carnegie Mellon University
Steuten, Lotte; van de Wetering, Gijs; Groothuis-Oudshoorn, Karin; Retèl, Valesca
2013-01-01
This article provides a systematic and critical review of the evolving methods and applications of value of information (VOI) in academia and practice and discusses where future research needs to be directed. Published VOI studies were identified by conducting a computerized search on Scopus and ISI Web of Science from 1980 until December 2011 using pre-specified search terms. Only full-text papers that outlined and discussed VOI methods for medical decision making, and studies that applied VOI and explicitly discussed the results with a view to informing healthcare decision makers, were included. The included papers were divided into methodological and applied papers, based on the aim of the study. A total of 118 papers were included of which 50 % (n = 59) are methodological. A rapidly accumulating literature base on VOI from 1999 onwards for methodological papers and from 2005 onwards for applied papers is observed. Expected value of sample information (EVSI) is the preferred method of VOI to inform decision making regarding specific future studies, but real-life applications of EVSI remain scarce. Methodological challenges to VOI are numerous and include the high computational demands, dealing with non-linear models and interdependency between parameters, estimations of effective time horizons and patient populations, and structural uncertainties. VOI analysis receives increasing attention in both the methodological and the applied literature bases, but challenges to applying VOI in real-life decision making remain. For many technical and methodological challenges to VOI analytic solutions have been proposed in the literature, including leaner methods for VOI. Further research should also focus on the needs of decision makers regarding VOI.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grabaskas, Dave; Brunett, Acacia J.; Bucknor, Matthew
GE Hitachi Nuclear Energy (GEH) and Argonne National Laboratory are currently engaged in a joint effort to modernize and develop probabilistic risk assessment (PRA) techniques for advanced non-light water reactors. At a high level the primary outcome of this project will be the development of next-generation PRA methodologies that will enable risk-informed prioritization of safety- and reliability-focused research and development, while also identifying gaps that may be resolved through additional research. A subset of this effort is the development of a reliability database (RDB) methodology to determine applicable reliability data for inclusion in the quantification of the PRA. The RDBmore » method developed during this project seeks to satisfy the requirements of the Data Analysis element of the ASME/ANS Non-LWR PRA standard. The RDB methodology utilizes a relevancy test to examine reliability data and determine whether it is appropriate to include as part of the reliability database for the PRA. The relevancy test compares three component properties to establish the level of similarity to components examined as part of the PRA. These properties include the component function, the component failure modes, and the environment/boundary conditions of the component. The relevancy test is used to gauge the quality of data found in a variety of sources, such as advanced reactor-specific databases, non-advanced reactor nuclear databases, and non-nuclear databases. The RDB also establishes the integration of expert judgment or separate reliability analysis with past reliability data. This paper provides details on the RDB methodology, and includes an example application of the RDB methodology for determining the reliability of the intermediate heat exchanger of a sodium fast reactor. The example explores a variety of reliability data sources, and assesses their applicability for the PRA of interest through the use of the relevancy test.« less
Płaszewski, Maciej; Bettany-Saltikov, Josette
2014-01-01
Background Non-surgical interventions for adolescents with idiopathic scoliosis remain highly controversial. Despite the publication of numerous reviews no explicit methodological evaluation of papers labeled as, or having a layout of, a systematic review, addressing this subject matter, is available. Objectives Analysis and comparison of the content, methodology, and evidence-base from systematic reviews regarding non-surgical interventions for adolescents with idiopathic scoliosis. Design Systematic overview of systematic reviews. Methods Articles meeting the minimal criteria for a systematic review, regarding any non-surgical intervention for adolescent idiopathic scoliosis, with any outcomes measured, were included. Multiple general and systematic review specific databases, guideline registries, reference lists and websites of institutions were searched. The AMSTAR tool was used to critically appraise the methodology, and the Oxford Centre for Evidence Based Medicine and the Joanna Briggs Institute’s hierarchies were applied to analyze the levels of evidence from included reviews. Results From 469 citations, twenty one papers were included for analysis. Five reviews assessed the effectiveness of scoliosis-specific exercise treatments, four assessed manual therapies, five evaluated bracing, four assessed different combinations of interventions, and one evaluated usual physical activity. Two reviews addressed the adverse effects of bracing. Two papers were high quality Cochrane reviews, Three were of moderate, and the remaining sixteen were of low or very low methodological quality. The level of evidence of these reviews ranged from 1 or 1+ to 4, and in some reviews, due to their low methodological quality and/or poor reporting, this could not be established. Conclusions Higher quality reviews indicate that generally there is insufficient evidence to make a judgment on whether non-surgical interventions in adolescent idiopathic scoliosis are effective. Papers labeled as systematic reviews need to be considered in terms of their methodological rigor; otherwise they may be mistakenly regarded as high quality sources of evidence. Protocol registry number CRD42013003538, PROSPERO PMID:25353954
Płaszewski, Maciej; Bettany-Saltikov, Josette
2014-01-01
Non-surgical interventions for adolescents with idiopathic scoliosis remain highly controversial. Despite the publication of numerous reviews no explicit methodological evaluation of papers labeled as, or having a layout of, a systematic review, addressing this subject matter, is available. Analysis and comparison of the content, methodology, and evidence-base from systematic reviews regarding non-surgical interventions for adolescents with idiopathic scoliosis. Systematic overview of systematic reviews. Articles meeting the minimal criteria for a systematic review, regarding any non-surgical intervention for adolescent idiopathic scoliosis, with any outcomes measured, were included. Multiple general and systematic review specific databases, guideline registries, reference lists and websites of institutions were searched. The AMSTAR tool was used to critically appraise the methodology, and the Oxford Centre for Evidence Based Medicine and the Joanna Briggs Institute's hierarchies were applied to analyze the levels of evidence from included reviews. From 469 citations, twenty one papers were included for analysis. Five reviews assessed the effectiveness of scoliosis-specific exercise treatments, four assessed manual therapies, five evaluated bracing, four assessed different combinations of interventions, and one evaluated usual physical activity. Two reviews addressed the adverse effects of bracing. Two papers were high quality Cochrane reviews, Three were of moderate, and the remaining sixteen were of low or very low methodological quality. The level of evidence of these reviews ranged from 1 or 1+ to 4, and in some reviews, due to their low methodological quality and/or poor reporting, this could not be established. Higher quality reviews indicate that generally there is insufficient evidence to make a judgment on whether non-surgical interventions in adolescent idiopathic scoliosis are effective. Papers labeled as systematic reviews need to be considered in terms of their methodological rigor; otherwise they may be mistakenly regarded as high quality sources of evidence. CRD42013003538, PROSPERO.
Guglielminotti, Jean; Dechartres, Agnès; Mentré, France; Montravers, Philippe; Longrois, Dan; Laouénan, Cedric
2015-10-01
Prognostic research studies in anesthesiology aim to identify risk factors for an outcome (explanatory studies) or calculate the risk of this outcome on the basis of patients' risk factors (predictive studies). Multivariable models express the relationship between predictors and an outcome and are used in both explanatory and predictive studies. Model development demands a strict methodology and a clear reporting to assess its reliability. In this methodological descriptive review, we critically assessed the reporting and methodology of multivariable analysis used in observational prognostic studies published in anesthesiology journals. A systematic search was conducted on Medline through Web of Knowledge, PubMed, and journal websites to identify observational prognostic studies with multivariable analysis published in Anesthesiology, Anesthesia & Analgesia, British Journal of Anaesthesia, and Anaesthesia in 2010 and 2011. Data were extracted by 2 independent readers. First, studies were analyzed with respect to reporting of outcomes, design, size, methods of analysis, model performance (discrimination and calibration), model validation, clinical usefulness, and STROBE (i.e., Strengthening the Reporting of Observational Studies in Epidemiology) checklist. A reporting rate was calculated on the basis of 21 items of the aforementioned points. Second, they were analyzed with respect to some predefined methodological points. Eighty-six studies were included: 87.2% were explanatory and 80.2% investigated a postoperative event. The reporting was fairly good, with a median reporting rate of 79% (75% in explanatory studies and 100% in predictive studies). Six items had a reporting rate <36% (i.e., the 25th percentile), with some of them not identified in the STROBE checklist: blinded evaluation of the outcome (11.9%), reason for sample size (15.1%), handling of missing data (36.0%), assessment of colinearity (17.4%), assessment of interactions (13.9%), and calibration (34.9%). When reported, a few methodological shortcomings were observed, both in explanatory and predictive studies, such as an insufficient number of events of the outcome (44.6%), exclusion of cases with missing data (93.6%), or categorization of continuous variables (65.1%.). The reporting of multivariable analysis was fairly good and could be further improved by checking reporting guidelines and EQUATOR Network website. Limiting the number of candidate variables, including cases with missing data, and not arbitrarily categorizing continuous variables should be encouraged.
Poverty in Latin America: A Critical Analysis of Three Studies.
ERIC Educational Resources Information Center
Boltvinik, Julio
1996-01-01
Critically evaluates the methodologies used in three recent studies on poverty in Latin America. Maintains that some studies measure the relative nature of nutritional poverty while others record the absolute nature of nutritional poverty (physical survival). Includes a comparative analysis of the studies' results. (MJP)
An Analysis of Methods Used to Examine Gender Differences in Computer-Related Behavior.
ERIC Educational Resources Information Center
Kay, Robin
1992-01-01
Review of research investigating gender differences in computer-related behavior examines statistical and methodological flaws. Issues addressed include sample selection, sample size, scale development, scale quality, the use of univariate and multivariate analyses, regressional analysis, construct definition, construct testing, and the…
Carlsson, Ing-Marie; Blomqvist, Marjut; Jormfeldt, Henrika
2017-01-01
Undertaking research studies in the field of mental health is essential in mental health nursing. Qualitative research methodologies enable human experiences to become visible and recognize the importance of lived experiences. This paper argues that involving people with schizophrenia in research is critical to promote their health and well-being. The quality of qualitative research needs scrutinizing according to methodological issues such as trustworthiness and ethical standards that are a fundamental part of qualitative research and nursing curricula. The aim of this study was to critically review recent qualitative studies involving people with severe and persistent mental illness such as schizophrenia and other psychotic conditions, regarding descriptions of ethical and methodological issues in data collection and analysis. A search for relevant papers was conducted in three electronic databases, in December 2016. Fifteen qualitative interview studies were included and reviewed regarding methodological issues related to ethics, and data collection and analysis. The results revealed insufficient descriptions of methodology regarding ethical considerations and issues related to recruitment and sampling in qualitative interview studies with individuals with severe mental illness, putting trustworthiness at risk despite detailed descriptions of data analysis. Knowledge from the perspective of individuals with their own experience of mental illness is essential. Issues regarding sampling and trustworthiness in qualitative studies involving people with severe mental illness are vital to counteract the stigmatization of mental illness.
Carlsson, Ing-Marie; Blomqvist, Marjut; Jormfeldt, Henrika
2017-01-01
ABSTRACT Undertaking research studies in the field of mental health is essential in mental health nursing. Qualitative research methodologies enable human experiences to become visible and recognize the importance of lived experiences. This paper argues that involving people with schizophrenia in research is critical to promote their health and well-being. The quality of qualitative research needs scrutinizing according to methodological issues such as trustworthiness and ethical standards that are a fundamental part of qualitative research and nursing curricula. The aim of this study was to critically review recent qualitative studies involving people with severe and persistent mental illness such as schizophrenia and other psychotic conditions, regarding descriptions of ethical and methodological issues in data collection and analysis. A search for relevant papers was conducted in three electronic databases, in December 2016. Fifteen qualitative interview studies were included and reviewed regarding methodological issues related to ethics, and data collection and analysis. The results revealed insufficient descriptions of methodology regarding ethical considerations and issues related to recruitment and sampling in qualitative interview studies with individuals with severe mental illness, putting trustworthiness at risk despite detailed descriptions of data analysis. Knowledge from the perspective of individuals with their own experience of mental illness is essential. Issues regarding sampling and trustworthiness in qualitative studies involving people with severe mental illness are vital to counteract the stigmatization of mental illness. PMID:28901217
Aeroelastic optimization methodology for viscous and turbulent flows
NASA Astrophysics Data System (ADS)
Barcelos Junior, Manuel Nascimento Dias
2007-12-01
In recent years, the development of faster computers and parallel processing allowed the application of high-fidelity analysis methods to the aeroelastic design of aircraft. However, these methods are restricted to the final design verification, mainly due to the computational cost involved in iterative design processes. Therefore, this work is concerned with the creation of a robust and efficient aeroelastic optimization methodology for inviscid, viscous and turbulent flows by using high-fidelity analysis and sensitivity analysis techniques. Most of the research in aeroelastic optimization, for practical reasons, treat the aeroelastic system as a quasi-static inviscid problem. In this work, as a first step toward the creation of a more complete aeroelastic optimization methodology for realistic problems, an analytical sensitivity computation technique was developed and tested for quasi-static aeroelastic viscous and turbulent flow configurations. Viscous and turbulent effects are included by using an averaged discretization of the Navier-Stokes equations, coupled with an eddy viscosity turbulence model. For quasi-static aeroelastic problems, the traditional staggered solution strategy has unsatisfactory performance when applied to cases where there is a strong fluid-structure coupling. Consequently, this work also proposes a solution methodology for aeroelastic and sensitivity analyses of quasi-static problems, which is based on the fixed point of an iterative nonlinear block Gauss-Seidel scheme. The methodology can also be interpreted as the solution of the Schur complement of the aeroelastic and sensitivity analyses linearized systems of equations. The methodologies developed in this work are tested and verified by using realistic aeroelastic systems.
Leavesley, Silas J; Sweat, Brenner; Abbott, Caitlyn; Favreau, Peter; Rich, Thomas C
2018-01-01
Spectral imaging technologies have been used for many years by the remote sensing community. More recently, these approaches have been applied to biomedical problems, where they have shown great promise. However, biomedical spectral imaging has been complicated by the high variance of biological data and the reduced ability to construct test scenarios with fixed ground truths. Hence, it has been difficult to objectively assess and compare biomedical spectral imaging assays and technologies. Here, we present a standardized methodology that allows assessment of the performance of biomedical spectral imaging equipment, assays, and analysis algorithms. This methodology incorporates real experimental data and a theoretical sensitivity analysis, preserving the variability present in biomedical image data. We demonstrate that this approach can be applied in several ways: to compare the effectiveness of spectral analysis algorithms, to compare the response of different imaging platforms, and to assess the level of target signature required to achieve a desired performance. Results indicate that it is possible to compare even very different hardware platforms using this methodology. Future applications could include a range of optimization tasks, such as maximizing detection sensitivity or acquisition speed, providing high utility for investigators ranging from design engineers to biomedical scientists. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Simulation of Attacks for Security in Wireless Sensor Network.
Diaz, Alvaro; Sanchez, Pablo
2016-11-18
The increasing complexity and low-power constraints of current Wireless Sensor Networks (WSN) require efficient methodologies for network simulation and embedded software performance analysis of nodes. In addition, security is also a very important feature that has to be addressed in most WSNs, since they may work with sensitive data and operate in hostile unattended environments. In this paper, a methodology for security analysis of Wireless Sensor Networks is presented. The methodology allows designing attack-aware embedded software/firmware or attack countermeasures to provide security in WSNs. The proposed methodology includes attacker modeling and attack simulation with performance analysis (node's software execution time and power consumption estimation). After an analysis of different WSN attack types, an attacker model is proposed. This model defines three different types of attackers that can emulate most WSN attacks. In addition, this paper presents a virtual platform that is able to model the node hardware, embedded software and basic wireless channel features. This virtual simulation analyzes the embedded software behavior and node power consumption while it takes into account the network deployment and topology. Additionally, this simulator integrates the previously mentioned attacker model. Thus, the impact of attacks on power consumption and software behavior/execution-time can be analyzed. This provides developers with essential information about the effects that one or multiple attacks could have on the network, helping them to develop more secure WSN systems. This WSN attack simulator is an essential element of the attack-aware embedded software development methodology that is also introduced in this work.
Making sense of grounded theory in medical education.
Kennedy, Tara J T; Lingard, Lorelei A
2006-02-01
Grounded theory is a research methodology designed to develop, through collection and analysis of data that is primarily (but not exclusively) qualitative, a well-integrated set of concepts that provide a theoretical explanation of a social phenomenon. This paper aims to provide an introduction to key features of grounded theory methodology within the context of medical education research. In this paper we include a discussion of the origins of grounded theory, a description of key methodological processes, a comment on pitfalls encountered commonly in the application of grounded theory research, and a summary of the strengths of grounded theory methodology with illustrations from the medical education domain. The significant strengths of grounded theory that have resulted in its enduring prominence in qualitative research include its clearly articulated analytical process and its emphasis on the generation of pragmatic theory that is grounded in the data of experience. When applied properly and thoughtfully, grounded theory can address research questions of significant relevance to the domain of medical education.
User-Centered Iterative Design of a Collaborative Virtual Environment
2001-03-01
cognitive task analysis methods to study land navigators. This study was intended to validate the use of user-centered design methodologies for the design of...have explored the cognitive aspects of collaborative human way finding and design for collaborative virtual environments. Further investigation of design paradigms should include cognitive task analysis and behavioral task analysis.
ERIC Educational Resources Information Center
Shintani, Natsuko; Li, Shaofeng; Ellis, Rod
2013-01-01
This article reports a meta-analysis of studies that investigated the relative effectiveness of comprehension-based instruction (CBI) and production-based instruction (PBI). The meta-analysis only included studies that featured a direct comparison of CBI and PBI in order to ensure methodological and statistical robustness. A total of 35 research…
Evaluation of grid generation technologies from an applied perspective
NASA Technical Reports Server (NTRS)
Hufford, Gary S.; Harrand, Vincent J.; Patel, Bhavin C.; Mitchell, Curtis R.
1995-01-01
An analysis of the grid generation process from the point of view of an applied CFD engineer is given. Issues addressed include geometric modeling, structured grid generation, unstructured grid generation, hybrid grid generation and use of virtual parts libraries in large parametric analysis projects. The analysis is geared towards comparing the effective turn around time for specific grid generation and CFD projects. The conclusion was made that a single grid generation methodology is not universally suited for all CFD applications due to both limitations in grid generation and flow solver technology. A new geometric modeling and grid generation tool, CFD-GEOM, is introduced to effectively integrate the geometric modeling process to the various grid generation methodologies including structured, unstructured, and hybrid procedures. The full integration of the geometric modeling and grid generation allows implementation of extremely efficient updating procedures, a necessary requirement for large parametric analysis projects. The concept of using virtual parts libraries in conjunction with hybrid grids for large parametric analysis projects is also introduced to improve the efficiency of the applied CFD engineer.
NASA Astrophysics Data System (ADS)
Gamzina, Diana
Diana Gamzina March 2016 Mechanical and Aerospace Engineering Multiscale Thermo-Mechanical Design and Analysis of High Frequency and High Power Vacuum Electron Devices Abstract A methodology for performing thermo-mechanical design and analysis of high frequency and high average power vacuum electron devices is presented. This methodology results in a "first-pass" engineering design directly ready for manufacturing. The methodology includes establishment of thermal and mechanical boundary conditions, evaluation of convective film heat transfer coefficients, identification of material options, evaluation of temperature and stress field distributions, assessment of microscale effects on the stress state of the material, and fatigue analysis. The feature size of vacuum electron devices operating in the high frequency regime of 100 GHz to 1 THz is comparable to the microstructure of the materials employed for their fabrication. As a result, the thermo-mechanical performance of a device is affected by the local material microstructure. Such multiscale effects on the stress state are considered in the range of scales from about 10 microns up to a few millimeters. The design and analysis methodology is demonstrated on three separate microwave devices: a 95 GHz 10 kW cw sheet beam klystron, a 263 GHz 50 W long pulse wide-bandwidth sheet beam travelling wave tube, and a 346 GHz 1 W cw backward wave oscillator.
Catastrophic incidents can generate a large number of samples with analytically diverse types including forensic, clinical, environmental, food, and others. Environmental samples include water, wastewater, soil, air, urban building and infrastructure materials, and surface resid...
Access and Ownership in the Academic Environment: One Library's Progress Report.
ERIC Educational Resources Information Center
Brin, Beth; Cochran, Elissa
1994-01-01
Describes the methodology used at the University of Arizona Library to address the issue of access versus ownership of library materials. Topics discussed include participatory management; data collection, including focus groups, interlibrary loan statistics, and graduate research citation analysis; and resulting recommendations, including…
2014-01-01
Background Digital image analysis has the potential to address issues surrounding traditional histological techniques including a lack of objectivity and high variability, through the application of quantitative analysis. A key initial step in image analysis is the identification of regions of interest. A widely applied methodology is that of segmentation. This paper proposes the application of image analysis techniques to segment skin tissue with varying degrees of histopathological damage. The segmentation of human tissue is challenging as a consequence of the complexity of the tissue structures and inconsistencies in tissue preparation, hence there is a need for a new robust method with the capability to handle the additional challenges materialising from histopathological damage. Methods A new algorithm has been developed which combines enhanced colour information, created following a transformation to the L*a*b* colourspace, with general image intensity information. A colour normalisation step is included to enhance the algorithm’s robustness to variations in the lighting and staining of the input images. The resulting optimised image is subjected to thresholding and the segmentation is fine-tuned using a combination of morphological processing and object classification rules. The segmentation algorithm was tested on 40 digital images of haematoxylin & eosin (H&E) stained skin biopsies. Accuracy, sensitivity and specificity of the algorithmic procedure were assessed through the comparison of the proposed methodology against manual methods. Results Experimental results show the proposed fully automated methodology segments the epidermis with a mean specificity of 97.7%, a mean sensitivity of 89.4% and a mean accuracy of 96.5%. When a simple user interaction step is included, the specificity increases to 98.0%, the sensitivity to 91.0% and the accuracy to 96.8%. The algorithm segments effectively for different severities of tissue damage. Conclusions Epidermal segmentation is a crucial first step in a range of applications including melanoma detection and the assessment of histopathological damage in skin. The proposed methodology is able to segment the epidermis with different levels of histological damage. The basic method framework could be applied to segmentation of other epithelial tissues. PMID:24521154
Treves-Kagan, Sarah; Naidoo, Evasen; Gilvydis, Jennifer M; Raphela, Elsie; Barnhart, Scott; Lippman, Sheri A
2017-09-01
Successful HIV prevention programming requires engaging communities in the planning process and responding to the social environmental factors that shape health and behaviour in a specific local context. We conducted two community-based situational analyses to inform a large, comprehensive HIV prevention programme in two rural districts of North West Province South Africa in 2012. The methodology includes: initial partnership building, goal setting and background research; 1 week of field work; in-field and subsequent data analysis; and community dissemination and programmatic incorporation of results. We describe the methodology and a case study of the approach in rural South Africa; assess if the methodology generated data with sufficient saturation, breadth and utility for programming purposes; and evaluate if this process successfully engaged the community. Between the two sites, 87 men and 105 women consented to in-depth interviews; 17 focus groups were conducted; and 13 health facilities and 7 NGOs were assessed. The methodology succeeded in quickly collecting high-quality data relevant to tailoring a comprehensive HIV programme and created a strong foundation for community engagement and integration with local health services. This methodology can be an accessible tool in guiding community engagement and tailoring future combination HIV prevention and care programmes.
Goedecke, Thomas; Morales, Daniel R; Pacurariu, Alexandra; Kurz, Xavier
2018-03-01
Evaluating the public health impact of regulatory interventions is important but there is currently no common methodological approach to guide this evaluation. This systematic review provides a descriptive overview of the analytical methods for impact research. We searched MEDLINE and EMBASE for articles with an empirical analysis evaluating the impact of European Union or non-European Union regulatory actions to safeguard public health published until March 2017. References from systematic reviews and articles from other known sources were added. Regulatory interventions, data sources, outcomes of interest, methodology and key findings were extracted. From 1246 screened articles, 229 were eligible for full-text review and 153 articles in English language were included in the descriptive analysis. Over a third of articles studied analgesics and antidepressants. Interventions most frequently evaluated are regulatory safety communications (28.8%), black box warnings (23.5%) and direct healthcare professional communications (10.5%); 55% of studies measured changes in drug utilization patterns, 27% evaluated health outcomes, and 18% targeted knowledge, behaviour or changes in clinical practice. Unintended consequences like switching therapies or spill-over effects were rarely evaluated. Two-thirds used before-after time series and 15.7% before-after cross-sectional study designs. Various analytical approaches were applied including interrupted time series regression (31.4%), simple descriptive analysis (28.8%) and descriptive analysis with significance tests (23.5%). Whilst impact evaluation of pharmacovigilance and product-specific regulatory interventions is increasing, the marked heterogeneity in study conduct and reporting highlights the need for scientific guidance to ensure robust methodologies are applied and systematic dissemination of results occurs. © 2017 The Authors. British Journal of Clinical Pharmacology published by John Wiley & Sons Ltd on behalf of British Pharmacological Society.
Methodology for cost analysis of film-based and filmless portable chest systems
NASA Astrophysics Data System (ADS)
Melson, David L.; Gauvain, Karen M.; Beardslee, Brian M.; Kraitsik, Michael J.; Burton, Larry; Blaine, G. James; Brink, Gary S.
1996-05-01
Many studies analyzing the costs of film-based and filmless radiology have focused on multi- modality, hospital-wide solutions. Yet due to the enormous cost of converting an entire large radiology department or hospital to a filmless environment all at once, institutions often choose to eliminate film one area at a time. Narrowing the focus of cost-analysis may be useful in making such decisions. This presentation will outline a methodology for analyzing the cost per exam of film-based and filmless solutions for providing portable chest exams to Intensive Care Units (ICUs). The methodology, unlike most in the literature, is based on parallel data collection from existing filmless and film-based ICUs, and is currently being utilized at our institution. Direct costs, taken from the perspective of the hospital, for portable computed radiography chest exams in one filmless and two film-based ICUs are identified. The major cost components are labor, equipment, materials, and storage. Methods for gathering and analyzing each of the cost components are discussed, including FTE-based and time-based labor analysis, incorporation of equipment depreciation, lease, and maintenance costs, and estimation of materials costs. Extrapolation of data from three ICUs to model hypothetical, hospital-wide film-based and filmless ICU imaging systems is described. Performance of sensitivity analysis on the filmless model to assess the impact of anticipated reductions in specific labor, equipment, and archiving costs is detailed. A number of indirect costs, which are not explicitly included in the analysis, are identified and discussed.
A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis
Guardia, Gabriela D. A.; Pires, Luís Ferreira; Vêncio, Ricardo Z. N.; Malmegrim, Kelen C. R.; de Farias, Cléver R. G.
2015-01-01
Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS) Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis. PMID:26207740
A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis.
Guardia, Gabriela D A; Pires, Luís Ferreira; Vêncio, Ricardo Z N; Malmegrim, Kelen C R; de Farias, Cléver R G
2015-01-01
Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS) Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis.
The Stanford Prison Experiment in Introductory Psychology Textbooks: A Content Analysis
ERIC Educational Resources Information Center
Bartels, Jared M.
2015-01-01
The present content analysis examines the coverage of theoretical and methodological problems with the Stanford prison experiment (SPE) in a sample of introductory psychology textbooks. Categories included the interpretation and replication of the study, variance in guard behavior, participant selection bias, the presence of demand characteristics…
Pazó, Jose A.; Granada, Enrique; Saavedra, Ángeles; Eguía, Pablo; Collazo, Joaquín
2010-01-01
The objective of this study was to develop a methodology for the determination of the maximum sampling error and confidence intervals of thermal properties obtained from thermogravimetric analysis (TG), including moisture, volatile matter, fixed carbon and ash content. The sampling procedure of the TG analysis was of particular interest and was conducted with care. The results of the present study were compared to those of a prompt analysis, and a correlation between the mean values and maximum sampling errors of the methods were not observed. In general, low and acceptable levels of uncertainty and error were obtained, demonstrating that the properties evaluated by TG analysis were representative of the overall fuel composition. The accurate determination of the thermal properties of biomass with precise confidence intervals is of particular interest in energetic biomass applications. PMID:20717532
LC-MS based analysis of endogenous steroid hormones in human hair.
Gao, Wei; Kirschbaum, Clemens; Grass, Juliane; Stalder, Tobias
2016-09-01
The quantification of endogenous steroid hormone concentrations in hair is increasingly used as a method for obtaining retrospective information on long-term integrated hormone exposure. Several different analytical procedures have been employed for hair steroid analysis, with liquid chromatography-mass spectrometry (LC-MS) being recognized as a particularly powerful analytical tool. Several methodological aspects affect the performance of LC-MS systems for hair steroid analysis, including sample preparation and pretreatment, steroid extraction, post-incubation purification, LC methodology, ionization techniques and MS specifications. Here, we critically review the differential value of such protocol variants for hair steroid hormones analysis, focusing on both analytical quality and practical feasibility issues. Our results show that, when methodological challenges are adequately addressed, LC-MS protocols can not only yield excellent sensitivity and specificity but are also characterized by relatively simple sample processing and short run times. This makes LC-MS based hair steroid protocols particularly suitable as a high-quality option for routine application in research contexts requiring the processing of larger numbers of samples. Copyright © 2016 Elsevier Ltd. All rights reserved.
Coupled structural/thermal/electromagnetic analysis/tailoring of graded composite structures
NASA Technical Reports Server (NTRS)
Mcknight, R. L.; Chen, P. C.; Dame, L. T.; Huang, H.
1992-01-01
Accomplishments are described for the first year effort of a 5-year program to develop a methodology for coupled structural/thermal/electromagnetic analysis/tailoring of graded composite structures. These accomplishments include: (1) the results of the selective literature survey; (2) 8-, 16-, and 20-noded isoparametric plate and shell elements; (3) large deformation structural analysis; (4) eigenanalysis; (5) anisotropic heat transfer analysis; and (6) anisotropic electromagnetic analysis.
NASA Astrophysics Data System (ADS)
McKinney, D. C.; Cuellar, A. D.
2015-12-01
Climate change has accelerated glacial retreat in high altitude glaciated regions of Nepal leading to the growth and formation of glacier lakes. Glacial lake outburst floods (GLOF) are sudden events triggered by an earthquake, moraine failure or other shock that causes a sudden outflow of water. These floods are catastrophic because of their sudden onset, the difficulty predicting them, and enormous quantity of water and debris rapidly flooding downstream areas. Imja Lake in the Himalaya of Nepal has experienced accelerated growth since it first appeared in the 1960s. Communities threatened by a flood from Imja Lake have advocated for projects to adapt to the increasing threat of a GLOF. Nonetheless, discussions surrounding projects for Imja have not included a rigorous analysis of the potential consequences of a flood, probability of an event, or costs of mitigation projects in part because this information is unknown or uncertain. This work presents a demonstration of a decision making methodology developed to rationally analyze the risks posed by Imja Lake and the various adaptation projects proposed using available information. In this work the authors use decision analysis, data envelopement analysis (DEA), and sensitivity analysis to assess proposed adaptation measures that would mitigate damage in downstream communities from a GLOF. We use an existing hydrodynamic model of the at-risk area to determine how adaptation projects will affect downstream flooding and estimate fatalities using an empirical method developed for dam failures. The DEA methodology allows us to estimate the value of a statistical life implied by each project given the cost of the project and number of lives saved to determine which project is the most efficient. In contrast the decision analysis methodology requires fatalities to be assigned a cost but allows the inclusion of uncertainty in the decision making process. We compare the output of these two methodologies and determine the sensitivity of the conclusions to changes in uncertain input parameters including project cost, value of a statistical life, and time to a GLOF event.
Optimal use of human and machine resources for Space Station assembly operations
NASA Technical Reports Server (NTRS)
Parrish, Joseph C.
1988-01-01
This paper investigates the issues involved in determining the best mix of human and machine resources for assembly of the Space Station. It presents the current Station assembly sequence, along with descriptions of the available assembly resources. A number of methodologies for optimizing the human/machine tradeoff problem have been developed, but the Space Station assembly offers some unique issues that have not yet been addressed. These include a strong constraint on available EVA time for early flights and a phased deployment of assembly resources over time. A methodology for incorporating the previously developed decision methods to the special case of the Space Station is presented. This methodology emphasizes an application of multiple qualitative and quantitative techniques, including simulation and decision analysis, for producing an objective, robust solution to the tradeoff problem.
ERIC Educational Resources Information Center
Shire, Stephanie Yoshiko; Kasari, Connie
2014-01-01
This systematic review examines train the trainer (TTT) effectiveness trials of behavioral interventions for individuals with autism spectrum disorder (ASD). Published methodological quality scales were used to assess studies including participant description, research design, intervention, outcomes, and analysis. Twelve studies including 9 weak…
How to conduct a qualitative meta-analysis: Tailoring methods to enhance methodological integrity.
Levitt, Heidi M
2018-05-01
Although qualitative research has long been of interest in the field of psychology, meta-analyses of qualitative literatures (sometimes called meta-syntheses) are still quite rare. Like quantitative meta-analyses, these methods function to aggregate findings and identify patterns across primary studies, but their aims, procedures, and methodological considerations may vary. This paper explains the function of qualitative meta-analyses and their methodological development. Recommendations have broad relevance but are framed with an eye toward their use in psychotherapy research. Rather than arguing for the adoption of any single meta-method, this paper advocates for considering how procedures can best be selected and adapted to enhance a meta-study's methodological integrity. Through the paper, recommendations are provided to help researchers identify procedures that can best serve their studies' specific goals. Meta-analysts are encouraged to consider the methodological integrity of their studies in relation to central research processes, including identifying a set of primary research studies, transforming primary findings into initial units of data for a meta-analysis, developing categories or themes, and communicating findings. The paper provides guidance for researchers who desire to tailor meta-analytic methods to meet their particular goals while enhancing the rigor of their research.
An engineering methodology for implementing and testing VLSI (Very Large Scale Integrated) circuits
NASA Astrophysics Data System (ADS)
Corliss, Walter F., II
1989-03-01
The engineering methodology for producing a fully tested VLSI chip from a design layout is presented. A 16-bit correlator, NPS CORN88, that was previously designed, was used as a vehicle to demonstrate this methodology. The study of the design and simulation tools, MAGIC and MOSSIM II, was the focus of the design and validation process. The design was then implemented and the chip was fabricated by MOSIS. This fabricated chip was then used to develop a testing methodology for using the digital test facilities at NPS. NPS CORN88 was the first full custom VLSI chip, designed at NPS, to be tested with the NPS digital analysis system, Tektronix DAS 9100 series tester. The capabilities and limitations of these test facilities are examined. NPS CORN88 test results are included to demonstrate the capabilities of the digital test system. A translator, MOS2DAS, was developed to convert the MOSSIM II simulation program to the input files required by the DAS 9100 device verification software, 91DVS. Finally, a tutorial for using the digital test facilities, including the DAS 9100 and associated support equipments, is included as an appendix.
System architectures for telerobotic research
NASA Technical Reports Server (NTRS)
Harrison, F. Wallace
1989-01-01
Several activities are performed related to the definition and creation of telerobotic systems. The effort and investment required to create architectures for these complex systems can be enormous; however, the magnitude of process can be reduced if structured design techniques are applied. A number of informal methodologies supporting certain aspects of the design process are available. More recently, prototypes of integrated tools supporting all phases of system design from requirements analysis to code generation and hardware layout have begun to appear. Activities related to system architecture of telerobots are described, including current activities which are designed to provide a methodology for the comparison and quantitative analysis of alternative system architectures.
ERIC Educational Resources Information Center
Sajavaara, Kari, Ed.; Lehtonen, Jaakko, Ed.
The following papers and reports are included: (1) "Prisoners of Code-Centred Privacy: Reflections on Contrastive Analysis and Related Disciplines" by Kari Sajavaara and Jaakko Lehtonen; (2) "The Methodology and Practice of Contrastive Discourse Analysis" by Sajavaara, Lehtonen, and Liisa Korpimies; (3) "Interactional Activities in Discourse…
Oxlade, Olivia; Pinto, Marcia; Trajman, Anete; Menzies, Dick
2013-01-01
Introduction Cost effectiveness analyses (CEA) can provide useful information on how to invest limited funds, however they are less useful if different analysis of the same intervention provide unclear or contradictory results. The objective of our study was to conduct a systematic review of methodologic aspects of CEA that evaluate Interferon Gamma Release Assays (IGRA) for the detection of Latent Tuberculosis Infection (LTBI), in order to understand how differences affect study results. Methods A systematic review of studies was conducted with particular focus on study quality and the variability in inputs used in models used to assess cost-effectiveness. A common decision analysis model of the IGRA versus Tuberculin Skin Test (TST) screening strategy was developed and used to quantify the impact on predicted results of observed differences of model inputs taken from the studies identified. Results Thirteen studies were ultimately included in the review. Several specific methodologic issues were identified across studies, including how study inputs were selected, inconsistencies in the costing approach, the utility of the QALY (Quality Adjusted Life Year) as the effectiveness outcome, and how authors choose to present and interpret study results. When the IGRA versus TST test strategies were compared using our common decision analysis model predicted effectiveness largely overlapped. Implications Many methodologic issues that contribute to inconsistent results and reduced study quality were identified in studies that assessed the cost-effectiveness of the IGRA test. More specific and relevant guidelines are needed in order to help authors standardize modelling approaches, inputs, assumptions and how results are presented and interpreted. PMID:23505412
Factors and competitiveness analysis in rare earth mining, new methodology: case study from Brazil.
Silva, Gustavo A; Petter, Carlos O; Albuquerque, Nelson R
2018-03-01
Rare earths are increasingly being applied in high-tech industries, such as green energy (e.g. wind power), hybrid cars, electric cars, permanent high-performance magnets, superconductors, luminophores and many other industrial sectors involved in modern technologies. Given that China dominates this market and imposes restrictions on production and exports whenever opportunities arise, it is becoming more and more challenging to develop business ventures in this sector. Several initiatives were taken to prospect new resources and develop the production chain, including the mining of these mineral assets around the world, but some factors of uncertainties, including current low prices, increased the challenge of transforming the current resources into deposits or productive mines. Thus, analyzing the competitiveness of advanced projects becomes indispensable. This work has the objective of introducing a new methodology of competitiveness analysis, where some variables are considered as main factors that can contribute strongly to make unfeasible a mining enterprise for the use of rare earth elements (REE) with this methodology, which is quite practical and reproducible, it was possible to verify some real facts, such as: the fact that the Lynas Mount Weld CLD (AUS) Project is resilient to the uncertainties of the RE sector, at the same time as the Molycorp Project is facing major financial difficulties (under judicial reorganization). It was also possible to verify that the Araxá Project of CBMM in Brazil is one of the most competitive in this country. Thus, we contribute to the existing literature, providing a new methodology for competitiveness analysis in rare earth mining.
Mollison, Daisy; Sellar, Robin; Bastin, Mark; Mollison, Denis; Chandran, Siddharthan; Wardlaw, Joanna; Connick, Peter
2017-01-01
Moderate correlation exists between the imaging quantification of brain white matter lesions and cognitive performance in people with multiple sclerosis (MS). This may reflect the greater importance of other features, including subvisible pathology, or methodological limitations of the primary literature. To summarise the cognitive clinico-radiological paradox and explore the potential methodological factors that could influence the assessment of this relationship. Systematic review and meta-analysis of primary research relating cognitive function to white matter lesion burden. Fifty papers met eligibility criteria for review, and meta-analysis of overall results was possible in thirty-two (2050 participants). Aggregate correlation between cognition and T2 lesion burden was r = -0.30 (95% confidence interval: -0.34, -0.26). Wide methodological variability was seen, particularly related to key factors in the cognitive data capture and image analysis techniques. Resolving the persistent clinico-radiological paradox will likely require simultaneous evaluation of multiple components of the complex pathology using optimum measurement techniques for both cognitive and MRI feature quantification. We recommend a consensus initiative to support common standards for image analysis in MS, enabling benchmarking while also supporting ongoing innovation.
Explosion/Blast Dynamics for Constellation Launch Vehicles Assessment
NASA Technical Reports Server (NTRS)
Baer, Mel; Crawford, Dave; Hickox, Charles; Kipp, Marlin; Hertel, Gene; Morgan, Hal; Ratzel, Arthur; Cragg, Clinton H.
2009-01-01
An assessment methodology is developed to guide quantitative predictions of adverse physical environments and the subsequent effects on the Ares-1 crew launch vehicle associated with the loss of containment of cryogenic liquid propellants from the upper stage during ascent. Development of the methodology is led by a team at Sandia National Laboratories (SNL) with guidance and support from a number of National Aeronautics and Space Administration (NASA) personnel. The methodology is based on the current Ares-1 design and feasible accident scenarios. These scenarios address containment failure from debris impact or structural response to pressure or blast loading from an external source. Once containment is breached, the envisioned assessment methodology includes predictions for the sequence of physical processes stemming from cryogenic tank failure. The investigative techniques, analysis paths, and numerical simulations that comprise the proposed methodology are summarized and appropriate simulation software is identified in this report.
A Methodology for Anatomic Ultrasound Image Diagnostic Quality Assessment.
Hemmsen, Martin Christian; Lange, Theis; Brandt, Andreas Hjelm; Nielsen, Michael Bachmann; Jensen, Jorgen Arendt
2017-01-01
This paper discusses the methods for the assessment of ultrasound image quality based on our experiences with evaluating new methods for anatomic imaging. It presents a methodology to ensure a fair assessment between competing imaging methods using clinically relevant evaluations. The methodology is valuable in the continuing process of method optimization and guided development of new imaging methods. It includes a three phased study plan covering from initial prototype development to clinical assessment. Recommendations to the clinical assessment protocol, software, and statistical analysis are presented. Earlier uses of the methodology has shown that it ensures validity of the assessment, as it separates the influences between developer, investigator, and assessor once a research protocol has been established. This separation reduces confounding influences on the result from the developer to properly reveal the clinical value. This paper exemplifies the methodology using recent studies of synthetic aperture sequential beamforming tissue harmonic imaging.
Multivariate Methods for Meta-Analysis of Genetic Association Studies.
Dimou, Niki L; Pantavou, Katerina G; Braliou, Georgia G; Bagos, Pantelis G
2018-01-01
Multivariate meta-analysis of genetic association studies and genome-wide association studies has received a remarkable attention as it improves the precision of the analysis. Here, we review, summarize and present in a unified framework methods for multivariate meta-analysis of genetic association studies and genome-wide association studies. Starting with the statistical methods used for robust analysis and genetic model selection, we present in brief univariate methods for meta-analysis and we then scrutinize multivariate methodologies. Multivariate models of meta-analysis for a single gene-disease association studies, including models for haplotype association studies, multiple linked polymorphisms and multiple outcomes are discussed. The popular Mendelian randomization approach and special cases of meta-analysis addressing issues such as the assumption of the mode of inheritance, deviation from Hardy-Weinberg Equilibrium and gene-environment interactions are also presented. All available methods are enriched with practical applications and methodologies that could be developed in the future are discussed. Links for all available software implementing multivariate meta-analysis methods are also provided.
Saltaji, Humam; Armijo-Olivo, Susan; Cummings, Greta G; Amin, Maryam; Flores-Mir, Carlos
2014-01-01
Introduction It is fundamental that randomised controlled trials (RCTs) are properly conducted in order to reach well-supported conclusions. However, there is emerging evidence that RCTs are subject to biases which can overestimate or underestimate the true treatment effect, due to flaws in the study design characteristics of such trials. The extent to which this holds true in oral health RCTs, which have some unique design characteristics compared to RCTs in other health fields, is unclear. As such, we aim to examine the empirical evidence quantifying the extent of bias associated with methodological and non-methodological characteristics in oral health RCTs. Methods and analysis We plan to perform a meta-epidemiological study, where a sample size of 60 meta-analyses (MAs) including approximately 600 RCTs will be selected. The MAs will be randomly obtained from the Oral Health Database of Systematic Reviews using a random number table; and will be considered for inclusion if they include a minimum of five RCTs, and examine a therapeutic intervention related to one of the recognised dental specialties. RCTs identified in selected MAs will be subsequently included if their study design includes a comparison between an intervention group and a placebo group or another intervention group. Data will be extracted from selected trials included in MAs based on a number of methodological and non-methodological characteristics. Moreover, the risk of bias will be assessed using the Cochrane Risk of Bias tool. Effect size estimates and measures of variability for the main outcome will be extracted from each RCT included in selected MAs, and a two-level analysis will be conducted using a meta-meta-analytic approach with a random effects model to allow for intra-MA and inter-MA heterogeneity. Ethics and dissemination The intended audiences of the findings will include dental clinicians, oral health researchers, policymakers and graduate students. The aforementioned will be introduced to the findings through workshops, seminars, round table discussions and targeted individual meetings. Other opportunities for knowledge transfer will be pursued such as key dental conferences. Finally, the results will be published as a scientific report in a dental peer-reviewed journal. PMID:24568962
NASA Technical Reports Server (NTRS)
Leininger, G.; Jutila, S.; King, J.; Muraco, W.; Hansell, J.; Lindeen, J.; Franckowiak, E.; Flaschner, A.
1975-01-01
A methodology is described for the evaluation of societal impacts associated with the implementation of a new technology. Theoretical foundations for the methodology, called the total assessment profile, are established from both the economic and social science perspectives. The procedure provides for accountability of nonquantifiable factors and measures through the use of a comparative value matrix by assessing the impacts of the technology on the value system of the society.
Carayon, Pascale; Li, Yaqiong; Kelly, Michelle M.; DuBenske, Lori L.; Xie, Anping; McCabe, Brenna; Orne, Jason; Cox, Elizabeth D.
2014-01-01
Human factors and ergonomics methods are needed to redesign healthcare processes and support patient-centered care, in particular for vulnerable patients such as hospitalized children. We implemented and evaluated a stimulated recall methodology for collective confrontation in the context of family-centered rounds. Five parents and five healthcare team members reviewed video records of their bedside rounds, and were then interviewed using the stimulated recall methodology to identify work system barriers and facilitators in family-centered rounds. The evaluation of the methodology was based on a survey of the participants, and a qualitative analysis of interview data in light of the work system model of Smith and Carayon (1989; 2000). Positive survey feedback from the participants was received. The stimulated recall methodology identified barriers and facilitators in all work system elements. Participatory ergonomics methods such as the stimulated recall methodology allow a range of participants, including parents and children, to participate in healthcare process improvement. PMID:24894378
Using discrete choice experiments within a cost-benefit analysis framework: some considerations.
McIntosh, Emma
2006-01-01
A great advantage of the stated preference discrete choice experiment (SPDCE) approach to economic evaluation methodology is its immense flexibility within applied cost-benefit analyses (CBAs). However, while the use of SPDCEs in healthcare has increased markedly in recent years there has been a distinct lack of equivalent CBAs in healthcare using such SPDCE-derived valuations. This article outlines specific issues and some practical suggestions for consideration relevant to the development of CBAs using SPDCE-derived benefits. The article shows that SPDCE-derived CBA can adopt recent developments in cost-effectiveness methodology including the cost-effectiveness plane, appropriate consideration of uncertainty, the net-benefit framework and probabilistic sensitivity analysis methods, while maintaining the theoretical advantage of the SPDCE approach. The concept of a cost-benefit plane is no different in principle to the cost-effectiveness plane and can be a useful tool for reporting and presenting the results of CBAs.However, there are many challenging issues to address for the advancement of CBA methodology using SPCDEs within healthcare. Particular areas for development include the importance of accounting for uncertainty in SPDCE-derived willingness-to-pay values, the methodology of SPDCEs in clinical trial settings and economic models, measurement issues pertinent to using SPDCEs specifically in healthcare, and the importance of issues such as consideration of the dynamic nature of healthcare and the resulting impact this has on the validity of attribute definitions and context.
Methodological quality of systematic reviews on treatments for depression: a cross-sectional study.
Chung, V C H; Wu, X Y; Feng, Y; Ho, R S T; Wong, S Y S; Threapleton, D
2017-05-02
Depression is one of the most common mental disorders and identifying effective treatment strategies is crucial for the control of depression. Well-conducted systematic reviews (SRs) and meta-analyses can provide the best evidence for supporting treatment decision-making. Nevertheless, the trustworthiness of conclusions can be limited by lack of methodological rigour. This study aims to assess the methodological quality of a representative sample of SRs on depression treatments. A cross-sectional study on the bibliographical and methodological characteristics of SRs published on depression treatments trials was conducted. Two electronic databases (the Cochrane Database of Systematic Reviews and the Database of Abstracts of Reviews of Effects) were searched for potential SRs. SRs with at least one meta-analysis on the effects of depression treatments were considered eligible. The methodological quality of included SRs was assessed using the validated AMSTAR (Assessing the Methodological Quality of Systematic Reviews) tool. The associations between bibliographical characteristics and scoring on AMSTAR items were analysed using logistic regression analysis. A total of 358 SRs were included and appraised. Over half of included SRs (n = 195) focused on non-pharmacological treatments and harms were reported in 45.5% (n = 163) of all studies. Studies varied in methods and reporting practices: only 112 (31.3%) took the risk of bias among primary studies into account when formulating conclusions; 245 (68.4%) did not fully declare conflict of interests; 93 (26.0%) reported an 'a priori' design and 104 (29.1%) provided lists of both included and excluded studies. Results from regression analyses showed: more recent publications were more likely to report 'a priori' designs [adjusted odds ratio (AOR) 1.31, 95% confidence interval (CI) 1.09-1.57], to describe study characteristics fully (AOR 1.16, 95% CI 1.06-1.28), and to assess presence of publication bias (AOR 1.13, 95% CI 1.06-1.19), but were less likely to list both included and excluded studies (AOR 0.86, 95% CI 0.81-0.92). SRs published in journals with higher impact factor (AOR 1.14, 95% CI 1.04-1.25), completed by more review authors (AOR 1.12, 95% CI 1.01-1.24) and SRs on non-pharmacological treatments (AOR 1.62, 95% CI 1.01-2.59) were associated with better performance in publication bias assessment. The methodological quality of included SRs is disappointing. Future SRs should strive to improve rigour by considering of risk of bias when formulating conclusions, reporting conflict of interests and authors should explicitly describe harms. SR authors should also use appropriate methods to combine the results, prevent language and publication biases, and ensure timely updates.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-18
... during conference calls and via email discussions. Member duties include prioritizing topics, designing... their expertise in methodological issues such as meta-analysis, analytic modeling or clinical...
Research: Research in Language Arts Education: Notes on How It Works.
ERIC Educational Resources Information Center
Dilworth, Collett B., Jr.
1980-01-01
Provides an overview of different types of educational research in language arts, including the quasi-experiment, the controlled methodological experiment, the controlled descriptive experiment, the non-controlled description, and textual analysis. (RL)
Social Impact Studies: An Expository Analysis
ERIC Educational Resources Information Center
Shields, Mark A.
1975-01-01
Analyzed are some selected studies on the social impact of resources development and construction projects including dams, highways, nuclear power plants and strip mines. The analytical and methodological problem of assessing differential impacts is stressed. (BT)
Ninth NASTRAN (R) Users' Colloquium
NASA Technical Reports Server (NTRS)
1980-01-01
The general application of finite element methodology and the specific application of NASTRAN to a wide variety of static and dynamic structural problems is addressed. Comparison with other approaches and new methods of analysis with nastran are included.
Nonpoint source pollution of urban stormwater runoff: a methodology for source analysis.
Petrucci, Guido; Gromaire, Marie-Christine; Shorshani, Masoud Fallah; Chebbo, Ghassan
2014-09-01
The characterization and control of runoff pollution from nonpoint sources in urban areas are a major issue for the protection of aquatic environments. We propose a methodology to quantify the sources of pollutants in an urban catchment and to analyze the associated uncertainties. After describing the methodology, we illustrate it through an application to the sources of Cu, Pb, Zn, and polycyclic aromatic hydrocarbons (PAH) from a residential catchment (228 ha) in the Paris region. In this application, we suggest several procedures that can be applied for the analysis of other pollutants in different catchments, including an estimation of the total extent of roof accessories (gutters and downspouts, watertight joints and valleys) in a catchment. These accessories result as the major source of Pb and as an important source of Zn in the example catchment, while activity-related sources (traffic, heating) are dominant for Cu (brake pad wear) and PAH (tire wear, atmospheric deposition).
North, Carol S
2005-01-01
Several methodological issues may affect the findings of studies of the mental health effects of disasters over time. These issues include analysis of the course of individual disorders over time that may be lost when they are presented embedded in general summary statistics, consideration of assessment of psychiatric disorders versus symptoms, adherence to established criteria in assigning psychiatric diagnoses, and orientation of mental health issues to the type of disaster exposure of the sample. This report will explore these methodological issues in a review of disaster literature and in data obtained from study of survivors of the Oklahoma City bombing. Clinical implications of the data obtained from the Oklahoma City bombing study of survivors of the direct bomb blast are presented in the context of these methodological concerns.
Note: Methodology for the analysis of Bluetooth gateways in an implemented scatternet.
Etxaniz, J; Monje, P M; Aranguren, G
2014-03-01
This Note introduces a novel methodology to analyze the time performance of Bluetooth gateways in multi-hop networks, known as scatternets. The methodology is focused on distinguishing between the processing time and the time that each communication between nodes takes along an implemented scatternet. This technique is not only valid for Bluetooth networks but also for other wireless networks that offer access to their middleware in order to include beacons in the operation of the nodes. We show in this Note the results of the tests carried out on a Bluetooth scatternet in order to highlight the reliability and effectiveness of the methodology. The results also validate this technique showing convergence in the results when subtracting the time for the beacons from the delay measurements.
A Comparative Analysis of Method Books for Class Jazz Instruction
ERIC Educational Resources Information Center
Watson, Kevin E.
2017-01-01
The purpose of this study was to analyze and compare instructional topics and teaching approaches included in selected class method books for jazz pedagogy through content analysis methodology. Frequency counts for the number of pages devoted to each defined instructional content category were compiled and percentages of pages allotted to each…
Gender Issues in Psychology: A Content Analysis of Introductory Psychology Textbooks.
ERIC Educational Resources Information Center
Connor-Greene, Patti; And Others
This paper assesses the attention given to gender issues in 17 psychology textbooks published between 1985 and 1987 and used in college undergraduate introductory courses. The methodology used was the analysis of content and research citations. Specific issues that were examined included: (1) the explanation of the distinction between gender and…
ERIC Educational Resources Information Center
Moeyaert, Mariola; Ugille, Maaike; Ferron, John M.; Onghena, Patrick; Heyvaert, Mieke; Beretvas, S. Natasha; Van den Noortgate, Wim
2015-01-01
The purpose of this study is to illustrate the multilevel meta-analysis of results from single-subject experimental designs of different types, including AB phase designs, multiple-baseline designs, ABAB reversal designs, and alternating treatment designs. Current methodological work on the meta-analysis of single-subject experimental designs…
Public and Private School Costs. A Local Analysis, 1994.
ERIC Educational Resources Information Center
Public Policy Forum, Inc., Milwaukee, WI.
This document presents findings of a study that identified key factors of cost-per-pupil differences between public and private school spending among selected Milwaukee area public and private schools. The analysis was limited to cost factors only, specifically, to per-pupil spending. Methodology included a review of the school budgets of 7 public…
ERIC Educational Resources Information Center
Jones, Earl I., Ed.
This five-section symposium report includes 22 papers assessing the state-of-the-art in occupational research. Section 1, Occupational Analysis, Structure, and Methods, contains four papers that discuss: the Air Force Occupational Research project, methodologies in job analysis, evaluation, structures and requirements, career development,…
ERIC Educational Resources Information Center
Evan-Wong, Sue; de Freitas, Claudette
1995-01-01
Presents a methodology for marketing an information service which focuses on including information users in the strategic marketing planning process. Identifies the following stages of a marketing planning process: analysis of the environment, information audit, information needs assessment, market opportunity analysis, tactical marketing program,…
Effectiveness of Occupational Health and Safety Training: A Systematic Review with Meta-Analysis
ERIC Educational Resources Information Center
Ricci, Federico; Chiesi, Andrea; Bisio, Carlo; Panari, Chiara; Pelosi, Annalisa
2016-01-01
Purpose: This meta-analysis aims to verify the efficacy of occupational health and safety (OHS) training in terms of knowledge, attitude and beliefs, behavior and health. Design/methodology/approach: The authors included studies published in English (2007-2014) selected from ten databases. Eligibility criteria were studies concerned with the…
Use of Analog Functional Analysis in Assessing the Function of Mealtime Behavior Problems.
ERIC Educational Resources Information Center
Girolami, Peter A.; Scotti, Joseph R.
2001-01-01
This study applied the methodology of an analog experimental (functional) analysis of behavior to the specific interaction between parents and three children with mental retardation exhibiting food refusal and related mealtime problems. Analog results were highly consistent with other forms of functional assessment data, including interviews,…
Educational Leadership Effectiveness: A Rasch Analysis
ERIC Educational Resources Information Center
Sinnema, Claire; Ludlow, Larry; Robinson, Viviane
2016-01-01
Purpose: The purposes of this paper are, first, to establish the psychometric properties of the ELP tool, and, second, to test, using a Rasch item response theory analysis, the hypothesized progression of challenge presented by the items included in the tool. Design/ Methodology/ Approach: Data were collected at two time points through a survey of…
Critical Protection Item classification for a waste processing facility at Savannah River Site
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ades, M.J.; Garrett, R.J.
1993-10-01
This paper describes the methodology for Critical Protection Item (CPI) classification and its application to the Structures, Systems and Components (SSC) of a waste processing facility at the Savannah River Site (SRS). The WSRC methodology for CPI classification includes the evaluation of the radiological and non-radiological consequences resulting from postulated accidents at the waste processing facility and comparison of these consequences with allowable limits. The types of accidents considered include explosions and fire in the facility and postulated accidents due to natural phenomena, including earthquakes, tornadoes, and high velocity straight winds. The radiological analysis results indicate that CPIs are notmore » required at the waste processing facility to mitigate the consequences of radiological release. The non-radiological analysis, however, shows that the Waste Storage Tank (WST) and the dike spill containment structures around the formic acid tanks in the cold chemical feed area and waste treatment area of the facility should be identified as CPIs. Accident mitigation options are provided and discussed.« less
Wong, John B.; Coates, Paul M.; Russell, Robert M.; Dwyer, Johanna T.; Schuttinga, James A.; Bowman, Barbara A.; Peterson, Sarah A.
2011-01-01
Increased interest in the potential societal benefit of incorporating health economics as a part of clinical translational science, particularly nutrition interventions, led the Office of Dietary Supplements at the National Institutes of Health to sponsor a conference to address key questions about economic analysis of nutrition interventions to enhance communication among health economic methodologists, researchers, reimbursement policy makers, and regulators. Issues discussed included the state of the science, such as what health economic methods are currently used to judge the burden of illness, interventions, or health care policies, and what new research methodologies are available or needed to address knowledge and methodological gaps or barriers. Research applications included existing evidence-based health economic research activities in nutrition that are ongoing or planned at federal agencies. International and U.S. regulatory, policy and clinical practice perspectives included a discussion of how research results can help regulators and policy makers within government make nutrition policy decisions, and how economics affects clinical guideline development. PMID:21884133
Simulation of Attacks for Security in Wireless Sensor Network
Diaz, Alvaro; Sanchez, Pablo
2016-01-01
The increasing complexity and low-power constraints of current Wireless Sensor Networks (WSN) require efficient methodologies for network simulation and embedded software performance analysis of nodes. In addition, security is also a very important feature that has to be addressed in most WSNs, since they may work with sensitive data and operate in hostile unattended environments. In this paper, a methodology for security analysis of Wireless Sensor Networks is presented. The methodology allows designing attack-aware embedded software/firmware or attack countermeasures to provide security in WSNs. The proposed methodology includes attacker modeling and attack simulation with performance analysis (node’s software execution time and power consumption estimation). After an analysis of different WSN attack types, an attacker model is proposed. This model defines three different types of attackers that can emulate most WSN attacks. In addition, this paper presents a virtual platform that is able to model the node hardware, embedded software and basic wireless channel features. This virtual simulation analyzes the embedded software behavior and node power consumption while it takes into account the network deployment and topology. Additionally, this simulator integrates the previously mentioned attacker model. Thus, the impact of attacks on power consumption and software behavior/execution-time can be analyzed. This provides developers with essential information about the effects that one or multiple attacks could have on the network, helping them to develop more secure WSN systems. This WSN attack simulator is an essential element of the attack-aware embedded software development methodology that is also introduced in this work. PMID:27869710
Thermal Design, Analysis, and Testing of the Quench Module Insert Bread Board
NASA Technical Reports Server (NTRS)
Breeding, Shawn; Khodabandeh, Julia
2002-01-01
Contents include the following: Quench Module Insert (QMI) science requirements. QMI interfaces. QMI design layout. QMI thermal analysis and design methodology. QMI bread board testing and instrumentation approach. QMI thermal probe design parameters. Design features for gradient measurement. Design features for heated zone measurements. Thermal gradient analysis results. Heated zone analysis results. Bread board thermal probe layout. QMI bread board correlation and performance. Summary and conclusions.
Low-Temperature Hydrothermal Resource Potential
Katherine Young
2016-06-30
Compilation of data (spreadsheet and shapefiles) for several low-temperature resource types, including isolated springs and wells, delineated area convection systems, sedimentary basins and coastal plains sedimentary systems. For each system, we include estimates of the accessible resource base, mean extractable resource and beneficial heat. Data compiled from USGS and other sources. The paper (submitted to GRC 2016) describing the methodology and analysis is also included.
Satellite services system analysis study. Volume 3: Service equipment requirements
NASA Technical Reports Server (NTRS)
1981-01-01
Service equipment mission requirements are discussed. On-orbit operations, satellite classes, and reference missions are included. Service equipment usage and requirements are considered. Equipment identification methodology is discussed. Service equipment usage is analyzed, including initial launch, revisit, Earth return, and orbital storage. A summary of service requirements and equipment is presented, including service equipment status, even interaction, satellite features, and observations.
Reiner, Bruce I
2018-04-01
Uncertainty in text-based medical reports has long been recognized as problematic, frequently resulting in misunderstanding and miscommunication. One strategy for addressing the negative clinical ramifications of report uncertainty would be the creation of a standardized methodology for characterizing and quantifying uncertainty language, which could provide both the report author and reader with context related to the perceived level of diagnostic confidence and accuracy. A number of computerized strategies could be employed in the creation of this analysis including string search, natural language processing and understanding, histogram analysis, topic modeling, and machine learning. The derived uncertainty data offers the potential to objectively analyze report uncertainty in real time and correlate with outcomes analysis for the purpose of context and user-specific decision support at the point of care, where intervention would have the greatest clinical impact.
Methodological aspects of fuel performance system analysis at raw hydrocarbon processing plants
NASA Astrophysics Data System (ADS)
Kulbjakina, A. V.; Dolotovskij, I. V.
2018-01-01
The article discusses the methodological aspects of fuel performance system analysis at raw hydrocarbon (RH) processing plants. Modern RH processing facilities are the major consumers of energy resources (ER) for their own needs. To reduce ER, including fuel consumption, and to develop rational fuel system structure are complex and relevant scientific tasks that can only be done using system analysis and complex system synthesis. In accordance with the principles of system analysis, the hierarchical structure of the fuel system, the block scheme for the synthesis of the most efficient alternative of the fuel system using mathematical models and the set of performance criteria have been developed on the main stages of the study. The results from the introduction of specific engineering solutions to develop their own energy supply sources for RH processing facilities have been provided.
The Future Impact of Wind on BPA Power System Load Following and Regulation Requirements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Makarov, Yuri V.; Lu, Shuai; McManus, Bart
Wind power is growing in a very fast pace as an alternative generating resource. As the ratio of wind power over total system capacity increases, the impact of wind on various system aspects becomes significant. This paper presents a methodology to study the future impact of wind on BPA power system load following and regulation requirements. Existing methodologies for similar analysis include dispatch model simulation and standard deviation evaluation on load and wind data. The methodology proposed in this paper uses historical data and stochastic processes to simulate the load balancing processes in the BPA power system. It mimics themore » actual power system operations therefore the results are close to reality yet the study based on this methodology is convenient to perform. The capacity, ramp rate and ramp duration characteristics are extracted from the simulation results. System load following and regulation capacity requirements are calculated accordingly. The ramp rate and ramp duration data obtained from the analysis can be used to evaluate generator response or maneuverability requirement and regulating units’ energy requirement, respectively.« less
Methodological Status and Trends in Expository Text Structure Instruction Efficacy Research
ERIC Educational Resources Information Center
Bohaty, Janet J.; Hebert, Michael A.; Nelson, J. Ron; Brown, Jessica A.
2015-01-01
This systematic descriptive historical review was conducted to examine the status and trends in expository text structure instruction efficacy research for first through twelfth grade students. The analysis included sixty studies, which spanned the years 1978 to 2014. Descriptive dimensions of the research included study type, research design,…
System engineering toolbox for design-oriented engineers
NASA Technical Reports Server (NTRS)
Goldberg, B. E.; Everhart, K.; Stevens, R.; Babbitt, N., III; Clemens, P.; Stout, L.
1994-01-01
This system engineering toolbox is designed to provide tools and methodologies to the design-oriented systems engineer. A tool is defined as a set of procedures to accomplish a specific function. A methodology is defined as a collection of tools, rules, and postulates to accomplish a purpose. For each concept addressed in the toolbox, the following information is provided: (1) description, (2) application, (3) procedures, (4) examples, if practical, (5) advantages, (6) limitations, and (7) bibliography and/or references. The scope of the document includes concept development tools, system safety and reliability tools, design-related analytical tools, graphical data interpretation tools, a brief description of common statistical tools and methodologies, so-called total quality management tools, and trend analysis tools. Both relationship to project phase and primary functional usage of the tools are also delineated. The toolbox also includes a case study for illustrative purposes. Fifty-five tools are delineated in the text.
Buldt, Andrew K; Allan, Jamie J; Landorf, Karl B; Menz, Hylton B
2018-02-23
Foot posture is a risk factor for some lower limb injuries, however the underlying mechanism is not well understood. Plantar pressure analysis is one technique to investigate the interaction between foot posture and biomechanical function of the lower limb. The aim of this review was to investigate the relationship between foot posture and plantar pressure during walking. A systematic database search was conducted using MEDLINE, CINAHL, SPORTDiscus and Embase to identify studies that have assessed the relationship between foot posture and plantar pressure during walking. Included studies were assessed for methodological quality. Meta-analysis was not conducted due to heterogeneity between studies. Inconsistencies included foot posture classification techniques, gait analysis protocols, selection of plantar pressure parameters and statistical analysis approaches. Of the 4213 citations identified for title and abstract review, sixteen studies were included and underwent quality assessment; all were of moderate methodological quality. There was some evidence that planus feet display higher peak pressure, pressure-time integral, maximum force, force-time integral and contact area predominantly in the medial arch, central forefoot and hallux, while these variables are lower in the lateral and medial forefoot. In contrast, cavus feet display higher peak pressure and pressure-time integral in the heel and lateral forefoot, while pressure-time integral, maximum force, force-time integral and contact area are lower for the midfoot and hallux. Centre of pressure was more laterally deviated in cavus feet and more medially deviated in planus feet. Overall, effect sizes were moderate, but regression models could only explain a small amount of variance in plantar pressure variables. Despite these significant findings, future research would benefit from greater methodological rigour, particularly in relation to the use of valid foot posture measurement techniques, gait analysis protocols, and standardised approaches for analysis and reporting of plantar pressure variables. Copyright © 2018 Elsevier B.V. All rights reserved.
[Application of root cause analysis in healthcare].
Hsu, Tsung-Fu
2007-12-01
The main purpose of this study was to explore various aspects of root cause analysis (RCA), including its definition, rationale concept, main objective, implementation procedures, most common analysis methodology (fault tree analysis, FTA), and advantages and methodologic limitations in regard to healthcare. Several adverse events that occurred at a certain hospital were also analyzed by the author using FTA as part of this study. RCA is a process employed to identify basic and contributing causal factors underlying performance variations associated with adverse events. The rationale concept of RCA offers a systemic approach to improving patient safety that does not assign blame or liability to individuals. The four-step process involved in conducting an RCA includes: RCA preparation, proximate cause identification, root cause identification, and recommendation generation and implementation. FTA is a logical, structured process that can help identify potential causes of system failure before actual failures occur. Some advantages and significant methodologic limitations of RCA were discussed. Finally, we emphasized that errors stem principally from faults attributable to system design, practice guidelines, work conditions, and other human factors, which induce health professionals to make negligence or mistakes with regard to healthcare. We must explore the root causes of medical errors to eliminate potential RCA system failure factors. Also, a systemic approach is needed to resolve medical errors and move beyond a current culture centered on assigning fault to individuals. In constructing a real environment of patient-centered safety healthcare, we can help encourage clients to accept state-of-the-art healthcare services.
Willke, Richard J; Zheng, Zhiyuan; Subedi, Prasun; Althin, Rikard; Mullins, C Daniel
2012-12-13
Implicit in the growing interest in patient-centered outcomes research is a growing need for better evidence regarding how responses to a given intervention or treatment may vary across patients, referred to as heterogeneity of treatment effect (HTE). A variety of methods are available for exploring HTE, each associated with unique strengths and limitations. This paper reviews a selected set of methodological approaches to understanding HTE, focusing largely but not exclusively on their uses with randomized trial data. It is oriented for the "intermediate" outcomes researcher, who may already be familiar with some methods, but would value a systematic overview of both more and less familiar methods with attention to when and why they may be used. Drawing from the biomedical, statistical, epidemiological and econometrics literature, we describe the steps involved in choosing an HTE approach, focusing on whether the intent of the analysis is for exploratory, initial testing, or confirmatory testing purposes. We also map HTE methodological approaches to data considerations as well as the strengths and limitations of each approach. Methods reviewed include formal subgroup analysis, meta-analysis and meta-regression, various types of predictive risk modeling including classification and regression tree analysis, series of n-of-1 trials, latent growth and growth mixture models, quantile regression, and selected non-parametric methods. In addition to an overview of each HTE method, examples and references are provided for further reading.By guiding the selection of the methods and analysis, this review is meant to better enable outcomes researchers to understand and explore aspects of HTE in the context of patient-centered outcomes research.
Zeng, Xiantao; Zhang, Yonggang; Kwong, Joey S W; Zhang, Chao; Li, Sheng; Sun, Feng; Niu, Yuming; Du, Liang
2015-02-01
To systematically review the methodological assessment tools for pre-clinical and clinical studies, systematic review and meta-analysis, and clinical practice guideline. We searched PubMed, the Cochrane Handbook for Systematic Reviews of Interventions, Joanna Briggs Institute (JBI) Reviewers Manual, Centre for Reviews and Dissemination, Critical Appraisal Skills Programme (CASP), Scottish Intercollegiate Guidelines Network (SIGN), and the National Institute for Clinical Excellence (NICE) up to May 20th, 2014. Two authors selected studies and extracted data; quantitative analysis was performed to summarize the characteristics of included tools. We included a total of 21 assessment tools for analysis. A number of tools were developed by academic organizations, and some were developed by only a small group of researchers. The JBI developed the highest number of methodological assessment tools, with CASP coming second. Tools for assessing the methodological quality of randomized controlled studies were most abundant. The Cochrane Collaboration's tool for assessing risk of bias is the best available tool for assessing RCTs. For cohort and case-control studies, we recommend the use of the Newcastle-Ottawa Scale. The Methodological Index for Non-Randomized Studies (MINORS) is an excellent tool for assessing non-randomized interventional studies, and the Agency for Healthcare Research and Quality (ARHQ) methodology checklist is applicable for cross-sectional studies. For diagnostic accuracy test studies, the Quality Assessment of Diagnostic Accuracy Studies-2 (QUADAS-2) tool is recommended; the SYstematic Review Centre for Laboratory animal Experimentation (SYRCLE) risk of bias tool is available for assessing animal studies; Assessment of Multiple Systematic Reviews (AMSTAR) is a measurement tool for systematic reviews/meta-analyses; an 18-item tool has been developed for appraising case series studies, and the Appraisal of Guidelines, Research and Evaluation (AGREE)-II instrument is widely used to evaluate clinical practice guidelines. We have successfully identified a variety of methodological assessment tools for different types of study design. However, further efforts in the development of critical appraisal tools are warranted since there is currently a lack of such tools for other fields, e.g. genetic studies, and some existing tools (nested case-control studies and case reports, for example) are in need of updating to be in line with current research practice and rigor. In addition, it is very important that all critical appraisal tools remain subjective and performance bias is effectively avoided. © 2015 Chinese Cochrane Center, West China Hospital of Sichuan University and Wiley Publishing Asia Pty Ltd.
Methodological Reporting of Randomized Trials in Five Leading Chinese Nursing Journals
Shi, Chunhu; Tian, Jinhui; Ren, Dan; Wei, Hongli; Zhang, Lihuan; Wang, Quan; Yang, Kehu
2014-01-01
Background Randomized controlled trials (RCTs) are not always well reported, especially in terms of their methodological descriptions. This study aimed to investigate the adherence of methodological reporting complying with CONSORT and explore associated trial level variables in the Chinese nursing care field. Methods In June 2012, we identified RCTs published in five leading Chinese nursing journals and included trials with details of randomized methods. The quality of methodological reporting was measured through the methods section of the CONSORT checklist and the overall CONSORT methodological items score was calculated and expressed as a percentage. Meanwhile, we hypothesized that some general and methodological characteristics were associated with reporting quality and conducted a regression with these data to explore the correlation. The descriptive and regression statistics were calculated via SPSS 13.0. Results In total, 680 RCTs were included. The overall CONSORT methodological items score was 6.34±0.97 (Mean ± SD). No RCT reported descriptions and changes in “trial design,” changes in “outcomes” and “implementation,” or descriptions of the similarity of interventions for “blinding.” Poor reporting was found in detailing the “settings of participants” (13.1%), “type of randomization sequence generation” (1.8%), calculation methods of “sample size” (0.4%), explanation of any interim analyses and stopping guidelines for “sample size” (0.3%), “allocation concealment mechanism” (0.3%), additional analyses in “statistical methods” (2.1%), and targeted subjects and methods of “blinding” (5.9%). More than 50% of trials described randomization sequence generation, the eligibility criteria of “participants,” “interventions,” and definitions of the “outcomes” and “statistical methods.” The regression analysis found that publication year and ITT analysis were weakly associated with CONSORT score. Conclusions The completeness of methodological reporting of RCTs in the Chinese nursing care field is poor, especially with regard to the reporting of trial design, changes in outcomes, sample size calculation, allocation concealment, blinding, and statistical methods. PMID:25415382
Thermal Control System for a Small, Extended Duration Lunar Surface Science Platform
NASA Technical Reports Server (NTRS)
Bugby, D.; Farmer, J.; OConnor, B.; Wirzburger, M.; Abel, E.; Stouffer, C.
2010-01-01
The presentation slides include: Introduction: lunar mission definition, Problem: requirements/methodology, Concept: thermal switching options, Analysis: system evaluation, Plans: dual-radiator LHP (loop heat pipe) test bed, and Conclusions: from this study.
Payload/orbiter contamination control requirement study: Spacelab configuration contamination study
NASA Technical Reports Server (NTRS)
Bareiss, L. E.; Hetrick, M. A.; Ress, E. B.; Strange, D. A.
1976-01-01
The assessment of the Spacelab carrier induced contaminant environment was continued, and the ability of Spacelab to meet established contamination control criteria for the space transportation system program was determined. The primary areas considered included: (1) updating, refining, and improving the Spacelab contamination computer model and contamination analysis methodology, (2) establishing the resulting adjusted induced environment predictions for comparison with the applicable criteria, (3) determining the Spacelab design and operational requirements necessary to meet the criteria, (4) conducting mission feasibility analyses of the combined Spacelab/Orbiter contaminant environment for specific proposed mission and payload mixes, and (5) establishing a preliminary Spacelab mission support plan as well as model interface requirements; A summary of those activities conducted to date with respect to the modelling, analysis, and predictions of the induced environment, including any modifications in approach or methodology utilized in the contamination assessment of the Spacelab carrier, was presented.
[Neurological sciences based on evidence].
Ceballos, C; Almárcegui, C; Artal, A; García-Campayo, J; Valdizán, J R
An exhaustive search of reported metanalysis from any medical speciality is described. Search of papers included in MEDLINE or EMBASE between 1973-1998. A descriptive analysis of the reported papers (frequency tables and graphics) is described, including differences of mean of reported metanalysis papers by medical speciality and year. 1,514 papers were selected and classified. Between 1977-1987 overall mean of reported studies of neurologic metanalysis (1.20 +/- 1.10) was significatively inferior to the 1988-1998 (11.20 +/- 7.85) (p < 0.001). Global number of neurologic metanalysis was positively correlated (p < 0.05) with the number of studies about fundamentals and methodology during the study period. A progressive increase in the number of reported neurologic metanalysis since 1977 can be demonstrated. Diffusion of knowledge about fundamentals and methodology of metanalysis seems to have drawn and increase in performing and reporting this kind of analysis.
Patterson, Stephanie Y; Smith, Veronica; Mirenda, Pat
2012-09-01
The purpose of this systematic review was to examine research utilizing single subject research designs (SSRD) to explore the effectiveness of interventions designed to increase parents' ability to support communication and social development in children with autism spectrum disorders (ASDs). Included studies were systematically assessed for methodological quality (Logan et al., 2008; Smith et al., 2007) and intervention effects. Data examining participant characteristics, study methodology, outcomes, and analysis were systematically extracted. Eleven SSRD parent-training intervention studies examining 44 participants with ASD were included. Overall, the studies were of moderate quality and reported increases in parent skills and child language and communication outcomes. The results supported by improvement rate difference (IRD) analysis indicated several interventions demonstrated positive effects for both parent and child outcomes. However, limited generalization and follow-up data suggested only one intervention demonstrated parents' accurate and ongoing intervention implementation beyond training.
Harrigan, George G; Harrison, Jay M
2012-01-01
New transgenic (GM) crops are subjected to extensive safety assessments that include compositional comparisons with conventional counterparts as a cornerstone of the process. The influence of germplasm, location, environment, and agronomic treatments on compositional variability is, however, often obscured in these pair-wise comparisons. Furthermore, classical statistical significance testing can often provide an incomplete and over-simplified summary of highly responsive variables such as crop composition. In order to more clearly describe the influence of the numerous sources of compositional variation we present an introduction to two alternative but complementary approaches to data analysis and interpretation. These include i) exploratory data analysis (EDA) with its emphasis on visualization and graphics-based approaches and ii) Bayesian statistical methodology that provides easily interpretable and meaningful evaluations of data in terms of probability distributions. The EDA case-studies include analyses of herbicide-tolerant GM soybean and insect-protected GM maize and soybean. Bayesian approaches are presented in an analysis of herbicide-tolerant GM soybean. Advantages of these approaches over classical frequentist significance testing include the more direct interpretation of results in terms of probabilities pertaining to quantities of interest and no confusion over the application of corrections for multiple comparisons. It is concluded that a standardized framework for these methodologies could provide specific advantages through enhanced clarity of presentation and interpretation in comparative assessments of crop composition.
[Quantity versus quality: a review on current methodological dispute in health services research].
Sikorski, Claudia; Glaesmer, Heide; Bramesfeld, Anke
2010-10-01
The aim of this study was to determine the percentage of qualitative and quantitative research papers on health services research in two German journals. All publications of the two journals were viewed. Only empirical research papers were included. It was then assessed whether they dealt with health services research and what methodology was used to collect and analyse data. About half of all published empirical papers dealt with health services research. Of those, slightly over 20 % used qualitative methods at least partially. Ordered by topic, qualitative data collection and analysis is especially common in the fields of phenomenology, treatment determinants and treatment outcome. Sole qualitative methodology is still used rather seldom in health services research. Attempts to include quantitative as well as qualitative approaches are limited to sequential design, lowering the independent value of both approaches. The concept of triangulation yields the possibility to overcome paradigm based dichotomies. However, the choice of methodology ought to be based primarily on the research question. © Georg Thieme Verlag KG Stuttgart · New York.
A methodology to event reconstruction from trace images.
Milliet, Quentin; Delémont, Olivier; Sapin, Eric; Margot, Pierre
2015-03-01
The widespread use of digital imaging devices for surveillance (CCTV) and entertainment (e.g., mobile phones, compact cameras) has increased the number of images recorded and opportunities to consider the images as traces or documentation of criminal activity. The forensic science literature focuses almost exclusively on technical issues and evidence assessment [1]. Earlier steps in the investigation phase have been neglected and must be considered. This article is the first comprehensive description of a methodology to event reconstruction using images. This formal methodology was conceptualised from practical experiences and applied to different contexts and case studies to test and refine it. Based on this practical analysis, we propose a systematic approach that includes a preliminary analysis followed by four main steps. These steps form a sequence for which the results from each step rely on the previous step. However, the methodology is not linear, but it is a cyclic, iterative progression for obtaining knowledge about an event. The preliminary analysis is a pre-evaluation phase, wherein potential relevance of images is assessed. In the first step, images are detected and collected as pertinent trace material; the second step involves organising and assessing their quality and informative potential. The third step includes reconstruction using clues about space, time and actions. Finally, in the fourth step, the images are evaluated and selected as evidence. These steps are described and illustrated using practical examples. The paper outlines how images elicit information about persons, objects, space, time and actions throughout the investigation process to reconstruct an event step by step. We emphasise the hypothetico-deductive reasoning framework, which demonstrates the contribution of images to generating, refining or eliminating propositions or hypotheses. This methodology provides a sound basis for extending image use as evidence and, more generally, as clues in investigation and crime reconstruction processes. Copyright © 2015 Forensic Science Society. Published by Elsevier Ireland Ltd. All rights reserved.
Quintana, B; Pedrosa, M C; Vázquez-Canelas, L; Santamaría, R; Sanjuán, M A; Puertas, F
2018-04-01
A methodology including software tools for analysing NORM building materials and residues by low-level gamma-ray spectrometry has been developed. It comprises deconvolution of gamma-ray spectra using the software GALEA with focus on the natural radionuclides and Monte Carlo simulations for efficiency and true coincidence summing corrections. The methodology has been tested on a range of building materials and validated against reference materials. Copyright © 2017 Elsevier Ltd. All rights reserved.
Computational fluid dynamics combustion analysis evaluation
NASA Technical Reports Server (NTRS)
Kim, Y. M.; Shang, H. M.; Chen, C. P.; Ziebarth, J. P.
1992-01-01
This study involves the development of numerical modelling in spray combustion. These modelling efforts are mainly motivated to improve the computational efficiency in the stochastic particle tracking method as well as to incorporate the physical submodels of turbulence, combustion, vaporization, and dense spray effects. The present mathematical formulation and numerical methodologies can be casted in any time-marching pressure correction methodologies (PCM) such as FDNS code and MAST code. A sequence of validation cases involving steady burning sprays and transient evaporating sprays will be included.
Hydrogen Hazards Assessment Protocol (HHAP): Approach and Methodology
NASA Technical Reports Server (NTRS)
Woods, Stephen
2009-01-01
This viewgraph presentation reviews the approach and methodology to develop a assessment protocol for hydrogen hazards. Included in the presentation are the reasons to perform hazards assessment, the types of hazard assessments that exist, an analysis of hydrogen hazards, specific information about the Hydrogen Hazards Assessment Protocol (HHAP). The assessment is specifically tailored for hydrogen behavior. The end product of the assesment is a compilation of hazard, mitigations and associated factors to facilitate decision making and achieve the best practice.
NASA Technical Reports Server (NTRS)
Taylor, Arthur C., III; Hou, Gene W.
1992-01-01
Fundamental equations of aerodynamic sensitivity analysis and approximate analysis for the two dimensional thin layer Navier-Stokes equations are reviewed, and special boundary condition considerations necessary to apply these equations to isolated lifting airfoils on 'C' and 'O' meshes are discussed in detail. An efficient strategy which is based on the finite element method and an elastic membrane representation of the computational domain is successfully tested, which circumvents the costly 'brute force' method of obtaining grid sensitivity derivatives, and is also useful in mesh regeneration. The issue of turbulence modeling is addressed in a preliminary study. Aerodynamic shape sensitivity derivatives are efficiently calculated, and their accuracy is validated on two viscous test problems, including: (1) internal flow through a double throat nozzle, and (2) external flow over a NACA 4-digit airfoil. An automated aerodynamic design optimization strategy is outlined which includes the use of a design optimization program, an aerodynamic flow analysis code, an aerodynamic sensitivity and approximate analysis code, and a mesh regeneration and grid sensitivity analysis code. Application of the optimization methodology to the two test problems in each case resulted in a new design having a significantly improved performance in the aerodynamic response of interest.
Schueler, Sabine; Walther, Stefan; Schuetz, Georg M; Schlattmann, Peter; Dewey, Marc
2013-06-01
To evaluate the methodological quality of diagnostic accuracy studies on coronary computed tomography (CT) angiography using the QUADAS (Quality Assessment of Diagnostic Accuracy Studies included in systematic reviews) tool. Each QUADAS item was individually defined to adapt it to the special requirements of studies on coronary CT angiography. Two independent investigators analysed 118 studies using 12 QUADAS items. Meta-regression and pooled analyses were performed to identify possible effects of methodological quality items on estimates of diagnostic accuracy. The overall methodological quality of coronary CT studies was merely moderate. They fulfilled a median of 7.5 out of 12 items. Only 9 of the 118 studies fulfilled more than 75 % of possible QUADAS items. One QUADAS item ("Uninterpretable Results") showed a significant influence (P = 0.02) on estimates of diagnostic accuracy with "no fulfilment" increasing specificity from 86 to 90 %. Furthermore, pooled analysis revealed that each QUADAS item that is not fulfilled has the potential to change estimates of diagnostic accuracy. The methodological quality of studies investigating the diagnostic accuracy of non-invasive coronary CT is only moderate and was found to affect the sensitivity and specificity. An improvement is highly desirable because good methodology is crucial for adequately assessing imaging technologies. • Good methodological quality is a basic requirement in diagnostic accuracy studies. • Most coronary CT angiography studies have only been of moderate design quality. • Weak methodological quality will affect the sensitivity and specificity. • No improvement in methodological quality was observed over time. • Authors should consider the QUADAS checklist when undertaking accuracy studies.
Response Time Analysis and Test of Protection System Instrument Channels for APR1400 and OPR1000
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Chang Jae; Han, Seung; Yun, Jae Hee
2015-07-01
Safety limits are required to maintain the integrity of physical barriers designed to prevent the uncontrolled release of radioactive materials in nuclear power plants. The safety analysis establishes two critical constraints that include an analytical limit in terms of a measured or calculated variable, and a specific time after the analytical limit is reached to begin protective action. Keeping with the nuclear regulations and industry standards, satisfying these two requirements will ensure that the safety limit will not be exceeded during the design basis event, either an anticipated operational occurrence or a postulated accident. Various studies on the setpoint determinationmore » methodology for the safety-related instrumentation have been actively performed to ensure that the requirement of the analytical limit is satisfied. In particular, the protection setpoint methodology for the advanced power reactor 1400 (APP1400) and the optimized power reactor 1000 (OPR1000) has been recently developed to cover both the design basis event and the beyond design basis event. The developed setpoint methodology has also been quantitatively validated using specific computer programs and setpoint calculations. However, the safety of nuclear power plants cannot be fully guaranteed by satisfying the requirement of the analytical limit. In spite of the response time verification requirements of nuclear regulations and industry standards, it is hard to find the studies on the systematically integrated methodology regarding the response time evaluation. In cases of APR1400 and OPR1000, the response time analysis for the plant protection system is partially included in the setpoint calculation and the response time test is separately performed via the specific plant procedure. The test technique has a drawback which is the difficulty to demonstrate completeness of timing test. The analysis technique has also a demerit of resulting in extreme times that not actually possible. Thus, the establishment of the systematic response time evaluation methodology is needed to justify the conformance to the response time requirement used in the safety analysis. This paper proposes the response time evaluation methodology for APR1400 and OPR1000 using the combined analysis and test technique to confirm that the plant protection system can meet the analytical response time assumed in the safety analysis. In addition, the results of the quantitative evaluation performed for APR1400 and OPR1000 are presented in this paper. The proposed response time analysis technique consists of defining the response time requirement, determining the critical signal path for the trip parameter, allocating individual response time to each component on the signal path, and analyzing the total response time for the trip parameter, and demonstrates that the total analyzed response time does not exceed the response time requirement. The proposed response time test technique is composed of defining the response time requirement, determining the critical signal path for the trip parameter, determining the test method for each component on the signal path, performing the response time test, and demonstrates that the total test result does not exceed the response time requirement. The total response time should be tested in a single test that covers from the sensor to the final actuation device on the instrument channel. When the total channel is not tested in a single test, separate tests on groups of components or single components including the total instrument channel shall be combined to verify the total channel response. For APR1400 and OPR1000, the ramp test technique is used for the pressure and differential pressure transmitters and the step function testing technique is applied to the signal processing equipment and final actuation device. As a result, it can be demonstrated that the response time requirement is satisfied by the combined analysis and test technique. Therefore, the proposed methodology in this paper plays a crucial role in guaranteeing the safety of the nuclear power plants systematically satisfying one of two critical requirements from the safety analysis. (authors)« less
Does Exercise Improve Cognitive Performance? A Conservative Message from Lord's Paradox.
Liu, Sicong; Lebeau, Jean-Charles; Tenenbaum, Gershon
2016-01-01
Although extant meta-analyses support the notion that exercise results in cognitive performance enhancement, methodology shortcomings are noted among primary evidence. The present study examined relevant randomized controlled trials (RCTs) published in the past 20 years (1996-2015) for methodological concerns arise from Lord's paradox. Our analysis revealed that RCTs supporting the positive effect of exercise on cognition are likely to include Type I Error(s). This result can be attributed to the use of gain score analysis on pretest-posttest data as well as the presence of control group superiority over the exercise group on baseline cognitive measures. To improve accuracy of causal inferences in this area, analysis of covariance on pretest-posttest data is recommended under the assumption of group equivalence. Important experimental procedures are discussed to maintain group equivalence.
NASA Technical Reports Server (NTRS)
Sperry, S. L.
1982-01-01
The planning process for a statewide reclamation plan of Ohio abandoned minelands in response to the Federal Surface Mining Control and Reclamation Act of 1977 included: (1) the development of a screening and ranking methodology; (2) the establishment of a statewide review of major watersheds affected by mining; (3) the development of an immediate action process; and (4) a prototypical study of a priority watershed demonstrating the data collection, analysis, display and evaluation to be used for the remaining state watersheds. Historical methods for satisfying map information analysis and evaluation, as well as current methodologies being used were discussed. Various computer mapping and analysis programs were examined for their usability in evaluating the priority reclamation sites. Hand methods were chosen over automated procedures; intuitive evaluation was the primary reason.
Chen, Bi-Cang; Wu, Qiu-Ying; Xiang, Cheng-Bin; Zhou, Yi; Guo, Ling-Xiang; Zhao, Neng-Jiang; Yang, Shu-Yu
2006-01-01
To evaluate the quality of reports published in recent 10 years in China about quantitative analysis of syndrome differentiation for diabetes mellitus (DM) in order to explore the methodological problems in these reports and find possible solutions. The main medical literature databases in China were searched. Thirty-one articles were included and evaluated by the principles of clinical epidemiology. There were many mistakes and deficiencies in these articles, such as clinical trial designs, diagnosis criteria for DM, standards of syndrome differentiation of DM, case inclusive and exclusive criteria, sample size and estimation, data comparability and statistical methods. It is necessary and important to improve the quality of reports concerning quantitative analysis of syndrome differentiation of DM in light of the principles of clinical epidemiology.
Methodological issues in studies of air pollution and reproductive health.
Woodruff, Tracey J; Parker, Jennifer D; Darrow, Lyndsey A; Slama, Rémy; Bell, Michelle L; Choi, Hyunok; Glinianaia, Svetlana; Hoggatt, Katherine J; Karr, Catherine J; Lobdell, Danelle T; Wilhelm, Michelle
2009-04-01
In the past decade there have been an increasing number of scientific studies describing possible effects of air pollution on perinatal health. These papers have mostly focused on commonly monitored air pollutants, primarily ozone (O(3)), particulate matter (PM), sulfur dioxide (SO(2)), carbon monoxide (CO), and nitrogen dioxide (NO(2)), and various indices of perinatal health, including fetal growth, pregnancy duration, and infant mortality. While most published studies have found some marker of air pollution related to some types of perinatal outcomes, variability exists in the nature of the pollutants and outcomes associated. Synthesis of the findings has been difficult for various reasons, including differences in study design and analysis. A workshop was held in September 2007 to discuss methodological differences in the published studies as a basis for understanding differences in study findings and to identify priorities for future research, including novel approaches for existing data. Four broad topic areas were considered: confounding and effect modification, spatial and temporal exposure variations, vulnerable windows of exposure, and multiple pollutants. Here we present a synopsis of the methodological issues and challenges in each area and make recommendations for future study. Two key recommendations include: (1) parallel analyses of existing data sets using a standardized methodological approach to disentangle true differences in associations from methodological differences among studies; and (2) identification of animal studies to inform important mechanistic research gaps. This work is of critical public health importance because of widespread exposure and because perinatal outcomes are important markers of future child and adult health.
ERIC Educational Resources Information Center
Qin, Jian; Jurisica, Igor; Liddy, Elizabeth D.; Jansen, Bernard J; Spink, Amanda; Priss, Uta; Norton, Melanie J.
2000-01-01
These six articles discuss knowledge discovery in databases (KDD). Topics include data mining; knowledge management systems; applications of knowledge discovery; text and Web mining; text mining and information retrieval; user search patterns through Web log analysis; concept analysis; data collection; and data structure inconsistency. (LRW)
Liu, Jian-ping; Xia, Yun
2007-04-01
To critically assess the quality of literature about systematic review or meta-analysis on traditional Chinese medicine (TCM) published in Chinese journals. Electronic searches in CNKI, VIP and Wanfang data-base were conducted to retrieve the systematic reviews or meta-analysis reports on TCM, including herbal medicine, needling, acupuncture and moxibustion, as well as integrative medicine, they were identified and extracted according to the 18 items of QUOROM (quality of reporting of meta-analyses) Statement and relative information. The appraisal was made taking the indexes mainly including objectives, source of data, methods of data extraction, quality assessment of the included studies, measurement data synthesis, etc. Eighty-two systematic reviews were identified, except 6 reviews were excluded for repeatedly published or didn't comply with the enrolled criterion, 76 reviews concerning 51 kinds of diseases were enrolled for appraisal. Among them, 70 reviews evaluated the efficacy of TCM, mainly on Chinese herbs and 9 on acupuncture and moxibustion. In majority of the reviews, randomised controlled trials were included and the data resources were described, but in 26 reviews only the Chinese databases were searched and the description about data extraction and analysis method were too simple; and 70% of reviews assessed the quality of the included studies; none used flow chart to express the process of selection, inclusion and exclusion of studies. Few reviews or Meta-analysis reports reached the international standard and there is insufficient description of methodology for conducting systematic reviews, so it is hardly to be repeated. The authors suggested that advanced methodological training is necessary for reviewers.
ERIC Educational Resources Information Center
Kim, Kyung Hi
2014-01-01
This research, based on a case study of vulnerable children in Korea, used a mixed methods transformative approach to explore strategies to support and help disadvantaged children. The methodological approach includes three phases: a mixed methods contextual analysis, a qualitative dominant analysis based on Sen's capability approach and critical…
ALS rocket engine combustion devices design and demonstration
NASA Technical Reports Server (NTRS)
Arreguin, Steve
1989-01-01
Work performed during Phase one is summarized and the significant technical and programmatic accomplishments occurring during this period are documented. Besides a summary of the results, methodologies, trade studies, design, fabrication, and hardware conditions; the following are included: the evolving Maintainability Plan, Reliability Program Plan, Failure Summary and Analysis Report, and the Failure Mode and Effect Analysis.
Operation Team Spirit: Program Review and Analysis
2009-06-01
research was strictly qualitative in nature. The specific method being used was exploratory case study analysis... study , method is used for the specific acts of conducting research , while methodology refers to the qualitative nature of research performed. While...the researcher . According to Leedy and Ormrod (2005), some of the types of methods and their respective purposes include: • Case study :
Near ground level sensing for spatial analysis of vegetation
NASA Technical Reports Server (NTRS)
Sauer, Tom; Rasure, John; Gage, Charlie
1991-01-01
Measured changes in vegetation indicate the dynamics of ecological processes and can identify the impacts from disturbances. Traditional methods of vegetation analysis tend to be slow because they are labor intensive; as a result, these methods are often confined to small local area measurements. Scientists need new algorithms and instruments that will allow them to efficiently study environmental dynamics across a range of different spatial scales. A new methodology that addresses this problem is presented. This methodology includes the acquisition, processing, and presentation of near ground level image data and its corresponding spatial characteristics. The systematic approach taken encompasses a feature extraction process, a supervised and unsupervised classification process, and a region labeling process yielding spatial information.
On Designing Multicore-Aware Simulators for Systems Biology Endowed with OnLine Statistics
Calcagno, Cristina; Coppo, Mario
2014-01-01
The paper arguments are on enabling methodologies for the design of a fully parallel, online, interactive tool aiming to support the bioinformatics scientists .In particular, the features of these methodologies, supported by the FastFlow parallel programming framework, are shown on a simulation tool to perform the modeling, the tuning, and the sensitivity analysis of stochastic biological models. A stochastic simulation needs thousands of independent simulation trajectories turning into big data that should be analysed by statistic and data mining tools. In the considered approach the two stages are pipelined in such a way that the simulation stage streams out the partial results of all simulation trajectories to the analysis stage that immediately produces a partial result. The simulation-analysis workflow is validated for performance and effectiveness of the online analysis in capturing biological systems behavior on a multicore platform and representative proof-of-concept biological systems. The exploited methodologies include pattern-based parallel programming and data streaming that provide key features to the software designers such as performance portability and efficient in-memory (big) data management and movement. Two paradigmatic classes of biological systems exhibiting multistable and oscillatory behavior are used as a testbed. PMID:25050327
NASA Technical Reports Server (NTRS)
Morel, T.; Kerlbar, R.; Fort, E. F.; Blumberg, P. N.
1985-01-01
This report describes work done during Phase 2 of a 3 year program aimed at developing a comprehensive heat transfer and thermal analysis methodology for design analysis of insulated diesel engines. The overall program addresses all the key heat transfer issues: (1) spatially and time-resolved convective and radiative in-cylinder heat transfer, (2) steady-state conduction in the overall structure, and (3) cyclical and load/speed temperature transients in the engine structure. During Phase 2, radiation heat transfer model was developed, which accounts for soot formation and burn up. A methodology was developed for carrying out the multi-dimensional finite-element heat conduction calculations within the framework of thermodynamic cycle codes. Studies were carried out using the integrated methodology to address key issues in low heat rejection engines. A wide ranging design analysis matrix was covered, including a variety of insulation strategies, recovery devices and base engine configurations. A single cylinder Cummins engine was installed at Purdue University, and it was brought to a full operational status. The development of instrumentation was continued, concentrating on radiation heat flux detector, total heat flux probe, and accurate pressure-crank angle data acquisition.
On designing multicore-aware simulators for systems biology endowed with OnLine statistics.
Aldinucci, Marco; Calcagno, Cristina; Coppo, Mario; Damiani, Ferruccio; Drocco, Maurizio; Sciacca, Eva; Spinella, Salvatore; Torquati, Massimo; Troina, Angelo
2014-01-01
The paper arguments are on enabling methodologies for the design of a fully parallel, online, interactive tool aiming to support the bioinformatics scientists .In particular, the features of these methodologies, supported by the FastFlow parallel programming framework, are shown on a simulation tool to perform the modeling, the tuning, and the sensitivity analysis of stochastic biological models. A stochastic simulation needs thousands of independent simulation trajectories turning into big data that should be analysed by statistic and data mining tools. In the considered approach the two stages are pipelined in such a way that the simulation stage streams out the partial results of all simulation trajectories to the analysis stage that immediately produces a partial result. The simulation-analysis workflow is validated for performance and effectiveness of the online analysis in capturing biological systems behavior on a multicore platform and representative proof-of-concept biological systems. The exploited methodologies include pattern-based parallel programming and data streaming that provide key features to the software designers such as performance portability and efficient in-memory (big) data management and movement. Two paradigmatic classes of biological systems exhibiting multistable and oscillatory behavior are used as a testbed.
Low-Temperature Hydrothermal Resource Potential Estimate
Katherine Young
2016-06-30
Compilation of data (spreadsheet and shapefiles) for several low-temperature resource types, including isolated springs and wells, delineated area convection systems, sedimentary basins and coastal plains sedimentary systems. For each system, we include estimates of the accessible resource base, mean extractable resource and beneficial heat. Data compiled from USGS and other sources. The paper (submitted to GRC 2016) describing the methodology and analysis is also included.
Flap-lag-torsional dynamics of helicopter rotor blades in forward flight
NASA Technical Reports Server (NTRS)
Crespodasilva, M. R. M.
1986-01-01
A perturbation/numerical methodology to analyze the flap-lead/lag motion of a centrally hinged spring restrained rotor blade that is valid for both hover and for forward flight was developed. The derivation of the nonlinear differential equations of motion and the analysis of the stability of the steady state response of the blade were conducted entirely in a Symbolics 3670 Machine using MACSYMA to perform all the lengthy symbolic manipulations. It also includes generation of the fortran codes and plots of the results. The Floquet theory was also applied to the differential equations of motion in order to compare results with those obtained from the perturbation analysis. The results obtained from the perturbation methodology and from Floquet theory were found to be very close to each other, which demonstrates the usefullness of the perturbation methodology. Another problem under study consisted in the analysis of the influence of higher order terms in the response and stability of a flexible rotor blade in forward flight using Computerized Symbolic Manipulation and a perturbation technique to bypass the Floquet theory. The derivation of the partial differential equations of motion is presented.
Methodology for Uncertainty Analysis of Dynamic Computational Toxicology Models
The task of quantifying the uncertainty in both parameter estimates and model predictions has become more important with the increased use of dynamic computational toxicology models by the EPA. Dynamic toxicological models include physiologically-based pharmacokinetic (PBPK) mode...
30 CFR 780.21 - Hydrologic information.
Code of Federal Regulations, 2010 CFR
2010-07-01
... contain information on water availability and alternative water sources, including the suitability of...) flooding or streamflow alteration; (D) ground water and surface water availability; and (E) other... Hydrologic information. (a) Sampling and analysis methodology. All water-quality analyses performed to meet...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Price, S.M.
1997-04-30
This chapter provides information on the physical, chemical, and biological characteristics of the waste stored at the 616 NRDWSF. A waste analysis plan is included that describes the methodology used for determining waste types.
2016-06-01
characteristics, experimental design techniques, and analysis methodologies that distinguish each phase of the MBSE MEASA. To ensure consistency... methodology . Experimental design selection, simulation analysis, and trade space analysis support the final two stages. Figure 27 segments the MBSE MEASA...rounding has the potential to increase the correlation between columns of the experimental design matrix. The design methodology presented in Vieira
Roetzheim, Richard G.; Freund, Karen M.; Corle, Don K.; Murray, David M.; Snyder, Frederick R.; Kronman, Andrea C.; Jean-Pierre, Pascal; Raich, Peter C.; Holden, Alan E. C.; Darnell, Julie S.; Warren-Mears, Victoria; Patierno, Steven; Design, PNRP; Committee, Analysis
2013-01-01
Background The Patient Navigation Research Program (PNRP) is a cooperative effort of nine research projects, each employing its own unique study design. To evaluate projects such as PNRP, it is desirable to perform a pooled analysis to increase power relative to the individual projects. There is no agreed upon prospective methodology, however, for analyzing combined data arising from different study designs. Expert opinions were thus solicited from members of the PNRP Design and Analysis Committee Purpose To review possible methodologies for analyzing combined data arising from heterogeneous study designs. Methods The Design and Analysis Committee critically reviewed the pros and cons of five potential methods for analyzing combined PNRP project data. Conclusions were based on simple consensus. The five approaches reviewed included: 1) Analyzing and reporting each project separately, 2) Combining data from all projects and performing an individual-level analysis, 3) Pooling data from projects having similar study designs, 4) Analyzing pooled data using a prospective meta analytic technique, 5) Analyzing pooled data utilizing a novel simulated group randomized design. Results Methodologies varied in their ability to incorporate data from all PNRP projects, to appropriately account for differing study designs, and in their impact from differing project sample sizes. Limitations The conclusions reached were based on expert opinion and not derived from actual analyses performed. Conclusions The ability to analyze pooled data arising from differing study designs may provide pertinent information to inform programmatic, budgetary, and policy perspectives. Multi-site community-based research may not lend itself well to the more stringent explanatory and pragmatic standards of a randomized controlled trial design. Given our growing interest in community-based population research, the challenges inherent in the analysis of heterogeneous study design are likely to become more salient. Discussion of the analytic issues faced by the PNRP and the methodological approaches we considered may be of value to other prospective community-based research programs. PMID:22273587
Lindberg, Elisabeth; Österberg, Sofia A; Hörberg, Ulrica
2016-01-01
Phenomena in caring science are often complex and laden with meanings. Empirical research with the aim of capturing lived experiences is one way of revealing the complexity. Sometimes, however, results from empirical research need to be further discussed. One way is to further abstract the result and/or philosophically examine it. This has previously been performed and presented in scientific journals and doctoral theses, contributing to a greater understanding of phenomena in caring science. Although the intentions in many of these publications are laudable, the lack of methodological descriptions as well as a theoretical and systematic foundation can contribute to an ambiguity concerning how the results have emerged during the analysis. The aim of this paper is to describe the methodological support for the further abstraction of and/or philosophical examination of empirical findings. When trying to systematize the support procedures, we have used a reflective lifeworld research (RLR) approach. Based on the assumptions in RLR, this article will present methodological support for a theoretical examination that can include two stages. In the first stage, data from several (two or more) empirical results on an essential level are synthesized into a general structure. Sometimes the analysis ends with the general structure, but sometimes there is a need to proceed further. The second stage can then be a philosophical examination, in which the general structure is discussed in relation to a philosophical text, theory, or concept. It is important that the theories are brought in as the final stage after the completion of the analysis. Core dimensions of the described methodological support are, in accordance with RLR, openness, bridling, and reflection. The methodological support cannot be understood as fixed stages, but rather as a guiding light in the search for further meanings.
"Women Made It a Home": Representations of Women in Social Studies
ERIC Educational Resources Information Center
Schmeichel, Mardi
2014-01-01
This article explores recently published P-12 social studies lesson plans that include women to examine how attending to women is "getting done" in the field and how the lessons represent women and women's experiences. Using discourse analysis methodologies, the author demonstrates that women have been included as topics in ways that do…
Sprenkle, Douglas H
2012-01-01
This article serves as an introduction to this third version of research reviews of couple and family therapy (CFT) that have appeared in this journal beginning in 1995. It also presents a methodological and substantive overview of research in couple and family therapy from about 2001/2002 to 2010/2011 (the period covered in this issue), while also making connections with previous research. The article introduces quantitative research reviews of family-based intervention research that appear in this issue on 10 substantive areas including conduct disorder/delinquency, drug abuse, childhood and adolescent disorders (not including the aforementioned), family psycho-education for major mental illness, alcoholism, couple distress, relationship education, affective disorders, interpersonal violence, and chronic illness. The paper also introduces the first qualitative research paper in this series, as well as a paper that highlights current methodologies in meta-analysis. The first part of this article rates the 10 content areas on 12 dimensions of methodological strength for quantitative research and makes generalizations about the state of quantitative methodology in CFT. The latter part of the papers summarizes and makes comments on the substantive findings in the 12 papers in this issue, as well as on the field as a whole. © 2012 American Association for Marriage and Family Therapy.
The instrumental genesis process in future primary teachers using Dynamic Geometry Software
NASA Astrophysics Data System (ADS)
Ruiz-López, Natalia
2018-05-01
This paper, which describes a study undertaken with pairs of future primary teachers using GeoGebra software to solve geometry problems, includes a brief literature review, the theoretical framework and methodology used. An analysis of the instrumental genesis process for a pair participating in the case study is also provided. This analysis addresses the techniques and types of dragging used, the obstacles to learning encountered, a description of the interaction between the pair and their interaction with the teacher, and the type of language used. Based on this analysis, possibilities and limitations of the instrumental genesis process are identified for the development of geometric competencies such as conjecture creation, property checking and problem researching. It is also suggested that the methodology used in the analysis of the problem solving process may be useful for those teachers and researchers who want to integrate Dynamic Geometry Software (DGS) in their classrooms.
Saturno-Hernández, Pedro J; Gutiérrez-Reyes, Juan Pablo; Vieyra-Romero, Waldo Ivan; Romero-Martínez, Martín; O'Shea-Cuevas, Gabriel Jaime; Lozano-Herrera, Javier; Tavera-Martínez, Sonia; Hernández-Ávila, Mauricio
2016-01-01
To describe the conceptual framework and methods for implementation and analysis of the satisfaction survey of the Mexican System for Social Protection in Health. We analyze the methodological elements of the 2013, 2014 and 2015 surveys, including the instrument, sampling method and study design, conceptual framework, and characteristics and indicators of the analysis. The survey captures information on perceived quality and satisfaction. Sampling has national and State representation. Simple and composite indicators (index of satisfaction and rate of reported quality problems) are built and described. The analysis is completed using Pareto diagrams, correlation between indicators and association with satisfaction by means of multivariate models. The measurement of satisfaction and perceived quality is a complex but necessary process to comply with regulations and to identify strategies for improvement. The described survey presents a design and rigorous analysis focused on its utility for improving.
An Overview of R in Health Decision Sciences.
Jalal, Hawre; Pechlivanoglou, Petros; Krijkamp, Eline; Alarid-Escudero, Fernando; Enns, Eva; Hunink, M G Myriam
2017-10-01
As the complexity of health decision science applications increases, high-level programming languages are increasingly adopted for statistical analyses and numerical computations. These programming languages facilitate sophisticated modeling, model documentation, and analysis reproducibility. Among the high-level programming languages, the statistical programming framework R is gaining increased recognition. R is freely available, cross-platform compatible, and open source. A large community of users who have generated an extensive collection of well-documented packages and functions supports it. These functions facilitate applications of health decision science methodology as well as the visualization and communication of results. Although R's popularity is increasing among health decision scientists, methodological extensions of R in the field of decision analysis remain isolated. The purpose of this article is to provide an overview of existing R functionality that is applicable to the various stages of decision analysis, including model design, input parameter estimation, and analysis of model outputs.
NASA Technical Reports Server (NTRS)
Melcher, Kevin J.; Cruz, Jose A.; Johnson Stephen B.; Lo, Yunnhon
2015-01-01
This paper describes a quantitative methodology for bounding the false positive (FP) and false negative (FN) probabilities associated with a human-rated launch vehicle abort trigger (AT) that includes sensor data qualification (SDQ). In this context, an AT is a hardware and software mechanism designed to detect the existence of a specific abort condition. Also, SDQ is an algorithmic approach used to identify sensor data suspected of being corrupt so that suspect data does not adversely affect an AT's detection capability. The FP and FN methodologies presented here were developed to support estimation of the probabilities of loss of crew and loss of mission for the Space Launch System (SLS) which is being developed by the National Aeronautics and Space Administration (NASA). The paper provides a brief overview of system health management as being an extension of control theory; and describes how ATs and the calculation of FP and FN probabilities relate to this theory. The discussion leads to a detailed presentation of the FP and FN methodology and an example showing how the FP and FN calculations are performed. This detailed presentation includes a methodology for calculating the change in FP and FN probabilities that result from including SDQ in the AT architecture. To avoid proprietary and sensitive data issues, the example incorporates a mixture of open literature and fictitious reliability data. Results presented in the paper demonstrate the effectiveness of the approach in providing quantitative estimates that bound the probability of a FP or FN abort determination.
Ghimire, Santosh R; Johnston, John M
2017-09-01
We propose a modified eco-efficiency (EE) framework and novel sustainability analysis methodology for green infrastructure (GI) practices used in water resource management. Green infrastructure practices such as rainwater harvesting (RWH), rain gardens, porous pavements, and green roofs are emerging as viable strategies for climate change adaptation. The modified framework includes 4 economic, 11 environmental, and 3 social indicators. Using 6 indicators from the framework, at least 1 from each dimension of sustainability, we demonstrate the methodology to analyze RWH designs. We use life cycle assessment and life cycle cost assessment to calculate the sustainability indicators of 20 design configurations as Decision Management Objectives (DMOs). Five DMOs emerged as relatively more sustainable along the EE analysis Tradeoff Line, and we used Data Envelopment Analysis (DEA), a widely applied statistical approach, to quantify the modified EE measures as DMO sustainability scores. We also addressed the subjectivity and sensitivity analysis requirements of sustainability analysis, and we evaluated the performance of 10 weighting schemes that included classical DEA, equal weights, National Institute of Standards and Technology's stakeholder panel, Eco-Indicator 99, Sustainable Society Foundation's Sustainable Society Index, and 5 derived schemes. We improved upon classical DEA by applying the weighting schemes to identify sustainability scores that ranged from 0.18 to 1.0, avoiding the nonuniqueness problem and revealing the least to most sustainable DMOs. Our methodology provides a more comprehensive view of water resource management and is generally applicable to GI and industrial, environmental, and engineered systems to explore the sustainability space of alternative design configurations. Integr Environ Assess Manag 2017;13:821-831. Published 2017. This article is a US Government work and is in the public domain in the USA. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of Society of Environmental Toxicology & Chemistry (SETAC). Published 2017. This article is a US Government work and is in the public domain in the USA. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of Society of Environmental Toxicology & Chemistry (SETAC).
Reference values of elements in human hair: a systematic review.
Mikulewicz, Marcin; Chojnacka, Katarzyna; Gedrange, Thomas; Górecki, Henryk
2013-11-01
The lack of systematic review on reference values of elements in human hair with the consideration of methodological approach. The absence of worldwide accepted and implemented universal reference ranges causes that hair mineral analysis has not become yet a reliable and useful method of assessment of nutritional status and exposure of individuals. Systematic review of reference values of elements in human hair. PubMed, ISI Web of Knowledge, Scopus. Humans, hair mineral analysis, elements or minerals, reference values, original studies. The number of studies screened and assessed for eligibility was 52. Eventually, included in the review were 5 papers. The studies report reference ranges for the content of elements in hair: macroelements, microelements, toxic elements and other elements. Reference ranges were elaborated for different populations in the years 2000-2012. The analytical methodology differed, in particular sample preparation, digestion and analysis (ICP-AES, ICP-MS). Consequently, the levels of hair minerals reported as reference values varied. It is necessary to elaborate the standard procedures and furtherly validate hair mineral analysis and deliver detailed methodology. Only then it would be possible to provide meaningful reference ranges and take advantage of the potential that lies in Hair Mineral Analysis as a medical diagnostic technique. Copyright © 2013 Elsevier B.V. All rights reserved.
2008-12-01
Approved OMB No. 0704-0188 Public reporting burden for this collection of information is estimated to average 1 hour per response, including the...Arlington, VA 22202-4302, and to the Office of Management and Budget, Paperwork Reduction Project (0704-0188) Washington DC 20503. 1 . AGENCY USE ONLY...on Investment (ROI) of the Zephyr system. This is achieved by ( 1 ) Developing a model to carry out Business Case Analysis (BCA) of JCTDs, including
The coordinate-based meta-analysis of neuroimaging data.
Samartsidis, Pantelis; Montagna, Silvia; Nichols, Thomas E; Johnson, Timothy D
2017-01-01
Neuroimaging meta-analysis is an area of growing interest in statistics. The special characteristics of neuroimaging data render classical meta-analysis methods inapplicable and therefore new methods have been developed. We review existing methodologies, explaining the benefits and drawbacks of each. A demonstration on a real dataset of emotion studies is included. We discuss some still-open problems in the field to highlight the need for future research.
,
1990-01-01
Various techniques were used to decipher the sedimentation history of Site 765, including Markov chain analysis of facies transitions, XRD analysis of clay and other minerals, and multivariate analysis of smear-slide data, in addition to the standard descriptive procedures employed by the shipboard sedimentologist. This chapter presents brief summaries of methodology and major findings of these three techniques, a summary of the sedimentation history, and a discussion of trends in sedimentation through time.
The coordinate-based meta-analysis of neuroimaging data
Samartsidis, Pantelis; Montagna, Silvia; Nichols, Thomas E.; Johnson, Timothy D.
2017-01-01
Neuroimaging meta-analysis is an area of growing interest in statistics. The special characteristics of neuroimaging data render classical meta-analysis methods inapplicable and therefore new methods have been developed. We review existing methodologies, explaining the benefits and drawbacks of each. A demonstration on a real dataset of emotion studies is included. We discuss some still-open problems in the field to highlight the need for future research. PMID:29545671
Ozone data and mission sampling analysis
NASA Technical Reports Server (NTRS)
Robbins, J. L.
1980-01-01
A methodology was developed to analyze discrete data obtained from the global distribution of ozone. Statistical analysis techniques were applied to describe the distribution of data variance in terms of empirical orthogonal functions and components of spherical harmonic models. The effects of uneven data distribution and missing data were considered. Data fill based on the autocorrelation structure of the data is described. Computer coding of the analysis techniques is included.
A methodological review of qualitative case study methodology in midwifery research.
Atchan, Marjorie; Davis, Deborah; Foureur, Maralyn
2016-10-01
To explore the use and application of case study research in midwifery. Case study research provides rich data for the analysis of complex issues and interventions in the healthcare disciplines; however, a gap in the midwifery research literature was identified. A methodological review of midwifery case study research using recognized templates, frameworks and reporting guidelines facilitated comprehensive analysis. An electronic database search using the date range January 2005-December 2014: Maternal and Infant Care, CINAHL Plus, Academic Search Complete, Web of Knowledge, SCOPUS, Medline, Health Collection (Informit), Cochrane Library Health Source: Nursing/Academic Edition, Wiley online and ProQuest Central. Narrative evaluation was undertaken. Clearly worded questions reflected the problem and purpose. The application, strengths and limitations of case study methods were identified through a quality appraisal process. The review identified both case study research's applicability to midwifery and its low uptake, especially in clinical studies. Many papers included the necessary criteria to achieve rigour. The included measures of authenticity and methodology were varied. A high standard of authenticity was observed, suggesting authors considered these elements to be routine inclusions. Technical aspects were lacking in many papers, namely a lack of reflexivity and incomplete transparency of processes. This review raises the profile of case study research in midwifery. Midwives will be encouraged to explore if case study research is suitable for their investigation. The raised profile will demonstrate further applicability; encourage support and wider adoption in the midwifery setting. © 2016 John Wiley & Sons Ltd.
Lee, Andy H; Zhou, Xu; Kang, Deying; Luo, Yanan; Liu, Jiali; Sun, Xin
2018-01-01
Objective To assess risk of bias and to investigate methodological issues concerning the design, conduct and analysis of randomised controlled trials (RCTs) testing acupuncture for knee osteoarthritis (KOA). Methods PubMed, EMBASE, Cochrane Central Register of Controlled Trials and four major Chinese databases were searched for RCTs that investigated the effect of acupuncture for KOA. The Cochrane tool was used to examine the risk of bias of eligible RCTs. Their methodological details were examined using a standardised and pilot-tested questionnaire of 48 items, together with the association between four predefined factors and important methodological quality indicators. Results A total of 248 RCTs were eligible, of which 39 (15.7%) used computer-generated randomisation sequence. Of the 31 (12.5%) trials that stated the allocation concealment, only one used central randomisation. Twenty-five (10.1%) trials mentioned that their acupuncture procedures were standardised, but only 18 (7.3%) specified how the standardisation was achieved. The great majority of trials (n=233, 94%) stated that blinding was in place, but 204 (87.6%) did not clarify who was blinded. Only 27 (10.9%) trials specified the primary outcome, for which 7 used intention-to-treat analysis. Only 17 (6.9%) trials included details on sample size calculation; none preplanned an interim analysis and associated stopping rule. In total, 46 (18.5%) trials explicitly stated that loss to follow-up occurred, but only 6 (2.4%) provided some information to deal with the issue. No trials prespecified, conducted or reported any subgroup or adjusted analysis for the primary outcome. Conclusion The overall risk of bias was high among published RCTs testing acupuncture for KOA. Methodological limitations were present in many important aspects of design, conduct and analyses. These findings inform the development of evidence-based methodological guidance for future trials assessing the effect of acupuncture for KOA. PMID:29511016
Penn, Alexandra S.; Knight, Christopher J. K.; Lloyd, David J. B.; Avitabile, Daniele; Kok, Kasper; Schiller, Frank; Woodward, Amy; Druckman, Angela; Basson, Lauren
2013-01-01
Fuzzy Cognitive Mapping (FCM) is a widely used participatory modelling methodology in which stakeholders collaboratively develop a ‘cognitive map’ (a weighted, directed graph), representing the perceived causal structure of their system. This can be directly transformed by a workshop facilitator into simple mathematical models to be interrogated by participants by the end of the session. Such simple models provide thinking tools which can be used for discussion and exploration of complex issues, as well as sense checking the implications of suggested causal links. They increase stakeholder motivation and understanding of whole systems approaches, but cannot be separated from an intersubjective participatory context. Standard FCM methodologies make simplifying assumptions, which may strongly influence results, presenting particular challenges and opportunities. We report on a participatory process, involving local companies and organisations, focussing on the development of a bio-based economy in the Humber region. The initial cognitive map generated consisted of factors considered key for the development of the regional bio-based economy and their directional, weighted, causal interconnections. A verification and scenario generation procedure, to check the structure of the map and suggest modifications, was carried out with a second session. Participants agreed on updates to the original map and described two alternate potential causal structures. In a novel analysis all map structures were tested using two standard methodologies usually used independently: linear and sigmoidal FCMs, demonstrating some significantly different results alongside some broad similarities. We suggest a development of FCM methodology involving a sensitivity analysis with different mappings and discuss the use of this technique in the context of our case study. Using the results and analysis of our process, we discuss the limitations and benefits of the FCM methodology in this case and in general. We conclude by proposing an extended FCM methodology, including multiple functional mappings within one participant-constructed graph. PMID:24244303
Application of damage tolerance methodology in certification of the Piaggio P-180 Avanti
NASA Technical Reports Server (NTRS)
Johnson, Jerry
1992-01-01
The Piaggio P-180 Avanti, a twin pusher-prop engine nine-passenger business aircraft was certified in 1990, to the requirements of FAR Part 23 and Associated Special Conditions for Composite Structure. Certification included the application of a damage tolerant methodology to the design of the composite forward wing and empennage (vertical fin, horizontal stabilizer, tailcone, and rudder) structure. This methodology included an extensive analytical evaluation coupled with sub-component and full-scale testing of the structure. The work from the Damage Tolerance Analysis Assessment was incorporated into the full-scale testing. Damage representing hazards such as dropped tools, ground equipment, handling, and runway debris, was applied to the test articles. Additional substantiation included allowing manufacturing discrepancies to exist unrepaired on the full-scale articles and simulated bondline failures in critical elements. The importance of full-scale testing in the critical environmental conditions and the application of critical damage are addressed. The implication of damage tolerance on static and fatigue testing is discussed. Good correlation between finite element solutions and experimental test data was observed.
Pluye, Pierre; Gagnon, Marie-Pierre; Griffiths, Frances; Johnson-Lafleur, Janique
2009-04-01
A new form of literature review has emerged, Mixed Studies Review (MSR). These reviews include qualitative, quantitative and mixed methods studies. In the present paper, we examine MSRs in health sciences, and provide guidance on processes that should be included and reported. However, there are no valid and usable criteria for concomitantly appraising the methodological quality of the qualitative, quantitative and mixed methods studies. To propose criteria for concomitantly appraising the methodological quality of qualitative, quantitative and mixed methods studies or study components. A three-step critical review was conducted. 2322 references were identified in MEDLINE, and their titles and abstracts were screened; 149 potentially relevant references were selected and the full-text papers were examined; 59 MSRs were retained and scrutinized using a deductive-inductive qualitative thematic data analysis. This revealed three types of MSR: convenience, reproducible, and systematic. Guided by a proposal, we conducted a qualitative thematic data analysis of the quality appraisal procedures used in the 17 systematic MSRs (SMSRs). Of 17 SMSRs, 12 showed clear quality appraisal procedures with explicit criteria but no SMSR used valid checklists to concomitantly appraise qualitative, quantitative and mixed methods studies. In two SMSRs, criteria were developed following a specific procedure. Checklists usually contained more criteria than needed. In four SMSRs, a reliability assessment was described or mentioned. While criteria for quality appraisal were usually based on descriptors that require specific methodological expertise (e.g., appropriateness), no SMSR described the fit between reviewers' expertise and appraised studies. Quality appraisal usually resulted in studies being ranked by methodological quality. A scoring system is proposed for concomitantly appraising the methodological quality of qualitative, quantitative and mixed methods studies for SMSRs. This scoring system may also be used to appraise the methodological quality of qualitative, quantitative and mixed methods components of mixed methods research.
GSuite HyperBrowser: integrative analysis of dataset collections across the genome and epigenome.
Simovski, Boris; Vodák, Daniel; Gundersen, Sveinung; Domanska, Diana; Azab, Abdulrahman; Holden, Lars; Holden, Marit; Grytten, Ivar; Rand, Knut; Drabløs, Finn; Johansen, Morten; Mora, Antonio; Lund-Andersen, Christin; Fromm, Bastian; Eskeland, Ragnhild; Gabrielsen, Odd Stokke; Ferkingstad, Egil; Nakken, Sigve; Bengtsen, Mads; Nederbragt, Alexander Johan; Thorarensen, Hildur Sif; Akse, Johannes Andreas; Glad, Ingrid; Hovig, Eivind; Sandve, Geir Kjetil
2017-07-01
Recent large-scale undertakings such as ENCODE and Roadmap Epigenomics have generated experimental data mapped to the human reference genome (as genomic tracks) representing a variety of functional elements across a large number of cell types. Despite the high potential value of these publicly available data for a broad variety of investigations, little attention has been given to the analytical methodology necessary for their widespread utilisation. We here present a first principled treatment of the analysis of collections of genomic tracks. We have developed novel computational and statistical methodology to permit comparative and confirmatory analyses across multiple and disparate data sources. We delineate a set of generic questions that are useful across a broad range of investigations and discuss the implications of choosing different statistical measures and null models. Examples include contrasting analyses across different tissues or diseases. The methodology has been implemented in a comprehensive open-source software system, the GSuite HyperBrowser. To make the functionality accessible to biologists, and to facilitate reproducible analysis, we have also developed a web-based interface providing an expertly guided and customizable way of utilizing the methodology. With this system, many novel biological questions can flexibly be posed and rapidly answered. Through a combination of streamlined data acquisition, interoperable representation of dataset collections, and customizable statistical analysis with guided setup and interpretation, the GSuite HyperBrowser represents a first comprehensive solution for integrative analysis of track collections across the genome and epigenome. The software is available at: https://hyperbrowser.uio.no. © The Author 2017. Published by Oxford University Press.
Östlund, Ulrika; Kidd, Lisa; Wengström, Yvonne; Rowa-Dewar, Neneh
2011-03-01
It has been argued that mixed methods research can be useful in nursing and health science because of the complexity of the phenomena studied. However, the integration of qualitative and quantitative approaches continues to be one of much debate and there is a need for a rigorous framework for designing and interpreting mixed methods research. This paper explores the analytical approaches (i.e. parallel, concurrent or sequential) used in mixed methods studies within healthcare and exemplifies the use of triangulation as a methodological metaphor for drawing inferences from qualitative and quantitative findings originating from such analyses. This review of the literature used systematic principles in searching CINAHL, Medline and PsycINFO for healthcare research studies which employed a mixed methods approach and were published in the English language between January 1999 and September 2009. In total, 168 studies were included in the results. Most studies originated in the United States of America (USA), the United Kingdom (UK) and Canada. The analytic approach most widely used was parallel data analysis. A number of studies used sequential data analysis; far fewer studies employed concurrent data analysis. Very few of these studies clearly articulated the purpose for using a mixed methods design. The use of the methodological metaphor of triangulation on convergent, complementary, and divergent results from mixed methods studies is exemplified and an example of developing theory from such data is provided. A trend for conducting parallel data analysis on quantitative and qualitative data in mixed methods healthcare research has been identified in the studies included in this review. Using triangulation as a methodological metaphor can facilitate the integration of qualitative and quantitative findings, help researchers to clarify their theoretical propositions and the basis of their results. This can offer a better understanding of the links between theory and empirical findings, challenge theoretical assumptions and develop new theory. Copyright © 2010 Elsevier Ltd. All rights reserved.
Rodríguez, M T Torres; Andrade, L Cristóbal; Bugallo, P M Bello; Long, J J Casares
2011-09-15
Life cycle thinking (LCT) is one of the philosophies that has recently appeared in the context of the sustainable development. Some of the already existing tools and methods, as well as some of the recently emerged ones, which seek to understand, interpret and design the life of a product, can be included into the scope of the LCT philosophy. That is the case of the material and energy flow analysis (MEFA), a tool derived from the industrial metabolism definition. This paper proposes a methodology combining MEFA with another technique derived from sustainable development which also fits the LCT philosophy, the BAT (best available techniques) analysis. This methodology, applied to an industrial process, seeks to identify the so-called improvable flows by MEFA, so that the appropriate candidate BAT can be selected by BAT analysis. Material and energy inputs, outputs and internal flows are quantified, and sustainable solutions are provided on the basis of industrial metabolism. The methodology has been applied to an exemplary roof tile manufacture plant for validation. 14 Improvable flows have been identified and 7 candidate BAT have been proposed aiming to reduce these flows. The proposed methodology provides a way to detect improvable material or energy flows in a process and selects the most sustainable options to enhance them. Solutions are proposed for the detected improvable flows, taking into account their effectiveness on improving such flows. Copyright © 2011 Elsevier B.V. All rights reserved.
DOT National Transportation Integrated Search
2013-12-01
In 1992, an Applicants Guide and a Reviewers Guide to Traffic Impact Analyses to standardize the methodologies for conducting traffic : impact analyses (TIAs) in Indiana were developed for the Indiana Department of Transportation (INDOT). The m...
DOT National Transportation Integrated Search
1996-04-01
THIS REPORT ALSO DESCRIBES THE PROCEDURES FOR DIRECT ESTIMATION OF INTERSECTION CAPACITY WITH SIMULATION, INCLUDING A SET OF RIGOROUS STATISTICAL TESTS FOR SIMULATION PARAMETER CALIBRATION FROM FIELD DATA.
The Ohio River Basin energy facility siting model. Volume 1: Methodology
NASA Astrophysics Data System (ADS)
Fowler, G. L.; Bailey, R. E.; Gordon, S. I.; Jansen, S. D.; Randolph, J. C.; Jones, W. W.
1981-04-01
The siting model developed for ORBES is specifically designed for regional policy analysis. The region includes 423 counties in an area that consists of all of Kentucky and substantial portions of Illinois, Indiana, Ohio, Pennsylvania, and West Virginia.
Aeroelastic analysis for propellers - mathematical formulations and program user's manual
NASA Technical Reports Server (NTRS)
Bielawa, R. L.; Johnson, S. A.; Chi, R. M.; Gangwani, S. T.
1983-01-01
Mathematical development is presented for a specialized propeller dedicated version of the G400 rotor aeroelastic analysis. The G400PROP analysis simulates aeroelastic characteristics particular to propellers such as structural sweep, aerodynamic sweep and high subsonic unsteady airloads (both stalled and unstalled). Formulations are presented for these expanded propeller related methodologies. Results of limited application of the analysis to realistic blade configurations and operating conditions which include stable and unstable stall flutter test conditions are given. Sections included for enhanced program user efficiency and expanded utilization include descriptions of: (1) the structuring of the G400PROP FORTRAN coding; (2) the required input data; and (3) the output results. General information to facilitate operation and improve efficiency is also provided.
Reviewing the methodology of an integrative review.
Hopia, Hanna; Latvala, Eila; Liimatainen, Leena
2016-12-01
Whittemore and Knafl's updated description of methodological approach for integrative review was published in 2005. Since then, the five stages of the approach have been regularly used as a basic conceptual structure of the integrative reviews conducted by nursing researchers. However, this methodological approach is seldom examined from the perspective of how systematically and rigorously the stages are implemented in the published integrative reviews. To appraise the selected integrative reviews on the basis of the methodological approach according to the five stages published by Whittemore and Knafl in 2005. A literature review was used in this study. CINAHL (Cumulative Index to Nursing and Allied Health), PubMed, OVID (Journals@Ovid) and the Cochrane Library databases were searched for integrative reviews published between 2002 and 2014. Papers were included if they used the methodological approach described by Whittemore and Knafl, were published in English and were focused on nursing education or nursing expertise. A total of 259 integrative review publications for potential inclusion were identified. Ten integrative reviews fulfilled the inclusion criteria. Findings from the studies were extracted and critically examined according to the five methodological stages. The reviews assessed followed the guidelines of the stated methodology approach to different extents. The stages of literature search, data evaluation and data analysis were fairly poorly formulated and only partially implemented in the studies included in the sample. The other two stages, problem identification and presentation, followed those described in the methodological approach quite well. Increasing use of research in clinical practice is inevitable, and therefore, integrative reviews can play a greater role in developing evidence-based nursing practices. Because of this, nurse researchers should pay more attention to sound integrative nursing research to systematise the review process and make it more rigorous. © 2016 Nordic College of Caring Science.
A Screening Method for Assessing Cumulative Impacts
Alexeeff, George V.; Faust, John B.; August, Laura Meehan; Milanes, Carmen; Randles, Karen; Zeise, Lauren; Denton, Joan
2012-01-01
The California Environmental Protection Agency (Cal/EPA) Environmental Justice Action Plan calls for guidelines for evaluating “cumulative impacts.” As a first step toward such guidelines, a screening methodology for assessing cumulative impacts in communities was developed. The method, presented here, is based on the working definition of cumulative impacts adopted by Cal/EPA [1]: “Cumulative impacts means exposures, public health or environmental effects from the combined emissions and discharges in a geographic area, including environmental pollution from all sources, whether single or multi-media, routinely, accidentally, or otherwise released. Impacts will take into account sensitive populations and socio-economic factors, where applicable and to the extent data are available.” The screening methodology is built on this definition as well as current scientific understanding of environmental pollution and its adverse impacts on health, including the influence of both intrinsic, biological factors and non-intrinsic socioeconomic factors in mediating the effects of pollutant exposures. It addresses disparities in the distribution of pollution and health outcomes. The methodology provides a science-based tool to screen places for relative cumulative impacts, incorporating both the pollution burden on a community- including exposures to pollutants, their public health and environmental effects- and community characteristics, specifically sensitivity and socioeconomic factors. The screening methodology provides relative rankings to distinguish more highly impacted communities from less impacted ones. It may also help identify which factors are the greatest contributors to a community’s cumulative impact. It is not designed to provide quantitative estimates of community-level health impacts. A pilot screening analysis is presented here to illustrate the application of this methodology. Once guidelines are adopted, the methodology can serve as a screening tool to help Cal/EPA programs prioritize their activities and target those communities with the greatest cumulative impacts. PMID:22470315
Simoens, Steven
2013-01-01
Objectives This paper aims to assess the methodological quality of economic evaluations included in Belgian reimbursement applications for Class 1 drugs. Materials and Methods For 19 reimbursement applications submitted during 2011 and Spring 2012, a descriptive analysis assessed the methodological quality of the economic evaluation, evaluated the assessment of that economic evaluation by the Drug Reimbursement Committee and the response to that assessment by the company. Compliance with methodological guidelines issued by the Belgian Healthcare Knowledge Centre was assessed using a detailed checklist of 23 methodological items. The rate of compliance was calculated based on the number of economic evaluations for which the item was applicable. Results Economic evaluations tended to comply with guidelines regarding perspective, target population, subgroup analyses, comparator, use of comparative clinical data and final outcome measures, calculation of costs, incremental analysis, discounting and time horizon. However, more attention needs to be paid to the description of limitations of indirect comparisons, the choice of an appropriate analytic technique, the expression of unit costs in values for the current year, the estimation and valuation of outcomes, the presentation of results of sensitivity analyses, and testing the face validity of model inputs and outputs. Also, a large variation was observed in the scope and depth of the quality assessment by the Drug Reimbursement Committee. Conclusions Although general guidelines exist, pharmaceutical companies and the Drug Reimbursement Committee would benefit from the existence of a more detailed checklist of methodological items that need to be reported in an economic evaluation. PMID:24386474
Simoens, Steven
2013-01-01
This paper aims to assess the methodological quality of economic evaluations included in Belgian reimbursement applications for Class 1 drugs. For 19 reimbursement applications submitted during 2011 and Spring 2012, a descriptive analysis assessed the methodological quality of the economic evaluation, evaluated the assessment of that economic evaluation by the Drug Reimbursement Committee and the response to that assessment by the company. Compliance with methodological guidelines issued by the Belgian Healthcare Knowledge Centre was assessed using a detailed checklist of 23 methodological items. The rate of compliance was calculated based on the number of economic evaluations for which the item was applicable. Economic evaluations tended to comply with guidelines regarding perspective, target population, subgroup analyses, comparator, use of comparative clinical data and final outcome measures, calculation of costs, incremental analysis, discounting and time horizon. However, more attention needs to be paid to the description of limitations of indirect comparisons, the choice of an appropriate analytic technique, the expression of unit costs in values for the current year, the estimation and valuation of outcomes, the presentation of results of sensitivity analyses, and testing the face validity of model inputs and outputs. Also, a large variation was observed in the scope and depth of the quality assessment by the Drug Reimbursement Committee. Although general guidelines exist, pharmaceutical companies and the Drug Reimbursement Committee would benefit from the existence of a more detailed checklist of methodological items that need to be reported in an economic evaluation.
A Methodology for Loading the Advanced Test Reactor Driver Core for Experiment Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cowherd, Wilson M.; Nielsen, Joseph W.; Choe, Dong O.
In support of experiments in the ATR, a new methodology was devised for loading the ATR Driver Core. This methodology will replace the existing methodology used by the INL Neutronic Analysis group to analyze experiments. Studied in this paper was the as-run analysis for ATR Cycle 152B, specifically comparing measured lobe powers and eigenvalue calculations.
Sikirzhytski, Vitali; Sikirzhytskaya, Aliaksandra; Lednev, Igor K
2012-10-10
Conventional confirmatory biochemical tests used in the forensic analysis of body fluid traces found at a crime scene are destructive and not universal. Recently, we reported on the application of near-infrared (NIR) Raman microspectroscopy for non-destructive confirmatory identification of pure blood, saliva, semen, vaginal fluid and sweat. Here we expand the method to include dry mixtures of semen and blood. A classification algorithm was developed for differentiating pure body fluids and their mixtures. The classification methodology is based on an effective combination of Support Vector Machine (SVM) regression (data selection) and SVM Discriminant Analysis of preprocessed experimental Raman spectra collected using an automatic mapping of the sample. This extensive cross-validation of the obtained results demonstrated that the detection limit of the minor contributor is as low as a few percent. The developed methodology can be further expanded to any binary mixture of complex solutions, including but not limited to mixtures of other body fluids. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Studies of the Effect of Formative Assessment on Student Achievement: So Much More Is Needed
ERIC Educational Resources Information Center
McMillan, James H.; Venable, Jessica C.; Varier, Divya
2013-01-01
Kingston and Nash (2011) recently presented a meta-analysis of studies showing that the effect of formative assessment on K-12 student achievement may not be as robust as widely believed. This investigation analyzes the methodology used in the Kingston and Nash meta-analysis and provides further analyses of the studies included in the study. These…
Impact of Physical Activity Interventions on Blood Pressure in Brazilian Populations
Bento, Vivian Freitas Rezende; Albino, Flávia Barbizan; de Moura, Karen Fernandes; Maftum, Gustavo Jorge; dos Santos, Mauro de Castro; Guarita-Souza, Luiz César; Faria Neto, José Rocha; Baena, Cristina Pellegrino
2015-01-01
Background High blood pressure is associated with cardiovascular disease, which is the leading cause of mortality in the Brazilian population. Lifestyle changes, including physical activity, are important for lowering blood pressure levels and decreasing the costs associated with outcomes. Objective Assess the impact of physical activity interventions on blood pressure in Brazilian individuals. Methods Meta-analysis and systematic review of studies published until May 2014, retrieved from several health sciences databases. Seven studies with 493 participants were included. The analysis included parallel studies of physical activity interventions in adult populations in Brazil with a description of blood pressure (mmHg) before and after the intervention in the control and intervention groups. Results Of 390 retrieved studies, eight matched the proposed inclusion criteria for the systematic review and seven randomized clinical trials were included in the meta-analysis. Physical activity interventions included aerobic and resistance exercises. There was a reduction of -10.09 (95% CI: -18.76 to -1.43 mmHg) in the systolic and -7.47 (95% CI: -11.30 to -3.63 mmHg) in the diastolic blood pressure. Conclusions Available evidence on the effects of physical activity on blood pressure in the Brazilian population shows a homogeneous and significant effect at both systolic and diastolic blood pressures. However, the strength of the included studies was low and the methodological quality was also low and/or regular. Larger studies with more rigorous methodology are necessary to build robust evidence. PMID:26016783
2012-01-01
Implicit in the growing interest in patient-centered outcomes research is a growing need for better evidence regarding how responses to a given intervention or treatment may vary across patients, referred to as heterogeneity of treatment effect (HTE). A variety of methods are available for exploring HTE, each associated with unique strengths and limitations. This paper reviews a selected set of methodological approaches to understanding HTE, focusing largely but not exclusively on their uses with randomized trial data. It is oriented for the “intermediate” outcomes researcher, who may already be familiar with some methods, but would value a systematic overview of both more and less familiar methods with attention to when and why they may be used. Drawing from the biomedical, statistical, epidemiological and econometrics literature, we describe the steps involved in choosing an HTE approach, focusing on whether the intent of the analysis is for exploratory, initial testing, or confirmatory testing purposes. We also map HTE methodological approaches to data considerations as well as the strengths and limitations of each approach. Methods reviewed include formal subgroup analysis, meta-analysis and meta-regression, various types of predictive risk modeling including classification and regression tree analysis, series of n-of-1 trials, latent growth and growth mixture models, quantile regression, and selected non-parametric methods. In addition to an overview of each HTE method, examples and references are provided for further reading. By guiding the selection of the methods and analysis, this review is meant to better enable outcomes researchers to understand and explore aspects of HTE in the context of patient-centered outcomes research. PMID:23234603
Does Exercise Improve Cognitive Performance? A Conservative Message from Lord's Paradox
Liu, Sicong; Lebeau, Jean-Charles; Tenenbaum, Gershon
2016-01-01
Although extant meta-analyses support the notion that exercise results in cognitive performance enhancement, methodology shortcomings are noted among primary evidence. The present study examined relevant randomized controlled trials (RCTs) published in the past 20 years (1996–2015) for methodological concerns arise from Lord's paradox. Our analysis revealed that RCTs supporting the positive effect of exercise on cognition are likely to include Type I Error(s). This result can be attributed to the use of gain score analysis on pretest-posttest data as well as the presence of control group superiority over the exercise group on baseline cognitive measures. To improve accuracy of causal inferences in this area, analysis of covariance on pretest-posttest data is recommended under the assumption of group equivalence. Important experimental procedures are discussed to maintain group equivalence. PMID:27493637
Discourse Analysis and the Study of Educational Leadership
ERIC Educational Resources Information Center
Anderson, Gary; Mungal, Angus Shiva
2015-01-01
Purpose: The purpose of this paper is to provide an overview of the current and past work using discourse analysis in the field of educational administration and of discourse analysis as a methodology. Design/Methodology/Approach: Authors reviewed research in educational leadership that uses discourse analysis as a methodology. Findings: While…
76 FR 30139 - Federal Need Analysis Methodology for the 2012-2013 Award Year
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-24
... DEPARTMENT OF EDUCATION Federal Need Analysis Methodology for the 2012-2013 Award Year AGENCY: Federal Student Aid, Department of Education. ACTION: Notice of revision of the Federal Need Analysis...; 84.268; 84.379]. Federal Need Analysis Methodology for the 2012-2013 award year; Federal Pell Grant...
System data communication structures for active-control transport aircraft, volume 1
NASA Technical Reports Server (NTRS)
Hopkins, A. L.; Martin, J. H.; Brock, L. D.; Jansson, D. G.; Serben, S.; Smith, T. B.; Hanley, L. D.
1981-01-01
Candidate data communication techniques are identified, including dedicated links, local buses, broadcast buses, multiplex buses, and mesh networks. The design methodology for mesh networks is then discussed, including network topology and node architecture. Several concepts of power distribution are reviewed, including current limiting and mesh networks for power. The technology issues of packaging, transmission media, and lightning are addressed, and, finally, the analysis tools developed to aid in the communication design process are described. There are special tools to analyze the reliability and connectivity of networks and more general reliability analysis tools for all types of systems.
Analysis of crack initiation and growth in the high level vibration test at Tadotsu
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kassir, M.K.; Park, Y.J.; Hofmayer, C.H.
1993-08-01
The High Level Vibration Test data are used to assess the accuracy and usefulness of current engineering methodologies for predicting crack initiation and growth in a cast stainless steel pipe elbow under complex, large amplitude loading. The data were obtained by testing at room temperature a large scale modified model of one loop of a PWR primary coolant system at the Tadotsu Engineering Laboratory in Japan. Fatigue crack initiation time is reasonably predicted by applying a modified local strain approach (Coffin-Mason-Goodman equation) in conjunction with Miner`s rule of cumulative damage. Three fracture mechanics methodologies are applied to investigate the crackmore » growth behavior observed in the hot leg of the model. These are: the {Delta}K methodology (Paris law), {Delta}J concepts and a recently developed limit load stress-range criterion. The report includes a discussion on the pros and cons of the analysis involved in each of the methods, the role played by the key parameters influencing the formulation and a comparison of the results with the actual crack growth behavior observed in the vibration test program. Some conclusions and recommendations for improvement of the methodologies are also provided.« less
District Heating Systems Performance Analyses. Heat Energy Tariff
NASA Astrophysics Data System (ADS)
Ziemele, Jelena; Vigants, Girts; Vitolins, Valdis; Blumberga, Dagnija; Veidenbergs, Ivars
2014-12-01
The paper addresses an important element of the European energy sector: the evaluation of district heating (DH) system operations from the standpoint of increasing energy efficiency and increasing the use of renewable energy resources. This has been done by developing a new methodology for the evaluation of the heat tariff. The paper presents an algorithm of this methodology, which includes not only a data base and calculation equation systems, but also an integrated multi-criteria analysis module using MADM/MCDM (Multi-Attribute Decision Making / Multi-Criteria Decision Making) based on TOPSIS (Technique for Order Performance by Similarity to Ideal Solution). The results of the multi-criteria analysis are used to set the tariff benchmarks. The evaluation methodology has been tested for Latvian heat tariffs, and the obtained results show that only half of heating companies reach a benchmark value equal to 0.5 for the efficiency closeness to the ideal solution indicator. This means that the proposed evaluation methodology would not only allow companies to determine how they perform with regard to the proposed benchmark, but also to identify their need to restructure so that they may reach the level of a low-carbon business.
Evaluating the statistical methodology of randomized trials on dentin hypersensitivity management.
Matranga, Domenica; Matera, Federico; Pizzo, Giuseppe
2017-12-27
The present study aimed to evaluate the characteristics and quality of statistical methodology used in clinical studies on dentin hypersensitivity management. An electronic search was performed for data published from 2009 to 2014 by using PubMed, Ovid/MEDLINE, and Cochrane Library databases. The primary search terms were used in combination. Eligibility criteria included randomized clinical trials that evaluated the efficacy of desensitizing agents in terms of reducing dentin hypersensitivity. A total of 40 studies were considered eligible for assessment of quality statistical methodology. The four main concerns identified were i) use of nonparametric tests in the presence of large samples, coupled with lack of information about normality and equality of variances of the response; ii) lack of P-value adjustment for multiple comparisons; iii) failure to account for interactions between treatment and follow-up time; and iv) no information about the number of teeth examined per patient and the consequent lack of cluster-specific approach in data analysis. Owing to these concerns, statistical methodology was judged as inappropriate in 77.1% of the 35 studies that used parametric methods. Additional studies with appropriate statistical analysis are required to obtain appropriate assessment of the efficacy of desensitizing agents.
European Healthy Cities evaluation: conceptual framework and methodology.
de Leeuw, Evelyne; Green, Geoff; Dyakova, Mariana; Spanswick, Lucy; Palmer, Nicola
2015-06-01
This paper presents the methodology, programme logic and conceptual framework that drove the evaluation of the Fifth Phase of the WHO European Healthy Cities Network. Towards the end of the phase, 99 cities were designated progressively through the life of the phase (2009-14). The paper establishes the values, systems and aspirations that these cities sign up for, as foundations for the selection of methodology. We assert that a realist synthesis methodology, driven by a wide range of qualitative and quantitative methods, is the most appropriate perspective to address the wide geopolitical, demographic, population and health diversities of these cities. The paper outlines the rationale for a structured multiple case study approach, the deployment of a comprehensive questionnaire, data mining through existing databases including Eurostat and analysis of management information generation tools used throughout the period. Response rates were considered extremely high for this type of research. Non-response analyses are described, which show that data are representative for cities across the spectrum of diversity. This paper provides a foundation for further analysis on specific areas of interest presented in this supplement. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Sailaukhanuly, Yerbolat; Zhakupbekova, Arai; Amutova, Farida; Carlsen, Lars
2013-01-01
Knowledge of the environmental behavior of chemicals is a fundamental part of the risk assessment process. The present paper discusses various methods of ranking of a series of persistent organic pollutants (POPs) according to the persistence, bioaccumulation and toxicity (PBT) characteristics. Traditionally ranking has been done as an absolute (total) ranking applying various multicriteria data analysis methods like simple additive ranking (SAR) or various utility functions (UFs) based rankings. An attractive alternative to these ranking methodologies appears to be partial order ranking (POR). The present paper compares different ranking methods like SAR, UF and POR. Significant discrepancies between the rankings are noted and it is concluded that partial order ranking, as a method without any pre-assumptions concerning possible relation between the single parameters, appears as the most attractive ranking methodology. In addition to the initial ranking partial order methodology offers a wide variety of analytical tools to elucidate the interplay between the objects to be ranked and the ranking parameters. In the present study is included an analysis of the relative importance of the single P, B and T parameters. Copyright © 2012 Elsevier Ltd. All rights reserved.
Ó Céilleachair, Alan J; Hanly, Paul; Skally, Máiréad; O'Neill, Ciaran; Fitzpatrick, Patricia; Kapur, Kanika; Staines, Anthony; Sharp, Linda
2013-04-01
Colorectal cancer (CRC) is the third most common cancer worldwide with over 1 million new cases diagnosed each year. Advances in treatment and survival are likely to have increased lifetime costs of managing the disease. Cost-of-illness (COI) studies are key building blocks in economic evaluations of interventions and comparative effectiveness research. We systematically reviewed and critiqued the COI literature on CRC. We searched several databases for CRC COI studies published in English, between January 2000 and February 2011. Information was abstracted on: setting, patient population, top-down/bottom-up costing, incident/prevalent approach, payer perspective, time horizon, costs included, cost source, and per-person costs. We developed a framework to compare study methodologies and assess homogeneity/heterogeneity. A total of 26 papers met the inclusion criteria. There was extensive methodological heterogeneity. Studies included case-control studies based on claims/reimbursement data (10), examinations of patient charts (5), and analysis of claims data (4). Epidemiological approaches varied (prevalent, 6; incident, 8; mixed, 10; unclear, 4). Time horizons ranged from 1 year postdiagnosis to lifetime. Seventeen studies used top-down costing. Twenty-five studies included healthcare-payer direct medical costs; 2 included indirect costs; 1 considered patient costs. There was broad agreement in how studies accounted for time, but few studies described costs in sufficient detail to allow replication. In general, costs were not comparable between studies. Methodological heterogeneity and lack of transparency made it almost impossible to compare CRC costs between studies or over time. For COI studies to be more useful and robust there is need for clear and rigorous guidelines around methodological and reporting "best practice."
[Methodological deficits in neuroethics: do we need theoretical neuroethics?].
Northoff, G
2013-10-01
Current neuroethics can be characterized best as empirical neuroethics: it is strongly empirically oriented in that it not only includes empirical findings from neuroscience but also searches for applications within neuroscience. This, however, neglects the social and political contexts which could be subject to a future social neuroethics. In addition, methodological issues need to be considered as in theoretical neuroethics. The focus in this article is on two such methodological issues: (1) the analysis of the different levels and their inferences among each other which is exemplified by the inference of consciousness from the otherwise purely neuronal data in patients with vegetative state and (2) the problem of linking descriptive and normative concepts in a non-reductive and non-inferential way for which I suggest the mutual contextualization between both concepts. This results in a methodological strategy that can be described as contextual fact-norm iterativity.
Future generations, environmental ethics, and global environmental change
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tonn, B.E.
1994-12-31
The elements of a methodology to be employed by the global community to investigate the consequences of global environmental change upon future generations and global ecosystems are outlined in this paper. The methodology is comprised of two major components: A possible future worlds model; and a formal, citizen-oriented process to judge whether the possible future worlds potentially inheritable by future generations meet obligational standards. A broad array of descriptors of future worlds can be encompassed within this framework, including survival of ecosystems and other species and satisfaction of human concerns. The methodology expresses fundamental psychological motivations and human myths journey,more » renewal, mother earth, and being-in-nature-and incorporates several viewpoints on obligations to future generations-maintaining options, fairness, humility, and the cause of humanity. The methodology overcomes several severe drawbacks of the economic-based methods most commonly used for global environmental policy analysis.« less
Developing and validating a nutrition knowledge questionnaire: key methods and considerations.
Trakman, Gina Louise; Forsyth, Adrienne; Hoye, Russell; Belski, Regina
2017-10-01
To outline key statistical considerations and detailed methodologies for the development and evaluation of a valid and reliable nutrition knowledge questionnaire. Literature on questionnaire development in a range of fields was reviewed and a set of evidence-based guidelines specific to the creation of a nutrition knowledge questionnaire have been developed. The recommendations describe key qualitative methods and statistical considerations, and include relevant examples from previous papers and existing nutrition knowledge questionnaires. Where details have been omitted for the sake of brevity, the reader has been directed to suitable references. We recommend an eight-step methodology for nutrition knowledge questionnaire development as follows: (i) definition of the construct and development of a test plan; (ii) generation of the item pool; (iii) choice of the scoring system and response format; (iv) assessment of content validity; (v) assessment of face validity; (vi) purification of the scale using item analysis, including item characteristics, difficulty and discrimination; (vii) evaluation of the scale including its factor structure and internal reliability, or Rasch analysis, including assessment of dimensionality and internal reliability; and (viii) gathering of data to re-examine the questionnaire's properties, assess temporal stability and confirm construct validity. Several of these methods have previously been overlooked. The measurement of nutrition knowledge is an important consideration for individuals working in the nutrition field. Improved methods in the development of nutrition knowledge questionnaires, such as the use of factor analysis or Rasch analysis, will enable more confidence in reported measures of nutrition knowledge.
Catanuto, Giuseppe; Taher, Wafa; Rocco, Nicola; Catalano, Francesca; Allegra, Dario; Milotta, Filippo Luigi Maria; Stanco, Filippo; Gallo, Giovanni; Nava, Maurizio Bruno
2018-03-20
Breast shape is defined utilizing mainly qualitative assessment (full, flat, ptotic) or estimates, such as volume or distances between reference points, that cannot describe it reliably. We will quantitatively describe breast shape with two parameters derived from a statistical methodology denominated principal component analysis (PCA). We created a heterogeneous dataset of breast shapes acquired with a commercial infrared 3-dimensional scanner on which PCA was performed. We plotted on a Cartesian plane the two highest values of PCA for each breast (principal components 1 and 2). Testing of the methodology on a preoperative and postoperative surgical case and test-retest was performed by two operators. The first two principal components derived from PCA are able to characterize the shape of the breast included in the dataset. The test-retest demonstrated that different operators are able to obtain very similar values of PCA. The system is also able to identify major changes in the preoperative and postoperative stages of a two-stage reconstruction. Even minor changes were correctly detected by the system. This methodology can reliably describe the shape of a breast. An expert operator and a newly trained operator can reach similar results in a test/re-testing validation. Once developed and after further validation, this methodology could be employed as a good tool for outcome evaluation, auditing, and benchmarking.
Lees-Haley, Paul R; Greiffenstein, M Frank; Larrabee, Glenn J; Manning, Edward L
2004-08-01
Recently, Kaiser (2003) raised concerns over the increase in brain damage claims reportedly due to exposure to welding fumes. In the present article, we discuss methodological problems in conducting neuropsychological research on the effects of welding exposure, using a recent paper by Bowler et al. (2003) as an example to illustrate problems common in the neurotoxicity literature. Our analysis highlights difficulties in conducting such quasi-experimental investigations, including subject selection bias, litigation effects on symptom report and neuropsychological test performance, response bias, and scientifically inadequate casual reasoning.
[The methods of assessment of health risk from exposure to radon and radon daughters].
Demin, V F; Zhukovskiy, M V; Kiselev, S M
2014-01-01
The critical analysis of existing models of the relationship dose-effect (RDE) for radon exposure on human health has been performed. Conclusion about the necessity and possibility of improving these models has been made. A new improved version ofthe RDE has been developed. A technique for assessing the human health risk of exposure to radon, including the method for estimating of exposure doses of radon, an improved model of RDE, proper methodology risk assessment has been described. Methodology is proposed for the use in the territory of Russia.
Infrared Algorithm Development for Ocean Observations with EOS/MODIS
NASA Technical Reports Server (NTRS)
Brown, Otis B.
1997-01-01
Efforts continue under this contract to develop algorithms for the computation of sea surface temperature (SST) from MODIS infrared measurements. This effort includes radiative transfer modeling, comparison of in situ and satellite observations, development and evaluation of processing and networking methodologies for algorithm computation and data accession, evaluation of surface validation approaches for IR radiances, development of experimental instrumentation, and participation in MODIS (project) related activities. Activities in this contract period have focused on radiative transfer modeling, evaluation of atmospheric correction methodologies, undertake field campaigns, analysis of field data, and participation in MODIS meetings.
NASA Technical Reports Server (NTRS)
Dejesusparada, N. (Principal Investigator); Crepani, E.; Martini, P. R.
1980-01-01
A methodology is proposed for international geological correlation studies based on LANDSAT-MSS imagery, Bullard's model of continental fit and compatible structural trends between Northeast Brazil and the West African counterpart. Six extensive lineaments in the Brazilian study area are mapped and discussed according to their regional behavior and in relation to the adjacent continental margin. Among the first conclusions, correlations were found between the Sobral Pedro II Lineament and the megafaults that surround the West African craton; and the Pernambuco Lineament with the Ngaurandere Linemanet in Cameroon. Ongoing research to complete the methodological stages includes the mapping of the West African structural framework, reconstruction of the pre-drift puzzle, and an analysis of the counterpart correlations.
Effects of MicroCAD on Learning Fundamental Engineering Graphical Concepts: A Qualitative Study.
ERIC Educational Resources Information Center
Leach, James A.; Gull, Randall L.
1990-01-01
Students' reactions and performances were examined when taught engineering geometry concepts using a standard microcomputer-aided drafting software package. Two sample groups were compared based on their computer experience. Included are the methodology, data analysis, and conclusions. (KR)
Extended cooperative control synthesis
NASA Technical Reports Server (NTRS)
Davidson, John B.; Schmidt, David K.
1994-01-01
This paper reports on research for extending the Cooperative Control Synthesis methodology to include a more accurate modeling of the pilot's controller dynamics. Cooperative Control Synthesis (CCS) is a methodology that addresses the problem of how to design control laws for piloted, high-order, multivariate systems and/or non-conventional dynamic configurations in the absence of flying qualities specifications. This is accomplished by emphasizing the parallel structure inherent in any pilot-controlled, augmented vehicle. The original CCS methodology is extended to include the Modified Optimal Control Model (MOCM), which is based upon the optimal control model of the human operator developed by Kleinman, Baron, and Levison in 1970. This model provides a modeling of the pilot's compensation dynamics that is more accurate than the simplified pilot dynamic representation currently in the CCS methodology. Inclusion of the MOCM into the CCS also enables the modeling of pilot-observation perception thresholds and pilot-observation attention allocation affects. This Extended Cooperative Control Synthesis (ECCS) allows for the direct calculation of pilot and system open- and closed-loop transfer functions in pole/zero form and is readily implemented in current software capable of analysis and design for dynamic systems. Example results based upon synthesizing an augmentation control law for an acceleration command system in a compensatory tracking task using the ECCS are compared with a similar synthesis performed by using the original CCS methodology. The ECCS is shown to provide augmentation control laws that yield more favorable, predicted closed-loop flying qualities and tracking performance than those synthesized using the original CCS methodology.
Anti-tobacco mass media and socially disadvantaged groups: a systematic and methodological review.
Guillaumier, Ashleigh; Bonevski, Billie; Paul, Chris
2012-07-01
Only a limited amount of research has been conducted to explore whether there are socioeconomic status differences in responses to mass media. However, the methodological quality of this evidence has not been assessed, limiting confidence in conclusions that can be drawn regarding study outcomes. A systematic review of the effectiveness of anti-tobacco mass media campaigns with socially disadvantaged groups was conducted, and the methodological quality of included studies was assessed. Medline, The Cochrane Library, PsycInfo, Embase and Web of Science were searched using MeSH and keywords for quantitative studies conducted in Western countries prior to March 2012. A methodological quality assessment and narrative analysis of included studies was undertaken. Seventeen relevant studies (reported in 18 papers) were identified; however, weak study designs and selection bias were common characteristics, limiting strong conclusions about effectiveness. Using predominantly non-cessation related outcome measures reviewed papers indicated mixed results for mass media tobacco control campaign effectiveness among various social groups. Most studies assessed mass media impact on low socioeconomic status groups rather than highly socially disadvantaged groups. Methodological rigour of evaluations in this field must be improved to aid understanding regarding the effectiveness of mass media campaigns in driving cessation among disadvantaged groups. The results of this review indicate a gap in methodologically rigorous research into the effectiveness of mass media campaigns among socially disadvantaged groups, particularly the highly disadvantaged. © 2012 Australasian Professional Society on Alcohol and other Drugs.
Risk-benefit analysis and public policy: a bibliography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clark, E.M.; Van Horn, A.J.
1976-11-01
Risk-benefit analysis has been implicitly practiced whenever decision-makers are confronted with decisions involving risks to life, health, or to the environment. Various methodologies have been developed to evaluate relevant criteria and to aid in assessing the impacts of alternative projects. Among these have been cost-benefit analysis, which has been widely used for project evaluation. However, in many cases it has been difficult to assign dollar costs to those criteria involving risks and benefits which are not now assigned explicit monetary values in our economic system. Hence, risk-benefit analysis has evolved to become more than merely an extension of cost-benefit analysis,more » and many methods have been applied to examine the trade-offs between risks and benefits. In addition, new scientific and statistical techniques have been developed for assessing current and future risks. The 950 references included in this bibliography are meant to suggest the breadth of those methodologies which have been applied to decisions involving risk.« less
Outcomes associated with virtual reality in psychological interventions: where are we now?
Turner, Wesley A; Casey, Leanne M
2014-12-01
The impending commercial release of affordable VR systems is likely to accelerate both the opportunity and demand for VR applications that specifically target psychological conditions. The aim of this study was to conduct a meta-analysis of outcomes associated with VR psychological interventions and to examine the methodological rigour used in these interventions. Literature search was conducted via Ovid, ProQuest Psychology Journals and ScienceDirect (Psychology) databases. Interventions were required to: be published between 1980 to 2014; use a randomised controlled trial design; be published in a scholarly journal; focused primarily on psychological/behavioural intervention; include validated measures; include reported means and standard deviations of outcome measures; and include one group with clinical/subclinical disorders, syndromes or distressing behaviours. Thirty eligible studies were identified. Random effects meta-analysis found an overall moderate effect size for VR interventions. Individual meta-analyses found an overall large effect size against non-intervention wait-lists and an overall moderate effect size against active interventions. No correlation was found between treatment outcomes and methodological rigour. Limitations may include limited study numbers, the use of a single coder, a need for more in-depth analyses of variation in form VR intervention, and omission of presence as a moderating factor. The current review supports VR interventions as efficacious, promising forms of psychological treatment. Use of reporting guidelines such as the CONSORT and CONSORT-EHEALTH statements should promote greater emphasis on methodological rigour, providing a firm foundation for the further development of clinical VR applications. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Wolf, M.
1979-01-01
To facilitate the task of objectively comparing competing process options, a methodology was needed for the quantitative evaluation of their relative cost effectiveness. Such a methodology was developed and is described, together with three examples for its application. The criterion for the evaluation is the cost of the energy produced by the system. The method permits the evaluation of competing design options for subsystems, based on the differences in cost and efficiency of the subsystems, assuming comparable reliability and service life, or of competing manufacturing process options for such subsystems, which include solar cells or modules. This process option analysis is based on differences in cost, yield, and conversion efficiency contribution of the process steps considered.
Systematic reviews addressing microsurgical head and neck reconstruction.
Momeni, Arash; Jacobson, Joshua Y; Lee, Gordon K
2015-01-01
Systematic reviews frequently form the basis for clinical decision making and guideline development. Yet, the quality of systematic reviews has been variable, thus raising concerns about the validity of their conclusions. In the current study, a quality analysis of systematic reviews was performed, addressing microsurgical head and neck reconstruction. A PubMed search was performed to identify all systematic reviews published up to and including December 2012 in 12 surgical journals. Two authors independently reviewed the literature and extracted data from the included reviews. Discrepancies were resolved by consensus. Quality assessment was performed using AMSTAR. The initial search retrieved 1020 articles. After screening titles and abstracts, 987 articles were excluded. Full-text review of the remaining 33 articles resulted in further exclusion of 18 articles, leaving 15 systematic reviews for final analysis. A marked increase in the number of published systematic reviews over time was noted (P = 0.07). The median AMSTAR score was 5, thus reflecting a "fair" quality. No evidence for improvement in methodological quality over time was noted. The trend to publish more systematic reviews in microsurgical head and neck reconstruction is encouraging. However, efforts are indicated to improve the methodological quality of systematic reviews. Familiarity with criteria of methodological quality is critical to ensure future improvements in the quality of systematic reviews conducted in microsurgery.
Methodology to model the energy and greenhouse gas emissions of electronic software distributions.
Williams, Daniel R; Tang, Yinshan
2012-01-17
A new electronic software distribution (ESD) life cycle analysis (LCA) methodology and model structure were constructed to calculate energy consumption and greenhouse gas (GHG) emissions. In order to counteract the use of high level, top-down modeling efforts, and to increase result accuracy, a focus upon device details and data routes was taken. In order to compare ESD to a relevant physical distribution alternative, physical model boundaries and variables were described. The methodology was compiled from the analysis and operational data of a major online store which provides ESD and physical distribution options. The ESD method included the calculation of power consumption of data center server and networking devices. An in-depth method to calculate server efficiency and utilization was also included to account for virtualization and server efficiency features. Internet transfer power consumption was analyzed taking into account the number of data hops and networking devices used. The power consumed by online browsing and downloading was also factored into the model. The embedded CO(2)e of server and networking devices was proportioned to each ESD process. Three U.K.-based ESD scenarios were analyzed using the model which revealed potential CO(2)e savings of 83% when ESD was used over physical distribution. Results also highlighted the importance of server efficiency and utilization methods.
Transmission line relay mis-operation detection based on time-synchronized field data
Esmaeilian, Ahad; Popovic, Tomo; Kezunovic, Mladen
2015-05-04
In this paper, a real-time tool to detect transmission line relay mis-operation is implemented. The tool uses time-synchronized measurements obtained from both ends of the line during disturbances. The proposed fault analysis tool comes into the picture only after the protective device has operated and tripped the line. The proposed methodology is able not only to detect, classify, and locate transmission line faults, but also to accurately confirm whether the line was tripped due to a mis-operation of protective relays. The analysis report includes either detailed description of the fault type and location or detection of relay mis-operation. As such,more » it can be a source of very useful information to support the system restoration. The focus of the paper is on the implementation requirements that allow practical application of the methodology, which is illustrated using the field data obtained the real power system. Testing and validation is done using the field data recorded by digital fault recorders and protective relays. The test data included several hundreds of event records corresponding to both relay mis-operations and actual faults. The discussion of results addresses various challenges encountered during the implementation and validation of the presented methodology.« less
[Bibliometric analysis of publications by the Mexican Social Security Institute staff].
Valdez-Martínez, E; Garduño-Espinosa, J; Gómez-Delgado, A; Dante Amato-Martínez, J; Morales-Mori, L; Blanco-Favela, F; Muñoz-Hernández, O
2000-01-01
To describe and analyze the general characteristics and methodology of indexed publications by the health staff of the Mexican Social Security Institute in 1997. Original articles were evaluated. The primary sources included Index Medicus, Current Contents and the Mexican National Council of Science and Technology (CONACYT) index. The following information was gathered for each article: affiliation and chief activity of the first author; impact factor of the journal; research type; field of study; topic of study, and methodological conduction. This latter point included congruence between design and objective, reproducibility of methods, applicability of the analysis, and pertinence of the conclusions. A total of 300 original articles was published of which 212 (71%) were available for the present study: full-time investigators (FTI) generated 109 articles and investigators with clinical activities (CAI) wrote 103 articles. The median impact factor of the journals in which FTI published was 1.337 (0.341 to 37.297) and for CAI publications, 0.707 (0.400 to 4.237). Biomedical research predominated in the first group (41%) and clinical investigation in the second (66%). Statistically significant differences were identified for the methodological conduction between groups of investigators. Descriptive studies and publications in journals without impact factor predominated. The FTI group had the highest bibliographic production of original articles in indexed journals with an impact factor.
Stubbs, Brendon; Stubbs, Jean; Gnanaraj, Solomon Donald; Soundy, Andrew
2016-01-01
Depressive symptomology is now widely recognized as a key risk factor for falls. The evidence regarding the impact of major depressive disorder (MDD) on falls is unclear. A systematic review and exploratory meta-analysis was undertaken to explore the relationship between MDD and falls. Major electronic database were searched from inception till April 2015. Studies that defined MDD and measured falls prospectively in older adults (≥60 years) were included. Studies relying on depressive symptomology alone were excluded. The methodological quality of included articles was assessed and study findings were synthesized using an exploratory meta-analysis. From a potential of 415 articles, only three studies met the inclusion criteria. This included 976 unique older adults with a range of mean age from ≥65 to 83 years. The methodological quality of included studies was satisfactory. None of the included studies' primary aim was to investigate the relationship between MDD and falls. The exploratory meta-analysis demonstrated older adults with MDD are at increased risk of falling compared to non-depressed older adults (odds ratio (OR) 4.0, 95% CI 2.0-8.1, I(2) = 60%, n = 976). There is a paucity of research considering falls in older adults with MDD. Our results demonstrate that the odds of falling appear to be greater among people with MDD (OR 4.0) than in previous meta-analyses that have only considered subthreshold depressive symptoms. Given the distinct nature and challenges with MDD, more research is required to better understand the falls risk in this group.
Mantzoukas, Stefanos
2009-04-01
Evidence-based practice has become an imperative for efficient, effective and safe practice. Furthermore, evidences emerging from published research are considered as valid knowledge sources to guiding practice. The aim of this paper is to review all research articles published in the top 10 general nursing journals for the years 2000-2006 to identify the methodologies used, the types of evidence these studies produced and the issues upon which they endeavored. Quantitative content analysis was implemented to study all published research papers of the top 10 general nursing journals for the years 2000-2006. The top 10 general nursing journals were included in the study. The abstracts of all research articles were analysed with regards the methodologies of enquiry, the types of evidence produced and the issues of study they endeavored upon. Percentages were developed as to enable conclusions to be drawn. The results for the category methodologies used were 7% experimental, 6% quasi-experimental, 39% non-experimental, 2% ethnographical studies, 7% phenomenological, 4% grounded theory, 1% action research, 1% case study, 15% unspecified, 5.5% other, 0.5% meta-synthesis, 2% meta-analysis, 5% literature reviews and 3% secondary analysis. For the category types of evidence were 4% hypothesis/theory testing, 11% evaluative, 5% comparative, 2% correlational, 46% descriptive, 5% interpretative and 27% exploratory. For the category issues of study were 45% practice/clinical, 8% educational, 11% professional, 3% spiritual/ethical/metaphysical, 26% health promotion and 7% managerial/policy. Published studies can provide adequate evidences for practice if nursing journals conceptualise evidence emerging from non-experimental and qualitative studies as relevant types of evidences for practice and develop appropriate mechanisms for assessing their validity. Also, nursing journals need to increase and encourage the publication of studies that implement RCT methodology, systematic reviews, meta-synthesis and meta-analysis methodologies. Finally, nursing journals need to encourage more high quality research evidence that derive from interpretative, theory testing and evaluative types of studies that are practice relevant.
Study protocol for a scoping review on rehabilitation scoping reviews.
Colquhoun, Heather L; Jesus, Tiago S; O'Brien, Kelly K; Tricco, Andrea C; Chui, Adora; Zarin, Wasifa; Lillie, Erin; Hitzig, Sander L; Straus, Sharon
2017-09-01
Scoping reviews are increasingly popular in rehabilitation. However, significant variability in scoping review conduct and reporting currently exists, limiting potential for the methodology to advance rehabilitation research, practice and policy. Our aim is to conduct a scoping review of rehabilitation scoping reviews in order to examine the current volume, yearly distribution, proportion, scope and methodological practices involved in the conduct of scoping reviews in rehabilitation. Key areas of methodological improvement will be described. Methods and analysis: We will undertake the review using the Arksey and O'Malley scoping review methodology. Our search will involve two phases. The first will combine a previously conducted scoping review of scoping reviews (not distinct to rehabilitation, with data current to July 2014) together with a rehabilitation keyword search in PubMed. Articles found in the first phase search will undergo a full text review. The second phase will include an update of the previously conducted scoping review of scoping reviews (July 2014 to current). This update will include the search of nine electronic databases, followed by title and abstract screening as well as a full text review. All screening and extraction will be performed independently by two authors. Articles will be included if they are scoping reviews within the field of rehabilitation. A consultation exercise with key targets will inform plans to improve rehabilitation scoping reviews. Ethics and dissemination: Ethics will be required for the consultation phase of our scoping review. Dissemination will include peer-reviewed publication and conferences in rehabilitation-specific contexts.
Frempong, Samuel N; Sutton, Andrew J; Davenport, Clare; Barton, Pelham
2018-02-01
There is little specific guidance on the implementation of cost-effectiveness modelling at the early stage of test development. The aim of this study was to review the literature in this field to examine the methodologies and tools that have been employed to date. Areas Covered: A systematic review to identify relevant studies in established literature databases. Five studies were identified and included for narrative synthesis. These studies revealed that there is no consistent approach in this growing field. The perspective of patients and the potential for value of information (VOI) to provide information on the value of future research is often overlooked. Test accuracy is an essential consideration, with most studies having described and included all possible test results in their analysis, and conducted extensive sensitivity analyses on important parameters. Headroom analysis was considered in some instances but at the early development stage (not the concept stage). Expert commentary: The techniques available to modellers that can demonstrate the value of conducting further research and product development (i.e. VOI analysis, headroom analysis) should be better utilized. There is the need for concerted efforts to develop rigorous methodology in this growing field to maximize the value and quality of such analysis.
Kemp, Candace L.; Ball, Mary M.; Morgan, Jennifer Craft; Doyle, Patrick J.; Burgess, Elisabeth O.; Dillard, Joy A.; Barmon, Christina E.; Fitzroy, Andrea F.; Helmly, Victoria E.; Avent, Elizabeth S.; Perkins, Molly M.
2018-01-01
In this article, we analyze the research experiences associated with a longitudinal qualitative study of residents’ care networks in assisted living. Using data from researcher meetings, field notes, and memos, we critically examine our design and decision making and accompanying methodological implications. We focus on one complete wave of data collection involving 28 residents and 114 care network members in four diverse settings followed for 2 years. We identify study features that make our research innovative, but that also represent significant challenges. They include the focus and topic; settings and participants; scope and design complexity; nature, modes, frequency, and duration of data collection; and analytic approach. Each feature has methodological implications, including benefits and challenges pertaining to recruitment, retention, data collection, quality, and management, research team work, researcher roles, ethics, and dissemination. Our analysis demonstrates the value of our approach and of reflecting on and sharing methodological processes for cumulative knowledge building. PMID:27651072
Benchmarking for Bayesian Reinforcement Learning
Ernst, Damien; Couëtoux, Adrien
2016-01-01
In the Bayesian Reinforcement Learning (BRL) setting, agents try to maximise the collected rewards while interacting with their environment while using some prior knowledge that is accessed beforehand. Many BRL algorithms have already been proposed, but the benchmarks used to compare them are only relevant for specific cases. The paper addresses this problem, and provides a new BRL comparison methodology along with the corresponding open source library. In this methodology, a comparison criterion that measures the performance of algorithms on large sets of Markov Decision Processes (MDPs) drawn from some probability distributions is defined. In order to enable the comparison of non-anytime algorithms, our methodology also includes a detailed analysis of the computation time requirement of each algorithm. Our library is released with all source code and documentation: it includes three test problems, each of which has two different prior distributions, and seven state-of-the-art RL algorithms. Finally, our library is illustrated by comparing all the available algorithms and the results are discussed. PMID:27304891
Benchmarking for Bayesian Reinforcement Learning.
Castronovo, Michael; Ernst, Damien; Couëtoux, Adrien; Fonteneau, Raphael
2016-01-01
In the Bayesian Reinforcement Learning (BRL) setting, agents try to maximise the collected rewards while interacting with their environment while using some prior knowledge that is accessed beforehand. Many BRL algorithms have already been proposed, but the benchmarks used to compare them are only relevant for specific cases. The paper addresses this problem, and provides a new BRL comparison methodology along with the corresponding open source library. In this methodology, a comparison criterion that measures the performance of algorithms on large sets of Markov Decision Processes (MDPs) drawn from some probability distributions is defined. In order to enable the comparison of non-anytime algorithms, our methodology also includes a detailed analysis of the computation time requirement of each algorithm. Our library is released with all source code and documentation: it includes three test problems, each of which has two different prior distributions, and seven state-of-the-art RL algorithms. Finally, our library is illustrated by comparing all the available algorithms and the results are discussed.
Kemp, Candace L; Ball, Mary M; Morgan, Jennifer Craft; Doyle, Patrick J; Burgess, Elisabeth O; Dillard, Joy A; Barmon, Christina E; Fitzroy, Andrea F; Helmly, Victoria E; Avent, Elizabeth S; Perkins, Molly M
2017-07-01
In this article, we analyze the research experiences associated with a longitudinal qualitative study of residents' care networks in assisted living. Using data from researcher meetings, field notes, and memos, we critically examine our design and decision making and accompanying methodological implications. We focus on one complete wave of data collection involving 28 residents and 114 care network members in four diverse settings followed for 2 years. We identify study features that make our research innovative, but that also represent significant challenges. They include the focus and topic; settings and participants; scope and design complexity; nature, modes, frequency, and duration of data collection; and analytic approach. Each feature has methodological implications, including benefits and challenges pertaining to recruitment, retention, data collection, quality, and management, research team work, researcher roles, ethics, and dissemination. Our analysis demonstrates the value of our approach and of reflecting on and sharing methodological processes for cumulative knowledge building.
Medical technology as a key driver of rising health expenditure: disentangling the relationship
Sorenson, Corinna; Drummond, Michael; Bhuiyan Khan, Beena
2013-01-01
Health care spending has risen steadily in most countries, becoming a concern for decision-makers worldwide. Commentators often point to new medical technology as the key driver for burgeoning expenditures. This paper critically appraises this conjecture, based on an analysis of the existing literature, with the aim of offering a more detailed and considered analysis of this relationship. Several databases were searched to identify relevant literature. Various categories of studies (eg, multivariate and cost-effectiveness analyses) were included to cover different perspectives, methodological approaches, and issues regarding the link between medical technology and costs. Selected articles were reviewed and relevant information was extracted into a standardized template and analyzed for key cross-cutting themes, ie, impact of technology on costs, factors influencing this relationship, and methodological challenges in measuring such linkages. A total of 86 studies were reviewed. The analysis suggests that the relationship between medical technology and spending is complex and often conflicting. Findings were frequently contingent on varying factors, such as the availability of other interventions, patient population, and the methodological approach employed. Moreover, the impact of technology on costs differed across technologies, in that some (eg, cancer drugs, invasive medical devices) had significant financial implications, while others were cost-neutral or cost-saving. In light of these issues, we argue that decision-makers and other commentators should extend their focus beyond costs solely to include consideration of whether medical technology results in better value in health care and broader socioeconomic benefits. PMID:23807855
Hasan, Haroon; Muhammed, Taaha; Yu, Jennifer; Taguchi, Kelsi; Samargandi, Osama A; Howard, A Fuchsia; Lo, Andrea C; Olson, Robert; Goddard, Karen
2017-10-01
The objective of our study was to evaluate the methodological quality of systematic reviews and meta-analyses in Radiation Oncology. A systematic literature search was conducted for all eligible systematic reviews and meta-analyses in Radiation Oncology from 1966 to 2015. Methodological characteristics were abstracted from all works that satisfied the inclusion criteria and quality was assessed using the critical appraisal tool, AMSTAR. Regression analyses were performed to determine factors associated with a higher score of quality. Following exclusion based on a priori criteria, 410 studies (157 systematic reviews and 253 meta-analyses) satisfied the inclusion criteria. Meta-analyses were found to be of fair to good quality while systematic reviews were found to be of less than fair quality. Factors associated with higher scores of quality in the multivariable analysis were including primary studies consisting of randomized control trials, performing a meta-analysis, and applying a recommended guideline related to establishing a systematic review protocol and/or reporting. Systematic reviews and meta-analyses may introduce a high risk of bias if applied to inform decision-making based on AMSTAR. We recommend that decision-makers in Radiation Oncology scrutinize the methodological quality of systematic reviews and meta-analyses prior to assessing their utility to inform evidence-based medicine and researchers adhere to methodological standards outlined in validated guidelines when embarking on a systematic review. Copyright © 2017 Elsevier Ltd. All rights reserved.
Crovelli, R.A.
1988-01-01
The geologic appraisal model that is selected for a petroleum resource assessment depends upon purpose of the assessment, basic geologic assumptions of the area, type of available data, time available before deadlines, available human and financial resources, available computer facilities, and, most importantly, the available quantitative methodology with corresponding computer software and any new quantitative methodology that would have to be developed. Therefore, different resource assessment projects usually require different geologic models. Also, more than one geologic model might be needed in a single project for assessing different regions of the study or for cross-checking resource estimates of the area. Some geologic analyses used in the past for petroleum resource appraisal involved play analysis. The corresponding quantitative methodologies of these analyses usually consisted of Monte Carlo simulation techniques. A probabilistic system of petroleum resource appraisal for play analysis has been designed to meet the following requirements: (1) includes a variety of geologic models, (2) uses an analytic methodology instead of Monte Carlo simulation, (3) possesses the capacity to aggregate estimates from many areas that have been assessed by different geologic models, and (4) runs quickly on a microcomputer. Geologic models consist of four basic types: reservoir engineering, volumetric yield, field size, and direct assessment. Several case histories and present studies by the U.S. Geological Survey are discussed. ?? 1988 International Association for Mathematical Geology.
2013-01-01
Background Systematic reviews provide clinical practice recommendations that are based on evaluation of primary evidence. When systematic reviews with the same aims have different conclusions, it is difficult to ascertain which review reported the most credible and robust findings. Methods This study examined five systematic reviews that have investigated the effectiveness of Pilates exercise in people with chronic low back pain. A four-stage process was used to interpret findings of the reviews. This process included comparison of research questions, included primary studies, and the level and quality of evidence of systematic reviews. Two independent reviewers assessed the level of evidence and the methodological quality of systematic reviews, using the National Health and Medical Research Council hierarchy of evidence, and the Revised Assessment of Multiple Systematic Reviews respectively. Any disagreements were resolved by a third researcher. Results A high level of consensus was achieved between the reviewers. Conflicting findings were reported by the five systematic reviews regarding the effectiveness of Pilates in reducing pain and disability in people with chronic low back pain. Authors of the systematic reviews included primary studies that did not match their questions in relation to treatment or population characteristics. A total of ten primary studies were identified across five systematic reviews. Only two of the primary studies were included in all of the reviews due to different inclusion criteria relating to publication date and status, definition of Pilates, and methodological quality. The level of evidence of reviews was low due to the methodological design of the primary studies. The methodological quality of reviews varied. Those which conducted a meta-analysis obtained higher scores. Conclusion There is inconclusive evidence that Pilates is effective in reducing pain and disability in people with chronic low back pain. This is due to the small number and poor methodological quality of primary studies. The Revised Assessment of Multiple Systematic Reviews provides a useful method of appraising the methodological quality of systematic reviews. Individual item scores, however, should be examined in addition to total scores, so that significant methodological flaws of systematic reviews are not missed, and results are interpreted appropriately. (348 words) PMID:23331384
Wells, Cherie; Kolt, Gregory S; Marshall, Paul; Hill, Bridget; Bialocerkowski, Andrea
2013-01-19
Systematic reviews provide clinical practice recommendations that are based on evaluation of primary evidence. When systematic reviews with the same aims have different conclusions, it is difficult to ascertain which review reported the most credible and robust findings. This study examined five systematic reviews that have investigated the effectiveness of Pilates exercise in people with chronic low back pain. A four-stage process was used to interpret findings of the reviews. This process included comparison of research questions, included primary studies, and the level and quality of evidence of systematic reviews. Two independent reviewers assessed the level of evidence and the methodological quality of systematic reviews, using the National Health and Medical Research Council hierarchy of evidence, and the Revised Assessment of Multiple Systematic Reviews respectively. Any disagreements were resolved by a third researcher. A high level of consensus was achieved between the reviewers. Conflicting findings were reported by the five systematic reviews regarding the effectiveness of Pilates in reducing pain and disability in people with chronic low back pain. Authors of the systematic reviews included primary studies that did not match their questions in relation to treatment or population characteristics. A total of ten primary studies were identified across five systematic reviews. Only two of the primary studies were included in all of the reviews due to different inclusion criteria relating to publication date and status, definition of Pilates, and methodological quality. The level of evidence of reviews was low due to the methodological design of the primary studies. The methodological quality of reviews varied. Those which conducted a meta-analysis obtained higher scores. There is inconclusive evidence that Pilates is effective in reducing pain and disability in people with chronic low back pain. This is due to the small number and poor methodological quality of primary studies. The Revised Assessment of Multiple Systematic Reviews provides a useful method of appraising the methodological quality of systematic reviews. Individual item scores, however, should be examined in addition to total scores, so that significant methodological flaws of systematic reviews are not missed, and results are interpreted appropriately. (348 words).
Ferrer, Imma; Thurman, E Michael
2012-10-12
A straightforward methodology for the chromatographic separation and accurate mass identification of 100 pharmaceuticals including some of their degradation products was developed using liquid chromatography/quadrupole time-of-flight mass spectrometry (LC/Q-TOF-MS). A table compiling the protonated or deprotonated exact masses for all compounds, as well as the exact mass of several fragment ions obtained by MS-MS is included. Excellent chromatographic separation was achieved by using 3.5 μm particle size columns and a slow and generic 30-min gradient. Isobaric and isomeric compounds (same nominal mass and same exact mass, respectively) were distinguished by various methods, including chromatography separation, MS-MS fragmentation, and isotopic signal identification. Method reporting limits of detection ranged from 1 to 1000 ng/L, after solid-phase extraction of 100mL aqueous samples. The methodology was successfully applied to the analysis of surface water impacted by wastewater effluent by identifying many of the pharmaceuticals and metabolites included in the list. Examples are given for some of the most unusual findings in environmental samples. This paper is meant to serve as a guide for those doing analysis of pharmaceuticals in environmental samples, by providing exact mass measurements of several well known, as well as newly identified and environmentally relevant pharmaceuticals in water samples. Copyright © 2012 Elsevier B.V. All rights reserved.
Poor methodological quality and reporting standards of systematic reviews in burn care management.
Wasiak, Jason; Tyack, Zephanie; Ware, Robert; Goodwin, Nicholas; Faggion, Clovis M
2017-10-01
The methodological and reporting quality of burn-specific systematic reviews has not been established. The aim of this study was to evaluate the methodological quality of systematic reviews in burn care management. Computerised searches were performed in Ovid MEDLINE, Ovid EMBASE and The Cochrane Library through to February 2016 for systematic reviews relevant to burn care using medical subject and free-text terms such as 'burn', 'systematic review' or 'meta-analysis'. Additional studies were identified by hand-searching five discipline-specific journals. Two authors independently screened papers, extracted and evaluated methodological quality using the 11-item A Measurement Tool to Assess Systematic Reviews (AMSTAR) tool and reporting quality using the 27-item Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) checklist. Characteristics of systematic reviews associated with methodological and reporting quality were identified. Descriptive statistics and linear regression identified features associated with improved methodological quality. A total of 60 systematic reviews met the inclusion criteria. Six of the 11 AMSTAR items reporting on 'a priori' design, duplicate study selection, grey literature, included/excluded studies, publication bias and conflict of interest were reported in less than 50% of the systematic reviews. Of the 27 items listed for PRISMA, 13 items reporting on introduction, methods, results and the discussion were addressed in less than 50% of systematic reviews. Multivariable analyses showed that systematic reviews associated with higher methodological or reporting quality incorporated a meta-analysis (AMSTAR regression coefficient 2.1; 95% CI: 1.1, 3.1; PRISMA regression coefficient 6·3; 95% CI: 3·8, 8·7) were published in the Cochrane library (AMSTAR regression coefficient 2·9; 95% CI: 1·6, 4·2; PRISMA regression coefficient 6·1; 95% CI: 3·1, 9·2) and included a randomised control trial (AMSTAR regression coefficient 1·4; 95%CI: 0·4, 2·4; PRISMA regression coefficient 3·4; 95% CI: 0·9, 5·8). The methodological and reporting quality of systematic reviews in burn care requires further improvement with stricter adherence by authors to the PRISMA checklist and AMSTAR tool. © 2016 Medicalhelplines.com Inc and John Wiley & Sons Ltd.
da Silva, Vinicius Zacarias Maldaner; Durigan, João Luiz Quaglioti; Arena, Ross; de Noronha, Marcos; Gurney, Burke; Cipriano, Gerson
2015-01-01
Neuromuscular electrical stimulation (NMES) is widely utilized to enhance muscle performance. However, the optimal NMES waveform with respect to treatment effect has not been established. To investigate the effects of kilohertz-frequency alternating current (KFAC) and low-frequency pulsed current (PC) on quadriceps evoked torque and self-reported discomfort. PubMed, The Cochrane Library, EMBASE, MEDLINE, Physiotherapy Evidence Database (PEDro), SinoMed, ISI Web of Knowledge, and CINAHL were searched for randomized controlled trials (RCTs) and quasi-randomized controlled trials (QRCTs). Two reviewers independently selected potential studies according to the inclusion criteria, extracted data, and assessed methodological quality. Studies were eligible if they compared KFAC versus PC interventions. Studies that included outcome measures for percentage of maximal isometric voluntary contraction (%MIVC) torque and self-reported discomfort level were eligible for evaluation. Seven studies involving 127 individuals were included. The methodological quality of eligible trials was moderate, with a mean of 5 on the 10-point PEDro scale. Overall, PC was no better than KFAC in terms of evoked torque and there was no difference in self-reported discomfort level. KFAC and PC have similar effects on quadriceps evoked torque and self-reported discomfort level in healthy individuals. The small number and overall methodological quality of currently available studies included in this meta-analysis indicate that new RCTs are needed to better determine optimal NMES treatment parameters.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Giannantonio, T.; et al.
Optical imaging surveys measure both the galaxy density and the gravitational lensing-induced shear fields across the sky. Recently, the Dark Energy Survey (DES) collaboration used a joint fit to two-point correlations between these observables to place tight constraints on cosmology (DES Collaboration et al. 2017). In this work, we develop the methodology to extend the DES Collaboration et al. (2017) analysis to include cross-correlations of the optical survey observables with gravitational lensing of the cosmic microwave background (CMB) as measured by the South Pole Telescope (SPT) and Planck. Using simulated analyses, we show how the resulting set of five two-pointmore » functions increases the robustness of the cosmological constraints to systematic errors in galaxy lensing shear calibration. Additionally, we show that contamination of the SPT+Planck CMB lensing map by the thermal Sunyaev-Zel'dovich effect is a potentially large source of systematic error for two-point function analyses, but show that it can be reduced to acceptable levels in our analysis by masking clusters of galaxies and imposing angular scale cuts on the two-point functions. The methodology developed here will be applied to the analysis of data from the DES, the SPT, and Planck in a companion work.« less
Space station needs, attributes and architectural options: Mission requirements
NASA Technical Reports Server (NTRS)
1983-01-01
Various mission requirements for the proposed space station are examined. Subjects include modelling methodology, science applications, commercial opportunities, operations analysis, integrated mission requirements, and the role of man in space station functions and activities. The information is presented through the use of graphs.
How to Conduct Ethnographic Research
ERIC Educational Resources Information Center
Sangasubana, Nisaratana
2011-01-01
The purpose of this paper is to describe the process of conducting ethnographic research. Methodology definition and key characteristics are given. The stages of the research process are described including preparation, data gathering and recording, and analysis. Important issues such as reliability and validity are also discussed.
The Future Impact of Wind on BPA Power System Ancillary Services
DOE Office of Scientific and Technical Information (OSTI.GOV)
Makarov, Yuri V.; Lu, Shuai; McManus, Bart
Wind power is growing in a very fast pace as an alternative generating resource. As the ratio of wind power over total system capacity increases, the impact of wind on various system aspects becomes significant. This paper presents a methodology to study the future impact of wind on BPA power system ancillary services including load following and regulation. Existing approaches for similar analysis include dispatch model simulation and standard deviation evaluation. The methodology proposed in this paper uses historical data and stochastic processes to simulate the load balancing processes in BPA power system. Then capacity, ramp rate and ramp durationmore » characteristics are extracted from the simulation results, and load following and regulation requirements are calculated accordingly. It mimics the actual power system operations therefore the results can be more realistic yet the approach is convenient to perform. Further, the ramp rate and ramp duration data obtained from the analysis can be used to evaluate generator response or maneuverability and energy requirement, respectively, additional to the capacity requirement.« less
Cost benefit analysis of space communications technology: Volume 1: Executive summary
NASA Technical Reports Server (NTRS)
Holland, L. D.; Sassone, P. G.; Gallagher, J. J.; Robinette, S. L.; Vogler, F. H.; Zimmer, R. P.
1976-01-01
The questions of (1) whether or not NASA should support the further development of space communications technology, and, if so, (2) which technology's support should be given the highest priority are addressed. Insofar as the issues deal principally with resource allocation, an economics perspective is adopted. The resultant cost benefit methodology utilizes the net present value concept in three distinct analysis stages to evaluate and rank those technologies which pass a qualification test based upon probable (private sector) market failure. User-preference and technology state-of-the-art surveys were conducted (in 1975) to form a data base for the technology evaluation. The program encompassed near-future technologies in space communications earth stations and satellites, including the noncommunication subsystems of the satellite (station keeping, electrical power system, etc.). Results of the research program include confirmation of the applicability of the methodology as well as a list of space communications technologies ranked according to the estimated net present value of their support (development) by NASA.
Le Quintrec, Jean-Laurent; Bussy, Caroline; Golmard, Jean-Louis; Hervé, Christian; Baulon, Alain; Piette, François
2005-03-01
Very elderly subjects (VES; aged 80 years or older) constitute a special population as they frequently present multiple diseases (polypathology). Results from trials on general adult populations therefore cannot be extrapolated to VES. We performed a census of randomized controlled trials (RCT) on VES published between 1990 and 2002, and carried out a descriptive and methodological analysis of these RCT/VES, comparing them with matched RCT on general adult populations (control RCT, RCT/C). We searched for RCT/VES in two international databases (EMBASE and MEDLINE) and then manually. RCT/C were matched to RCT/VES for disease area and year of publication. The methodological quality of each RCT was assessed with Chalmers' scale. We identified 84 RCT/VES, 63 of which were conclusive and 21, inconclusive. Subjects were institutionalized in 48 RCT, and community dwelling in 11 RCT (unspecified in 25 RCT). Efficacy was the main criterion in 75 RCT; tolerance in 9 RCT. Twenty-six RCT were published by geriatrics journals, and 58 by general medical journals. The RCT/VES covered most of the disease areas of geriatrics. The 84 RCT/VES had a mean methodological quality score of 0.578 +/- 0.157. The matched 84 RCT/C had a mean methodological quality score of 0.592 +/- 0.116 (p = .466). The methodological quality score of RCT/VES increased with the number of included subjects (p = .004) and the year of publication (p = .001). The methodological quality of RCT/VES is equivalent to that of RCT in general adult populations. Nevertheless, RCT/VES remain very scarce, and neglect certain diseases. RCT/VES and the inclusion of very elderly subjects in RCT on adults should be strongly encouraged.
Development/Modernization of an Advanced Non-Light Water Reactor Probabilistic Risk Assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Henneke, Dennis W.; Robinson, James
In 2015, GE Hitachi Nuclear Energy (GEH) teamed with Argonne National Laboratory (Argonne) to perform Research and Development (R&D) of next-generation Probabilistic Risk Assessment (PRA) methodologies for the modernization of an advanced non-Light Water Reactor (non-LWR) PRA. This effort built upon a PRA developed in the early 1990s for GEH’s Power Reactor Inherently Safe Module (PRISM) Sodium Fast Reactor (SFR). The work had four main tasks: internal events development modeling the risk from the reactor for hazards occurring at-power internal to the plant; an all hazards scoping review to analyze the risk at a high level from external hazards suchmore » as earthquakes and high winds; an all modes scoping review to understand the risk at a high level from operating modes other than at-power; and risk insights to integrate the results from each of the three phases above. To achieve these objectives, GEH and Argonne used and adapted proven PRA methodologies and techniques to build a modern non-LWR all hazards/all modes PRA. The teams also advanced non-LWR PRA methodologies, which is an important outcome from this work. This report summarizes the project outcomes in two major phases. The first phase presents the methodologies developed for non-LWR PRAs. The methodologies are grouped by scope, from Internal Events At-Power (IEAP) to hazards analysis to modes analysis. The second phase presents details of the PRISM PRA model which was developed as a validation of the non-LWR methodologies. The PRISM PRA was performed in detail for IEAP, and at a broader level for hazards and modes. In addition to contributing methodologies, this project developed risk insights applicable to non-LWR PRA, including focus-areas for future R&D, and conclusions about the PRISM design.« less
Quantitation and detection of vanadium in biologic and pollution materials
NASA Technical Reports Server (NTRS)
Gordon, W. A.
1974-01-01
A review is presented of special considerations and methodology for determining vanadium in biological and air pollution materials. In addition to descriptions of specific analysis procedures, general sections are included on quantitation of analysis procedures, sample preparation, blanks, and methods of detection of vanadium. Most of the information presented is applicable to the determination of other trace elements in addition to vanadium.
Prison Radicalization: The New Extremist Training Grounds?
2007-09-01
distributing and collecting survey data , and the data analysis. The analytical methodology includes descriptive and inferential statistical methods, in... statistical analysis of the responses to identify significant correlations and relationships. B. SURVEY DATA COLLECTION To effectively access a...Q18, Q19, Q20, and Q21. Due to the exploratory nature of this small survey, data analyses were confined mostly to descriptive statistics and
ERIC Educational Resources Information Center
Center for Research and Reform in Education, 2012
2012-01-01
This review examines research on the effects of technology use on reading achievement in K-12 classrooms. It applies consistent inclusion standards to focus on studies that met high methodological standards. A total of 84 qualified studies based on over 60,000 K-12 participants were included in the final analysis. Four major categories of…
Coupled structural/thermal/electromagnetic analysis/tailoring of graded composite structures
NASA Technical Reports Server (NTRS)
Mcknight, R. L.; Huang, H.; Hartle, M.
1992-01-01
Accomplishments are described for the fourth years effort of a 5-year program to develop a methodology for coupled structural/thermal/electromagnetic analysis/tailoring of graded component structures. These accomplishments include: (1) demonstration of coupled solution capability; (2) alternate CSTEM electromagnetic technology; (3) CSTEM acoustic capability; (4) CSTEM tailoring; (5) CSTEM composite micromechanics using ICAN; and (6) multiple layer elements in CSTEM.
ERIC Educational Resources Information Center
Bashaw, W. L., Ed.; Findley, Warren G., Ed.
This volume contains the five major addresses and subsequent discussion from the Symposium on the General Linear Models Approach to the Analysis of Experimental Data in Educational Research, which was held in 1967 in Athens, Georgia. The symposium was designed to produce systematic information, including new methodology, for dissemination to the…
NASA Astrophysics Data System (ADS)
Satyanarayana, S.; Indrakanti, S.; Kim, J.; Kim, C.; Pamidi, S.
2017-12-01
Benefits of an integrated high temperature superconducting (HTS) power system and the associated cryogenic systems on board an electric ship or aircraft are discussed. A versatile modelling methodology developed to assess the cryogenic thermal behavior of the integrated system with multiple HTS devices and the various potential configurations are introduced. The utility and effectiveness of the developed modelling methodology is demonstrated using a case study involving a hypothetical system including an HTS propulsion motor, an HTS generator and an HTS power cable cooled by an integrated cryogenic helium circulation system. Using the methodology, multiple configurations are studied. The required total cooling power and the ability to maintain each HTS device at the required operating temperatures are considered for each configuration and the trade-offs are discussed for each configuration. Transient analysis of temperature evolution in the cryogenic helium circulation loop in case of a system failure is carried out to arrive at the required critical response time. The analysis was also performed for a similar liquid nitrogen circulation for an isobaric condition and the cooling capacity ratio is used to compare the relative merits of the two cryogens.
Ryan, Kath; Bissell, Paul; Morecroft, Charles
2007-08-01
Part 2 of this paper aims to provide a methodological framework for the study of medication narratives, including a semi-structured interview guide and suggested method of analysis, in an attempt to aid the development of narrative scholarship within pharmacy practice research. Examples of medication narratives are provided to illustrate their diversity and usefulness. The framework is derived from the work of other researchers and adapted for our specific purpose. It comes from social psychology, narrative psychology, narrative anthropology, sociology and critical theory and fits within the social constructionist paradigm. The suggested methods of analysis could broadly be described as narrative analysis and discourse analysis. Examples of medication narratives are chosen from a variety of sources and brief interpretations are presented by way of illustration. Narrative analysis, a neglected area of research in pharmacy practice, has the potential to provide new understanding about how people relate to their medicines, how pharmacists are engaged in producing narratives and the importance of narrative in the education of students. IMPACT OF THE ARTICLE: This article aims to have the following impact on pharmacy practice research: Innovative approach to researching and conceptualising the use of medicines. Introduction of a new theoretical perspective and methodology. Incorporation of social science research methods into pharmacy practice research. Development of narrative scholarship within pharmacy.
Single Case Method in Psychology: How to Improve as a Possible Methodology in Quantitative Research.
Krause-Kjær, Elisa; Nedergaard, Jensine I
2015-09-01
Awareness of including Single-Case Method (SCM), as a possible methodology in quantitative research in the field of psychology, has been argued as useful, e.g., by Hurtado-Parrado and López-López (IPBS: Integrative Psychological & Behavioral Science, 49:2, 2015). Their article introduces a historical and conceptual analysis of SCMs and proposes changing the, often prevailing, tendency of neglecting SCM as an alternative to Null Hypothesis Significance Testing (NHST). This article contributes by putting a new light on SCM as an equally important methodology in psychology. The intention of the present article is to elaborate this point of view further by discussing one of the most fundamental requirements as well as main characteristics of SCM regarding temporality. In this respect that; "…performance is assessed continuously over time and under different conditions…" Hurtado-Parrado and López-López (IPBS: Integrative Psychological & Behavioral Science, 49:2, 2015). Defining principles when it comes to particular units of analysis, both synchronic (spatial) and diachronic (temporal) elements should be incorporated. In this article misunderstandings of the SCM will be adduced, and further the temporality will be described in order to propose how the SCM could have a more severe usability in psychological research. It is further discussed how to implement SCM in psychological methodology. It is suggested that one solution might be to reconsider the notion of time in psychological research to cover more than a variable of control and in this respect also include the notion of time as an irreversible unity within life.
Systematic reviews and meta-analyses on treatment of asthma: critical evaluation
Jadad, Alejandro R; Moher, Michael; Browman, George P; Booker, Lynda; Sigouin, Christopher; Fuentes, Mario; Stevens, Robert
2000-01-01
Objective To evaluate the clinical, methodological, and reporting aspects of systematic reviews and meta-analyses on the treatment of asthma and to compare those published by the Cochrane Collaboration with those published in paper based journals. Design Analysis of studies identified from Medline, CINAHL, HealthSTAR, EMBASE, Cochrane Library, personal collections, and reference lists. Studies Articles describing a systematic review or a meta-analysis of the treatment of asthma that were published as a full report, in any language or format, in a peer reviewed journal or the Cochrane Library. Main outcome measures General characteristics of studies reviewed and methodological characteristics (sources of articles; language restrictions; format, design, and publication status of studies included; type of data synthesis; and methodological quality). Results 50 systematic reviews and meta-analyses were included. More than half were published in the past two years. Twelve reviews were published in the Cochrane Library and 38 were published in 22 peer reviewed journals. Forced expiratory volume in one second was the most frequently used outcome, but few reviews evaluated the effect of treatment on costs or patient preferences. Forty reviews were judged to have serious or extensive flaws. All six reviews associated with industry were in this group. Seven of the 10 most rigorous reviews were published in the Cochrane Library. Conclusions Most reviews published in peer reviewed journals or funded by industry have serious methodological flaws that limit their value to guide decisions. Cochrane reviews are more rigorous and better reported than those published in peer reviewed journals. PMID:10688558
Laitinen, Heleena; Kaunonen, Marja; Astedt-Kurki, Päivi
2014-11-01
To give clarity to the analysis of participant observation in nursing when implementing the grounded theory method. Participant observation (PO) is a method of collecting data that reveals the reality of daily life in a specific context. In grounded theory, interviews are the primary method of collecting data but PO gives a distinctive insight, revealing what people are really doing, instead of what they say they are doing. However, more focus is needed on the analysis of PO. An observational study carried out to gain awareness of nursing care and its electronic documentation in four acute care wards in hospitals in Finland. Discussion of using the grounded theory method and PO as a data collection tool. The following methodological tools are discussed: an observational protocol, jotting of notes, microanalysis, the use of questioning, constant comparison, and writing and illustrating. Each tool has specific significance in collecting and analysing data, working in constant interaction. Grounded theory and participant observation supplied rich data and revealed the complexity of the daily reality of acute care. In this study, the methodological tools provided a base for the study at the research sites and outside. The process as a whole was challenging. It was time-consuming and it required rigorous and simultaneous data collection and analysis, including reflective writing. Using these methodological tools helped the researcher stay focused from data collection and analysis to building theory. Using PO as a data collection method in qualitative nursing research provides insights. It is not commonly discussed in nursing research and therefore this study can provide insight, which cannot be seen or revealed by using other data collection methods. Therefore, this paper can produce a useful tool for those who intend to use PO and grounded theory in their nursing research.
Plancoulaine, Benoit; Laurinaviciene, Aida; Herlin, Paulette; Besusparis, Justinas; Meskauskas, Raimundas; Baltrusaityte, Indra; Iqbal, Yasir; Laurinavicius, Arvydas
2015-10-19
Digital image analysis (DIA) enables higher accuracy, reproducibility, and capacity to enumerate cell populations by immunohistochemistry; however, the most unique benefits may be obtained by evaluating the spatial distribution and intra-tissue variance of markers. The proliferative activity of breast cancer tissue, estimated by the Ki67 labeling index (Ki67 LI), is a prognostic and predictive biomarker requiring robust measurement methodologies. We performed DIA on whole-slide images (WSI) of 302 surgically removed Ki67-stained breast cancer specimens; the tumour classifier algorithm was used to automatically detect tumour tissue but was not trained to distinguish between invasive and non-invasive carcinoma cells. The WSI DIA-generated data were subsampled by hexagonal tiling (HexT). Distribution and texture parameters were compared to conventional WSI DIA and pathology report data. Factor analysis of the data set, including total numbers of tumor cells, the Ki67 LI and Ki67 distribution, and texture indicators, extracted 4 factors, identified as entropy, proliferation, bimodality, and cellularity. The factor scores were further utilized in cluster analysis, outlining subcategories of heterogeneous tumors with predominant entropy, bimodality, or both at different levels of proliferative activity. The methodology also allowed the visualization of Ki67 LI heterogeneity in tumors and the automated detection and quantitative evaluation of Ki67 hotspots, based on the upper quintile of the HexT data, conceptualized as the "Pareto hotspot". We conclude that systematic subsampling of DIA-generated data into HexT enables comprehensive Ki67 LI analysis that reflects aspects of intra-tumor heterogeneity and may serve as a methodology to improve digital immunohistochemistry in general.
Coupled structural/thermal/electromagnetic analysis/tailoring of graded composite structures
NASA Technical Reports Server (NTRS)
Hartle, M. S.; Mcknight, R. L.; Huang, H.; Holt, R.
1992-01-01
Described here are the accomplishments of a 5-year program to develop a methodology for coupled structural, thermal, electromagnetic analysis tailoring of graded component structures. The capabilities developed over the course of the program are the analyzer module and the tailoring module for the modeling of graded materials. Highlighted accomplishments for the past year include the addition of a buckling analysis capability, the addition of mode shape slope calculation for flutter analysis, verification of the analysis modules using simulated components, and verification of the tailoring module.
Stakeholder analysis methodologies resource book
DOE Office of Scientific and Technical Information (OSTI.GOV)
Babiuch, W.M.; Farhar, B.C.
1994-03-01
Stakeholder analysis allows analysts to identify how parties might be affected by government projects. This process involves identifying the likely impacts of a proposed action and stakeholder groups affected by that action. Additionally, the process involves assessing how these groups might be affected and suggesting measures to mitigate any adverse effects. Evidence suggests that the efficiency and effectiveness of government actions can be increased and adverse social impacts mitigated when officials understand how a proposed action might affect stakeholders. This report discusses how to conduct useful stakeholder analyses for government officials making decisions on energy-efficiency and renewable-energy technologies and theirmore » commercialization. It discusses methodological issues that may affect the validity and reliability of findings, including sampling, generalizability, validity, ``uncooperative`` stakeholder groups, using social indicators, and the effect of government regulations. The Appendix contains resource directories and a list of specialists in stakeholder analysis and involvement.« less
Integrating protein structural dynamics and evolutionary analysis with Bio3D.
Skjærven, Lars; Yao, Xin-Qiu; Scarabelli, Guido; Grant, Barry J
2014-12-10
Popular bioinformatics approaches for studying protein functional dynamics include comparisons of crystallographic structures, molecular dynamics simulations and normal mode analysis. However, determining how observed displacements and predicted motions from these traditionally separate analyses relate to each other, as well as to the evolution of sequence, structure and function within large protein families, remains a considerable challenge. This is in part due to the general lack of tools that integrate information of molecular structure, dynamics and evolution. Here, we describe the integration of new methodologies for evolutionary sequence, structure and simulation analysis into the Bio3D package. This major update includes unique high-throughput normal mode analysis for examining and contrasting the dynamics of related proteins with non-identical sequences and structures, as well as new methods for quantifying dynamical couplings and their residue-wise dissection from correlation network analysis. These new methodologies are integrated with major biomolecular databases as well as established methods for evolutionary sequence and comparative structural analysis. New functionality for directly comparing results derived from normal modes, molecular dynamics and principal component analysis of heterogeneous experimental structure distributions is also included. We demonstrate these integrated capabilities with example applications to dihydrofolate reductase and heterotrimeric G-protein families along with a discussion of the mechanistic insight provided in each case. The integration of structural dynamics and evolutionary analysis in Bio3D enables researchers to go beyond a prediction of single protein dynamics to investigate dynamical features across large protein families. The Bio3D package is distributed with full source code and extensive documentation as a platform independent R package under a GPL2 license from http://thegrantlab.org/bio3d/ .
pyAudioAnalysis: An Open-Source Python Library for Audio Signal Analysis.
Giannakopoulos, Theodoros
2015-01-01
Audio information plays a rather important role in the increasing digital content that is available today, resulting in a need for methodologies that automatically analyze such content: audio event recognition for home automations and surveillance systems, speech recognition, music information retrieval, multimodal analysis (e.g. audio-visual analysis of online videos for content-based recommendation), etc. This paper presents pyAudioAnalysis, an open-source Python library that provides a wide range of audio analysis procedures including: feature extraction, classification of audio signals, supervised and unsupervised segmentation and content visualization. pyAudioAnalysis is licensed under the Apache License and is available at GitHub (https://github.com/tyiannak/pyAudioAnalysis/). Here we present the theoretical background behind the wide range of the implemented methodologies, along with evaluation metrics for some of the methods. pyAudioAnalysis has been already used in several audio analysis research applications: smart-home functionalities through audio event detection, speech emotion recognition, depression classification based on audio-visual features, music segmentation, multimodal content-based movie recommendation and health applications (e.g. monitoring eating habits). The feedback provided from all these particular audio applications has led to practical enhancement of the library.
pyAudioAnalysis: An Open-Source Python Library for Audio Signal Analysis
Giannakopoulos, Theodoros
2015-01-01
Audio information plays a rather important role in the increasing digital content that is available today, resulting in a need for methodologies that automatically analyze such content: audio event recognition for home automations and surveillance systems, speech recognition, music information retrieval, multimodal analysis (e.g. audio-visual analysis of online videos for content-based recommendation), etc. This paper presents pyAudioAnalysis, an open-source Python library that provides a wide range of audio analysis procedures including: feature extraction, classification of audio signals, supervised and unsupervised segmentation and content visualization. pyAudioAnalysis is licensed under the Apache License and is available at GitHub (https://github.com/tyiannak/pyAudioAnalysis/). Here we present the theoretical background behind the wide range of the implemented methodologies, along with evaluation metrics for some of the methods. pyAudioAnalysis has been already used in several audio analysis research applications: smart-home functionalities through audio event detection, speech emotion recognition, depression classification based on audio-visual features, music segmentation, multimodal content-based movie recommendation and health applications (e.g. monitoring eating habits). The feedback provided from all these particular audio applications has led to practical enhancement of the library. PMID:26656189
Using a Realist Research Methodology in Policy Analysis
ERIC Educational Resources Information Center
Lourie, Megan; Rata, Elizabeth
2017-01-01
The article describes the usefulness of a realist methodology in linking sociological theory to empirically obtained data through the development of a methodological device. Three layers of analysis were integrated: 1. the findings from a case study about Maori language education in New Zealand; 2. the identification and analysis of contradictions…
E Pluribus Analysis: Applying a Superforecasting Methodology to the Detection of Homegrown Violence
2018-03-01
actor violence and a set of predefined decision-making protocols. This research included running four simulations using the Monte Carlo technique, which...actor violence and a set of predefined decision-making protocols. This research included running four simulations using the Monte Carlo technique...PREDICTING RANDOMNESS.............................................................24 1. Using a “ Runs Test” to Determine a Temporal Pattern in Lone
COTS Ceramic Chip Capacitors: An Evaluation of the Parts and Assurance Methodologies
NASA Technical Reports Server (NTRS)
Sampson, Michael J.
2004-01-01
This viewgraph presentation profiles an experiment to evaluate the suitability of commercial off-the-shelf (COTS) ceramic chip capacitors for NASA spaceflight applications. The experiment included: 1) Voltage Conditioning ('Burn-In'); 2) Highly Accelerated Life Test (HALT); 3) Destructive Physical Analysis (DPA); 4) Ultimate Voltage Breakdown Strength. The presentation includes results for each of the capacitors used in the experiment.
Precision-Guided Munitions Effects Representation
2017-01-03
Center for Army Analysis (CAA) by the TRADOC Analysis Center, Monterey (TRAC-MTRY). The focus of the research is to improve the current methodology ... Methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . A-2 Timeline... Methodology . . . . . . . . . . . . . . . . . . . . C-1 MATLAB Code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . C-49 Damage
Cluster Randomised Trials in Cochrane Reviews: Evaluation of Methodological and Reporting Practice.
Richardson, Marty; Garner, Paul; Donegan, Sarah
2016-01-01
Systematic reviews can include cluster-randomised controlled trials (C-RCTs), which require different analysis compared with standard individual-randomised controlled trials. However, it is not known whether review authors follow the methodological and reporting guidance when including these trials. The aim of this study was to assess the methodological and reporting practice of Cochrane reviews that included C-RCTs against criteria developed from existing guidance. Criteria were developed, based on methodological literature and personal experience supervising review production and quality. Criteria were grouped into four themes: identifying, reporting, assessing risk of bias, and analysing C-RCTs. The Cochrane Database of Systematic Reviews was searched (2nd December 2013), and the 50 most recent reviews that included C-RCTs were retrieved. Each review was then assessed using the criteria. The 50 reviews we identified were published by 26 Cochrane Review Groups between June 2013 and November 2013. For identifying C-RCTs, only 56% identified that C-RCTs were eligible for inclusion in the review in the eligibility criteria. For reporting C-RCTs, only eight (24%) of the 33 reviews reported the method of cluster adjustment for their included C-RCTs. For assessing risk of bias, only one review assessed all five C-RCT-specific risk-of-bias criteria. For analysing C-RCTs, of the 27 reviews that presented unadjusted data, only nine (33%) provided a warning that confidence intervals may be artificially narrow. Of the 34 reviews that reported data from unadjusted C-RCTs, only 13 (38%) excluded the unadjusted results from the meta-analyses. The methodological and reporting practices in Cochrane reviews incorporating C-RCTs could be greatly improved, particularly with regard to analyses. Criteria developed as part of the current study could be used by review authors or editors to identify errors and improve the quality of published systematic reviews incorporating C-RCTs.
Introducing a new bond reactivity index: Philicities for natural bond orbitals.
Sánchez-Márquez, Jesús; Zorrilla, David; García, Víctor; Fernández, Manuel
2017-12-22
In the present work, a new methodology defined for obtaining reactivity indices (philicities) is proposed. This is based on reactivity functions such as the Fukui function or the dual descriptor, and makes it possible to project the information from reactivity functions onto molecular orbitals, instead of onto the atoms of the molecule (atomic reactivity indices). The methodology focuses on the molecules' natural bond orbitals (bond reactivity indices) because these orbitals have the advantage of being localized, allowing the reaction site of an electrophile or nucleophile to be determined within a very precise molecular region. This methodology provides a "philicity" index for every NBO, and a representative set of molecules has been used to test the new definition. A new methodology has also been developed to compare the "finite difference" and the "frontier molecular orbital" approximations. To facilitate their use, the proposed methodology as well as the possibility of calculating the new indices have been implemented in a new version of UCA-FUKUI software. In addition, condensation schemes based on atomic populations of the "atoms in molecules" theory, the Hirshfeld population analysis, the approximation of Mulliken (with a minimal basis set) and electrostatic potential-derived charges have also been implemented, including the calculation of "bond reactivity indices" defined in previous studies. Graphical abstract A new methodology defined for obtaining bond reactivity indices (philicities) is proposed and makes it possible to project the information from reactivity functions onto molecular orbitals. The proposed methodology as well as the possibility of calculating the new indices have been implemented in a new version of UCA-FUKUI software. In addition, this version can use new atomic condensation schemes and new "utilities" have also been included in this second version.
Zhan, Jie; Pan, Ruihuan; Zhou, Mingchao; Tan, Feng; Huang, Zhen; Dong, Jing; Wen, Zehuai
2018-01-01
Objectives To assess the effectiveness and safety of electroacupuncture (EA) combined with rehabilitation therapy (RT) and/or conventional drugs (CD) for improving poststroke motor dysfunction (PSMD). Design Systematic review and meta-analysis. Methods The China National Knowledge Infrastructure, Chinese Biological Medicine Database, Chinese Scientific Journal Database, Cochrane Library, Medline and Embase were electronically searched from inception to December 2016. The methodological quality of the included trials was assessed using the Cochrane risk of bias assessment tool. Statistical analyses were performed by RevMan V.5.3 and Stata SE V.11.0. Results Nineteen trials with 1434 participants were included for qualitative synthesis and meta-analysis. The methodological quality of the included trials was generally poor. The meta-analysis indicated that the EA group might be benefiting more than the non-EA group in terms of the changes in the Fugl-Meyer Assessment Scale (FMA) (weighted mean difference (WMD): 10.79, 95% CI 6.39 to 15.20, P<0.001), FMA for lower extremity (WMD: 5.16, 95% CI 3.78 to 6.54, P<0.001) and activities of daily living (standardised mean difference: 1.37, 95% CI 0.79 to 1.96, P<0.001). However, there was no difference between EA and non-EA groups in terms of the effective rate (relative risk: 1.13, 95% CI 1.00 to 1.27, P=0.050). Moreover, there were not any reports of side effects due to EA combined with RT and/or CD in the included trials. Conclusions This review provides new evidence for the effectiveness and safety of EA combined with RT and/or CD for PSMD. However, the results should be interpreted cautiously because of methodological weakness and publication bias. Further clinical trials with a rigorous design and large sample sizes are warranted. PROSPERO registration number CRD42016037597. PMID:29371267
Wightman, Jade; Julio, Flávia; Virués-Ortega, Javier
2014-05-01
Experimental functional analysis is an assessment methodology to identify the environmental factors that maintain problem behavior in individuals with developmental disabilities and in other populations. Functional analysis provides the basis for the development of reinforcement-based approaches to treatment. This article reviews the procedures, validity, and clinical implementation of the methodological variations of functional analysis and function-based interventions. We present six variations of functional analysis methodology in addition to the typical functional analysis: brief functional analysis, single-function tests, latency-based functional analysis, functional analysis of precursors, and trial-based functional analysis. We also present the three general categories of function-based interventions: extinction, antecedent manipulation, and differential reinforcement. Functional analysis methodology is a valid and efficient approach to the assessment of problem behavior and the selection of treatment strategies.
Probabilistic structural analysis methods for space propulsion system components
NASA Technical Reports Server (NTRS)
Chamis, C. C.
1986-01-01
The development of a three-dimensional inelastic analysis methodology for the Space Shuttle main engine (SSME) structural components is described. The methodology is composed of: (1) composite load spectra, (2) probabilistic structural analysis methods, (3) the probabilistic finite element theory, and (4) probabilistic structural analysis. The methodology has led to significant technical progress in several important aspects of probabilistic structural analysis. The program and accomplishments to date are summarized.
A Review of Citation Analysis Methodologies for Collection Management
ERIC Educational Resources Information Center
Hoffmann, Kristin; Doucette, Lise
2012-01-01
While there is a considerable body of literature that presents the results of citation analysis studies, most researchers do not provide enough detail in their methodology to reproduce the study, nor do they provide rationale for methodological decisions. In this paper, we review the methodologies used in 34 recent articles that present a…
Systematic risk assessment methodology for critical infrastructure elements - Oil and Gas subsectors
NASA Astrophysics Data System (ADS)
Gheorghiu, A.-D.; Ozunu, A.
2012-04-01
The concern for the protection of critical infrastructure has been rapidly growing in the last few years in Europe. The level of knowledge and preparedness in this field is beginning to develop in a lawfully organized manner, for the identification and designation of critical infrastructure elements of national and European interest. Oil and gas production, refining, treatment, storage and transmission by pipelines facilities, are considered European critical infrastructure sectors, as per Annex I of the Council Directive 2008/114/EC of 8 December 2008 on the identification and designation of European critical infrastructures and the assessment of the need to improve their protection. Besides identifying European and national critical infrastructure elements, member states also need to perform a risk analysis for these infrastructure items, as stated in Annex II of the above mentioned Directive. In the field of risk assessment, there are a series of acknowledged and successfully used methods in the world, but not all hazard identification and assessment methods and techniques are suitable for a given site, situation, or type of hazard. As Theoharidou, M. et al. noted (Theoharidou, M., P. Kotzanikolaou, and D. Gritzalis 2009. Risk-Based Criticality Analysis. In Critical Infrastructure Protection III. Proceedings. Third Annual IFIP WG 11.10 International Conference on Critical Infrastructure Protection. Hanover, New Hampshire, USA, March 23-25, 2009: revised selected papers, edited by C. Palmer and S. Shenoi, 35-49. Berlin: Springer.), despite the wealth of knowledge already created, there is a need for simple, feasible, and standardized criticality analyses. The proposed systematic risk assessment methodology includes three basic steps: the first step (preliminary analysis) includes the identification of hazards (including possible natural hazards) for each installation/section within a given site, followed by a criterial analysis and then a detailed analysis step. The criterial evaluation is used as a ranking system in order to establish the priorities for the detailed risk assessment. This criterial analysis stage is necessary because the total number of installations and sections on a site can be quite large. As not all installations and sections on a site contribute significantly to the risk of a major accident occurring, it is not efficient to include all installations and sections in the detailed risk assessment, which can be time and resource consuming. The selected installations are then taken into consideration in the detailed risk assessment, which is the third step of the systematic risk assessment methodology. Following this step, conclusions can be drawn related to the overall risk characteristics of the site. The proposed methodology can as such be successfully applied to the assessment of risk related to critical infrastructure elements falling under the energy sector of Critical Infrastructure, mainly the sub-sectors oil and gas. Key words: Systematic risk assessment, criterial analysis, energy sector critical infrastructure elements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klein, Adam
2015-01-01
This thesis presents work on advancements and applications of methodology for the analysis of biological samples using mass spectrometry. Included in this work are improvements to chemical cross-linking mass spectrometry (CXMS) for the study of protein structures and mass spectrometry imaging and quantitative analysis to study plant metabolites. Applications include using matrix-assisted laser desorption/ionization-mass spectrometry imaging (MALDI-MSI) to further explore metabolic heterogeneity in plant tissues and chemical interactions at the interface between plants and pests. Additional work was focused on developing liquid chromatography-mass spectrometry (LC-MS) methods to investigate metabolites associated with plant-pest interactions.
Methodological considerations in cost of illness studies on Alzheimer disease
2012-01-01
Cost-of-illness studies (COI) can identify and measure all the costs of a particular disease, including the direct, indirect and intangible dimensions. They are intended to provide estimates about the economic impact of costly disease. Alzheimer disease (AD) is a relevant example to review cost of illness studies because of its costliness.The aim of this study was to review relevant published cost studies of AD to analyze the method used and to identify which dimension had to be improved from a methodological perspective. First, we described the key points of cost study methodology. Secondly, cost studies relating to AD were systematically reviewed, focussing on an analysis of the different methods used. The methodological choices of the studies were analysed using an analytical grid which contains the main methodological items of COI studies. Seventeen articles were retained. Depending on the studies, annual total costs per patient vary from $2,935 to $52, 954. The methods, data sources, and estimated cost categories in each study varied widely. The review showed that cost studies adopted different approaches to estimate costs of AD, reflecting a lack of consensus on the methodology of cost studies. To increase its credibility, closer agreement among researchers on the methodological principles of cost studies would be desirable. PMID:22963680
A neural network based methodology to predict site-specific spectral acceleration values
NASA Astrophysics Data System (ADS)
Kamatchi, P.; Rajasankar, J.; Ramana, G. V.; Nagpal, A. K.
2010-12-01
A general neural network based methodology that has the potential to replace the computationally-intensive site-specific seismic analysis of structures is proposed in this paper. The basic framework of the methodology consists of a feed forward back propagation neural network algorithm with one hidden layer to represent the seismic potential of a region and soil amplification effects. The methodology is implemented and verified with parameters corresponding to Delhi city in India. For this purpose, strong ground motions are generated at bedrock level for a chosen site in Delhi due to earthquakes considered to originate from the central seismic gap of the Himalayan belt using necessary geological as well as geotechnical data. Surface level ground motions and corresponding site-specific response spectra are obtained by using a one-dimensional equivalent linear wave propagation model. Spectral acceleration values are considered as a target parameter to verify the performance of the methodology. Numerical studies carried out to validate the proposed methodology show that the errors in predicted spectral acceleration values are within acceptable limits for design purposes. The methodology is general in the sense that it can be applied to other seismically vulnerable regions and also can be updated by including more parameters depending on the state-of-the-art in the subject.
Qualitative case study methodology in nursing research: an integrative review.
Anthony, Susan; Jack, Susan
2009-06-01
This paper is a report of an integrative review conducted to critically analyse the contemporary use of qualitative case study methodology in nursing research. Increasing complexity in health care and increasing use of case study in nursing research support the need for current examination of this methodology. In 2007, a search for case study research (published 2005-2007) indexed in the CINAHL, MEDLINE, EMBASE, PsychINFO, Sociological Abstracts and SCOPUS databases was conducted. A sample of 42 case study research papers met the inclusion criteria. Whittemore and Knafl's integrative review method guided the analysis. Confusion exists about the name, nature and use of case study. This methodology, including terminology and concepts, is often invisible in qualitative study titles and abstracts. Case study is an exclusive methodology and an adjunct to exploring particular aspects of phenomena under investigation in larger or mixed-methods studies. A high quality of case study exists in nursing research. Judicious selection and diligent application of literature review methods promote the development of nursing science. Case study is becoming entrenched in the nursing research lexicon as a well-accepted methodology for studying phenomena in health and social care, and its growing use warrants continued appraisal to promote nursing knowledge development. Attention to all case study elements, process and publication is important in promoting authenticity, methodological quality and visibility.
Assessing the validity of discourse analysis: transdisciplinary convergence
NASA Astrophysics Data System (ADS)
Jaipal-Jamani, Kamini
2014-12-01
Research studies using discourse analysis approaches make claims about phenomena or issues based on interpretation of written or spoken text, which includes images and gestures. How are findings/interpretations from discourse analysis validated? This paper proposes transdisciplinary convergence as a way to validate discourse analysis approaches to research. The argument is made that discourse analysis explicitly grounded in semiotics, systemic functional linguistics, and critical theory, offers a credible research methodology. The underlying assumptions, constructs, and techniques of analysis of these three theoretical disciplines can be drawn on to show convergence of data at multiple levels, validating interpretations from text analysis.
ORGANIC COMPOUNDS IN SURFACE SEDIMENTS AND OYSTER TISSUES FROM THE CHESAPEAKE BAY. APPENDICES
Detailed in the first part of this report is a development and discussion of the methodology used to extract and analyze sediment and oyster tissue samples from Chesapeake Bay for organic compounds. The method includes extraction, fractionation, and subsequent analysis using glas...
Current and Emerging Forces Impacting Special Education.
ERIC Educational Resources Information Center
Yates, James R.
Using the methodology of force field analysis, the paper develops possible futures for special education based on current trends. Demographic forces impacting special education include age changes, ethnicity changes, the needs of emerging language minorities, specific change in the youth population, environmental factors and the incidence of…
Addressing Gender Equity in Nonfaculty Salaries.
ERIC Educational Resources Information Center
Toukoushian, Robert K.
2000-01-01
Discusses methodology of gender equity studies on noninstructional employees of colleges and universities, including variable selection in the multiple regression model and alternative approaches for measuring wage gaps. Analysis of staff data at one institution finds that experience and market differences account for 80 percent of gender pay…
A Systems Analysis Role Play Case: We Sell Stuff, Inc.
ERIC Educational Resources Information Center
Mitri, Michel; Cole, Carey
2007-01-01
Most systems development projects incorporate some sort of life cycle approach in their development. Whether the development methodology involves a traditional life cycle, prototyping, rapid application development, or some other approach, the first step usually involves a system investigation, which includes problem identification, feasibility…
NASA Technical Reports Server (NTRS)
Bopp, Genie; Somers, Jeff; Granderson, Brad; Gernhardt, Mike; Currie, Nancy; Lawrence, Chuck
2010-01-01
Topics include occupant protection overview with a focus on crew protection during dynamic phases of flight; occupant protection collaboration; modeling occupant protection; occupant protection considerations; project approach encompassing analysis tools, injury criteria, and testing program development; injury criteria update methodology, unique effects of pressure suits and other factors; and a summary.
NMR characterization of polymers: Review and update
USDA-ARS?s Scientific Manuscript database
NMR spectroscopy is a major technique for the characterization and analysis of polymers. A large number of methodologies have been developed in both the liquid and the solid state, and the literature has grown considerably (1-5). The field now covers a broad spectrum of activities, including polym...
40 CFR 85.2120 - Maintenance and submittal of records.
Code of Federal Regulations, 2010 CFR
2010-07-01
... testing program, including all production part sampling techniques used to verify compliance of the... subsequent analyses of that data; (7) A description of all the methodology, analysis, testing and/or sampling techniques used to ascertain the emission critical parameter specifications of the originial equipment part...
Mixed Methods Research Designs in Counseling Psychology
ERIC Educational Resources Information Center
Hanson, William E.; Creswell, John W.; Clark, Vicki L. Plano; Petska, Kelly S.; Creswell, David J.
2005-01-01
With the increased popularity of qualitative research, researchers in counseling psychology are expanding their methodologies to include mixed methods designs. These designs involve the collection, analysis, and integration of quantitative and qualitative data in a single or multiphase study. This article presents an overview of mixed methods…
Religions and the History of Education: A Historiography
ERIC Educational Resources Information Center
Raftery, Deirdre
2012-01-01
This article provides a study of scholarship on religions and education, published over the past forty years, in "History of Education". It also includes reference to other publications, attempting a thematic analysis that scrutinises work on missionaries, churchmen, convents, charitable societies, denominations and education. Methodologies and…
Transportation networks : data, analysis, methodology development and visualization.
DOT National Transportation Integrated Search
2007-12-29
This project provides data compilation, analysis methodology and visualization methodology for the current network : data assets of the Alabama Department of Transportation (ALDOT). This study finds that ALDOT is faced with a : considerable number of...
Lunar Exploration Architecture Level Key Drivers and Sensitivities
NASA Technical Reports Server (NTRS)
Goodliff, Kandyce; Cirillo, William; Earle, Kevin; Reeves, J. D.; Shyface, Hilary; Andraschko, Mark; Merrill, R. Gabe; Stromgren, Chel; Cirillo, Christopher
2009-01-01
Strategic level analysis of the integrated behavior of lunar transportation and lunar surface systems architecture options is performed to assess the benefit, viability, affordability, and robustness of system design choices. This analysis employs both deterministic and probabilistic modeling techniques so that the extent of potential future uncertainties associated with each option are properly characterized. The results of these analyses are summarized in a predefined set of high-level Figures of Merit (FOMs) so as to provide senior NASA Constellation Program (CxP) and Exploration Systems Mission Directorate (ESMD) management with pertinent information to better inform strategic level decision making. The strategic level exploration architecture model is designed to perform analysis at as high a level as possible but still capture those details that have major impacts on system performance. The strategic analysis methodology focuses on integrated performance, affordability, and risk analysis, and captures the linkages and feedbacks between these three areas. Each of these results leads into the determination of the high-level FOMs. This strategic level analysis methodology has been previously applied to Space Shuttle and International Space Station assessments and is now being applied to the development of the Constellation Program point-of-departure lunar architecture. This paper provides an overview of the strategic analysis methodology and the lunar exploration architecture analyses to date. In studying these analysis results, the strategic analysis team has identified and characterized key drivers affecting the integrated architecture behavior. These key drivers include inclusion of a cargo lander, mission rate, mission location, fixed-versus- variable costs/return on investment, and the requirement for probabilistic analysis. Results of sensitivity analysis performed on lunar exploration architecture scenarios are also presented.
Saltaji, Humam; Armijo-Olivo, Susan; Cummings, Greta G; Amin, Maryam; Flores-Mir, Carlos
2014-02-25
It is fundamental that randomised controlled trials (RCTs) are properly conducted in order to reach well-supported conclusions. However, there is emerging evidence that RCTs are subject to biases which can overestimate or underestimate the true treatment effect, due to flaws in the study design characteristics of such trials. The extent to which this holds true in oral health RCTs, which have some unique design characteristics compared to RCTs in other health fields, is unclear. As such, we aim to examine the empirical evidence quantifying the extent of bias associated with methodological and non-methodological characteristics in oral health RCTs. We plan to perform a meta-epidemiological study, where a sample size of 60 meta-analyses (MAs) including approximately 600 RCTs will be selected. The MAs will be randomly obtained from the Oral Health Database of Systematic Reviews using a random number table; and will be considered for inclusion if they include a minimum of five RCTs, and examine a therapeutic intervention related to one of the recognised dental specialties. RCTs identified in selected MAs will be subsequently included if their study design includes a comparison between an intervention group and a placebo group or another intervention group. Data will be extracted from selected trials included in MAs based on a number of methodological and non-methodological characteristics. Moreover, the risk of bias will be assessed using the Cochrane Risk of Bias tool. Effect size estimates and measures of variability for the main outcome will be extracted from each RCT included in selected MAs, and a two-level analysis will be conducted using a meta-meta-analytic approach with a random effects model to allow for intra-MA and inter-MA heterogeneity. The intended audiences of the findings will include dental clinicians, oral health researchers, policymakers and graduate students. The aforementioned will be introduced to the findings through workshops, seminars, round table discussions and targeted individual meetings. Other opportunities for knowledge transfer will be pursued such as key dental conferences. Finally, the results will be published as a scientific report in a dental peer-reviewed journal.
Comparative quantification of health risks: Conceptual framework and methodological issues
Murray, Christopher JL; Ezzati, Majid; Lopez, Alan D; Rodgers, Anthony; Vander Hoorn, Stephen
2003-01-01
Reliable and comparable analysis of risks to health is key for preventing disease and injury. Causal attribution of morbidity and mortality to risk factors has traditionally been conducted in the context of methodological traditions of individual risk factors, often in a limited number of settings, restricting comparability. In this paper, we discuss the conceptual and methodological issues for quantifying the population health effects of individual or groups of risk factors in various levels of causality using knowledge from different scientific disciplines. The issues include: comparing the burden of disease due to the observed exposure distribution in a population with the burden from a hypothetical distribution or series of distributions, rather than a single reference level such as non-exposed; considering the multiple stages in the causal network of interactions among risk factor(s) and disease outcome to allow making inferences about some combinations of risk factors for which epidemiological studies have not been conducted, including the joint effects of multiple risk factors; calculating the health loss due to risk factor(s) as a time-indexed "stream" of disease burden due to a time-indexed "stream" of exposure, including consideration of discounting; and the sources of uncertainty. PMID:12780936
Jiang, Chenghui; Whitehill, Tara L
2014-04-01
Speech errors associated with cleft palate are well established for English and several other Indo-European languages. Few articles describing the speech of Putonghua (standard Mandarin Chinese) speakers with cleft palate have been published in English language journals. Although methodological guidelines have been published for the perceptual speech evaluation of individuals with cleft palate, there has been no critical review of methodological issues in studies of Putonghua speakers with cleft palate. A literature search was conducted to identify relevant studies published over the past 30 years in Chinese language journals. Only studies incorporating perceptual analysis of speech were included. Thirty-seven articles which met inclusion criteria were analyzed and coded on a number of methodological variables. Reliability was established by having all variables recoded for all studies. This critical review identified many methodological issues. These design flaws make it difficult to draw reliable conclusions about characteristic speech errors in this group of speakers. Specific recommendations are made to improve the reliability and validity of future studies, as well to facilitate cross-center comparisons.
[Evaluative designs in public health: methodological considerations].
López, Ma José; Marí-Dell'Olmo, Marc; Pérez-Giménez, Anna; Nebot, Manel
2011-06-01
Evaluation of public health interventions poses numerous methodological challenges. Randomization of individuals is not always feasible and interventions are usually composed of multiple factors. To face these challenges, certain elements, such as the selection of the most appropriate design and the use of a statistical analysis that includes potential confounders, are essential. The objective of this article was to describe the most frequently used designs in the evaluation of public health interventions (policies, programs or campaigns). The characteristics, strengths and weaknesses of each of these evaluative designs are described. Additionally, a brief explanation of the most commonly used statistical analysis in each of these designs is provided. Copyright © 2011 Sociedad Española de Salud Pública y Administración Sanitaria. Published by Elsevier Espana. All rights reserved.
[Preliminarily application of content analysis to qualitative nursing data].
Liang, Shu-Yuan; Chuang, Yeu-Hui; Wu, Shu-Fang
2012-10-01
Content analysis is a methodology for objectively and systematically studying the content of communication in various formats. Content analysis in nursing research and nursing education is called qualitative content analysis. Qualitative content analysis is frequently applied to nursing research, as it allows researchers to determine categories inductively and deductively. This article examines qualitative content analysis in nursing research from theoretical and practical perspectives. We first describe how content analysis concepts such as unit of analysis, meaning unit, code, category, and theme are used. Next, we describe the basic steps involved in using content analysis, including data preparation, data familiarization, analysis unit identification, creating tentative coding categories, category refinement, and establishing category integrity. Finally, this paper introduces the concept of content analysis rigor, including dependability, confirmability, credibility, and transferability. This article elucidates the content analysis method in order to help professionals conduct systematic research that generates data that are informative and useful in practical application.
Methodology and reporting of meta-analyses in the neurosurgical literature.
Klimo, Paul; Thompson, Clinton J; Ragel, Brian T; Boop, Frederick A
2014-04-01
Neurosurgeons are inundated with vast amounts of new clinical research on a daily basis, making it difficult and time-consuming to keep up with the latest literature. Meta-analysis is an extension of a systematic review that employs statistical techniques to pool the data from the literature in order to calculate a cumulative effect size. This is done to answer a clearly defined a priori question. Despite their increasing popularity in the neurosurgery literature, meta-analyses have not been scrutinized in terms of reporting and methodology. The authors performed a literature search using PubMed/MEDLINE to locate all meta-analyses that have been published in the JNS Publishing Group journals (Journal of Neurosurgery, Journal of Neurosurgery: Pediatrics, Journal of Neurosurgery: Spine, and Neurosurgical Focus) or Neurosurgery. Accepted checklists for reporting (PRISMA) and methodology (AMSTAR) were applied to each meta-analysis, and the number of items within each checklist that were satisfactorily fulfilled was recorded. The authors sought to answer 4 specific questions: Are meta-analyses improving 1) with time; 2) when the study met their definition of a meta-analysis; 3) when clinicians collaborated with a potential expert in meta-analysis; and 4) when the meta-analysis was the only focus of the paper? Seventy-two meta-analyses were published in the JNS Publishing Group journals and Neurosurgery between 1990 and 2012. The number of published meta-analyses has increased dramatically in the last several years. The most common topics were vascular, and most were based on observational studies. Only 11 papers were prepared using an established checklist. The average AMSTAR and PRISMA scores (proportion of items satisfactorily fulfilled divided by the total number of eligible items in the respective instrument) were 31% and 55%, respectively. Major deficiencies were identified, including the lack of a comprehensive search strategy, study selection and data extraction, assessment of heterogeneity, publication bias, and study quality. Almost one-third of the papers did not meet our basic definition of a meta-analysis. The quality of reporting and methodology was better 1) when the study met our definition of a meta-analysis; 2) when one or more of the authors had experience or expertise in conducting a meta-analysis; 3) when the meta-analysis was not conducted alongside an evaluation of the authors' own data; and 4) in more recent studies. Reporting and methodology of meta-analyses in the neurosurgery literature is excessively variable and overall poor. As these papers are being published with increasing frequency, neurosurgical journals need to adopt a clear definition of a meta-analysis and insist that they be created using checklists for both reporting and methodology. Standardization will ensure high-quality publications.
The methodology of semantic analysis for extracting physical effects
NASA Astrophysics Data System (ADS)
Fomenkova, M. A.; Kamaev, V. A.; Korobkin, D. M.; Fomenkov, S. A.
2017-01-01
The paper represents new methodology of semantic analysis for physical effects extracting. This methodology is based on the Tuzov ontology that formally describes the Russian language. In this paper, semantic patterns were described to extract structural physical information in the form of physical effects. A new algorithm of text analysis was described.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-20
... DEPARTMENT OF EDUCATION Federal Need Analysis Methodology for the 2014-15 Award Year-- Federal Pell Grant, Federal Perkins Loan, Federal Work-Study, Federal Supplemental Educational Opportunity... announces the annual updates to the tables used in the statutory Federal Need Analysis Methodology that...
Design Optimization Method for Composite Components Based on Moment Reliability-Sensitivity Criteria
NASA Astrophysics Data System (ADS)
Sun, Zhigang; Wang, Changxi; Niu, Xuming; Song, Yingdong
2017-08-01
In this paper, a Reliability-Sensitivity Based Design Optimization (RSBDO) methodology for the design of the ceramic matrix composites (CMCs) components has been proposed. A practical and efficient method for reliability analysis and sensitivity analysis of complex components with arbitrary distribution parameters are investigated by using the perturbation method, the respond surface method, the Edgeworth series and the sensitivity analysis approach. The RSBDO methodology is then established by incorporating sensitivity calculation model into RBDO methodology. Finally, the proposed RSBDO methodology is applied to the design of the CMCs components. By comparing with Monte Carlo simulation, the numerical results demonstrate that the proposed methodology provides an accurate, convergent and computationally efficient method for reliability-analysis based finite element modeling engineering practice.
Development of economic consequence methodology for process risk analysis.
Zadakbar, Omid; Khan, Faisal; Imtiaz, Syed
2015-04-01
A comprehensive methodology for economic consequence analysis with appropriate models for risk analysis of process systems is proposed. This methodology uses loss functions to relate process deviations in a given scenario to economic losses. It consists of four steps: definition of a scenario, identification of losses, quantification of losses, and integration of losses. In this methodology, the process deviations that contribute to a given accident scenario are identified and mapped to assess potential consequences. Losses are assessed with an appropriate loss function (revised Taguchi, modified inverted normal) for each type of loss. The total loss is quantified by integrating different loss functions. The proposed methodology has been examined on two industrial case studies. Implementation of this new economic consequence methodology in quantitative risk assessment will provide better understanding and quantification of risk. This will improve design, decision making, and risk management strategies. © 2014 Society for Risk Analysis.
Event-scale power law recession analysis: quantifying methodological uncertainty
NASA Astrophysics Data System (ADS)
Dralle, David N.; Karst, Nathaniel J.; Charalampous, Kyriakos; Veenstra, Andrew; Thompson, Sally E.
2017-01-01
The study of single streamflow recession events is receiving increasing attention following the presentation of novel theoretical explanations for the emergence of power law forms of the recession relationship, and drivers of its variability. Individually characterizing streamflow recessions often involves describing the similarities and differences between model parameters fitted to each recession time series. Significant methodological sensitivity has been identified in the fitting and parameterization of models that describe populations of many recessions, but the dependence of estimated model parameters on methodological choices has not been evaluated for event-by-event forms of analysis. Here, we use daily streamflow data from 16 catchments in northern California and southern Oregon to investigate how combinations of commonly used streamflow recession definitions and fitting techniques impact parameter estimates of a widely used power law recession model. Results are relevant to watersheds that are relatively steep, forested, and rain-dominated. The highly seasonal mediterranean climate of northern California and southern Oregon ensures study catchments explore a wide range of recession behaviors and wetness states, ideal for a sensitivity analysis. In such catchments, we show the following: (i) methodological decisions, including ones that have received little attention in the literature, can impact parameter value estimates and model goodness of fit; (ii) the central tendencies of event-scale recession parameter probability distributions are largely robust to methodological choices, in the sense that differing methods rank catchments similarly according to the medians of these distributions; (iii) recession parameter distributions are method-dependent, but roughly catchment-independent, such that changing the choices made about a particular method affects a given parameter in similar ways across most catchments; and (iv) the observed correlative relationship between the power-law recession scale parameter and catchment antecedent wetness varies depending on recession definition and fitting choices. Considering study results, we recommend a combination of four key methodological decisions to maximize the quality of fitted recession curves, and to minimize bias in the related populations of fitted recession parameters.
Lindberg, Elisabeth; Österberg, Sofia A.; Hörberg, Ulrica
2016-01-01
Phenomena in caring science are often complex and laden with meanings. Empirical research with the aim of capturing lived experiences is one way of revealing the complexity. Sometimes, however, results from empirical research need to be further discussed. One way is to further abstract the result and/or philosophically examine it. This has previously been performed and presented in scientific journals and doctoral theses, contributing to a greater understanding of phenomena in caring science. Although the intentions in many of these publications are laudable, the lack of methodological descriptions as well as a theoretical and systematic foundation can contribute to an ambiguity concerning how the results have emerged during the analysis. The aim of this paper is to describe the methodological support for the further abstraction of and/or philosophical examination of empirical findings. When trying to systematize the support procedures, we have used a reflective lifeworld research (RLR) approach. Based on the assumptions in RLR, this article will present methodological support for a theoretical examination that can include two stages. In the first stage, data from several (two or more) empirical results on an essential level are synthesized into a general structure. Sometimes the analysis ends with the general structure, but sometimes there is a need to proceed further. The second stage can then be a philosophical examination, in which the general structure is discussed in relation to a philosophical text, theory, or concept. It is important that the theories are brought in as the final stage after the completion of the analysis. Core dimensions of the described methodological support are, in accordance with RLR, openness, bridling, and reflection. The methodological support cannot be understood as fixed stages, but rather as a guiding light in the search for further meanings. PMID:26925926
Predicting operator workload during system design
NASA Technical Reports Server (NTRS)
Aldrich, Theodore B.; Szabo, Sandra M.
1988-01-01
A workload prediction methodology was developed in response to the need to measure workloads associated with operation of advanced aircraft. The application of the methodology will involve: (1) conducting mission/task analyses of critical mission segments and assigning estimates of workload for the sensory, cognitive, and psychomotor workload components of each task identified; (2) developing computer-based workload prediction models using the task analysis data; and (3) exercising the computer models to produce predictions of crew workload under varying automation and/or crew configurations. Critical issues include reliability and validity of workload predictors and selection of appropriate criterion measures.
Single event test methodology for integrated optoelectronics
NASA Technical Reports Server (NTRS)
Label, Kenneth A.; Cooley, James A.; Stassinopoulos, E. G.; Marshall, Paul; Crabtree, Christina
1993-01-01
A single event upset (SEU), defined as a transient or glitch on the output of a device, and its applicability to integrated optoelectronics are discussed in the context of spacecraft design and the need for more than a bit error rate viewpoint for testing and analysis. A methodology for testing integrated optoelectronic receivers and transmitters for SEUs is presented, focusing on the actual test requirements and system schemes needed for integrated optoelectronic devices. Two main causes of single event effects in the space environment, including protons and galactic cosmic rays, are considered along with ground test facilities for simulating the space environment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reckinger, Scott James; Livescu, Daniel; Vasilyev, Oleg V.
A comprehensive numerical methodology has been developed that handles the challenges introduced by considering the compressive nature of Rayleigh-Taylor instability (RTI) systems, which include sharp interfacial density gradients on strongly stratified background states, acoustic wave generation and removal at computational boundaries, and stratification-dependent vorticity production. The computational framework is used to simulate two-dimensional single-mode RTI to extreme late-times for a wide range of flow compressibility and variable density effects. The results show that flow compressibility acts to reduce the growth of RTI for low Atwood numbers, as predicted from linear stability analysis.
Report to the President of the United States on Sexual Assault Prevention and Response
2014-11-01
established research history based on laboratory- tested principles of memory retrieval, knowledge representation, and communication. AFOSI has been using CI...analysis methods, including scientific research , data analysis, focus groups , and on -site assessments to evaluate the Department’s SAPR program...131 Report to the President of the United States on SAPR DMDC’s focus group methodology employs a standard qualitative research approach to
Network Data: Statistical Theory and New Models
2016-02-17
SECURITY CLASSIFICATION OF: During this period of review, Bin Yu worked on many thrusts of high-dimensional statistical theory and methodologies. Her...research covered a wide range of topics in statistics including analysis and methods for spectral clustering for sparse and structured networks...2,7,8,21], sparse modeling (e.g. Lasso) [4,10,11,17,18,19], statistical guarantees for the EM algorithm [3], statistical analysis of algorithm leveraging
Wang, Weihao; Xing, Zhihua
2014-01-01
Objective. Xingnaojing injection (XNJ) is a well-known traditional Chinese patent medicine (TCPM) for stroke. The aim of this study is to assess the efficacy of XNJ for stroke including ischemic stroke, intracerebral hemorrhage (ICH), and subarachnoid hemorrhage (SAH). Methods. An extensive search was performed within using eight databases up to November 2013. Randomized controlled trials (RCTs) on XNJ for treatment of stroke were collected. Study selection, data extraction, quality assessment, and meta-analysis were conducted according to the Cochrane standards, and RevMan5.0 was used for meta-analysis. Results. This review included 13 RCTs and a total of 1,514 subjects. The overall methodological quality was poor. The meta-analysis showed that XNJ combined with conventional treatment was more effective for total efficacy, neurological deficit improvement, and reduction of TNF-α levels compared with those of conventional treatment alone. Three trials reported adverse events, of these one trial reported mild impairment of kidney and liver function, whereas the other two studies failed to report specific adverse events. Conclusion. Despite the limitations of this review, we suggest that XNJ in combination with conventional medicines might be beneficial for the treatment of stroke. Currently there are various methodological problems in the studies. Therefore, high-quality, large-scale RCTs are urgently needed. PMID:24707306
Li, Yuanqing; Zhu, Xiaoshu; Bensussan, Alan; Li, Pingping; Moylan, Eugene; Delaney, Geoff; McPherson, Luke
2016-01-01
Objective. This systematic review was conducted to evaluate the clinical effectiveness and safety of herbal medicine (HM) as an alternative management for hot flushes induced by endocrine therapy in breast cancer patients. Methods. Key English and Chinese language databases were searched from inception to July 2015. Randomized Controlled Trials (RCTs) evaluating the effects of HM on hot flushes induced by endocrine therapy in women with breast cancer were retrieved. We conducted data collection and analysis in accordance with the Cochrane Handbook for Systematic Reviews of Interventions. Statistical analysis was performed with the software (Review Manager 5.3). Results. 19 articles were selected from the articles retrieved, and 5 articles met the inclusion criteria for analysis. Some included individual studies showed that HM can relieve hot flushes as well as other menopausal symptoms induced by endocrine therapy among women with breast cancer and improve the quality of life. There are minor side effects related to HM which are well tolerated. Conclusion. Given the small number of included studies and relatively poor methodological quality, there is insufficient evidence to draw positive conclusions regarding the objective benefit of HM. Additional high quality studies are needed with more rigorous methodological approach to answer this question.
Jia, Pengli; Tang, Li; Yu, Jiajie; Lee, Andy H; Zhou, Xu; Kang, Deying; Luo, Yanan; Liu, Jiali; Sun, Xin
2018-03-06
To assess risk of bias and to investigate methodological issues concerning the design, conduct and analysis of randomised controlled trials (RCTs) testing acupuncture for knee osteoarthritis (KOA). PubMed, EMBASE, Cochrane Central Register of Controlled Trials and four major Chinese databases were searched for RCTs that investigated the effect of acupuncture for KOA. The Cochrane tool was used to examine the risk of bias of eligible RCTs. Their methodological details were examined using a standardised and pilot-tested questionnaire of 48 items, together with the association between four predefined factors and important methodological quality indicators. A total of 248 RCTs were eligible, of which 39 (15.7%) used computer-generated randomisation sequence. Of the 31 (12.5%) trials that stated the allocation concealment, only one used central randomisation. Twenty-five (10.1%) trials mentioned that their acupuncture procedures were standardised, but only 18 (7.3%) specified how the standardisation was achieved. The great majority of trials (n=233, 94%) stated that blinding was in place, but 204 (87.6%) did not clarify who was blinded. Only 27 (10.9%) trials specified the primary outcome, for which 7 used intention-to-treat analysis. Only 17 (6.9%) trials included details on sample size calculation; none preplanned an interim analysis and associated stopping rule. In total, 46 (18.5%) trials explicitly stated that loss to follow-up occurred, but only 6 (2.4%) provided some information to deal with the issue. No trials prespecified, conducted or reported any subgroup or adjusted analysis for the primary outcome. The overall risk of bias was high among published RCTs testing acupuncture for KOA. Methodological limitations were present in many important aspects of design, conduct and analyses. These findings inform the development of evidence-based methodological guidance for future trials assessing the effect of acupuncture for KOA. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Methodological challenges in cross-language qualitative research: a research review.
Squires, Allison
2009-02-01
Cross-language qualitative research occurs when a language barrier is present between researchers and participants. The language barrier is frequently mediated through the use of a translator or interpreter. The purpose of this analysis of cross-language qualitative research was threefold: (1) review the methods literature addressing cross-language research; (2) synthesize the methodological recommendations from the literature into a list of criteria that could evaluate how researchers methodologically managed translators and interpreters in their qualitative studies; (3) test these criteria on published cross-language qualitative studies. A group of 40 purposively selected cross-language qualitative studies found in nursing and health sciences journals. The synthesis of the cross-language methods literature produced 14 criteria to evaluate how qualitative researchers managed the language barrier between themselves and their study participants. To test the criteria, the researcher conducted a summative content analysis framed by discourse analysis techniques of the 40 cross-language studies. The evaluation showed that only 6 out of 40 studies met all the criteria recommended by the cross-language methods literature for the production of trustworthy results in cross-language qualitative studies. Multiple inconsistencies, reflecting disadvantageous methodological choices by cross-language researchers, appeared in the remaining 33 studies. To name a few, these included rendering the translator or interpreter as an invisible part of the research process, failure to pilot test interview questions in the participant's language, no description of translator or interpreter credentials, failure to acknowledge translation as a limitation of the study, and inappropriate methodological frameworks for cross-language research. The finding about researchers making the role of the translator or interpreter invisible during the research process supports studies completed by other authors examining this issue. The analysis demonstrated that the criteria produced by this study may provide useful guidelines for evaluating cross-language research and for novice cross-language researchers designing their first studies. Finally, the study also indicates that researchers attempting cross-language studies need to address the methodological issues surrounding language barriers between researchers and participants more systematically.
Kallio, Hanna; Pietilä, Anna-Maija; Johnson, Martin; Kangasniemi, Mari
2016-12-01
To produce a framework for the development of a qualitative semi-structured interview guide. Rigorous data collection procedures fundamentally influence the results of studies. The semi-structured interview is a common data collection method, but methodological research on the development of a semi-structured interview guide is sparse. Systematic methodological review. We searched PubMed, CINAHL, Scopus and Web of Science for methodological papers on semi-structured interview guides from October 2004-September 2014. Having examined 2,703 titles and abstracts and 21 full texts, we finally selected 10 papers. We analysed the data using the qualitative content analysis method. Our analysis resulted in new synthesized knowledge on the development of a semi-structured interview guide, including five phases: (1) identifying the prerequisites for using semi-structured interviews; (2) retrieving and using previous knowledge; (3) formulating the preliminary semi-structured interview guide; (4) pilot testing the guide; and (5) presenting the complete semi-structured interview guide. Rigorous development of a qualitative semi-structured interview guide contributes to the objectivity and trustworthiness of studies and makes the results more plausible. Researchers should consider using this five-step process to develop a semi-structured interview guide and justify the decisions made during it. © 2016 John Wiley & Sons Ltd.
Longo, S; Hospido, A; Lema, J M; Mauricio-Iglesias, M
2018-05-10
This article examines the potential benefits of using Data Envelopment Analysis (DEA) for conducting energy-efficiency assessment of wastewater treatment plants (WWTPs). WWTPs are characteristically heterogeneous (in size, technology, climate, function …) which limits the correct application of DEA. This paper proposes and describes the Robust Energy Efficiency DEA (REED) in its various stages, a systematic state-of-the-art methodology aimed at including exogenous variables in nonparametric frontier models and especially designed for WWTP operation. In particular, the methodology systematizes the modelling process by presenting an integrated framework for selecting the correct variables and appropriate models, possibly tackling the effect of exogenous factors. As a result, the application of REED improves the quality of the efficiency estimates and hence the significance of benchmarking. For the reader's convenience, this article is presented as a step-by-step guideline to guide the user in the determination of WWTPs energy efficiency from beginning to end. The application and benefits of the developed methodology are demonstrated by a case study related to the comparison of the energy efficiency of a set of 399 WWTPs operating in different countries and under heterogeneous environmental conditions. Copyright © 2018 Elsevier Ltd. All rights reserved.
Supervised classification of continental shelf sediment off western Donegal, Ireland
NASA Astrophysics Data System (ADS)
Monteys, X.; Craven, K.; McCarron, S. G.
2017-12-01
Managing human impacts on marine ecosystems requires natural regions to be identified and mapped over a range of hierarchically nested scales. In recent years (2000-present) the Irish National Seabed Survey (INSS) and Integrated Mapping for the Sustainable Development of Ireland's Marine Resources programme (INFOMAR) (Geological Survey Ireland and Marine Institute collaborations) has provided unprecedented quantities of high quality data on Ireland's offshore territories. The increasing availability of large, detailed digital representations of these environments requires the application of objective and quantitative analyses. This study presents results of a new approach for sea floor sediment mapping based on an integrated analysis of INFOMAR multibeam bathymetric data (including the derivatives of slope and relative position), backscatter data (including derivatives of angular response analysis) and sediment groundtruthing over the continental shelf, west of Donegal. It applies a Geographic-Object-Based Image Analysis software package to provide a supervised classification of the surface sediment. This approach can provide a statistically robust, high resolution classification of the seafloor. Initial results display a differentiation of sediment classes and a reduction in artefacts from previously applied methodologies. These results indicate a methodology that could be used during physical habitat mapping and classification of marine environments.
NASA Technical Reports Server (NTRS)
Funk, Christie J.; Perry, Boyd, III; Silva, Walter A.; Newman, Brett
2014-01-01
A software program and associated methodology to study gust loading on aircraft exists for a classification of geometrically simplified flexible configurations. This program consists of a simple aircraft response model with two rigid and three flexible symmetric degrees-of - freedom and allows for the calculation of various airplane responses due to a discrete one-minus- cosine gust as well as continuous turbulence. Simplifications, assumptions, and opportunities for potential improvements pertaining to the existing software program are first identified, then a revised version of the original software tool is developed with improved methodology to include more complex geometries, additional excitation cases, and additional output data so as to provide a more useful and precise tool for gust load analysis. In order to improve the original software program to enhance usefulness, a wing control surface and horizontal tail control surface is added, an extended application of the discrete one-minus-cosine gust input is employed, a supplemental continuous turbulence spectrum is implemented, and a capability to animate the total vehicle deformation response to gust inputs is included. These revisions and enhancements are implemented and an analysis of the results is used to validate the modifications.
The Impacts of Individualization on Equity Educational Policies
ERIC Educational Resources Information Center
Francia, Guadalupe
2013-01-01
The present article has as its aim to illustrate and discuss the impacts of individualization strategies on equity educational policies through the analysis of individualized teaching strategies applied within the framework of educational priority policies in Sweden. The methodology used in our research work includes: (a) the study of research…
Evaluation of methodology for detecting/predicting migration of forest species
Dale S. Solomon; William B. Leak
1996-01-01
Available methods for analyzing migration of forest species are evaluated, including simulation models, remeasured plots, resurveys, pollen/vegetation analysis, and age/distance trends. Simulation models have provided some of the most drastic estimates of species changes due to predicted changes in global climate. However, these models require additional testing...
Parenting Stress in Families of Children with ADHD: A Meta-Analysis
ERIC Educational Resources Information Center
Theule, Jennifer; Wiener, Judith; Tannock, Rosemary; Jenkins, Jennifer M.
2013-01-01
Meta-analyses were conducted to examine findings on the association between parenting stress and ADHD. Predictors comprising child, parent, and contextual factors, and methodological and demographic moderators of the relationship between parenting stress and ADHD, were examined. Findings from 22 published and 22 unpublished studies were included.…
ERIC Educational Resources Information Center
DeHart, Dana
2010-01-01
This report describes process and outcome evaluation of an innovative program based in a women's maximum-security correctional facility. Methodology included review of program materials, unobtrusive observation of group process, participant evaluation forms, focus groups, and individual interviews with current and former program participants.…
Leveraging Our Expertise To Inform International RE Roadmaps | Energy
energy targets to support Mexico's renewable energy goal. NREL and its Mexico partners developed the institutions need to take to determine how the electricity infrastructure and systems must change to accommodate high levels of renewables. The roadmap focuses on analysis methodologies-including grid expansion
In Vitro Evaluation of a Program for Machine-Aided Indexing.
ERIC Educational Resources Information Center
Jacquemin, Christian; Daille, Beatrice; Royaute, Jean; Polanco, Xavier
2002-01-01
Presents the human evaluation of ILIAD, a program for machine-aided indexing that was designed to assist expert librarians in computer-aided indexing and document analysis. Topics include controlled indexing and free indexing; natural language and concept-based information retrieval; evaluation methodology; syntactic variations; and a comparison…
An Emergent Phenomenon of American Indian Secondary Students' Career Development Process
ERIC Educational Resources Information Center
Flynn, Stephen V.; Duncan, Kelly J.; Evenson, Lori L.
2013-01-01
Nine single-race American Indian secondary students' career development experiences were examined through a phenomenological methodology. All 9 participants were in the transition period starting in late secondary school (age 18). Data sources included individual interviews and journal analysis. The phenomenon of American Indian secondary…
A Critical Analysis of Research on the Overjustification Effect.
ERIC Educational Resources Information Center
Berkay, Paul James
This document examined studies of the overjustification effect. Many studies examining the phenomenon were conducted during the 1970s; findings appeared to be accepted without qualification. It is unclear whether researchers conducted the studies with the proper methodologies and interpreted results correctly. Such factors included the type of…
Claiming Unclaimed Spaces: Virtual Spaces for Learning
ERIC Educational Resources Information Center
Miller, Nicole C.
2016-01-01
The purpose of this study was to describe and examine the environments used by teacher candidates in multi-user virtual environments. Secondary data analysis of a case study methodology was employed. Multiple data sources including interviews, surveys, observations, snapshots, course artifacts, and the researcher's journal were used in the initial…
(The Androgyny Dimension: A Comment on Stokes, Childs, and Fuehrer: And a Response.)
ERIC Educational Resources Information Center
Lubinski, David; Stokes, Joseph
1983-01-01
Suggests a critical methodological flaw in a study done about the relationship between the Bem Sex-Role Inventory and certain indices of self-disclosure (Stokes, et al.). Notes that multiple regression analysis was not performed in appropriate hierarchical fashion. Includes Stokes reply to the critique. (PAS)
Decolonizing Education: A Critical Discourse Analysis of Post-Secondary Humanities Textbooks
ERIC Educational Resources Information Center
Harper, Kimberly C.
2012-01-01
This dissertation examines nine post-secondary humanities textbooks published between 2001 and 2011 using an approach that includes both qualitative and quantitative methodology to analyze the written and visual content of humanities textbooks. This dissertation engages in current debates that address bias in humanities textbooks and contributes…
NASA Technical Reports Server (NTRS)
Rodgers, T. E.; Johnson, J. F.
1977-01-01
The logic and methodology for a preliminary grouping of Spacelab and mixed-cargo payloads is proposed in a form that can be readily coded into a computer program by NASA. The logic developed for this preliminary cargo grouping analysis is summarized. Principal input data include the NASA Payload Model, payload descriptive data, Orbiter and Spacelab capabilities, and NASA guidelines and constraints. The first step in the process is a launch interval selection in which the time interval for payload grouping is identified. Logic flow steps are then taken to group payloads and define flight configurations based on criteria that includes dedication, volume, area, orbital parameters, pointing, g-level, mass, center of gravity, energy, power, and crew time.
NASA Astrophysics Data System (ADS)
Martino, P.
1980-12-01
A general methodology is presented for conducting an analysis of the various aspects of the hazards associated with the storage and transportation of liquefied natural gas (LNG) which should be considered during the planning stages of a typical LNG ship terminal. The procedure includes the performance of a hazards and system analysis of the proposed site, a probability analysis of accident scenarios and safety impacts, an analysis of the consequences of credible accidents such as tanker accidents, spills and fires, the assessment of risks and the design and evaluation of risk mitigation measures.
Analysis of methods. [information systems evolution environment
NASA Technical Reports Server (NTRS)
Mayer, Richard J. (Editor); Ackley, Keith A.; Wells, M. Sue; Mayer, Paula S. D.; Blinn, Thomas M.; Decker, Louis P.; Toland, Joel A.; Crump, J. Wesley; Menzel, Christopher P.; Bodenmiller, Charles A.
1991-01-01
Information is one of an organization's most important assets. For this reason the development and maintenance of an integrated information system environment is one of the most important functions within a large organization. The Integrated Information Systems Evolution Environment (IISEE) project has as one of its primary goals a computerized solution to the difficulties involved in the development of integrated information systems. To develop such an environment a thorough understanding of the enterprise's information needs and requirements is of paramount importance. This document is the current release of the research performed by the Integrated Development Support Environment (IDSE) Research Team in support of the IISEE project. Research indicates that an integral part of any information system environment would be multiple modeling methods to support the management of the organization's information. Automated tool support for these methods is necessary to facilitate their use in an integrated environment. An integrated environment makes it necessary to maintain an integrated database which contains the different kinds of models developed under the various methodologies. In addition, to speed the process of development of models, a procedure or technique is needed to allow automatic translation from one methodology's representation to another while maintaining the integrity of both. The purpose for the analysis of the modeling methods included in this document is to examine these methods with the goal being to include them in an integrated development support environment. To accomplish this and to develop a method for allowing intra-methodology and inter-methodology model element reuse, a thorough understanding of multiple modeling methodologies is necessary. Currently the IDSE Research Team is investigating the family of Integrated Computer Aided Manufacturing (ICAM) DEFinition (IDEF) languages IDEF(0), IDEF(1), and IDEF(1x), as well as ENALIM, Entity Relationship, Data Flow Diagrams, and Structure Charts, for inclusion in an integrated development support environment.
Pesch, Megan H; Lumeng, Julie C
2017-12-15
Behavioral coding of videotaped eating and feeding interactions can provide researchers with rich observational data and unique insights into eating behaviors, food intake, food selection as well as interpersonal and mealtime dynamics of children and their families. Unlike self-report measures of eating and feeding practices, the coding of videotaped eating and feeding behaviors can allow for the quantitative and qualitative examinations of behaviors and practices that participants may not self-report. While this methodology is increasingly more common, behavioral coding protocols and methodology are not widely shared in the literature. This has important implications for validity and reliability of coding schemes across settings. Additional guidance on how to design, implement, code and analyze videotaped eating and feeding behaviors could contribute to advancing the science of behavioral nutrition. The objectives of this narrative review are to review methodology for the design, operationalization, and coding of videotaped behavioral eating and feeding data in children and their families, and to highlight best practices. When capturing eating and feeding behaviors through analysis of videotapes, it is important for the study and coding to be hypothesis driven. Study design considerations include how to best capture the target behaviors through selection of a controlled experimental laboratory environment versus home mealtime, duration of video recording, number of observations to achieve reliability across eating episodes, as well as technical issues in video recording and sound quality. Study design must also take into account plans for coding the target behaviors, which may include behavior frequency, duration, categorization or qualitative descriptors. Coding scheme creation and refinement occur through an iterative process. Reliability between coders can be challenging to achieve but is paramount to the scientific rigor of the methodology. Analysis approach is dependent on the how data were coded and collapsed. Behavioral coding of videotaped eating and feeding behaviors can capture rich data "in-vivo" that is otherwise unobtainable from self-report measures. While data collection and coding are time-intensive the data yielded can be extremely valuable. Additional sharing of methodology and coding schemes around eating and feeding behaviors could advance the science and field.
Developing and validating risk prediction models in an individual participant data meta-analysis
2014-01-01
Background Risk prediction models estimate the risk of developing future outcomes for individuals based on one or more underlying characteristics (predictors). We review how researchers develop and validate risk prediction models within an individual participant data (IPD) meta-analysis, in order to assess the feasibility and conduct of the approach. Methods A qualitative review of the aims, methodology, and reporting in 15 articles that developed a risk prediction model using IPD from multiple studies. Results The IPD approach offers many opportunities but methodological challenges exist, including: unavailability of requested IPD, missing patient data and predictors, and between-study heterogeneity in methods of measurement, outcome definitions and predictor effects. Most articles develop their model using IPD from all available studies and perform only an internal validation (on the same set of data). Ten of the 15 articles did not allow for any study differences in baseline risk (intercepts), potentially limiting their model’s applicability and performance in some populations. Only two articles used external validation (on different data), including a novel method which develops the model on all but one of the IPD studies, tests performance in the excluded study, and repeats by rotating the omitted study. Conclusions An IPD meta-analysis offers unique opportunities for risk prediction research. Researchers can make more of this by allowing separate model intercept terms for each study (population) to improve generalisability, and by using ‘internal-external cross-validation’ to simultaneously develop and validate their model. Methodological challenges can be reduced by prospectively planned collaborations that share IPD for risk prediction. PMID:24397587
Tepid massage for febrile children: A systematic review and meta-analysis.
Lim, Junghee; Kim, Juyoung; Moon, Bora; Kim, Gaeun
2018-05-10
This study aimed to examine the effect of tepid massage in febrile children comparing with other fever management. Experimental studies published in English were included; quasi-experimental research studies were also included in consideration of rare experimental studies in Korean. The search strategy sought to identify published research reports in the English language and covered all major databases up to 2016. The methodological quality of each study was assessed by 2 independent reviewers using a Scottish Intercollegiate Guidelines Network's Methodology Checklist. Means and standard deviations were used for continuous variables, and standardized mean difference was used for variables of different scales. Heterogeneity was assessed using the I 2 statistics after visual reviewing with forest plots. This study reviewed mainly the effect of tepid massage on temperature compared with the use of antipyretics, along with other adverse effects in relation with fever management. The results revealed no significant effect of tepid massage on temperature in febrile children. In addition, incidence rates of adverse effects including chills, goose pimples, and discomfort were higher in tepid massage groups. This meta-analysis showed the need for re-verification of commonly used practice including the use of tepid massage and proper body temperature measurement. © 2018 John Wiley & Sons Australia, Ltd.
The added value of thorough economic evaluation of telemedicine networks.
Le Goff-Pronost, Myriam; Sicotte, Claude
2010-02-01
This paper proposes a thorough framework for the economic evaluation of telemedicine networks. A standard cost analysis methodology was used as the initial base, similar to the evaluation method currently being applied to telemedicine, and to which we suggest adding subsequent stages that enhance the scope and sophistication of the analytical methodology. We completed the methodology with a longitudinal and stakeholder analysis, followed by the calculation of a break-even threshold, a calculation of the economic outcome based on net present value (NPV), an estimate of the social gain through external effects, and an assessment of the probability of social benefits. In order to illustrate the advantages, constraints and limitations of the proposed framework, we tested it in a paediatric cardiology tele-expertise network. The results demonstrate that the project threshold was not reached after the 4 years of the study. Also, the calculation of the project's NPV remained negative. However, the additional analytical steps of the proposed framework allowed us to highlight alternatives that can make this service economically viable. These included: use over an extended period of time, extending the network to other telemedicine specialties, or including it in the services offered by other community hospitals. In sum, the results presented here demonstrate the usefulness of an economic evaluation framework as a way of offering decision makers the tools they need to make comprehensive evaluations of telemedicine networks.
Moss, Robert; Grosse, Thibault; Marchant, Ivanny; Lassau, Nathalie; Gueyffier, François; Thomas, S. Randall
2012-01-01
Mathematical models that integrate multi-scale physiological data can offer insight into physiological and pathophysiological function, and may eventually assist in individualized predictive medicine. We present a methodology for performing systematic analyses of multi-parameter interactions in such complex, multi-scale models. Human physiology models are often based on or inspired by Arthur Guyton's whole-body circulatory regulation model. Despite the significance of this model, it has not been the subject of a systematic and comprehensive sensitivity study. Therefore, we use this model as a case study for our methodology. Our analysis of the Guyton model reveals how the multitude of model parameters combine to affect the model dynamics, and how interesting combinations of parameters may be identified. It also includes a “virtual population” from which “virtual individuals” can be chosen, on the basis of exhibiting conditions similar to those of a real-world patient. This lays the groundwork for using the Guyton model for in silico exploration of pathophysiological states and treatment strategies. The results presented here illustrate several potential uses for the entire dataset of sensitivity results and the “virtual individuals” that we have generated, which are included in the supplementary material. More generally, the presented methodology is applicable to modern, more complex multi-scale physiological models. PMID:22761561
Chung, Ka-Fai; Chan, Man-Sum; Lam, Ying-Yin; Lai, Cindy Sin-Yee; Yeung, Wing-Fai
2017-06-01
Insufficient sleep among students is a major school health problem. School-based sleep education programs tailored to reach large number of students may be one of the solutions. A systematic review and meta-analysis was conducted to summarize the programs' effectiveness and current status. Electronic databases were searched up until May 2015. Randomized controlled trials of school-based sleep intervention among 10- to 19-year-old students with outcome on total sleep duration were included. Methodological quality of the studies was assessed using the Cochrane's risk of bias assessment. Seven studies were included, involving 1876 students receiving sleep education programs and 2483 attending classes-as-usual. Four weekly 50-minute sleep education classes were most commonly provided. Methodological quality was only moderate, with a high or an uncertain risk of bias in several domains. Compared to classes-as-usual, sleep education programs produced significantly longer weekday and weekend total sleep time and better mood among students at immediate post-treatment, but the improvements were not maintained at follow-up. Limited by the small number of studies and methodological limitations, the preliminary data showed that school-based sleep education programs produced short-term benefits. Future studies should explore integrating sleep education with delayed school start time or other more effective approaches. © 2017, American School Health Association.
Dent, Andrew W; Asadpour, Ali; Weiland, Tracey J; Paltridge, Debbie
2008-02-01
Fellows of the Australasian College for Emergency Medicine (FACEM) have opportunities to participate in a range of continuing professional development activities. To inform FACEM and assist those involved in planning continuing professional development interventions for FACEM, we undertook a learning needs analysis of emergency physicians. Exploratory study using survey methodology. Following questionnaire development by iterative feedback with emergency physicians and researchers, a mailed survey was distributed to all FACEM. The survey comprised eight items on work and demographic characteristics of FACEM, and 194 items on attitudes to existing learning opportunities, barriers to learning, and perceived learning needs and preferences. Fifty-eight percent (503/854) of all FACEM surveyed responded to the questionnaire, almost half of whom attained their FACEM after year 2000. The sample comprised mostly males (72.8%) with mean age of the sample 41.6 years, similar to ACEM database. Most respondents reported working in ACEM accredited hospitals (89%), major referral hospitals (54%), and practiced on both children and adults (78%). FACEM reported working on average 26.7 clinical hours per week with those at private hospitals working a greater proportion of clinical hours than other hospital types. As the first of six related reports, this paper documents the methodology used, including questionnaire development, and provides the demographics of responding FACEM, including the clinical and non-clinical hours worked and type of hospital of principal employment.
Indirect Comparisons: A Review of Reporting and Methodological Quality
Donegan, Sarah; Williamson, Paula; Gamble, Carrol; Tudur-Smith, Catrin
2010-01-01
Background The indirect comparison of two interventions can be valuable in many situations. However, the quality of an indirect comparison will depend on several factors including the chosen methodology and validity of underlying assumptions. Published indirect comparisons are increasingly more common in the medical literature, but as yet, there are no published recommendations of how they should be reported. Our aim is to systematically review the quality of published indirect comparisons to add to existing empirical data suggesting that improvements can be made when reporting and applying indirect comparisons. Methodology/Findings Reviews applying statistical methods to indirectly compare the clinical effectiveness of two interventions using randomised controlled trials were eligible. We searched (1966–2008) Database of Abstracts and Reviews of Effects, The Cochrane library, and Medline. Full review publications were assessed for eligibility. Specific criteria to assess quality were developed and applied. Forty-three reviews were included. Adequate methodology was used to calculate the indirect comparison in 41 reviews. Nineteen reviews assessed the similarity assumption using sensitivity analysis, subgroup analysis, or meta-regression. Eleven reviews compared trial-level characteristics. Twenty-four reviews assessed statistical homogeneity. Twelve reviews investigated causes of heterogeneity. Seventeen reviews included direct and indirect evidence for the same comparison; six reviews assessed consistency. One review combined both evidence types. Twenty-five reviews urged caution in interpretation of results, and 24 reviews indicated when results were from indirect evidence by stating this term with the result. Conclusions This review shows that the underlying assumptions are not routinely explored or reported when undertaking indirect comparisons. We recommend, therefore, that the quality of indirect comparisons should be improved, in particular, by assessing assumptions and reporting the assessment methods applied. We propose that the quality criteria applied in this article may provide a basis to help review authors carry out indirect comparisons and to aid appropriate interpretation. PMID:21085712
Xu, Qinguang; Chen, Bei; Wang, Yueyi; Wang, Xuezong; Han, Dapeng; Ding, Daofang; Zheng, Yuxin; Cao, Yuelong; Zhan, Hongsheng; Zhou, Yao
2017-05-01
Knee osteoarthritis (KOA) is the most common form of arthritis, leading to pain disability in seniors and increased health care utilization. Manual therapy is one widely used physical treatment for KOA. To evaluate the effectiveness and adverse events (AEs) of manual therapy compared to other treatments for relieving pain, stiffness, and physical dysfunction in patients with KOA. A systematic review and meta-analysis of manual therapy for KOA. We searched PubMed, EMBASE, the Cochrane Library, and Chinese databases for relevant randomized controlled trials (RCTs) of manual therapy for patients with KOA from the inception to October 2015 without language restrictions. RCTs compared manual therapy to the placebo or other interventional control with an appropriate description of randomization. Two reviewers independently conducted the search results identification, data extraction, and methodological quality assessment. The methodological quality was assessed by PEDro scale. Pooled data was expressed as standard mean difference (SMD), with 95% confident intervals (CIs) in a random effects model. The meta-analysis of manual therapy for KOA on pain, stiffness, and physical function were conducted. Fourteen studies involving 841 KOA participants compared to other treatments were included. The methodological quality of most included RCTs was poor. The mean PEDro scale score was 6.6. The meta-analyses results showed that manual therapy had statistically significant effects on relieving pain (standardized mean difference, SMD = -0.61, 95% CI -0.95 to -0.28, P = 76%), stiffness (SMD = -0.58, 95% CI -0.95 to -0.21, P = 81%), improving physical function (SMD = -0.49, 95% CI -0.76 to -0.22, P = 65%), and total score (SMD = -0.56, 95% CI -0.78 to -0.35, P = 50%). But in the subgroups, manual therapy did not show significant improvements on stiffness and physical function when treatment duration was less than 4 weeks. And the long-term information for manual therapy was insufficient. The limitations of this systematic review include the paucity of literature and inevitable heterogeneity between included studies. The preliminary evidence from our study suggests that manual therapy might be effective and safe for improving pain, stiffness, and physical function in KOA patients and could be treated as complementary and alternative options. However, the evidence may be limited by potential bias and poor methodological quality of included studies. High-quality RCTs with long-term follow-up are warranted to confirm our findings.Key words: Knee osteoarthritis, manual therapy, systematic review.
Institutions and national development in Latin America: a comparative study
Portes, Alejandro; Smith, Lori D.
2013-01-01
We review the theoretical and empirical literatures on the role of institutions on national development as a prelude to present a more rigorous and measurable definition of the concept and a methodology to study this relationship at the national and subnational levels. The existing research literature features conflicting definitions of the concept of “institutions” and empirical tests based mostly on reputational indices, with countries as units of analysis. The present study’s methodology is based on a set of five strategic organizations studied comparatively in five Latin American countries. These include key federal agencies, public administrative organizations, and stock exchanges. Systematic analysis of results show a pattern of differences between economically-oriented institutions and those entrusted with providing basic services to the general population. Consistent differences in institutional quality also emerge across countries, despite similar levels of economic development. Using the algebraic methods developed by Ragin, we test six hypotheses about factors determining the developmental character of particular institutions. Implications of results for theory and for methodological practices of future studies in this field are discussed. PMID:26543407
Røislien, Jo; Winje, Brita
2013-09-20
Clinical studies frequently include repeated measurements of individuals, often for long periods. We present a methodology for extracting common temporal features across a set of individual time series observations. In particular, the methodology explores extreme observations within the time series, such as spikes, as a possible common temporal phenomenon. Wavelet basis functions are attractive in this sense, as they are localized in both time and frequency domains simultaneously, allowing for localized feature extraction from a time-varying signal. We apply wavelet basis function decomposition of individual time series, with corresponding wavelet shrinkage to remove noise. We then extract common temporal features using linear principal component analysis on the wavelet coefficients, before inverse transformation back to the time domain for clinical interpretation. We demonstrate the methodology on a subset of a large fetal activity study aiming to identify temporal patterns in fetal movement (FM) count data in order to explore formal FM counting as a screening tool for identifying fetal compromise and thus preventing adverse birth outcomes. Copyright © 2013 John Wiley & Sons, Ltd.
Innovation design of medical equipment based on TRIZ.
Gao, Changqing; Guo, Leiming; Gao, Fenglan; Yang, Bo
2015-01-01
Medical equipment is closely related to personal health and safety, and this can be of concern to the equipment user. Furthermore, there is much competition among medical equipment manufacturers. Innovative design is the key to success for those enterprises. The design of medical equipment usually covers vastly different domains of knowledge. The application of modern design methodology in medical equipment and technology invention is an urgent requirement. TRIZ (Russian abbreviation of what can be translated as `theory of inventive problem solving') was born in Russia, which contain some problem-solving methods developed by patent analysis around the world, including Conflict Matrix, Substance Field Analysis, Standard Solution, Effects, etc. TRIZ is an inventive methodology for problems solving. As an Engineering example, infusion system is analyzed and re-designed by TRIZ. The innovative idea is generated to liberate the caretaker from the infusion bag watching out. The research in this paper shows the process of the application of TRIZ in medical device inventions. It is proved that TRIZ is an inventive methodology for problems solving and can be used widely in medical device development.
Medical device procurement in low- and middle-income settings: protocol for a systematic review.
Diaconu, Karin; Chen, Yen-Fu; Manaseki-Holland, Semira; Cummins, Carole; Lilford, Richard
2014-10-21
Medical device procurement processes for low- and middle-income countries (LMICs) are a poorly understood and researched topic. To support LMIC policy formulation in this area, international public health organizations and research institutions issue a large body of predominantly grey literature including guidelines, manuals and recommendations. We propose to undertake a systematic review to identify and explore the medical device procurement methodologies suggested within this and further literature. Procurement facilitators and barriers will be identified, and methodologies for medical device prioritization under resource constraints will be discussed. Searches of both bibliographic and grey literature will be conducted to identify documents relating to the procurement of medical devices in LMICs. Data will be extracted according to protocol on a number of pre-specified issues and variables. First, data relating to the specific settings described within the literature will be noted. Second, information relating to medical device procurement methodologies will be extracted, including prioritization of procurement under resource constraints, the use of evidence (e.g. cost-effectiveness evaluations, burden of disease data) as well as stakeholders participating in procurement processes. Information relating to prioritization methodologies will be extracted in the form of quotes or keywords, and analysis will include qualitative meta-summary. Narrative synthesis will be employed to analyse data otherwise extracted. The PRISMA guidelines for reporting will be followed. The current review will identify recommended medical device procurement methodologies for LMICs. Prioritization methods for medical device acquisition will be explored. Relevant stakeholders, facilitators and barriers will be discussed. The review is aimed at both LMIC decision makers and the international research community and hopes to offer a first holistic conceptualization of this topic.
Overdiagnosis across medical disciplines: a scoping review
de Groot, Joris A H; Reitsma, Johannes B; Moons, Karel G M; Hooft, Lotty; Naaktgeboren, Christiana A
2017-01-01
Objective To provide insight into how and in what clinical fields overdiagnosis is studied and give directions for further applied and methodological research. Design Scoping review. Data sources Medline up to August 2017. Study selection All English studies on humans, in which overdiagnosis was discussed as a dominant theme. Data extraction Studies were assessed on clinical field, study aim (ie, methodological or non-methodological), article type (eg, primary study, review), the type and role of diagnostic test(s) studied and the context in which these studies discussed overdiagnosis. Results From 4896 studies, 1851 were included for analysis. Half of all studies on overdiagnosis were performed in the field of oncology (50%). Other prevalent clinical fields included mental disorders, infectious diseases and cardiovascular diseases accounting for 9%, 8% and 6% of studies, respectively. Overdiagnosis was addressed from a methodological perspective in 20% of studies. Primary studies were the most common article type (58%). The type of diagnostic tests most commonly studied were imaging tests (32%), although these were predominantly seen in oncology and cardiovascular disease (84%). Diagnostic tests were studied in a screening setting in 43% of all studies, but as high as 75% of all oncological studies. The context in which studies addressed overdiagnosis related most frequently to its estimation, accounting for 53%. Methodology on overdiagnosis estimation and definition provided a source for extensive discussion. Other contexts of discussion included definition of disease, overdiagnosis communication, trends in increasing disease prevalence, drivers and consequences of overdiagnosis, incidental findings and genomics. Conclusions Overdiagnosis is discussed across virtually all clinical fields and in different contexts. The variability in characteristics between studies and lack of consensus on overdiagnosis definition indicate the need for a uniform typology to improve coherence and comparability of studies on overdiagnosis. PMID:29284720
Medical device procurement in low- and middle-income settings: protocol for a systematic review
2014-01-01
Background Medical device procurement processes for low- and middle-income countries (LMICs) are a poorly understood and researched topic. To support LMIC policy formulation in this area, international public health organizations and research institutions issue a large body of predominantly grey literature including guidelines, manuals and recommendations. We propose to undertake a systematic review to identify and explore the medical device procurement methodologies suggested within this and further literature. Procurement facilitators and barriers will be identified, and methodologies for medical device prioritization under resource constraints will be discussed. Methods/design Searches of both bibliographic and grey literature will be conducted to identify documents relating to the procurement of medical devices in LMICs. Data will be extracted according to protocol on a number of pre-specified issues and variables. First, data relating to the specific settings described within the literature will be noted. Second, information relating to medical device procurement methodologies will be extracted, including prioritization of procurement under resource constraints, the use of evidence (e.g. cost-effectiveness evaluations, burden of disease data) as well as stakeholders participating in procurement processes. Information relating to prioritization methodologies will be extracted in the form of quotes or keywords, and analysis will include qualitative meta-summary. Narrative synthesis will be employed to analyse data otherwise extracted. The PRISMA guidelines for reporting will be followed. Discussion The current review will identify recommended medical device procurement methodologies for LMICs. Prioritization methods for medical device acquisition will be explored. Relevant stakeholders, facilitators and barriers will be discussed. The review is aimed at both LMIC decision makers and the international research community and hopes to offer a first holistic conceptualization of this topic. PMID:25336161
Huang, Erich P; Wang, Xiao-Feng; Choudhury, Kingshuk Roy; McShane, Lisa M; Gönen, Mithat; Ye, Jingjing; Buckler, Andrew J; Kinahan, Paul E; Reeves, Anthony P; Jackson, Edward F; Guimaraes, Alexander R; Zahlmann, Gudrun
2015-02-01
Medical imaging serves many roles in patient care and the drug approval process, including assessing treatment response and guiding treatment decisions. These roles often involve a quantitative imaging biomarker, an objectively measured characteristic of the underlying anatomic structure or biochemical process derived from medical images. Before a quantitative imaging biomarker is accepted for use in such roles, the imaging procedure to acquire it must undergo evaluation of its technical performance, which entails assessment of performance metrics such as repeatability and reproducibility of the quantitative imaging biomarker. Ideally, this evaluation will involve quantitative summaries of results from multiple studies to overcome limitations due to the typically small sample sizes of technical performance studies and/or to include a broader range of clinical settings and patient populations. This paper is a review of meta-analysis procedures for such an evaluation, including identification of suitable studies, statistical methodology to evaluate and summarize the performance metrics, and complete and transparent reporting of the results. This review addresses challenges typical of meta-analyses of technical performance, particularly small study sizes, which often causes violations of assumptions underlying standard meta-analysis techniques. Alternative approaches to address these difficulties are also presented; simulation studies indicate that they outperform standard techniques when some studies are small. The meta-analysis procedures presented are also applied to actual [18F]-fluorodeoxyglucose positron emission tomography (FDG-PET) test-retest repeatability data for illustrative purposes. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Huang, Erich P; Wang, Xiao-Feng; Choudhury, Kingshuk Roy; McShane, Lisa M; Gönen, Mithat; Ye, Jingjing; Buckler, Andrew J; Kinahan, Paul E; Reeves, Anthony P; Jackson, Edward F; Guimaraes, Alexander R; Zahlmann, Gudrun
2017-01-01
Medical imaging serves many roles in patient care and the drug approval process, including assessing treatment response and guiding treatment decisions. These roles often involve a quantitative imaging biomarker, an objectively measured characteristic of the underlying anatomic structure or biochemical process derived from medical images. Before a quantitative imaging biomarker is accepted for use in such roles, the imaging procedure to acquire it must undergo evaluation of its technical performance, which entails assessment of performance metrics such as repeatability and reproducibility of the quantitative imaging biomarker. Ideally, this evaluation will involve quantitative summaries of results from multiple studies to overcome limitations due to the typically small sample sizes of technical performance studies and/or to include a broader range of clinical settings and patient populations. This paper is a review of meta-analysis procedures for such an evaluation, including identification of suitable studies, statistical methodology to evaluate and summarize the performance metrics, and complete and transparent reporting of the results. This review addresses challenges typical of meta-analyses of technical performance, particularly small study sizes, which often causes violations of assumptions underlying standard meta-analysis techniques. Alternative approaches to address these difficulties are also presented; simulation studies indicate that they outperform standard techniques when some studies are small. The meta-analysis procedures presented are also applied to actual [18F]-fluorodeoxyglucose positron emission tomography (FDG-PET) test–retest repeatability data for illustrative purposes. PMID:24872353
MicroRNAs for osteosarcoma in the mouse: a meta-analysis
Chang, Junli; Yao, Min; Li, Yimian; Zhao, Dongfeng; Hu, Shaopu; Cui, Xuejun; Liu, Gang; Shi, Qi; Wang, Yongjun; Yang, Yanping
2016-01-01
Osteosarcoma (OS) is the most common primary malignant bone carcinoma with high morbidity that happens mainly in children and young adults. As the key components of gene-regulatory networks, microRNAs (miRNAs) control many critical pathophysiological processes, including initiation and progression of cancers. The objective of this study is to summarize and evaluate the potential of miRNAs as targets for prevention and treatment of OS in mouse models, and to explore the methodological quality of current studies. We searched PubMed, Web of Science, Embase, Wan Fang Database, VIP Database, China Knowledge Resource Integrated Database, and Chinese BioMedical since their beginning date to 10 May 2016. Two reviewers separately screened the controlled studies, which estimate the effects of miRNAs on osteosarcoma in mice. A pair-wise analysis was performed. Thirty six studies with enough randomization were selected and included in the meta-analysis. We found that blocking oncogenic or restoring decreased miRNAs in cancer cells could significantly suppress the progression of OS in vivo, as assessed by tumor volume and tumor weight. This meta-analysis suggests that miRNAs are potential therapeutic targets for OS and correction of the altered expression of miRNAs significantly suppresses the progression of OS in mouse models, however, the overall methodological quality of studies included here was low, and more animal studies with the rigourous design must be carried out before a miRNA-based treatment could be translated from animal studies to clinical trials. PMID:27852052
Safety assessment methodology in management of spent sealed sources.
Mahmoud, Narmine Salah
2005-02-14
Environmental hazards can be caused from radioactive waste after their disposal. It was therefore important that safety assessment methodologies be developed and established to study and estimate the possible hazards, and institute certain safety methodologies that lead and prevent the evolution of these hazards. Spent sealed sources are specific type of radioactive waste. According to IAEA definition, spent sealed sources are unused sources because of activity decay, damage, misuse, loss, or theft. Accidental exposure of humans from spent sealed sources can occur at the moment they become spent and before their disposal. Because of that reason, safety assessment methodologies were tailored to suit the management of spent sealed sources. To provide understanding and confidence of this study, validation analysis was undertaken by considering the scenario of an accident that occurred in Egypt, June 2000 (the Meet-Halfa accident from an iridium-192 source). The text of this work includes consideration related to the safety assessment approaches of spent sealed sources which constitutes assessment context, processes leading an active source to be spent, accident scenarios, mathematical models for dose calculations, and radiological consequences and regulatory criteria. The text also includes a validation study, which was carried out by evaluating a theoretical scenario compared to the real scenario of Meet-Halfa accident depending on the clinical assessment of affected individuals.
Konz, Tobias; Migliavacca, Eugenia; Dayon, Loïc; Bowman, Gene; Oikonomidi, Aikaterini; Popp, Julius; Rezzi, Serge
2017-05-05
We here describe the development, validation and application of a quantitative methodology for the simultaneous determination of 29 elements in human serum using state-of-the-art inductively coupled plasma triple quadrupole mass spectrometry (ICP-MS/MS). This new methodology offers high-throughput elemental profiling using simple dilution of minimal quantity of serum samples. We report the outcomes of the validation procedure including limits of detection/quantification, linearity of calibration curves, precision, recovery and measurement uncertainty. ICP-MS/MS-based ionomics was used to analyze human serum of 120 older adults. Following a metabolomic data mining approach, the generated ionome profiles were subjected to principal component analysis revealing gender and age-specific differences. The ionome of female individuals was marked by higher levels of calcium, phosphorus, copper and copper to zinc ratio, while iron concentration was lower with respect to male subjects. Age was associated with lower concentrations of zinc. These findings were complemented with additional readouts to interpret micronutrient status including ceruloplasmin, ferritin and inorganic phosphate. Our data supports a gender-specific compartmentalization of the ionome that may reflect different bone remodelling in female individuals. Our ICP-MS/MS methodology enriches the panel of validated "Omics" approaches to study molecular relationships between the exposome and the ionome in relation with nutrition and health.
Authentic leadership in healthcare: a scoping review.
Malila, Niina; Lunkka, Nina; Suhonen, Marjo
2018-02-05
Purpose The purpose of this paper is to review peer-reviewed original research articles on authentic leadership (AL) in health care to identify potential research gaps and present recommendations for future research. The objectives are to examine and map evidence of the main characteristics, research themes and methodologies in the studies. AL is a leader's non-authoritarian, ethical and transparent behaviour pattern. Design/methodology/approach A scoping review with thematic analysis was conducted. A three-step search strategy was used with database and manual searches. The included studies were composed of English language peer-reviewed original research articles referring to both AL and health care. Findings In total, 29 studies were included. The studies favoured Canadian nurses in acute care hospitals. AL was understood as its original definition. The review identified four research themes: well-being at work, patient care quality, work environment and AL promotion. Quantitative research methodology with the authentic leadership questionnaire and cross-sectional design were prevalent. Research limitations/implications Future research needs more variation in research themes, study populations, settings, organisations, work sectors, geographical origins and theory perspectives. Different research methodologies, such as qualitative and mixed methods research and longitudinal designs, should be used more. Originality/value This is presumably the first literature review to map the research on AL in health care.
Assessment of methodologies for analysis of the dungeness B accidental aircraft crash risk.
DOE Office of Scientific and Technical Information (OSTI.GOV)
LaChance, Jeffrey L.; Hansen, Clifford W.
2010-09-01
The Health and Safety Executive (HSE) has requested Sandia National Laboratories (SNL) to review the aircraft crash methodology for nuclear facilities that are being used in the United Kingdom (UK). The scope of the work included a review of one method utilized in the UK for assessing the potential for accidental airplane crashes into nuclear facilities (Task 1) and a comparison of the UK methodology against similar International Atomic Energy Agency (IAEA), United States (US) Department of Energy (DOE), and the US Nuclear Regulatory Commission (NRC) methods (Task 2). Based on the conclusions from Tasks 1 and 2, an additionalmore » Task 3 would provide an assessment of a site-specific crash frequency for the Dungeness B facility using one of the other methodologies. This report documents the results of Task 2. The comparison of the different methods was performed for the three primary contributors to aircraft crash risk at the Dungeness B site: airfield related crashes, crashes below airways, and background crashes. The methods and data specified in each methodology were compared for each of these risk contributors, differences in the methodologies were identified, and the importance of these differences was qualitatively and quantitatively assessed. The bases for each of the methods and the data used were considered in this assessment process. A comparison of the treatment of the consequences of the aircraft crashes was not included in this assessment because the frequency of crashes into critical structures is currently low based on the existing Dungeness B assessment. Although the comparison found substantial differences between the UK and the three alternative methodologies (IAEA, NRC, and DOE) this assessment concludes that use of any of these alternative methodologies would not change the conclusions reached for the Dungeness B site. Performance of Task 3 is thus not recommended.« less
Modern proposal of methodology for retrieval of characteristic synthetic rainfall hyetographs
NASA Astrophysics Data System (ADS)
Licznar, Paweł; Burszta-Adamiak, Ewa; Łomotowski, Janusz; Stańczyk, Justyna
2017-11-01
Modern engineering workshop of designing and modelling complex drainage systems is based on hydrodynamic modelling and has a probabilistic character. Its practical application requires a change regarding rainfall models accepted at the input. Previously used artificial rainfall models of simplified form, e.g. block precipitation or Euler's type II model rainfall are no longer sufficient. It is noticeable that urgent clarification is needed as regards the methodology of standardized rainfall hyetographs that would take into consideration the specifics of local storm rainfall temporal dynamics. The aim of the paper is to present a proposal for innovative methodology for determining standardized rainfall hyetographs, based on statistical processing of the collection of actual local precipitation characteristics. Proposed methodology is based on the classification of standardized rainfall hyetographs with the use of cluster analysis. Its application is presented on the example of selected rain gauges localized in Poland. Synthetic rainfall hyetographs achieved as a final result may be used for hydrodynamic modelling of sewerage systems, including probabilistic detection of necessary capacity of retention reservoirs.
Krauter, Paula; Edwards, Donna; Yang, Lynn; Tucker, Mark
2011-09-01
Decontamination and recovery of a facility or outdoor area after a wide-area biological incident involving a highly persistent agent (eg, Bacillus anthracis spores) is a complex process that requires extensive information and significant resources, which are likely to be limited, particularly if multiple facilities or areas are affected. This article proposes a systematic methodology for evaluating information to select the decontamination or alternative treatments that optimize use of resources if decontamination is required for the facility or area. The methodology covers a wide range of approaches, including volumetric and surface decontamination, monitored natural attenuation, and seal and abandon strategies. A proposed trade-off analysis can help decision makers understand the relative appropriateness, efficacy, and labor, skill, and cost requirements of the various decontamination methods for the particular facility or area needing treatment--whether alone or as part of a larger decontamination effort. Because the state of decontamination knowledge and technology continues to evolve rapidly, the methodology presented here is designed to accommodate new strategies and materials and changing information.
NASA Astrophysics Data System (ADS)
Jia, Xiaodong; Jin, Chao; Buzza, Matt; Di, Yuan; Siegel, David; Lee, Jay
2018-01-01
Successful applications of Diffusion Map (DM) in machine failure detection and diagnosis have been reported in several recent studies. DM provides an efficient way to visualize the high-dimensional, complex and nonlinear machine data, and thus suggests more knowledge about the machine under monitoring. In this paper, a DM based methodology named as DM-EVD is proposed for machine degradation assessment, abnormality detection and diagnosis in an online fashion. Several limitations and challenges of using DM for machine health monitoring have been analyzed and addressed. Based on the proposed DM-EVD, a deviation based methodology is then proposed to include more dimension reduction methods. In this work, the incorporation of Laplacian Eigen-map and Principal Component Analysis (PCA) are explored, and the latter algorithm is named as PCA-Dev and is validated in the case study. To show the successful application of the proposed methodology, case studies from diverse fields are presented and investigated in this work. Improved results are reported by benchmarking with other machine learning algorithms.
NASA standard: Trend analysis techniques
NASA Technical Reports Server (NTRS)
1990-01-01
Descriptive and analytical techniques for NASA trend analysis applications are presented in this standard. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. This document should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend analysis is neither a precise term nor a circumscribed methodology: it generally connotes quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this document. The basic ideas needed for qualitative and quantitative assessment of trends along with relevant examples are presented.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-08
... analysis, survey methodology, geospatial analysis, econometrics, cognitive psychology, and computer science... following disciplines: demography, economics, geography, psychology, statistics, survey methodology, social... expertise in such areas as demography, economics, geography, psychology, statistics, survey methodology...
Failure mode effect analysis and fault tree analysis as a combined methodology in risk management
NASA Astrophysics Data System (ADS)
Wessiani, N. A.; Yoshio, F.
2018-04-01
There have been many studies reported the implementation of Failure Mode Effect Analysis (FMEA) and Fault Tree Analysis (FTA) as a method in risk management. However, most of the studies usually only choose one of these two methods in their risk management methodology. On the other side, combining these two methods will reduce the drawbacks of each methods when implemented separately. This paper aims to combine the methodology of FMEA and FTA in assessing risk. A case study in the metal company will illustrate how this methodology can be implemented. In the case study, this combined methodology will assess the internal risks that occur in the production process. Further, those internal risks should be mitigated based on their level of risks.
Kassianos, Angelos P; Ioannou, Myria; Koutsantoni, Marianna; Charalambous, Haris
2018-01-01
Specialized palliative care (SPC) is currently underutilized or provided late in cancer care. The aim of this systematic review and meta-analysis is to critically evaluate the impact of SPC on patients' health-related quality of life (HRQoL). Five databases were searched through June 2016. Randomized controlled trials (RCTs) and prospective studies using a pre- and post- assessment of HRQoL were included. The PRISMA reporting statement was followed. Criteria from available checklists were used to evaluate the studies' quality. A meta-analysis followed using random-effect models separately for RCTs and non-RCTs. Eleven studies including five RCTs and 2939 cancer patients published between 2001 and 2014 were identified. There was improved HRQoL in patients with cancer following SPC especially in symptoms like pain, nausea, and fatigue as well as improvement of physical and psychological functioning. Less or no improvements were observed in social and spiritual domains. In general, studies of inpatients showed a larger benefit from SPC than studies of outpatients whereas patients' age and treatment duration did not moderate the impact of SPC. Methodological shortcomings of included studies include high attrition rates, low precision, and power and poor reporting of control procedures. The methodological problems and publication bias call for higher-quality studies to be designed, funded, and published. However, there is a clear message that SPC is multi-disciplinary and aims at palliation of symptoms and burden in line with current recommendations.
Cuevas, Soledad
Agriculture is a major contributor to greenhouse gas emissions, an important part of which is associated to deforestation and indirect land use change. Appropriate and coherent food policies can play an important role in aligning health, economic and environmental goals. From the point of view of policy analysis, however, this requires multi-sectoral, interdisciplinary approaches which can be highly complex. Important methodological advances in the area are not exempted from limitations and criticism. We argue that there is scope for further developments in integrated quantitative and qualitative policy analysis combining existing methods, including mathematical modelling and stakeholder analysis. We outline methodological trends in the field, briefly characterise integrated mixed methods policy analysis and identify contributions, challenges and opportunities for future research. In particular, this type of approach can help address issues of uncertainty and context-specific validity, incorporate multiple perspectives and help advance meaningful interdisciplinary collaboration in the field. Substantial challenges remain, however, such as the integration of key issues related to non-communicable disease, or the incorporation of a broader range of qualitative approaches that can address important cultural and ethical dimensions of food.
Prevalence of hypertension among adolescents: systematic review and meta-analysis.
Gonçalves, Vivian Siqueira Santos; Galvão, Taís Freire; de Andrade, Keitty Regina Cordeiro; Dutra, Eliane Said; Bertolin, Maria Natacha Toral; de Carvalho, Kenia Mara Baiocchi; Pereira, Mauricio Gomes
2016-01-01
To estimate the prevalence of hypertension among adolescent Brazilian students. A systematic review of school-based cross-sectional studies was conducted. The articles were searched in the databases MEDLINE, Embase, Scopus, LILACS, SciELO, Web of Science, CAPES thesis database and Trip Database. In addition, we examined the lists of references of relevant studies to identify potentially eligible articles. No restrictions regarding publication date, language, or status applied. The studies were selected by two independent evaluators, who also extracted the data and assessed the methodological quality following eight criteria related to sampling, measuring blood pressure, and presenting results. The meta-analysis was calculated using a random effects model and analyses were performed to investigate heterogeneity. We retrieved 1,577 articles from the search and included 22 in the review. The included articles corresponded to 14,115 adolescents, 51.2% (n = 7,230) female. We observed a variety of techniques, equipment, and references used. The prevalence of hypertension was 8.0% (95%CI 5.0-11.0; I2 = 97.6%), 9.3% (95%CI 5.6-13.6; I2 = 96.4%) in males and 6.5% (95%CI 4.2-9.1; I2 = 94.2%) in females. The meta-regression failed to identify the causes of the heterogeneity among studies. Despite the differences found in the methodologies of the included studies, the results of this systematic review indicate that hypertension is prevalent in the Brazilian adolescent school population. For future investigations, we suggest the standardization of techniques, equipment, and references, aiming at improving the methodological quality of the studies.
Assessing quality of reports on randomized clinical trials in nursing journals.
Parent, Nicole; Hanley, James A
2009-01-01
Several surveys have presented the quality of reports on randomized clinical trials (RCTs) published in general and specialty medical journals. The aim of these surveys was to raise scientific consciousness on methodological aspects pertaining to internal and external validity. These reviews have suggested that the methodological quality could be improved. We conducted a survey of reports on RCTs published in nursing journals to assess their methodological quality. The features we considered included sample size, flow of participants, assessment of baseline comparability, randomization, blinding, and statistical analysis. We collected data from all reports of RCTs published between January 1994 and December 1997 in Applied Nursing Research, Heart & Lung and Nursing Research. We hand-searched the journals and included all 54 articles in which authors reported that individuals have been randomly allocated to distinct groups. We collected data using a condensed form of the Consolidated Standards of Reporting Trials (CONSORT) statement for structured reporting of RCTs (Begg et al., 1996). Sample size calculations were included in only 22% of the reports. Only 48% of the reports provided information about the type of randomization, and a mere 22% described blinding strategies. Comparisons of baseline characteristics using hypothesis tests were abusively produced in more than 76% of the reports. Excessive use and unstructured reports of significance testing were common (59%), and all reports failed to provide magnitude of treatment differences with confidence intervals. Better methodological quality in reports of RCTs will contribute to increase the standards of nursing research.
Using Sociograms to Enhance Power and Voice in Focus Groups.
Baiardi, Janet M; Gultekin, Laura; Brush, Barbara L
2015-01-01
To discuss the use of sociograms in our focus groups with homeless sheltered mothers and to assess facilitator influence and the distribution of power influence. An exploratory, descriptive qualitative design that utilizes both focus groups and sociograms. Two focus groups were conducted in December 2009 (N = 7) and January 2010 (N = 4). Data analysis included a content analysis and a process analysis using sociograms to graphically represent group participant dynamics. Use of the sociogram provided a means to assess the influence of the facilitator as well as quantify the degree to which group participants' voices are included. Using sociograms provides a viable mechanism to complement content analysis and increase the methodological rigor of focus groups in health care research. © 2015 Wiley Periodicals, Inc.
Evolutionary Computing Methods for Spectral Retrieval
NASA Technical Reports Server (NTRS)
Terrile, Richard; Fink, Wolfgang; Huntsberger, Terrance; Lee, Seugwon; Tisdale, Edwin; VonAllmen, Paul; Tinetti, Geivanna
2009-01-01
A methodology for processing spectral images to retrieve information on underlying physical, chemical, and/or biological phenomena is based on evolutionary and related computational methods implemented in software. In a typical case, the solution (the information that one seeks to retrieve) consists of parameters of a mathematical model that represents one or more of the phenomena of interest. The methodology was developed for the initial purpose of retrieving the desired information from spectral image data acquired by remote-sensing instruments aimed at planets (including the Earth). Examples of information desired in such applications include trace gas concentrations, temperature profiles, surface types, day/night fractions, cloud/aerosol fractions, seasons, and viewing angles. The methodology is also potentially useful for retrieving information on chemical and/or biological hazards in terrestrial settings. In this methodology, one utilizes an iterative process that minimizes a fitness function indicative of the degree of dissimilarity between observed and synthetic spectral and angular data. The evolutionary computing methods that lie at the heart of this process yield a population of solutions (sets of the desired parameters) within an accuracy represented by a fitness-function value specified by the user. The evolutionary computing methods (ECM) used in this methodology are Genetic Algorithms and Simulated Annealing, both of which are well-established optimization techniques and have also been described in previous NASA Tech Briefs articles. These are embedded in a conceptual framework, represented in the architecture of the implementing software, that enables automatic retrieval of spectral and angular data and analysis of the retrieved solutions for uniqueness.
Roetzheim, Richard G; Freund, Karen M; Corle, Don K; Murray, David M; Snyder, Frederick R; Kronman, Andrea C; Jean-Pierre, Pascal; Raich, Peter C; Holden, Alan Ec; Darnell, Julie S; Warren-Mears, Victoria; Patierno, Steven
2012-04-01
The Patient Navigation Research Program (PNRP) is a cooperative effort of nine research projects, with similar clinical criteria but with different study designs. To evaluate projects such as PNRP, it is desirable to perform a pooled analysis to increase power relative to the individual projects. There is no agreed-upon prospective methodology, however, for analyzing combined data arising from different study designs. Expert opinions were thus solicited from the members of the PNRP Design and Analysis Committee. To review possible methodologies for analyzing combined data arising from heterogeneous study designs. The Design and Analysis Committee critically reviewed the pros and cons of five potential methods for analyzing combined PNRP project data. The conclusions were based on simple consensus. The five approaches reviewed included the following: (1) analyzing and reporting each project separately, (2) combining data from all projects and performing an individual-level analysis, (3) pooling data from projects having similar study designs, (4) analyzing pooled data using a prospective meta-analytic technique, and (5) analyzing pooled data utilizing a novel simulated group-randomized design. Methodologies varied in their ability to incorporate data from all PNRP projects, to appropriately account for differing study designs, and to accommodate differing project sample sizes. The conclusions reached were based on expert opinion and not derived from actual analyses performed. The ability to analyze pooled data arising from differing study designs may provide pertinent information to inform programmatic, budgetary, and policy perspectives. Multisite community-based research may not lend itself well to the more stringent explanatory and pragmatic standards of a randomized controlled trial design. Given our growing interest in community-based population research, the challenges inherent in the analysis of heterogeneous study design are likely to become more salient. Discussion of the analytic issues faced by the PNRP and the methodological approaches we considered may be of value to other prospective community-based research programs.
The Fungal Frontier: A Comparative Analysis of Methods Used in the Study of the Human Gut Mycobiome.
Huseyin, Chloe E; Rubio, Raul Cabrera; O'Sullivan, Orla; Cotter, Paul D; Scanlan, Pauline D
2017-01-01
The human gut is host to a diverse range of fungal species, collectively referred to as the gut "mycobiome". The gut mycobiome is emerging as an area of considerable research interest due to the potential roles of these fungi in human health and disease. However, there is no consensus as to what the best or most suitable methodologies available are with respect to characterizing the human gut mycobiome. The aim of this study is to provide a comparative analysis of several previously published mycobiome-specific culture-dependent and -independent methodologies, including choice of culture media, incubation conditions (aerobic versus anaerobic), DNA extraction method, primer set and freezing of fecal samples to assess their relative merits and suitability for gut mycobiome analysis. There was no significant effect of media type or aeration on culture-dependent results. However, freezing was found to have a significant effect on fungal viability, with significantly lower fungal numbers recovered from frozen samples. DNA extraction method had a significant effect on DNA yield and quality. However, freezing and extraction method did not have any impact on either α or β diversity. There was also considerable variation in the ability of different fungal-specific primer sets to generate PCR products for subsequent sequence analysis. Through this investigation two DNA extraction methods and one primer set was identified which facilitated the analysis of the mycobiome for all samples in this study. Ultimately, a diverse range of fungal species were recovered using both approaches, with Candida and Saccharomyces identified as the most common fungal species recovered using culture-dependent and culture-independent methods, respectively. As has been apparent from ecological surveys of the bacterial fraction of the gut microbiota, the use of different methodologies can also impact on our understanding of gut mycobiome composition and therefore requires careful consideration. Future research into the gut mycobiome needs to adopt a common strategy to minimize potentially confounding effects of methodological choice and to facilitate comparative analysis of datasets.
Wojtusiak, Janusz; Michalski, Ryszard S; Simanivanh, Thipkesone; Baranova, Ancha V
2009-12-01
Systematic reviews and meta-analysis of published clinical datasets are important part of medical research. By combining results of multiple studies, meta-analysis is able to increase confidence in its conclusions, validate particular study results, and sometimes lead to new findings. Extensive theory has been built on how to aggregate results from multiple studies and arrive to the statistically valid conclusions. Surprisingly, very little has been done to adopt advanced machine learning methods to support meta-analysis. In this paper we describe a novel machine learning methodology that is capable of inducing accurate and easy to understand attributional rules from aggregated data. Thus, the methodology can be used to support traditional meta-analysis in systematic reviews. Most machine learning applications give primary attention to predictive accuracy of the learned knowledge, and lesser attention to its understandability. Here we employed attributional rules, the special form of rules that are relatively easy to interpret for medical experts who are not necessarily trained in statistics and meta-analysis. The methodology has been implemented and initially tested on a set of publicly available clinical data describing patients with metabolic syndrome (MS). The objective of this application was to determine rules describing combinations of clinical parameters used for metabolic syndrome diagnosis, and to develop rules for predicting whether particular patients are likely to develop secondary complications of MS. The aggregated clinical data was retrieved from 20 separate hospital cohorts that included 12 groups of patients with present liver disease symptoms and 8 control groups of healthy subjects. The total of 152 attributes were used, most of which were measured, however, in different studies. Twenty most common attributes were selected for the rule learning process. By applying the developed rule learning methodology we arrived at several different possible rulesets that can be used to predict three considered complications of MS, namely nonalcoholic fatty liver disease (NAFLD), simple steatosis (SS), and nonalcoholic steatohepatitis (NASH).
NASA Technical Reports Server (NTRS)
Young, Larry A.; Yetter, Jeffrey A.; Guynn, Mark D.
2006-01-01
Maturation of intelligent systems technologies and their incorporation into aerial platforms are dictating the development of new analysis tools and incorporation of such tools into existing system analysis methodologies in order to fully capture the trade-offs of autonomy on vehicle and mission success. A first-order "system analysis of autonomy" methodology is outlined in this paper. Further, this analysis methodology is subsequently applied to notional high-altitude long-endurance (HALE) aerial vehicle missions.
Robinson, César Leyton; Caballero, Andrés Díaz
2007-01-01
This article is an experimental methodological reflection on the use of medical images as useful documents for constructing the history of medicine. A method is used that is based on guidelines or analysis topics that include different ways of viewing documents, from aesthetic, technical, social and political theories to historical and medical thinking. Some exercises are also included that enhance the proposal for the reader: rediscovering the worlds in society that harbor these medical photographical archives to obtain a new theoretical approach to the construction of the history of medical science.
Leung, Henry W C; Chan, Agnes L F; Leung, Matthew S H; Lu, Chin-Li
2013-04-01
To systematically review and assess the quality of cost-effectiveness analyses (CEAs) of pharmaceutical therapies for metastatic colorectal cancer (mCRC). The MEDLINE, EMBASE, Cochrane, and EconLit databases were searched for the Medical Subject Headings or text key words quality-adjusted, QALY, life-year gained (LYG), and cost-effectiveness (January 1, 1999-December 31, 2009). Original CEAs of mCRC pharmacotherapy published in English were included. CEAs that measured health effects in units other than quality-adjusted life years or LYG and letters to the editor, case reports, posters, and editorials were excluded. Each article was independently assessed by 2 trained reviewers according to a quality checklist created by the Panel on Cost-Effectiveness in Health and Medicine. Twenty-four CEA studies pertaining to pharmaceutical therapies for mCRC were identified. All studies showed a wide variation in methodologic approaches, which resulted in a different range of incremental cost-effectiveness ratios reported for each regimen. We found common methodologic flaws in a significant number of CEA studies, including lack of clear description for critique of data quality; lack of method for adjusting costs for inflation and methods for obtaining expert judgment; no results of model validation; wide differences in the types of perspective, time horizon, study design, cost categories, and effect outcomes; and no quality assessment of data (cost and effectiveness) for the interventions evaluation. This study has shown a wide variation in the methodology and quality of cost-effectiveness analysis for mCRC. Improving quality and harmonization of CEA for cancer treatment is needed. Further study is suggested to assess the quality of CEA methodology outside the mCRC disease state.
Anonychuk, Andrea M; Tricco, Andrea C; Bauch, Chris T; Pham, Ba'; Gilca, Vladimir; Duval, Bernard; John-Baptiste, Ava; Woo, Gloria; Krahn, Murray
2008-01-01
Hepatitis A vaccines have been available for more than a decade. Because the burden of hepatitis A virus has fallen in developed countries, the appropriate role of vaccination programmes, especially universal vaccination strategies, remains unclear. Cost-effectiveness analysis is a useful method of relating the costs of vaccination to its benefits, and may inform policy. This article systematically reviews the evidence on the cost effectiveness of hepatitis A vaccination in varying populations, and explores the effects of methodological quality and key modelling issues on the cost-effectiveness ratios.Cost-effectiveness/cost-utility studies of hepatitis A vaccine were identified via a series of literature searches (MEDLINE, EMBASE, HSTAR and SSCI). Citations and full-text articles were reviewed independently by two reviewers. Reference searching, author searches and expert consultation ensured literature saturation. Incremental cost-effectiveness ratios (ICERs) were abstracted for base-case analyses, converted to $US, year 2005 values, and categorised to reflect various levels of cost effectiveness. Quality of reporting, methodological issues and key modelling issues were assessed using frameworks published in the literature.Thirty-one cost-effectiveness studies (including 12 cost-utility analyses) were included from full-text article review (n = 58) and citation screening (n = 570). These studies evaluated universal mass vaccination (n = 14), targeted vaccination (n = 17) and vaccination of susceptibles (i.e. individuals initially screened for antibody and, if susceptible, vaccinated) [n = 13]. For universal vaccination, 50% of the ICERs were <$US20 000 per QALY or life-year gained. Analyses evaluating vaccination in children, particularly in high incidence areas, produced the most attractive ICERs. For targeted vaccination, cost effectiveness was highly dependent on the risk of infection.Incidence, vaccine cost and discount rate were the most influential parameters in sensitivity analyses. Overall, analyses that evaluated the combined hepatitis A/hepatitis B vaccine, adjusted incidence for under-reporting, included societal costs and that came from studies of higher methodological quality tended to have more attractive cost-effectiveness ratios. Methodological quality varied across studies. Major methodological flaws included inappropriate model type, comparator, incidence estimate and inclusion/exclusion of costs.
Kinjo, Masataka
2018-01-01
Neurodegenerative diseases, including amyotrophic lateral sclerosis (ALS), Alzheimer’s disease, Parkinson’s disease, and Huntington’s disease, are devastating proteinopathies with misfolded protein aggregates accumulating in neuronal cells. Inclusion bodies of protein aggregates are frequently observed in the neuronal cells of patients. Investigation of the underlying causes of neurodegeneration requires the establishment and selection of appropriate methodologies for detailed investigation of the state and conformation of protein aggregates. In the current review, we present an overview of the principles and application of several methodologies used for the elucidation of protein aggregation, specifically ones based on determination of fluctuations of fluorescence. The discussed methods include fluorescence correlation spectroscopy (FCS), imaging FCS, image correlation spectroscopy (ICS), photobleaching ICS (pbICS), number and brightness (N&B) analysis, super-resolution optical fluctuation imaging (SOFI), and transient state (TRAST) monitoring spectroscopy. Some of these methodologies are classical protein aggregation analyses, while others are not yet widely used. Collectively, the methods presented here should help the future development of research not only into protein aggregation but also neurodegenerative diseases. PMID:29570669
Prieto, M.L.; Cuéllar-Barboza, A.B.; Bobo, W.V.; Roger, V.L.; Bellivier, F.; Leboyer, M.; West, C.P.; Frye, M.A.
2016-01-01
Objective To review the evidence on and estimate the risk of myocardial infarction and stroke in bipolar disorder. Method A systematic search using MEDLINE, EMBASE, PsycINFO, Web of Science, Scopus, Cochrane Database of Systematic Reviews, and bibliographies (1946 – May, 2013) was conducted. Case-control and cohort studies of bipolar disorder patients age 15 or older with myocardial infarction or stroke as outcomes were included. Two independent reviewers extracted data and assessed quality. Estimates of effect were summarized using random-effects meta-analysis. Results Five cohort studies including 13 115 911 participants (27 092 bipolar) were included. Due to the use of registers, different statistical methods, and inconsistent adjustment for confounders, there was significant methodological heterogeneity among studies. The exploratory meta-analysis yielded no evidence for a significant increase in the risk of myocardial infarction: [relative risk (RR): 1.09, 95% CI 0.96–1.24, P = 0.20; I2 = 6%]. While there was evidence of significant study heterogeneity, the risk of stroke in bipolar disorder was significantly increased (RR 1.74, 95% CI 1.29–2.35; P = 0.0003; I2 = 83%). Conclusion There may be a differential risk of myocardial infarction and stroke in patients with bipolar disorder. Confidence in these pooled estimates was limited by the small number of studies, significant heterogeneity and dissimilar methodological features. PMID:24850482
Assessment of environmental impacts part one. Intervention analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hipel, Keith William; Lettenmaier, Dennis P.; McLeod, A. Ian
The use of intervention analysis as a statistical method of gauging the effects of environmental changes is discussed. The Box-Jenkins model, serves as the basis for the intervention analysis methodology. Environmental studies of the Aswan Dam, the South Saskatchewan River, and a forest fire near the Pipers Hole River, Canada, are included as case studies in which intervention analysis was employed. Methods of data collection for intervention analysis are found to have a significant impact on model reliability; effective data collection processes for the Box-Jenkins model are provided. (15 graphs, 27 references, 2 tables)
An Integrated Low-Speed Performance and Noise Prediction Methodology for Subsonic Aircraft
NASA Technical Reports Server (NTRS)
Olson, E. D.; Mavris, D. N.
2000-01-01
An integrated methodology has been assembled to compute the engine performance, takeoff and landing trajectories, and community noise levels for a subsonic commercial aircraft. Where feasible, physics-based noise analysis methods have been used to make the results more applicable to newer, revolutionary designs and to allow for a more direct evaluation of new technologies. The methodology is intended to be used with approximation methods and risk analysis techniques to allow for the analysis of a greater number of variable combinations while retaining the advantages of physics-based analysis. Details of the methodology are described and limited results are presented for a representative subsonic commercial aircraft.
NASA Astrophysics Data System (ADS)
Vugteveen, Pim; van Katwijk, Marieke M.; Rouwette, Etiënne; Hanssen, Lucien
2014-02-01
Integrated Coastal Management cannot operate effectively without reliable information and knowledge on changes in the environment and on the causes of those changes. Monitoring is essential to provide data needed for a real understanding of socio-economic and ecological functioning in multi-user nature areas. We present a web-based and comprehensive assessment methodology to articulate, structure and prioritize information needs and ensuing monitoring needs. We applied this methodology in the Dutch Wadden Sea Region, which includes a designated UNESCO World Heritage nature reserve. The methodology consists of the following steps: i) exploring social-ecological issues of concern and defining the monitoring scope; ii) articulating information needs expressed as tractable questions; iii) elaborating monitoring needs; iv) grounding in scientific models and current monitoring; v) synthesizing assessment findings into target entities, i.e. analysis variables for monitoring. In this paper we focus on the first three steps. As part of our methodology we performed two online surveys amongst a broad range of stakeholders and amongst monitoring professionals. In the case of the Dutch Wadden Sea Region, main monitoring questions were related to biodiversity and food web relations; effects of fisheries and its pressures on the ecosystem; channel and port dredging; spatial planning and multifunctional use; sustainable energy production; and effects of changing storm regimes due to climate change. Subsequently we elaborated these general issues into analysis variables within five themes. The presented methodology enables large scale and unbiased involvement of stakeholders in articulating information needs in a multi-user nature reserve like the Wadden Sea. In addition the methodology facilitates the input and feedback of monitoring professionals by providing a detailed elaboration of monitoring needs.
Human factors evaluation of teletherapy: Function and task analysis. Volume 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kaye, R.D.; Henriksen, K.; Jones, R.
1995-07-01
As a treatment methodology, teletherapy selectively destroys cancerous and other tissue by exposure to an external beam of ionizing radiation. Sources of radiation are either a radioactive isotope, typically Cobalt-60 (Co-60), or a linear accelerator. Records maintained by the NRC have identified instances of teletherapy misadministration where the delivered radiation dose has differed from the radiation prescription (e.g., instances where fractions were delivered to the wrong patient, to the wrong body part, or were too great or too little with respect to the defined treatment volume). Both human error and machine malfunction have led to misadministrations. Effective and safe treatmentmore » requires a concern for precision and consistency of human-human and human-machine interactions throughout the course of therapy. The present study is the first part of a series of human factors evaluations for identifying the root causes that lead to human error in the teletherapy environment. The human factors evaluations included: (1) a function and task analysis of teletherapy activities, (2) an evaluation of the human-system interfaces, (3) an evaluation of procedures used by teletherapy staff, (4) an evaluation of the training and qualifications of treatment staff (excluding the oncologists), (5) an evaluation of organizational practices and policies, and (6) an identification of problems and alternative approaches for NRC and industry attention. The present report addresses the function and task analysis of teletherapy activities and provides the foundation for the conduct of the subsequent evaluations. The report includes sections on background, methodology, a description of the function and task analysis, and use of the task analysis findings for the subsequent tasks. The function and task analysis data base also is included.« less
Shuttle payload bay dynamic environments: Summary and conclusion report for STS flights 1-5 and 9
NASA Technical Reports Server (NTRS)
Oconnell, M.; Garba, J.; Kern, D.
1984-01-01
The vibration, acoustic and low frequency loads data from the first 5 shuttle flights are presented. The engineering analysis of that data is also presented. Vibroacoustic data from STS-9 are also presented because they represent the only data taken on a large payload. Payload dynamic environment predictions developed by the participation of various NASA and industrial centers are presented along with a comparison of analytical loads methodology predictions with flight data, including a brief description of the methodologies employed in developing those predictions for payloads. The review of prediction methodologies illustrates how different centers have approached the problems of developing shuttle dynamic environmental predictions and criteria. Ongoing research activities related to the shuttle dynamic environments are also described. Analytical software recently developed for the prediction of payload acoustic and vibration environments are also described.
Towards Accelerated Aging Methodologies and Health Management of Power MOSFETs (Technical Brief)
NASA Technical Reports Server (NTRS)
Celaya, Jose R.; Patil, Nishad; Saha, Sankalita; Wysocki, Phil; Goebel, Kai
2009-01-01
Understanding aging mechanisms of electronic components is of extreme importance in the aerospace domain where they are part of numerous critical subsystems including avionics. In particular, power MOSFETs are of special interest as they are involved in high voltage switching circuits such as drivers for electrical motors. With increased use of electronics in aircraft control, it becomes more important to understand the degradation of these components in aircraft specific environments. In this paper, we present an accelerated aging methodology for power MOSFETs that subject the devices to indirect thermal overstress during high voltage switching. During this accelerated aging process, two major modes of failure were observed - latch-up and die attach degradation. In this paper we present the details of our aging methodology along with details of experiments and analysis of the results.
NASA Technical Reports Server (NTRS)
Padovan, J.; Adams, M.; Lam, P.; Fertis, D.; Zeid, I.
1982-01-01
Second-year efforts within a three-year study to develop and extend finite element (FE) methodology to efficiently handle the transient/steady state response of rotor-bearing-stator structure associated with gas turbine engines are outlined. The two main areas aim at (1) implanting the squeeze film damper element into a general purpose FE code for testing and evaluation; and (2) determining the numerical characteristics of the FE-generated rotor-bearing-stator simulation scheme. The governing FE field equations are set out and the solution methodology is presented. The choice of ADINA as the general-purpose FE code is explained, and the numerical operational characteristics of the direct integration approach of FE-generated rotor-bearing-stator simulations is determined, including benchmarking, comparison of explicit vs. implicit methodologies of direct integration, and demonstration problems.
Where does good quality qualitative health care research get published?
Richardson, Jane C; Liddle, Jennifer
2017-09-01
This short report aims to give some insight into current publication patterns for high-quality qualitative health research, using the Research Excellence Framework (REF) 2014 database. We explored patterns of publication by range and type of journal, by date and by methodological focus. We also looked at variations between the publications submitted to different Units of Assessment, focussing particularly on the one most closely aligned with our own research area of primary care. Our brief analysis demonstrates that general medical/health journals with high impact factors are the dominant routes of publication, but there is variation according to the methodological approach adopted by articles. The number of qualitative health articles submitted to REF 2014 overall was small, and even more so for articles based on mixed methods research, qualitative methodology or reviews/syntheses that included qualitative articles.
Optical holographic structural analysis of Kevlar rocket motor cases
NASA Astrophysics Data System (ADS)
Harris, W. J.
1981-05-01
The methodology of applying optical holography to evaluation of subscale Kevlar 49 composite pressure vessels is explored. The results and advantages of the holographic technique are discussed. The cases utilized were of similar design, but each had specific design features, the effects of which are reviewed. Burst testing results are presented in conjunction with the holographic fringe patterns obtained during progressive pressurization. Examples of quantitative data extracted by analysis of fringe fields are included.