ERIC Educational Resources Information Center
Gommans, Rob; Cillessen, Antonius H. N.
2015-01-01
Children's peer relationships are frequently assessed with peer nominations. An important methodological issue is whether to collect unlimited or limited nominations. Some researchers have argued that the psychometric differences between both methods are negligible, while others have claimed that one is superior over the other. The current study…
Jensen, Eric Allen
2017-01-01
With the rapid global proliferation of social media, there has been growing interest in using this existing source of easily accessible 'big data' to develop social science knowledge. However, amidst the big data gold rush, it is important that long-established principles of good social research are not ignored. This article critically evaluates Mitchell et al.'s (2013) study, 'The Geography of Happiness: Connecting Twitter Sentiment and Expression, Demographics, and Objective Characteristics of Place', demonstrating the importance of attending to key methodological issues associated with secondary data analysis.
Considering consumer choice in the economic evaluation of mandatory health programmes: a review.
Parkinson, Bonny; Goodall, Stephen
2011-08-01
Governments are increasing their focus on mandatory public health programmes following positive economic evaluations of their impact. This review aims to examine whether loss of consumer choice should be included in economic evaluations of mandatory health programmes (MHP). A systematic literature review was conducted to identify economic evaluations of MHP, whether they discuss the impact on consumer choice and any methodological limitations. Overall 39 economic evaluations were identified, of which 10 discussed the loss of consumer choice and 6 attempted to place a value on the loss of consumer choice. Methodological limitations included: measuring the marginal cost of compliance, unavailability of price elasticity estimates, the impact of income effects, double counting health impacts, biased willingness-to-pay responses, and "protest" responses. Overall it was found that the inclusion of the loss of consumer choice rarely impacted on the final outcome of the study. The impact of MHP on the loss of consumer choice has largely been ignored in economic evaluations. Its importance remains uncertain due to its infrequent inclusion and significant methodological limitations. Further research regarding which methodology is best for valuing the loss of consumer choice and whether it is important to the final implementation decision is warranted. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Health economic evaluation: important principles and methodology.
Rudmik, Luke; Drummond, Michael
2013-06-01
To discuss health economic evaluation and improve the understanding of common methodology. This article discusses the methodology for the following types of economic evaluations: cost-minimization, cost-effectiveness, cost-utility, cost-benefit, and economic modeling. Topics include health-state utility measures, the quality-adjusted life year (QALY), uncertainty analysis, discounting, decision tree analysis, and Markov modeling. Economic evaluation is the comparative analysis of alternative courses of action in terms of both their costs and consequences. With increasing health care expenditure and limited resources, it is important for physicians to consider the economic impact of their interventions. Understanding common methodology involved in health economic evaluation will improve critical appraisal of the literature and optimize future economic evaluations. Copyright © 2012 The American Laryngological, Rhinological and Otological Society, Inc.
Cost-effectiveness analyses and their role in improving healthcare strategies.
Rodriguez, Maria I; Caughey, Aaron B
2013-12-01
In this era of healthcare reform, attention is focused on increasing the quality of care and access to services, while simultaneously reducing the cost. Economic evaluations can play an important role in translating research to evidence-based practice and policy. Cost-effectiveness analysis (CEA) and its utility for clinical and policy decision making among U.S. obstetricians and gynecologists is reviewed. Three case examples demonstrating the value of this methodology in decision making are considered. A discussion of the methodologic principles of CEA, the advantages, and the limitations of the methodology are presented. CEA can play an important role in evidence-based decision making, with value for clinicians and policy makers alike. These studies are of particular interest in the field of obstetrics and gynecology, in which uncertainty from epidemiologic or clinical trials exists, or multiple perspectives need to be considered (maternal, neonatal, and societal). As with all research, it is essential that economic evaluations are conducted according to established methodologic standards. Interpretation and application of results should occur with a clear understanding of both the value and the limitations of economic evaluations.
Organisational and Training Factors Affecting Academic Teacher Training Outcomes
ERIC Educational Resources Information Center
Renta-Davids, Ana-Inés; Jiménez-González, José-Miguel; Fandos-Garrido, Manel; González-Soto, Ángel-Pío
2016-01-01
University teacher training has become an important topic in recent years due to the curricular and methodological reforms introduced by the Bologna process. Despite its acknowledged importance, evaluations have been limited to measures of participants' satisfaction, and little is known about its impact on teaching practices. This study seeks to…
ERIC Educational Resources Information Center
Begeny, John C.; Krouse, Hailey E.; Brown, Kristina G.; Mann, Courtney M.
2011-01-01
Teacher judgments about students' academic abilities are important for instructional decision making and potential special education entitlement decisions. However, the small number of studies evaluating teachers' judgments are limited methodologically (e.g., sample size, procedural sophistication) and have yet to answer important questions…
ERIC Educational Resources Information Center
Kascsak, Theresa Marie
2012-01-01
The development of social adjustment during elementary school is of critical importance because early socialization skills are an important predictor of both future social and emotional functioning. However, an examination of current literature reveals there is limited research utilizing sound research methodology and evaluation protocols for…
April Evans; Hans Vogelsong
2008-01-01
Rural tourism is a rapidly expanding industry which holds some promise of improving the economy in small towns and farming regions. However, rural communities have limited funding available for promotional efforts. To understand if limited funds are effective in producing the desired economic impacts, it is important that rural communities evaluate their promotional...
Methodological considerations for designing a community water fluoridation cessation study.
Singhal, Sonica; Farmer, Julie; McLaren, Lindsay
2017-06-01
High-quality, up-to-date research on community water fluoridation (CWF), and especially on the implications of CWF cessation for dental health, is limited. Although CWF cessation studies have been conducted, they are few in number; one of the major reasons is the methodological complexity of conducting such a study. This article draws on a systematic review of existing cessation studies (n=15) to explore methodological considerations of conducting CWF cessation studies in future. We review nine important methodological aspects (study design, comparison community, target population, time frame, sampling strategy, clinical indicators, assessment criteria, covariates and biomarkers) and provide recommendations for planning future CWF cessation studies that examine effects on dental caries. There is no one ideal study design to answer a research question. However, recommendations proposed regarding methodological aspects to conduct an epidemiological study to observe the effects of CWF cessation on dental caries, coupled with our identification of important methodological gaps, will be useful for researchers who are looking to optimize resources to conduct such a study with standards of rigour. © 2017 Her Majesty the Queen in Right of Canada. Community Dentistry and Oral Epidemiology © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Munthe-Kaas, Heather; Bohren, Meghan A; Glenton, Claire; Lewin, Simon; Noyes, Jane; Tunçalp, Özge; Booth, Andrew; Garside, Ruth; Colvin, Christopher J; Wainwright, Megan; Rashidian, Arash; Flottorp, Signe; Carlsen, Benedicte
2018-01-25
The GRADE-CERQual (Confidence in Evidence from Reviews of Qualitative research) approach has been developed by the GRADE (Grading of Recommendations Assessment, Development and Evaluation) Working Group. The approach has been developed to support the use of findings from qualitative evidence syntheses in decision-making, including guideline development and policy formulation. CERQual includes four components for assessing how much confidence to place in findings from reviews of qualitative research (also referred to as qualitative evidence syntheses): (1) methodological limitations, (2) coherence, (3) adequacy of data and (4) relevance. This paper is part of a series providing guidance on how to apply CERQual and focuses on CERQual's methodological limitations component. We developed the methodological limitations component by searching the literature for definitions, gathering feedback from relevant research communities and developing consensus through project group meetings. We tested the CERQual methodological limitations component within several qualitative evidence syntheses before agreeing on the current definition and principles for application. When applying CERQual, we define methodological limitations as the extent to which there are concerns about the design or conduct of the primary studies that contributed evidence to an individual review finding. In this paper, we describe the methodological limitations component and its rationale and offer guidance on how to assess methodological limitations of a review finding as part of the CERQual approach. This guidance outlines the information required to assess methodological limitations component, the steps that need to be taken to assess methodological limitations of data contributing to a review finding and examples of methodological limitation assessments. This paper provides guidance for review authors and others on undertaking an assessment of methodological limitations in the context of the CERQual approach. More work is needed to determine which criteria critical appraisal tools should include when assessing methodological limitations. We currently recommend that whichever tool is used, review authors provide a transparent description of their assessments of methodological limitations in a review finding. We expect the CERQual approach and its individual components to develop further as our experiences with the practical implementation of the approach increase.
SUDOQU, a new dose-assessment methodology for radiological surface contamination.
van Dillen, Teun; van Dijk, Arjan
2018-06-12
A new methodology has been developed for the assessment of the annual effective dose resulting from removable and fixed radiological surface contamination. It is entitled SUDOQU (SUrface DOse QUantification) and it can for instance be used to derive criteria for surface contamination related to the import of non-food consumer goods, containers and conveyances, e.g., limiting values and operational screening levels. SUDOQU imposes mass (activity)-balance equations based on radioactive decay, removal and deposition processes in indoor and outdoor environments. This leads to time-dependent contamination levels that may be of particular importance in exposure scenarios dealing with one or a few contaminated items only (usually public exposure scenarios, therefore referred to as the 'consumer' model). Exposure scenarios with a continuous flow of freshly contaminated goods also fall within the scope of the methodology (typically occupational exposure scenarios, thus referred to as the 'worker model'). In this paper we describe SUDOQU, its applications, and its current limitations. First, we delineate the contamination issue, present the assumptions and explain the concepts. We describe the relevant removal, transfer, and deposition processes, and derive equations for the time evolution of the radiological surface-, air- and skin-contamination levels. These are then input for the subsequent evaluation of the annual effective dose with possible contributions from external gamma radiation, inhalation, secondary ingestion (indirect, from hand to mouth), skin contamination, direct ingestion and skin-contact exposure. The limiting effective surface dose is introduced for issues involving the conservatism of dose calculations. SUDOQU can be used by radiation-protection scientists/experts and policy makers in the field of e.g. emergency preparedness, trade and transport, exemption and clearance, waste management, and nuclear facilities. Several practical examples are worked out demonstrating the potential applications of the methodology. . Creative Commons Attribution license.
Greenhouse gas emissions from reservoir water surfaces: A new global synthesis
Collectively, reservoirs created by dams are thought to be an important source ofgreenhouse gases (GHGs) to the atmosphere. So far, efforts to quantify, model, andmanage these emissions have been limited by data availability and inconsistenciesin methodological approach. Here we ...
Greenhouse Gas Emissions from Reservoir Water Surfaces: A New Global Synthesis - journal
Collectively, reservoirs are an important anthropogenic source of greenhouse gases (GHGs) to the atmosphere. Attempts to model reservoir GHG fluxes, however, have been limited by inconsistencies in methodological approaches and data availability. An increase in the number of pu...
Rasheed, Nadia; Amin, Shamsudin H M
2016-01-01
Grounded language acquisition is an important issue, particularly to facilitate human-robot interactions in an intelligent and effective way. The evolutionary and developmental language acquisition are two innovative and important methodologies for the grounding of language in cognitive agents or robots, the aim of which is to address current limitations in robot design. This paper concentrates on these two main modelling methods with the grounding principle for the acquisition of linguistic ability in cognitive agents or robots. This review not only presents a survey of the methodologies and relevant computational cognitive agents or robotic models, but also highlights the advantages and progress of these approaches for the language grounding issue.
Rasheed, Nadia; Amin, Shamsudin H. M.
2016-01-01
Grounded language acquisition is an important issue, particularly to facilitate human-robot interactions in an intelligent and effective way. The evolutionary and developmental language acquisition are two innovative and important methodologies for the grounding of language in cognitive agents or robots, the aim of which is to address current limitations in robot design. This paper concentrates on these two main modelling methods with the grounding principle for the acquisition of linguistic ability in cognitive agents or robots. This review not only presents a survey of the methodologies and relevant computational cognitive agents or robotic models, but also highlights the advantages and progress of these approaches for the language grounding issue. PMID:27069470
Nandi, Anirban; Pan, Sharadwata; Potumarthi, Ravichandra; Danquah, Michael K; Sarethy, Indira P
2014-01-01
Six Sigma methodology has been successfully applied to daily operations by several leading global private firms including GE and Motorola, to leverage their net profits. Comparatively, limited studies have been conducted to find out whether this highly successful methodology can be applied to research and development (R&D). In the current study, we have reviewed and proposed a process for a probable integration of Six Sigma methodology to large-scale production of Penicillin G and its subsequent conversion to 6-aminopenicillanic acid (6-APA). It is anticipated that the important aspects of quality control and quality assurance will highly benefit from the integration of Six Sigma methodology in mass production of Penicillin G and/or its conversion to 6-APA.
Nandi, Anirban; Danquah, Michael K.
2014-01-01
Six Sigma methodology has been successfully applied to daily operations by several leading global private firms including GE and Motorola, to leverage their net profits. Comparatively, limited studies have been conducted to find out whether this highly successful methodology can be applied to research and development (R&D). In the current study, we have reviewed and proposed a process for a probable integration of Six Sigma methodology to large-scale production of Penicillin G and its subsequent conversion to 6-aminopenicillanic acid (6-APA). It is anticipated that the important aspects of quality control and quality assurance will highly benefit from the integration of Six Sigma methodology in mass production of Penicillin G and/or its conversion to 6-APA. PMID:25057428
The effect of erosion on the fatigue limit of metallic materials for aerospace applications
NASA Astrophysics Data System (ADS)
Kordatos, E. Z.; Exarchos, D. A.; Matikas, T. E.
2018-03-01
This work deals with the study of the fatigue behavior of metallic materials for aerospace applications which have undergone erosion. Particularly, an innovative non-destructive methodology based on infrared lock-in thermography was applied on aluminum samples for the rapid determination of their fatigue limit. The effect of erosion on the structural integrity of materials can lead to a catastrophic failure and therefore an efficient assessment of the fatigue behavior is of high importance. Infrared thermography (IRT) as a non-destructive, non-contact, real time and full field method can be employed in order the fatigue limit to be rapidly determined. The basic principle of this method is the detection and monitoring of the intrinsically dissipated energy due to the cyclic fatigue loading. This methodology was successfully applied on both eroded and non-eroded aluminum specimens in order the severity of erosion to be evaluated.
De Ambrogi, Francesco; Ratti, Elisabetta Ceppi
2011-01-01
Today the Italian national debate over the Work-Related Stress Risk Assessment methodology is rather heated. Several methodological proposals and guidelines have been published in recent months, not least those by the "Commissione Consultiva". But despite this wide range of proposals, it appears that there is still a lack of attention to some of the basic methodological issues that must be taken into account in order to correctly implement the above-mentioned guidelines. The aim of this paper is to outline these methodological issues. In order to achieve this, the most authoritative methodological proposals and guidelines have been reviewed. The study focuses in particular on the methodological issues that could lead to important biases if not considered properly. The study leads to some considerations about the methodological validity of a Work-Related Stress Risk Assessment based exclusively on the literal interpretation of the considered proposals. Furthermore, the study provides some hints and working hypotheses on how to overcome these methodological limits. This study should be considered as a starting point for further investigations and debate on the Work-Related Stress Risk Assessment methodology on a national level.
Online interviewing with interpreters in humanitarian contexts
Chiumento, Anna; Rahman, Atif; Frith, Lucy
2018-01-01
ABSTRACT Purpose: Recognising that one way to address the logistical and safety considerations of research conducted in humanitarian emergencies is to use internet communication technologies to facilitate interviews online, this article explores some practical and methodological considerations inherent to qualitative online interviewing. Method: Reflections from a case study of a multi-site research project conducted in post-conflict countries are presented. Synchronous online cross-language qualitative interviews were conducted in one country. Although only a small proportion of interviews were conducted online (six out of 35), it remains important to critically consider the impact upon data produced in this way. Results: A range of practical and methodological considerations are discussed, illustrated with examples. Results suggest that whilst online interviewing has methodological and ethical potential and versatility, there are inherent practical challenges in settings with poor internet and electricity infrastructure. Notable methodological limitations include barriers to building rapport due to partial visual and non-visual cues, and difficulties interpreting pauses or silences. Conclusions: Drawing upon experiences in this case study, strategies for managing the practical and methodological limitations of online interviewing are suggested, alongside recommendations for supporting future research practice. These are intended to act as a springboard for further reflection, and operate alongside other conceptual frameworks for online interviewing. PMID:29532739
Online interviewing with interpreters in humanitarian contexts.
Chiumento, Anna; Machin, Laura; Rahman, Atif; Frith, Lucy
2018-12-01
Recognising that one way to address the logistical and safety considerations of research conducted in humanitarian emergencies is to use internet communication technologies to facilitate interviews online, this article explores some practical and methodological considerations inherent to qualitative online interviewing. Reflections from a case study of a multi-site research project conducted in post-conflict countries are presented. Synchronous online cross-language qualitative interviews were conducted in one country. Although only a small proportion of interviews were conducted online (six out of 35), it remains important to critically consider the impact upon data produced in this way. A range of practical and methodological considerations are discussed, illustrated with examples. Results suggest that whilst online interviewing has methodological and ethical potential and versatility, there are inherent practical challenges in settings with poor internet and electricity infrastructure. Notable methodological limitations include barriers to building rapport due to partial visual and non-visual cues, and difficulties interpreting pauses or silences. Drawing upon experiences in this case study, strategies for managing the practical and methodological limitations of online interviewing are suggested, alongside recommendations for supporting future research practice. These are intended to act as a springboard for further reflection, and operate alongside other conceptual frameworks for online interviewing.
Landslide hazard analysis for pipelines: The case of the Simonette river crossing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grivas, D.A.; Schultz, B.C.; O`Neil, G.
1995-12-31
The overall objective of this study is to develop a probabilistic methodology to analyze landslide hazards and their effects on the safety of buried pipelines. The methodology incorporates a range of models that can accommodate differences in the ground movement modes and the amount and type of information available at various site locations. Two movement modes are considered, namely (a) instantaneous (catastrophic) slides, and (b) gradual ground movement which may result in cumulative displacements over the pipeline design life (30--40 years) that are in excess of allowable values. Probabilistic analysis is applied in each case to address the uncertainties associatedmore » with important factors that control slope stability. Availability of information ranges from relatively well studied, instrumented installations to cases where data is limited to what can be derived from topographic and geologic maps. The methodology distinguishes between procedures applied where there is little information and those that can be used when relatively extensive data is available. important aspects of the methodology are illustrated in a case study involving a pipeline located in Northern Alberta, Canada, in the Simonette river valley.« less
Assessing species saturation: conceptual and methodological challenges.
Olivares, Ingrid; Karger, Dirk N; Kessler, Michael
2018-05-07
Is there a maximum number of species that can coexist? Intuitively, we assume an upper limit to the number of species in a given assemblage, or that a lineage can produce, but defining and testing this limit has proven problematic. Herein, we first outline seven general challenges of studies on species saturation, most of which are independent of the actual method used to assess saturation. Among these are the challenge of defining saturation conceptually and operationally, the importance of setting an appropriate referential system, and the need to discriminate among patterns, processes and mechanisms. Second, we list and discuss the methodological approaches that have been used to study species saturation. These approaches vary in time and spatial scales, and in the variables and assumptions needed to assess saturation. We argue that assessing species saturation is possible, but that many studies conducted to date have conceptual and methodological flaws that prevent us from currently attaining a good idea of the occurrence of species saturation. © 2018 Cambridge Philosophical Society.
Treatment resistance and psychodynamic psychiatry: concepts psychiatry needs from psychoanalysis.
Plakun, Eric
2012-06-01
Over the last 30 years psychiatry and psychoanalysis have moved in substantially divergent directions. Psychiatry has become rich in methodology but conceptually limited, with a drift toward biological reductionism. Psychoanalysis has remained relatively limited in methodology, but conceptually rich. The rich methodology of psychiatry has led to major contributions in discovering gene by environment interactions, the importance of early adversity, and to recognition of the serious problem posed by treatment resistance. However, psychiatry's biologically reductionistic conceptual focus interferes with the development of a nuanced clinical perspective based on emerging knowledge that might help more treatment resistant patients become treatment responders. This article argues that recognition of the problem of treatment resistance in psychiatry creates a need for it to reconnect with the conceptual richness of psychoanalysis in order to improve patient care. Psychodynamic psychiatry is defined as the relevant intersection of psychiatry and psychoanalysis where this reconnection can occur. I will suggest selected aspects of psychoanalysis that are especially relevant to psychiatry in improving outcomes in work with treatment resistant patients.
Complex dynamic in ecological time series
Peter Turchin; Andrew D. Taylor
1992-01-01
Although the possibility of complex dynamical behaviors-limit cycles, quasiperiodic oscillations, and aperiodic chaos-has been recognized theoretically, most ecologists are skeptical of their importance in nature. In this paper we develop a methodology for reconstructing endogenous (or deterministic) dynamics from ecological time series. Our method consists of fitting...
Theory and Scholarly Inquiry Need Not Be Scientific to Be of Value.
ERIC Educational Resources Information Center
Martin, Jack
1995-01-01
Reacts to responses concerning a previous article in this issue, "Against Scientism in Psychological Counselling and Therapy." Reasserts that there are important, undeniable limitations to the application of physical science methodologies and epistemologies to the study of humans and their experiences. (JPS)
Advancing the use of minirhizotrons in wetlands
DOE Office of Scientific and Technical Information (OSTI.GOV)
Iversen, Colleen M; Murphy, Meaghan T.; Allen, Michael F.
Background: Wetlands store a substantial amount of carbon (C) in deep soil organic matter deposits, and play an important role in global fluxes of carbon dioxide and methane. Fine roots (i.e., ephemeral roots that are active in water and nutrient uptake) are recognized as important components of biogeochemical cycles in nutrient-limited wetland ecosystems. However, quantification of fine-root dynamics in wetlands has generally been limited to destructive approaches, possibly because of methodological difficulties associated with the unique environmental, soil, and plant community characteristics of these systems. Non-destructive minirhizotron technology has rarely been used in wetland ecosystems. Scope: Our goal was tomore » develop a consensus on, and a methodological framework for, the appropriate installation and use of minirhizotron technology in wetland ecosystems. Here, we discuss a number of potential solutions for the challenges associated with the deployment of minirhizotron technology in wetlands, including minirhizotron installation and anchorage, capture and analysis of minirhizotron images, and upscaling of minirhizotron data for analysis of biogeochemical pools and parameterization of land surface models. Conclusions: The appropriate use of minirhizotron technology to examine relatively understudied fine-root dynamics in wetlands will advance our knowledge of ecosystem C and nutrient cycling in these globally important ecosystems.« less
Educational Knowledge Brokerage and Mobilization: The "Marshall Memo" Case
ERIC Educational Resources Information Center
Malin, Joel R.; Paralkar, Vijay Keshaorao
2017-01-01
The importance of intermediation between communities primarily engaged in research production and those primarily engaged in practice is increasingly acknowledged, yet our understanding of the nature and influence of this work in education remains limited. Accordingly, this study utilizes case study methodology and aspires to understand the…
ERIC Educational Resources Information Center
Czocher, Jennifer A.
2016-01-01
This study contributes a methodological tool to reconstruct the cognitive processes and mathematical activities carried out by mathematical modelers. Represented as Modeling Transition Diagrams (MTDs), individual modeling routes were constructed for four engineering undergraduate students. Findings stress the importance and limitations of using…
The Associations of Intergroup Interactions and School Racial Socialization with Academic Motivation
ERIC Educational Resources Information Center
Byrd, Christy M.
2015-01-01
School racial climate is an important aspect of the school environment that can have significant implications for youths' development. However, existing research is limited by conceptual and methodological concerns that restrict the ability of researchers and educators to identify "what" about and "how" the racial climate is…
ERIC Educational Resources Information Center
Etaio, Iñaki; Churruca, Itziar; Rada, Diego; Miranda, Jonatan; Saracibar, Amaia; Sarrionandia, Fernando; Lasa, Arrate; Simón, Edurne; Labayen, Idoia; Martinez, Olaia
2018-01-01
European Frame for Higher Education has led universities to adapt their teaching schemes. Degrees must train students in competences including specific and cross-curricular skills. Nevertheless, there are important limitations to follow skill improvement through the consecutive academic years. Final-year dissertation (FYD) offers the opportunity…
ERIC Educational Resources Information Center
Bhattacharyya, Ena; Patil, Arun; Sargunan, Rajeswary Appacutty
2010-01-01
Engineering communication studies indicate the importance of oral presentations as an indispensable component of workplace oral communication activities; however, since there is limited literature regarding stakeholder perceptions of effective presentation skills and attributes in technical oral presentations or final year engineering project…
Oliver, Penelope; Cicerale, Sara; Pang, Edwin; Keast, Russell
2018-04-01
Temporal dominance of sensations (TDS) is a rapid descriptive method that offers a different magnitude of information to traditional descriptive analysis methodologies. This methodology considers the dynamic nature of eating, assessing sensory perception of foods as they change throughout the eating event. Limited research has applied the TDS methodology to strawberries and subsequently validated the results against Quantitative Descriptive Analysis (QDA™). The aim of this research is to compare the TDS methodology using an untrained consumer panel to the results obtained via QDA™ with a trained sensory panel. The trained panelists (n = 12, minimum 60 hr each panelist) were provided with six strawberry samples (three cultivars at two maturation levels) and applied QDA™ techniques to profile each strawberry sample. Untrained consumers (n = 103) were provided with six strawberry samples (three cultivars at two maturation levels) and required to use TDS methodology to assess the dominant sensations for each sample as they change over time. Results revealed moderately comparable product configurations produced via TDS in comparison to QDA™ (RV coefficient = 0.559), as well as similar application of the sweet attribute (correlation coefficient of 0.895 at first bite). The TDS methodology however was not in agreement with the QDA™ methodology regarding more complex flavor terms. These findings support the notion that the lack of training on the definition of terms, together with the limitations of the methodology to ignore all attributes other than those dominant, provide a different magnitude of information than the QDA™ methodology. A comparison of TDS to traditional descriptive analysis indicate that TDS provides additional information to QDA™ regarding the lingering component of eating. The QDA™ results however provide more precise detail regarding singular attributes. Therefore, the TDS methodology has an application in industry when it is important to understand the lingering profile of products. However, this methodology should not be employed as a replacement to traditional descriptive analysis methods. © 2018 Institute of Food Technologists®.
Lexchin, J; Holbrook, A
1994-07-01
To evaluate the methodologic quality and relevance of references in pharmaceutical advertisements in the Canadian Medical Association Journal (CMAJ). Analytic study. All 114 references cited in the first 22 distinct pharmaceutical advertisements in volume 146 of CMAJ. Mean methodologic quality score (modified from the 6-point scale used to assess articles in the American College of Physicians' Journal Club) and mean relevance score (based on a new 5-point scale) for all references in each advertisement. Twenty of the 22 companies responded, sending 78 (90%) of the 87 references requested. The mean methodologic quality score was 58% (95% confidence limits [CL] 51% and 65%) and the mean relevance score 76% (95% CL 72% and 80%). The two mean scores were statistically lower than the acceptable score of 80% (p < 0.05), and the methodologic quality score was outside the preset clinically significant difference of 15%. The poor rating for methodologic quality was primarily because of the citation of references to low-quality review articles and "other" sources (i.e., other than reports of clinical trials). Half of the advertisements had a methodologic quality score of less than 65%, but only five had a relevance score of less than 65%. Although the relevance of most of the references was within minimal acceptable limits, the methodologic quality was often unacceptable. Because advertisements are an important part of pharmaceutical marketing and education, we suggest that companies develop written standards for their advertisements and monitor their advertisements for adherence to these standards. We also suggest that the Pharmaceutical Advertising Advisory Board develop more stringent guidelines for advertising and that it enforce these guidelines in a consistent, rigorous fashion.
Lexchin, J; Holbrook, A
1994-01-01
OBJECTIVE: To evaluate the methodologic quality and relevance of references in pharmaceutical advertisements in the Canadian Medical Association Journal (CMAJ). DESIGN: Analytic study. DATA SOURCE: All 114 references cited in the first 22 distinct pharmaceutical advertisements in volume 146 of CMAJ. MAIN OUTCOME MEASURES: Mean methodologic quality score (modified from the 6-point scale used to assess articles in the American College of Physicians' Journal Club) and mean relevance score (based on a new 5-point scale) for all references in each advertisement. MAIN RESULTS: Twenty of the 22 companies responded, sending 78 (90%) of the 87 references requested. The mean methodologic quality score was 58% (95% confidence limits [CL] 51% and 65%) and the mean relevance score 76% (95% CL 72% and 80%). The two mean scores were statistically lower than the acceptable score of 80% (p < 0.05), and the methodologic quality score was outside the preset clinically significant difference of 15%. The poor rating for methodologic quality was primarily because of the citation of references to low-quality review articles and "other" sources (i.e., other than reports of clinical trials). Half of the advertisements had a methodologic quality score of less than 65%, but only five had a relevance score of less than 65%. CONCLUSIONS: Although the relevance of most of the references was within minimal acceptable limits, the methodologic quality was often unacceptable. Because advertisements are an important part of pharmaceutical marketing and education, we suggest that companies develop written standards for their advertisements and monitor their advertisements for adherence to these standards. We also suggest that the Pharmaceutical Advertising Advisory Board develop more stringent guidelines for advertising and that it enforce these guidelines in a consistent, rigorous fashion. PMID:8004560
LaKind, Judy S; Anthony, Laura G; Goodman, Michael
2017-01-01
Environmental epidemiology data are becoming increasingly important in public health decision making, which commonly incorporates a systematic review of multiple studies. This review addresses two fundamental questions: What is the quality of available reviews on associations between exposure to synthetic organic chemicals and neurodevelopmental outcomes? What is the value (e.g., quality and consistency) of the underlying literature? Published reviews on associations between synthetic organic environmental chemical exposures and neurodevelopmental outcomes in children were systematically evaluated. Seventy-four relevant reviews were identified, and these were evaluated with respect to four methodological characteristics: (1) systematic inclusion/exclusion criteria and reproducible methods for search and retrieval of studies; (2) structured evaluation of underlying data quality; (3) systematic assessment of consistency across specific exposure-outcome associations; and (4) evaluation of reporting/publication bias. None of the 74 reviews fully met the criteria for all four methodological characteristics. Only four reviews met two criteria, and six reviews fulfilled only one criterion. Perhaps more importantly, the higher quality reviews were not able to meet all of the criteria owing to the shortcomings of underlying studies, which lacked comparability in terms of specific research question of interest, overall design, exposure assessment, outcome ascertainment, and analytic methods. Thus, even the most thoughtful and rigorous review may be of limited value if the underlying literature includes investigations that address different hypotheses and are beset by methodological inconsistencies and limitations. Issues identified in this review of reviews illustrate considerable challenges that are facing assessments of epidemiological evidence.
Symptom research on chronic cough: a historical perspective.
Irwin, R S; Madison, J M
2001-05-01
This review provides a perspective on how research on the management of cough has evolved, looks at key methodologic lessons that have been learned from this research and how they may relate to the management of other symptoms, identifies important methodologic challenges that remain to be solved, and lists important questions that still need to be answered. Three important methodologic lessons have been learned. First, cough must be evaluated systematically and according to a neuroanatomic framework. Second, the response to specific therapy must be noted to determine the cause or causes of cough and to characterize the strengths and limitations of diagnostic testing. Third, multiple conditions can simultaneously cause cough. Among the three methodologic challenges that still need to be solved are 1) definitively determining the diagnostic accuracy and reliability of 24-hour esophageal pH monitoring and how best to interpret pH test results, 2) definitively determining the role of nonacid reflux in cough due to gastroesophageal reflux disease, and 3) developing reliable and reproducible subjective and objective methods with which to assess the efficacy of cough therapy. Numerous important clinical questions are still unanswered: What role do empirical therapeutic trials play in diagnosing the cause of chronic cough? What is the most cost-effective approach to the diagnosis and treatment of chronic cough: empirical therapeutic trials or laboratory testing-directed therapeutic trials? How often is environmental air pollution, unrelated to allergies or smoking, responsible for chronic cough?
Background: Soil/dust ingestion rates are important variables in assessing children’s health risks in contaminated environments. Current estimates are based largely on soil tracer methodology, which is limited by analytical uncertainty, small sample size, and short study du...
Democratic Schooling and Citizenship Education: What Does the Research Reveal?
ERIC Educational Resources Information Center
Hepburn, Mary A.
This paper examines four major research studies spanning approximately 45 years which provide solid evidence that democratic schooling is possible and is an extremely important factor in the education of young citizens for a democratic society. The objectives, methodology, limitations, and results of each study are examined. From the studies, the…
How to emerge from the conservatism in clinical research methodology?
Kotecki, Nuria; Penel, Nicolas; Awada, Ahmad
2017-09-01
Despite recent changes in clinical research methodology, many challenges remain in drug development methodology. Advances in molecular biology and cancer treatments have changed the clinical research landscape. Thus, we moved from empirical clinical oncology to molecular and immunological therapeutic approaches. Along with this move, adapted dose-limiting toxicities definitions, endpoints, and dose escalation methods have been proposed. Moreover, the classical frontier between phase I, phase II, and phase III has become unclear in particular for immunological approaches. So, investigators are facing major challenges in drug development methodology. We propose to individualize clinical research using innovative approaches to significantly improve patient outcomes and targeting what is considered unmet need. Integrating high level of translational research and performing well designed biomarker studies with great potential for clinical practice are of utmost importance. This could be performed within new models of clinical research networks and by building a strong collaboration between academic, cooperative groups, on-site investigators, and pharma.
Gyori, Miklos; Stefanik, Krisztina; Kanizsai-Nagy, Ildikó
2015-01-01
A growing body of evidence confirms that mobile digital devices have key potentials as assistive/educational tools for people with autism spectrum disorders. The aim of this paper is to outline key aspects of development and evaluation methodologies that build on, and provide systematic evidence on effects of using such apps. We rely on the results of two R+D projects, both using quantitative and qualitative methods to support development and to evaluate developed apps (n=54 and n=22). Analyzing methodological conclusions from these studies we outline some guidelines for an 'ideal' R+D methodology but we also point to important trade-offs between the need for best systematic evidence and the limitations on development time and costs. We see these trade-offs as a key issue to be resolved in this field.
Experiences of Structured Elicitation for Model-Based Cost-Effectiveness Analyses.
Soares, Marta O; Sharples, Linda; Morton, Alec; Claxton, Karl; Bojke, Laura
2018-06-01
Empirical evidence supporting the cost-effectiveness estimates of particular health care technologies may be limited, or it may even be missing entirely. In these situations, additional information, often in the form of expert judgments, is needed to reach a decision. There are formal methods to quantify experts' beliefs, termed as structured expert elicitation (SEE), but only limited research is available in support of methodological choices. Perhaps as a consequence, the use of SEE in the context of cost-effectiveness modelling is limited. This article reviews applications of SEE in cost-effectiveness modelling with the aim of summarizing the basis for methodological choices made in each application and recording the difficulties and challenges reported by the authors in the design, conduct, and analyses. The methods used in each application were extracted along with the criteria used to support methodological and practical choices and any issues or challenges discussed in the text. Issues and challenges were extracted using an open field, and then categorised and grouped for reporting. The review demonstrates considerable heterogeneity in methods used, and authors acknowledge great methodological uncertainty in justifying their choices. Specificities of the context area emerging as potentially important in determining further methodological research in elicitation are between- expert variation and its interpretation, the fact that substantive experts in the area may not be trained in quantitative subjects, that judgments are often needed on various parameter types, the need for some form of assessment of validity, and the need for more integration with behavioural research to devise relevant debiasing strategies. This review of experiences of SEE highlights a number of specificities/constraints that can shape the development of guidance and target future research efforts in this area. Copyright © 2018 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Analytical and simulator study of advanced transport
NASA Technical Reports Server (NTRS)
Levison, W. H.; Rickard, W. W.
1982-01-01
An analytic methodology, based on the optimal-control pilot model, was demonstrated for assessing longitidunal-axis handling qualities of transport aircraft in final approach. Calibration of the methodology is largely in terms of closed-loop performance requirements, rather than specific vehicle response characteristics, and is based on a combination of published criteria, pilot preferences, physical limitations, and engineering judgment. Six longitudinal-axis approach configurations were studied covering a range of handling qualities problems, including the presence of flexible aircraft modes. The analytical procedure was used to obtain predictions of Cooper-Harper ratings, a solar quadratic performance index, and rms excursions of important system variables.
Television's impact on children.
Zuckerman, D M; Zuckerman, B S
1985-02-01
Television has a major impact on children's knowledge, attitudes, and behavior. Research has demonstrated the association between television viewing and four areas: (1) children's aggressive behavior; (2) racial and sex-role stereotypes; (3) decreased interest in reading and school activities; and (4) poorer health habits and attitudes. Methodological limitations make it difficult to draw firm conclusions about a causal relationship between television viewing and children's behavior. Representative studies in these four areas are reviewed, important methodological concerns are pointed out, and conclusions from the research findings are drawn. The implications of the data for pediatricians and other health professionals are discussed.
Casado, Banghwa Lee; Negi, Nalini Junko; Hong, Michin
2012-01-01
Despite the growing number of language minorities, foreign-born individuals with limited English proficiency, this population has been largely left out of social work research, often due to methodological challenges involved in conducting research with this population. Whereas the professional standard calls for cultural competence, a discussion of how to implement strategies for culturally competent research with language minorities is regrettably limited in the social work literature. This article is, to the authors' knowledge, one of the first within the field of social work to tie together unique methodological issues that may arise throughout the research conceptualization, development, and implementation process with this population. Strategies for how to overcome such issues are provided by adapting and expanding on a conceptual framework by Meleis. The incorporation of such research practices with language minorities has the potential to enhance trust and, thus, improve the recruitment and retention of this hard-to-reach population. More important, studies that aim to include such culturally responsive criteria may produce results that have improved validity and, thus, contribute to the advancement of knowledge regarding this population.
System learning approach to assess sustainability and ...
This paper presents a methodology that combines the power of an Artificial Neural Network and Information Theory to forecast variables describing the condition of a regional system. The novelty and strength of this approach is in the application of Fisher information, a key method in Information Theory, to preserve trends in the historical data and prevent over fitting projections. The methodology was applied to demographic, environmental, food and energy consumption, and agricultural production in the San Luis Basin regional system in Colorado, U.S.A. These variables are important for tracking conditions in human and natural systems. However, available data are often so far out of date that they limit the ability to manage these systems. Results indicate that the approaches developed provide viable tools for forecasting outcomes with the aim of assisting management toward sustainable trends. This methodology is also applicable for modeling different scenarios in other dynamic systems. Indicators are indispensable for tracking conditions in human and natural systems, however, available data is sometimes far out of date and limit the ability to gauge system status. Techniques like regression and simulation are not sufficient because system characteristics have to be modeled ensuring over simplification of complex dynamics. This work presents a methodology combining the power of an Artificial Neural Network and Information Theory to capture patterns in a real dyna
European approaches to work-related stress: a critical review on risk evaluation.
Zoni, Silvia; Lucchini, Roberto G
2012-03-01
In recent years, various international organizations have raised awareness regarding psychosocial risks and work-related stress. European stakeholders have also taken action on these issues by producing important documents, such as position papers and government regulations, which are reviewed in this article. In particular, 4 European models that have been developed for the assessment and management of work-related stress are considered here. Although important advances have been made in the understanding of work-related stress, there are still gaps in the translation of this knowledge into effective practice at the enterprise level. There are additional problems regarding the methodology in the evaluation of work-related stress. The European models described in this article are based on holistic, global and participatory approaches, where the active role of and involvement of workers are always emphasized. The limitations of these models are in the lack of clarity on preventive intervention and, for two of them, the lack of instrument standardization for risk evaluation. The comparison among the European models to approach work-related stress, although with limitations and socio-cultural differences, offers the possibility for the development of a social dialogue that is important in defining the correct and practical methodology for work stress evaluation and prevention.
Shi, Chunhu; Zhu, Lin; Wang, Xue; Qin, Chunxia; Xu, Qi; Tian, Jinhui
2014-12-01
The importance of systematic reviews (SRs) of nursing interventions' impact on practice makes their methodological quality and reporting characteristics especially important as it directly influence their utility for clinicians, patients and policy makers.The study aims to assess the methodological quality and reporting characteristics of SRs of nursing interventions in Chinese nursing journals. Three Chinese databases were searched for SRs of nursing interventions from inception to October 2011. The assessment of multiple systematic reviews (AMSTAR) and Preferred Reporting Items for Systematic Reviews and Meta Analyses (PRISMA) statements were used to assess methodological quality and reporting characteristics. Seventy-four SRs were included. The proportion of SRs complying with AMSTAR checklist items ranged from 0% to 82.4%. No SRs reported an 'a priori' design or conflict of interest. Only four items were found to be reported in more than 50% of the SRs: a list of included and excluded studies, the scientific quality of included studies, the appropriate use of methods to combine findings, and formulating conclusions appropriately. The majority of SRs of nursing interventions in China had major methodological and reporting flaws that limited their value to guide decisions. Chinese authors and journals should adopt and keep up with the AMSTAR and PRISMA statements to improve the quality of SRs in this field. © 2014 Wiley Publishing Asia Pty Ltd.
Cole, Ashley L; Austin, Anna E; Hickson, Ryan P; Dixon, Matthew S; Barber, Emma L
2018-05-11
Randomized trials outside the U.S. have found non-inferior survival for neoadjuvant chemotherapy (NACT) versus primary debulking surgery (PDS) for advanced ovarian cancer (AOC). However, these trials reported lower overall survival and lower rates of optimal debulking than U.S. studies, leading to questions about generalizability to U.S. practice, where aggressive debulking is more common. Consequently, comparative effectiveness in the U.S. remains controversial. We reviewed U.S. comparative effectiveness studies of NACT versus PDS for AOC. Here we describe methodological challenges, compare results to trials outside the U.S., and make suggestions for future research. We identified U.S. studies published in 2010 or later that evaluated the comparative effectiveness of NACT versus PDS on survival in AOC through a PubMed search. Two independent reviewers abstracted data from eligible articles. Nine of 230 articles were eligible for review. Methodological challenges included unmeasured confounders, heterogeneous treatment effects, treatment variations over time, and inconsistent measurement of treatment and survival. Whereas some limitations were unavoidable, several limitations noted across studies were avoidable, including conditioning on mediating factors and immortal time introduced by measuring survival beginning from diagnosis. Without trials in the U.S., non-randomized studies are an important source of evidence for the ideal treatment for AOC. However, several methodological challenges exist when assessing the comparative effectiveness of NACT versus PDS in a non-randomized setting. Future observational studies must ensure that treatment is consistent throughout the study period and that treatment groups are comparable. Rapidly-evolving oncology data networks may allow for identification of treatment intent and other important confounders. Copyright © 2018 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Henry, Gary T.; Fortner, C. Kevin; Thompson, Charles L.
2010-01-01
Evaluating the impacts of public school funding on student achievement has been an important objective for informing education policymaking but fraught with data and methodological limitations. Findings from prior research have been mixed at best, leaving policymakers with little advice about the benefits of allocating public resources to schools…
ERIC Educational Resources Information Center
Lange, Linda; Bickel, Robert
This paper examines pregnancy in early adolescence, among West Virginia females aged 10-14, as it relates to local economic and social contexts. Although research on adolescent pregnancy is substantial, it is generally limited to the experiences of older adolescents and premised on assumptions of methodological individualism--that the correlates…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-08
... that tolled the deadlines for all Import Administration cases by seven calendar days due to the Federal...-day period to 300 days if it determines that the case is extraordinarily complicated. See 19 CFR 351... shipper reviews involve extraordinarily complicated methodological issues, including the examination of...
ERIC Educational Resources Information Center
McHugh, R. Kathryn; Behar, Evelyn
2012-01-01
In his commentary on our previously published article "Readability of Self-Report Measures of Depression and Anxiety," J. Schinka (2012) argued for the importance of considering readability of patient materials and highlighted limitations of existing methodologies for this assessment. Schinka's commentary articulately described the weaknesses of…
ERIC Educational Resources Information Center
Genemo, Hussein; Miah, Shah Jahan; McAndrew, Alasdair
2016-01-01
Assessment has been defined as an authentic method that plays an important role in evaluating students' learning attitude in acquiring lifelong knowledge. Traditional methods of assessment including the Computer-Aided Assessment (CAA) for mathematics show limited ability to assess students' full work unless multi-step questions are sub-divided…
NASA Astrophysics Data System (ADS)
Provenzano, G.; Vardy, M. E.; Henstock, T.; Zervos, A.
2017-12-01
A quantitative high-resolution physical model of the top 100 meters of the sub-seabed is of key importance for a wide range of shallow geohazard scenarios: identification of potential shallow landsliding, monitoring of gas storage sites, and assessment of offshore structures stability. Cur- rently, engineering-scale sediment characterisation relies heavily on direct sampling of the seabed and in-situ measurements. Such an approach is expensive and time-consuming, as well as liable to alter the sediment properties during the coring process. As opposed to reservoir-scale seismic exploration, ultra-high-frequency (UHF, 0.2-4.0 kHz) multi-channel marine reflection seismic data are most often limited to a to semi-quantitative interpretation of the reflection amplitudes and facies geometries, leaving largely unexploited its intrinsic value as a remote characterisation tool. In this work, we develop a seismic inversion methodology to obtain a robust sub-metric resolution elastic model from limited-offset, limited-bandwidth UHF seismic reflection data, with minimal pre-processing and limited a priori information. The Full Waveform Inversion is implemented as a stochastic optimiser based upon a Genetic Algorithm, modified in order to improve the robustness against inaccurate starting model populations. Multiple independent runs are used to create a robust posterior model distribution and quantify the uncertainties on the solution. The methodology has been applied to complex synthetic examples and to real datasets acquired in areas prone to shallow landsliding. The inverted elastic models show a satisfactory match with the ground-truths and a good sensitivity to relevant variations in the sediment texture and saturation state. We apply the methodology to a range of synthetic consolidating slopes under different loading conditions and sediment properties distributions. Our work demonstrates that the seismic inversion of UHF data has the potential to become an important practical tool for marine ground model building in spatially heterogeneous areas, reducing the reliance on expensive and time-consuming coring campaigns.
Hendriks, Marleen E.; Kundu, Piyali; Boers, Alexander C.; Bolarinwa, Oladimeji A.; te Pas, Mark J.; Akande, Tanimola M.; Agbede, Kayode; Gomez, Gabriella B.; Redekop, William K.; Schultsz, Constance; Tan, Siok Swan
2014-01-01
Background Disease-specific costing studies can be used as input into cost-effectiveness analyses and provide important information for efficient resource allocation. However, limited data availability and limited expertise constrain such studies in low- and middle-income countries (LMICs). Objective To describe a step-by-step guideline for conducting disease-specific costing studies in LMICs where data availability is limited and to illustrate how the guideline was applied in a costing study of cardiovascular disease prevention care in rural Nigeria. Design The step-by-step guideline provides practical recommendations on methods and data requirements for six sequential steps: 1) definition of the study perspective, 2) characterization of the unit of analysis, 3) identification of cost items, 4) measurement of cost items, 5) valuation of cost items, and 6) uncertainty analyses. Results We discuss the necessary tradeoffs between the accuracy of estimates and data availability constraints at each step and illustrate how a mixed methodology of accurate bottom-up micro-costing and more feasible approaches can be used to make optimal use of all available data. An illustrative example from Nigeria is provided. Conclusions An innovative, user-friendly guideline for disease-specific costing in LMICs is presented, using a mixed methodology to account for limited data availability. The illustrative example showed that the step-by-step guideline can be used by healthcare professionals in LMICs to conduct feasible and accurate disease-specific cost analyses. PMID:24685170
Passenger rail vehicle safety assessment methodology. Volume I, Summary of safe performance limits.
DOT National Transportation Integrated Search
2000-04-01
This report presents a methodology based on computer simulation that asseses the safe dyamic performance limits of commuter passenger vehicles. The methodology consists of determining the critical design parameters and characteristic properties of bo...
ERIC Educational Resources Information Center
Seymour, Sharon
1991-01-01
Review of research methodologies used in studies of online public access catalog (OPAC) users finds that a variety of research methodologies--e.g., surveys, transaction log analysis, interviews--have been used with varying degrees of expertise. It is concluded that poor research methodology resulting from limited training and resources limits the…
Wanner, Samuel Penna; Prímola-Gomes, Thales Nicolau; Pires, Washington; Guimarães, Juliana Bohnen; Hudson, Alexandre Sérvulo Ribeiro; Kunstetter, Ana Cançado; Fonseca, Cletiana Gonçalves; Drummond, Lucas Rios; Damasceno, William Coutinho; Teixeira-Coelho, Francisco
2015-01-01
Rats are used worldwide in experiments that aim to investigate the physiological responses induced by a physical exercise session. Changes in body temperature regulation, which may affect both the performance and the health of exercising rats, are evident among these physiological responses. Despite the universal use of rats in biomedical research involving exercise, investigators often overlook important methodological issues that hamper the accurate measurement of clear thermoregulatory responses. Moreover, much debate exists regarding whether the outcome of rat experiments can be extrapolated to human physiology, including thermal physiology. Herein, we described the impact of different exercise intensities, durations and protocols and environmental conditions on running-induced thermoregulatory changes. We focused on treadmill running because this type of exercise allows for precise control of the exercise intensity and the measurement of autonomic thermoeffectors associated with heat production and loss. Some methodological issues regarding rat experiments, such as the sites for body temperature measurements and the time of day at which experiments are performed, were also discussed. In addition, we analyzed the influence of a high body surface area-to-mass ratio and limited evaporative cooling on the exercise-induced thermoregulatory responses of running rats and then compared these responses in rats to those observed in humans. Collectively, the data presented in this review represent a reference source for investigators interested in studying exercise thermoregulation in rats. In addition, the present data indicate that the thermoregulatory responses of exercising rats can be extrapolated, with some important limitations, to human thermal physiology.
Wanner, Samuel Penna; Prímola-Gomes, Thales Nicolau; Pires, Washington; Guimarães, Juliana Bohnen; Hudson, Alexandre Sérvulo Ribeiro; Kunstetter, Ana Cançado; Fonseca, Cletiana Gonçalves; Drummond, Lucas Rios; Damasceno, William Coutinho; Teixeira-Coelho, Francisco
2015-01-01
Rats are used worldwide in experiments that aim to investigate the physiological responses induced by a physical exercise session. Changes in body temperature regulation, which may affect both the performance and the health of exercising rats, are evident among these physiological responses. Despite the universal use of rats in biomedical research involving exercise, investigators often overlook important methodological issues that hamper the accurate measurement of clear thermoregulatory responses. Moreover, much debate exists regarding whether the outcome of rat experiments can be extrapolated to human physiology, including thermal physiology. Herein, we described the impact of different exercise intensities, durations and protocols and environmental conditions on running-induced thermoregulatory changes. We focused on treadmill running because this type of exercise allows for precise control of the exercise intensity and the measurement of autonomic thermoeffectors associated with heat production and loss. Some methodological issues regarding rat experiments, such as the sites for body temperature measurements and the time of day at which experiments are performed, were also discussed. In addition, we analyzed the influence of a high body surface area-to-mass ratio and limited evaporative cooling on the exercise-induced thermoregulatory responses of running rats and then compared these responses in rats to those observed in humans. Collectively, the data presented in this review represent a reference source for investigators interested in studying exercise thermoregulation in rats. In addition, the present data indicate that the thermoregulatory responses of exercising rats can be extrapolated, with some important limitations, to human thermal physiology. PMID:27227066
NASA Technical Reports Server (NTRS)
Sankararaman, Shankar
2016-01-01
This paper presents a computational framework for uncertainty characterization and propagation, and sensitivity analysis under the presence of aleatory and epistemic un- certainty, and develops a rigorous methodology for efficient refinement of epistemic un- certainty by identifying important epistemic variables that significantly affect the overall performance of an engineering system. The proposed methodology is illustrated using the NASA Langley Uncertainty Quantification Challenge (NASA-LUQC) problem that deals with uncertainty analysis of a generic transport model (GTM). First, Bayesian inference is used to infer subsystem-level epistemic quantities using the subsystem-level model and corresponding data. Second, tools of variance-based global sensitivity analysis are used to identify four important epistemic variables (this limitation specified in the NASA-LUQC is reflective of practical engineering situations where not all epistemic variables can be refined due to time/budget constraints) that significantly affect system-level performance. The most significant contribution of this paper is the development of the sequential refine- ment methodology, where epistemic variables for refinement are not identified all-at-once. Instead, only one variable is first identified, and then, Bayesian inference and global sensi- tivity calculations are repeated to identify the next important variable. This procedure is continued until all 4 variables are identified and the refinement in the system-level perfor- mance is computed. The advantages of the proposed sequential refinement methodology over the all-at-once uncertainty refinement approach are explained, and then applied to the NASA Langley Uncertainty Quantification Challenge problem.
Brown, Christopher J; O'Connor, Mary I; Poloczanska, Elvira S; Schoeman, David S; Buckley, Lauren B; Burrows, Michael T; Duarte, Carlos M; Halpern, Benjamin S; Pandolfi, John M; Parmesan, Camille; Richardson, Anthony J
2016-04-01
Climate change is shifting species' distribution and phenology. Ecological traits, such as mobility or reproductive mode, explain variation in observed rates of shift for some taxa. However, estimates of relationships between traits and climate responses could be influenced by how responses are measured. We compiled a global data set of 651 published marine species' responses to climate change, from 47 papers on distribution shifts and 32 papers on phenology change. We assessed the relative importance of two classes of predictors of the rate of change, ecological traits of the responding taxa and methodological approaches for quantifying biological responses. Methodological differences explained 22% of the variation in range shifts, more than the 7.8% of the variation explained by ecological traits. For phenology change, methodological approaches accounted for 4% of the variation in measurements, whereas 8% of the variation was explained by ecological traits. Our ability to predict responses from traits was hindered by poor representation of species from the tropics, where temperature isotherms are moving most rapidly. Thus, the mean rate of distribution change may be underestimated by this and other global syntheses. Our analyses indicate that methodological approaches should be explicitly considered when designing, analysing and comparing results among studies. To improve climate impact studies, we recommend that (1) reanalyses of existing time series state how the existing data sets may limit the inferences about possible climate responses; (2) qualitative comparisons of species' responses across different studies be limited to studies with similar methodological approaches; (3) meta-analyses of climate responses include methodological attributes as covariates; and (4) that new time series be designed to include the detection of early warnings of change or ecologically relevant change. Greater consideration of methodological attributes will improve the accuracy of analyses that seek to quantify the role of climate change in species' distribution and phenology changes. © 2015 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Dominguez, Caroline; Nascimento, Maria M.; Payan-Carreira, Rita; Cruz, Gonçalo; Silva, Helena; Lopes, José; Morais, Maria da Felicidade A.; Morais, Eva
2015-09-01
Considering the results of research on the benefits and difficulties of peer review, this paper describes how teaching faculty, interested in endorsing the acquisition of communication and critical thinking (CT) skills among engineering students, has been implementing a learning methodology throughout online peer review activities. While introducing a new methodology, it is important to weight the advantages found and the conditions that might have restrained the activity outcomes, thereby modulating its overall efficiency. Our results show that several factors are decisive for the success of the methodology: the use of specific and detailed orientation guidelines for CT skills, the students' training on how to deliver a meaningful feedback, the opportunity to counter-argument, the selection of good assignments' examples, and the constant teacher's monitoring of the activity. Results also tackle other aspects of the methodology such as the thinking skills evaluation tools (grades and tests) that most suit our reality. An improved methodology is proposed taking in account the encountered limitations, thus offering the possibility to other interested institutions to use/test and/or improve it.
Cumulative risk and developmental health: an argument for the importance of a family-wide science.
Browne, Dillon T; Plamondon, Andre; Prime, Heather; Puente-Duran, Sofia; Wade, Mark
2015-01-01
A substantial body of research links social disadvantage and developmental health via a cascade running from poverty, to cumulative psychosocial risk, to disrupted family dynamics, to child biological regulatory systems and neurocognitive processing, and finally to morbidity across the lifespan. Most research in this area employs single-dyad or between-family methodology. While informative, there are limitations to this approach. Specifically, it is impossible to determine how risk alters psychosocial environments that are similar for all persons within a household, versus processes that are unique to particular children. This is important in light of literature citing the primacy of child-specific environments in driving developmental health. Methodologically speaking, there are both benefits and challenges to family-wide approaches that differentiate between- and within-family environments. This review describes literature linking cumulative risk and developmental health via family process, while articulating the importance of family-wide approaches. Areas of shortcoming and recommendations for a family-wide science are provided. © 2015 John Wiley & Sons, Ltd.
Measuring systems of hard to get objects: problems with analysis of measurement results
NASA Astrophysics Data System (ADS)
Gilewska, Grazyna
2005-02-01
The problem accessibility of metrological parameters features of objects appeared in many measurements. Especially if it is biological object which parameters very often determined on the basis of indirect research. Accidental component predominate in forming of measurement results with very limited access to measurement objects. Every measuring process has a lot of conditions limiting its abilities to any way processing (e.g. increase number of measurement repetition to decrease random limiting error). It may be temporal, financial limitations, or in case of biological object, small volume of sample, influence measuring tool and observers on object, or whether fatigue effects e.g. at patient. It's taken listing difficulties into consideration author worked out and checked practical application of methods outlying observation reduction and next innovative methods of elimination measured data with excess variance to decrease of mean standard deviation of measured data, with limited aomunt of data and accepted level of confidence. Elaborated methods wee verified on the basis of measurement results of knee-joint width space got from radiographs. Measurements were carried out by indirectly method on the digital images of radiographs. Results of examination confirmed legitimacy to using of elaborated methodology and measurement procedures. Such methodology has special importance when standard scientific ways didn't bring expectations effects.
Source apportionment and sensitivity analysis: two methodologies with two different purposes
NASA Astrophysics Data System (ADS)
Clappier, Alain; Belis, Claudio A.; Pernigotti, Denise; Thunis, Philippe
2017-11-01
This work reviews the existing methodologies for source apportionment and sensitivity analysis to identify key differences and stress their implicit limitations. The emphasis is laid on the differences between source impacts
(sensitivity analysis) and contributions
(source apportionment) obtained by using four different methodologies: brute-force top-down, brute-force bottom-up, tagged species and decoupled direct method (DDM). A simple theoretical example to compare these approaches is used highlighting differences and potential implications for policy. When the relationships between concentration and emissions are linear, impacts and contributions are equivalent concepts. In this case, source apportionment and sensitivity analysis may be used indifferently for both air quality planning purposes and quantifying source contributions. However, this study demonstrates that when the relationship between emissions and concentrations is nonlinear, sensitivity approaches are not suitable to retrieve source contributions and source apportionment methods are not appropriate to evaluate the impact of abatement strategies. A quantification of the potential nonlinearities should therefore be the first step prior to source apportionment or planning applications, to prevent any limitations in their use. When nonlinearity is mild, these limitations may, however, be acceptable in the context of the other uncertainties inherent to complex models. Moreover, when using sensitivity analysis for planning, it is important to note that, under nonlinear circumstances, the calculated impacts will only provide information for the exact conditions (e.g. emission reduction share) that are simulated.
Methodological aspects of clinical trials in tinnitus: A proposal for an international standard
Landgrebe, Michael; Azevedo, Andréia; Baguley, David; Bauer, Carol; Cacace, Anthony; Coelho, Claudia; Dornhoffer, John; Figueiredo, Ricardo; Flor, Herta; Hajak, Goeran; van de Heyning, Paul; Hiller, Wolfgang; Khedr, Eman; Kleinjung, Tobias; Koller, Michael; Lainez, Jose Miguel; Londero, Alain; Martin, William H.; Mennemeier, Mark; Piccirillo, Jay; De Ridder, Dirk; Rupprecht, Rainer; Searchfield, Grant; Vanneste, Sven; Zeman, Florian; Langguth, Berthold
2013-01-01
Chronic tinnitus is a common condition with a high burden of disease. While many different treatments are used in clinical practice, the evidence for the efficacy of these treatments is low and the variance of treatment response between individuals is high. This is most likely due to the great heterogeneity of tinnitus with respect to clinical features as well as underlying pathophysiological mechanisms. There is a clear need to find effective treatment options in tinnitus, however, clinical trials differ substantially with respect to methodological quality and design. Consequently, the conclusions that can be derived from these studies are limited and jeopardize comparison between studies. Here, we discuss our view of the most important aspects of trial design in clinical studies in tinnitus and make suggestions for an international methodological standard in tinnitus trials. We hope that the proposed methodological standard will stimulate scientific discussion and will help to improve the quality of trials in tinnitus. PMID:22789414
Cuevas, Soledad
Agriculture is a major contributor to greenhouse gas emissions, an important part of which is associated to deforestation and indirect land use change. Appropriate and coherent food policies can play an important role in aligning health, economic and environmental goals. From the point of view of policy analysis, however, this requires multi-sectoral, interdisciplinary approaches which can be highly complex. Important methodological advances in the area are not exempted from limitations and criticism. We argue that there is scope for further developments in integrated quantitative and qualitative policy analysis combining existing methods, including mathematical modelling and stakeholder analysis. We outline methodological trends in the field, briefly characterise integrated mixed methods policy analysis and identify contributions, challenges and opportunities for future research. In particular, this type of approach can help address issues of uncertainty and context-specific validity, incorporate multiple perspectives and help advance meaningful interdisciplinary collaboration in the field. Substantial challenges remain, however, such as the integration of key issues related to non-communicable disease, or the incorporation of a broader range of qualitative approaches that can address important cultural and ethical dimensions of food.
Comparative Effectiveness Research in Oncology
2013-01-01
Although randomized controlled trials represent the gold standard for comparative effective research (CER), a number of additional methods are available when randomized controlled trials are lacking or inconclusive because of the limitations of such trials. In addition to more relevant, efficient, and generalizable trials, there is a need for additional approaches utilizing rigorous methodology while fully recognizing their inherent limitations. CER is an important construct for defining and summarizing evidence on effectiveness and safety and comparing the value of competing strategies so that patients, providers, and policymakers can be offered appropriate recommendations for optimal patient care. Nevertheless, methodological as well as political and social challenges for CER remain. CER requires constant and sophisticated methodological oversight of study design and analysis similar to that required for randomized trials to reduce the potential for bias. At the same time, if appropriately conducted, CER offers an opportunity to identify the most effective and safe approach to patient care. Despite rising and unsustainable increases in health care costs, an even greater challenge to the implementation of CER arises from the social and political environment questioning the very motives and goals of CER. Oncologists and oncology professional societies are uniquely positioned to provide informed clinical and methodological expertise to steer the appropriate application of CER toward critical discussions related to health care costs, cost-effectiveness, and the comparative value of the available options for appropriate care of patients with cancer. PMID:23697601
Alamian, Golnoush; Hincapié, Ana-Sofía; Pascarella, Annalisa; Thiery, Thomas; Combrisson, Etienne; Saive, Anne-Lise; Martel, Véronique; Althukov, Dmitrii; Haesebaert, Frédéric; Jerbi, Karim
2017-09-01
Neuroimaging studies provide evidence of disturbed resting-state brain networks in Schizophrenia (SZ). However, untangling the neuronal mechanisms that subserve these baseline alterations requires measurement of their electrophysiological underpinnings. This systematic review specifically investigates the contributions of resting-state Magnetoencephalography (MEG) in elucidating abnormal neural organization in SZ patients. A systematic literature review of resting-state MEG studies in SZ was conducted. This literature is discussed in relation to findings from resting-state fMRI and EEG, as well as to task-based MEG research in SZ population. Importantly, methodological limitations are considered and recommendations to overcome current limitations are proposed. Resting-state MEG literature in SZ points towards altered local and long-range oscillatory network dynamics in various frequency bands. Critical methodological challenges with respect to experiment design, and data collection and analysis need to be taken into consideration. Spontaneous MEG data show that local and global neural organization is altered in SZ patients. MEG is a highly promising tool to fill in knowledge gaps about the neurophysiology of SZ. However, to reach its fullest potential, basic methodological challenges need to be overcome. MEG-based resting-state power and connectivity findings could be great assets to clinical and translational research in psychiatry, and SZ in particular. Copyright © 2017 International Federation of Clinical Neurophysiology. Published by Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Scott, C. A.
2014-12-01
This presentation reviews conceptual advances in the emerging field of socio-hydrology that focuses on coupled human and water systems. An important current challenge is how to better couple the bidirectional influences between human and water systems, which lead to emergent dynamics. The interactions among (1) the structure and dynamics of systems with (2) human values and norms lead to (3) outcomes, which in turn influence subsequent interactions. Human influences on hydrological systems are relatively well understood, chiefly resulting from developments in the field of water resources. The ecosystem-service concept of cultural value has expanded understanding of decision-making beyond economic rationality criteria. Hydrological impacts on social processes are less well developed conceptually, but this is changing with growing attention to vulnerability, adaptation, and resilience, particularly in the face of climate change. Methodological limitations, especially in characterizing the range of human responses to hydrological events and drivers, still pose challenges to modeling bidirectional human-water influences. Evidence from multiple case studies, synthesized in more broadly generic syndromes, helps surmount these methodological limitations and offers the potential to improve characterization and quantification of socio-hydrological systems.
Groundwater vulnerability to climate change: A review of the assessment methodology.
Aslam, Rana Ammar; Shrestha, Sangam; Pandey, Vishnu Prasad
2018-01-15
Impacts of climate change on water resources, especially groundwater, can no longer be hidden. These impacts are further exacerbated under the integrated influence of climate variability, climate change and anthropogenic activities. The degree of impact varies according to geographical location and other factors leading systems and regions towards different levels of vulnerability. In the recent past, several attempts have been made in various regions across the globe to quantify the impacts and consequences of climate and non-climate factors in terms of vulnerability to groundwater resources. Firstly, this paper provides a structured review of the available literature, aiming to critically analyse and highlight the limitations and knowledge gaps involved in vulnerability (of groundwater to climate change) assessment methodologies. The effects of indicator choice and the importance of including composite indicators are then emphasised. A new integrated approach for the assessment of groundwater vulnerability to climate change is proposed to successfully address those limitations. This review concludes that the choice of indicator has a significant role in defining the reliability of computed results. The effect of an individual indicator is also apparent but the consideration of a combination (variety) of indicators may give more realistic results. Therefore, in future, depending upon the local conditions and scale of the study, indicators from various groups should be chosen. Furthermore, there are various assumptions involved in previous methodologies, which limit their scope by introducing uncertainty in the calculated results. These limitations can be overcome by implementing the proposed approach. Copyright © 2017 Elsevier B.V. All rights reserved.
Challenges to inferring causality from viral information dispersion in dynamic social networks
NASA Astrophysics Data System (ADS)
Ternovski, John
2014-06-01
Understanding the mechanism behind large-scale information dispersion through complex networks has important implications for a variety of industries ranging from cyber-security to public health. With the unprecedented availability of public data from online social networks (OSNs) and the low cost nature of most OSN outreach, randomized controlled experiments, the "gold standard" of causal inference methodologies, have been used with increasing regularity to study viral information dispersion. And while these studies have dramatically furthered our understanding of how information disseminates through social networks by isolating causal mechanisms, there are still major methodological concerns that need to be addressed in future research. This paper delineates why modern OSNs are markedly different from traditional sociological social networks and why these differences present unique challenges to experimentalists and data scientists. The dynamic nature of OSNs is particularly troublesome for researchers implementing experimental designs, so this paper identifies major sources of bias arising from network mutability and suggests strategies to circumvent and adjust for these biases. This paper also discusses the practical considerations of data quality and collection, which may adversely impact the efficiency of the estimator. The major experimental methodologies used in the current literature on virality are assessed at length, and their strengths and limits identified. Other, as-yetunsolved threats to the efficiency and unbiasedness of causal estimators--such as missing data--are also discussed. This paper integrates methodologies and learnings from a variety of fields under an experimental and data science framework in order to systematically consolidate and identify current methodological limitations of randomized controlled experiments conducted in OSNs.
40 CFR Appendix B to Part 72 - Methodology for Conversion of Emissions Limits
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 16 2010-07-01 2010-07-01 false Methodology for Conversion of... Conversion of Emissions Limits For the purposes of the Acid Rain Program, all emissions limits must be... conditions. Generic conversions for these limits are based on the assumed average energy contents listed in...
40 CFR Appendix B to Part 72 - Methodology for Conversion of Emissions Limits
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 16 2011-07-01 2011-07-01 false Methodology for Conversion of... Conversion of Emissions Limits For the purposes of the Acid Rain Program, all emissions limits must be... conditions. Generic conversions for these limits are based on the assumed average energy contents listed in...
40 CFR Appendix B to Part 72 - Methodology for Conversion of Emissions Limits
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 17 2014-07-01 2014-07-01 false Methodology for Conversion of Emissions Limits B Appendix B to Part 72 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) PERMITS REGULATION Pt. 72, App. B Appendix B to Part 72—Methodology for...
40 CFR Appendix A to Part 72 - Methodology for Annualization of Emissions Limits
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 17 2013-07-01 2013-07-01 false Methodology for Annualization of Emissions Limits A Appendix A to Part 72 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) PERMITS REGULATION Pt. 72, App. A Appendix A to Part 72—Methodology for...
40 CFR Appendix A to Part 72 - Methodology for Annualization of Emissions Limits
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 17 2014-07-01 2014-07-01 false Methodology for Annualization of Emissions Limits A Appendix A to Part 72 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) PERMITS REGULATION Pt. 72, App. A Appendix A to Part 72—Methodology for...
40 CFR Appendix B to Part 72 - Methodology for Conversion of Emissions Limits
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 17 2013-07-01 2013-07-01 false Methodology for Conversion of Emissions Limits B Appendix B to Part 72 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) PERMITS REGULATION Pt. 72, App. B Appendix B to Part 72—Methodology for...
Morales Guerrero, Josefina C; García Zepeda, Rodrigo A; Flores Ruvalcaba, Edgar; Martínez Michel, Lorelei
2012-09-01
We evaluated the two methods accepted by the Mexican norm for the determination of nitritesin infant meat-based food with vegetables. We determined the content of nitrites in the infant food, raw materials as well as products from the intermediate stages of production. A reagent blank and a reference sample were included at each analytical run. In addition, we determined the sensitivity, recovery percentage and accuracy of each methodology. Infant food results indicated an important difference in the nitrite content determined under each methodology, due to the persistent presence of turbidity in the extracts. Different treatments were proposed to eliminate the turbidity, but these only managed to reduce it. The turbidity was attributed to carbohydrates which disclosed concentration exhibit a wide dispersion and were below the quantifiable limit under both methodologies; therefore it is not recommended to apply these techniques with food suspected to contain traces of nitrites.
Tchepel, Oxana; Dias, Daniela
2011-06-01
This study is focused on the assessment of potential health benefits by meeting the air quality limit values (2008/50/CE) for short-term PM₁₀ exposure. For this purpose, the methodology of the WHO for Health Impact Assessment and APHEIS guidelines for data collection were applied to Porto Metropolitan Area, Portugal. Additionally, an improved methodology using population mobility data is proposed in this work to analyse number of persons exposed. In order to obtain representative background concentrations, an innovative approach to process air quality time series was implemented. The results provide the number of attributable cases prevented annually by reducing PM(10) concentration. An intercomparison of two approaches to process input data for the health risk analysis provides information on sensitivity of the applied methodology. The findings highlight the importance of taking into account spatial variability of the air pollution levels and population mobility in the health impact assessment.
Coarasa, Jorge; Das, Jishnu; Gummerson, Elizabeth; Bitton, Asaf
2017-04-12
Systematic reviews are powerful tools for summarizing vast amounts of data in controversial areas; but their utility is limited by methodological choices and assumptions. Two systematic reviews of literature on the quality of private sector primary care in low and middle income countries (LMIC), published in the same journal within a year, reached conflicting conclusions. The difference in findings reflects different review methodologies, but more importantly, a weak underlying body of literature. A detailed examination of the literature cited in both reviews shows that only one of the underlying studies met the gold standard for methodological robustness. Given the current policy momentum on universal health coverage and primary health care reform across the globe, there is an urgent need for high quality empirical evidence on the quality of private versus public sector primary health care in LMIC.
Conceptual, Methodological, and Ethical Problems in Communicating Uncertainty in Clinical Evidence
Han, Paul K. J.
2014-01-01
The communication of uncertainty in clinical evidence is an important endeavor that poses difficult conceptual, methodological, and ethical problems. Conceptual problems include logical paradoxes in the meaning of probability and “ambiguity”— second-order uncertainty arising from the lack of reliability, credibility, or adequacy of probability information. Methodological problems include questions about optimal methods for representing fundamental uncertainties and for communicating these uncertainties in clinical practice. Ethical problems include questions about whether communicating uncertainty enhances or diminishes patient autonomy and produces net benefits or harms. This article reviews the limited but growing literature on these problems and efforts to address them and identifies key areas of focus for future research. It is argued that the critical need moving forward is for greater conceptual clarity and consistent representational methods that make the meaning of various uncertainties understandable, and for clinical interventions to support patients in coping with uncertainty in decision making. PMID:23132891
Le Quere, C. [University of East Anglia, Norwich UK; Moriarty, R. [University of East Anglia, Norwich UK; Andrew, R. M. [Univ. of Oslo (Norway); Canadell, J. G. [Commonwealth Scientific and Industrial Research Organization (CSIRO) Oceans and Atmosphere, Canberra ACT (Australia); Sitch, S. [University of Exeter, Exter UK; Boden, T. A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States) Carbon Dioxide Information Analysis Center (CDIAC); al., et
2015-01-01
Accurate assessment of anthropogenic carbon dioxide (CO2) emissions and their redistribution among the atmosphere, ocean, and terrestrial biosphere is important to better understand the global carbon cycle, support the development of climate policies, and project future climate change. Here we describe data sets and a methodology to quantify all major components of the global carbon budget, including their uncertainties, based on the combination of a range of data, algorithms, statistics, and model estimates and their interpretation by a broad scientific community. We discuss changes compared to previous estimates as well as consistency within and among components, alongside methodology and data limitations.
Trost, Barry M.; Frontier, Alison J.; Thiel, Oliver R.; Yang, Hanbiao; Dong, Guangbin
2012-01-01
Bryostatins, a family of structurally complicated macrolides, exhibit an exceptional range of biological activities. The limited availability and structural complexity of these molecules makes development of an efficient total synthesis particularly important. This article describes our initial efforts towards the total synthesis of bryostatins, in which chemoselective and atom-economical methods for stereoselective assembly of the C-ring subunit were developed. A Pd-catalyzed tandem alkyne-alkyne coupling/6-endo-dig cyclization sequence was explored and successfully pursued in the synthesis of a dihydropyran ring system. Elaboration of this methodology ultimately led to a concise synthesis of the C-ring subunit of bryostatins. PMID:21793057
Van Pamel, Anton; Brett, Colin R; Lowe, Michael J S
2014-12-01
Improving the ultrasound inspection capability for coarse-grained metals remains of longstanding interest and is expected to become increasingly important for next-generation electricity power plants. Conventional ultrasonic A-, B-, and C-scans have been found to suffer from strong background noise caused by grain scattering, which can severely limit the detection of defects. However, in recent years, array probes and full matrix capture (FMC) imaging algorithms have unlocked exciting possibilities for improvements. To improve and compare these algorithms, we must rely on robust methodologies to quantify their performance. This article proposes such a methodology to evaluate the detection performance of imaging algorithms. For illustration, the methodology is applied to some example data using three FMC imaging algorithms; total focusing method (TFM), phase-coherent imaging (PCI), and decomposition of the time-reversal operator with multiple scattering filter (DORT MSF). However, it is important to note that this is solely to illustrate the methodology; this article does not attempt the broader investigation of different cases that would be needed to compare the performance of these algorithms in general. The methodology considers the statistics of detection, presenting the detection performance as probability of detection (POD) and probability of false alarm (PFA). A test sample of coarse-grained nickel super alloy, manufactured to represent materials used for future power plant components and containing some simple artificial defects, is used to illustrate the method on the candidate algorithms. The data are captured in pulse-echo mode using 64-element array probes at center frequencies of 1 and 5 MHz. In this particular case, it turns out that all three algorithms are shown to perform very similarly when comparing their flaw detection capabilities.
Loucka, Martin; Payne, Sheila; Brearley, Sarah
2014-01-01
A number of research projects have been conducted that aim to gather data on the international development of palliative care. These data are important for policy makers and palliative care advocates. The aim of this article was to provide a critical comparative analysis of methodological approaches used to assess the development and status of palliative care services and infrastructure at an international level. A selective literature review that focused on the methodological features of eight identified reports was undertaken. Reviewed reports were found to differ in adopted methodologies and provided uneven amounts of methodological information. Five major methodological limitations were identified (lack of theory, use of experts as source of information, grey literature, difficulties in ranking, and the problematic nature of data on service provision). A set of recommendations on how to deal with these issues in future research is provided. Measuring the international development of palliative care is a difficult and challenging task. The results of this study could be used to improve the validity of future research in this field. Copyright © 2014 U.S. Cancer Pain Relief Committee. Published by Elsevier Inc. All rights reserved.
Effectiveness of clinical pathways for total knee and total hip arthroplasty: literature review.
Kim, Stephen; Losina, Elena; Solomon, Daniel H; Wright, John; Katz, Jeffrey N
2003-01-01
Although many hospitals have implemented clinical pathways to standardize the process of care, the effectiveness of clinical pathways for total hip and knee arthroplasties has not been reviewed critically. We searched for articles comparing outcomes of total hip or knee arthroplasty for patients who were treated using clinical pathways as opposed to patients treated without these pathways. Eleven studies met criteria for inclusion. Ten used historical controls, and 1 was a randomized trial. The studies had important methodological limitations. In general, the articles showed that patients treated using pathways experienced shorter hospital stays and lower costs, with comparable clinical outcomes as compared with patients treated without clinical pathways. We concluded that clinical pathways appear successful in reducing costs and length of stay in the acute care hospital, with no compromise in patient outcomes. However, interpretation of these studies is complicated by substantial methodological limitations, particularly the use of historical controls and failure to account for length of stay in rehabilitation facilities. Copyright 2003, Elsevier Science (USA). All rights reserved.
Technological advancements and their importance for nematode identification
NASA Astrophysics Data System (ADS)
Ahmed, Mohammed; Sapp, Melanie; Prior, Thomas; Karssen, Gerrit; Back, Matthew Alan
2016-06-01
Nematodes represent a species-rich and morphologically diverse group of metazoans known to inhabit both aquatic and terrestrial environments. Their role as biological indicators and as key players in nutrient cycling has been well documented. Some plant-parasitic species are also known to cause significant losses to crop production. In spite of this, there still exists a huge gap in our knowledge of their diversity due to the enormity of time and expertise often involved in characterising species using phenotypic features. Molecular methodology provides useful means of complementing the limited number of reliable diagnostic characters available for morphology-based identification. We discuss herein some of the limitations of traditional taxonomy and how molecular methodologies, especially the use of high-throughput sequencing, have assisted in carrying out large-scale nematode community studies and characterisation of phytonematodes through rapid identification of multiple taxa. We also provide brief descriptions of some the current and almost-outdated high-throughput sequencing platforms and their applications in both plant nematology and soil ecology.
A critical analysis of studies of state drug reimbursement policies: research in need of discipline.
Soumerai, S B; Ross-Degnan, D; Fortess, E E; Abelson, J
1993-01-01
Concerns over pharmaceutical costs and appropriateness of medication use have led state Medicaid programs to restrict drug reimbursement. This article critically reviews 20 years of research on cost sharing, drug reimbursement limits, and administrative limitations on access to particular drugs via formularies, category exclusions, or prior authorization requirements; evaluates their methodological rigor; summarizes the state of current knowledge; and proposes future research directions. Drug reimbursement caps and modest cost sharing can reduce the use of both essential and less important drugs in Medicaid populations; severe reimbursement caps may precipitate serious unintended effects. Limitations on access to particular drugs can cause both rational and irrational drug substitution effects; it is unclear whether such limits reduce expenditures either for drugs or for overall health care.
NASA Astrophysics Data System (ADS)
Ebrahim, Girma Y.; Villholth, Karen G.
2016-10-01
Groundwater is an important resource for multiple uses in South Africa. Hence, setting limits to its sustainable abstraction while assuring basic human needs is required. Due to prevalent data scarcity related to groundwater replenishment, which is the traditional basis for estimating groundwater availability, the present article presents a novel method for determining allocatable groundwater in quaternary (fourth-order) catchments through information on streamflow. Using established methodologies for assessing baseflow, recession flow, and instream ecological flow requirement, the methodology develops a combined stepwise methodology to determine annual available groundwater storage volume using linear reservoir theory, essentially linking low flows proportionally to upstream groundwater storages. The approach was trialled for twenty-one perennial and relatively undisturbed catchments with long-term and reliable streamflow records. Using the Desktop Reserve Model, instream flow requirements necessary to meet the present ecological state of the streams were determined, and baseflows in excess of these flows were converted into a conservative estimates of allocatable groundwater storages on an annual basis. Results show that groundwater development potential exists in fourteen of the catchments, with upper limits to allocatable groundwater volumes (including present uses) ranging from 0.02 to 3.54 × 106 m3 a-1 (0.10-11.83 mm a-1) per catchment. With a secured availability of these volume 75% of the years, variability between years is assumed to be manageable. A significant (R2 = 0.88) correlation between baseflow index and the drainage time scale for the catchments underscores the physical basis of the methodology and also enables the reduction of the procedure by one step, omitting recession flow analysis. The method serves as an important complementary tool for the assessment of the groundwater part of the Reserve and the Groundwater Resource Directed Measures in South Africa and could be adapted and applied elsewhere.
Cross-Sectional And Longitudinal Uncertainty Propagation In Drinking Water Risk Assessment
NASA Astrophysics Data System (ADS)
Tesfamichael, A. A.; Jagath, K. J.
2004-12-01
Pesticide residues in drinking water can vary significantly from day to day. However, drinking water quality monitoring performed under the Safe Drinking Water Act (SDWA) at most community water systems (CWSs) is typically limited to four data points per year over a few years. Due to limited sampling, likely maximum residues may be underestimated in risk assessment. In this work, a statistical methodology is proposed to study the cross-sectional and longitudinal uncertainties in observed samples and their propagated effect in risk estimates. The methodology will be demonstrated using data from 16 CWSs across the US that have three independent databases of atrazine residue to estimate the uncertainty of risk in infants and children. The results showed that in 85% of the CWSs, chronic risks predicted with the proposed approach may be two- to four-folds higher than that predicted with the current approach, while intermediate risks may be two- to three-folds higher in 50% of the CWSs. In 12% of the CWSs, however, the proposed methodology showed a lower intermediate risk. A closed-form solution of propagated uncertainty will be developed to calculate the number of years (seasons) of water quality data and sampling frequency needed to reduce the uncertainty in risk estimates. In general, this methodology provided good insight into the importance of addressing uncertainty of observed water quality data and the need to predict likely maximum residues in risk assessment by considering propagation of uncertainties.
Methodology for estimating helicopter performance and weights using limited data
NASA Technical Reports Server (NTRS)
Baserga, Claudio; Ingalls, Charles; Lee, Henry; Peyran, Richard
1990-01-01
Methodology is developed and described for estimating the flight performance and weights of a helicopter for which limited data are available. The methodology is based on assumptions which couple knowledge of the technology of the helicopter under study with detailed data from well documented helicopters thought to be of similar technology. The approach, analysis assumptions, technology modeling, and the use of reference helicopter data are discussed. Application of the methodology is illustrated with an investigation of the Agusta A129 Mangusta.
Matthews, M; Rathleff, M S; Claus, A; McPoil, T; Nee, R; Crossley, K; Vicenzino, B
2017-12-01
Patellofemoral pain (PFP) is a multifactorial and often persistent knee condition. One strategy to enhance patient outcomes is using clinically assessable patient characteristics to predict the outcome and match a specific treatment to an individual. A systematic review was conducted to determine which baseline patient characteristics were (1) associated with patient outcome (prognosis); or (2) modified patient outcome from a specific treatment (treatment effect modifiers). 6 electronic databases were searched (July 2016) for studies evaluating the association between those with PFP, their characteristics and outcome. All studies were appraised using the Epidemiological Appraisal Instrument. Studies that aimed to identify treatment effect modifiers underwent a checklist for methodological quality. The 24 included studies evaluated 180 participant characteristics. 12 studies investigated prognosis, and 12 studies investigated potential treatment effect modifiers. Important methodological limitations were identified. Some prognostic studies used a retrospective design. Studies aiming to identify treatment effect modifiers often analysed too many variables for the limiting sample size and typically failed to use a control or comparator treatment group. 16 factors were reported to be associated with a poor outcome, with longer duration of symptoms the most reported (>4 months). Preliminary evidence suggests increased midfoot mobility may predict those who have a successful outcome to foot orthoses. Current evidence can identify those with increased risk of a poor outcome, but methodological limitations make it difficult to predict the outcome after one specific treatment compared with another. Adequately designed randomised trials are needed to identify treatment effect modifiers. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Rodríguez-Prieto, V; Vicente-Rubiano, M; Sánchez-Matamoros, A; Rubio-Guerri, C; Melero, M; Martínez-López, B; Martínez-Avilés, M; Hoinville, L; Vergne, T; Comin, A; Schauer, B; Dórea, F; Pfeiffer, D U; Sánchez-Vizcaíno, J M
2015-07-01
In this globalized world, the spread of new, exotic and re-emerging diseases has become one of the most important threats to animal production and public health. This systematic review analyses conventional and novel early detection methods applied to surveillance. In all, 125 scientific documents were considered for this study. Exotic (n = 49) and re-emerging (n = 27) diseases constituted the most frequently represented health threats. In addition, the majority of studies were related to zoonoses (n = 66). The approaches found in the review could be divided in surveillance modalities, both active (n = 23) and passive (n = 5); and tools and methodologies that support surveillance activities (n = 57). Combinations of surveillance modalities and tools (n = 40) were also found. Risk-based approaches were very common (n = 60), especially in the papers describing tools and methodologies (n = 50). The main applications, benefits and limitations of each approach were extracted from the papers. This information will be very useful for informing the development of tools to facilitate the design of cost-effective surveillance strategies. Thus, the current literature review provides key information about the advantages, disadvantages, limitations and potential application of methodologies for the early detection of new, exotic and re-emerging diseases.
Agile methodology selection criteria: IT start-up case study
NASA Astrophysics Data System (ADS)
Micic, Lj
2017-05-01
Project management in modern IT companies is often based on agile methodologies which have several advantages compared to traditional methodologies such is waterfall. Having in mind that clients sometimes change project during development it is crucial for an IT company to choose carefully which methodology is going to implement and is it going to be mostly based on one or is it going got be combination of several. There are several modern and often used methodologies but among those Scrum, Kanban and XP programming are usually the most common. Sometimes companies use mostly tools and procedures from one but quite often they use some of the combination of those methodologies. Having in mind that those methodologies are just a framework they allow companies to adapt it for their specific projects as well as for other limitations. These methodologies are in limited usage Bosnia but more and more IT companies are starting to use agile methodologies because it is practice and common not just for their clients abroad but also starting to be the only option in order to deliver quality product on time. However it is always challenging which methodology or combination of several companies should implement and how to connect it to its own project, organizational framework and HR management. This paper presents one case study based on local IT start up and delivers solution based on theoretical framework and practical limitations that case company has.
Computational Simulation of Continuous Fiber-Reinforced Ceramic Matrix Composites Behavior
NASA Technical Reports Server (NTRS)
Murthy, Pappu L. N.; Chamis, Christos C.; Mital, Subodh K.
1996-01-01
This report describes a methodology which predicts the behavior of ceramic matrix composites and has been incorporated in the computational tool CEMCAN (CEramic Matrix Composite ANalyzer). The approach combines micromechanics with a unique fiber substructuring concept. In this new concept, the conventional unit cell (the smallest representative volume element of the composite) of the micromechanics approach is modified by substructuring it into several slices and developing the micromechanics-based equations at the slice level. The methodology also takes into account nonlinear ceramic matrix composite (CMC) behavior due to temperature and the fracture initiation and progression. Important features of the approach and its effectiveness are described by using selected examples. Comparisons of predictions and limited experimental data are also provided.
The Single-Group, Pre- and Posttest Design in Nursing Education Research: It's Time to Move on.
Spurlock, Darrell R
2018-02-01
Studying the effectiveness of educational interventions is centrally important to building the science of nursing education. Yet, the design most commonly used in the study of nursing education interventions-the single-group, preand posttest design-provides limited evidence to support claims of intervention effectiveness. In this Methodology Corner installment, the limitations of the single-group, preand posttest design are outlined and a review of the requirements for establishing stronger arguments for causality is presented. To overcome the limitations of single-group, preand posttest designs, nursing education researchers are encouraged to employ study designs and procedures that can significantly strengthen researchers' claims of intervention effectiveness. [J Nurs Educ. 2018;57(2):69-71.]. Copyright 2018, SLACK Incorporated.
Theofilatos, Konstantinos; Pavlopoulou, Niki; Papasavvas, Christoforos; Likothanassis, Spiros; Dimitrakopoulos, Christos; Georgopoulos, Efstratios; Moschopoulos, Charalampos; Mavroudi, Seferina
2015-03-01
Proteins are considered to be the most important individual components of biological systems and they combine to form physical protein complexes which are responsible for certain molecular functions. Despite the large availability of protein-protein interaction (PPI) information, not much information is available about protein complexes. Experimental methods are limited in terms of time, efficiency, cost and performance constraints. Existing computational methods have provided encouraging preliminary results, but they phase certain disadvantages as they require parameter tuning, some of them cannot handle weighted PPI data and others do not allow a protein to participate in more than one protein complex. In the present paper, we propose a new fully unsupervised methodology for predicting protein complexes from weighted PPI graphs. The proposed methodology is called evolutionary enhanced Markov clustering (EE-MC) and it is a hybrid combination of an adaptive evolutionary algorithm and a state-of-the-art clustering algorithm named enhanced Markov clustering. EE-MC was compared with state-of-the-art methodologies when applied to datasets from the human and the yeast Saccharomyces cerevisiae organisms. Using public available datasets, EE-MC outperformed existing methodologies (in some datasets the separation metric was increased by 10-20%). Moreover, when applied to new human datasets its performance was encouraging in the prediction of protein complexes which consist of proteins with high functional similarity. In specific, 5737 protein complexes were predicted and 72.58% of them are enriched for at least one gene ontology (GO) function term. EE-MC is by design able to overcome intrinsic limitations of existing methodologies such as their inability to handle weighted PPI networks, their constraint to assign every protein in exactly one cluster and the difficulties they face concerning the parameter tuning. This fact was experimentally validated and moreover, new potentially true human protein complexes were suggested as candidates for further validation using experimental techniques. Copyright © 2015 Elsevier B.V. All rights reserved.
Development of a probabilistic analysis methodology for structural reliability estimation
NASA Technical Reports Server (NTRS)
Torng, T. Y.; Wu, Y.-T.
1991-01-01
The novel probabilistic analysis method for assessment of structural reliability presented, which combines fast-convolution with an efficient structural reliability analysis, can after identifying the most important point of a limit state proceed to establish a quadratic-performance function. It then transforms the quadratic function into a linear one, and applies fast convolution. The method is applicable to problems requiring computer-intensive structural analysis. Five illustrative examples of the method's application are given.
Miften, Moyed; Olch, Arthur; Mihailidis, Dimitris; Moran, Jean; Pawlicki, Todd; Molineu, Andrea; Li, Harold; Wijesooriya, Krishni; Shi, Jie; Xia, Ping; Papanikolaou, Nikos; Low, Daniel A
2018-04-01
Patient-specific IMRT QA measurements are important components of processes designed to identify discrepancies between calculated and delivered radiation doses. Discrepancy tolerance limits are neither well defined nor consistently applied across centers. The AAPM TG-218 report provides a comprehensive review aimed at improving the understanding and consistency of these processes as well as recommendations for methodologies and tolerance limits in patient-specific IMRT QA. The performance of the dose difference/distance-to-agreement (DTA) and γ dose distribution comparison metrics are investigated. Measurement methods are reviewed and followed by a discussion of the pros and cons of each. Methodologies for absolute dose verification are discussed and new IMRT QA verification tools are presented. Literature on the expected or achievable agreement between measurements and calculations for different types of planning and delivery systems are reviewed and analyzed. Tests of vendor implementations of the γ verification algorithm employing benchmark cases are presented. Operational shortcomings that can reduce the γ tool accuracy and subsequent effectiveness for IMRT QA are described. Practical considerations including spatial resolution, normalization, dose threshold, and data interpretation are discussed. Published data on IMRT QA and the clinical experience of the group members are used to develop guidelines and recommendations on tolerance and action limits for IMRT QA. Steps to check failed IMRT QA plans are outlined. Recommendations on delivery methods, data interpretation, dose normalization, the use of γ analysis routines and choice of tolerance limits for IMRT QA are made with focus on detecting differences between calculated and measured doses via the use of robust analysis methods and an in-depth understanding of IMRT verification metrics. The recommendations are intended to improve the IMRT QA process and establish consistent, and comparable IMRT QA criteria among institutions. © 2018 American Association of Physicists in Medicine.
Low Temperature Creep of a Titanium Alloy Ti-6Al-2Cb-1Ta-0.8Mo
NASA Technical Reports Server (NTRS)
Chu, H. P.
1997-01-01
This paper presents a methodology for the analysis of low temperature creep of titanium alloys in order to establish design limitations due to the effect of creep. The creep data on a titanium Ti-6Al-2Cb-1Ta-0.8Mo are used in the analysis. A creep equation is formulated to determine the allowable stresses so that creep at ambient temperatures can be kept within an acceptable limit during the service life of engineering structures or instruments. Microcreep which is important to design of precision instruments is included in the discussion also.
Theory of mind: mechanisms, methods, and new directions
Byom, Lindsey J.; Mutlu, Bilge
2013-01-01
Theory of Mind (ToM) has received significant research attention. Traditional ToM research has provided important understanding of how humans reason about mental states by utilizing shared world knowledge, social cues, and the interpretation of actions; however, many current behavioral paradigms are limited to static, “third-person” protocols. Emerging experimental approaches such as cognitive simulation and simulated social interaction offer opportunities to investigate ToM in interactive, “first-person” and “second-person” scenarios while affording greater experimental control. The advantages and limitations of traditional and emerging ToM methodologies are discussed with the intent of advancing the understanding of ToM in socially mediated situations. PMID:23964218
Patorno, Elisabetta; Patrick, Amanda R; Garry, Elizabeth M; Schneeweiss, Sebastian; Gillet, Victoria G; Bartels, Dorothee B; Masso-Gonzalez, Elvira; Seeger, John D
2014-11-01
Recent years have witnessed a growing body of observational literature on the association between glucose-lowering treatments and cardiovascular disease. However, many of the studies are based on designs or analyses that inadequately address the methodological challenges involved. We reviewed recent observational literature on the association between glucose-lowering medications and cardiovascular outcomes and assessed the design and analysis methods used, with a focus on their ability to address specific methodological challenges. We describe and illustrate these methodological issues and their impact on observed associations, providing examples from the reviewed literature. We suggest approaches that may be employed to manage these methodological challenges. From the evaluation of 81 publications of observational investigations assessing the association between glucose-lowering treatments and cardiovascular outcomes, we identified the following methodological challenges: 1) handling of temporality in administrative databases; 2) handling of risks that vary with time and treatment duration; 3) definitions of the exposure risk window; 4) handling of exposures that change over time; and 5) handling of confounding by indication. Most of these methodological challenges may be suitably addressed through application of appropriate methods. Observational research plays an increasingly important role in the evaluation of the clinical effects of diabetes treatment. Implementation of appropriate research methods holds the promise of reducing the potential for spurious findings and the risk that the spurious findings will mislead the medical community about risks and benefits of diabetes medications.
A new hyperspectral image compression paradigm based on fusion
NASA Astrophysics Data System (ADS)
Guerra, Raúl; Melián, José; López, Sebastián.; Sarmiento, Roberto
2016-10-01
The on-board compression of remote sensed hyperspectral images is an important task nowadays. One of the main difficulties is that the compression of these images must be performed in the satellite which carries the hyperspectral sensor. Hence, this process must be performed by space qualified hardware, having area, power and speed limitations. Moreover, it is important to achieve high compression ratios without compromising the quality of the decompress image. In this manuscript we proposed a new methodology for compressing hyperspectral images based on hyperspectral image fusion concepts. The proposed compression process has two independent steps. The first one is to spatially degrade the remote sensed hyperspectral image to obtain a low resolution hyperspectral image. The second step is to spectrally degrade the remote sensed hyperspectral image to obtain a high resolution multispectral image. These two degraded images are then send to the earth surface, where they must be fused using a fusion algorithm for hyperspectral and multispectral image, in order to recover the remote sensed hyperspectral image. The main advantage of the proposed methodology for compressing remote sensed hyperspectral images is that the compression process, which must be performed on-board, becomes very simple, being the fusion process used to reconstruct image the more complex one. An extra advantage is that the compression ratio can be fixed in advanced. Many simulations have been performed using different fusion algorithms and different methodologies for degrading the hyperspectral image. The results obtained in the simulations performed corroborate the benefits of the proposed methodology.
Monge Pereira, E; Molina Rueda, F; Alguacil Diego, I M; Cano de la Cuerda, R; de Mauro, A; Miangolarra Page, J C
2014-01-01
The limitations in performing functional activities in children and adolescents with cerebral palsy are important. The use of virtual reality systems is a new treatment approach that reinforces task-oriented motor learning. The purpose of this guide is to study the impact of the use of virtual reality systems in the improvement and acquisition of functional skills, and to evaluate the scientific evidence to determine the strength of recommendation of such interventions. All available full-text articles, regardless of their methodology, were included. The following data bases were consulted: PubMed (Medline), PEDro, EMBASE (OVID-Elsevier), Cochrane Library, Medline (OVID), CINAHL, ISI Web Knowledge. An assessment was made of methodological quality, the level of scientific evidence, and the strength of recommendations using the tools: Critical Review Form - Quantitative Studies and the Guidelines for Critical Review Form - Quantitative Studies and U.S. Preventive Services Task Force. Finally, we included 13 articles and 97 participants were recruited. We obtained significant improvements in outcome measures that assessed postural control and balance, upper limb function, the selective joint control, and gait. The guide has some limitations: the limited number of patients enrolled, clinical diversity and age range, as well as the methodological quality of existing trials. Virtual reality is a promising tool in the treatment of children with cerebral palsy. There is strong scientific evidence of an acceptable recommendation for the use of virtual reality systems in the treatment of cerebral palsy. Copyright © 2011 Sociedad Española de Neurología. Published by Elsevier Espana. All rights reserved.
NASA Astrophysics Data System (ADS)
Fraser, R.; Coulaud, M.; Aeschlimann, V.; Lemay, J.; Deschenes, C.
2016-11-01
With the growing proportion of inconstant energy source as wind and solar, hydroelectricity becomes a first class source of peak energy in order to regularize the grid. The important increase of start - stop cycles may then cause a premature ageing of runners by both a higher number of cycles in stress fluctuations and by reaching a higher stress level in absolute. Aiming to sustain good quality development on fully homologous scale model turbines, the Hydraulic Machines Laboratory (LAMH) of Laval University has developed a methodology to operate model size turbines on transient regimes such as start-up, stop or load rejection on its test stand. This methodology allows maintaining a constant head while the wicket gates are opening or closing in a representative speed on the model scale of what is made on the prototype. This paper first presents the opening speed on model based on dimensionless numbers, the methodology itself and its application. Then both its limitation and the first results using a bulb turbine are detailed.
Geng, Hua; Todd, Naomi M; Devlin-Mullin, Aine; Poologasundarampillai, Gowsihan; Kim, Taek Bo; Madi, Kamel; Cartmell, Sarah; Mitchell, Christopher A; Jones, Julian R; Lee, Peter D
2016-06-01
A correlative imaging methodology was developed to accurately quantify bone formation in the complex lattice structure of additive manufactured implants. Micro computed tomography (μCT) and histomorphometry were combined, integrating the best features from both, while demonstrating the limitations of each imaging modality. This semi-automatic methodology registered each modality using a coarse graining technique to speed the registration of 2D histology sections to high resolution 3D μCT datasets. Once registered, histomorphometric qualitative and quantitative bone descriptors were directly correlated to 3D quantitative bone descriptors, such as bone ingrowth and bone contact. The correlative imaging allowed the significant volumetric shrinkage of histology sections to be quantified for the first time (~15 %). This technique demonstrated the importance of location of the histological section, demonstrating that up to a 30 % offset can be introduced. The results were used to quantitatively demonstrate the effectiveness of 3D printed titanium lattice implants.
Anywaine, Zacchaeus; Abaasa, Andrew; Levin, Jonathan; Kasirye, Ronnie; Kamali, Anatoli; Grosskurth, Heiner; Munderi, Paula; Nunn, Andrew
2015-07-01
Cotrimoxazole (CTX) prophylaxis is recommended by the World Health Organisation for HIV infected persons. However, once HIV infected patients have commenced ART in resource limited settings, the benefits of continued CTX prophylaxis are not known. The few studies that investigated the safety of discontinuing CTX prophylaxis in these settings had limitations due to their design. COSTOP is a randomised double blind placebo controlled non-inferiority trial among HIV infected Ugandan adults stabilised on anti-retroviral treatment (ART). Participants with CD4 count of 250 or more cells/mm(3) are randomised to two arms: the intervention arm in which CTX is discontinued and the control arm in which CTX prophylaxis is continued. The study aims to assess whether the intervention regimen is not inferior, with respect to the incidence of pre-defined CTX-preventable events, to the control regimen and superior with respect to the incidence of haematological adverse events. Studies that have previously evaluated the safety of discontinuing CTX prophylaxis among HIV infected adults in resource limited settings have provided moderate to low quality evidence owing in part to methodological limitations. COSTOP is designed and conducted with sufficient rigour to answer this question. The results of the trial will assist in guiding policy recommendations. This paper describes the design and methodological considerations important for the conduct of CTX cessation studies. Copyright © 2015. Published by Elsevier Inc.
Methodological considerations for documenting the energy demand of dance activity: a review
Beck, Sarah; Redding, Emma; Wyon, Matthew A.
2015-01-01
Previous research has explored the intensity of dance class, rehearsal, and performance and attempted to document the body's physiological adaptation to these activities. Dance activity is frequently described as: complex, diverse, non-steady state, intermittent, of moderate to high intensity, and with notable differences between training and performance intensities and durations. Many limitations are noted in the methodologies of previous studies creating barriers to consensual conclusion. The present study therefore aims to examine the previous body of literature and in doing so, seeks to highlight important methodological considerations for future research in this area to strengthen our knowledge base. Four recommendations are made for future research. Firstly, research should continue to be dance genre specific, with detailed accounts of technical and stylistic elements of the movement vocabulary examined given wherever possible. Secondly, a greater breadth of performance repertoire, within and between genres, needs to be closely examined. Thirdly, a greater focus on threshold measurements is recommended due to the documented complex interplay between aerobic and anaerobic energy systems. Lastly, it is important for research to begin to combine temporal data relating to work and rest periods with real-time measurement of metabolic data in work and rest, in order to be able to quantify demand more accurately. PMID:25999885
McLinden, Taylor; Sargeant, Jan M; Thomas, M Kate; Papadopoulos, Andrew; Fazil, Aamir
2014-09-01
Nontyphoidal Salmonella spp. are one of the most common causes of bacterial foodborne illness. Variability in cost inventories and study methodologies limits the possibility of meaningfully interpreting and comparing cost-of-illness (COI) estimates, reducing their usefulness. However, little is known about the relative effect these factors have on a cost-of-illness estimate. This is important for comparing existing estimates and when designing new cost-of-illness studies. Cost-of-illness estimates, identified through a scoping review, were used to investigate the association between descriptive, component cost, methodological, and foodborne illness-related factors such as chronic sequelae and under-reporting with the cost of nontyphoidal Salmonella spp. illness. The standardized cost of nontyphoidal Salmonella spp. illness from 30 estimates reported in 29 studies ranged from $0.01568 to $41.22 United States dollars (USD)/person/year (2012). The mean cost of nontyphoidal Salmonella spp. illness was $10.37 USD/person/year (2012). The following factors were found to be significant in multiple linear regression (p≤0.05): the number of direct component cost categories included in an estimate (0-4, particularly long-term care costs) and chronic sequelae costs (inclusion/exclusion), which had positive associations with the cost of nontyphoidal Salmonella spp. illness. Factors related to study methodology were not significant. Our findings indicated that study methodology may not be as influential as other factors, such as the number of direct component cost categories included in an estimate and costs incurred due to chronic sequelae. Therefore, these may be the most important factors to consider when designing, interpreting, and comparing cost of foodborne illness studies.
Caregiver-Provided Physical Therapy Home Programs for Children with Motor Delay: A Scoping Review.
Gorgon, Edward James R
2018-06-01
Caregiver-provided physical therapy home programs (PTHP) play an important role in enhancing motor outcomes in pediatric patient populations. This scoping review systematically mapped clinical trials of caregiver-provided PTHP that were aimed at enhancing motor outcomes in children who have or who are at risk for motor delay, with the purpose of (1) describing trial characteristics; (2) assessing methodologic quality; and (3) examining the reporting of caregiver-related components. Physiotherapy Evidence Database (PEDro), Cochrane CENTRAL, PubMed, Scopus, ScienceDirect, ProQuest Central, CINAHL, LILACS, and OTseeker were searched up to July 31, 2017. Two reviewers independently assessed study eligibility. Randomized or quasi-randomized controlled trials on PTHP administered by parents, other family members, friends, or informal caregivers to children who had or who were at risk for motor delay were included. Two reviewers independently appraised trial quality on the PEDro scale and extracted data. Twenty-four articles representing 17 individual trials were identified. Populations and interventions investigated were heterogeneous. Most of the trials had important research design limitations and methodological issues that could limit usefulness in ascertaining the effectiveness of caregiver-provided PTHP. Few (4 of 17) trials indicated involvement of caregivers in the PTHP planning, assessed how the caregivers learned from the training or instructions provided, or carried out both. Included studies were heterogeneous, and unpublished data were excluded. Although caregiver-provided PTHP are important in addressing motor outcomes in this population, there is a lack of evidence at the level of clinical trials to guide practice. More research is urgently needed to determine the effectiveness of care-giver-provided PTHP. Future studies should address the many important issues identified in this scoping review to improve the usefulness of the trial results.
Fracture mechanics approach to estimate rail wear limits
DOT National Transportation Integrated Search
2009-10-01
This paper describes a systematic methodology to estimate allowable limits for rail head wear in terms of vertical head-height loss, gage-face side wear, and/or the combination of the two. This methodology is based on the principles of engineering fr...
Some consideration for evaluation of structural integrity of aging aircraft
NASA Astrophysics Data System (ADS)
Terada, Hiroyuki; Asada, Hiroo
The objective of this paper is to examine the achievement and the limitation of state-of-the-art of the methodology of damage tolerant design and the subjects to be solved for further improvement. The topics discussed are: the basic concept of full-scale fatigue testing, fracture mechanics applications, repair of detected damages, inspection technology, and determination of inspection intervals, reliability assessment for practical application, and the importance of various kinds of data acquisition.
Aging and social expenditures in Italy: some issues associated with population projections.
Terra Abrami, V
1990-01-01
"After describing the main results of the recent Italian population projections, and some possible consequences...aging may have on social expenditures, this paper focuses on attempts to improve the accuracy of development assumptions, with special regard to natural components. Emphasis is placed on the importance of applying specific methodological tools to define self-explanatory assumptions for fertility and mortality and to produce projections which could be considered, with reasonable limitations, as real forecasts." excerpt
Methodological guidelines for developing accident modification functions.
Elvik, Rune
2015-07-01
This paper proposes methodological guidelines for developing accident modification functions. An accident modification function is a mathematical function describing systematic variation in the effects of road safety measures. The paper describes ten guidelines. An example is given of how to use the guidelines. The importance of exploratory analysis and an iterative approach in developing accident modification functions is stressed. The example shows that strict compliance with all the guidelines may be difficult, but represents a level of stringency that should be strived for. Currently the main limitations in developing accident modification functions are the small number of good evaluation studies and the often huge variation in estimates of effect. It is therefore still not possible to develop accident modification functions for very many road safety measures. Copyright © 2015 Elsevier Ltd. All rights reserved.
Psychoanalysis and homosexuality: do we need a new theory?
Auchincloss, E L; Vaughan, S C
2001-01-01
No need exists, it is argued, for a new psychoanalytic theory of homosexuality. Certainly psychoanalysis should not be expected to generate such a theory using its own methodology alone. The preoccupation with producing such a theory avoids more important questions about psychoanalytic theory building raised by an examination of the long relationship between psychoanalysis and homosexuality. These questions concern the problems related to using psychoanalytic methodology (1) to construct categories (including the categories normal and abnormal), (2) to construct causal theory (the problems include the limitations of psychoanalytic developmental theory and a long-standing confusion between psychoanalytic developmental theory, psychoanalytic genetic reconstruction, and psychodynamics), and (3) to identify "bedrock." Finally, the question is addressed of what might be needed that is new in the psychoanalytic approach to homosexuality.
Trost, Barry M; Frontier, Alison J; Thiel, Oliver R; Yang, Hanbiao; Dong, Guangbin
2011-08-22
Bryostatins, a family of structurally complicated macrolides, exhibit an exceptional range of biological activities. The limited availability and structural complexity of these molecules makes development of an efficient total synthesis particularly important. This article describes our initial efforts towards the total synthesis of bryostatins, in which chemoselective and atom-economical methods for the stereoselective assembly of the ring C subunit were developed. A Pd-catalyzed tandem alkyne-alkyne coupling/6-endo-dig cyclization sequence was explored and successfully pursued in the synthesis of a dihydropyran ring system. Elaboration of this methodology ultimately led to a concise synthesis of the ring C subunit of bryostatins. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Natural Language Processing Methods and Systems for Biomedical Ontology Learning
Liu, Kaihong; Hogan, William R.; Crowley, Rebecca S.
2010-01-01
While the biomedical informatics community widely acknowledges the utility of domain ontologies, there remain many barriers to their effective use. One important requirement of domain ontologies is that they must achieve a high degree of coverage of the domain concepts and concept relationships. However, the development of these ontologies is typically a manual, time-consuming, and often error-prone process. Limited resources result in missing concepts and relationships as well as difficulty in updating the ontology as knowledge changes. Methodologies developed in the fields of natural language processing, information extraction, information retrieval and machine learning provide techniques for automating the enrichment of an ontology from free-text documents. In this article, we review existing methodologies and developed systems, and discuss how existing methods can benefit the development of biomedical ontologies. PMID:20647054
Forbes, A; Wainwright, S P
2001-09-01
The integration of survey data with psycho-social theories is an important and emerging theme within the field of health inequalities research. This paper critically examines this approach arguing that the respective models of health inequality which these approaches promote, the related concepts of 'social cohesion' and 'social capital' suffer from serious methodological, theoretical and philosophical flaws. The critique draws particular attention to the limitations of survey-derived data and the dangers of using such data to develop complex social explanations for health inequalities. The paper discusses wider epistemological issues which emerge from the critique addressing the fundamental but neglected question of 'what is inequality'? The paper concludes by introducing a structure for questions regarding health inequalities emphasising the need for those question to be attached to real communities.
Kroumova, Vesselina; Gobbato, Elisa; Basso, Elisa; Mucedola, Luca; Giani, Tommaso; Fortina, Giacomo
2011-08-15
Matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS) has recently been demonstrated to be a powerful tool for the rapid identification of bacteria from growing colonies. In order to speed up the identification of bacteria, several authors have evaluated the usefulness of this MALDI-TOF MS technology for the direct and quick identification bacteria from positive blood cultures. The results obtained so far have been encouraging but have also shown some limitations, mainly related to the bacterial growth and to the presence of interference substances belonging to the blood cultures. In this paper, we present a new methodological approach that we have developed to overcome these limitations, based mainly on an enrichment of the sample into a growing medium before the extraction process, prior to mass spectrometric analysis. The proposed method shows important advantages for the identification of bacterial strains, yielding an increased identification score, which gives higher confidence in the results. Copyright © 2011 John Wiley & Sons, Ltd.
Tracer Flux Balance at an Urban Canyon Intersection
NASA Astrophysics Data System (ADS)
Carpentieri, Matteo; Robins, Alan G.
2010-05-01
Despite their importance for pollutant dispersion in urban areas, the special features of dispersion at street intersections are rarely taken into account by operational air quality models. Several previous studies have demonstrated the complex flow patterns that occur at street intersections, even with simple geometry. This study presents results from wind-tunnel experiments on a reduced scale model of a complex but realistic urban intersection, located in central London. Tracer concentration measurements were used to derive three-dimensional maps of the concentration field within the intersection. In combination with a previous study (Carpentieri et al., Boundary-Layer Meteorol 133:277-296, 2009) where the velocity field was measured in the same model, a methodology for the calculation of the mean tracer flux balance at the intersection was developed and applied. The calculation highlighted several limitations of current state-of-the-art canyon dispersion models, arising mainly from the complex geometry of the intersection. Despite its limitations, the proposed methodology could be further developed in order to derive, assess and implement street intersection dispersion models for complex urban areas.
Pagán, Josué; Risco-Martín, José L; Moya, José M; Ayala, José L
2016-08-01
Prediction of symptomatic crises in chronic diseases allows to take decisions before the symptoms occur, such as the intake of drugs to avoid the symptoms or the activation of medical alarms. The prediction horizon is in this case an important parameter in order to fulfill the pharmacokinetics of medications, or the time response of medical services. This paper presents a study about the prediction limits of a chronic disease with symptomatic crises: the migraine. For that purpose, this work develops a methodology to build predictive migraine models and to improve these predictions beyond the limits of the initial models. The maximum prediction horizon is analyzed, and its dependency on the selected features is studied. A strategy for model selection is proposed to tackle the trade off between conservative but robust predictive models, with respect to less accurate predictions with higher horizons. The obtained results show a prediction horizon close to 40min, which is in the time range of the drug pharmacokinetics. Experiments have been performed in a realistic scenario where input data have been acquired in an ambulatory clinical study by the deployment of a non-intrusive Wireless Body Sensor Network. Our results provide an effective methodology for the selection of the future horizon in the development of prediction algorithms for diseases experiencing symptomatic crises. Copyright © 2016 Elsevier Inc. All rights reserved.
Lee, Andy H; Zhou, Xu; Kang, Deying; Luo, Yanan; Liu, Jiali; Sun, Xin
2018-01-01
Objective To assess risk of bias and to investigate methodological issues concerning the design, conduct and analysis of randomised controlled trials (RCTs) testing acupuncture for knee osteoarthritis (KOA). Methods PubMed, EMBASE, Cochrane Central Register of Controlled Trials and four major Chinese databases were searched for RCTs that investigated the effect of acupuncture for KOA. The Cochrane tool was used to examine the risk of bias of eligible RCTs. Their methodological details were examined using a standardised and pilot-tested questionnaire of 48 items, together with the association between four predefined factors and important methodological quality indicators. Results A total of 248 RCTs were eligible, of which 39 (15.7%) used computer-generated randomisation sequence. Of the 31 (12.5%) trials that stated the allocation concealment, only one used central randomisation. Twenty-five (10.1%) trials mentioned that their acupuncture procedures were standardised, but only 18 (7.3%) specified how the standardisation was achieved. The great majority of trials (n=233, 94%) stated that blinding was in place, but 204 (87.6%) did not clarify who was blinded. Only 27 (10.9%) trials specified the primary outcome, for which 7 used intention-to-treat analysis. Only 17 (6.9%) trials included details on sample size calculation; none preplanned an interim analysis and associated stopping rule. In total, 46 (18.5%) trials explicitly stated that loss to follow-up occurred, but only 6 (2.4%) provided some information to deal with the issue. No trials prespecified, conducted or reported any subgroup or adjusted analysis for the primary outcome. Conclusion The overall risk of bias was high among published RCTs testing acupuncture for KOA. Methodological limitations were present in many important aspects of design, conduct and analyses. These findings inform the development of evidence-based methodological guidance for future trials assessing the effect of acupuncture for KOA. PMID:29511016
Bhatti, Shammi; Jha, Gopaljee
2010-11-01
Apple (Malus domestica Borkh.), which is a widely cultivated, important economic fruit crop with nutritive and medicinal importance, has emerged as a model horticultural crop in this post-genomic era. Apple cultivation is heavily dependent on climatic condition and is susceptible to several diseases caused by fungi, bacteria, viruses, insects, etc. Extensive research work has been carried out to standardize tissue culture protocols and utilize them in apple improvement. We review the in vitro shoot multiplication, rooting, transformation and regeneration methodologies in apple and tabulate various such protocols for easy reference. The utility and limitation of transgenesis in apple improvement have also been summarized. The concepts of marker-free plants, use of non-antibiotic resistance selectable markers, and cisgenic and intragenic approaches are highlighted. Furthermore, the limitations, current trends and future prospects of tissue culture-mediated biotechnological interventions in apple improvement are discussed.
Pathophysiology of hypertension in obese children: a systematic review.
Wirix, A J G; Kaspers, P J; Nauta, J; Chinapaw, M J M; Kist-van Holthe, J E
2015-10-01
Hypertension is increasingly common in overweight and obese children. The mechanisms behind the development of hypertension in obesity are complex, and evidence is limited. In order to effectively treat obese children for hypertension, it is important to have a deeper understanding of the pathophysiology of hypertension in obese children. The present review summarizes the main factors associated with hypertension in obese children and discusses their potential role in its pathophysiology. Systematic searches were conducted in PubMed and EMBASE for articles published up to October 2014. In total, 60 relevant studies were included. The methodological quality of the included studies ranged from weak to strong. Several factors important in the development of hypertension in obese children have been suggested, including endocrine determinants, such as corticosteroids and adipokines, sympathetic nervous system activity, disturbed sodium homeostasis, as well as oxidative stress, inflammation and endothelial dysfunction. Understanding the pathophysiology of hypertension in overweight and obese children is important and could have implications for its screening and treatment. Based on solely cross-sectional observational studies, it is impossible to infer causality. Longitudinal studies of high methodological quality are needed to gain more insight into the complex mechanisms behind the development of hypertension in obese children. © 2015 World Obesity.
76 FR 23825 - Study Methodologies for Diagnostics in the Postmarket Setting; Public Workshop
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-28
... community on issues related to the studies and methodological approaches examining diagnostics in the... discuss a large number of methodological concerns at the workshop, including, but not limited to the...
ERIC Educational Resources Information Center
Misra, Anjali; Schloss, Patrick J.
1989-01-01
The critical analysis of 23 studies using respondent techniques for the reduction of excessive emotional reactions in school children focuses on research design, dependent variables, independent variables, component analysis, and demonstrations of generalization and maintenance. Results indicate widespread methodological flaws that limit the…
Social Competence in Late Elementary School: Relationships to Parenting and Neighborhood Context
ERIC Educational Resources Information Center
Caughy, Margaret O'Brien; Franzini, Luisa; Windle, Michael; Dittus, Patricia; Cuccaro, Paula; Elliott, Marc N.; Schuster, Mark A.
2012-01-01
Despite evidence that neighborhoods confer both risk and resilience for youth development, the existing neighborhood research has a number of methodological limitations including lack of diversity in neighborhoods sampled and neighborhood characteristics assessed. The purpose of this study was to address these methodological limitations of…
Evidence on public policy: methodological issues, political issues and examples.
Attanasio, Orazio P
2014-03-01
In this paper I discuss how evidence on public policy is generated and in particular the issue of evaluation of public policies. In economics, the issue of attribution and the identification of causal links has recently received considerable attention. Important methodological issues have been tackled and new techniques have been proposed and used. Randomized Control Trials have become some sort of gold standard. However, they are not exempt from problems and have important limitations: in some case they cannot be constructed and, more generally, problems of external validity and transferability of results can be important. The paper then moves on to discuss the political economy of policy evaluations for policy evaluations to have an impact for the conduct of actual policy, it is important that the demand for evaluation comes directly from the policy making process and is generated endogenously within it. In this sense it is important that the institutional design of policy making is such that policy making institutions are incentivized to use rigorous evaluation in the process of designing policies and allocating resources to alternative options. Economists are currently involved in the design and evaluation of many policies, including policies about health, nutrition and education. The role they can play in these fields is not completely obvious. The paper argues that their main contribution is in the modelling of how individual reacts to incentives (including those provided by public policies).
Faces and fitness: attractive evolutionary relationship or ugly hypothesis?
Smoliga, James M; Zavorsky, Gerald S
2015-11-01
In recent years, various studies have attempted to understand human evolution by examining relationships between athletic performance or physical fitness and facial attractiveness. Over a wide range of five homogeneous groups (n = 327), there is an approximate 3% shared variance between facial attractiveness and athletic performance or physical fitness (95% CI = 0.5-8%, p = 0.002). Further, studies relating human performance and attractiveness often have major methodological limitations that limit their generalizability. Thus, despite statistical significance, the association between facial attractiveness and human performance has questionable biological importance. Here, we present a critique of these studies and provide recommendations to improve the quality of future research in this realm. © 2015 The Author(s).
Katz, Matthew HG; Marsh, Robert; Herman, Joseph M.; Shi, Qian; Collison, Eric; Venook, Alan; Kindler, Hedy; Alberts, Steven; Philip, Philip; Lowy, Andrew M.; Pisters, Peter; Posner, Mitchell; Berlin, Jordan; Ahmad, Syed A.
2017-01-01
Methodological limitations of prior studies have prevented progress in the treatment of patients with borderline resectable pancreatic adenocarcinoma. Shortcomings have included the absence of staging and treatment standards and pre-existing biases with regard to the use of neoadjuvant therapy and the role of vascular resection at pancreatectomy. In this manuscript, we will review limitations of studies of borderline resectable PDAC reported to date, highlight important controversies related to this disease stage, emphasize the research infrastructure necessary for its future study, and present a recently-approved Intergroup pilot study (Alliance A0201101) that will provide a foundation upon which subsequent well-designed clinical trials can be performed. PMID:23435609
NASA Technical Reports Server (NTRS)
Koch, S. E.; Skillman, W. C.; Kocin, P. J.; Wetzel, P. J.; Brill, K. F.
1985-01-01
The synoptic scale performance characteristics of MASS 2.0 are determined by comparing filtered 12-24 hr model forecasts to same-case forecasts made by the National Meteorological Center's synoptic-scale Limited-area Fine Mesh model. Characteristics of the two systems are contrasted, and the analysis methodology used to determine statistical skill scores and systematic errors is described. The overall relative performance of the two models in the sample is documented, and important systematic errors uncovered are presented.
Pediatric microdose and microtracer studies using 14C in Europe.
Turner, M A; Mooij, M G; Vaes, W H J; Windhorst, A D; Hendrikse, N H; Knibbe, C A J; Kõrgvee, L T; Maruszak, W; Grynkiewicz, G; Garner, R C; Tibboel, D; Park, B K; de Wildt, S N
2015-09-01
Important information gaps remain on the efficacy and safety of drugs in children. Pediatric drug development encounters several ethical, practical, and scientific challenges. One barrier to the evaluation of medicines for children is a lack of innovative methodologies that have been adapted to the needs of children. This article presents our successful experience of pediatric microdose and microtracer studies using (14) C-labeled probes in Europe to illustrate the strengths and limitations of these approaches. © 2015 ASCPT.
Intelligent System Development Using a Rough Sets Methodology
NASA Technical Reports Server (NTRS)
Anderson, Gray T.; Shelton, Robert O.
1997-01-01
The purpose of this research was to examine the potential of the rough sets technique for developing intelligent models of complex systems from limited information. Rough sets a simple but promising technology to extract easily understood rules from data. The rough set methodology has been shown to perform well when used with a large set of exemplars, but its performance with sparse data sets is less certain. The difficulty is that rules will be developed based on just a few examples, each of which might have a large amount of noise associated with them. The question then becomes, what is the probability of a useful rule being developed from such limited information? One nice feature of rough sets is that in unusual situations, the technique can give an answer of 'I don't know'. That is, if a case arises that is different from the cases the rough set rules were developed on, the methodology can recognize this and alert human operators of it. It can also be trained to do this when the desired action is unknown because conflicting examples apply to the same set of inputs. This summer's project was to look at combining rough set theory with statistical theory to develop confidence limits in rules developed by rough sets. Often it is important not to make a certain type of mistake (e.g., false positives or false negatives), so the rules must be biased toward preventing a catastrophic error, rather than giving the most likely course of action. A method to determine the best course of action in the light of such constraints was examined. The resulting technique was tested with files containing electrical power line 'signatures' from the space shuttle and with decompression sickness data.
Morell, Jonathan A
2018-06-01
This article argues that evaluators could better deal with unintended consequences if they improved their methods of systematically and methodically combining empirical data collection and model building over the life cycle of an evaluation. This process would be helpful because it can increase the timespan from when the need for a change in methodology is first suspected to the time when the new element of the methodology is operational. The article begins with an explanation of why logic models are so important in evaluation, and why the utility of models is limited if they are not continually revised based on empirical evaluation data. It sets the argument within the larger context of the value and limitations of models in the scientific enterprise. Following will be a discussion of various issues that are relevant to model development and revision. What is the relevance of complex system behavior for understanding predictable and unpredictable unintended consequences, and the methods needed to deal with them? How might understanding of unintended consequences be improved with an appreciation of generic patterns of change that are independent of any particular program or change effort? What are the social and organizational dynamics that make it rational and adaptive to design programs around single-outcome solutions to multi-dimensional problems? How does cognitive bias affect our ability to identify likely program outcomes? Why is it hard to discern change as a result of programs being embedded in multi-component, continually fluctuating, settings? The last part of the paper outlines a process for actualizing systematic iteration between model and methodology, and concludes with a set of research questions that speak to how the model/data process can be made efficient and effective. Copyright © 2017 Elsevier Ltd. All rights reserved.
Geurtzen, Rosa; Hogeveen, Marije; Rajani, Anand K; Chitkara, Ritu; Antonius, Timothy; van Heijst, Arno; Draaisma, Jos; Halamek, Louis P
2014-06-01
Prenatal counseling at the threshold of viability is a challenging yet critically important activity, and care guidelines differ across cultures. Studying how this task is performed in the actual clinical environment is extremely difficult. In this pilot study, we used simulation as a methodology with 2 aims as follows: first, to explore the use of simulation incorporating a standardized pregnant patient as an investigative methodology and, second, to determine similarities and differences in content and style of prenatal counseling between American and Dutch neonatologists. We compared counseling practice between 11 American and 11 Dutch neonatologists, using a simulation-based investigative methodology. All subjects performed prenatal counseling with a simulated pregnant patient carrying a fetus at the limits of viability. The following elements of scenario design were standardized across all scenarios: layout of the physical environment, details of the maternal and fetal histories, questions and responses of the standardized pregnant patient, and the time allowed for consultation. American subjects typically presented several treatment options without bias, whereas Dutch subjects were more likely to explicitly advise a specific course of treatment (emphasis on partial life support). American subjects offered comfort care more frequently than the Dutch subjects and also discussed options for maximal life support more often than their Dutch colleagues. Simulation is a useful research methodology for studying activities difficult to assess in the actual clinical environment such as prenatal counseling at the limits of viability. Dutch subjects were more directive in their approach than their American counterparts, offering fewer options for care and advocating for less invasive interventions. American subjects were more likely to offer a wider range of therapeutic options without providing a recommendation for any specific option.
A Methodology for Assessing the Seismic Vulnerability of Highway Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cirianni, Francis; Leonardi, Giovanni; Scopelliti, Francesco
2008-07-08
Modern society is totally dependent on a complex and articulated infrastructure network of vital importance for the existence of the urban settlements scattered on the territory. On these infrastructure systems, usually indicated with the term lifelines, are entrusted numerous services and indispensable functions of the normal urban and human activity.The systems of the lifelines represent an essential element in all the urbanised areas which are subject to seismic risk. It is important that, in these zones, they are planned according to opportune criteria based on two fundamental assumptions: a) determination of the best territorial localization, avoiding, within limits, the placesmore » of higher dangerousness; b) application of constructive technologies finalized to the reduction of the vulnerability.Therefore it is indispensable that in any modern process of seismic risk assessment the study of the networks is taken in the rightful consideration, to be integrated with the traditional analyses of the buildings.The present paper moves in this direction, dedicating particular attention to one kind of lifeline: the highway system, proposing a methodology of analysis finalized to the assessment of the seismic vulnerability of the system.« less
Wind tunnel measurements of pollutant turbulent fluxes in urban intersections
NASA Astrophysics Data System (ADS)
Carpentieri, Matteo; Hayden, Paul; Robins, Alan G.
2012-01-01
Wind tunnel experiments have been carried out at the EnFlo laboratory to measure mean and turbulent tracer fluxes in geometries of real street canyon intersections. The work was part of the major DAPPLE project, focussing on the area surrounding the intersection between Marylebone Road and Gloucester Place in Central London, UK. Understanding flow and dispersion in urban streets is a very important issue for air quality management and planning, and turbulent mass exchange processes are important phenomena that are very often neglected in urban modelling studies. The adopted methodology involved the combined use of laser Doppler anemometry and tracer concentration measurements. This methodology was applied to quantify the mean and turbulent flow and dispersion fields within several street canyon intersections. Vertical profiles of turbulent tracer flux were also measured. The technique, despite a number of limitations, proved reliable and allowed tracer balance calculations to be undertaken in the selected street canyon intersections. The experience gained in this work will enable much more precise studies in the future as issues affecting the accuracy of the experimental technique have been identified and resolved.
Fabrication of Compact Superconducting Lowpass Filters for Ultrasensitive Detectors
NASA Technical Reports Server (NTRS)
Brown, Ari; Chervenak, James; Chuss, David; Mikula, Vilem; Ray, Christopher; Rostem, Karwan; U-Yen, Kongpop; Wassell, Edward; Wollack, Edward
2012-01-01
It is extremely important for current and future far-infrared and sub-millimeter ultrasensitive detectors, which include transition edge sensors (TES) and microwave kinetic inductance detectors, to be adequately filtered from stray electromagnetic radiation in order to achieve their optimal performance. One means of filtering stray radiation is to block leakage associated with electrical connections in the detector environment. Here we discuss a fabrication methodology for realizing non-dissipative planar filters imbedded in the wall of the detector enclosure to limit wave propagation modes up to far-infrared frequencies. Our methodology consists of fabricating a boxed stripline transmission line, in which a superconducting (Nb, Mo, or Al) transmission line is encased in a silicon dioxide dielectric insulator coated with a metallic shell. We report on achieved attenuation and return loss and find that it replicates the simulated data to a high degree.
NASA Astrophysics Data System (ADS)
Chen, Daniel; Chen, Damian; Yen, Ray; Cheng, Mingjen; Lan, Andy; Ghaskadvi, Rajesh
2008-11-01
Identifying hotspots--structures that limit the lithography process window--become increasingly important as the industry relies heavily on RET to print sub-wavelength designs. KLA-Tencor's patented Process Window Qualification (PWQ) methodology has been used for this purpose in various fabs. PWQ methodology has three key advantages (a) PWQ Layout--to obtain the best sensitivity (b) Design Based Binning--for pattern repeater analysis (c) Intelligent sampling--for the best DOI sampling rate. This paper evaluates two different analysis strategies for SEM review sampling successfully deployed at Inotera Memories, Inc. We propose a new approach combining the location repeater and pattern repeaters. Based on a recent case study the new sampling flow reduces the data analysis and sampling time from 6 hours to 1.5 hour maintaining maximum DOI sample rate.
78 FR 5810 - AHRQ Standing Workgroup for Quality Indicator Measure Specification
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-28
... AHRQ Quality Indicators (QIs), their technical specifications, and associated methodological issues.... The time- limited workgroup is more restricted to specific clinical or methodological issues, while..., data enhancements, and methodological advances. The standing workgroup may potentially provide guidance...
78 FR 22883 - AHRQ Standing Workgroup for Quality Indicator Measure Specification
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-17
... Quality Indicators (QIs), their technical specifications, and associated methodological issues. The...-limited workgroup is more restricted to specific clinical or methodological issues, while the standing... enhancements, and methodological advances. The standing workgroup may potentially provide guidance for the...
[Pharmacological Treatment of Apathy in Parkinson's Disease: a Systematic Review of the Literature].
Holguín Lew, Jorge Carlos; Caamaño Jaraba, Jessica; Gómez Alzate, Alejandra; Hidalgo López, Catalina; Marino Mondragón, Daniel Felipe; Restrepo Moreno, Sebastián; Rico Abella, Liz Evelin
2017-10-01
Apathy, defined as a deficit for initiating and maintaining action, is a symptom affecting patients with diverse psychiatric and neuropsychiatric diseases, including dementia, sequelae of traumatic brain injury, schizophrenia, depression, and Parkinson's disease (PD). Apathy negatively affects function and quality of life of PD patients, and it is an important cause of caregiver's distress. The pharmacological treatment of apathy in PD is the focus of this systematic review. A comprehensive search and systematic selection was performed in different databases of original research papers on the treatment of apathy in PD. The results were then consolidated, and a critical analysis was made of the research papers. The results are then discussed according to the methodological standards for systematic reviews of the literature. A total of 11 studies were included. Although some studies showed efficacy, all of them had important methodological limitations that hampered the interpretation of results. The results of the examined studies cannot be considered as evidence for guiding clinical decisions. So far, no evidence-based recommendations can be offered for the treatment of apathy in PD. More studies with better methodological quality are needed. It is a potentially fruitful area for research and one badly needed by both PD patients and their caregivers. Copyright © 2017 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.
New methodologies for multi-scale time-variant reliability analysis of complex lifeline networks
NASA Astrophysics Data System (ADS)
Kurtz, Nolan Scot
The cost of maintaining existing civil infrastructure is enormous. Since the livelihood of the public depends on such infrastructure, its state must be managed appropriately using quantitative approaches. Practitioners must consider not only which components are most fragile to hazard, e.g. seismicity, storm surge, hurricane winds, etc., but also how they participate on a network level using network analysis. Focusing on particularly damaged components does not necessarily increase network functionality, which is most important to the people that depend on such infrastructure. Several network analyses, e.g. S-RDA, LP-bounds, and crude-MCS, and performance metrics, e.g. disconnection bounds and component importance, are available for such purposes. Since these networks are existing, the time state is also important. If networks are close to chloride sources, deterioration may be a major issue. Information from field inspections may also have large impacts on quantitative models. To address such issues, hazard risk analysis methodologies for deteriorating networks subjected to seismicity, i.e. earthquakes, have been created from analytics. A bridge component model has been constructed for these methodologies. The bridge fragilities, which were constructed from data, required a deeper level of analysis as these were relevant for specific structures. Furthermore, chloride-induced deterioration network effects were investigated. Depending on how mathematical models incorporate new information, many approaches are available, such as Bayesian model updating. To make such procedures more flexible, an adaptive importance sampling scheme was created for structural reliability problems. Additionally, such a method handles many kinds of system and component problems with singular or multiple important regions of the limit state function. These and previously developed analysis methodologies were found to be strongly sensitive to the network size. Special network topologies may be more or less computationally difficult, while the resolution of the network also has large affects. To take advantage of some types of topologies, network hierarchical structures with super-link representation have been used in the literature to increase the computational efficiency by analyzing smaller, densely connected networks; however, such structures were based on user input and subjective at times. To address this, algorithms must be automated and reliable. These hierarchical structures may indicate the structure of the network itself. This risk analysis methodology has been expanded to larger networks using such automated hierarchical structures. Component importance is the most important objective from such network analysis; however, this may only provide the information of which bridges to inspect/repair earliest and little else. High correlations influence such component importance measures in a negative manner. Additionally, a regional approach is not appropriately modelled. To investigate a more regional view, group importance measures based on hierarchical structures have been created. Such structures may also be used to create regional inspection/repair approaches. Using these analytical, quantitative risk approaches, the next generation of decision makers may make both component and regional-based optimal decisions using information from both network function and further effects of infrastructure deterioration.
NASA Astrophysics Data System (ADS)
Nogueira, Juan Manuel; Romero, David; Espadas, Javier; Molina, Arturo
2013-02-01
With the emergence of new enterprise models, such as technology-based enterprises, and the large quantity of information generated through technological advances, the Zachman framework continues to represent a modelling tool of great utility and value to construct an enterprise architecture (EA) that can integrate and align the IT infrastructure and business goals. Nevertheless, implementing an EA requires an important effort within an enterprise. Small technology-based enterprises and start-ups can take advantage of EAs and frameworks but, because these enterprises have limited resources to allocate for this task, an enterprise framework implementation is not feasible in most cases. This article proposes a new methodology based on action-research for the implementation of the business, system and technology models of the Zachman framework to assist and facilitate its implementation. Following the explanation of cycles of the proposed methodology, a case study is presented to illustrate the results of implementing the Zachman framework in a technology-based enterprise: PyME CREATIVA, using action-research approach.
The use of soft system methodology (SSM) in a serviced-focussed study on the personal tutor's role.
Por, Jitna
2008-09-01
Soft system methodology (SSM) is described as a system-based methodology for tackling real world problems. SSM may be used as a means of articulating complex social processes in a particular way. SSM allows peoples' viewpoints and assumptions about the world to be bought to light, challenged and tested. This paper reports on the use of SSM in a service-focussed study (SFS) to explore the role of a personal tutor in nurse education. [Checkland, P., 1981. Systems Thinking Systems Practice. John Wiley and Sons, Chichester] highlighted the importance of considering cultural, social and political systems in the analysis. The seven stages of SSM are discussed in relation to the SFS and some of the findings are expressed through a 'Rich Picture'. It encourages commitment, brings diverse interests together and opens up the organizational culture. It also enables feasible and desirable changes to be recommended within the context of limited resources and competing demands upon lecturers' time. The SSM was an appropriate systematic model for this study and could be potentially useful in nurse education research.
NASA Astrophysics Data System (ADS)
Moomaw, Ronald L.
According to its abstract, this book attempts ‘an assessment of various water conservation measures aimed at reducing residential water usage.’ Its intent is to develop a research program whose ‘ultimate goal is to engender a conservation ethic among water users and managers and develop a predictable array of conservation methodologies. …’ Professor Flack indeed has presented an excellent assessment of conservation methodologies, but I believe that the proposed research program is too limited.Following a brief introductory chapter, chapter II presents an extensive review of the water conservation literature published in the 1970's and earlier. It and chapter III, which describes Flack's systematic comparison of the technical, economic, and political aspects of each conservation methodology, are the heart of the book. Chapter IV is a brief discussion and analysis of conservation programs (with examples) that a water utility might adopt. Chapter V is essentially a pilot study of methods of assessing political and social feasibility. Finally, a set of recommendations is presented in chapter VI. All in all, this book is a nice blend of literature review and original research that deals with an important issue.
Sharma, Prashant; Das, Reena
2016-03-26
Cation-exchange high-performance liquid chromatography (CE-HPLC) is a widely used laboratory test to detect variant hemoglobins as well as quantify hemoglobins F and A2 for the diagnosis of thalassemia syndromes. It's versatility, speed, reproducibility and convenience have made CE-HPLC the method of choice to initially screen for hemoglobin disorders. Despite its popularity, several methodological aspects of the technology remain obscure to pathologists and this may have consequences in specific situations. This paper discusses the basic principles of the technique, the initial quality control steps and the interpretation of various controls and variables that are available on the instrument output. Subsequent sections are devoted to methodological considerations that arise during reporting of cases. For instance, common problems of misidentified peaks, totals crossing 100%, causes of total area being above or below acceptable limits and the importance of pre-integration region peaks are dealt with. Ultimately, CE-HPLC remains an investigation, the reporting of which combines in-depth knowledge of the biological basics with more than a working knowledge of the technological aspects of the technique.
Yuen, Hon K; Austin, Sarah L
2014-01-01
We describe the methodological quality of recent studies on instrument development and testing published in the American Journal of Occupational Therapy (AJOT). We conducted a systematic review using the COnsensus-based Standards for the selection of health status Measurement INstruments (COSMIN) checklist to appraise 48 articles on measurement properties of assessments for adults published in AJOT between 2009 and 2013. Most studies had adequate methodological quality in design and statistical analysis. Common methodological limitations included that methods used to examine internal consistency were not consistently linked to the theoretical constructs underpinning assessments; participants in some test-retest reliability studies were not stable during the interim period; and in several studies of reliability and convergent validity, sample sizes were inadequate. AJOT's dissemination of psychometric research evidence has made important contributions to moving the profession toward the American Occupational Therapy Association's Centennial Vision. This study's results provide a benchmark by which to evaluate future accomplishments. Copyright © 2014 by the American Occupational Therapy Association, Inc.
Oligomerization of G protein-coupled receptors: computational methods.
Selent, J; Kaczor, A A
2011-01-01
Recent research has unveiled the complexity of mechanisms involved in G protein-coupled receptor (GPCR) functioning in which receptor dimerization/oligomerization may play an important role. Although the first high-resolution X-ray structure for a likely functional chemokine receptor dimer has been deposited in the Protein Data Bank, the interactions and mechanisms of dimer formation are not yet fully understood. In this respect, computational methods play a key role for predicting accurate GPCR complexes. This review outlines computational approaches focusing on sequence- and structure-based methodologies as well as discusses their advantages and limitations. Sequence-based approaches that search for possible protein-protein interfaces in GPCR complexes have been applied with success in several studies, but did not yield always consistent results. Structure-based methodologies are a potent complement to sequence-based approaches. For instance, protein-protein docking is a valuable method especially when guided by experimental constraints. Some disadvantages like limited receptor flexibility and non-consideration of the membrane environment have to be taken into account. Molecular dynamics simulation can overcome these drawbacks giving a detailed description of conformational changes in a native-like membrane. Successful prediction of GPCR complexes using computational approaches combined with experimental efforts may help to understand the role of dimeric/oligomeric GPCR complexes for fine-tuning receptor signaling. Moreover, since such GPCR complexes have attracted interest as potential drug target for diverse diseases, unveiling molecular determinants of dimerization/oligomerization can provide important implications for drug discovery.
Bradley, Catherine C; Boan, Andrea D; Cohen, Amy P; Charles, Jane M; Carpenter, Laura A
2016-01-01
Previous research on developmental regression in youth with autism spectrum disorders (ASD) has often been limited by the definition, assessment, and methodology used to evaluate and describe regression. This study sought to overcome these limitations by examining the prevalence, timing, and correlates of documented cases of developmental regression in a large, epidemiological sample of youth with ASD. Utilizing a population-based surveillance methodology, this study includes 862 youth with ASD identified through abstraction and clinician record review. Approximately 21% of the sample had developmental regression documented in their medical or educational records with the mean age of regression being 24.2 ± 14.3 months. Youth with ASD and a history of regression were more likely to have comorbid intellectual disability, a prior community diagnosis of ASD, and be eligible for educational services as a student with autism. Youth with a documented history of regression also had higher rates of restricted, repetitive behaviors, such as stereotyped speech, nonfunctional routines/rituals, and sensory interests. Results suggest that youth with a history of regression are not only more likely to have comorbid intellectual disability but are also are more likely to have been previously diagnosed with ASD in the community, suggesting that development regression may play an important role in identifying children who are at the risk for ASD and need evaluation. Higher rates of restricted, repetitive behaviors in youth with a documented history of regression may also provide important insights into the relationship between ASD and developmental regression.
Theory and practice in health communication campaigns: a critical interrogation.
Dutta-Bergman, Mohan J
2005-01-01
In recent reviews of the body of work on health campaigns, communication scholars discussed the importance of reflective thinking about the capacity of campaigns to effect change; this reflective thinking is especially important in the realm of the increasing gaps in society between the health rich and the health poor and the increasing marginalization of the poorer sections of society. This article critically reviews 3 central theories of health communication campaigns that represent the dominant cognitive approach: theory of reasoned action, health belief model, and the extended parallel process model. After articulating the limitations of these theoretical approaches, the article summarizes new directions in theory, methodology, and application of health communication campaigns targeting marginalized populations.
Destounis, Stamatia; Arieno, Andrea; Morgan, Renee; Roberts, Christina; Chan, Ariane
2017-01-01
Mammographic breast density (MBD) has been proven to be an important risk factor for breast cancer and an important determinant of mammographic screening performance. The measurement of density has changed dramatically since its inception. Initial qualitative measurement methods have been found to have limited consistency between readers, and in regards to breast cancer risk. Following the introduction of full-field digital mammography, more sophisticated measurement methodology is now possible. Automated computer-based density measurements can provide consistent, reproducible, and objective results. In this review paper, we describe various methods currently available to assess MBD, and provide a discussion on the clinical utility of such methods for breast cancer screening. PMID:28561776
Equity and health policy in Africa: using concept mapping in Moore (Burkina Faso).
Ridde, Valéry
2008-04-22
This methodological article is based on a health policy research project conducted in Burkina Faso (West Africa). Concept mapping (CM) was used as a research method to understand the local views of equity among stakeholders, who were concerned by the health policy under consideration. While this technique has been used in North America and elsewhere, to our knowledge it has not yet been applied in Africa in any vernacular language. Its application raises many issues and certain methodological limitations. Our objective in this article is to present its use in this particular context, and to share a number of methodological observations on the subject. Two CMs were done among two different groups of local stakeholders following four steps: generating ideas, structuring the ideas, computing maps using multidimensional scaling and cluster analysis methods, and interpreting maps. Fifteen nurses were invited to take part in the study, all of whom had undergone training on health policies. Of these, nine nurses (60%) ultimately attended the two-day meeting, conducted in French. Of 45 members of village health committees who attended training on health policies, only eight were literate in the local language (Moore). Seven of these (88%) came to the meeting. The local perception of equity seems close to the egalitarian model. The actors are not ready to compromise social stability and peace for the benefit of the worst-off. The discussion on the methodological limitations of CM raises the limitations of asking a single question in Moore and the challenge of translating a concept as complex as equity. While the translation of equity into Moore undoubtedly oriented the discussions toward social relations, we believe that, in the context of this study, the open-ended question concerning social justice has a threefold relevance. At the same time, those limitations were transformed into strengths. We understand that it was essential to resort to the focus group approach to explore deeply a complex subject such as equity, which became, after the two CMs, one of the important topics of the research. Using this technique in a new context was not the easiest thing to do. Nevertheless, contrary to what local organizers thought when we explained to them this "crazy" idea of applying the technique in Moore with peasants, we believe we have shown that it was feasible, even with persons not literate in French.
Power system market implementation in a deregulated environment
NASA Astrophysics Data System (ADS)
Silva, Carlos
2000-10-01
The opening of the power system markets (also known as deregulation) gives rise to issues never seen before by this industry. One of the most important is the control of information about the cost of generation. Information that used to be common knowledge is now kept private by the new agents of the system (generator companies, distribution companies, etc.). Data such as the generator cost functions are now known only by the owning companies. The result is a new system consisting of a group of independent firms seeking the maximization of their own profit. There have been many proposals to organize the new market in an economically efficient manner. Nevertheless, the uniqueness of the electric power system has prevented the development of such a market. This thesis evaluates the most common proposals using simulations in an auction setting. In addition a new methodology is proposed based on mechanism design, a common technique in economics, that solves some of the practical problems of power system markets (such as the management of limited transmission capacity). In this methodology, when each company acts in its best interest, the outcome is efficient in spite of the information problem cited above. This new methodology, along with the existing methodologies, are tested using simulation and analyzed to create a clear comparison of benefits and disadvantages.
Social interaction in management group meetings: a case study of Finnish hospital.
Laapotti, Tomi; Mikkola, Leena
2016-06-20
Purpose - The purpose of this paper is to understand the role of management group meetings (MGMs) in hospital organization by examining the social interaction in these meetings. Design/methodology/approach - This case study approaches social interaction from a structuration point of view. Social network analysis and qualitative content analysis are applied. Findings - The findings show that MGMs are mainly forums for information sharing. Meetings are not held for problem solving or decision making, and operational coordinating is limited. Meeting interaction is very much focused on the chair, and most of the discussion takes place between the chair and one other member, not between members. The organizational structures are maintained and reproduced in the meeting interaction, and they appear to limit discussion. Meetings appear to fulfil their goals as a part of the organization's information structure and to some extent as an instrument for management. The significance of the relational side of MGMs was recognized. Research limitations/implications - The results of this study provide a basis for future research on hospital MGMs with wider datasets and other methodologies. Especially the relational role of MGMs needs more attention. Practical implications - The goals of MGMs should be reviewed and MG members should be made aware of meeting interaction structures. Originality/value - The paper provides new knowledge about interaction networks in hospital MGMs, and describes the complexity of the importance of MGMs for hospitals.
Reinke, Evelyn; Supriyatiningsih; Haier, Jörg
2017-01-01
Background In 2015 the proposed period ended for achieving the Millennium Development Goals (MDG) of the United Nations targeting to lower maternal mortality worldwide by ~ 75%. 99% of these cases appear in developing and threshold countries; but reports mostly rely on incomplete or unrepresentative data. Using Indonesia as example, currently available data sets for maternal mortality were systematically reviewed. Methods Besides analysis of international and national data resources, a systematic review was carried out according to Cochrane methodology to identify all data and assessments regarding maternal mortality. Results Overall, primary data on maternal mortality differed significantly and were hardly comparable. For 1990 results varied between 253/100 000 and 446/100 000. In 2013 data appeared more conclusive (140–199/100 000). An annual reduction rate (ARR) of –2.8% can be calculated. Conclusion Reported data quality of maternal mortality in Indonesia is very limited regarding comprehensive availability and methodology. This limitation appears to be of general importance for the targeted countries of the MDG. Primary data are rare, not uniformly obtained and not evaluated by comparable methods resulting in very limited comparability. Continuous small data set registration should have high priority for analysis of maternal health activities. PMID:28400953
van der Meersch, Amélie; Dechartres, Agnès; Ravaud, Philippe
2011-01-01
Background Generic drugs are used by millions of patients for economic reasons, so their evaluation must be highly transparent. Objective To assess the quality of reporting of bioequivalence trials comparing generic to brand-name drugs. Methodology/Principal Findings PubMed was searched for reports of bioequivalence trials comparing generic to brand-name drugs between January 2005 and December 2008. Articles were included if the aim of the study was to assess the bioequivalency of generic and brand-name drugs. We excluded case studies, pharmaco-economic evaluations, and validation dosage assays of drugs. We evaluated whether important information about funding, methodology, location of trials, and participants were reported. We also assessed whether the criteria required by the Food and Drug Administration (FDA) and the European Medicine Agency (EMA) to conclude bioequivalence were reported and that the conclusions were in agreement with the results. We identified 134 potentially relevant articles but eliminated 55 because the brand-name or generic drug status of the reference drug was unknown. Thus, we evaluated 79 articles. The funding source and location of the trial were reported in 41% and 56% of articles, respectively. The type of statistical analysis was reported in 94% of articles, but the methods to generate the randomization sequence and to conceal allocation were reported in only 15% and 5%, respectively. In total, 65 articles of single-dose trials (89%) concluded bioequivalence. Of these, 20 (31%) did not report the 3 criteria within the limits required by the FDA and 11 (17%) did not report the 2 criteria within the limits required by the EMA. Conclusions/Significance Important information to judge the validity and relevance of results are frequently missing in published reports of trials assessing generic drugs. The quality of reporting of such trials is in need of improvement. PMID:21858184
Hand, Rosa K; Perzynski, Adam T
2016-09-01
Retrospective self-reported data have limitations, making it important to evaluate alternative forms of measurement for nutrition behaviors. Ecological momentary assessment (EMA) attempts to overcome the challenges of recalled data with real-time data collection in a subject's natural environment, often leveraging technology. This perspective piece 1) introduces the concepts and terminology of EMA, 2) provides an overview of the methodological and analytical considerations, 3) gives examples of past research using EMA, and 4) suggests new opportunities (including combining assessment and intervention) and limitations (including the need for technology) for the application of EMA to research and practice regarding nutrition behaviors. Copyright © 2016 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.
Fracture Toughness to Understand Stretch-Flangeability and Edge Cracking Resistance in AHSS
NASA Astrophysics Data System (ADS)
Casellas, Daniel; Lara, Antoni; Frómeta, David; Gutiérrez, David; Molas, Sílvia; Pérez, Lluís; Rehrl, Johannes; Suppan, Clemens
2017-01-01
The edge fracture is considered as a high risk for automotive parts, especially for parts made of advanced high strength steels (AHSS). The limited ductility of AHSS makes them more sensitive to the edge damage. The traditional approaches, such as those based on ductility measurements or forming limit diagrams, are unable to predict this type of fractures. Thus, stretch-flangeability has become an important formability parameter in addition to tensile and formability properties. The damage induced in sheared edges in AHSS parts affects stretch-flangeability, because the generated microcracks propagate from the edge. Accordingly, a fracture mechanics approach may be followed to characterize the crack propagation resistance. With this aim, this work addresses the applicability of fracture toughness as a tool to understand crack-related problems, as stretch-flangeability and edge cracking, in different AHSS grades. Fracture toughness was determined by following the essential work of fracture methodology and stretch-flangeability was characterized by means of hole expansions tests. Results show a good correlation between stretch-flangeability and fracture toughness. It allows postulating fracture toughness, measured by the essential work of fracture methodology, as a key material property to rationalize crack propagation phenomena in AHSS.
Mixed-methods research in pharmacy practice: basics and beyond (part 1).
Hadi, Muhammad Abdul; Alldred, David Phillip; Closs, S José; Briggs, Michelle
2013-10-01
This is the first of two papers which explore the use of mixed-methods research in pharmacy practice. In an era of evidence-based medicine and policy, high-quality research evidence is essential for the development of effective pharmacist-led services. Over the past decade, the use of mixed-methods research has become increasingly common in healthcare, although to date its use has been relatively limited in pharmacy practice research. In this article, the basic concepts of mixed-methods research including its definition, typologies and advantages in relation to pharmacy practice research are discussed. Mixed-methods research brings together qualitative and quantitative methodologies within a single study to answer or understand a research problem. There are a number of mixed-methods designs available, but the selection of an appropriate design must always be dictated by the research question. Importantly, mixed-methods research should not be seen as a 'tool' to collect qualitative and quantitative data, rather there should be some degree of 'integration' between the two data sets. If conducted appropriately, mixed-methods research has the potential to generate quality research evidence by combining strengths and overcoming the respective limitations of qualitative and quantitative methodologies. © 2012 Royal Pharmaceutical Society.
Franconi, Giovanna; Schröder, Sven; Marchetti, Paolo; Robinson, Nicola
2013-01-01
Chemotherapy-induced peripheral neuropathy (CIPN) is a common side effect that can be very disabling and can limit or delay the dose of chemotherapy that can be administered. Acupuncture may be effective for treating peripheral neuropathy. The aim of this study was to review the available literature on the use of acupuncture for CIPN. The systematic literature search was performed using MEDLINE, Google Scholar, Cochrane Database, CINHAL, and ISI Proceedings. Hand searching was conducted, and consensus was reached on all extracted data. Only papers in the English language were included, irrespective of study design. From 3989 retrieved papers, 8 relevant papers were identified. One was an experimental study which showed that electroacupuncture suppressed CIPN pain in rats. In addition, there were 7 very heterogeneous clinical studies, 1 controlled randomised study using auricular acupuncture, 2 randomized controlled studies using somatic acupuncture, and 3 case series/case reports which suggested a positive effect of acupuncture in CIPN. Conclusions. Only one controlled randomised study demonstrated that acupuncture may be beneficial for CIPN. All the clinical studies reviewed had important methodological limitations. Further studies with robust methodology are needed to demonstrate the role of acupuncture for treating CIPN resulting from cancer treatment. PMID:23983788
Landewé, Robert B M; Smolen, Josef S; Weinblatt, Michael E; Emery, Paul; Dougados, Maxime; Fleischmann, Roy; Aletaha, Daniel; Kavanaugh, Arthur; van der Heijde, Désirée
2014-10-01
Investigator-initiated trials, some of which have been referred to as comparative effectiveness trials, pragmatic trials, or strategy trials, are sometimes considered to be of greater clinical importance than industry-driven trials, because they address important but unresolved clinical questions that differ from the questions asked in industry-driven trials. Regulatory authorities have provided methodological guidance for industry-driven trials for the approval of new treatments, but such guidance is less clear for investigator-initiated trials. The European League Against Rheumatism (EULAR) task force for the update of the recommendations for the management of rheumatoid arthritis has critically looked at the methodological quality and conduct of many investigator-initiated trials, and has identified a number of concerns. In this Viewpoint paper, we highlight commonly encountered issues that are discussed using examples of well-known investigator-initiated trials. These issues cover three themes: (1) design choice (superiority vs non-inferiority designs); (2) statistical power and (3) convenience reporting. Since we acknowledge the importance of investigator-initiated research, we also propose a shortlist of points-to-consider when designing, performing and reporting investigator-initiated trials. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
A methodology for stochastic analysis of share prices as Markov chains with finite states.
Mettle, Felix Okoe; Quaye, Enoch Nii Boi; Laryea, Ravenhill Adjetey
2014-01-01
Price volatilities make stock investments risky, leaving investors in critical position when uncertain decision is made. To improve investor evaluation confidence on exchange markets, while not using time series methodology, we specify equity price change as a stochastic process assumed to possess Markov dependency with respective state transition probabilities matrices following the identified state pace (i.e. decrease, stable or increase). We established that identified states communicate, and that the chains are aperiodic and ergodic thus possessing limiting distributions. We developed a methodology for determining expected mean return time for stock price increases and also establish criteria for improving investment decision based on highest transition probabilities, lowest mean return time and highest limiting distributions. We further developed an R algorithm for running the methodology introduced. The established methodology is applied to selected equities from Ghana Stock Exchange weekly trading data.
Methodological issues associated with clinical trials in epilepsy.
Ferlazzo, Edoardo; Sueri, Chiara; Gasparini, Sara; Russo, Emilio; Cianci, Vittoria; Ascoli, Michele; De Sarro, Giovambattista; Aguglia, Umberto
2017-10-01
despite methodological advances in epilepsy clinical trials, the proportion of patients reaching seizure-freedom has not substantially changed over the years. We review the main methodological limitations of current trials, the possible strategies to overcome these limits, and the issues that need to be addressed in next future. Area covered: references were identified by PubMed search until March 2017 and unpublished literature was searched on ClinicalTrials.gov. Add-on trials mainly involve refractory epilepsy subjects, reducing overall response to the investigational drug. The inclusion of subjects with earlier disease from less developed countries has partially allowed overcoming this limitation, but has introduced more random variability of results. Monotherapy trials rise methodological, economical, and ethical concerns with different regulatory requirements in European Union and in the United States of America. Newer trial designs, such as futility trials or 'time-to-event' design, have been implemented. Moreover, both add-on and monotherapy trials results might be affected by patient's ability to recognize and record seizures, and by randomness of seizures occurrence over time. Possible strategies to achieve more reliable outcomes are detailed. Expert commentary: clinical trial methodology needs to be optimized to better address regulatory agencies requirements and to encounter both patients' and clinicians' needs.
Hilgers, Ralf-Dieter; Bogdan, Malgorzata; Burman, Carl-Fredrik; Dette, Holger; Karlsson, Mats; König, Franz; Male, Christoph; Mentré, France; Molenberghs, Geert; Senn, Stephen
2018-05-11
IDeAl (Integrated designs and analysis of small population clinical trials) is an EU funded project developing new statistical design and analysis methodologies for clinical trials in small population groups. Here we provide an overview of IDeAl findings and give recommendations to applied researchers. The description of the findings is broken down by the nine scientific IDeAl work packages and summarizes results from the project's more than 60 publications to date in peer reviewed journals. In addition, we applied text mining to evaluate the publications and the IDeAl work packages' output in relation to the design and analysis terms derived from in the IRDiRC task force report on small population clinical trials. The results are summarized, describing the developments from an applied viewpoint. The main result presented here are 33 practical recommendations drawn from the work, giving researchers a comprehensive guidance to the improved methodology. In particular, the findings will help design and analyse efficient clinical trials in rare diseases with limited number of patients available. We developed a network representation relating the hot topics developed by the IRDiRC task force on small population clinical trials to IDeAl's work as well as relating important methodologies by IDeAl's definition necessary to consider in design and analysis of small-population clinical trials. These network representation establish a new perspective on design and analysis of small-population clinical trials. IDeAl has provided a huge number of options to refine the statistical methodology for small-population clinical trials from various perspectives. A total of 33 recommendations developed and related to the work packages help the researcher to design small population clinical trial. The route to improvements is displayed in IDeAl-network representing important statistical methodological skills necessary to design and analysis of small-population clinical trials. The methods are ready for use.
Bharucha, Nadir; Odermatt, Peter; Preux, Pierre-Marie
2014-01-01
The majority of people with epilepsy (PWE) live in low- and middle-income countries (LMICs). However, they remain largely untreated and the bulk of resources are used to treat patients in the developed world. This disparity constitutes a challenge for neuroepidemiological studies on a global scale. In the past, several studies have focused on diverse populations in disparate countries at various periods of time and for particular purposes. The specificity of different contexts and circumstances makes it difficult to analyse PWE as a group either qualitatively or quantitatively. Such methodological limitations are further complicated by a lack of logistical support. There is a lack of interest in conducting studies, which results in inadequate funding and, in addition, there is the considerable challenge of publishing research reports from LMICs in peer-reviewed international journals. This paper focuses on methodological problems related to studies in LMICs and attempts to give the reasons for their limitations using epilepsy as an example. Regional conditions and environmental factors must be given careful consideration in the research design because of the importance of understanding the challenges of living in these environments. There are further limitations to the successful implementation of studies. Existing information on epilepsy is often not readily accessible; there is a lack of census data, and migratory patterns into cities make enumeration and sampling even more challenging. As there is usually no well-developed healthcare system a door-to-door screening process is often the only way to identify those with convulsive epilepsy. The questionnaire and study design should preferably be adapted from standardized protocols, and pre-tested and validated in local conditions. Systematic reviews and meta-analyses of studies in LMICs can provide data on the burden, risk factors, treatment and outcome of epilepsy only if the primary studies used are properly conducted using uniform and comparable methodology. The use of consistent replicable neuroepidemiological methods in primary studies and systematic reviews enable reduction of the treatment gap and better epilepsy care. © 2013 S. Karger AG, Basel.
An approach to the parametric design of ion thrusters
NASA Technical Reports Server (NTRS)
Wilbur, Paul J.; Beattie, John R.; Hyman, Jay, Jr.
1988-01-01
A methodology that can be used to determine which of several physical constraints can limit ion thruster power and thrust, under various design and operating conditions, is presented. The methodology is exercised to demonstrate typical limitations imposed by grid system span-to-gap ratio, intragrid electric field, discharge chamber power per unit beam area, screen grid lifetime, and accelerator grid lifetime constraints. Limitations on power and thrust for a thruster defined by typical discharge chamber and grid system parameters when it is operated at maximum thrust-to-power are discussed. It is pointed out that other operational objectives such as optimization of payload fraction or mission duration can be substituted for the thrust-to-power objective and that the methodology can be used as a tool for mission analysis.
Sylvetsky, Allison C.; Blau, Jenny E.; Rother, Kristina I.
2016-01-01
Consumption of foods, beverages, and packets containing low-calorie sweeteners (LCS) has increased markedly across gender, age, race/ethnicity, weight status, and socioeconomic subgroups. However, well-controlled intervention studies rigorously evaluating the health effects of LCS in humans are limited. One of the key questions is whether LCS are indeed a beneficial strategy for weight management and prevention of obesity. The current review discusses several methodological considerations in the design and interpretation of these studies. Specifically, we focus on the selection of study participants, inclusion of an appropriate control, importance of considering habitual LCS exposure, selection of specific LCS, dose and route of LCS administration, choice of study outcomes, and the context and generalizability of the study findings. These critical considerations will guide the design of future studies and thus assist in understanding the health effects of LCS. PMID:26936185
Sylvetsky, Allison C; Blau, Jenny E; Rother, Kristina I
2016-06-01
Consumption of foods, beverages, and packets containing low-calorie sweeteners (LCS) has increased markedly across gender, age, race/ethnicity, weight status, and socio-economic subgroups. However, well-controlled intervention studies rigorously evaluating the health effects of LCS in humans are limited. One of the key questions is whether LCS are indeed a beneficial strategy for weight management and prevention of obesity. The current review discusses several methodological considerations in the design and interpretation of these studies. Specifically, we focus on the selection of study participants, inclusion of an appropriate control, importance of considering habitual LCS exposure, selection of specific LCS, dose and route of LCS administration, choice of study outcomes, and the context and generalizability of the study findings. These critical considerations will guide the design of future studies and thus assist in understanding the health effects of LCS.
Performance testing of 3D point cloud software
NASA Astrophysics Data System (ADS)
Varela-González, M.; González-Jorge, H.; Riveiro, B.; Arias, P.
2013-10-01
LiDAR systems are being used widely in recent years for many applications in the engineering field: civil engineering, cultural heritage, mining, industry and environmental engineering. One of the most important limitations of this technology is the large computational requirements involved in data processing, especially for large mobile LiDAR datasets. Several software solutions for data managing are available in the market, including open source suites, however, users often unknown methodologies to verify their performance properly. In this work a methodology for LiDAR software performance testing is presented and four different suites are studied: QT Modeler, VR Mesh, AutoCAD 3D Civil and the Point Cloud Library running in software developed at the University of Vigo (SITEGI). The software based on the Point Cloud Library shows better results in the loading time of the point clouds and CPU usage. However, it is not as strong as commercial suites in working set and commit size tests.
Evaluation of massively parallel sequencing for forensic DNA methylation profiling.
Richards, Rebecca; Patel, Jayshree; Stevenson, Kate; Harbison, SallyAnn
2018-05-11
Epigenetics is an emerging area of interest in forensic science. DNA methylation, a type of epigenetic modification, can be applied to chronological age estimation, identical twin differentiation and body fluid identification. However, there is not yet an agreed, established methodology for targeted detection and analysis of DNA methylation markers in forensic research. Recently a massively parallel sequencing-based approach has been suggested. The use of massively parallel sequencing is well established in clinical epigenetics and is emerging as a new technology in the forensic field. This review investigates the potential benefits, limitations and considerations of this technique for the analysis of DNA methylation in a forensic context. The importance of a robust protocol, regardless of the methodology used, that minimises potential sources of bias is highlighted. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
The effect of music on cognitive performance: insight from neurobiological and animal studies.
Rickard, Nikki S; Toukhsati, Samia R; Field, Simone E
2005-12-01
The past 50 years have seen numerous claims that music exposure enhances human cognitive performance. Critical evaluation of studies across a variety of contexts, however, reveals important methodological weaknesses. The current article argues that an interdisciplinary approach is required to advance this research. A case is made for the use of appropriate animal models to avoid many confounds associated with human music research. Although such research has validity limitations for humans, reductionist methodology enables a more controlled exploration of music's elementary effects. This article also explores candidate mechanisms for this putative effect. A review of neurobiological evidence from human and comparative animal studies confirms that musical stimuli modify autonomic and neurochemical arousal indices, and may also modify synaptic plasticity. It is proposed that understanding how music affects animals provides a valuable conjunct to human research and may be vital in uncovering how music might be used to enhance cognitive performance.
Ma, Nylanda; Roberts, Rachel; Winefield, Helen; Furber, Gareth
2015-02-01
While the importance of looking at the entire family system in the context of child and adolescent mental health is well recognised, siblings of children with mental health problems (MHPs) are often overlooked. The existing literature on the mental health of these siblings needs to be reviewed. A systematic search located publications from 1990 to 2011 in four electronic databases. Thirty-nine relevant studies reported data on the prevalence of psychopathology in siblings of target children with MHPs. Siblings of target children had higher rates of at least one type of psychopathology than comparison children. Risk of psychopathology varied across the type of MHP in the target child. Other covariates included sibling age and gender and parental psychopathology. Significant variations and limitations in methodology were found in the existing literature. Methodological guidelines for future studies are outlined. Implications for clinicians, parents, and for future research are discussed.
Piloting a Deceased Subject Integrated Data Repository and Protecting Privacy of Relatives
Huser, Vojtech; Kayaalp, Mehmet; Dodd, Zeyno A.; Cimino, James J.
2014-01-01
Use of deceased subject Electronic Health Records can be an important piloting platform for informatics or biomedical research. Existing legal framework allows such research under less strict de-identification criteria; however, privacy of non-decedent must be protected. We report on creation of the decease subject Integrated Data Repository (dsIDR) at National Institutes of Health, Clinical Center and a pilot methodology to remove secondary protected health information or identifiable information (secondary PxI; information about persons other than the primary patient). We characterize available structured coded data in dsIDR and report the estimated frequencies of secondary PxI, ranging from 12.9% (sensitive token presence) to 1.1% (using stricter criteria). Federating decedent EHR data from multiple institutions can address sample size limitations and our pilot study provides lessons learned and methodology that can be adopted by other institutions. PMID:25954378
Piloting a deceased subject integrated data repository and protecting privacy of relatives.
Huser, Vojtech; Kayaalp, Mehmet; Dodd, Zeyno A; Cimino, James J
2014-01-01
Use of deceased subject Electronic Health Records can be an important piloting platform for informatics or biomedical research. Existing legal framework allows such research under less strict de-identification criteria; however, privacy of non-decedent must be protected. We report on creation of the decease subject Integrated Data Repository (dsIDR) at National Institutes of Health, Clinical Center and a pilot methodology to remove secondary protected health information or identifiable information (secondary PxI; information about persons other than the primary patient). We characterize available structured coded data in dsIDR and report the estimated frequencies of secondary PxI, ranging from 12.9% (sensitive token presence) to 1.1% (using stricter criteria). Federating decedent EHR data from multiple institutions can address sample size limitations and our pilot study provides lessons learned and methodology that can be adopted by other institutions.
Sciutto, Giorgia; Oliveri, Paolo; Catelli, Emilio; Bonacini, Irene
2017-01-01
In the field of applied researches in heritage science, the use of multivariate approach is still quite limited and often chemometric results obtained are often underinterpreted. Within this scenario, the present paper is aimed at disseminating the use of suitable multivariate methodologies and proposes a procedural workflow applied on a representative group of case studies, of considerable importance for conservation purposes, as a sort of guideline on the processing and on the interpretation of this FTIR data. Initially, principal component analysis (PCA) is performed and the score values are converted into chemical maps. Successively, the brushing approach is applied, demonstrating its usefulness for a deep understanding of the relationships between the multivariate map and PC score space, as well as for the identification of the spectral bands mainly involved in the definition of each area localised within the score maps. PMID:29333162
Simulating Colour Vision Deficiency from a Spectral Image.
Shrestha, Raju
2016-01-01
People with colour vision deficiency (CVD) have difficulty seeing full colour contrast and can miss some of the features in a scene. As a part of universal design, researcher have been working on how to modify and enhance the colour of images in order to make them see the scene with good contrast. For this, it is important to know how the original colour image is seen by different individuals with CVD. This paper proposes a methodology to simulate accurate colour deficient images from a spectral image using cone sensitivity of different cases of deficiency. As the method enables generation of accurate colour deficient image, the methodology is believed to help better understand the limitations of colour vision deficiency and that in turn leads to the design and development of more effective imaging technologies for better and wider accessibility in the context of universal design.
Oropharyngeal dysphagia in myotonic dystrophy type 1: a systematic review.
Pilz, Walmari; Baijens, Laura W J; Kremer, Bernd
2014-06-01
A systematic review was conducted to investigate the pathophysiology of and diagnostic procedures for oropharyngeal dysphagia in myotonic dystrophy (MD). The electronic databases Embase, PubMed, and The Cochrane Library were used. The search was limited to English, Dutch, French, German, Spanish, and Portuguese publications. Sixteen studies met the inclusion criteria. Two independent reviewers assessed the methodological quality of the included articles. Swallowing assessment tools, the corresponding protocols, the studies' outcome measurements, and main findings are summarized and presented. The body of literature on pathophysiology of swallowing in dysphagic patients with MD type 1 remains scant. The included studies are heterogeneous with respect to design and outcome measures and hence are not directly comparable. More importantly, most studies had methodological problems. These are discussed in detail and recommendations for further research on diagnostic examinations for swallowing disorders in patients with MD type 1 are provided.
Methodological Review of Intimate Partner Violence Prevention Research
ERIC Educational Resources Information Center
Murray, Christine E.; Graybeal, Jennifer
2007-01-01
The authors present a methodological review of empirical program evaluation research in the area of intimate partner violence prevention. The authors adapted and utilized criterion-based rating forms to standardize the evaluation of the methodological strengths and weaknesses of each study. The findings indicate that the limited amount of…
Qualitative Approaches to Mixed Methods Practice
ERIC Educational Resources Information Center
Hesse-Biber, Sharlene
2010-01-01
This article discusses how methodological practices can shape and limit how mixed methods is practiced and makes visible the current methodological assumptions embedded in mixed methods practice that can shut down a range of social inquiry. The article argues that there is a "methodological orthodoxy" in how mixed methods is practiced…
JEDI Methodology | Jobs and Economic Development Impact Models | NREL
Methodology JEDI Methodology The intent of the Jobs and Economic Development Impact (JEDI) models costs) to demonstrate the employment and economic impacts that will likely result during the estimate of overall economic impacts from specific scenarios. Please see Limitations of JEDI Models for
Conducting Research with LGB People of Color: Methodological Challenges and Strategies
ERIC Educational Resources Information Center
DeBlaere, Cirleen; Brewster, Melanie E.; Sarkees, Anthony; Moradi, Bonnie
2010-01-01
Methodological barriers have been highlighted as a primary reason for the limited research with lesbian, gay, and bisexual (LGB) people of color. Thus, strategies for anticipating and addressing potential methodological barriers are needed. To address this need, this article discusses potential challenges associated with conducting research with…
Investigating the Effectiveness of Special Education: An Analysis of Methodology.
ERIC Educational Resources Information Center
Tindal, Gerald
1985-01-01
The review examines evaluations of the efficacy of special education programs for mildly disabled children. The author suggests that serious methodological flaws make our present knowledge in this area very weak and proposes a methodology to address and overcome many of the limitations of previous research. (Author)
Particle-based modeling of heterogeneous chemical kinetics including mass transfer.
Sengar, A; Kuipers, J A M; van Santen, Rutger A; Padding, J T
2017-08-01
Connecting the macroscopic world of continuous fields to the microscopic world of discrete molecular events is important for understanding several phenomena occurring at physical boundaries of systems. An important example is heterogeneous catalysis, where reactions take place at active surfaces, but the effective reaction rates are determined by transport limitations in the bulk fluid and reaction limitations on the catalyst surface. In this work we study the macro-micro connection in a model heterogeneous catalytic reactor by means of stochastic rotation dynamics. The model is able to resolve the convective and diffusive interplay between participating species, while including adsorption, desorption, and reaction processes on the catalytic surface. Here we apply the simulation methodology to a simple straight microchannel with a catalytic strip. Dimensionless Damkohler numbers are used to comment on the spatial concentration profiles of reactants and products near the catalyst strip and in the bulk. We end the discussion with an outlook on more complicated geometries and increasingly complex reactions.
Particle-based modeling of heterogeneous chemical kinetics including mass transfer
NASA Astrophysics Data System (ADS)
Sengar, A.; Kuipers, J. A. M.; van Santen, Rutger A.; Padding, J. T.
2017-08-01
Connecting the macroscopic world of continuous fields to the microscopic world of discrete molecular events is important for understanding several phenomena occurring at physical boundaries of systems. An important example is heterogeneous catalysis, where reactions take place at active surfaces, but the effective reaction rates are determined by transport limitations in the bulk fluid and reaction limitations on the catalyst surface. In this work we study the macro-micro connection in a model heterogeneous catalytic reactor by means of stochastic rotation dynamics. The model is able to resolve the convective and diffusive interplay between participating species, while including adsorption, desorption, and reaction processes on the catalytic surface. Here we apply the simulation methodology to a simple straight microchannel with a catalytic strip. Dimensionless Damkohler numbers are used to comment on the spatial concentration profiles of reactants and products near the catalyst strip and in the bulk. We end the discussion with an outlook on more complicated geometries and increasingly complex reactions.
Jia, Pengli; Tang, Li; Yu, Jiajie; Lee, Andy H; Zhou, Xu; Kang, Deying; Luo, Yanan; Liu, Jiali; Sun, Xin
2018-03-06
To assess risk of bias and to investigate methodological issues concerning the design, conduct and analysis of randomised controlled trials (RCTs) testing acupuncture for knee osteoarthritis (KOA). PubMed, EMBASE, Cochrane Central Register of Controlled Trials and four major Chinese databases were searched for RCTs that investigated the effect of acupuncture for KOA. The Cochrane tool was used to examine the risk of bias of eligible RCTs. Their methodological details were examined using a standardised and pilot-tested questionnaire of 48 items, together with the association between four predefined factors and important methodological quality indicators. A total of 248 RCTs were eligible, of which 39 (15.7%) used computer-generated randomisation sequence. Of the 31 (12.5%) trials that stated the allocation concealment, only one used central randomisation. Twenty-five (10.1%) trials mentioned that their acupuncture procedures were standardised, but only 18 (7.3%) specified how the standardisation was achieved. The great majority of trials (n=233, 94%) stated that blinding was in place, but 204 (87.6%) did not clarify who was blinded. Only 27 (10.9%) trials specified the primary outcome, for which 7 used intention-to-treat analysis. Only 17 (6.9%) trials included details on sample size calculation; none preplanned an interim analysis and associated stopping rule. In total, 46 (18.5%) trials explicitly stated that loss to follow-up occurred, but only 6 (2.4%) provided some information to deal with the issue. No trials prespecified, conducted or reported any subgroup or adjusted analysis for the primary outcome. The overall risk of bias was high among published RCTs testing acupuncture for KOA. Methodological limitations were present in many important aspects of design, conduct and analyses. These findings inform the development of evidence-based methodological guidance for future trials assessing the effect of acupuncture for KOA. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
System-on-Chip Data Processing and Data Handling Spaceflight Electronics
NASA Technical Reports Server (NTRS)
Kleyner, I.; Katz, R.; Tiggeler, H.
1999-01-01
This paper presents a methodology and a tool set which implements automated generation of moderate-size blocks of customized intellectual property (IP), thus effectively reusing prior work and minimizing the labor intensive, error-prone parts of the design process. Customization of components allows for optimization for smaller area and lower power consumption, which is an important factor given the limitations of resources available in radiation-hardened devices. The effects of variations in HDL coding style on the efficiency of synthesized code for various commercial synthesis tools are also discussed.
In Vivo Methods for the Assessment of Topical Drug Bioavailability
Herkenne, Christophe; Alberti, Ingo; Naik, Aarti; Kalia, Yogeshvar N.; Mathy, François-Xavier; Préat, Véronique
2007-01-01
This paper reviews some current methods for the in vivo assessment of local cutaneous bioavailability in humans after topical drug application. After an introduction discussing the importance of local drug bioavailability assessment and the limitations of model-based predictions, the focus turns to the relevance of experimental studies. The available techniques are then reviewed in detail, with particular emphasis on the tape stripping and microdialysis methodologies. Other less developed techniques, including the skin biopsy, suction blister, follicle removal and confocal Raman spectroscopy techniques are also described. PMID:17985216
Epidemiological Interpretation of Studies Examining the Effect of Antibiotic Usage on Resistance
Schechner, Vered; Temkin, Elizabeth; Harbarth, Stephan; Carmeli, Yehuda
2013-01-01
SUMMARY Bacterial resistance to antibiotics is a growing clinical problem and public health threat. Antibiotic use is a known risk factor for the emergence of antibiotic resistance, but demonstrating the causal link between antibiotic use and resistance is challenging. This review describes different study designs for assessing the association between antibiotic use and resistance and discusses strengths and limitations of each. Approaches to measuring antibiotic use and antibiotic resistance are presented. Important methodological issues such as confounding, establishing temporality, and control group selection are examined. PMID:23554418
HS-GC-MS method for the analysis of fragrance allergens in complex cosmetic matrices.
Desmedt, B; Canfyn, M; Pype, M; Baudewyns, S; Hanot, V; Courselle, P; De Beer, J O; Rogiers, V; De Paepe, K; Deconinck, E
2015-01-01
Potential allergenic fragrances are part of the Cosmetic Regulation with labelling and concentration restrictions. This means that they have to be declared on the ingredients list, when their concentration exceeds the labelling limit of 10 ppm or 100 ppm for leave-on or rinse-off cosmetics, respectively. Labelling is important regarding consumer safety. In this way, sensitised people towards fragrances might select their products based on the ingredients list to prevent elicitation of an allergic reaction. It is therefore important to quantify potential allergenic ingredients in cosmetic products. An easy to perform liquid extraction was developed, combined with a new headspace GC-MS method. The latter was capable of analysing 24 volatile allergenic fragrances in complex cosmetic formulations, such as hydrophilic (O/W) and lipophilic (W/O) creams, lotions and gels. This method was successfully validated using the total error approach. The trueness deviations for all components were smaller than 8%, and the expectation tolerance limits did not exceed the acceptance limits of ± 20% at the labelling limit. The current methodology was used to analyse 18 cosmetic samples that were already identified as being illegal on the EU market for containing forbidden skin whitening substances. Our results showed that these cosmetic products also contained undeclared fragrances above the limit value for labelling, which imposes an additional health risk for the consumer. Copyright © 2014 Elsevier B.V. All rights reserved.
Practical Loop-Shaping Design of Feedback Control Systems
NASA Technical Reports Server (NTRS)
Kopasakis, George
2010-01-01
An improved methodology for designing feedback control systems has been developed based on systematically shaping the loop gain of the system to meet performance requirements such as stability margins, disturbance attenuation, and transient response, while taking into account the actuation system limitations such as actuation rates and range. Loop-shaping for controls design is not new, but past techniques do not directly address how to systematically design the controller to maximize its performance. As a result, classical feedback control systems are designed predominantly using ad hoc control design approaches such as proportional integral derivative (PID), normally satisfied when a workable solution is achieved, without a good understanding of how to maximize the effectiveness of the control design in terms of competing performance requirements, in relation to the limitations of the plant design. The conception of this improved methodology was motivated by challenges in designing control systems of the types needed for supersonic propulsion. But the methodology is generally applicable to any classical control-system design where the transfer function of the plant is known or can be evaluated. In the case of a supersonic aerospace vehicle, a major challenge is to design the system to attenuate anticipated external and internal disturbances, using such actuators as fuel injectors and valves, bypass doors, and ramps, all of which are subject to limitations in actuator response, rates, and ranges. Also, for supersonic vehicles, with long slim type of structures, coupling between the engine and the structural dynamics can produce undesirable effects that could adversely affect vehicle stability and ride quality. In order to design distributed controls that can suppress these potential adverse effects, within the full capabilities of the actuation system, it is important to employ a systematic control design methodology such as this that can maximize the effectiveness of the control design in a methodical and quantifiable way. The emphasis is in generating simple but rather powerful design techniques that will allow even designers with a layman s knowledge in controls to develop effective feedback control designs. Unlike conventional ad hoc methodologies of feedback control design, in this approach actuator rates are incorporated into the design right from the start: The relation between actuator speeds and the desired control bandwidth of the system is established explicitly. The technique developed is demonstrated via design examples in a step-by-step tutorial way. Given the actuation system rates and range limits together with design specifications in terms of stability margins, disturbance rejection, and transient response, the procedure involves designing the feedback loop gain to meet the requirements and maximizing the control system effectiveness, without exceeding the actuation system limits and saturating the controller. Then knowing the plant transfer function, the procedure involves designing the controller so that the controller transfer function together with the plant transfer function equate to the designed loop gain. The technique also shows what the limitations of the controller design are and how to trade competing design requirements such as stability margins and disturbance rejection. Finally, the technique is contrasted against other more familiar control design techniques, like PID control, to show its advantages.
Brown, Matt A; Bishnoi, Ram J; Dholakia, Sara; Velligan, Dawn I
2016-01-20
Recent failures to detect efficacy in clinical trials investigating pharmacological treatments for schizophrenia raise concerns regarding the potential contribution of methodological shortcomings to this research. This review provides an examination of two key methodological issues currently suspected of playing a role in hampering schizophrenia drug development; 1) limitations on the translational utility of preclinical development models, and 2) methodological challenges posed by increased placebo effects. Recommendations for strategies to address these methodological issues are addressed.
User Evaluation of Neonatology Ward Design.
Trujillo, Juan Luis Higuera; Aviñó, Antoni Montañana I; Millán, Carmen Llinares
2017-01-01
The object of this article is to identify the set of affective and emotional factors behind users' assessments of a space in a neonatology unit and to propose design guidelines based on these. The importance of the neonatology service and the variety of users place great demands on the space at all levels. Despite the repercussions, the emotional aspects of the environment have received less attention. To avoid incurring limitations in the user mental scheme, this study uses two complementary methodologies: focus group and semantic differential. The (qualitative) focus group methodology provides exploratory information and concepts. The (quantitative) semantic differential methodology then uses these concepts to extract the conceptual structures that users employ in their assessment of the space. Of the total 175 subjects, 31 took part in focus groups and 144 in semantic differential. Five independent concepts were identified: privacy, functionality and professional nature, spaciousness, lighting, and cleanliness. In relation to the importance of the overall positive assessment of the space, the perception of privacy and sensations of dominance and pleasure are fundamental. Six relevant design aspects were also identified: provide spacious surroundings, facilitate sufficient separation between the different posts or cots, use different colors from those usually found in health-care centers, as some aversion was found to white and especially green, design areas with childhood themes, use warm artificial light, and choose user-friendly equipment. Results provide design recommendations of interest and show the possibilities offered by combining both systems to analyze user response.
Kennedy, Gordon J; Afeworki, Mobae; Calabro, David C; Chase, Clarence E; Smiley, Randolph J
2004-06-01
Distinct hydrogen species are present in important inorganic solids such as zeolites, silicoaluminophosphates (SAPOs), mesoporous materials, amorphous silicas, and aluminas. These H species include hydrogens associated with acidic sites such as Al(OH)Si, non-framework aluminum sites, silanols, and surface functionalities. Direct and quantitative methodology to identify, measure, and monitor these hydrogen species are key to monitoring catalyst activity, optimizing synthesis conditions, tracking post-synthesis structural modifications, and in the preparation of novel catalytic materials. Many workers have developed several techniques to address these issues, including 1H MAS NMR (magic-angle spinning nuclear magnetic resonance). 1H MAS NMR offers many potential advantages over other techniques, but care is needed in recognizing experimental limitations and developing sample handling and NMR methodology to obtain quantitatively reliable data. A simplified approach is described that permits vacuum dehydration of multiple samples simultaneously and directly in the MAS rotor without the need for epoxy, flame sealing, or extensive glovebox use. We have found that careful optimization of important NMR conditions, such as magnetic field homogeneity and magic angle setting are necessary to acquire quantitative, high-resolution spectra that accurately measure the concentrations of the different hydrogen species present. Details of this 1H MAS NMR methodology with representative applications to zeolites, SAPOs, M41S, and silicas as a function of synthesis conditions and post-synthesis treatments (i.e., steaming, thermal dehydroxylation, and functionalization) are presented.
McHugh, R Kathryn; Behar, Evelyn
2012-12-01
In his commentary on our previously published article "Readability of Self-Report Measures of Depression and Anxiety," J. Schinka (2012) argued for the importance of considering readability of patient materials and highlighted limitations of existing methodologies for this assessment. Schinka's commentary articulately described the weaknesses of readability assessment and emphasized the importance of the development of improved strategies for assessing readability to maximize the validity of self-report measures in applied settings. In our reply, we support and extend Schinka's argument, highlighting the importance of consideration of the range of factors (e.g., use of reverse-scored items) that may increase respondent difficulty with comprehension. Consideration of the readability of self-report symptom measures is critical to the validity of these measures in both clinical practice and research settings.
Micronutrients and Leptospirosis: A Review of the Current Evidence
Herman, Heather S.; Mehta, Saurabh; Cárdenas, Washington B.; Stewart-Ibarra, Anna M.
2016-01-01
Background Leptospirosis is one of the most widespread zoonoses and represents a major threat to human health. Due to the high burden of disease, limitations in diagnostics, and limited coverage and availability of effective human and veterinary vaccines, leptospirosis remains an important neglected zoonotic disease. Improved surveillance and identification of modifiable risk factors for leptospirosis are urgently needed to inform preventive interventions and reduce the risk and severity of Leptospira infection. Methodology/Principal Findings This review was conducted to examine the evidence that links micronutrient status and Leptospira infection. A total of 56 studies were included in this review: 28 in vitro, 17 animal, and 11 observational human studies. Findings indicated that Leptospira infection is associated with higher iron and calcium concentrations and hypomagnesemia. Conclusions/Significance Few prospective studies and no randomized trials have been conducted to date to examine the potential role of micronutrients in Leptospira infection. The limited literature in this area constrains our ability to make specific recommendations; however, the roles of iron, calcium, and magnesium in leptospirosis represent important areas for future research. The role of micronutrients in leptospirosis risk and severity needs to be elucidated in larger prospective human studies to inform public health interventions. PMID:27387046
Documentation of indigenous Pacific agroforestry systems: a review of methodologies
Bill Raynor
1993-01-01
Recent interest in indigenous agroforestry has led to a need for documentation of these systems. However, previous work is very limited, and few methodologies are well-known or widely accepted. This paper outlines various methodologies (including sampling methods, data to be collected, and considerations in analysis) for documenting structure and productivity of...
Methodological Limitations of the Application of Expert Systems Methodology in Reading.
ERIC Educational Resources Information Center
Willson, Victor L.
Methodological deficiencies inherent in expert-novice reading research make it impossible to draw inferences about curriculum change. First, comparisons of intact groups are often used as a basis for making causal inferences about how observed characteristics affect behaviors. While comparing different groups is not by itself a useless activity,…
78 FR 60715 - Sedaxane; Pesticide Tolerances
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-02
... 28-day dermal study did not show systemic toxicity at the limit dose of 1,000 milligrams/kilogram/day... Enforcement Methodology Adequate enforcement methodology is available to enforce the tolerance expression. A...
Proactive patient rounding to increase customer service and satisfaction on an orthopaedic unit.
Tea, Christine; Ellison, Michael; Feghali, Fadia
2008-01-01
Customer service and patient satisfaction have become increasingly important in the healthcare industry. Given limited resources and a myriad of choices, on which facets of patient satisfaction should healthcare providers focus? An analysis of 40,000 observations across 4 hospitals found 1 important intervention: timely staff responsiveness. Using the Plan-Do-Check-Act (PDCA) quality methodology, the goal was set to improve staff responsiveness to orthopaedic patient needs and requests, thus improving patient satisfaction. A model to improve staff responsiveness was systematically developed and implemented. The I Care Rounding model places the emphasis on proactively meeting patient needs through hourly rounding, rather than caregivers providing care in a reactionary mode. After full implementation, positive improvement was demonstrated.
ERIC Educational Resources Information Center
Reed, Vicki A.; Brammall, Helen
2006-01-01
This article describes the systematic and detailed processes undertaken to modify a research methodology for use with language-impaired adolescents. The original methodology had been used previously with normally achieving adolescents and speech pathologists to obtain their opinions about the relative importance of selected communication skills…
Probabilistic assessment of dynamic system performance. Part 3
DOE Office of Scientific and Technical Information (OSTI.GOV)
Belhadj, Mohamed
1993-01-01
Accurate prediction of dynamic system failure behavior can be important for the reliability and risk analyses of nuclear power plants, as well as for their backfitting to satisfy given constraints on overall system reliability, or optimization of system performance. Global analysis of dynamic systems through investigating the variations in the structure of the attractors of the system and the domains of attraction of these attractors as a function of the system parameters is also important for nuclear technology in order to understand the fault-tolerance as well as the safety margins of the system under consideration and to insure a safemore » operation of nuclear reactors. Such a global analysis would be particularly relevant to future reactors with inherent or passive safety features that are expected to rely on natural phenomena rather than active components to achieve and maintain safe shutdown. Conventionally, failure and global analysis of dynamic systems necessitate the utilization of different methodologies which have computational limitations on the system size that can be handled. Using a Chapman-Kolmogorov interpretation of system dynamics, a theoretical basis is developed that unifies these methodologies as special cases and which can be used for a comprehensive safety and reliability analysis of dynamic systems.« less
Searching for grey literature for systematic reviews: challenges and benefits.
Mahood, Quenby; Van Eerd, Dwayne; Irvin, Emma
2014-09-01
There is ongoing interest in including grey literature in systematic reviews. Including grey literature can broaden the scope to more relevant studies, thereby providing a more complete view of available evidence. Searching for grey literature can be challenging despite greater access through the Internet, search engines and online bibliographic databases. There are a number of publications that list sources for finding grey literature in systematic reviews. However, there is scant information about how searches for grey literature are executed and how it is included in the review process. This level of detail is important to ensure that reviews follow explicit methodology to be systematic, transparent and reproducible. The purpose of this paper is to provide a detailed account of one systematic review team's experience in searching for grey literature and including it throughout the review. We provide a brief overview of grey literature before describing our search and review approach. We also discuss the benefits and challenges of including grey literature in our systematic review, as well as the strengths and limitations to our approach. Detailed information about incorporating grey literature in reviews is important in advancing methodology as review teams adapt and build upon the approaches described. Copyright © 2013 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Lee, Hyunki; Kim, Min Young; Moon, Jeon Il
2017-12-01
Phase measuring profilometry and moiré methodology have been widely applied to the three-dimensional shape measurement of target objects, because of their high measuring speed and accuracy. However, these methods suffer from inherent limitations called a correspondence problem, or 2π-ambiguity problem. Although a kind of sensing method to combine well-known stereo vision and phase measuring profilometry (PMP) technique simultaneously has been developed to overcome this problem, it still requires definite improvement for sensing speed and measurement accuracy. We propose a dynamic programming-based stereo PMP method to acquire more reliable depth information and in a relatively small time period. The proposed method efficiently fuses information from two stereo sensors in terms of phase and intensity simultaneously based on a newly defined cost function of dynamic programming. In addition, the important parameters are analyzed at the view point of the 2π-ambiguity problem and measurement accuracy. To analyze the influence of important hardware and software parameters related to the measurement performance and to verify its efficiency, accuracy, and sensing speed, a series of experimental tests were performed with various objects and sensor configurations.
Greenhouse gas emissions from reservoir water surfaces: A ...
Collectively, reservoirs created by dams are thought to be an important source ofgreenhouse gases (GHGs) to the atmosphere. So far, efforts to quantify, model, andmanage these emissions have been limited by data availability and inconsistenciesin methodological approach. Here we synthesize worldwide reservoir methane,carbon dioxide, and nitrous oxide emission data with three main objectives: (1) togenerate a global estimate of GHG emissions from reservoirs, (2) to identify the bestpredictors of these emissions, and (3) to consider the effect of methodology onemission estimates. We estimate that GHG emission from reservoir water surfacesaccount for 0.8 (0.5-1.2) Pg CO2-equivalents per year, equal to ~1.3 % of allanthropogenic GHG emissions, with the majority (79%) of this forcing due tomethane. We also discuss the potential for several alternative pathways such as damdegassing and downstream emissions to contribute significantly to overall GHGemissions. Although prior studies have linked reservoir GHG emissions to systemage and latitude, we find that factors related to reservoir productivity are betterpredictors of emission. Finally, as methane contributed the most to total reservoirGHG emissions, it is important that future monitoring campaigns incorporatemethane emission pathways, especially ebullition. To inform the public.
Iyer, Smriti; Kapur, Avani; Mahbub, Rifaiyat; Mukherjee, Anit
2017-01-01
Summary Background There is limited empirical evidence about the efficacy of fiscal transfers for a specific purpose, including for health which represents an important source of funds for the delivery of public services especially in large populous countries such as India. Objective To examine two distinct methodologies for allocating specific‐purpose centre‐to‐state transfers, one using an input‐based formula focused on equity and the other using an outcome‐based formula focused on performance. Materials and Methods We examine the Twelfth Finance Commission (12FC)'s use of Equalization Grants for Health (EGH) as an input‐based formula and the Thirteenth Finance Commission (13FC)'s use of Incentive Grants for Health (IGH) as an outcome‐based formula. We simulate and replicate the allocation of these two transfer methodologies and examine the consequences of these fiscal transfer mechanisms. Results The EGH placed conditions for releasing funds, but states varied in their ability to meet those conditions, and hence their allocations varied, eg, Madhya Pradesh received 100% and Odisha 67% of its expected allocation. Due to the design of the IGH formula, IGH allocations were unequally distributed and highly concentrated in 4 states (Manipur, Sikkim, Tamil Nadu, Nagaland), which received over half the national IGH allocation. Discussion The EGH had limited impact in achieving equalization, whereas the IGH rewards were concentrated in states which were already doing better. Greater transparency and accountability of centre‐to‐state allocations and specifically their methodologies are needed to ensure that allocation objectives are aligned to performance. PMID:28857284
Fan, Victoria Y; Iyer, Smriti; Kapur, Avani; Mahbub, Rifaiyat; Mukherjee, Anit
2018-01-01
There is limited empirical evidence about the efficacy of fiscal transfers for a specific purpose, including for health which represents an important source of funds for the delivery of public services especially in large populous countries such as India. To examine two distinct methodologies for allocating specific-purpose centre-to-state transfers, one using an input-based formula focused on equity and the other using an outcome-based formula focused on performance. We examine the Twelfth Finance Commission (12FC)'s use of Equalization Grants for Health (EGH) as an input-based formula and the Thirteenth Finance Commission (13FC)'s use of Incentive Grants for Health (IGH) as an outcome-based formula. We simulate and replicate the allocation of these two transfer methodologies and examine the consequences of these fiscal transfer mechanisms. The EGH placed conditions for releasing funds, but states varied in their ability to meet those conditions, and hence their allocations varied, eg, Madhya Pradesh received 100% and Odisha 67% of its expected allocation. Due to the design of the IGH formula, IGH allocations were unequally distributed and highly concentrated in 4 states (Manipur, Sikkim, Tamil Nadu, Nagaland), which received over half the national IGH allocation. The EGH had limited impact in achieving equalization, whereas the IGH rewards were concentrated in states which were already doing better. Greater transparency and accountability of centre-to-state allocations and specifically their methodologies are needed to ensure that allocation objectives are aligned to performance. © 2017 The Authors. The International Journal of Health Planning and Management published by John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Alexandre, J.; Azevedo, A. R. G.; Theophilo, M. M. D.; Xavier, C. G.; Paes, A. L. C.; Monteiro, S. N.; Margem, F. M.; Azeredo, N. G.
The use of bricks of soil-cement is proving to be an important constructive methodology due to low environmental impact in the production process of these blocks comparing with conventional bricks are burnt, besides being easy to produce. However during the process of production of bricks, which are compressed, knowledge of the properties of the soil used is critical to the quality and durability of the blocks. The objective of this work is to evaluate the feasibility of using soil from the municipality of Goytacazes for the production of soil-cement bricks. Assays were performed the compaction, liquid limit, plastic limit, particle size analysis, EDX and X-Ray diffraction for later pressed blocks and analyze their compressive strength and water absorption.
Personality and chronic fatigue syndrome: methodological and conceptual issues.
van Geelen, Stefan M; Sinnema, Gerben; Hermans, Hubert J M; Kuis, Wietse
2007-12-01
Among clinical psychologists, consulting physicians, scientific researchers and society in general an image has emerged of patients with chronic fatigue syndrome (CFS) as perfectionist, conscientious, hardworking, somewhat neurotic and introverted individuals with high personal standards, a great desire to be socially accepted and with a history of continuously pushing themselves past their limits. The aim of this article is to (a) give a concise review of the main recent studies on personality and CFS, (b) address the major methodological problems in the study of personality in CFS and (c) discuss some of the conceptual assumptions that seem to limit the research on personality and CFS. The results of the reviewed studies range from no evidence of major differences between the personalities of patients with CFS and controls, to evidence of severe psychopathology and personality disorder in patients with CFS. Although personality seems to play a role in CFS, it is difficult to draw general conclusions on the relation between personality and CFS. It is argued that this is partially due to the diversity and heterogeneity in study methods, patient populations, control groups and CFS case definitions. Personality should be regarded as an important factor to be studied in CFS. However, additional studies are needed, not focusing exclusively on personality disorder, or personality considered on a general trait level. In recent developments in personality research, the continually evolving life narrative that makes sense of, and gives direction to, an individual's life is also regarded as an important aspect of personality. New insights into personality and CFS might be gained by systematically studying the self-narratives of patients with the syndrome.
Reed, Vicki A; Brammall, Helen
2006-01-01
This article describes the systematic and detailed processes undertaken to modify a research methodology for use with language-impaired adolescents. The original methodology had been used previously with normally achieving adolescents and speech pathologists to obtain their opinions about the relative importance of selected communication skills for adolescents' positive peer relationships. Modifications attempted to address language-impaired adolescents' characteristic metalinguistic, literacy, cognitive, and information processing weaknesses. Revising the original wording of the communication skills, reducing the reading level of the skills from grade 10 to 4.6, using a Q-sort approach to ranking the importance of the skills, and revising the instructions and administration procedures led to what pilot testing results indicated was a valid methodology for use with language-impaired adolescents. Results of a preliminary study using the revised methodology suggested that language-impaired adolescents may perceive the relative importance of some communication skills differently from their normally achieving peers.
Methodology to Estimate the Quantity, Composition, and ...
This report, Methodology to Estimate the Quantity, Composition and Management of Construction and Demolition Debris in the US, was developed to expand access to data on CDD in the US and to support research on CDD and sustainable materials management. Since past US EPA CDD estimates have been limited to building-related CDD, a goal in the development of this methodology was to use data originating from CDD facilities and contractors to better capture the current picture of total CDD management, including materials from roads, bridges and infrastructure. This report, Methodology to Estimate the Quantity, Composition and Management of Construction and Demolition Debris in the US, was developed to expand access to data on CDD in the US and to support research on CDD and sustainable materials management. Since past US EPA CDD estimates have been limited to building-related CDD, a goal in the development of this methodology was to use data originating from CDD facilities and contractors to better capture the current picture of total CDD management, including materials from roads, bridges and infrastructure.
Jin, Ying-Hui; Wang, Guo-Hao; Sun, Yi-Rong; Li, Qi; Zhao, Chen; Li, Ge; Si, Jin-Hua; Li, Yan; Lu, Cui; Shang, Hong-Cai
2016-11-14
To assess the methodology and quality of evidence of systematic reviews and meta-analyses of traditional Chinese medical nursing (TCMN) interventions in Chinese journals. These interventions include acupressure, massage, Tai Chi, Qi Gong, electroacupuncture and use of Chinese herbal medicines-for example, in enemas, foot massage and compressing the umbilicus. A systematic literature search for systematic reviews and meta-analyses of TCMN interventions was performed. Review characteristics were extracted. The methodological quality and the quality of the evidence were evaluated using the Assessment of Multiple Systematic Reviews (AMSTAR) and Grading of Recommendations Assessment, Development and Evaluation (GRADE) approaches. We included 20 systematic reviews and meta-analyses, and a total of 11 TCMN interventions were assessed in the 20 reviews. The compliance with AMSTAR checklist items ranged from 4.5 to 8 and systematic reviews/meta-analyses were, on average, of medium methodological quality. The quality of the evidence we assessed ranged from very low to moderate; no high-quality evidence was found. The top two causes for downrating confidence in effect estimates among the 31 bodies of evidence assessed were the risk of bias and inconsistency. There is room for improvement in the methodological quality of systematic reviews/meta-analyses of TCMN interventions published in Chinese journals. Greater efforts should be devoted to ensuring a more comprehensive search strategy, clearer specification of the interventions of interest in the eligibility criteria and identification of meaningful outcomes for clinicians and patients (consumers). The overall quality of evidence among reviews remains suboptimal, which raise concerns about their roles in influencing clinical practice. Thus, the conclusions in reviews we assessed must be treated with caution and their roles in influencing clinical practice should be limited. A critical appraisal of systematic reviews/meta-analyses of TCMN interventions is particularly important to provide sound guidance for TCMN. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Anti-tobacco mass media and socially disadvantaged groups: a systematic and methodological review.
Guillaumier, Ashleigh; Bonevski, Billie; Paul, Chris
2012-07-01
Only a limited amount of research has been conducted to explore whether there are socioeconomic status differences in responses to mass media. However, the methodological quality of this evidence has not been assessed, limiting confidence in conclusions that can be drawn regarding study outcomes. A systematic review of the effectiveness of anti-tobacco mass media campaigns with socially disadvantaged groups was conducted, and the methodological quality of included studies was assessed. Medline, The Cochrane Library, PsycInfo, Embase and Web of Science were searched using MeSH and keywords for quantitative studies conducted in Western countries prior to March 2012. A methodological quality assessment and narrative analysis of included studies was undertaken. Seventeen relevant studies (reported in 18 papers) were identified; however, weak study designs and selection bias were common characteristics, limiting strong conclusions about effectiveness. Using predominantly non-cessation related outcome measures reviewed papers indicated mixed results for mass media tobacco control campaign effectiveness among various social groups. Most studies assessed mass media impact on low socioeconomic status groups rather than highly socially disadvantaged groups. Methodological rigour of evaluations in this field must be improved to aid understanding regarding the effectiveness of mass media campaigns in driving cessation among disadvantaged groups. The results of this review indicate a gap in methodologically rigorous research into the effectiveness of mass media campaigns among socially disadvantaged groups, particularly the highly disadvantaged. © 2012 Australasian Professional Society on Alcohol and other Drugs.
Rating of Dynamic Coefficient for Simple Beam Bridge Design on High-Speed Railways
NASA Astrophysics Data System (ADS)
Diachenko, Leonid; Benin, Andrey; Smirnov, Vladimir; Diachenko, Anastasia
2018-06-01
The aim of the work is to improve the methodology for the dynamic computation of simple beam spans during the impact of high-speed trains. Mathematical simulation utilizing numerical and analytical methods of structural mechanics is used in the research. The article analyses parameters of the effect of high-speed trains on simple beam spanning bridge structures and suggests a technique of determining of the dynamic index to the live load. Reliability of the proposed methodology is confirmed by results of numerical simulation of high-speed train passage over spans with different speeds. The proposed algorithm of dynamic computation is based on a connection between maximum acceleration of the span in the resonance mode of vibrations and the main factors of stress-strain state. The methodology allows determining maximum and also minimum values of the main efforts in the construction that makes possible to perform endurance tests. It is noted that dynamic additions for the components of the stress-strain state (bending moments, transverse force and vertical deflections) are different. This condition determines the necessity for differentiated approach to evaluation of dynamic coefficients performing design verification of I and II groups of limiting state. The practical importance: the methodology of determining the dynamic coefficients allows making dynamic calculation and determining the main efforts in split beam spans without numerical simulation and direct dynamic analysis that significantly reduces the labour costs for design.
Toward Green Acylation of (Hetero)arenes: Palladium-Catalyzed Carbonylation of Olefins to Ketones
2017-01-01
Green Friedel–Crafts acylation reactions belong to the most desired transformations in organic chemistry. The resulting ketones constitute important intermediates, building blocks, and functional molecules in organic synthesis as well as for the chemical industry. Over the past 60 years, advances in this topic have focused on how to make this reaction more economically and environmentally friendly by using green acylating conditions, such as stoichiometric acylations and catalytic homogeneous and heterogeneous acylations. However, currently well-established methodologies for their synthesis either produce significant amounts of waste or proceed under harsh conditions, limiting applications. Here, we present a new protocol for the straightforward and selective introduction of acyl groups into (hetero)arenes without directing groups by using available olefins with inexpensive CO. In the presence of commercial palladium catalysts, inter- and intramolecular carbonylative C–H functionalizations take place with good regio- and chemoselectivity. Compared to classical Friedel–Crafts chemistry, this novel methodology proceeds under mild reaction conditions. The general applicability of this methodology is demonstrated by the direct carbonylation of industrial feedstocks (ethylene and diisobutene) as well as of natural products (eugenol and safrole). Furthermore, synthetic applications to drug molecules are showcased. PMID:29392174
Diagnostic radiograph based 3D bone reconstruction framework: application to the femur.
Gamage, P; Xie, S Q; Delmas, P; Xu, W L
2011-09-01
Three dimensional (3D) visualization of anatomy plays an important role in image guided orthopedic surgery and ultimately motivates minimally invasive procedures. However, direct 3D imaging modalities such as Computed Tomography (CT) are restricted to a minority of complex orthopedic procedures. Thus the diagnostics and planning of many interventions still rely on two dimensional (2D) radiographic images, where the surgeon has to mentally visualize the anatomy of interest. The purpose of this paper is to apply and validate a bi-planar 3D reconstruction methodology driven by prominent bony anatomy edges and contours identified on orthogonal radiographs. The results obtained through the proposed methodology are benchmarked against 3D CT scan data to assess the accuracy of reconstruction. The human femur has been used as the anatomy of interest throughout the paper. The novelty of this methodology is that it not only involves the outer contours of the bony anatomy in the reconstruction but also several key interior edges identifiable on radiographic images. Hence, this framework is not simply limited to long bones, but is generally applicable to a multitude of other bony anatomies as illustrated in the results section. Copyright © 2010 Elsevier Ltd. All rights reserved.
Ramaraju, Bhargavi; McFeeters, Hana; Vogler, Bernhard; McFeeters, Robert L.
2016-01-01
Nuclear magnetic resonance spectroscopy studies of ever larger systems have benefited from many different forms of isotope labeling, in particular, site specific isotopic labeling. Site specific 13C labeling of methyl groups has become an established means of probing systems not amenable to traditional methodology. However useful, methyl reporter sites can be limited in number and/or location. Therefore, new complementary site specific isotope labeling strategies are valuable. Aromatic amino acids make excellent probes since they are often found at important interaction interfaces and play significant structural roles. Aromatic side chains have many of the same advantages as methyl containing amino acids including distinct 13C chemical shifts and multiple magnetically equivalent 1H positions. Herein we report economical bacterial production and one-step purification of phenylalanine with 13C incorporation at the Cα, Cγ and Cε positions, resulting in two isolated 1H-13C spin systems. We also present methodology to maximize incorporation of phenylalanine into recombinantly overexpressed proteins in bacteria and demonstrate compatibility with ILV-methyl labeling. Inexpensive, site specific isotope labeled phenylalanine adds another dimension to biomolecular NMR, opening new avenues of study. PMID:28028744
Efficient free energy calculations of quantum systems through computer simulations
NASA Astrophysics Data System (ADS)
Antonelli, Alex; Ramirez, Rafael; Herrero, Carlos; Hernandez, Eduardo
2009-03-01
In general, the classical limit is assumed in computer simulation calculations of free energy. This approximation, however, is not justifiable for a class of systems in which quantum contributions for the free energy cannot be neglected. The inclusion of quantum effects is important for the determination of reliable phase diagrams of these systems. In this work, we present a new methodology to compute the free energy of many-body quantum systems [1]. This methodology results from the combination of the path integral formulation of statistical mechanics and efficient non-equilibrium methods to estimate free energy, namely, the adiabatic switching and reversible scaling methods. A quantum Einstein crystal is used as a model to show the accuracy and reliability the methodology. This new method is applied to the calculation of solid-liquid coexistence properties of neon. Our findings indicate that quantum contributions to properties such as, melting point, latent heat of fusion, entropy of fusion, and slope of melting line can be up to 10% of the calculated values using the classical approximation. [1] R. M. Ramirez, C. P. Herrero, A. Antonelli, and E. R. Hernández, Journal of Chemical Physics 129, 064110 (2008)
Mollison, Daisy; Sellar, Robin; Bastin, Mark; Mollison, Denis; Chandran, Siddharthan; Wardlaw, Joanna; Connick, Peter
2017-01-01
Moderate correlation exists between the imaging quantification of brain white matter lesions and cognitive performance in people with multiple sclerosis (MS). This may reflect the greater importance of other features, including subvisible pathology, or methodological limitations of the primary literature. To summarise the cognitive clinico-radiological paradox and explore the potential methodological factors that could influence the assessment of this relationship. Systematic review and meta-analysis of primary research relating cognitive function to white matter lesion burden. Fifty papers met eligibility criteria for review, and meta-analysis of overall results was possible in thirty-two (2050 participants). Aggregate correlation between cognition and T2 lesion burden was r = -0.30 (95% confidence interval: -0.34, -0.26). Wide methodological variability was seen, particularly related to key factors in the cognitive data capture and image analysis techniques. Resolving the persistent clinico-radiological paradox will likely require simultaneous evaluation of multiple components of the complex pathology using optimum measurement techniques for both cognitive and MRI feature quantification. We recommend a consensus initiative to support common standards for image analysis in MS, enabling benchmarking while also supporting ongoing innovation.
A systematic review on the affordability of a healthful diet for families in the United States.
Horning, Melissa L; Fulkerson, Jayne A
2015-01-01
As obesity rates remain alarmingly high, the importance of healthful diets is emphasized; however, affordability of such diets is disputed. Market basket surveys (MBSs) investigate the affordability of diets for families that meet minimum daily dietary requirements using actual food prices from grocery stores. This review paper describes the methods of MBSs, summarizes methodology, price and affordability findings, limitations, and suggests related policy and practice implications. This is a systematic review of 16 MBSs performed in the United States from 1985 to 2012. A comprehensive multidisciplinary database search strategy was used to identify articles meeting inclusion criteria. Results indicated MBS methodology varied across studies and price data indicated healthful diets for families are likely unaffordable when purchased from small- to medium-sized stores and may be unaffordable in larger stores when compared to the Thrifty Food Plan. Using a social ecological approach, public health nurses and all public health professionals are prime advocates for increased affordability of healthful foods. This study includes policy advocacy, particularly in support of Supplemental Nutrition Assistance Program benefits for low-income families. Future research implications are provided, including methodological recommendations for consistency and quality of forthcoming MBS research. © 2014 Wiley Periodicals, Inc.
Speciation of adsorbates on surface of solids by infrared spectroscopy and chemometrics.
Vilmin, Franck; Bazin, Philippe; Thibault-Starzyk, Frédéric; Travert, Arnaud
2015-09-03
Speciation, i.e. identification and quantification, of surface species on heterogeneous surfaces by infrared spectroscopy is important in many fields but remains a challenging task when facing strongly overlapped spectra of multiple adspecies. Here, we propose a new methodology, combining state of the art instrumental developments for quantitative infrared spectroscopy of adspecies and chemometrics tools, mainly a novel data processing algorithm, called SORB-MCR (SOft modeling by Recursive Based-Multivariate Curve Resolution) and multivariate calibration. After formal transposition of the general linear mixture model to adsorption spectral data, the main issues, i.e. validity of Beer-Lambert law and rank deficiency problems, are theoretically discussed. Then, the methodology is exposed through application to two case studies, each of them characterized by a specific type of rank deficiency: (i) speciation of physisorbed water species over a hydrated silica surface, and (ii) speciation (chemisorption and physisorption) of a silane probe molecule over a dehydrated silica surface. In both cases, we demonstrate the relevance of this approach which leads to a thorough surface speciation based on comprehensive and fully interpretable multivariate quantitative models. Limitations and drawbacks of the methodology are also underlined. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Kaya, Ebru
2017-11-01
In this review essay I respond to issues raised in Mijung Kim and Wolff-Michael Roth's paper titled "Dialogical argumentation in elementary science classrooms", which presents a study dealing with dialogical argumentation in early elementary school classrooms. Since there is very limited research on lower primary school students' argumentation in school science, their paper makes a contribution to research on children's argumentation skills. In this response, I focus on two main issues to extend the discussion in Kim and Roth's paper: (a) methodological issues including conducting a quantitative study on children's argumentation levels and focusing on children's written argumentation in addition to their dialogical argumentation, and (b) investigating children's conceptual understanding along with their argumentation levels. Kim and Roth emphasize the difficulty in determining the level of children's argumentation through the Toulmin's Argument Pattern and lack of high level arguments by children due to their difficulties in writing texts. Regarding these methodological issues, I suggest designing quantitative research on coding children's argument levels because such research could potentially provide important findings on children's argumentation. Furthermore, I discuss alternative written products including posters, figures, or pictures generated by children in order to trace children's arguments, and finally articulating argumentation and conceptual understanding of children.
A theoretical and experimental investigation of propeller performance methodologies
NASA Technical Reports Server (NTRS)
Korkan, K. D.; Gregorek, G. M.; Mikkelson, D. C.
1980-01-01
This paper briefly covers aspects related to propeller performance by means of a review of propeller methodologies; presentation of wind tunnel propeller performance data taken in the NASA Lewis Research Center 10 x 10 wind tunnel; discussion of the predominent limitations of existing propeller performance methodologies; and a brief review of airfoil developments appropriate for propeller applications.
KNOW ESSENTIALS: a tool for informed decisions in the absence of formal HTA systems.
Mathew, Joseph L
2011-04-01
Most developing countries and resource-limited settings lack robust health technology assessment (HTA) systems. Because the development of locally relevant HTA is not immediately viable, and the extrapolation of external HTA is inappropriate, a new model for evaluating health technologies is required. The aim of this study was to describe the development and application of KNOW ESSENTIALS, a tool facilitating evidence-based decisions on health technologies by stakeholders in settings lacking formal HTA systems. Current HTA methodology was examined through literature search. Additional issues relevant to resource-limited settings, but not adequately addressed in current methodology, were identified through further literature search, appraisal of contextually relevant issues, discussion with healthcare professionals familiar with the local context, and personal experience. A set of thirteen elements important for evidence-based decisions was identified, selected and combined into a tool with the mnemonic KNOW ESSENTIALS. Detailed definitions for each element, coding for the elements, and a system to evaluate a given health technology using the tool were developed. Developing countries and resource-limited settings face several challenges to informed decision making. Models that are relevant and applicable in high-income countries are unlikely in such settings. KNOW ESSENTIALS is an alternative that facilitates evidence-based decision making by stakeholders without formal expertise in HTA. The tool could be particularly useful, as an interim measure, in healthcare systems that are developing HTA capacity. It could also be useful anywhere when rapid evidence-based decisions on health technologies are required.
NASA Astrophysics Data System (ADS)
Wright, Mark Mba
There are significant technological and systemic challenges faced by today's advanced biofuel industry. These challenges stem from the current state-of-technology and from the system (consumer market, infrastructure, environment...) in which this emerging industry is being developed. The state-of-technology will improve with continued efforts in technology development, but novel approaches are required to investigate the systemic challenges that limit the adoption of advanced biofuels. The motivation of this dissertation is to address the question of how to find cost-effective, sustainable, and environmentally responsible pathways for the production of biofuels. Economic competitiveness, long-term viability, and benign environmental impact are key for biofuels to be embraced by industry, government, and consumers. Techno-economic, location, and carbon emission analysis are research methodologies that help address each of these issues. The research approach presented in this dissertation is to combine these three methodologies into a holistic study of advanced biofuel technologies. The value of techno-economic, location, and carbon emission analysis is limited when conducted in isolation because of current public perception towards energy technologies. Energy technologies are evaluated based on multiple criteria with a significant emphasis on the three areas investigated in this study. There are important aspects within each of these fields that could significantly limit the value of advances in other fields of study. Therefore, it is necessary that future research in advanced biofuels always consider the systemic challenges faced by novel developments.
Refuting phylogenetic relationships
Bucknam, James; Boucher, Yan; Bapteste, Eric
2006-01-01
Background Phylogenetic methods are philosophically grounded, and so can be philosophically biased in ways that limit explanatory power. This constitutes an important methodologic dimension not often taken into account. Here we address this dimension in the context of concatenation approaches to phylogeny. Results We discuss some of the limits of a methodology restricted to verificationism, the philosophy on which gene concatenation practices generally rely. As an alternative, we describe a software which identifies and focuses on impossible or refuted relationships, through a simple analysis of bootstrap bipartitions, followed by multivariate statistical analyses. We show how refuting phylogenetic relationships could in principle facilitate systematics. We also apply our method to the study of two complex phylogenies: the phylogeny of the archaea and the phylogeny of the core of genes shared by all life forms. While many groups are rejected, our results left open a possible proximity of N. equitans and the Methanopyrales, of the Archaea and the Cyanobacteria, and as well the possible grouping of the Methanobacteriales/Methanoccocales and Thermosplasmatales, of the Spirochaetes and the Actinobacteria and of the Proteobacteria and firmicutes. Conclusion It is sometimes easier (and preferable) to decide which species do not group together than which ones do. When possible topologies are limited, identifying local relationships that are rejected may be a useful alternative to classical concatenation approaches aiming to find a globally resolved tree on the basis of weak phylogenetic markers. Reviewers This article was reviewed by Mark Ragan, Eugene V Koonin and J Peter Gogarten. PMID:16956399
2013-01-01
Background Systematic reviews and meta-analyses of home telemonitoring interventions for patients with chronic diseases have increased over the past decade and become increasingly important to a wide range of clinicians, policy makers, and other health care stakeholders. While a few criticisms about their methodological rigor and synthesis approaches have recently appeared, no formal appraisal of their quality has been conducted yet. Objective The primary aim of this critical review was to evaluate the methodology, quality, and reporting characteristics of prior reviews that have investigated the effects of home telemonitoring interventions in the context of chronic diseases. Methods Ovid MEDLINE, the Database of Abstract of Reviews of Effects (DARE), and Health Technology Assessment Database (HTA) of the Cochrane Library were electronically searched to find relevant systematic reviews, published between January 1966 and December 2012. Potential reviews were screened and assessed for inclusion independently by three reviewers. Data pertaining to the methods used were extracted from each included review and examined for accuracy by two reviewers. A validated quality assessment instrument, R-AMSTAR, was used as a framework to guide the assessment process. Results Twenty-four reviews, nine of which were meta-analyses, were identified from more than 200 citations. The bibliographic search revealed that the number of published reviews has increased substantially over the years in this area and although most reviews focus on studying the effects of home telemonitoring on patients with congestive heart failure, researcher interest has extended to other chronic diseases as well, such as diabetes, hypertension, chronic obstructive pulmonary disease, and asthma. Nevertheless, an important number of these reviews appear to lack optimal scientific rigor due to intrinsic methodological issues. Also, the overall quality of reviews does not appear to have improved over time. While several criteria were met satisfactorily by either all or nearly all reviews, such as the establishment of an a priori design with inclusion and exclusion criteria, use of electronic searches on multiple databases, and reporting of studies characteristics, there were other important areas that needed improvement. Duplicate data extraction, manual searches of highly relevant journals, inclusion of gray and non-English literature, assessment of the methodological quality of included studies and quality of evidence were key methodological procedures that were performed infrequently. Furthermore, certain methodological limitations identified in the synthesis of study results have affected the results and conclusions of some reviews. Conclusions Despite the availability of methodological guidelines that can be utilized to guide the proper conduct of systematic reviews and meta-analyses and eliminate potential risks of bias, this knowledge has not yet been fully integrated in the area of home telemonitoring. Further efforts should be made to improve the design, conduct, reporting, and publication of systematic reviews and meta-analyses in this area. PMID:23880072
Stenne, R; Hurlimann, T; Godard, Béatrice
2012-01-01
Nutrigenetics is a promising field, but the achievability of expected benefits is challenged by the methodological limitations that are associated with clinical research in that field. The mere existence of these limitations suggests that promises about potential outcomes may be premature. Thus, benefits claimed in scientific journal articles in which these limitations are not acknowledged might stimulate biohype. This article aims to examine whether nutrigenetics clinical research articles are a potential source of biohype. Of the 173 articles identified, 16 contained claims in which clinical applications were extrapolated from study results. The methodological limitations being incompletely acknowledged, these articles could potentially be a source of biohype.
Methodologies and Tools for Tuning Parallel Programs: 80% Art, 20% Science, and 10% Luck
NASA Technical Reports Server (NTRS)
Yan, Jerry C.; Bailey, David (Technical Monitor)
1996-01-01
The need for computing power has forced a migration from serial computation on a single processor to parallel processing on multiprocessors. However, without effective means to monitor (and analyze) program execution, tuning the performance of parallel programs becomes exponentially difficult as program complexity and machine size increase. In the past few years, the ubiquitous introduction of performance tuning tools from various supercomputer vendors (Intel's ParAide, TMC's PRISM, CRI's Apprentice, and Convex's CXtrace) seems to indicate the maturity of performance instrumentation/monitor/tuning technologies and vendors'/customers' recognition of their importance. However, a few important questions remain: What kind of performance bottlenecks can these tools detect (or correct)? How time consuming is the performance tuning process? What are some important technical issues that remain to be tackled in this area? This workshop reviews the fundamental concepts involved in analyzing and improving the performance of parallel and heterogeneous message-passing programs. Several alternative strategies will be contrasted, and for each we will describe how currently available tuning tools (e.g. AIMS, ParAide, PRISM, Apprentice, CXtrace, ATExpert, Pablo, IPS-2) can be used to facilitate the process. We will characterize the effectiveness of the tools and methodologies based on actual user experiences at NASA Ames Research Center. Finally, we will discuss their limitations and outline recent approaches taken by vendors and the research community to address them.
Choices Behind Numbers: a Review of the Major Air Pollution Health Impact Assessments in Europe.
Malmqvist, E; Oudin, A; Pascal, M; Medina, S
2018-03-01
The aim of this review is to identify the key contextual and methodological differences in health impact assessments (HIA) of ambient air pollution performed for Europe. We limited our review to multi-country reviews. An additional aim is to quantify some of these differences by applying them in a HIA template in three European cities. Several HIAs of ambient air pollution have been performed for Europe, and their key results have been largely disseminated. Different studies have, however, come up with substantial differences in attributed health effects. It is of importance to review the background contributing to these differences and to quantify their importance for decision makers who will use them. We identified several methodological differences that could explain the discrepancy behind the number of attributable deaths or years of life lost. The main differences are due to the exposure-response functions chosen, the ways of assessing air pollution levels, the air pollution scenarios and the study population. In the quantification part, we found that using risk estimates from the European Study of Cohorts for Air Pollution Effects (ESCAPE) instead of the American Cancer Society (ACS) study could nearly double the attributable burden of ambient air pollution. This study provides some insights into the differential results in previously published HIAs on air pollution in Europe. These results are important for stakeholders in order to make informed decisions.
Young, Rhea; Camic, Paul M; Tischler, Victoria
2016-01-01
Dementia is a progressive condition, affecting increasing numbers of people, characterised by cognitive decline. The current systematic review aimed to evaluate research pertaining to the impact of arts and health interventions on cognition in people with dementia. A literature search was conducted utilising PsychInfo, Cochrane Reviews, Web of Science, Medline and British Humanities Index databases. Seventeen studies were included in the review, including those related to literary, performing and visual arts. The review highlighted this as an emerging area of research with the literature consisting largely of small-scale studies with methodological limitations including lack of control groups and often poorly defined samples. All the studies suggested, however, that arts-based activities had a positive impact on cognitive processes, in particular on attention, stimulation of memories, enhanced communication and engagement with creative activities. The existent literature suggests that arts activities are helpful interventions within dementia care. A consensus has yet to emerge, however, about the direction for future research including the challenge of measurement and the importance of methodological flexibility. It is suggested that further research address some of these limitations by examining whether the impact of interventions vary depending on cognitive ability and to continue to assess how arts interventions can be of use across the stages of dementia.
Vergani, Stefano; Korsunsky, Ilya; Mazzarello, Andrea Nicola; Ferrer, Gerardo; Chiorazzi, Nicholas; Bagnara, Davide
2017-01-01
Efficient and accurate high-throughput DNA sequencing of the adaptive immune receptor repertoire (AIRR) is necessary to study immune diversity in healthy subjects and disease-related conditions. The high complexity and diversity of the AIRR coupled with the limited amount of starting material, which can compromise identification of the full biological diversity makes such sequencing particularly challenging. AIRR sequencing protocols often fail to fully capture the sampled AIRR diversity, especially for samples containing restricted numbers of B lymphocytes. Here, we describe a library preparation method for immunoglobulin sequencing that results in an exhaustive full-length repertoire where virtually every sampled B-cell is sequenced. This maximizes the likelihood of identifying and quantifying the entire IGHV-D-J repertoire of a sample, including the detection of rearrangements present in only one cell in the starting population. The methodology establishes the importance of circumventing genetic material dilution in the preamplification phases and incorporates the use of certain described concepts: (1) balancing the starting material amount and depth of sequencing, (2) avoiding IGHV gene-specific amplification, and (3) using Unique Molecular Identifier. Together, this methodology is highly efficient, in particular for detecting rare rearrangements in the sampled population and when only a limited amount of starting material is available.
Valentine, Jeffrey C; Cooper, Harris
2008-06-01
Assessments of studies meant to evaluate the effectiveness of interventions, programs, and policies can serve an important role in the interpretation of research results. However, evidence suggests that available quality assessment tools have poor measurement characteristics and can lead to opposing conclusions when applied to the same body of studies. These tools tend to (a) be insufficiently operational, (b) rely on arbitrary post-hoc decision rules, and (c) result in a single number to represent a multidimensional construct. In response to these limitations, a multilevel and hierarchical instrument was developed in consultation with a wide range of methodological and statistical experts. The instrument focuses on the operational details of studies and results in a profile of scores instead of a single score to represent study quality. A pilot test suggested that satisfactory between-judge agreement can be obtained using well-trained raters working in naturalistic conditions. Limitations of the instrument are discussed, but these are inherent in making decisions about study quality given incomplete reporting and in the absence of strong, contextually based information about the effects of design flaws on study outcomes. (PsycINFO Database Record (c) 2008 APA, all rights reserved).
Quantitative mass spectrometry of unconventional human biological matrices
NASA Astrophysics Data System (ADS)
Dutkiewicz, Ewelina P.; Urban, Pawel L.
2016-10-01
The development of sensitive and versatile mass spectrometric methodology has fuelled interest in the analysis of metabolites and drugs in unconventional biological specimens. Here, we discuss the analysis of eight human matrices-hair, nail, breath, saliva, tears, meibum, nasal mucus and skin excretions (including sweat)-by mass spectrometry (MS). The use of such specimens brings a number of advantages, the most important being non-invasive sampling, the limited risk of adulteration and the ability to obtain information that complements blood and urine tests. The most often studied matrices are hair, breath and saliva. This review primarily focuses on endogenous (e.g. potential biomarkers, hormones) and exogenous (e.g. drugs, environmental contaminants) small molecules. The majority of analytical methods used chromatographic separation prior to MS; however, such a hyphenated methodology greatly limits analytical throughput. On the other hand, the mass spectrometric methods that exclude chromatographic separation are fast but suffer from matrix interferences. To enable development of quantitative assays for unconventional matrices, it is desirable to standardize the protocols for the analysis of each specimen and create appropriate certified reference materials. Overcoming these challenges will make analysis of unconventional human biological matrices more common in a clinical setting. This article is part of the themed issue 'Quantitative mass spectrometry'.
Chesson, Harrell W; Patel, Chirag G; Gift, Thomas L; Bernstein, Kyle T; Aral, Sevgi O
2017-09-01
Racial disparities in the burden of sexually transmitted diseases (STDs) have been documented and described for decades. Similarly, methodological issues and limitations in the use of disparity measures to quantify disparities in health have also been well documented. The purpose of this study was to use historic STD surveillance data to illustrate four of the most well-known methodological issues associated with the use of disparity measures. We manually searched STD surveillance reports to find examples of racial/ethnic distributions of reported STDs that illustrate key methodological issues in the use of disparity measures. The disparity measures we calculated included the black-white rate ratio, the Index of Disparity (weighted and unweighted by subgroup population), and the Gini coefficient. The 4 examples we developed included illustrations of potential differences in relative and absolute disparity measures, potential differences in weighted and nonweighted disparity measures, the importance of the reference point when calculating disparities, and differences in disparity measures in the assessment of trends in disparities over time. For example, the gonorrhea rate increased for all minority groups (relative to whites) from 1992 to 1993, yet the Index of Disparity suggested that racial/ethnic disparities had decreased. Although imperfect, disparity measures can be useful to quantify racial/ethnic disparities in STDs, to assess trends in these disparities, and to inform interventions to reduce these disparities. Our study uses reported STD rates to illustrate potential methodological issues with these disparity measures and highlights key considerations when selecting disparity measures for quantifying disparities in STDs.
Disma, Nicola; Mondardini, Maria C; Terrando, Niccolò; Absalom, Anthony R; Bilotta, Federico
2016-01-01
Preclinical evidence suggests that anesthetic agents harm the developing brain thereby causing long-term neurocognitive impairments. It is not clear if these findings apply to humans, and retrospective epidemiological studies thus far have failed to show definitive evidence that anesthetic agents are harmful to the developing human brain. The aim of this systematic review was to summarize the preclinical studies published over the past decade, with a focus on methodological issues, to facilitate the comparison between different preclinical studies and inform better design of future trials. The literature search identified 941 articles related to the topic of neurotoxicity. As the primary aim of this systematic review was to compare methodologies applied in animal studies to inform future trials, we excluded a priori all articles focused on putative mechanism of neurotoxicity and the neuroprotective agents. Forty-seven preclinical studies were finally included in this review. Methods used in these studies were highly heterogeneous-animals were exposed to anesthetic agents at different developmental stages, in various doses and in various combinations with other drugs, and overall showed diverse toxicity profiles. Physiological monitoring and maintenance of physiological homeostasis was variable and the use of cognitive tests was generally limited to assessment of specific brain areas, with restricted translational relevance to humans. Comparison between studies is thus complicated by this heterogeneous methodology and the relevance of the combined body of literature to humans remains uncertain. Future preclinical studies should use better standardized methodologies to facilitate transferability of findings from preclinical into clinical science. © 2015 John Wiley & Sons Ltd.
Guidelines for the Design and Conduct of Clinical Studies in Knee Articular Cartilage Repair
Mithoefer, Kai; Saris, Daniel B.F.; Farr, Jack; Kon, Elizaveta; Zaslav, Kenneth; Cole, Brian J.; Ranstam, Jonas; Yao, Jian; Shive, Matthew; Levine, David; Dalemans, Wilfried; Brittberg, Mats
2011-01-01
Objective: To summarize current clinical research practice and develop methodological standards for objective scientific evaluation of knee cartilage repair procedures and products. Design: A comprehensive literature review was performed of high-level original studies providing information relevant for the design of clinical studies on articular cartilage repair in the knee. Analysis of cartilage repair publications and synopses of ongoing trials were used to identify important criteria for the design, reporting, and interpretation of studies in this field. Results: Current literature reflects the methodological limitations of the scientific evidence available for articular cartilage repair. However, clinical trial databases of ongoing trials document a trend suggesting improved study designs and clinical evaluation methodology. Based on the current scientific information and standards of clinical care, detailed methodological recommendations were developed for the statistical study design, patient recruitment, control group considerations, study endpoint definition, documentation of results, use of validated patient-reported outcome instruments, and inclusion and exclusion criteria for the design and conduct of scientifically sound cartilage repair study protocols. A consensus statement among the International Cartilage Repair Society (ICRS) and contributing authors experienced in clinical trial design and implementation was achieved. Conclusions: High-quality clinical research methodology is critical for the optimal evaluation of current and new cartilage repair technologies. In addition to generally applicable principles for orthopedic study design, specific criteria and considerations apply to cartilage repair studies. Systematic application of these criteria and considerations can facilitate study designs that are scientifically rigorous, ethical, practical, and appropriate for the question(s) being addressed in any given cartilage repair research project. PMID:26069574
A methodology for testing fault-tolerant software
NASA Technical Reports Server (NTRS)
Andrews, D. M.; Mahmood, A.; Mccluskey, E. J.
1985-01-01
A methodology for testing fault tolerant software is presented. There are problems associated with testing fault tolerant software because many errors are masked or corrected by voters, limiter, or automatic channel synchronization. This methodology illustrates how the same strategies used for testing fault tolerant hardware can be applied to testing fault tolerant software. For example, one strategy used in testing fault tolerant hardware is to disable the redundancy during testing. A similar testing strategy is proposed for software, namely, to move the major emphasis on testing earlier in the development cycle (before the redundancy is in place) thus reducing the possibility that undetected errors will be masked when limiters and voters are added.
Continuous flow nitration in miniaturized devices
2014-01-01
Summary This review highlights the state of the art in the field of continuous flow nitration with miniaturized devices. Although nitration has been one of the oldest and most important unit reactions, the advent of miniaturized devices has paved the way for new opportunities to reconsider the conventional approach for exothermic and selectivity sensitive nitration reactions. Four different approaches to flow nitration with microreactors are presented herein and discussed in view of their advantages, limitations and applicability of the information towards scale-up. Selected recent patents that disclose scale-up methodologies for continuous flow nitration are also briefly reviewed. PMID:24605161
The Role of Formative Evaluation in Implementation Research and the QUERI Experience
Stetler, Cheryl B; Legro, Marcia W; Wallace, Carolyn M; Bowman, Candice; Guihan, Marylou; Hagedorn, Hildi; Kimmel, Barbara; Sharp, Nancy D; Smith, Jeffrey L
2006-01-01
This article describes the importance and role of 4 stages of formative evaluation in our growing understanding of how to implement research findings into practice in order to improve the quality of clinical care. It reviews limitations of traditional approaches to implementation research and presents a rationale for new thinking and use of new methods. Developmental, implementation-focused, progress-focused, and interpretive evaluations are then defined and illustrated with examples from Veterans Health Administration Quality Enhancement Research Initiative projects. This article also provides methodologic details and highlights challenges encountered in actualizing formative evaluation within implementation research. PMID:16637954
Hayes, Eileen P; Jolly, Robert A; Faria, Ellen C; Barle, Ester Lovsin; Bercu, Joel P; Molnar, Lance R; Naumann, Bruce D; Olson, Michael J; Pecquet, Alison M; Sandhu, Reena; Shipp, Bryan K; Sussman, Robert G; Weideman, Patricia A
2016-08-01
A European Union (EU) regulatory guideline came into effect for all new pharmaceutical products on June 1st, 2015, and for all existing pharmaceutical products on December 1st, 2015. This guideline centers around the use of the Acceptable Daily Exposure (ADE) [synonymous with the Permitted Daily Exposure (PDE)] and operational considerations associated with implementation are outlined here. The EU guidance states that all active pharmaceutical ingredients (API) require an ADE; however, other substances such as starting materials, process intermediates, and cleaning agents may benefit from an ADE. Problems in setting ADEs for these additional substances typically relate to toxicological data limitations precluding the ability to establish a formal ADE. Established methodologies such as occupational exposure limits or bands (OELs or OEBs) and the threshold of toxicological concern (TTC) can be used or adjusted for use as interim ADEs when only limited data are available and until a more formal ADE can be established. Once formal ADEs are derived, it is important that the documents are routinely updated and that these updates are communicated to appropriate stakeholders. Another key operational consideration related to data-poor substances includes the use of maximum daily dose (MDD) in setting cross-contamination limits. The MDD is an important part of the maximum allowable/safe concentration (MAC/MSC) calculation and there are important considerations for its use and definition. Finally, other considerations discussed include operational aspects of setting ADEs for pediatrics, considerations for large molecules, and risk management in shared facilities. Copyright © 2016 Elsevier Inc. All rights reserved.
OPUS: Optimal Projection for Uncertain Systems. Volume 1
1991-09-01
unifiedI control- design methodology that directly addresses these technology issues. 1 In particular, optimal projection theory addresses the need for...effects, and limited identification accuracy in a 1-g environment. The principal contribution of OPUS is a unified design methodology that...characterizing solutions to constrained control- design problems. Transforming OPUS into a practi- cal design methodology requires the development of
ERIC Educational Resources Information Center
Lauckner, Heidi; Paterson, Margo; Krupa, Terry
2012-01-01
Often, research projects are presented as final products with the methodologies cleanly outlined and little attention paid to the decision-making processes that led to the chosen approach. Limited attention paid to these decision-making processes perpetuates a sense of mystery about qualitative approaches, particularly for new researchers who will…
Alternative Methods of Base Level Demand Forecasting for Economic Order Quantity Items,
1975-12-01
Note .. . . . . . . . . . . . . . . . . . . . . . . . 21 AdaptivC Single Exponential Smooti-ing ........ 21 Choosing the Smoothiing Constant... methodology used in the study, an analysis of results, .And a detailed summary. Chapter I. Methodology , contains a description o the data, a...Chapter IV. Detailed Summary, presents a detailed summary of the findings, lists the limitations inherent in the 7’" research methodology , and
NASA Astrophysics Data System (ADS)
Bán, Zoltán; Győri, Erzsébet; János Katona, Tamás; Tóth, László
2015-04-01
Preparedness of nuclear power plants to beyond design base external effects became high importance after 11th of March 2011 Great Tohoku Earthquakes. In case of some nuclear power plants constructed at the soft soil sites, liquefaction should be considered as a beyond design basis hazard. The consequences of liquefaction have to be analysed with the aim of definition of post-event plant condition, identification of plant vulnerabilities and planning the necessary measures for accident management. In the paper, the methodology of the analysis of liquefaction effects for nuclear power plants is outlined. The case of Nuclear Power Plant at Paks, Hungary is used as an example for demonstration of practical importance of the presented results and considerations. Contrary to the design, conservatism of the methodology for the evaluation of beyond design basis liquefaction effects for an operating plant has to be limited to a reasonable level. Consequently, applicability of all existing methods has to be considered for the best estimation. The adequacy and conclusiveness of the results is mainly limited by the epistemic uncertainty of the methods used for liquefaction hazard definition and definition of engineering parameters characterizing the consequences of liquefaction. The methods have to comply with controversial requirements. They have to be consistent and widely accepted and used in the practice. They have to be based on the comprehensive database. They have to provide basis for the evaluation of dominating engineering parameters that control the post-liquefaction response of the plant structures. Experience of Kashiwazaki-Kariwa plant hit by Niigata-ken Chuetsu-oki earthquake of 16 July 2007 and analysis of site conditions and plant layout at Paks plant have shown that the differential settlement is found to be the dominating effect in case considered. They have to be based on the probabilistic seismic hazard assessment and allow the integration into logic-tree procedure. Earlier studies have shown that the potentially liquefiable layer at Paks Nuclear Power Plant is situated in relatively large depth. Therefore the applicability and adequacy of the methods at high overburden pressure is important. In case of existing facilities, the geotechnical data gained before construction aren't sufficient for the comprehensive liquefaction analysis. Performance of new geotechnical survey is limited. Consequently, the availability of the data has to be accounted while selection the analysis methods. Considerations have to be made for dealing with aleatory uncertainty related to the knowledge of the soil conditions. It is shown in the paper, a careful comparison and analysis of the results obtained by different methodologies provides the basis of the selection of practicable methods for the safety analysis of nuclear power plant for beyond design basis liquefaction hazard.
NASA Astrophysics Data System (ADS)
Baldasano, José M.; Gonçalves, María; Soret, Albert; Jiménez-Guerrero, Pedro
2010-08-01
Assessing the effects of air quality management strategies in urban areas is a major concern worldwide because of the large impacts on health caused by the exposure to air pollution. In this sense, this work analyses the changes in urban air quality due to the introduction of a maximum speed limit to 80 km h -1 on motorways in a large city by using a novel methodology combining traffic assimilation data and modelling systems implemented in a supercomputing facility. Albeit the methodology has been non-specifically developed and can be extrapolated to any large city or megacity, the case study of Barcelona is presented here. Hourly simulations take into account the entire year 2008 (when the 80 km h -1 limit has been introduced) vs. the traffic conditions for the year 2007. The data has been assimilated in an emission model, which considers hourly variable speeds and hourly traffic intensity in the affected area, taken from long-term measurement campaigns for the aforementioned years; it also permits to take into account the traffic congestion effect. Overall, the emissions are reduced up to 4%; however the local effects of this reduction achieve an important impact for the adjacent area to the roadways, reaching 11%. In this sense, the speed limitation effects assessed represent enhancements in air quality levels (5-7%) of primary pollutants over the area, directly improving the welfare of 1.35 million inhabitants (over 41% of the population of the Metropolitan Area) and affecting 3.29 million dwellers who are potentially benefited from this strategy for air quality management (reducing 0.6% the mortality rates in the area).
Izewska, Joanna; Wesolowska, Paulina; Azangwe, Godfrey; Followill, David S.; Thwaites, David I.; Arib, Mehenna; Stefanic, Amalia; Viegas, Claudio; Suming, Luo; Ekendahl, Daniela; Bulski, Wojciech; Georg, Dietmar
2016-01-01
Abstract The International Atomic Energy Agency (IAEA) has a long tradition of supporting development of methodologies for national networks providing quality audits in radiotherapy. A series of co-ordinated research projects (CRPs) has been conducted by the IAEA since 1995 assisting national external audit groups developing national audit programs. The CRP ‘Development of Quality Audits for Radiotherapy Dosimetry for Complex Treatment Techniques’ was conducted in 2009–2012 as an extension of previously developed audit programs. Material and methods. The CRP work described in this paper focused on developing and testing two steps of dosimetry audit: verification of heterogeneity corrections, and treatment planning system (TPS) modeling of small MLC fields, which are important for the initial stages of complex radiation treatments, such as IMRT. The project involved development of a new solid slab phantom with heterogeneities containing special measurement inserts for thermoluminescent dosimeters (TLD) and radiochromic films. The phantom and the audit methodology has been developed at the IAEA and tested in multi-center studies involving the CRP participants. Results. The results of multi-center testing of methodology for two steps of dosimetry audit show that the design of audit procedures is adequate and the methodology is feasible for meeting the audit objectives. A total of 97% TLD results in heterogeneity situations obtained in the study were within 3% and all results within 5% agreement with the TPS predicted doses. In contrast, only 64% small beam profiles were within 3 mm agreement between the TPS calculated and film measured doses. Film dosimetry results have highlighted some limitations in TPS modeling of small beam profiles in the direction of MLC leave movements. Discussion. Through multi-center testing, any challenges or difficulties in the proposed audit methodology were identified, and the methodology improved. Using the experience of these studies, the participants could incorporate the auditing procedures in their national programs. PMID:26934916
Izewska, Joanna; Wesolowska, Paulina; Azangwe, Godfrey; Followill, David S; Thwaites, David I; Arib, Mehenna; Stefanic, Amalia; Viegas, Claudio; Suming, Luo; Ekendahl, Daniela; Bulski, Wojciech; Georg, Dietmar
2016-07-01
The International Atomic Energy Agency (IAEA) has a long tradition of supporting development of methodologies for national networks providing quality audits in radiotherapy. A series of co-ordinated research projects (CRPs) has been conducted by the IAEA since 1995 assisting national external audit groups developing national audit programs. The CRP 'Development of Quality Audits for Radiotherapy Dosimetry for Complex Treatment Techniques' was conducted in 2009-2012 as an extension of previously developed audit programs. The CRP work described in this paper focused on developing and testing two steps of dosimetry audit: verification of heterogeneity corrections, and treatment planning system (TPS) modeling of small MLC fields, which are important for the initial stages of complex radiation treatments, such as IMRT. The project involved development of a new solid slab phantom with heterogeneities containing special measurement inserts for thermoluminescent dosimeters (TLD) and radiochromic films. The phantom and the audit methodology has been developed at the IAEA and tested in multi-center studies involving the CRP participants. The results of multi-center testing of methodology for two steps of dosimetry audit show that the design of audit procedures is adequate and the methodology is feasible for meeting the audit objectives. A total of 97% TLD results in heterogeneity situations obtained in the study were within 3% and all results within 5% agreement with the TPS predicted doses. In contrast, only 64% small beam profiles were within 3 mm agreement between the TPS calculated and film measured doses. Film dosimetry results have highlighted some limitations in TPS modeling of small beam profiles in the direction of MLC leave movements. Through multi-center testing, any challenges or difficulties in the proposed audit methodology were identified, and the methodology improved. Using the experience of these studies, the participants could incorporate the auditing procedures in their national programs.
Thoughts on an Indigenous Research Methodology.
ERIC Educational Resources Information Center
Steinhauer, Evelyn
2002-01-01
Reviews writings of Indigenous scholars concerning the need for and nature of an Indigenous research methodology. Discusses why an Indigenous research methodology is needed; the importance of relational accountability in such a methodology; why Indigenous people must conduct Indigenous research; Indigenous knowledge and ways of knowing (including…
NASA Astrophysics Data System (ADS)
Dadashzadeh, N.; Duzgun, H. S. B.; Yesiloglu-Gultekin, N.
2017-08-01
While advanced numerical techniques in slope stability analysis are successfully used in deterministic studies, they have so far found limited use in probabilistic analyses due to their high computation cost. The first-order reliability method (FORM) is one of the most efficient probabilistic techniques to perform probabilistic stability analysis by considering the associated uncertainties in the analysis parameters. However, it is not possible to directly use FORM in numerical slope stability evaluations as it requires definition of a limit state performance function. In this study, an integrated methodology for probabilistic numerical modeling of rock slope stability is proposed. The methodology is based on response surface method, where FORM is used to develop an explicit performance function from the results of numerical simulations. The implementation of the proposed methodology is performed by considering a large potential rock wedge in Sumela Monastery, Turkey. The accuracy of the developed performance function to truly represent the limit state surface is evaluated by monitoring the slope behavior. The calculated probability of failure is compared with Monte Carlo simulation (MCS) method. The proposed methodology is found to be 72% more efficient than MCS, while the accuracy is decreased with an error of 24%.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jetter, R. I.; Messner, M. C.; Sham, T. -L.
The goal of the proposed integrated Elastic Perfectly-Plastic (EPP) and Simplified Model Test (SMT) methodology is to incorporate an SMT data based approach for creep-fatigue damage evaluation into the EPP methodology to avoid the separate evaluation of creep and fatigue damage and eliminate the requirement for stress classification in current methods; thus greatly simplifying evaluation of elevated temperature cyclic service. This methodology should minimize over-conservatism while properly accounting for localized defects and stress risers. To support the implementation of the proposed methodology and to verify the applicability of the code rules, analytical studies and evaluation of thermomechanical test results continuedmore » in FY17. This report presents the results of those studies. An EPP strain limits methodology assessment was based on recent two-bar thermal ratcheting test results on 316H stainless steel in the temperature range of 405 to 7050C. Strain range predictions from the EPP evaluation of the two-bar tests were also evaluated and compared with the experimental results. The role of sustained primary loading on cyclic life was assessed using the results of pressurized SMT data from tests on Alloy 617 at 9500C. A viscoplastic material model was used in an analytic simulation of two-bar tests to compare with EPP strain limits assessments using isochronous stress strain curves that are consistent with the viscoplastic material model. A finite element model of a prior 304H stainless steel Oak Ridge National Laboratory (ORNL) nozzle-to-sphere test was developed and used for an EPP strain limits and creep-fatigue code case damage evaluations. A theoretical treatment of a recurring issue with convergence criteria for plastic shakedown illustrated the role of computer machine precision in EPP calculations.« less
Discrete choice experiments in pharmacy: a review of the literature.
Naik-Panvelkar, Pradnya; Armour, Carol; Saini, Bandana
2013-02-01
Discrete choice experiments (DCEs) have been widely used to elicit patient preferences for various healthcare services and interventions. The aim of our study was to conduct an in-depth scoping review of the literature and provide a current overview of the progressive application of DCEs within the field of pharmacy. Electronic databases (MEDLINE, EMBASE, SCOPUS, ECONLIT) were searched (January 1990-August 2011) to identify published English language studies using DCEs within the pharmacy context. Data were abstracted with respect to DCE methodology and application to pharmacy. Our search identified 12 studies. The DCE methodology was utilised to elicit preferences for different aspects of pharmacy products, therapy or services. Preferences were elicited from either patients or pharmacists, with just two studies incorporating the views of both. Most reviewed studies examined preferences for process-related or provider-related aspects with a lesser focus on health outcomes. Monetary attributes were considered to be important by most patients and pharmacists in the studies reviewed. Logit, probit or multinomial logit models were most commonly employed for estimation. Our study showed that the pharmacy profession has adopted the DCE methodology consistent with the general health DCEs although the number of studies is quite limited. Future studies need to examine preferences of both patients and providers for particular products or disease-state management services. Incorporation of health outcome attributes in the design, testing for external validity and the incorporation of DCE results in economic evaluation framework to inform pharmacy policy remain important areas for future research. © 2012 The Authors. IJPP © 2012 Royal Pharmaceutical Society.
Clinical and Public Health Considerations in Urine Drug Testing to Identify and Treat Substance Use.
Barthwell, Andrea G
2016-05-11
To expand appropriate use of substance use testing, practitioners must increase their knowledge of the appropriate methodology, scope, and frequency. Yet, there is a current lack of accepted guidelines on clinical testing to identify and treat substance use. This article (1) conveys the importance of substance use testing as a clinical and public health response to trends of prescription drug abuse and increased access to medical and commercialized marijuana; (2) summarizes central features of the rapidly evolving science and the practice of patient-centered substance use testing in a clinical setting; and (3) provides recommendations that balance costs and benefits and serve as a starting point for appropriate testing to prevent, identify, and treat substance use disorders. The author conducted a search of peer-reviewed and government-supported articles and books in electronic databases and used her own knowledge and clinical experience. The author makes recommendations for determining the methodology, scope, and frequency of testing in each stage of care based on clinical considerations and methodological factors. Conclusion/Importance: Integrating sensible substance use testing broadly into clinical health care to identify substance use, diagnose substance use disorders, and guide patients into treatment can improve health outcomes and reduce the costs of substance use and addiction. No single testing regimen is suitable for all clinical scenarios; rather, a multitude of options, as discussed herein, can be adapted to meet a patient's unique needs. Ultimately, the practitioner must combine patient-specific information with knowledge of test technologies, capabilities, limitations, and costs.
Staff Acceptance of Tele-ICU Coverage
Chan, Paul S.; Cram, Peter
2011-01-01
Background: Remote coverage of ICUs is increasing, but staff acceptance of this new technology is incompletely characterized. We conducted a systematic review to summarize existing research on acceptance of tele-ICU coverage among ICU staff. Methods: We searched for published articles pertaining to critical care telemedicine systems (aka, tele-ICU) between January 1950 and March 2010 using PubMed, Cumulative Index to Nursing and Allied Health Literature, Global Health, Web of Science, and the Cochrane Library and abstracts and presentations delivered at national conferences. Studies were included if they provided original qualitative or quantitative data on staff perceptions of tele-ICU coverage. Studies were imported into content analysis software and coded by tele-ICU configuration, methodology, participants, and findings (eg, positive and negative staff evaluations). Results: Review of 3,086 citations yielded 23 eligible studies. Findings were grouped into four categories of staff evaluation: overall acceptance level of tele-ICU coverage (measured in 70% of studies), impact on patient care (measured in 96%), impact on staff (measured in 100%), and organizational impact (measured in 48%). Overall acceptance was high, despite initial ambivalence. Favorable impact on patient care was perceived by > 82% of participants. Staff impact referenced enhanced collaboration, autonomy, and training, although scrutiny, malfunctions, and contradictory advice were cited as potential barriers. Staff perceived the organizational impact to vary. An important limitation of available studies was a lack of rigorous methodology and validated survey instruments in many studies. Conclusions: Initial reports suggest high levels of staff acceptance of tele-ICU coverage, but more rigorous methodologic study is required. PMID:21051386
Groundwater availability as constrained by hydrogeology and environmental flows
Watson, Katelyn A.; Mayer, Alex S.; Reeves, Howard W.
2014-01-01
Groundwater pumping from aquifers in hydraulic connection with nearby streams has the potential to cause adverse impacts by decreasing flows to levels below those necessary to maintain aquatic ecosystems. The recent passage of the Great Lakes-St. Lawrence River Basin Water Resources Compact has brought attention to this issue in the Great Lakes region. In particular, the legislation requires the Great Lakes states to enact measures for limiting water withdrawals that can cause adverse ecosystem impacts. This study explores how both hydrogeologic and environmental flow limitations may constrain groundwater availability in the Great Lakes Basin. A methodology for calculating maximum allowable pumping rates is presented. Groundwater availability across the basin may be constrained by a combination of hydrogeologic yield and environmental flow limitations varying over both local and regional scales. The results are sensitive to factors such as pumping time, regional and local hydrogeology, streambed conductance, and streamflow depletion limits. Understanding how these restrictions constrain groundwater usage and which hydrogeologic characteristics and spatial variables have the most influence on potential streamflow depletions has important water resources policy and management implications.
Mechanical system reliability for long life space systems
NASA Technical Reports Server (NTRS)
Kowal, Michael T.
1994-01-01
The creation of a compendium of mechanical limit states was undertaken in order to provide a reference base for the application of first-order reliability methods to mechanical systems in the context of the development of a system level design methodology. The compendium was conceived as a reference source specific to the problem of developing the noted design methodology, and not an exhaustive or exclusive compilation of mechanical limit states. The compendium is not intended to be a handbook of mechanical limit states for general use. The compendium provides a diverse set of limit-state relationships for use in demonstrating the application of probabilistic reliability methods to mechanical systems. The compendium is to be used in the reliability analysis of moderately complex mechanical systems.
Fall 2016 Solicitation Projects Website Info
DOE Office of Scientific and Technical Information (OSTI.GOV)
Diachin, L.
Spark-ignition engines are the backbone behind people transportation around the world. The efficiency of spark-ignition engines is limited in practice by variations between engine cycles and cylinders within an engine that result from the manufacturing processes/tolerances. These variations impact knock limits and dilution tolerance, which results in more conservative settings for design and calibration settings, such as compression ratio, valve timing, and exhaust gas recirculation rates. Engine variations also have a significant impact on emissions generation, which can have a secondary impact on efficiency. A deeper understanding of the relative importance of these variations and their interactions on the chargemore » preparation process can guide future decisions on machining tolerances and control strategies. This project will develop simulation tools and methodology to include the effects of some key manufacturing tolerances and their impact on engine performance and emissions.« less
2011-01-01
Background Comprehensive "Total Pain" assessments of patients' end-of-life needs are critical for providing improved patient-clinician communication, assessing needs, and offering high quality palliative care. However, patients' needs-based research methodologies and findings remain highly diverse with their lack of consensus preventing optimum needs assessments and care planning. Mixed-methods is an underused yet robust "patient-based" approach for reported lived experiences to map both the incidence and prevalence of what patients perceive as important end of life needs. Methods Findings often include methodological artifacts and their own selection bias. Moving beyond diverse findings therefore requires revisiting methodological choices. A mixed methods research cross-sectional design is therefore used to reduce limitations inherent in both qualitative and quantitative methodologies. Audio-taped phenomenological "thinking aloud" interviews of a purposive sample of 30 hospice patients are used to identify their vocabulary for communicating perceptions of end-of-life needs. Grounded theory procedures assisted by QSR-NVivo software is then used for discovering domains of needs embedded in the interview narratives. Summary findings are translated into quantified format for presentation and analytical purposes. Results Findings from this mixed-methods feasibility study indicate patients' narratives represent 7 core domains of end-of-life needs. These are (1) time, (2) social, (3) physiological, (4) death and dying, (5) safety, (6) spirituality, (7) change & adaptation. The prevalence, rather than just the occurrence, of patients' reported needs provides further insight into their relative importance. Conclusion Patients' perceptions of end-of-life needs are multidimensional, often ambiguous and uncertain. Mixed methodology appears to hold considerable promise for unpacking both the occurrence and prevalence of cognitive structures represented by verbal encoding that constitute patients' narratives. Communication is a key currency for delivering optimal palliative care. Therefore understanding the domains of needs that emerge from patient-based vocabularies indicate potential for: (1) developing more comprehensive clinical-patient needs assessment tools; (2) improved patient-clinician communication; and (3) moving toward a theoretical model of human needs that can emerge at the end of life. PMID:21272318
The canine and feline skin microbiome in health and disease.
Weese, J Scott
2013-02-01
The skin harbours a diverse and abundant, yet inadequately investigated, microbial population. The population is believed to play an important role in both the pathophysiology and the prevention of disease, through a variety of poorly explored mechanisms. Early studies of the skin microbiota in dogs and cats reported a minimally diverse microbial composition of low overall abundance, most probably as a reflection of the limitations of testing methodology. Despite these limitations, it was clear that the bacterial population of the skin plays an important role in disease and in changes in response to both infectious and noninfectious diseases. Recent advances in technology are challenging some previous assumptions about the canine and feline skin microbiota and, with preliminary application of next-generation sequenced-based methods, it is apparent that the diversity and complexity of the canine skin microbiome has been greatly underestimated. A better understanding of this complex microbial population is critical for elucidation of the pathophysiology of various dermatological (and perhaps systemic) diseases and to develop novel ways to manipulate this microbial population to prevent or treat disease. © 2013 The Author. Veterinary Dermatology © 2013 ESVD and ACVD.
Spanking children: the controversies, findings, and new directions.
Benjet, Corina; Kazdin, Alan E
2003-03-01
The use of spanking as a discipline technique is quite prevalent, even though whether or not to spank children is controversial among lay and professional audiences alike. Considerable research on the topic has been analyzed in several reviews of the literature that often reach different and sometimes opposite conclusions. Opposing conclusions are not inherently problematic as research develops in an area. However, we propose that both methodological limitations of the research to date as well as the limited focus of the research questions have prevented a better understanding of the impact of parental spanking on child development. The purpose of this article is to convey the basis for limited progress to date and, more importantly, to reformulate the research agenda. The goal is to move toward a resolution of the most relevant questions to parents, professionals, and policymakers. We propose an expanded research agenda that addresses the goals of parental discipline, the direct and concomitant effects of spanking, the influences that foster and maintain the use of spanking, and the processes through which spanking operates.
Introducing a methodology for estimating duration of surgery in health services research.
Redelmeier, Donald A; Thiruchelvam, Deva; Daneman, Nick
2008-09-01
The duration of surgery is an indicator for the quality, risks, and efficiency of surgical procedures. We introduce a new methodology for assessing the duration of surgery based on anesthesiology billing records, along with reviewing its fundamental logic and limitations. The validity of the methodology was assessed through a population-based cohort of patients (n=480,986) undergoing elective operations in 246 Ontario hospitals with 1,084 anesthesiologists between April 1, 1992 and March 31, 2002 (10 years). The weaknesses of the methodology relate to missing data, self-serving exaggerations by providers, imprecisions from clinical diversity, upper limits due to accounting regulations, fluctuations from updates over the years, national differences in reimbursement schedules, and the general failings of claims base analyses. The strengths of the methodology are in providing data that match clinical experiences, correspond to chart review, are consistent over time, can detect differences where differences would be anticipated, and might have implications for examining patient outcomes after long surgical times. We suggest that an understanding and application of large studies of surgical duration may help scientists explore selected questions concerning postoperative complications.
Sweeney, Sedona; Vassall, Anna; Foster, Nicola; Simms, Victoria; Ilboudo, Patrick; Kimaro, Godfather; Mudzengi, Don; Guinness, Lorna
2016-02-01
Out-of-pocket spending is increasingly recognized as an important barrier to accessing health care, particularly in low-income and middle-income countries (LMICs) where a large portion of health expenditure comes from out-of-pocket payments. Emerging universal healthcare policies prioritize reduction of poverty impact such as catastrophic and impoverishing healthcare expenditure. Poverty impact is therefore increasingly evaluated alongside and within economic evaluations to estimate the impact of specific health interventions on poverty. However, data collection for these metrics can be challenging in intervention-based contexts in LMICs because of study design and practical limitations. Using a set of case studies, this letter identifies methodological challenges in collecting patient cost data in LMIC contexts. These components are presented in a framework to encourage researchers to consider the implications of differing approaches in data collection and to report their approach in a standardized and transparent way. © 2016 The Authors. Health Economics published by John Wiley & Sons Ltd.
Microbiological safety of drinking water: United States and global perspectives.
Ford, T E
1999-01-01
Waterborne disease statistics only begin to estimate the global burden of infectious diseases from contaminated drinking water. Diarrheal disease is dramatically underreported and etiologies seldom diagnosed. This review examines available data on waterborne disease incidence both in the United States and globally together with its limitations. The waterborne route of transmission is examined for bacterial, protozoal, and viral pathogens that either are frequently associated with drinking water (e.g., Shigella spp.), or for which there is strong evidence implicating the waterborne route of transmission (e.g., Leptospira spp.). In addition, crucial areas of research are discussed, including risks from selection of treatment-resistant pathogens, importance of environmental reservoirs, and new methodologies for pathogen-specific monitoring. To accurately assess risks from waterborne disease, it is necessary to understand pathogen distribution and survival strategies within water distribution systems and to apply methodologies that can detect not only the presence, but also the viability and infectivity of the pathogen. Images Figure 1 Figure 2 PMID:10229718
The Incident Command System: a literature review.
Jensen, Jessica; Thompson, Steven
2016-01-01
Given the foundational and the fundamental role that the Incident Command System (ICS) is intended to play in on-scene response efforts across the United States, it is important to determine what is known about the system and how this is known. Accordingly, this study addresses the following research question: 'How has research explored the ICS?'. To probe this question, a methodological review of the scant, but widening, pool of research literature directly related to the ICS was conducted. This paper reports on the findings of the analysis related to the focus, theoretical frameworks, population and sampling, methods, results, and conclusions of the existing research literature. While undertaken using different methodological approaches, the ICS research suggests that the system may be limited in its usefulness. In addition, the paper discusses the implications of the research for the state of knowledge of the system and for the direction of future research. © 2016 The Author(s). Disasters © Overseas Development Institute, 2016.
Affinity chromatography: A versatile technique for antibody purification.
Arora, Sushrut; Saxena, Vikas; Ayyar, B Vijayalakshmi
2017-03-01
Antibodies continue to be extremely utilized entities in myriad applications including basic research, imaging, targeted delivery, chromatography, diagnostics, and therapeutics. At production stage, antibodies are generally present in complex matrices and most of their intended applications necessitate purification. Antibody purification has always been a major bottleneck in downstream processing of antibodies, due to the need of high quality products and associated high costs. Over the years, extensive research has focused on finding better purification methodologies to overcome this holdup. Among a plethora of different techniques, affinity chromatography is one of the most selective, rapid and easy method for antibody purification. This review aims to provide a detailed overview on affinity chromatography and the components involved in purification. An array of support matrices along with various classes of affinity ligands detailing their underlying working principles, together with the advantages and limitations of each system in purifying different types of antibodies, accompanying recent developments and important practical methodological considerations to optimize purification procedure are discussed. Copyright © 2016 Elsevier Inc. All rights reserved.
Young-Wolff, Kelly C.; Enoch, Mary-Anne; Prescott, Carol A.
2011-01-01
Since 2005, a rapidly expanding literature has evaluated whether environmental factors such as socio-cultural context and environmental adversity interact with genetic influences on drinking behaviors. This article critically reviews empirical research on alcohol-related genotype-environment interactions (GxE) and provides a contextual framework for understanding how genetic factors combine with (or are shaped by) environmental influences to influence the development of drinking behaviors and alcohol use disorders. Collectively, evidence from twin, adoption, and molecular genetic studies indicates that the degree of importance of genetic influences on risk for drinking outcomes can vary in different populations and under different environmental circumstances. However, methodological limitations and lack of consistent replications in this literature make it difficult to draw firm conclusions regarding the nature and effect size of alcohol-related GxE. On the basis of this review, we describe several methodological challenges as they relate to current research on GxE in drinking behaviors and provide recommendations to aid future research. PMID:21530476
O'Brien, Mary R; Clark, David
2012-02-01
Stories about illness have proven invaluable in helping health professionals understand illness experiences. Such narratives have traditionally been solicited by researchers through interviews and the collection of personal writings, including diaries. These approaches are, however, researcher driven; the impetus for the creation of the story comes from the researcher and not the narrator. In recent years there has been exponential growth in illness narratives created by individuals, of their own volition, and made available for others to read in print or as Internet accounts. We sought to determine whether it was possible to identify such material for use as research data to explore the subject of living with the terminal illness amyotrophic lateral sclerosis/motor neuron disease--the contention being that these accounts are narrator driven and therefore focus on issues of greatest importance to the affected person. We encountered and sought to overcome a number of methodological and ethical challenges, which is our focus here.
Teaching and learning based on peer review: a realistic approach in forensic sciences.
Dinis-Oliveira, Ricardo Jorge; Magalhães, Teresa
2016-01-01
Teaching and learning methods need a continuous upgrade in higher education. However it is also true that some of the modern methodologies do not reduce or prevent school failure. Perhaps the real limitation is the inability to identify the true reasons that may explain it or ignore/undervalue the problem. In our opinion, one of the current constraints of the teaching/learning process is the excess of and inadequate bibliography recommended by the teacher, which results in continuous student difficulties and waste of time in searching and selecting useful information. The need to change the paradigm of the teaching/learning process comes also from employers. They claim forensic experts armed with useful knowledge to face professional life. It is therefore mandatory to identify the new needs and opportunities regarding pedagogical methodologies. This article reflects on the recent importance of peer review in teaching/learning forensic sciences based on the last 10 years of pedagogical experience inseparably from the scientific activity.
Studies and methodologies on vaginal drug permeation.
Machado, Rita Monteiro; Palmeira-de-Oliveira, Ana; Gaspar, Carlos; Martinez-de-Oliveira, José; Palmeira-de-Oliveira, Rita
2015-09-15
The vagina stands as an important alternative to the oral route for those systemic drugs that are poorly absorbed orally or are rapidly metabolized by the liver. Drug permeation through the vaginal tissue can be estimated by using in vitro, ex vivo and in vivo models. The latter ones, although more realistic, assume ethical and biological limitations due to animal handling. Therefore, in vitro and ex vivo models have been developed to predict drug absorption through the vagina while allowing for simultaneous toxicity and pathogenesis studies. This review focuses on available methodologies to study vaginal drug permeation discussing their advantages and drawbacks. The technical complexity, costs and the ethical issues of an available model, along with its accuracy and reproducibility will determine if it is valid and applicable. Therefore every model shall be evaluated, validated and standardized in order to allow for extrapolations and results presumption, and so improving vaginal drug research and stressing its benefits. Copyright © 2015 Elsevier B.V. All rights reserved.
de Oliveira, Tatiane Milão; Augusto Peres, Jayme; Lurdes Felsner, Maria; Cristiane Justi, Karin
2017-08-15
Milk is an important food in the human diet due to its physico-chemical composition; therefore, it is necessary to monitor contamination by toxic metals such as Pb. Milk sample slurries were prepared using Triton X-100 and nitric acid for direct analysis of Pb using graphite furnace atomic absorption spectrometry - GF AAS. After dilution of the slurries, 10.00µl were directly introduced into the pyrolytic graphite tube without use of a chemical modifier, which acts as an advantage considering this type of matrix. The limits of detection and quantification were 0.64 and 2.14µgl -1 , respectively. The figures of merit studied showed that the proposed methodology without pretreatment of the raw milk sample and using external standard calibration is suitable. The methodology was applied in milk samples from the Guarapuava region, in Paraná State (Brazil) and Pb concentrations ranged from 2.12 to 37.36µgl -1 . Copyright © 2017 Elsevier Ltd. All rights reserved.
Teaching and learning based on peer review: a realistic approach in forensic sciences
Dinis-Oliveira, Ricardo Jorge; Magalhães, Teresa
2016-01-01
Teaching and learning methods need a continuous upgrade in higher education. However it is also true that some of the modern methodologies do not reduce or prevent school failure. Perhaps the real limitation is the inability to identify the true reasons that may explain it or ignore/undervalue the problem. In our opinion, one of the current constraints of the teaching/learning process is the excess of and inadequate bibliography recommended by the teacher, which results in continuous student difficulties and waste of time in searching and selecting useful information. The need to change the paradigm of the teaching/learning process comes also from employers. They claim forensic experts armed with useful knowledge to face professional life. It is therefore mandatory to identify the new needs and opportunities regarding pedagogical methodologies. This article reflects on the recent importance of peer review in teaching/learning forensic sciences based on the last 10 years of pedagogical experience inseparably from the scientific activity. PMID:27547377
Magness, Scott T.; Puthoff, Brent J.; Crissey, Mary Ann; Dunn, James; Henning, Susan J.; Houchen, Courtney; Kaddis, John S.; Kuo, Calvin J.; Li, Linheng; Lynch, John; Martin, Martin G.; May, Randal; Niland, Joyce C.; Olack, Barbara; Qian, Dajun; Stelzner, Matthias; Swain, John R.; Wang, Fengchao; Wang, Jiafang; Wang, Xinwei; Yan, Kelley; Yu, Jian
2013-01-01
Fluorescence-activated cell sorting (FACS) is an essential tool for studies requiring isolation of distinct intestinal epithelial cell populations. Inconsistent or lack of reporting of the critical parameters associated with FACS methodologies has complicated interpretation, comparison, and reproduction of important findings. To address this problem a comprehensive multicenter study was designed to develop guidelines that limit experimental and data reporting variability and provide a foundation for accurate comparison of data between studies. Common methodologies and data reporting protocols for tissue dissociation, cell yield, cell viability, FACS, and postsort purity were established. Seven centers tested the standardized methods by FACS-isolating a specific crypt-based epithelial population (EpCAM+/CD44+) from murine small intestine. Genetic biomarkers for stem/progenitor (Lgr5 and Atoh 1) and differentiated cell lineages (lysozyme, mucin2, chromogranin A, and sucrase isomaltase) were interrogated in target and control populations to assess intra- and intercenter variability. Wilcoxon's rank sum test on gene expression levels showed limited intracenter variability between biological replicates. Principal component analysis demonstrated significant intercenter reproducibility among four centers. Analysis of data collected by standardized cell isolation methods and data reporting requirements readily identified methodological problems, indicating that standard reporting parameters facilitate post hoc error identification. These results indicate that the complexity of FACS isolation of target intestinal epithelial populations can be highly reproducible between biological replicates and different institutions by adherence to common cell isolation methods and FACS gating strategies. This study can be considered a foundation for continued method development and a starting point for investigators that are developing cell isolation expertise to study physiology and pathophysiology of the intestinal epithelium. PMID:23928185
A methodology for determining rural public transportation needs in Virginia.
DOT National Transportation Integrated Search
1974-01-01
The need for rural public transportation is coming into focus, although-Its magnitude is unknown because of limited data. This study was initiated to develop an efficient and economical methodology for determining the transportation needs of the rura...
DOT National Transportation Integrated Search
2000-04-01
This report presents detailed analytic tools and results on dynamic response which are used to develop the safe dynamic performance limits of commuter passenger vehicles. The methodology consists of determining the critical parameters and characteris...
Particle Count Limits Recommendation for Aviation Fuel
2015-10-05
Particle Counter Methodology • Particle counts are taken utilizing calibration methodologies and standardized cleanliness code ratings – ISO 11171 – ISO...Limits Receipt Vehicle Fuel Tank Fuel Injector Aviation Fuel DEF (AUST) 5695B 18/16/13 Parker 18/16/13 14/10/7 Pamas / Parker / Particle Solutions 19/17...12 U.S. DOD 19/17/14/13* Diesel Fuel World Wide Fuel Charter 5th 18/16/13 DEF (AUST) 5695B 18/16/13 Caterpillar 18/16/13 Detroit Diesel 18/16/13 MTU
Testing for genetically modified organisms (GMOs): Past, present and future perspectives.
Holst-Jensen, Arne
2009-01-01
This paper presents an overview of GMO testing methodologies and how these have evolved and may evolve in the next decade. Challenges and limitations for the application of the test methods as well as to the interpretation of results produced with the methods are highlighted and discussed, bearing in mind the various interests and competences of the involved stakeholders. To better understand the suitability and limitations of detection methodologies the evolution of transformation processes for creation of GMOs is briefly reviewed.
Kitsiou, Spyros; Paré, Guy; Jaana, Mirou; Gerber, Ben
2017-01-01
Diabetes is a common chronic disease that places an unprecedented strain on health care systems worldwide. Mobile health technologies such as smartphones, mobile applications, and wearable devices, known as mHealth, offer significant and innovative opportunities for improving patient to provider communication and self-management of diabetes. The purpose of this overview is to critically appraise and consolidate evidence from multiple systematic reviews on the effectiveness of mHealth interventions for patients with diabetes to inform policy makers, practitioners, and researchers. A comprehensive search on multiple databases was performed to identify relevant systematic reviews published between January 1996 and December 2015. Two authors independently selected reviews, extracted data, and assessed the methodological quality of included reviews using AMSTAR. Fifteen systematic reviews published between 2008 and 2014 were eligible for inclusion. The quality of the reviews varied considerably and most of them had important methodological limitations. Focusing on systematic reviews that offered the most direct evidence, this overview demonstrates that on average, mHealth interventions improve glycemic control (HbA1c) compared to standard care or other non-mHealth approaches by as much as 0.8% for patients with type 2 diabetes and 0.3% for patients with type 1 diabetes, at least in the short-term (≤12 months). However, limitations in the overall quality of evidence suggest that further research will likely have an important impact in these estimates of effect. Findings are consistent with clinically relevant improvements, particularly with respect to patients with type 2 diabetes. Similar to home telemonitoring, mHealth interventions represent a promising approach for self-management of diabetes.
van Mourik, Maaike S M; van Duijn, Pleun Joppe; Moons, Karel G M; Bonten, Marc J M; Lee, Grace M
2015-01-01
Objective Measuring the incidence of healthcare-associated infections (HAI) is of increasing importance in current healthcare delivery systems. Administrative data algorithms, including (combinations of) diagnosis codes, are commonly used to determine the occurrence of HAI, either to support within-hospital surveillance programmes or as free-standing quality indicators. We conducted a systematic review evaluating the diagnostic accuracy of administrative data for the detection of HAI. Methods Systematic search of Medline, Embase, CINAHL and Cochrane for relevant studies (1995–2013). Methodological quality assessment was performed using QUADAS-2 criteria; diagnostic accuracy estimates were stratified by HAI type and key study characteristics. Results 57 studies were included, the majority aiming to detect surgical site or bloodstream infections. Study designs were very diverse regarding the specification of their administrative data algorithm (code selections, follow-up) and definitions of HAI presence. One-third of studies had important methodological limitations including differential or incomplete HAI ascertainment or lack of blinding of assessors. Observed sensitivity and positive predictive values of administrative data algorithms for HAI detection were very heterogeneous and generally modest at best, both for within-hospital algorithms and for formal quality indicators; accuracy was particularly poor for the identification of device-associated HAI such as central line associated bloodstream infections. The large heterogeneity in study designs across the included studies precluded formal calculation of summary diagnostic accuracy estimates in most instances. Conclusions Administrative data had limited and highly variable accuracy for the detection of HAI, and their judicious use for internal surveillance efforts and external quality assessment is recommended. If hospitals and policymakers choose to rely on administrative data for HAI surveillance, continued improvements to existing algorithms and their robust validation are imperative. PMID:26316651
How to study deep roots—and why it matters
Maeght, Jean-Luc; Rewald, Boris; Pierret, Alain
2013-01-01
The drivers underlying the development of deep root systems, whether genetic or environmental, are poorly understood but evidence has accumulated that deep rooting could be a more widespread and important trait among plants than commonly anticipated from their share of root biomass. Even though a distinct classification of “deep roots” is missing to date, deep roots provide important functions for individual plants such as nutrient and water uptake but can also shape plant communities by hydraulic lift (HL). Subterranean fauna and microbial communities are highly influenced by resources provided in the deep rhizosphere and deep roots can influence soil pedogenesis and carbon storage.Despite recent technological advances, the study of deep roots and their rhizosphere remains inherently time-consuming, technically demanding and costly, which explains why deep roots have yet to be given the attention they deserve. While state-of-the-art technologies are promising for laboratory studies involving relatively small soil volumes, they remain of limited use for the in situ observation of deep roots. Thus, basic techniques such as destructive sampling or observations at transparent interfaces with the soil (e.g., root windows) which have been known and used for decades to observe roots near the soil surface, must be adapted to the specific requirements of deep root observation. In this review, we successively address major physical, biogeochemical and ecological functions of deep roots to emphasize the significance of deep roots and to illustrate the yet limited knowledge. In the second part we describe the main methodological options to observe and measure deep roots, providing researchers interested in the field of deep root/rhizosphere studies with a comprehensive overview. Addressed methodologies are: excavations, trenches and soil coring approaches, minirhizotrons (MR), access shafts, caves and mines, and indirect approaches such as tracer-based techniques. PMID:23964281
Force-controlled absorption in a fully-nonlinear numerical wave tank
NASA Astrophysics Data System (ADS)
Spinneken, Johannes; Christou, Marios; Swan, Chris
2014-09-01
An active control methodology for the absorption of water waves in a numerical wave tank is introduced. This methodology is based upon a force-feedback technique which has previously been shown to be very effective in physical wave tanks. Unlike other methods, an a-priori knowledge of the wave conditions in the tank is not required; the absorption controller being designed to automatically respond to a wide range of wave conditions. In comparison to numerical sponge layers, effective wave absorption is achieved on the boundary, thereby minimising the spatial extent of the numerical wave tank. In contrast to the imposition of radiation conditions, the scheme is inherently capable of absorbing irregular waves. Most importantly, simultaneous generation and absorption can be achieved. This is an important advance when considering inclusion of reflective bodies within the numerical wave tank. In designing the absorption controller, an infinite impulse response filter is adopted, thereby eliminating the problem of non-causality in the controller optimisation. Two alternative controllers are considered, both implemented in a fully-nonlinear wave tank based on a multiple-flux boundary element scheme. To simplify the problem under consideration, the present analysis is limited to water waves propagating in a two-dimensional domain. The paper presents an extensive numerical validation which demonstrates the success of the method for a wide range of wave conditions including regular, focused and random waves. The numerical investigation also highlights some of the limitations of the method, particularly in simultaneously generating and absorbing large amplitude or highly-nonlinear waves. The findings of the present numerical study are directly applicable to related fields where optimum absorption is sought; these include physical wavemaking, wave power absorption and a wide range of numerical wave tank schemes.
Bala, Malgorzata M; Akl, Elie A; Sun, Xin; Bassler, Dirk; Mertz, Dominik; Mejza, Filip; Vandvik, Per Olav; Malaga, German; Johnston, Bradley C; Dahm, Philipp; Alonso-Coello, Pablo; Diaz-Granados, Natalia; Srinathan, Sadeesh K; Hassouneh, Basil; Briel, Matthias; Busse, Jason W; You, John J; Walter, Stephen D; Altman, Douglas G; Guyatt, Gordon H
2013-03-01
To compare methodological characteristics of randomized controlled trials (RCTs) published in higher vs. lower impact Core Clinical Journals. We searched MEDLINE for RCTs published in 2007 in Core Clinical Journals. We randomly sampled 1,140 study reports in a 1:1 ratio in higher (five general medicine journals with the highest total citations in 2007) and lower impact journals. Four hundred sixty-nine RCTs proved eligible: 219 in higher and 250 in lower impact journals. RCTs in higher vs. lower impact journals had larger sample sizes (median, 285 vs. 39), were more likely to receive industry funding (53% vs. 28%), declare concealment of allocation (66% vs. 36%), declare blinding of health care providers (53% vs. 41%) and outcome adjudicators (72% vs. 54%), report a patient-important primary outcome (69% vs. 50%), report subgroup analyses (64% vs. 26%), prespecify subgroup hypotheses (42% vs. 20%), and report a test for interaction (54% vs. 27%); P < 0.05 for all differences. RCTs published in higher impact journals were more likely to report methodological safeguards against bias and patient-important outcomes than those published in lower impact journals. However, sufficient limitations remain such that publication in a higher impact journal does not ensure low risk of bias. Copyright © 2013 Elsevier Inc. All rights reserved.
Byrne, Fiona; Grace, Rebekah; Tredoux, Jaimie; Kemp, Lynn
2016-06-01
Objective The aims of the present paper were to: (1) review the research literature that contributes to an understanding of the role of volunteer home visiting programs in supporting the health and well being of families with young children; and (2) propose a conceptual model outlining service pathways for families in need of additional support. Methods An integrative literature review method was used, with a mix of electronic and manual search methods for the period January 1980-January 2014. Forty-five studies were identified that met the inclusion criteria for review and were coded according to themes developed a priori. Results There is little formal research that has examined the effectiveness of volunteer home visiting programs for supporting family health and well being. The available research suggests that volunteer home visiting programs provide socioemotional support through structured social relationships; however, there is limited empirical evidence to explicate the factors that contribute to these outcomes. Conclusion In recognition of the importance of peer support for new parents, the not-for-profit sector has been involved in providing volunteer home visiting services to families for decades. However, the body of research to support this work is characterised by methodological limitations, and rigorous evidence is limited. What is clear anecdotally and qualitatively from the existing research is that parents who are in need of additional support value engagement with a community volunteer. These structured social relationships appear to fulfil a service need within the community, helping build bridges to support social networks, and thus complementing professional services and relationships. Overall, structured social relationships in the form of volunteer home visiting programs appear to provide an important pathway to support family health and well being. Findings from the existing research are mixed and often characterised by methodological limitations, pointing to a need for further rigorous research. What is known about the topic? Volunteer family support programs have been an important part of the service landscape for vulnerable families, both nationally and internationally, for many years. Anecdotal reports suggest that this is a valued form of support that increases a sense of community connectedness and breaks down barriers for families in accessing other community support services. What does this paper add? This paper proposes a model identifying broad service pathways impacting on family health and well being that takes into account the importance of structured social relationships and social connectedness. What are the implications for practitioners? The proposed model may encourage discussion by practitioners and organisations interested in models of support for families who are socially isolated and/or in need of assistance to access and engage with services within the community.
Villeval, M; Carayol, M; Lamy, S; Lepage, B; Lang, T
2016-12-01
In the field of health, evidence-based medicine and associated methods like randomised controlled trials (RCTs) have become widely used. RCT has become the gold standard for evaluating causal links between interventions and health results. Originating in pharmacology, this method has been progressively expanded to medical devices, non-pharmacological individual interventions, as well as collective public health interventions. Its use in these domains has led to the formulation of several limits, and it has been called into question as an undisputed gold standard. Some of those limits (e.g. confounding biases and external validity) are common to these four different domains, while others are more specific. This paper describes the different limits, as well as several research avenues. Some are methodological reflections aiming at adapting RCT to the complexity of the tested interventions, and at overcoming some of its limits. Others are alternative methods. The objective is not to remove RCT from the range of evaluation methodologies, but to resituate it within this range. The aim is to encourage choosing between different methods according to the features and the level of the intervention to evaluate, thereby calling for methodological pluralism. Copyright © 2016 Elsevier Masson SAS. All rights reserved.
Retinal image registration for eye movement estimation.
Kolar, Radim; Tornow, Ralf P; Odstrcilik, Jan
2015-01-01
This paper describes a novel methodology for eye fixation measurement using a unique videoophthalmoscope setup and advanced image registration approach. The representation of the eye movements via Poincare plot is also introduced. The properties, limitations and perspective of this methodology are finally discussed.
Evaluation Methods Sourcebook.
ERIC Educational Resources Information Center
Love, Arnold J., Ed.
The chapters commissioned for this book describe key aspects of evaluation methodology as they are practiced in a Canadian context, providing representative illustrations of recent developments in evaluation methodology as it is currently applied. The following chapters are included: (1) "Program Evaluation with Limited Fiscal and Human…
Using quality assessment tools to critically appraise ageing research: a guide for clinicians.
Harrison, Jennifer Kirsty; Reid, James; Quinn, Terry J; Shenkin, Susan Deborah
2017-05-01
Evidence based medicine tells us that we should not accept published research at face value. Even research from established teams published in the highest impact journals can have methodological flaws, biases and limited generalisability. The critical appraisal of research studies can seem daunting, but tools are available to make the process easier for the non-specialist. Understanding the language and process of quality assessment is essential when considering or conducting research, and is also valuable for all clinicians who use published research to inform their clinical practice.We present a review written specifically for the practising geriatrician. This considers how quality is defined in relation to the methodological conduct and reporting of research. Having established why quality assessment is important, we present and critique tools which are available to standardise quality assessment. We consider five study designs: RCTs, non-randomised studies, observational studies, systematic reviews and diagnostic test accuracy studies. Quality assessment for each of these study designs is illustrated with an example of published cognitive research. The practical applications of the tools are highlighted, with guidance on their strengths and limitations. We signpost educational resources and offer specific advice for use of these tools.We hope that all geriatricians become comfortable with critical appraisal of published research and that use of the tools described in this review - along with awareness of their strengths and limitations - become a part of teaching, journal clubs and practice. © The Author 2016. Published by Oxford University Press on behalf of the British Geriatrics Society.
Integrated Response Time Evaluation Methodology for the Nuclear Safety Instrumentation System
NASA Astrophysics Data System (ADS)
Lee, Chang Jae; Yun, Jae Hee
2017-06-01
Safety analysis for a nuclear power plant establishes not only an analytical limit (AL) in terms of a measured or calculated variable but also an analytical response time (ART) required to complete protective action after the AL is reached. If the two constraints are met, the safety limit selected to maintain the integrity of physical barriers used for preventing uncontrolled radioactivity release will not be exceeded during anticipated operational occurrences and postulated accidents. Setpoint determination methodologies have actively been developed to ensure that the protective action is initiated before the process conditions reach the AL. However, regarding the ART for a nuclear safety instrumentation system, an integrated evaluation methodology considering the whole design process has not been systematically studied. In order to assure the safety of nuclear power plants, this paper proposes a systematic and integrated response time evaluation methodology that covers safety analyses, system designs, response time analyses, and response time tests. This methodology is applied to safety instrumentation systems for the advanced power reactor 1400 and the optimized power reactor 1000 nuclear power plants in South Korea. The quantitative evaluation results are provided herein. The evaluation results using the proposed methodology demonstrate that the nuclear safety instrumentation systems fully satisfy corresponding requirements of the ART.
Concomitant prediction of function and fold at the domain level with GO-based profiles.
Lopez, Daniel; Pazos, Florencio
2013-01-01
Predicting the function of newly sequenced proteins is crucial due to the pace at which these raw sequences are being obtained. Almost all resources for predicting protein function assign functional terms to whole chains, and do not distinguish which particular domain is responsible for the allocated function. This is not a limitation of the methodologies themselves but it is due to the fact that in the databases of functional annotations these methods use for transferring functional terms to new proteins, these annotations are done on a whole-chain basis. Nevertheless, domains are the basic evolutionary and often functional units of proteins. In many cases, the domains of a protein chain have distinct molecular functions, independent from each other. For that reason resources with functional annotations at the domain level, as well as methodologies for predicting function for individual domains adapted to these resources are required.We present a methodology for predicting the molecular function of individual domains, based on a previously developed database of functional annotations at the domain level. The approach, which we show outperforms a standard method based on sequence searches in assigning function, concomitantly predicts the structural fold of the domains and can give hints on the functionally important residues associated to the predicted function.
Major challenges for correlational ecological niche model projections to future climate conditions.
Peterson, A Townsend; Cobos, Marlon E; Jiménez-García, Daniel
2018-06-20
Species-level forecasts of distributional potential and likely distributional shifts, in the face of changing climates, have become popular in the literature in the past 20 years. Many refinements have been made to the methodology over the years, and the result has been an approach that considers multiple sources of variation in geographic predictions, and how that variation translates into both specific predictions and uncertainty in those predictions. Although numerous previous reviews and overviews of this field have pointed out a series of assumptions and caveats associated with the methodology, three aspects of the methodology have important impacts but have not been treated previously in detail. Here, we assess those three aspects: (1) effects of niche truncation on model transfers to future climate conditions, (2) effects of model selection procedures on future-climate transfers of ecological niche models, and (3) relative contributions of several factors (replicate samples of point data, general circulation models, representative concentration pathways, and alternative model parameterizations) to overall variance in model outcomes. Overall, the view is one of caution: although resulting predictions are fascinating and attractive, this paradigm has pitfalls that may bias and limit confidence in niche model outputs as regards the implications of climate change for species' geographic distributions. © 2018 New York Academy of Sciences.
Mogasale, Vittal; Mogasale, Vijayalaxmi V; Ramani, Enusa; Lee, Jung Seok; Park, Ju Yeon; Lee, Kang Sung; Wierzba, Thomas F
2016-01-29
The control of typhoid fever being an important public health concern in low and middle income countries, improving typhoid surveillance will help in planning and implementing typhoid control activities such as deployment of new generation Vi conjugate typhoid vaccines. We conducted a systematic literature review of longitudinal population-based blood culture-confirmed typhoid fever studies from low and middle income countries published from 1(st) January 1990 to 31(st) December 2013. We quantitatively summarized typhoid fever incidence rates and qualitatively reviewed study methodology that could have influenced rate estimates. We used meta-analysis approach based on random effects model in summarizing the hospitalization rates. Twenty-two papers presented longitudinal population-based and blood culture-confirmed typhoid fever incidence estimates from 20 distinct sites in low and middle income countries. The reported incidence and hospitalizations rates were heterogeneous as well as the study methodology across the sites. We elucidated how the incidence rates were underestimated in published studies. We summarized six categories of under-estimation biases observed in these studies and presented potential solutions. Published longitudinal typhoid fever studies in low and middle income countries are geographically clustered and the methodology employed has a potential for underestimation. Future studies should account for these limitations.
An approach to accidents modeling based on compounds road environments.
Fernandes, Ana; Neves, Jose
2013-04-01
The most common approach to study the influence of certain road features on accidents has been the consideration of uniform road segments characterized by a unique feature. However, when an accident is related to the road infrastructure, its cause is usually not a single characteristic but rather a complex combination of several characteristics. The main objective of this paper is to describe a methodology developed in order to consider the road as a complete environment by using compound road environments, overcoming the limitations inherented in considering only uniform road segments. The methodology consists of: dividing a sample of roads into segments; grouping them into quite homogeneous road environments using cluster analysis; and identifying the influence of skid resistance and texture depth on road accidents in each environment by using generalized linear models. The application of this methodology is demonstrated for eight roads. Based on real data from accidents and road characteristics, three compound road environments were established where the pavement surface properties significantly influence the occurrence of accidents. Results have showed clearly that road environments where braking maneuvers are more common or those with small radii of curvature and high speeds require higher skid resistance and texture depth as an important contribution to the accident prevention. Copyright © 2013 Elsevier Ltd. All rights reserved.
Muskett, Tom; Body, Richard
2013-01-01
Conversation analysis (CA) continues to accrue interest within clinical linguistics as a methodology that can enable elucidation of structural and sequential orderliness in interactions involving participants who produce ostensibly disordered communication behaviours. However, it can be challenging to apply CA to re-examine clinical phenomena that have initially been defined in terms of linguistics, as a logical starting point for analysis may be to focus primarily on the organisation of language ("talk") in such interactions. In this article, we argue that CA's methodological power can only be fully exploited in this research context when a multimodal analytic orientation is adopted, where due consideration is given to participants' co-ordinated use of multiple semiotic resources including, but not limited to, talk (e.g., gaze, embodied action, object use and so forth). To evidence this argument, a two-layered analysis of unusual question-answer sequences in a play episode involving a child with autism is presented. It is thereby demonstrated that only when the scope of enquiry is broadened to include gaze and other embodied action can an account be generated of orderliness within these sequences. This finding has important implications for CA's application as a research methodology within clinical linguistics.
Tracking of childhood overweight into adulthood: a systematic review of the literature.
Singh, A S; Mulder, C; Twisk, J W R; van Mechelen, W; Chinapaw, M J M
2008-09-01
Overweight and obesity in youth are important public health concerns and are of particular interest because of possible long-term associations with adult weight status and morbidity. The aim of this study was to systematically review the literature and update evidence concerning persistence of childhood overweight. A computerized bibliographical search--restricted to studies with a prospective or retrospective longitudinal design--was conducted. Two authors independently extracted data and assessed the methodological quality of the included studies in four dimensions (i) study population and participation rate; (ii) study attrition; (iii) data collection and (iv) data analysis. Conclusions were based on a rating system of three levels of evidence. A total of 25 publications were selected for inclusion in this review. According to a methodological quality assessment, 13 studies were considered to be of high quality. The majority of these high-quality studies were published after 2001, indicating that recently published data, in particular, provide us with reliable information. All included studies consistently report an increased risk of overweight and obese youth becoming overweight adults, suggesting that the likelihood of persistence of overweight into adulthood is moderate for overweight and obese youth. However, predictive values varied considerably. Limiting aspects with respect to generalizability and methodological issues are discussed.
Nadal, Ana; Alamús, Ramón; Pipia, Luca; Ruiz, Antonio; Corbera, Jordi; Cuerva, Eva; Rieradevall, Joan; Josa, Alejandro
2017-12-01
The integration of rooftop greenhouses (RTGs) in urban buildings is a practice that is becoming increasingly important in the world for their contribution to food security and sustainable development. However, the supply of tools and procedures to facilitate their implementation at the city scale is limited and laborious. This work aims to develop a specific and automated methodology for identifying the feasibility of implementation of rooftop greenhouses in non-residential urban areas, using airborne sensors. The use of Light Detection and Ranging (LIDAR) and Long Wave Infrared (LWIR) data and the Leica ALS50-II and TASI-600 sensors allow for the identification of some building roof parameters (area, slope, materials, and solar radiation) to determine the potential for constructing a RTG. This development represents an improvement in time and accuracy with respect to previous methodology, where all the relevant information must be acquired manually. The methodology has been applied and validated in a case study corresponding to a non-residential urban area in the industrial municipality of Rubí, Barcelona (Spain). Based on this practical application, an area of 36,312m 2 out of a total area of 1,243,540m 2 of roofs with ideal characteristics for the construction of RTGs was identified. This area can produce approximately 600tons of tomatoes per year, which represents the average yearly consumption for about 50% of Rubí total population. The use of this methodology also facilitates the decision making process in urban agriculture, allowing a quick identification of optimal surfaces for the future implementation of urban agriculture in housing. It also opens new avenues for the use of airborne technology in environmental topics in cities. Copyright © 2017 Elsevier B.V. All rights reserved.
Data sources and methods for ascertaining human exposure to drugs.
Jones, J K; Kennedy, D L
Estimates of population exposure based on drug use data are critical elements in the post marketing surveillance of drugs and provide a context for assessing the various risks and benefits associated with drug treatment. Such information is important in predicting morbidity and planning public health protection strategies, indepth studies, and regulatory actions. Knowledge that a population of one thousand instead of one million may potentially be exposed to a drug can help determine how a particular regulatory problem will be handled and would obviously be a major determinant in designing a case-control or cohort study. National estimates of drug use give an overview of the most commonly used drug therapies in current practice. They also furnish valuable comparison data for specific studies of drug use limited to one group of drugs, one geographic region, or one medical care setting. The FDA has access to several different national drug use data bases, each measuring a different point in the drug distribution channels. None covers the entire spectrum of drug exposures. The major "holes" in this patchwork of data bases are the inability to measure OTC drug use with any accuracy and the lack of qualitative information on drug use in hospitals. In addition, there is no patient linkage with the data. The data can only show trends in drug use. They impart no sense of the longitudinal use of drugs for individual patients. There is no direct connection between the different data bases, all of which have their own sampling frames and their own projection methodologies. The market research companies have complete control over these methodologies and they are subject to periodic changes, a situation not entirely satisfactory for epidemiologic research. Sometimes it is a struggle to keep up with these changes. Over the past two years, every one of these data bases has undergone some type of sampling or projection methodology change. One important limitation to the use of all of these data bases is that they are subscription data bases, that is, the FDA purchases the data under contract to the marketing research companies and by doing so assumes certain contract obligations. Anytime the FDA wants to release any data outside of the Agency, it must first notify the company in sufficient time for review and approval. Subscribing to these data bases is costly, but the subscription cost is insignificant, compared to the estimated cost of duplicating these services. In spite of all of the limitations of these systems, there are obvious advantages.(ABSTRACT TRUNCATED AT 400 WORDS)
2017-06-01
importantly, it examines the methodology used to build the class IX block embarked on ship prior to deployment. The class IX block is defined as a repository...compared to historical data to evaluate model and simulation outputs. This thesis provides recommendations on improving the methodology implemented in...improving the level of organic support available to deployed units. More importantly, it examines the methodology used to build the class IX block
Caird, Jeff K; Johnston, Katherine A; Willness, Chelsea R; Asbridge, Mark
2014-06-01
Three important and inter-related topics are addressed in this paper. First, the importance of meta-analysis and research synthesis methods to combine studies on traffic safety, in general, and on driver distraction, in particular, is briefly reviewed. Second, naturalistic, epidemiologic, and driving simulation studies on driver distraction are used to illustrate convergent and divergent results that have accumulated thus far in this domain of research. In particular, mobile phone conversation, passenger presence, and text messaging naturalistic studies use meta-analyses and research syntheses to illustrate important patterns of results that are in need of more in-depth study. Third, a number of driver distraction study limitations such as poorly defined dependent variables, lack of methodological detail, and omission of statistical information prevent the integration of many studies into meta-analyses. In addition, the overall quality of road safety studies suffers from these same limitations and suggestions for improvement are made to guide researchers and reviewers. Practical Applications. The use of research synthesis and meta-analysis provide comprehensive estimates of the impact of distractions on driving performance, which can be used to guide public policy and future research. Copyright © 2014 National Safety Council and Elsevier Ltd. All rights reserved.
Monge, Susana; Ronda, Elena; Pons-Vigués, Mariona; Vives Cases, Carmen; Malmusi, Davide; Gil-González, Diana
2015-01-01
Our objective was to describe the methodological limitations and recommendations identified by authors of original articles on immigration and health in Spain. A literature review was conducted of original articles published in Spanish or English between 1998 and 2012 combining keywords on immigration and health. A total of 311 articles were included; of these, 176 (56.6%) mentioned limitations, and 15 (4.8%) made recommendations. The most frequently mentioned limitations included the following: reduced sample sizes; internal validity and sample representativeness issues, with under- or overrepresentation of specific groups; problems of validity of the collected information and missing data mostly related to measurement tools; and absence of key variables for adjustment or stratification. Based on these results, a series of recommendations are proposed to minimise common limitations and advance the quality of scientific production on immigration and health in our setting. Copyright © 2015 SESPAS. Published by Elsevier Espana. All rights reserved.
Golaszewski, T
2001-01-01
To examine the literature from the past 20 years and identify those studies that support the economic merit of health promotion. A panel of experts was used to identify the top studies supporting the purpose of this article. Studies were chosen based on the following criteria: the study (1) examined the relationship between health risks and financial outcomes, or health promotion programs and financial outcomes; (2) provided strong and compelling financial data supporting the worth of health promotion; (3) had a high-quality methodology; (4) answered an important question or replicated important findings with superior methodology; and (5) represented U.S.-based initiatives published since 1980. After initially nominating a group of studies for consideration, panelists rated each on a scale from 1 to 3 representing their opinion of importance. Studies rating the highest were included for this discussion. Studies were analyzed by population characteristics, design, statistical tests, limitations, and results. This information was summarized for each identified article. A relationship between modifiable health risk factors and health care costs is supported by research. Health promotion interventions appear to provide positive financial returns, most notably for health care costs and absenteeism reduction. Private sector initiatives seem to be driving economic-based research. Overall, health promotion shows promising results for providing financial advantages for its sponsors; however, if this discipline is to show its true worth, considerable funding is needed from government or philanthropic sources to cover the substantial costs of quality research.
On the Evolving Nature of Exposure Therapy
ERIC Educational Resources Information Center
Schare, Mitchell L.; Wyatt, Kristin P.
2013-01-01
Four articles examining methodological applications of exposure therapy and its limited dissemination were briefly reviewed. Methodological articles included those by Abramowitz et al., Gryczkowski et al., and Weiner and McKay, which addressed couple treatment of obsessive-compulsive disorder (OCD), modification of evidence-based anxiety…
Patrick, Kevin; Wolszon, Laura; Basen-Engquist, Karen M; Demark-Wahnefried, Wendy; Prokhorov, Alex V; Barrera, Stephanie; Baru, Chaitan; Farcas, Emilia; Krueger, Ingolf; Palmer, Doug; Raab, Fred; Rios, Phil; Ziftci, Celal; Peterson, Susan
2011-03-01
Improved approaches and methodologies are needed to conduct comparative effectiveness research (CER) in oncology. While cancer therapies continue to emerge at a rapid pace, the review, synthesis, and dissemination of evidence-based interventions across clinical trials lag in comparison. Rigorous and systematic testing of competing therapies has been clouded by age-old problems: poor patient adherence, inability to objectively measure the environmental influences on health, lack of knowledge about patients' lifestyle behaviors that may affect cancer's progression and recurrence, and limited ability to compile and interpret the wide range of variables that must be considered in the cancer treatment. This lack of data integration limits the potential for patients and clinicians to engage in fully informed decision-making regarding cancer prevention, treatment, and survivorship care, and the translation of research results into mainstream medical care. Particularly important, as noted in a 2009 report on CER to the President and Congress, the limited focus on health behavior-change interventions was a major hindrance in this research landscape (DHHS 2009). This paper describes an initiative to improve CER for cancer by addressing several of these limitations. The Cyberinfrastructure for Comparative Effectiveness Research (CYCORE) project, informed by the National Science Foundation's 2007 report "Cyberinfrastructure Vision for 21(st) Century Discovery" has, as its central aim, the creation of a prototype for a user-friendly, open-source cyberinfrastructure (CI) that supports acquisition, storage, visualization, analysis, and sharing of data important for cancer-related CER. Although still under development, the process of gathering requirements for CYCORE has revealed new ways in which CI design can significantly improve the collection and analysis of a wide variety of data types, and has resulted in new and important partnerships among cancer researchers engaged in advancing health-related CI.
Key challenges for nanotechnology: Standardization of ecotoxicity testing.
Cerrillo, Cristina; Barandika, Gotzone; Igartua, Amaya; Areitioaurtena, Olatz; Mendoza, Gemma
2017-04-03
Nanotechnology is expected to contribute to the protection of the environment, but many uncertainties exist regarding the environmental and human implications of manufactured nanomaterials (MNMs). Contradictory results have been reported for their ecotoxicity to aquatic organisms, which constitute one of the most important pathways for their entrance and transfer throughout the food web. The present review is focused on the international strategies that are laying the foundations of the ecotoxicological assessment of MNMs. Specific advice is provided on the preparation of MNM dispersions in the culture media of the organisms, which is considered a key factor to overcome the limitations in the standardization of the test methodologies.
Design and analysis of post-marketing research.
Zhou, Xiao-Hua Andrew; Yang, Wei
2013-07-01
A post-marketing study is an integral part of research that helps to ensure a favorable risk-benefit profile for approved drugs used in the market. Because most of post-marketing studies use observational designs, which are liable to confounding, estimation of the causal effect of a drug versus a comparative one is very challenging. This article focuses on methodological issues of importance in designing and analyzing studies to evaluate the safety of marketed drugs, especially marketed traditional Chinese medicine (TCM) products. Advantages and limitations of the current designs and analytic methods for postmarketing studies are discussed, and recommendations are given for improving the validity of postmarketing studies in TCM products.
Did the American Academy of Orthopaedic Surgeons osteoarthritis guidelines miss the mark?
Bannuru, Raveendhara R; Vaysbrot, Elizaveta E; McIntyre, Louis F
2014-01-01
The American Academy of Orthopaedic Surgeons (AAOS) 2013 guidelines for knee osteoarthritis recommended against the use of viscosupplementation for failing to meet the criterion of minimum clinically important improvement (MCII). However, the AAOS's methodology contained numerous flaws in obtaining, displaying, and interpreting MCII-based results. The current state of research on MCII allows it to be used only as a supplementary instrument, not a basis for clinical decision making. The AAOS guidelines should reflect this consideration in their recommendations to avoid condemning potentially viable treatments in the context of limited available alternatives. Copyright © 2014 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.
Gomez-Ramirez, Jaime; Sanz, Ricardo
2013-09-01
One of the most important scientific challenges today is the quantitative and predictive understanding of biological function. Classical mathematical and computational approaches have been enormously successful in modeling inert matter, but they may be inadequate to address inherent features of biological systems. We address the conceptual and methodological obstacles that lie in the inverse problem in biological systems modeling. We introduce a full Bayesian approach (FBA), a theoretical framework to study biological function, in which probability distributions are conditional on biophysical information that physically resides in the biological system that is studied by the scientist. Copyright © 2013 Elsevier Ltd. All rights reserved.
Tree physiology research in a changing world.
Kaufmann, Merrill R.; Linder, Sune
1996-01-01
Changes in issues and advances in methodology have contributed to substantial progress in tree physiology research during the last several decades. Current research focuses on process interactions in complex systems and the integration of processes across multiple spatial and temporal scales. An increasingly important challenge for future research is assuring sustainability of production systems and forested ecosystems in the face of increased demands for natural resources and human disturbance of forests. Meeting this challenge requires significant shifts in research approach, including the study of limitations of productivity that may accompany achievement of system sustainability, and a focus on the biological capabilities of complex land bases altered by human activity.
[Enzymatic analysis of the quality of foodstuffs].
Kolesnov, A Iu
1997-01-01
Enzymatic analysis is an independent and separate branch of enzymology and analytical chemistry. It has become one of the most important methodologies used in food analysis. Enzymatic analysis allows the quick, reliable determination of many food ingredients. Often these contents cannot be determined by conventional methods, or if methods are available, they are determined only with limited accuracy. Today, methods of enzymatic analysis are being increasingly used in the investigation of foodstuffs. Enzymatic measurement techniques are used in industry, scientific and food inspection laboratories for quality analysis. This article describes the requirements of an optimal analytical method: specificity, sample preparation, assay performance, precision, sensitivity, time requirement, analysis cost, safety of reagents.
NASA Astrophysics Data System (ADS)
Mirotta, S.; Guillot, J.; Chevalier, V.; Biard, B.
2018-01-01
The study of Reactivity Initiated Accidents (RIA) is important to determine up to which limits nuclear fuels can withstand such accidents without clad failure. The CABRI International Program (CIP), conducted by IRSN under an OECD/NEA agreement, has been launched to perform representative RIA Integral Effect Tests (IET) on real irradiated fuel rods in prototypical Pressurized Water Reactors (PWR) conditions. For this purpose, the CABRI experimental pulse reactor, operated by CEA in Cadarache, France, has been strongly renovated, and equipped with a pressurized water loop. The behavior of the test rod, located in that loop in the center of the driver core, is followed in real time during the power transients thanks to the hodoscope, a unique online fuel motion monitoring system, and one of the major distinctive features of CABRI. The hodoscope measures the fast neutrons emitted by the tested rod during the power pulse with a complete set of 153 Fission Chambers and 153 Proton Recoil Counters. During the CABRI facility renovation, the electronic chain of these detectors has been upgraded. In this paper, the performance of the new system is presented describing gain calibration methodology in order to get maximal Signal/Noise ratio for amplification modules, threshold tuning methodology for the discrimination modules (old and new ones), and linear detectors response limit versus different reactor powers for the whole electronic chain.
Pano-Farias, Norma S; Ceballos-Magaña, Silvia G; Gonzalez, Jorge; Jurado, José M; Muñiz-Valencia, Roberto
2015-04-01
To improve the analysis of pesticides in complex food matrices with economic importance, alternative chromatographic techniques, such as supercritical fluid chromatography, can be used. Supercritical fluid chromatography has barely been applied for pesticide analysis in food matrices. In this paper, an analytical method using supercritical fluid chromatography coupled to a photodiode array detection has been established for the first time for the quantification of pesticides in papaya and avocado. The extraction of methyl parathion, atrazine, ametryn, carbofuran, and carbaryl was performed through the quick, easy, cheap, effective, rugged, and safe methodology. The method was validated using papaya and avocado samples. For papaya, the correlation coefficient values were higher than 0.99; limits of detection and quantification ranged from 130-380 and 220-640 μg/kg, respectively; recovery values ranged from 72.8-94.6%; precision was lower than 3%. For avocado, limit of detection values were ˂450 μg/kg; precision was lower than 11%; recoveries ranged from 50.0-94.2%. Method feasibility was tested for lime, banana, mango, and melon samples. Our results demonstrate that the proposed method is applicable to methyl parathion, atrazine, ametryn, and carbaryl, toxics pesticides used worldwide. The methodology presented in this work could be applicable to other fruits. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
[The new German general threshold limit value for dust--pro and contra the adoption in Austria].
Godnic-Cvar, Jasminka; Ponocny, Ivo
2004-01-01
Since it has been realised that inhalation of inert dust is one of the important confounding variables for the development of chronic bronchitis, the threshold values for occupational exposure to these dusts needs to be further decreased. The German Commission for the Investigation of Health Hazards of Chemical Compounds in the Work Area (MAK-Commission) has set a new threshold (MAK-Value) for inert dusts (4 mg/m3 for inhalable dust, 1.5 mg/m3 for respirable dust) in 1997. This value is much lower than the threshold values currently used world-wide. The aim of the present article is to assess the scientific plausibility of the methodology (databases and statistics) used to set these new German MAK-Values, regarding their adoption in Austria. Although we believe that it is substantial to lower the MAK-Value for inert dust in order to prevent the development of chronic bronchitis as a consequence of occupational exposure to inert dusts, the applied methodology used by the German MAK-Commission in 1997 to set the new MAK-Values does not justify the reduction of the threshold limit value. A carefully designed study to establish an appropriate scientific basis for setting a new threshold value for inert dusts in the workplace should be carried out. Meanwhile, at least the currently internationally applied threshold values should be adopted in Austria.
Super-Sensitive and Robust Biosensors from Supported Polymer Bilayers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paxton, Walter F.
2015-09-01
Biological organisms are potentially the most sensitive and selective biological detection systems known, yet we are currently severely limited in our ability to exploit biological interactions in sensory devices, due in part to the limited stability of biological systems and derived materials. This proposal addresses an important aspect of integrating biological sensory materials in a solid state device. If successful, such technology could enable entirely new classes of robust biosensors that could be miniaturized and deployed in the field. The critical aims of the proposed work were 1) the calibration of a more versatile approach to measuring pH, 2) themore » use of this method to monitor pH changes caused by the light-induced pumping of protons across vesicles with bacteriorhodopsin integrated into the membranes (either polymer or lipid); 3) the preparation of bilayer assemblies on platinum surfaces; 4) the enhanced detection of lightinduced pH changes driven by bR-loaded supported bilayers. I have developed a methodology that may enable that at interfaces and developed a methodology to characterize the functionality of bilayer membranes with reconstituted membrane proteins. The integrity of the supported bilayer films however must be optimized prior to the full realization of the work originally envisioned in the original proposal. Nevertheless, the work performed on this project and the encouraging results it has produced has demonstrated that these goals are challenging yet within reach.« less
Engineering Large Animal Species to Model Human Diseases.
Rogers, Christopher S
2016-07-01
Animal models are an important resource for studying human diseases. Genetically engineered mice are the most commonly used species and have made significant contributions to our understanding of basic biology, disease mechanisms, and drug development. However, they often fail to recreate important aspects of human diseases and thus can have limited utility as translational research tools. Developing disease models in species more similar to humans may provide a better setting in which to study disease pathogenesis and test new treatments. This unit provides an overview of the history of genetically engineered large animals and the techniques that have made their development possible. Factors to consider when planning a large animal model, including choice of species, type of modification and methodology, characterization, production methods, and regulatory compliance, are also covered. © 2016 by John Wiley & Sons, Inc. Copyright © 2016 John Wiley & Sons, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhattarai, Bishnu; Kouzelis, Konstantinos; Mendaza, Iker
The gradual active load penetration in low voltage distribution grids is expected to challenge their network capacity in the near future. Distribution system operators should for this reason resort to either costly grid reinforcements or to demand side management mechanisms. Since demand side management implementation is usually cheaper, it is also the favorable solution. To this end, this article presents a framework for handling grid limit violations, both voltage and current, to ensure a secure and qualitative operation of the distribution grid. This framework consists of two steps, namely a proactive centralized and subsequently a reactive decentralized control scheme. Themore » former is employed to balance the one hour ahead load while the latter aims at regulating the consumption in real-time. In both cases, the importance of fair use of electricity demand flexibility is emphasized. Thus, it is demonstrated that this methodology aids in keeping the grid status within preset limits while utilizing flexibility from all flexibility participants.« less
Uninterrupted and reusable source for the controlled growth of nanowires
Sugavaneshwar, R. P.; Nanda, Karuna Kar
2013-01-01
Generally, the length of the oxide nanowires grown by vapor phase transport is limited by the degradation of the source materials. Furthermore, the source material is used once for the nanowires growth. By exploiting the Si-Zn phase diagram, we have developed a simple methodology for the non-catalytic growth of ultralong ZnO nanowires in large area with controllable aspect ratio and branched structures. The insolubility of Zn in Si and the use of a Si cap on the Zn source to prevent local source oxidation of Zn (i. e. prevents the degradation of the source) are the keys to grow longer nanowires without limitations. It has been shown that the aspect ratio can be controlled by thermodynamically (temperature) and more importantly by kinetically (vapor flux). One of the interesting findings is that the same source material can be used for several depositions of oxide nanostructured materials. PMID:23412010
Noah, Aggie J.
2015-01-01
Neighborhood is an important context in which individuals and families are embedded. Yet family studies researchers have been relatively slow to incorporate spatial approaches into family science. Although limited theoretical and methodological attention has been devoted to families in neighborhood-effects research, family scholars can contribute greatly to theories about neighborhood effects, and neighborhood-effects research can help move the field of family studies forward. This article reviews the theories, applications, and limitations of research on neighborhood effects and discusses how family studies can benefit from incorporating a spatial perspective from neighborhood-effects research. I then present an innovative methodology—referred to as activity spaces—emerging in neighborhood-effects research, and I discuss how this approach can be used to better understand the complexity and heterogeneity of families. Last, I highlight ways to incorporate space into family studies by “putting families into place.” PMID:26681979
Approaches for Assessing Olfaction in Children with Autism Spectrum Disorder.
Kumazaki, Hirokazu; Okamoto, Masako; Kanzaki, Sho; Okada, Ken-Ichi; Mimura, Masaru; Minabe, Yoshio; Kikuchi, Mitsuru
2018-01-01
Olfactory traits in individuals with autism spectrum disorder (ASD) are considered the strongest predictors of social impairment. Compared to other sensory abnormalities, olfactory abnormalities in individuals with ASD are poorly understood. In this chapter, we provide an overview of the current assessment in individuals with ASD. Several confounding factors have to be considered when conducting research on olfaction in individuals with ASD. Qualitative measures of olfaction contain only limited information about the olfactory stimuli. In addition, little systematic information is available about individual's actual uses of olfaction in daily life. Only a limited number of experimental studies have performed quantitative measurements of olfactory abnormalities in ASD. Therefore, clarifying the relationship between olfactory traits and the influence of real-life situations in a laboratory setting is very difficult. Some new methodologies for measuring olfactory traits are gradually becoming available. New methods that reveal important links between ASD and olfactory traits should be developed in the future.
Manktelow, Bradley N.; Seaton, Sarah E.
2012-01-01
Background Emphasis is increasingly being placed on the monitoring and comparison of clinical outcomes between healthcare providers. Funnel plots have become a standard graphical methodology to identify outliers and comprise plotting an outcome summary statistic from each provider against a specified ‘target’ together with upper and lower control limits. With discrete probability distributions it is not possible to specify the exact probability that an observation from an ‘in-control’ provider will fall outside the control limits. However, general probability characteristics can be set and specified using interpolation methods. Guidelines recommend that providers falling outside such control limits should be investigated, potentially with significant consequences, so it is important that the properties of the limits are understood. Methods Control limits for funnel plots for the Standardised Mortality Ratio (SMR) based on the Poisson distribution were calculated using three proposed interpolation methods and the probability calculated of an ‘in-control’ provider falling outside of the limits. Examples using published data were shown to demonstrate the potential differences in the identification of outliers. Results The first interpolation method ensured that the probability of an observation of an ‘in control’ provider falling outside either limit was always less than a specified nominal probability (p). The second method resulted in such an observation falling outside either limit with a probability that could be either greater or less than p, depending on the expected number of events. The third method led to a probability that was always greater than, or equal to, p. Conclusion The use of different interpolation methods can lead to differences in the identification of outliers. This is particularly important when the expected number of events is small. We recommend that users of these methods be aware of the differences, and specify which interpolation method is to be used prior to any analysis. PMID:23029202
42 CFR 495.212 - Limitation on review.
Code of Federal Regulations, 2013 CFR
2013-10-01
... PROGRAM Requirements Specific to Medicare Advantage (MA) Organizations § 495.212 Limitation on review. (a... methodology and standards for determining payment amounts and payment adjustments under the MA EHR EP... related to the fixed schedule for application of limitation on incentive payments for all qualifying MA...
42 CFR 495.212 - Limitation on review.
Code of Federal Regulations, 2011 CFR
2011-10-01
... PROGRAM Requirements Specific to Medicare Advantage (MA) Organizations § 495.212 Limitation on review. (a... methodology and standards for determining payment amounts and payment adjustments under the MA EHR EP... related to the fixed schedule for application of limitation on incentive payments for all qualifying MA...
42 CFR 495.212 - Limitation on review.
Code of Federal Regulations, 2012 CFR
2012-10-01
... PROGRAM Requirements Specific to Medicare Advantage (MA) Organizations § 495.212 Limitation on review. (a... methodology and standards for determining payment amounts and payment adjustments under the MA EHR EP... related to the fixed schedule for application of limitation on incentive payments for all qualifying MA...
42 CFR 495.212 - Limitation on review.
Code of Federal Regulations, 2010 CFR
2010-10-01
... PROGRAM Requirements Specific to Medicare Advantage (MA) Organizations § 495.212 Limitation on review. (a... methodology and standards for determining payment amounts and payment adjustments under the MA EHR EP... related to the fixed schedule for application of limitation on incentive payments for all qualifying MA...
42 CFR 495.212 - Limitation on review.
Code of Federal Regulations, 2014 CFR
2014-10-01
... PROGRAM Requirements Specific to Medicare Advantage (MA) Organizations § 495.212 Limitation on review. (a... methodology and standards for determining payment amounts and payment adjustments under the MA EHR EP... related to the fixed schedule for application of limitation on incentive payments for all qualifying MA...
The Differential Effect of Attentional Condition on Subsequent Vocabulary Development
ERIC Educational Resources Information Center
Mohammed, Halah Abdulelah; Majid, Norazman Abdul; Abdullah, Tina
2016-01-01
This study addressed the potential methodological issues effect of attentional condition on subsequent vocabulary development from a different perspective, which addressed several potential methodological issues of previous research that have been based on psycholinguistic notion of second language learner as a limited capacity processor. The…
Predicting Dissertation Methodology Choice among Doctoral Candidates at a Faith-Based University
ERIC Educational Resources Information Center
Lunde, Rebecca
2017-01-01
Limited research has investigated dissertation methodology choice and the factors that contribute to this choice. Quantitative research is based in mathematics and scientific positivism, and qualitative research is based in constructivism. These underlying philosophical differences posit the question if certain factors predict dissertation…
2014-01-01
Background mRNA translation involves simultaneous movement of multiple ribosomes on the mRNA and is also subject to regulatory mechanisms at different stages. Translation can be described by various codon-based models, including ODE, TASEP, and Petri net models. Although such models have been extensively used, the overlap and differences between these models and the implications of the assumptions of each model has not been systematically elucidated. The selection of the most appropriate modelling framework, and the most appropriate way to develop coarse-grained/fine-grained models in different contexts is not clear. Results We systematically analyze and compare how different modelling methodologies can be used to describe translation. We define various statistically equivalent codon-based simulation algorithms and analyze the importance of the update rule in determining the steady state, an aspect often neglected. Then a novel probabilistic Boolean network (PBN) model is proposed for modelling translation, which enjoys an exact numerical solution. This solution matches those of numerical simulation from other methods and acts as a complementary tool to analytical approximations and simulations. The advantages and limitations of various codon-based models are compared, and illustrated by examples with real biological complexities such as slow codons, premature termination and feedback regulation. Our studies reveal that while different models gives broadly similiar trends in many cases, important differences also arise and can be clearly seen, in the dependence of the translation rate on different parameters. Furthermore, the update rule affects the steady state solution. Conclusions The codon-based models are based on different levels of abstraction. Our analysis suggests that a multiple model approach to understanding translation allows one to ascertain which aspects of the conclusions are robust with respect to the choice of modelling methodology, and when (and why) important differences may arise. This approach also allows for an optimal use of analysis tools, which is especially important when additional complexities or regulatory mechanisms are included. This approach can provide a robust platform for dissecting translation, and results in an improved predictive framework for applications in systems and synthetic biology. PMID:24576337
DOT National Transportation Integrated Search
2013-09-01
Recent advances in multivariate methodology provide an opportunity to further the assessment of service offerings in public transportation for work commuting. We offer methodologies that are alternative to direct rating scale and have advantages in t...
Sáiz, Jorge; García-Roa, Roberto; Martín, José; Gómara, Belén
2017-09-08
Chemical signaling is a widespread mode of communication among living organisms that is used to establish social organization, territoriality and/or for mate choice. In lizards, femoral and precloacal glands are important sources of chemical signals. These glands protrude chemical secretions used to mark territories and also, to provide valuable information from the bearer to other individuals. Ecologists have studied these chemical secretions for decades in order to increase the knowledge of chemical communication in lizards. Although several studies have focused on the chemical analysis of these secretions, there is a lack of faster, more sensitive and more selective analytical methodologies for their study. In this work a new GC coupled to tandem triple quadrupole MS (GC-QqQ (MS/MS)) methodology is developed and proposed for the target study of 12 relevant compounds often found in lizard secretions (i.e. 1-hexadecanol, palmitic acid, 1-octadecanol, oleic acid, stearic acid, 1-tetracosanol, squalene, cholesta-3,5-diene, α-tocopherol, cholesterol, ergosterol and campesterol). The method baseline-separated the analytes in less than 7min, with instrumental limits of detection ranging from 0.04 to 6.0ng/mL. It was possible to identify differences in the composition of the samples from the lizards analyzed, which depended on the species, the habitat occupied and the diet of the individuals. Moreover, α-tocopherol has been determined for the first time in a lizard species, which was thought to lack its expression in chemical secretions. Globally, the methodology has been proven to be a valuable alternative to other published methods with important improvements in terms of analysis time, sensitivity, and selectivity. Copyright © 2017 Elsevier B.V. All rights reserved.
Fragility Analysis of Concrete Gravity Dams
NASA Astrophysics Data System (ADS)
Tekie, Paulos B.; Ellingwood, Bruce R.
2002-09-01
Concrete gravity dams are an important part ofthe nation's infrastructure. Many dams have been in service for over 50 years, during which time important advances in the methodologies for evaluation of natural phenomena hazards have caused the design-basis events to be revised upwards, in some cases significantly. Many existing dams fail to meet these revised safety criteria and structural rehabilitation to meet newly revised criteria may be costly and difficult. A probabilistic safety analysis (PSA) provides a rational safety assessment and decision-making tool managing the various sources of uncertainty that may impact dam performance. Fragility analysis, which depicts fl%e uncertainty in the safety margin above specified hazard levels, is a fundamental tool in a PSA. This study presents a methodology for developing fragilities of concrete gravity dams to assess their performance against hydrologic and seismic hazards. Models of varying degree of complexity and sophistication were considered and compared. The methodology is illustrated using the Bluestone Dam on the New River in West Virginia, which was designed in the late 1930's. The hydrologic fragilities showed that the Eluestone Dam is unlikely to become unstable at the revised probable maximum flood (PMF), but it is likely that there will be significant cracking at the heel ofthe dam. On the other hand, the seismic fragility analysis indicated that sliding is likely, if the dam were to be subjected to a maximum credible earthquake (MCE). Moreover, there will likely be tensile cracking at the neck of the dam at this level of seismic excitation. Probabilities of relatively severe limit states appear to be only marginally affected by extremely rare events (e.g. the PMF and MCE). Moreover, the risks posed by the extreme floods and earthquakes were not balanced for the Bluestone Dam, with seismic hazard posing a relatively higher risk.
NASA Astrophysics Data System (ADS)
O'Keeffe, Jimmy; Buytaert, Wouter; Mijic, Ana; Brozovic, Nicholas
2015-04-01
To build an accurate, robust understanding of the environment, it is important to not only collect information describing its physical characteristics, but also the drivers which influence it. As environmental change, from increasing CO2 levels to decreasing water levels, is often heavily influenced by human activity, gathering information on anthropogenic as well as environmental variables is extremely important. This can mean collecting qualitative, as well as quantitative information. In reality studies are often bound by financial and time constraints, limiting the depth and detail of the research. It is up to the researcher to determine what the best methodology to answer the research questions is likely to be. Here we present a methodology of collecting qualitative and quantitative information in tandem for hydrological studies through the use of semi-structured interviews. This is applied to a case study in two districts of Uttar Pradesh, North India, one of the most intensely irrigated areas of the world. Here, decreasing water levels exacerbated by unchecked water abstraction, an expanding population and government subsidies, have put the long term resilience of the farming population in doubt. Through random selection of study locations, combined with convenience sampling of the participants therein, we show how the data collected can provide valuable insight into the drivers which have led to the current water scenario. We also show how reliable quantitative information can, using the same methodology, be effectively and efficiently extracted for modelling purposes, which along with developing an understanding of the characteristics of the environment is vital in coming up with realistic and sustainable solutions for water resource management in the future.
Sleep disturbances as an evidence-based suicide risk factor.
Bernert, Rebecca A; Kim, Joanne S; Iwata, Naomi G; Perlis, Michael L
2015-03-01
Increasing research indicates that sleep disturbances may confer increased risk for suicidal behaviors, including suicidal ideation, suicide attempts, and death by suicide. Despite increased investigation, a number of methodological problems present important limitations to the validity and generalizability of findings in this area, which warrant additional focus. To evaluate and delineate sleep disturbances as an evidence-based suicide risk factor, a systematic review of the extant literature was conducted with methodological considerations as a central focus. The following methodologic criteria were required for inclusion: the report (1) evaluated an index of sleep disturbance; (2) examined an outcome measure for suicidal behavior; (3) adjusted for presence of a depression diagnosis or depression severity, as a covariate; and (4) represented an original investigation as opposed to a chart review. Reports meeting inclusion criteria were further classified and reviewed according to: study design and timeframe; sample type and size; sleep disturbance, suicide risk, and depression covariate assessment measure(s); and presence of positive versus negative findings. Based on keyword search, the following search engines were used: PubMed and PsycINFO. Search criteria generated N = 82 articles representing original investigations focused on sleep disturbances and suicide outcomes. Of these, N = 18 met inclusion criteria for review based on systematic analysis. Of the reports identified, N = 18 evaluated insomnia or poor sleep quality symptoms, whereas N = 8 assessed nightmares in association with suicide risk. Despite considerable differences in study designs, samples, and assessment techniques, the comparison of such reports indicates preliminary, converging evidence for sleep disturbances as an empirical risk factor for suicidal behaviors, while highlighting important, future directions for increased investigation.
Using Q Methodology in Quality Improvement Projects.
Tiernon, Paige; Hensel, Desiree; Roy-Ehri, Leah
Q methodology consists of a philosophical framework and procedures to identify subjective viewpoints that may not be well understood, but its use in nursing is still quite limited. We describe how Q methodology can be used in quality improvement projects to better understand local viewpoints that act as facilitators or barriers to the implementation of evidence-based practice. We describe the use of Q methodology to identify nurses' attitudes about the provision of skin-to-skin care after cesarean birth. Copyright © 2017 AWHONN, the Association of Women's Health, Obstetric and Neonatal Nurses. Published by Elsevier Inc. All rights reserved.
Roadmap for Navy Family Research.
1980-08-01
of methodological limitations, including: small, often non -representative or narrowly defined samples; inadequate statistical controls, inadequate...1-1 1.2 Overview of the Research Roadmap ..................... 1-2 2. Methodology ...the Office of Naval Research by the Westinghouse Public Applied Systems Division, and is designed to provide the Navy with a systematic framework for
Feminist Research Methodology Groups: Origins, Forms, Functions.
ERIC Educational Resources Information Center
Reinharz, Shulamit
Feminist Research Methodology Groups (FRMGs) have developed as a specific type of women's group in which feminist academics can find supportive audiences for their work while contributing to a feminist redefinition of research methods. An analysis of two FRMGs reveals common characteristics, dynamics, and outcomes. Both were limited to small…
ERIC Educational Resources Information Center
Casado, Banghwa Lee; Negi, Nalini Junko; Hong, Michin
2012-01-01
Despite the growing number of language minorities, foreign-born individuals with limited English proficiency, this population has been largely left out of social work research, often due to methodological challenges involved in conducting research with this population. Whereas the professional standard calls for cultural competence, a discussion…
NASA Astrophysics Data System (ADS)
Yee, Eugene
2007-04-01
Although a great deal of research effort has been focused on the forward prediction of the dispersion of contaminants (e.g., chemical and biological warfare agents) released into the turbulent atmosphere, much less work has been directed toward the inverse prediction of agent source location and strength from the measured concentration, even though the importance of this problem for a number of practical applications is obvious. In general, the inverse problem of source reconstruction is ill-posed and unsolvable without additional information. It is demonstrated that a Bayesian probabilistic inferential framework provides a natural and logically consistent method for source reconstruction from a limited number of noisy concentration data. In particular, the Bayesian approach permits one to incorporate prior knowledge about the source as well as additional information regarding both model and data errors. The latter enables a rigorous determination of the uncertainty in the inference of the source parameters (e.g., spatial location, emission rate, release time, etc.), hence extending the potential of the methodology as a tool for quantitative source reconstruction. A model (or, source-receptor relationship) that relates the source distribution to the concentration data measured by a number of sensors is formulated, and Bayesian probability theory is used to derive the posterior probability density function of the source parameters. A computationally efficient methodology for determination of the likelihood function for the problem, based on an adjoint representation of the source-receptor relationship, is described. Furthermore, we describe the application of efficient stochastic algorithms based on Markov chain Monte Carlo (MCMC) for sampling from the posterior distribution of the source parameters, the latter of which is required to undertake the Bayesian computation. The Bayesian inferential methodology for source reconstruction is validated against real dispersion data for two cases involving contaminant dispersion in highly disturbed flows over urban and complex environments where the idealizations of horizontal homogeneity and/or temporal stationarity in the flow cannot be applied to simplify the problem. Furthermore, the methodology is applied to the case of reconstruction of multiple sources.
Döring, Nora; Mayer, Susanne; Rasmussen, Finn; Sonntag, Diana
2016-09-13
Despite methodological advances in the field of economic evaluations of interventions, economic evaluations of obesity prevention programmes in early childhood are seldom conducted. The aim of the present study was to explore existing methods and applications of economic evaluations, examining their limitations and making recommendations for future cost-effectiveness assessments. A systematic literature search was conducted using PubMed, Cochrane Library, the British National Health Service Economic Evaluation Databases and EconLit. Eligible studies included trial-based or simulation-based cost-effectiveness analyses of obesity prevention programmes targeting preschool children and/or their parents. The quality of included studies was assessed. Of the six studies included, five were intervention studies and one was based on a simulation approach conducted on secondary data. We identified three main conceptual and methodological limitations of their economic evaluations: Insufficient conceptual approach considering the complexity of childhood obesity, inadequate measurement of effects of interventions, and lack of valid instruments to measure child-related quality of life and costs. Despite the need for economic evaluations of obesity prevention programmes in early childhood, only a few studies of varying quality have been conducted. Moreover, due to methodological and conceptual weaknesses, they offer only limited information for policy makers and intervention providers. We elaborate reasons for the limitations of these studies and offer guidance for designing better economic evaluations of early obesity prevention.
Döring, Nora; Mayer, Susanne; Rasmussen, Finn; Sonntag, Diana
2016-01-01
Despite methodological advances in the field of economic evaluations of interventions, economic evaluations of obesity prevention programmes in early childhood are seldom conducted. The aim of the present study was to explore existing methods and applications of economic evaluations, examining their limitations and making recommendations for future cost-effectiveness assessments. A systematic literature search was conducted using PubMed, Cochrane Library, the British National Health Service Economic Evaluation Databases and EconLit. Eligible studies included trial-based or simulation-based cost-effectiveness analyses of obesity prevention programmes targeting preschool children and/or their parents. The quality of included studies was assessed. Of the six studies included, five were intervention studies and one was based on a simulation approach conducted on secondary data. We identified three main conceptual and methodological limitations of their economic evaluations: Insufficient conceptual approach considering the complexity of childhood obesity, inadequate measurement of effects of interventions, and lack of valid instruments to measure child-related quality of life and costs. Despite the need for economic evaluations of obesity prevention programmes in early childhood, only a few studies of varying quality have been conducted. Moreover, due to methodological and conceptual weaknesses, they offer only limited information for policy makers and intervention providers. We elaborate reasons for the limitations of these studies and offer guidance for designing better economic evaluations of early obesity prevention. PMID:27649218
Prabakaran, Rema; Seymour, Shiri; Moles, David R; Cunningham, Susan J
2012-08-01
Motivation and cooperation are vital components of orthodontic treatment if a good outcome is to be achieved. In this study, we used Q-methodology to investigate motivating factors among adolescents seeking orthodontic treatment and parents wanting their children to undergo orthodontic treatment. This technique asks participants to rank a series of statements, and the analysis of this ranking then provides insight into the participants' opinions. Each of these complementary studies was divided into 2 phases: interviews to generate a list of reasons for seeking orthodontic treatment and the use of Q-methodology to assess and categorize the relative importance of these reasons for the groups of participants. In the patient study, 32 items were generated from the interviews and placed in order of importance on a Q-methodology grid by 60 patients who were about to commence orthodontic treatment. The rankings were subjected to factor analysis, which categorized the patients' views into groups of shared opinions. The same methodology was used with the parent group, and a Q-methodology grid was designed to accommodate 35 items that were then ranked by the 60 parents. The rankings were subjected to factor analysis as for the patient group. For the patients, factor analysis identified 3 factors, all of which included esthetics, as important. The remaining respondents had more individual viewpoints and did not map to any of the 3 factors. For the parents, factor analysis identified 4 factors, all of which included treatment in adolescence to prevent future problems, as important. This study showed that Q-methodology is a novel and efficient tool that can be used in dental research with few difficulties. It might prove useful for the aspects of care for which subjective views or opinions play an important role. Copyright © 2012 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.
Cordes, Joseph J
2017-10-01
Since the early 2000's there has been growing interest in using the Social Return on Investment (SROI) as a measure for assessing the performance of social enterprises. By analogy with its business counterpart, the Return on Investment (ROI), the SROI is a metric that compares the monetized social costs of a program with the monetized social benefits of achieving an outcome (or set of outcomes). For example, calculating the SROI of a nonprofit half-way house for drug addicts might involve estimating the reduced social costs attributable to successful rehabilitation of addicts, and comparing this to the social costs of operating the half-way house. Alternatively, the total return of a for-profit social enterprise providing affordable housing might consist both of the traditional private return on investment along with the economic value of meeting the housing needs of lower income households. Early descriptions of the methodology for calculating the SROI suggest that the approach initially evolved from standard methodologies found in the business finance literature for evaluating investments, with the important twist that nonprofit sector returns/payoffs are defined in broader social terms (Thornley, Anderson, & Dixon, 2016). Yet, someone who is familiar with the economic literature on cost benefit analysis (CBA) as it is applied to the evaluation of public programs cannot help but be struck by the similarity between the outcomes that CBA is intended to measure, and those that are the object of efforts to calculate the SROI. One implication is that the literature on the theory and practice of cost benefit analysis offers useful lessons about how to measure the social return on investment, as well as about potential caveats and limitations that need to be confronted when attempting to undertake an analysis of the SROI. The paper discusses the potential uses and limitations of CBA and SROI as tools that governments, private donor/investors, and foundations can use to help set funding priorities, and evaluate performance. It summarizes: (1) the conceptual foundations of CBA and its application to SROI analysis, (2) issues raised in the implementation of CBA and SROI in practice, and (3) discusses when CBA and/or SROI approaches are a useful lens for setting priorities and/or evaluating performance, as well as important limitations of such methods. Copyright © 2016 Elsevier Ltd. All rights reserved.
Deveau, M; Chen, C-P; Johanson, G; Krewski, D; Maier, A; Niven, K J; Ripple, S; Schulte, P A; Silk, J; Urbanus, J H; Zalk, D M; Niemeier, R W
2015-01-01
Occupational exposure limits (OELs) serve as health-based benchmarks against which measured or estimated workplace exposures can be compared. In the years since the introduction of OELs to public health practice, both developed and developing countries have established processes for deriving, setting, and using OELs to protect workers exposed to hazardous chemicals. These processes vary widely, however, and have thus resulted in a confusing international landscape for identifying and applying such limits in workplaces. The occupational hygienist will encounter significant overlap in coverage among organizations for many chemicals, while other important chemicals have OELs developed by few, if any, organizations. Where multiple organizations have published an OEL, the derived value often varies considerably-reflecting differences in both risk policy and risk assessment methodology as well as access to available pertinent data. This article explores the underlying reasons for variability in OELs, and recommends the harmonization of risk-based methods used by OEL-deriving organizations. A framework is also proposed for the identification and systematic evaluation of OEL resources, which occupational hygienists can use to support risk characterization and risk management decisions in situations where multiple potentially relevant OELs exist.
24 CFR 904.205 - Training methodology.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 24 Housing and Urban Development 4 2011-04-01 2011-04-01 false Training methodology. 904.205... DEVELOPMENT LOW RENT HOUSING HOMEOWNERSHIP OPPORTUNITIES Homeownership Counseling and Training § 904.205 Training methodology. Equal in importance to the content of the pre- and post-occupancy training is the...
24 CFR 904.205 - Training methodology.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 24 Housing and Urban Development 4 2014-04-01 2014-04-01 false Training methodology. 904.205... DEVELOPMENT LOW RENT HOUSING HOMEOWNERSHIP OPPORTUNITIES Homeownership Counseling and Training § 904.205 Training methodology. Equal in importance to the content of the pre- and post-occupancy training is the...
24 CFR 904.205 - Training methodology.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 24 Housing and Urban Development 4 2012-04-01 2012-04-01 false Training methodology. 904.205... DEVELOPMENT LOW RENT HOUSING HOMEOWNERSHIP OPPORTUNITIES Homeownership Counseling and Training § 904.205 Training methodology. Equal in importance to the content of the pre- and post-occupancy training is the...
24 CFR 904.205 - Training methodology.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 24 Housing and Urban Development 4 2013-04-01 2013-04-01 false Training methodology. 904.205... DEVELOPMENT LOW RENT HOUSING HOMEOWNERSHIP OPPORTUNITIES Homeownership Counseling and Training § 904.205 Training methodology. Equal in importance to the content of the pre- and post-occupancy training is the...
Data Centric Development Methodology
ERIC Educational Resources Information Center
Khoury, Fadi E.
2012-01-01
Data centric applications, an important effort of software development in large organizations, have been mostly adopting a software methodology, such as a waterfall or Rational Unified Process, as the framework for its development. These methodologies could work on structural, procedural, or object oriented based applications, but fails to capture…
NASA Astrophysics Data System (ADS)
Shafer, J. M.; Varljen, M. D.
1990-08-01
A fundamental requirement for geostatistical analyses of spatially correlated environmental data is the estimation of the sample semivariogram to characterize spatial correlation. Selecting an underlying theoretical semivariogram based on the sample semivariogram is an extremely important and difficult task that is subject to a great deal of uncertainty. Current standard practice does not involve consideration of the confidence associated with semivariogram estimates, largely because classical statistical theory does not provide the capability to construct confidence limits from single realizations of correlated data, and multiple realizations of environmental fields are not found in nature. The jackknife method is a nonparametric statistical technique for parameter estimation that may be used to estimate the semivariogram. When used in connection with standard confidence procedures, it allows for the calculation of closely approximate confidence limits on the semivariogram from single realizations of spatially correlated data. The accuracy and validity of this technique was verified using a Monte Carlo simulation approach which enabled confidence limits about the semivariogram estimate to be calculated from many synthetically generated realizations of a random field with a known correlation structure. The synthetically derived confidence limits were then compared to jackknife estimates from single realizations with favorable results. Finally, the methodology for applying the jackknife method to a real-world problem and an example of the utility of semivariogram confidence limits were demonstrated by constructing confidence limits on seasonal sample variograms of nitrate-nitrogen concentrations in shallow groundwater in an approximately 12-mi2 (˜30 km2) region in northern Illinois. In this application, the confidence limits on sample semivariograms from different time periods were used to evaluate the significance of temporal change in spatial correlation. This capability is quite important as it can indicate when a spatially optimized monitoring network would need to be reevaluated and thus lead to more robust monitoring strategies.
Barnes, Brian B.; Wilson, Michael B.; Carr, Peter W.; Vitha, Mark F.; Broeckling, Corey D.; Heuberger, Adam L.; Prenni, Jessica; Janis, Gregory C.; Corcoran, Henry; Snow, Nicholas H.; Chopra, Shilpi; Dhandapani, Ramkumar; Tawfall, Amanda; Sumner, Lloyd W.; Boswell, Paul G.
2014-01-01
Gas chromatography-mass spectrometry (GC-MS) is a primary tool used to identify compounds in complex samples. Both mass spectra and GC retention times are matched to those of standards, but it is often impractical to have standards on hand for every compound of interest, so we must rely on shared databases of MS data and GC retention information. Unfortunately, retention databases (e.g. linear retention index libraries) are experimentally restrictive, notoriously unreliable, and strongly instrument dependent, relegating GC retention information to a minor, often negligible role in compound identification despite its potential power. A new methodology called “retention projection” has great potential to overcome the limitations of shared chromatographic databases. In this work, we tested the reliability of the methodology in five independent laboratories. We found that even when each lab ran nominally the same method, the methodology was 3-fold more accurate than retention indexing because it properly accounted for unintentional differences between the GC-MS systems. When the labs used different methods of their own choosing, retention projections were 4- to 165-fold more accurate. More importantly, the distribution of error in the retention projections was predictable across different methods and labs, thus enabling automatic calculation of retention time tolerance windows. Tolerance windows at 99% confidence were generally narrower than those widely used even when physical standards are on hand to measure their retention. With its high accuracy and reliability, the new retention projection methodology makes GC retention a reliable, precise tool for compound identification, even when standards are not available to the user. PMID:24205931
A Framework for Determining the Return on Investment of Simulation-Based Training in Health Care
Bukhari, Hatim; Andreatta, Pamela; Goldiez, Brian; Rabelo, Luis
2017-01-01
This article describes a framework that has been developed to monetize the real value of simulation-based training in health care. A significant consideration has been given to the incorporation of the intangible and qualitative benefits, not only the tangible and quantitative benefits of simulation-based training in health care. The framework builds from three works: the value measurement methodology (VMM) used by several departments of the US Government, a methodology documented in several books by Dr Jack Phillips to monetize various training approaches, and a traditional return on investment methodology put forth by Frost and Sullivan, and Immersion Medical. All 3 source materials were adapted to create an integrated methodology that can be readily implemented. This article presents details on each of these methods and how they can be integrated and presents a framework that integrates the previous methods. In addition to that, it describes the concept and the application of the developed framework. As a test of the applicability of the framework, a real case study has been used to demonstrate the application of the framework. This case study provides real data related to the correlation between the pediatric patient cardiopulmonary arrest (CPA) survival rates and a simulation-based mock codes at the University of Michigan tertiary care academic medical center. It is important to point out that the proposed framework offers the capability to consider a wide range of benefits and values, but on the other hand, there are several limitations that has been discussed and need to be taken in consideration. PMID:28133988
A Framework for Determining the Return on Investment of Simulation-Based Training in Health Care.
Bukhari, Hatim; Andreatta, Pamela; Goldiez, Brian; Rabelo, Luis
2017-01-01
This article describes a framework that has been developed to monetize the real value of simulation-based training in health care. A significant consideration has been given to the incorporation of the intangible and qualitative benefits, not only the tangible and quantitative benefits of simulation-based training in health care. The framework builds from three works: the value measurement methodology (VMM) used by several departments of the US Government, a methodology documented in several books by Dr Jack Phillips to monetize various training approaches, and a traditional return on investment methodology put forth by Frost and Sullivan, and Immersion Medical. All 3 source materials were adapted to create an integrated methodology that can be readily implemented. This article presents details on each of these methods and how they can be integrated and presents a framework that integrates the previous methods. In addition to that, it describes the concept and the application of the developed framework. As a test of the applicability of the framework, a real case study has been used to demonstrate the application of the framework. This case study provides real data related to the correlation between the pediatric patient cardiopulmonary arrest (CPA) survival rates and a simulation-based mock codes at the University of Michigan tertiary care academic medical center. It is important to point out that the proposed framework offers the capability to consider a wide range of benefits and values, but on the other hand, there are several limitations that has been discussed and need to be taken in consideration.
Methodologies for measuring travelers' risk perception of infectious diseases: A systematic review.
Sridhar, Shruti; Régner, Isabelle; Brouqui, Philippe; Gautret, Philippe
2016-01-01
Numerous studies in the past have stressed the importance of travelers' psychology and perception in the implementation of preventive measures. The aim of this systematic review was to identify the methodologies used in studies reporting on travelers' risk perception of infectious diseases. A systematic search for relevant literature was conducted according to Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines. There were 39 studies identified. In 35 of 39 studies, the methodology used was that of a knowledge, attitude and practice (KAP) survey based on questionnaires. One study used a combination of questionnaires and a visual psychometric measuring instrument called the 'pictorial representation of illness and self-measurement" or PRISM. One study used a self-representation model (SRM) method. Two studies measured psychosocial factors. Valuable information was obtained from KAP surveys showing an overall lack of knowledge among travelers about the most frequent travel-associated infections and associated preventive measures. This methodological approach however, is mainly descriptive, addressing knowledge, attitudes, and practices separately and lacking an examination of the interrelationships between these three components. Another limitation of the KAP method is underestimating psychosocial variables that have proved influential in health related behaviors, including perceived benefits and costs of preventive measures, perceived social pressure, perceived personal control, unrealistic optimism and risk propensity. Future risk perception studies in travel medicine should consider psychosocial variables with inferential and multivariate statistical analyses. The use of implicit measurements of attitudes could also provide new insights in the field of travelers' risk perception of travel-associated infectious diseases. Copyright © 2016 Elsevier Ltd. All rights reserved.
Increased Reliability of Gas Turbine Components by Robust Coatings Manufacturing
NASA Astrophysics Data System (ADS)
Sharma, A.; Dudykevych, T.; Sansom, D.; Subramanian, R.
2017-08-01
The expanding operational windows of the advanced gas turbine components demand increasing performance capability from protective coating systems. This demand has led to the development of novel multi-functional, multi-materials coating system architectures over the last years. In addition, the increasing dependency of components exposed to extreme environment on protective coatings results in more severe penalties, in case of a coating system failure. This emphasizes that reliability and consistency of protective coating systems are equally important to their superior performance. By means of examples, this paper describes the effects of scatter in the material properties resulting from manufacturing variations on coating life predictions. A strong foundation in process-property-performance correlations as well as regular monitoring and control of the coating process is essential for robust and well-controlled coating process. Proprietary and/or commercially available diagnostic tools can help in achieving these goals, but their usage in industrial setting is still limited. Various key contributors to process variability are briefly discussed along with the limitations of existing process and product control methods. Other aspects that are important for product reliability and consistency in serial manufacturing as well as advanced testing methodologies to simplify and enhance product inspection and improve objectivity are briefly described.
The patient experience: measuring the quality of care in the Defence Medical Services.
Piper, Neale; Lamb, D
2014-06-01
Healthcare provided by the Defence Medical Services (DMS) is acknowledged to be of a high standard but patients' experiences of it has not been measured and collated in a consistent and meaningful way, which has limited strategic quality improvement initiatives. Responsibility for implementing and delivering a programme of healthcare governance and assurance for the DMS rests with the Inspector General (IG). An important aspect of this role is to nurture a culture of continuous improvement in the DMS and under this leadership the IG team has prioritised a number of projects to address this. The project to improve patient experience data capture was prioritised in the work schedule as it incorporated initiatives that would lead to improved quality in DMS healthcare, information exploitation and ultimately patient safety. This is the first in a series of articles that will document this important work and describe the methodological considerations associated with the initial questionnaire design, collaboration with NHS partners, the pilot study and progress towards the introduction of the definitive DMS tool later this year. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Biotechnological advances in the diagnosis of little-known parasitoses of pets.
Traversa, Donato; Otranto, Domenico
2009-01-01
Dogs, cats, and horses are popular pets in many countries of the World and they have lived in close proximity with human beings for thousands of years. The effect of pet ownership on human health is well known and there is significant merit in preserving the health and welfare of these animals. Some infections caused by parasitic nematodes and arthropods of dogs, cats, and horses are now spreading in several areas of the world. This is the case of canine spirocercosis, feline aelurostrongylosis, and equine gastro-intestinal and nasal nematode and botfly infections. These diseases affect animal health and welfare and may be life-threatening. In spite these infections causing illnesses of major importance in clinical practice are spreading in new geographical foci, they are little known and underestimated also as an effect of difficulties in traditional diagnostics. Importantly, the limited reliability of conventional methodologies has also limited our knowledge of epidemiology, ecology, and biology of these parasitoses. This article reviews the DNA-based assays that have been recently developed for diagnosing these neglected pet parasitic diseases focusing on the advantages they have over classical techniques. Moreover, the opportunities for further epidemiological, ecological, and biological investigations are discussed.
DE LA Vega, G J; Schilman, P E
2018-03-01
In order to assess how triatomines (Hemiptera, Reduviidae), Chagas disease vectors, are distributed through Latin America, we analysed the relationship between the ecological niche and the limits of the physiological thermal niche in seven species of triatomines. We combined two methodological approaches: species distribution models, and physiological tolerances. First, we modelled the ecological niche and identified the most important abiotic factor for their distribution. Then, thermal tolerance limits were analysed by measuring maximum and minimum critical temperatures, upper lethal temperature, and 'chill-coma recovery time'. Finally, we used phylogenetic independent contrasts to analyse the link between limiting factors and the thermal tolerance range for the assessment of ecological hypotheses that provide a different outlook for the geo-epidemiology of Chagas disease. In triatomines, thermo-tolerance range increases with increasing latitude mainly due to better cold tolerances, suggesting an effect of thermal selection. In turn, physiological analyses show that species reaching southernmost areas have a higher thermo-tolerance than those with tropical distributions, denoting that thermo-tolerance is limiting the southern distribution. Understanding the latitudinal range along its physiological limits of disease vectors may prove useful to test ecological hypotheses and improve strategies and efficiency of vector control at the local and regional levels. © 2017 The Royal Entomological Society.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ronold, K.O.; Nielsen, N.J.R.; Tura, F.
This paper demonstrates how a structural reliability method can be applied as a rational means to analyze free spans of submarine pipelines with respect to failure in ultimate loading, and to establish partial safety factors for design of such free spans against this failure mode. It is important to note that the described procedure shall be considered as an illustration of a structural reliability methodology, and that the results do not represent a set of final design recommendations. A scope of design cases, consisting of a number of available site-specific pipeline spans, is established and is assumed representative for themore » future occurrence of submarine pipeline spans. Probabilistic models for the wave and current loading and its transfer to stresses in the pipe wall of a pipeline span is established together with a stochastic representation of the material resistance. The event of failure in ultimate loading is considered as based on a limit state which is reached when the maximum stress over the design life of the pipeline exceeds the yield strength of the pipe material. The yielding limit state is considered an ultimate limit state (ULS).« less
Lee, Kee Hyuck; Yoo, Sooyoung; Shin, HoGyun; Baek, Rong-Min; Chung, Chin Youb; Hwang, Hee
2013-01-01
It is reported that digital dashboard systems in hospitals provide a user interface (UI) that can centrally manage and retrieve various information related to patients in a single screen, support the decision-making of medical professionals on a real time basis by integrating the scattered medical information systems and core work flows, enhance the competence and decision-making ability of medical professionals, and reduce the probability of misdiagnosis. However, the digital dashboard systems of hospitals reported to date have some limitations when medical professionals use them to generally treat inpatients, because those were limitedly used for the work process of certain departments or developed to improve specific disease-related indicators. Seoul National University Bundang Hospital developed a new concept of EMR system to overcome such limitations. The system allows medical professionals to easily access all information on inpatients and effectively retrieve important information from any part of the hospital by displaying inpatient information in the form of digital dashboard. In this study, we would like to introduce the structure, development methodology and the usage of our new concept.
Occurrence of veterinary pharmaceuticals in the aquatic environment in Flanders
NASA Astrophysics Data System (ADS)
Servaes, K.; Vanermen, G.; Seuntjens, P.
2009-04-01
There is a growing interest in the occurrence of pharmaceuticals in the aquatic environment. Pharmaceuticals are classified as so-called ‘emerging pollutants'. ‘Emerging pollutants' are not necessarily new chemical compounds. Often these compounds are already present in the environment for a long time. But, their occurrence and especially their impact on the environment has only recently become clear. Consequently, data on their occurrence are rather scarce. In this study, we focus on the occurrence of veterinary pharmaceuticals in surface water in Flanders. We have only considered active substances administered to cattle, pigs and poultry. Based on the literature and information concerning the use in Belgium, a selection of 25 veterinary pharmaceuticals has been made. This selection consists of the most important antibiotics and antiparasitic substances applied in veterinary medicine in Belgium. We develop an analytical methodology based on UPLC-MS/MS for the detection of these veterinary pharmaceuticals in surface water. Therefore, the mass characteristics as well as the optimum LC conditions will be determined. To obtain limits of detection as low as possible, the samples are concentrated prior to analysis using solid phase extraction (SPE). Different SPE cartridges will be tested during the method development. At first, this SPE sample pre-treatment is performed off-line. In a next step, online SPE is optimized for this purpose. The analytical procedure will be subject to an in-house validation study, thereby determining recovery, repeatability (% RSD), limits of detection and limits of quantification. Finally, the developed methodology will be applied for monitoring the occurrence of veterinary pharmaceuticals in surface water and groundwater in Flanders. These water samples will be taken in areas characterized by intensive cattle breeding. Moreover, the samples will be collected during springtime. In this season, farmers apply manure, stored during winter, onto the fields.
Barazzetti Barbieri, Cristina; de Souza Sarkis, Jorge Eduardo
2018-07-01
The forensic interpretation of environmental analytical data is usually challenging due to the high geospatial variability of these data. The measurements' uncertainty includes contributions from the sampling and from the sample handling and preparation processes. These contributions are often disregarded in analytical techniques results' quality assurance. A pollution crime investigation case was used to carry out a methodology able to address these uncertainties in two different environmental compartments, freshwater sediments and landfill leachate. The methodology used to estimate the uncertainty was the duplicate method (that replicates predefined steps of the measurement procedure in order to assess its precision) and the parameters used to investigate the pollution were metals (Cr, Cu, Ni, and Zn) in the leachate, the suspect source, and in the sediment, the possible sink. The metal analysis results were compared to statutory limits and it was demonstrated that Cr and Ni concentrations in sediment samples exceeded the threshold levels at all sites downstream the pollution sources, considering the expanded uncertainty U of the measurements and a probability of contamination >0.975, at most sites. Cu and Zn concentrations were above the statutory limits at two sites, but the classification was inconclusive considering the uncertainties of the measurements. Metal analyses in leachate revealed that Cr concentrations were above the statutory limits with a probability of contamination >0.975 in all leachate ponds while the Cu, Ni and Zn probability of contamination was below 0.025. The results demonstrated that the estimation of the sampling uncertainty, which was the dominant component of the combined uncertainty, is required for a comprehensive interpretation of the environmental analyses results, particularly in forensic cases. Copyright © 2018 Elsevier B.V. All rights reserved.
Larivée, N; Suissa, S; Khosrow-Khavar, F; Tagalakis, V; Filion, K B
2017-09-01
The effects of fourth-generation drospirenone-containing combined oral contraceptives (COCs) on the risk of venous thromboembolism (VTE) are controversial. To assess the methodological strengths and limitations of the evidence on the VTE risk of these COCs. We searched CINAHL, the Cochrane Library, EMBASE, HealthStar, Medline, and the Science Citation Index. Studies were included if they were cohort and case-control studies, reported a venous thrombotic outcome, had a comparator group, reported an effect measure of the association of interest, and were published in English or French. We assessed study quality using the ROBINS-I tool and assessed the presence of four common sources of bias: prevalent user bias, inappropriate choice of comparator, VTE misclassification, and confounding. Our systematic review included 17 studies. The relative risks of VTE associated with drospirenone- versus second-generation levonorgestrel-containing COCs ranged from 1.0 to 3.3. Based on ROBINS-I, three studies had a moderate risk, ten had a serious risk, and four had a critical risk. Nine studies included prevalent users, four included inappropriate comparators, four had VTE misclassification, and five did not account for two or more important confounding factors. The three highest quality studies had relative risks ranging from 1.0 to 1.57. As a result of the methodological limitations of the individual studies, the VTE risk of drospirenone-containing COCs remains unknown. The highest quality studies suggest there are no or slightly increased harmful effects, but their confidence limits do not rule out an almost doubling of the risk. Systematic review of drospirenone: best studies show no or slightly increased VTE risk (versus levonorgestrel). © 2017 Royal College of Obstetricians and Gynaecologists.
Qiu, Li; Wang, Xiao; Zhao, Na; Xu, Shiliang; An, Zengjian; Zhuang, Xuhui; Lan, Zhenggang; Wen, Lirong; Wan, Xiaobo
2014-12-05
A newly developed reductive ring closure methodology to heteroacenes bearing a dihydropyrrolo[3,2-b]pyrrole core was systematically studied for its scope and limitation. The methodology involves (i) the cyclization of an o-aminobenzoic acid ester derivative to give an eight-membered cyclic dilactam, and (ii) the conversion of the dilactams into the corresponding diimidoyl chloride, which undergoes (iii) reductive ring closure to install the dihydropyrrolo[3,2-b]pyrrole core. The first step of the methodology plays the key role due to its substrate limitation, which suffers from the competition of oligomerization and hydrolysis. All the dilactams could successfully convert to the corresponding diimidoyl chlorides, most of which succeeded to give the dihydropyrrolo[3,2-b]pyrrole core. The influence of the substituents and the elongation of conjugated length on the photophysical properties of the obtained heteroacenes were then investigated systematically using UV-vis spectroscopy and cyclic voltammetry. It was found that chlorination and fluorination had quite a different effect on the photophysical properties of the heteroacene, and the ring fusing pattern also had a drastic influence on the band gap of the heteroacene. The successful preparation of a series of heteroacenes bearing a dihydropyrrolo[3,2-b]pyrrole core would provide a wide variety of candidates for further fabrication of organic field-effect transistor devices.
Conducting Indigenous Research in Western Knowledge Spaces: Aligning Theory and Methodology
ERIC Educational Resources Information Center
Singh, Myra; Major, Jae
2017-01-01
Walking simultaneously in two worlds as an Indigenous researcher, navigating Indigenous and Western epistemologies/methodologies can have its challenges. Indigenous methodologies have become an important element of qualitative research and have been increasingly taken up by both Indigenous and non-Indigenous researchers. Indigenous methodologies…
A methodology for the assessment of manned flight simulator fidelity
NASA Technical Reports Server (NTRS)
Hess, Ronald A.; Malsbury, Terry N.
1989-01-01
A relatively simple analytical methodology for assessing the fidelity of manned flight simulators for specific vehicles and tasks is offered. The methodology is based upon an application of a structural model of the human pilot, including motion cue effects. In particular, predicted pilot/vehicle dynamic characteristics are obtained with and without simulator limitations. A procedure for selecting model parameters can be implemented, given a probable pilot control strategy. In analyzing a pair of piloting tasks for which flight and simulation data are available, the methodology correctly predicted the existence of simulator fidelity problems. The methodology permitted the analytical evaluation of a change in simulator characteristics and indicated that a major source of the fidelity problems was a visual time delay in the simulation.
Methodological Choices in Peer Nomination Research
ERIC Educational Resources Information Center
Cillessen, Antonius H. N.; Marks, Peter E. L.
2017-01-01
Although peer nomination measures have been used by researchers for nearly a century, common methodological practices and rules of thumb (e.g., which variables to measure; use of limited vs. unlimited nomination methods) have continued to develop in recent decades. At the same time, other key aspects of the basic nomination procedure (e.g.,…
ERIC Educational Resources Information Center
Bulfin, Scott; Henderson, Michael; Johnson, Nicola F.; Selwyn, Neil
2014-01-01
The academic study of educational technology is often characterised by critics as methodologically limited. In order to test this assumption, the present paper reports on data collected from a survey of 462 "research active" academic researchers working in the broad areas of educational technology and educational media. The paper…
A Big Data Analytics Methodology Program in the Health Sector
ERIC Educational Resources Information Center
Lawler, James; Joseph, Anthony; Howell-Barber, H.
2016-01-01
The benefits of Big Data Analytics are cited frequently in the literature. However, the difficulties of implementing Big Data Analytics can limit the number of organizational projects. In this study, the authors evaluate business, procedural and technical factors in the implementation of Big Data Analytics, applying a methodology program. Focusing…
College Research Methodology Courses: Revisiting General Instructional Goals and Objectives
ERIC Educational Resources Information Center
Lei, Simon A.
2010-01-01
A number of graduate (masters-level) students from a wide variety of academic disciplines have viewed a required introductory research methodology course negatively. These students often do not retain much of the previously learned material, thus limiting their success of subsequent research and statistics courses. The purpose of this article is…
Postscript: Split Spatial Attention? The Data Remain Difficult to Interpret
ERIC Educational Resources Information Center
Jans, Bert; Peters, Judith C.; De Weerd, Peter
2010-01-01
A growing number of studies claim that spatial attention can be split "on demand" into several, segregated foci of enhanced processing. Intrigued by the theoretical ramifications of this proposal, we analyzed 19 relevant sets of experiments using four methodological criteria. We typically found several methodological limitations in each study that…
Methodologies of Bilingual Instruction in Literacy Education. Project Mobile, 1988-89. OREA Report.
ERIC Educational Resources Information Center
Berney, Tomi D.; Plotkin, Donna
In its second year, Methodologies of Bilingual Instruction in Literacy Education (Project MOBILE) provided 373 students of limited English proficiency, native speakers of Spanish and Haitian Creole, with supplementary English as a Second Language (ESL), native language arts (NLA), and content-area instruction. Project MOBILE stressed the…
How Six Sigma Methodology Improved Doctors' Performance
ERIC Educational Resources Information Center
Zafiropoulos, George
2015-01-01
Six Sigma methodology was used in a District General Hospital to assess the effect of the introduction of an educational programme to limit unnecessary admissions. The performance of the doctors involved in the programme was assessed. Ishikawa Fishbone and 5 S's were initially used and Pareto analysis of their findings was performed. The results…
ERIC Educational Resources Information Center
Clapp, John D.; Holmes, Megan R.; Reed, Mark B.; Shillington, Audrey M.; Freisthler, Bridget; Lange, James E.
2007-01-01
In recent years researchers have paid substantial attention to the issue of college students' alcohol use. One limitation to the current literature is an over reliance on retrospective, self-report survey data. This article presents field methodologies for measuring college students' alcohol consumption in natural drinking environments.…
Major Life Events and Daily Hassles in Predicting Health Status: Methodological Inquiry.
ERIC Educational Resources Information Center
Flannery, Raymond B., Jr.
1986-01-01
Hypothesized that both major life events and daily hassles would be associated with anxiety and depression symptomatology. While the results partially support the hypothesis, the inconsistent findings suggest methodological flaws in each life stress measure. Reviews these limitations and presents the use of the semi-structured interview as one…
Assessing the impact of modeling limits on intelligent systems
NASA Technical Reports Server (NTRS)
Rouse, William B.; Hammer, John M.
1990-01-01
The knowledge bases underlying intelligent systems are validated. A general conceptual framework is provided for considering the roles in intelligent systems of models of physical, behavioral, and operational phenomena. A methodology is described for identifying limits in particular intelligent systems, and the use of the methodology is illustrated via an experimental evaluation of the pilot-vehicle interface within the Pilot's Associate. The requirements and functionality are outlined for a computer based knowledge engineering environment which would embody the approach advocated and illustrated in earlier discussions. Issues considered include the specific benefits of this functionality, the potential breadth of applicability, and technical feasibility.
Cognitive hypnotherapy with bulimia.
Barabasz, Marianne
2012-04-01
Research on the efficacy of hypnosis in the treatment of bulimia nervosa has produced mixed findings. This is due in part to the interplay between the characteristics of people with bulimia and the wide variety of hypnosis interventions that have been employed. Several authors have noted that methodological limitations in hypnosis research often make evaluation of treatment efficacy difficult. Many of the studies extant provide insufficient information regarding the specifics of participants' hypnotizability, the hypnotic induction, or the hypnotic suggestion(s) employed. Such limitations preclude replication and clinical implementation. This article reviews the literature with replicable methodologies and discusses the implications for evaluating treatment efficacy.
Choosing the Most Effective Pattern Classification Model under Learning-Time Constraint.
Saito, Priscila T M; Nakamura, Rodrigo Y M; Amorim, Willian P; Papa, João P; de Rezende, Pedro J; Falcão, Alexandre X
2015-01-01
Nowadays, large datasets are common and demand faster and more effective pattern analysis techniques. However, methodologies to compare classifiers usually do not take into account the learning-time constraints required by applications. This work presents a methodology to compare classifiers with respect to their ability to learn from classification errors on a large learning set, within a given time limit. Faster techniques may acquire more training samples, but only when they are more effective will they achieve higher performance on unseen testing sets. We demonstrate this result using several techniques, multiple datasets, and typical learning-time limits required by applications.
Del Giudice, G; Padulano, R; Siciliano, D
2016-01-01
The lack of geometrical and hydraulic information about sewer networks often excludes the adoption of in-deep modeling tools to obtain prioritization strategies for funds management. The present paper describes a novel statistical procedure for defining the prioritization scheme for preventive maintenance strategies based on a small sample of failure data collected by the Sewer Office of the Municipality of Naples (IT). Novelty issues involve, among others, considering sewer parameters as continuous statistical variables and accounting for their interdependences. After a statistical analysis of maintenance interventions, the most important available factors affecting the process are selected and their mutual correlations identified. Then, after a Box-Cox transformation of the original variables, a methodology is provided for the evaluation of a vulnerability map of the sewer network by adopting a joint multivariate normal distribution with different parameter sets. The goodness-of-fit is eventually tested for each distribution by means of a multivariate plotting position. The developed methodology is expected to assist municipal engineers in identifying critical sewers, prioritizing sewer inspections in order to fulfill rehabilitation requirements.
Health effects of electric and magnetic fields: Overview of research recommendations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Savitz, D.A.
We developed a series of articles concerning epidemiologic research on potential health effects of electric and magnetic fields. Our goal was to identify methodological issues that have arisen through past studies of cancer, reproduction, and neurobehavioral outcomes in order to suggest strategies to extend knowledge. Following an overview of relevant physics and engineering principles, cancer epidemiology of electric and magnetic fields is discussed separately with a focus on epidemiologic methods and cancer biology, respectively. Reproductive health studies, many of which focus on exposure from video display terminals are then summarized, followed by an evaluation of the limited literature on neurobehavioralmore » outcomes, including suicide and depression. Methodological issues in exposure assessment are discussed, focusing on the challenges in residential exposure assessment and interpretation of wire configuration codes. An overview offers recommendations for priorities across these topic areas, emphasizing the importance of resolving the question of wire codes and childhood cancer. Collectively, these articles provide an array of observations and suggestions regarding the epidemiologic literature, recognizing the potential benefits to science and public policy. 10 refs.« less
Strategies for the extraction and analysis of non-extractable polyphenols from plants.
Domínguez-Rodríguez, Gloria; Marina, María Luisa; Plaza, Merichel
2017-09-08
The majority of studies based on phenolic compounds from plants are focused on the extractable fraction derived from an aqueous or aqueous-organic extraction. However, an important fraction of polyphenols is ignored due to the fact that they remain retained in the residue of extraction. They are the so-called non-extractable polyphenols (NEPs) which are high molecular weight polymeric polyphenols or individual low molecular weight phenolics associated to macromolecules. The scarce information available about NEPs shows that these compounds possess interesting biological activities. That is why the interest about the study of these compounds has been increasing in the last years. Furthermore, the extraction and characterization of NEPs are considered a challenge because the developed analytical methodologies present some limitations. Thus, the present literature review summarizes current knowledge of NEPs and the different methodologies for the extraction of these compounds, with a particular focus on hydrolysis treatments. Besides, this review provides information on the most recent developments in the purification, separation, identification and quantification of NEPs from plants. Copyright © 2017 Elsevier B.V. All rights reserved.
Gene environment interaction studies in depression and suicidal behavior: An update.
Mandelli, Laura; Serretti, Alessandro
2013-12-01
Increasing evidence supports the involvement of both heritable and environmental risk factors in major depression (MD) and suicidal behavior (SB). Studies investigating gene-environment interaction (G × E) may be useful for elucidating the role of biological mechanisms in the risk for mental disorders. In the present paper, we review the literature regarding the interaction between genes modulating brain functions and stressful life events in the etiology of MD and SB and discuss their potential added benefit compared to genetic studies only. Within the context of G × E investigation, thus far, only a few reliable results have been obtained, although some genes have consistently shown interactive effects with environmental risk in MD and, to a lesser extent, in SB. Further investigation is required to disentangle the direct and mediated effects that are common or specific to MD and SB. Since traditional G × E studies overall suffer from important methodological limitations, further effort is required to develop novel methodological strategies with an interdisciplinary approach. Copyright © 2013 Elsevier Ltd. All rights reserved.
Anthropology and Epidemiology: learning epistemological lessons through a collaborative venture
Béhague, Dominique Pareja; Gonçalves, Helen; Victora, Cesar Gomes
2009-01-01
Collaboration between anthropology and epidemiology has a long and tumultuous history. Based on empirical examples, this paper describes a number of epistemological lessons we have learned through our experience of cross-disciplinary collaboration. Although critical of both mainstream epidemiology and medical anthropology, our analysis focuses on the implications of addressing each discipline’s main epistemological differences, while addressing the goal of adopting a broader social approach to health improvement. We believe it is important to push the boundaries of research collaborations from the more standard forms of “multidisciplinarity,” to the adoption of theoretically imbued “interdisciplinarity.” The more we challenge epistemological limitations and modify ways of knowing, the more we will be able to provide in-depth explanations for the emergence of disease-patterns and thus, to problem-solve. In our experience, both institutional support and the adoption of a relativistic attitude are necessary conditions for sustained theoretical interdisciplinarity. Until researchers acknowledge that methodology is merely a human-designed tool to interpret reality, unnecessary methodological hyper-specialization will continue to alienate one field of knowledge from the other. PMID:18833344
Wu, Liyun; Li, Xiaoming
2013-01-01
Background: This review explores the current community-based psychosocial interventions among people living with HIV/AIDS (PLWHA) across the globe. Methods: Evaluation studies were retrieved and reviewed regarding study location, characteristics of participants, study design, intervention strategies, outcome indicators, and intervention findings. Results: The 28 studies spanned a broad range of intervention strategies, including coping skills, treatment and cure, cultural activities, community involvement, knowledge education, voluntary counseling and testing, peer-group support, three-layered service provision, child-directed group intervention, adult mentoring, and support group interventions. Regardless of study designs, all studies reported positive intervention effects, ranging from a reduction in HIV/AIDS stigma, loneliness, marginalization, distress, depression, anger, and anxiety to an increase in self-esteem, self-efficacy, coping skills, and quality of life. Conclusion: Although the existing studies have limitation with regard to program coverage, intensity, scope, and methodological challenges, they underscore the importance of developing community-based interventions to promote psychosocial well-being among PLWHA. Future studies need to employ more rigorous methodology and integrate contextual and institutional factors when implementing effective interventions. PMID:25264499
Parallelization of fine-scale computation in Agile Multiscale Modelling Methodology
NASA Astrophysics Data System (ADS)
Macioł, Piotr; Michalik, Kazimierz
2016-10-01
Nowadays, multiscale modelling of material behavior is an extensively developed area. An important obstacle against its wide application is high computational demands. Among others, the parallelization of multiscale computations is a promising solution. Heterogeneous multiscale models are good candidates for parallelization, since communication between sub-models is limited. In this paper, the possibility of parallelization of multiscale models based on Agile Multiscale Methodology framework is discussed. A sequential, FEM based macroscopic model has been combined with concurrently computed fine-scale models, employing a MatCalc thermodynamic simulator. The main issues, being investigated in this work are: (i) the speed-up of multiscale models with special focus on fine-scale computations and (ii) on decreasing the quality of computations enforced by parallel execution. Speed-up has been evaluated on the basis of Amdahl's law equations. The problem of `delay error', rising from the parallel execution of fine scale sub-models, controlled by the sequential macroscopic sub-model is discussed. Some technical aspects of combining third-party commercial modelling software with an in-house multiscale framework and a MPI library are also discussed.
du Prel, Jean-Baptist; Röhrig, Bernd; Blettner, Maria
2009-02-01
In the era of evidence-based medicine, one of the most important skills a physician needs is the ability to analyze scientific literature critically. This is necessary to keep medical knowledge up to date and to ensure optimal patient care. The aim of this paper is to present an accessible introduction into critical appraisal of scientific articles. Using a selection of international literature, the reader is introduced to the principles of critical reading of scientific articles in medicine. For the sake of conciseness, detailed description of statistical methods is omitted. Widely accepted principles for critically appraising scientific articles are outlined. Basic knowledge of study design, structuring of an article, the role of different sections, of statistical presentations as well as sources of error and limitation are presented. The reader does not require extensive methodological knowledge. As far as necessary for critical appraisal of scientific articles, differences in research areas like epidemiology, clinical, and basic research are outlined. Further useful references are presented. Basic methodological knowledge is required to select and interpret scientific articles correctly.
Afshar, Soheil; Porter, Melanie; Barton, Belinda; Stormon, Michael
2018-05-09
As survival rates for pediatric liver transplant continue to increase, research attention is turning toward long-term functional consequences, with particular interest in whether medical and transplant-related factors are implicated in neurocognitive outcomes. The relative importance of different factors is unclear, due to a lack of methodological uniformity, inclusion of differing primary diagnoses, varying transplant policies, and organ availability in different jurisdictions. This cross-sectional, single-site study sought to address various methodological limitations in the literature and the paucity of studies conducted outside of North America and Western Europe by examining the intellectual and academic outcomes of Australian pediatric liver transplant recipients (N = 40). Participants displayed significantly poorer intellectual and mathematical abilities compared with the normative population. Greater time on the transplant waitlist was a significant predictor of poorer verbal intelligence, working memory, mathematical abilities, and reading but only when considering the subgroup of children with biliary atresia. These findings support reducing the time children wait for a transplant as a priority. © 2018 The American Society of Transplantation and the American Society of Transplant Surgeons.
Zhang, Liding; Wei, Qiujiang; Han, Qinqin; Chen, Qiang; Tai, Wenlin; Zhang, Jinyang; Song, Yuzhu; Xia, Xueshan
2018-01-01
Shigella is an important human food-borne zoonosis bacterial pathogen, and can cause clinically severe diarrhea. There is an urgent need to develop a specific, sensitive, and rapid methodology for detection of this pathogen. In this study, loop-mediated isothermal amplification (LAMP) combined with magnetic immunocapture assay (IC-LAMP) was first developed for the detection of Shigella in pure culture, artificial milk, and clinical stool samples. This method exhibited a detection limit of 8.7 CFU/mL. Compared with polymerase chain reaction, IC-LAMP is sensitive, specific, and reliable for monitoring Shigella. Additionally, IC-LAMP is more convenient, efficient, and rapid than ordinary LAMP, as it is more efficiently enriches pathogen cells without extraction of genomic DNA. Under isothermal conditions, the amplification curves and the green fluorescence were detected within 30 min in the presence of genomic DNA template. The overall analysis time was approximately 1 h, including the enrichment and lysis of the bacterial cells, a significantly short detection time. Therefore, the IC-LAMP methodology described here is potentially useful for the efficient detection of Shigella in various samples. PMID:29467730
Khachadourian, Vahe; Armenian, Haroutune; Demirchyan, Anahit; Melkonian, Arthur; Hovanesian, Ashot
2016-07-01
The post-earthquake psychopathological investigation (PEPSI) was designed to probe the short-and long-term effects of the earthquake in northern Armenia on 7 December 1988 on survivors' mental and physical health. Four phases of this study have been conducted to date, and, overall, more than 80 per cent of a sub-sample of 1,773 drawn from an initial cohort of 32,743 was successfully followed during 2012. This paper describes the methodology employed in the evaluation, summarises previous findings, details the current objectives, and examines the general characteristics of the sample based on the most recent follow-up phase outcomes. Despite a significant decrease in psychopathology rates between 1990 and 2012, prevalence rates of post-traumatic stress disorder and depression among study participants in 2012 were greater than 15 and 26 per cent, respectively. The paper also notes the strengths and limitations of the study vis-à-vis future research and highlights the importance and potential practical implications of similar assessments and their outcomes. © 2016 The Author(s). Disasters © Overseas Development Institute, 2016.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B.; Rising, Michael Evan; Alwin, Jennifer Louise
Whisper is a statistical analysis package developed in 2014 to support nuclear criticality safety (NCS) validation [1-3]. It uses the sensitivity profile data for an application as computed by MCNP6 [4-6] along with covariance files [7,8] for the nuclear data to determine a baseline upper-subcritical-limit (USL) for the application. Whisper version 1.0 was first developed and used at LANL in 2014 [3]. During 2015-2016, Whisper was updated to version 1.1 [9] and is to be included with the upcoming release of MCNP6.2. This document describes the Whisper-1.1 package that will be included with the MCNP6.2 release during 2017. Specific detailsmore » are provided on the computer systems supported, the software quality assurance (SQA) procedures, installation, and testing. This document does not address other important topics, such as the importance of sensitivity-uncertainty (SU) methods to NCS validation, the theory underlying SU methodology, tutorials on the usage of MCNP-Whisper, practical approaches to using SU methodology to support and extend traditional validation, etc. There are over 50 documents included with Whisper-1.1 and available in the MCNP Reference Collection on the MCNP website (mcnp.lanl.gov) that address all of those topics and more. In this document, however, a complete bibliography of relevant MCNP-Whisper references is provided.« less
The utility of human sciences in nursing inquiry.
Pratt, Maria
2012-01-01
This paper targets novice nurse researchers to highlight how the perspectives of human sciences are useful in understanding people's experiences. There is a need to address the utility of human sciences or the humanistic philosophy that values the understanding of subjective experiences in nursing, given that the mainstream development of nursing knowledge is still influenced by the positivist and post-positivist research paradigms. Discussion papers on Heideggerian hermeneutic phenomenology, human sciences, and qualitative research were accessed through the databases Cinahl and Medline over the past 30 years. Seminal works on phenomenology were addressed in this paper. Using Heideggerian hermeneutic phenomenology as a commonly referenced human philosophy and methodology, this paper discusses how Heidegger's (1962) perspective may be used in nursing practice and research. Van Manen's (1990) descriptions of phenomenological science are discussed to address the perspective's value in nursing inquiry and to reveal the biases associated with this humanistic approach. The limitations of human sciences should not deter nurse researchers from using this type of nursing inquiry as it can provide an important framework in nursing research, practice and knowledge development. The author's perspective as a graduate student highlights the importance of human sciences in exploring the experiences of people vital in the delivery of nursing practice. However, researchers wishing to undertake humanistic inquiry should learn the philosophical and methodological underpinnings of their chosen humanistic approach.
Few, Roger; Lake, Iain; Hunter, Paul R; Tran, Pham Gia; Thien, Vu Trong
2009-12-21
Understanding how risks to human health change as a result of seasonal variations in environmental conditions is likely to become of increasing importance in the context of climatic change, especially in lower-income countries. A multi-disciplinary approach can be a useful tool for improving understanding, particularly in situations where existing data resources are limited but the environmental health implications of seasonal hazards may be high. This short article describes a multi-disciplinary approach combining analysis of changes in levels of environmental contamination, seasonal variations in disease incidence and a social scientific analysis of health behaviour. The methodology was field-tested in a peri-urban environment in the Mekong Delta, Vietnam, where poor households face alternate seasonal extremes in the local environment as the water level in the Delta changes from flood to dry season. Low-income households in the research sites rely on river water for domestic uses, including provision of drinking water, and it is commonly perceived that the seasonal changes alter risk from diarrhoeal diseases and other diseases associated with contamination of water. The discussion focuses on the implementation of the methodology in the field, and draws lessons from the research process that can help in refining and developing the approach for application in other locations where seasonal dynamics of disease risk may have important consequences for public health.
Farmworker Exposure to Pesticides: Methodologic Issues for the Collection of Comparable Data
Arcury, Thomas A.; Quandt, Sara A.; Barr, Dana B.; Hoppin, Jane A.; McCauley, Linda; Grzywacz, Joseph G.; Robson, Mark G.
2006-01-01
The exposure of migrant and seasonal farmworkers and their families to agricultural and residential pesticides is a continuing public health concern. Pesticide exposure research has been spurred on by the development of sensitive and reliable laboratory techniques that allow the detection of minute amounts of pesticides or pesticide metabolites. The power of research on farmworker pesticide exposure has been limited because of variability in the collection of exposure data, the predictors of exposure considered, the laboratory procedures used in analyzing the exposure, and the measurement of exposure. The Farmworker Pesticide Exposure Comparable Data Conference assembled 25 scientists from diverse disciplinary and organizational backgrounds to develop methodologic consensus in four areas of farmworker pesticide exposure research: environmental exposure assessment, biomarkers, personal and occupational predictors of exposure, and health outcomes of exposure. In this introduction to this mini-monograph, first, we present the rationale for the conference and its organization. Second, we discuss some of the important challenges in conducting farmworker pesticide research, including the definition and size of the farmworker population, problems in communication and access, and the organization of agricultural work. Third, we summarize major findings from each of the conference’s four foci—environmental exposure assessment, biomonitoring, predictors of exposure, and health outcomes of exposure—as well as important laboratory and statistical analysis issues that cross-cut the four foci. PMID:16759996
DOE Office of Scientific and Technical Information (OSTI.GOV)
Augustine, Chad
Existing methodologies for estimating the electricity generation potential of Enhanced Geothermal Systems (EGS) assume thermal recovery factors of 5% or less, resulting in relatively low volumetric electricity generation potentials for EGS reservoirs. This study proposes and develops a methodology for calculating EGS electricity generation potential based on the Gringarten conceptual model and analytical solution for heat extraction from fractured rock. The electricity generation potential of a cubic kilometer of rock as a function of temperature is calculated assuming limits on the allowed produced water temperature decline and reservoir lifetime based on surface power plant constraints. The resulting estimates of EGSmore » electricity generation potential can be one to nearly two-orders of magnitude larger than those from existing methodologies. The flow per unit fracture surface area from the Gringarten solution is found to be a key term in describing the conceptual reservoir behavior. The methodology can be applied to aid in the design of EGS reservoirs by giving minimum reservoir volume, fracture spacing, number of fractures, and flow requirements for a target reservoir power output. Limitations of the idealized model compared to actual reservoir performance and the implications on reservoir design are discussed.« less
Methodological Considerations in Designing and Evaluating Animal-Assisted Interventions.
Stern, Cindy; Chur-Hansen, Anna
2013-02-27
This paper presents a discussion of the literature on animal-assisted interventions and describes limitations surrounding current methodological quality. Benefits to human physical, psychological and social health cannot be empirically confirmed due to the methodological limitations of the existing body of research, and comparisons cannot validly be made across different studies. Without a solid research base animal-assisted interventions will not receive recognition and acceptance as a credible alternative health care treatment. The paper draws on the work of four systematic reviews conducted over April-May 2009, with no date restrictions, focusing exclusively on the use of canine-assisted interventions for older people residing in long-term care. The reviews revealed a lack of good quality studies. Although the literature base has grown in volume since its inception, it predominantly consists of anecdotal accounts and reports. Experimental studies undertaken are often flawed in aspects of design, conduct and reporting. There are few qualitative studies available leading to the inability to draw definitive conclusions. It is clear that due to the complexities associated with these interventions not all weaknesses can be eliminated. However, there are basic methodological weaknesses that can be addressed in future studies in the area. Checklists for quantitative and qualitative research designs to guide future research are offered to help address methodological rigour.
Sources of self-efficacy in an undergraduate introductory astronomy course for non-science majors
NASA Astrophysics Data System (ADS)
Carter, Brooke L.
The role of the astronomy laboratory on non-science major student self-efficacy is investigated through combining quantitative and qualitative methodologies. The Astronomy Diagnostic Test 2.0 is distributed to an introductory astronomy laboratory for non-science major class in the Spring of 2005. The ADT 2.0 is used to draw comparisons between interview subjects and the remaining class. Eight subjects were interviewed three times throughout the semester in order to determine the important contributing factors to the subjects' self-efficacy beliefs. Results of the quantitative data suggest that the interview participants' general science self-efficacy did not significantly increase over the course of the semester. Results of the quantitative data suggest the most important contributor to the subjects' self-efficacy in the laboratory is verbal persuasion. The results of this limited study suggest that the astronomy laboratory experience is a strong contributor to student self-efficacy beliefs.
The Baptist Health Nurse Retention Questionnaire: A Methodological Study, Part 1.
Lengerich, Alexander; Bugajski, Andrew; Marchese, Matthew; Hall, Brittany; Yackzan, Susan; Davies, Claire; Brockopp, Dorothy
2017-05-01
The purposes of this study were to develop and test the Baptist Health Nurse Retention Questionnaire (BHNRQ) and examine the importance of nurse retention factors. Multiple factors, including increasing patient acuity levels, have led to concerns regarding nurse retention. An understanding of current factors related to retention is limited. To establish the psychometric properties of the BHNRQ, data were collected from 279 bedside nurses at a 391-bed, Magnet® redesignated community hospital. A principal component analysis was conducted to determine the subscale structure of the BHNRQ. Additional analyses were conducted related to content validity and test-retest reliability. The results of the principal components analysis revealed 3 subscales: nursing practice, management, and staffing. Analyses demonstrate that the BHNRQ is a reliable and valid instrument for measuring nurse retention factors. The BHNRQ was found to be a clinically useful instrument for measuring important factors related to nurse retention.
Apparatus For Measuring The Concentration Of A Species At A Distance
Rice, Steven F.; Allendorf, Mark D.
2006-04-11
Corrosion of refractory silica brick and air quality issues due to particulate emissions are two important glass manufacturing issues that have been tied to sodium vapor and its transport throughout the melt tank. Knowledge of the relationship between tank operating conditions and tank atmosphere sodium levels are therefore important considerations in correcting corrosion and air quality issues. However, until recently direct quantitative measurements of sodium levels has been limited to extractive sampling methods followed by laboratory analysis. Excimer laser induced fragmentation (ELIF) fluorescence spectroscopy is a technique that permits the measurement of volatilized NaOH in high temperature environments on a timescale of less than one second. The development of this method and the construction of field-portable instrumentation for glass furnace applications are herein disclosed. The method is shown to be effective in full-scale industrial settings. Characteristics of the method are outlined, including equipment configuration, detection sensitivity, and calibration methodology.
Reed, Darcy A; Fletcher, Kathlyn E; Arora, Vineet M
2010-12-21
The Accreditation Council for Graduate Medical Education's new duty-hour standards limit interns' shifts to 16 hours and night float to 6 consecutive nights. Protected sleep time (that is, "nap") is strongly encouraged. As duty-hour reforms are implemented, examination of the quality and outcomes of the relevant literature is important. To systematically review the literature examining shift length, protected sleep time, and night float. MEDLINE, PREMEDLINE, and EMBASE from January 1989 through May 2010. Studies examined the associations of shift length, protected sleep time, or night float with patient care, resident health, and education outcomes among residents in practice settings. Study quality was measured by using the validated Medical Education Research Study Quality Instrument and the U.S. Preventive Services Task Force criteria. Two investigators independently rated study quality, and interrater agreement was calculated. Sixty-four studies met inclusion criteria. Most studies used single-group cross-sectional (19 studies [29.7%]) or pre-post (41 studies [64.1%]) designs, and 4 (6.3%) were randomized, controlled trials. Five studies (7.8%) were multi-institutional. Twenty-four of 33 (72.7%) studies examining shift length reported that shorter shifts were associated with decreased medical errors, motor vehicle crashes, and percutaneous injuries. Only 2 studies assessed protected sleep time and reported that residents' adherence to naps was poor. Night floats described in 33 studies involved 5 to 7 consecutive nights. Most studies used single-institution, observational designs. Publication bias is likely but difficult to assess in this methodologically weak and heterogeneous body of evidence. For the limited outcomes measured, most studies supported reducing shift length but did not adequately address the optimal shift duration. Studies had numerous methodological limitations and unclear generalizability for most outcomes. Specific recommendations about shift length, protected sleep time, and night float should acknowledge the limitations of this evidence. Accreditation Council for Graduate Medical Education.
An Integrated Low-Speed Performance and Noise Prediction Methodology for Subsonic Aircraft
NASA Technical Reports Server (NTRS)
Olson, E. D.; Mavris, D. N.
2000-01-01
An integrated methodology has been assembled to compute the engine performance, takeoff and landing trajectories, and community noise levels for a subsonic commercial aircraft. Where feasible, physics-based noise analysis methods have been used to make the results more applicable to newer, revolutionary designs and to allow for a more direct evaluation of new technologies. The methodology is intended to be used with approximation methods and risk analysis techniques to allow for the analysis of a greater number of variable combinations while retaining the advantages of physics-based analysis. Details of the methodology are described and limited results are presented for a representative subsonic commercial aircraft.
Developing comparative criminology and the case of China: an introduction.
Liu, Jianhong
2007-02-01
Although comparative criminology has made significant development during the past decade or so, systematic empirical research has only developed along a few topics. Comparative criminology has never occupied a central position in criminology. This article analyzes the major theoretical and methodological impediments in the development of comparative criminology. It stresses a need to shift methodology from a conventional primary approach that uses the nation as the unit of analysis to an in-depth case study method as a primary methodological approach. The article maintains that case study method can overcome the limitation of its descriptive tradition and become a promising methodological approach for comparative criminology.
Critical Communicative Methodology: Informing Real Social Transformation through Research
ERIC Educational Resources Information Center
Gomez, Aitor; Puigvert, Lidia; Flecha, Ramon
2011-01-01
The critical communicative methodology (CCM) is a methodological response to the dialogic turn of societies and sciences that has already had an important impact in transforming situations of inequality and exclusion. Research conducted with the CCM implies continuous and egalitarian dialogue among researchers and the people involved in the…
A new method for determining the optimal lagged ensemble
DelSole, T.; Tippett, M. K.; Pegion, K.
2017-01-01
Abstract We propose a general methodology for determining the lagged ensemble that minimizes the mean square forecast error. The MSE of a lagged ensemble is shown to depend only on a quantity called the cross‐lead error covariance matrix, which can be estimated from a short hindcast data set and parameterized in terms of analytic functions of time. The resulting parameterization allows the skill of forecasts to be evaluated for an arbitrary ensemble size and initialization frequency. Remarkably, the parameterization also can estimate the MSE of a burst ensemble simply by taking the limit of an infinitely small interval between initialization times. This methodology is applied to forecasts of the Madden Julian Oscillation (MJO) from version 2 of the Climate Forecast System version 2 (CFSv2). For leads greater than a week, little improvement is found in the MJO forecast skill when ensembles larger than 5 days are used or initializations greater than 4 times per day. We find that if the initialization frequency is too infrequent, important structures of the lagged error covariance matrix are lost. Lastly, we demonstrate that the forecast error at leads ≥10 days can be reduced by optimally weighting the lagged ensemble members. The weights are shown to depend only on the cross‐lead error covariance matrix. While the methodology developed here is applied to CFSv2, the technique can be easily adapted to other forecast systems. PMID:28580050
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grabaskas, David; Brunett, Acacia J.; Passerini, Stefano
GE Hitachi Nuclear Energy (GEH) and Argonne National Laboratory (Argonne) participated in a two year collaboration to modernize and update the probabilistic risk assessment (PRA) for the PRISM sodium fast reactor. At a high level, the primary outcome of the project was the development of a next-generation PRA that is intended to enable risk-informed prioritization of safety- and reliability-focused research and development. A central Argonne task during this project was a reliability assessment of passive safety systems, which included the Reactor Vessel Auxiliary Cooling System (RVACS) and the inherent reactivity feedbacks of the metal fuel core. Both systems were examinedmore » utilizing a methodology derived from the Reliability Method for Passive Safety Functions (RMPS), with an emphasis on developing success criteria based on mechanistic system modeling while also maintaining consistency with the Fuel Damage Categories (FDCs) of the mechanistic source term assessment. This paper provides an overview of the reliability analyses of both systems, including highlights of the FMEAs, the construction of best-estimate models, uncertain parameter screening and propagation, and the quantification of system failure probability. In particular, special focus is given to the methodologies to perform the analysis of uncertainty propagation and the determination of the likelihood of violating FDC limits. Additionally, important lessons learned are also reviewed, such as optimal sampling methodologies for the discovery of low likelihood failure events and strategies for the combined treatment of aleatory and epistemic uncertainties.« less
Development of an evidence-based review with recommendations using an online iterative process.
Rudmik, Luke; Smith, Timothy L
2011-01-01
The practice of modern medicine is governed by evidence-based principles. Due to the plethora of medical literature, clinicians often rely on systematic reviews and clinical guidelines to summarize the evidence and provide best practices. Implementation of an evidence-based clinical approach can minimize variation in health care delivery and optimize the quality of patient care. This article reports a method for developing an "Evidence-based Review with Recommendations" using an online iterative process. The manuscript describes the following steps involved in this process: Clinical topic selection, Evidence-hased review assignment, Literature review and initial manuscript preparation, Iterative review process with author selection, and Manuscript finalization. The goal of this article is to improve efficiency and increase the production of evidence-based reviews while maintaining the high quality and transparency associated with the rigorous methodology utilized for clinical guideline development. With the rise of evidence-based medicine, most medical and surgical specialties have an abundance of clinical topics which would benefit from a formal evidence-based review. Although clinical guideline development is an important methodology, the associated challenges limit development to only the absolute highest priority clinical topics. As outlined in this article, the online iterative approach to the development of an Evidence-based Review with Recommendations may improve productivity without compromising the quality associated with formal guideline development methodology. Copyright © 2011 American Rhinologic Society-American Academy of Otolaryngic Allergy, LLC.
Methodological challenges to bridge the gap between regional climate and hydrology models
NASA Astrophysics Data System (ADS)
Bozhinova, Denica; José Gómez-Navarro, Juan; Raible, Christoph; Felder, Guido
2017-04-01
The frequency and severity of floods worldwide, together with their impacts, are expected to increase under climate change scenarios. It is therefore very important to gain insight into the physical mechanisms responsible for such events in order to constrain the associated uncertainties. Model simulations of the climate and hydrological processes are important tools that can provide insight in the underlying physical processes and thus enable an accurate assessment of the risks. Coupled together, they can provide a physically consistent picture that allows to assess the phenomenon in a comprehensive way. However, climate and hydrological models work at different temporal and spatial scales, so there are a number of methodological challenges that need to be carefully addressed. An important issue pertains the presence of biases in the simulation of precipitation. Climate models in general, and Regional Climate models (RCMs) in particular, are affected by a number of systematic biases that limit their reliability. In many studies, prominently the assessment of changes due to climate change, such biases are minimised by applying the so-called delta approach, which focuses on changes disregarding absolute values that are more affected by biases. However, this approach is not suitable in this scenario, as the absolute value of precipitation, rather than the change, is fed into the hydrological model. Therefore, bias has to be previously removed, being this a complex matter where various methodologies have been proposed. In this study, we apply and discuss the advantages and caveats of two different methodologies that correct the simulated precipitation to minimise differences with respect an observational dataset: a linear fit (FIT) of the accumulated distributions and Quantile Mapping (QM). The target region is Switzerland, and therefore the observational dataset is provided by MeteoSwiss. The RCM is the Weather Research and Forecasting model (WRF), driven at the boundaries by the Community Earth System Model (CESM). The raw simulation driven by CESM exhibit prominent biases that stand out in the evolution of the annual cycle and demonstrate that the correction of biases is mandatory in this type of studies, rather than a minor correction that might be neglected. The simulation spans the period 1976 - 2005, although the application of the correction is carried out on a daily basis. Both methods lead to a corrected field of precipitation that respects the temporal evolution of the simulated precipitation, at the same time that mimics the distribution of precipitation according to the one in the observations. Due to the nature of the two methodologies, there are important differences between the products of both corrections, that lead to dataset with different properties. FIT is generally more accurate regarding the reproduction of the tails of the distribution, i.e. extreme events, whereas the nature of QM renders it a general-purpose correction whose skill is equally distributed across the full distribution of precipitation, including central values.
2005-03-01
Spontaneous intracerebral hemorrhage (ICH) is one of the most lethal stroke types. In December 2003, a National Institute of Neurological Disorders and Stroke (NINDS) workshop was convened to develop a consensus for ICH research priorities. The focus was clinical research aimed at acute ICH in patients. Workshop participants were divided into 6 groups: (1) current state of ICH research; (2) basic science; and (3) imaging, (4) medical, (5) surgical, and (6) clinical methodology. Each group formulated research priorities before the workshop. At the workshop, these were discussed and refined. Recent progress in management of hemorrhage growth, intraventricular hemorrhage, and limitations in the benefit of open craniotomy were noted. The workshop identified the importance of developing animal models to reflect human ICH, as well as the phenomena of rebleeding. More human ICH pathology is needed. Real-time, high-field magnets and 3-dimensional imaging, as well as high-resolution tissue probes, are ICH imaging priorities. Trials of acute blood pressure-lowering in ICH and coagulopathy reversal are medical priorities. The exact role of edema in human ICH pathology and its treatment requires intensive study. Trials of minimally invasive surgical techniques including mechanical and chemical surgical adjuncts are critically important. The methodologic challenges include establishing research networks and a multi-specialty approach. Waiver of consent issues and standardizing care in trials are important issues. Encouragement of young investigators from varied backgrounds to enter the ICH research field is critical. Increasing ICH research is crucial. A collaborative approach is likely to yield therapies for this devastating form of brain injury.
Prabhakar, Ramachandran
2012-01-01
Source to surface distance (SSD) plays a very important role in external beam radiotherapy treatment verification. In this study, a simple technique has been developed to verify the SSD automatically with lasers. The study also suggests a methodology for determining the respiratory signal with lasers. Two lasers, red and green are mounted on the collimator head of a Clinac 2300 C/D linac along with a camera to determine the SSD. A software (SSDLas) was developed to estimate the SSD automatically from the images captured by a 12-megapixel camera. To determine the SSD to a patient surface, the external body contour of the central axis transverse computed tomography (CT) cut is imported into the software. Another important aspect in radiotherapy is the generation of respiratory signal. The changes in the lasers separation as the patient breathes are converted to produce a respiratory signal. Multiple frames of laser images were acquired from the camera mounted on the collimator head and each frame was analyzed with SSDLas to generate the respiratory signal. The SSD as observed with the ODI on the machine and SSD measured by the SSDlas software was found to be within the tolerance limit. The methodology described for generating the respiratory signals will be useful for the treatment of mobile tumors such as lung, liver, breast, pancreas etc. The technique described for determining the SSD and the generation of respiratory signals using lasers is cost effective and simple to implement. Copyright © 2011 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Geisler, Cheryl
2018-01-01
Coding, the analytic task of assigning codes to nonnumeric data, is foundational to writing research. A rich discussion of methodological pluralism has established the foundational importance of systematicity in the task of coding, but less attention has been paid to the equally important commitment to language complexity. Addressing the interplay…
Problems and Limitations in Studies on Screening for Language Delay
ERIC Educational Resources Information Center
Eriksson, Marten; Westerlund, Monica; Miniscalco, Carmela
2010-01-01
This study discusses six common methodological limitations in screening for language delay (LD) as illustrated in 11 recent studies. The limitations are (1) whether the studies define a target population, (2) whether the recruitment procedure is unbiased, (3) attrition, (4) verification bias, (5) small sample size and (6) inconsistencies in choice…
X-Phi and Carnapian Explication.
Shepherd, Joshua; Justus, James
2015-04-01
The rise of experimental philosophy (x-phi) has placed metaphilosophical questions, particularly those concerning concepts, at the center of philosophical attention. X-phi offers empirically rigorous methods for identifying conceptual content, but what exactly it contributes towards evaluating conceptual content remains unclear. We show how x-phi complements Rudolf Carnap's underappreciated methodology for concept determination, explication. This clarifies and extends x-phi's positive philosophical import, and also exhibits explication's broad appeal. But there is a potential problem: Carnap's account of explication was limited to empirical and logical concepts, but many concepts of interest to philosophers (experimental and otherwise) are essentially normative. With formal epistemology as a case study, we show how x-phi assisted explication can apply to normative domains.
Deanol in minimal brain dysfunction.
Lewis, J A; Lewis, B S
1977-12-01
The literature on minimal brain dysfunction is confused, confusing and controversial. The statements that the condition exists, needs treatment, and that treatment may be pharmacological, are more expressions of faith than accepted fact. We believe they are true (within limits not discussed in the article). Furthermore, there is evidence that some patients with MBD are hypo-aroused, while others are not. The role of deanol in the treatment of MBD is still unclear, because of the complexities of identifying appropriate patients in terms of levels of arousal, as well as identifying appropriate measures of response. There is sufficient support for an effect of deanol in the literature to justify further investigation. Further studies should attend to important methodological problems as discussed.
Chile rural electrification cooperation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flowers, L.
1997-12-01
The author describes a joint program to use renewables for rural electrification projects in Chile. The initial focus was in a limited part of the country, involving wind mapping, pilot project planning, training, and development of methodologies for comparative evaluations of resources. To this point three wind hybrid systems have been installed in one region, as a part of the regional private utility, and three additional projects are being designed. Additional resource assessment and training is ongoing. The author points out the difficulties in working with utilities, the importance of signed documentation, and the need to look at these programsmore » as long term because of the time involved in introducing such new technologies.« less
Strengthening Precipitate Morphologies Fully Quantified in Advanced Disk Superalloys
NASA Technical Reports Server (NTRS)
Gabb, Timothy P.
1998-01-01
Advanced aviation gas turbine engines will require disk superalloys that can operate at higher temperatures and stresses than current conditions. Such applications will be limited by the tensile, creep, and fatigue mechanical properties of these alloys. These mechanical properties vary with the size, shape, and quantity of the gamma precipitates that strengthen disk superalloys. It is therefore important to quantify these precipitate parameters and relate them to mechanical properties to improve disk superalloys. Favorable precipitate morphologies and practical processing approaches to achieve them can then be determined. A methodology has been developed at the NASA Lewis Research Center to allow the comprehensive quantification of the size, shape, and quantity of all types of gamma precipitates.
Informed consent in blood transfusion: knowledge and administrative issues in Uganda hospitals.
Kajja, Isaac; Bimenya, Gabriel S; Smit Sibinga, Cees Th
2011-02-01
Blood as a transplant is not free of risks. Clinicians and patients ought to know the parameters of a transfusion informed consent. A mixed methodology to explore patients' and clinicians' knowledge and opinions of administration and strategies to improve the transfusion informed consent process was conducted. The clinicians' level of knowledge was limited to provision of information about and the right to consent to a transfusion. They disagreed on administrative issues but had acceptable opinions on improving the process. Patients perceived this process as a way of assurance of blood safety. This process is important and should not be omitted. Copyright © 2010 Elsevier Ltd. All rights reserved.
General Principles for Brain Design
NASA Astrophysics Data System (ADS)
Josephson, Brian D.
2006-06-01
The task of understanding how the brain works has met with only limited success since important design concepts are not as yet incorporated in the analysis. Relevant concepts can be uncovered by studying the powerful methodologies that have evolved in the context of computer programming, raising the question of how the concepts involved there can be realised in neural hardware. Insights can be gained in regard to such issues through the study of the role played by models and representation. These insights lead on to an appreciation of the mechanisms underlying subtle capacities such as those concerned with the use of language. A precise, essentially mathematical account of such capacities is in prospect for the future.
Modeling Viral Capsid Assembly
2014-01-01
I present a review of the theoretical and computational methodologies that have been used to model the assembly of viral capsids. I discuss the capabilities and limitations of approaches ranging from equilibrium continuum theories to molecular dynamics simulations, and I give an overview of some of the important conclusions about virus assembly that have resulted from these modeling efforts. Topics include the assembly of empty viral shells, assembly around single-stranded nucleic acids to form viral particles, and assembly around synthetic polymers or charged nanoparticles for nanotechnology or biomedical applications. I present some examples in which modeling efforts have promoted experimental breakthroughs, as well as directions in which the connection between modeling and experiment can be strengthened. PMID:25663722
A Comparative Analysis of Disaster Risk, Vulnerability and Resilience Composite Indicators.
Beccari, Benjamin
2016-03-14
In the past decade significant attention has been given to the development of tools that attempt to measure the vulnerability, risk or resilience of communities to disasters. Particular attention has been given to the development of composite indices to quantify these concepts mirroring their deployment in other fields such as sustainable development. Whilst some authors have published reviews of disaster vulnerability, risk and resilience composite indicator methodologies, these have been of a limited nature. This paper seeks to dramatically expand these efforts by analysing 106 composite indicator methodologies to understand the breadth and depth of practice. An extensive search of the academic and grey literature was undertaken for composite indicator and scorecard methodologies that addressed multiple/all hazards; included social and economic aspects of risk, vulnerability or resilience; were sub-national in scope; explained the method and variables used; focussed on the present-day; and, had been tested or implemented. Information on the index construction, geographic areas of application, variables used and other relevant data was collected and analysed. Substantial variety in construction practices of composite indicators of risk, vulnerability and resilience were found. Five key approaches were identified in the literature, with the use of hierarchical or deductive indices being the most common. Typically variables were chosen by experts, came from existing statistical datasets and were combined by simple addition with equal weights. A minimum of 2 variables and a maximum of 235 were used, although approximately two thirds of methodologies used less than 40 variables. The 106 methodologies used 2298 unique variables, the most frequently used being common statistical variables such as population density and unemployment rate. Classification of variables found that on average 34% of the variables used in each methodology related to the social environment, 25% to the disaster environment, 20% to the economic environment, 13% to the built environment, 6% to the natural environment and 3% were other indices. However variables specifically measuring action to mitigate or prepare for disasters only comprised 12%, on average, of the total number of variables in each index. Only 19% of methodologies employed any sensitivity or uncertainty analysis and in only a single case was this comprehensive. A number of potential limitations of the present state of practice and how these might impact on decision makers are discussed. In particular the limited deployment of sensitivity and uncertainty analysis and the low use of direct measures of disaster risk, vulnerability and resilience could significantly limit the quality and reliability of existing methodologies. Recommendations for improvements to indicator development and use are made, as well as suggested future research directions to enhance the theoretical and empirical knowledge base for composite indicator development.
A Comparative Analysis of Disaster Risk, Vulnerability and Resilience Composite Indicators
Beccari, Benjamin
2016-01-01
Introduction: In the past decade significant attention has been given to the development of tools that attempt to measure the vulnerability, risk or resilience of communities to disasters. Particular attention has been given to the development of composite indices to quantify these concepts mirroring their deployment in other fields such as sustainable development. Whilst some authors have published reviews of disaster vulnerability, risk and resilience composite indicator methodologies, these have been of a limited nature. This paper seeks to dramatically expand these efforts by analysing 106 composite indicator methodologies to understand the breadth and depth of practice. Methods: An extensive search of the academic and grey literature was undertaken for composite indicator and scorecard methodologies that addressed multiple/all hazards; included social and economic aspects of risk, vulnerability or resilience; were sub-national in scope; explained the method and variables used; focussed on the present-day; and, had been tested or implemented. Information on the index construction, geographic areas of application, variables used and other relevant data was collected and analysed. Results: Substantial variety in construction practices of composite indicators of risk, vulnerability and resilience were found. Five key approaches were identified in the literature, with the use of hierarchical or deductive indices being the most common. Typically variables were chosen by experts, came from existing statistical datasets and were combined by simple addition with equal weights. A minimum of 2 variables and a maximum of 235 were used, although approximately two thirds of methodologies used less than 40 variables. The 106 methodologies used 2298 unique variables, the most frequently used being common statistical variables such as population density and unemployment rate. Classification of variables found that on average 34% of the variables used in each methodology related to the social environment, 25% to the disaster environment, 20% to the economic environment, 13% to the built environment, 6% to the natural environment and 3% were other indices. However variables specifically measuring action to mitigate or prepare for disasters only comprised 12%, on average, of the total number of variables in each index. Only 19% of methodologies employed any sensitivity or uncertainty analysis and in only a single case was this comprehensive. Discussion: A number of potential limitations of the present state of practice and how these might impact on decision makers are discussed. In particular the limited deployment of sensitivity and uncertainty analysis and the low use of direct measures of disaster risk, vulnerability and resilience could significantly limit the quality and reliability of existing methodologies. Recommendations for improvements to indicator development and use are made, as well as suggested future research directions to enhance the theoretical and empirical knowledge base for composite indicator development. PMID:27066298
THE USE AND LIMITATIONS OF DETECTION AND QUANTITATION LIMITS IN ENVIRONMENTAL ANALYSIS
Site assessment, remediation and compliance monitoring require the routine determination of the concentration of regulated substances in environmental samples. Each measurement methodology providing the concentration determinations, is required to specify key data quality elemen...
Possible Improvements of the ACE Diversity Interchange Methodology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Etingov, Pavel V.; Zhou, Ning; Makarov, Yuri V.
2010-07-26
North American Electric Reliability Corporation (NERC) grid is operated by about 131 balancing authorities (BA). Within each BA, operators are responsible for managing the unbalance (caused by both load and wind). As wind penetration levels increase, the challenges of managing power variation increases. Working independently, balancing area with limited regulating/load following generation and high wind power penetration faces significant challenges. The benefits of BA cooperation and consolidation increase when there is a significant wind energy penetration. To explore the benefits of BA cooperation, this paper investigates ACE sharing approach. A technology called ACE diversity interchange (ADI) is already in usemore » in the western interconnection. A new methodology extending ADI is proposed in the paper. The proposed advanced ADI overcoming some limitations existing in conventional ADI. Simulations using real statistical data of CAISO and BPA have shown high performance of the proposed advanced ADI methodology.« less
Beckensteiner, Jennifer; Kaplan, David M.; Potts, Warren M.; Santos, Carmen V.; O’Farrell, Michael R.
2016-01-01
Excessive truncation of a population’s size structure is often identified as an important deleterious effect of exploitation, yet the effect on population persistence of size-structure truncation caused by exploitation is often not quantified due to data limitations. In this study, we estimate changes in eggs per recruit (EPR) using annual length-frequency samples over a 9 year period to assess persistence of the two most important recreational fishes in southern Angola: west coast dusky kob (Argyrosomus coronus) and leerfish (Lichia amia). Using a length- and age-structured model, we improve on an existing method to fit this type of model to length-frequency data and estimate EPR. The objectives of the methodological changes are to add flexibility and robustness to the approach for assessing population status in data-limited situations. Results indicate that dusky kob presents very low levels of EPR (5%-10% of the per recruit reproductive capacity in the absence of fishing) in 2013, whereas large inter-annual variability in leerfish estimates suggest caution must be applied when drawing conclusions about its exploitation status. Using simulated length frequency data with known parameter values, we demonstrate that recruitment decline due to overexploitation leads to overestimation of EPR values. Considering the low levels of EPR estimated for the study species, recruitment limitation is not impossible and true EPR values may be even lower than our estimates. It is, therefore, likely that management action, such as the creation of Marine Protected Areas, is needed to reconstitute the west coast dusky kob population. PMID:26829489
Beckensteiner, Jennifer; Kaplan, David M; Potts, Warren M; Santos, Carmen V; O'Farrell, Michael R
2016-01-01
Excessive truncation of a population's size structure is often identified as an important deleterious effect of exploitation, yet the effect on population persistence of size-structure truncation caused by exploitation is often not quantified due to data limitations. In this study, we estimate changes in eggs per recruit (EPR) using annual length-frequency samples over a 9 year period to assess persistence of the two most important recreational fishes in southern Angola: west coast dusky kob (Argyrosomus coronus) and leerfish (Lichia amia). Using a length- and age-structured model, we improve on an existing method to fit this type of model to length-frequency data and estimate EPR. The objectives of the methodological changes are to add flexibility and robustness to the approach for assessing population status in data-limited situations. Results indicate that dusky kob presents very low levels of EPR (5%-10% of the per recruit reproductive capacity in the absence of fishing) in 2013, whereas large inter-annual variability in leerfish estimates suggest caution must be applied when drawing conclusions about its exploitation status. Using simulated length frequency data with known parameter values, we demonstrate that recruitment decline due to overexploitation leads to overestimation of EPR values. Considering the low levels of EPR estimated for the study species, recruitment limitation is not impossible and true EPR values may be even lower than our estimates. It is, therefore, likely that management action, such as the creation of Marine Protected Areas, is needed to reconstitute the west coast dusky kob population.
The Effect of Job Performance Aids on Quality Assurance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fosshage, Erik
Job performance aids (JPAs) have been studied for many decades in a variety of disciplines and for many different types of tasks, yet this is the first known research experiment using JPAs in a quality assurance (QA) context. The objective of this thesis was to assess whether a JPA has an effect on the performance of a QA observer performing the concurrent dual verification technique for a basic assembly task. The JPA used in this study was a simple checklist, and the design borrows heavily from prior research on task analysis and other human factors principles. The assembly task andmore » QA construct of concurrent dual verification are consistent with those of a high consequence manufacturing environment. Results showed that the JPA had only a limited effect on QA performance in the context of this experiment. However, there were three important and unexpected findings that may draw interest from a variety of practitioners. First, a novel testing methodology sensitive enough to measure the effects of a JPA on performance was created. Second, the discovery that there are different probabilities of detection for different types of error in a QA context may be the most far-reaching results. Third, these results highlight the limitations of concurrent dual verification as a control against defects. It is hoped that both the methodology and results of this study are an effective baseline from which to launch future research activities.« less
The Effects of Breakfast and Breakfast Composition on Cognition in Adults123
Spitznagel, Mary Beth
2016-01-01
Extensive literature has addressed the acute cognitive effects of breaking a fast. Recent reviews in this line of work have synthesized available research on the cognitive consequences of fasting compared with nutrient intake and the cognitive effects of macronutrient consumption. These largely have been inconclusive, possibly in part because of selection criteria limiting the scope of studies covered. The purpose of the current review is to integrate the results of the literature examining the cognitive effects of breakfast and breakfast composition in adults with the use of a flexible definition of breakfast, specifically, any caloric intake after a fasting period of ≥8 h. This review includes 38 studies that examine the acute cognitive impact of breakfast and 16 studies that examine the effects of breakfast composition. Results suggest that healthy adults show a small but robust advantage for memory (particularly delayed recall) from consuming breakfast. Largely equivocal results emerge for attention and motor and executive function; there were no effects from breakfast on language. Regarding breakfast composition, a smaller number of studies and widely disparate methodology addressing this question preclude definitive conclusions about the effects of cognition. A subset of this literature examines these questions in the context of glucoregulation; the findings emphasize the importance of considering differences in glucoregulation in research designs, even among healthy cohorts. The limitations of this literature include methodologic differences, such as the use of different tests to measure cognitive constructs, as well as the effects of timing in test administration. PMID:27184286
Microparticle Analysis in Disorders of Hemostasis and Thrombosis
Mooberry, Micah J.; Key, Nigel S.
2015-01-01
Microparticles (MPs) are submicron vesicles released from the plasma membrane of eukaryotic cells in response to activation or apoptosis. MPs are known to be involved in numerous biologic processes, including inflammation, the immune response, cancer metastasis, and angiogenesis. Their earliest recognized and most widely accepted role, however, is the ability to promote and support the process of blood coagulation. Consequently, there is ongoing interest in studying MPs in disorders of hemostasis and thrombosis. Both phosphatidylserine (PS) exposure and the presence of tissue factor (TF) in the MP membrane may account for their procoagulant properties, and elevated numbers of MPs in plasma have been reported in numerous prothrombotic conditions. To date, however, there are few data on true causality linking MPs to the genesis of thrombosis. A variety of methodologies have been employed to characterize and quantify MPs, although detection is challenging due to their submicron size. Flow cytometry (FCM) remains the most frequently utilized strategy for MP detection; however, it is associated with significant technological limitations. Additionally, pre-analytical and analytical variables can influence the detection of MPs by FCM, rendering data interpretation difficult. Lack of methodologic standardization in MP analysis by FCM confounds the issue further, although efforts are currently underway to address this limitation. Moving forward, it will be important to address these technical challenges as a scientific community if we are to better understand the role that MPs play in disorders of hemostasis and thrombosis. PMID:25704723
Subgroup analyses in confirmatory clinical trials: time to be specific about their purposes.
Tanniou, Julien; van der Tweel, Ingeborg; Teerenstra, Steven; Roes, Kit C B
2016-02-18
It is well recognized that treatment effects may not be homogeneous across the study population. Subgroup analyses constitute a fundamental step in the assessment of evidence from confirmatory (Phase III) clinical trials, where conclusions for the overall study population might not hold. Subgroup analyses can have different and distinct purposes, requiring specific design and analysis solutions. It is relevant to evaluate methodological developments in subgroup analyses against these purposes to guide health care professionals and regulators as well as to identify gaps in current methodology. We defined four purposes for subgroup analyses: (1) Investigate the consistency of treatment effects across subgroups of clinical importance, (2) Explore the treatment effect across different subgroups within an overall non-significant trial, (3) Evaluate safety profiles limited to one or a few subgroup(s), (4) Establish efficacy in the targeted subgroup when included in a confirmatory testing strategy of a single trial. We reviewed the methodology in line with this "purpose-based" framework. The review covered papers published between January 2005 and April 2015 and aimed to classify them in none, one or more of the aforementioned purposes. In total 1857 potentially eligible papers were identified. Forty-eight papers were selected and 20 additional relevant papers were identified from their references, leading to 68 papers in total. Nineteen were dedicated to purpose 1, 16 to purpose 4, one to purpose 2 and none to purpose 3. Seven papers were dedicated to more than one purpose, the 25 remaining could not be classified unambiguously. Purposes of the methods were often not specifically indicated, methods for subgroup analysis for safety purposes were almost absent and a multitude of diverse methods were developed for purpose (1). It is important that researchers developing methodology for subgroup analysis explicitly clarify the objectives of their methods in terms that can be understood from a patient's, health care provider's and/or regulator's perspective. A clear operational definition for consistency of treatment effects across subgroups is lacking, but is needed to improve the usability of subgroup analyses in this setting. Finally, methods to particularly explore benefit-risk systematically across subgroups need more research.
Assembly line performance and modeling
NASA Astrophysics Data System (ADS)
Rane, Arun B.; Sunnapwar, Vivek K.
2017-09-01
Automobile sector forms the backbone of manufacturing sector. Vehicle assembly line is important section in automobile plant where repetitive tasks are performed one after another at different workstations. In this thesis, a methodology is proposed to reduce cycle time and time loss due to important factors like equipment failure, shortage of inventory, absenteeism, set-up, material handling, rejection and fatigue to improve output within given cost constraints. Various relationships between these factors, corresponding cost and output are established by scientific approach. This methodology is validated in three different vehicle assembly plants. Proposed methodology may help practitioners to optimize the assembly line using lean techniques.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeff Sanders
2006-09-01
Development and attestation of gamma-ray non-destructive assay measurement methodologies for use by inspectors of the Russian Federal Service for Environmental, Technological, and Nuclear Oversight (Rostekhnadzor, formerly Gosatomnadzor or GAN), as well as for use by Russian nuclear facilities, has been completed. Specifically, a methodology utilizing the gamma-ray multi group analysis (MGA) method for determining plutonium isotopic composition has been developed, while existing methodologies to determining uranium enrichment and isotopic composition have been revised to make them more appropriate to the material types and conditions present in nuclear facilities in the Russian Federation. This paper will discuss the development and revisionmore » of these methodologies, the metrological characteristics of the final methodologies, as well as the limitations and concerns specific to the utilization of these analysis methods in the Russian Federation.« less
[Radiotherapy phase I trials' methodology: Features].
Rivoirard, R; Vallard, A; Langrand-Escure, J; Guy, J-B; Ben Mrad, M; Yaoxiong, X; Diao, P; Méry, B; Pigne, G; Rancoule, C; Magné, N
2016-12-01
In clinical research, biostatistical methods allow the rigorous analysis of data collection and should be defined from the trial design to obtain the appropriate experimental approach. Thus, if the main purpose of phase I is to determine the dose to use during phase II, methodology should be finely adjusted to experimental treatment(s). Today, the methodology for chemotherapy and targeted therapy is well known. For radiotherapy and chemoradiotherapy phase I trials, the primary endpoint must reflect both effectiveness and potential treatment toxicities. Methodology should probably be complex to limit failures in the following phases. However, there are very few data about methodology design in the literature. The present study focuses on these particular trials and their characteristics. It should help to raise existing methodological patterns shortcomings in order to propose new and better-suited designs. Copyright © 2016 Société française de radiothérapie oncologique (SFRO). Published by Elsevier SAS. All rights reserved.
Rogers, J L; Stoms, G B; Phifer, J L
1989-01-01
A systematic "roadmap" through the medical literature that empirically examines the incidence of psychological sequelae of induced abortion is presented. Because outcome incidence rates and methodological profiles vary substantially across studies, selective use of articles from this literature without an accompanying rationale for that selectivity could foster erroneous conclusions. Information compiled here can facilitate a rapid methodological critique of citations in abortion-related materials. Investigations published in English between January 1966 and April 1988 that quantitatively examined psychological sequelae using prospective, retrospective, or comparative methodologies are summarized in tables to produce a synopsis of the demographics, methodological limitations, and gross statistical features of each article. This quantitative guide is designed to facilitate appropriate use of the current literature, provide needed background to assess positions arising from the currently available data, and provide methodological focus for planning better studies in the future.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-03
... Methodology for Boiling Water Reactors, June 2011. To support use of Topical Report ANP-10307PA, Revision 0... the NRC's E-Filing system does not support unlisted software, and the NRC Meta System Help Desk will... Water Reactors with AREVA Topical Report ANP- 10307PA, Revision 0, ``AREVA MCPR Safety Limit Methodology...
A Test Method for Monitoring Modulus Changes during Durability Tests on Building Joint Sealants
Christopher C. White; Donald L. Hunston; Kar Tean Tan; Gregory T. Schueneman
2012-01-01
The durability of building joint sealants is generally assessed using a descriptive methodology involving visual inspection of exposed specimens for defects. It is widely known that this methodology has inherent limitations, including that the results are qualitative. A new test method is proposed that provides more fundamental and quantitative information about...
Staff Beliefs about Why People with Learning Disabilities Self-Harm: A Q-Methodology Study
ERIC Educational Resources Information Center
Dick, Katie; Gleeson, Kate; Johnstone, Lucy; Weston, Clive
2011-01-01
Staff beliefs about self-harm can influence staff responses to the behaviour. Existing research into staff beliefs about self-harm by people with learning disabilities is limited, with qualitative research restricted to forensic services. The aim of this study was to use Q-methodology to explore staff beliefs about why people with learning…
ERIC Educational Resources Information Center
Roman, Elliott M.
The Alternative Learning Methodologies through Academics Project (Project ALMA) was an Elementary and Secondary Education Act Title VII-funded project in its fourth year of operation in two high schools in Queens and the Bronx (New York). The program served 436 Spanish-speaking students, most of whom were of limited English proficiency.…
Education in Management of Data Created by New Technologies for Rapid Product Development in SMEs
ERIC Educational Resources Information Center
Shaw, A.; Aitchison, D.
2003-01-01
This paper presents outcomes from a research programme aimed at developing new tools and methodologies to assist small and medium-sized enterprises (SMEs) in rapid product development (RPD). The authors suggest that current education strategies for the teaching of RPD tools and methodologies may be of limited value unless those strategies also…
What Does Global Migration Network Say about Recent Changes in the World System Structure?
ERIC Educational Resources Information Center
Zinkina, Julia; Korotayev, Andrey
2014-01-01
Purpose: The aim of this paper is to investigate whether the structure of the international migration system has remained stable through the recent turbulent changes in the world system. Design/methodology/approach: The methodology draws on the social network analysis framework--but with some noteworthy limitations stipulated by the specifics of…
ERIC Educational Resources Information Center
Bachore, Zelalem
2012-01-01
Ontology not only is considered to be the backbone of the semantic web but also plays a significant role in distributed and heterogeneous information systems. However, ontology still faces limited application and adoption to date. One of the major problems is that prevailing engineering-oriented methodologies for building ontologies do not…
ERIC Educational Resources Information Center
Pillay, Hitendra; Kelly, Kathy; Tones, Megan
2010-01-01
Purpose: The purpose of this paper is to identify the transitional employment (TE) aspirations and training and development needs of older and younger workers at risk of early retirement due to limited education and/or employment in blue-collar (BC) occupations. Design/methodology/approach: A computer-based methodology is used to evaluate the…
Trends in Methodological Rigor in Intervention Research Published in School Psychology Journals
ERIC Educational Resources Information Center
Burns, Matthew K.; Klingbeil, David A.; Ysseldyke, James E.; Petersen-Brown, Shawna
2012-01-01
Methodological rigor in intervention research is important for documenting evidence-based practices and has been a recent focus in legislation, including the No Child Left Behind Act. The current study examined the methodological rigor of intervention research in four school psychology journals since the 1960s. Intervention research has increased…
Methodology or method? A critical review of qualitative case study reports.
Hyett, Nerida; Kenny, Amanda; Dickson-Swift, Virginia
2014-01-01
Despite on-going debate about credibility, and reported limitations in comparison to other approaches, case study is an increasingly popular approach among qualitative researchers. We critically analysed the methodological descriptions of published case studies. Three high-impact qualitative methods journals were searched to locate case studies published in the past 5 years; 34 were selected for analysis. Articles were categorized as health and health services (n=12), social sciences and anthropology (n=7), or methods (n=15) case studies. The articles were reviewed using an adapted version of established criteria to determine whether adequate methodological justification was present, and if study aims, methods, and reported findings were consistent with a qualitative case study approach. Findings were grouped into five themes outlining key methodological issues: case study methodology or method, case of something particular and case selection, contextually bound case study, researcher and case interactions and triangulation, and study design inconsistent with methodology reported. Improved reporting of case studies by qualitative researchers will advance the methodology for the benefit of researchers and practitioners.
Amy L. Sheaffer; Jay Beaman; Joseph T. O' Leary; Rebecca L. Williams; Doran M. Mason
2001-01-01
Sampling for research in recreation settings in an ongoing challenge. Often certain groups of users are more likely to be sampled. It is important in measuring public support for resource conservation and in understanding use of natural resources for recreation to evaluate issues of bias in survey methodologies. Important methodological issues emerged from a statewide...
Current Challenges in Health Economic Modeling of Cancer Therapies: A Research Inquiry
Miller, Jeffrey D.; Foley, Kathleen A.; Russell, Mason W.
2014-01-01
Background The demand for economic models that evaluate cancer treatments is increasing, as healthcare decision makers struggle for ways to manage their budgets while providing the best care possible to patients with cancer. Yet, after nearly 2 decades of cultivating and refining techniques for modeling the cost-effectiveness and budget impact of cancer therapies, serious methodologic and policy challenges have emerged that question the adequacy of economic modeling as a sound decision-making tool in oncology. Objectives We sought to explore some of the contentious issues associated with the development and use of oncology economic models as informative tools in current healthcare decision-making. Our objective was to draw attention to these complex pharmacoeconomic concerns and to promote discussion within the oncology and health economics research communities. Methods Using our combined expertise in health economics research and economic modeling, we structured our inquiry around the following 4 questions: (1) Are economic models adequately addressing questions relevant to oncology decision makers; (2) What are the methodologic limitations of oncology economic models; (3) What guidelines are followed for developing oncology economic models; and (4) Is the evolution of oncology economic modeling keeping pace with treatment innovation? Within the context of each of these questions, we discuss issues related to the technical limitations of oncology modeling, the availability of adequate data for developing models, and the problems with how modeling analyses and results are presented and interpreted. Discussion There is general acceptance that economic models are good, essential tools for decision-making, but the practice of oncology and its rapidly evolving technologies present unique challenges that make assessing and demonstrating value especially complex. There is wide latitude for improvement in oncology modeling methodologies and how model results are presented and interpreted. Conclusion Complex technical and data availability issues with oncology economic modeling pose serious concerns that need to be addressed. It is our hope that this article will provide a framework to guide future discourse on this important topic. PMID:24991399
Current challenges in health economic modeling of cancer therapies: a research inquiry.
Miller, Jeffrey D; Foley, Kathleen A; Russell, Mason W
2014-05-01
The demand for economic models that evaluate cancer treatments is increasing, as healthcare decision makers struggle for ways to manage their budgets while providing the best care possible to patients with cancer. Yet, after nearly 2 decades of cultivating and refining techniques for modeling the cost-effectiveness and budget impact of cancer therapies, serious methodologic and policy challenges have emerged that question the adequacy of economic modeling as a sound decision-making tool in oncology. We sought to explore some of the contentious issues associated with the development and use of oncology economic models as informative tools in current healthcare decision-making. Our objective was to draw attention to these complex pharmacoeconomic concerns and to promote discussion within the oncology and health economics research communities. Using our combined expertise in health economics research and economic modeling, we structured our inquiry around the following 4 questions: (1) Are economic models adequately addressing questions relevant to oncology decision makers; (2) What are the methodologic limitations of oncology economic models; (3) What guidelines are followed for developing oncology economic models; and (4) Is the evolution of oncology economic modeling keeping pace with treatment innovation? Within the context of each of these questions, we discuss issues related to the technical limitations of oncology modeling, the availability of adequate data for developing models, and the problems with how modeling analyses and results are presented and interpreted. There is general acceptance that economic models are good, essential tools for decision-making, but the practice of oncology and its rapidly evolving technologies present unique challenges that make assessing and demonstrating value especially complex. There is wide latitude for improvement in oncology modeling methodologies and how model results are presented and interpreted. Complex technical and data availability issues with oncology economic modeling pose serious concerns that need to be addressed. It is our hope that this article will provide a framework to guide future discourse on this important topic.
Vaillant, Michel; Lang, Trudie A.; Guérin, Philippe J.; Olliaro, Piero L.
2016-01-01
Background Schistosomiasis control mainly relies on preventive chemotherapy with praziquantel (PZQ) distributed through mass drug administration. With a target of 260 million treatments yearly, reliably assessing and monitoring efficacy is all-important. Recommendations for treatment and control of schistosomiasis are supported by systematic reviews and meta-analyses of aggregated data, which however also point to limitations due to heterogeneity in trial design, analyses and reporting. Some such limitations could be corrected through access to individual participant-level data (IPD), which facilitates standardised analyses. Methodology A systematic literature review was conducted to identify antischistosomal drug efficacy studies performed since 2000; including electronic searches of the Cochrane Infectious Diseases Group specialised register and the Cochrane Library, PubMed, CENTRAL and Embase; complemented with a manual search for articles listed in past reviews. Antischistosomal treatment studies with assessment of outcome within 60 days post-treatment were eligible. Meta-data, i.e. study-level characteristics (Schistosoma species, number of patients, drug administered, country, etc.) and efficacy parameters were extracted from published documents to evaluate the scope of an individual-level data sharing platform. Principal findings Out of 914 documents screened, 90 studies from 26 countries were included, enrolling 20,517 participants infected with Schistosoma spp. and treated with different PZQ regimens or other drugs. Methodologies varied in terms of diagnostic approaches (number of samples and test repeats), time of outcome assessment, and outcome measure (cure rate or egg reduction rate, as an arithmetic or geometric mean), making direct comparison of published data difficult. Conclusions This review describes the landscape of schistosomiasis clinical research. The volume of data and the methodological and reporting heterogeneity identified all indicate that there is scope for an individual participant-level database, to allow for standardised analyses. PMID:27347678
Improved Conceptual Models Methodology (ICoMM) for Validation of Non-Observable Systems
2015-12-01
distribution is unlimited IMPROVED CONCEPTUAL MODELS METHODOLOGY (ICoMM) FOR VALIDATION OF NON-OBSERVABLE SYSTEMS by Sang M. Sok December 2015...REPORT TYPE AND DATES COVERED Dissertation 4. TITLE AND SUBTITLE IMPROVED CONCEPTUAL MODELS METHODOLOGY (ICoMM) FOR VALIDATION OF NON-OBSERVABLE...importance of the CoM. The improved conceptual model methodology (ICoMM) is developed in support of improving the structure of the CoM for both face and
Rational decision-making in mental health: the role of systematic reviews.
Gilbody, Simon M.; Petticrew, Mark
1999-09-01
BACKGROUND: "Systematic reviews" have come to be recognized as the most rigorous method of summarizing confusing and often contradictory primary research in a transparent and reproducible manner. Their greatest impact has been in the summarization of epidemiological literature - particularly that relating to clinical effectiveness. Systematic reviews also have a potential to inform rational decision-making in healthcare policy and to form a component of economic evaluation. AIMS OF THE STUDY: This article aims to introduce the rationale behind systematic reviews and, using examples from mental health, to introduce the strengths and limitations of systematic reviews, particularly in informing mental health policy and economic evaluation. METHODS: Examples are selected from recent controversies surrounding the introduction of new psychiatric drugs (anti-depressants and anti-schizophrenia drugs) and methods of delivering psychiatric care in the community (case management and assertive community treatment). The potential for systematic reviews to (i) produce best estimates of clinical efficacy and effectiveness, (ii) aid economic evaluation and policy decision-making and (iii) highlight gaps in the primary research knowledge base are discussed. Lastly examples are selected from outside mental health to show how systematic reviews have a potential to be explicitly used in economic and health policy evaluation. RESULTS: Systematic reviews produce the best estimates of clinical efficacy, which can form an important component of economic evaluation. Importantly, serious methodological flaws and areas of uncertainty in the primary research literature are identified within an explicit framework. Summary indices of clinical effectiveness can be produced, but it is difficult to produce such summary indices of cost effectiveness by pooling economic data from primary studies. Modelling is commonly used in economic and policy evaluation. Here, systematic reviews can provide the best estimates of effectiveness and, importantly, highlight areas of uncertainty that can be used in "sensitivity analysis". DISCUSSION: Systematic reviews are an important recent methodological advance, the potential for which has only begun to be realized in mental health. This use of systematic reviews is probably most advanced in producing critical summaries of clinical effectiveness data. Systematic reviews cannot produce valid and believable conclusions when the primary research literature is of poor quality. An important function of systematic reviews will be in highlighting this poor quality research which is of little use in mental health decision making. IMPLICATIONS FOR HEALTH PROVISION: Health care provision should be both clinically and cost effective. Systematic reviews are a key component in ensuring that this goal is achieved. IMPLICATIONS FOR HEALTH POLICIES: Systematic reviews have potential to inform health policy. Examples presented show that health policy is often made without due consideration of the research evidence. Systematic reviews can provide robust and believable answers, which can help inform rational decision-making. Importantly, systematic reviews can highlight the need for important primary research and can inform the design of this research such that it provides answers that will help in forming healthcare policy. IMPLICATIONS FOR FURTHER RESEARCH: Systematic reviews should precede costly (and often unnecessary) primary research. Many areas of health policy and practice have yet to be evaluated using systematic review methodology. Methods for the summarization of economic data are methodologically complex and deserve further research
Anttila, Heidi; Autti-Rämö, Ilona; Suoranta, Jutta; Mäkelä, Marjukka; Malmivaara, Antti
2008-01-01
Background To assess the effectiveness of physical therapy (PT) interventions on functioning in children with cerebral palsy (CP). Methods A search was made in Medline, Cinahl, PEDro and the Cochrane library for the period 1990 to February 2007. Only randomized controlled trials (RCTs) on PT interventions in children with diagnosed CP were included. Two reviewers independently assessed the methodological quality and extracted the data. The outcomes measured in the trials were classified using the International Classification of Functioning, Disability and Health (ICF). Results Twenty-two trials were identified. Eight intervention categories were distinguished. Four trials were of high methodological quality. Moderate evidence of effectiveness was established for two intervention categories: effectiveness of upper extremity treatments on attained goals and active supination, and of prehensile hand treatment and neurodevelopmental therapy (NDT) or NDT twice a week on developmental status, and of constraint-induced therapy on amount and quality of hand use. Moderate evidence of ineffectiveness was found of strength training on walking speed and stride length. Conflicting evidence was found for strength training on gross motor function. For the other intervention categories the evidence was limited due to low methodological quality and the statistically insignificant results of the studies. Conclusion Due to limitations in methodological quality and variations in population, interventions and outcomes, mostly limited evidence on the effectiveness of most PT interventions is available through RCTs. Moderate evidence was found for some effectiveness of upper extremity training. Well-designed trials are needed especially for focused PT interventions. PMID:18435840
NASA Astrophysics Data System (ADS)
Konovodov, V. V.; Valentov, A. V.; Kukhar, I. S.; Retyunskiy, O. Yu; Baraksanov, A. S.
2016-08-01
The work proposes the algorithm to calculate strength under alternating stresses using the developed methodology of building the diagram of limiting stresses. The overall safety factor is defined by the suggested formula. Strength calculations of components working under alternating stresses in the great majority of cases are conducted as the checking ones. It is primarily explained by the fact that the overall fatigue strength reduction factor (Kσg or Kτg) can only be chosen approximately during the component design as the engineer at this stage of work has just the approximate idea on the component size and shape.
Helou, A; Ollenschläger, G
1998-06-01
Recently a German appraisal instrument for clinical guidelines was published that could be used by various parties in formal evaluation of guidelines. An user's guide to the appraisal instrument was designed that contains a detailed explanation for each question to ensure that the instrument is interpreted consistently. This paper describes the purposes, format and contents of the user's guide, and reviews the key factors influencing the validity of guidelines. Taking into account international experiences, the purposes, chances and methodological limitations of a prospective assessment of clinical practice guidelines are discussed.
[Use of the Six Sigma methodology for the preparation of parenteral nutrition mixtures].
Silgado Bernal, M F; Basto Benítez, I; Ramírez García, G
2014-04-01
To use the tools of the Six Sigma methodology for the statistical control in the elaboration of parenteral nutrition mixtures at the critical checkpoint of specific density. Between August of 2010 and September of 2013, specific density analysis was performed to 100% of the samples, and the data were divided in two groups, adults and neonates. The percentage of acceptance, the trend graphs, and the sigma level were determined. A normality analysis was carried out by using the Shapiro Wilk test and the total percentage of mixtures within the specification limits was calculated. The specific density data between August of 2010 and September of 2013 comply with the normality test (W = 0.94) and show improvement in sigma level through time, reaching 6/6 in adults and 3.8/6 in neonates. 100% of the mixtures comply with the specification limits for adults and neonates, always within the control limits during the process. The improvement plans together with the Six Sigma methodology allow controlling the process, and warrant the agreement between the medical prescription and the content of the mixture. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.
Yusuf, Afiqah; Elsabbagh, Mayada
2015-12-15
Identifying biomarkers for autism can improve outcomes for those affected by autism. Engaging the diverse stakeholders in the research process using community-based participatory research (CBPR) can accelerate biomarker discovery into clinical applications. However, there are limited examples of stakeholder involvement in autism research, possibly due to conceptual and practical concerns. We evaluate the applicability of CBPR principles to biomarker discovery in autism and critically review empirical studies adopting these principles. Using a scoping review methodology, we identified and evaluated seven studies using CBPR principles in biomarker discovery. The limited number of studies in biomarker discovery adopting CBPR principles coupled with their methodological limitations suggests that such applications are feasible but challenging. These studies illustrate three CBPR themes: community assessment, setting global priorities, and collaboration in research design. We propose that further research using participatory principles would be useful in accelerating the pace of discovery and the development of clinically meaningful biomarkers. For this goal to be successful we advocate for increased attention to previously identified conceptual and methodological challenges to participatory approaches in health research, including improving scientific rigor and developing long-term partnerships among stakeholders.
Application of tolerance limits to the characterization of image registration performance.
Fedorov, Andriy; Wells, William M; Kikinis, Ron; Tempany, Clare M; Vangel, Mark G
2014-07-01
Deformable image registration is used increasingly in image-guided interventions and other applications. However, validation and characterization of registration performance remain areas that require further study. We propose an analysis methodology for deriving tolerance limits on the initial conditions for deformable registration that reliably lead to a successful registration. This approach results in a concise summary of the probability of registration failure, while accounting for the variability in the test data. The (β, γ) tolerance limit can be interpreted as a value of the input parameter that leads to successful registration outcome in at least 100β% of cases with the 100γ% confidence. The utility of the methodology is illustrated by summarizing the performance of a deformable registration algorithm evaluated in three different experimental setups of increasing complexity. Our examples are based on clinical data collected during MRI-guided prostate biopsy registered using publicly available deformable registration tool. The results indicate that the proposed methodology can be used to generate concise graphical summaries of the experiments, as well as a probabilistic estimate of the registration outcome for a future sample. Its use may facilitate improved objective assessment, comparison and retrospective stress-testing of deformable.
Intelligent systems/software engineering methodology - A process to manage cost and risk
NASA Technical Reports Server (NTRS)
Friedlander, Carl; Lehrer, Nancy
1991-01-01
A systems development methodology is discussed that has been successfully applied to the construction of a number of intelligent systems. This methodology is a refinement of both evolutionary and spiral development methodologies. It is appropriate for development of intelligent systems. The application of advanced engineering methodology to the development of software products and intelligent systems is an important step toward supporting the transition of AI technology into aerospace applications. A description of the methodology and the process model from which it derives is given. Associated documents and tools are described which are used to manage the development process and record and report the emerging design.
Möhler, Christian; Wohlfahrt, Patrick; Richter, Christian; Greilich, Steffen
2017-06-01
Electron density is the most important tissue property influencing photon and ion dose distributions in radiotherapy patients. Dual-energy computed tomography (DECT) enables the determination of electron density by combining the information on photon attenuation obtained at two different effective x-ray energy spectra. Most algorithms suggested so far use the CT numbers provided after image reconstruction as input parameters, i.e., are imaged-based. To explore the accuracy that can be achieved with these approaches, we quantify the intrinsic methodological and calibration uncertainty of the seemingly simplest approach. In the studied approach, electron density is calculated with a one-parametric linear superposition ('alpha blending') of the two DECT images, which is shown to be equivalent to an affine relation between the photon attenuation cross sections of the two x-ray energy spectra. We propose to use the latter relation for empirical calibration of the spectrum-dependent blending parameter. For a conclusive assessment of the electron density uncertainty, we chose to isolate the purely methodological uncertainty component from CT-related effects such as noise and beam hardening. Analyzing calculated spectrally weighted attenuation coefficients, we find universal applicability of the investigated approach to arbitrary mixtures of human tissue with an upper limit of the methodological uncertainty component of 0.2%, excluding high-Z elements such as iodine. The proposed calibration procedure is bias-free and straightforward to perform using standard equipment. Testing the calibration on five published data sets, we obtain very small differences in the calibration result in spite of different experimental setups and CT protocols used. Employing a general calibration per scanner type and voltage combination is thus conceivable. Given the high suitability for clinical application of the alpha-blending approach in combination with a very small methodological uncertainty, we conclude that further refinement of image-based DECT-algorithms for electron density assessment is not advisable. © 2017 American Association of Physicists in Medicine.
Lobato, Ramiro D; Lagares, Alfonso; Villena, Victoria; García Seoane, Jorge; Jiménez-Roldán, Luis; Munarriz, Pablo M; Castaño-Leon, Ana M; Alén, José F
2015-01-01
The design of an appropriate method for the selection of medical graduates for residency posts is extremely important, not only for the efficiency of the method itself (accurate identification of most competent candidates), but also for its influence on the study and teaching methodologies operating in medical schools. Currently, there is a great variation in the criteria used in different countries and there is no definitively appropriate method. The use of isolated or combined criteria, such as the marks obtained by students in medical schools, their performance in tests of theoretical knowledge and evaluations of clinical competence, or personal interviews, have a limited value for identifying those candidates who will perform better during the residency and later on during independent practice. To analyse the variability in the methodologies used for the selection of residents employed in different countries, in particular those used in the United Kingdom and USA, where external agencies and medical schools make systematic analyses of curriculum development. The advantages and disadvantages of national or transnational licensing examinations on the process of convergence and harmonization of medical degrees and residency programmes through Europe are discussed. The present analysis is used to design a new and more efficient multi-criteria methodology for resident selection in Spain, which will be published in the next issue of this journal. Since the multi-criteria methods used in UK and USA appear to be most consistent, these have been employed for designing the new methodology that could be applied in Spain. Although many experts in medical education reject national examinations for awarding medical degrees or ranking candidates for residency posts, it seems that, when appropriately designed, they can be used to verify the level of competence of graduating students without necessarily distorting curriculum implementation or improvement. Copyright © 2014 Sociedad Española de Neurocirugía. Published by Elsevier España. All rights reserved.
Jin, Ying-Hui; Wang, Guo-Hao; Sun, Yi-Rong; Li, Qi; Zhao, Chen; Li, Ge; Si, Jin-Hua; Li, Yan; Lu, Cui; Shang, Hong-Cai
2016-01-01
Objective To assess the methodology and quality of evidence of systematic reviews and meta-analyses of traditional Chinese medical nursing (TCMN) interventions in Chinese journals. These interventions include acupressure, massage, Tai Chi, Qi Gong, electroacupuncture and use of Chinese herbal medicines—for example, in enemas, foot massage and compressing the umbilicus. Design A systematic literature search for systematic reviews and meta-analyses of TCMN interventions was performed. Review characteristics were extracted. The methodological quality and the quality of the evidence were evaluated using the Assessment of Multiple Systematic Reviews (AMSTAR) and Grading of Recommendations Assessment, Development and Evaluation (GRADE) approaches. Result We included 20 systematic reviews and meta-analyses, and a total of 11 TCMN interventions were assessed in the 20 reviews. The compliance with AMSTAR checklist items ranged from 4.5 to 8 and systematic reviews/meta-analyses were, on average, of medium methodological quality. The quality of the evidence we assessed ranged from very low to moderate; no high-quality evidence was found. The top two causes for downrating confidence in effect estimates among the 31 bodies of evidence assessed were the risk of bias and inconsistency. Conclusions There is room for improvement in the methodological quality of systematic reviews/meta-analyses of TCMN interventions published in Chinese journals. Greater efforts should be devoted to ensuring a more comprehensive search strategy, clearer specification of the interventions of interest in the eligibility criteria and identification of meaningful outcomes for clinicians and patients (consumers). The overall quality of evidence among reviews remains suboptimal, which raise concerns about their roles in influencing clinical practice. Thus, the conclusions in reviews we assessed must be treated with caution and their roles in influencing clinical practice should be limited. A critical appraisal of systematic reviews/meta-analyses of TCMN interventions is particularly important to provide sound guidance for TCMN. PMID:28186925
Beillas, Philippe; Berthet, Fabien
2017-05-29
Human body models have the potential to better describe the human anatomy and variability than dummies. However, data sets available to verify the human response to impact are typically limited in numbers, and they are not size or gender specific. The objective of this study was to investigate the use of model morphing methodologies within that context. In this study, a simple human model scaling methodology was developed to morph two detailed human models (Global Human Body Model Consortium models 50th male, M50, and 5th female, F05) to the dimensions of post mortem human surrogates (PMHS) used in published literature. The methodology was then successfully applied to 52 PMHS tested in 14 impact conditions loading the abdomen. The corresponding 104 simulations were compared to the responses of the PMHS and to the responses of the baseline models without scaling (28 simulations). The responses were analysed using the CORA method and peak values. The results suggest that model scaling leads to an improvement of the predicted force and deflection but has more marginal effects on the predicted abdominal compressions. M50 and F05 models scaled to the same PMHS were also found to have similar external responses, but large differences were found between the two sets of models for the strain energy densities in the liver and the spleen for mid-abdomen impact simulations. These differences, which were attributed to the anatomical differences in the abdomen of the baseline models, highlight the importance of the selection of the impact condition for simulation studies, especially if the organ location is not known in the test. While the methodology could be further improved, it shows the feasibility of using model scaling methodologies to compare human models of different sizes and to evaluate scaling approaches within the context of human model validation.
Cummings, Elizabeth; Turner, Paul
2010-01-01
Building an evidence base for healthcare interventions has long been advocated as both professionally and ethically desirable. By supporting meaningful comparison amongst different approaches, a good evidence base has been viewed as an important element in optimising clinical decision-making and the safety and quality of care. Unsurprisingly, medical research has put considerable effort into supporting the development of this evidence base, and the randomised controlled trial has become the dominant methodology. Recently however, a body of research has begun to question, not just this methodology per se, but also the extent to which the evidence it produces may marginalise individual patient experiences, priorities and perceptions. Simultaneously, the widespread adoption and utilisation of information systems (IS) in health care has also prompted initiatives to develop a stronger base of evidence about their impacts. These calls have been stimulated both by numerous system failures and research expressing concerns about the limitations of information systems methodologies in health care environments. Alongside the potential of information systems to produce positive, negative and unintended consequences, many measures of success, impact or benefit appear to have little to do with improvements in care, health outcomes or individual patient experiences. Combined these methodological concerns suggest the need for more detailed examination. This is particularly the case, given the prevalence within contemporary clinical and IS discourses on health interventions advocating the need to put the ‘patient at the centre’ by engaging them in their own care and/or ‘empowering’ them through the use of information systems. This paper aims to contribute to these on-going debates by focusing on the socio-technical processes by which patients’ interests and outcomes are measured, defined and evaluated within health interventions that involve them using web-based information systems. The paper outlines an integrated approach that aims to generate evidence about the impact of these types of health interventions that are meaningful at both individual patient and patient cohort levels. PMID:21594007
Falkenhaug, Tone; Baxter, Emily J.
2017-01-01
The diversity and distribution of gelatinous zooplankton were investigated along the northern Mid-Atlantic Ridge (MAR) from June to August 2004.Here, we present results from macrozooplankton trawl sampling, as well as comparisons made between five different methodologies that were employed during the MAR-ECO survey. In total, 16 species of hydromedusae, 31 species of siphonophores and four species of scyphozoans were identified to species level from macrozooplankton trawl samples. Additional taxa were identified to higher taxonomic levels and a single ctenophore genus was observed. Samples were collected at 17 stations along the MAR between the Azores and Iceland. A divergence in the species assemblages was observed at the southern limit of the Subpolar Frontal Zone. The catch composition of gelatinous zooplankton is compared between different sampling methodologies including: a macrozooplankton trawl; a Multinet; a ringnet attached to bottom trawl; and optical platforms (Underwater Video Profiler (UVP) & Remotely Operated Vehicle (ROV)). Different sampling methodologies are shown to exhibit selectivity towards different groups of gelatinous zooplankton. Only ~21% of taxa caught during the survey were caught by both the macrozooplankton trawl and the Multinet when deployed at the same station. The estimates of gelatinous zooplankton abundance calculated using these two gear types also varied widely (1.4 ± 0.9 individuals 1000 m-3 estimated by the macrozooplankton trawl vs. 468.3 ± 315.4 individuals 1000 m-3 estimated by the Multinet (mean ± s.d.) when used at the same stations (n = 6). While it appears that traditional net sampling can generate useful data on pelagic cnidarians, comparisons with results from the optical platforms suggest that ctenophore diversity and abundance are consistently underestimated, particularly when net sampling is conducted in combination with formalin fixation. The results emphasise the importance of considering sampling methodology both when planning surveys, as well as when interpreting existing data. PMID:29095891
Hosia, Aino; Falkenhaug, Tone; Baxter, Emily J; Pagès, Francesc
2017-01-01
The diversity and distribution of gelatinous zooplankton were investigated along the northern Mid-Atlantic Ridge (MAR) from June to August 2004.Here, we present results from macrozooplankton trawl sampling, as well as comparisons made between five different methodologies that were employed during the MAR-ECO survey. In total, 16 species of hydromedusae, 31 species of siphonophores and four species of scyphozoans were identified to species level from macrozooplankton trawl samples. Additional taxa were identified to higher taxonomic levels and a single ctenophore genus was observed. Samples were collected at 17 stations along the MAR between the Azores and Iceland. A divergence in the species assemblages was observed at the southern limit of the Subpolar Frontal Zone. The catch composition of gelatinous zooplankton is compared between different sampling methodologies including: a macrozooplankton trawl; a Multinet; a ringnet attached to bottom trawl; and optical platforms (Underwater Video Profiler (UVP) & Remotely Operated Vehicle (ROV)). Different sampling methodologies are shown to exhibit selectivity towards different groups of gelatinous zooplankton. Only ~21% of taxa caught during the survey were caught by both the macrozooplankton trawl and the Multinet when deployed at the same station. The estimates of gelatinous zooplankton abundance calculated using these two gear types also varied widely (1.4 ± 0.9 individuals 1000 m-3 estimated by the macrozooplankton trawl vs. 468.3 ± 315.4 individuals 1000 m-3 estimated by the Multinet (mean ± s.d.) when used at the same stations (n = 6). While it appears that traditional net sampling can generate useful data on pelagic cnidarians, comparisons with results from the optical platforms suggest that ctenophore diversity and abundance are consistently underestimated, particularly when net sampling is conducted in combination with formalin fixation. The results emphasise the importance of considering sampling methodology both when planning surveys, as well as when interpreting existing data.
Esteves, Sandro C; Chan, Peter
2015-09-01
We systematically identified and reviewed the methods and consistency of recommendations of recently developed clinical practice guidelines (CPG) and best practice statements (BPS) on the evaluation of the infertile male. MEDLINE and related engines as well as guidelines' Web sites were searched for CPG and BPS written in English on the general evaluation of male infertility published between January 2008 and April 2015. Four guidelines were identified, all of which reported to have been recently updated. Systematic review was not consistently used in the BPS despite being reported in the CPG. Only one of them reported having a patient representative in its development team. The CPG issued by the European Association of Urology (EAU) graded some recommendations and related that to levels (but not quality) of evidence. Overall, the BPS issued respectively by the American Urological Association and American Society for Reproductive Medicine concurred with each other, but both differed from the EAU guidelines with regard to methods of collection, extraction and interpretation of data. None of the guidelines incorporated health economics. Important specific limitations of conventional semen analysis results were ignored by all guidelines. Besides variation in the methodological quality, implementation strategies were not reported in two out of four guidelines. While the various panels of experts who contributed to the development of the CPG and BPS reviewed should be commended on their tremendous efforts aiming to establish a clinical standard in both the evaluation and management of male infertility, we recognized inconsistencies in the methodology of their synthesis and in the contents of their final recommendations. These discrepancies pose a barrier in the general implementation of these guidelines and may limit their utility in standardizing clinical practice or improving health-related outcomes. Continuous efforts are needed to generate high-quality evidence to allow further development of these important guidelines for the evaluation and management of males suffering from infertility.
Boulesteix, Anne-Laure; Wilson, Rory; Hapfelmeier, Alexander
2017-09-09
The goal of medical research is to develop interventions that are in some sense superior, with respect to patient outcome, to interventions currently in use. Similarly, the goal of research in methodological computational statistics is to develop data analysis tools that are themselves superior to the existing tools. The methodology of the evaluation of medical interventions continues to be discussed extensively in the literature and it is now well accepted that medicine should be at least partly "evidence-based". Although we statisticians are convinced of the importance of unbiased, well-thought-out study designs and evidence-based approaches in the context of clinical research, we tend to ignore these principles when designing our own studies for evaluating statistical methods in the context of our methodological research. In this paper, we draw an analogy between clinical trials and real-data-based benchmarking experiments in methodological statistical science, with datasets playing the role of patients and methods playing the role of medical interventions. Through this analogy, we suggest directions for improvement in the design and interpretation of studies which use real data to evaluate statistical methods, in particular with respect to dataset inclusion criteria and the reduction of various forms of bias. More generally, we discuss the concept of "evidence-based" statistical research, its limitations and its impact on the design and interpretation of real-data-based benchmark experiments. We suggest that benchmark studies-a method of assessment of statistical methods using real-world datasets-might benefit from adopting (some) concepts from evidence-based medicine towards the goal of more evidence-based statistical research.
Biviano, Marilyn B.; Wagner, Lorie A.; Sullivan, Daniel E.
1999-01-01
Materials consumption estimates, such as apparent consumption of raw materials, can be important indicators of sustainability. Apparent consumption of raw materials does not account for material contained in manufactured products that are imported or exported and may thus under- or over-estimate total consumption of materials in the domestic economy. This report demonstrates a methodology to measure the amount of materials contained in net imports (imports minus exports), using lead as an example. The analysis presents illustrations of differences between apparent and total consumption of lead and distributes these differences into individual lead-consuming sectors.
Seaton, Sarah E; Manktelow, Bradley N
2012-07-16
Emphasis is increasingly being placed on the monitoring of clinical outcomes for health care providers. Funnel plots have become an increasingly popular graphical methodology used to identify potential outliers. It is assumed that a provider only displaying expected random variation (i.e. 'in-control') will fall outside a control limit with a known probability. In reality, the discrete count nature of these data, and the differing methods, can lead to true probabilities quite different from the nominal value. This paper investigates the true probability of an 'in control' provider falling outside control limits for the Standardised Mortality Ratio (SMR). The true probabilities of an 'in control' provider falling outside control limits for the SMR were calculated and compared for three commonly used limits: Wald confidence interval; 'exact' confidence interval; probability-based prediction interval. The probability of falling above the upper limit, or below the lower limit, often varied greatly from the nominal value. This was particularly apparent when there were a small number of expected events: for expected events ≤ 50 the median probability of an 'in-control' provider falling above the upper 95% limit was 0.0301 (Wald), 0.0121 ('exact'), 0.0201 (prediction). It is important to understand the properties and probability of being identified as an outlier by each of these different methods to aid the correct identification of poorly performing health care providers. The limits obtained using probability-based prediction limits have the most intuitive interpretation and their properties can be defined a priori. Funnel plot control limits for the SMR should not be based on confidence intervals.
NASA Technical Reports Server (NTRS)
Myers, Thomas T.; Mcruer, Duane T.
1988-01-01
The development of a comprehensive and electric methodology for conceptual and preliminary design of flight control systems is presented and illustrated. The methodology is focused on the design states starting with the layout of system requirements and ending when some viable competing system architectures (feedback control structures) are defined. The approach is centered on the human pilot and the aircraft as both the sources of, and the keys to the solution of, many flight control problems. The methodology relies heavily on computational procedures which are highly interactive with the design engineer. To maximize effectiveness, these techniques, as selected and modified to be used together in the methodology, form a cadre of computational tools specifically tailored for integrated flight control system preliminary design purposes. The FCX expert system as presently developed is only a limited prototype capable of supporting basic lateral-directional FCS design activities related to the design example used. FCX presently supports design of only one FCS architecture (yaw damper plus roll damper) and the rules are largely focused on Class IV (highly maneuverable) aircraft. Despite this limited scope, the major elements which appear necessary for application of knowledge-based software concepts to flight control design were assembled and thus FCX represents a prototype which can be tested, critiqued and evolved in an ongoing process of development.
2011-09-01
a quality evaluation with limited data, a model -based assessment must be...that affect system performance, a multistage approach to system validation, a modeling and experimental methodology for efficiently addressing a ...affect system performance, a multistage approach to system validation, a modeling and experimental methodology for efficiently addressing a wide range
ERIC Educational Resources Information Center
Foisy, Pierre
1994-01-01
Meta analysis of 22 studies testing 1,598 subjects revealed that aging has a great effect on intentional memory for spatial location. However, methodological limits were found: fewer than half of the studies controlled for age differences in visual acuity or did not use a test phase of fixed duration. (SK)
USDA-ARS?s Scientific Manuscript database
Vitamin A (VA) stable isotope dilution methodology provides a quantitative estimate of total body VA stores and is the best method currently available for assessing VA status in adults and children. The methodology has also been used to test the efficacy of VA interventions in a number of low-incom...
Teaching Note--An Exploration of Team-Based Learning and Social Work Education: A Natural Fit
ERIC Educational Resources Information Center
Robinson, Michael A.; Robinson, Michelle Bachelor; McCaskill, Gina M.
2013-01-01
The literature on team-based learning (TBL) as a pedagogical methodology in social work education is limited; however, TBL, which was developed as a model for business, has been successfully used as a teaching methodology in nursing, business, engineering, medical school, and many other disciplines in academia. This project examines the use of TBL…
Braubach, Matthias; Tobollik, Myriam; Mudu, Pierpaolo; Hiscock, Rosemary; Chapizanis, Dimitris; Sarigiannis, Denis A.; Keuken, Menno; Perez, Laura; Martuzzi, Marco
2015-01-01
Well-being impact assessments of urban interventions are a difficult challenge, as there is no agreed methodology and scarce evidence on the relationship between environmental conditions and well-being. The European Union (EU) project “Urban Reduction of Greenhouse Gas Emissions in China and Europe” (URGENCHE) explored a methodological approach to assess traffic noise-related well-being impacts of transport interventions in three European cities (Basel, Rotterdam and Thessaloniki) linking modeled traffic noise reduction effects with survey data indicating noise-well-being associations. Local noise models showed a reduction of high traffic noise levels in all cities as a result of different urban interventions. Survey data indicated that perception of high noise levels was associated with lower probability of well-being. Connecting the local noise exposure profiles with the noise-well-being associations suggests that the urban transport interventions may have a marginal but positive effect on population well-being. This paper also provides insight into the methodological challenges of well-being assessments and highlights the range of limitations arising from the current lack of reliable evidence on environmental conditions and well-being. Due to these limitations, the results should be interpreted with caution. PMID:26016437
Braubach, Matthias; Tobollik, Myriam; Mudu, Pierpaolo; Hiscock, Rosemary; Chapizanis, Dimitris; Sarigiannis, Denis A; Keuken, Menno; Perez, Laura; Martuzzi, Marco
2015-05-26
Well-being impact assessments of urban interventions are a difficult challenge, as there is no agreed methodology and scarce evidence on the relationship between environmental conditions and well-being. The European Union (EU) project "Urban Reduction of Greenhouse Gas Emissions in China and Europe" (URGENCHE) explored a methodological approach to assess traffic noise-related well-being impacts of transport interventions in three European cities (Basel, Rotterdam and Thessaloniki) linking modeled traffic noise reduction effects with survey data indicating noise-well-being associations. Local noise models showed a reduction of high traffic noise levels in all cities as a result of different urban interventions. Survey data indicated that perception of high noise levels was associated with lower probability of well-being. Connecting the local noise exposure profiles with the noise-well-being associations suggests that the urban transport interventions may have a marginal but positive effect on population well-being. This paper also provides insight into the methodological challenges of well-being assessments and highlights the range of limitations arising from the current lack of reliable evidence on environmental conditions and well-being. Due to these limitations, the results should be interpreted with caution.
Almeida, Mariana R; Correa, Deleon N; Zacca, Jorge J; Logrado, Lucio Paulo Lima; Poppi, Ronei J
2015-02-20
The aim of this study was to develop a methodology using Raman hyperspectral imaging and chemometric methods for identification of pre- and post-blast explosive residues on banknote surfaces. The explosives studied were of military, commercial and propellant uses. After the acquisition of the hyperspectral imaging, independent component analysis (ICA) was applied to extract the pure spectra and the distribution of the corresponding image constituents. The performance of the methodology was evaluated by the explained variance and the lack of fit of the models, by comparing the ICA recovered spectra with the reference spectra using correlation coefficients and by the presence of rotational ambiguity in the ICA solutions. The methodology was applied to forensic samples to solve an automated teller machine explosion case. Independent component analysis proved to be a suitable method of resolving curves, achieving equivalent performance with the multivariate curve resolution with alternating least squares (MCR-ALS) method. At low concentrations, MCR-ALS presents some limitations, as it did not provide the correct solution. The detection limit of the methodology presented in this study was 50 μg cm(-2). Copyright © 2014 Elsevier B.V. All rights reserved.
Mechanical modulation method for ultrasensitive phase measurements in photonics biosensing.
Patskovsky, S; Maisonneuve, M; Meunier, M; Kabashin, A V
2008-12-22
A novel polarimetry methodology for phase-sensitive measurements in single reflection geometry is proposed for applications in optical transduction-based biological sensing. The methodology uses altering step-like chopper-based mechanical phase modulation for orthogonal s- and p- polarizations of light reflected from the sensing interface and the extraction of phase information at different harmonics of the modulation. We show that even under a relatively simple experimental arrangement, the methodology provides the resolution of phase measurements as low as 0.007 deg. We also examine the proposed approach using Total Internal Reflection (TIR) and Surface Plasmon Resonance (SPR) geometries. For TIR geometry, the response appears to be strongly dependent on the prism material with the best values for high refractive index Si. The detection limit for Si-based TIR is estimated as 10(-5) in terms Refractive Index Units (RIU) change. SPR geometry offers much stronger phase response due to a much sharper phase characteristics. With the detection limit of 3.2*10(-7) RIU, the proposed methodology provides one of best sensitivities for phase-sensitive SPR devices. Advantages of the proposed method include high sensitivity, simplicity of experimental setup and noise immunity as a result of a high stability modulation.
Quantitative imaging assay for NF-κB nuclear translocation in primary human macrophages
Noursadeghi, Mahdad; Tsang, Jhen; Haustein, Thomas; Miller, Robert F.; Chain, Benjamin M.; Katz, David R.
2008-01-01
Quantitative measurement of NF-κB nuclear translocation is an important research tool in cellular immunology. Established methodologies have a number of limitations, such as poor sensitivity, high cost or dependence on cell lines. Novel imaging methods to measure nuclear translocation of transcriptionally active components of NF-κB are being used but are also partly limited by the need for specialist imaging equipment or image analysis software. Herein we present a method for quantitative detection of NF-κB rel A nuclear translocation, using immunofluorescence microscopy and the public domain image analysis software ImageJ that can be easily adopted for cellular immunology research without the need for specialist image analysis expertise and at low cost. The method presented here is validated by demonstrating the time course and dose response of NF-κB nuclear translocation in primary human macrophages stimulated with LPS, and by comparison with a commercial NF-κB activation reporter cell line. PMID:18036607
Neighborhoods and health: where are we and were do we go from here?
DIEZ-ROUX, A. V.
2007-01-01
Summary In recent years there has been an explosion of interest in neighborhood health effects. Most existing work has relied on secondary data analyses and has used administrative areas and aggregate census data to characterize neighborhoods. Important questions remain regarding whether the associations reported by these studies reflect causal processes. This paper reviews the major limitations of existing work and discusses areas for future development including (1) definition and measurement of area or ecologic attributes (2) consideration of spatial scale (3) cumulative exposures and lagged effects and (4) the complementary nature of observational, quasi-experimental, and experimental evidence. As is usually the case with complex research questions, consensus regarding the presence and magnitude of neighborhood health effects will emerge from the work of multiple disciplines, often with diverse methodological approaches, each with its strengths and its limitations. Partnership across disciplines, as well as among health researchers, communities, urban planners, and policy experts will be key. PMID:17320330
Media Portrayal of a Landmark Neuroscience Experiment on Free Will.
Racine, Eric; Nguyen, Valentin; Saigle, Victoria; Dubljevic, Veljko
2017-08-01
The concept of free will has been heavily debated in philosophy and the social sciences. Its alleged importance lies in its association with phenomena fundamental to our understandings of self, such as autonomy, freedom, self-control, agency, and moral responsibility. Consequently, when neuroscience research is interpreted as challenging or even invalidating this concept, a number of heated social and ethical debates surface. We undertook a content analysis of media coverage of Libet's et al.'s (Brain 106(Pt 3):623-642, 1983) landmark study, which is frequently interpreted as posing a serious challenge to the existence of free will. Media descriptions of Libet et al.'s experiment provided limited details about the original study. Overall, many media articles reported that Libet et al.'s experiments undermined the existence of free will, despite acknowledging that several methodological limitations had been identified in the literature. A propensity to attribute greater credibility than warranted to neurobiological explanations could be at stake.
Smart Grid Constraint Violation Management for Balancing and Regulating Purposes
Bhattarai, Bishnu; Kouzelis, Konstantinos; Mendaza, Iker; ...
2017-03-29
The gradual active load penetration in low voltage distribution grids is expected to challenge their network capacity in the near future. Distribution system operators should for this reason resort to either costly grid reinforcements or to demand side management mechanisms. Since demand side management implementation is usually cheaper, it is also the favorable solution. To this end, this article presents a framework for handling grid limit violations, both voltage and current, to ensure a secure and qualitative operation of the distribution grid. This framework consists of two steps, namely a proactive centralized and subsequently a reactive decentralized control scheme. Themore » former is employed to balance the one hour ahead load while the latter aims at regulating the consumption in real-time. In both cases, the importance of fair use of electricity demand flexibility is emphasized. Thus, it is demonstrated that this methodology aids in keeping the grid status within preset limits while utilizing flexibility from all flexibility participants.« less
Goerlandt, Floris; Montewka, Jakub
2014-02-15
In risk assessment of maritime transportation, estimation of accidental oil outflow from tankers is important for assessing environmental impacts. However, there typically is limited data concerning the specific structural design and tank arrangement of ships operating in a given area. Moreover, there is uncertainty about the accident scenarios potentially emerging from ship encounters. This paper proposes a Bayesian network (BN) model for reasoning under uncertainty for the assessment of accidental cargo oil outflow in a ship-ship collision where a product tanker is struck. The BN combines a model linking impact scenarios to damage extent with a model for estimating the tank layouts based on limited information regarding the ship. The methodology for constructing the model is presented and output for two accident scenarios is shown. The discussion elaborates on the issue of model validation, both in terms of the BN and in light of the adopted uncertainty/bias-based risk perspective. Copyright © 2013 The Authors. Published by Elsevier Ltd.. All rights reserved.
Toumi, Mondher; Motrunich, Anastasiia; Millier, Aurélie; Rémuzat, Cécile; Chouaid, Christos; Falissard, Bruno; Aballéa, Samuel
2017-01-01
ABSTRACT Background: Despite the guidelines for Economic and Public Health Assessment Committee (CEESP) submission having been available for nearly six years, the dossiers submitted continue to deviate from them, potentially impacting product prices. Objective: to review the reports published by CEESP, analyse deviations from the guidelines, and discuss their implications for the pricing and reimbursement process. Study design: CEESP reports published until January 2017 were reviewed, and deviations from the guidelines were extracted. The frequency of deviations was described by type of methodological concern (minor, important or major). Results: In 19 reports, we identified 243 methodological concerns, most often concerning modelling, measurement and valuation of health states and results presentation and sensitivity analyses; nearly 63% were minor, 33% were important and 4.5% were major. All reports included minor methodological concerns, and 17 (89%) included at least one important and/or major methodological concern. Global major methodological concerns completely invalidated the analysis in seven dossiers (37%). Conclusion: The CEESP submission dossiers fail to adhere to the guidelines, potentially invalidating the health economics analysis and resulting in pricing negotiations. As these negotiations tend to be unfavourable for the manufacturer, the industry should strive to improve the quality of the analyses submitted to CEESP. PMID:28804600
Wastewater GHG Accounting Protocols as Compared to the State of GHG Science.
Willis, John L; Yuan, Zhiguo; Murthy, Sudhir
2016-08-01
Greenhouse gas (GHG) accounting protocols have addressed emissions from wastewater conveyance and treatment using a variety of simplifying methodologies. While these methodologies vary to some degree by protocol, within each protocol they provide consistent tools for organizational entities of varying size and scope to report and verify GHG emissions. Much of the science supporting these methodologies is either limited or the protocols have failed to keep abreast of developing GHG research. This state-of-the-art review summarizes the sources of direct GHG emissions (both those covered and not covered in current protocols) from wastewater handling; provides a review of the wastewater-related methodologies in a select group of popular protocols; and discusses where research has out-paced protocol methodologies and other areas where the supporting science is relatively weak and warrants further exploration.
Real-time subsecond voltammetric analysis of Pb in aqueous environmental samples.
Yang, Yuanyuan; Pathirathna, Pavithra; Siriwardhane, Thushani; McElmurry, Shawn P; Hashemi, Parastoo
2013-08-06
Lead (Pb) pollution is an important environmental and public health concern. Rapid Pb transport during stormwater runoff significantly impairs surface water quality. The ability to characterize and model Pb transport during these events is critical to mitigating its impact on the environment. However, Pb analysis is limited by the lack of analytical methods that can afford rapid, sensitive measurements in situ. While electrochemical methods have previously shown promise for rapid Pb analysis, they are currently limited in two ways. First, because of Pb's limited solubility, test solutions that are representative of environmental systems are not typically employed in laboratory characterizations. Second, concerns about traditional Hg electrode toxicity, stability, and low temporal resolution have dampened opportunities for in situ analyses with traditional electrochemical methods. In this paper, we describe two novel methodological advances that bypass these limitations. Using geochemical models, we first create an environmentally relevant test solution that can be used for electrochemical method development and characterization. Second, we develop a fast-scan cyclic voltammetry (FSCV) method for Pb detection on Hg-free carbon fiber microelectrodes. We assess the method's sensitivity and stability, taking into account Pb speciation, and utilize it to characterize rapid Pb fluctuations in real environmental samples. We thus present a novel real-time electrochemical tool for Pb analysis in both model and authentic environmental solutions.
Assessing risk factors for dental caries: a statistical modeling approach.
Trottini, Mario; Bossù, Maurizio; Corridore, Denise; Ierardo, Gaetano; Luzzi, Valeria; Saccucci, Matteo; Polimeni, Antonella
2015-01-01
The problem of identifying potential determinants and predictors of dental caries is of key importance in caries research and it has received considerable attention in the scientific literature. From the methodological side, a broad range of statistical models is currently available to analyze dental caries indices (DMFT, dmfs, etc.). These models have been applied in several studies to investigate the impact of different risk factors on the cumulative severity of dental caries experience. However, in most of the cases (i) these studies focus on a very specific subset of risk factors; and (ii) in the statistical modeling only few candidate models are considered and model selection is at best only marginally addressed. As a result, our understanding of the robustness of the statistical inferences with respect to the choice of the model is very limited; the richness of the set of statistical models available for analysis in only marginally exploited; and inferences could be biased due the omission of potentially important confounding variables in the model's specification. In this paper we argue that these limitations can be overcome considering a general class of candidate models and carefully exploring the model space using standard model selection criteria and measures of global fit and predictive performance of the candidate models. Strengths and limitations of the proposed approach are illustrated with a real data set. In our illustration the model space contains more than 2.6 million models, which require inferences to be adjusted for 'optimism'.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baumgartner, S.; Bieli, R.; Bergmann, U. C.
2012-07-01
An overview is given of existing CPR design criteria and the methods used in BWR reload analysis to evaluate the impact of channel bow on CPR margins. Potential weaknesses in today's methodologies are discussed. Westinghouse in collaboration with KKL and Axpo - operator and owner of the Leibstadt NPP - has developed an optimized CPR methodology based on a new criterion to protect against dryout during normal operation and with a more rigorous treatment of channel bow. The new steady-state criterion is expressed in terms of an upper limit of 0.01 for the dryout failure probability per year. This ismore » considered a meaningful and appropriate criterion that can be directly related to the probabilistic criteria set-up for the analyses of Anticipated Operation Occurrences (AOOs) and accidents. In the Monte Carlo approach a statistical modeling of channel bow and an accurate evaluation of CPR response functions allow the associated CPR penalties to be included directly in the plant SLMCPR and OLMCPR in a best-estimate manner. In this way, the treatment of channel bow is equivalent to all other uncertainties affecting CPR. Emphasis is put on quantifying the statistical distribution of channel bow throughout the core using measurement data. The optimized CPR methodology has been implemented in the Westinghouse Monte Carlo code, McSLAP. The methodology improves the quality of dryout safety assessments by supplying more valuable information and better control of conservatisms in establishing operational limits for CPR. The methodology is demonstrated with application examples from the introduction at KKL. (authors)« less
Updating the limit efficiency of silicon solar cells
NASA Technical Reports Server (NTRS)
Wolf, M.
1979-01-01
Evaluation of the limit efficiency based on the simplest, most basic mathematical method that is appropriate for the conditions imposed by the cell model is discussed. The methodology, the solar cell structure, and the selection of the material parameters used in the evaluation are described. The results are discussed including a set of design goals derived from the limit efficiency.
Macagnan, Fernanda Teixeira; da Silva, Leila Picolli; Hecktheuer, Luisa Helena
2016-07-01
There is a growing need for a global consensus on the definition of dietary fibre and the use of appropriate methodologies for its determination in different food matrices. Oligosaccharides (prebiotic effect) and bioactive compounds (antioxidant effect) are important constituents of dietary fibre, which enhance its beneficial effects in the body, such as those related to maintaining intestinal health. These dietary components need to be quantified and addressed in conjunction with fibre in nutritional studies due to the close relationship between them and their common destiny in the human body. This review discusses updates to the concept of dietary fibre, with an emphasis on biological and methodological aspects, and highlights the physiological importance of fibre as a carrier of bioactive compounds. Copyright © 2016 Elsevier Ltd. All rights reserved.
Validity assessment and the neurological physical examination.
Zasler, Nathan D
2015-01-01
The assessment of any patient or examinee with neurological impairment, whether acquired or congenital, provides a key set of data points in the context of developing accurate diagnostic impressions and implementing an appropriate neurorehabilitation program. As part of that assessment, the neurological physical exam is an extremely important component of the overall neurological assessment. In the aforementioned context, clinicians often are confounded by unusual, atypical or unexplainable physical exam findings that bring into question the organicity, veracity, and/or underlying cause of the observed clinical presentation. The purpose of this review is to provide readers with general directions and specific caveats regarding validity assessment in the context of the neurological physical exam. It is of utmost importance for health care practitioners to be aware of assessment methodologies that may assist in determining the validity of the neurological physical exam and differentiating organic from non-organic/functional impairments. Maybe more importantly, the limitations of many commonly used strategies for assessment of non-organicity should be recognized and consider prior to labeling observed physical findings on neurological exam as non-organic or functional.
van Zutphen, Linda; Siep, Nicolette; Jacob, Gitta A; Goebel, Rainer; Arntz, Arnoud
2015-04-01
Emotional sensitivity, emotion regulation and impulsivity are fundamental topics in research of borderline personality disorder (BPD). Studies using fMRI examining the neural correlates concerning these topics is growing and has just begun understanding the underlying neural correlates in BPD. However, there are strong similarities but also important differences in results of different studies. It is therefore important to know in more detail what these differences are and how we should interpret these. In present review a critical light is shed on the fMRI studies examining emotional sensitivity, emotion regulation and impulsivity in BPD patients. First an outline of the methodology and the results of the studies will be given. Thereafter important issues that remained unanswered and topics to improve future research are discussed. Future research should take into account the limited power of previous studies and focus more on BPD specificity with regard to time course responses, different regulation strategies, manipulation of self-regulation, medication use, a wider range of stimuli, gender effects and the inclusion of a clinical control group. Copyright © 2015 Elsevier Ltd. All rights reserved.
RM-DEMATEL: a new methodology to identify the key factors in PM2.5.
Chen, Yafeng; Liu, Jie; Li, Yunpeng; Sadiq, Rehan; Deng, Yong
2015-04-01
Weather system is a relative complex dynamic system, the factors of the system are mutually influenced PM2.5 concentration. In this paper, a new method is proposed to quantify the influence on PM2.5 by other factors in the weather system and identify the most important factors for PM2.5 with limited resources. The relation map (RM) is used to figure out the direct relation matrix of 14 factors in PM2.5. The decision making trial and evaluation laboratory(DEMATEL) is applied to calculate the causal relationship and extent to a mutual influence of 14 factors in PM2.5. According to the ranking results of our proposed method, the most important key factors is sulfur dioxide (SO2) and nitrogen oxides (NO(X)). In addition, the other factors, the ambient maximum temperature (T(max)), concentration of PM10, and wind direction (W(dir)), are important factors for PM2.5. The proposed method can also be applied to other environment management systems to identify key factors.
Van Cauwenberg, Jelle; De Bourdeaudhuij, Ilse; Clarys, Peter; Nasar, Jack; Salmon, Jo; Goubert, Liesbet; Deforche, Benedicte
2016-01-16
Knowledge about the relationships between micro-scale environmental factors and older adults' walking for transport is limited and inconsistent. This is probably due to methodological limitations, such as absence of an accurate neighborhood definition, lack of environmental heterogeneity, environmental co-variation, and recall bias. Furthermore, most previous studies are observational in nature. We aimed to address these limitations by investigating the effects of manipulating photographs on micro-scale environmental factors on the appeal of a street for older adults' transportation walking. Secondly, we used latent class analysis to examine whether subgroups could be identified that have different environmental preferences for transportation walking. Thirdly, we investigated whether these subgroups differed in socio-demographic, functional and psychosocial characteristics, current level of walking and environmental perceptions of their own street. Data were collected among 1131 Flemish older adults through an online (n = 940) or an interview version of the questionnaire (n = 191). This questionnaire included a choice-based conjoint exercise with manipulated photographs of a street. These manipulated photographs originated from one panoramic photograph of an existing street that was manipulated on nine environmental attributes. Participants chose which of two presented streets they would prefer to walk for transport. In the total sample, sidewalk evenness had by far the greatest appeal for transportation walking. The other environmental attributes were less important. Four subgroups that differed in their environmental preferences for transportation walking were identified. In the two largest subgroups (representing 86% of the sample) sidewalk evenness was the most important environmental attribute. In the two smaller subgroups (each comprising 7% of the sample), traffic volume and speed limit were the most important environmental attributes for one, and the presence of vegetation and a bench were the most important environmental attributes for the other. This latter subgroup included a higher percentage of service flat residents than the other subgroups. Our results suggest that the provision of even sidewalks should be considered a priority when developing environmental interventions aiming to stimulate older adults' transportation walking. Natural experiments are needed to confirm whether our findings can be translated to real environments and actual transportation walking behavior.
Hacking, Damian; Cleary, Susan
2016-02-09
Setting priorities is important in health research given the limited resources available for research. Various guidelines exist to assist in the priority setting process; however, priority setting still faces significant challenges such as the clear ranking of identified priorities. The World Health Organization (WHO) proposed a Disability Adjusted Life Year (DALY)-based model to rank priorities by research area (basic, health systems and biomedical) by dividing the DALYs into 'unavertable with existing interventions', 'avertable with improved efficiency' and 'avertable with existing but non-cost-effective interventions', respectively. However, the model has conceptual flaws and no clear methodology for its construction. Therefore, the aim of this paper was to amend the model to address these flaws, and develop a clear methodology by using tuberculosis in South Africa as a worked example. An amended model was constructed to represent total DALYs as the product of DALYs per person and absolute burden of disease. These figures were calculated for all countries from WHO datasets. The lowest figures achieved by any country were assumed to represent 'unavertable with existing interventions' if extrapolated to South Africa. The ratio of 'cost per patient treated' (adjusted for purchasing power and outcome weighted) between South Africa and the best country was used to calculate the 'avertable with improved efficiency section'. Finally, 'avertable with existing but non-cost-effective interventions' was calculated using Disease Control Priorities Project efficacy data, and the ratio between the best intervention and South Africa's current intervention, irrespective of cost. The amended model shows that South Africa has a tuberculosis burden of 1,009,837.3 DALYs; 0.009% of DALYs are unavertable with existing interventions and 96.3% of DALYs could be averted with improvements in efficiency. Of the remaining DALYs, a further 56.9% could be averted with existing but non-cost-effective interventions. The amended model was successfully constructed using limited data sources. The generalizability of the data used is the main limitation of the model. More complex formulas are required to deal with such potential confounding variables; however, the results act as starting point for development of a more robust model.
Selecting cockpit functions for speech I/O technology
NASA Technical Reports Server (NTRS)
Simpson, C. A.
1985-01-01
A general methodology for the initial selection of functions for speech generation and speech recognition technology is discussed. The SCR (Stimulus/Central-Processing/Response) compatibility model of Wickens et al. (1983) is examined, and its application is demonstrated for a particular cockpit display problem. Some limits of the applicability of that model are illustrated in the context of predicting overall pilot-aircraft system performance. A program of system performance measurement is recommended for the evaluation of candidate systems. It is suggested that no one measure of system performance can necessarily be depended upon to the exclusion of others. Systems response time, system accuracy, and pilot ratings are all important measures. Finally, these measures must be collected in the context of the total flight task environment.
Cunningham, Shayna D.; Kerrigan, Deanna L.; McNeely, Clea A.; Ellen, Jonathan M.
2016-01-01
This paper examines the activities of churches in Baltimore, Maryland, concerning the issues of sexuality, whether they potentially stigmatize persons with or at risk for HIV/AIDS, and to what extent individual agency versus institutional forces influence churches in this regard. In-depth interviews were conducted with 20 leaders from 16 churches and analyzed using a grounded theory methodology. Although many churches were involved in HIV/AIDS-related activities, the content of such initiatives was sometimes limited due to organizational constraints. Church leaders varied, however, in the extent to which they responded in accordance with or resisted these constraints, highlighting the importance of individual agency influencing churches’ responses to HIV/AIDS. PMID:19714469
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Myeong H., E-mail: myeong.lee@warwick.ac.uk; Troisi, Alessandro
Vibronic coupling between the electronic and vibrational degrees of freedom has been reported to play an important role in charge and exciton transport in organic photovoltaic materials, molecular aggregates, and light-harvesting complexes. Explicitly accounting for effective vibrational modes rather than treating them as a thermal environment has been shown to be crucial to describe the effect of vibronic coupling. We present a methodology to study dissipative quantum dynamics of vibronically coupled systems based on a surrogate Hamiltonian approach, which is in principle not limited by Markov approximation or weak system-bath interaction, using a vibronic basis. We apply vibronic surrogate Hamiltonianmore » method to a linear chain system and discuss how different types of relaxation process, intramolecular vibrational relaxation and intermolecular vibronic relaxation, influence population dynamics of dissipative vibronic systems.« less
Individual and peer group normative beliefs about relational aggression.
Werner, Nicole E; Hill, Laura G
2010-01-01
Studies show that children who use relational aggression process social information in unique ways; however, findings have been inconsistent and limited by methodological weaknesses. This short-term longitudinal study examined developmental changes in 245 (49% female; ages 8-13) 3rd through 8th graders' normative beliefs about relational aggression and tested the hypothesis that individual and classroom-level norms predict relational aggression 1 year later. Results showed that the transition to middle school was marked by increased approval of relational aggression, and individual norms predicted future relational aggression. Importantly, a contextual model showed that students in peer groups highly supportive of relational aggression became increasingly aggressive. Findings extend social information processing theories of relational aggression to focus on the role of peer group cognitions.
MUSIC TEMPO'S EFFECT ON EXERCISE PERFORMANCE: COMMENT ON DYER AND McKUNE.
Nakamura, Priscila Missaki
2015-06-01
Dyer and McKune (2013) stated that music tempo has no influence on performance, physiological, and psychophysical variables in well-trained cyclists during high intensity endurance tasks. However, there are important limitations in the methodology of the study. The participants' music preferences and tempo change were not well measured. It is not possible to affirm that music tempo does not influence athletes' performance. Potential areas of future research include: (a) use of instruments to assess the qualities of music; (b) standardizing music of tempo according to exercise type (e.g., running, cycling, etc.); (c) considering training level of the participants (i.e., athletes and non-athletes); and (d) use of instruments to assess concentration during exercise.
Hydrocarbon degradation in soils and methods for soil biotreatment.
Morgan, P; Watkinson, R J
1989-01-01
The cleanup of soils and groundwater contaminated with hydrocarbons is of particular importance in minimizing the environmental impact of petroleum and petroleum products and in preventing contamination of potable water supplies. Consequently, there is a growing industry involved in the treatment of contaminated topsoils, subsoils, and groundwater. The biotreatment methodologies employed for decontamination are designed to enhance in situ degradation by the supply of oxygen, inorganic nutrients, and/or microbial inocula to the contaminated zone. This review considers the fate and effects of hydrocarbon contaminants in terrestrial environments, with particular reference to the factors that limit biodegradation rates. The potential efficiencies, advantages, and disadvantages of biotreatment techniques are discussed and the future research directions necessary for process development are considered.
Birth Order and health: major issues.
Elliott, B A
1992-08-01
Birth Order has been described as a variable with a complex relationship to child and adult outcomes. A review of the medical literature over the past 5 years identified 20 studies that investigated the relationship between Birth Order and a health outcome. Only one of the studies established a relationship between Birth Order and a health outcome: third and fourth-born children have a higher incidence of accidents that result in hospitalization. The other demonstrated relationships are each explained by intervening variables or methodological limitations. Although Birth Order is not a strongly independent explanatory factor in understanding health outcomes, it is an important marker variable. Statistically significant relationships between Birth Order and health outcomes yield insights into the ways a family influences an individual's health.
Features and characterization needs of rubber composite structures
NASA Technical Reports Server (NTRS)
Tabaddor, Farhad
1989-01-01
Some of the major unique features of rubber composite structures are outlined. The features covered are those related to the material properties, but the analytical features are also briefly discussed. It is essential to recognize these features at the planning stage of any long-range analytical, experimental, or application program. The development of a general and comprehensive program which fully accounts for all the important characteristics of tires, under all the relevant modes of operation, may present a prohibitively expensive and impractical task at the near future. There is therefore a need to develop application methodologies which can utilize the less general models, beyond their theoretical limitations and yet with reasonable reliability, by proper mix of analytical, experimental, and testing activities.
2010-01-01
Cardiac-related failure of weaning from mechanical ventilation is an important reason for prolonged mechanical ventilation, intensive care unit treatment, and increased morbidity and mortality. When transthoracic echocardiography (TTE) is routinely performed before a weaning trial, patients at high risk of cardiac-related failure can be detected by low left ventricular (LV) ejection fraction, diastolic dysfunction, and elevated LV filling pressure. During the weaning trial, a further increase of LV filling pressure and progression of diastolic failure can be observed by repeated TTE. Owing to certain limitations concerning patients and methodology, TTE cannot be employed in every patient and invasive hemodynamic monitoring is still mandatory in selected patients with repetitive weaning failure. PMID:20619005
Voga, Gorazd
2010-01-01
Cardiac-related failure of weaning from mechanical ventilation is an important reason for prolonged mechanical ventilation, intensive care unit treatment, and increased morbidity and mortality. When transthoracic echocardiography (TTE) is routinely performed before a weaning trial, patients at high risk of cardiac-related failure can be detected by low left ventricular (LV) ejection fraction, diastolic dysfunction, and elevated LV filling pressure. During the weaning trial, a further increase of LV filling pressure and progression of diastolic failure can be observed by repeated TTE. Owing to certain limitations concerning patients and methodology, TTE cannot be employed in every patient and invasive hemodynamic monitoring is still mandatory in selected patients with repetitive weaning failure.