Modeling energy/economy interactions for conservation and renewable energy-policy analysis
NASA Astrophysics Data System (ADS)
Groncki, P. J.
Energy policy and the implications for policy analysis and the methodological tools are discussed. The evolution of one methodological approach and the combined modeling system of the component models, their evolution in response to changing analytic needs, and the development of the integrated framework are reported. The analyses performed over the past several years are summarized. The current philosophy behind energy policy is discussed and compared to recent history. Implications for current policy analysis and methodological approaches are drawn.
Munthe-Kaas, Heather; Bohren, Meghan A; Glenton, Claire; Lewin, Simon; Noyes, Jane; Tunçalp, Özge; Booth, Andrew; Garside, Ruth; Colvin, Christopher J; Wainwright, Megan; Rashidian, Arash; Flottorp, Signe; Carlsen, Benedicte
2018-01-25
The GRADE-CERQual (Confidence in Evidence from Reviews of Qualitative research) approach has been developed by the GRADE (Grading of Recommendations Assessment, Development and Evaluation) Working Group. The approach has been developed to support the use of findings from qualitative evidence syntheses in decision-making, including guideline development and policy formulation. CERQual includes four components for assessing how much confidence to place in findings from reviews of qualitative research (also referred to as qualitative evidence syntheses): (1) methodological limitations, (2) coherence, (3) adequacy of data and (4) relevance. This paper is part of a series providing guidance on how to apply CERQual and focuses on CERQual's methodological limitations component. We developed the methodological limitations component by searching the literature for definitions, gathering feedback from relevant research communities and developing consensus through project group meetings. We tested the CERQual methodological limitations component within several qualitative evidence syntheses before agreeing on the current definition and principles for application. When applying CERQual, we define methodological limitations as the extent to which there are concerns about the design or conduct of the primary studies that contributed evidence to an individual review finding. In this paper, we describe the methodological limitations component and its rationale and offer guidance on how to assess methodological limitations of a review finding as part of the CERQual approach. This guidance outlines the information required to assess methodological limitations component, the steps that need to be taken to assess methodological limitations of data contributing to a review finding and examples of methodological limitation assessments. This paper provides guidance for review authors and others on undertaking an assessment of methodological limitations in the context of the CERQual approach. More work is needed to determine which criteria critical appraisal tools should include when assessing methodological limitations. We currently recommend that whichever tool is used, review authors provide a transparent description of their assessments of methodological limitations in a review finding. We expect the CERQual approach and its individual components to develop further as our experiences with the practical implementation of the approach increase.
Para-Quantitative Methodology: Reclaiming Experimentalism in Educational Research
ERIC Educational Resources Information Center
Shabani Varaki, Bakhtiar; Floden, Robert E.; Javidi Kalatehjafarabadi, Tahereh
2015-01-01
This article focuses on the criticisms of current approaches in educational research methodology. It summarizes rationales for mixed methods and argues that the mixing quantitative paradigm and qualitative paradigm is problematic due to practical and philosophical arguments. It is also indicated that the current rise of mixed methods work has…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, Xiao-Ying; Yao, Juan; He, Hua
2012-01-01
Extensive testing shows that the current version of the Chemical Mixture Methodology (CMM) is meeting its intended mission to provide conservative estimates of the health effects from exposure to airborne chemical mixtures. However, the current version of the CMM could benefit from several enhancements that are designed to improve its application of Health Code Numbers (HCNs) and employ weighting factors to reduce over conservatism.
Methodological Approaches in Conducting Overviews: Current State in HTA Agencies
ERIC Educational Resources Information Center
Pieper, Dawid; Antoine, Sunya-Lee; Morfeld, Jana-Carina; Mathes, Tim; Eikermann, Michaela
2014-01-01
Objectives: Overviews search for reviews rather than for primary studies. They might have the potential to support decision making within a shorter time frame by reducing production time. We aimed to summarize available instructions for authors intending to conduct overviews as well as the currently applied methodology of overviews in…
The U.S. EPA's current draft ARE methodology offers three different approaches for derivation of health effects values for various chemicals and agents under inhalation exposure scenarios of < 24 hrs. These approaches, the NOAEL, benchmark concentration (BMC), and categorical ...
Progressive failure methodologies for predicting residual strength and life of laminated composites
NASA Technical Reports Server (NTRS)
Harris, Charles E.; Allen, David H.; Obrien, T. Kevin
1991-01-01
Two progressive failure methodologies currently under development by the Mechanics of Materials Branch at NASA Langley Research Center are discussed. The damage tolerance/fail safety methodology developed by O'Brien is an engineering approach to ensuring adequate durability and damage tolerance by treating only delamination onset and the subsequent delamination accumulation through the laminate thickness. The continuum damage model developed by Allen and Harris employs continuum damage laws to predict laminate strength and life. The philosophy, mechanics framework, and current implementation status of each methodology are presented.
Qualitative Approaches to Mixed Methods Practice
ERIC Educational Resources Information Center
Hesse-Biber, Sharlene
2010-01-01
This article discusses how methodological practices can shape and limit how mixed methods is practiced and makes visible the current methodological assumptions embedded in mixed methods practice that can shut down a range of social inquiry. The article argues that there is a "methodological orthodoxy" in how mixed methods is practiced…
Discourse Analysis and the Study of Educational Leadership
ERIC Educational Resources Information Center
Anderson, Gary; Mungal, Angus Shiva
2015-01-01
Purpose: The purpose of this paper is to provide an overview of the current and past work using discourse analysis in the field of educational administration and of discourse analysis as a methodology. Design/Methodology/Approach: Authors reviewed research in educational leadership that uses discourse analysis as a methodology. Findings: While…
Development of a structured approach for decomposition of complex systems on a functional basis
NASA Astrophysics Data System (ADS)
Yildirim, Unal; Felician Campean, I.
2014-07-01
The purpose of this paper is to present the System State Flow Diagram (SSFD) as a structured and coherent methodology to decompose a complex system on a solution- independent functional basis. The paper starts by reviewing common function modelling frameworks in literature and discusses practical requirements of the SSFD in the context of the current literature and current approaches in industry. The proposed methodology is illustrated through the analysis of a case study: design analysis of a generic Bread Toasting System (BTS).
Examining emotional expressions in discourse: methodological considerations
NASA Astrophysics Data System (ADS)
Hufnagel, Elizabeth; Kelly, Gregory J.
2017-10-01
This methodological paper presents an approach for examining emotional expressions through discourse analysis and ethnographic methods. Drawing on trends in the current literature in science education, we briefly explain the importance of emotions in science education and examine the current research methodologies used in interactional emotion studies. We put forth and substantiate a methodological approach that attends to the interactional, contextual, intertextual, and consequential aspects of emotional expressions. By examining emotional expressions in the discourse in which they are constructed, emotional expressions are identified through semantics, contextualization, and linguistic features. These features make salient four dimensions of emotional expressions: aboutness, frequency, type, and ownership. Drawing on data from a large empirical study of pre-service elementary teachers' emotional expressions about climate change in a science course, we provide illustrative examples to describe what counts as emotional expressions in situ. In doing so we explain how our approach makes salient the nuanced nature of such expressions as well as the broader discourse in which they are constructed and the implications for researching emotional expressions in science education discourse. We suggest reasons why this discourse orientated research methodology can contribute to the interactional study of emotions in science education contexts.
Canino-Rodríguez, José M; García-Herrero, Jesús; Besada-Portas, Juan; Ravelo-García, Antonio G; Travieso-González, Carlos; Alonso-Hernández, Jesús B
2015-03-04
The limited efficiency of current air traffic systems will require a next-generation of Smart Air Traffic System (SATS) that relies on current technological advances. This challenge means a transition toward a new navigation and air-traffic procedures paradigm, where pilots and air traffic controllers perform and coordinate their activities according to new roles and technological supports. The design of new Human-Computer Interactions (HCI) for performing these activities is a key element of SATS. However efforts for developing such tools need to be inspired on a parallel characterization of hypothetical air traffic scenarios compatible with current ones. This paper is focused on airborne HCI into SATS where cockpit inputs came from aircraft navigation systems, surrounding traffic situation, controllers' indications, etc. So the HCI is intended to enhance situation awareness and decision-making through pilot cockpit. This work approach considers SATS as a system distributed on a large-scale with uncertainty in a dynamic environment. Therefore, a multi-agent systems based approach is well suited for modeling such an environment. We demonstrate that current methodologies for designing multi-agent systems are a useful tool to characterize HCI. We specifically illustrate how the selected methodological approach provides enough guidelines to obtain a cockpit HCI design that complies with future SATS specifications.
Canino-Rodríguez, José M.; García-Herrero, Jesús; Besada-Portas, Juan; Ravelo-García, Antonio G.; Travieso-González, Carlos; Alonso-Hernández, Jesús B.
2015-01-01
The limited efficiency of current air traffic systems will require a next-generation of Smart Air Traffic System (SATS) that relies on current technological advances. This challenge means a transition toward a new navigation and air-traffic procedures paradigm, where pilots and air traffic controllers perform and coordinate their activities according to new roles and technological supports. The design of new Human-Computer Interactions (HCI) for performing these activities is a key element of SATS. However efforts for developing such tools need to be inspired on a parallel characterization of hypothetical air traffic scenarios compatible with current ones. This paper is focused on airborne HCI into SATS where cockpit inputs came from aircraft navigation systems, surrounding traffic situation, controllers’ indications, etc. So the HCI is intended to enhance situation awareness and decision-making through pilot cockpit. This work approach considers SATS as a system distributed on a large-scale with uncertainty in a dynamic environment. Therefore, a multi-agent systems based approach is well suited for modeling such an environment. We demonstrate that current methodologies for designing multi-agent systems are a useful tool to characterize HCI. We specifically illustrate how the selected methodological approach provides enough guidelines to obtain a cockpit HCI design that complies with future SATS specifications. PMID:25746092
Duggleby, Wendy; Williams, Allison
2016-01-01
The purpose of this article is to discuss methodological and epistemological considerations involved in using qualitative inquiry to develop interventions. These considerations included (a) using diverse methodological approaches and (b) epistemological considerations such as generalization, de-contextualization, and subjective reality. Diverse methodological approaches have the potential to inform different stages of intervention development. Using the development of a psychosocial hope intervention for advanced cancer patients as an example, the authors utilized a thematic study to assess current theories/frameworks and interventions. However, to understand the processes that the intervention needed to target to affect change, grounded theory was used. Epistemological considerations provided a framework to understand and, further, critique the intervention. Using diverse qualitative methodological approaches and examining epistemological considerations were useful in developing an intervention that appears to foster hope in patients with advanced cancer. © The Author(s) 2015.
A framework for assessing the adequacy and effectiveness of software development methodologies
NASA Technical Reports Server (NTRS)
Arthur, James D.; Nance, Richard E.
1990-01-01
Tools, techniques, environments, and methodologies dominate the software engineering literature, but relatively little research in the evaluation of methodologies is evident. This work reports an initial attempt to develop a procedural approach to evaluating software development methodologies. Prominent in this approach are: (1) an explication of the role of a methodology in the software development process; (2) the development of a procedure based on linkages among objectives, principles, and attributes; and (3) the establishment of a basis for reduction of the subjective nature of the evaluation through the introduction of properties. An application of the evaluation procedure to two Navy methodologies has provided consistent results that demonstrate the utility and versatility of the evaluation procedure. Current research efforts focus on the continued refinement of the evaluation procedure through the identification and integration of product quality indicators reflective of attribute presence, and the validation of metrics supporting the measure of those indicators. The consequent refinement of the evaluation procedure offers promise of a flexible approach that admits to change as the field of knowledge matures. In conclusion, the procedural approach presented in this paper represents a promising path toward the end goal of objectively evaluating software engineering methodologies.
Creativity Research: Implications for Teaching, Learning and Thinking.
ERIC Educational Resources Information Center
Petrowski, Mary Jane
2000-01-01
Explores reasons why creativity has only recently gained credibility as a legitimate research field and provides an overview of various disciplinary approaches and methodologies currently in use that are relevant to teaching and learning. Highlights include psychometrics; contextual approaches; experimental approaches; biographical, or…
Benchmarking for the Effective Use of Student Evaluation Data
ERIC Educational Resources Information Center
Smithson, John; Birks, Melanie; Harrison, Glenn; Nair, Chenicheri Sid; Hitchins, Marnie
2015-01-01
Purpose: The purpose of this paper is to examine current approaches to interpretation of student evaluation data and present an innovative approach to developing benchmark targets for the effective and efficient use of these data. Design/Methodology/Approach: This article discusses traditional approaches to gathering and using student feedback…
A More Flexible Approach to Valuing Flexibility
2011-04-01
remaining life of the program? Almost certainly. Next is the cost assessment step. This is executed in the context of whatever design options we...methodology is essentially a modifi- cation of the current life cycle model and is premised on the notion that the need for capabili- ty changes in a program...valuing the inherent ability of a system or design to accommodate change. The proposed methodology is essentially a modifi-cation of the current life
Prioritization Methodology for Chemical Replacement
NASA Technical Reports Server (NTRS)
Cruit, W.; Schutzenhofer, S.; Goldberg, B.; Everhart, K.
1993-01-01
This project serves to define an appropriate methodology for effective prioritization of efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semiquantitative approach derived from quality function deployment techniques (QFD Matrix). This methodology aims to weigh the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives. The results are being implemented as a guideline for consideration for current NASA propulsion systems.
ERIC Educational Resources Information Center
Bird, Anne Marie; Ross, Diane
1984-01-01
A brief history of research in sport psychology based on Lander's (1982) analysis is presented. A systematic approach to theory building is offered. Previous methodological inadequacies are identified using examples of observational learning and anxiety. (Author/DF)
Give Design a Chance: A Case for a Human Centered Approach to Operational Art
2017-03-30
strategy development and operational art. This demands fuller integration of the Army Design Methodology (ADM) and the Military Decision Making Process...MDMP). This monograph proposes a way of thinking and planning that goes beyond current Army doctrinal methodologies to address the changing...between conceptual and detailed planning. 15. SUBJECT TERMS Design; Army Design Methodology (ADM); Human Centered; Strategy; Operational Art
Fuzzy Current-Mode Control and Stability Analysis
NASA Technical Reports Server (NTRS)
Kopasakis, George
2000-01-01
In this paper a current-mode control (CMC) methodology is developed for a buck converter by using a fuzzy logic controller. Conventional CMC methodologies are based on lead-lag compensation with voltage and inductor current feedback. In this paper the converter lead-lag compensation will be substituted with a fuzzy controller. A small-signal model of the fuzzy controller will also be developed in order to examine the stability properties of this buck converter control system. The paper develops an analytical approach, introducing fuzzy control into the area of CMC.
Renaissance of protein crystallization and precipitation in biopharmaceuticals purification.
Dos Santos, Raquel; Carvalho, Ana Luísa; Roque, A Cecília A
The current chromatographic approaches used in protein purification are not keeping pace with the increasing biopharmaceutical market demand. With the upstream improvements, the bottleneck shifted towards the downstream process. New approaches rely in Anything But Chromatography methodologies and revisiting former techniques with a bioprocess perspective. Protein crystallization and precipitation methods are already implemented in the downstream process of diverse therapeutic biological macromolecules, overcoming the current chromatographic bottlenecks. Promising work is being developed in order to implement crystallization and precipitation in the purification pipeline of high value therapeutic molecules. This review focuses in the role of these two methodologies in current industrial purification processes, and highlights their potential implementation in the purification pipeline of high value therapeutic molecules, overcoming chromatographic holdups. Copyright © 2016 Elsevier Inc. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-05
... Numerical Simulations Risk Management Methodology November 1, 2010. I. Introduction On August 25, 2010, The... Analysis and Numerical Simulations (``STANS'') risk management methodology. The rule change alters... collateral within the STANS Monte Carlo simulations.\\7\\ \\7\\ OCC believes the approach currently used to...
Online Tutor 2.0: Methodologies and Case Studies for Successful Learning
ERIC Educational Resources Information Center
García-Peñalvo, Francisco José, Ed.; Seoane-Pardo, Antonio Miguel, Ed.
2014-01-01
After centuries of rethinking education and learning, the current theory is based on technology's approach to and affect on the planned interaction between knowledge trainers and trainees. "Online Tutor 2.0: Methodologies and Case Studies for Successful Learning" demonstrates, through the exposure of successful cases in online education…
Language Education and ELT Materials in Turkey from the Path Dependence Perspective
ERIC Educational Resources Information Center
Isik, Ali
2011-01-01
This paper examines the role of traditional language teaching methodology on the current language teaching methodology in Turkey from the Path Dependence Theory perspective. Path Dependence claims that the past continues shaping the present. Similarly, traditional approaches still shape foreign/second language education. Turkey has inherited a…
USDA-ARS?s Scientific Manuscript database
Current molecular methodologies, specifically DNA-based approaches, provide access to previously hidden soil biodiversity and are routinely employed in environmental studies of microbial ecology. Selection of cell lysis methodology is critical to community analyses due to the inability of any singul...
NASA Technical Reports Server (NTRS)
Cruit, Wendy; Schutzenhofer, Scott; Goldberg, Ben; Everhart, Kurt
1993-01-01
This project served to define an appropriate methodology for effective prioritization of technology efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semiquantitative approach derived from quality function deployment techniques (QFD Matrix). This methodology aims to weight the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives. The results will be implemented as a guideline for consideration for current NASA propulsion systems.
Jeong, Jeong-Won; Shin, Dae C; Do, Synho; Marmarelis, Vasilis Z
2006-08-01
This paper presents a novel segmentation methodology for automated classification and differentiation of soft tissues using multiband data obtained with the newly developed system of high-resolution ultrasonic transmission tomography (HUTT) for imaging biological organs. This methodology extends and combines two existing approaches: the L-level set active contour (AC) segmentation approach and the agglomerative hierarchical kappa-means approach for unsupervised clustering (UC). To prevent the trapping of the current iterative minimization AC algorithm in a local minimum, we introduce a multiresolution approach that applies the level set functions at successively increasing resolutions of the image data. The resulting AC clusters are subsequently rearranged by the UC algorithm that seeks the optimal set of clusters yielding the minimum within-cluster distances in the feature space. The presented results from Monte Carlo simulations and experimental animal-tissue data demonstrate that the proposed methodology outperforms other existing methods without depending on heuristic parameters and provides a reliable means for soft tissue differentiation in HUTT images.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
Kushniruk, Andre W; Borycki, Elizabeth M
2015-01-01
The development of more usable and effective healthcare information systems has become a critical issue. In the software industry methodologies such as agile and iterative development processes have emerged to lead to more effective and usable systems. These approaches highlight focusing on user needs and promoting iterative and flexible development practices. Evaluation and testing of iterative agile development cycles is considered an important part of the agile methodology and iterative processes for system design and re-design. However, the issue of how to effectively integrate usability testing methods into rapid and flexible agile design cycles has remained to be fully explored. In this paper we describe our application of an approach known as low-cost rapid usability testing as it has been applied within agile system development in healthcare. The advantages of the integrative approach are described, along with current methodological considerations.
Three Approaches to Environmental Resources Analysis.
ERIC Educational Resources Information Center
Harvard Univ., Cambridge, MA. Graduate School of Design.
This booklet, the first of a projected series related to the development of methodologies and techniques for environments planning and design, examines three approaches that are currently being used to identify, analyze, and evaluate the natural and man-made resources that comprise the physical environment. One approach by G. Angus Hills uses a…
The colloquial approach: An active learning technique
NASA Astrophysics Data System (ADS)
Arce, Pedro
1994-09-01
This paper addresses the very important problem of the effectiveness of teaching methodologies in fundamental engineering courses such as transport phenomena. An active learning strategy, termed the colloquial approach, is proposed in order to increase student involvement in the learning process. This methodology is a considerable departure from traditional methods that use solo lecturing. It is based on guided discussions, and it promotes student understanding of new concepts by directing the student to construct new ideas by building upon the current knowledge and by focusing on key cases that capture the essential aspects of new concepts. The colloquial approach motivates the student to participate in discussions, to develop detailed notes, and to design (or construct) his or her own explanation for a given problem. This paper discusses the main features of the colloquial approach within the framework of other current and previous techniques. Problem-solving strategies and the need for new textbooks and for future investigations based on the colloquial approach are also outlined.
Web-Based Collaborative Writing in L2 Contexts: Methodological Insights from Text Mining
ERIC Educational Resources Information Center
Yim, Soobin; Warschauer, Mark
2017-01-01
The increasingly widespread use of social software (e.g., Wikis, Google Docs) in second language (L2) settings has brought a renewed attention to collaborative writing. Although the current methodological approaches to examining collaborative writing are valuable to understand L2 students' interactional patterns or perceived experiences, they can…
Decoding the Disciplines: An Approach to Scientific Thinking
ERIC Educational Resources Information Center
Pinnow, Eleni
2016-01-01
The Decoding the Disciplines methodology aims to teach students to think like experts in discipline-specific tasks. The central aspect of the methodology is to identify a bottleneck in the course content: a particular topic that a substantial number of students struggle to master. The current study compared the efficacy of standard lecture and…
Integration of infrared thermography into various maintenance methodologies
NASA Astrophysics Data System (ADS)
Morgan, William T.
1993-04-01
Maintenance methodologies are in developmental stages throughout the world as global competitiveness drives all industries to improve operational efficiencies. Rapid progress in technical advancements has added an additional strain on maintenance organizations to progressively change. Accompanying needs for advanced training and documentation is the demand for utilization of various analytical instruments and quantitative methods. Infrared thermography is one of the primary elements of engineered approaches to maintenance. Current maintenance methodologies can be divided into six categories; Routine ('Breakdown'), Preventive, Predictive, Proactive, Reliability-Based, and Total Productive (TPM) maintenance. Each of these methodologies have distinctive approaches to achieving improved operational efficiencies. Popular though is that infrared thermography is a Predictive maintenance tool. While this is true, it is also true that it can be effectively integrated into each of the maintenance methodologies for achieving desired results. The six maintenance strategies will be defined. Infrared applications integrated into each will be composed in tabular form.
Distance Education in China: The Current State of e-Learning
ERIC Educational Resources Information Center
Chen, Li; Chen, Huina; Wang, Nan
2009-01-01
Purpose: The purpose of this paper is to identify the current trends in and future prospects for distance education in primary, secondary, and higher education in China. Design/methodology/approach: This article reviews relevant literature and cases, and explores the current situation of distance education in China. Findings: The use of…
Effectiveness of Social Media for Communicating Health Messages in Ghana
ERIC Educational Resources Information Center
Bannor, Richard; Asare, Anthony Kwame; Bawole, Justice Nyigmah
2017-01-01
Purpose: The purpose of this paper is to develop an in-depth understanding of the effectiveness, evolution and dynamism of the current health communication media used in Ghana. Design/methodology/approach: This paper uses a multi-method approach which utilizes a combination of qualitative and quantitative approaches. In-depth interviews are…
Organizational Approach to the Ergonomic Examination of E-Learning Modules
ERIC Educational Resources Information Center
Lavrov, Evgeniy; Kupenko, Olena; Lavryk, Tetiana; Barchenko, Natalia
2013-01-01
With a significant increase in the number of e-learning resources the issue of quality is of current importance. An analysis of existing scientific and methodological literature shows the variety of approaches, methods and tools to evaluate e-learning materials. This paper proposes an approach based on the procedure for estimating parameters of…
How Organisations Learn from Safety Incidents: A Multifaceted Problem
ERIC Educational Resources Information Center
Lukic, Dane; Margaryan, Anoush; Littlejohn, Allison
2010-01-01
Purpose: This paper seeks to review current approaches to learning from health and safety incidents in the workplace. The aim of the paper is to identify the diversity of approaches and analyse them in terms of learning aspects. Design/methodology/approach: A literature review was conducted searching for terms incident/accident/near…
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
Working with words: exploring textual analysis in medical education research.
Park, Sophie; Griffin, Ann; Gill, Deborah
2012-04-01
Text is familiar to us all. This paper offers an introduction to, and an exploration of, the range of methodological possibilities open to the education researcher who has chosen to use text as a research data source. It encourages a purposeful deliberation of the different textual sources available as data, the range of methodological approaches possible and the types of interpretation that can be adopted when embarking on an empirical study using textual data. Approaches to interpreting text are varied and utilise a range of analytical and interpretative strategies. To illustrate the theoretical points raised within this paper, two contrasting methods were applied to the same text. Tag cloud analysis and performative narrative analysis (PNA) were employed to analyse Chapter 4 of the UK government's 2010 White Paper Equity and excellence: Liberating the NHS. The adoption of these contrasting methodologies, which are not currently used extensively in medical education research, revealed that some common issues were identified by both tag clouds and PNA, but, in addition, each approach was able to unveil something unique about the text. These two methods highlight the range of affordances, or possibilities, that textual analysis will have on the results. We suggest that medical education researchers should be encouraged to move away from the current dominant and privileged methodologies that seek to provide answers and explore other methods and approaches to textual data that encourage us to question and reflect more deeply. © Blackwell Publishing Ltd 2012.
A Review of Self-Report and Alternative Approaches in the Measurement of Student Motivation
ERIC Educational Resources Information Center
Fulmer, Sara M.; Frijters, Jan C.
2009-01-01
Within psychological and educational research, self-report methodology dominates the study of student motivation. The present review argues that the scope of motivation research can be expanded by incorporating a wider range of methodologies and measurement tools. Several authors have suggested that current study of motivation is overly reliant on…
ERIC Educational Resources Information Center
Fernandes, Joana; Costa, Rute; Peres, Paula
2016-01-01
This paper aims at discussing the advantages of a methodology design grounded on a concept-based approach to Terminology applied to the most prominent scenario of current Higher Education: "blended learning." Terminology is a discipline that aims at representing, describing and defining specialized knowledge through language, putting…
Reflexivity as Methodology: An Approach to the Necessarily Political Work of Senior Groups
ERIC Educational Resources Information Center
Warwick, Robert; Board, Douglas
2012-01-01
Research into senior groups and their political nature has serious gaps. We claim that participants in the process are best placed to be both researchers and, with others, the subject of research. Here we illustrate the shortcomings of current methodologies, such as action research, due to the spatial separation and detemporalisation between what…
Seniors Falls Investigative Methodology (SFIM): A Systems Approach to the Study of Falls in Seniors
ERIC Educational Resources Information Center
Zecevic, Aleksandra A.; Salmoni, Alan W.; Lewko, John H.; Vandervoort, Anthony A.
2007-01-01
An in-depth understanding of human factors and human error is lacking in current research on seniors' falls. Additional knowledge is needed to understand why seniors are falling. The purpose of this article is to describe the adapting of the Integrated Safety Investigation Methodology (ISIM) (used for investigating transportation and industrial…
The need for a comprehensive expert system development methodology
NASA Technical Reports Server (NTRS)
Baumert, John; Critchfield, Anna; Leavitt, Karen
1988-01-01
In a traditional software development environment, the introduction of standardized approaches has led to higher quality, maintainable products on the technical side and greater visibility into the status of the effort on the management side. This study examined expert system development to determine whether it differed enough from traditional systems to warrant a reevaluation of current software development methodologies. Its purpose was to identify areas of similarity with traditional software development and areas requiring tailoring to the unique needs of expert systems. A second purpose was to determine whether existing expert system development methodologies meet the needs of expert system development, management, and maintenance personnel. The study consisted of a literature search and personal interviews. It was determined that existing methodologies and approaches to developing expert systems are not comprehensive nor are they easily applied, especially to cradle to grave system development. As a result, requirements were derived for an expert system development methodology and an initial annotated outline derived for such a methodology.
Decision Support for Renewal of Wastewater Collection and Water Distribution Systems
The objective of this study was to identify the current decision support methodologies, models and approaches being used for determining how to rehabilitate or replace underground utilities; identify the critical gaps of these current models through comparison with case history d...
Leighton, Angela; Weinborn, Michael; Maybery, Murray
2014-10-01
Bigler (2012) and Larrabee (2012) recently addressed the state of the science surrounding performance validity tests (PVTs) in a dialogue highlighting evidence for the valid and increased use of PVTs, but also for unresolved problems. Specifically, Bigler criticized the lack of guidance from neurocognitive processing theory in the PVT literature. For example, individual PVTs have applied the simultaneous forced-choice methodology using a variety of test characteristics (e.g., word vs. picture stimuli) with known neurocognitive processing implications (e.g., the "picture superiority effect"). However, the influence of such variations on classification accuracy has been inadequately evaluated, particularly among cognitively impaired individuals. The current review places the PVT literature in the context of neurocognitive processing theory, and identifies potential methodological factors to account for the significant variability we identified in classification accuracy across current PVTs. We subsequently evaluated the utility of a well-known cognitive manipulation to provide a Clinical Analogue Methodology (CAM), that is, to alter the PVT performance of healthy individuals to be similar to that of a cognitively impaired group. Initial support was found, suggesting the CAM may be useful alongside other approaches (analogue malingering methodology) for the systematic evaluation of PVTs, particularly the influence of specific neurocognitive processing components on performance.
Towards an Airframe Noise Prediction Methodology: Survey of Current Approaches
NASA Technical Reports Server (NTRS)
Farassat, Fereidoun; Casper, Jay H.
2006-01-01
In this paper, we present a critical survey of the current airframe noise (AFN) prediction methodologies. Four methodologies are recognized. These are the fully analytic method, CFD combined with the acoustic analogy, the semi-empirical method and fully numerical method. It is argued that for the immediate need of the aircraft industry, the semi-empirical method based on recent high quality acoustic database is the best available method. The method based on CFD and the Ffowcs William- Hawkings (FW-H) equation with penetrable data surface (FW-Hpds ) has advanced considerably and much experience has been gained in its use. However, more research is needed in the near future particularly in the area of turbulence simulation. The fully numerical method will take longer to reach maturity. Based on the current trends, it is predicted that this method will eventually develop into the method of choice. Both the turbulence simulation and propagation methods need to develop more for this method to become useful. Nonetheless, the authors propose that the method based on a combination of numerical and analytical techniques, e.g., CFD combined with FW-H equation, should also be worked on. In this effort, the current symbolic algebra software will allow more analytical approaches to be incorporated into AFN prediction methods.
The approaches to the didactics of physics in the Czech Republic - Historical development
NASA Astrophysics Data System (ADS)
Žák, Vojtěch
2017-01-01
The aim of this paper is to describe approaches to the didactics of physics which have appeared in the Czech Republic during its development and to discuss mainly their relationships with other fields. It is potentially beneficial to the understanding of the current situation of the Czech didactics of physics and to the prognosis of its future development. The main part of the article includes a description of the particular approaches of the Czech didactics of physics, such as the methodological, application, integration and communication approaches described in chronological order. Special attention is paid to the relationships of the didactics of physics and physics itself, pedagogy and other fields. It is obvious that the methodological approach is narrowly connected to physics, while the application approach comes essentially from pedagogy. The integration approach seeks the utilization of other scientific fields to develop the didactics of physics. It was revealed that the most elaborate is the communication approach. This approach belongs to the concepts that have influenced the current didactical thinking in the Czech Republic to a high extent in other fields as well (including within the didactics of socio-humanist fields). In spite of the importance of the communication approach, it should be admitted that the other approaches are, to a certain extent, employed as well and co-exist.
Recovery and purification process development for monoclonal antibody production
Ma, Junfen; Winter, Charles; Bayer, Robert
2010-01-01
Hundreds of therapeutic monoclonal antibodies (mAbs) are currently in development, and many companies have multiple antibodies in their pipelines. Current methodology used in recovery processes for these molecules are reviewed here. Basic unit operations such as harvest, Protein A affinity chromatography and additional polishing steps are surveyed. Alternative processes such as flocculation, precipitation and membrane chromatography are discussed. We also cover platform approaches to purification methods development, use of high throughput screening methods, and offer a view on future developments in purification methodology as applied to mAbs. PMID:20647768
Uncertainty factors in screening ecological risk assessments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duke, L.D.; Taggart, M.
2000-06-01
The hazard quotient (HQ) method is commonly used in screening ecological risk assessments (ERAs) to estimate risk to wildlife at contaminated sites. Many ERAs use uncertainty factors (UFs) in the HQ calculation to incorporate uncertainty associated with predicting wildlife responses to contaminant exposure using laboratory toxicity data. The overall objective was to evaluate the current UF methodology as applied to screening ERAs in California, USA. Specific objectives included characterizing current UF methodology, evaluating the degree of conservatism in UFs as applied, and identifying limitations to the current approach. Twenty-four of 29 evaluated ERAs used the HQ approach: 23 of thesemore » used UFs in the HQ calculation. All 24 made interspecies extrapolations, and 21 compensated for its uncertainty, most using allometric adjustments and some using RFs. Most also incorporated uncertainty for same-species extrapolations. Twenty-one ERAs used UFs extrapolating from lowest observed adverse effect level (LOAEL) to no observed adverse effect level (NOAEL), and 18 used UFs extrapolating from subchronic to chronic exposure. Values and application of all UF types were inconsistent. Maximum cumulative UFs ranged from 10 to 3,000. Results suggest UF methodology is widely used but inconsistently applied and is not uniformly conservative relative to UFs recommended in regulatory guidelines and academic literature. The method is limited by lack of consensus among scientists, regulators, and practitioners about magnitudes, types, and conceptual underpinnings of the UF methodology.« less
Cappon, Davide; Jahanshahi, Marjan; Bisiacchi, Patrizia
2016-01-01
Non-invasive brain stimulation techniques, including transcranial direct current stimulation (t-DCS) have been used in the rehabilitation of cognitive function in a spectrum of neurological disorders. The present review outlines methodological communalities and differences of t-DCS procedures in neurocognitive rehabilitation. We consider the efficacy of tDCS for the management of specific cognitive deficits in four main neurological disorders by providing a critical analysis of recent studies that have used t-DCS to improve cognition in patients with Parkinson's Disease, Alzheimer's Disease, Hemi-spatial Neglect, and Aphasia. The evidence from this innovative approach to cognitive rehabilitation suggests that tDCS can influence cognition. However, the results show a high variability between studies both in terms of the methodological approach adopted and the cognitive functions targeted. The review also focuses both on methodological issues such as technical aspects of the stimulation (electrode position and dimension; current intensity; duration of protocol) and on the inclusion of appropriate assessment tools for cognition. A further aspect considered is the optimal timing for administration of tDCS: before, during or after cognitive rehabilitation. We conclude that more studies using common methodology are needed to gain a better understanding of the efficacy of tDCS as a new tool for rehabilitation of cognitive disorders in a range of neurological disorders. PMID:27147949
Cappon, Davide; Jahanshahi, Marjan; Bisiacchi, Patrizia
2016-01-01
Non-invasive brain stimulation techniques, including transcranial direct current stimulation (t-DCS) have been used in the rehabilitation of cognitive function in a spectrum of neurological disorders. The present review outlines methodological communalities and differences of t-DCS procedures in neurocognitive rehabilitation. We consider the efficacy of tDCS for the management of specific cognitive deficits in four main neurological disorders by providing a critical analysis of recent studies that have used t-DCS to improve cognition in patients with Parkinson's Disease, Alzheimer's Disease, Hemi-spatial Neglect, and Aphasia. The evidence from this innovative approach to cognitive rehabilitation suggests that tDCS can influence cognition. However, the results show a high variability between studies both in terms of the methodological approach adopted and the cognitive functions targeted. The review also focuses both on methodological issues such as technical aspects of the stimulation (electrode position and dimension; current intensity; duration of protocol) and on the inclusion of appropriate assessment tools for cognition. A further aspect considered is the optimal timing for administration of tDCS: before, during or after cognitive rehabilitation. We conclude that more studies using common methodology are needed to gain a better understanding of the efficacy of tDCS as a new tool for rehabilitation of cognitive disorders in a range of neurological disorders.
Kant on historiography and the use of regulative ideas.
Kleingeld, Pauline
2008-12-01
In this paper, I examine Kant's methodological remarks in the 'Idea for a universal history' against the background of the Critique of pure reason. I argue that Kant's approach to the function of regulative ideas of human history as a whole may still be fruitful. This approach allows for regulative ideas that are grand in scope, but modest and fallibilistic in their epistemic status. Kant's methodological analysis should be distinguished from the specific teleological model of history he developed on its basis, however, because this model can no longer be appropriated for current purposes.
Hospitality Studies: Escaping the Tyranny?
ERIC Educational Resources Information Center
Lashley, Conrad
2015-01-01
Purpose: The purpose of this paper is to explore current strands in hospitality management education and research, and suggest that future programs should reflect a more social science informed content. Design/methodology/approach: The paper reviews current research in hospitality management education and in the study of hospitality and…
Team Learning on the Edge of Chaos
ERIC Educational Resources Information Center
Fisser, Sandra; Browaeys, Marie-Joelle
2010-01-01
Purpose: Organizations as complex networks aim to survive. The purpose of this paper is to provide an alternative perspective to current organizational challenges by considering team learning as key factor for surviving this turbulent environment. Design/methodology/approach: The dominating approach in this paper comes from the complexity…
Emerging technologies for the changing global market
NASA Technical Reports Server (NTRS)
Cruit, Wendy; Schutzenhofer, Scott; Goldberg, Ben; Everhart, Kurt
1993-01-01
This project served to define an appropriate methodology for effective prioritization of technology efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semi-quantative approach derived from quality function deployment techniques (QFD Matrix). This methodology aims to weight the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives. The results will be implemented as a guideline for consideration for current NASA propulsion systems.
ERIC Educational Resources Information Center
Shilling, Chris
2010-01-01
In this article I identify how developments in consumer culture, waged-work and health policy have informed our current interest in the body, before suggesting that Durkheim's and Mauss's methodological approach towards the external and internal dimensions of "social facts" provides us with a valuable basis on which we can analyse the…
Biomimetics in the design of a robotic exoskeleton for upper limb therapy
NASA Astrophysics Data System (ADS)
Baniqued, Paul Dominick E.; Dungao, Jade R.; Manguerra, Michael V.; Baldovino, Renann G.; Abad, Alexander C.; Bugtai, Nilo T.
2018-02-01
Current methodologies in designing robotic exoskeletons for upper limb therapy simplify the complex requirements of the human anatomy. As a result, such devices tend to compromise safety and biocompatibility with the intended user. However, a new design methodology uses biological analogues as inspiration to address these technical issues. This approach follows that of biomimetics, a design principle that uses the extraction and transfer of useful information from natural morphologies and processes to solve technical design issues. In this study, a biomimetic approach in the design of a 5-degree-of-freedom robotic exoskeleton for upper limb therapy was performed. A review of biomimetics was first discussed along with its current contribution to the design of rehabilitation robots. With a proposed methodological framework, the design for an upper limb robotic exoskeleton was generated using CATIA software. The design was inspired by the morphology of the bones and the muscle force transmission of the upper limbs. Finally, a full design assembly presented had integrated features extracted from the biological analogue. The successful execution of a biomimetic design methodology made a case in providing safer and more biocompatible robots for rehabilitation.
[Organization of monitoring of electromagnetic radiation in the urban environment].
Savel'ev, S I; Dvoeglazova, S V; Koz'min, V A; Kochkin, D E; Begishev, M R
2008-01-01
The authors describe new current approaches to monitoring the environment, including the sources of electromagnetic radiation and noise. Electronic maps of the area under study are shown to be made, by constructing the isolines or distributing the actual levels of controlled factors. These current approaches to electromagnetic and acoustic monitoring make it possible to automate a process of measurements, to analyze the established situation, and to simplify the risk controlling methodology.
Algorithm for evaluating the effectiveness of a high-rise development project based on current yield
NASA Astrophysics Data System (ADS)
Soboleva, Elena
2018-03-01
The article is aimed at the issues of operational evaluation of development project efficiency in high-rise construction under the current economic conditions in Russia. The author touches the following issues: problems of implementing development projects, the influence of the operational evaluation quality of high-rise construction projects on general efficiency, assessing the influence of the project's external environment on the effectiveness of project activities under crisis conditions and the quality of project management. The article proposes the algorithm and the methodological approach to the quality management of the developer project efficiency based on operational evaluation of the current yield efficiency. The methodology for calculating the current efficiency of a development project for high-rise construction has been updated.
From Databases to Modelling of Functional Pathways
2004-01-01
This short review comments on current informatics resources and methodologies in the study of functional pathways in cell biology. It highlights recent achievements in unveiling the structural design of protein and gene networks and discusses current approaches to model and simulate the dynamics of regulatory pathways in the cell. PMID:18629070
From databases to modelling of functional pathways.
Nasi, Sergio
2004-01-01
This short review comments on current informatics resources and methodologies in the study of functional pathways in cell biology. It highlights recent achievements in unveiling the structural design of protein and gene networks and discusses current approaches to model and simulate the dynamics of regulatory pathways in the cell.
The Current and Future Role of Business Schools
ERIC Educational Resources Information Center
Rayment, John; Smith, Jonathan
2013-01-01
Purpose: Considerable debate since the global financial crisis has been evident concerning the role of business schools. This article aims to outline the authors' research on their role. Design/methodology/approach: The paper begins with an overview of the significant literature highlighting the current debates impacting on business schools and…
ERIC Educational Resources Information Center
Abou-Warda, Sherein Hamed
2016-01-01
Purpose: The overall objective of the current study is to explore how universities can better developing new educational services. The purpose of this paper is to develop framework for technology entrepreneurship education (TEPE) within universities. Design/Methodology/Approach: Qualitative and quantitative research approaches were employed. This…
A Loud Silence: Working with Research-Based Theatre and A/R/Tography
ERIC Educational Resources Information Center
Lea, Graham W.; Belliveau, George; Wager, Amanda; Beck, Jaime L.
2011-01-01
Arts-based approaches to research have emerged as an integral component of current scholarship in the social sciences, education, health research, and humanities. Integrating arts-based methods and methodologies with research generates possibilities for fresh approaches for creating, translating, and exchanging knowledge (Barone & Eisner, 1997;…
Comparing Pedagogies for Plastic Waste Management at University Level
ERIC Educational Resources Information Center
Yeung, Siu-Kit; So, Wing-Mui Winnie; Cheng, Nga-Yee Irene; Cheung, Tsz-Yan; Chow, Cheuk-Fai
2017-01-01
Purpose: This paper aims to compare the learning outcomes of gaming simulation and guided inquiry in sustainability education on plastic waste management. The current study targets the identification of success factors in these teaching approaches. Design/methodology/approach: This study used a quasi-experimental design with undergraduate…
Prasse, Carsten; Stalter, Daniel; Schulte-Oehlmann, Ulrike; Oehlmann, Jörg; Ternes, Thomas A
2015-12-15
The knowledge we have gained in recent years on the presence and effects of compounds discharged by wastewater treatment plants (WWTPs) brings us to a point where we must question the appropriateness of current water quality evaluation methodologies. An increasing number of anthropogenic chemicals is detected in treated wastewater and there is increasing evidence of adverse environmental effects related to WWTP discharges. It has thus become clear that new strategies are needed to assess overall quality of conventional and advanced treated wastewaters. There is an urgent need for multidisciplinary approaches combining expertise from engineering, analytical and environmental chemistry, (eco)toxicology, and microbiology. This review summarizes the current approaches used to assess treated wastewater quality from the chemical and ecotoxicological perspective. Discussed chemical approaches include target, non-target and suspect analysis, sum parameters, identification and monitoring of transformation products, computational modeling as well as effect directed analysis and toxicity identification evaluation. The discussed ecotoxicological methodologies encompass in vitro testing (cytotoxicity, genotoxicity, mutagenicity, endocrine disruption, adaptive stress response activation, toxicogenomics) and in vivo tests (single and multi species, biomonitoring). We critically discuss the benefits and limitations of the different methodologies reviewed. Additionally, we provide an overview of the current state of research regarding the chemical and ecotoxicological evaluation of conventional as well as the most widely used advanced wastewater treatment technologies, i.e., ozonation, advanced oxidation processes, chlorination, activated carbon, and membrane filtration. In particular, possible directions for future research activities in this area are provided. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Arnold, Steven M.; Goldberg, Robert K.; Lerch, Bradley A.; Saleeb, Atef F.
2009-01-01
Herein a general, multimechanism, physics-based viscoelastoplastic model is presented in the context of an integrated diagnosis and prognosis methodology which is proposed for structural health monitoring, with particular applicability to gas turbine engine structures. In this methodology, diagnostics and prognostics will be linked through state awareness variable(s). Key technologies which comprise the proposed integrated approach include (1) diagnostic/detection methodology, (2) prognosis/lifing methodology, (3) diagnostic/prognosis linkage, (4) experimental validation, and (5) material data information management system. A specific prognosis lifing methodology, experimental characterization and validation and data information management are the focal point of current activities being pursued within this integrated approach. The prognostic lifing methodology is based on an advanced multimechanism viscoelastoplastic model which accounts for both stiffness and/or strength reduction damage variables. Methods to characterize both the reversible and irreversible portions of the model are discussed. Once the multiscale model is validated the intent is to link it to appropriate diagnostic methods to provide a full-featured structural health monitoring system.
NASA Technical Reports Server (NTRS)
Arnold, Steven M.; Goldberg, Robert K.; Lerch, Bradley A.; Saleeb, Atef F.
2009-01-01
Herein a general, multimechanism, physics-based viscoelastoplastic model is presented in the context of an integrated diagnosis and prognosis methodology which is proposed for structural health monitoring, with particular applicability to gas turbine engine structures. In this methodology, diagnostics and prognostics will be linked through state awareness variable(s). Key technologies which comprise the proposed integrated approach include 1) diagnostic/detection methodology, 2) prognosis/lifing methodology, 3) diagnostic/prognosis linkage, 4) experimental validation and 5) material data information management system. A specific prognosis lifing methodology, experimental characterization and validation and data information management are the focal point of current activities being pursued within this integrated approach. The prognostic lifing methodology is based on an advanced multi-mechanism viscoelastoplastic model which accounts for both stiffness and/or strength reduction damage variables. Methods to characterize both the reversible and irreversible portions of the model are discussed. Once the multiscale model is validated the intent is to link it to appropriate diagnostic methods to provide a full-featured structural health monitoring system.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for designs failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.
Agile rediscovering values: Similarities to continuous improvement strategies
NASA Astrophysics Data System (ADS)
Díaz de Mera, P.; Arenas, J. M.; González, C.
2012-04-01
Research in the late 80's on technological companies that develop products of high value innovation, with sufficient speed and flexibility to adapt quickly to changing market conditions, gave rise to the new set of methodologies known as Agile Management Approach. In the current changing economic scenario, we considered very interesting to study the similarities of these Agile Methodologies with other practices whose effectiveness has been amply demonstrated in both the West and Japan. Strategies such as Kaizen, Lean, World Class Manufacturing, Concurrent Engineering, etc, would be analyzed to check the values they have in common with the Agile Approach.
ERIC Educational Resources Information Center
Clements, Douglas H.; Agodini, Roberto; Harris, Barbara
2013-01-01
In this Appendix, we provide details about the data used for the current study, the curricula used in the classrooms from which data were collected, and the current study's methodological approach. (Contains 14 tables and 5 footnotes.)[For full report, see ED544189.
ERIC Educational Resources Information Center
Wright, Hazel A.; Ironside, Joseph E.; Gwynn-Jones, Dylan
2008-01-01
Purpose: This study aims to identify the current barriers to sustainability in the bioscience laboratory setting and to determine which mechanisms are likely to increase sustainable behaviours in this specialised environment. Design/methodology/approach: The study gathers qualitative data from a sample of laboratory researchers presently…
Sources, Developments and Directions of Task-Based Language Teaching
ERIC Educational Resources Information Center
Bygate, Martin
2016-01-01
This paper provides an outline of the origins, the current shape and the potential directions of task-based language teaching (TBLT) as an approach to language pedagogy. It first offers a brief description of TBLT and considers its origins within language teaching methodology and second language acquisition. It then summarises the current position…
Recognising Current Competencies of Volunteers in Emergency Service Organisations
ERIC Educational Resources Information Center
Catts, Ralph; Chamings, Dave
2006-01-01
Purpose: The paper seeks to show the relationship between organisational structure and flexibility of training has not been well researched. Focusing on the role of recognition of current competencies, this study provides evidence of the effects of the former on the latter. Design/methodology/approach: In this paper evidence was obtained by…
An Exploratory Study of Sustainable Development at Italian Universities
ERIC Educational Resources Information Center
Vagnoni, Emidia; Cavicchi, Caterina
2015-01-01
Purpose: This paper aims to outline the current status of the implementation of sustainability practices in the context of Italian public universities, highlighting the strengths and gaps. Design/methodology/approach: Based on a qualitative approach, an exploratory study design has been outlined using the model of Glavic and Lukman (2007) focusing…
Quality Concerns in Technical Education in India: A Quantifiable Quality Enabled Model
ERIC Educational Resources Information Center
Gambhir, Victor; Wadhwa, N. C.; Grover, Sandeep
2016-01-01
Purpose: The paper aims to discuss current Technical Education scenarios in India. It proposes modelling the factors affecting quality in a technical institute and then applying a suitable technique for assessment, comparison and ranking. Design/methodology/approach: The paper chose graph theoretic approach for quantification of quality-enabled…
The Behavioural Approach in Schools: A Time for Caution Revisited
ERIC Educational Resources Information Center
Harrop, Alex; Swinson, Jeremy
2007-01-01
This paper takes as its starting point an examination of the current status of some of the concerns that were raised in the mid-1980s about methodological problems faced by educational researchers using the behavioural approach in schools. These concerns included the measurement of agreement between observers, the interpretation of raw data…
Towards Dynamic and Interdisciplinary Frameworks for School-Based Mental Health Promotion
ERIC Educational Resources Information Center
O'Toole, Catriona
2017-01-01
Purpose: The purpose of this paper is to scrutinise two ostensibly disparate approaches to school-based mental health promotion and offer a conceptual foundation for considering possible synergies between them. Design/methodology/approach: The paper examines current conceptualisations of child and youth mental health and explores how these inform…
Benchmarking in the National Intellectual Capital Measurement: Is It the Best Available Approach?
ERIC Educational Resources Information Center
Januškaite, Virginija; Užiene, Lina
2016-01-01
Sustainable economic development is an aspiration of every nation in today's knowledge economy. Scientists for a few decades claim that intellectual capital management is the answer how to reach this goal. Currently, benchmarking methodology is the most common approach in the national intellectual capital measurement intended to provide…
Multi-Level Alignment Model: Transforming Face-to-Face into E-Instructional Programs
ERIC Educational Resources Information Center
Byers, Celina
2005-01-01
Purpose--To suggest to others in the field an approach equally valid for transforming existing courses into online courses and for creating new online courses. Design/methodology/approach--Using the literature for substantiation, this article discusses the current rapid change within organizations, the role of technology in that change, and the…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-21
...This notice addresses the methodology used by the Department of Commerce (``the Department'') to value the cost of labor in non- market economy (``NME'') countries. After reviewing all comments received on the Department's interim, industry-specific wage calculation methodology that is currently applied in NME antidumping proceedings, the Department has determined that the single surrogate- country approach is best. In addition, the Department has decided to use International Labor Organization (``ILO'') Yearbook Chapter 6A as its primary source of labor cost data in NME antidumping proceedings.
An omnibus test for family-based association studies with multiple SNPs and multiple phenotypes.
Lasky-Su, Jessica; Murphy, Amy; McQueen, Matthew B; Weiss, Scott; Lange, Christoph
2010-06-01
We propose an omnibus family-based association test (MFBAT) that can be applied to multiple markers and multiple phenotypes and that has only one degree of freedom. The proposed test statistic extends current FBAT methodology to incorporate multiple markers as well as multiple phenotypes. Using simulation studies, power estimates for the proposed methodology are compared with the standard methodologies. On the basis of these simulations, we find that MFBAT substantially outperforms other methods, including haplotypic approaches and doing multiple tests with single single-nucleotide polymorphisms (SNPs) and single phenotypes. The practical relevance of the approach is illustrated by an application to asthma in which SNP/phenotype combinations are identified and reach overall significance that would not have been identified using other approaches. This methodology is directly applicable to cases in which there are multiple SNPs, such as candidate gene studies, cases in which there are multiple phenotypes, such as expression data, and cases in which there are multiple phenotypes and genotypes, such as genome-wide association studies that incorporate expression profiles as phenotypes. This program is available in the PBAT analysis package.
Zarit, Steven H.; Liu, Yin; Bangerter, Lauren R.; Rovine, Michael J.
2017-01-01
Objectives There is growing emphasis on empirical validation of the efficacy of community-based services for older people and their families, but research on services such as respite care faces methodological challenges that have limited the growth of outcome studies. We identify problems associated with the usual research approaches for studying respite care, with the goal of stimulating use of novel and more appropriate research designs that can lead to improved studies of community-based services. Method Using the concept of research validity, we evaluate the methodological approaches in the current literature on respite services, including adult day services, in-home respite and overnight respite. Results Although randomized control trials (RCTs) are possible in community settings, validity is compromised by practical limitations of randomization and other problems. Quasi-experimental and interrupted time series designs offer comparable validity to RCTs and can be implemented effectively in community settings. Conclusion An emphasis on RCTs by funders and researchers is not supported by scientific evidence. Alternative designs can lead to development of a valid body of research on community services such as respite. PMID:26729467
Zarit, Steven H; Bangerter, Lauren R; Liu, Yin; Rovine, Michael J
2017-03-01
There is growing emphasis on empirical validation of the efficacy of community-based services for older people and their families, but research on services such as respite care faces methodological challenges that have limited the growth of outcome studies. We identify problems associated with the usual research approaches for studying respite care, with the goal of stimulating use of novel and more appropriate research designs that can lead to improved studies of community-based services. Using the concept of research validity, we evaluate the methodological approaches in the current literature on respite services, including adult day services, in-home respite and overnight respite. Although randomized control trials (RCTs) are possible in community settings, validity is compromised by practical limitations of randomization and other problems. Quasi-experimental and interrupted time series designs offer comparable validity to RCTs and can be implemented effectively in community settings. An emphasis on RCTs by funders and researchers is not supported by scientific evidence. Alternative designs can lead to development of a valid body of research on community services such as respite.
Cramer, Robert J.; Johnson, Shara M.; McLaughlin, Jennifer; Rausch, Emilie M.; Conroy, Mary Alice
2014-01-01
Clinical and counseling psychology programs currently lack adequate evidence-based competency goals and training in suicide risk assessment. To begin to address this problem, this article proposes core competencies and an integrated training framework that can form the basis for training and research in this area. First, we evaluate the extent to which current training is effective in preparing trainees for suicide risk assessment. Within this discussion, sample and methodological issues are reviewed. Second, as an extension of these methodological training issues, we integrate empirically- and expert-derived suicide risk assessment competencies from several sources with the goal of streamlining core competencies for training purposes. Finally, a framework for suicide risk assessment training is outlined. The approach employs Objective Structured Clinical Examination (OSCE) methodology, an approach commonly utilized in medical competency training. The training modality also proposes the Suicide Competency Assessment Form (SCAF), a training tool evaluating self- and observer-ratings of trainee core competencies. The training framework and SCAF are ripe for empirical evaluation and potential training implementation. PMID:24672588
Cramer, Robert J; Johnson, Shara M; McLaughlin, Jennifer; Rausch, Emilie M; Conroy, Mary Alice
2013-02-01
Clinical and counseling psychology programs currently lack adequate evidence-based competency goals and training in suicide risk assessment. To begin to address this problem, this article proposes core competencies and an integrated training framework that can form the basis for training and research in this area. First, we evaluate the extent to which current training is effective in preparing trainees for suicide risk assessment. Within this discussion, sample and methodological issues are reviewed. Second, as an extension of these methodological training issues, we integrate empirically- and expert-derived suicide risk assessment competencies from several sources with the goal of streamlining core competencies for training purposes. Finally, a framework for suicide risk assessment training is outlined. The approach employs Objective Structured Clinical Examination (OSCE) methodology, an approach commonly utilized in medical competency training. The training modality also proposes the Suicide Competency Assessment Form (SCAF), a training tool evaluating self- and observer-ratings of trainee core competencies. The training framework and SCAF are ripe for empirical evaluation and potential training implementation.
A life prediction methodology for encapsulated solar cells
NASA Technical Reports Server (NTRS)
Coulbert, C. D.
1978-01-01
This paper presents an approach to the development of a life prediction methodology for encapsulated solar cells which are intended to operate for twenty years or more in a terrestrial environment. Such a methodology, or solar cell life prediction model, requires the development of quantitative intermediate relationships between local environmental stress parameters and the basic chemical mechanisms of encapsulant aging leading to solar cell failures. The use of accelerated/abbreviated testing to develop these intermediate relationships and in revealing failure modes is discussed. Current field and demonstration tests of solar cell arrays and the present laboratory tests to qualify solar module designs provide very little data applicable to predicting the long-term performance of encapsulated solar cells. An approach to enhancing the value of such field tests to provide data for life prediction is described.
Horvath, Karl; Semlitsch, Thomas; Jeitler, Klaus; Abuzahra, Muna E; Posch, Nicole; Domke, Andreas; Siebenhofer, Andrea
2016-01-01
Objectives Identification of sufficiently trustworthy top 5 list recommendations from the US Choosing Wisely campaign. Setting Not applicable. Participants All top 5 list recommendations available from the American Board of Internal Medicine Foundation website. Main outcome measures/interventions Compilation of US top 5 lists and search for current German highly trustworthy (S3) guidelines. Extraction of guideline recommendations, including grade of recommendation (GoR), for suggestions comparable to top 5 list recommendations. For recommendations without guideline equivalents, the methodological quality of the top 5 list development process was assessed using criteria similar to that used to judge guidelines, and relevant meta-literature was identified in cited references. Judgement of sufficient trustworthiness of top 5 list recommendations was based either on an ‘A’ GoR of guideline equivalents or on high methodological quality and citation of relevant meta-literature. Results 412 top 5 list recommendations were identified. For 75 (18%), equivalents were found in current German S3 guidelines. 44 of these recommendations were associated with an ‘A’ GoR, or a strong recommendation based on strong evidence, and 26 had a ‘B’ or a ‘C’ GoR. No GoR was provided for 5 recommendations. 337 recommendations had no equivalent in the German S3 guidelines. The methodological quality of the development process was high and relevant meta-literature was cited for 87 top 5 list recommendations. For a further 36, either the methodological quality was high without any meta-literature citations or meta-literature citations existed but the methodological quality was lacking. For the remaining 214 recommendations, either the methodological quality was lacking and no literature was cited or the methodological quality was generally unsatisfactory. Conclusions 131 of current US top 5 list recommendations were found to be sufficiently trustworthy. For a substantial number of current US top 5 list recommendations, their trustworthiness remains unclear. Methodological requirements for developing top 5 lists are recommended. PMID:27855098
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflights systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for design, failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.
Can Improvisation Be "Taught"?: A Call for Free Improvisation in Our Schools
ERIC Educational Resources Information Center
Hickey, Maud
2009-01-01
The purpose of this article is to present the idea that the music education profession's current drive to include improvisation in school music is limited in its approach, and that "teaching" improvisation, in the traditional sense, is not possible. These beliefs are based on an examination of current methodologies and texts in light of the…
ERIC Educational Resources Information Center
Linnenbrink-Garcia, Lisa; Middleton, Michael J.; Ciani, Keith D.; Easter, Matthew A.; O'Keefe, Paul A.; Zusho, Akane
2012-01-01
In current research on achievement goal theory, most researchers differentiate between performance-approach and performance-avoidance goal orientations. Evidence from prior research and from several previously published data sets is used to highlight that the correlation is often rather large, with a number of studies reporting correlations above…
Building Work-Based Learning into the School Curriculum
ERIC Educational Resources Information Center
Asher, Jenny
2005-01-01
Purpose - The purpose of this article is to examine the increasing number of opportunities for pre-16 young people at schools in England to become involved in work related and work based programmes and to look at the key drivers of change and their impact. Design/methodology/approach - The approach is descriptive, covering current trends and also…
ERIC Educational Resources Information Center
Sewell, Peter; Pool, Lorraine Dacre
2010-01-01
Purpose: The purpose of this paper is to discuss how the terms "employability", "enterprise" and "entrepreneurship" are currently being used, often interchangeably, within higher education, and to propose how to clarify this issue with the terminology. Design/methodology/approach: The approach taken is to discuss the three terms and some of their…
ERIC Educational Resources Information Center
Beckmann, Jennifer; Weber, Peter
2016-01-01
Purpose: The purpose of this study is to introduce a virtual collaborative learning setting called "Net Economy", which we established as part of an international learning network of currently six universities, and present our approach to continuously improve the course in each cycle. Design/ Methodology/Approach: Using the community of…
[Theoretical and methodological uses of research in Social and Human Sciences in Health].
Deslandes, Suely Ferreira; Iriart, Jorge Alberto Bernstein
2012-12-01
The current article aims to map and critically reflect on the current theoretical and methodological uses of research in the subfield of social and human sciences in health. A convenience sample was used to select three Brazilian public health journals. Based on a reading of 1,128 abstracts published from 2009 to 2010, 266 articles were selected that presented the empirical base of research stemming from social and human sciences in health. The sample was classified thematically as "theoretical/ methodological reference", "study type/ methodological design", "analytical categories", "data production techniques", and "analytical procedures". We analyze the sample's emic categories, drawing on the authors' literal statements. All the classifications and respective variables were tabulated in Excel. Most of the articles were self-described as qualitative and used more than one data production technique. There was a wide variety of theoretical references, in contrast with the almost total predominance of a single type of data analysis (content analysis). In several cases, important gaps were identified in expounding the study methodology and instrumental use of the qualitative research techniques and methods. However, the review did highlight some new objects of study and innovations in theoretical and methodological approaches.
Beyond the Natural Proteome: Nondegenerate Saturation Mutagenesis-Methodologies and Advantages.
Ferreira Amaral, M M; Frigotto, L; Hine, A V
2017-01-01
Beyond the natural proteome, high-throughput mutagenesis offers the protein engineer an opportunity to "tweak" the wild-type activity of a protein to create a recombinant protein with required attributes. Of the various approaches available, saturation mutagenesis is one of the core techniques employed by protein engineers, and in recent times, nondegenerate saturation mutagenesis is emerging as the approach of choice. This review compares the current methodologies available for conducting nondegenerate saturation mutagenesis with traditional, degenerate saturation and briefly outlines the options available for screening the resulting libraries, to discover a novel protein with the required activity and/or specificity. © 2017 Elsevier Inc. All rights reserved.
Resource Letter MPCVW-1: Modeling Political Conflict, Violence, and Wars: A Survey
NASA Astrophysics Data System (ADS)
Morgenstern, Ana P.; Velásquez, Nicolás; Manrique, Pedro; Hong, Qi; Johnson, Nicholas; Johnson, Neil
2013-11-01
This Resource Letter provides a guide into the literature on modeling and explaining political conflict, violence, and wars. Although this literature is dominated by social scientists, multidisciplinary work is currently being developed in the wake of myriad methodological approaches that have sought to analyze and predict political violence. The works covered herein present an overview of this abundance of methodological approaches. Since there is a variety of possible data sets and theoretical approaches, the level of detail and scope of models can vary quite considerably. The review does not provide a summary of the available data sets, but instead highlights recent works on quantitative or multi-method approaches to modeling different forms of political violence. Journal articles and books are organized in the following topics: social movements, diffusion of social movements, political violence, insurgencies and terrorism, and civil wars.
Griffith, James W; Sumner, Jennifer A; Raes, Filip; Barnhofer, Thorsten; Debeer, Elise; Hermans, Dirk
2012-12-01
Autobiographical memory is a multifaceted construct that is related to psychopathology and other difficulties in functioning. Across many studies, a variety of methods have been used to study autobiographical memory. The relationship between overgeneral autobiographical memory (OGM) and psychopathology has been of particular interest, and many studies of this cognitive phenomenon rely on the Autobiographical Memory Test (AMT) to assess it. In this paper, we examine several methodological approaches to studying autobiographical memory, and focus primarily on methodological and psychometric considerations in OGM research. We pay particular attention to what is known about the reliability, validity, and methodological variations of the AMT. The AMT has adequate psychometric properties, but there is great variability in methodology across studies that use it. Methodological recommendations and suggestions for future studies are presented. Copyright © 2011 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Jones, Thomas C.; Dorsey, John T.; Doggett, William R.
2015-01-01
The Tendon-Actuated Lightweight In-Space MANipulator (TALISMAN) is a versatile long-reach robotic manipulator that is currently being tested at NASA Langley Research Center. TALISMAN is designed to be highly mass-efficient and multi-mission capable, with applications including asteroid retrieval and manipulation, in-space servicing, and astronaut and payload positioning. The manipulator uses a modular, periodic, tension-compression design that lends itself well to analytical modeling. Given the versatility of application for TALISMAN, a structural sizing methodology was developed that could rapidly assess mass and configuration sensitivities for any specified operating work space, applied loads and mission requirements. This methodology allows the systematic sizing of the key structural members of TALISMAN, which include the truss arm links, the spreaders and the tension elements. This paper summarizes the detailed analytical derivations and methodology that support the structural sizing approach and provides results from some recent TALISMAN designs developed for current and proposed mission architectures.
NASA Astrophysics Data System (ADS)
Ghosh, Sukanya; Roy, Souvanic; Sanyal, Manas Kumar
2016-09-01
With the help of a case study, the article has explored current practices of implementation of governmental affordable housing programme for urban poor in a slum of India. This work shows that the issues associated with the problems of governmental affordable housing programme has to be addressed to with a suitable methodology as complexities are not only dealing with quantitative data but qualitative data also. The Hard System Methodologies (HSM), which is conventionally applied to address the issues, deals with real and known problems which can be directly solved. Since most of the issues of affordable housing programme as found in the case study are subjective and complex in nature, Soft System Methodology (SSM) has been tried for better representation from subjective points of views. The article explored drawing of Rich Picture as an SSM approach for better understanding and analysing complex issues and constraints of affordable housing programme so that further exploration of the issues is possible.
Analysis of Freight Transport Strategies and Methodologies [summary
DOT National Transportation Integrated Search
2017-12-01
Transportation planners constantly examine traffic flows to see if current roadway layouts are serving traffic needs. For freight hauling, this presents one issue on the open road, but a much different issue as these large vehicles approach their des...
Automated Verification of Specifications with Typestates and Access Permissions
NASA Technical Reports Server (NTRS)
Siminiceanu, Radu I.; Catano, Nestor
2011-01-01
We propose an approach to formally verify Plural specifications based on access permissions and typestates, by model-checking automatically generated abstract state-machines. Our exhaustive approach captures all the possible behaviors of abstract concurrent programs implementing the specification. We describe the formal methodology employed by our technique and provide an example as proof of concept for the state-machine construction rules. The implementation of a fully automated algorithm to generate and verify models, currently underway, provides model checking support for the Plural tool, which currently supports only program verification via data flow analysis (DFA).
ERIC Educational Resources Information Center
Braganza, Morgan; Akesson, Bree; Rothwell, David
2017-01-01
Grounded theory is a popular methodological approach in social work research, especially by doctoral students conducting qualitative research. The approach, however, is not always used consistently or as originally designed, compromising the quality of the research. The aim of the current study is to assess the quality of recent Canadian social…
Youth, Heroin, Crack: A Review of Recent British Trends
ERIC Educational Resources Information Center
Seddon, Toby
2008-01-01
Purpose: The purpose of this paper is to review the research evidence on recent British trends in the use of heroin and/or crack-cocaine by young people in order to appraise the scale and nature of the contemporary health problem they pose. Design/methodology/approach: The approach consists of a narrative review of the main current data sources on…
De Buck, Emmy; Pauwels, Nele S; Dieltjens, Tessa; Vandekerckhove, Philippe
2014-03-01
As part of its strategy Belgian Red Cross-Flanders underpins all its activities with evidence-based guidelines and systematic reviews. The aim of this publication is to describe in detail the methodology used to achieve this goal within an action-oriented organisation, in a timely and cost-effective way. To demonstrate transparency in our methods, we wrote a methodological charter describing the way in which we develop evidence-based materials to support our activities. Criteria were drawn up for deciding on project priority and the choice of different types of projects (scoping reviews, systematic reviews and evidence-based guidelines). While searching for rigorous and realistically attainable methodological standards, we encountered a wide variety in terminology and methodology used in the field of evidence-based practice. Terminologies currently being used by different organisations and institutions include systematic reviews, systematic literature searches, evidence-based guidelines, rapid reviews, pragmatic systematic reviews, and rapid response service. It is not always clear what the definition and methodology is behind these terms and whether they are used consistently. We therefore describe the terminology and methodology used by Belgian Red Cross-Flanders; criteria for making methodological choices and details on the methodology we use are given. In our search for an appropriate methodology, taking into account time and resource constraints, we encountered an enormous variety of methodological approaches and terminology used for evidence-based materials. In light of this, we recommend that authors of evidence-based guidelines and reviews are transparent and clear about the methodology used. To be transparent about our approach, we developed a methodological charter. This charter may inspire other organisations that want to use evidence-based methodology to support their activities.
Uncertainties in Emissions In Emissions Inputs for Near-Road Assessments
Emissions, travel demand, and dispersion models are all needed to obtain temporally and spatially resolved pollutant concentrations. Current methodology combines these three models in a bottom-up approach based on hourly traffic and emissions estimates, and hourly dispersion conc...
The comparison of various approach to evaluation erosion risks and design control erosion measures
NASA Astrophysics Data System (ADS)
Kapicka, Jiri
2015-04-01
In the present is in the Czech Republic one methodology how to compute and compare erosion risks. This methodology contain also method to design erosion control measures. The base of this methodology is Universal Soil Loss Equation (USLE) and their result long-term average annual rate of erosion (G). This methodology is used for landscape planners. Data and statistics from database of erosion events in the Czech Republic shows that many troubles and damages are from local episodes of erosion events. An extent of these events and theirs impact are conditional to local precipitation events, current plant phase and soil conditions. These erosion events can do troubles and damages on agriculture land, municipally property and hydro components and even in a location is from point of view long-term average annual rate of erosion in good conditions. Other way how to compute and compare erosion risks is episodes approach. In this paper is presented the compare of various approach to compute erosion risks. The comparison was computed to locality from database of erosion events on agricultural land in the Czech Republic where have been records two erosion events. The study area is a simple agriculture land without any barriers that can have high influence to water flow and soil sediment transport. The computation of erosion risks (for all methodology) was based on laboratory analysis of soil samples which was sampled on study area. Results of the methodology USLE, MUSLE and results from mathematical model Erosion 3D have been compared. Variances of the results in space distribution of the places with highest soil erosion where compared and discussed. Other part presents variances of design control erosion measures where their design was done on based different methodology. The results shows variance of computed erosion risks which was done by different methodology. These variances can start discussion about different approach how compute and evaluate erosion risks in areas with different importance.
A Practical Methodology for Disaggregating the Drivers of Drug Costs Using Administrative Data.
Lungu, Elena R; Manti, Orlando J; Levine, Mitchell A H; Clark, Douglas A; Potashnik, Tanya M; McKinley, Carol I
2017-09-01
Prescription drug expenditures represent a significant component of health care costs in Canada, with estimates of $28.8 billion spent in 2014. Identifying the major cost drivers and the effect they have on prescription drug expenditures allows policy makers and researchers to interpret current cost pressures and anticipate future expenditure levels. To identify the major drivers of prescription drug costs and to develop a methodology to disaggregate the impact of each of the individual drivers. The methodology proposed in this study uses the Laspeyres approach for cost decomposition. This approach isolates the effect of the change in a specific factor (e.g., price) by holding the other factor(s) (e.g., quantity) constant at the base-period value. The Laspeyres approach is expanded to a multi-factorial framework to isolate and quantify several factors that drive prescription drug cost. Three broad categories of effects are considered: volume, price and drug-mix effects. For each category, important sub-effects are quantified. This study presents a new and comprehensive methodology for decomposing the change in prescription drug costs over time including step-by-step demonstrations of how the formulas were derived. This methodology has practical applications for health policy decision makers and can aid researchers in conducting cost driver analyses. The methodology can be adjusted depending on the purpose and analytical depth of the research and data availability. © 2017 Journal of Population Therapeutics and Clinical Pharmacology. All rights reserved.
NASA Technical Reports Server (NTRS)
Rocco, David A.
1994-01-01
Redefining the approach and philosophy that operations management uses to define, develop, and implement space missions will be a central element in achieving high efficiency mission operations for the future. The goal of a cost effective space operations program cannot be realized if the attitudes and methodologies we currently employ to plan, develop, and manage space missions do not change. A management philosophy that is in synch with the environment in terms of budget, technology, and science objectives must be developed. Changing our basic perception of mission operations will require a shift in the way we view the mission. This requires a transition from current practices of viewing the mission as a unique end product, to a 'mission development concept' built on the visualization of the end-to-end mission. To achieve this change we must define realistic mission success criteria and develop pragmatic approaches to achieve our goals. Custom mission development for all but the largest and most unique programs is not practical in the current budget environment, and we simply do not have the resources to implement all of our planned science programs. We need to shift our management focus to allow us the opportunity make use of methodologies and approaches which are based on common building blocks that can be utilized in the space, ground, and mission unique segments of all missions.
Corroborating evidence-based medicine.
Mebius, Alexander
2014-12-01
Proponents of evidence-based medicine (EBM) have argued convincingly for applying this scientific method to medicine. However, the current methodological framework of the EBM movement has recently been called into question, especially in epidemiology and the philosophy of science. The debate has focused on whether the methodology of randomized controlled trials provides the best evidence available. This paper attempts to shift the focus of the debate by arguing that clinical reasoning involves a patchwork of evidential approaches and that the emphasis on evidence hierarchies of methodology fails to lend credence to the common practice of corroboration in medicine. I argue that the strength of evidence lies in the evidence itself, and not the methodology used to obtain that evidence. Ultimately, when it comes to evaluating the effectiveness of medical interventions, it is the evidence obtained from the methodology rather than the methodology that should establish the strength of the evidence. © 2014 John Wiley & Sons, Ltd.
Designing for Annual Spacelift Performance
NASA Technical Reports Server (NTRS)
McCleskey, Carey M.; Zapata, Edgar
2017-01-01
This paper presents a methodology for approaching space launch system design from a total architectural point of view. This different approach to conceptual design is contrasted with traditional approaches that focus on a single set of metrics for flight system performance, i.e., payload lift per flight, vehicle mass, specific impulse, etc. The approach presented works with a larger set of metrics, including annual system lift, or "spacelift" performance. Spacelift performance is more inclusive of the flight production capability of the total architecture, i.e., the flight and ground systems working together as a whole to produce flights on a repeated basis. In the proposed methodology, spacelift performance becomes an important design-for-support parameter for flight system concepts and truly advanced spaceport architectures of the future. The paper covers examples of existing system spacelift performance as benchmarks, points out specific attributes of space transportation systems that must be greatly improved over these existing designs, and outlines current activity in this area.
Current target acquisition methodology in force on force simulations
NASA Astrophysics Data System (ADS)
Hixson, Jonathan G.; Miller, Brian; Mazz, John P.
2017-05-01
The U.S. Army RDECOM CERDEC NVESD MSD's target acquisition models have been used for many years by the military community in force on force simulations for training, testing, and analysis. There have been significant improvements to these models over the past few years. The significant improvements are the transition of ACQUIRE TTP-TAS (ACQUIRE Targeting Task Performance Target Angular Size) methodology for all imaging sensors and the development of new discrimination criteria for urban environments and humans. This paper is intended to provide an overview of the current target acquisition modeling approach and provide data for the new discrimination tasks. This paper will discuss advances and changes to the models and methodologies used to: (1) design and compare sensors' performance, (2) predict expected target acquisition performance in the field, (3) predict target acquisition performance for combat simulations, and (4) how to conduct model data validation for combat simulations.
Eigenvalue Contributon Estimator for Sensitivity Calculations with TSUNAMI-3D
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rearden, Bradley T; Williams, Mark L
2007-01-01
Since the release of the Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI) codes in SCALE [1], the use of sensitivity and uncertainty analysis techniques for criticality safety applications has greatly increased within the user community. In general, sensitivity and uncertainty analysis is transitioning from a technique used only by specialists to a practical tool in routine use. With the desire to use the tool more routinely comes the need to improve the solution methodology to reduce the input and computational burden on the user. This paper reviews the current solution methodology of the Monte Carlo eigenvalue sensitivity analysismore » sequence TSUNAMI-3D, describes an alternative approach, and presents results from both methodologies.« less
ERIC Educational Resources Information Center
Arthur, Gabriella Colussi
1995-01-01
Considers the most recently published or re-edited North American textbooks for Italian with a view to discussing the current methodological trends they illustrate. Findings indicate that all except one of the seven textbooks reviewed have been upgraded to reflect an integrated or eclectic approach and that they differ mainly in the authors'…
Fractal Risk Assessment of ISS Propulsion Module in Meteoroid and Orbital Debris Environments
NASA Technical Reports Server (NTRS)
Mog, Robert A.
2001-01-01
A unique and innovative risk assessment of the International Space Station (ISS) Propulsion Module is conducted using fractal modeling of the Module's response to the meteoroid and orbital debris environments. Both the environment models and structural failure modes due to the resultant hypervelocity impact phenomenology, as well as Module geometry, are investigated for fractal applicability. The fractal risk assessment methodology could produce a greatly simplified alternative to current methodologies, such as BUMPER analyses, while maintaining or increasing the number of complex scenarios that can be assessed. As a minimum, this innovative fractal approach will provide an independent assessment of existing methodologies in a unique way.
Development and Current Status of Skull-Image Superimposition - Methodology and Instrumentation.
Lan, Y
1992-12-01
This article presents a review of the literature and an evaluation on the development and application of skull-image superimposition technology - both instrumentation and methodology - contributed by a number of scholars since 1935. Along with a comparison of the methodologies involved in the two superimposition techniques - photographic and video - the author characterized the techniques in action and the recent advances in computer image superimposition processing technology. The major disadvantage of conventional approaches is its relying on subjective interpretation. Through painstaking comparison and analysis, computer image processing technology can make more conclusive identifications by direct testing and evaluating the various programmed indices. Copyright © 1992 Central Police University.
Current Trends in the Treament of Phobias in Autistic and Mentally Retarded Persons.
ERIC Educational Resources Information Center
Jackson, Henry J.
1983-01-01
The paper reviews research on phobic disorders of mentally retarded and autistic persons, noting the definitions, incidence and prevalence, etiological explanations, and treatment approaches. Methodological weaknesses are stressed. Behavioral interventions are seen as the treatments of choice. (CL)
Exploring the Experiences of Administrative Interns
ERIC Educational Resources Information Center
Jamison, Kimberly; Clayton, Jennifer
2016-01-01
Purpose: The purpose of this paper is to identify how current administrative interns enrolled in a university administrator preparation program describe and make meaning of their internship experiences. Design/methodology/approach: For this qualitative study, the researchers interviewed administrative interns enrolled in one university preparation…
DEVELOPMENT OF WEIGHTED DISTRIBUTIONS OF REPS FOR DIOXIN-LIKE COMPOUNDS
Potential health risks associated with exposure to mixtures of dioxin-like compounds are currently assessed using a toxic equivalency factor (TEF) approach. Recently, both the WH0 and NAS reviewed the TEF methodology and acknowledged the importance of better characterizing varia...
PIA and REWIND: Two New Methodologies for Cross Section Adjustment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palmiotti, G.; Salvatores, M.
2017-02-01
This paper presents two new cross section adjustment methodologies intended for coping with the problem of compensations. The first one PIA, Progressive Incremental Adjustment, gives priority to the utilization of experiments of elemental type (those sensitive to a specific cross section), following a definite hierarchy on which type of experiment to use. Once the adjustment is performed, both the new adjusted data and the new covariance matrix are kept. The second methodology is called REWIND (Ranking Experiments by Weighting for Improved Nuclear Data). This new proposed approach tries to establish a methodology for ranking experiments by looking at the potentialmore » gain they can produce in an adjustment. Practical applications for different adjustments illustrate the results of the two methodologies against the current one and show the potential improvement for reducing uncertainties in target reactors.« less
The challenge of risk characterization: current practice and future directions.
Gray, G M; Cohen, J T; Graham, J D
1993-01-01
Risk characterization is perhaps the most important part of risk assessment. As currently practiced, risk characterizations do not convey the degree of uncertainty in a risk estimate to risk managers, Congress, the press, and the public. Here, we use a framework put forth by an ad hoc study group of industry and government scientists and academics to critique the risk characterizations contained in two risks assessments of gasoline vapor. After discussing the strengths and weaknesses of each assessment's risk characterization, we detail an alternative approach that conveys estimates in the form of a probability distribution. The distributional approach can make use of all relevant scientific data and knowledge, including alternative data sets and all plausible mechanistic theories of carcinogenesis. As a result, this approach facilitates better public health decisions than current risk characterization procedures. We discuss methodological issues, as well as strengths and weaknesses of the distributional approach. PMID:8020444
Loucka, Martin; Payne, Sheila; Brearley, Sarah
2014-01-01
A number of research projects have been conducted that aim to gather data on the international development of palliative care. These data are important for policy makers and palliative care advocates. The aim of this article was to provide a critical comparative analysis of methodological approaches used to assess the development and status of palliative care services and infrastructure at an international level. A selective literature review that focused on the methodological features of eight identified reports was undertaken. Reviewed reports were found to differ in adopted methodologies and provided uneven amounts of methodological information. Five major methodological limitations were identified (lack of theory, use of experts as source of information, grey literature, difficulties in ranking, and the problematic nature of data on service provision). A set of recommendations on how to deal with these issues in future research is provided. Measuring the international development of palliative care is a difficult and challenging task. The results of this study could be used to improve the validity of future research in this field. Copyright © 2014 U.S. Cancer Pain Relief Committee. Published by Elsevier Inc. All rights reserved.
Lores, Marta; Llompart, Maria; Alvarez-Rivera, Gerardo; Guerra, Eugenia; Vila, Marlene; Celeiro, Maria; Lamas, J Pablo; Garcia-Jares, Carmen
2016-04-07
Cosmetic products placed on the market and their ingredients, must be safe under reasonable conditions of use, in accordance to the current legislation. Therefore, regulated and allowed chemical substances must meet the regulatory criteria to be used as ingredients in cosmetics and personal care products, and adequate analytical methodology is needed to evaluate the degree of compliance. This article reviews the most recent methods (2005-2015) used for the extraction and the analytical determination of the ingredients included in the positive lists of the European Regulation of Cosmetic Products (EC 1223/2009): comprising colorants, preservatives and UV filters. It summarizes the analytical properties of the most relevant analytical methods along with the possibilities of fulfilment of the current regulatory issues. The cosmetic legislation is frequently being updated; consequently, the analytical methodology must be constantly revised and improved to meet safety requirements. The article highlights the most important advances in analytical methodology for cosmetics control, both in relation to the sample pretreatment and extraction and the different instrumental approaches developed to solve this challenge. Cosmetics are complex samples, and most of them require a sample pretreatment before analysis. In the last times, the research conducted covering this aspect, tended to the use of green extraction and microextraction techniques. Analytical methods were generally based on liquid chromatography with UV detection, and gas and liquid chromatographic techniques hyphenated with single or tandem mass spectrometry; but some interesting proposals based on electrophoresis have also been reported, together with some electroanalytical approaches. Regarding the number of ingredients considered for analytical control, single analyte methods have been proposed, although the most useful ones in the real life cosmetic analysis are the multianalyte approaches. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Ellis, Matthew O. A.; Stamenova, Maria; Sanvito, Stefano
2017-12-01
There exists a significant challenge in developing efficient magnetic tunnel junctions with low write currents for nonvolatile memory devices. With the aim of analyzing potential materials for efficient current-operated magnetic junctions, we have developed a multi-scale methodology combining ab initio calculations of spin-transfer torque with large-scale time-dependent simulations using atomistic spin dynamics. In this work we introduce our multiscale approach, including a discussion on a number of possible schemes for mapping the ab initio spin torques into the spin dynamics. We demonstrate this methodology on a prototype Co/MgO/Co/Cu tunnel junction showing that the spin torques are primarily acting at the interface between the Co free layer and MgO. Using spin dynamics we then calculate the reversal switching times for the free layer and the critical voltages and currents required for such switching. Our work provides an efficient, accurate, and versatile framework for designing novel current-operated magnetic devices, where all the materials details are taken into account.
Hemostatic strategies for traumatic and surgical bleeding
Behrens, Adam M.; Sikorski, Michael J.; Kofinas, Peter
2017-01-01
Wide interest in new hemostatic approaches has stemmed from unmet needs in the hospital and on the battlefield. Many current commercial hemostatic agents fail to fulfill the design requirements of safety, efficacy, cost, and storage. Academic focus has led to the improvement of existing strategies as well as new developments. This review will identify and discuss the three major classes of hemostatic approaches: biologically derived materials, synthetically derived materials, and intravenously administered hemostatic agents. The general class is first discussed, then specific approaches discussed in detail, including the hemostatic mechanisms and the advancement of the method. As hemostatic strategies evolve and synthetic-biologic interactions are more fully understood, current clinical methodologies will be replaced. PMID:24307256
Campbell, J Elliott; Moen, Jeremie C; Ney, Richard A; Schnoor, Jerald L
2008-03-01
Estimates of forest soil organic carbon (SOC) have applications in carbon science, soil quality studies, carbon sequestration technologies, and carbon trading. Forest SOC has been modeled using a regression coefficient methodology that applies mean SOC densities (mass/area) to broad forest regions. A higher resolution model is based on an approach that employs a geographic information system (GIS) with soil databases and satellite-derived landcover images. Despite this advancement, the regression approach remains the basis of current state and federal level greenhouse gas inventories. Both approaches are analyzed in detail for Wisconsin forest soils from 1983 to 2001, applying rigorous error-fixing algorithms to soil databases. Resulting SOC stock estimates are 20% larger when determined using the GIS method rather than the regression approach. Average annual rates of increase in SOC stocks are 3.6 and 1.0 million metric tons of carbon per year for the GIS and regression approaches respectively.
Methodologies for Improving Flight Project Information Capture, Storage, and Dissemination
NASA Technical Reports Server (NTRS)
Equils, Douglas J.
2011-01-01
This paper will discuss the drawbacks and risks of the current documentation paradigm, how Document QuickStart improves on that process and ultimately how this stream-lined approach will reduce risk and costs to the next generation of Flight Projects at JPL
Farmer Experience of Pluralistic Agricultural Extension, Malawi
ERIC Educational Resources Information Center
Chowa, Clodina; Garforth, Chris; Cardey, Sarah
2013-01-01
Purpose: Malawi's current extension policy supports pluralism and advocates responsiveness to farmer demand. We investigate whether smallholder farmers' experience supports the assumption that access to multiple service providers leads to extension and advisory services that respond to the needs of farmers. Design/methodology/approach: Within a…
Innovative Technologies for Multicultural Education Needs
ERIC Educational Resources Information Center
Ferdig, Richard E.; Coutts, Jade; DiPietro, Joseph; Lok, Benjamin; Davis, Niki
2007-01-01
Purpose: The purpose of this paper is to discuss several technology applications that are being used to address current problems or opportunities related to multicultural education. Design/methodology/approach: Five technology applications or technology-related projects are discussed, including a teacher education literacy tool, social networking…
Current methodological approaches in conditioned pain modulation assessment in pediatrics
Hwang, Philippe S; Ma, My-Linh; Spiegelberg, Nora; Ferland, Catherine E
2017-01-01
Conditioned pain modulation (CPM) paradigms have been used in various studies with healthy and non-healthy adult populations in an attempt to elucidate the mechanisms of pain processing. However, only a few studies so far have applied CPM in pediatric populations. Studies finding associations with chronic pain conditions suggest that deficiencies in underlying descending pain pathways may play an important role in the development and persistence of pain early in life. Twelve studies were identified using a PubMed search which examine solely pediatric populations, and these are reviewed with regard to demographics studied, methodological approaches, and conclusions reached. This review aimed to provide both clinicians and researchers with a brief overview of the current state of research regarding the use of CPM in children and adolescents, both healthy and clinical patients. The implications of CPM in experimental and clinical settings and its potential to aid in refining considerations to individualize treatment of pediatric pain syndromes will be discussed. PMID:29263694
Design of experiments enhanced statistical process control for wind tunnel check standard testing
NASA Astrophysics Data System (ADS)
Phillips, Ben D.
The current wind tunnel check standard testing program at NASA Langley Research Center is focused on increasing data quality, uncertainty quantification and overall control and improvement of wind tunnel measurement processes. The statistical process control (SPC) methodology employed in the check standard testing program allows for the tracking of variations in measurements over time as well as an overall assessment of facility health. While the SPC approach can and does provide researchers with valuable information, it has certain limitations in the areas of process improvement and uncertainty quantification. It is thought by utilizing design of experiments methodology in conjunction with the current SPC practices that one can efficiently and more robustly characterize uncertainties and develop enhanced process improvement procedures. In this research, methodologies were developed to generate regression models for wind tunnel calibration coefficients, balance force coefficients and wind tunnel flow angularities. The coefficients of these regression models were then tracked in statistical process control charts, giving a higher level of understanding of the processes. The methodology outlined is sufficiently generic such that this research can be applicable to any wind tunnel check standard testing program.
van Diessen, E; Numan, T; van Dellen, E; van der Kooi, A W; Boersma, M; Hofman, D; van Lutterveld, R; van Dijk, B W; van Straaten, E C W; Hillebrand, A; Stam, C J
2015-08-01
Electroencephalogram (EEG) and magnetoencephalogram (MEG) recordings during resting state are increasingly used to study functional connectivity and network topology. Moreover, the number of different analysis approaches is expanding along with the rising interest in this research area. The comparison between studies can therefore be challenging and discussion is needed to underscore methodological opportunities and pitfalls in functional connectivity and network studies. In this overview we discuss methodological considerations throughout the analysis pipeline of recording and analyzing resting state EEG and MEG data, with a focus on functional connectivity and network analysis. We summarize current common practices with their advantages and disadvantages; provide practical tips, and suggestions for future research. Finally, we discuss how methodological choices in resting state research can affect the construction of functional networks. When taking advantage of current best practices and avoid the most obvious pitfalls, functional connectivity and network studies can be improved and enable a more accurate interpretation and comparison between studies. Copyright © 2014 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.
[Quantity versus quality: a review on current methodological dispute in health services research].
Sikorski, Claudia; Glaesmer, Heide; Bramesfeld, Anke
2010-10-01
The aim of this study was to determine the percentage of qualitative and quantitative research papers on health services research in two German journals. All publications of the two journals were viewed. Only empirical research papers were included. It was then assessed whether they dealt with health services research and what methodology was used to collect and analyse data. About half of all published empirical papers dealt with health services research. Of those, slightly over 20 % used qualitative methods at least partially. Ordered by topic, qualitative data collection and analysis is especially common in the fields of phenomenology, treatment determinants and treatment outcome. Sole qualitative methodology is still used rather seldom in health services research. Attempts to include quantitative as well as qualitative approaches are limited to sequential design, lowering the independent value of both approaches. The concept of triangulation yields the possibility to overcome paradigm based dichotomies. However, the choice of methodology ought to be based primarily on the research question. © Georg Thieme Verlag KG Stuttgart · New York.
Mallants, Dirk; Batelaan, Okke; Gedeon, Matej; Huysmans, Marijke; Dassargues, Alain
2017-01-01
Cone penetration testing (CPT) is one of the most efficient and versatile methods currently available for geotechnical, lithostratigraphic and hydrogeological site characterization. Currently available methods for soil behaviour type classification (SBT) of CPT data however have severe limitations, often restricting their application to a local scale. For parameterization of regional groundwater flow or geotechnical models, and delineation of regional hydro- or lithostratigraphy, regional SBT classification would be very useful. This paper investigates the use of model-based clustering for SBT classification, and the influence of different clustering approaches on the properties and spatial distribution of the obtained soil classes. We additionally propose a methodology for automated lithostratigraphic mapping of regionally occurring sedimentary units using SBT classification. The methodology is applied to a large CPT dataset, covering a groundwater basin of ~60 km2 with predominantly unconsolidated sandy sediments in northern Belgium. Results show that the model-based approach is superior in detecting the true lithological classes when compared to more frequently applied unsupervised classification approaches or literature classification diagrams. We demonstrate that automated mapping of lithostratigraphic units using advanced SBT classification techniques can provide a large gain in efficiency, compared to more time-consuming manual approaches and yields at least equally accurate results. PMID:28467468
Rogiers, Bart; Mallants, Dirk; Batelaan, Okke; Gedeon, Matej; Huysmans, Marijke; Dassargues, Alain
2017-01-01
Cone penetration testing (CPT) is one of the most efficient and versatile methods currently available for geotechnical, lithostratigraphic and hydrogeological site characterization. Currently available methods for soil behaviour type classification (SBT) of CPT data however have severe limitations, often restricting their application to a local scale. For parameterization of regional groundwater flow or geotechnical models, and delineation of regional hydro- or lithostratigraphy, regional SBT classification would be very useful. This paper investigates the use of model-based clustering for SBT classification, and the influence of different clustering approaches on the properties and spatial distribution of the obtained soil classes. We additionally propose a methodology for automated lithostratigraphic mapping of regionally occurring sedimentary units using SBT classification. The methodology is applied to a large CPT dataset, covering a groundwater basin of ~60 km2 with predominantly unconsolidated sandy sediments in northern Belgium. Results show that the model-based approach is superior in detecting the true lithological classes when compared to more frequently applied unsupervised classification approaches or literature classification diagrams. We demonstrate that automated mapping of lithostratigraphic units using advanced SBT classification techniques can provide a large gain in efficiency, compared to more time-consuming manual approaches and yields at least equally accurate results.
Fazlollahtabar, Hamed
2010-12-01
Consumer expectations for automobile seat comfort continue to rise. With this said, it is evident that the current automobile seat comfort development process, which is only sporadically successful, needs to change. In this context, there has been growing recognition of the need for establishing theoretical and methodological automobile seat comfort. On the other hand, seat producer need to know the costumer's required comfort to produce based on their interests. The current research methodologies apply qualitative approaches due to anthropometric specifications. The most significant weakness of these approaches is the inexact extracted inferences. Despite the qualitative nature of the consumer's preferences there are some methods to transform the qualitative parameters into numerical value which could help seat producer to improve or enhance their products. Nonetheless this approach would help the automobile manufacturer to provide their seats from the best producer regarding to the consumers idea. In this paper, a heuristic multi criteria decision making technique is applied to make consumers preferences in the numeric value. This Technique is combination of Analytical Hierarchy Procedure (AHP), Entropy method, and Technique for Order Preference by Similarity to an Ideal Solution (TOPSIS). A case study is conducted to illustrate the applicability and the effectiveness of the proposed heuristic approach. Copyright © 2010 Elsevier Ltd. All rights reserved.
Community-based early warning systems for flood risk mitigation in Nepal
NASA Astrophysics Data System (ADS)
Smith, Paul J.; Brown, Sarah; Dugar, Sumit
2017-03-01
This paper focuses on the use of community-based early warning systems for flood resilience in Nepal. The first part of the work outlines the evolution and current status of these community-based systems, highlighting the limited lead times currently available for early warning. The second part of the paper focuses on the development of a robust operational flood forecasting methodology for use by the Nepal Department of Hydrology and Meteorology (DHM) to enhance early warning lead times. The methodology uses data-based physically interpretable time series models and data assimilation to generate probabilistic forecasts, which are presented in a simple visual tool. The approach is designed to work in situations of limited data availability with an emphasis on sustainability and appropriate technology. The successful application of the forecast methodology to the flood-prone Karnali River basin in western Nepal is outlined, increasing lead times from 2-3 to 7-8 h. The challenges faced in communicating probabilistic forecasts to the last mile of the existing community-based early warning systems across Nepal is discussed. The paper concludes with an assessment of the applicability of this approach in basins and countries beyond Karnali and Nepal and an overview of key lessons learnt from this initiative.
Principals' Opinions of Organisational Justice in Elementary Schools in Turkey
ERIC Educational Resources Information Center
Aydin, Inayet; Karaman-Kepenekci, Yasemin
2008-01-01
Purpose--This study aims to present the opinions of public elementary school principals in Turkey about the current organisational justice practices among teachers from the distributive, procedural, interactional, and rectificatory dimensions. Design/methodology/approach--The opinions of 11 public elementary school principals in Ankara about…
Where Are the People? The Human Viewpoint Approach for Architecting and Acquisition
2014-10-01
however, systems engineers currently do not have sufficient tools to quantitatively integrate human considerations into systems development ( Hardman ...Engineering, 13(1), 72–79. Hardman , N., & Colombi, J. (2012). An empirical methodology for human integration in the SE technical processes. Journal of Systems
Learning Strategies at Work and Professional Development
ERIC Educational Resources Information Center
Haemer, Hannah Deborah; Borges-Andrade, Jairo Eduardo; Cassiano, Simone Kelli
2017-01-01
Purpose: This paper aims to investigate the prediction of current and evolutionary perceptions of professional development through five learning strategies at work and through training and how individual and job characteristics predict those strategies. Design/methodology/approach: Variables were measured in a cross-sectional survey, with 962…
Strategic Information Systems Planning in Malaysian Public Universities
ERIC Educational Resources Information Center
Ismail, Noor Azizi; Raja Mohd Ali, Raja Haslinda; Mat Saat, Rafeah; Hsbollah, Hafizah Mohamad
2007-01-01
Purpose: The paper's purpose is to investigate the current status, problems and benefits of strategic information systems planning implementation in Malaysian public universities. Design/methodology/approach: The study uses dual but mutually supportive strands of investigation, i.e. a questionnaire survey and interviews. Findings: Malaysian public…
The Status of Entrepreneurship Education in Australian Universities
ERIC Educational Resources Information Center
Maritz, Alex; Jones, Colin; Shwetzer, Claudia
2015-01-01
Purpose: The purpose of this paper is to provide an analytical overview of the current state of entrepreneurship education (EE) in Australia; placing emphasis on programs, curricula and entrepreneurship ecosystems. Design/methodology/approach: The authors performed a contextual review of the literature by delineating entrepreneurship education…
Collaboration in Cultural Heritage Digitisation in East Asia
ERIC Educational Resources Information Center
Lee, Hyuk-Jin
2010-01-01
Purpose: The purpose of this paper is to review the current status of collaboration in cultural heritage preservation in East Asia, including digital projects, and to suggest practical improvements based on a cultural structuralism perspective. Design/methodology/approach: Through exploratory research, the paper addresses aspects for successful…
A General Survey of Qualitative Research Methodology.
ERIC Educational Resources Information Center
Cary, Rick
Current definitions and philosophical foundations of qualitative research are presented; and designs, evaluation methods, and issues in application of qualitative research to education are discussed. The effects of positivism and the post-positivist era on qualitative research are outlined, and naturalist and positivist approaches are contrasted.…
A Social-Medical Approach to Violence in Colombia
Franco, Saul
2003-01-01
Violence is the main public health problem in Colombia. Many theoretical and methodological approaches to solving this problem have been attempted from different disciplines. My past work has focused on homicide violence from the perspective of social medicine. In this article I present the main conceptual and methodological aspects and the chief findings of my research over the past 15 years. Findings include a quantitative description of the current situation and the introduction of the category of explanatory contexts as a contribution to the study of Colombian violence. The complexity and severity of this problem demand greater theoretical discussion, more plans for action and a faster transition between the two. Social medicine may make a growing contribution to this field. PMID:14652328
A social-medical approach to violence in Colombia.
Franco, Saul
2003-12-01
Violence is the main public health problem in Colombia. Many theoretical and methodological approaches to solving this problem have been attempted from different disciplines. My past work has focused on homicide violence from the perspective of social medicine. In this article I present the main conceptual and methodological aspects and the chief findings of my research over the past 15 years. Findings include a quantitative description of the current situation and the introduction of the category of explanatory contexts as a contribution to the study of Colombian violence. The complexity and severity of this problem demand greater theoretical discussion, more plans for action and a faster transition between the two. Social medicine may make a growing contribution to this field.
FY16 Status Report on Development of Integrated EPP and SMT Design Methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jetter, R. I.; Sham, T. -L.; Wang, Y.
2016-08-01
The goal of the Elastic-Perfectly Plastic (EPP) combined integrated creep-fatigue damage evaluation approach is to incorporate a Simplified Model Test (SMT) data based approach for creep-fatigue damage evaluation into the EPP methodology to avoid the separate evaluation of creep and fatigue damage and eliminate the requirement for stress classification in current methods; thus greatly simplifying evaluation of elevated temperature cyclic service. The EPP methodology is based on the idea that creep damage and strain accumulation can be bounded by a properly chosen “pseudo” yield strength used in an elastic-perfectly plastic analysis, thus avoiding the need for stress classification. The originalmore » SMT approach is based on the use of elastic analysis. The experimental data, cycles to failure, is correlated using the elastically calculated strain range in the test specimen and the corresponding component strain is also calculated elastically. The advantage of this approach is that it is no longer necessary to use the damage interaction, or D-diagram, because the damage due to the combined effects of creep and fatigue are accounted in the test data by means of a specimen that is designed to replicate or bound the stress and strain redistribution that occurs in actual components when loaded in the creep regime. The reference approach to combining the two methodologies and the corresponding uncertainties and validation plans are presented. Results from recent key feature tests are discussed to illustrate the applicability of the EPP methodology and the behavior of materials at elevated temperature when undergoing stress and strain redistribution due to plasticity and creep.« less
Gifford, Grace K; Gill, Anthony J; Stevenson, William S
2016-01-01
Molecular classification of diffuse large B-cell lymphoma (DLBCL) is critical. Numerous methodologies have demonstrated that DLBCL is biologically heterogeneous despite morphological similarities. This underlies the disparate outcomes of treatment response or failure in this common non-Hodgkin lymphoma. This review will summarise historical approaches to lymphoma classifications, current diagnosis of DLBCL, molecular techniques that have primarily been used in the research setting to distinguish and subclassify DLBCL, evaluate contemporary diagnostic methodologies that seek to translate lymphoma biology into clinical practice, and introduce novel diagnostic platforms that may overcome current issues. The review concludes with an overview of key molecular lesions currently identified in DLBCL, all of which are potential targets for drug treatments that may improve survival and cure. Copyright © 2015 The Royal College of Pathologists of Australasia. Published by Elsevier B.V. All rights reserved.
The current deconstruction of paradoxes: one sign of the ongoing methodological "revolution".
Porta, Miquel; Vineis, Paolo; Bolúmar, Francisco
2015-10-01
The current deconstruction of paradoxes is one among several signs that a profound renewal of methods for clinical and epidemiological research is taking place; perhaps for some basic life sciences as well. The new methodological approaches have already deconstructed and explained long puzzling apparent paradoxes, including the (non-existent) benefits of obesity in diabetics, or of smoking in low birth weight. Achievements of the new methods also comprise the elucidation of the causal structure of long-disputed and highly complex questions, as Berkson's bias and Simpson's paradox, and clarifying reasons for deep controversies, as those on estrogens and endometrial cancer, or on adverse effects of hormone replacement therapy. These are signs that the new methods can go deeper and beyond the methods in current use. A major example of a highly relevant idea is: when we condition on a common effect of a pair of variables, then a spurious association between such pair is likely. The implications of these ideas are potentially vast. A substantial number of apparent paradoxes may simply be the result of collider biases, a source of selection bias that is common not just in epidemiologic research, but in many types of research in the health, life, and social sciences. The new approaches develop a new framework of concepts and methods, as collider, instrumental variables, d-separation, backdoor path and, notably, Directed Acyclic Graphs (DAGs). The current theoretical and methodological renewal-or, perhaps, "revolution"-may be changing deeply how clinical and epidemiological research is conceived and performed, how we assess the validity and relevance of findings, and how causal inferences are made. Clinical and basic researchers, among others, should get acquainted with DAGs and related concepts.
New Methodology for Estimating Fuel Economy by Vehicle Class
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chin, Shih-Miao; Dabbs, Kathryn; Hwang, Ho-Ling
2011-01-01
Office of Highway Policy Information to develop a new methodology to generate annual estimates of average fuel efficiency and number of motor vehicles registered by vehicle class for Table VM-1 of the Highway Statistics annual publication. This paper describes the new methodology developed under this effort and compares the results of the existing manual method and the new systematic approach. The methodology developed under this study takes a two-step approach. First, the preliminary fuel efficiency rates are estimated based on vehicle stock models for different classes of vehicles. Then, a reconciliation model is used to adjust the initial fuel consumptionmore » rates from the vehicle stock models and match the VMT information for each vehicle class and the reported total fuel consumption. This reconciliation model utilizes a systematic approach that produces documentable and reproducible results. The basic framework utilizes a mathematical programming formulation to minimize the deviations between the fuel economy estimates published in the previous year s Highway Statistics and the results from the vehicle stock models, subject to the constraint that fuel consumptions for different vehicle classes must sum to the total fuel consumption estimate published in Table MF-21 of the current year Highway Statistics. The results generated from this new approach provide a smoother time series for the fuel economies by vehicle class. It also utilizes the most up-to-date and best available data with sound econometric models to generate MPG estimates by vehicle class.« less
Practical aspects of protein co-evolution.
Ochoa, David; Pazos, Florencio
2014-01-01
Co-evolution is a fundamental aspect of Evolutionary Theory. At the molecular level, co-evolutionary linkages between protein families have been used as indicators of protein interactions and functional relationships from long ago. Due to the complexity of the problem and the amount of genomic data required for these approaches to achieve good performances, it took a relatively long time from the appearance of the first ideas and concepts to the quotidian application of these approaches and their incorporation to the standard toolboxes of bioinformaticians and molecular biologists. Today, these methodologies are mature (both in terms of performance and usability/implementation), and the genomic information that feeds them large enough to allow their general application. This review tries to summarize the current landscape of co-evolution-based methodologies, with a strong emphasis on describing interesting cases where their application to important biological systems, alone or in combination with other computational and experimental approaches, allowed getting new insight into these.
Practical aspects of protein co-evolution
Ochoa, David; Pazos, Florencio
2014-01-01
Co-evolution is a fundamental aspect of Evolutionary Theory. At the molecular level, co-evolutionary linkages between protein families have been used as indicators of protein interactions and functional relationships from long ago. Due to the complexity of the problem and the amount of genomic data required for these approaches to achieve good performances, it took a relatively long time from the appearance of the first ideas and concepts to the quotidian application of these approaches and their incorporation to the standard toolboxes of bioinformaticians and molecular biologists. Today, these methodologies are mature (both in terms of performance and usability/implementation), and the genomic information that feeds them large enough to allow their general application. This review tries to summarize the current landscape of co-evolution-based methodologies, with a strong emphasis on describing interesting cases where their application to important biological systems, alone or in combination with other computational and experimental approaches, allowed getting new insight into these. PMID:25364721
Dhir, Somdutta; Pacurar, Mircea; Franklin, Dino; Gáspári, Zoltán; Kertész-Farkas, Attila; Kocsor, András; Eisenhaber, Frank; Pongor, Sándor
2010-11-01
SBASE is a project initiated to detect known domain types and predicting domain architectures using sequence similarity searching (Simon et al., Protein Seq Data Anal, 5: 39-42, 1992, Pongor et al, Nucl. Acids. Res. 21:3111-3115, 1992). The current approach uses a curated collection of domain sequences - the SBASE domain library - and standard similarity search algorithms, followed by postprocessing which is based on a simple statistics of the domain similarity network (http://hydra.icgeb.trieste.it/sbase/). It is especially useful in detecting rare, atypical examples of known domain types which are sometimes missed even by more sophisticated methodologies. This approach does not require multiple alignment or machine learning techniques, and can be a useful complement to other domain detection methodologies. This article gives an overview of the project history as well as of the concepts and principles developed within this the project.
E-Learning: Ageing Workforce versus Technology-Savvy Generation
ERIC Educational Resources Information Center
Becker, Karen; Fleming, Julie; Keijsers, Wilhelmina
2012-01-01
Purpose: The purpose of this paper is to provide description and analysis of how a traditional industry is currently using e-learning, and to identify how the potential of e-learning can be realised whilst acknowledging the technological divide between younger and older workers. Design/methodology/approach: An exploratory qualitative methodology…
DOT National Transportation Integrated Search
2012-03-31
This report evaluates the performance of Continuous Risk Profile (CRP) compared with the : Sliding Window Method (SWM) and Peak Searching (PS) methods. These three network : screening methods all require the same inputs: traffic collision data and Sa...
DOT National Transportation Integrated Search
2012-03-01
This report evaluates the performance of Continuous Risk Profile (CRP) compared with the : Sliding Window Method (SWM) and Peak Searching (PS) methods. These three network : screening methods all require the same inputs: traffic collision data and Sa...
Learning through Work: Emerging Perspectives and New Challenges
ERIC Educational Resources Information Center
Billett, Stephen; Choy, Sarojni
2013-01-01
Purpose: This paper aims to consider and appraise current developments and emerging perspectives on learning in the circumstances of work, to propose how some of the challenges for securing effective workplace learning may be redressed. Design/methodology/approach: First, new challenges and perspectives on learning in the circumstances of work are…
Designing a Pedagogical Model for Web Engineering Education: An Evolutionary Perspective
ERIC Educational Resources Information Center
Hadjerrouit, Said
2005-01-01
In contrast to software engineering, which relies on relatively well established development approaches, there is a lack of a proven methodology that guides Web engineers in building reliable and effective Web-based systems. Currently, Web engineering lacks process models, architectures, suitable techniques and methods, quality assurance, and a…
Guidelines for line-oriented flight training, volume 2
NASA Technical Reports Server (NTRS)
Lauber, J. K.; Foushee, H. C.
1981-01-01
Current approaches to line-oriented flight training used by six American airlines are described. This recurrent training methodology makes use of a full-crew and full-mission simulation to teach and assess resource management skills, but does not necessarily fulfill requirements for the training and manipulation of all skills.
English Teaching in Argentina.
ERIC Educational Resources Information Center
Arazi, Blanca
2002-01-01
Examines the teaching of English in Argentina, a country that has had a myriad of English language teaching activities at all levels for many decades--mostly in British English. Looks at English in binational centers, in schools, and at the university level; methodological approach; language assessment; teacher training; and the current economic…
Examining General Hospitals' Smoke-Free Policies
ERIC Educational Resources Information Center
Whitman, Marilyn V.; Harbison, Phillip Adam
2010-01-01
Purpose: This paper aims to examine the level of smoke-free policies in general hospitals and the barriers faced in implementing restrictive policies banning smoking inside buildings and on surrounding grounds. Design/methodology/approach; A survey was developed to gather data on hospitals' current smoke-free policies, including the challenges…
Ideas and Approaches for Teaching Undergraduate Research Methods in the Health Sciences
ERIC Educational Resources Information Center
Peachey, Andrew A.; Baller, Stephanie L.
2015-01-01
Training in research methodology is becoming more commonly expected within undergraduate curricula designed to prepare students for entry into graduate allied health programs. Little information is currently available about pedagogical strategies to promote undergraduate students' learning of research methods, and less yet is available discussing…
Enterprise Education in Initial Teacher Education in Ireland
ERIC Educational Resources Information Center
Tiernan, Peter
2016-01-01
Purpose: The purpose of this paper is to examine the impact of enterprise education on students' understanding of and attitudes to entrepreneurship and enterprise education in initial teacher education. Design/methodology/approach: This paper builds on current literature by introducing student teachers to the theory and practice of…
"Folk" Understandings of Quality in UK Higher Hospitality Education
ERIC Educational Resources Information Center
Wood, Roy
2015-01-01
Purpose: The purpose of this paper is to provide an overview of the evolution of "folk" understandings of quality in higher hospitality education and the consequent implications of these understandings for current quality concerns in the field. Design/methodology/approach: The paper combines a historical survey of the stated topic…
The Immigrant and Hispanic Paradoxes: A Systematic Review of Their Predictions and Effects
ERIC Educational Resources Information Center
Teruya, Stacey A.; Bazargan-Hejazi, Shahrzad
2013-01-01
A survey of the literature indicates that reported advantages of the Immigrant and Hispanic Paradox are inconsistent and equivocal. The "healthy migrant hypothesis" also suggests that current research approaches consider only "healthy" groups. Other methodological concerns include the simple underreporting of deaths, and that…
Linguistic Hegemony Today: Recommendations for Eradicating Language Discrimination
ERIC Educational Resources Information Center
Scott, Lakia M.; Venegas, Elena M.
2017-01-01
Purpose: The purpose of this paper is to discuss issues of contemporary language conflict in educational contexts. Design/methodology/approach: This is a conceptual paper which examines current educational practices and policies through the lens of linguistic hegemony. Findings: The authors identify three primary areas in which linguistic hegemony…
A Qualitative Approach to Understanding Audience's Perceptions of Creativity in Online Advertising
ERIC Educational Resources Information Center
McStay, Andrew
2010-01-01
In this paper I seek to inquire upon audience's perceptions of creativity in online advertising--a heretofore poorly understood area. This paper initially outlines current academic understanding of creativity in online advertising, mainly derived from quantitative assessments. It then advances a qualitative methodology including diary-interviews…
Four Perspectives on the Status of Child Abuse and Neglect Research.
ERIC Educational Resources Information Center
Friedman, Robert M.; And Others
The current status of child abuse and neglect research is reviewed from the four traditional perspectives of mental health, medicine, law, and social work. In the field of mental health, research methodology; characteristics of victims, perpetrators, families, and the situation; prediction; long-term effects; and theoretical approaches are…
Assessment of Internship Effectiveness in South Italy Universities
ERIC Educational Resources Information Center
della Volpe, Maddalena
2017-01-01
Purpose: The purpose of this paper is to describe and discuss the way internships are currently evaluated in Campania Universities by host institutions. Design/methodology/approach: The author collected and described questionnaires used by the universities of the Regional Observatory of Campania University System. These questionnaires are given by…
An Evaluation of Alert Services: Quantity versus Quality
ERIC Educational Resources Information Center
Zandian, Fatemeh; Riahinia, Nosrat; Azimi, Ali; Poursalehi, Nastaran
2010-01-01
Purpose: Online information vendors currently offer a variety of additional services; among these are alert services which present requested information on recent publications to registered users. This paper aims to investigate a variety of alert services provided by four online information vendors. Design/methodology/approach: A comparison of the…
Job Insecurity and Remuneration in Chinese Family-Owned Business Workers
ERIC Educational Resources Information Center
Hu, Qiao; Schaufeli, Wilmar B.
2011-01-01
Purpose: The purpose of this paper is to study the impact of job insecurity (past job downsizing and anticipated job downsizing) and current remuneration--via wellbeing (burnout and work engagement)--on organizational outcomes (organization commitment and low turnover intention) of Chinese family-owned business. Design/methodology/approach: The…
The New Generation: Characteristics and Motivations of BME Graduate Entrepreneurs
ERIC Educational Resources Information Center
Hussain, Javed G.; Scott, Jonathan M.; Hannon, Paul D.
2008-01-01
Purpose: The purpose of this paper is to profile the characteristics and entrepreneurial motivations of graduate entrepreneurs from black and minority ethnic (BME) communities. Design/methodology/approach: To gather the data, the authors interviewed selected individuals from within the BME community (including current students and graduates from…
Fashion Entrepreneurship Education in the UK and China
ERIC Educational Resources Information Center
Shi, Jiwei Jenny; Chen, Yudong; Gifford, Elena Kate; Jin, Hui
2012-01-01
Purpose: The purpose of this paper is to obtain a shared understanding of entrepreneurship education and to evaluate the effectiveness of employability and enterprise division in current fashion courses and amongst the students between a British and a Chinese university (UClan and SCAU). Design/methodology/approach: It is a three-stage…
What Factors Influence Vietnamese Students' Choice of University?
ERIC Educational Resources Information Center
Dao, Mai Thi Ngoc; Thorpe, Anthony
2015-01-01
Purpose: The purpose of this paper is to report the factors that influence Vietnamese students' choice of university in a little researched context where the effects of globalization and education reform are changing higher education. Design/methodology/approach: A quantitative survey was completed by 1,124 current or recently completed university…
Examining the Planning and Management of Principal Succession
ERIC Educational Resources Information Center
Zepeda, Sally J.; Bengtson, Ed; Parylo, Oksana
2012-01-01
Purpose: The purpose of this study is to examine principal succession planning and management by analyzing current practices of handling school leader succession in four Georgia school systems. Design/methodology/approach: Looking through the lens of organizational leadership succession theory, the practices of school systems as they experienced…
Personal Name Identification in the Practice of Digital Repositories
ERIC Educational Resources Information Center
Xia, Jingfeng
2006-01-01
Purpose: To propose improvements to the identification of authors' names in digital repositories. Design/methodology/approach: Analysis of current name authorities in digital resources, particularly in digital repositories, and analysis of some features of existing repository applications. Findings: This paper finds that the variations of authors'…
Optimizing a Workplace Learning Pattern: A Case Study from Aviation
ERIC Educational Resources Information Center
Mavin, Timothy John; Roth, Wolff-Michael
2015-01-01
Purpose: This study aims to contribute to current research on team learning patterns. It specifically addresses some negative perceptions of the job performance learning pattern. Design/methodology/approach: Over a period of three years, qualitative and quantitative data were gathered on pilot learning in the workplace. The instructional modes…
Decolonizing Education: A Critical Discourse Analysis of Post-Secondary Humanities Textbooks
ERIC Educational Resources Information Center
Harper, Kimberly C.
2012-01-01
This dissertation examines nine post-secondary humanities textbooks published between 2001 and 2011 using an approach that includes both qualitative and quantitative methodology to analyze the written and visual content of humanities textbooks. This dissertation engages in current debates that address bias in humanities textbooks and contributes…
Learning about Teachers' Literacy Instruction from Classroom Observations
ERIC Educational Resources Information Center
Kelcey, Ben; Carlisle, Joanne F.
2013-01-01
The purpose of this study is to contribute to efforts to improve methods for gathering and analyzing data from classroom observations in early literacy. The methodological approach addresses current problems of reliability and validity of classroom observations by taking into account differences in teachers' uses of instructional actions (e.g.,…
Systems Analysis Approach for the NASA Environmentally Responsible Aviation Project
NASA Technical Reports Server (NTRS)
Kimmel, William M.
2011-01-01
This conference paper describes the current systems analysis approach being implemented for the Environmentally Responsible Aviation Project within the Integrated Systems Research Program under the NASA Aeronautics Research Mission Directorate. The scope and purpose of these systems studies are introduced followed by a methodology overview. The approach involves both top-down and bottoms-up components to provide NASA s stakeholders with a rationale for the prioritization and tracking of a portfolio of technologies which enable the future fleet of aircraft to operate with a simultaneous reduction of aviation noise, emissions and fuel-burn impacts to our environment. Examples of key current results and relevant decision support conclusions are presented along with a forecast of the planned analyses to follow.
Methodological approaches in conducting overviews: current state in HTA agencies.
Pieper, Dawid; Antoine, Sunya-Lee; Morfeld, Jana-Carina; Mathes, Tim; Eikermann, Michaela
2014-09-01
Overviews search for reviews rather than for primary studies. They might have the potential to support decision making within a shorter time frame by reducing production time. We aimed to summarize available instructions for authors intending to conduct overviews as well as the currently applied methodology of overviews in international Health Technology Assessment (HTA) agencies. We identified 127 HTA agencies and scanned their websites for methodological handbooks as well as published overviews as HTA reports. Additionally, we contacted HTA agencies by e-mail to retrieve possible unidentified handbooks or other related sources. In total, eight HTA agencies providing methodological support were found. Thirteen HTA agencies were found to have produced overviews since 2007, but only six of them published more than four overviews. Overviews were mostly employed in HTA products related to rapid assessment. Additional searches for primary studies published after the last review are often mentioned in order to update results. Although the interest in overviews is rising, little methodological guidance for the conduct of overviews is provided by HTA agencies. Overviews are of special interest in the context of rapid assessments to support policy-making within a short time frame. Therefore, empirical work on overviews needs to be extended. National strategies and experience should be disclosed and discussed. Copyright © 2013 John Wiley & Sons, Ltd.
Methodology for fault detection in induction motors via sound and vibration signals
NASA Astrophysics Data System (ADS)
Delgado-Arredondo, Paulo Antonio; Morinigo-Sotelo, Daniel; Osornio-Rios, Roque Alfredo; Avina-Cervantes, Juan Gabriel; Rostro-Gonzalez, Horacio; Romero-Troncoso, Rene de Jesus
2017-01-01
Nowadays, timely maintenance of electric motors is vital to keep up the complex processes of industrial production. There are currently a variety of methodologies for fault diagnosis. Usually, the diagnosis is performed by analyzing current signals at a steady-state motor operation or during a start-up transient. This method is known as motor current signature analysis, which identifies frequencies associated with faults in the frequency domain or by the time-frequency decomposition of the current signals. Fault identification may also be possible by analyzing acoustic sound and vibration signals, which is useful because sometimes this information is the only available. The contribution of this work is a methodology for detecting faults in induction motors in steady-state operation based on the analysis of acoustic sound and vibration signals. This proposed approach uses the Complete Ensemble Empirical Mode Decomposition for decomposing the signal into several intrinsic mode functions. Subsequently, the frequency marginal of the Gabor representation is calculated to obtain the spectral content of the IMF in the frequency domain. This proposal provides good fault detectability results compared to other published works in addition to the identification of more frequencies associated with the faults. The faults diagnosed in this work are two broken rotor bars, mechanical unbalance and bearing defects.
Assessing species saturation: conceptual and methodological challenges.
Olivares, Ingrid; Karger, Dirk N; Kessler, Michael
2018-05-07
Is there a maximum number of species that can coexist? Intuitively, we assume an upper limit to the number of species in a given assemblage, or that a lineage can produce, but defining and testing this limit has proven problematic. Herein, we first outline seven general challenges of studies on species saturation, most of which are independent of the actual method used to assess saturation. Among these are the challenge of defining saturation conceptually and operationally, the importance of setting an appropriate referential system, and the need to discriminate among patterns, processes and mechanisms. Second, we list and discuss the methodological approaches that have been used to study species saturation. These approaches vary in time and spatial scales, and in the variables and assumptions needed to assess saturation. We argue that assessing species saturation is possible, but that many studies conducted to date have conceptual and methodological flaws that prevent us from currently attaining a good idea of the occurrence of species saturation. © 2018 Cambridge Philosophical Society.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fogarty, Aoife C., E-mail: fogarty@mpip-mainz.mpg.de; Potestio, Raffaello, E-mail: potestio@mpip-mainz.mpg.de; Kremer, Kurt, E-mail: kremer@mpip-mainz.mpg.de
A fully atomistic modelling of many biophysical and biochemical processes at biologically relevant length- and time scales is beyond our reach with current computational resources, and one approach to overcome this difficulty is the use of multiscale simulation techniques. In such simulations, when system properties necessitate a boundary between resolutions that falls within the solvent region, one can use an approach such as the Adaptive Resolution Scheme (AdResS), in which solvent particles change their resolution on the fly during the simulation. Here, we apply the existing AdResS methodology to biomolecular systems, simulating a fully atomistic protein with an atomistic hydrationmore » shell, solvated in a coarse-grained particle reservoir and heat bath. Using as a test case an aqueous solution of the regulatory protein ubiquitin, we first confirm the validity of the AdResS approach for such systems, via an examination of protein and solvent structural and dynamical properties. We then demonstrate how, in addition to providing a computational speedup, such a multiscale AdResS approach can yield otherwise inaccessible physical insights into biomolecular function. We use our methodology to show that protein structure and dynamics can still be correctly modelled using only a few shells of atomistic water molecules. We also discuss aspects of the AdResS methodology peculiar to biomolecular simulations.« less
Koutkias, Vassilis; Stalidis, George; Chouvarda, Ioanna; Lazou, Katerina; Kilintzis, Vassilis; Maglaveras, Nicos
2009-01-01
Adverse Drug Events (ADEs) are currently considered as a major public health issue, endangering patients' safety and causing significant healthcare costs. Several research efforts are currently concentrating on the reduction of preventable ADEs by employing Information Technology (IT) solutions, which aim to provide healthcare professionals and patients with relevant knowledge and decision support tools. In this context, we present a knowledge engineering approach towards the construction of a Knowledge-based System (KBS) regarded as the core part of a CDSS (Clinical Decision Support System) for ADE prevention, all developed in the context of the EU-funded research project PSIP (Patient Safety through Intelligent Procedures in Medication). In the current paper, we present the knowledge sources considered in PSIP and the implications they pose to knowledge engineering, the methodological approach followed, as well as the components defining the knowledge engineering framework based on relevant state-of-the-art technologies and representation formalisms.
What Synthesis Methodology Should I Use? A Review and Analysis of Approaches to Research Synthesis.
Schick-Makaroff, Kara; MacDonald, Marjorie; Plummer, Marilyn; Burgess, Judy; Neander, Wendy
2016-01-01
When we began this process, we were doctoral students and a faculty member in a research methods course. As students, we were facing a review of the literature for our dissertations. We encountered several different ways of conducting a review but were unable to locate any resources that synthesized all of the various synthesis methodologies. Our purpose is to present a comprehensive overview and assessment of the main approaches to research synthesis. We use 'research synthesis' as a broad overarching term to describe various approaches to combining, integrating, and synthesizing research findings. We conducted an integrative review of the literature to explore the historical, contextual, and evolving nature of research synthesis. We searched five databases, reviewed websites of key organizations, hand-searched several journals, and examined relevant texts from the reference lists of the documents we had already obtained. We identified four broad categories of research synthesis methodology including conventional, quantitative, qualitative, and emerging syntheses. Each of the broad categories was compared to the others on the following: key characteristics, purpose, method, product, context, underlying assumptions, unit of analysis, strengths and limitations, and when to use each approach. The current state of research synthesis reflects significant advancements in emerging synthesis studies that integrate diverse data types and sources. New approaches to research synthesis provide a much broader range of review alternatives available to health and social science students and researchers.
NASA Astrophysics Data System (ADS)
Vazquez Rascon, Maria de Lourdes
This thesis focuses on the implementation of a participatory and transparent decision making tool about the wind farm projects. This tool is based on an (argumentative) framework that reflects the stakeholder's values systems involved in these projects and it employs two multicriteria methods: the multicriteria decision aide and the participatory geographical information systems, making it possible to represent this value systems by criteria and indicators to be evaluated. The stakeholder's values systems will allow the inclusion of environmental, economic and social-cultural aspects of wind energy projects and, thus, a sustainable development wind projects vision. This vision will be analyzed using the 16 sustainable principles included in the Quebec's Sustainable Development Act. Four specific objectives have been instrumented to favor a logical completion work, and to ensure the development of a successfultool : designing a methodology to couple the MCDA and participatory GIS, testing the developed methodology by a case study, making a robustness analysis to address strategic issues and analyzing the strengths, weaknesses, opportunities and threads of the developed methodology. Achieving the first goal allowed us to obtain a decision-making tool called Territorial Intelligence Modeling for Energy Development (TIMED approach). The TIMED approach is visually represented by a figure expressing the idea of a co-construction decision and where ail stakeholders are the focus of this methodology. TIMED is composed of four modules: Multi-Criteria decision analysis, participatory geographic Information systems, active involvement of the stakeholders and scientific knowledge/local knowledge. The integration of these four modules allows for the analysis of different implementation scenarios of wind turbines in order to choose the best one based on a participatory and transparent decision-making process that takes into account stakeholders' concerns. The second objective enabled the testing of TIMED in an ex-post experience of a wind farm in operation since 2006. In this test, II people participated representing four stakeholder' categories: the private sector, the public sector, experts and civil society. This test allowed us to analyze the current situation in which wind projects are currently developed in Quebec. The concerns of some stakeholders regarding situations that are not considered in the current context were explored through a third goal. This third objective allowed us to make simulations taking into account the assumptions of strategic levels. Examples of the strategic level are the communication tools used to approach the host community and the park property type. Finally, the fourth objective, a SWOT analysis with the participation of eight experts, allowed us to verify the extent to which TIMED approach succeeded in constructing four fields for participatory decision-making: physical, intellectual, emotional and procedural. From these facts, 116 strengths, 28 weaknesses, 32 constraints and 54 opportunities were identified. Contributions, applications, limitations and extensions of this research are based on giving a participatory decision-making methodology taking into account socio-cultural, environmental and economic variables; making reflection sessions on a wind farm in operation; acquiring MCDA knowledge for participants involved in testing the proposed methodology; taking into account the physical, intellectual, emotional and procedural spaces to al1iculate a participatory decision; using the proposed methodology in renewable energy sources other than wind; the need to an interdisciplinary team for the methodology application; access to quality data; access to information technologies; the right to public participation; the neutrality of experts; the relationships between experts and non-experts; cultural constraints; improvement of designed indicators; the implementation of a Web platform for participatory decision-making and writing a manual on the use of the developed methodology. Keywords: wind farm, multicriteria decision, geographic information systems, TIMED approach, sustainable wind energy projects development, renewable energy, social participation, robustness concern, SWOT analysis.
Polcin, Douglas L
Communities throughout the U.S. are struggling to find solutions for serious and persistent homelessness. Alcohol and drug problems can be causes and consequences of homelessness, as well as co-occurring problems that complicate efforts to succeed in finding stable housing. Two prominent service models exist, one known as "Housing First" takes a harm reduction approach and the other known as the "linear" model typically supports a goal of abstinence from alcohol and drugs. Despite their popularity, the research supporting these models suffers from methodological problems and inconsistent findings. One purpose of this paper is to describe systematic reviews of the homelessness services literature, which illustrate weaknesses in research designs and inconsistent conclusions about the effectiveness of current models. Problems among some of the seminal studies on homelessness include poorly defined inclusion and exclusion criteria, inadequate measures of alcohol and drug use, unspecified or poorly implemented comparison conditions, and lack of procedures documenting adherence to service models. Several recent papers have suggested broader based approaches for homeless services that integrate alternatives and respond better to consumer needs. Practical considerations for implementing a broader system of services are described and peer managed recovery homes are presented as examples of services that address some of the gaps in current approaches. Three issues are identified that need more attention from researchers: 1) improving upon the methodological limitations in current studies, 2) assessing the impact of broader based, integrated services on outcome, and 3) assessing approaches to the service needs of homeless persons involved in the criminal justice system.
Polcin, Douglas L.
2016-01-01
Abstract Communities throughout the U.S. are struggling to find solutions for serious and persistent homelessness. Alcohol and drug problems can be causes and consequences of homelessness, as well as co-occurring problems that complicate efforts to succeed in finding stable housing. Two prominent service models exist, one known as “Housing First” takes a harm reduction approach and the other known as the “linear” model typically supports a goal of abstinence from alcohol and drugs. Despite their popularity, the research supporting these models suffers from methodological problems and inconsistent findings. One purpose of this paper is to describe systematic reviews of the homelessness services literature, which illustrate weaknesses in research designs and inconsistent conclusions about the effectiveness of current models. Problems among some of the seminal studies on homelessness include poorly defined inclusion and exclusion criteria, inadequate measures of alcohol and drug use, unspecified or poorly implemented comparison conditions, and lack of procedures documenting adherence to service models. Several recent papers have suggested broader based approaches for homeless services that integrate alternatives and respond better to consumer needs. Practical considerations for implementing a broader system of services are described and peer-managed recovery homes are presented as examples of services that address some of the gaps in current approaches. Three issues are identified that need more attention from researchers: (1) improving upon the methodological limitations in current studies, (2) assessing the impact of broader based, integrated services on outcome, and (3) assessing approaches to the service needs of homeless persons involved in the criminal justice system. PMID:27092027
Quantifying Drosophila food intake: comparative analysis of current methodology
Deshpande, Sonali A.; Carvalho, Gil B.; Amador, Ariadna; Phillips, Angela M.; Hoxha, Sany; Lizotte, Keith J.; Ja, William W.
2014-01-01
Food intake is a fundamental parameter in animal studies. Despite the prevalent use of Drosophila in laboratory research, precise measurements of food intake remain challenging in this model organism. Here, we compare several common Drosophila feeding assays: the Capillary Feeder (CAFE), food-labeling with a radioactive tracer or a colorimetric dye, and observations of proboscis extension (PE). We show that the CAFE and radioisotope-labeling provide the most consistent results, have the highest sensitivity, and can resolve differences in feeding that dye-labeling and PE fail to distinguish. We conclude that performing the radiolabeling and CAFE assays in parallel is currently the best approach for quantifying Drosophila food intake. Understanding the strengths and limitations of food intake methodology will greatly advance Drosophila studies of nutrition, behavior, and disease. PMID:24681694
Soller, Jeffrey A; Eftim, Sorina E; Nappier, Sharon P
2018-01-01
Understanding pathogen risks is a critically important consideration in the design of water treatment, particularly for potable reuse projects. As an extension to our published microbial risk assessment methodology to estimate infection risks associated with Direct Potable Reuse (DPR) treatment train unit process combinations, herein, we (1) provide an updated compilation of pathogen density data in raw wastewater and dose-response models; (2) conduct a series of sensitivity analyses to consider potential risk implications using updated data; (3) evaluate the risks associated with log credit allocations in the United States; and (4) identify reference pathogen reductions needed to consistently meet currently applied benchmark risk levels. Sensitivity analyses illustrated changes in cumulative annual risks estimates, the significance of which depends on the pathogen group driving the risk for a given treatment train. For example, updates to norovirus (NoV) raw wastewater values and use of a NoV dose-response approach, capturing the full range of uncertainty, increased risks associated with one of the treatment trains evaluated, but not the other. Additionally, compared to traditional log-credit allocation approaches, our results indicate that the risk methodology provides more nuanced information about how consistently public health benchmarks are achieved. Our results indicate that viruses need to be reduced by 14 logs or more to consistently achieve currently applied benchmark levels of protection associated with DPR. The refined methodology, updated model inputs, and log credit allocation comparisons will be useful to regulators considering DPR projects and design engineers as they consider which unit treatment processes should be employed for particular projects. Published by Elsevier Ltd.
Wada, Yoshinao; Dell, Anne; Haslam, Stuart M; Tissot, Bérangère; Canis, Kévin; Azadi, Parastoo; Bäckström, Malin; Costello, Catherine E; Hansson, Gunnar C; Hiki, Yoshiyuki; Ishihara, Mayumi; Ito, Hiromi; Kakehi, Kazuaki; Karlsson, Niclas; Hayes, Catherine E; Kato, Koichi; Kawasaki, Nana; Khoo, Kay-Hooi; Kobayashi, Kunihiko; Kolarich, Daniel; Kondo, Akihiro; Lebrilla, Carlito; Nakano, Miyako; Narimatsu, Hisashi; Novak, Jan; Novotny, Milos V; Ohno, Erina; Packer, Nicolle H; Palaima, Elizabeth; Renfrow, Matthew B; Tajiri, Michiko; Thomsson, Kristina A; Yagi, Hirokazu; Yu, Shin-Yi; Taniguchi, Naoyuki
2010-04-01
The Human Proteome Organisation Human Disease Glycomics/Proteome Initiative recently coordinated a multi-institutional study that evaluated methodologies that are widely used for defining the N-glycan content in glycoproteins. The study convincingly endorsed mass spectrometry as the technique of choice for glycomic profiling in the discovery phase of diagnostic research. The present study reports the extension of the Human Disease Glycomics/Proteome Initiative's activities to an assessment of the methodologies currently used for O-glycan analysis. Three samples of IgA1 isolated from the serum of patients with multiple myeloma were distributed to 15 laboratories worldwide for O-glycomics analysis. A variety of mass spectrometric and chromatographic procedures representative of current methodologies were used. Similar to the previous N-glycan study, the results convincingly confirmed the pre-eminent performance of MS for O-glycan profiling. Two general strategies were found to give the most reliable data, namely direct MS analysis of mixtures of permethylated reduced glycans in the positive ion mode and analysis of native reduced glycans in the negative ion mode using LC-MS approaches. In addition, mass spectrometric methodologies to analyze O-glycopeptides were also successful.
2011-06-30
things. - Gerald M. Weinberg 1 Author‟ s Note: This paper is a theoretical exercise that attempts to deliver one possible Army...military and the overarching web of government agencies and international actors could approach Mexico‟ s current issues- however, this is a purely...interact.” 2 Raj Kumar, Why Mexico‟ s Violence is America‟ s Problem (CNN Opinion, April 11, 2011, http://www.cnn.com/2011/OPINION/04/11
Xu, Wei
2014-01-01
This paper first discusses the major inefficiencies faced in current human factors and ergonomics (HFE) approaches: (1) delivering an optimal end-to-end user experience (UX) to users of a solution across its solution lifecycle stages; (2) strategically influencing the product business and technology capability roadmaps from a UX perspective and (3) proactively identifying new market opportunities and influencing the platform architecture capabilities on which the UX of end products relies. In response to these challenges, three case studies are presented to demonstrate how enhanced ergonomics design approaches have effectively addressed the challenges faced in current HFE approaches. Then, the enhanced ergonomics design approaches are conceptualised by a user-experience ecosystem (UXE) framework, from a UX ecosystem perspective. Finally, evidence supporting the UXE, the advantage and the formalised process for executing UXE and methodological considerations are discussed. Practitioner Summary: This paper presents enhanced ergonomics approaches to product design via three case studies to effectively address current HFE challenges by leveraging a systematic end-to-end UX approach, UX roadmaps and emerging UX associated with prioritised user needs and usages. Thus, HFE professionals can be more strategic, creative and influential.
ERIC Educational Resources Information Center
Werr, Andreas; Runsten, Philip
2013-01-01
Purpose: The current paper aims at contributing to the understanding of interorganizational knowledge integration by highlighting the role of individuals' understandings of the task and how they shape knowledge integrating behaviours. Design/methodology/approach: The paper presents a framework of knowledge integration as heedful interrelating.…
Multicultural Integration in British and Dutch Societies: Education and Citizenship
ERIC Educational Resources Information Center
Bagley, Christopher Adam; Al-Refai, Nader
2017-01-01
Purpose: The purpose of this paper is to review and synthesize published studies and practice in the "integration" of ethnic and religious minorities in Britain and The Netherlands, 1965-2015, drawing out implications for current policy and practice. Design/methodology/approach: This paper is an evaluative review and report of results of…
A Call for a More Measured Approach to Reporting and Interpreting PISA Results
ERIC Educational Resources Information Center
Rutkowski, Leslie; Rutkowski, David
2016-01-01
In the current article, we consider the influential position of the Programme for International Student Assessment (PISA) and discuss several methodological areas that demonstrate the need for caution when using and interpreting PISA results. We motivate our argument by briefly describing the program's increased influence in educational policy…
A "Career" Work Ethic versus Just a Job
ERIC Educational Resources Information Center
Porter, Gayle
2005-01-01
Purpose: To provide current information on managers' expectations of their employees, toward structuring future research on amount of time and energy devoted to work. Design/methodology/approach: Qualitative data, acquired through focus groups and interviews, provide a sample of the perceptions of 57 managers in the mid-Atlantic region of the USA…
Loose and Tight Coupling in Educational Organizations--An Integrative Literature Review
ERIC Educational Resources Information Center
Hautala, Tanja; Helander, Jaakko; Korhonen, Vesa
2018-01-01
Purpose: The purpose of this paper is to review and synthesize the attributes of loose and tight coupling in educational organizations. In addition, it is aimed to determine whether this phenomenon has value and strategies to offer for the current educational administration and research. Design/methodology/approach: Integrative literature review…
Working against Ourselves: Decision Making in a Small Rural School District
ERIC Educational Resources Information Center
Patterson, Jean A.; Koenigs, Andrew; Mohn, Gordon; Rasmussen, Cheryl
2006-01-01
Purpose: The purpose of this paper is to examine decision making and resource allocation in a small, rural district in a Midwestern state of the USA during a time of economic retrenchment. Design/methodology/approach: Qualitative case study methods were used, including focus groups and personal interviews with current and former district…
Adolescent-Friendly Technologies as Potential Adjuncts for Health Promotion
ERIC Educational Resources Information Center
Dietrich, Janan J.; Coetzee, Jenny; Otwombe, Kennedy; Hornschuh, Stefanie; Mdanda, Sanele; Nkala, Busisiwe; Makongoza, Matamela; Tshabalala, Celokhuhle; Soon, Christine N.; Kaida, Angela; Hogg, Robert; Gray, Glenda E.; Miller, Cari L.
2014-01-01
Purpose: The purpose of this paper is to measure prevalence and predictors of mobile phone access and use among adolescents in Soweto, South Africa. Design/Methodology/Approach: The current study was an interviewer-administered, cross-sectional survey among adolescents 14-19 years living in a hyper-endemic human immunodeficiency virus (HIV)…
Workplace Learning and Higher Education in Finland: Reflections on Current Practice
ERIC Educational Resources Information Center
Virolainen, Maarit
2007-01-01
Purpose: The purpose of this article is to describe the organization of workplace learning in Finnish polytechnics, the models that have been developed for this purpose, and the challenges presented. Design/methodology/approach: First, the models for embedding workplace learning in the curriculum are described and analysed. Second, the conflicting…
Social Justice and Educational Administration: Mutually Exclusive?
ERIC Educational Resources Information Center
Karpinski, Carol F.; Lugg, Catherine A.
2006-01-01
Purpose: The purpose of this article is to explore some of the current tensions within educational administration in the USA and conclude with a few cautions for educators who engage in social justice projects. Design/methodology/approach: Using a selective case, this historical essay examines the issues of social justice and equity as they have…
Promoting Hong Kong's Higher Education to Asian Markets: Market Segmentations and Strategies
ERIC Educational Resources Information Center
Cheung, Alan C. K.; Yuen, Timothy W. W.; Yuen, Celeste Y. M.; Cheng, Yin Cheong
2010-01-01
Purpose: The main purpose of this study is threefold: to analyze the current conditions of higher education services offered in the three target markets; to conduct market segmentation analysis of these markets; and to recommend the most appropriate market entry strategies for Hong Kong's education service providers. Design/methodology/approach:…
Students with Blindness Explore Chemistry at "Camp Can Do"
ERIC Educational Resources Information Center
Supalo, Cary A.; Wohlers, H. David; Humphrey, Jennifer R.
2011-01-01
Students with blindness or low vision are often discouraged from full participation in laboratory science classes due to the inadequacy of current methodological approaches and the lack of sophisticated adaptive technologies. Consequently, these students rarely go on to pursue advanced studies and employment in the sciences. In response to his own…
Trial and Error: Negotiating Manhood and Struggling to Discover True Self
ERIC Educational Resources Information Center
Foste, Zak; Edwards, Keith; Davis, Tracy
2012-01-01
Using a case study approach , this article explores how men become restricted in experiencing a full range of emotions and human potential. After reviewing current literature describing the pressures men face to conform to traditional ideologies of masculinity, the case study methodology is described, results presented, and implications for…
ERIC Educational Resources Information Center
Hill, Faith
2006-01-01
Purpose: To explore the professional interface between health promotion (HP) and complementary and alternative medicine. Design/methodology/approach: A discussion paper, based on qualitative research involving in-depth interviews with 52 participants from either side of the interface. Findings: The current interface is predominantly limited to…
Current methodology used by the US EPA for determining plant species at risk from off site movement of pesticides has been determined to be inadequate for their protection. Ten agricultural, annual, herbaceous plant species are used in the preregistration tests as representative...
Backtrack Programming: A Computer-Based Approach to Group Problem Solving.
ERIC Educational Resources Information Center
Scott, Michael D.; Bodaken, Edward M.
Backtrack problem-solving appears to be a viable alternative to current problem-solving methodologies. It appears to have considerable heuristic potential as a conceptual and operational framework for small group communication research, as well as functional utility for the student group in the small group class or the management team in the…
Inclusion of Radiation Environment Variability in Total Dose Hardness Assurance Methodology
NASA Technical Reports Server (NTRS)
Xapsos, M. A.; Stauffer, C.; Phan, A.; McClure, S. S.; Ladbury, R. L.; Pellish, J. A.; Campola, M. J.; LaBel, K. A.
2015-01-01
Variability of the space radiation environment is investigated with regard to parts categorization for total dose hardness assurance methods. It is shown that it can have a significant impact. A modified approach is developed that uses current environment models more consistently and replaces the design margin concept with one of failure probability.
The Homecoming: A Review of Support Practices for Repatriates
ERIC Educational Resources Information Center
Pattie, Marshall; White, Marion M.; Tansky, Judy
2010-01-01
Purpose: The purpose of this paper is to examine the prevalence of repatriate support practices in organizations within the context of the current literature in this field of study. Design/methodology/approach: A total of 42 firms employing 3,234 expatriates were surveyed regarding human resource practices that support repatriation. Analysis…
Engaging with Employers in Work-Based Learning: A Foundation Degree in Applied Technology
ERIC Educational Resources Information Center
Benefer, Richard
2007-01-01
Purpose: This paper aims to describe the work of Staffordshire University in engaging with local employers and local further education colleges in the development of a Foundation Degree in Applied Technology. Design/methodology/approach: Following an outline of current government policy in employer engagement, the paper identifies--from the…
EMERGO: A Methodology and Toolkit for Developing Serious Games in Higher Education
ERIC Educational Resources Information Center
Nadolski, Rob J.; Hummel, Hans G. K.; van den Brink, Henk J.; Hoefakker, Ruud E.; Slootmaker, Aad; Kurvers, Hub J.; Storm, Jeroen
2008-01-01
Societal changes demand educators to apply new pedagogical approaches. Many educational stakeholders feel that serious games could play a key role in fulfilling this demand, and they lick their chops when looking at the booming industry of leisure games. However, current toolkits for developing leisure games show severe shortcomings when applied…
Corporate Universities in China: Processes, Issues and Challenges
ERIC Educational Resources Information Center
Qiao, June Xuejun
2009-01-01
Purpose: This study is intended to investigate the current status of corporate universities in China. It aims to explore the processes and practices of corporate universities in China, and discover the issues and challenges involved in building and running a corporate university in China. Design/methodology/approach: The heads of 11 well-known…
Understanding the Art and Science of Implementation in the SAAF Efficacy Trial
ERIC Educational Resources Information Center
Berkel, Cady; Murry, Velma McBride; Roulston, Kathryn J.; Brody, Gene H.
2013-01-01
Purpose: The purpose of this paper is to demonstrate the importance of considering both fidelity and adaptation in assessing the implementation of evidence-based programs. Design/methodology/approach: The current study employs a multi-method strategy to understand two dimensions of implementation (fidelity and adaptation) in the Strong African…
Exploring the Current Position of ESD in UK Higher Education Institutions
ERIC Educational Resources Information Center
Fiselier, Evelien S.; Longhurst, James W. S.; Gough, Georgina K.
2018-01-01
Purpose: The purpose of this paper is to consider the position of education for sustainable development in the UK Higher Education (HE) sector with respect to the Quality Assurance Agency (QAA) and Higher Education Academy (HEA) Guidance for education for sustainable development (ESD). Design/methodology/approach: By means of a mixed-method…
A Survey of Internship Programs for Management Undergraduates in AACSB-Accredited Institutions
ERIC Educational Resources Information Center
Kim, Eyong B.; Kim, Kijoo; Bzullak, Michael
2012-01-01
Purpose: The purpose of this paper is to survey the current status of internship programs for Management undergraduate students and to introduce a well-established internship program. Design/methodology/approach: A web page analysis was conducted on 473 institutions that have AACSB (the Association to Advance Collegiate Schools of Business)…
A GIS APPROACH FOR IDENTIFYING SPECIES AND LOCATIONS AT RISK FROM OFF-TARGET MOVEMENT OF PESTICIDES
In many countries, numerous tests are required prior to pesticide registration for the protection of human health and the environment from the unintended effects of chemical releases. Current methodology used by the US EPA for determining plant species at risk from off site movem...
ERIC Educational Resources Information Center
Brown, Brendan; Nuberg, Ian; Llewellyn, Rick
2018-01-01
Purpose: The limited uptake of improved agricultural practices in Africa raise questions on the functionality of current agricultural research systems. Our purpose is to explore the capacity for local innovation within the research systems of Ethiopia, Malawi and Mozambique. Design/Methodology/Approach: Using Conservation Agriculture (CA) as a…
Critical Review on Power in Organization: Empowerment in Human Resource Development
ERIC Educational Resources Information Center
Jo, Sung Jun; Park, Sunyoung
2016-01-01
Purpose: This paper aims to analyze current practices, discuss empowerment from the theoretical perspectives on power in organizations and suggest an empowerment model based on the type of organizational culture and the role of human resource development (HRD). Design/methodology/approach: By reviewing the classic viewpoint of power, Lukes'…
The Need for Private Universities in Japan to Be Agents of Change
ERIC Educational Resources Information Center
Zhang, Rong; McCornac, Dennis C.
2013-01-01
Purpose: The purpose of this paper is to examine a number of current innovations made by private higher educational institutions in Japan to counter decreased enrollments and financial constraints. Design/methodology/approach: The design of this study is both descriptive and conceptual, based on the latest data available. Additional information…
Strategically Focused Training in Six Sigma Way: A Case Study
ERIC Educational Resources Information Center
Pandey, Ashish
2007-01-01
Purpose: The purpose of the current study is to examine the utility of Six Sigma interventions as a performance measure and explore its applicability for making the training design and delivery operationally efficient and strategically effective. Design/methodology/approach: This is a single revelatory case study. Data were collected from multiple…
The Implementation of Sustainability Practices in Portuguese Higher Education Institutions
ERIC Educational Resources Information Center
Aleixo, Ana Marta; Azeiteiro, Ulisses; Leal, Susana
2018-01-01
Purpose: The purpose of this work is to analyze the current state of implementation of sustainability development (SD) in Portuguese higher education institutions (HEIs). Design/methodology/approach: A questionnaire was developed to measure the level of implementation of SD practices in HEIs as well as the number of rankings, certifications and…
Methodical Approaches to Teaching of Computer Modeling in Computer Science Course
ERIC Educational Resources Information Center
Rakhimzhanova, B. Lyazzat; Issabayeva, N. Darazha; Khakimova, Tiyshtik; Bolyskhanova, J. Madina
2015-01-01
The purpose of this study was to justify of the formation technique of representation of modeling methodology at computer science lessons. The necessity of studying computer modeling is that the current trends of strengthening of general education and worldview functions of computer science define the necessity of additional research of the…
Australian Small Business Participation in Training Activities
ERIC Educational Resources Information Center
Webster, Beverley; Walker, Elizabeth; Brown, Alan
2005-01-01
Purpose: This purpose of this paper is to investigate the use of on-line training by small businesses in Australia. It explores the relationship between the owners acceptance and use of the Internet, and their current participation in training opportunities. Design/Methodology/Approach: A sample of small businesses which had participated in an…
Do "Current" Teaching Methodologies Really Work in Every Context?
ERIC Educational Resources Information Center
Yürekli, Aynur
2017-01-01
This study examines the impact that learners have on the effective implementation of the Communicative Language Teaching Approach (CLT) in monolingual English for Academic Purposes (EAP) class in a country where English is taught as a foreign rather than second language. Based on recorded language lessons of four different learner groups, it…
Progressive Pedagogies and Teacher Education: A Review of the Literature
ERIC Educational Resources Information Center
Webber, Geoff; Miller, Dianne
2016-01-01
Few studies take up the question of how to teach pre-service or current teachers to practice integrated, interdisciplinary, and inquiry-based methodologies. In this literature review, scholarly research is explored to examine approaches to teacher education based in progressivism. Place- and community-based education is considered as an important…
Transportation Sustainability on a University Campus
ERIC Educational Resources Information Center
Kaplan, David H.
2015-01-01
Purpose: This paper aims to show the present level of sustainable transportation, mainly walking and bicycling, on a large campus in the US Midwest and then analyzes some of the opportunities and impediments in increasing the modal share. Design/methodology/approach: Three types of analysis are used. First, current level of walking and bicycling…
Macro-economic assessment of flood risk in Italy under current and future climate
NASA Astrophysics Data System (ADS)
Carrera, Lorenzo; Koks, Elco; Mysiak, Jaroslav; Aerts, Jeroen; Standardi, Gabriele
2014-05-01
This paper explores an integrated methodology for assessing direct and indirect costs of fluvial flooding to estimate current and future fluvial flood risk in Italy. Our methodology combines a Geographic Information System spatial approach, with a general economic equilibrium approach using a downscaled modified version of a Computable General Equilibrium model at NUTS2 scale. Given the level of uncertainty in the behavior of disaster-affected economies, the simulation considers a wide range of business recovery periods. We calculate expected annual losses for each NUTS2 region, and exceedence probability curves to determine probable maximum losses. Given a certain acceptable level of risk, we describe the conditions of flood protection and business recovery periods under which losses are contained within this limit. Because of the difference between direct costs, which are an overestimation of stock losses, and indirect costs, which represent the macro-economic effects, our results have different policy meanings. While the former is relevant for post-disaster recovery, the latter is more relevant for public policy issues, particularly for cost-benefit analysis and resilience assessment.
Enhanced methodology for porting ion chromatography retention data.
Park, Soo Hyun; Shellie, Robert A; Dicinoski, Greg W; Schuster, Georg; Talebi, Mohammad; Haddad, Paul R; Szucs, Roman; Dolan, John W; Pohl, Christopher A
2016-03-04
Porting is a powerful methodology to recalibrate an existing database of ion chromatography (IC) retention times by reflecting the changes of column behavior resulting from either batch-to-batch variability in the production of the column or the manufacture of new versions of a column. This approach has been employed to update extensive databases of retention data of inorganic and organic anions forming part of the "Virtual Column" software marketed by Thermo Fisher Scientific, which is the only available commercial optimization tool for IC separation. The current porting process is accomplished by performing three isocratic separations with two representative analyte ions in order to derive a porting equation which expresses the relationship between old and new data. Although the accuracy of retention prediction is generally enhanced on new columns, errors were observed on some columns. In this work, the porting methodology was modified in order to address this issue, where the porting equation is now derived by using six representative analyte ions (chloride, bromide, iodide, perchlorate, sulfate, and thiosulfate). Additionally, the updated porting methodology has been applied on three Thermo Fisher Scientific columns (AS20, AS19, and AS11HC). The proposed approach showed that the new porting methodology can provide more accurate and robust retention prediction on a wide range of columns, where average errors in retention times for ten test anions under three eluent conditions were less than 1.5%. Moreover, the retention prediction using this new approach provided an acceptable level of accuracy on a used column exhibiting changes in ion-exchange capacity. Crown Copyright © 2016. Published by Elsevier B.V. All rights reserved.
Roibás, Laura; Loiseau, Eléonore; Hospido, Almudena
2018-07-01
On a previous study, the carbon footprint (CF) of all production and consumption activities of Galicia, an Autonomous Community located in the north-west of Spain, was determined and the results were used to devise strategies aimed at the reduction and mitigation of the greenhouse gas (GHG) emissions. The territorial LCA methodology was used there to perform the calculations. However, that methodology was initially designed to compute the emissions of all types of polluting substances to the environment (several thousands of substances considered in the life cycle inventories), aimed at performing complete LCA studies. This requirement implies the use of specific modelling approaches and databases that in turn raised some difficulties, i.e., need of large amounts of data (which increased gathering times), low temporal, geographical and technological representativeness of the study, lack of data, and presence of double counting issues when trying to combine the sectorial CF results into those of the total economy. In view of these of difficulties, and considering the need to focus only on GHG emissions, it seems important to improve the robustness of the CF computation while proposing a simplified methodology. This study is the result of those efforts to improve the aforementioned methodology. In addition to the territorial LCA approach, several Input-Output (IO) based alternatives have been used here to compute direct and indirect GHG emissions of all Galician production and consumption activities. The results of the different alternatives were compared and evaluated under a multi-criteria approach considering reliability, completeness, temporal and geographical correlation, applicability and consistency. Based on that, an improved and simplified methodology was proposed to determine the CF of the Galician consumption and production activities from a total responsibility perspective. This methodology adequately reflects the current characteristics of the Galician economy, thus increasing the representativeness of the results, and can be applied to any region in which IO tables and environmental vectors are available. This methodology could thus provide useful information in decision making processes to reduce and prevent GHG emissions. Copyright © 2018 Elsevier Ltd. All rights reserved.
Methods for measuring denitrification: Diverse approaches to a difficult problem
Groffman, Peter M; Altabet, Mary A.; Böhlke, J.K.; Butterbach-Bahl, Klaus; David, Mary B.; Firestone, Mary K.; Giblin, Anne E.; Kana, Todd M.; Nielsen , Lars Peter; Voytek, Mary A.
2006-01-01
Denitrification, the reduction of the nitrogen (N) oxides, nitrate (NO3−) and nitrite (NO2−), to the gases nitric oxide (NO), nitrous oxide (N2O), and dinitrogen (N2), is important to primary production, water quality, and the chemistry and physics of the atmosphere at ecosystem, landscape, regional, and global scales. Unfortunately, this process is very difficult to measure, and existing methods are problematic for different reasons in different places at different times. In this paper, we review the major approaches that have been taken to measure denitrification in terrestrial and aquatic environments and discuss the strengths, weaknesses, and future prospects for the different methods. Methodological approaches covered include (1) acetylene-based methods, (2) 15N tracers, (3) direct N2 quantification, (4) N2:Ar ratio quantification, (5) mass balance approaches, (6) stoichiometric approaches, (7) methods based on stable isotopes, (8) in situ gradients with atmospheric environmental tracers, and (9) molecular approaches. Our review makes it clear that the prospects for improved quantification of denitrification vary greatly in different environments and at different scales. While current methodology allows for the production of accurate estimates of denitrification at scales relevant to water and air quality and ecosystem fertility questions in some systems (e.g., aquatic sediments, well-defined aquifers), methodology for other systems, especially upland terrestrial areas, still needs development. Comparison of mass balance and stoichiometric approaches that constrain estimates of denitrification at large scales with point measurements (made using multiple methods), in multiple systems, is likely to propel more improvement in denitrification methods over the next few years.
Approaches to Children’s Exposure Assessment: Case Study with Diethylhexylphthalate (DEHP)
Ginsberg, Gary; Ginsberg, Justine; Foos, Brenda
2016-01-01
Children’s exposure assessment is a key input into epidemiology studies, risk assessment and source apportionment. The goals of this article are to describe a methodology for children’s exposure assessment that can be used for these purposes and to apply the methodology to source apportionment for the case study chemical, diethylhexylphthalate (DEHP). A key feature is the comparison of total (aggregate) exposure calculated via a pathways approach to that derived from a biomonitoring approach. The 4-step methodology and its results for DEHP are: (1) Prioritization of life stages and exposure pathways, with pregnancy, breast-fed infants, and toddlers the focus of the case study and pathways selected that are relevant to these groups; (2) Estimation of pathway-specific exposures by life stage wherein diet was found to be the largest contributor for pregnant women, breast milk and mouthing behavior for the nursing infant and diet, house dust, and mouthing for toddlers; (3) Comparison of aggregate exposure by pathways vs biomonitoring-based approaches wherein good concordance was found for toddlers and pregnant women providing confidence in the exposure assessment; (4) Source apportionment in which DEHP presence in foods, children’s products, consumer products and the built environment are discussed with respect to early life mouthing, house dust and dietary exposure. A potential fifth step of the method involves the calculation of exposure doses for risk assessment which is described but outside the scope for the current case study. In summary, the methodology has been used to synthesize the available information to identify key sources of early life exposure to DEHP. PMID:27376320
A methodology to assess the economic impact of power storage technologies.
El-Ghandour, Laila; Johnson, Timothy C
2017-08-13
We present a methodology for assessing the economic impact of power storage technologies. The methodology is founded on classical approaches to the optimal stopping of stochastic processes but involves an innovation that circumvents the need to, ex ante , identify the form of a driving process and works directly on observed data, avoiding model risks. Power storage is regarded as a complement to the intermittent output of renewable energy generators and is therefore important in contributing to the reduction of carbon-intensive power generation. Our aim is to present a methodology suitable for use by policy makers that is simple to maintain, adaptable to different technologies and easy to interpret. The methodology has benefits over current techniques and is able to value, by identifying a viable optimal operational strategy, a conceived storage facility based on compressed air technology operating in the UK.This article is part of the themed issue 'Energy management: flexibility, risk and optimization'. © 2017 The Author(s).
Noctor, Graham; Mhamdi, Amna; Foyer, Christine H
2016-05-01
Oxidative stress and reactive oxygen species (ROS) are common to many fundamental responses of plants. Enormous and ever-growing interest has focused on this research area, leading to an extensive literature that documents the tremendous progress made in recent years. As in other areas of plant biology, advances have been greatly facilitated by developments in genomics-dependent technologies and the application of interdisciplinary techniques that generate information at multiple levels. At the same time, advances in understanding ROS are fundamentally reliant on the use of biochemical and cell biology techniques that are specific to the study of oxidative stress. It is therefore timely to revisit these approaches with the aim of providing a guide to convenient methods and assisting interested researchers in avoiding potential pitfalls. Our critical overview of currently popular methodologies includes a detailed discussion of approaches used to generate oxidative stress, measurements of ROS themselves, determination of major antioxidant metabolites, assays of antioxidative enzymes and marker transcripts for oxidative stress. We consider the applicability of metabolomics, proteomics and transcriptomics approaches and discuss markers such as damage to DNA and RNA. Our discussion of current methodologies is firmly anchored to future technological developments within this popular research field. © 2016 John Wiley & Sons Ltd.
Design Science Methodology Applied to a Chemical Surveillance Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Zhuanyi; Han, Kyungsik; Charles-Smith, Lauren E.
Public health surveillance systems gain significant benefits from integrating existing early incident detection systems,supported by closed data sources, with open source data.However, identifying potential alerting incidents relies on finding accurate, reliable sources and presenting the high volume of data in a way that increases analysts work efficiency; a challenge for any system that leverages open source data. In this paper, we present the design concept and the applied design science research methodology of ChemVeillance, a chemical analyst surveillance system.Our work portrays a system design and approach that translates theoretical methodology into practice creating a powerful surveillance system built for specificmore » use cases.Researchers, designers, developers, and related professionals in the health surveillance community can build upon the principles and methodology described here to enhance and broaden current surveillance systems leading to improved situational awareness based on a robust integrated early warning system.« less
A methodology aimed at fostering and sustaining the development processes of an IE-based industry
NASA Astrophysics Data System (ADS)
Corallo, Angelo; Errico, Fabrizio; de Maggio, Marco; Giangreco, Enza
In the current competitive scenario, where business relationships are fundamental in building successful business models and inter/intra organizational business processes are progressively digitalized, an end-to-end methodology is required that is capable of guiding business networks through the Internetworked Enterprise (IE) paradigm: a new and innovative organizational model able to leverage Internet technologies to perform real-time coordination of intra and inter-firm activities, to create value by offering innovative and personalized products/services and reduce transaction costs. This chapter presents the TEKNE project Methodology of change that guides business networks, by means of a modular and flexible approach, towards the IE techno-organizational paradigm, taking into account the competitive environment of the network and how this environment influences its strategic, organizational and technological levels. Contingency, the business model, enterprise architecture and performance metrics are the key concepts that form the cornerstone of this methodological framework.
Intelligent Automation Approach for Improving Pilot Situational Awareness
NASA Technical Reports Server (NTRS)
Spirkovska, Lilly
2004-01-01
Automation in the aviation domain has been increasing for the past two decades. Pilot reaction to automation varies from highly favorable to highly critical depending on both the pilot's background and how effectively the automation is implemented. We describe a user-centered approach for automation that considers the pilot's tasks and his needs related to accomplishing those tasks. Further, we augment rather than replace how the pilot currently fulfills his goals, relying on redundant displays that offer the pilot an opportunity to build trust in the automation. Our prototype system automates the interpretation of hydraulic system faults of the UH-60 helicopter. We describe the problem with the current system and our methodology for resolving it.
Asan, Esther; Drenckhahn, Detlev
2008-12-01
Investigations of cell and tissue structure and function using innovative methods and approaches have again yielded numerous exciting findings in recent months and have added important data to current knowledge, inspiring new ideas and hypotheses in various fields of modern life sciences. Topics and contents of comprehensive expert reviews covering different aspects in methodological advances, cell biology, tissue function and morphology, and novel findings reported in original papers are summarized in the present review.
Recent Trends in Quantum Chemical Modeling of Enzymatic Reactions.
Himo, Fahmi
2017-05-24
The quantum chemical cluster approach is a powerful method for investigating enzymatic reactions. Over the past two decades, a large number of highly diverse systems have been studied and a great wealth of mechanistic insight has been developed using this technique. This Perspective reviews the current status of the methodology. The latest technical developments are highlighted, and challenges are discussed. Some recent applications are presented to illustrate the capabilities and progress of this approach, and likely future directions are outlined.
NASA Astrophysics Data System (ADS)
Adhikari, Pashupati Raj
Materials selection processes have been the most important aspects in product design and development. Knowledge-based system (KBS) and some of the methodologies used in the materials selection for the design of aircraft cabin metallic structures are discussed. Overall aircraft weight reduction means substantially less fuel consumption. Part of the solution to this problem is to find a way to reduce overall weight of metallic structures inside the cabin. Among various methodologies of materials selection using Multi Criterion Decision Making (MCDM) techniques, a few of them are demonstrated with examples and the results are compared with those obtained using Ashby's approach in materials selection. Pre-defined constraint values, mainly mechanical properties, are employed as relevant attributes in the process. Aluminum alloys with high strength-to-weight ratio have been second-to-none in most of the aircraft parts manufacturing. Magnesium alloys that are much lighter in weight as alternatives to the Al-alloys currently in use in the structures are tested using the methodologies and ranked results are compared. Each material attribute considered in the design are categorized as benefit and non-benefit attribute. Using Ashby's approach, material indices that are required to be maximized for an optimum performance are determined, and materials are ranked based on the average of consolidated indices ranking. Ranking results are compared for any disparity among the methodologies.
Current Trends in Modeling Research for Turbulent Aerodynamic Flows
NASA Technical Reports Server (NTRS)
Gatski, Thomas B.; Rumsey, Christopher L.; Manceau, Remi
2007-01-01
The engineering tools of choice for the computation of practical engineering flows have begun to migrate from those based on the traditional Reynolds-averaged Navier-Stokes approach to methodologies capable, in theory if not in practice, of accurately predicting some instantaneous scales of motion in the flow. The migration has largely been driven by both the success of Reynolds-averaged methods over a wide variety of flows as well as the inherent limitations of the method itself. Practitioners, emboldened by their ability to predict a wide-variety of statistically steady, equilibrium turbulent flows, have now turned their attention to flow control and non-equilibrium flows, that is, separation control. This review gives some current priorities in traditional Reynolds-averaged modeling research as well as some methodologies being applied to a new class of turbulent flow control problems.
What Synthesis Methodology Should I Use? A Review and Analysis of Approaches to Research Synthesis
Schick-Makaroff, Kara; MacDonald, Marjorie; Plummer, Marilyn; Burgess, Judy; Neander, Wendy
2016-01-01
Background When we began this process, we were doctoral students and a faculty member in a research methods course. As students, we were facing a review of the literature for our dissertations. We encountered several different ways of conducting a review but were unable to locate any resources that synthesized all of the various synthesis methodologies. Our purpose is to present a comprehensive overview and assessment of the main approaches to research synthesis. We use ‘research synthesis’ as a broad overarching term to describe various approaches to combining, integrating, and synthesizing research findings. Methods We conducted an integrative review of the literature to explore the historical, contextual, and evolving nature of research synthesis. We searched five databases, reviewed websites of key organizations, hand-searched several journals, and examined relevant texts from the reference lists of the documents we had already obtained. Results We identified four broad categories of research synthesis methodology including conventional, quantitative, qualitative, and emerging syntheses. Each of the broad categories was compared to the others on the following: key characteristics, purpose, method, product, context, underlying assumptions, unit of analysis, strengths and limitations, and when to use each approach. Conclusions The current state of research synthesis reflects significant advancements in emerging synthesis studies that integrate diverse data types and sources. New approaches to research synthesis provide a much broader range of review alternatives available to health and social science students and researchers. PMID:29546155
Cognitive training and plasticity: Theoretical perspective and methodological consequences
Willis, Sherry L.; Schaie, K. Warner
2013-01-01
Purpose To provide an overview of cognitive plasticity concepts and findings from a lifespan developmental perspective. Methods After an evaluation of the general concept of cognitive plasticity, the most important approaches to study behavioral and brain plasticity are reviewed. This includes intervention studies, experimental approaches, cognitive trainings, the study of facilitating factors for strategy learning and strategy use, practice, and person-environment interactions. Transfer and durability of training-induced plasticity is discussed. Results The review indicates that methodological and conceptual advances are needed to improve the match between levels of behavioral and brain plasticity targeted in current developmental research and study designs. Conclusions The results suggest that the emphasis of plasticity studies on treatment effectiveness needs to be complemented by a strong commitment to the grounding of the intervention in a conceptual framework. PMID:19847065
NASA Astrophysics Data System (ADS)
García-Santos, Glenda; Madruga de Brito, Mariana; Höllermann, Britta; Taft, Linda; Almoradie, Adrian; Evers, Mariele
2018-06-01
Understanding the interactions between water resources and its social dimensions is crucial for an effective and sustainable water management. The identification of sensitive control variables and feedback loops of a specific human-hydro-scape can enhance the knowledge about the potential factors and/or agents leading to the current water resources and ecosystems situation, which in turn supports the decision-making process of desirable futures. Our study presents the utility of a system dynamics modeling approach for water management and decision-making for the case of a forest ecosystem under risk of wildfires. We use the pluralistic water research concept to explore different scenarios and simulate the emergent behaviour of water interception and net precipitation after a wildfire in a forest ecosystem. Through a case study, we illustrate the applicability of this new methodology.
Cues and concerns by patients in medical consultations: a literature review.
Zimmermann, Christa; Del Piccolo, Lidia; Finset, Arnstein
2007-05-01
The aim of the current article is to review the peer-reviewed research literature on cues and concerns published between 1975 and 2006. To be included, articles had to report observational studies based on patient-physician consultations and report findings on patient expressions of cues and/or concerns. Quantitative and qualitative studies from different medical settings were considered. Fifty-eight original articles based on the analysis of audio- or videotaped medical consultations were tracked down. Definition of cues and concerns and methodological approaches differed widely. Physicians missed most cues and concerns and adopted behaviors that discouraged disclosure. Communication training improved the detection of cues and concerns. Future research progress would require different methodological approaches more appropriate for studying verbal interactions and the complexity of the various levels that influence interactions. (c) 2007 APA, all rights reserved
NASA Astrophysics Data System (ADS)
Fogarty, Aoife C.; Potestio, Raffaello; Kremer, Kurt
2015-05-01
A fully atomistic modelling of many biophysical and biochemical processes at biologically relevant length- and time scales is beyond our reach with current computational resources, and one approach to overcome this difficulty is the use of multiscale simulation techniques. In such simulations, when system properties necessitate a boundary between resolutions that falls within the solvent region, one can use an approach such as the Adaptive Resolution Scheme (AdResS), in which solvent particles change their resolution on the fly during the simulation. Here, we apply the existing AdResS methodology to biomolecular systems, simulating a fully atomistic protein with an atomistic hydration shell, solvated in a coarse-grained particle reservoir and heat bath. Using as a test case an aqueous solution of the regulatory protein ubiquitin, we first confirm the validity of the AdResS approach for such systems, via an examination of protein and solvent structural and dynamical properties. We then demonstrate how, in addition to providing a computational speedup, such a multiscale AdResS approach can yield otherwise inaccessible physical insights into biomolecular function. We use our methodology to show that protein structure and dynamics can still be correctly modelled using only a few shells of atomistic water molecules. We also discuss aspects of the AdResS methodology peculiar to biomolecular simulations.
The Role of Teacher Imagination in Conceptualising the Child as a Second Language Learner
ERIC Educational Resources Information Center
Guz, Ewa; Tetiurka, Maugorzata
2013-01-01
In order to initiate and maintain meaningful interaction in a young learner L2 classroom, an adult teacher needs to approach children in ways consistent with their developmental profile and adjust teaching methodology so as to accommodate young learners' current skills. This requires the ability to predict the child's possible responses to…
ERIC Educational Resources Information Center
Poza-Lujan, Jose-Luis; Calafate, Carlos T.; Posadas-Yagüe, Juan-Luis; Cano, Juan-Carlos
2016-01-01
Current opinion on undergraduate studies has led to a reformulation of teaching methodologies to base them not just on learning, but also on skills and competencies. In this approach, the teaching/learning process should accomplish both knowledge assimilation and skill development. Previous works demonstrated that a strategy that uses continuous…
ERIC Educational Resources Information Center
Lane, Tonisha B.
2016-01-01
The current study used a case study methodological approach, including document analysis, semistructured interviews, and participant observations, to investigate how a science, technology, engineering, and mathematics (STEM) enrichment program supported retention and degree attainment of underrepresented students at a large, public, predominantly…
The Future of Teaching Research in the Social Sciences
ERIC Educational Resources Information Center
Wagner, C.
2009-01-01
Current literature on teaching research methodology in the social sciences highlights the changing nature of our world in terms of its complexity and diversity, and points to how this affects the way in which we search for answers to related problems (Brew 2003, 3; Tashakkori and Teddlie 2003, 74). New ways of approaching research problems that…
Solar Electricity Generation: Issues of Development and Impact on ICT Implementation in Africa
ERIC Educational Resources Information Center
Damasen, Ikwaba Paul
2013-01-01
Purpose: The purpose of this paper is to examine and discuss, in-depth, how solar electricity can be developed and used to tackle grid electricity-related problems in African countries suffering from unreliable and inadequate grid electricity. Design/methodology/approach: The paper discusses in depth the current status of grid electricity in…
ERIC Educational Resources Information Center
Eseryel, Deniz; Schuver-van Blanken, Marian J.; Spector, J. Michael
ADAPT[IT] (Advanced Design Approach for Personalized Training-Interactive Tools is a European project coordinated by the Dutch National Aerospace Laboratory. The aim of ADAPT[IT] is to create and validate an effective training design methodology, based on cognitive science and leading to the integration of advanced technologies, so that the…
Principals' Conceptions of Their Current Power Basis Revealed through Phenomenography
ERIC Educational Resources Information Center
Özaslan, Gökhan
2018-01-01
Purpose: The purpose of this paper is to describe the variations in the ways that principals conceptualize their basis of power in schools. Design/methodology/approach: Phenomenography was used as the research method of this study. The interviewees consisted of 16 principals, eight from public schools and eight from private schools. Findings: The…
ERIC Educational Resources Information Center
Wisecup, Allison K.; Grady, Dennis; Roth, Richard A.; Stephens, Julio
2017-01-01
Purpose: The purpose of this study was to determine whether, and how, electricity consumption by students in university residence halls were impacted through three intervention strategies. Design/methodology/approach: The current investigation uses a quasi-experimental design by exposing freshman students in four matched residence halls and the…
Assessment--Enabling Participation in Academic Discourse and the Implications
ERIC Educational Resources Information Center
Bayaga, Anass; Wadesango, Newman
2013-01-01
The current study was an exploration of how to develop assessment resources and processes via in-depth interviews with 30 teachers. The focus was on how teachers use and apply different assessment situations. The methodology, which was a predominately qualitative approach and adopted case study design, sought to use a set of criteria based on…
Getting People Involved: The Benefit of Intellectual Capital Management for Addressing HR Challenges
ERIC Educational Resources Information Center
Pook, Katja
2011-01-01
Purpose: This paper aims to explore the benefits of intellectual capital assessment for facing current challenges of human resources work and organizational development. Design/methodology/approach: The paper takes findings of studies on challenges in HR work and maps them with features of intellectual capital assessment methods. It is thus a…
How Do Management Students Perceive the Quality of Education in Public Institutions?
ERIC Educational Resources Information Center
Narang, Ritu
2012-01-01
Purpose: Keeping in mind the urgent need to deliver quality education in higher education institutes, the current paper seeks to measure the quality perception of management students in India. Design/methodology/approach: Based on an exploratory study a modified version of SERVQUAL was employed as the research instrument. Data were collected from…
The Bologna Process in Higher Education: An Exploratory Case Study in a Russian Context
ERIC Educational Resources Information Center
Esyutina, Maria; Fearon, Colm; Leatherbarrow, Nicky
2013-01-01
Purpose: The aim of the current article is to discuss the role of the Bologna process in enabling quality of educational change, internationalisation and greater mobility using an example case study of a Russian university. Some discussion is provided to offer insights and inform future research and practice. Design/methodology/approach: The…
Supporting the Research Process through Expanded Library Data Services
ERIC Educational Resources Information Center
Wang, Minglu
2013-01-01
Purpose: The purpose of this paper is to describe how the authors gained a better understanding of the variety of library users' data needs, and how gradually some new data services were established based on current capabilities. Design/methodology/approach: This paper uses a case study of the new data services at the John Cotton Dana Library, at…
ERIC Educational Resources Information Center
Cheung, Alan C. K.
2013-01-01
Purpose: The purpose of this paper is to examine language, academic, social-cultural and financial adjustments facing mainland Chinese students in Hong Kong. Design/methodology/approach: The current study employed both quantitative and qualitative methods and included over 300 mainland Chinese students from seven major universities in Hong Kong.…
Rural Schools, Social Capital and the Big Society: A Theoretical and Empirical Exposition
ERIC Educational Resources Information Center
Bagley, Carl; Hillyard, Sam
2014-01-01
The paper commences with a theoretical exposition of the current UK government's policy commitment to the idealised notion of the Big Society and the social capital currency underpinning its formation. The paper positions this debate in relation to the rural and adopts an ethnographically-informed methodological approach to provide an in-depth…
Inclusion of Radiation Environment Variability in Total Dose Hardness Assurance Methodology
NASA Technical Reports Server (NTRS)
Xapsos, M. A.; Stauffer, C.; Phan, A.; McClure, S. S.; Ladbury, R. L.; Pellish, J. A.; Campola, M. J.; LaBel, K. A.
2016-01-01
Variability of the space radiation environment is investigated with regard to parts categorization for total dose hardness assurance methods. It is shown that it can have a significant impact. A modified approach is developed that uses current environment models more consistently and replaces the radiation design margin concept with one of failure probability during a mission.
An Educational Institution's Quest for Service Quality: Customers' Perspective
ERIC Educational Resources Information Center
Joseph, Mathew; Yakhou, Mehenna; Stone, George
2005-01-01
Purpose: The purpose of the current study is to assess some of the self-reported factors that students in the study used as choice criteria in making their school selection. Design/methodology/approach: The results of this study were obtained by conducting a series of focus groups involving incoming freshmen at a small liberal arts university…
Overview of Current Activities in Combustion Instability
2015-10-02
and avoid liquid rocket engine combustion stability problems Approach: 1) Develop a SOA combustion stability software package called Stable...phase II will invest in Multifidelity Tools and Methodologies – CSTD will develop a SOA combustion stability software package called Stable Combustion
A Multinomial Logit Approach to Estimating Regional Inventories by Product Class
Lawrence Teeter; Xiaoping Zhou
1998-01-01
Current timber inventory projections generally lack information on inventory by product classes. Most models available for inventory projection and linked to supply analyses are limited to projecting aggregate softwood and hardwood. The objective of this research is to develop a methodology to distribute the volume on each FIA survey plot to product classes and...
"I Just Want to Teach": Queensland Independent School Teachers and Their Workload
ERIC Educational Resources Information Center
Timms, Carolyn; Graham, Deborah; Cottrell, David
2007-01-01
Purpose: The present study seeks to elucidate observed mismatches with workload in teacher respondents to a survey exploring aspects of the work environment. Design/methodology/approach: This phase of the study constituted a pen and paper survey of 298 currently serving teachers in independent schools in Queensland, Australia. Measures used in the…
Crisis Management for Secondary Education: A Survey of Secondary Education Directors in Greece
ERIC Educational Resources Information Center
Savelides, Socrates; Mihiotis, Athanassios; Koutsoukis, Nikitas-Spiros
2015-01-01
Purpose: The Greek secondary education system lacks a formal crisis management system. The purpose of this paper is to address this problem as follows: elicit current crisis management practices, outline features for designing a formal crisis management system in Greece. Design/methodology/approach: The research is based on a survey conducted with…
Teaching, Learning and Assessing HRD: Findings from a BMAF/UFHRD Research Project
ERIC Educational Resources Information Center
Sambrook, Sally; Stewart, Jim
2010-01-01
Purpose: This paper seeks to analyse and explore the results of a research project, which aimed to identify recent and current research on TLA within HRD programmes. From that base the project also intended to identify areas for future research and a basis for establishing a Special Interest Group. Design/methodology/approach: A comprehensive…
Views of HR Specialists on Formal Mentoring: Current Situation and Prospects for the Future
ERIC Educational Resources Information Center
Laiho, Maarit; Brandt, Tiina
2012-01-01
Purpose: The article aims to report the findings of quantitative and qualitative analysis of the benefits, drawbacks and future prospects of formal mentoring in medium-sized and large organisations. Design/methodology/approach: The empirical data for the study were collected via an online survey, and consist of responses from 152 human resource…
Career Management in Transition: HRD Themes from the Estonian Civil Service
ERIC Educational Resources Information Center
Rees, Christopher J.; Jarvalt, Jane; Metcalfe, Beverley
2005-01-01
Purpose: To explore, through a case study, some of the key career-related HRD issues that senior managers are currently facing in the Estonian civil service. Design/methodology/approach: Presents primary empirical research into career management in the Estonian civil service since 1991, that is, in the post-Soviet era. The research involved…
Regional Consortia for E-Resources: A Case Study of Deals in the South China Region
ERIC Educational Resources Information Center
Chunrong, Luo; Jingfen, Wang; Zhinong, Zhou
2010-01-01
Purpose: The purpose of this paper is to analyse the current situation and the social and economic benefits from the consortia acquisitions of electronic resources by the China Academic Library and Information System (CALIS) South China Regional Centre and to recommend improvements for consortia acquisitions. Design/methodology/approach: Analyses…
"You Could See It on Their Faces...": The Importance of Provoking Smiles in Schools
ERIC Educational Resources Information Center
Barnes, Jonathan
2005-01-01
Purpose: Current research in both cognitive neuroscience and what has been called "positive psychology" point to the need for wholesale reappraisal of what happens in schools, especially with regard to the wellbeing of children. Seeks to examine this issue. Design/methodology/approach: Reviews and discussion of research by the World…
Current trends in molecular diagnostics of chronic myeloid leukemia.
Vinhas, Raquel; Cordeiro, Milton; Pedrosa, Pedro; Fernandes, Alexandra R; Baptista, Pedro V
2017-08-01
Nearly 1.5 million people worldwide suffer from chronic myeloid leukemia (CML), characterized by the genetic translocation t(9;22)(q34;q11.2), involving the fusion of the Abelson oncogene (ABL1) with the breakpoint cluster region (BCR) gene. Early onset diagnosis coupled to current therapeutics allow for a treatment success rate of 90, which has focused research on the development of novel diagnostics approaches. In this review, we present a critical perspective on current strategies for CML diagnostics, comparing to gold standard methodologies and with an eye on the future trends on nanotheranostics.
Developing the DESCARTE Model: The Design of Case Study Research in Health Care.
Carolan, Clare M; Forbat, Liz; Smith, Annetta
2016-04-01
Case study is a long-established research tradition which predates the recent surge in mixed-methods research. Although a myriad of nuanced definitions of case study exist, seminal case study authors agree that the use of multiple data sources typify this research approach. The expansive case study literature demonstrates a lack of clarity and guidance in designing and reporting this approach to research. Informed by two reviews of the current health care literature, we posit that methodological description in case studies principally focuses on description of case study typology, which impedes the construction of methodologically clear and rigorous case studies. We draw from the case study and mixed-methods literature to develop the DESCARTE model as an innovative approach to the design, conduct, and reporting of case studies in health care. We examine how case study fits within the overall enterprise of qualitatively driven mixed-methods research, and the potential strengths of the model are considered. © The Author(s) 2015.
Applications of Landsat data and the data base approach
Lauer, D.T.
1986-01-01
A generalized methodology for applying digital Landsat data to resource inventory and assessment tasks is currently being used by several bureaux and agencies within the US Department of the Interior. The methodology includes definition of project objectives and output, identification of source materials, construction of the digital data base, performance of computer-assisted analyses, and generation of output. The USGS, Bureau of Land Management, US Fish and Wildlife Service, Bureau of Indian Affairs, Bureau of Reclamation, and National Park Service have used this generalized methodology to assemble comprehensive digital data bases for resource management. Advanced information processing techniques have been applied to these data bases for making regional environmental surveys on millions of acres of public lands at costs ranging from $0.01 to $0.08 an acre.-Author
Future in biomolecular computation
NASA Astrophysics Data System (ADS)
Wimmer, E.
1988-01-01
Large-scale computations for biomolecules are dominated by three levels of theory: rigorous quantum mechanical calculations for molecules with up to about 30 atoms, semi-empirical quantum mechanical calculations for systems with up to several hundred atoms, and force-field molecular dynamics studies of biomacromolecules with 10,000 atoms and more including surrounding solvent molecules. It can be anticipated that increased computational power will allow the treatment of larger systems of ever growing complexity. Due to the scaling of the computational requirements with increasing number of atoms, the force-field approaches will benefit the most from increased computational power. On the other hand, progress in methodologies such as density functional theory will enable us to treat larger systems on a fully quantum mechanical level and a combination of molecular dynamics and quantum mechanics can be envisioned. One of the greatest challenges in biomolecular computation is the protein folding problem. It is unclear at this point, if an approach with current methodologies will lead to a satisfactory answer or if unconventional, new approaches will be necessary. In any event, due to the complexity of biomolecular systems, a hierarchy of approaches will have to be established and used in order to capture the wide ranges of length-scales and time-scales involved in biological processes. In terms of hardware development, speed and power of computers will increase while the price/performance ratio will become more and more favorable. Parallelism can be anticipated to become an integral architectural feature in a range of computers. It is unclear at this point, how fast massively parallel systems will become easy enough to use so that new methodological developments can be pursued on such computers. Current trends show that distributed processing such as the combination of convenient graphics workstations and powerful general-purpose supercomputers will lead to a new style of computing in which the calculations are monitored and manipulated as they proceed. The combination of a numeric approach with artificial-intelligence approaches can be expected to open up entirely new possibilities. Ultimately, the most exciding aspect of the future in biomolecular computing will be the unexpected discoveries.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Faidy, C.; Gilles, P.
The objective of the seminar was to present the current state of the art in Leak-Before-Break (LBB) methodology development, validation, and application in an international forum. With particular emphasis on industrial applications and regulatory policies, the seminar provided an opportunity to compare approaches, experiences, and codifications developed by different countries. The seminar was organized into four topic areas: status of LBB applications; technical issues in LBB methodology; complementary requirements (leak detection and inspection); LBB assessment and margins. As a result of this seminar, an improved understanding of LBB gained through sharing of different viewpoints from different countries, permits consideration of:more » simplified pipe support design and possible elimination of loss-of-coolant-accident (LOCA) mechanical consequences for specific cases; defense-in-depth type of applications without support modifications; support of safety cases for plants designed without the LOCA hypothesis. In support of these activities, better estimates of the limits to the LBB approach should follow, as well as an improvement in codifying methodologies. Selected papers are indexed separately for inclusion in the Energy Science and Technology Database.« less
Goal-oriented rectification of camera-based document images.
Stamatopoulos, Nikolaos; Gatos, Basilis; Pratikakis, Ioannis; Perantonis, Stavros J
2011-04-01
Document digitization with either flatbed scanners or camera-based systems results in document images which often suffer from warping and perspective distortions that deteriorate the performance of current OCR approaches. In this paper, we present a goal-oriented rectification methodology to compensate for undesirable document image distortions aiming to improve the OCR result. Our approach relies upon a coarse-to-fine strategy. First, a coarse rectification is accomplished with the aid of a computationally low cost transformation which addresses the projection of a curved surface to a 2-D rectangular area. The projection of the curved surface on the plane is guided only by the textual content's appearance in the document image while incorporating a transformation which does not depend on specific model primitives or camera setup parameters. Second, pose normalization is applied on the word level aiming to restore all the local distortions of the document image. Experimental results on various document images with a variety of distortions demonstrate the robustness and effectiveness of the proposed rectification methodology using a consistent evaluation methodology that encounters OCR accuracy and a newly introduced measure using a semi-automatic procedure.
Zhou, Xian; Seto, Sai Wang; Chang, Dennis; Kiat, Hosen; Razmovski-Naumovski, Valentina; Chan, Kelvin; Bensoussan, Alan
2016-01-01
Traditional Chinese medicine (TCM) is an important part of primary health care in Asian countries that has utilized complex herbal formulations (consisting 2 or more medicinal herbs) for treating diseases over thousands of years. There seems to be a general assumption that the synergistic therapeutic effects of Chinese herbal medicine (CHM) derive from the complex interactions between the multiple bioactive components within the herbs and/or herbal formulations. However, evidence to support these synergistic effects remains weak and controversial due to several reasons, including the very complex nature of CHM, misconceptions about synergy and methodological challenges to study design. In this review, we clarify the definition of synergy, identify common errors in synergy research and describe current methodological approaches to test for synergistic interaction. We discuss the strengths and weaknesses of these models in the context of CHM and summarize the current status of synergy research in CHM. Despite the availability of some scientific data to support the synergistic effects of multi-herbal and/or herb-drug combinations, the level of evidence remains low, and the clinical relevancy of most of these findings is undetermined. There remain significant challenges in the development of suitable methods for synergistic studies of complex herbal combinations. PMID:27462269
Shannon, Gary William; Buker, Carol Marie
2010-01-01
Teledermatology provides a partial solution to the problem of accessibility to dermatology services in underserved areas, yet methodologies to determine the locations and geographic dimensions of these areas and the locational efficiency of remote teledermatology sites have been found wanting. This article illustrates an innovative Geographic Information Systems approach using dermatologists' addresses, U.S. Census population data, and the Topologically Integrated Geographic Encoding and Referencing System. Travel-time-based service areas were calculated and mapped for each dermatologist in the state of Kentucky and for possible locations of several remote teledermatology sites. Populations within the current and possible remote service areas were determined. These populations and associated maps permit assessment of the locational efficiency of the current distribution of dermatologists, location of underserved areas, and the potential contribution of proposed hypothetical teledermatology sites. This approach is a valuable and practical tool for evaluating access to current distributions of dermatologists as well as planning for and implementing teledermatology.
Rodríguez-Prieto, V; Vicente-Rubiano, M; Sánchez-Matamoros, A; Rubio-Guerri, C; Melero, M; Martínez-López, B; Martínez-Avilés, M; Hoinville, L; Vergne, T; Comin, A; Schauer, B; Dórea, F; Pfeiffer, D U; Sánchez-Vizcaíno, J M
2015-07-01
In this globalized world, the spread of new, exotic and re-emerging diseases has become one of the most important threats to animal production and public health. This systematic review analyses conventional and novel early detection methods applied to surveillance. In all, 125 scientific documents were considered for this study. Exotic (n = 49) and re-emerging (n = 27) diseases constituted the most frequently represented health threats. In addition, the majority of studies were related to zoonoses (n = 66). The approaches found in the review could be divided in surveillance modalities, both active (n = 23) and passive (n = 5); and tools and methodologies that support surveillance activities (n = 57). Combinations of surveillance modalities and tools (n = 40) were also found. Risk-based approaches were very common (n = 60), especially in the papers describing tools and methodologies (n = 50). The main applications, benefits and limitations of each approach were extracted from the papers. This information will be very useful for informing the development of tools to facilitate the design of cost-effective surveillance strategies. Thus, the current literature review provides key information about the advantages, disadvantages, limitations and potential application of methodologies for the early detection of new, exotic and re-emerging diseases.
Intercultural exchange: an approach to training from a Franco-Canadian perspective.
Parent, Roger
2007-01-01
The current challenges of cultural diversity necessitate effective methods for training professionals in health, as well as other sectors, to work with the phenomenon of culture. This paper presents an overview of a semiotic-based approach to training in this regard. Recent publications by Anti Randviir in semiotics on the textual nature of cultural phenomena and by Annabel Levesque on the healthcare issues of Western, French-speaking Canadians provide the methodological frame and basic cultural reference for the overview. The anthropological definition of culture as a 'semiotic', or universe of meaning, offers interdisciplinary common ground for designing practical approaches to cultural analysis, intercultural communication and creativity training. This definition is consistent with convergent findings and research practice in social and cognitive psychology, administrative science, philosophy, ethnography, linguistics and semiotics. Cultural performances such as narrative constitute an effective methodological tool for interdisciplinary data gathering and for analysis of all kinds of cultures: organizational, family, ethnic, regional, transborder, etc. When combined with a functionalist and systemic approach to the study of culture, semiotic approaches to narrative analysis provide useful principles for decoding cultural modes of communication and for designing meaningful change based on cultural specificity.
A risk assessment methodology using intuitionistic fuzzy set in FMEA
NASA Astrophysics Data System (ADS)
Chang, Kuei-Hu; Cheng, Ching-Hsue
2010-12-01
Most current risk assessment methods use the risk priority number (RPN) value to evaluate the risk of failure. However, conventional RPN methodology has been criticised as having five main shortcomings as follows: (1) the assumption that the RPN elements are equally weighted leads to over simplification; (2) the RPN scale itself has some non-intuitive statistical properties; (3) the RPN elements have many duplicate numbers; (4) the RPN is derived from only three factors mainly in terms of safety; and (5) the conventional RPN method has not considered indirect relations between components. To address the above issues, an efficient and comprehensive algorithm to evaluate the risk of failure is needed. This article proposes an innovative approach, which integrates the intuitionistic fuzzy set (IFS) and the decision-making trial and evaluation laboratory (DEMATEL) approach on risk assessment. The proposed approach resolves some of the shortcomings of the conventional RPN method. A case study, which assesses the risk of 0.15 µm DRAM etching process, is used to demonstrate the effectiveness of the proposed approach. Finally, the result of the proposed method is compared with the listing approaches of risk assessment methods.
When is good, good enough? Methodological pragmatism for sustainable guideline development.
Browman, George P; Somerfield, Mark R; Lyman, Gary H; Brouwers, Melissa C
2015-03-06
Continuous escalation in methodological and procedural rigor for evidence-based processes in guideline development is associated with increasing costs and production delays that threaten sustainability. While health research methodologists are appropriately responsible for promoting increasing rigor in guideline development, guideline sponsors are responsible for funding such processes. This paper acknowledges that other stakeholders in addition to methodologists should be more involved in negotiating trade-offs between methodological procedures and efficiency in guideline production to produce guidelines that are 'good enough' to be trustworthy and affordable under specific circumstances. The argument for reasonable methodological compromise to meet practical circumstances is consistent with current implicit methodological practice. This paper proposes a conceptual tool as a framework to be used by different stakeholders in negotiating, and explicitly reporting, reasonable compromises for trustworthy as well as cost-worthy guidelines. The framework helps fill a transparency gap in how methodological choices in guideline development are made. The principle, 'when good is good enough' can serve as a basis for this approach. The conceptual tool 'Efficiency-Validity Methodological Continuum' acknowledges trade-offs between validity and efficiency in evidence-based guideline development and allows for negotiation, guided by methodologists, of reasonable methodological compromises among stakeholders. Collaboration among guideline stakeholders in the development process is necessary if evidence-based guideline development is to be sustainable.
Fryer, Craig S; Seaman, Elizabeth L; Clark, Rachael S; Plano Clark, Vicki L
2017-01-01
Tobacco use among young people is a complex and serious global dilemma that demands innovative and diverse research approaches. The purpose of this methodological review was to examine the current use of mixed methods research in tobacco control with youth and young adult populations and to develop practical recommendations for tobacco control researchers interested in this methodology. Using PubMed, we searched five peer-reviewed journals that publish tobacco control empirical literature for the use of mixed methods research to study young populations, age 12-25 years. Our team analyzed the features of each article in terms of tobacco control topic, population, youth engagement strategies, and several essential elements of mixed methods research. We identified 23 mixed methods studies published by authors from five different countries reported between 2004 and 2015. These 23 articles examined various topics that included tobacco use behavior, tobacco marketing and branding, and cessation among youth and young adults. The most common mixed methods approach was variations of the concurrent design in which the qualitative and quantitative strands were administered at the same time and given equal priority. This review documented several innovative applications of mixed methods research as well as challenges in the reporting of the complex research designs. The use of mixed methods research in tobacco control has great potential for advancing the understanding of complex behavioral and sociocultural issues for all groups, especially youth and young adults.
Seaman, Elizabeth L.; Clark, Rachael S.; Plano Clark, Vicki L.
2017-01-01
Introduction Tobacco use among young people is a complex and serious global dilemma that demands innovative and diverse research approaches. The purpose of this methodological review was to examine the current use of mixed methods research in tobacco control with youth and young adult populations and to develop practical recommendations for tobacco control researchers interested in this methodology. Methods Using PubMed, we searched five peer-reviewed journals that publish tobacco control empirical literature for the use of mixed methods research to study young populations, age 12–25 years. Our team analyzed the features of each article in terms of tobacco control topic, population, youth engagement strategies, and several essential elements of mixed methods research. Results We identified 23 mixed methods studies published by authors from five different countries reported between 2004 and 2015. These 23 articles examined various topics that included tobacco use behavior, tobacco marketing and branding, and cessation among youth and young adults. The most common mixed methods approach was variations of the concurrent design in which the qualitative and quantitative strands were administered at the same time and given equal priority. This review documented several innovative applications of mixed methods research as well as challenges in the reporting of the complex research designs. Conclusions The use of mixed methods research in tobacco control has great potential for advancing the understanding of complex behavioral and sociocultural issues for all groups, especially youth and young adults. PMID:28841689
Representations of Invariant Manifolds for Applications in Three-Body Systems
NASA Technical Reports Server (NTRS)
Howell, K.; Beckman, M.; Patterson, C.; Folta, D.
2004-01-01
The Lunar L1 and L2 libration points have been proposed as gateways granting inexpensive access to interplanetary space. To date, only individual solutions to the transfer between three-body systems have been found. The methodology to solve the problem for arbitrary three-body systems and entire families of orbits is currently being studied. This paper presents an initial approach to solve the general problem for single and multiple impulse transfers. Two different methods of representing and storing the invariant manifold data are presented. Some particular solutions are presented for two types of transfer problems, though the emphasis is on developing the methodology for solving the general problem.
Rothman, Jason; Alemán Bañón, José; González Alonso, Jorge
2015-01-01
This article has two main objectives. First, we offer an introduction to the subfield of generative third language (L3) acquisition. Concerned primarily with modeling initial stages transfer of morphosyntax, one goal of this program is to show how initial stages L3 data make significant contributions toward a better understanding of how the mind represents language and how (cognitive) economy constrains acquisition processes more generally. Our second objective is to argue for and demonstrate how this subfield will benefit from a neuro/psycholinguistic methodological approach, such as event-related potential experiments, to complement the claims currently made on the basis of exclusively behavioral experiments. PMID:26300800
Quantifying Ballistic Armor Performance: A Minimally Invasive Approach
NASA Astrophysics Data System (ADS)
Holmes, Gale; Kim, Jaehyun; Blair, William; McDonough, Walter; Snyder, Chad
2006-03-01
Theoretical and non-dimensional analyses suggest a critical link between the performance of ballistic resistant armor and the fundamental mechanical properties of the polymeric materials that comprise them. Therefore, a test methodology that quantifies these properties without compromising an armored vest that is exposed to the industry standard V-50 ballistic performance test is needed. Currently, there is considerable speculation about the impact that competing degradation mechanisms (e.g., mechanical, humidity, ultraviolet) may have on ballistic resistant armor. We report on the use of a new test methodology that quantifies the mechanical properties of ballistic fibers and how each proposed degradation mechanism may impact a vest's ballistic performance.
Deterministic Multiaxial Creep and Creep Rupture Enhancements for CARES/Creep Integrated Design Code
NASA Technical Reports Server (NTRS)
Jadaan, Osama M.
1998-01-01
High temperature and long duration applications of monolithic ceramics can place their failure mode in the creep rupture regime. A previous model advanced by the authors described a methodology by which the creep rupture life of a loaded component can be predicted. That model was based on the life fraction damage accumulation rule in association with the modified Monkman-Grant creep rupture criterion. However, that model did not take into account the deteriorating state of the material due to creep damage (e.g., cavitation) as time elapsed. In addition, the material creep parameters used in that life prediction methodology, were based on uniaxial creep curves displaying primary and secondary creep behavior, with no tertiary regime. The objective of this paper is to present a creep life prediction methodology based on a modified form of the Kachanov-Rabotnov continuum damage mechanics (CDM) theory. In this theory, the uniaxial creep rate is described in terms of sum, temperature, time, and the current state of material damage. This scalar damage state parameter is basically an abstract measure of the current state of material damage due to creep deformation. The damage rate is assumed to vary with stress, temperature, time, and the current state of damage itself. Multiaxial creep and creep rupture formulations of the CDM approach are presented in this paper. Parameter estimation methodologies based on nonlinear regression analysis are also described for both, isothermal constant stress states and anisothermal variable stress conditions This creep life prediction methodology was preliminarily added to the integrated design code CARES/Creep (Ceramics Analysis and Reliability Evaluation of Structures/Creep), which is a postprocessor program to commercially available finite element analysis (FEA) packages. Two examples, showing comparisons between experimental and predicted creep lives of ceramic specimens, are used to demonstrate the viability of Ns methodology and the CARES/Creep program.
NASA Technical Reports Server (NTRS)
Jadaan, Osama M.; Powers, Lynn M.; Gyekenyesi, John P.
1998-01-01
High temperature and long duration applications of monolithic ceramics can place their failure mode in the creep rupture regime. A previous model advanced by the authors described a methodology by which the creep rupture life of a loaded component can be predicted. That model was based on the life fraction damage accumulation rule in association with the modified Monkman-Grant creep ripture criterion However, that model did not take into account the deteriorating state of the material due to creep damage (e.g., cavitation) as time elapsed. In addition, the material creep parameters used in that life prediction methodology, were based on uniaxial creep curves displaying primary and secondary creep behavior, with no tertiary regime. The objective of this paper is to present a creep life prediction methodology based on a modified form of the Kachanov-Rabotnov continuum damage mechanics (CDM) theory. In this theory, the uniaxial creep rate is described in terms of stress, temperature, time, and the current state of material damage. This scalar damage state parameter is basically an abstract measure of the current state of material damage due to creep deformation. The damage rate is assumed to vary with stress, temperature, time, and the current state of damage itself. Multiaxial creep and creep rupture formulations of the CDM approach are presented in this paper. Parameter estimation methodologies based on nonlinear regression analysis are also described for both, isothermal constant stress states and anisothermal variable stress conditions This creep life prediction methodology was preliminarily added to the integrated design code CARES/Creep (Ceramics Analysis and Reliability Evaluation of Structures/Creep), which is a postprocessor program to commercially available finite element analysis (FEA) packages. Two examples, showing comparisons between experimental and predicted creep lives of ceramic specimens, are used to demonstrate the viability of this methodology and the CARES/Creep program.
ERIC Educational Resources Information Center
Morell, Linda; Collier, Tina; Black, Paul; Wilson, Mark
2017-01-01
This paper builds on the current literature base about learning progressions in science to address the question, "What is the nature of the learning progression in the content domain of the structure of matter?" We introduce a learning progression in response to that question and illustrate a methodology, the Construct Modeling (Wilson,…
ERIC Educational Resources Information Center
Emad, Gholamreza; Roth, Wolff Michael
2008-01-01
Purpose: The purpose of this paper is to highlight the contradictions in the current maritime education and training system (MET), which is based on competency-based education, training and assessment, and to theorize the failure to make the training useful. Design/methodology/approach: A case study of education and training in the international…
ERIC Educational Resources Information Center
Paschall, Katherine W.; Mastergeorge, Ann M.
2016-01-01
The concept of bidirectionality represents a process of mutual influence between parent and child, whereby each influences the other as well as the dyadic relationship. Despite the widespread acceptance of bidirectional models of influence, there is still a lack of integration of such models in current research designs. Research on…
ERIC Educational Resources Information Center
Carter, Stephen; Yeo, Amy Chu-May
2016-01-01
Purpose: The purpose of this paper is to investigate two areas of interest: first, to determine business student customer satisfiers that could be contributors to students' current and predicted retention in a higher educational institution (HEI) and second, to use these satisfiers to inform HEI marketing planning. Design/Methodology/Approach: The…
ERIC Educational Resources Information Center
de Kraker, Joop; Dlouhá, Jana; Machackova Henderson, Laura; Kapitulcinová, Dana
2017-01-01
Purpose: The purpose of this paper is to assess the current and potential value of the European Virtual Seminar on Sustainable Development (EVS) as an opportunity for professional development in Education for Sustainable Development (ESD) for teaching staff at university level. Design/methodology/approach: The paper presents and reflects on the…
Francisco Rodríguez y Silva; Armando González-Cabán
2013-01-01
The abandonment of land, the high energy load generated and accumulated by vegetation covers, climate change and interface scenarios in Mediterranean forest ecosystems are demanding serious attention to forest fire conditions. This is particularly true when dealing with the budget requirements for undertaking protection programs related to the state of current and...
ERIC Educational Resources Information Center
Asuga, Gladys; Eacott, Scott; Scevak, Jill
2015-01-01
Purpose: The purpose of this paper is to evaluate the quality of the current provision for school leadership in Kenya, the extent to which they have an impact on student outcomes and the return on school leadership preparation and development investment. Design/Methodology/Approach: The paper draws from educational leadership, management and…
ACSPRI 2014 4th International Social Science Methodology Conference Report
2015-04-01
Validity, trustworthiness and rigour: quality and the idea of qualitative research . Journal of Advanced Nursing, 304-310. Spencer, L., Ritchie, J...increasing data quality; the Total Survey Error framework; multi-modal on-line surveying, quality frameworks for assessing qualitative research ; and...provided an overview of the current perspectives on causal claims in qualitative research . Three approaches to generating plausible causal
ERIC Educational Resources Information Center
Alkraiji, Abdullah; Jackson, Thomas; Murray, Ian
2011-01-01
Purpose: This paper seeks to carry out a critical study of health data standards and adoption process with a focus on Saudi Arabia. Design/methodology/approach: Many developed nations have initiated programs to develop, promote, adopt and customise international health data standards to the local needs. The current status of, and future plans for,…
ERIC Educational Resources Information Center
Dries, Nicky
2011-01-01
Purpose: The purpose of this paper is to examine the extent to which the concept of career success has been subject to reification, and identify potential implications for individuals, organizations, and societies. Design/methodology/approach: The current paper offers an in-depth analysis of the different contextual forces contributing to the…
Open Source, Crowd Source: Harnessing the Power of the People behind Our Libraries
ERIC Educational Resources Information Center
Trainor, Cindi
2009-01-01
Purpose: The purpose of this paper is to provide an insight into the use of Web 2.0 and Library 2.0 technologies so that librarians can combine open source software with user-generated content to create a richer discovery experience for their users. Design/methodology/approach: Following a description of the current state of integrated library…
ERIC Educational Resources Information Center
Ayoubi, Rami M.; Massoud, Hiba
2012-01-01
Purpose: The main aim of the current study is to explore and model the major obstacles that UK universities encounter when developing international partnerships with overseas universities. Design/methodology/approach: Focusing on the obstacles to developing international partnerships, the study results are developed from 24 interviews with senior…
Inclusion of Radiation Environment Variability in Total Dose Hardness Assurance Methodology
Xapsos, M.A.; Stauffer, C.; Phan, A.; McClure, S.S.; Ladbury, R.L.; Pellish, J.A.; Campola, M.J.; LaBel, K.A.
2017-01-01
Variability of the space radiation environment is investigated with regard to parts categorization for total dose hardness assurance methods. It is shown that it can have a significant impact. A modified approach is developed that uses current environment models more consistently and replaces the radiation design margin concept with one of failure probability during a mission. PMID:28804156
ERIC Educational Resources Information Center
Ingwersen, Wesley W.; Curran, Mary Ann; Gonzalez, Michael A.; Hawkins, Troy R.
2012-01-01
Purpose: The purpose of this study is to compare the life cycle environmental impacts of the University of Cincinnati College of Engineering and Applied Sciences' current printed annual report to a version distributed via the internet. Design/methodology/approach: Life cycle environmental impacts of both versions of the report are modeled using…
Armando González-Cabán
2008-01-01
hese proceedings summarize the results of a symposium designed to address current issues of agencies with wildland fire protection responsibility at the federal and state levels in the United States as well as agencies in the international community. The topics discussed at the symposium included fire economics, theoretical and methodological approaches to strategic...
Angell, Linda S.
2014-01-01
A variety of methodologies for understanding the prevalence of distracted driving, its risk, and other aspects of driver secondary activity, have been used in the last 15 years. Although the current trend is toward naturalistic driving studies, each methodology contributes certain elements to a better understanding that could emerge from a convergence of these efforts. However, if differing methods are to contribute to a common and robust understanding of driver distraction, it is critical to understand the strengths and limitations of each method. This paper reviews several of the non-naturalistic methods. It suggests that “convergence science” – a more concerted and rigorous effort to bring different approaches together into an integrative whole – may offer benefits for identification and definition of issues and countermeasure development to improve driving safety. PMID:24776226
Medicine and the humanities--theoretical and methodological issues.
Puustinen, Raimo; Leiman, M; Viljanen, A M
2003-12-01
Engel's biopsychosocial model, Cassell's promotion of the concept "person" in medical thinking and Pellegrino's and Thomasma's philosophy of medicine are attempts to widen current biomedical theory of disease and to approach medicine as a form of human activity in pursuit of healing. To develop this approach further we would like to propose activity theory as a possible means for understanding the nature of medical practice. By "activity theory" we refer to developments which have evolved from Vygotsky's research on socially mediated mental functions and processes. Analysing medicine as activity enforces the joint consideration of target and subject: who is doing what to whom. This requires the use of historical, linguistic, anthropological, and semiotic tools. Therefore, if we analyse medicine as an activity, humanities are both theoretically and methodologically "inbound" (or internal) to the analysis itself. On the other hand, literature studies or anthropological writings provide material for analysing the various forms of medical practices.
Advances in computer-aided well-test interpretation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Horne, R.N.
1994-07-01
Despite the feeling expressed several times over the past 40 years that well-test analysis had reached it peak development, an examination of recent advances shows continuous expansion in capability, with future improvement likely. The expansion in interpretation capability over the past decade arose mainly from the development of computer-aided techniques, which, although introduced 20 years ago, have come into use only recently. The broad application of computer-aided interpretation originated with the improvement of the methodologies and continued with the expansion in computer access and capability that accompanied the explosive development of the microcomputer industry. This paper focuses on the differentmore » pieces of the methodology that combine to constitute a computer-aided interpretation and attempts to compare some of the approaches currently used. Future directions of the approach are also discussed. The separate areas discussed are deconvolution, pressure derivatives, model recognition, nonlinear regression, and confidence intervals.« less
McMahon, Camilla M; Lerner, Matthew D; Britton, Noah
2013-01-01
In this paper, we synthesize the current literature on group-based social skills interventions (GSSIs) for adolescents (ages 10–20 years) with higher-functioning autism spectrum disorder and identify key concepts that should be addressed in future research on GSSIs. We consider the research participants, the intervention, the assessment of the intervention, and the research methodology and results to be integral and interconnected components of the GSSI literature, and we review each of these components respectively. Participant characteristics (eg, age, IQ, sex) and intervention characteristics (eg, targeted social skills, teaching strategies, duration and intensity) vary considerably across GSSIs; future research should evaluate whether participant and intervention characteristics mediate/moderate intervention efficacy. Multiple assessments (eg, parent-report, child-report, social cognitive assessments) are used to evaluate the efficacy of GSSIs; future research should be aware of the limitations of current measurement approaches and employ more accurate, sensitive, and comprehensive measurement approaches. Results of GSSIs are largely inconclusive, with few consistent findings across studies (eg, high parent and child satisfaction with the intervention); future research should employ more rigorous methodological standards for evaluating efficacy. A better understanding of these components in the current GSSI literature and a more sophisticated and rigorous analysis of these components in future research will lend clarity to key questions regarding the efficacy of GSSIs for individuals with autism spectrum disorder. PMID:23956616
Where does good quality qualitative health care research get published?
Richardson, Jane C; Liddle, Jennifer
2017-09-01
This short report aims to give some insight into current publication patterns for high-quality qualitative health research, using the Research Excellence Framework (REF) 2014 database. We explored patterns of publication by range and type of journal, by date and by methodological focus. We also looked at variations between the publications submitted to different Units of Assessment, focussing particularly on the one most closely aligned with our own research area of primary care. Our brief analysis demonstrates that general medical/health journals with high impact factors are the dominant routes of publication, but there is variation according to the methodological approach adopted by articles. The number of qualitative health articles submitted to REF 2014 overall was small, and even more so for articles based on mixed methods research, qualitative methodology or reviews/syntheses that included qualitative articles.
Synthesis: Intertwining product and process
NASA Technical Reports Server (NTRS)
Weiss, David M.
1990-01-01
Synthesis is a proposed systematic process for rapidly creating different members of a program family. Family members are described by variations in their requirements. Requirements variations are mapped to variations on a standard design to generate production quality code and documentation. The approach is made feasible by using principles underlying design for change. Synthesis incorporates ideas from rapid prototyping, application generators, and domain analysis. The goals of Synthesis and the Synthesis process are discussed. The technology needed and the feasibility of the approach are also briefly discussed. The status of current efforts to implement Synthesis methodologies is presented.
Shear-wave velocity profiling according to three alternative approaches: A comparative case study
NASA Astrophysics Data System (ADS)
Dal Moro, G.; Keller, L.; Al-Arifi, N. S.; Moustafa, S. S. R.
2016-11-01
The paper intends to compare three different methodologies which can be used to analyze surface-wave propagation, thus eventually obtaining the vertical shear-wave velocity (VS) profile. The three presented methods (currently still quite unconventional) are characterized by different field procedures and data processing. The first methodology is a sort of evolution of the classical Multi-channel Analysis of Surface Waves (MASW) here accomplished by jointly considering Rayleigh and Love waves (analyzed according to the Full Velocity Spectrum approach) and the Horizontal-to-Vertical Spectral Ratio (HVSR). The second method is based on the joint analysis of the HVSR curve together with the Rayleigh-wave dispersion determined via Miniature Array Analysis of Microtremors (MAAM), a passive methodology that relies on a small number (4 to 6) of vertical geophones deployed along a small circle (for the common near-surface application the radius usually ranges from 0.6 to 5 m). Finally, the third considered approach is based on the active data acquired by a single 3-component geophone and relies on the joint inversion of the group-velocity spectra of the radial and vertical components of the Rayleigh waves, together with the Radial-to-Vertical Spectral Ratio (RVSR). The results of the analyses performed while considering these approaches (completely different both in terms of field procedures and data analysis) appear extremely consistent thus mutually validating their performances. Pros and cons of each approach are summarized both in terms of computational aspects as well as with respect to practical considerations regarding the specific character of the pertinent field procedures.
Horizon Mission Methodology - A tool for the study of technology innovation and new paradigms
NASA Technical Reports Server (NTRS)
Anderson, John L.
1993-01-01
The Horizon Mission (HM) methodology was developed to provide a means of identifying and evaluating highly innovative, breakthrough technology concepts (BTCs) and for assessing their potential impact on advanced space missions. The methodology is based on identifying new capabilities needed by hypothetical 'horizon' space missions having performance requirements that cannot be met even by extrapolating known space technologies. Normal human evaluation of new ideas such as BTCs appears to be governed (and limited) by 'inner models of reality' defined as paradigms. Thus, new ideas are evaluated by old models. This paper describes the use of the HM Methodology to define possible future paradigms that would provide alternatives to evaluation by current paradigms. The approach is to represent a future paradigm by a set of new BTC-based capabilities - called a paradigm abstract. The paper describes methods of constructing and using the abstracts for evaluating BTCs for space applications and for exploring the concept of paradigms and paradigm shifts as a representation of technology innovation.
Heavy atom labeled nucleotides for measurement of kinetic isotope effects.
Weissman, Benjamin P; Li, Nan-Sheng; York, Darrin; Harris, Michael; Piccirilli, Joseph A
2015-11-01
Experimental analysis of kinetic isotope effects represents an extremely powerful approach for gaining information about the transition state structure of complex reactions not available through other methodologies. The implementation of this approach to the study of nucleic acid chemistry requires the synthesis of nucleobases and nucleotides enriched for heavy isotopes at specific positions. In this review, we highlight current approaches to the synthesis of nucleic acids enriched site specifically for heavy oxygen and nitrogen and their application in heavy atom isotope effect studies. This article is part of a special issue titled: Enzyme Transition States from Theory and Experiment. Copyright © 2015 Elsevier B.V. All rights reserved.
Reflective random indexing for semi-automatic indexing of the biomedical literature.
Vasuki, Vidya; Cohen, Trevor
2010-10-01
The rapid growth of biomedical literature is evident in the increasing size of the MEDLINE research database. Medical Subject Headings (MeSH), a controlled set of keywords, are used to index all the citations contained in the database to facilitate search and retrieval. This volume of citations calls for efficient tools to assist indexers at the US National Library of Medicine (NLM). Currently, the Medical Text Indexer (MTI) system provides assistance by recommending MeSH terms based on the title and abstract of an article using a combination of distributional and vocabulary-based methods. In this paper, we evaluate a novel approach toward indexer assistance by using nearest neighbor classification in combination with Reflective Random Indexing (RRI), a scalable alternative to the established methods of distributional semantics. On a test set provided by the NLM, our approach significantly outperforms the MTI system, suggesting that the RRI approach would make a useful addition to the current methodologies.
Constenla, Dagna
2015-03-24
Economic evaluations have routinely understated the net benefits of vaccination by not including the full range of economic benefits that accrue over the lifetime of a vaccinated person. Broader approaches for evaluating benefits of vaccination can be used to more accurately calculate the value of vaccination. This paper reflects on the methodology of one such approach - the health investment life course approach - that looks at the impact of vaccine investment on lifetime returns. The role of this approach on vaccine decision-making will be assessed using the malaria health investment life course model example. We describe a framework that measures the impact of a health policy decision on government accounts over many generations. The methodological issues emerging from this approach are illustrated with an example from a recently completed health investment life course analysis of malaria vaccination in Ghana. Beyond the results, various conceptual and practical challenges of applying this framework to Ghana are discussed in this paper. The current framework seeks to understand how disease and available technologies can impact a range of economic parameters such as labour force participation, education, healthcare consumption, productivity, wages or economic growth, and taxation following their introduction. The framework is unique amongst previous economic models in malaria because it considers future tax revenue for governments. The framework is complementary to cost-effectiveness and budget impact analysis. The intent of this paper is to stimulate discussion on how existing and new methodology can add to knowledge regarding the benefits from investing in new and underutilized vaccines. Copyright © 2015 Elsevier Ltd. All rights reserved.
Koundouri, P; Ker Rault, P; Pergamalis, V; Skianis, V; Souliotis, I
2016-01-01
The development of the Water Framework Directive aimed to establish an integrated framework of water management at European level. This framework revolves around inland surface waters, transitional waters, coastal waters and ground waters. In the process of achieving the environment and ecological objectives set from the Directive, the role of economics is put in the core of the water management. An important feature of the Directive is the recovery of total economic cost of water services by all users. The total cost of water services can be disaggregated into environmental, financial and resource costs. Another important aspect of the directive is the identification of major drivers and pressures in each River Basin District. We describe a methodology that is aiming to achieve sustainable and environmental and socioeconomic management of freshwater ecosystem services. The Ecosystem Services Approach is in the core of the suggested methodology for the implementation of a more sustainable and efficient water management. This approach consists of the following three steps: (i) socio-economic characterization of the River Basin area, (ii) assessment of the current recovery of water use cost, and (iii) identification and suggestion of appropriate programs of measures for sustainable water management over space and time. This methodology is consistent with a) the economic principles adopted explicitly by the Water Framework Directive (WFD), b) the three-step WFD implementation approach adopted in the WATECO document, c) the Ecosystem Services Approach to valuing freshwater goods and services to humans. Furthermore, we analyze how the effects of multiple stressors and socio-economic development can be quantified in the context of freshwater resources management. We also attempt to estimate the value of four ecosystem services using the benefit transfer approach for the Anglian River Basin, which showed the significance of such services. Copyright © 2015. Published by Elsevier B.V.
Johnson, Blair T; Low, Robert E; MacDonald, Hayley V
2015-01-01
Systematic reviews now routinely assess methodological quality to gauge the validity of the included studies and of the synthesis as a whole. Although trends from higher quality studies should be clearer, it is uncertain how often meta-analyses incorporate methodological quality in models of study results either as predictors, or, more interestingly, in interactions with theoretical moderators. We survey 200 meta-analyses in three health promotion domains to examine when and how meta-analyses incorporate methodological quality. Although methodological quality assessments commonly appear in contemporary meta-analyses (usually as scales), they are rarely incorporated in analyses, and still more rarely analysed in interaction with theoretical determinants of the success of health promotions. The few meta-analyses (2.5%) that did include such an interaction analysis showed that moderator results remained significant in higher quality studies or were present only among higher quality studies. We describe how to model quality interactively with theoretically derived moderators and discuss strengths and weaknesses of this approach and in relation to current meta-analytic practice. In large literatures exhibiting heterogeneous effects, meta-analyses can incorporate methodological quality and generate conclusions that enable greater confidence not only about the substantive phenomenon but also about the role that methodological quality itself plays.
ERIC Educational Resources Information Center
O'Dea, Jennifer A.
2005-01-01
Purpose: The purpose of this paper is to review current programmes and major issues surrounding preventive interventions for body image and obesity in schools. Design/methodology/approach: A literature review was carried out by analysing papers cited in major literature databases from the last 50 years. This review describes and summarises…
The Development of Methodology to Support Comprehensive Approach: TMC
2014-05-02
complex situations (Lizotte, Bernier, Mokhtari , & Boivin, 2012). This project was part of the DRDC forward-looking Technology Investment Fund (TIF...outperformed participants in the control group (Lizotte, Bernier, Mokhtari , & Boivin, 2012. The current follow on effort originated from needs...Canadian Forces operations. Canadian Military Journal, 9, 11‐20. [7] Lizotte M., Bernier F., Mokhtari M., Boivin É. (2012). IMAGE Final Report: An
ERIC Educational Resources Information Center
Krishnamurthy, M.
2008-01-01
Purpose: The purpose of this paper is to describe the open access and open source movement in the digital library world. Design/methodology/approach: A review of key developments in the open access and open source movement is provided. Findings: Open source software and open access to research findings are of great use to scholars in developing…
ERIC Educational Resources Information Center
Buchanan, F. Robert; Kim, Kong-Hee; Basham, Randall
2007-01-01
Purpose: The purpose of this study is to explore career orientations of business master's degree seekers in comparison with social work degree pursuers in an effort to provide insight for educators and policy makers. Design/methodology/approach: A web-based survey of current master's students from two graduate schools at a large university…
ERIC Educational Resources Information Center
Damian, Radu; Grifoll, Josep; Rigbers, Anke
2015-01-01
In this paper the current national legislations, the quality assurance approaches and the activities of impact analysis of three quality assurance agencies from Romania, Spain and Germany are described from a strategic perspective. The analysis shows that the general methodologies (comprising, for example, self-evaluation reports, peer reviews,…
ERIC Educational Resources Information Center
Winkel, Olaf
2010-01-01
Purpose: The purpose of this paper is to provide information about the current reform of higher education in Germany, which can be described as German reading of the Bologna process, about the problems and deficits occurring in this area, and about ways to correct unwelcome developments. Design/methodology/approach: The paper starts with a review…
ERIC Educational Resources Information Center
Kohlström, Kirsi; Rantatalo, Oscar; Karp, Staffan; Padyab, Mojgan
2017-01-01
Purpose: This study aims to examine how subgroups within a cohort of Swedish police students value different types of curricula content (i.e. new competencies versus enduring ones) in the context of the currently transforming landscape of basic police training. Design/methodology/approach: Drawing on a Swedish national survey (N = 369), the study…
ERIC Educational Resources Information Center
Thom, Marco
2017-01-01
Purpose: The purpose of this paper is to report on the current state of arts entrepreneurship education at higher educational institutions (HEIs) in the UK and Germany. It is based on findings from questionnaire surveys among 210 lecturers in fine art at 89 HEIs in the UK and Germany. Design/methodology/approach: This paper explores issues related…
ERIC Educational Resources Information Center
Thom, Marco
2017-01-01
Purpose: The purpose of this paper is to elucidate the current state of arts entrepreneurship education at higher educational institutions (HEIs) by reviewing the relevant literature and surveying lecturers in Fine Art. Design/methodology/approach: The analysis of fine art students' educational situation at HEIs in the UK and Germany is conducted…
ERIC Educational Resources Information Center
Quader, Sarab Abu-Rabia; Oplatka, Izhar
2008-01-01
Purpose: The current paper aims to tell the stories of six female supervisors who have successfully managed to access this high-level position in the Bedouin educational system, putting forward some implications for understanding and exploring the lives and career of women in patriarchal, minority groups. Design/methodology/approach: Six female…
ERIC Educational Resources Information Center
Holden, Heather; Ozok, Ant; Rada, Roy
2008-01-01
Purpose: The purpose of this study is to explore the current usage and acceptance of classroom technologies by secondary math/science education teachers in one community. Design/methodology/approach: Forty-seven secondary education math and science teachers in one American city responded to a survey about their use and perceptions of technology in…
ERIC Educational Resources Information Center
Purdy, Noel; York, Leanne
2016-01-01
This study aimed to investigate internet usage among post-primary pupils in years 9, 11 and 13 in two contrasting post-primary schools in Northern Ireland, the nature and incidence of cyberbullying among these pupils, and the ways in which their schools are currently addressing the problem. A mixed methodological approach was adopted: a paper…
NASA Astrophysics Data System (ADS)
Ye, Su; Pontius, Robert Gilmore; Rakshit, Rahul
2018-07-01
Object-based image analysis (OBIA) has gained widespread popularity for creating maps from remotely sensed data. Researchers routinely claim that OBIA procedures outperform pixel-based procedures; however, it is not immediately obvious how to evaluate the degree to which an OBIA map compares to reference information in a manner that accounts for the fact that the OBIA map consists of objects that vary in size and shape. Our study reviews 209 journal articles concerning OBIA published between 2003 and 2017. We focus on the three stages of accuracy assessment: (1) sampling design, (2) response design and (3) accuracy analysis. First, we report the literature's overall characteristics concerning OBIA accuracy assessment. Simple random sampling was the most used method among probability sampling strategies, slightly more than stratified sampling. Office interpreted remotely sensed data was the dominant reference source. The literature reported accuracies ranging from 42% to 96%, with an average of 85%. A third of the articles failed to give sufficient information concerning accuracy methodology such as sampling scheme and sample size. We found few studies that focused specifically on the accuracy of the segmentation. Second, we identify a recent increase of OBIA articles in using per-polygon approaches compared to per-pixel approaches for accuracy assessment. We clarify the impacts of the per-pixel versus the per-polygon approaches respectively on sampling, response design and accuracy analysis. Our review defines the technical and methodological needs in the current per-polygon approaches, such as polygon-based sampling, analysis of mixed polygons, matching of mapped with reference polygons and assessment of segmentation accuracy. Our review summarizes and discusses the current issues in object-based accuracy assessment to provide guidance for improved accuracy assessments for OBIA.
NASA Astrophysics Data System (ADS)
Qaddus, Muhammad Kamil
The gap between estimated and actual savings in energy efficiency and conservation (EE&C) projects or programs forms the problem statement for the scope of public and government buildings. This gap has been analyzed first on impact and then on process-level. On the impact-level, the methodology leads to categorization of the gap as 'Realization Gap'. It then views the categorization of gap within the context of past and current narratives linked to realization gap. On process-level, the methodology leads to further analysis of realization gap on process evaluation basis. The process evaluation criterion, a product of this basis is then applied to two different programs (DESEU and NYC ACE) linked to the scope of this thesis. Utilizing the synergies of impact and process level analysis, it offers proposals on program development and its structure using our process evaluation criterion. Innovative financing and benefits distribution structure is thus developed and will remain part of the proposal. Restricted Stakeholder Crowd Financing and Risk-Free Incentivized return are the products of proposed financing and benefit distribution structure respectively. These products are then complimented by proposing an alternative approach in estimating EE&C savings. The approach advocates estimation based on range-allocation rather than currently utilized unique estimated savings approach. The Way Ahead section thus explores synergy between financial and engineering ranges of energy savings as a multi-discipline approach for future research. Moreover, it provides the proposed program structure with risk aversion and incentive allocation while dealing with uncertainty. This set of new approaches are believed to better fill the realization gap between estimated and actual energy efficiency savings.
NASA Astrophysics Data System (ADS)
Orlaineta-Agüero, S.; Del Sol-Fernández, S.; Sánchez-Guzmán, D.; García-Salcedo, R.
2017-01-01
In the present work we show the implementation of a learning sequence based on an active learning methodology for teaching Physics, this proposal tends to promote a better learning in high school students with the use of a comic book and it combines the use of different low-cost experimental activities for teaching the electrical concepts of Current, Resistance and Voltage. We consider that this kind of strategy can be easily extrapolated to higher-education levels like Engineering-college/university level and other disciplines of Science. To evaluate this proposal, we used some conceptual questions from the Electric Circuits Concept Evaluation survey developed by Sokoloff and the results from this survey was analysed with the Normalized Conceptual Gain proposed by Hake and the Concentration Factor that was proposed by Bao and Redish, to identify the effectiveness of the methodology and the models that the students presented after and before the instruction, respectively. We found that this methodology was more effective than only the implementation of traditional lectures, we consider that these results cannot be generalized but gave us the opportunity to view many important approaches in Physics Education; finally, we will continue to apply the same experiment with more students, in the same and upper levels of education, to confirm and validate the effectiveness of this methodology proposal.
Study on the performance of different craniofacial superimposition approaches (I).
Ibáñez, O; Vicente, R; Navega, D S; Wilkinson, C; Jayaprakash, P T; Huete, M I; Briers, T; Hardiman, R; Navarro, F; Ruiz, E; Cavalli, F; Imaizumi, K; Jankauskas, R; Veselovskaya, E; Abramov, A; Lestón, P; Molinero, F; Cardoso, J; Çağdır, A S; Humpire, D; Nakanishi, Y; Zeuner, A; Ross, A H; Gaudio, D; Damas, S
2015-12-01
As part of the scientific tasks coordinated throughout The 'New Methodologies and Protocols of Forensic Identification by Craniofacial Superimposition (MEPROCS)' project, the current study aims to analyse the performance of a diverse set of CFS methodologies and the corresponding technical approaches when dealing with a common dataset of real-world cases. Thus, a multiple-lab study on craniofacial superimposition has been carried out for the first time. In particular, 26 participants from 17 different institutions in 13 countries were asked to deal with 14 identification scenarios, some of them involving the comparison of multiple candidates and unknown skulls. In total, 60 craniofacial superimposition problems divided in two set of females and males. Each participant follow her/his own methodology and employed her/his particular technological means. For each single case they were asked to report the final identification decision (either positive or negative) along with the rationale supporting the decision and at least one image illustrating the overlay/superimposition outcome. This study is expected to provide important insights to better understand the most convenient characteristics of every method included in this study. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Furedy, John J
2003-11-01
The differential/experimental distinction that Cronbach specified is important because any adequate account of psychological phenomena requires the recognition of the validity of both approaches, and a meaningful melding of the two. This paper suggests that Pavlov's work in psychology, based on earlier traditions of inquiry that can be traced back to the pre-Socratics, provides a potential way of achieving this melding, although such features as systematic rather than anecdotal methods of observation need to be added. Pavlov's methodological behaviorist approach is contrasted with metaphysical behaviorism (as exemplified explicitly in Watson and Skinner, and implicitly in the computer-metaphorical, information-processing explanations employed by current "cognitive" psychology). A common feature of the metaphysical approach is that individual-differences variables like sex are essentially ignored, or relegated to ideological categories such as the treatment of sex as merely a "social construction." Examples of research both before and after the "cognitive revolution" are presented where experimental and differential methods are melded, and individual differences are treated as phenomena worthy of investigation rather than as nuisance factors that merely add to experimental error.
A Comprehensive Validation Approach Using The RAVEN Code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alfonsi, Andrea; Rabiti, Cristian; Cogliati, Joshua J
2015-06-01
The RAVEN computer code , developed at the Idaho National Laboratory, is a generic software framework to perform parametric and probabilistic analysis based on the response of complex system codes. RAVEN is a multi-purpose probabilistic and uncertainty quantification platform, capable to communicate with any system code. A natural extension of the RAVEN capabilities is the imple- mentation of an integrated validation methodology, involving several different metrics, that represent an evolution of the methods currently used in the field. The state-of-art vali- dation approaches use neither exploration of the input space through sampling strategies, nor a comprehensive variety of metrics neededmore » to interpret the code responses, with respect experimental data. The RAVEN code allows to address both these lacks. In the following sections, the employed methodology, and its application to the newer developed thermal-hydraulic code RELAP-7, is reported.The validation approach has been applied on an integral effect experiment, representing natu- ral circulation, based on the activities performed by EG&G Idaho. Four different experiment configurations have been considered and nodalized.« less
Stone, Susanna; Johnson, Kate M; Beall, Erica; Meindl, Peter; Smith, Benjamin; Graham, Jesse
2014-07-01
Political psychology is a dynamic field of research that offers a unique blend of approaches and methods in the social and cognitive sciences. Political psychologists explore the interactions between macrolevel political structures and microlevel factors such as decision-making processes, motivations, and perceptions. In this article, we provide a broad overview of the field, beginning with a brief history of political psychology research and a summary of the primary methodological approaches in the field. We then give a more detailed account of research on ideology and social justice, two topics experiencing a resurgence of interest in current political psychology. Finally, we cover research on political persuasion and voting behavior. By summarizing these major areas of political psychology research, we hope to highlight the wide variety of theoretical and methodological approaches of cognitive scientists working at the intersection of psychology and political science. WIREs Cogn Sci 2014, 5:373-385. doi: 10.1002/wcs.1293 For further resources related to this article, please visit the WIREs website. The authors have declared no conflicts of interest for this article. © 2014 John Wiley & Sons, Ltd.
Bámaca-Colbert, Mayra Y; Gayles, Jochebed G
2010-11-01
The overall aim of the current study was to identify the methodological approach and corresponding analytic procedure that best elucidated the associations among Mexican-origin mother-daughter cultural orientation dissonance, family functioning, and adolescent adjustment. To do so, we employed, and compared, two methodological approaches (i.e., variable-centered and person-centered) via four analytic procedures (i.e., difference score, interactive, matched/mismatched grouping, and latent profiles). The sample consisted of 319 girls in the 7th or 10th grade and their mother or mother figure from a large Southwestern, metropolitan area in the US. Family factors were found to be important predictors of adolescent adjustment in all models. Although some findings were similar across all models, overall, findings suggested that the latent profile procedure best elucidated the associations among the variables examined in this study. In addition, associations were present across early and middle adolescents, with a few findings being only present for one group. Implications for using these analytic procedures in studying cultural and family processes are discussed.
Amariti, M L; Restori, M; De Ferrari, F; Paganelli, C; Faglia, R; Legnani, G
1999-06-01
Age determination by teeth examination is one of the main means of determining personal identification. Current studies have suggested different techniques for determining the age of a subject by means of the analysis of microscopic and macroscopic structural modifications of the tooth with ageing. The histological approach is useful among the various methodologies utilized for this purpose. It is still unclear as to what is the best technique, as almost all the authors suggest the use of the approach they themselves have tested. In the present study, age determination by means of microscopic techniques has been based on the quantitative analysis of three parameters, all well recognized in specialized literature: 1. dentinal tubules density/sclerosis 2. tooth translucency 3. analysis of the cementum thickness. After a description of the three methodologies (with automatic image processing of the dentinal sclerosis utilizing an appropriate computer program developed by the authors) the results obtained on cases using the three different approaches are presented, and the merits and failings of each technique are identified with the intention of identifying the one offering the least degree of error in age determination.
Butera, R J; Wilson, C G; Delnegro, C A; Smith, J C
2001-12-01
We present a novel approach to implementing the dynamic-clamp protocol (Sharp et al., 1993), commonly used in neurophysiology and cardiac electrophysiology experiments. Our approach is based on real-time extensions to the Linux operating system. Conventional PC-based approaches have typically utilized single-cycle computational rates of 10 kHz or slower. In thispaper, we demonstrate reliable cycle-to-cycle rates as fast as 50 kHz. Our system, which we call model reference current injection (MRCI); pronounced merci is also capable of episodic logging of internal state variables and interactive manipulation of model parameters. The limiting factor in achieving high speeds was not processor speed or model complexity, but cycle jitter inherent in the CPU/motherboard performance. We demonstrate these high speeds and flexibility with two examples: 1) adding action-potential ionic currents to a mammalian neuron under whole-cell patch-clamp and 2) altering a cell's intrinsic dynamics via MRCI while simultaneously coupling it via artificial synapses to an internal computational model cell. These higher rates greatly extend the applicability of this technique to the study of fast electrophysiological currents such fast a currents and fast excitatory/inhibitory synapses.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grabaskas, Dave; Brunett, Acacia J.; Bucknor, Matthew
GE Hitachi Nuclear Energy (GEH) and Argonne National Laboratory are currently engaged in a joint effort to modernize and develop probabilistic risk assessment (PRA) techniques for advanced non-light water reactors. At a high level, the primary outcome of this project will be the development of next-generation PRA methodologies that will enable risk-informed prioritization of safety- and reliability-focused research and development, while also identifying gaps that may be resolved through additional research. A subset of this effort is the development of PRA methodologies to conduct a mechanistic source term (MST) analysis for event sequences that could result in the release ofmore » radionuclides. The MST analysis seeks to realistically model and assess the transport, retention, and release of radionuclides from the reactor to the environment. The MST methods developed during this project seek to satisfy the requirements of the Mechanistic Source Term element of the ASME/ANS Non-LWR PRA standard. The MST methodology consists of separate analysis approaches for risk-significant and non-risk significant event sequences that may result in the release of radionuclides from the reactor. For risk-significant event sequences, the methodology focuses on a detailed assessment, using mechanistic models, of radionuclide release from the fuel, transport through and release from the primary system, transport in the containment, and finally release to the environment. The analysis approach for non-risk significant event sequences examines the possibility of large radionuclide releases due to events such as re-criticality or the complete loss of radionuclide barriers. This paper provides details on the MST methodology, including the interface between the MST analysis and other elements of the PRA, and provides a simplified example MST calculation for a sodium fast reactor.« less
An Agile Course-Delivery Approach
ERIC Educational Resources Information Center
Capellan, Mirkeya
2009-01-01
In the world of software development, agile methodologies have gained popularity thanks to their lightweight methodologies and flexible approach. Many advocates believe that agile methodologies can provide significant benefits if applied in the educational environment as a teaching method. The need for an approach that engages and motivates…
Zhu, Yi; Han, Jianlin; Wang, Jiandong; Shibata, Norio; Sodeoka, Mikiko; Soloshonok, Vadim A; Coelho, Jaime A S; Toste, F Dean
2018-04-11
New methods for preparation of tailor-made fluorine-containing compounds are in extremely high demand in nearly every sector of chemical industry. The asymmetric construction of quaternary C-F stereogenic centers is the most synthetically challenging and, consequently, the least developed area of research. As a reflection of this apparent methodological deficit, pharmaceutical drugs featuring C-F stereogenic centers constitute less than 1% of all fluorine-containing medicines currently on the market or in clinical development. Here we provide a comprehensive review of current research activity in this area, including such general directions as asymmetric electrophilic fluorination via organocatalytic and transition-metal catalyzed reactions, asymmetric elaboration of fluorine-containing substrates via alkylations, Mannich, Michael, and aldol additions, cross-coupling reactions, and biocatalytic approaches.
Computer-aided drug discovery.
Bajorath, Jürgen
2015-01-01
Computational approaches are an integral part of interdisciplinary drug discovery research. Understanding the science behind computational tools, their opportunities, and limitations is essential to make a true impact on drug discovery at different levels. If applied in a scientifically meaningful way, computational methods improve the ability to identify and evaluate potential drug molecules, but there remain weaknesses in the methods that preclude naïve applications. Herein, current trends in computer-aided drug discovery are reviewed, and selected computational areas are discussed. Approaches are highlighted that aid in the identification and optimization of new drug candidates. Emphasis is put on the presentation and discussion of computational concepts and methods, rather than case studies or application examples. As such, this contribution aims to provide an overview of the current methodological spectrum of computational drug discovery for a broad audience.
Hermoso, Maria; Tabacchi, Garden; Iglesia-Altaba, Iris; Bel-Serrat, Silvia; Moreno-Aznar, Luis A; García-Santos, Yurena; García-Luzardo, Ma del Rosario; Santana-Salguero, Beatriz; Peña-Quintana, Luis; Serra-Majem, Lluis; Moran, Victoria Hall; Dykes, Fiona; Decsi, Tamás; Benetou, Vassiliki; Plada, Maria; Trichopoulou, Antonia; Raats, Monique M; Doets, Esmée L; Berti, Cristiana; Cetin, Irene; Koletzko, Berthold
2010-10-01
This paper presents a review of the current knowledge regarding the macro- and micronutrient requirements of infants and discusses issues related to these requirements during the first year of life. The paper also reviews the current reference values used in European countries and the methodological approaches used to derive them by a sample of seven European and international authoritative committees from which background scientific reports are available. Throughout the paper, the main issues contributing to disparities in micronutrient reference values for infants are highlighted. The identification of these issues in relation to the specific physiological aspects of infants is important for informing future initiatives aimed at providing standardized approaches to overcome variability of micronutrient reference values across Europe for this age group. © 2010 Blackwell Publishing Ltd.
Rasheed, Nadia; Amin, Shamsudin H M
2016-01-01
Grounded language acquisition is an important issue, particularly to facilitate human-robot interactions in an intelligent and effective way. The evolutionary and developmental language acquisition are two innovative and important methodologies for the grounding of language in cognitive agents or robots, the aim of which is to address current limitations in robot design. This paper concentrates on these two main modelling methods with the grounding principle for the acquisition of linguistic ability in cognitive agents or robots. This review not only presents a survey of the methodologies and relevant computational cognitive agents or robotic models, but also highlights the advantages and progress of these approaches for the language grounding issue.
Rasheed, Nadia; Amin, Shamsudin H. M.
2016-01-01
Grounded language acquisition is an important issue, particularly to facilitate human-robot interactions in an intelligent and effective way. The evolutionary and developmental language acquisition are two innovative and important methodologies for the grounding of language in cognitive agents or robots, the aim of which is to address current limitations in robot design. This paper concentrates on these two main modelling methods with the grounding principle for the acquisition of linguistic ability in cognitive agents or robots. This review not only presents a survey of the methodologies and relevant computational cognitive agents or robotic models, but also highlights the advantages and progress of these approaches for the language grounding issue. PMID:27069470
Hohmann, Erik; Brand, Jefferson C; Rossi, Michael J; Lubowitz, James H
2018-02-01
Our current trend and focus on evidence-based medicine is biased in favor of randomized controlled trials, which are ranked highest in the hierarchy of evidence while devaluing expert opinion, which is ranked lowest in the hierarchy. However, randomized controlled trials have weaknesses as well as strengths, and no research method is flawless. Moreover, stringent application of scientific research techniques, such as the Delphi Panel methodology, allows survey of experts in a high quality and scientific manner. Level V evidence (expert opinion) remains a necessary component in the armamentarium used to determine the answer to a clinical question. Copyright © 2017 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.
Current algebra, statistical mechanics and quantum models
NASA Astrophysics Data System (ADS)
Vilela Mendes, R.
2017-11-01
Results obtained in the past for free boson systems at zero and nonzero temperatures are revisited to clarify the physical meaning of current algebra reducible functionals which are associated to systems with density fluctuations, leading to observable effects on phase transitions. To use current algebra as a tool for the formulation of quantum statistical mechanics amounts to the construction of unitary representations of diffeomorphism groups. Two mathematical equivalent procedures exist for this purpose. One searches for quasi-invariant measures on configuration spaces, the other for a cyclic vector in Hilbert space. Here, one argues that the second approach is closer to the physical intuition when modelling complex systems. An example of application of the current algebra methodology to the pairing phenomenon in two-dimensional fermion systems is discussed.
Decision-theoretic methodology for reliability and risk allocation in nuclear power plants
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cho, N.Z.; Papazoglou, I.A.; Bari, R.A.
1985-01-01
This paper describes a methodology for allocating reliability and risk to various reactor systems, subsystems, components, operations, and structures in a consistent manner, based on a set of global safety criteria which are not rigid. The problem is formulated as a multiattribute decision analysis paradigm; the multiobjective optimization, which is performed on a PRA model and reliability cost functions, serves as the guiding principle for reliability and risk allocation. The concept of noninferiority is used in the multiobjective optimization problem. Finding the noninferior solution set is the main theme of the current approach. The assessment of the decision maker's preferencesmore » could then be performed more easily on the noninferior solution set. Some results of the methodology applications to a nontrivial risk model are provided and several outstanding issues such as generic allocation and preference assessment are discussed.« less
NASA Astrophysics Data System (ADS)
Kamalraj, Devaraj; Yuvaraj, Selvaraj; Yoganand, Coimbatore Paramasivam; Jaffer, Syed S.
2018-01-01
Here, we propose a new synthetic methodology for silver nanocluster preparation by using a double stranded-DNA (ds-DNA) template which no one has reported yet. A new calculative method was formulated to determine the size of the nanocluster and their band gaps by using steady state 3D contour fluorescence technique with Brus model. Generally, the structure and size of the nanoclusters determine by using High Resolution Transmission Electron Microscopy (HR-TEM). Before imaging the samples by using HR-TEM, they are introduced to drying process which causes aggregation and forms bigger polycrystalline particles. It takes long time duration and expensive methodology. In this current methodology, we found out the size and band gap of the nanocluster in the liquid form without any polycrystalline aggregation for which 3D contour fluorescence technique was used as an alternative approach to the HR-TEM method.
MRI-guided robotics at the U of Houston: evolving methodologies for interventions and surgeries.
Tsekos, Nikolaos V
2009-01-01
Currently, we witness the rapid evolution of minimally invasive surgeries (MIS) and image guided interventions (IGI) for offering improved patient management and cost effectiveness. It is well recognized that sustaining and expand this paradigm shift would require new computational methodology that integrates sensing with multimodal imaging, actively controlled robotic manipulators, the patient and the operator. Such approach would include (1) assessing in real-time tissue deformation secondary to the procedure and physiologic motion, (2) monitoring the tool(s) in 3D, and (3) on-the-fly update information about the pathophysiology of the targeted tissue. With those capabilities, real time image guidance may facilitate a paradigm shift and methodological leap from "keyhole" visualization (i.e. endoscopy or laparoscopy) to one that uses a volumetric and informational rich perception of the Area of Operation (AoO). This capability may eventually enable a wider range and level of complexity IGI and MIS.
Tsipouras, Markos G; Giannakeas, Nikolaos; Tzallas, Alexandros T; Tsianou, Zoe E; Manousou, Pinelopi; Hall, Andrew; Tsoulos, Ioannis; Tsianos, Epameinondas
2017-03-01
Collagen proportional area (CPA) extraction in liver biopsy images provides the degree of fibrosis expansion in liver tissue, which is the most characteristic histological alteration in hepatitis C virus (HCV). Assessment of the fibrotic tissue is currently based on semiquantitative staging scores such as Ishak and Metavir. Since its introduction as a fibrotic tissue assessment technique, CPA calculation based on image analysis techniques has proven to be more accurate than semiquantitative scores. However, CPA has yet to reach everyday clinical practice, since the lack of standardized and robust methods for computerized image analysis for CPA assessment have proven to be a major limitation. The current work introduces a three-stage fully automated methodology for CPA extraction based on machine learning techniques. Specifically, clustering algorithms have been employed for background-tissue separation, as well as for fibrosis detection in liver tissue regions, in the first and the third stage of the methodology, respectively. Due to the existence of several types of tissue regions in the image (such as blood clots, muscle tissue, structural collagen, etc.), classification algorithms have been employed to identify liver tissue regions and exclude all other non-liver tissue regions from CPA computation. For the evaluation of the methodology, 79 liver biopsy images have been employed, obtaining 1.31% mean absolute CPA error, with 0.923 concordance correlation coefficient. The proposed methodology is designed to (i) avoid manual threshold-based and region selection processes, widely used in similar approaches presented in the literature, and (ii) minimize CPA calculation time. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Uher, Jana
2015-12-01
Taxonomic "personality" models are widely used in research and applied fields. This article applies the Transdisciplinary Philosophy-of-Science Paradigm for Research on Individuals (TPS-Paradigm) to scrutinise the three methodological steps that are required for developing comprehensive "personality" taxonomies: 1) the approaches used to select the phenomena and events to be studied, 2) the methods used to generate data about the selected phenomena and events and 3) the reduction principles used to extract the "most important" individual-specific variations for constructing "personality" taxonomies. Analyses of some currently popular taxonomies reveal frequent mismatches between the researchers' explicit and implicit metatheories about "personality" and the abilities of previous methodologies to capture the particular kinds of phenomena toward which they are targeted. Serious deficiencies that preclude scientific quantifications are identified in standardised questionnaires, psychology's established standard method of investigation. These mismatches and deficiencies derive from the lack of an explicit formulation and critical reflection on the philosophical and metatheoretical assumptions being made by scientists and from the established practice of radically matching the methodological tools to researchers' preconceived ideas and to pre-existing statistical theories rather than to the particular phenomena and individuals under study. These findings raise serious doubts about the ability of previous taxonomies to appropriately and comprehensively reflect the phenomena towards which they are targeted and the structures of individual-specificity occurring in them. The article elaborates and illustrates with empirical examples methodological principles that allow researchers to appropriately meet the metatheoretical requirements and that are suitable for comprehensively exploring individuals' "personality".
Mapping human long bone compartmentalisation during ontogeny: a new methodological approach.
Cambra-Moo, Oscar; Nacarino Meneses, Carmen; Rodríguez Barbero, Miguel Ángel; García Gil, Orosia; Rascón Pérez, Josefina; Rello-Varona, Santiago; Campo Martín, Manuel; González Martín, Armando
2012-06-01
Throughout ontogeny, human bones undergo differentiation in terms of shape, size and tissue type; this is a complex scenario in which the variations in the tissue compartmentalisation of the cortical bone are still poorly understood. Currently, compartmentalisation is studied using methodologies that oversimplify the bone tissue complexity. Here, we present a new methodological approach that integrates a histological description and a mineral content analysis to study the compartmentalisation of the whole mineralised and non-mineralised tissues (i.e., spatial distribution in long bone sections). This new methodology, based on Geographical Information System (GIS) software, allows us to draw areas of interest (i.e., tracing vectorial shapes which are quantifiable) in raw images that are extracted from microscope and compared them spatially in a semi-automatic and quantitative fashion. As an example of our methodology, we have studied the tibiae from individuals with different age at death (infant, juvenile and adult). The tibia's cortical bone presents a well-formed fibrolamellar bone, in which remodelling is clearly evidenced from early ontogeny, and we discuss the existence of "lines of arrested growth". Concurrent with the histological variation, Raman and FT-IR spectroscopy analyses corroborate that the mineral content in the cortical bone changes differentially. The anterior portion of the tibia remains highly pierced and is less crystalline than the rest of the cortex during growth, which is evidence of more active and continuous remodelling. Finally, while porosity and other "non-mineralised cavities" are largely modified, the mineralised portion and the marrow cavity size persist proportionally during ontogeny. Copyright © 2012 Elsevier Inc. All rights reserved.
Setnik, Beatrice; Schoedel, Kerri A; Levy-Cooperman, Naama; Shram, Megan; Pixton, Glenn C; Roland, Carl L
With the development of opioid abuse-deterrent formulations (ADFs), there is a need to conduct well-designed human abuse potential studies to evaluate the effectiveness of their deterrent properties. Although these types of studies have been conducted for many years, largely to evaluate inherent abuse potential of a molecule and inform drug scheduling, methodological approaches have varied across studies. The focus of this review is to describe current "best practices" and methodological adaptations required to assess abuse-deterrent opioid formulations for regulatory submissions. A literature search was conducted in PubMed® to review methodological approaches (study conduct and analysis) used in opioid human abuse potential studies. Search terms included a combination of "opioid," "opiate," "abuse potential," "abuse liability," "liking," AND "pharmacodynamic," and only studies that evaluated single doses of opioids in healthy, nondependent individuals with or without prior opioid experience were included. Seventy-one human abuse potential studies meeting the prespecified criteria were identified, of which 21 studies evaluated a purported opioid ADF. Based on these studies, key methodological considerations were reviewed and summarized according to participant demographics, study prequalification, comparator and dose selection, route of administration and drug manipulation, study blinding, outcome measures and training, safety, and statistical analyses. The authors recommend careful consideration of key elements (eg, a standardized definition of a "nondependent recreational user"), as applicable, and offer key principles and "best practices" when conducting human abuse potential studies for opioid ADFs. Careful selection of appropriate study conditions is dependent on the type of ADF technology being evaluated.
Current Status and Challenges of Atmospheric Data Assimilation
NASA Astrophysics Data System (ADS)
Atlas, R. M.; Gelaro, R.
2016-12-01
The issues of modern atmospheric data assimilation are fairly simple to comprehend but difficult to address, involving the combination of literally billions of model variables and tens of millions of observations daily. In addition to traditional meteorological variables such as wind, temperature pressure and humidity, model state vectors are being expanded to include explicit representation of precipitation, clouds, aerosols and atmospheric trace gases. At the same time, model resolutions are approaching single-kilometer scales globally and new observation types have error characteristics that are increasingly non-Gaussian. This talk describes the current status and challenges of atmospheric data assimilation, including an overview of current methodologies, the difficulty of estimating error statistics, and progress toward coupled earth system analyses.
Conservation planning for biodiversity and wilderness: a real-world example.
Ceauşu, Silvia; Gomes, Inês; Pereira, Henrique Miguel
2015-05-01
Several of the most important conservation prioritization approaches select markedly different areas at global and regional scales. They are designed to maximize a certain biodiversity dimension such as coverage of species in the case of hotspots and complementarity, or composite properties of ecosystems in the case of wilderness. Most comparisons between approaches have ignored the multidimensionality of biodiversity. We analyze here the results of two species-based methodologies-hotspots and complementarity-and an ecosystem-based methodology-wilderness-at local scale. As zoning of protected areas can increase the effectiveness of conservation, we use the data employed for the management plan of the Peneda-Gerês National Park in Portugal. We compare the approaches against four criteria: species representativeness, wilderness coverage, coverage of important areas for megafauna, and for regulating ecosystem services. Our results suggest that species- and ecosystem-based approaches select significantly different areas at local scale. Our results also show that no approach covers well all biodiversity dimensions. Species-based approaches cover species distribution better, while the ecosystem-based approach favors wilderness, areas important for megafauna, and for ecosystem services. Management actions addressing different dimensions of biodiversity have a potential for contradictory effects, social conflict, and ecosystem services trade-offs, especially in the context of current European biodiversity policies. However, biodiversity is multidimensional, and management and zoning at local level should reflect this aspect. The consideration of both species- and ecosystem-based approaches at local scale is necessary to achieve a wider range of conservation goals.
Ride quality research techniques: Section on general techniques
NASA Technical Reports Server (NTRS)
1977-01-01
Information is gathered about the methods currently used for the study of ride quality in a variety of transportation modes by a variety of research organizations, including universities, Federal agencies, contracting firms, and private industries. Detailed descriptions of these techniques and their strengths and weaknesses, and identifying the organizations using such methods are presented. The specific efforts of the Group's participants, as well as a variety of feasible approaches not currently in use, are presented as methodological alternatives under the three basic factors which must be considered in ride quality studies: research techniques, research environments, and choice of subjects.
Simultaneous confidence sets for several effective doses.
Tompsett, Daniel M; Biedermann, Stefanie; Liu, Wei
2018-04-03
Construction of simultaneous confidence sets for several effective doses currently relies on inverting the Scheffé type simultaneous confidence band, which is known to be conservative. We develop novel methodology to make the simultaneous coverage closer to its nominal level, for both two-sided and one-sided simultaneous confidence sets. Our approach is shown to be considerably less conservative than the current method, and is illustrated with an example on modeling the effect of smoking status and serum triglyceride level on the probability of the recurrence of a myocardial infarction. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Path planning in uncertain flow fields using ensemble method
NASA Astrophysics Data System (ADS)
Wang, Tong; Le Maître, Olivier P.; Hoteit, Ibrahim; Knio, Omar M.
2016-10-01
An ensemble-based approach is developed to conduct optimal path planning in unsteady ocean currents under uncertainty. We focus our attention on two-dimensional steady and unsteady uncertain flows, and adopt a sampling methodology that is well suited to operational forecasts, where an ensemble of deterministic predictions is used to model and quantify uncertainty. In an operational setting, much about dynamics, topography, and forcing of the ocean environment is uncertain. To address this uncertainty, the flow field is parametrized using a finite number of independent canonical random variables with known densities, and the ensemble is generated by sampling these variables. For each of the resulting realizations of the uncertain current field, we predict the path that minimizes the travel time by solving a boundary value problem (BVP), based on the Pontryagin maximum principle. A family of backward-in-time trajectories starting at the end position is used to generate suitable initial values for the BVP solver. This allows us to examine and analyze the performance of the sampling strategy and to develop insight into extensions dealing with general circulation ocean models. In particular, the ensemble method enables us to perform a statistical analysis of travel times and consequently develop a path planning approach that accounts for these statistics. The proposed methodology is tested for a number of scenarios. We first validate our algorithms by reproducing simple canonical solutions, and then demonstrate our approach in more complex flow fields, including idealized, steady and unsteady double-gyre flows.
NASA Astrophysics Data System (ADS)
Hunka, Frantisek; Matula, Jiri
2017-07-01
Transaction based approach is utilized in some methodologies in business process modeling. Essential parts of these transactions are human beings. The notion of agent or actor role is usually used for them. The paper on a particular example describes possibilities of Design Engineering Methodology for Organizations (DEMO) and Resource-Event-Agent (REA) methodology. Whereas the DEMO methodology can be regarded as a generic methodology having its foundation in the theory of Enterprise Ontology the REA methodology is regarded as the domain specific methodology and has its origin in accountancy systems. The results of these approaches is that the DEMO methodology captures everything that happens in the reality with a good empirical evidence whereas the REA methodology captures only changes connected with economic events. Economic events represent either change of the property rights to economic resource or consumption or production of economic resources. This results from the essence of economic events and their connection to economic resources.
ERIC Educational Resources Information Center
Blumler, Jay G., Ed.; Katz, Elihu, Ed.
The essays in this volume examine the use of the mass media and explore the findings of the gratifications approach to mass communication research. Part one summaries the achievements in this area of mass media research and proposes an agenda for discussion of the future direction of this research in terms of a set of theoretical, methodological,…
Urban land use: Remote sensing of ground-basin permeability
NASA Technical Reports Server (NTRS)
Tinney, L. R.; Jensen, J. R.; Estes, J. E.
1975-01-01
A remote sensing analysis of the amount and type of permeable and impermeable surfaces overlying an urban recharge basin is discussed. An effective methodology for accurately generating this data as input to a safe yield study is detailed and compared to more conventional alternative approaches. The amount of area inventoried, approximately 10 sq. miles, should provide a reliable base against which automatic pattern recognition algorithms, currently under investigation for this task, can be evaluated. If successful, such approaches can significantly reduce the time and effort involved in obtaining permeability data, an important aspect of urban hydrology dynamics.
A log-linear model approach to estimation of population size using the line-transect sampling method
Anderson, D.R.; Burnham, K.P.; Crain, B.R.
1978-01-01
The technique of estimating wildlife population size and density using the belt or line-transect sampling method has been used in many past projects, such as the estimation of density of waterfowl nestling sites in marshes, and is being used currently in such areas as the assessment of Pacific porpoise stocks in regions of tuna fishing activity. A mathematical framework for line-transect methodology has only emerged in the last 5 yr. In the present article, we extend this mathematical framework to a line-transect estimator based upon a log-linear model approach.
NASA Technical Reports Server (NTRS)
Pena, Joaquin; Hinchey, Michael G.; Ruiz-Cortes, Antonio
2006-01-01
The field of Software Product Lines (SPL) emphasizes building a core architecture for a family of software products from which concrete products can be derived rapidly. This helps to reduce time-to-market, costs, etc., and can result in improved software quality and safety. Current AOSE methodologies are concerned with developing a single Multiagent System. We propose an initial approach to developing the core architecture of a Multiagent Systems Product Line (MAS-PL), exemplifying our approach with reference to a concept NASA mission based on multiagent technology.
NASA Astrophysics Data System (ADS)
Eimori, Takahisa; Anami, Kenji; Yoshimatsu, Norifumi; Hasebe, Tetsuya; Murakami, Kazuaki
2014-01-01
A comprehensive design optimization methodology using intuitive nondimensional parameters of inversion-level and saturation-level is proposed, especially for ultralow-power, low-voltage, and high-performance analog circuits with mixed strong, moderate, and weak inversion metal-oxide-semiconductor transistor (MOST) operations. This methodology is based on the synthesized charge-based MOST model composed of Enz-Krummenacher-Vittoz (EKV) basic concepts and advanced-compact-model (ACM) physics-based equations. The key concept of this methodology is that all circuit and system characteristics are described as some multivariate functions of inversion-level parameters, where the inversion level is used as an independent variable representative of each MOST. The analog circuit design starts from the first step of inversion-level design using universal characteristics expressed by circuit currents and inversion-level parameters without process-dependent parameters, followed by the second step of foundry-process-dependent design and the last step of verification using saturation-level criteria. This methodology also paves the way to an intuitive and comprehensive design approach for many kinds of analog circuit specifications by optimization using inversion-level log-scale diagrams and saturation-level criteria. In this paper, we introduce an example of our design methodology for a two-stage Miller amplifier.
NASA Technical Reports Server (NTRS)
Fogarty, Jennifer A.; Rando, Cynthia; Baumann, David; Richard, Elizabeth; Davis, Jeffrey
2010-01-01
In an effort to expand routes for open communication and create additional opportunities for public involvement with NASA, Open Innovation Service Provider (OISP) methodologies have been incorporated as a tool in NASA's problem solving strategy. NASA engaged the services of two OISP providers, InnoCentive and Yet2.com, to test this novel approach and its feasibility in solving NASA s space flight challenges. The OISPs were chosen based on multiple factors including: network size and knowledge area span, established process, methodology, experience base, and cost. InnoCentive and Yet2.com each met the desired criteria; however each company s approach to Open Innovation is distinctly different. InnoCentive focuses on posting individual challenges to an established web-based network of approximately 200,000 solvers; viable solutions are sought and granted a financial award if found. Based on a specific technological need, Yet2.com acts as a talent scout providing a broad external network of experts as potential collaborators to NASA. A relationship can be established with these contacts to develop technologies and/or maintained as an established network of future collaborators. The results from the first phase of the pilot study have shown great promise for long term efficacy of utilizing the OISP methodologies. Solution proposals have been received for the challenges posted on InnoCentive and are currently under review for final disposition. In addition, Yet2.com has identified new external partners for NASA and we are in the process of understanding and acting upon these new opportunities. Compared to NASA's traditional routes for external problem solving, the OISP methodologies offered NASA a substantial savings in terms of time and resources invested. In addition, these strategies will help NASA extend beyond its current borders to build an ever expanding network of experts and global solvers.
Life-Cycle Cost/Benefit Assessment of Expedite Departure Path (EDP)
NASA Technical Reports Server (NTRS)
Wang, Jianzhong Jay; Chang, Paul; Datta, Koushik
2005-01-01
This report presents a life-cycle cost/benefit assessment (LCCBA) of Expedite Departure Path (EDP), an air traffic control Decision Support Tool (DST) currently under development at NASA. This assessment is an update of a previous study performed by bd Systems, Inc. (bd) during FY01, with the following revisions: The life-cycle cost assessment methodology developed by bd for the previous study was refined and calibrated using Free Flight Phase 1 (FFP1) cost information for Traffic Management Advisor (TMA, or TMA-SC in the FAA's terminology). Adjustments were also made to the site selection and deployment scheduling methodology to include airspace complexity as a factor. This technique was also applied to the benefit extrapolation methodology to better estimate potential benefits for other years, and at other sites. This study employed a new benefit estimating methodology because bd s previous single year potential benefit assessment of EDP used unrealistic assumptions that resulted in optimistic estimates. This methodology uses an air traffic simulation approach to reasonably predict the impacts from the implementation of EDP. The results of the costs and benefits analyses were then integrated into a life-cycle cost/benefit assessment.
Agüero, A; Pinedo, P; Simón, I; Cancio, D; Moraleda, M; Trueba, C; Pérez-Sánchez, D
2008-09-15
A methodological approach which includes conceptual developments, methodological aspects and software tools have been developed in the Spanish context, based on the BIOMASS "Reference Biospheres Methodology". The biosphere assessments have to be undertaken with the aim of demonstrating compliance with principles and regulations established to limit the possible radiological impact of radioactive waste disposals on human health and on the environment, and to ensure that future generations will not be exposed to higher radiation levels than those that would be acceptable today. The biosphere in the context of high-level waste disposal is defined as the collection of various radionuclide transfer pathways that may result in releases into the surface environment, transport within and between the biosphere receptors, exposure of humans and biota, and the doses/risks associated with such exposures. The assessments need to take into account the complexity of the biosphere, the nature of the radionuclides released and the long timescales considered. It is also necessary to make assumptions related to the habits and lifestyle of the exposed population, human activities in the long term and possible modifications of the biosphere. A summary on the Spanish methodological approach for biosphere assessment are presented here as well as its application in a Spanish generic case study. A reference scenario has been developed based on current conditions at a site located in Central-West Spain, to indicate the potential impact to the actual population. In addition, environmental change has been considered qualitatively through the use of interaction matrices and transition diagrams. Unit source terms of (36)Cl, (79)Se, (99)Tc, (129)I, (135)Cs, (226)Ra, (231)Pa, (238)U, (237)Np and (239)Pu have been taken. Two exposure groups of infants and adults have been chosen for dose calculations. Results are presented and their robustness is evaluated through the use of uncertainty and sensitivity analyses.
Theory and Methodology in Researching Emotions in Education
ERIC Educational Resources Information Center
Zembylas, Michalinos
2007-01-01
Differing theoretical approaches to the study of emotions are presented: emotions as private (psychodynamic approaches); emotions as sociocultural phenomena (social constructionist approaches); and a third perspective (interactionist approaches) transcending these two. These approaches have important methodological implications in studying…
Halpern, Scott D.; Randolph, Adrienne G.; Angus, Derek C.
2010-01-01
Objective Randomized clinical trials of novel critical care interventions are currently tested in children only after documenting their safety in adults. Although this practice may protect children from research risks, it may paradoxically threaten children’s well-being by depriving them of evidence to guide their care. We sought to evaluate the ethical, methodologic, and practical arguments for and against studying critical care interventions in adults and children simultaneously rather than sequentially. Data Source Empirical studies and conceptual arguments germane to the objective were reviewed. Data Extraction and Synthesis Children are traditionally viewed as “participants of last resort” due to their vulnerability and decisional incapacity. However, critically ill adults commonly share similar features. Thus, structured risk assessments used by Institutional Review Boards to determine the adequacy of research protections for critically ill adults can also help protect children. From a methodologic perspective, interventions may be tested simultaneously in children and adults by enrolling children as a prespecified subgroup within a larger adult randomized clinical trial or by enrolling children in a separate trial conducted in parallel. Both approaches raise practical and analytical challenges that can frequently be met. For example, investigators might choose outcome measures that are appropriate for both adults and children. Additionally, using Bayesian approaches to link the estimates of treatment effects in children to the values observed in adults may enhance the statistical power to detect pediatric-specific effects. Finally, centralized Institutional Review Boards and data monitoring centers may alleviate practical concerns with conducting trials among adults and children simultaneously. Conclusions The current standard of testing critical care interventions in adults before children rests on tenuous ethical arguments and is entrenched by the methodologic and logistic barriers encountered with alternative approaches. However, these barriers will frequently be surmountable. We therefore propose that the default paradigm be changed such that interventions are examined routinely in critically ill children and adults simultaneously unless unique reasons exist to the contrary. PMID:19602971
Methodological Approaches in MOOC Research: Retracing the Myth of Proteus
ERIC Educational Resources Information Center
Raffaghelli, Juliana Elisa; Cucchiara, Stefania; Persico, Donatella
2015-01-01
This paper explores the methodological approaches most commonly adopted in the scholarly literature on Massive Open Online Courses (MOOCs), published during the period January 2008-May 2014. In order to identify trends, gaps and criticalities related to the methodological approaches of this emerging field of research, we analysed 60 papers…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Seung Jun; Buechler, Cynthia Eileen
The current study aims to predict the steady state power of a generic solution vessel and to develop a corresponding heat transfer coefficient correlation for a Moly99 production facility by conducting a fully coupled multi-physics simulation. A prediction of steady state power for the current application is inherently interconnected between thermal hydraulic characteristics (i.e. Multiphase computational fluid dynamics solved by ANSYS-Fluent 17.2) and the corresponding neutronic behavior (i.e. particle transport solved by MCNP6.2) in the solution vessel. Thus, the development of a coupling methodology is vital to understand the system behavior at a variety of system design and postulated operatingmore » scenarios. In this study, we report on the k-effective (keff) calculation for the baseline solution vessel configuration with a selected solution concentration using MCNP K-code modeling. The associated correlation of thermal properties (e.g. density, viscosity, thermal conductivity, specific heat) at the selected solution concentration are developed based on existing experimental measurements in the open literature. The numerical coupling methodology between multiphase CFD and MCNP is successfully demonstrated, and the detailed coupling procedure is documented. In addition, improved coupling methods capturing realistic physics in the solution vessel thermal-neutronic dynamics are proposed and tested further (i.e. dynamic height adjustment, mull-cell approach). As a key outcome of the current study, a multi-physics coupling methodology between MCFD and MCNP is demonstrated and tested for four different operating conditions. Those different operating conditions are determined based on the neutron source strength at a fixed geometry condition. The steady state powers for the generic solution vessel at various operating conditions are reported, and a generalized correlation of the heat transfer coefficient for the current application is discussed. The assessment of multi-physics methodology and preliminary results from various coupled calculations (power prediction and heat transfer coefficient) can be further utilized for the system code validation and generic solution vessel design improvement.« less
Collins, A.L; Pulley, S.; Foster, I.D.L; Gellis, Allen; Porto, P.; Horowitz, A.J.
2017-01-01
The growing awareness of the environmental significance of fine-grained sediment fluxes through catchment systems continues to underscore the need for reliable information on the principal sources of this material. Source estimates are difficult to obtain using traditional monitoring techniques, but sediment source fingerprinting or tracing procedures, have emerged as a potentially valuable alternative. Despite the rapidly increasing numbers of studies reporting the use of sediment source fingerprinting, several key challenges and uncertainties continue to hamper consensus among the international scientific community on key components of the existing methodological procedures. Accordingly, this contribution reviews and presents recent developments for several key aspects of fingerprinting, namely: sediment source classification, catchment source and target sediment sampling, tracer selection, grain size issues, tracer conservatism, source apportionment modelling, and assessment of source predictions using artificial mixtures. Finally, a decision-tree representing the current state of knowledge is presented, to guide end-users in applying the fingerprinting approach.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liou, M.-L.; Yeh, S.-C.; Yu, Y.-H.
2006-03-15
This paper discusses the current SEA procedures and assessment methodologies, aiming to propose strategies that can lead to effective improvement in a newly industrialized Asian country, Taiwan. Institutional and practical problems with regard to the regulations and tools of SEA in Taiwan are compared to those in other countries. According to the research results, it is suggested that extra evaluation processes should be incorporated into the current assessment procedures to improve their scientific validity and integrity. Moreover, it is also suggested that the sustainability appraisal approaches be included in the SEA framework. In this phase, revised evaluation indicators associated withmore » corresponding targets can be the first attempt for modifying the SEA system. It is believed that these can promote the operability in practice and also lead the whole assessment procedures to a direction closer to sustainable development. The trails that Taiwan has followed can help other countries that are going to adopt SEA to find a more effective and efficient way to follow.« less
Assessment of xylem phenology: a first attempt to verify its accuracy and precision.
Lupi, C; Rossi, S; Vieira, J; Morin, H; Deslauriers, A
2014-01-01
This manuscript aims to evaluate the precision and accuracy of current methodology for estimating xylem phenology and tracheid production in trees. Through a simple approach, sampling at two positions on the stem of co-dominant black spruce trees in two sites of the boreal forest of Quebec, we were able to quantify variability among sites, between trees and within a tree for different variables. We demonstrated that current methodology is accurate for the estimation of the onset of xylogenesis, while the accuracy for the evaluation of the ending of xylogenesis may be improved by sampling at multiple positions on the stem. The pattern of variability in different phenological variables and cell production allowed us to advance a novel hypothesis on the shift in the importance of various drivers of xylogenesis, from factors mainly varying at the level of site (e.g., climate) at the beginning of the growing season to factors varying at the level of individual trees (e.g., possibly genetic variability) at the end of the growing season.
Electroencephalogram-based methodology for determining unconsciousness during depopulation.
Benson, E R; Alphin, R L; Rankin, M K; Caputo, M P; Johnson, A L
2012-12-01
When an avian influenza or virulent Newcastle disease outbreak occurs within commercial poultry, key steps involved in managing a fast-moving poultry disease can include: education; biosecurity; diagnostics and surveillance; quarantine; elimination of infected poultry through depopulation or culling, disposal, and disinfection; and decreasing host susceptibility. Available mass emergency depopulation procedures include whole-house gassing, partial-house gassing, containerized gassing, and water-based foam. To evaluate potential depopulation methods, it is often necessary to determine the time to the loss of consciousness (LOC) in poultry. Many current approaches to evaluating LOC are qualitative and require visual observation of the birds. This study outlines an electroencephalogram (EEG) frequency domain-based approach for determining the point at which a bird loses consciousness. In this study, commercial broilers were used to develop the methodology, and the methodology was validated with layer hens. In total, 42 data sets from 13 broilers aged 5-10 wk and 12 data sets from four spent hens (age greater than 1 yr) were collected and analyzed. A wireless EEG transmitter was surgically implanted, and each bird was monitored during individual treatment with isoflurane anesthesia. EEG data were evaluated using a frequency-based approach. The alpha/delta (A/D, alpha: 8-12 Hz, delta: 0.5-4 Hz) ratio and loss of posture (LOP) were used to determine the point at which the birds became unconscious. Unconsciousness, regardless of the method of induction, causes suppression in alpha and a rise in the delta frequency component, and this change is used to determine unconsciousness. There was no statistically significant difference between time to unconsciousness as measured by A/D ratio or LOP, and the A/D values were correlated at the times of unconsciousness. The correlation between LOP and A/D ratio indicates that the methodology is appropriate for determining unconsciousness. The A/D ratio approach is suitable for monitoring during anesthesia, during depopulation, and in situations where birds cannot be readily viewed.
Monitoring the Impact of Solution Concepts within a Given Problematic
NASA Astrophysics Data System (ADS)
Cavallucci, Denis; Rousselot, François; Zanni, Cecilia
It is acknowledged that one of the most critical issues facing today’s organizations concerns the substantial leaps required to methodologically structure innovation. Among other published work, some suggest that a complete rethinking of current practices is required. In this article, we propose a methodology aiming at providing controlled R&D choices based on a monitoring of the impact Solution Concepts provoke on a problematic situation. Initially this problematic situation is modeled in a graph form, namely a Problem Graph. It has the objective to assists R&D managers when choosing which activities to support and bring them concrete arguments to defend their choices. We postulate that by improving the robustness of such approaches we help deciders to switch from intuitive decisions (mostly built upon their past experiences, fear regarding risks, and awareness of the company’s level of acceptance of novelties) to thoroughly constructed inventive problem solving strategies. Our approach will be discussed using a computer application that illustrates our hypothesis after being tested in several industrial applications.
A method to identify and analyze biological programs through automated reasoning
Yordanov, Boyan; Dunn, Sara-Jane; Kugler, Hillel; Smith, Austin; Martello, Graziano; Emmott, Stephen
2016-01-01
Predictive biology is elusive because rigorous, data-constrained, mechanistic models of complex biological systems are difficult to derive and validate. Current approaches tend to construct and examine static interaction network models, which are descriptively rich, but often lack explanatory and predictive power, or dynamic models that can be simulated to reproduce known behavior. However, in such approaches implicit assumptions are introduced as typically only one mechanism is considered, and exhaustively investigating all scenarios is impractical using simulation. To address these limitations, we present a methodology based on automated formal reasoning, which permits the synthesis and analysis of the complete set of logical models consistent with experimental observations. We test hypotheses against all candidate models, and remove the need for simulation by characterizing and simultaneously analyzing all mechanistic explanations of observed behavior. Our methodology transforms knowledge of complex biological processes from sets of possible interactions and experimental observations to precise, predictive biological programs governing cell function. PMID:27668090
NASA Astrophysics Data System (ADS)
Diffenbaugh, N. S.
2017-12-01
Severe heat provides one of the most direct, acute, and rapidly changing impacts of climate on people and ecostystems. Theory, historical observations, and climate model simulations all suggest that global warming should increase the probability of hot events that fall outside of our historical experience. Given the acutre impacts of extreme heat, quantifying the probability of historically unprecedented hot events at different levels of climate forcing is critical for climate adaptation and mitigation decisions. However, in practice that quantification presents a number of methodological challenges. This presentation will review those methodological challenges, including the limitations of the observational record and of climate model fidelity. The presentation will detail a comprehensive approach to addressing these challenges. It will then demonstrate the application of that approach to quantifying uncertainty in the probability of record-setting hot events in the current climate, as well as periods with lower and higher greenhouse gas concentrations than the present.
Early Warning Signals of Ecological Transitions: Methods for Spatial Patterns
Brock, William A.; Carpenter, Stephen R.; Ellison, Aaron M.; Livina, Valerie N.; Seekell, David A.; Scheffer, Marten; van Nes, Egbert H.; Dakos, Vasilis
2014-01-01
A number of ecosystems can exhibit abrupt shifts between alternative stable states. Because of their important ecological and economic consequences, recent research has focused on devising early warning signals for anticipating such abrupt ecological transitions. In particular, theoretical studies show that changes in spatial characteristics of the system could provide early warnings of approaching transitions. However, the empirical validation of these indicators lag behind their theoretical developments. Here, we summarize a range of currently available spatial early warning signals, suggest potential null models to interpret their trends, and apply them to three simulated spatial data sets of systems undergoing an abrupt transition. In addition to providing a step-by-step methodology for applying these signals to spatial data sets, we propose a statistical toolbox that may be used to help detect approaching transitions in a wide range of spatial data. We hope that our methodology together with the computer codes will stimulate the application and testing of spatial early warning signals on real spatial data. PMID:24658137
Towards a general object-oriented software development methodology
NASA Technical Reports Server (NTRS)
Seidewitz, ED; Stark, Mike
1986-01-01
An object is an abstract software model of a problem domain entity. Objects are packages of both data and operations of that data (Goldberg 83, Booch 83). The Ada (tm) package construct is representative of this general notion of an object. Object-oriented design is the technique of using objects as the basic unit of modularity in systems design. The Software Engineering Laboratory at the Goddard Space Flight Center is currently involved in a pilot program to develop a flight dynamics simulator in Ada (approximately 40,000 statements) using object-oriented methods. Several authors have applied object-oriented concepts to Ada (e.g., Booch 83, Cherry 85). It was found that these methodologies are limited. As a result a more general approach was synthesized with allows a designer to apply powerful object-oriented principles to a wide range of applications and at all stages of design. An overview is provided of this approach. Further, how object-oriented design fits into the overall software life-cycle is considered.
Ternik, Robert; Liu, Fang; Bartlett, Jeremy A; Khong, Yuet Mei; Thiam Tan, David Cheng; Dixit, Trupti; Wang, Siri; Galella, Elizabeth A; Gao, Zhihui; Klein, Sandra
2018-02-05
The acceptability of pediatric pharmaceutical products to patients and their caregivers can have a profound impact on the resulting therapeutic outcome. However, existing methodology and approaches used for acceptability assessments for pediatric products is fragmented, making robust and consistent product evaluations difficult. A pediatric formulation development workshop took place in Washington, DC in June 2016 through the University of Maryland's Center of Excellence in Regulatory Science and Innovation (M-CERSI). A session at the workshop was dedicated to acceptability assessments and focused on two major elements that affect the overall acceptability of oral medicines, namely swallowability and palatability. The session started with presentations to provide an overview of literature, background and current state on swallowability and palatability assessments. Five parallel breakout discussions followed the presentations on each element, focusing on three overarching themes, risk-based approaches, methodology and product factors. This article reports the key outcomes of the workshop related to swallowability and palatability assessments. Copyright © 2017 Elsevier B.V. All rights reserved.
Improving scanner wafer alignment performance by target optimization
NASA Astrophysics Data System (ADS)
Leray, Philippe; Jehoul, Christiane; Socha, Robert; Menchtchikov, Boris; Raghunathan, Sudhar; Kent, Eric; Schoonewelle, Hielke; Tinnemans, Patrick; Tuffy, Paul; Belen, Jun; Wise, Rich
2016-03-01
In the process nodes of 10nm and below, the patterning complexity along with the processing and materials required has resulted in a need to optimize alignment targets in order to achieve the required precision, accuracy and throughput performance. Recent industry publications on the metrology target optimization process have shown a move from the expensive and time consuming empirical methodologies, towards a faster computational approach. ASML's Design for Control (D4C) application, which is currently used to optimize YieldStar diffraction based overlay (DBO) metrology targets, has been extended to support the optimization of scanner wafer alignment targets. This allows the necessary process information and design methodology, used for DBO target designs, to be leveraged for the optimization of alignment targets. In this paper, we show how we applied this computational approach to wafer alignment target design. We verify the correlation between predictions and measurements for the key alignment performance metrics and finally show the potential alignment and overlay performance improvements that an optimized alignment target could achieve.
Report on FY17 testing in support of integrated EPP-SMT design methods development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Yanli .; Jetter, Robert I.; Sham, T. -L.
The goal of the proposed integrated Elastic Perfectly-Plastic (EPP) and Simplified Model Test (SMT) methodology is to incorporate a SMT data-based approach for creep-fatigue damage evaluation into the EPP methodology to avoid the separate evaluation of creep and fatigue damage and eliminate the requirement for stress classification in current methods; thus greatly simplifying evaluation of elevated temperature cyclic service. The purpose of this methodology is to minimize over-conservatism while properly accounting for localized defects and stress risers. To support the implementation of the proposed methodology and to verify the applicability of the code rules, thermomechanical tests continued in FY17. Thismore » report presents the recent test results for Type 1 SMT specimens on Alloy 617 with long hold times, pressurization SMT on Alloy 617, and two-bar thermal ratcheting test results on SS316H at the temperature range of 405 °C to 705 °C. Preliminary EPP strain range analysis on the two-bar tests are critically evaluated and compared with the experimental results.« less
Adaptive Multi-scale Prognostics and Health Management for Smart Manufacturing Systems
Choo, Benjamin Y.; Adams, Stephen C.; Weiss, Brian A.; Marvel, Jeremy A.; Beling, Peter A.
2017-01-01
The Adaptive Multi-scale Prognostics and Health Management (AM-PHM) is a methodology designed to enable PHM in smart manufacturing systems. In application, PHM information is not yet fully utilized in higher-level decision-making in manufacturing systems. AM-PHM leverages and integrates lower-level PHM information such as from a machine or component with hierarchical relationships across the component, machine, work cell, and assembly line levels in a manufacturing system. The AM-PHM methodology enables the creation of actionable prognostic and diagnostic intelligence up and down the manufacturing process hierarchy. Decisions are then made with the knowledge of the current and projected health state of the system at decision points along the nodes of the hierarchical structure. To overcome the issue of exponential explosion of complexity associated with describing a large manufacturing system, the AM-PHM methodology takes a hierarchical Markov Decision Process (MDP) approach into describing the system and solving for an optimized policy. A description of the AM-PHM methodology is followed by a simulated industry-inspired example to demonstrate the effectiveness of AM-PHM. PMID:28736651
Security Events and Vulnerability Data for Cybersecurity Risk Estimation.
Allodi, Luca; Massacci, Fabio
2017-08-01
Current industry standards for estimating cybersecurity risk are based on qualitative risk matrices as opposed to quantitative risk estimates. In contrast, risk assessment in most other industry sectors aims at deriving quantitative risk estimations (e.g., Basel II in Finance). This article presents a model and methodology to leverage on the large amount of data available from the IT infrastructure of an organization's security operation center to quantitatively estimate the probability of attack. Our methodology specifically addresses untargeted attacks delivered by automatic tools that make up the vast majority of attacks in the wild against users and organizations. We consider two-stage attacks whereby the attacker first breaches an Internet-facing system, and then escalates the attack to internal systems by exploiting local vulnerabilities in the target. Our methodology factors in the power of the attacker as the number of "weaponized" vulnerabilities he/she can exploit, and can be adjusted to match the risk appetite of the organization. We illustrate our methodology by using data from a large financial institution, and discuss the significant mismatch between traditional qualitative risk assessments and our quantitative approach. © 2017 Society for Risk Analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Castillo, H.
1982-01-01
The Government of Costa Rica has stated the need for a formal procedure for the evaluation and categorization of an environmental program. Methodological studies were prepared as the basis for the development of the general methodology by which each government or institution can adapt and implement the procedure. The methodology was established by using different techniques according to their contribution to the evaluation process, such as: Systemic Approach, Delphi, and Saaty Methods. The methodology consists of two main parts: 1) evaluation of the environmental aspects by using different techniques; 2) categorization of the environmental aspects by applying the methodology tomore » the Costa Rican Environmental affairs using questionnaire answers supplied by experts both inside and outside of the country. The second part of the research includes Appendixes in which is presented general information concerning institutions related to environmental affairs; description of the methods used; results of the current status evaluation and its scale; the final scale of categorization; and the questionnaires and a list of experts. The methodology developed in this research will have a beneficial impact on environmental concerns in Costa Rica. As a result of this research, a Commission Office of Environmental Affairs, providing links between consumers, engineers, scientists, and the Government, is recommended. Also there is significant potential use of this methodology in developed countries for a better balancing of the budgets of major research programs such as cancer, heart, and other research areas.« less
Usage-Centered Design Approach in Design of Malaysia Sexuality Education (MSE) Courseware
NASA Astrophysics Data System (ADS)
Chan, S. L.; Jaafar, A.
The problems amongst juveniles increased every year, especially rape case of minor. Therefore, the government of Malaysia has introduced the National Sexuality Education Guideline on 2005. An early study related to the perception of teachers and students toward the sexuality education curriculum taught in secondary schools currently was carried out in 2008. The study showed that there are big gaps between the perception of the teachers and the students towards several issues of Malaysia sexuality education today. The Malaysia Sexuality Education (MSE) courseware was designed based on few learning theories approach. Then MSE was executed through a comprehensive methodology which the model ADDIE integrated with Usage-Centered Design to achieve high usability courseware. In conclusion, the effort of developing the MSE is hopefully will be a solution to the current problem that happens in Malaysia sexuality education now.
González-García, I; García-Arieta, A; Merino-Sanjuan, M; Mangas-Sanjuan, V; Bermejo, M
2018-07-01
Regulatory guidelines recommend that, when a level A IVIVC is established, dissolution specification should be established using averaged data and the maximum difference between AUC and C max between the reference and test formulations cannot be greater than 20%. However, averaging data assumes a loss of information and may reflect a bias in the results. The objective of the current work is to present a new approach to establish dissolution specifications using a new methodology (individual approach) instead of average data (classical approach). Different scenarios were established based on the relationship between in vitro-in vivo dissolution rate coefficient using a level A IVIVC of a controlled release formulation. Then, in order to compare this new approach with the classical one, six additional batches were simulated. For each batch, 1000 simulations of a dissolution assay were run. C max ratios between the reference formulation and each batch were calculated showing that the individual approach was more sensitive and able to detect differences between the reference and the batch formulation compared to the classical approach. Additionally, the new methodology displays wider dissolution specification limits than the classical approach, ensuring that any tablet from the new batch would generate in vivo profiles which its AUC or C max ratio will be out of the 0.8-1.25 range, taking into account the in vitro and in vivo variability of the new batches developed. Copyright © 2018 Elsevier B.V. All rights reserved.
Sandbox University: estimating influence of institutional action.
Forsman, Jonas; Mann, Richard P; Linder, Cedric; van den Bogaard, Maartje
2014-01-01
The approach presented in this article represents a generalizable and adaptable methodology for identifying complex interactions in educational systems and for investigating how manipulation of these systems may affect educational outcomes of interest. Multilayer Minimum Spanning Tree and Monte-Carlo methods are used. A virtual Sandbox University is created in order to facilitate effective identification of successful and stable initiatives within higher education, which can affect students' credits and student retention - something that has been lacking up until now. The results highlight the importance of teacher feedback and teacher-student rapport, which is congruent with current educational findings, illustrating the methodology's potential to provide a new basis for further empirical studies of issues in higher education from a complex systems perspective.
Sandbox University: Estimating Influence of Institutional Action
Forsman, Jonas; Mann, Richard P.; Linder, Cedric; van den Bogaard, Maartje
2014-01-01
The approach presented in this article represents a generalizable and adaptable methodology for identifying complex interactions in educational systems and for investigating how manipulation of these systems may affect educational outcomes of interest. Multilayer Minimum Spanning Tree and Monte-Carlo methods are used. A virtual Sandbox University is created in order to facilitate effective identification of successful and stable initiatives within higher education, which can affect students' credits and student retention – something that has been lacking up until now. The results highlight the importance of teacher feedback and teacher-student rapport, which is congruent with current educational findings, illustrating the methodology's potential to provide a new basis for further empirical studies of issues in higher education from a complex systems perspective. PMID:25054313
[The Philosophical Relevance of the Study of Schizophrenia. Methodological and Conceptual Issues].
López-Silva, Pablo
2014-01-01
The study of mental illness involves profound methodological and philosophical debates. This article explores the disciplinary complementarity, particularly, between philosophy of mind, phenomenology, and empirical studies in psychiatry and psychopathology in the context of the understanding of schizophrenia. After clarifying the possible role of these disciplines, it is explored the way in which a certain symptom of schizophrenia (thought insertion) challenges the current phenomenological approach to the relationship between consciousness and self-awareness. Finally, it is concluded that philosophy of mind, phenomenology, and empirical studies in psychiatry and psychopathology should, necessarily, regulate their progress jointly in order to reach plausible conclusions about what we call 'schizophrenia'. Crown Copyright © 2014. Publicado por Elsevier España. All rights reserved.
Current trends in protein crystallization.
Gavira, José A
2016-07-15
Proteins belong to the most complex colloidal system in terms of their physicochemical properties, size and conformational-flexibility. This complexity contributes to their great sensitivity to any external change and dictate the uncertainty of crystallization. The need of 3D models to understand their functionality and interaction mechanisms with other neighbouring (macro)molecules has driven the tremendous effort put into the field of crystallography that has also permeated other fields trying to shed some light into reluctant-to-crystallize proteins. This review is aimed at revising protein crystallization from a regular-laboratory point of view. It is also devoted to highlight the latest developments and achievements to produce, identify and deliver high-quality protein crystals for XFEL, Micro-ED or neutron diffraction. The low likelihood of protein crystallization is rationalized by considering the intrinsic polypeptide nature (folded state, surface charge, etc) followed by a description of the standard crystallization methods (batch, vapour diffusion and counter-diffusion), including high throughput advances. Other methodologies aimed at determining protein features in solution (NMR, SAS, DLS) or to gather structural information from single particles such as Cryo-EM are also discussed. Finally, current approaches showing the convergence of different structural biology techniques and the cross-methodologies adaptation to tackle the most difficult problems, are presented. Current advances in biomacromolecules crystallization, from nano crystals for XFEL and Micro-ED to large crystals for neutron diffraction, are covered with special emphasis in methodologies applicable at laboratory scale. Copyright © 2015 Elsevier Inc. All rights reserved.
Fleischhacker, Sheila; Roberts, Erica; Camplain, Ricky; Evenson, Kelly R; Gittelsohn, Joel
2016-12-01
Promoting physical activity using environmental, policy, and systems approaches could potentially address persistent health disparities faced by American Indian and Alaska Native children and adolescents. To address research gaps and help inform tribally led community changes that promote physical activity, this review examined the methodology and current evidence of physical activity interventions and community-wide initiatives among Native youth. A keyword-guided search was conducted in multiple databases to identify peer-reviewed research articles that reported on physical activity among Native youth. Ultimately, 20 unique interventions (described in 76 articles) and 13 unique community-wide initiatives (described in 16 articles) met the study criteria. Four interventions noted positive changes in knowledge and attitude relating to physical activity but none of the interventions examined reported statistically significant improvements on weight-related outcomes. Only six interventions reported implementing environmental, policy, and system approaches relating to promoting physical activity and generally only shared anecdotal information about the approaches tried. Using community-based participatory research or tribally driven research models strengthened the tribal-research partnerships and improved the cultural and contextual sensitivity of the intervention or community-wide initiative. Few interventions or community-wide initiatives examined multi-level, multi-sector interventions to promote physical activity among Native youth, families, and communities. More research is needed to measure and monitor physical activity within this understudied, high risk group. Future research could also focus on the unique authority and opportunity of tribal leaders and other key stakeholders to use environmental, policy, and systems approaches to raise a healthier generation of Native youth.
Roberts, Erica; Camplain, Ricky; Evenson, Kelly R.; Gittelsohn, Joel
2015-01-01
Promoting physical activity using environmental, policy, and systems approaches could potentially address persistent health disparities faced by American Indian and Alaska Native children and adolescents. To address research gaps and help inform tribally-led community changes that promote physical activity, this review examined the methodology and current evidence of physical activity interventions and community-wide initiatives among Native youth. A keyword guided search was conducted in multiple databases to identify peer-reviewed research articles that reported on physical activity among Native youth. Ultimately, 20 unique interventions (described in 76 articles) and 13 unique community-wide initiatives (described in 16 articles) met the study criteria. Four interventions noted positive changes in knowledge and attitude relating to physical activity but none of the interventions examined reported statistically significant improvements on weight-related outcomes. Only six interventions reported implementing environmental, policy, and system approaches relating to promoting physical activity and generally only shared anecdotal information about the approaches tried. Using community-based participatory research or tribally-driven research models strengthened the tribal-research partnerships and improved the cultural and contextual sensitivity of the intervention or community-wide initiative. Few interventions or community-wide initiatives examined multi-level, multi-sector interventions to promote physical activity among Native youth, families and communities. More research is needed to measure and monitor physical activity within this understudied, high risk group. Future research could also focus on the unique authority and opportunity of tribal leaders and other key stakeholders to use environmental, policy, and systems approaches to raise a healthier generation of Native youth. PMID:27294756
Dos Santos Vasconcelos, Crhisllane Rafaele; de Lima Campos, Túlio; Rezende, Antonio Mauro
2018-03-06
Systematic analysis of a parasite interactome is a key approach to understand different biological processes. It makes possible to elucidate disease mechanisms, to predict protein functions and to select promising targets for drug development. Currently, several approaches for protein interaction prediction for non-model species incorporate only small fractions of the entire proteomes and their interactions. Based on this perspective, this study presents an integration of computational methodologies, protein network predictions and comparative analysis of the protozoan species Leishmania braziliensis and Leishmania infantum. These parasites cause Leishmaniasis, a worldwide distributed and neglected disease, with limited treatment options using currently available drugs. The predicted interactions were obtained from a meta-approach, applying rigid body docking tests and template-based docking on protein structures predicted by different comparative modeling techniques. In addition, we trained a machine-learning algorithm (Gradient Boosting) using docking information performed on a curated set of positive and negative protein interaction data. Our final model obtained an AUC = 0.88, with recall = 0.69, specificity = 0.88 and precision = 0.83. Using this approach, it was possible to confidently predict 681 protein structures and 6198 protein interactions for L. braziliensis, and 708 protein structures and 7391 protein interactions for L. infantum. The predicted networks were integrated to protein interaction data already available, analyzed using several topological features and used to classify proteins as essential for network stability. The present study allowed to demonstrate the importance of integrating different methodologies of interaction prediction to increase the coverage of the protein interaction of the studied protocols, besides it made available protein structures and interactions not previously reported.
Transcranial direct current stimulation for motor recovery of upper limb function after stroke.
Lüdemann-Podubecká, Jitka; Bösl, Kathrin; Rothhardt, Sandra; Verheyden, Geert; Nowak, Dennis Alexander
2014-11-01
Changes in neural processing after stroke have been postulated to impede recovery from stroke. Transcranial direct current stimulation has the potential to alter cortico-spinal excitability and thereby might be beneficial in stroke recovery. We review the pertinent literature prior to 30/09/2013 on transcranial direct current stimulation in promoting motor recovery of the affected upper limb after stroke. We found overall 23 trials (they included 523 participants). All stimulation protocols pride on interhemispheric imbalance model. In a comparative approach, methodology and effectiveness of (a) facilitation of the affected hemisphere, (b) inhibition of the unaffected hemisphere and (c) combined application of transcranial direct current stimulation over the affected and unaffected hemispheres to treat impaired hand function after stroke are presented. Transcranial direct current stimulation is associated with improvement of the affected upper limb after stroke, but current evidence does not support its routine use. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Padula, Santo, II
2009-01-01
The ability to sufficiently measure orbiter window defects to allow for window recertification has been an ongoing challenge for the orbiter vehicle program. The recent Columbia accident has forced even tighter constraints on the criteria that must be met in order to recertify windows for flight. As a result, new techniques are being investigated to improve the reliability, accuracy and resolution of the defect detection process. The methodology devised in this work, which is based on the utilization of a vertical scanning interferometric (VSI) tool, shows great promise for meeting the ever increasing requirements for defect detection. This methodology has the potential of a 10-100 fold greater resolution of the true defect depth than can be obtained from the currently employed micrometer based methodology. An added benefit is that it also produces a digital elevation map of the defect, thereby providing information about the defect morphology which can be utilized to ascertain the type of debris that induced the damage. However, in order to successfully implement such a tool, a greater understanding of the resolution capability and measurement repeatability must be obtained. This work focused on assessing the variability of the VSI-based measurement methodology and revealed that the VSI measurement tool was more repeatable and more precise than the current micrometer based approach, even in situations where operator variation could affect the measurement. The analysis also showed that the VSI technique was relatively insensitive to the hardware and software settings employed, making the technique extremely robust and desirable
Affordable proteomics: the two-hybrid systems.
Gillespie, Marc
2003-06-01
Numerous proteomic methodologies exist, but most require a heavy investment in expertise and technology. This puts these approaches out of reach for many laboratories and small companies, rarely allowing proteomics to be used as a pilot approach for biomarker or target identification. Two proteomic approaches, 2D gel electrophoresis and the two-hybrid systems, are currently available to most researchers. The two-hybrid systems, though accommodating to large-scale experiments, were originally designed as practical screens, that by comparison to current proteomics tools were small-scale, affordable and technically feasible. The screens rapidly generated data, identifying protein interactions that were previously uncharacterized. The foundation for a two-hybrid proteomic investigation can be purchased as separate kits from a number of companies. The true power of the technique lies not in its affordability, but rather in its portability. The two-hybrid system puts proteomics back into laboratories where the output of the screens can be evaluated by researchers with experience in the particular fields of basic research, cancer biology, toxicology or drug development.
Raghavan, Ramesh
2014-01-01
Federal policymaking in the last decade has dramatically expanded performance measurement within child welfare systems, and states are currently being fiscally penalized for poor performance on defined outcomes. However, in contrast to performance measurement in health settings, current policy holds child welfare systems solely responsible for meeting outcomes, largely without taking into account the effects of factors at the level of the child, and his or her social ecology, that might undermine the performance of child welfare agencies. Appropriate measurement of performance is predicated upon the ability to disentangle individual, as opposed to organizational, determinants of outcomes, which is the goal of risk adjustment methodologies. This review briefly conceptualizes and examines risk adjustment approaches in health and child welfare, suggests approaches to expanding its use to appropriately measure the performance of child welfare agencies, and highlights research gaps that diminish the appropriate use of risk adjustment approaches – and which consequently suggest the need for caution – in policymaking around performance measurement of child welfare agencies. PMID:25253917
Michalek, Lukas; Barner, Leonie; Barner-Kowollik, Christopher
2018-03-07
Well-defined polymer strands covalently tethered onto solid substrates determine the properties of the resulting functional interface. Herein, the current approaches to determine quantitative grafting densities are assessed. Based on a brief introduction into the key theories describing polymer brush regimes, a user's guide is provided to estimating maximum chain coverage and-importantly-examine the most frequently employed approaches for determining grafting densities, i.e., dry thickness measurements, gravimetric assessment, and swelling experiments. An estimation of the reliability of these determination methods is provided via carefully evaluating their assumptions and assessing the stability of the underpinning equations. A practical access guide for comparatively and quantitatively evaluating the reliability of a given approach is thus provided, enabling the field to critically judge experimentally determined grafting densities and to avoid the reporting of grafting densities that fall outside the physically realistic parameter space. The assessment is concluded with a perspective on the development of advanced approaches for determination of grafting density, in particular, on single-chain methodologies. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Application of Bayesian and cost benefit risk analysis in water resources management
NASA Astrophysics Data System (ADS)
Varouchakis, E. A.; Palogos, I.; Karatzas, G. P.
2016-03-01
Decision making is a significant tool in water resources management applications. This technical note approaches a decision dilemma that has not yet been considered for the water resources management of a watershed. A common cost-benefit analysis approach, which is novel in the risk analysis of hydrologic/hydraulic applications, and a Bayesian decision analysis are applied to aid the decision making on whether or not to construct a water reservoir for irrigation purposes. The alternative option examined is a scaled parabolic fine variation in terms of over-pumping violations in contrast to common practices that usually consider short-term fines. The methodological steps are analytically presented associated with originally developed code. Such an application, and in such detail, represents new feedback. The results indicate that the probability uncertainty is the driving issue that determines the optimal decision with each methodology, and depending on the unknown probability handling, each methodology may lead to a different optimal decision. Thus, the proposed tool can help decision makers to examine and compare different scenarios using two different approaches before making a decision considering the cost of a hydrologic/hydraulic project and the varied economic charges that water table limit violations can cause inside an audit interval. In contrast to practices that assess the effect of each proposed action separately considering only current knowledge of the examined issue, this tool aids decision making by considering prior information and the sampling distribution of future successful audits.
Stochastic approach for radionuclides quantification
NASA Astrophysics Data System (ADS)
Clement, A.; Saurel, N.; Perrin, G.
2018-01-01
Gamma spectrometry is a passive non-destructive assay used to quantify radionuclides present in more or less complex objects. Basic methods using empirical calibration with a standard in order to quantify the activity of nuclear materials by determining the calibration coefficient are useless on non-reproducible, complex and single nuclear objects such as waste packages. Package specifications as composition or geometry change from one package to another and involve a high variability of objects. Current quantification process uses numerical modelling of the measured scene with few available data such as geometry or composition. These data are density, material, screen, geometric shape, matrix composition, matrix and source distribution. Some of them are strongly dependent on package data knowledge and operator backgrounds. The French Commissariat à l'Energie Atomique (CEA) is developing a new methodology to quantify nuclear materials in waste packages and waste drums without operator adjustment and internal package configuration knowledge. This method suggests combining a global stochastic approach which uses, among others, surrogate models available to simulate the gamma attenuation behaviour, a Bayesian approach which considers conditional probability densities of problem inputs, and Markov Chains Monte Carlo algorithms (MCMC) which solve inverse problems, with gamma ray emission radionuclide spectrum, and outside dimensions of interest objects. The methodology is testing to quantify actinide activity in different kind of matrix, composition, and configuration of sources standard in terms of actinide masses, locations and distributions. Activity uncertainties are taken into account by this adjustment methodology.
Towards a theory of individual differences in statistical learning
Bogaerts, Louisa; Christiansen, Morten H.; Frost, Ram
2017-01-01
In recent years, statistical learning (SL) research has seen a growing interest in tracking individual performance in SL tasks, mainly as a predictor of linguistic abilities. We review studies from this line of research and outline three presuppositions underlying the experimental approach they employ: (i) that SL is a unified theoretical construct; (ii) that current SL tasks are interchangeable, and equally valid for assessing SL ability; and (iii) that performance in the standard forced-choice test in the task is a good proxy of SL ability. We argue that these three critical presuppositions are subject to a number of theoretical and empirical issues. First, SL shows patterns of modality- and informational-specificity, suggesting that SL cannot be treated as a unified construct. Second, different SL tasks may tap into separate sub-components of SL that are not necessarily interchangeable. Third, the commonly used forced-choice tests in most SL tasks are subject to inherent limitations and confounds. As a first step, we offer a methodological approach that explicitly spells out a potential set of different SL dimensions, allowing for better transparency in choosing a specific SL task as a predictor of a given linguistic outcome. We then offer possible methodological solutions for better tracking and measuring SL ability. Taken together, these discussions provide a novel theoretical and methodological approach for assessing individual differences in SL, with clear testable predictions. This article is part of the themed issue ‘New frontiers for statistical learning in the cognitive sciences’. PMID:27872377
Cross-Sectional And Longitudinal Uncertainty Propagation In Drinking Water Risk Assessment
NASA Astrophysics Data System (ADS)
Tesfamichael, A. A.; Jagath, K. J.
2004-12-01
Pesticide residues in drinking water can vary significantly from day to day. However, drinking water quality monitoring performed under the Safe Drinking Water Act (SDWA) at most community water systems (CWSs) is typically limited to four data points per year over a few years. Due to limited sampling, likely maximum residues may be underestimated in risk assessment. In this work, a statistical methodology is proposed to study the cross-sectional and longitudinal uncertainties in observed samples and their propagated effect in risk estimates. The methodology will be demonstrated using data from 16 CWSs across the US that have three independent databases of atrazine residue to estimate the uncertainty of risk in infants and children. The results showed that in 85% of the CWSs, chronic risks predicted with the proposed approach may be two- to four-folds higher than that predicted with the current approach, while intermediate risks may be two- to three-folds higher in 50% of the CWSs. In 12% of the CWSs, however, the proposed methodology showed a lower intermediate risk. A closed-form solution of propagated uncertainty will be developed to calculate the number of years (seasons) of water quality data and sampling frequency needed to reduce the uncertainty in risk estimates. In general, this methodology provided good insight into the importance of addressing uncertainty of observed water quality data and the need to predict likely maximum residues in risk assessment by considering propagation of uncertainties.
Automating the packing heuristic design process with genetic programming.
Burke, Edmund K; Hyde, Matthew R; Kendall, Graham; Woodward, John
2012-01-01
The literature shows that one-, two-, and three-dimensional bin packing and knapsack packing are difficult problems in operational research. Many techniques, including exact, heuristic, and metaheuristic approaches, have been investigated to solve these problems and it is often not clear which method to use when presented with a new instance. This paper presents an approach which is motivated by the goal of building computer systems which can design heuristic methods. The overall aim is to explore the possibilities for automating the heuristic design process. We present a genetic programming system to automatically generate a good quality heuristic for each instance. It is not necessary to change the methodology depending on the problem type (one-, two-, or three-dimensional knapsack and bin packing problems), and it therefore has a level of generality unmatched by other systems in the literature. We carry out an extensive suite of experiments and compare with the best human designed heuristics in the literature. Note that our heuristic design methodology uses the same parameters for all the experiments. The contribution of this paper is to present a more general packing methodology than those currently available, and to show that, by using this methodology, it is possible for a computer system to design heuristics which are competitive with the human designed heuristics from the literature. This represents the first packing algorithm in the literature able to claim human competitive results in such a wide variety of packing domains.
Surface chemistry at Swiss Universities of Applied Sciences.
Brodard, Pierre; Pfeifer, Marc E; Adlhart, Christian D; Pieles, Uwe; Shahgaldian, Patrick
2014-01-01
In the Swiss Universities of Applied Sciences, a number of research groups are involved in surface science, with different methodological approaches and a broad range of sophisticated characterization techniques. A snapshot of the current research going on in different groups from the University of Applied Sciences and Arts Western Switzerland (HES-SO), the Zurich University of Applied Sciences (ZHAW) and the University of Applied Sciences and Arts Northwestern Switzerland (FHNW) is given.
A comparison of fatigue life prediction methodologies for rotorcraft
NASA Technical Reports Server (NTRS)
Everett, R. A., Jr.
1990-01-01
Because of the current U.S. Army requirement that all new rotorcraft be designed to a 'six nines' reliability on fatigue life, this study was undertaken to assess the accuracy of the current safe life philosophy using the nominal stress Palmgrem-Miner linear cumulative damage rule to predict the fatigue life of rotorcraft dynamic components. It has been shown that this methodology can predict fatigue lives that differ from test lives by more than two orders of magnitude. A further objective of this work was to compare the accuracy of this methodology to another safe life method called the local strain approach as well as to a method which predicts fatigue life based solely on crack growth data. Spectrum fatigue tests were run on notched (k(sub t) = 3.2) specimens made of 4340 steel using the Felix/28 tests fairly well, being slightly on the unconservative side of the test data. The crack growth method, which is based on 'small crack' crack growth data and a crack-closure model, also predicted the fatigue lives very well with the predicted lives being slightly longer that the mean test lives but within the experimental scatter band. The crack growth model was also able to predict the change in test lives produced by the rainflow reconstructed spectra.
Robinson, Sean M; Sobell, Linda Carter; Sobell, Mark B; Arcidiacono, Steven; Tzall, David
2014-01-01
Several methodological reviews of alcohol treatment outcome studies and one review of drug studies have been published over the past 40 years. Although past reviews demonstrated methodological improvements in alcohol studies, they also found continued deficiencies. The current review allows for an updated evaluation of the methodological rigor of alcohol and drug studies and, by utilizing inclusion criteria similar to previous reviews, it allows for a comparative review over time. In addition, this is the first review that compares the methodology of alcohol and drug treatment outcome studies published during the same time period. The methodology for 25 alcohol and 11 drug treatment outcome studies published from 2005 through 2010 that met the review's inclusion criteria was evaluated. The majority of variables evaluated were used in prior reviews. The current review found that more alcohol and drug treatment outcome studies are now using continuous substance use measures and assessing problem severity. Although there have been methodological improvements over time, the current reviews differed little from their most recent past counterpart. Despite this finding, some areas, particularly the continued low reporting of demographic data, needs strengthening. Improvement in the methodological rigor of alcohol and drug treatment outcome studies has occurred over time. The current review found few differences between alcohol and drug study methodologies as well as few differences between the current review and the most recent past alcohol and drug reviews. © 2013 Elsevier Ltd. All rights reserved.
Verifying the UK agricultural N2O emission inventory with tall tower measurements
NASA Astrophysics Data System (ADS)
Carnell, E. J.; Meneguz, E.; Skiba, U. M.; Misselbrook, T. H.; Cardenas, L. M.; Arnold, T.; Manning, A.; Dragosits, U.
2016-12-01
Nitrous oxide (N2O) is a key greenhouse gas (GHG), with a global warming potential 300 times greater than that of CO2. N2O is emitted from a variety of sources, predominantly from agriculture. Annual UK emission estimates are reported, to comply with government commitments under the United Nations Framework Convention on Climate Change (UNFCCC). The UK N2O inventory follows internationally agreed protocols and emission estimates are derived by applying emission factors to estimates of (anthropogenic) emission sources. This approach is useful for comparing anthropogenic emissions from different countries, but does not capture regional differences and inter-annual variability associated with environmental factors (such as climate and soils) and agricultural management. In recent years, the UK inventory approach has been refined to include regional information into its emissions estimates, in an attempt to reduce uncertainty. This study attempts to assess the difference between current published inventory methodology (default IPCC methodology) and an alternative approach, which incorporates the latest thinking, using data from recent work. For 2013, emission estimates made using the alternative approach were 30 % lower than those made using default IPCC methodology, due to the use of lower emission factors suggested by recent projects (Defra projects: AC0116, AC0213 and MinNO). The 2013 emissions estimates were disaggregated on a monthly basis using agricultural management (e.g. sowing dates), climate data and soil properties. The temporally disaggregated emission maps were used as input to the Met Office atmospheric dispersion model NAME, for comparison with measured N2O concentrations, at three observation stations (Tacolneston, E. England; Ridge Hill, W. England; Mace Head, W. Ireland) in the UK DECC network (Deriving Emissions linked to Climate Change). The Mace Head site, situated on the west coast of Ireland, was used to establish baseline concentrations. The trends in the modelled data were found to correspond with the observational data trends, with concentration peaks coinciding with periods of land spreading of manures and fertiliser application. The model run using the default IPCC methodology was found to correspond with the observed data more closely than the alternative approach.
NASA Astrophysics Data System (ADS)
Miola, Apollonia; Ciuffo, Biagio
2011-04-01
Maritime transport plays a central role in the transport sector's sustainability debate. Its contribution to air pollution and greenhouse gases is significant. An effective policy strategy to regulate air emissions requires their robust estimation in terms of quantification and location. This paper provides a critical analysis of the ship emission modelling approaches and data sources available, identifying their limits and constraints. It classifies the main methodologies on the basis of the approach followed (bottom-up or top-down) for the evaluation and geographic characterisation of emissions. The analysis highlights the uncertainty of results from the different methods. This is mainly due to the level of uncertainty connected with the sources of information that are used as inputs to the different studies. This paper describes the sources of the information required for these analyses, paying particular attention to AIS data and to the possible problems associated with their use. One way of reducing the overall uncertainty in the results could be the simultaneous use of different sources of information. This paper presents an alternative methodology based on this approach. As a final remark, it can be expected that new approaches to the problem together with more reliable data sources over the coming years could give more impetus to the debate on the global impact of maritime traffic on the environment that, currently, has only reached agreement via the "consensus" estimates provided by IMO (2009).
Emergy Analysis for the Sustainable Utilization of Biosolids ...
This contribution describes the application of an emergy-based methodology for comparing two management alternatives of biosolids produced in a wastewater treatment plant. The current management practice of using biosolids as soil fertilizers was evaluated and compared to another alternative, the recovery of energy from the biosolid gasification process. This emergy assessment and comparison approach identifies more sustainable processes which achieve economic and social benefits with a minimal environmental impact. In addition, emergy-based sustainability indicators and the GREENSCOPE methodology were used to compare the two biosolid management alternatives. According to the sustainability assessment results, the energy production from biosolid gasification is energetically profitable, economically viable, and environmentally suitable. Furthermore, it was found that the current use of biosolids as soil fertilizer does not generate any considerable environmental stress, has the potential to achieve more economic benefits, and a post-processing of biosolids prior to its use as soil fertilizer improves its sustainability performance. In conclusion, this emergy analysis provides a sustainability assessment of both alternatives of biosolid management and helps decision-makers to identify opportunities for improvement during the current process of biosolid management. This work aims to identify the best option for the use and management of biosolids generated in a wa
Gill, Emily L; Koelmel, Jeremy P; Yost, Richard A; Okun, Michael S; Vedam-Mai, Vinata; Garrett, Timothy J
2018-03-06
Parkinson's disease (PD) is a neurodegenerative disorder resulting from the loss of dopaminergic neurons of the substantia nigra as well as degeneration of motor and nonmotor basal ganglia circuitries. Typically known for classical motor deficits (tremor, rigidity, bradykinesia), early stages of the disease are associated with a large nonmotor component (depression, anxiety, apathy, etc.). Currently, there are no definitive biomarkers of PD, and the measurement of dopamine metabolites does not allow for detection of prodromal PD nor does it aid in long-term monitoring of disease progression. Given that PD is increasingly recognized as complex and heterogeneous, involving several neurotransmitters and proteins, it is of importance that we advance interdisciplinary studies to further our knowledge of the molecular and cellular pathways that are affected in PD. This approach will possibly yield useful biomarkers for early diagnosis and may assist in the development of disease-modifying therapies. Here, we discuss preanalytical factors associated with metabolomics studies, summarize current mass spectrometric methodologies used to evaluate the metabolic signature of PD, and provide future perspectives of the rapidly developing field of MS in the context of PD.
Genomic and metagenomic technologies to explore the antibiotic resistance mobilome.
Martínez, José L; Coque, Teresa M; Lanza, Val F; de la Cruz, Fernando; Baquero, Fernando
2017-01-01
Antibiotic resistance is a relevant problem for human health that requires global approaches to establish a deep understanding of the processes of acquisition, stabilization, and spread of resistance among human bacterial pathogens. Since natural (nonclinical) ecosystems are reservoirs of resistance genes, a health-integrated study of the epidemiology of antibiotic resistance requires the exploration of such ecosystems with the aim of determining the role they may play in the selection, evolution, and spread of antibiotic resistance genes, involving the so-called resistance mobilome. High-throughput sequencing techniques allow an unprecedented opportunity to describe the genetic composition of a given microbiome without the need to subculture the organisms present inside. However, bioinformatic methods for analyzing this bulk of data, mainly with respect to binning each resistance gene with the organism hosting it, are still in their infancy. Here, we discuss how current genomic methodologies can serve to analyze the resistance mobilome and its linkage with different bacterial genomes and metagenomes. In addition, we describe the drawbacks of current methodologies for analyzing the resistance mobilome, mainly in cases of complex microbiotas, and discuss the possibility of implementing novel tools to improve our current metagenomic toolbox. © 2016 New York Academy of Sciences.
Verma, Nishant; Beretvas, S Natasha; Pascual, Belen; Masdeu, Joseph C; Markey, Mia K
2015-11-12
As currently used, the Alzheimer's Disease Assessment Scale-Cognitive subscale (ADAS-Cog) has low sensitivity for measuring Alzheimer's disease progression in clinical trials. A major reason behind the low sensitivity is its sub-optimal scoring methodology, which can be improved to obtain better sensitivity. Using item response theory, we developed a new scoring methodology (ADAS-CogIRT) for the ADAS-Cog, which addresses several major limitations of the current scoring methodology. The sensitivity of the ADAS-CogIRT methodology was evaluated using clinical trial simulations as well as a negative clinical trial, which had shown an evidence of a treatment effect. The ADAS-Cog was found to measure impairment in three cognitive domains of memory, language, and praxis. The ADAS-CogIRT methodology required significantly fewer patients and shorter trial durations as compared to the current scoring methodology when both were evaluated in simulated clinical trials. When validated on data from a real clinical trial, the ADAS-CogIRT methodology had higher sensitivity than the current scoring methodology in detecting the treatment effect. The proposed scoring methodology significantly improves the sensitivity of the ADAS-Cog in measuring progression of cognitive impairment in clinical trials focused in the mild-to-moderate Alzheimer's disease stage. This provides a boost to the efficiency of clinical trials requiring fewer patients and shorter durations for investigating disease-modifying treatments.
Gladysz, Rafaela; Dos Santos, Fabio Mendes; Langenaeker, Wilfried; Thijs, Gert; Augustyns, Koen; De Winter, Hans
2018-03-07
Spectrophores are novel descriptors that are calculated from the three-dimensional atomic properties of molecules. In our current implementation, the atomic properties that were used to calculate spectrophores include atomic partial charges, atomic lipophilicity indices, atomic shape deviations and atomic softness properties. This approach can easily be widened to also include additional atomic properties. Our novel methodology finds its roots in the experimental affinity fingerprinting technology developed in the 1990's by Terrapin Technologies. Here we have translated it into a purely virtual approach using artificial affinity cages and a simplified metric to calculate the interaction between these cages and the atomic properties. A typical spectrophore consists of a vector of 48 real numbers. This makes it highly suitable for the calculation of a wide range of similarity measures for use in virtual screening and for the investigation of quantitative structure-activity relationships in combination with advanced statistical approaches such as self-organizing maps, support vector machines and neural networks. In our present report we demonstrate the applicability of our novel methodology for scaffold hopping as well as virtual screening.
A systematic approach for the location of hand sanitizer dispensers in hospitals.
Cure, Laila; Van Enk, Richard; Tiong, Ewing
2014-09-01
Compliance with hand hygiene practices is directly affected by the accessibility and availability of cleaning agents. Nevertheless, the decision of where to locate these dispensers is often not explicitly or fully addressed in the literature. In this paper, we study the problem of selecting the locations to install alcohol-based hand sanitizer dispensers throughout a hospital unit as an indirect approach to maximize compliance with hand hygiene practices. We investigate the relevant criteria in selecting dispenser locations that promote hand hygiene compliance, propose metrics for the evaluation of various location configurations, and formulate a dispenser location optimization model that systematically incorporates such criteria. A complete methodology to collect data and obtain the model parameters is described. We illustrate the proposed approach using data from a general care unit at a collaborating hospital. A cost analysis was performed to study the trade-offs between usability and cost. The proposed methodology can help in evaluating the current location configuration, determining the need for change, and establishing the best possible configuration. It can be adapted to incorporate alternative metrics, tailored to different institutions and updated as needed with new internal policies or safety regulation.
Storytelling and story testing in domestication.
Gerbault, Pascale; Allaby, Robin G; Boivin, Nicole; Rudzinski, Anna; Grimaldi, Ilaria M; Pires, J Chris; Climer Vigueira, Cynthia; Dobney, Keith; Gremillion, Kristen J; Barton, Loukas; Arroyo-Kalin, Manuel; Purugganan, Michael D; Rubio de Casas, Rafael; Bollongino, Ruth; Burger, Joachim; Fuller, Dorian Q; Bradley, Daniel G; Balding, David J; Richerson, Peter J; Gilbert, M Thomas P; Larson, Greger; Thomas, Mark G
2014-04-29
The domestication of plants and animals marks one of the most significant transitions in human, and indeed global, history. Traditionally, study of the domestication process was the exclusive domain of archaeologists and agricultural scientists; today it is an increasingly multidisciplinary enterprise that has come to involve the skills of evolutionary biologists and geneticists. Although the application of new information sources and methodologies has dramatically transformed our ability to study and understand domestication, it has also generated increasingly large and complex datasets, the interpretation of which is not straightforward. In particular, challenges of equifinality, evolutionary variance, and emergence of unexpected or counter-intuitive patterns all face researchers attempting to infer past processes directly from patterns in data. We argue that explicit modeling approaches, drawing upon emerging methodologies in statistics and population genetics, provide a powerful means of addressing these limitations. Modeling also offers an approach to analyzing datasets that avoids conclusions steered by implicit biases, and makes possible the formal integration of different data types. Here we outline some of the modeling approaches most relevant to current problems in domestication research, and demonstrate the ways in which simulation modeling is beginning to reshape our understanding of the domestication process.
Storytelling and story testing in domestication
Gerbault, Pascale; Allaby, Robin G.; Boivin, Nicole; Rudzinski, Anna; Grimaldi, Ilaria M.; Pires, J. Chris; Climer Vigueira, Cynthia; Dobney, Keith; Gremillion, Kristen J.; Barton, Loukas; Arroyo-Kalin, Manuel; Purugganan, Michael D.; Rubio de Casas, Rafael; Bollongino, Ruth; Burger, Joachim; Fuller, Dorian Q.; Bradley, Daniel G.; Balding, David J.; Richerson, Peter J.; Gilbert, M. Thomas P.; Larson, Greger; Thomas, Mark G.
2014-01-01
The domestication of plants and animals marks one of the most significant transitions in human, and indeed global, history. Traditionally, study of the domestication process was the exclusive domain of archaeologists and agricultural scientists; today it is an increasingly multidisciplinary enterprise that has come to involve the skills of evolutionary biologists and geneticists. Although the application of new information sources and methodologies has dramatically transformed our ability to study and understand domestication, it has also generated increasingly large and complex datasets, the interpretation of which is not straightforward. In particular, challenges of equifinality, evolutionary variance, and emergence of unexpected or counter-intuitive patterns all face researchers attempting to infer past processes directly from patterns in data. We argue that explicit modeling approaches, drawing upon emerging methodologies in statistics and population genetics, provide a powerful means of addressing these limitations. Modeling also offers an approach to analyzing datasets that avoids conclusions steered by implicit biases, and makes possible the formal integration of different data types. Here we outline some of the modeling approaches most relevant to current problems in domestication research, and demonstrate the ways in which simulation modeling is beginning to reshape our understanding of the domestication process. PMID:24753572
Effectiveness evaluation of the R&D projects in organizations financed by the budget expenses
NASA Astrophysics Data System (ADS)
Yakovlev, D.; Yushkov, E.; Pryakhin, A.; Bogatyreova, M.
2017-01-01
The issues of R&D project performance and their prospects are closely concerned with knowledge management. In the initial stages of the project development, it is the quality of the project evaluation that is crucial for the result and generation of future knowledge. Currently there does not exist any common methodology for the evaluation of new R&D financed by the budget. Suffice it to say, the assessment of scientific and technical projects (ST projects) varies greatly depending on the type of customer - government or business structures. An extensive methodological groundwork was formed with respect to orders placed by business structures. It included “an internal administrative order” by the company management for the results of STA intended for its own ST divisions. Regretfully this is not the case with state orders in the field of STA although the issue requires state regulation and official methodological support. The article is devoted to methodological assessment of scientific and technical effectiveness of studies performed at the expense of budget funds, and suggests a new concept based on the definition of the cost-effectiveness index. Thus, the study reveals it necessary to extend the previous approach to projects of different levels - micro-, meso-, macro projects. The preliminary results of the research show that there must be a common methodological approach to underpin the financing of projects under government contracts within the framework of budget financing and stock financing. This should be developed as general guidelines as well as recommendations that reflect specific sectors of the public sector, various project levels and forms of financing, as well as different stages of project life cycle.
Reliability Modeling Methodology for Independent Approaches on Parallel Runways Safety Analysis
NASA Technical Reports Server (NTRS)
Babcock, P.; Schor, A.; Rosch, G.
1998-01-01
This document is an adjunct to the final report An Integrated Safety Analysis Methodology for Emerging Air Transport Technologies. That report presents the results of our analysis of the problem of simultaneous but independent, approaches of two aircraft on parallel runways (independent approaches on parallel runways, or IAPR). This introductory chapter presents a brief overview and perspective of approaches and methodologies for performing safety analyses for complex systems. Ensuing chapter provide the technical details that underlie the approach that we have taken in performing the safety analysis for the IAPR concept.
Study on the integration approaches to CAD/CAPP/FMS in garment CIMS
NASA Astrophysics Data System (ADS)
Wang, Xiankui; Tian, Wensheng; Liu, Chengying; Li, Zhizhong
1995-08-01
Computer integrated manufacturing system (CIMS), as an advanced methodology, has been applied in many industry fields. There is, however, little research on the application of CIMS in the garment industry, especially on the integrated approach to CAD, CAPP, and FMS in garment CIMS. In this paper, the current situations of CAD, CAPP, and FMS in the garment industry are discussed, and information requirements between them as well as the integrated approaches are also investigated. The representation of the garments' product data by the group technology coding is proposed. Based on the group technology, a shared data base as an integration element can be constructed, which leads to the integration of CAD/CAPP/FMS in garment CIMS.
A Kalman-Filter-Based Approach to Combining Independent Earth-Orientation Series
NASA Technical Reports Server (NTRS)
Gross, Richard S.; Eubanks, T. M.; Steppe, J. A.; Freedman, A. P.; Dickey, J. O.; Runge, T. F.
1998-01-01
An approach. based upon the use of a Kalman filter. that is currently employed at the Jet Propulsion Laboratory (JPL) for combining independent measurements of the Earth's orientation, is presented. Since changes in the Earth's orientation can be described is a randomly excited stochastic process, the uncertainty in our knowledge of the Earth's orientation grows rapidly in the absence of measurements. The Kalman-filter methodology allows for an objective accounting of this uncertainty growth, thereby facilitating the intercomparison of measurements taken at different epochs (not necessarily uniformly spaced in time) and with different precision. As an example of this approach to combining Earth-orientation series, a description is given of a combination, SPACE95, that has been generated recently at JPL.
Neuroeconomics: cross-currents in research on decision-making.
Sanfey, Alan G; Loewenstein, George; McClure, Samuel M; Cohen, Jonathan D
2006-03-01
Despite substantial advances, the question of how we make decisions and judgments continues to pose important challenges for scientific research. Historically, different disciplines have approached this problem using different techniques and assumptions, with few unifying efforts made. However, the field of neuroeconomics has recently emerged as an inter-disciplinary effort to bridge this gap. Research in neuroscience and psychology has begun to investigate neural bases of decision predictability and value, central parameters in the economic theory of expected utility. Economics, in turn, is being increasingly influenced by a multiple-systems approach to decision-making, a perspective strongly rooted in psychology and neuroscience. The integration of these disparate theoretical approaches and methodologies offers exciting potential for the construction of more accurate models of decision-making.
Coal resources available for development; a methodology and pilot study
Eggleston, Jane R.; Carter, M. Devereux; Cobb, James C.
1990-01-01
Coal accounts for a major portion of our Nation's energy supply in projections for the future. A demonstrated reserve base of more than 475 billion short tons, as the Department of Energy currently estimates, indicates that, on the basis of today's rate of consumption, the United States has enough coal to meet projected energy needs for almost 200 years. However, the traditional procedures used for estimating the demonstrated reserve base do not account for many environmental and technological restrictions placed on coal mining. A new methodology has been developed to determine the quantity of coal that might actually be available for mining under current and foreseeable conditions. This methodology is unique in its approach, because it applies restrictions to the coal resource before it is mined. Previous methodologies incorporated restrictions into the recovery factor (a percentage), which was then globally applied to the reserve (minable coal) tonnage to derive a recoverable coal tonnage. None of the previous methodologies define the restrictions and their area and amount of impact specifically. Because these restrictions and their impacts are defined in this new methodology, it is possible to achieve more accurate and specific assessments of available resources. This methodology has been tested in a cooperative project between the U.S. Geological Survey and the Kentucky Geological Survey on the Matewan 7.5-minute quadrangle in eastern Kentucky. Pertinent geologic, mining, land-use, and technological data were collected, assimilated, and plotted. The National Coal Resources Data System was used as the repository for data, and its geographic information system software was applied to these data to eliminate restricted coal and quantify that which is available for mining. This methodology does not consider recovery factors or the economic factors that would be considered by a company before mining. Results of the pilot study indicate that, of the estimated original 986.5 million short tons of coal resources in Kentucky's Matewan quadrangle, 13 percent has been mined, 2 percent is restricted by land-use considerations, and 23 percent is restricted by technological considerations. This leaves an estimated 62 percent of the original resource, or approximately 612 million short tons available for mining. However, only 44 percent of this available coal (266 million short tons) will meet current Environmental Protection Agency new-source performance standards for sulfur emissions from electric generating plants in the United States. In addition, coal tonnage lost during mining and cleaning would further reduce the amount of coal actually arriving at the market.
Controlled synthesis of different metal oxide nanostructures by direct current arc discharge.
Su, Yanjie; Zhang, Jing; Zhang, Liling; Zhang, Yafei
2013-02-01
Direct current (DC) arc discharge method gives high temperature in a short time, which has been widely used to prepare carbon nanotubes. We use this simple approach to synthesize metal oxide nanostructures (MgO, SnO2) without any catalyst. Different morphologies (nanowires, nanobelts, nanocubes, and nanodisks) of metal oxide nanostructures can be controllably synthesized by changing the content of air in buffer gas. The growth mechanisms for these nanostructures are discussed in detail. Oxygen partial pressure is supposed to be one of the most important key factors. The methodology might be used to synthesize similar nanostructures of other functional oxide materials and non-oxide materials.
Reviewing the methodology of an integrative review.
Hopia, Hanna; Latvala, Eila; Liimatainen, Leena
2016-12-01
Whittemore and Knafl's updated description of methodological approach for integrative review was published in 2005. Since then, the five stages of the approach have been regularly used as a basic conceptual structure of the integrative reviews conducted by nursing researchers. However, this methodological approach is seldom examined from the perspective of how systematically and rigorously the stages are implemented in the published integrative reviews. To appraise the selected integrative reviews on the basis of the methodological approach according to the five stages published by Whittemore and Knafl in 2005. A literature review was used in this study. CINAHL (Cumulative Index to Nursing and Allied Health), PubMed, OVID (Journals@Ovid) and the Cochrane Library databases were searched for integrative reviews published between 2002 and 2014. Papers were included if they used the methodological approach described by Whittemore and Knafl, were published in English and were focused on nursing education or nursing expertise. A total of 259 integrative review publications for potential inclusion were identified. Ten integrative reviews fulfilled the inclusion criteria. Findings from the studies were extracted and critically examined according to the five methodological stages. The reviews assessed followed the guidelines of the stated methodology approach to different extents. The stages of literature search, data evaluation and data analysis were fairly poorly formulated and only partially implemented in the studies included in the sample. The other two stages, problem identification and presentation, followed those described in the methodological approach quite well. Increasing use of research in clinical practice is inevitable, and therefore, integrative reviews can play a greater role in developing evidence-based nursing practices. Because of this, nurse researchers should pay more attention to sound integrative nursing research to systematise the review process and make it more rigorous. © 2016 Nordic College of Caring Science.
Refinements to the Graves and Pitarka (2010) Broadband Ground Motion Simulation Method
Graves, Robert; Arben Pitarka,
2015-01-01
This brief article describes refinements to the Graves and Pitarka (2010) broadband ground motion simulation methodology (GP2010 hereafter) that have been implemented in version 14.3 of the SCEC Broadband Platform (BBP). The updated version of our method on the current SCEC BBP is referred to as GP14.3. Our simulation technique is a hybrid approach that combines low-‐frequency and high-‐frequency motions computed with different methods into a single broadband response. The separate low-‐ and high-‐frequency components have traditionally been called “deterministic” and “stochastic”, respectively; however, this nomenclature is an oversimplification. In reality, the low-‐frequency approach includes many stochastic elements, and likewise, the high-‐frequency approach includes many deterministic elements (e.g., Pulido and Kubo, 2004; Hartzell et al., 2005; Liu et al., 2006; Frankel, 2009; Graves and Pitarka, 2010; Mai et al., 2010). While the traditional terminology will likely remain in use by the broader modeling community, in this paper we will refer to these using the generic terminology “low-‐frequency” and “high-‐ frequency” approaches. Furthermore, one of the primary goals in refining our methodology is to provide a smoother and more consistent transition between the low-‐ and high-‐ frequency calculations, with the ultimate objective being the development of a single unified modeling approach that can be applied over a broad frequency band. GP2010 was validated by modeling recorded strong motions from four California earthquakes. While the method performed well overall, several issues were identified including the tendency to over-‐predict the level of longer period (2-‐5 sec) motions and the effects of rupture directivity. The refinements incorporated in GP14.3 are aimed at addressing these issues with application to the simulation of earthquakes in Western US (WUS). These refinements include the addition of a deep weak zone (details in following section) to the rupture characterization and allowing perturbations in the correlation of rise time and rupture speed with the specified slip distribution. Additionally, we have extended the parameterization of GP14.3 so that it is also applicable for simulating Eastern North America (ENA) earthquakes. This work has been guided by the comprehensive set of validation studies described in Goulet and Abrahamson (2014) and Dreger et al. (2014). The GP14.3 method shows improved performance relative to GP2010, and we direct the interested reader to Dreger et al. (2014) for a detailed assessment of the current methodology. In this paper, we concentrate on describing the modifications in more detail, and also discussing additional refinements that are currently being developed.
Maximizing ecological and evolutionary insight in bisulfite sequencing data sets
Lea, Amanda J.; Vilgalys, Tauras P.; Durst, Paul A.P.; Tung, Jenny
2017-01-01
Preface Genome-scale bisulfite sequencing approaches have opened the door to ecological and evolutionary studies of DNA methylation in many organisms. These approaches can be powerful. However, they introduce new methodological and statistical considerations, some of which are particularly relevant to non-model systems. Here, we highlight how these considerations influence a study’s power to link methylation variation with a predictor variable of interest. Relative to current practice, we argue that sample sizes will need to increase to provide robust insights. We also provide recommendations for overcoming common challenges and an R Shiny app to aid in study design. PMID:29046582
Woolderink, M; Lynch, F L; van Asselt, A D I; Beecham, J; Evers, S M A A; Paulus, A T G; van Schayck, C P
2015-05-01
Economic evaluations are increasingly used in decision-making. Accurate measurement of service use is critical to economic evaluation. This qualitative study, based on expert interviews, aims to identify best approaches to service use measurement for child mental health conditions, and to identify problems in current methods. Results suggest considerable agreement on strengths (e.g., availability of accurate instruments to measure service use) and weaknesses, (e.g., lack of unit prices for services outside the health sector) or alternative approaches to service use measurement. Experts also identified some unresolved problems, for example the lack of uniform definitions for some mental health services.
Approaches to advancescientific understanding of macrosystems ecology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Levy, Ofir; Ball, Becky; Bond-Lamberty, Benjamin
Macrosystem ecological studies inherently investigate processes that interact across multiple spatial and temporal scales, requiring intensive sampling and massive amounts of data from diverse sources to incorporate complex cross-scale and hierarchical interactions. Inherent challenges associated with these characteristics include high computational demands, data standardization and assimilation, identification of important processes and scales without prior knowledge, and the need for large, cross-disciplinary research teams that conduct long-term studies. Therefore, macrosystem ecology studies must utilize a unique set of approaches that are capable of encompassing these methodological characteristics and associated challenges. Several case studies demonstrate innovative methods used in current macrosystem ecologymore » studies.« less
A product-service system approach to telehealth application design.
Flores-Vaquero, Paul; Tiwari, Ashutosh; Alcock, Jeffrey; Hutabarat, Windo; Turner, Chris
2016-06-01
A considerable proportion of current point-of-care devices do not offer a wide enough set of capabilities if they are to function in any telehealth system. There is a need for intermediate devices that lie between healthcare devices and service networks. The development of an application is suggested that allows for a smartphone to take the role of an intermediate device. This research seeks to identify the telehealth service requirements for long-term condition management using a product-service system approach. The use of product-service system has proven to be a suitable methodology for the design and development of telehealth smartphone applications. © The Author(s) 2014.
Meeting EHR security requirements: SeAAS approach.
Katt, Basel; Trojer, Thomas; Breu, Ruth; Schabetsberger, Thomas; Wozak, Florian
2010-01-01
In the last few years, Electronic Health Record (EHR) systems have received a great attention in the literature, as well as in the industry. They are expected to lead to health care savings, increase health care quality and reduce medical errors. This interest has been accompanied by the development of different standards and frameworks to meet EHR challenges. One of the most important initiatives that was developed to solve problems of EHR is IHE (Integrating the Healthcare Enterprise), which adapts the distributed approach to store and manage healthcare data. IHE aims at standardizing the way healthcare systems exchange information in distributed environments. For this purpose it defines several so called Integration Profiles that specify the interactions and the interfaces (Transactions) between various healthcare systems (Actors) or entities. Security was considered also in few profiles that tackled the main security requirements, mainly authentication and audit trails. The security profiles of IHE currently suffer two drawbacks. First, they apply end point security methodology, which has been proven recently to be insufficient and cumbersome in distributed and heterogeneous environment. Second, the current security profiles for more complex security requirements are oversimplified, vague and do not consider architectural design. This recently changed to some extend e.g., with the introduction of newly published white papers regarding privacy [5] and access control [9]. In order to solve the first problem we utilize results of previous studies conducted in the area of security-aware IHE-based systems and the state-of-the-art Security-as-a-Service approach as a convenient methodology to group domain-wide security needs and overcome the end point security shortcomings.
Andersen, Melvin E.; Clewell, Harvey J.; Carmichael, Paul L.; Boekelheide, Kim
2013-01-01
The 2007 report “Toxicity Testing in the 21st Century: A Vision and A Strategy” argued for a change in toxicity testing for environmental agents and discussed federal funding mechanisms that could be used to support this transformation within the USA. The new approach would test for in vitro perturbations of toxicity pathways using human cells with high throughput testing platforms. The NRC report proposed a deliberate timeline, spanning about 20 years, to implement a wholesale replacement of current in-life toxicity test approaches focused on apical responses with in vitro assays. One approach to accelerating implementation is to focus on well-studied prototype compounds with known toxicity pathway targets. Through a series of carefully executed case studies with four or five pathway prototypes, the various steps required for implementation of an in vitro toxicity pathway approach to risk assessment could be developed and refined. In this article, we discuss alternative approaches for implementation and also outline advantages of a case study approach and the manner in which the cases studies could be pursued using current methodologies. A case study approach would be complementary to recently proposed efforts to map the human toxome, while representing a significant extension toward more formal risk assessment compared to the profiling and prioritization approaches offered by programs such as the EPA’s ToxCast effort. PMID:21993955
NASA Astrophysics Data System (ADS)
Dodick, Jeff; Argamon, Shlomo; Chase, Paul
2009-08-01
A key focus of current science education reforms involves developing inquiry-based learning materials. However, without an understanding of how working scientists actually do science, such learning materials cannot be properly developed. Until now, research on scientific reasoning has focused on cognitive studies of individual scientific fields. However, the question remains as to whether scientists in different fields fundamentally rely on different methodologies. Although many philosophers and historians of science do indeed assert that there is no single monolithic scientific method, this has never been tested empirically. We therefore approach this problem by analyzing patterns of language used by scientists in their published work. Our results demonstrate systematic variation in language use between types of science that are thought to differ in their characteristic methodologies. The features of language use that were found correspond closely to a proposed distinction between Experimental Sciences (e.g., chemistry) and Historical Sciences (e.g., paleontology); thus, different underlying rhetorical and conceptual mechanisms likely operate for scientific reasoning and communication in different contexts.
The temporal structure of pollution levels in developed cities.
Barrigón Morillas, Juan Miguel; Ortiz-Caraballo, Carmen; Prieto Gajardo, Carlos
2015-06-01
Currently, the need for mobility can cause significant pollution levels in cities, with important effects on health and quality of life. Any approach to the study of urban pollution and its effects requires an analysis of spatial distribution and temporal variability. It is a crucial dilemma to obtain proven methodologies that allow an increase in the quality of the prediction and the saving of resources in the spatial and temporal sampling. This work proposes a new analytical methodology in the study of temporal structure. As a result, a model for estimating annual levels of urban traffic noise was proposed. The average errors are less than one decibel in all acoustics indicators. A new working methodology of urban noise has begun. Additionally, a general application can be found for the study of the impacts of pollution associated with traffic, with implications for urban design and possibly in economic and sociological aspects. Copyright © 2015 Elsevier B.V. All rights reserved.
Transferring Codified Knowledge: Socio-Technical versus Top-Down Approaches
ERIC Educational Resources Information Center
Guzman, Gustavo; Trivelato, Luiz F.
2008-01-01
Purpose: This paper aims to analyse and evaluate the transfer process of codified knowledge (CK) performed under two different approaches: the "socio-technical" and the "top-down". It is argued that the socio-technical approach supports the transfer of CK better than the top-down approach. Design/methodology/approach: Case study methodology was…
NASA Astrophysics Data System (ADS)
Escuder-Bueno, I.; Castillo-Rodríguez, J. T.; Zechner, S.; Jöbstl, C.; Perales-Momparler, S.; Petaccia, G.
2012-09-01
Risk analysis has become a top priority for authorities and stakeholders in many European countries, with the aim of reducing flooding risk, considering the population's needs and improving risk awareness. Within this context, two methodological pieces have been developed in the period 2009-2011 within the SUFRI project (Sustainable Strategies of Urban Flood Risk Management with non-structural measures to cope with the residual risk, 2nd ERA-Net CRUE Funding Initiative). First, the "SUFRI Methodology for pluvial and river flooding risk assessment in urban areas to inform decision-making" provides a comprehensive and quantitative tool for flood risk analysis. Second, the "Methodology for investigation of risk awareness of the population concerned" presents the basis to estimate current risk from a social perspective and identify tendencies in the way floods are understood by citizens. Outcomes of both methods are integrated in this paper with the aim of informing decision making on non-structural protection measures. The results of two case studies are shown to illustrate practical applications of this developed approach. The main advantage of applying the methodology herein presented consists in providing a quantitative estimation of flooding risk before and after investing in non-structural risk mitigation measures. It can be of great interest for decision makers as it provides rational and solid information.
Increasing accuracy in the assessment of motion sickness: A construct methodology
NASA Technical Reports Server (NTRS)
Stout, Cynthia S.; Cowings, Patricia S.
1993-01-01
The purpose is to introduce a new methodology that should improve the accuracy of the assessment of motion sickness. This construct methodology utilizes both subjective reports of motion sickness and objective measures of physiological correlates to assess motion sickness. Current techniques and methods used in the framework of a construct methodology are inadequate. Current assessment techniques for diagnosing motion sickness and space motion sickness are reviewed, and attention is called to the problems with the current methods. Further, principles of psychophysiology that when applied will probably resolve some of these problems are described in detail.
Various approaches in EPR identification of gamma-irradiated plant foodstuffs: A review.
Aleksieva, Katerina I; Yordanov, Nicola D
2018-03-01
Irradiation of food in the world is becoming a preferred method for their sterilization and extending their shelf life. For the purpose of trade with regard to the rights of consumers is necessary marking of irradiated foodstuffs, and the use of appropriate methods for unambiguous identification of radiation treatment. One-third of the current standards of the European Union to identify irradiated foods use the method of the Electron Paramagnetic Resonance (EPR) spectroscopy. On the other hand the current standards for irradiated foods of plant origin have some weaknesses that led to the development of new methodologies for the identification of irradiated food. New approaches for EPR identification of radiation treatment of herbs and spices when the specific signal is absent or disappeared after irradiation are discussed. Direct EPR measurements of dried fruits and vegetables and different pretreatments for fresh samples are reviewed. Copyright © 2017 Elsevier Ltd. All rights reserved.
Exuberant and inhibited children: Person-centered profiles and links to social adjustment.
Dollar, Jessica M; Stifter, Cynthia A; Buss, Kristin A
2017-07-01
The current study aimed to substantiate and extend our understanding regarding the existence and developmental pathways of 3 distinct temperament profiles-exuberant, inhibited, and average approach-in a sample of 3.5-year-old children (n = 121). The interactions between temperamental styles and specific types of effortful control, inhibitory control and attentional control, were also examined in predicting kindergarten peer acceptance. Latent profile analysis identified 3 temperamental styles: exuberant, inhibited, and average approach. Support was found for the adaptive role of inhibitory control for exuberant children and attentional control for inhibited children in promoting peer acceptance in kindergarten. These findings add to our current understanding of temperamental profiles by using sophisticated methodology in a slightly older, community sample, as well as the importance of examining specific types of self-regulation to identify which skills lower risk for children of different temperamental styles. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
User-Defined Data Distributions in High-Level Programming Languages
NASA Technical Reports Server (NTRS)
Diaconescu, Roxana E.; Zima, Hans P.
2006-01-01
One of the characteristic features of today s high performance computing systems is a physically distributed memory. Efficient management of locality is essential for meeting key performance requirements for these architectures. The standard technique for dealing with this issue has involved the extension of traditional sequential programming languages with explicit message passing, in the context of a processor-centric view of parallel computation. This has resulted in complex and error-prone assembly-style codes in which algorithms and communication are inextricably interwoven. This paper presents a high-level approach to the design and implementation of data distributions. Our work is motivated by the need to improve the current parallel programming methodology by introducing a paradigm supporting the development of efficient and reusable parallel code. This approach is currently being implemented in the context of a new programming language called Chapel, which is designed in the HPCS project Cascade.
g-force induced giant efficiency of nanoparticles internalization into living cells
Ocampo, Sandra M.; Rodriguez, Vanessa; de la Cueva, Leonor; Salas, Gorka; Carrascosa, Jose. L.; Josefa Rodríguez, María; García-Romero, Noemí; Luis, Jose; Cuñado, F.; Camarero, Julio; Miranda, Rodolfo; Belda-Iniesta, Cristobal; Ayuso-Sacido, Angel
2015-01-01
Nanotechnology plays an increasingly important role in the biomedical arena. Iron oxide nanoparticles (IONPs)-labelled cells is one of the most promising approaches for a fast and reliable evaluation of grafted cells in both preclinical studies and clinical trials. Current procedures to label living cells with IONPs are based on direct incubation or physical approaches based on magnetic or electrical fields, which always display very low cellular uptake efficiencies. Here we show that centrifugation-mediated internalization (CMI) promotes a high uptake of IONPs in glioblastoma tumour cells, just in a few minutes, and via clathrin-independent endocytosis pathway. CMI results in controllable cellular uptake efficiencies at least three orders of magnitude larger than current procedures. Similar trends are found in human mesenchymal stem cells, thereby demonstrating the general feasibility of the methodology, which is easily transferable to any laboratory with great potential for the development of improved biomedical applications. PMID:26477718
Information requirements and methodology for development of an EVA crewmember's heads up display
NASA Astrophysics Data System (ADS)
Petrek, J. S.
This paper presents a systematic approach for developing a Heads Up Display (HUD) to be used within the helmet of the Extra Vehicular Activity (EVA) crewmember. The information displayed on the EVA HUD will be analogous to EVA Flight Data File (FDF) information, which is an integral part of NASA's current Space Transportation System. Another objective is to determine information requirements and media techniques ultimately leading to the helmet-mounted HUD presentation technique.
Development and application of optimum sensitivity analysis of structures
NASA Technical Reports Server (NTRS)
Barthelemy, J. F. M.; Hallauer, W. L., Jr.
1984-01-01
The research focused on developing an algorithm applying optimum sensitivity analysis for multilevel optimization. The research efforts have been devoted to assisting NASA Langley's Interdisciplinary Research Office (IRO) in the development of a mature methodology for a multilevel approach to the design of complex (large and multidisciplinary) engineering systems. An effort was undertaken to identify promising multilevel optimization algorithms. In the current reporting period, the computer program generating baseline single level solutions was completed and tested out.
Weber, Durkheim, and the comparative method.
Kapsis, R E
1977-10-01
This essay compares and contrasts the means by which Durkheim and Weber dealt with methodological issues peculiar to the comparative study of societies, what Smelser has called "the problem of sociocultural variability and complexity." More specifically, it examines how Weber and Durkheim chose appropriate comparative units for their empirical studies. The approaches that Weber and Durkheim brought to theproblem of cross-cultural comparison have critical implications for more current procedures used in the comparative study of contemporary and historical societies.
Structural adjustment for accurate conditioning in large-scale subsurface systems
NASA Astrophysics Data System (ADS)
Tahmasebi, Pejman
2017-03-01
Most of the current subsurface simulation approaches consider a priority list for honoring the well and any other auxiliary data, and eventually adopt a middle ground between the quality of the model and conditioning it to hard data. However, as the number of datasets increases, such methods often produce undesirable features in the subsurface model. Due to their high flexibility, subsurface modeling based on training images (TIs) is becoming popular. Providing comprehensive TIs remains, however, an outstanding problem. In addition, identifying a pattern similar to those in the TI that honors the well and other conditioning data is often difficult. Moreover, the current subsurface modeling approaches do not account for small perturbations that may occur in a subsurface system. Such perturbations are active in most of the depositional systems. In this paper, a new methodology is presented that is based on an irregular gridding scheme that accounts for incomplete TIs and minor offsets. Use of the methodology enables one to use a small or incomplete TI and adaptively change the patterns in the simulation grid in order to simultaneously honor the well data and take into account the effect of the local offsets. Furthermore, the proposed method was used on various complex process-based models and their structures are deformed for matching with the conditioning point data. The accuracy and robustness of the proposed algorithm are successfully demonstrated by applying it to models of several complex examples.
Investigating students' view on STEM in learning about electrical current through STS approach
NASA Astrophysics Data System (ADS)
Tupsai, Jiraporn; Yuenyong, Chokchai
2018-01-01
This study aims to investigate Grade 11 students' views on Science Technology Engineering Mathematics (STEM) with the integration of learning about electrical current based on Science Technology Society (STS) approach [8]. The participants were 60 Grade 11 students in Demonstration Secondary School, Khon Kaen University, Khon Kaen Province, Thailand. The methodology is in the respect of interpretive paradigm. The teaching and learning about Electrical Current through STS approach carried out over 6 weeks. The Electrical Current unit through STS approach was developed based on framework[8] that consists of five stages including (1) identification of social issues, (2) identification of potential solutions, (3) need for knowledge, (4) decision making, and (5) socialization stage. To start with, the question "what if this world is lack of electricity" was challenged in the class in order to move students to find the problem of how to design Electricity Generation from Clean Energy. Students were expected to apply scientific and other knowledge to design of Electricity Generation. Students' views on STEM were collected during their learning by participant' observation and students' tasks. Their views on STEM were categorized when they applied their knowledge for designing the Electricity Generation. The findings indicated that students cooperatively work to solve the problem when applying knowledge about the content of Science and Mathematics and processing skill of Technology and Engineering. It showed that students held the integration of science, technology, engineering and mathematics to design their possible solutions in learning about Electrical Current. The paper also discusses implications for science teaching and learning through STS in Thailand.
Product environmental footprint in policy and market decisions: Applicability and impact assessment.
Lehmann, Annekatrin; Bach, Vanessa; Finkbeiner, Matthias
2015-07-01
In April 2013, the European Commission published the Product and Organisation Environmental Footprint (PEF/OEF) methodology--a life cycle-based multicriteria measure of the environmental performance of products, services, and organizations. With its approach of "comparability over flexibility," the PEF/OEF methodology aims at harmonizing existing methods, while decreasing the flexibility provided by the International Organization for Standardization (ISO) standards regarding methodological choices. Currently, a 3-y pilot phase is running, aiming at testing the methodology and developing product category and organization sector rules (PEFCR/OEFSR). Although a harmonized method is in theory a good idea, the PEF/OEF methodology presents challenges, including a risk of confusion and limitations in applicability to practice. The paper discusses the main differences between the PEF and ISO methodologies and highlights challenges regarding PEF applicability, with a focus on impact assessment. Some methodological aspects of the PEF and PEFCR Guides are found to contradict the ISO 14044 (2006) and ISO 14025 (2006). Others, such as prohibition of inventory cutoffs, are impractical. The evaluation of the impact assessment methods proposed in the PEF/OEF Guide showed that the predefined methods for water consumption, land use, and abiotic resources are not adequate because of modeling artefacts, missing inventory data, or incomplete characterization factors. However, the methods for global warming and ozone depletion perform very well. The results of this study are relevant for the PEF (and OEF) pilot phase, which aims at testing the PEF (OEF) methodology (and potentially adapting it) as well as addressing challenges and coping with them. © 2015 SETAC.
Modified teaching approach for an enhanced medical physics graduate education experience
Rutel, IB
2011-01-01
Lecture-based teaching promotes a passive interaction with students. Opportunities to modify this format are available to enhance the overall learning experience for both students and instructors. The description for a discussion-based learning format is presented as it applies to a graduate curriculum with technical (formal mathematical derivation) topics. The presented hybrid method involves several techniques, including problem-based learning, modeling, and online lectures, eliminating didactic lectures. The results from an end-of-course evaluation show that the students appear to prefer the modified format over the more traditional methodology of “lecture only” contact time. These results are motivation for further refinement and continued implementation of the described methodology in the current course and potentially other courses within the department graduate curriculum. PMID:22279505
The Joint Confidence Level Paradox: A History of Denial
NASA Technical Reports Server (NTRS)
Butts, Glenn; Linton, Kent
2009-01-01
This paper is intended to provide a reliable methodology for those tasked with generating price tags on construction (C0F) and research and development (R&D) activities in the NASA performance world. This document consists of a collection of cost-related engineering detail and project fulfillment information from early agency days to the present. Accurate historical detail is the first place to start when determining improved methodologies for future cost and schedule estimating. This paper contains a beneficial proposed cost estimating method for arriving at more reliable numbers for future submits. When comparing current cost and schedule methods with earlier cost and schedule approaches, it became apparent that NASA's organizational performance paradigm has morphed. Mission fulfillment speed has slowed and cost calculating factors have increased in 21st Century space exploration.
Dvornikov, M V; Medenkov, A A
2015-04-01
In the current paper authors discuss problems of marine and aerospace medicine and psychophysiology, which Georgii Zarakovskii (1925-2014), a prominent domestic experts in the field of military medicine, psychology and ergonomics, solved. Authors focused on methodological approaches and results of the study of psychophysiological characteristics and human capabilities took into account for design of tools and organization of flight crews, astronauts and military experts. Authors marked the contribution to the creation of a system integrating psychophysiological features and characteristics of the person neccessary for development, testing and maintenance of aerospace engineering and organization of its professional activities. The possibilities of using the methodology of psychophysiological activity analysis in order to improve the reliability of psychophysiological military specialists, are shown.
Methodological guidelines for developing accident modification functions.
Elvik, Rune
2015-07-01
This paper proposes methodological guidelines for developing accident modification functions. An accident modification function is a mathematical function describing systematic variation in the effects of road safety measures. The paper describes ten guidelines. An example is given of how to use the guidelines. The importance of exploratory analysis and an iterative approach in developing accident modification functions is stressed. The example shows that strict compliance with all the guidelines may be difficult, but represents a level of stringency that should be strived for. Currently the main limitations in developing accident modification functions are the small number of good evaluation studies and the often huge variation in estimates of effect. It is therefore still not possible to develop accident modification functions for very many road safety measures. Copyright © 2015 Elsevier Ltd. All rights reserved.
A review on color normalization and color deconvolution methods in histopathology.
Onder, Devrim; Zengin, Selen; Sarioglu, Sulen
2014-01-01
The histopathologists get the benefits of wide range of colored dyes to have much useful information about the lesions and the tissue compositions. Despite its advantages, the staining process comes up with quite complex variations in staining concentrations and correlations, tissue fixation types, and fixation time periods. Together with the improvements in computing power and with the development of novel image analysis methods, these imperfections have led to the emerging of several color normalization algorithms. This article is a review of the currently available digital color normalization methods for the bright field histopathology. We describe the proposed color normalization methodologies in detail together with the lesion and tissue types used in the corresponding experiments. We also present the quantitative validation approaches for each of the proposed methodology where available.
Ethnographic methods for process evaluations of complex health behaviour interventions.
Morgan-Trimmer, Sarah; Wood, Fiona
2016-05-04
This article outlines the contribution that ethnography could make to process evaluations for trials of complex health-behaviour interventions. Process evaluations are increasingly used to examine how health-behaviour interventions operate to produce outcomes and often employ qualitative methods to do this. Ethnography shares commonalities with the qualitative methods currently used in health-behaviour evaluations but has a distinctive approach over and above these methods. It is an overlooked methodology in trials of complex health-behaviour interventions that has much to contribute to the understanding of how interventions work. These benefits are discussed here with respect to three strengths of ethnographic methodology: (1) producing valid data, (2) understanding data within social contexts, and (3) building theory productively. The limitations of ethnography within the context of process evaluations are also discussed.
Navigating the grounded theory terrain. Part 1.
Hunter, Andrew; Murphy, Kathy; Grealish, Annmarie; Casey, Dympna; Keady, John
2011-01-01
The decision to use grounded theory is not an easy one and this article aims to illustrate and explore the methodological complexity and decision-making process. It explores the decision making of one researcher in the first two years of a grounded theory PhD study looking at the psychosocial training needs of nurses and healthcare assistants working with people with dementia in residential care. It aims to map out three different approaches to grounded theory: classic, Straussian and constructivist. In nursing research, grounded theory is often referred to but it is not always well understood. This confusion is due in part to the history of grounded theory methodology, which is one of development and divergent approaches. Common elements across grounded theory approaches are briefly outlined, along with the key differences of the divergent approaches. Methodological literature pertaining to the three chosen grounded theory approaches is considered and presented to illustrate the options and support the choice made. The process of deciding on classical grounded theory as the version best suited to this research is presented. The methodological and personal factors that directed the decision are outlined. The relative strengths of Straussian and constructivist grounded theories are reviewed. All three grounded theory approaches considered offer the researcher a structured, rigorous methodology, but researchers need to understand their choices and make those choices based on a range of methodological and personal factors. In the second article, the final methodological decision will be outlined and its research application described.
An Assessment Methodology to Evaluate In-Flight Engine Health Management Effectiveness
NASA Astrophysics Data System (ADS)
Maggio, Gaspare; Belyeu, Rebecca; Pelaccio, Dennis G.
2002-01-01
flight effectiveness of candidate engine health management system concepts. A next generation engine health management system will be required to be both reliable and robust in terms of anomaly detection capability. The system must be able to operate successfully in the hostile, high-stress engine system environment. This implies that its system components, such as the instrumentation, process and control, and vehicle interface and support subsystems, must be highly reliable. Additionally, the system must be able to address a vast range of possible engine operation anomalies through a host of different types of measurements supported by a fast algorithm/architecture processing capability that can identify "true" (real) engine operation anomalies. False anomaly condition reports for such a system must be essentially eliminated. The accuracy of identifying only real anomaly conditions has been an issue with the Space Shuttle Main Engine (SSME) in the past. Much improvement in many of the technologies to address these areas is required. The objectives of this study were to identify and demonstrate a consistent assessment methodology that can evaluate the capability of next generation engine health management system concepts to respond in a correct, timely manner to alleviate an operational engine anomaly condition during flight. Science Applications International Corporation (SAIC), with support from NASA Marshall Space Flight Center, identified a probabilistic modeling approach to assess engine health management system concept effectiveness using a deterministic anomaly-time event assessment modeling approach that can be applied in the engine preliminary design stage of development to assess engine health management system concept effectiveness. Much discussion in this paper focuses on the formulation and application approach in performing this assessment. This includes detailed discussion of key modeling assumptions, the overall assessment methodology approach identified, and the identification of key supporting engine health management system concept design/operation and fault mode information required to utilize this methodology. At the paper's conclusion, discussion focuses on a demonstration benchmark study that applied this methodology to the current SSME health management system. A summary of study results and lessons learned are provided. Recommendations for future work in this area are also identified at the conclusion of the paper. * Please direct all correspondence/communication pertaining to this paper to Dennis G. Pelaccio, Science
Mullins, C Daniel; Wang, Junling; Cooke, Jesse L; Blatt, Lisa; Baquet, Claudia R
2004-01-01
Projecting future breast cancer treatment expenditure is critical for budgeting purposes, medical decision making and the allocation of resources in order to maximise the overall impact on health-related outcomes of care. Currently, both longitudinal and cross-sectional methodologies are used to project the economic burden of cancer. This pilot study examined the differences in estimates that were obtained using these two methods, focusing on Maryland, US Medicaid reimbursement data for chemotherapy and prescription drugs for the years 1999-2000. Two different methodologies for projecting life cycles of cancer expenditure were considered. The first examined expenditure according to chronological time (calendar quarter) for all cancer patients in the database in a given quarter. The second examined only the most recent quarter and constructed a hypothetical expenditure life cycle by taking into consideration the number of quarters since the respective patient had her first claim. We found different average expenditures using the same data and over the same time period. The longitudinal measurement had less extreme peaks and troughs, and yielded average expenditure in the final period that was 60% higher than that produced using the cross-sectional analysis; however, the longitudinal analysis had intermediate periods with significantly lower estimated expenditure than the cross-sectional data. These disparate results signify that each of the methods has merit. The longitudinal method tracks changes over time while the cross-sectional approach reflects more recent data, e.g. current practice patterns. Thus, this study reiterates the importance of considering the methodology when projecting future cancer expenditure.
Braubach, Matthias; Tobollik, Myriam; Mudu, Pierpaolo; Hiscock, Rosemary; Chapizanis, Dimitris; Sarigiannis, Denis A.; Keuken, Menno; Perez, Laura; Martuzzi, Marco
2015-01-01
Well-being impact assessments of urban interventions are a difficult challenge, as there is no agreed methodology and scarce evidence on the relationship between environmental conditions and well-being. The European Union (EU) project “Urban Reduction of Greenhouse Gas Emissions in China and Europe” (URGENCHE) explored a methodological approach to assess traffic noise-related well-being impacts of transport interventions in three European cities (Basel, Rotterdam and Thessaloniki) linking modeled traffic noise reduction effects with survey data indicating noise-well-being associations. Local noise models showed a reduction of high traffic noise levels in all cities as a result of different urban interventions. Survey data indicated that perception of high noise levels was associated with lower probability of well-being. Connecting the local noise exposure profiles with the noise-well-being associations suggests that the urban transport interventions may have a marginal but positive effect on population well-being. This paper also provides insight into the methodological challenges of well-being assessments and highlights the range of limitations arising from the current lack of reliable evidence on environmental conditions and well-being. Due to these limitations, the results should be interpreted with caution. PMID:26016437
Analysis of crack initiation and growth in the high level vibration test at Tadotsu
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kassir, M.K.; Park, Y.J.; Hofmayer, C.H.
1993-08-01
The High Level Vibration Test data are used to assess the accuracy and usefulness of current engineering methodologies for predicting crack initiation and growth in a cast stainless steel pipe elbow under complex, large amplitude loading. The data were obtained by testing at room temperature a large scale modified model of one loop of a PWR primary coolant system at the Tadotsu Engineering Laboratory in Japan. Fatigue crack initiation time is reasonably predicted by applying a modified local strain approach (Coffin-Mason-Goodman equation) in conjunction with Miner`s rule of cumulative damage. Three fracture mechanics methodologies are applied to investigate the crackmore » growth behavior observed in the hot leg of the model. These are: the {Delta}K methodology (Paris law), {Delta}J concepts and a recently developed limit load stress-range criterion. The report includes a discussion on the pros and cons of the analysis involved in each of the methods, the role played by the key parameters influencing the formulation and a comparison of the results with the actual crack growth behavior observed in the vibration test program. Some conclusions and recommendations for improvement of the methodologies are also provided.« less
Braubach, Matthias; Tobollik, Myriam; Mudu, Pierpaolo; Hiscock, Rosemary; Chapizanis, Dimitris; Sarigiannis, Denis A; Keuken, Menno; Perez, Laura; Martuzzi, Marco
2015-05-26
Well-being impact assessments of urban interventions are a difficult challenge, as there is no agreed methodology and scarce evidence on the relationship between environmental conditions and well-being. The European Union (EU) project "Urban Reduction of Greenhouse Gas Emissions in China and Europe" (URGENCHE) explored a methodological approach to assess traffic noise-related well-being impacts of transport interventions in three European cities (Basel, Rotterdam and Thessaloniki) linking modeled traffic noise reduction effects with survey data indicating noise-well-being associations. Local noise models showed a reduction of high traffic noise levels in all cities as a result of different urban interventions. Survey data indicated that perception of high noise levels was associated with lower probability of well-being. Connecting the local noise exposure profiles with the noise-well-being associations suggests that the urban transport interventions may have a marginal but positive effect on population well-being. This paper also provides insight into the methodological challenges of well-being assessments and highlights the range of limitations arising from the current lack of reliable evidence on environmental conditions and well-being. Due to these limitations, the results should be interpreted with caution.
Morales, Daniel R.; Pacurariu, Alexandra; Kurz, Xavier
2017-01-01
Aims Evaluating the public health impact of regulatory interventions is important but there is currently no common methodological approach to guide this evaluation. This systematic review provides a descriptive overview of the analytical methods for impact research. Methods We searched MEDLINE and EMBASE for articles with an empirical analysis evaluating the impact of European Union or non‐European Union regulatory actions to safeguard public health published until March 2017. References from systematic reviews and articles from other known sources were added. Regulatory interventions, data sources, outcomes of interest, methodology and key findings were extracted. Results From 1246 screened articles, 229 were eligible for full‐text review and 153 articles in English language were included in the descriptive analysis. Over a third of articles studied analgesics and antidepressants. Interventions most frequently evaluated are regulatory safety communications (28.8%), black box warnings (23.5%) and direct healthcare professional communications (10.5%); 55% of studies measured changes in drug utilization patterns, 27% evaluated health outcomes, and 18% targeted knowledge, behaviour or changes in clinical practice. Unintended consequences like switching therapies or spill‐over effects were rarely evaluated. Two‐thirds used before–after time series and 15.7% before–after cross‐sectional study designs. Various analytical approaches were applied including interrupted time series regression (31.4%), simple descriptive analysis (28.8%) and descriptive analysis with significance tests (23.5%). Conclusion Whilst impact evaluation of pharmacovigilance and product‐specific regulatory interventions is increasing, the marked heterogeneity in study conduct and reporting highlights the need for scientific guidance to ensure robust methodologies are applied and systematic dissemination of results occurs. PMID:29105853
Methodology or method? A critical review of qualitative case study reports.
Hyett, Nerida; Kenny, Amanda; Dickson-Swift, Virginia
2014-01-01
Despite on-going debate about credibility, and reported limitations in comparison to other approaches, case study is an increasingly popular approach among qualitative researchers. We critically analysed the methodological descriptions of published case studies. Three high-impact qualitative methods journals were searched to locate case studies published in the past 5 years; 34 were selected for analysis. Articles were categorized as health and health services (n=12), social sciences and anthropology (n=7), or methods (n=15) case studies. The articles were reviewed using an adapted version of established criteria to determine whether adequate methodological justification was present, and if study aims, methods, and reported findings were consistent with a qualitative case study approach. Findings were grouped into five themes outlining key methodological issues: case study methodology or method, case of something particular and case selection, contextually bound case study, researcher and case interactions and triangulation, and study design inconsistent with methodology reported. Improved reporting of case studies by qualitative researchers will advance the methodology for the benefit of researchers and practitioners.
Valuing Non-CO2 GHG Emission Changes in Benefit-Cost ...
The climate impacts of greenhouse gas (GHG) emissions impose social costs on society. To date, EPA has not had an approach to estimate the economic benefits of reducing emissions of non-CO2 GHGs (or the costs of increasing them) that is consistent with the methodology underlying the U.S. Government’s current estimates of the social cost of carbon (SCC). A recently published paper presents estimates of the social cost of methane that are consistent with the SCC estimates. The Agency is seeking review of the potential application of these new benefit estimates to benefit cost analysis in relation to current practice in this area. The goal of this project is to improve upon the current treatment of non-CO2 GHG emission impacts in benefit-cost analysis.
Tomblin Murphy, Gail; Birch, Stephen; MacKenzie, Adrian; Rigby, Janet
2016-12-12
As part of efforts to inform the development of a global human resources for health (HRH) strategy, a comprehensive methodology for estimating HRH supply and requirements was described in a companion paper. The purpose of this paper is to demonstrate the application of that methodology, using data publicly available online, to simulate the supply of and requirements for midwives, nurses, and physicians in the 32 high-income member countries of the Organisation for Economic Co-operation and Development (OECD) up to 2030. A model combining a stock-and-flow approach to simulate the future supply of each profession in each country-adjusted according to levels of HRH participation and activity-and a needs-based approach to simulate future HRH requirements was used. Most of the data to populate the model were obtained from the OECD's online indicator database. Other data were obtained from targeted internet searches and documents gathered as part of the companion paper. Relevant recent measures for each model parameter were found for at least one of the included countries. In total, 35% of the desired current data elements were found; assumed values were used for the other current data elements. Multiple scenarios were used to demonstrate the sensitivity of the simulations to different assumed future values of model parameters. Depending on the assumed future values of each model parameter, the simulated HRH gaps across the included countries could range from shortfalls of 74 000 midwives, 3.2 million nurses, and 1.2 million physicians to surpluses of 67 000 midwives, 2.9 million nurses, and 1.0 million physicians by 2030. Despite important gaps in the data publicly available online and the short time available to implement it, this paper demonstrates the basic feasibility of a more comprehensive, population needs-based approach to estimating HRH supply and requirements than most of those currently being used. HRH planners in individual countries, working with their respective stakeholder groups, would have more direct access to data on the relevant planning parameters and would thus be in an even better position to implement such an approach.
Jordan, Jens; Nilsson, Peter M; Kotsis, Vasilios; Olsen, Michael H; Grassi, Guido; Yumuk, Volkan; Hauner, Hans; Zahorska-Markiewicz, Barbara; Toplak, Hermann; Engeli, Stefan; Finer, Nick
2015-03-01
Current cardiovascular risk scores do not include obesity or fat distribution as independent factors, and may underestimate risk in obese individuals. Assessment of early vascular ageing (EVA) biomarkers including arterial stiffness, central blood pressure, carotid intima-media thickness and flow-mediated vasodilation may help to refine risk assessment in obese individuals in whom traditional cardiovascular risk scores and factors suggest no need for specific medical attention. A number of issues need to be addressed before this approach is ready for translation into routine clinical practice. Methodologies for measurements of vascular markers need to be further standardized and less operator-dependent. The utility of these nontraditional risk factors will also need to be proven in sufficiently large and properly designed interventional studies. Indeed, published studies on vascular markers in obesity and weight loss vary in quality and study design, are sometimes conducted in small populations, use a variety of differing methodologies and study differing vascular beds. Finally, current vascular measurements are still crude and may not be sufficient to cover the different aspects of EVA in obesity.
Collins, A L; Pulley, S; Foster, I D L; Gellis, A; Porto, P; Horowitz, A J
2017-06-01
The growing awareness of the environmental significance of fine-grained sediment fluxes through catchment systems continues to underscore the need for reliable information on the principal sources of this material. Source estimates are difficult to obtain using traditional monitoring techniques, but sediment source fingerprinting or tracing procedures, have emerged as a potentially valuable alternative. Despite the rapidly increasing numbers of studies reporting the use of sediment source fingerprinting, several key challenges and uncertainties continue to hamper consensus among the international scientific community on key components of the existing methodological procedures. Accordingly, this contribution reviews and presents recent developments for several key aspects of fingerprinting, namely: sediment source classification, catchment source and target sediment sampling, tracer selection, grain size issues, tracer conservatism, source apportionment modelling, and assessment of source predictions using artificial mixtures. Finally, a decision-tree representing the current state of knowledge is presented, to guide end-users in applying the fingerprinting approach. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
O'Connor, Claire M.; Clemson, Lindy; da Silva, Thaís Bento Lima; Piguet, Olivier; Hodges, John R.; Mioshi, Eneida
2013-01-01
FTD is a unique condition which manifests with a range of behavioural symptoms, marked dysfunction in activities of daily living (ADL) and increased levels of carer burden as compared to carers of other dementias. No efficacious pharmacological interventions to treat FTD currently exist, and research on pharmacological symptom management is variable. The few studies on non-pharmacological interventions in FTD focus on either the carer or the patients' symptoms, and lack methodological rigour. This paper reviews and discusses current studies utilising non-pharmacological approaches, exposing the clear need for more rigorous methodologies to be applied in this field. Finally, a successful randomised controlled trial helped reduce behaviours of concern in dementia, and through implementing participation in tailored activities, the FTD-specific Tailored Activities Program (TAP) is presented. Crucially, this protocol has scope to target both the person with FTD and their carer. This paper highlights that studies in this area would help to elucidate the potential for using activities to reduce characteristic behaviours in FTD, improving quality of life and the caregiving experience in FTD. PMID:29213832
Lee, Ping-Tzu; Dakin, Emily; McLure, Merinda
2016-05-01
Equine-assisted psychotherapy (EAP) is an innovative emerging approach to mental health treatment. This narrative synthesis explores the current state of knowledge and areas for future research in EAP. Specifically reviewed are qualitative and quantitative empirical studies, including both articles published in peer-reviewed journals and research presented in theses and dissertations. We selected 24 studies for final inclusion in this study, dating between 2005 and 2013, and including the first EAP empirical research completed in 2005. Four of these studies are peer-reviewed journal articles, while 20 are master's theses or doctoral dissertations. The reviewed qualitative research provides initial evidence for the value of EAP for enhancing adolescents' communication and relationship skills. The reviewed experimental and quasi-experimental research provides initial evidence for the value of EAP for enhancing children's and adolescents' emotional, social and behavioural functioning. Yet, conclusions about the effectiveness of EAP must still be considered preliminary due to various methodological limitations in the reviewed research. The narrative review describes these methodological limitations and concludes with recommendations for future research. © 2015 John Wiley & Sons Ltd.
A Roadmap for Using Agile Development in a Traditional Environment
NASA Technical Reports Server (NTRS)
Streiffert, Barbara; Starbird, Thomas; Grenander, Sven
2006-01-01
One of the newer classes of software engineering techniques is called 'Agile Development'. In Agile Development software engineers take small implementation steps and, in some cases, they program in pairs. In addition, they develop automatic tests prior to implementing their small functional piece. Agile Development focuses on rapid turnaround, incremental planning, customer involvement and continuous integration. Agile Development is not the traditional waterfall method or even a rapid prototyping method (although this methodology is closer to Agile Development). At the Jet Propulsion Laboratory (JPL) a few groups have begun Agile Development software implementations. The difficulty with this approach becomes apparent when Agile Development is used in an organization that has specific criteria and requirements handed down for how software development is to be performed. The work at the JPL is performed for the National Aeronautics and Space Agency (NASA). Both organizations have specific requirements, rules and processes for developing software. This paper will discuss some of the initial uses of the Agile Development methodology, the spread of this method and the current status of the successful incorporation into the current JPL development policies and processes.
A Roadmap for Using Agile Development in a Traditional Environment
NASA Technical Reports Server (NTRS)
Streiffert, Barbara A.; Starbird, Thomas; Grenander, Sven
2006-01-01
One of the newer classes of software engineering techniques is called 'Agile Development'. In Agile Development software engineers take small implementation steps and, in some cases they program in pairs. In addition, they develop automatic tests prior to implementing their small functional piece. Agile Development focuses on rapid turnaround, incremental planning, customer involvement and continuous integration. Agile Development is not the traditional waterfall method or even a rapid prototyping method (although this methodology is closer to Agile Development). At Jet Propulsion Laboratory (JPL) a few groups have begun Agile Development software implementations. The difficulty with this approach becomes apparent when Agile Development is used in an organization that has specific criteria and requirements handed down for how software development is to be performed. The work at the JPL is performed for the National Aeronautics and Space Agency (NASA). Both organizations have specific requirements, rules and procedure for developing software. This paper will discuss the some of the initial uses of the Agile Development methodology, the spread of this method and the current status of the successful incorporation into the current JPL development policies.
On the validity of language: speaking, knowing and understanding in medical geography.
Scarpaci, J L
1993-09-01
This essay examines methodological problems concerning the conceptualization and operationalization of phenomena central to medical geography. Its main argument is that qualitative research can be strengthened if the differences between instrumental and apparent validity are better understood than the current research in medical geography suggests. Its premise is that our definitions of key terms and concepts must be reinforced throughout the design of research should our knowledge and understanding be enhanced. In doing so, the paper aims to move the methodological debate beyond the simple dichotomies of quantitative vs qualitative approaches and logical positivism vs phenomenology. Instead, the argument is couched in a postmodernist hermeneutic sense which questions the validity of one discourse of investigation over another. The paper begins by discussing methods used in conceptualizing and operationalizing variables in quantitative and qualitative research design. Examples derive from concepts central to a geography of health-care behavior and well-being. The latter half of the essay shows the uses and misuses of validity studies in selected health services research and the current debate on national health insurance.
Electrochemical degradation and mineralization of glyphosate herbicide.
Tran, Nam; Drogui, Patrick; Doan, Tuan Linh; Le, Thanh Son; Nguyen, Hoai Chau
2017-12-01
The presence of herbicide is a concern for both human and ecological health. Glyphosate is occasionally detected as water contaminants in agriculture areas where the herbicide is used extensively. The removal of glyphosate in synthetic solution using advanced oxidation process is a possible approach for remediation of contaminated waters. The ability of electrochemical oxidation for the degradation and mineralization of glyphosate herbicide was investigated using Ti/PbO 2 anode. The current intensity, treatment time, initial concentration and pH of solution are the influent parameters on the degradation efficiency. An experimental design methodology was applied to determine the optimal condition (in terms of cost/effectiveness) based on response surface methodology. Glyphosate concentration (C 0 = 16.9 mg L -1 ) decreased up to 0.6 mg L -1 when the optimal conditions were imposed (current intensity of 4.77 A and treatment time of 173 min). The removal efficiencies of glyphosate and total organic carbon were 95 ± 16% and 90.31%, respectively. This work demonstrates that electrochemical oxidation is a promising process for degradation and mineralization of glyphosate.
2012-01-01
This paper utilizes a statistical approach, the response surface optimization methodology, to determine the optimum conditions for the Acid Black 172 dye removal efficiency from aqueous solution by electrocoagulation. The experimental parameters investigated were initial pH: 4–10; initial dye concentration: 0–600 mg/L; applied current: 0.5-3.5 A and reaction time: 3–15 min. These parameters were changed at five levels according to the central composite design to evaluate their effects on decolorization through analysis of variance. High R2 value of 94.48% shows a high correlation between the experimental and predicted values and expresses that the second-order regression model is acceptable for Acid Black 172 dye removal efficiency. It was also found that some interactions and squares influenced the electrocoagulation performance as well as the selected parameters. Optimum dye removal efficiency of 90.4% was observed experimentally at initial pH of 7, initial dye concentration of 300 mg/L, applied current of 2 A and reaction time of 9.16 min, which is close to model predicted (90%) result. PMID:23369574
Differentiating between descriptive and interpretive phenomenological research approaches.
Matua, Gerald Amandu; Van Der Wal, Dirk Mostert
2015-07-01
To provide insight into how descriptive and interpretive phenomenological research approaches can guide nurse researchers during the generation and application of knowledge. Phenomenology is a discipline that investigates people's experiences to reveal what lies 'hidden' in them. It has become a major philosophy and research method in the humanities, human sciences and arts. Phenomenology has transitioned from descriptive phenomenology, which emphasises the 'pure' description of people's experiences, to the 'interpretation' of such experiences, as in hermeneutic phenomenology. However, nurse researchers are still challenged by the epistemological and methodological tenets of these two methods. The data came from relevant online databases and research books. A review of selected peer-reviewed research and discussion papers published between January 1990 and December 2013 was conducted using CINAHL, Science Direct, PubMed and Google Scholar databases. In addition, selected textbooks that addressed phenomenology as a philosophy and as a research methodology were used. Evidence from the literature indicates that most studies following the 'descriptive approach' to research are used to illuminate poorly understood aspects of experiences. In contrast, the 'interpretive/hermeneutic approach' is used to examine contextual features of an experience in relation to other influences such as culture, gender, employment or wellbeing of people or groups experiencing the phenomenon. This allows investigators to arrive at a deeper understanding of the experience, so that caregivers can derive requisite knowledge needed to address such clients' needs. Novice nurse researchers should endeavour to understand phenomenology both as a philosophy and research method. This is vitally important because in-depth understanding of phenomenology ensures that the most appropriate method is chosen to implement a study and to generate knowledge for nursing practice. This paper adds to the current debate on why it is important for nurse researchers to clearly understand phenomenology as a philosophy and research method before embarking on a study. The paper guides novice researchers on key methodological decisions they need to make when using descriptive or interpretive phenomenological research approaches.
A methodological systematic review of what's wrong with meta-ethnography reporting.
France, Emma F; Ring, Nicola; Thomas, Rebecca; Noyes, Jane; Maxwell, Margaret; Jepson, Ruth
2014-11-19
Syntheses of qualitative studies can inform health policy, services and our understanding of patient experience. Meta-ethnography is a systematic seven-phase interpretive qualitative synthesis approach well-suited to producing new theories and conceptual models. However, there are concerns about the quality of meta-ethnography reporting, particularly the analysis and synthesis processes. Our aim was to investigate the application and reporting of methods in recent meta-ethnography journal papers, focusing on the analysis and synthesis process and output. Methodological systematic review of health-related meta-ethnography journal papers published from 2012-2013. We searched six electronic databases, Google Scholar and Zetoc for papers using key terms including 'meta-ethnography.' Two authors independently screened papers by title and abstract with 100% agreement. We identified 32 relevant papers. Three authors independently extracted data and all authors analysed the application and reporting of methods using content analysis. Meta-ethnography was applied in diverse ways, sometimes inappropriately. In 13% of papers the approach did not suit the research aim. In 66% of papers reviewers did not follow the principles of meta-ethnography. The analytical and synthesis processes were poorly reported overall. In only 31% of papers reviewers clearly described how they analysed conceptual data from primary studies (phase 5, 'translation' of studies) and in only one paper (3%) reviewers explicitly described how they conducted the analytic synthesis process (phase 6). In 38% of papers we could not ascertain if reviewers had achieved any new interpretation of primary studies. In over 30% of papers seminal methodological texts which could have informed methods were not cited. We believe this is the first in-depth methodological systematic review of meta-ethnography conduct and reporting. Meta-ethnography is an evolving approach. Current reporting of methods, analysis and synthesis lacks clarity and comprehensiveness. This is a major barrier to use of meta-ethnography findings that could contribute significantly to the evidence base because it makes judging their rigour and credibility difficult. To realise the high potential value of meta-ethnography for enhancing health care and understanding patient experience requires reporting that clearly conveys the methodology, analysis and findings. Tailored meta-ethnography reporting guidelines, developed through expert consensus, could improve reporting.
Sanchon-Lopez, Beatriz; Everett, Jeremy R
2016-09-02
A new, simple-to-implement and quantitative approach to assessing the confidence in NMR-based identification of known metabolites is introduced. The approach is based on a topological analysis of metabolite identification information available from NMR spectroscopy studies and is a development of the metabolite identification carbon efficiency (MICE) method. New topological metabolite identification indices are introduced, analyzed, and proposed for general use, including topological metabolite identification carbon efficiency (tMICE). Because known metabolite identification is one of the key bottlenecks in either NMR-spectroscopy- or mass spectrometry-based metabonomics/metabolomics studies, and given the fact that there is no current consensus on how to assess metabolite identification confidence, it is hoped that these new approaches and the topological indices will find utility.
An operational approach to high resolution agro-ecological zoning in West-Africa.
Le Page, Y; Vasconcelos, Maria; Palminha, A; Melo, I Q; Pereira, J M C
2017-01-01
The objective of this work is to develop a simple methodology for high resolution crop suitability analysis under current and future climate, easily applicable and useful in Least Developed Countries. The approach addresses both regional planning in the context of climate change projections and pre-emptive short-term rural extension interventions based on same-year agricultural season forecasts, while implemented with off-the-shelf resources. The developed tools are applied operationally in a case-study developed in three regions of Guinea-Bissau and the obtained results, as well as the advantages and limitations of methods applied, are discussed. In this paper we show how a simple approach can easily generate information on climate vulnerability and how it can be operationally used in rural extension services.
ERIC Educational Resources Information Center
Vázquez-Alonso, Ángel; Manassero-Mas, María-Antonia; García-Carmona, Antonio; Montesano de Talavera, Marisa
2016-01-01
This study applies a new quantitative methodological approach to diagnose epistemology conceptions in a large sample. The analyses use seven multiple-rating items on the epistemology of science drawn from the item pool Views on Science-Technology-Society (VOSTS). The bases of the new methodological diagnostic approach are the empirical…
AERIS: An Integrated Domain Information System for Aerospace Science and Technology
ERIC Educational Resources Information Center
Hatua, Sudip Ranjan; Madalli, Devika P.
2011-01-01
Purpose: The purpose of this paper is to discuss the methodology in building an integrated domain information system with illustrations that provide proof of concept. Design/methodology/approach: The present work studies the usual search engine approach to information and its pitfalls. A methodology was adopted for construction of a domain-based…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sudhir, Dass; Bandyopadhyay, M., E-mail: mainak@ter-india.org; Chakraborty, A.
2014-01-15
Impedance matching circuit between radio frequency (RF) generator and the plasma load, placed between them, determines the RF power transfer from RF generator to the plasma load. The impedance of plasma load depends on the plasma parameters through skin depth and plasma conductivity or resistivity. Therefore, for long pulse operation of inductively coupled plasmas, particularly for high power (∼100 kW or more) where plasma load condition may vary due to different reasons (e.g., pressure, power, and thermal), online tuning of impedance matching circuit is necessary through feedback. In fusion grade ion source operation, such online methodology through feedback is notmore » present but offline remote tuning by adjusting the matching circuit capacitors and tuning the driving frequency of the RF generator between the ion source operation pulses is envisaged. The present model is an approach for remote impedance tuning methodology for long pulse operation and corresponding online impedance matching algorithm based on RF coil antenna current measurement or coil antenna calorimetric measurement may be useful in this regard.« less
Multi-Step Usage of in Vivo Models During Rational Drug Design and Discovery
Williams, Charles H.; Hong, Charles C.
2011-01-01
In this article we propose a systematic development method for rational drug design while reviewing paradigms in industry, emerging techniques and technologies in the field. Although the process of drug development today has been accelerated by emergence of computational methodologies, it is a herculean challenge requiring exorbitant resources; and often fails to yield clinically viable results. The current paradigm of target based drug design is often misguided and tends to yield compounds that have poor absorption, distribution, metabolism, and excretion, toxicology (ADMET) properties. Therefore, an in vivo organism based approach allowing for a multidisciplinary inquiry into potent and selective molecules is an excellent place to begin rational drug design. We will review how organisms like the zebrafish and Caenorhabditis elegans can not only be starting points, but can be used at various steps of the drug development process from target identification to pre-clinical trial models. This systems biology based approach paired with the power of computational biology; genetics and developmental biology provide a methodological framework to avoid the pitfalls of traditional target based drug design. PMID:21731440
NASA Astrophysics Data System (ADS)
Cheyney, S.; Fishwick, S.; Hill, I. A.; Linford, N. T.
2015-08-01
Despite the development of advanced processing and interpretation tools for magnetic data sets in the fields of mineral and hydrocarbon industries, these methods have not achieved similar levels of adoption for archaeological or very near surface surveys. Using a synthetic data set we demonstrate that certain methodologies and assumptions used to successfully invert more regional-scale data can lead to large discrepancies between the true and recovered depths when applied to archaeological-type anomalies. We propose variations to the current approach, analysing the choice of the depth-weighting function, mesh design and parameter constraints, to develop an appropriate technique for the 3-D inversion of archaeological-scale data sets. The results show a successful recovery of a synthetic scenario, as well as a case study of a Romano-Celtic temple in the UK. For the case study, the final susceptibility model is compared with two coincident ground penetrating radar surveys, showing a high correlation with the comparative depth slices. The new approach takes interpretation of archaeological data sets beyond a simple 2-D visual interpretation based on pattern recognition.
NASA Astrophysics Data System (ADS)
Kozlovská, Mária; Čabala, Jozef; Struková, Zuzana
2014-11-01
Information technology is becoming a strong tool in different industries, including construction. The recent trend of buildings designing is leading up to creation of the most comprehensive virtual building model (Building Information Model) in order to solve all the problems relating to the project as early as in the designing phase. Building information modelling is a new way of approaching to the design of building projects documentation. Currently, the building site layout as a part of the building design documents has a very little support in the BIM environment. Recently, the research of designing the construction process conditions has centred on improvement of general practice in planning and on new approaches to construction site layout planning. The state of art in field of designing the construction process conditions indicated an unexplored problem related to connection of knowledge system with construction site facilities (CSF) layout through interactive modelling. The goal of the paper is to present the methodology for execution of 3D construction site facility allocation model (3D CSF-IAM), based on principles of parametric and interactive modelling.
Shet, Vinayaka B; Palan, Anusha M; Rao, Shama U; Varun, C; Aishwarya, Uday; Raja, Selvaraj; Goveas, Louella Concepta; Vaman Rao, C; Ujwal, P
2018-02-01
In the current investigation, statistical approaches were adopted to hydrolyse non-edible seed cake (NESC) of Pongamia and optimize the hydrolysis process by response surface methodology (RSM). Through the RSM approach, the optimized conditions were found to be 1.17%v/v of HCl concentration at 54.12 min for hydrolysis. Under optimized conditions, the release of reducing sugars was found to be 53.03 g/L. The RSM data were used to train the artificial neural network (ANN) and the predictive ability of both models was compared by calculating various statistical parameters. A three-layered ANN model consisting of 2:12:1 topology was developed; the response of the ANN model indicates that it is precise when compared with the RSM model. The fit of the models was expressed with the regression coefficient R 2 , which was found to be 0.975 and 0.888, respectively, for the ANN and RSM models. This further demonstrated that the performance of ANN was better than that of RSM.
Mediators and moderators in early intervention research.
Breitborde, Nicholas J K; Srihari, Vinod H; Pollard, Jessica M; Addington, Donald N; Woods, Scott W
2010-05-01
The goal of this paper is to provide clarification with regard to the nature of mediator and moderator variables and the statistical methods used to test for the existence of these variables. Particular attention will be devoted to discussing the ways in which the identification of mediator and moderator variables may help to advance the field of early intervention in psychiatry. We completed a literature review of the methodological strategies used to test for mediator and moderator variables. Although several tests for mediator variables are currently available, recent evaluations suggest that tests which directly evaluate the indirect effect are superior. With regard to moderator variables, two approaches ('pick-a-point' and regions of significance) are available, and we provide guidelines with regard to how researchers can determine which approach may be most appropriate to use for their specific study. Finally, we discuss how to evaluate the clinical importance of mediator and moderator relationships as well as the methodology to calculate statistical power for tests of mediation and moderation. Further exploration of mediator and moderator variables may provide valuable information with regard to interventions provided early in the course of a psychiatric illness.
NASA Astrophysics Data System (ADS)
Hodijah, A.; Sundari, S.; Nugraha, A. C.
2018-05-01
As a Local Government Agencies who perform public services, General Government Office already has utilized Reporting Information System of Local Government Implementation (E-LPPD). However, E-LPPD has upgrade limitation for the integration processes that cannot accommodate General Government Offices’ needs in order to achieve Good Government Governance (GGG), while success stories of the ultimate goal of e-government implementation requires good governance practices. Currently, citizen demand public services as private sector do, which needs service innovation by utilizing the legacy system as a service based e-government implementation, while Service Oriented Architecture (SOA) to redefine a business processes as a set of IT enabled services and Enterprise Architecture from the Open Group Architecture Framework (TOGAF) as a comprehensive approach in redefining business processes as service innovation towards GGG. This paper takes a case study on Performance Evaluation of Local Government Implementation (EKPPD) system on General Government Office. The results show that TOGAF will guide the development of integrated business processes of EKPPD system that fits good governance practices to attain GGG with SOA methodology as technical approach.
Aguilar-Arredondo, Andrea; Arias, Clorinda; Zepeda, Angélica
2015-01-01
Hippocampal neurogenesis occurs in the adult brain in various species, including humans. A compelling question that arose when neurogenesis was accepted to occur in the adult dentate gyrus (DG) is whether new neurons become functionally relevant over time, which is key for interpreting their potential contributions to synaptic circuitry. The functional state of adult-born neurons has been evaluated using various methodological approaches, which have, in turn, yielded seemingly conflicting results regarding the timing of maturation and functional integration. Here, we review the contributions of different methodological approaches to addressing the maturation process of adult-born neurons and their functional state, discussing the contributions and limitations of each method. We aim to provide a framework for interpreting results based on the approaches currently used in neuroscience for evaluating functional integration. As shown by the experimental evidence, adult-born neurons are prone to respond from early stages, even when they are not yet fully integrated into circuits. The ongoing integration process for the newborn neurons is characterised by different features. However, they may contribute differently to the network depending on their maturation stage. When combined, the strategies used to date convey a comprehensive view of the functional development of newly born neurons while providing a framework for approaching the critical time at which new neurons become functionally integrated and influence brain function.
Rosedale, Mary; Malaspina, Dolores; Malamud, Daniel; Strauss, Shiela M; Horne, Jaclyn D; Abouzied, Salman; Cruciani, Ricardo A; Knotkova, Helena
2012-01-01
This article reports and discusses how quantitative (physiological and behavioral) and qualitative methods are being combined in an open-label pilot feasibility study. The study evaluates safety, tolerability, and acceptability of a protocol to treat depression in HIV-infected individuals, using a 2-week block of transcranial direct current stimulation (tDCS) over the dorsolateral prefrontal cortex. Major depressive disorder (MDD) is the second most prevalent psychiatric disorder after substance abuse among HIV-positive adults, and novel antidepressant treatments are needed for this vulnerable population. The authors describe the challenges and contributions derived from different research perspectives and methodological approaches and provide a philosophical framework for combining quantitative and qualitative measurements for a fuller examination of the disorder. Four methodological points are presented: (1) the value of combining quantitative and qualitative approaches; (2) the need for context-specific measures when studying patients with medical and psychiatric comorbidities; (3) the importance of research designs that integrate physiological, behavioral, and qualitative approaches when evaluating novel treatments; and (4) the need to explore the relationships between biomarkers, clinical symptom assessments, patient self-evaluations, and patient experiences when developing new, patient-centered protocols. The authors conclude that the complexity of studying novel treatments in complex and new patient populations requires complex research designs to capture the richness of data that inform translational research.
Influence of Environmental Factors on Vibrio spp. in Coastal Ecosystems.
Johnson, Crystal N
2015-06-01
Various studies have examined the relationships between vibrios and the environmental conditions surrounding them. However, very few reviews have compiled these studies into cohesive points. This may be due to the fact that these studies examine different environmental parameters, use different sampling, detection, and enumeration methodologies, and occur in diverse geographic locations. The current article is one approach to compile these studies into a cohesive work that assesses the importance of environmental determinants on the abundance of vibrios in coastal ecosystems.
2004-03-01
When applying experience to new situations, the process is very similar. Faced with a new situation, a human generally looks for ways in which...find the best course of action, the human would compare current goals to those it faced in the previous experiences and choose the path that...154. Saperstein, Alvin (1995) “War and Chaos”. American Scientist, vol. 84. November-December 1995. pp. 548-557. 155. Sargent, Robert G . (1991