Sample records for theoretical methodological proposal

  1. An Information Theoretic Investigation Of Complex Adaptive Supply Networks With Organizational Topologies

    DTIC Science & Technology

    2016-12-22

    assumptions of behavior. This research proposes an information theoretic methodology to discover such complex network structures and dynamics while overcoming...the difficulties historically associated with their study. Indeed, this was the first application of an information theoretic methodology as a tool...1 Research Objectives and Questions..............................................................................2 Methodology

  2. Virtual-pulse time integral methodology: A new explicit approach for computational dynamics - Theoretical developments for general nonlinear structural dynamics

    NASA Technical Reports Server (NTRS)

    Chen, Xiaoqin; Tamma, Kumar K.; Sha, Desong

    1993-01-01

    The present paper describes a new explicit virtual-pulse time integral methodology for nonlinear structural dynamics problems. The purpose of the paper is to provide the theoretical basis of the methodology and to demonstrate applicability of the proposed formulations to nonlinear dynamic structures. Different from the existing numerical methods such as direct time integrations or mode superposition techniques, the proposed methodology offers new perspectives and methodology of development, and possesses several unique and attractive computational characteristics. The methodology is tested and compared with the implicit Newmark method (trapezoidal rule) through a nonlinear softening and hardening spring dynamic models. The numerical results indicate that the proposed explicit virtual-pulse time integral methodology is an excellent alternative for solving general nonlinear dynamic problems.

  3. A Methodology for Instructional Design in Mathematics--With the Generic and Epistemic Student at the Centre

    ERIC Educational Resources Information Center

    Strømskag, Heidi

    2017-01-01

    This theoretical paper presents a methodology for instructional design in mathematics. It is a theoretical analysis of a proposed model for instructional design, where tasks are embedded in situations that preserve meaning with respect to particular pieces of mathematical knowledge. The model is applicable when there is an intention of teaching…

  4. History and theoretical-methodological fundaments of Community Psychology in Ceará.

    PubMed

    Barros, João Paulo Pereira; Ximenes, Verônica Morais

    2016-01-01

    In this article we discuss the historical and theoretical-methodological aspects of the Community Psychology that has been developed in the state of Ceará, in northeastern Brazil, based on the praxis initiated by Professor Cezar Wagner de Lima Góis and further developed by the Community Psychology Nucleus (NUCOM) at the Federal University of Ceará. Important aspects of the beginning of this Community Psychology are presented, highlighting its academic and social perspectives. NUCOM is a space for the development of teaching, research, and outreach activities, which allows the systematization and deepening of this proposal for a different Community Psychology. Community Psychology is constituted by five theoretical-methodological marks: Popular Education, Biodance, Carl Rogers' Humanistic Approach, Cultural-Historical Psychology, and Liberation Psychology. Finally, the article describes the methods comprising this proposal for working in communities, which are sustained by pillars such as participation and problematizing dialogue.

  5. The Construction and Analysis of a Science Story: A Proposed Methodology

    ERIC Educational Resources Information Center

    Klassen, Stephen

    2009-01-01

    Science educators are beginning to establish a theoretical and methodological foundation for constructing and using stories in science teaching. At the same time, it is not clear to what degree science stories that have recently been written adhere to the guidelines that are being proposed. The author has written a story about Louis Slotin, which…

  6. Some New Theoretical Issues in Systems Thinking Relevant for Modelling Corporate Learning

    ERIC Educational Resources Information Center

    Minati, Gianfranco

    2007-01-01

    Purpose: The purpose of this paper is to describe fundamental concepts and theoretical challenges with regard to systems, and to build on these in proposing new theoretical frameworks relevant to learning, for example in so-called learning organizations. Design/methodology/approach: The paper focuses on some crucial fundamental aspects introduced…

  7. Qualitative Assessment of Inquiry-Based Teaching Methods

    ERIC Educational Resources Information Center

    Briggs, Michael; Long, George; Owens, Katrina

    2011-01-01

    A new approach to teaching method assessment using student focused qualitative studies and the theoretical framework of mental models is proposed. The methodology is considered specifically for the advantages it offers when applied to the assessment of inquiry-based teaching methods. The theoretical foundation of mental models is discussed, and…

  8. Postscript: Split Spatial Attention? The Data Remain Difficult to Interpret

    ERIC Educational Resources Information Center

    Jans, Bert; Peters, Judith C.; De Weerd, Peter

    2010-01-01

    A growing number of studies claim that spatial attention can be split "on demand" into several, segregated foci of enhanced processing. Intrigued by the theoretical ramifications of this proposal, we analyzed 19 relevant sets of experiments using four methodological criteria. We typically found several methodological limitations in each study that…

  9. Modeling Single-Event Transient Propagation in a SiGe BiCMOS Direct-Conversion Receiver

    NASA Astrophysics Data System (ADS)

    Ildefonso, Adrian; Song, Ickhyun; Tzintzarov, George N.; Fleetwood, Zachary E.; Lourenco, Nelson E.; Wachter, Mason T.; Cressler, John D.

    2017-08-01

    The propagation of single-event transient (SET) signals in a silicon-germanium direct-conversion receiver carrying modulated data is explored. A theoretical analysis of transient propagation, verified by simulation, is presented. A new methodology to characterize and quantify the impact of SETs in communication systems carrying modulated data is proposed. The proposed methodology uses a pulsed radiation source to induce distortions in the signal constellation. The error vector magnitude due to SETs can then be calculated to quantify errors. Two different modulation schemes were simulated: QPSK and 16-QAM. The distortions in the constellation diagram agree with the presented circuit theory. Furthermore, the proposed methodology was applied to evaluate the improvements in the SET response due to a known radiation-hardening-by-design (RHBD) technique, where the common-base device of the low-noise amplifier was operated in inverse mode. The proposed methodology can be a valid technique to determine the most sensitive parts of a system carrying modulated data.

  10. Representations of everyday life: a proposal for capturing social values from the Marxist perspective of knowledge production.

    PubMed

    Soares, Cássia Baldini; Santos, Vilmar Ezequiel Dos; Campos, Célia Maria Sivalli; Lachtim, Sheila Aparecida Ferreira; Campos, Fernanda Cristina

    2011-12-01

    We propose from the Marxist perspective of the construction of knowledge, a theoretical and methodological framework for understanding social values by capturing everyday representations. We assume that scientific research brings together different dimensions: epistemological, theoretical and methodological that consistently to the other instances, proposes a set of operating procedures and techniques for capturing and analyzing the reality under study in order to expose the investigated object. The study of values reveals the essentiality of the formation of judgments and choices, there are values that reflect the dominant ideology, spanning all social classes, but there are values that reflect class interests, these are not universal, they are formed in relationships and social activities. Basing on the Marxist theory of consciousness, representations are discursive formulations of everyday life - opinion or conviction - issued by subjects about their reality, being a coherent way of understanding and exposure social values: focus groups show is suitable for grasping opinions while interviews show potential to expose convictions.

  11. Teaching Probability for Conceptual Change (La Ensenanza de la Probabilidad por Cambio Conceptual).

    ERIC Educational Resources Information Center

    Castro, Cesar Saenz

    1998-01-01

    Presents a theoretical proposal of a methodology for the teaching of probability theory. Discusses the importance of the epistemological approach of Lakatos and the perspective of the conceptual change. Discusses research using a proposed didactic method with Spanish high school students (N=6). Concludes that significant differences on all…

  12. Modelling periodic structure formation on 100Cr6 steel after irradiation with femtosecond-pulsed laser beams

    NASA Astrophysics Data System (ADS)

    Tsibidis, George D.; Mimidis, Alexandros; Skoulas, Evangelos; Kirner, Sabrina V.; Krüger, Jörg; Bonse, Jörn; Stratakis, Emmanuel

    2018-01-01

    We investigate the periodic structure formation upon intense femtosecond pulsed irradiation of chrome steel (100Cr6) for linearly polarised laser beams. The underlying physical mechanism of the laser-induced periodic structures is explored, their spatial frequency is calculated and theoretical results are compared with experimental observations. The proposed theoretical model comprises estimations of electron excitation, heat transfer, relaxation processes, and hydrodynamics-related mass transport. Simulations describe the sequential formation of sub-wavelength ripples and supra-wavelength grooves. In addition, the influence of the laser wavelength on the periodicity of the structures is discussed. The proposed theoretical investigation offers a systematic methodology towards laser processing of steel surfaces with important applications.

  13. Travel into a fairy land: a critique of modern qualitative and mixed methods psychologies.

    PubMed

    Toomela, Aaro

    2011-03-01

    In this article modern qualitative and mixed methods approaches are criticized from the standpoint of structural-systemic epistemology. It is suggested that modern qualitative methodologies suffer from several fallacies: some of them are grounded on inherently contradictory epistemology, the others ask scientific questions after the methods have been chosen, conduct studies inductively so that not only answers but even questions are often supposed to be discovered, do not create artificial situations and constraints on study-situations, are adevelopmental by nature, study not the external things and phenomena but symbols and representations--often the object of studies turns out to be the researcher rather than researched, rely on ambiguous data interpretation methods based to a large degree on feelings and opinions, aim to understand unique which is theoretically impossible, or have theoretical problems with sampling. Any one of these fallacies would be sufficient to exclude any possibility to achieve structural-systemic understanding of the studied things and phenomena. It also turns out that modern qualitative methodologies share several fallacies with the quantitative methodology. Therefore mixed methods approaches are not able to overcome the fundamental difficulties that characterize mixed methods taken separately. It is proposed that structural-systemic methodology that dominated psychological thought in the pre-WWII continental Europe is philosophically and theoretically better grounded than the other methodologies that can be distinguished in psychology today. Future psychology should be based on structural-systemic methodology.

  14. Discourse Markers in Chinese Conversational Narrative

    ERIC Educational Resources Information Center

    Xiao, Yang

    2010-01-01

    This study examines the indexicality of discourse markers (DMs) in Chinese conversational narrative. Drawing upon theoretical and methodological principles related to narrative dimensions (Ochs & Capps, 2001), narrative desires (Ochs, 1997, 2004), and narrative positioning (Bamberg, 1997), this work proposes an integrated analytical framework for…

  15. A temperature compensation methodology for piezoelectric based sensor devices

    NASA Astrophysics Data System (ADS)

    Wang, Dong F.; Lou, Xueqiao; Bao, Aijian; Yang, Xu; Zhao, Ji

    2017-08-01

    A temperature compensation methodology comprising a negative temperature coefficient thermistor with the temperature characteristics of a piezoelectric material is proposed to improve the measurement accuracy of piezoelectric sensing based devices. The piezoelectric disk is characterized by using a disk-shaped structure and is also used to verify the effectiveness of the proposed compensation method. The measured output voltage shows a nearly linear relationship with respect to the applied pressure by introducing the proposed temperature compensation method in a temperature range of 25-65 °C. As a result, the maximum measurement accuracy is observed to be improved by 40%, and the higher the temperature, the more effective the method. The effective temperature range of the proposed method is theoretically analyzed by introducing the constant coefficient of the thermistor (B), the resistance of initial temperature (R0), and the paralleled resistance (Rx). The proposed methodology can not only eliminate the influence of piezoelectric temperature dependent characteristics on the sensing accuracy but also decrease the power consumption of piezoelectric sensing based devices by the simplified sensing structure.

  16. International Students Decision-Making Process

    ERIC Educational Resources Information Center

    Cubillo, Jose Maria; Sanchez, Joaquin; Cervino, Julio

    2006-01-01

    Purpose--The purpose of this paper is to propose a theoretical model that integrates the different groups of factors which influence the decision-making process of international students, analysing different dimensions of this process and explaining those factors which determine students' choice. Design/methodology/approach--A hypothetical model…

  17. Following Watery Relations in Early Childhood Pedagogies

    ERIC Educational Resources Information Center

    Pacini-Ketchabaw, Veronica; Clark, Vanessa

    2016-01-01

    Working methodologically and theoretically with the hydro-logics of bodies of water, this article addresses the limitations of humanistic perspectives on water play in early childhood classrooms, and proposes pedagogies of watery relations. The article traces the fluid, murky, surging, creative, unpredictable specificities of bodies of water that…

  18. Electropyroelectric technique: A methodology free of fitting procedures for thermal effusivity determination in liquids.

    PubMed

    Ivanov, R; Marin, E; Villa, J; Gonzalez, E; Rodríguez, C I; Olvera, J E

    2015-06-01

    This paper describes an alternative methodology to determine the thermal effusivity of a liquid sample using the recently proposed electropyroelectric technique, without fitting the experimental data with a theoretical model and without having to know the pyroelectric sensor related parameters, as in most previous reported approaches. The method is not absolute, because a reference liquid with known thermal properties is needed. Experiments have been performed that demonstrate the high reliability and accuracy of the method with measurement uncertainties smaller than 3%.

  19. Evaluation and Communication: Using a Communication Audit to Evaluate Organizational Communication

    ERIC Educational Resources Information Center

    Hogard, Elaine; Ellis, Roger

    2006-01-01

    This article identifies a surprising dearth of studies that explicitly link communication and evaluation at substantive, theoretical, and methodological levels. A three-fold typology of evaluation studies referring to communication is proposed and examples given. The importance of organizational communication in program delivery is stressed and…

  20. Administrative Leadership as Projection, Social Control, and Action.

    ERIC Educational Resources Information Center

    Reed, Donald B.

    Over the past 50 years, theoretical and methodological problems have plagued the study of leadership. This paper, proposing an alternative theory, argues that leadership has three fundamental components: projection and social control, which are linked by action. Projection is the visualization of a project to be completed. Educational…

  1. Composite Indices of Development and Poverty: An Application to MDGs

    ERIC Educational Resources Information Center

    De Muro, Pasquale; Mazziotta, Matteo; Pareto, Adriano

    2011-01-01

    The measurement of development or poverty as multidimensional phenomena is very difficult because there are several theoretical, methodological and empirical problems involved. The literature of composite indicators offers a wide variety of aggregation methods, all with their pros and cons. In this paper, we propose a new, alternative composite…

  2. The Development of a Checklist to Enhance Methodological Quality in Intervention Programs.

    PubMed

    Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana; Sánchez-Martín, Milagrosa

    2016-01-01

    The methodological quality of primary studies is an important issue when performing meta-analyses or systematic reviews. Nevertheless, there are no clear criteria for how methodological quality should be analyzed. Controversies emerge when considering the various theoretical and empirical definitions, especially in relation to three interrelated problems: the lack of representativeness, utility, and feasibility. In this article, we (a) systematize and summarize the available literature about methodological quality in primary studies; (b) propose a specific, parsimonious, 12-items checklist to empirically define the methodological quality of primary studies based on a content validity study; and (c) present an inter-coder reliability study for the resulting 12-items. This paper provides a precise and rigorous description of the development of this checklist, highlighting the clearly specified criteria for the inclusion of items and a substantial inter-coder agreement in the different items. Rather than simply proposing another checklist, however, it then argues that the list constitutes an assessment tool with respect to the representativeness, utility, and feasibility of the most frequent methodological quality items in the literature, one that provides practitioners and researchers with clear criteria for choosing items that may be adequate to their needs. We propose individual methodological features as indicators of quality, arguing that these need to be taken into account when designing, implementing, or evaluating an intervention program. This enhances methodological quality of intervention programs and fosters the cumulative knowledge based on meta-analyses of these interventions. Future development of the checklist is discussed.

  3. The Development of a Checklist to Enhance Methodological Quality in Intervention Programs

    PubMed Central

    Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana; Sánchez-Martín, Milagrosa

    2016-01-01

    The methodological quality of primary studies is an important issue when performing meta-analyses or systematic reviews. Nevertheless, there are no clear criteria for how methodological quality should be analyzed. Controversies emerge when considering the various theoretical and empirical definitions, especially in relation to three interrelated problems: the lack of representativeness, utility, and feasibility. In this article, we (a) systematize and summarize the available literature about methodological quality in primary studies; (b) propose a specific, parsimonious, 12-items checklist to empirically define the methodological quality of primary studies based on a content validity study; and (c) present an inter-coder reliability study for the resulting 12-items. This paper provides a precise and rigorous description of the development of this checklist, highlighting the clearly specified criteria for the inclusion of items and a substantial inter-coder agreement in the different items. Rather than simply proposing another checklist, however, it then argues that the list constitutes an assessment tool with respect to the representativeness, utility, and feasibility of the most frequent methodological quality items in the literature, one that provides practitioners and researchers with clear criteria for choosing items that may be adequate to their needs. We propose individual methodological features as indicators of quality, arguing that these need to be taken into account when designing, implementing, or evaluating an intervention program. This enhances methodological quality of intervention programs and fosters the cumulative knowledge based on meta-analyses of these interventions. Future development of the checklist is discussed. PMID:27917143

  4. A Novel Clustering Methodology Based on Modularity Optimisation for Detecting Authorship Affinities in Shakespearean Era Plays

    PubMed Central

    Craig, Hugh; Berretta, Regina; Moscato, Pablo

    2016-01-01

    In this study we propose a novel, unsupervised clustering methodology for analyzing large datasets. This new, efficient methodology converts the general clustering problem into the community detection problem in graph by using the Jensen-Shannon distance, a dissimilarity measure originating in Information Theory. Moreover, we use graph theoretic concepts for the generation and analysis of proximity graphs. Our methodology is based on a newly proposed memetic algorithm (iMA-Net) for discovering clusters of data elements by maximizing the modularity function in proximity graphs of literary works. To test the effectiveness of this general methodology, we apply it to a text corpus dataset, which contains frequencies of approximately 55,114 unique words across all 168 written in the Shakespearean era (16th and 17th centuries), to analyze and detect clusters of similar plays. Experimental results and comparison with state-of-the-art clustering methods demonstrate the remarkable performance of our new method for identifying high quality clusters which reflect the commonalities in the literary style of the plays. PMID:27571416

  5. Adaptive optimization as a design and management methodology for coal-mining enterprise in uncertain and volatile market environment - the conceptual framework

    NASA Astrophysics Data System (ADS)

    Mikhalchenko, V. V.; Rubanik, Yu T.

    2016-10-01

    The work is devoted to the problem of cost-effective adaptation of coal mines to the volatile and uncertain market conditions. Conceptually it can be achieved through alignment of the dynamic characteristics of the coal mining system and power spectrum of market demand for coal product. In practical terms, this ensures the viability and competitiveness of coal mines. Transformation of dynamic characteristics is to be done by changing the structure of production system as well as corporate, logistics and management processes. The proposed methods and algorithms of control are aimed at the development of the theoretical foundations of adaptive optimization as basic methodology for coal mine enterprise management in conditions of high variability and uncertainty of economic and natural environment. Implementation of the proposed methodology requires a revision of the basic principles of open coal mining enterprises design.

  6. Ensemble modeling of stochastic unsteady open-channel flow in terms of its time-space evolutionary probability distribution - Part 1: theoretical development

    NASA Astrophysics Data System (ADS)

    Dib, Alain; Kavvas, M. Levent

    2018-03-01

    The Saint-Venant equations are commonly used as the governing equations to solve for modeling the spatially varied unsteady flow in open channels. The presence of uncertainties in the channel or flow parameters renders these equations stochastic, thus requiring their solution in a stochastic framework in order to quantify the ensemble behavior and the variability of the process. While the Monte Carlo approach can be used for such a solution, its computational expense and its large number of simulations act to its disadvantage. This study proposes, explains, and derives a new methodology for solving the stochastic Saint-Venant equations in only one shot, without the need for a large number of simulations. The proposed methodology is derived by developing the nonlocal Lagrangian-Eulerian Fokker-Planck equation of the characteristic form of the stochastic Saint-Venant equations for an open-channel flow process, with an uncertain roughness coefficient. A numerical method for its solution is subsequently devised. The application and validation of this methodology are provided in a companion paper, in which the statistical results computed by the proposed methodology are compared against the results obtained by the Monte Carlo approach.

  7. Reflection-induced linear polarization rotation and phase modulation between orthogonal waves for refractive index variation measurement.

    PubMed

    Twu, Ruey-Ching; Wang, Jhao-Sheng

    2016-04-01

    An optical phase interrogation is proposed to study reflection-induced linear polarization rotation in a common-path homodyne interferometer. This optical methodology can also be applied to the measurement of the refractive index variation of a liquid solution. The performance of the refractive index sensing structure is discussed theoretically, and the experimental results demonstrated a very good ability based on the proposed schemes. Compared with a conventional common-path heterodyne interferometer, the proposed homodyne interferometer with only a single channel reduced the usage of optic elements.

  8. A Framework for Evaluating and Enhancing Alignment in Self-Regulated Learning Research

    ERIC Educational Resources Information Center

    Dent, Amy L.; Hoyle, Rick H.

    2015-01-01

    We discuss the articles of this special issue with reference to an important yet previously only implicit dimension of study quality: alignment across the theoretical and methodological decisions that collectively define an approach to self-regulated learning. Integrating and extending work by leaders in the field, we propose a framework for…

  9. Information Resources Usage in Project Management Digital Learning System

    ERIC Educational Resources Information Center

    Davidovitch, Nitza; Belichenko, Margarita; Kravchenko, Yurii

    2017-01-01

    The article combines a theoretical approach to structuring knowledge that is based on the integrated use of fuzzy semantic network theory predicates, Boolean functions, theory of complexity of network structures and some practical aspects to be considered in the distance learning at the university. The paper proposes a methodological approach that…

  10. Quality Concerns in Technical Education in India: A Quantifiable Quality Enabled Model

    ERIC Educational Resources Information Center

    Gambhir, Victor; Wadhwa, N. C.; Grover, Sandeep

    2016-01-01

    Purpose: The paper aims to discuss current Technical Education scenarios in India. It proposes modelling the factors affecting quality in a technical institute and then applying a suitable technique for assessment, comparison and ranking. Design/methodology/approach: The paper chose graph theoretic approach for quantification of quality-enabled…

  11. Assessment of Effectiveness of Use of Intellectual Potential of a University: A Methodological Approach

    ERIC Educational Resources Information Center

    Stukalova, Irina B.; Stukalova, Anastasia A.; Selyanskaya, Galina N.

    2016-01-01

    This article presents the results of theoretical analysis of existing approaches to the categories of the "intellectual capital" and "intellectual potential" of an organization. The authors identified the specific peculiarities of developing the intellectual potential of a university and propose their own view of its structure.…

  12. Transnational Corporations and Strategic Challenges: An Analysis of Knowledge Flows and Competitive Advantage

    ERIC Educational Resources Information Center

    de Pablos, Patricia Ordonez

    2006-01-01

    Purpose: The purpose of this paper is to analyse knowledge transfers in transnational corporations. Design/methodology/approach: The paper develops a conceptual framework for the analysis of knowledge flow transfers in transnationals. Based on this theoretical framework, the paper propose's research hypotheses and builds a causal model that links…

  13. A Didactic Proposal for EFL in a Public School in Cali

    ERIC Educational Resources Information Center

    Chaves, Orlando; Fernandez, Alejandro

    2016-01-01

    This article reports an action-research project aimed at designing, applying, and assessing a didactic sequence for teaching English as a foreign language in the first grade of a public school in Cali. The article comprises the context, reasons that justified the research, theoretical support, methodology, and results, analyzed through descriptive…

  14. From Instructional Leadership to Leadership Capabilities: Empirical Findings and Methodological Challenges

    ERIC Educational Resources Information Center

    Robinson, Viviane M. J.

    2010-01-01

    While there is considerable evidence about the impact of instructional leadership on student outcomes, there is far less known about the leadership capabilities that are required to confidently engage in the practices involved. This article uses the limited available evidence, combined with relevant theoretical analyses, to propose a tentative…

  15. 75 FR 55619 - Self-Regulatory Organizations; The Options Clearing Corporation; Notice of Filing of Proposed...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-13

    ... Numerical Simulations Risk Management Methodology September 7, 2010. Pursuant to Section 19(b)(1) of the... for incorporation in the System for Theoretical Analysis and Numerical Simulations (``STANS'') risk... ETFs \\3\\ in the STANS margin calculation process.\\4\\ When OCC began including common stock and ETFs in...

  16. A Social Realist Perspective on Student Learning in Higher Education: The Morphogenesis of Agency

    ERIC Educational Resources Information Center

    Case, Jennifer M.

    2015-01-01

    Contemporary critiques of student learning research call for new theoretical and methodological approaches. This article proposes a social realist approach to this research, using the morphogenetic theory of sociologist Margaret Archer. The applicability of this approach is demonstrated by reference to an empirical study of engineering students at…

  17. Rethinking Teachers' Goal Orientations: Conceptual and Methodological Enhancements

    ERIC Educational Resources Information Center

    Nitsche, Sebastian; Dickhauser, Oliver; Fasching, Michaela S.; Dresel, Markus

    2011-01-01

    The article provides a theoretical extension of the goal orientation approach for teaching by proposing three different competence facets of learning goals and four types of addressees for performance approach and avoidance goals. On the basis of responses from 495 teacher trainees and 224 in-service teachers, the development and validation of an…

  18. Research Vitality as Sustained Excellence: What Keeps the Plates Spinning?

    ERIC Educational Resources Information Center

    Gilstrap, J. Bruce; Harvey, Jaron; Novicevic, Milorad M.; Buckley, M. Ronald

    2011-01-01

    Purpose: Research vitality addresses the perseverance that faculty members in the organization sciences experience in maintaining their research quantity and quality over an extended period of time. The purpose of this paper is to offer a theoretical model of research vitality. Design/methodology/approach: The authors propose a model consisting of…

  19. Critical Studies on the Ideological Structure of Personality.

    ERIC Educational Resources Information Center

    Barratt, Barnaby B.; And Others

    This document contains four papers about the ideological structure of personality. A proposal for a theoretical and methodological reworking of the life-historical inquiry of personality psychology is presented along with a report of some preliminary studies that employ an intensive life-history approach to a distinct topic within the context of…

  20. Communicative Language Teaching: Unity within Diversity

    ERIC Educational Resources Information Center

    Hiep, Pham Hoa

    2007-01-01

    Recent articles in the "ELT Journal" offer interesting debates on CLT. On one side, Bax (2003) proposes that CLT should be abandoned since the methodology fails to take into account the context of language teaching. On the other side, Liao (2004) suggests that CLT is best. However, within the broad theoretical position on which CLT is…

  1. [Discovery-based teaching and learning strategies in health: problematization and problem-based learning].

    PubMed

    Cyrino, Eliana Goldfarb; Toralles-Pereira, Maria Lúcia

    2004-01-01

    Considering the changes in teaching in the health field and the demand for new ways of dealing with knowledge in higher learning, the article discusses two innovative methodological approaches: problem-based learning (PBL) and problematization. Describing the two methods' theoretical roots, the article attempts to identify their main foundations. As distinct proposals, both contribute to a review of the teaching and learning process: problematization, focused on knowledge construction in the context of the formation of a critical awareness; PBL, focused on cognitive aspects in the construction of concepts and appropriation of basic mechanisms in science. Both problematization and PBL lead to breaks with the traditional way of teaching and learning, stimulating participatory management by actors in the experience and reorganization of the relationship between theory and practice. The critique of each proposal's possibilities and limits using the analysis of their theoretical and methodological foundations leads us to conclude that pedagogical experiences based on PBL and/or problematization can represent an innovative trend in the context of health education, fostering breaks and more sweeping changes.

  2. Enterprise resource planning (ERP) implementation using the value engineering methodology and Six Sigma tools

    NASA Astrophysics Data System (ADS)

    Leu, Jun-Der; Lee, Larry Jung-Hsing

    2017-09-01

    Enterprise resource planning (ERP) is a software solution that integrates the operational processes of the business functions of an enterprise. However, implementing ERP systems is a complex process. In addition to the technical issues, companies must address problems associated with business process re-engineering, time and budget control, and organisational change. Numerous industrial studies have shown that the failure rate of ERP implementation is high, even for well-designed systems. Thus, ERP projects typically require a clear methodology to support the project execution and effectiveness. In this study, we propose a theoretical model for ERP implementation. The value engineering (VE) method forms the basis of the proposed framework, which integrates Six Sigma tools. The proposed framework encompasses five phases: knowledge generation, analysis, creation, development and execution. In the VE method, potential ERP problems related to software, hardware, consultation and organisation are analysed in a group-decision manner and in relation to value, and Six Sigma tools are applied to avoid any project defects. We validate the feasibility of the proposed model by applying it to an international manufacturing enterprise in Taiwan. The results show improvements in customer response time and operational efficiency in terms of work-in-process and turnover of materials. Based on the evidence from the case study, the theoretical framework is discussed together with the study's limitations and suggestions for future research.

  3. Whole-Volume Clustering of Time Series Data from Zebrafish Brain Calcium Images via Mixture Modeling.

    PubMed

    Nguyen, Hien D; Ullmann, Jeremy F P; McLachlan, Geoffrey J; Voleti, Venkatakaushik; Li, Wenze; Hillman, Elizabeth M C; Reutens, David C; Janke, Andrew L

    2018-02-01

    Calcium is a ubiquitous messenger in neural signaling events. An increasing number of techniques are enabling visualization of neurological activity in animal models via luminescent proteins that bind to calcium ions. These techniques generate large volumes of spatially correlated time series. A model-based functional data analysis methodology via Gaussian mixtures is suggested for the clustering of data from such visualizations is proposed. The methodology is theoretically justified and a computationally efficient approach to estimation is suggested. An example analysis of a zebrafish imaging experiment is presented.

  4. [Ethical considerations about research with women in situations of violence].

    PubMed

    Rafael, Ricardo de Mattos Russo; Soares de Moura, Anna Tereza Miranda

    2013-01-01

    This essay aims at reflecting on the ethical and methodological principles involved in research with women in situation of violence. The text raises the discussion of the application of the principles of beneficence and non-maleficence during researches involving this issue, pointing to recommendations towards privacy, autonomy and immediate contributions for volunteers. Then, taking as theoretical reference the principles of justice and equity, the authors propose a debate on methodological aspects involved in protection of respondents, with a view at improving the quality of the data obtained and possible social contributions.

  5. Nucleon-nucleon interactions via Lattice QCD: Methodology. HAL QCD approach to extract hadronic interactions in lattice QCD

    NASA Astrophysics Data System (ADS)

    Aoki, Sinya

    2013-07-01

    We review the potential method in lattice QCD, which has recently been proposed to extract nucleon-nucleon interactions via numerical simulations. We focus on the methodology of this approach by emphasizing the strategy of the potential method, the theoretical foundation behind it, and special numerical techniques. We compare the potential method with the standard finite volume method in lattice QCD, in order to make pros and cons of the approach clear. We also present several numerical results for nucleon-nucleon potentials.

  6. Critical dialogical approach: A methodological direction for occupation-based social transformative work.

    PubMed

    Farias, Lisette; Laliberte Rudman, Debbie; Pollard, Nick; Schiller, Sandra; Serrata Malfitano, Ana Paula; Thomas, Kerry; van Bruggen, Hanneke

    2018-05-03

    Calls for embracing the potential and responsibility of occupational therapy to address socio-political conditions that perpetuate occupational injustices have materialized in the literature. However, to reach beyond traditional frameworks informing practices, this social agenda requires the incorporation of diverse epistemological and methodological approaches to support action commensurate with social transformative goals. Our intent is to present a methodological approach that can help extend the ways of thinking or frameworks used in occupational therapy and science to support the ongoing development of practices with and for individuals and collectives affected by marginalizing conditions. We describe the epistemological and theoretical underpinnings of a methodological approach drawing on Freire and Bakhtin's work. Integrating our shared experience taking part in an example study, we discuss the unique advantages of co-generating data using two methods aligned with this approach; dialogical interviews and critical reflexivity. Key considerations when employing this approach are presented, based on its proposed epistemological and theoretical stance and our shared experiences engaging in it. A critical dialogical approach offers one way forward in expanding occupational therapy and science scholarship by promoting collaborative knowledge generation and examination of taken-for-granted understandings that shape individuals assumptions and actions.

  7. Language between Bodies: A Cognitive Approach to Understanding Linguistic Politeness in American Sign Language

    ERIC Educational Resources Information Center

    Roush, Daniel R.

    2011-01-01

    This article proposes an answer to the primary question of how the American Sign Language (ASL) community in the United States conceptualizes (im)politeness and its related notions. It begins with a review of evolving theoretical issues in research on (im)politeness and related methodological problems with studying (im)politeness in natural…

  8. The Missing Link: Deficits of Country-Level Studies. A Review of 22 Articles Explaining Life Satisfaction

    ERIC Educational Resources Information Center

    Nonnenmacher, Alexandra; Friedrichs, Jurgen

    2013-01-01

    To explain country differences in an analytical or structural dependent variable, the application of a macro-micro-model containing contextual hypotheses is necessary. Our methodological study examines whether empirical studies apply such a model. We propose that a theoretical base for country differences is well described in multilevel studies,…

  9. Evidence-Based Administration for Decision Making in the Framework of Knowledge Strategic Management

    ERIC Educational Resources Information Center

    Del Junco, Julio Garcia; Zaballa, Rafael De Reyna; de Perea, Juan Garcia Alvarez

    2010-01-01

    Purpose: This paper seeks to present a model based on evidence-based administration (EBA), which aims to facilitate the creation, transformation and diffusion of knowledge in learning organizations. Design/methodology/approach: A theoretical framework is proposed based on EBA and the case method. Accordingly, an empirical study was carried out in…

  10. A Novel Methodology for Charging Station Deployment

    NASA Astrophysics Data System (ADS)

    Sun, Zhonghao; Zhao, Yunwei; He, Yueying; Li, Mingzhe

    2018-02-01

    Lack of charging stations has been a main obstacle to the promotion of electric vehicles. This paper studies deploying charging stations in traffic networks considering grid constraints to balance the charging demand and grid stability. First, we propose a statistical model for charging demand. Then we combine the charging demand model with power grid constraints and give the formulation of the charging station deployment problem. Finally, we propose a theoretical solution for the problem by transforming it to a Markov Decision Process.

  11. Inter-provider comparison of patient-reported outcomes: developing an adjustment to account for differences in patient case mix.

    PubMed

    Nuttall, David; Parkin, David; Devlin, Nancy

    2015-01-01

    This paper describes the development of a methodology for the case-mix adjustment of patient-reported outcome measures (PROMs) data permitting the comparison of outcomes between providers on a like-for-like basis. Statistical models that take account of provider-specific effects form the basis of the proposed case-mix adjustment methodology. Indirect standardisation provides a transparent means of case mix adjusting the PROMs data, which are updated on a monthly basis. Recently published PROMs data for patients undergoing unilateral knee replacement are used to estimate empirical models and to demonstrate the application of the proposed case-mix adjustment methodology in practice. The results are illustrative and are used to highlight a number of theoretical and empirical issues that warrant further exploration. For example, because of differences between PROMs instruments, case-mix adjustment methodologies may require instrument-specific approaches. A number of key assumptions are made in estimating the empirical models, which could be open to challenge. The covariates of post-operative health status could be expanded, and alternative econometric methods could be employed. © 2013 Crown copyright.

  12. Multiqubit subradiant states in N -port waveguide devices: ɛ-and-μ-near-zero hubs and nonreciprocal circulators

    NASA Astrophysics Data System (ADS)

    Liberal, Iñigo; Engheta, Nader

    2018-02-01

    Quantum emitters interacting through a waveguide setup have been proposed as a promising platform for basic research on light-matter interactions and quantum information processing. We propose to augment waveguide setups with the use of multiport devices. Specifically, we demonstrate theoretically the possibility of exciting N -qubit subradiant, maximally entangled, states with the use of suitably designed N -port devices. Our general methodology is then applied based on two different devices: an epsilon-and-mu-near-zero waveguide hub and a nonreciprocal circulator. A sensitivity analysis is carried out to assess the robustness of the system against a number of nonidealities. These findings link and merge the designs of devices for quantum state engineering with classical communication network methodologies.

  13. Medicine and the humanities--theoretical and methodological issues.

    PubMed

    Puustinen, Raimo; Leiman, M; Viljanen, A M

    2003-12-01

    Engel's biopsychosocial model, Cassell's promotion of the concept "person" in medical thinking and Pellegrino's and Thomasma's philosophy of medicine are attempts to widen current biomedical theory of disease and to approach medicine as a form of human activity in pursuit of healing. To develop this approach further we would like to propose activity theory as a possible means for understanding the nature of medical practice. By "activity theory" we refer to developments which have evolved from Vygotsky's research on socially mediated mental functions and processes. Analysing medicine as activity enforces the joint consideration of target and subject: who is doing what to whom. This requires the use of historical, linguistic, anthropological, and semiotic tools. Therefore, if we analyse medicine as an activity, humanities are both theoretically and methodologically "inbound" (or internal) to the analysis itself. On the other hand, literature studies or anthropological writings provide material for analysing the various forms of medical practices.

  14. Using information Theory in Optimal Test Point Selection for Health Management in NASA's Exploration Vehicles

    NASA Technical Reports Server (NTRS)

    Mehr, Ali Farhang; Tumer, Irem

    2005-01-01

    In this paper, we will present a new methodology that measures the "worth" of deploying an additional testing instrument (sensor) in terms of the amount of information that can be retrieved from such measurement. This quantity is obtained using a probabilistic model of RLV's that has been partially developed in the NASA Ames Research Center. A number of correlated attributes are identified and used to obtain the worth of deploying a sensor in a given test point from an information-theoretic viewpoint. Once the information-theoretic worth of sensors is formulated and incorporated into our general model for IHM performance, the problem can be formulated as a constrained optimization problem where reliability and operational safety of the system as a whole is considered. Although this research is conducted specifically for RLV's, the proposed methodology in its generic form can be easily extended to other domains of systems health monitoring.

  15. Trace Analysis and Spatial Reasoning: An Example of Intensive Cognitive Diagnosis and Its Implications for Testing. September 1987. Technical Report.

    ERIC Educational Resources Information Center

    Ohlsson, Stellan

    Recent theoretical developments in cognitive psychology imply both a need and a possibility for methodological development. In particular, the theory of problem solving proposed by Allen Newell and Herbert A. Simon (1972) provides the rationale for a new empirical method for the processing of think-aloud protocols--trace analysis. A detailed…

  16. Characterizing Teaching Effectiveness in the Joint Action Theory in Didactics: An Exploratory Study in Primary School

    ERIC Educational Resources Information Center

    Sensevy, Gérard

    2014-01-01

    This paper presents an exploratory study of two consecutive reading sessions conducted in primary school by two different teachers. Our purpose is twofold. From a theoretical viewpoint, we propose a tentative set of conditions of teaching effectiveness by relying on the Joint Action Theory in Didactics. From a methodological viewpoint, drawing on…

  17. The Economic Value of Breastfeeding (With Results from Research Conducted in Ghana and the Ivory Coast). Cornell International Nutrition Monograph Series Number 6.

    ERIC Educational Resources Information Center

    Greiner, Ted; And Others

    This monograph focuses attention on economic considerations related to infant feeding practices in developing countries. By enlarging on previous methodologies, this paper proposes to improve the accuracy of past estimates of the economic value of human milk, or more specifically, the practice of breastfeeding. The theoretical model employed…

  18. A Cladist is a systematist who seeks a natural classification: some comments on Quinn (2017).

    PubMed

    Williams, David M; Ebach, Malte C

    2018-01-01

    In response to Quinn (Biol Philos, 2017. 10.1007/s10539-017-9577-z) we identify cladistics to be about natural classifications and their discovery and thereby propose to add an eighth cladistic definition to Quinn's list, namely the systematist who seeks to discover natural classifications, regardless of their affiliation, theoretical or methodological justifications.

  19. Closed-loop, pilot/vehicle analysis of the approach and landing task

    NASA Technical Reports Server (NTRS)

    Schmidt, D. K.; Anderson, M. R.

    1985-01-01

    Optimal-control-theoretic modeling and frequency-domain analysis is the methodology proposed to evaluate analytically the handling qualities of higher-order manually controlled dynamic systems. Fundamental to the methodology is evaluating the interplay between pilot workload and closed-loop pilot/vehicle performance and stability robustness. The model-based metric for pilot workload is the required pilot phase compensation. Pilot/vehicle performance and loop stability is then evaluated using frequency-domain techniques. When these techniques were applied to the flight-test data for thirty-two highly-augmented fighter configurations, strong correlation was obtained between the analytical and experimental results.

  20. A Framework for Evaluating and Enhancing Alignment in Self-Regulated Learning Research

    PubMed Central

    Dent, Amy L.; Hoyle, Rick H.

    2015-01-01

    We discuss the articles of this special issue with reference to an important yet previously only implicit dimension of study quality: alignment across the theoretical and methodological decisions that collectively define an approach to self-regulated learning. Integrating and extending work by leaders in the field, we propose a framework for evaluating alignment in the way self-regulated learning research is both conducted and reported. Within this framework, the special issue articles provide a springboard for discussing methodological promises and pitfalls of increasingly sophisticated research on the dynamic, contingent, and contextualized features of self-regulated learning. PMID:25825589

  1. Influence analysis for high-dimensional time series with an application to epileptic seizure onset zone detection

    PubMed Central

    Flamm, Christoph; Graef, Andreas; Pirker, Susanne; Baumgartner, Christoph; Deistler, Manfred

    2013-01-01

    Granger causality is a useful concept for studying causal relations in networks. However, numerical problems occur when applying the corresponding methodology to high-dimensional time series showing co-movement, e.g. EEG recordings or economic data. In order to deal with these shortcomings, we propose a novel method for the causal analysis of such multivariate time series based on Granger causality and factor models. We present the theoretical background, successfully assess our methodology with the help of simulated data and show a potential application in EEG analysis of epileptic seizures. PMID:23354014

  2. Philosophy for the rest of cognitive science.

    PubMed

    Stepp, Nigel; Chemero, Anthony; Turvey, Michael T

    2011-04-01

    Cognitive science has always included multiple methodologies and theoretical commitments. The philosophy of cognitive science should embrace, or at least acknowledge, this diversity. Bechtel's (2009a) proposed philosophy of cognitive science, however, applies only to representationalist and mechanist cognitive science, ignoring the substantial minority of dynamically oriented cognitive scientists. As an example of nonrepresentational, dynamical cognitive science, we describe strong anticipation as a model for circadian systems (Stepp & Turvey, 2009). We then propose a philosophy of science appropriate to nonrepresentational, dynamical cognitive science. Copyright © 2011 Cognitive Science Society, Inc.

  3. Multiattribute selection of acute stroke imaging software platform for Extending the Time for Thrombolysis in Emergency Neurological Deficits (EXTEND) clinical trial.

    PubMed

    Churilov, Leonid; Liu, Daniel; Ma, Henry; Christensen, Soren; Nagakane, Yoshinari; Campbell, Bruce; Parsons, Mark W; Levi, Christopher R; Davis, Stephen M; Donnan, Geoffrey A

    2013-04-01

    The appropriateness of a software platform for rapid MRI assessment of the amount of salvageable brain tissue after stroke is critical for both the validity of the Extending the Time for Thrombolysis in Emergency Neurological Deficits (EXTEND) Clinical Trial of stroke thrombolysis beyond 4.5 hours and for stroke patient care outcomes. The objective of this research is to develop and implement a methodology for selecting the acute stroke imaging software platform most appropriate for the setting of a multi-centre clinical trial. A multi-disciplinary decision making panel formulated the set of preferentially independent evaluation attributes. Alternative Multi-Attribute Value Measurement methods were used to identify the best imaging software platform followed by sensitivity analysis to ensure the validity and robustness of the proposed solution. Four alternative imaging software platforms were identified. RApid processing of PerfusIon and Diffusion (RAPID) software was selected as the most appropriate for the needs of the EXTEND trial. A theoretically grounded generic multi-attribute selection methodology for imaging software was developed and implemented. The developed methodology assured both a high quality decision outcome and a rational and transparent decision process. This development contributes to stroke literature in the area of comprehensive evaluation of MRI clinical software. At the time of evaluation, RAPID software presented the most appropriate imaging software platform for use in the EXTEND clinical trial. The proposed multi-attribute imaging software evaluation methodology is based on sound theoretical foundations of multiple criteria decision analysis and can be successfully used for choosing the most appropriate imaging software while ensuring both robust decision process and outcomes. © 2012 The Authors. International Journal of Stroke © 2012 World Stroke Organization.

  4. What Was out of the Frame? A Dialogic Look at Youth Media Production in a Cultural Diversity and Educational Context in Chile

    ERIC Educational Resources Information Center

    Valdivia, Andrea

    2017-01-01

    This article accounts for an experience of digital storytelling workshops with indigenous adolescents in Chile, and proposes a theoretical and methodological approach to analyze digital creations with a dialogic and ethnographic point of view. Based on this, it discusses the possibilities of digital media production as a strategy for the…

  5. Ecoliteracy and a Place-Based Pedagogy: Expanding Latin@ Students' Critical Understanding of the Reciprocity between Sociocultural Systems and Ecosystems in the US-Mexico Border Region

    ERIC Educational Resources Information Center

    Gutierrez, Kristina A.

    2012-01-01

    This dissertation proposes a place-based theoretical and methodological framework, informed by concepts of ecology, multimodality, and activity systems. I apply this framework of ecoliteracy as it is defined within the interdisciplinary contexts of rhetoric and composition, linguistics, and Chicana/o studies. Ecoliteracy refers to individuals'…

  6. Variation in Research Designs Used to Test the Effectiveness of Dissemination and Implementation Strategies: A Review.

    PubMed

    Mazzucca, Stephanie; Tabak, Rachel G; Pilar, Meagan; Ramsey, Alex T; Baumann, Ana A; Kryzer, Emily; Lewis, Ericka M; Padek, Margaret; Powell, Byron J; Brownson, Ross C

    2018-01-01

    The need for optimal study designs in dissemination and implementation (D&I) research is increasingly recognized. Despite the wide range of study designs available for D&I research, we lack understanding of the types of designs and methodologies that are routinely used in the field. This review assesses the designs and methodologies in recently proposed D&I studies and provides resources to guide design decisions. We reviewed 404 study protocols published in the journal Implementation Science from 2/2006 to 9/2017. Eligible studies tested the efficacy or effectiveness of D&I strategies (i.e., not effectiveness of the underlying clinical or public health intervention); had a comparison by group and/or time; and used ≥1 quantitative measure. Several design elements were extracted: design category (e.g., randomized); design type [e.g., cluster randomized controlled trial (RCT)]; data type (e.g., quantitative); D&I theoretical framework; levels of treatment assignment, intervention, and measurement; and country in which the research was conducted. Each protocol was double-coded, and discrepancies were resolved through discussion. Of the 404 protocols reviewed, 212 (52%) studies tested one or more implementation strategy across 208 manuscripts, therefore meeting inclusion criteria. Of the included studies, 77% utilized randomized designs, primarily cluster RCTs. The use of alternative designs (e.g., stepped wedge) increased over time. Fewer studies were quasi-experimental (17%) or observational (6%). Many study design categories (e.g., controlled pre-post, matched pair cluster design) were represented by only one or two studies. Most articles proposed quantitative and qualitative methods (61%), with the remaining 39% proposing only quantitative. Half of protocols (52%) reported using a theoretical framework to guide the study. The four most frequently reported frameworks were Consolidated Framework for Implementing Research and RE-AIM ( n  = 16 each), followed by Promoting Action on Research Implementation in Health Services and Theoretical Domains Framework ( n  = 12 each). While several novel designs for D&I research have been proposed (e.g., stepped wedge, adaptive designs), the majority of the studies in our sample employed RCT designs. Alternative study designs are increasing in use but may be underutilized for a variety of reasons, including preference of funders or lack of awareness of these designs. Promisingly, the prevalent use of quantitative and qualitative methods together reflects methodological innovation in newer D&I research.

  7. Towards a Model of Technology Adoption: A Conceptual Model Proposition

    NASA Astrophysics Data System (ADS)

    Costello, Pat; Moreton, Rob

    A conceptual model for Information Communication Technology (ICT) adoption by Small Medium Enterprises (SMEs) is proposed. The research uses several ICT adoption models as its basis with theoretical underpinning provided by the Diffusion of Innovation theory and the Technology Acceptance Model (TAM). Taking an exploratory research approach the model was investigated amongst 200 SMEs whose core business is ICT. Evidence from this study demonstrates that these SMEs face the same issues as all other industry sectors. This work points out weaknesses in SMEs environments regarding ICT adoption and suggests what they may need to do to increase the success rate of any proposed adoption. The methodology for development of the framework is described and recommendations made for improved Government-led ICT adoption initiatives. Application of the general methodology has resulted in new opportunities to embed the ethos and culture surrounding the issues into the framework of new projects developed as a result of Government intervention. A conceptual model is proposed that may lead to a deeper understanding of the issues under consideration.

  8. Windowed Green function method for the Helmholtz equation in the presence of multiply layered media

    NASA Astrophysics Data System (ADS)

    Bruno, O. P.; Pérez-Arancibia, C.

    2017-06-01

    This paper presents a new methodology for the solution of problems of two- and three-dimensional acoustic scattering (and, in particular, two-dimensional electromagnetic scattering) by obstacles and defects in the presence of an arbitrary number of penetrable layers. Relying on the use of certain slow-rise windowing functions, the proposed windowed Green function approach efficiently evaluates oscillatory integrals over unbounded domains, with high accuracy, without recourse to the highly expensive Sommerfeld integrals that have typically been used to account for the effect of underlying planar multilayer structures. The proposed methodology, whose theoretical basis was presented in the recent contribution (Bruno et al. 2016 SIAM J. Appl. Math. 76, 1871-1898. (doi:10.1137/15M1033782)), is fast, accurate, flexible and easy to implement. Our numerical experiments demonstrate that the numerical errors resulting from the proposed approach decrease faster than any negative power of the window size. In a number of examples considered in this paper, the proposed method is up to thousands of times faster, for a given accuracy, than corresponding methods based on the use of Sommerfeld integrals.

  9. Windowed Green function method for the Helmholtz equation in the presence of multiply layered media.

    PubMed

    Bruno, O P; Pérez-Arancibia, C

    2017-06-01

    This paper presents a new methodology for the solution of problems of two- and three-dimensional acoustic scattering (and, in particular, two-dimensional electromagnetic scattering) by obstacles and defects in the presence of an arbitrary number of penetrable layers. Relying on the use of certain slow-rise windowing functions, the proposed windowed Green function approach efficiently evaluates oscillatory integrals over unbounded domains, with high accuracy, without recourse to the highly expensive Sommerfeld integrals that have typically been used to account for the effect of underlying planar multilayer structures. The proposed methodology, whose theoretical basis was presented in the recent contribution (Bruno et al. 2016 SIAM J. Appl. Math. 76 , 1871-1898. (doi:10.1137/15M1033782)), is fast, accurate, flexible and easy to implement. Our numerical experiments demonstrate that the numerical errors resulting from the proposed approach decrease faster than any negative power of the window size. In a number of examples considered in this paper, the proposed method is up to thousands of times faster, for a given accuracy, than corresponding methods based on the use of Sommerfeld integrals.

  10. External Validity in the Study of Human Development: Theoretical and Methodological Issues

    ERIC Educational Resources Information Center

    Hultsch, David F.; Hickey, Tom

    1978-01-01

    An examination of the concept of external validity from two theoretical perspectives: a traditional mechanistic approach and a dialectical organismic approach. Examines the theoretical and methodological implications of these perspectives. (BD)

  11. Theoretical calculating the thermodynamic properties of solid sorbents for CO{sub 2} capture applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duan, Yuhua

    2012-11-02

    Since current technologies for capturing CO{sub 2} to fight global climate change are still too energy intensive, there is a critical need for development of new materials that can capture CO{sub 2} reversibly with acceptable energy costs. Accordingly, solid sorbents have been proposed to be used for CO{sub 2} capture applications through a reversible chemical transformation. By combining thermodynamic database mining with first principles density functional theory and phonon lattice dynamics calculations, a theoretical screening methodology to identify the most promising CO{sub 2} sorbent candidates from the vast array of possible solid materials has been proposed and validated. The calculatedmore » thermodynamic properties of different classes of solid materials versus temperature and pressure changes were further used to evaluate the equilibrium properties for the CO{sub 2} adsorption/desorption cycles. According to the requirements imposed by the pre- and post- combustion technologies and based on our calculated thermodynamic properties for the CO{sub 2} capture reactions by the solids of interest, we were able to screen only those solid materials for which lower capture energy costs are expected at the desired pressure and temperature conditions. Only those selected CO{sub 2} sorbent candidates were further considered for experimental validations. The ab initio thermodynamic technique has the advantage of identifying thermodynamic properties of CO{sub 2} capture reactions without any experimental input beyond crystallographic structural information of the solid phases involved. Such methodology not only can be used to search for good candidates from existing database of solid materials, but also can provide some guidelines for synthesis new materials. In this presentation, we first introduce our screening methodology and the results on a testing set of solids with known thermodynamic properties to validate our methodology. Then, by applying our computational method to several different kinds of solid systems, we demonstrate that our methodology can predict the useful information to help developing CO{sub 2} capture Technologies.« less

  12. On the Correct Analysis of the Foundations of Theoretical Physics

    NASA Astrophysics Data System (ADS)

    Kalanov, Temur Z.

    2007-04-01

    The problem of truth in science -- the most urgent problem of our time -- is discussed. The correct theoretical analysis of the foundations of theoretical physics is proposed. The principle of the unity of formal logic and rational dialectics is a methodological basis of the analysis. The main result is as follows: the generally accepted foundations of theoretical physics (i.e. Newtonian mechanics, Maxwell electrodynamics, thermodynamics, statistical physics and physical kinetics, the theory of relativity, quantum mechanics) contain the set of logical errors. These errors are explained by existence of the global cause: the errors are a collateral and inevitable result of the inductive way of cognition of the Nature, i.e. result of movement from formation of separate concepts to formation of the system of concepts. Consequently, theoretical physics enters the greatest crisis. It means that physics as a science of phenomenon leaves the progress stage for a science of essence (information). Acknowledgment: The books ``Surprises in Theoretical Physics'' (1979) and ``More Surprises in Theoretical Physics'' (1991) by Sir Rudolf Peierls stimulated my 25-year work.

  13. Comment on `Magnitude conversion problem using general orthogonal regression' by H. R. Wason, Ranjit Das and M. L. Sharma, (Geophys. J. Int., 190, 1091-1096)

    NASA Astrophysics Data System (ADS)

    Gasperini, Paolo; Lolli, Barbara

    2014-01-01

    The argument proposed by Wason et al. that the conversion of magnitudes from a scale (e.g. Ms or mb) to another (e.g. Mw), using the coefficients computed by the general orthogonal regression method (Fuller) is biased if the observed values of the predictor (independent) variable are used in the equation as well as the methodology they suggest to estimate the supposedly true values of the predictor variable are wrong for a number of theoretical and empirical reasons. Hence, we advise against the use of such methodology for magnitude conversions.

  14. Quantifying Ballistic Armor Performance: A Minimally Invasive Approach

    NASA Astrophysics Data System (ADS)

    Holmes, Gale; Kim, Jaehyun; Blair, William; McDonough, Walter; Snyder, Chad

    2006-03-01

    Theoretical and non-dimensional analyses suggest a critical link between the performance of ballistic resistant armor and the fundamental mechanical properties of the polymeric materials that comprise them. Therefore, a test methodology that quantifies these properties without compromising an armored vest that is exposed to the industry standard V-50 ballistic performance test is needed. Currently, there is considerable speculation about the impact that competing degradation mechanisms (e.g., mechanical, humidity, ultraviolet) may have on ballistic resistant armor. We report on the use of a new test methodology that quantifies the mechanical properties of ballistic fibers and how each proposed degradation mechanism may impact a vest's ballistic performance.

  15. Argumentation in Science Education: A Model-based Framework

    NASA Astrophysics Data System (ADS)

    Böttcher, Florian; Meisert, Anke

    2011-02-01

    The goal of this article is threefold: First, the theoretical background for a model-based framework of argumentation to describe and evaluate argumentative processes in science education is presented. Based on the general model-based perspective in cognitive science and the philosophy of science, it is proposed to understand arguments as reasons for the appropriateness of a theoretical model which explains a certain phenomenon. Argumentation is considered to be the process of the critical evaluation of such a model if necessary in relation to alternative models. Secondly, some methodological details are exemplified for the use of a model-based analysis in the concrete classroom context. Third, the application of the approach in comparison with other analytical models will be presented to demonstrate the explicatory power and depth of the model-based perspective. Primarily, the framework of Toulmin to structurally analyse arguments is contrasted with the approach presented here. It will be demonstrated how common methodological and theoretical problems in the context of Toulmin's framework can be overcome through a model-based perspective. Additionally, a second more complex argumentative sequence will also be analysed according to the invented analytical scheme to give a broader impression of its potential in practical use.

  16. The Uses of Mass Communications: Current Perspectives on Gratifications Research. Sage Annual Reviews of Communication Research Volume III.

    ERIC Educational Resources Information Center

    Blumler, Jay G., Ed.; Katz, Elihu, Ed.

    The essays in this volume examine the use of the mass media and explore the findings of the gratifications approach to mass communication research. Part one summaries the achievements in this area of mass media research and proposes an agenda for discussion of the future direction of this research in terms of a set of theoretical, methodological,…

  17. What Is Different about E-Books? A MINES for Libraries® Analysis of Academic and Health Sciences Research Libraries' E-Book Usage

    ERIC Educational Resources Information Center

    Plum, Terry; Franklin, Brinley

    2015-01-01

    Building on the theoretical proposals of Kevin Guthrie and others concerning the transition from print books to e-books in academic and health sciences libraries, this paper presents data collected using the MINES for Libraries® e-resource survey methodology. Approximately 6,000 e-book uses were analyzed from a sample of e-resource usage at…

  18. A Confidence Paradigm for Classification Systems

    DTIC Science & Technology

    2008-09-01

    methodology to determine how much confi- dence one should have in a classifier output. This research proposes a framework to determine the level of...theoretical framework that attempts to unite the viewpoints of the classification system developer (or engineer) and the classification system user (or...operating point. An algorithm is developed that minimizes a “confidence” measure called Binned Error in the Posterior ( BEP ). Then, we prove that training a

  19. Students’ scientific production: a proposal to encourage it.

    PubMed

    Corrales-Reyes, Ibraín Enrique; Dorta-Contreras, Alberto Juan

    2018-01-31

    The scientific production of medical students in Latin America, is poor and below their potential. The reason for this is the low theoretical and practical knowledge of scientific writing, a low margin for new knowledge generation, a heavy academic and clinical load, and the expected profile of the medical school graduate. In the present short communication, we propose teaching courses in research methodology, scientific writing in English and Spanish, a personalized search for students and mentors with research aptitudes. Also, we propose academic and material stimuli for publishing, rewards for the best papers made by students and the development and support of scientific student journals. Other proposals are the requirement to publish a paper for graduation, and sharing the most outstanding experiences.

  20. A Neural-Network Clustering-Based Algorithm for Privacy Preserving Data Mining

    NASA Astrophysics Data System (ADS)

    Tsiafoulis, S.; Zorkadis, V. C.; Karras, D. A.

    The increasing use of fast and efficient data mining algorithms in huge collections of personal data, facilitated through the exponential growth of technology, in particular in the field of electronic data storage media and processing power, has raised serious ethical, philosophical and legal issues related to privacy protection. To cope with these concerns, several privacy preserving methodologies have been proposed, classified in two categories, methodologies that aim at protecting the sensitive data and those that aim at protecting the mining results. In our work, we focus on sensitive data protection and compare existing techniques according to their anonymity degree achieved, the information loss suffered and their performance characteristics. The ℓ-diversity principle is combined with k-anonymity concepts, so that background information can not be exploited to successfully attack the privacy of data subjects data refer to. Based on Kohonen Self Organizing Feature Maps (SOMs), we firstly organize data sets in subspaces according to their information theoretical distance to each other, then create the most relevant classes paying special attention to rare sensitive attribute values, and finally generalize attribute values to the minimum extend required so that both the data disclosure probability and the information loss are possibly kept negligible. Furthermore, we propose information theoretical measures for assessing the anonymity degree achieved and empirical tests to demonstrate it.

  1. A Novel Methodology for Measurements of an LED's Heat Dissipation Factor

    NASA Astrophysics Data System (ADS)

    Jou, R.-Y.; Haung, J.-H.

    2015-12-01

    Heat generation is an inevitable byproduct with high-power light-emitting diode (LED) lighting. The increase in junction temperature that accompanies the heat generation sharply degrades the optical output of the LED and has a significant negative influence on the reliability and durability of the LED. For these reasons, the heat dissipation factor, Kh, is an important factor in modeling and thermal design of LED installations. In this study, a methodology is proposed and experiments are conducted to determine LED heat dissipation factors. Experiments are conducted for two different brands of LED. The average heat dissipation factor of the Edixeon LED is 0.69, and is 0.60 for the OSRAM LED. By using the developed test method and comparing the results to the calculated luminous fluxes using theoretical equations, the interdependence of optical, electrical, and thermal powers can be predicted with a reasonable accuracy. The difference between the theoretical and experimental values is less than 9 %.

  2. Early Warning Signals of Ecological Transitions: Methods for Spatial Patterns

    PubMed Central

    Brock, William A.; Carpenter, Stephen R.; Ellison, Aaron M.; Livina, Valerie N.; Seekell, David A.; Scheffer, Marten; van Nes, Egbert H.; Dakos, Vasilis

    2014-01-01

    A number of ecosystems can exhibit abrupt shifts between alternative stable states. Because of their important ecological and economic consequences, recent research has focused on devising early warning signals for anticipating such abrupt ecological transitions. In particular, theoretical studies show that changes in spatial characteristics of the system could provide early warnings of approaching transitions. However, the empirical validation of these indicators lag behind their theoretical developments. Here, we summarize a range of currently available spatial early warning signals, suggest potential null models to interpret their trends, and apply them to three simulated spatial data sets of systems undergoing an abrupt transition. In addition to providing a step-by-step methodology for applying these signals to spatial data sets, we propose a statistical toolbox that may be used to help detect approaching transitions in a wide range of spatial data. We hope that our methodology together with the computer codes will stimulate the application and testing of spatial early warning signals on real spatial data. PMID:24658137

  3. [The medicalization of life: hybrids against the dichotomy Nature/Culture].

    PubMed

    Sy, Anahi

    2018-05-01

    This paper aims to analyze the process of medicalization in current societies, starting from the description of the way in which medicine gradually appropriated various aspects of everyday life that were once part of the life cycle of people. At the theoretical level, we are based on authors such as Descola and Latour, who problematize the dichotomy between Nature and Culture, and propose the need to think from a superior episteme. Methodologically, this theoretical proposal enables an analysis of the medicalization that can illuminate what is hidden in the discourse and biomedical practices: the sociocultural, political and economic processes that are part of these "objects" of Medicine. From this perspective, the presentation of them as scientific facts, objectively isolatable and manipulable by medical science, is in crisis. Thus, our analysis, based on the concept of "quasi-objects" or "hybrids", problematizes such objectification, while providing critical tools to reflect on the medicalization of life in today's societies.

  4. Model identification methodology for fluid-based inerters

    NASA Astrophysics Data System (ADS)

    Liu, Xiaofu; Jiang, Jason Zheng; Titurus, Branislav; Harrison, Andrew

    2018-06-01

    Inerter is the mechanical dual of the capacitor via the force-current analogy. It has the property that the force across the terminals is proportional to their relative acceleration. Compared with flywheel-based inerters, fluid-based forms have advantages of improved durability, inherent damping and simplicity of design. In order to improve the understanding of the physical behaviour of this fluid-based device, especially caused by the hydraulic resistance and inertial effects in the external tube, this work proposes a comprehensive model identification methodology. Firstly, a modelling procedure is established, which allows the topological arrangement of the mechanical networks to be obtained by mapping the damping, inertance and stiffness effects directly to their respective hydraulic counterparts. Secondly, an experimental sequence is followed, which separates the identification of friction, stiffness and various damping effects. Furthermore, an experimental set-up is introduced, where two pressure gauges are used to accurately measure the pressure drop across the external tube. The theoretical models with improved confidence are obtained using the proposed methodology for a helical-tube fluid inerter prototype. The sources of remaining discrepancies are further analysed.

  5. Experimental methodology for turbocompressor in-duct noise evaluation based on beamforming wave decomposition

    NASA Astrophysics Data System (ADS)

    Torregrosa, A. J.; Broatch, A.; Margot, X.; García-Tíscar, J.

    2016-08-01

    An experimental methodology is proposed to assess the noise emission of centrifugal turbocompressors like those of automotive turbochargers. A step-by-step procedure is detailed, starting from the theoretical considerations of sound measurement in flow ducts and examining specific experimental setup guidelines and signal processing routines. Special care is taken regarding some limiting factors that adversely affect the measuring of sound intensity in ducts, namely calibration, sensor placement and frequency ranges and restrictions. In order to provide illustrative examples of the proposed techniques and results, the methodology has been applied to the acoustic evaluation of a small automotive turbocharger in a flow bench. Samples of raw pressure spectra, decomposed pressure waves, calibration results, accurate surge characterization and final compressor noise maps and estimated spectrograms are provided. The analysis of selected frequency bands successfully shows how different, known noise phenomena of particular interest such as mid-frequency "whoosh noise" and low-frequency surge onset are correlated with operating conditions of the turbocharger. Comparison against external inlet orifice intensity measurements shows good correlation and improvement with respect to alternative wave decomposition techniques.

  6. Dynamic seismic signatures of saturated porous rocks containing two orthogonal sets of fractures: theory versus numerical simulations

    NASA Astrophysics Data System (ADS)

    Guo, Junxin; Rubino, J. Germán; Glubokovskikh, Stanislav; Gurevich, Boris

    2018-05-01

    The dispersion and attenuation of seismic waves are potentially important attributes for the non-invasive detection and characterization of fracture networks. A primary mechanism for these phenomena is wave-induced fluid flow (WIFF), which can take place between fractures and their embedding background (FB-WIFF), as well as within connected fractures (FF-WIFF). In this work, we propose a theoretical approach to quantify seismic dispersion and attenuation related to these two manifestations of WIFF in saturated porous rocks permeated by two orthogonal sets of fractures. The methodology is based on existing theoretical models for rocks with aligned fractures, and we consider three types of fracture geometries, namely, periodic planar fractures, randomly spaced planar fractures and penny-shaped cracks. Synthetic 2-D rock samples with different degrees of fracture intersections are then explored by considering both the proposed theoretical approach and a numerical upscaling procedure that provides the effective seismic properties of generic heterogeneous porous media. The results show that the theoretical predictions are in overall good agreement with the numerical simulations, in terms of both the stiffness coefficients and the anisotropic properties. For the seismic dispersion and attenuation caused by FB-WIFF, the theoretical model for penny-shaped cracks matches the numerical simulations best, whereas for representing the effects due to FF-WIFF the periodic planar fractures model turns out to be the most suitable one. The proposed theoretical approach is easy to apply and is applicable not only to 2-D but also to 3-D fracture systems. Hence, it has the potential to constitute a useful framework for the seismic characterization of fractured reservoirs, especially in the presence of intersecting fractures.

  7. Passive Super-Low Frequency electromagnetic prospecting technique

    NASA Astrophysics Data System (ADS)

    Wang, Nan; Zhao, Shanshan; Hui, Jian; Qin, Qiming

    2017-03-01

    The Super-Low Frequency (SLF) electromagnetic prospecting technique, adopted as a non-imaging remote sensing tool for depth sounding, is systematically proposed for subsurface geological survey. In this paper, we propose and theoretically illustrate natural source magnetic amplitudes as SLF responses for the first step. In order to directly calculate multi-dimensional theoretical SLF responses, modeling algorithms were developed and evaluated using the finite difference method. The theoretical results of three-dimensional (3-D) models show that the average normalized SLF magnetic amplitude responses were numerically stable and appropriate for practical interpretation. To explore the depth resolution, three-layer models were configured. The modeling results prove that the SLF technique is more sensitive to conductive objective layers than high resistive ones, with the SLF responses of conductive objective layers obviously showing uprising amplitudes in the low frequency range. Afterwards, we proposed an improved Frequency-Depth transformation based on Bostick inversion to realize the depth sounding by empirically adjusting two parameters. The SLF technique has already been successfully applied in geothermal exploration and coalbed methane (CBM) reservoir interpretation, which demonstrates that the proposed methodology is effective in revealing low resistive distributions. Furthermore, it siginificantly contributes to reservoir identification with electromagnetic radiation anomaly extraction. Meanwhile, the SLF interpretation results are in accordance with dynamic production status of CBM reservoirs, which means it could provide an economical, convenient and promising method for exploring and monitoring subsurface geo-objects.

  8. Evaluation and communication: using a communication audit to evaluate organizational communication.

    PubMed

    Hogard, Elaine; Ellis, Roger

    2006-04-01

    This article identifies a surprising dearth of studies that explicitly link communication and evaluation at substantive, theoretical, and methodological levels. A three-fold typology of evaluation studies referring to communication is proposed and examples given. The importance of organizational communication in program delivery is stressed and illustrative studies reviewed. It is proposed that organizational communication should be considered in all program evaluations and that this should be approached through communication audit. Communication audits are described with particular reference to established survey questionnaire instruments. Two case studies exemplify the use of such instruments in the evaluation of educational and social programs.

  9. Research Strategies for Biomedical and Health Informatics. Some Thought-provoking and Critical Proposals to Encourage Scientific Debate on the Nature of Good Research in Medical Informatics.

    PubMed

    Haux, Reinhold; Kulikowski, Casimir A; Bakken, Suzanne; de Lusignan, Simon; Kimura, Michio; Koch, Sabine; Mantas, John; Maojo, Victor; Marschollek, Michael; Martin-Sanchez, Fernando; Moen, Anne; Park, Hyeoun-Ae; Sarkar, Indra N; Leong, Tze Yun; McCray, Alexa T

    2017-01-25

    Medical informatics, or biomedical and health informatics (BMHI), has become an established scientific discipline. In all such disciplines there is a certain inertia to persist in focusing on well-established research areas and to hold on to well-known research methodologies rather than adopting new ones, which may be more appropriate. To search for answers to the following questions: What are research fields in informatics, which are not being currently adequately addressed, and which methodological approaches might be insufficiently used? Do we know about reasons? What could be consequences of change for research and for education? Outstanding informatics scientists were invited to three panel sessions on this topic in leading international conferences (MIE 2015, Medinfo 2015, HEC 2016) in order to get their answers to these questions. A variety of themes emerged in the set of answers provided by the panellists. Some panellists took the theoretical foundations of the field for granted, while several questioned whether the field was actually grounded in a strong theoretical foundation. Panellists proposed a range of suggestions for new or improved approaches, methodologies, and techniques to enhance the BMHI research agenda. The field of BMHI is on the one hand maturing as an academic community and intellectual endeavour. On the other hand vendor-supplied solutions may be too readily and uncritically accepted in health care practice. There is a high chance that BMHI will continue to flourish as an important discipline; its innovative interventions might then reach the original objectives of advancing science and improving health care outcomes.

  10. A Theoretical and Methodological Evaluation of Leadership Research.

    ERIC Educational Resources Information Center

    Lashbrook, Velma J.; Lashbrook, William B.

    This paper isolates some of the strengths and weaknesses of leadership research by evaluating it from both a theoretical and methodological perspective. The seven theories or approaches examined are: great man, trait, situational, style, functional, social influence, and interaction positions. General theoretical, conceptual, and measurement…

  11. [Theoretical and methodological uses of research in Social and Human Sciences in Health].

    PubMed

    Deslandes, Suely Ferreira; Iriart, Jorge Alberto Bernstein

    2012-12-01

    The current article aims to map and critically reflect on the current theoretical and methodological uses of research in the subfield of social and human sciences in health. A convenience sample was used to select three Brazilian public health journals. Based on a reading of 1,128 abstracts published from 2009 to 2010, 266 articles were selected that presented the empirical base of research stemming from social and human sciences in health. The sample was classified thematically as "theoretical/ methodological reference", "study type/ methodological design", "analytical categories", "data production techniques", and "analytical procedures". We analyze the sample's emic categories, drawing on the authors' literal statements. All the classifications and respective variables were tabulated in Excel. Most of the articles were self-described as qualitative and used more than one data production technique. There was a wide variety of theoretical references, in contrast with the almost total predominance of a single type of data analysis (content analysis). In several cases, important gaps were identified in expounding the study methodology and instrumental use of the qualitative research techniques and methods. However, the review did highlight some new objects of study and innovations in theoretical and methodological approaches.

  12. Diagnosis of cutaneous thermal burn injuries by multispectral imaging analysis

    NASA Technical Reports Server (NTRS)

    Anselmo, V. J.; Zawacki, B. E.

    1978-01-01

    Special photographic or television image analysis is shown to be a potentially useful technique to assist the physician in the early diagnosis of thermal burn injury. A background on the medical and physiological problems of burns is presented. The proposed methodology for burns diagnosis from both the theoretical and clinical points of view is discussed. The television/computer system constructed to accomplish this analysis is described, and the clinical results are discussed.

  13. Verification of Accelerated Testing Methodology for Long-Term Durability of CFRP Laminates for Marine Use

    DTIC Science & Technology

    2012-01-30

    CFRP LAMINATES FOR MARINE USE Sa. CONTRACT NUMBER 5b. GRANT NUMBER N00014-06-1-1139 Sc. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Miyano, Yasushi...prediction of CFRP laminates proposed and confirmed experimentally in the previous ONR project of Grant # N000140110949 was verified theoretically and refined...DURABILITY OF CFRP LAMINATES FOR MARINE USE Principal Investigator Yasushi Miyano Co-principal Investigator Isao Kimpara Materials System

  14. Identifying the starting point of a spreading process in complex networks.

    PubMed

    Comin, Cesar Henrique; Costa, Luciano da Fontoura

    2011-11-01

    When dealing with the dissemination of epidemics, one important question that can be asked is the location where the contamination began. In this paper, we analyze three spreading schemes and propose and validate an effective methodology for the identification of the source nodes. The method is based on the calculation of the centrality of the nodes on the sampled network, expressed here by degree, betweenness, closeness, and eigenvector centrality. We show that the source node tends to have the highest measurement values. The potential of the methodology is illustrated with respect to three theoretical complex network models as well as a real-world network, the email network of the University Rovira i Virgili.

  15. Quantitative determination of the clustered silicon concentration in substoichiometric silicon oxide layer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spinella, Corrado; Bongiorno, Corrado; Nicotra, Giuseppe

    2005-07-25

    We present an analytical methodology, based on electron energy loss spectroscopy (EELS) and energy-filtered transmission electron microscopy, which allows us to quantify the clustered silicon concentration in annealed substoichiometric silicon oxide layers, deposited by plasma-enhanced chemical vapor deposition. The clustered Si volume fraction was deduced from a fit to the experimental EELS spectrum using a theoretical description proposed to calculate the dielectric function of a system of spherical particles of equal radii, located at random in a host material. The methodology allowed us to demonstrate that the clustered Si concentration is only one half of the excess Si concentration dissolvedmore » in the layer.« less

  16. The human dark side: evolutionary psychology and original sin.

    PubMed

    Lee, Joseph; Theol, M

    2014-04-01

    Human nature has a dark side, something important to religions. Evolutionary psychology has been used to illuminate the human shadow side, although as a discipline it has attracted criticism. This article seeks to examine the evolutionary psychology's understanding of human nature and to propose an unexpected dialog with an enduring account of human evil known as original sin. Two cases are briefly considered: murder and rape. To further the exchange, numerous theoretical and methodological criticisms and replies of evolutionary psychology are explored jointly with original sin. Evolutionary psychology can partner with original sin since they share some theoretical likenesses and together they offer insights into the nature of what it means to be human.

  17. The integrative review: updated methodology.

    PubMed

    Whittemore, Robin; Knafl, Kathleen

    2005-12-01

    The aim of this paper is to distinguish the integrative review method from other review methods and to propose methodological strategies specific to the integrative review method to enhance the rigour of the process. Recent evidence-based practice initiatives have increased the need for and the production of all types of reviews of the literature (integrative reviews, systematic reviews, meta-analyses, and qualitative reviews). The integrative review method is the only approach that allows for the combination of diverse methodologies (for example, experimental and non-experimental research), and has the potential to play a greater role in evidence-based practice for nursing. With respect to the integrative review method, strategies to enhance data collection and extraction have been developed; however, methods of analysis, synthesis, and conclusion drawing remain poorly formulated. A modified framework for research reviews is presented to address issues specific to the integrative review method. Issues related to specifying the review purpose, searching the literature, evaluating data from primary sources, analysing data, and presenting the results are discussed. Data analysis methods of qualitative research are proposed as strategies that enhance the rigour of combining diverse methodologies as well as empirical and theoretical sources in an integrative review. An updated integrative review method has the potential to allow for diverse primary research methods to become a greater part of evidence-based practice initiatives.

  18. What lies behind crop decisions?Coming to terms with revealing farmers' preferences

    NASA Astrophysics Data System (ADS)

    Gomez, C.; Gutierrez, C.; Pulido-Velazquez, M.; López Nicolás, A.

    2016-12-01

    The paper offers a fully-fledged applied revealed preference methodology to screen and represent farmers' choices as the solution of an optimal program involving trade-offs among the alternative welfare outcomes of crop decisions such as profits, income security and management easiness. The recursive two-stage method is proposed as an alternative to cope with the methodological problems inherent to common practice positive mathematical program methodologies (PMP). Differently from PMP, in the model proposed in this paper, the non-linear costs that are required for both calibration and smooth adjustment are not at odds with the assumptions of linear Leontief technologies and fixed crop prices and input costs. The method frees the model from ad-hoc assumptions about costs and then recovers the potential of economic analysis as a means to understand the rationale behind observed and forecasted farmers' decisions and then to enhance the potential of the model to support policy making in relevant domains such as agricultural policy, water management, risk management and climate change adaptation. After the introduction, where the methodological drawbacks and challenges are set up, section two presents the theoretical model, section three develops its empirical application and presents its implementation to a Spanish irrigation district and finally section four concludes and makes suggestions for further research.

  19. The Military-Industrial-Scientific Complex and the Rise of New Powers: Conceptual, Theoretical and Methodological Contributions and the Brazilian Case

    DTIC Science & Technology

    2017-09-29

    Report: The Military-Industrial-Scientific Complex and the Rise of New Powers: Conceptual, Theoretical and Methodological Contributions and the... Methodological Contributions and the Brazilian Case Report Term: 0-Other Email: aminvielle@ucsd.edu Distribution Statement: 1-Approved for public

  20. Theoretical Borderlands: Using Multiple Theoretical Perspectives to Challenge Inequitable Power Structures in Student Development Theory

    ERIC Educational Resources Information Center

    Abes, Elisa S.

    2009-01-01

    This article is an exploration of possibilities and methodological considerations for using multiple theoretical perspectives in research that challenges inequitable power structures in student development theory. Specifically, I explore methodological considerations when partnering queer theory and constructivism in research on lesbian identity…

  1. GLOBALLY ADAPTIVE QUANTILE REGRESSION WITH ULTRA-HIGH DIMENSIONAL DATA

    PubMed Central

    Zheng, Qi; Peng, Limin; He, Xuming

    2015-01-01

    Quantile regression has become a valuable tool to analyze heterogeneous covaraite-response associations that are often encountered in practice. The development of quantile regression methodology for high dimensional covariates primarily focuses on examination of model sparsity at a single or multiple quantile levels, which are typically prespecified ad hoc by the users. The resulting models may be sensitive to the specific choices of the quantile levels, leading to difficulties in interpretation and erosion of confidence in the results. In this article, we propose a new penalization framework for quantile regression in the high dimensional setting. We employ adaptive L1 penalties, and more importantly, propose a uniform selector of the tuning parameter for a set of quantile levels to avoid some of the potential problems with model selection at individual quantile levels. Our proposed approach achieves consistent shrinkage of regression quantile estimates across a continuous range of quantiles levels, enhancing the flexibility and robustness of the existing penalized quantile regression methods. Our theoretical results include the oracle rate of uniform convergence and weak convergence of the parameter estimators. We also use numerical studies to confirm our theoretical findings and illustrate the practical utility of our proposal. PMID:26604424

  2. A methodology for generating normal and pathological brain perfusion SPECT images for evaluation of MRI/SPECT fusion methods: application in epilepsy

    NASA Astrophysics Data System (ADS)

    Grova, C.; Jannin, P.; Biraben, A.; Buvat, I.; Benali, H.; Bernard, A. M.; Scarabin, J. M.; Gibaud, B.

    2003-12-01

    Quantitative evaluation of brain MRI/SPECT fusion methods for normal and in particular pathological datasets is difficult, due to the frequent lack of relevant ground truth. We propose a methodology to generate MRI and SPECT datasets dedicated to the evaluation of MRI/SPECT fusion methods and illustrate the method when dealing with ictal SPECT. The method consists in generating normal or pathological SPECT data perfectly aligned with a high-resolution 3D T1-weighted MRI using realistic Monte Carlo simulations that closely reproduce the response of a SPECT imaging system. Anatomical input data for the SPECT simulations are obtained from this 3D T1-weighted MRI, while functional input data result from an inter-individual analysis of anatomically standardized SPECT data. The method makes it possible to control the 'brain perfusion' function by proposing a theoretical model of brain perfusion from measurements performed on real SPECT images. Our method provides an absolute gold standard for assessing MRI/SPECT registration method accuracy since, by construction, the SPECT data are perfectly registered with the MRI data. The proposed methodology has been applied to create a theoretical model of normal brain perfusion and ictal brain perfusion characteristic of mesial temporal lobe epilepsy. To approach realistic and unbiased perfusion models, real SPECT data were corrected for uniform attenuation, scatter and partial volume effect. An anatomic standardization was used to account for anatomic variability between subjects. Realistic simulations of normal and ictal SPECT deduced from these perfusion models are presented. The comparison of real and simulated SPECT images showed relative differences in regional activity concentration of less than 20% in most anatomical structures, for both normal and ictal data, suggesting realistic models of perfusion distributions for evaluation purposes. Inter-hemispheric asymmetry coefficients measured on simulated data were found within the range of asymmetry coefficients measured on corresponding real data. The features of the proposed approach are compared with those of other methods previously described to obtain datasets appropriate for the assessment of fusion methods.

  3. Methodological proposal for validation of the disinfecting efficacy of an automated flexible endoscope reprocessor

    PubMed Central

    Graziano, Kazuko Uchikawa; Pereira, Marta Elisa Auler; Koda, Elaine

    2016-01-01

    ABSTRACT Objective: to elaborate and apply a method to assess the efficacy of automated flexible endoscope reprocessors at a time when there is not an official method or trained laboratories to comply with the requirements described in specific standards for this type of health product in Brazil. Method: the present methodological study was developed based on the following theoretical references: International Organization for Standardization (ISO) standard ISO 15883-4/2008 and Brazilian Health Surveillance Agency (Agência Nacional de Vigilância Sanitária - ANVISA) Collegiate Board Resolution (Resolução de Diretoria Colegiada - RDC) no. 35/2010 and 15/2012. The proposed method was applied to a commercially available device using a high-level 0.2% peracetic acid-based disinfectant. Results: the proposed method of assessment was found to be robust when the recommendations made in the relevant legislation were incorporated with some adjustments to ensure their feasibility. Application of the proposed method provided evidence of the efficacy of the tested equipment for the high-level disinfection of endoscopes. Conclusion: the proposed method may serve as a reference for the assessment of flexible endoscope reprocessors, thereby providing solid ground for the purchase of this category of health products. PMID:27508915

  4. Enacting a Place-Responsive Research Methodology: Walking Interviews with Educators

    ERIC Educational Resources Information Center

    Lynch, Jonathan; Mannion, Greg

    2016-01-01

    Place-based and place-responsive approaches to outdoor learning and education are developing in many countries but there is dearth of theoretically-supported methodologies to take a more explicit account of place in research in these areas. In response, this article outlines one theoretical framing for place-responsive methodologies for…

  5. Methodological, Theoretical, Infrastructural, and Design Issues in Conducting Good Outcome Studies

    ERIC Educational Resources Information Center

    Kelly, Michael P.; Moore, Tessa A.

    2011-01-01

    This article outlines a set of methodological, theoretical, and other issues relating to the conduct of good outcome studies. The article begins by considering the contribution of evidence-based medicine to the methodology of outcome research. The lessons which can be applied in outcome studies in nonmedical settings are described. The article…

  6. Didactical suggestion for a Dynamic Hybrid Intelligent e-Learning Environment (DHILE) applying the PENTHA ID Model

    NASA Astrophysics Data System (ADS)

    dall'Acqua, Luisa

    2011-08-01

    The teleology of our research is to propose a solution to the request of "innovative, creative teaching", proposing a methodology to educate creative Students in a society characterized by multiple reference points and hyper dynamic knowledge, continuously subject to reviews and discussions. We apply a multi-prospective Instructional Design Model (PENTHA ID Model), defined and developed by our research group, which adopts a hybrid pedagogical approach, consisting of elements of didactical connectivism intertwined with aspects of social constructivism and enactivism. The contribution proposes an e-course structure and approach, applying the theoretical design principles of the above mentioned ID Model, describing methods, techniques, technologies and assessment criteria for the definition of lesson modes in an e-course.

  7. Adaptive PID formation control of nonholonomic robots without leader's velocity information.

    PubMed

    Shen, Dongbin; Sun, Weijie; Sun, Zhendong

    2014-03-01

    This paper proposes an adaptive proportional integral derivative (PID) algorithm to solve a formation control problem in the leader-follower framework where the leader robot's velocities are unknown for the follower robots. The main idea is first to design some proper ideal control law for the formation system to obtain a required performance, and then to propose the adaptive PID methodology to approach the ideal controller. As a result, the formation is achieved with much more enhanced robust formation performance. The stability of the closed-loop system is theoretically proved by Lyapunov method. Both numerical simulations and physical vehicle experiments are presented to verify the effectiveness of the proposed adaptive PID algorithm. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  8. Deterministic nonlinear phase gates induced by a single qubit

    NASA Astrophysics Data System (ADS)

    Park, Kimin; Marek, Petr; Filip, Radim

    2018-05-01

    We propose deterministic realizations of nonlinear phase gates by repeating a finite sequence of non-commuting Rabi interactions between a harmonic oscillator and only a single two-level ancillary qubit. We show explicitly that the key nonclassical features of the ideal cubic phase gate and the quartic phase gate are generated in the harmonic oscillator faithfully by our method. We numerically analyzed the performance of our scheme under realistic imperfections of the oscillator and the two-level system. The methodology is extended further to higher-order nonlinear phase gates. This theoretical proposal completes the set of operations required for continuous-variable quantum computation.

  9. Family-focused autism spectrum disorder research: a review of the utility of family systems approaches.

    PubMed

    Cridland, Elizabeth K; Jones, Sandra C; Magee, Christopher A; Caputi, Peter

    2014-04-01

    A family member with an autism spectrum disorder presents pervasive and bidirectional influences on the entire family system, suggesting a need for family-focused autism spectrum disorder research. While there has been increasing interest in this research area, family-focused autism spectrum disorder research can still be considered relatively recent, and there are limitations to the existing literature. The purpose of this article is to provide theoretical and methodological directions for future family-focused autism spectrum disorder research. In particular, this article proposes Family Systems approaches as a common theoretical framework for future family-focused autism spectrum disorder research by considering theoretical concepts such as Boundaries, Ambiguous Loss, Resilience and Traumatic Growth. We discuss reasons why these concepts are important to researching families living with autism spectrum disorder and provide recommendations for future research. The potential for research grounded in Family Systems approaches to influence clinical support services is also discussed.

  10. Array measurements adapted to the number of available sensors: Theoretical and practical approach for ESAC method

    NASA Astrophysics Data System (ADS)

    Galiana-Merino, J. J.; Rosa-Cintas, S.; Rosa-Herranz, J.; Garrido, J.; Peláez, J. A.; Martino, S.; Delgado, J.

    2016-05-01

    Array measurements of ambient noise have become a useful technique to estimate the surface wave dispersion curves and subsequently the subsurface elastic parameters that characterize the studied soil. One of the logistical handicaps associated with this kind of measurements is the requirement of several stations recording at the same time, which limits their applicability in the case of research groups without enough infrastructure resources. In this paper, we describe the theoretical basis of the ESAC method and we deduce how the number of stations needed to implement any array layout can be reduced to only two stations. In this way, we propose a new methodology to implement an N stations array layout by using only M stations (M < N), which will be recording in different positions of the original prearranged N stations geometry at different times. We also provide some practical guidelines to implement the proposed approach and we show different examples where the obtained results confirm the theoretical foundations. Thus, the study carried out reflects that we can use a minimum of 2 stations to deploy any array layout originally designed for higher number of sensors.

  11. Practice-Focused Ethnographies of Higher Education: Method/ological Corollaries of a Social Practice Perspective

    ERIC Educational Resources Information Center

    Trowler, Paul Richard

    2014-01-01

    Social practice theory addresses both theoretical and method/ological agendas. To date priority has been given to the former, with writing on the latter tending often to be an afterthought to theoretical expositions or fieldwork accounts. This article gives sustained attention to the method/ological corollaries of a social practice perspective. It…

  12. 76 FR 67515 - Self-Regulatory Organizations; Chicago Mercantile Exchange, Inc.; Notice of Filing and Order...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-01

    ..., determined by the Clearing House using stress test methodology equal to the theoretical two largest IRS Clearing Member losses produced by such stress test or such other methodology determined by the IRS Risk... portion, determined by the Clearing House using stress test methodology equal to the theoretical third and...

  13. Coarse-graining errors and numerical optimization using a relative entropy framework

    NASA Astrophysics Data System (ADS)

    Chaimovich, Aviel; Shell, M. Scott

    2011-03-01

    The ability to generate accurate coarse-grained models from reference fully atomic (or otherwise "first-principles") ones has become an important component in modeling the behavior of complex molecular systems with large length and time scales. We recently proposed a novel coarse-graining approach based upon variational minimization of a configuration-space functional called the relative entropy, Srel, that measures the information lost upon coarse-graining. Here, we develop a broad theoretical framework for this methodology and numerical strategies for its use in practical coarse-graining settings. In particular, we show that the relative entropy offers tight control over the errors due to coarse-graining in arbitrary microscopic properties, and suggests a systematic approach to reducing them. We also describe fundamental connections between this optimization methodology and other coarse-graining strategies like inverse Monte Carlo, force matching, energy matching, and variational mean-field theory. We suggest several new numerical approaches to its minimization that provide new coarse-graining strategies. Finally, we demonstrate the application of these theoretical considerations and algorithms to a simple, instructive system and characterize convergence and errors within the relative entropy framework.

  14. A theoretical approach to artificial intelligence systems in medicine.

    PubMed

    Spyropoulos, B; Papagounos, G

    1995-10-01

    The various theoretical models of disease, the nosology which is accepted by the medical community and the prevalent logic of diagnosis determine both the medical approach as well as the development of the relevant technology including the structure and function of the A.I. systems involved. A.I. systems in medicine, in addition to the specific parameters which enable them to reach a diagnostic and/or therapeutic proposal, entail implicitly theoretical assumptions and socio-cultural attitudes which prejudice the orientation and the final outcome of the procedure. The various models -causal, probabilistic, case-based etc. -are critically examined and their ethical and methodological limitations are brought to light. The lack of a self-consistent theoretical framework in medicine, the multi-faceted character of the human organism as well as the non-explicit nature of the theoretical assumptions involved in A.I. systems restrict them to the role of decision supporting "instruments" rather than regarding them as decision making "devices". This supporting role and, especially, the important function which A.I. systems should have in the structure, the methods and the content of medical education underscore the need of further research in the theoretical aspects and the actual development of such systems.

  15. Translational behavioral medicine for population and individual health: gaps, opportunities, and vision for practice-based translational behavior change research.

    PubMed

    Ma, Jun; Lewis, Megan A; Smyth, Joshua M

    2018-04-12

    In this commentary, we propose a vision for "practice-based translational behavior change research," which we define as clinical and public health practice-embedded research on the implementation, optimization, and fundamental mechanisms of behavioral interventions. This vision intends to be inclusive of important research elements for behavioral intervention development, testing, and implementation. We discuss important research gaps and conceptual and methodological advances in three key areas along the discovery (development) to delivery (implementation) continuum of evidence-based interventions to improve behavior and health that could help achieve our vision of practice-based translational behavior change research. We expect our proposed vision to be refined and evolve over time. Through highlighting critical gaps that can be addressed by integrating modern theoretical and methodological approaches across disciplines in behavioral medicine, we hope to inspire the development and funding of innovative research on more potent and implementable behavior change interventions for optimal population and individual health.

  16. Broadening the Study of Participation in the Life Sciences: How Critical Theoretical and Mixed-Methodological Approaches Can Enhance Efforts to Broaden Participation

    ERIC Educational Resources Information Center

    Metcalf, Heather

    2016-01-01

    This research methods Essay details the usefulness of critical theoretical frameworks and critical mixed-methodological approaches for life sciences education research on broadening participation in the life sciences. First, I draw on multidisciplinary research to discuss critical theory and methodologies. Then, I demonstrate the benefits of these…

  17. Is It Necessary to Articulate a Research Methodology When Reporting on Theoretical Research?

    ERIC Educational Resources Information Center

    Smith, Juliana; Small, Rosalie

    2017-01-01

    In this paper the authors share their insights on whether it is necessary to articulate a research methodology when reporting on theoretical research. Initially the authors, one being a supervisor and the other, a PhD student and a colleague, were confronted with the question during supervision and writing of a thesis on theoretical research.…

  18. Environment, genes, and experience: lessons from behavior genetics.

    PubMed

    Barsky, Philipp I

    2010-11-01

    The article reviews the theoretical analysis of the problems inherent in studying the environment within behavior genetics across several periods in the development of environmental studies in behavior genetics and proposes some possible alternatives to traditional approaches to studying the environment in behavior genetics. The first period (from the end of the 1920s to the end of the 1970s), when the environment was not actually studied, is called pre-environmental; during this time, the basic principles and theoretical models of understanding environmental effects in behavior genetics were developed. The second period is characterized by the development of studies on environmental influences within the traditional behavior genetics paradigm; several approaches to studying the environment emerged in behavior genetics during this period, from the beginning of the 1980s until today. At the present time, the field is undergoing paradigmatic changes, concerned with methodology, theory, and mathematical models of genotype-environment interplay; this might be the beginning of a third period of development of environmental studies in behavior genetics. In another part, the methodological problems related to environmental studies in behavior genetics are discussed. Although the methodology used in differential psychology is applicable for assessment of differences between individuals, it is insufficient to explain the sources of these differences. In addition, we stress that psychoanalytic studies of twins and their experiences, initiated in the 1930s and continued episodically until the 1980s, could bring an interesting methodology and contribute to the explanation of puzzling findings from environmental studies of behavior genetics. Finally, we will conclude with implications from the results of environmental studies in behavior genetics, including methodological issues. Copyright © 2010 Elsevier Ltd. All rights reserved.

  19. Robust and efficient estimation with weighted composite quantile regression

    NASA Astrophysics Data System (ADS)

    Jiang, Xuejun; Li, Jingzhi; Xia, Tian; Yan, Wanfeng

    2016-09-01

    In this paper we introduce a weighted composite quantile regression (CQR) estimation approach and study its application in nonlinear models such as exponential models and ARCH-type models. The weighted CQR is augmented by using a data-driven weighting scheme. With the error distribution unspecified, the proposed estimators share robustness from quantile regression and achieve nearly the same efficiency as the oracle maximum likelihood estimator (MLE) for a variety of error distributions including the normal, mixed-normal, Student's t, Cauchy distributions, etc. We also suggest an algorithm for the fast implementation of the proposed methodology. Simulations are carried out to compare the performance of different estimators, and the proposed approach is used to analyze the daily S&P 500 Composite index, which verifies the effectiveness and efficiency of our theoretical results.

  20. Computerized Design of Low-noise Face-milled Spiral Bevel Gears

    NASA Technical Reports Server (NTRS)

    Litvin, Faydor L.; Zhang, YI; Handschuh, Robert F.

    1994-01-01

    An advanced design methodology is proposed for the face-milled spiral bevel gears with modified tooth surface geometry that provides a reduced level of noise and has a stabilized bearing contact. The approach is based on the local synthesis of the gear drive that provides the 'best' machine-tool settings. The theoretical aspects of the local synthesis approach are based on the application of a predesigned parabolic function for absorption of undesirable transmission errors caused by misalignment and the direct relations between principal curvatures and directions for mating surfaces. The meshing and contact of the gear drive is synthesized and analyzed by a computer program. The generation of gears with the proposed geometry design can be accomplished by application of existing equipment. A numerical example that illustrates the proposed theory is presented.

  1. Computerized design of low-noise face-milled spiral bevel gears

    NASA Astrophysics Data System (ADS)

    Litvin, Faydor L.; Zhang, Yi; Handschuh, Robert F.

    1994-08-01

    An advanced design methodology is proposed for the face-milled spiral bevel gears with modified tooth surface geometry that provides a reduced level of noise and has a stabilized bearing contact. The approach is based on the local synthesis of the gear drive that provides the 'best' machine-tool settings. The theoretical aspects of the local synthesis approach are based on the application of a predesigned parabolic function for absorption of undesirable transmission errors caused by misalignment and the direct relations between principal curvatures and directions for mating surfaces. The meshing and contact of the gear drive is synthesized and analyzed by a computer program. The generation of gears with the proposed geometry design can be accomplished by application of existing equipment. A numerical example that illustrates the proposed theory is presented.

  2. Critical appraisal of rigour in interpretive phenomenological nursing research.

    PubMed

    de Witt, Lorna; Ploeg, Jenny

    2006-07-01

    This paper reports a critical review of published nursing research for expressions of rigour in interpretive phenomenology, and a new framework of rigour specific to this methodology is proposed. The rigour of interpretive phenomenology is an important nursing research methods issue that has direct implications for the legitimacy of nursing science. The use of a generic set of qualitative criteria of rigour for interpretive phenomenological studies is problematic because it is philosophically inconsistent with the methodology and creates obstacles to full expression of rigour in such studies. A critical review was conducted of the published theoretical interpretive phenomenological nursing literature from 1994 to 2004 and the expressions of rigour in this literature identified. We used three sources to inform the derivation of a proposed framework of expressions of rigour for interpretive phenomenology: the phenomenological scholar van Manen, the theoretical interpretive phenomenological nursing literature, and Madison's criteria of rigour for hermeneutic phenomenology. The nursing literature reveals a broad range of criteria for judging the rigour of interpretive phenomenological research. The proposed framework for evaluating rigour in this kind of research contains the following five expressions: balanced integration, openness, concreteness, resonance, and actualization. Balanced integration refers to the intertwining of philosophical concepts in the study methods and findings and a balance between the voices of study participants and the philosophical explanation. Openness is related to a systematic, explicit process of accounting for the multiple decisions made throughout the study process. Concreteness relates to usefulness for practice of study findings. Resonance encompasses the experiential or felt effect of reading study findings upon the reader. Finally, actualization refers to the future realization of the resonance of study findings. Adoption of this or similar frameworks of expressions of rigour could help to preserve the integrity and legitimacy of interpretive phenomenological nursing research.

  3. Restoration of a single superresolution image from several blurred, noisy, and undersampled measured images.

    PubMed

    Elad, M; Feuer, A

    1997-01-01

    The three main tools in the single image restoration theory are the maximum likelihood (ML) estimator, the maximum a posteriori probability (MAP) estimator, and the set theoretic approach using projection onto convex sets (POCS). This paper utilizes the above known tools to propose a unified methodology toward the more complicated problem of superresolution restoration. In the superresolution restoration problem, an improved resolution image is restored from several geometrically warped, blurred, noisy and downsampled measured images. The superresolution restoration problem is modeled and analyzed from the ML, the MAP, and POCS points of view, yielding a generalization of the known superresolution restoration methods. The proposed restoration approach is general but assumes explicit knowledge of the linear space- and time-variant blur, the (additive Gaussian) noise, the different measured resolutions, and the (smooth) motion characteristics. A hybrid method combining the simplicity of the ML and the incorporation of nonellipsoid constraints is presented, giving improved restoration performance, compared with the ML and the POCS approaches. The hybrid method is shown to converge to the unique optimal solution of a new definition of the optimization problem. Superresolution restoration from motionless measurements is also discussed. Simulations demonstrate the power of the proposed methodology.

  4. Calculating complete and exact Pareto front for multiobjective optimization: a new deterministic approach for discrete problems.

    PubMed

    Hu, Xiao-Bing; Wang, Ming; Di Paolo, Ezequiel

    2013-06-01

    Searching the Pareto front for multiobjective optimization problems usually involves the use of a population-based search algorithm or of a deterministic method with a set of different single aggregate objective functions. The results are, in fact, only approximations of the real Pareto front. In this paper, we propose a new deterministic approach capable of fully determining the real Pareto front for those discrete problems for which it is possible to construct optimization algorithms to find the k best solutions to each of the single-objective problems. To this end, two theoretical conditions are given to guarantee the finding of the actual Pareto front rather than its approximation. Then, a general methodology for designing a deterministic search procedure is proposed. A case study is conducted, where by following the general methodology, a ripple-spreading algorithm is designed to calculate the complete exact Pareto front for multiobjective route optimization. When compared with traditional Pareto front search methods, the obvious advantage of the proposed approach is its unique capability of finding the complete Pareto front. This is illustrated by the simulation results in terms of both solution quality and computational efficiency.

  5. On the use of Different Methodologies in Cognitive Neuropsychology: Drink Deep and from Several Sources

    PubMed Central

    Nickels, Lyndsey; Howard, David; Best, Wendy

    2012-01-01

    Cognitive neuropsychology has championed the use of single-case research design. Recently, however, case series designs that employ multiple single cases have been increasingly utilized to address theoretical issues using data from neuropsychological populations. In this paper, we examine these methodologies, focusing on a number of points in particular. First we discuss the use of dissociations and associations, often thought of as a defining feature of cognitive neuropsychology, and argue that they are better viewed as part of a spectrum of methods that aim to explain and predict behaviour. We also raise issues regarding case series design in particular, arguing that selection of an appropriate sample, including controlling degree of homogeneity, is critical and constrains the theoretical claims that can be made on the basis of the data. We discuss the possible interpretation of “outliers” in a case series, suggesting that while they may reflect “noise” caused by variability in performance due to factors that are not of relevance to the theoretical claims, they may also reflect the presence of patterns that are critical to test, refine, and potentially falsify our theories. The role of case series in treatment research is also raised, in light of the fact that, despite their status as gold standard, randomized controlled trials cannot provide answers to many crucial theoretical and clinical questions. Finally, we stress the importance of converging evidence: We propose that it is conclusions informed by multiple sources of evidence that are likely to best inform theory and stand the test of time. PMID:22746689

  6. Robust detection-isolation-accommodation for sensor failures

    NASA Technical Reports Server (NTRS)

    Weiss, J. L.; Pattipati, K. R.; Willsky, A. S.; Eterno, J. S.; Crawford, J. T.

    1985-01-01

    The results of a one year study to: (1) develop a theory for Robust Failure Detection and Identification (FDI) in the presence of model uncertainty, (2) develop a design methodology which utilizes the robust FDI ththeory, (3) apply the methodology to a sensor FDI problem for the F-100 jet engine, and (4) demonstrate the application of the theory to the evaluation of alternative FDI schemes are presented. Theoretical results in statistical discrimination are used to evaluate the robustness of residual signals (or parity relations) in terms of their usefulness for FDI. Furthermore, optimally robust parity relations are derived through the optimization of robustness metrics. The result is viewed as decentralization of the FDI process. A general structure for decentralized FDI is proposed and robustness metrics are used for determining various parameters of the algorithm.

  7. Critical Analysis of the Mathematical Formalism of Theoretical Physics. II. Foundations of Vector Calculus

    NASA Astrophysics Data System (ADS)

    Kalanov, Temur Z.

    2014-03-01

    A critical analysis of the foundations of standard vector calculus is proposed. The methodological basis of the analysis is the unity of formal logic and of rational dialectics. It is proved that the vector calculus is incorrect theory because: (a) it is not based on a correct methodological basis - the unity of formal logic and of rational dialectics; (b) it does not contain the correct definitions of ``movement,'' ``direction'' and ``vector'' (c) it does not take into consideration the dimensions of physical quantities (i.e., number names, denominate numbers, concrete numbers), characterizing the concept of ''physical vector,'' and, therefore, it has no natural-scientific meaning; (d) operations on ``physical vectors'' and the vector calculus propositions relating to the ''physical vectors'' are contrary to formal logic.

  8. Job stress and cardiovascular disease: a theoretic critical review.

    PubMed

    Kristensen, T S

    1996-07-01

    During the last 15 years, the research on job stress and cardiovascular diseases has been dominated by the job strain model developed by R. Karasek (1979) and colleagues (R. Karasek & T. Theorell, 1990). In this article the results of this research are briefly summarized, and the theoretical and methodological basis is discussed and criticized. A sociological interpretation of the model emphasizing theories of technological change, qualifications of the workers, and the organization of work is proposed. Furthermore, improvements with regard to measuring the job strain dimensions and to sampling the study base are suggested. Substantial improvements of the job strain research could be achieved if the principle of triangulation were used in the measurements of stressors, stress, and sickness and if occupation-based samples were used instead of large representative samples.

  9. A New Integrated Threshold Selection Methodology for Spatial Forecast Verification of Extreme Events

    NASA Astrophysics Data System (ADS)

    Kholodovsky, V.

    2017-12-01

    Extreme weather and climate events such as heavy precipitation, heat waves and strong winds can cause extensive damage to the society in terms of human lives and financial losses. As climate changes, it is important to understand how extreme weather events may change as a result. Climate and statistical models are often independently used to model those phenomena. To better assess performance of the climate models, a variety of spatial forecast verification methods have been developed. However, spatial verification metrics that are widely used in comparing mean states, in most cases, do not have an adequate theoretical justification to benchmark extreme weather events. We proposed a new integrated threshold selection methodology for spatial forecast verification of extreme events that couples existing pattern recognition indices with high threshold choices. This integrated approach has three main steps: 1) dimension reduction; 2) geometric domain mapping; and 3) thresholds clustering. We apply this approach to an observed precipitation dataset over CONUS. The results are evaluated by displaying threshold distribution seasonally, monthly and annually. The method offers user the flexibility of selecting a high threshold that is linked to desired geometrical properties. The proposed high threshold methodology could either complement existing spatial verification methods, where threshold selection is arbitrary, or be directly applicable in extreme value theory.

  10. Theory and practice of clinical ethics support services: narrative and hermeneutical perspectives.

    PubMed

    Porz, Rouven; Landeweer, Elleke; Widdershoven, Guy

    2011-09-01

    In this paper we introduce narrative and hermeneutical perspectives to clinical ethics support services (CESS). We propose a threefold consideration of 'theory' and show how it is interwoven with 'practice' as we go along. First, we look at theory in its foundational role: in our case 'narrative ethics' and 'philosophical hermeneutics' provide a theoretical base for clinical ethics by focusing on human identities entangled in stories and on moral understanding as a dialogical process. Second, we consider the role of theoretical notions in helping practitioners to understand their situation in clinical ethics practice, by using notions like 'story', 'responsibility', or 'vulnerability' to make explicit and explain their practical experience. Such theoretical notions help us to interpret clinical situations from an ethical perspective and to foster moral awareness of practitioners. And, thirdly, we examine how new theoretical concepts are developed by interpreting practice, using practice to form and improve our ethical theory. In this paper, we discuss this threefold use of theory in clinical ethics support services by reflecting on our own theoretical assumptions, methodological steps and practical experiences as ethicists, and by providing examples from our daily work. In doing so, we illustrate that theory and practice are interwoven, as theoretical understanding is dependent upon practical experience, and vice-versa. © 2011 Blackwell Publishing Ltd.

  11. Methodology in Bi- and Multilingual Studies: From Simplification to Complexity

    ERIC Educational Resources Information Center

    Aronin, Larissa; Jessner, Ulrike

    2014-01-01

    Research methodology is determined by theoretical approaches. This article discusses methods of multilingualism research in connection with theoretical developments in linguistics, psycholinguistics, sociolinguistics, and education. Taking a brief glance at the past, the article starts with a discussion of an issue underlying the choice of…

  12. Exploring How Globalization Shapes Education: Methodology and Theoretical Framework

    ERIC Educational Resources Information Center

    Pan, Su-Yan

    2010-01-01

    This is a commentary on some major issues raised in Carter and Dediwalage's "Globalisation and science education: The case of "Sustainability by the bay"" (this issue), particularly their methodology and theoretical framework for understanding how globalisation shapes education (including science education). While acknowledging the authors'…

  13. Gaming Space: A Game-Theoretic Methodology for Assessing the Deterrent Value of Space Control Options

    DTIC Science & Technology

    2018-06-07

    Gaming Space A Game-Theoretic Methodology for Assessing the Deterrent Value of Space Control Options C O R...in space. Adversaries have already employed non -kinetic OSC capabilities, such as Global Positioning System jammers, in recent conflicts, and they...as part of the project “Assessing the Deterrent Value of Defensive Space Control Options.” The purpose of the project was to develop a methodology

  14. A methodology to design heuristics for model selection based on the characteristics of data: Application to investigate when the Negative Binomial Lindley (NB-L) is preferred over the Negative Binomial (NB).

    PubMed

    Shirazi, Mohammadali; Dhavala, Soma Sekhar; Lord, Dominique; Geedipally, Srinivas Reddy

    2017-10-01

    Safety analysts usually use post-modeling methods, such as the Goodness-of-Fit statistics or the Likelihood Ratio Test, to decide between two or more competitive distributions or models. Such metrics require all competitive distributions to be fitted to the data before any comparisons can be accomplished. Given the continuous growth in introducing new statistical distributions, choosing the best one using such post-modeling methods is not a trivial task, in addition to all theoretical or numerical issues the analyst may face during the analysis. Furthermore, and most importantly, these measures or tests do not provide any intuitions into why a specific distribution (or model) is preferred over another (Goodness-of-Logic). This paper ponders into these issues by proposing a methodology to design heuristics for Model Selection based on the characteristics of data, in terms of descriptive summary statistics, before fitting the models. The proposed methodology employs two analytic tools: (1) Monte-Carlo Simulations and (2) Machine Learning Classifiers, to design easy heuristics to predict the label of the 'most-likely-true' distribution for analyzing data. The proposed methodology was applied to investigate when the recently introduced Negative Binomial Lindley (NB-L) distribution is preferred over the Negative Binomial (NB) distribution. Heuristics were designed to select the 'most-likely-true' distribution between these two distributions, given a set of prescribed summary statistics of data. The proposed heuristics were successfully compared against classical tests for several real or observed datasets. Not only they are easy to use and do not need any post-modeling inputs, but also, using these heuristics, the analyst can attain useful information about why the NB-L is preferred over the NB - or vice versa- when modeling data. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. The Three Stages of Critical Policy Methodology: An Example from Curriculum Analysis

    ERIC Educational Resources Information Center

    Rata, Elizabeth

    2014-01-01

    The article identifies and discusses three stages in the critical policy methodology used in the sociology of education. These are: firstly, employing a political economy theoretical framework that identifies causal links between global forces and local developments; secondly, analysing educational policy within that theoretically conceptualised…

  16. [Systemic inflammation: theoretical and methodological approaches to description of general pathological process model. Part 3. Backgroung for nonsyndromic approach].

    PubMed

    Gusev, E Yu; Chereshnev, V A

    2013-01-01

    Theoretical and methodological approaches to description of systemic inflammation as general pathological process are discussed. It is shown, that there is a need of integration of wide range of types of researches to develop a model of systemic inflammation.

  17. Centroid and Theoretical Rotation: Justification for Their Use in Q Methodology Research

    ERIC Educational Resources Information Center

    Ramlo, Sue

    2016-01-01

    This manuscript's purpose is to introduce Q as a methodology before providing clarification about the preferred factor analytical choices of centroid and theoretical (hand) rotation. Stephenson, the creator of Q, designated that only these choices allowed for scientific exploration of subjectivity while not violating assumptions associated with…

  18. A Design Pattern for Decentralised Decision Making

    PubMed Central

    Valentini, Gabriele; Fernández-Oto, Cristian; Dorigo, Marco

    2015-01-01

    The engineering of large-scale decentralised systems requires sound methodologies to guarantee the attainment of the desired macroscopic system-level behaviour given the microscopic individual-level implementation. While a general-purpose methodology is currently out of reach, specific solutions can be given to broad classes of problems by means of well-conceived design patterns. We propose a design pattern for collective decision making grounded on experimental/theoretical studies of the nest-site selection behaviour observed in honeybee swarms (Apis mellifera). The way in which honeybee swarms arrive at consensus is fairly well-understood at the macroscopic level. We provide formal guidelines for the microscopic implementation of collective decisions to quantitatively match the macroscopic predictions. We discuss implementation strategies based on both homogeneous and heterogeneous multiagent systems, and we provide means to deal with spatial and topological factors that have a bearing on the micro-macro link. Finally, we exploit the design pattern in two case studies that showcase the viability of the approach. Besides engineering, such a design pattern can prove useful for a deeper understanding of decision making in natural systems thanks to the inclusion of individual heterogeneities and spatial factors, which are often disregarded in theoretical modelling. PMID:26496359

  19. Coarse-graining errors and numerical optimization using a relative entropy framework.

    PubMed

    Chaimovich, Aviel; Shell, M Scott

    2011-03-07

    The ability to generate accurate coarse-grained models from reference fully atomic (or otherwise "first-principles") ones has become an important component in modeling the behavior of complex molecular systems with large length and time scales. We recently proposed a novel coarse-graining approach based upon variational minimization of a configuration-space functional called the relative entropy, S(rel), that measures the information lost upon coarse-graining. Here, we develop a broad theoretical framework for this methodology and numerical strategies for its use in practical coarse-graining settings. In particular, we show that the relative entropy offers tight control over the errors due to coarse-graining in arbitrary microscopic properties, and suggests a systematic approach to reducing them. We also describe fundamental connections between this optimization methodology and other coarse-graining strategies like inverse Monte Carlo, force matching, energy matching, and variational mean-field theory. We suggest several new numerical approaches to its minimization that provide new coarse-graining strategies. Finally, we demonstrate the application of these theoretical considerations and algorithms to a simple, instructive system and characterize convergence and errors within the relative entropy framework. © 2011 American Institute of Physics.

  20. Assessing EEG sleep spindle propagation. Part 1: theory and proposed methodology.

    PubMed

    O'Reilly, Christian; Nielsen, Tore

    2014-01-15

    A convergence of studies has revealed sleep spindles to be associated with sleep-related cognitive processing and even with fundamental waking state capacities such as intelligence. However, some spindle characteristics, such as propagation direction and delay, may play a decisive role but are only infrequently investigated because of technical complexities. A new methodology for assessing sleep spindle propagation over the human scalp using noninvasive electroencephalography (EEG) is described. This approach is based on the alignment of time-frequency representations of spindle activity across recording channels. This first of a two-part series concentrates on framing theoretical considerations related to EEG spindle propagation and on detailing the methodology. A short example application is provided that illustrates the repeatability of results obtained with the new propagation measure in a sample of 32 night recordings. A more comprehensive experimental investigation is presented in part two of the series. Compared to existing methods, this approach is particularly well adapted for studying the propagation of sleep spindles because it estimates time delays rather than phase synchrony and it computes propagation properties for every individual spindle with windows adjusted to the specific spindle duration. The proposed methodology is effective in tracking the propagation of spindles across the scalp and may thus help in elucidating the temporal aspects of sleep spindle dynamics, as well as other transient EEG and MEG events. A software implementation (the Spyndle Python package) is provided as open source software. Copyright © 2013 Elsevier B.V. All rights reserved.

  1. Synthesis of discipline-based education research in physics

    NASA Astrophysics Data System (ADS)

    Docktor, Jennifer L.; Mestre, José P.

    2014-12-01

    This paper presents a comprehensive synthesis of physics education research at the undergraduate level. It is based on work originally commissioned by the National Academies. Six topical areas are covered: (1) conceptual understanding, (2) problem solving, (3) curriculum and instruction, (4) assessment, (5) cognitive psychology, and (6) attitudes and beliefs about teaching and learning. Each topical section includes sample research questions, theoretical frameworks, common research methodologies, a summary of key findings, strengths and limitations of the research, and areas for future study. Supplemental material proposes promising future directions in physics education research.

  2. Network representation of protein interactions: Theory of graph description and analysis.

    PubMed

    Kurzbach, Dennis

    2016-09-01

    A methodological framework is presented for the graph theoretical interpretation of NMR data of protein interactions. The proposed analysis generalizes the idea of network representations of protein structures by expanding it to protein interactions. This approach is based on regularization of residue-resolved NMR relaxation times and chemical shift data and subsequent construction of an adjacency matrix that represents the underlying protein interaction as a graph or network. The network nodes represent protein residues. Two nodes are connected if two residues are functionally correlated during the protein interaction event. The analysis of the resulting network enables the quantification of the importance of each amino acid of a protein for its interactions. Furthermore, the determination of the pattern of correlations between residues yields insights into the functional architecture of an interaction. This is of special interest for intrinsically disordered proteins, since the structural (three-dimensional) architecture of these proteins and their complexes is difficult to determine. The power of the proposed methodology is demonstrated at the example of the interaction between the intrinsically disordered protein osteopontin and its natural ligand heparin. © 2016 The Protein Society.

  3. A Novel Information-Theoretic Approach for Variable Clustering and Predictive Modeling Using Dirichlet Process Mixtures

    PubMed Central

    Chen, Yun; Yang, Hui

    2016-01-01

    In the era of big data, there are increasing interests on clustering variables for the minimization of data redundancy and the maximization of variable relevancy. Existing clustering methods, however, depend on nontrivial assumptions about the data structure. Note that nonlinear interdependence among variables poses significant challenges on the traditional framework of predictive modeling. In the present work, we reformulate the problem of variable clustering from an information theoretic perspective that does not require the assumption of data structure for the identification of nonlinear interdependence among variables. Specifically, we propose the use of mutual information to characterize and measure nonlinear correlation structures among variables. Further, we develop Dirichlet process (DP) models to cluster variables based on the mutual-information measures among variables. Finally, orthonormalized variables in each cluster are integrated with group elastic-net model to improve the performance of predictive modeling. Both simulation and real-world case studies showed that the proposed methodology not only effectively reveals the nonlinear interdependence structures among variables but also outperforms traditional variable clustering algorithms such as hierarchical clustering. PMID:27966581

  4. A Novel Information-Theoretic Approach for Variable Clustering and Predictive Modeling Using Dirichlet Process Mixtures.

    PubMed

    Chen, Yun; Yang, Hui

    2016-12-14

    In the era of big data, there are increasing interests on clustering variables for the minimization of data redundancy and the maximization of variable relevancy. Existing clustering methods, however, depend on nontrivial assumptions about the data structure. Note that nonlinear interdependence among variables poses significant challenges on the traditional framework of predictive modeling. In the present work, we reformulate the problem of variable clustering from an information theoretic perspective that does not require the assumption of data structure for the identification of nonlinear interdependence among variables. Specifically, we propose the use of mutual information to characterize and measure nonlinear correlation structures among variables. Further, we develop Dirichlet process (DP) models to cluster variables based on the mutual-information measures among variables. Finally, orthonormalized variables in each cluster are integrated with group elastic-net model to improve the performance of predictive modeling. Both simulation and real-world case studies showed that the proposed methodology not only effectively reveals the nonlinear interdependence structures among variables but also outperforms traditional variable clustering algorithms such as hierarchical clustering.

  5. Vibration attenuations induced by periodic arrays of piezoelectric patches connected by enhanced resonant shunting circuits

    NASA Astrophysics Data System (ADS)

    Wang, Gang; Wang, Jianwei; Chen, Shengbing; Wen, Jihong

    2011-12-01

    Periodic arrays of piezoelectric patches connected by enhanced resonant shunting circuits are attached to a slender beam to control the propagation of vibration. Numerical models based on the transfer matrix methodology are constructed to predict the band structure, attenuation factors and the transmission of vibration in the proposed smart structure. The vibration attenuations of the proposed smart structure and that with the passive resonant shunting circuits are compared in order to verify the efficiency of the enhanced resonant shunting circuits. Vibration experiments are conducted in order to validate the theoretical predictions. The specimen with a combination of different types of resonant shunting circuits is also studied in order to gain wider attenuation frequency ranges.

  6. Subcritical transition to turbulence: What we can learn from the physics of glasses.

    PubMed

    Dauchot, Olivier; Bertin, Eric

    2012-09-01

    In this note, we discuss possible analogies between the subcritical transition to turbulence in shear flows and the glass transition in supercooled liquids. We briefly review recent experimental and numerical results, as well as theoretical proposals, and compare the difficulties arising in assessing the divergence of the turbulence lifetime in subcritical shear flow with that encountered for the relaxation time in the study of the glass transition. In order to go beyond the purely methodological similarities, we further elaborate on this analogy and propose a simple model for the transition to turbulence, inspired by the random energy model (a standard model for the glass transition), with the aim to possibly foster yet-unexplored directions of research in subcritical shear flows.

  7. Development of a methodology for strategic environmental assessment: application to the assessment of golf course installation policy in Taiwan.

    PubMed

    Chen, Ching-Ho; Wu, Ray-Shyan; Liu, Wei-Lin; Su, Wen-Ray; Chang, Yu-Min

    2009-01-01

    Some countries, including Taiwan, have adopted strategic environmental assessment (SEA) to assess and modify proposed policies, plans, and programs (PPPs) in the planning phase for pursuing sustainable development. However, there were only some sketchy steps focusing on policy assessment in the system of Taiwan. This study aims to develop a methodology for SEA in Taiwan to enhance the effectiveness associated with PPPs. The proposed methodology comprises an SEA procedure involving PPP management and assessment in various phases, a sustainable assessment framework, and an SEA management system. The SEA procedure is devised based on the theoretical considerations by systems thinking and the regulative requirements in Taiwan. The positive and negative impacts on ecology, society, and economy are simultaneously considered in the planning (including policy generation and evaluation), implementation, and control phases of the procedure. This study used the analytic hierarchy process, Delphi technique, and systems analysis to develop a sustainable assessment framework. An SEA management system was built based on geographic information system software to process spatial, attribute, and satellite image data during the assessment procedure. The proposed methodology was applied in the SEA of golf course installation policy in 2001 as a case study, which was the first SEA in Taiwan. Most of the 82 existing golf courses in 2001 were installed on slope lands and caused a serious ecological impact. Assessment results indicated that 15 future golf courses installed on marginal lands (including buffer zones, remedied lands, and wastelands) were acceptable because the comprehensive environmental (ecological, social, and economic) assessment value was better based on environmental characteristics and management regulations of Taiwan. The SEA procedure in the planning phase for this policy was completed but the implementation phase of this policy was not begun because the related legislation procedure could not be arranged due to a few senators' resistance. A self-review of the control phase was carried out in 2006 using this methodology. Installation permits for 12 courses on slope lands were terminated after 2001 and then 27 future courses could be installed on marginal lands. The assessment value of this policy using the data on ecological, social, and economic conditions from 2006 was higher than that using the data from 2001. The analytical results illustrate that the proposed methodology can be used to effectively and efficiently assist the related authorities for SEA.

  8. Defining and Measuring Engagement and Learning in Science: Conceptual, Theoretical, Methodological, and Analytical Issues

    ERIC Educational Resources Information Center

    Azevedo, Roger

    2015-01-01

    Engagement is one of the most widely misused and overgeneralized constructs found in the educational, learning, instructional, and psychological sciences. The articles in this special issue represent a wide range of traditions and highlight several key conceptual, theoretical, methodological, and analytical issues related to defining and measuring…

  9. Researching Education Policy in a Globalized World: Theoretical and Methodological Considerations

    ERIC Educational Resources Information Center

    Lingard, Bob

    2009-01-01

    This paper shows how globalization has given rise to a number of new theoretical and methodological issues for doing education policy analysis linked to globalization's impact within critical social science. Critical policy analysis has always required critical "reflexivity" and awareness of the "positionality" of the policy analyst. However, as…

  10. Speaking Back to the Deficit Discourses: A Theoretical and Methodological Approach

    ERIC Educational Resources Information Center

    Hogarth, Melitta

    2017-01-01

    The educational attainment of Aboriginal and Torres Strait Islander students is often presented within a deficit view. The need for Aboriginal and Torres Strait Islander researchers to challenge the societal norms is necessary to contribute to the struggle for self-determination. This paper presents a theoretical and methodological approach that…

  11. Neuroethics and animals: methods and philosophy.

    PubMed

    Takala, Tuija; Häyry, Matti

    2014-04-01

    This article provides an overview of the six other contributions in the Neuroethics and Animals special section. In addition, it discusses the methodological and theoretical problems of interdisciplinary fields. The article suggests that interdisciplinary approaches without established methodological and theoretical bases are difficult to assess scientifically. This might cause these fields to expand without actually advancing.

  12. A methodology to model causal relationships on offshore safety assessment focusing on human and organizational factors.

    PubMed

    Ren, J; Jenkinson, I; Wang, J; Xu, D L; Yang, J B

    2008-01-01

    Focusing on people and organizations, this paper aims to contribute to offshore safety assessment by proposing a methodology to model causal relationships. The methodology is proposed in a general sense that it will be capable of accommodating modeling of multiple risk factors considered in offshore operations and will have the ability to deal with different types of data that may come from different resources. Reason's "Swiss cheese" model is used to form a generic offshore safety assessment framework, and Bayesian Network (BN) is tailored to fit into the framework to construct a causal relationship model. The proposed framework uses a five-level-structure model to address latent failures within the causal sequence of events. The five levels include Root causes level, Trigger events level, Incidents level, Accidents level, and Consequences level. To analyze and model a specified offshore installation safety, a BN model was established following the guideline of the proposed five-level framework. A range of events was specified, and the related prior and conditional probabilities regarding the BN model were assigned based on the inherent characteristics of each event. This paper shows that Reason's "Swiss cheese" model and BN can be jointly used in offshore safety assessment. On the one hand, the five-level conceptual model is enhanced by BNs that are capable of providing graphical demonstration of inter-relationships as well as calculating numerical values of occurrence likelihood for each failure event. Bayesian inference mechanism also makes it possible to monitor how a safety situation changes when information flow travel forwards and backwards within the networks. On the other hand, BN modeling relies heavily on experts' personal experiences and is therefore highly domain specific. "Swiss cheese" model is such a theoretic framework that it is based on solid behavioral theory and therefore can be used to provide industry with a roadmap for BN modeling and implications. A case study of the collision risk between a Floating Production, Storage and Offloading (FPSO) unit and authorized vessels caused by human and organizational factors (HOFs) during operations is used to illustrate an industrial application of the proposed methodology.

  13. A manufacturing error measurement methodology for a rotary vector reducer cycloidal gear based on a gear measuring center

    NASA Astrophysics Data System (ADS)

    Li, Tianxing; Zhou, Junxiang; Deng, Xiaozhong; Li, Jubo; Xing, Chunrong; Su, Jianxin; Wang, Huiliang

    2018-07-01

    A manufacturing error of a cycloidal gear is the key factor affecting the transmission accuracy of a robot rotary vector (RV) reducer. A methodology is proposed to realize the digitized measurement and data processing of the cycloidal gear manufacturing error based on the gear measuring center, which can quickly and accurately measure and evaluate the manufacturing error of the cycloidal gear by using both the whole tooth profile measurement and a single tooth profile measurement. By analyzing the particularity of the cycloidal profile and its effect on the actual meshing characteristics of the RV transmission, the cycloid profile measurement strategy is planned, and the theoretical profile model and error measurement model of cycloid-pin gear transmission are established. Through the digital processing technology, the theoretical trajectory of the probe and the normal vector of the measured point are calculated. By means of precision measurement principle and error compensation theory, a mathematical model for the accurate calculation and data processing of manufacturing error is constructed, and the actual manufacturing error of the cycloidal gear is obtained by the optimization iterative solution. Finally, the measurement experiment of the cycloidal gear tooth profile is carried out on the gear measuring center and the HEXAGON coordinate measuring machine, respectively. The measurement results verify the correctness and validity of the measurement theory and method. This methodology will provide the basis for the accurate evaluation and the effective control of manufacturing precision of the cycloidal gear in a robot RV reducer.

  14. Sharing methodology: a worked example of theoretical integration with qualitative data to clarify practical understanding of learning and generate new theoretical development.

    PubMed

    Yardley, Sarah; Brosnan, Caragh; Richardson, Jane

    2013-01-01

    Theoretical integration is a necessary element of study design if clarification of experiential learning is to be achieved. There are few published examples demonstrating how this can be achieved. This methodological article provides a worked example of research methodology that achieved clarification of authentic early experiences (AEEs) through a bi-directional approach to theory and data. Bi-directional refers to our simultaneous use of theory to guide and interrogate empirical data and the use of empirical data to refine theory. We explain the five steps of our methodological approach: (1) understanding the context; (2) critique on existing applications of socio-cultural models to inform study design; (3) data generation; (4) analysis and interpretation and (5) theoretical development through a novel application of Metis. These steps resulted in understanding of how and why different outcomes arose from students participating in AEE. Our approach offers a mechanism for clarification without which evidence-based effective ways to maximise constructive learning cannot be developed. In our example it also contributed to greater theoretical understanding of the influence of social interactions. By sharing this example of research undertaken to develop both theory and educational practice we hope to assist others seeking to conduct similar research.

  15. Review of health information technology usability study methodologies

    PubMed Central

    Bakken, Suzanne

    2011-01-01

    Usability factors are a major obstacle to health information technology (IT) adoption. The purpose of this paper is to review and categorize health IT usability study methods and to provide practical guidance on health IT usability evaluation. 2025 references were initially retrieved from the Medline database from 2003 to 2009 that evaluated health IT used by clinicians. Titles and abstracts were first reviewed for inclusion. Full-text articles were then examined to identify final eligibility studies. 629 studies were categorized into the five stages of an integrated usability specification and evaluation framework that was based on a usability model and the system development life cycle (SDLC)-associated stages of evaluation. Theoretical and methodological aspects of 319 studies were extracted in greater detail and studies that focused on system validation (SDLC stage 2) were not assessed further. The number of studies by stage was: stage 1, task-based or user–task interaction, n=42; stage 2, system–task interaction, n=310; stage 3, user–task–system interaction, n=69; stage 4, user–task–system–environment interaction, n=54; and stage 5, user–task–system–environment interaction in routine use, n=199. The studies applied a variety of quantitative and qualitative approaches. Methodological issues included lack of theoretical framework/model, lack of details regarding qualitative study approaches, single evaluation focus, environmental factors not evaluated in the early stages, and guideline adherence as the primary outcome for decision support system evaluations. Based on the findings, a three-level stratified view of health IT usability evaluation is proposed and methodological guidance is offered based upon the type of interaction that is of primary interest in the evaluation. PMID:21828224

  16. Systematic review of the psychometric properties and theoretical grounding of instruments evaluating self-care in people with type 2 diabetes mellitus.

    PubMed

    Caro-Bautista, Jorge; Martín-Santos, Francisco Javier; Morales-Asencio, Jose Miguel

    2014-06-01

    To determine the psychometric properties and theoretical grounding of instruments that evaluate self-care behaviour or barriers in people with type 2 diabetes. There are many instruments designed to evaluate self-care behaviour or barriers in this population, but knowledge about their psychometric validation processes is lacking. Systematic review. We conducted a search for psychometric or validation studies published between January 1990-December 2012. We carried out searches in Pubmed, CINAHL, PsycINFO, ProQuolid, BibliPRO and Google SCHOLAR to identify instruments that evaluated self-care behaviours or barriers to diabetes self-care. We conducted a systematic review with the following inclusion criteria: Psychometric or clinimetric validation studies that included patients with type 2 diabetes (exclusively or partially) and which analysed self-care behaviour or barriers to self-care and proxies like self-efficacy or empowerment, from a multidimensional approach. Language: Spanish or English. Two authors independently assessed the quality of the studies and extracted data using Terwee's proposed criteria: psychometrics properties, dimensionality, theoretical ground and population used for validation through each included instrument. Sixteen instruments achieved the inclusion criteria for the review. We detected important methodological flaws in many of the selected instruments. Only the Self-management Profile for Type 2 Diabetes and Problem Areas in Diabetes Scale met half of Terwee's quality criteria. There are no instruments for identifying self-care behaviours or barriers elaborated with a strong validation process. Further research should be carried out to provide patients, clinicians and researchers with valid and reliable instruments that are methodologically solid and theoretically grounded. © 2013 John Wiley & Sons Ltd.

  17. From Determinism and Probability to Chaos: Chaotic Evolution towards Philosophy and Methodology of Chaotic Optimization

    PubMed Central

    2015-01-01

    We present and discuss philosophy and methodology of chaotic evolution that is theoretically supported by chaos theory. We introduce four chaotic systems, that is, logistic map, tent map, Gaussian map, and Hénon map, in a well-designed chaotic evolution algorithm framework to implement several chaotic evolution (CE) algorithms. By comparing our previous proposed CE algorithm with logistic map and two canonical differential evolution (DE) algorithms, we analyse and discuss optimization performance of CE algorithm. An investigation on the relationship between optimization capability of CE algorithm and distribution characteristic of chaotic system is conducted and analysed. From evaluation result, we find that distribution of chaotic system is an essential factor to influence optimization performance of CE algorithm. We propose a new interactive EC (IEC) algorithm, interactive chaotic evolution (ICE) that replaces fitness function with a real human in CE algorithm framework. There is a paired comparison-based mechanism behind CE search scheme in nature. A simulation experimental evaluation is conducted with a pseudo-IEC user to evaluate our proposed ICE algorithm. The evaluation result indicates that ICE algorithm can obtain a significant better performance than or the same performance as interactive DE. Some open topics on CE, ICE, fusion of these optimization techniques, algorithmic notation, and others are presented and discussed. PMID:25879067

  18. From determinism and probability to chaos: chaotic evolution towards philosophy and methodology of chaotic optimization.

    PubMed

    Pei, Yan

    2015-01-01

    We present and discuss philosophy and methodology of chaotic evolution that is theoretically supported by chaos theory. We introduce four chaotic systems, that is, logistic map, tent map, Gaussian map, and Hénon map, in a well-designed chaotic evolution algorithm framework to implement several chaotic evolution (CE) algorithms. By comparing our previous proposed CE algorithm with logistic map and two canonical differential evolution (DE) algorithms, we analyse and discuss optimization performance of CE algorithm. An investigation on the relationship between optimization capability of CE algorithm and distribution characteristic of chaotic system is conducted and analysed. From evaluation result, we find that distribution of chaotic system is an essential factor to influence optimization performance of CE algorithm. We propose a new interactive EC (IEC) algorithm, interactive chaotic evolution (ICE) that replaces fitness function with a real human in CE algorithm framework. There is a paired comparison-based mechanism behind CE search scheme in nature. A simulation experimental evaluation is conducted with a pseudo-IEC user to evaluate our proposed ICE algorithm. The evaluation result indicates that ICE algorithm can obtain a significant better performance than or the same performance as interactive DE. Some open topics on CE, ICE, fusion of these optimization techniques, algorithmic notation, and others are presented and discussed.

  19. "It's the Method, Stupid." Interrelations between Methodological and Theoretical Advances: The Example of Comparing Higher Education Systems Internationally

    ERIC Educational Resources Information Center

    Hoelscher, Michael

    2017-01-01

    This article argues that strong interrelations between methodological and theoretical advances exist. Progress in, especially comparative, methods may have important impacts on theory evaluation. By using the example of the "Varieties of Capitalism" approach and an international comparison of higher education systems, it can be shown…

  20. Image denoising in mixed Poisson-Gaussian noise.

    PubMed

    Luisier, Florian; Blu, Thierry; Unser, Michael

    2011-03-01

    We propose a general methodology (PURE-LET) to design and optimize a wide class of transform-domain thresholding algorithms for denoising images corrupted by mixed Poisson-Gaussian noise. We express the denoising process as a linear expansion of thresholds (LET) that we optimize by relying on a purely data-adaptive unbiased estimate of the mean-squared error (MSE), derived in a non-Bayesian framework (PURE: Poisson-Gaussian unbiased risk estimate). We provide a practical approximation of this theoretical MSE estimate for the tractable optimization of arbitrary transform-domain thresholding. We then propose a pointwise estimator for undecimated filterbank transforms, which consists of subband-adaptive thresholding functions with signal-dependent thresholds that are globally optimized in the image domain. We finally demonstrate the potential of the proposed approach through extensive comparisons with state-of-the-art techniques that are specifically tailored to the estimation of Poisson intensities. We also present denoising results obtained on real images of low-count fluorescence microscopy.

  1. Ensemble learning in fixed expansion layer networks for mitigating catastrophic forgetting.

    PubMed

    Coop, Robert; Mishtal, Aaron; Arel, Itamar

    2013-10-01

    Catastrophic forgetting is a well-studied attribute of most parameterized supervised learning systems. A variation of this phenomenon, in the context of feedforward neural networks, arises when nonstationary inputs lead to loss of previously learned mappings. The majority of the schemes proposed in the literature for mitigating catastrophic forgetting were not data driven and did not scale well. We introduce the fixed expansion layer (FEL) feedforward neural network, which embeds a sparsely encoding hidden layer to help mitigate forgetting of prior learned representations. In addition, we investigate a novel framework for training ensembles of FEL networks, based on exploiting an information-theoretic measure of diversity between FEL learners, to further control undesired plasticity. The proposed methodology is demonstrated on a basic classification task, clearly emphasizing its advantages over existing techniques. The architecture proposed can be enhanced to address a range of computational intelligence tasks, such as regression problems and system control.

  2. Panning for the gold in health research: incorporating studies' methodological quality in meta-analysis.

    PubMed

    Johnson, Blair T; Low, Robert E; MacDonald, Hayley V

    2015-01-01

    Systematic reviews now routinely assess methodological quality to gauge the validity of the included studies and of the synthesis as a whole. Although trends from higher quality studies should be clearer, it is uncertain how often meta-analyses incorporate methodological quality in models of study results either as predictors, or, more interestingly, in interactions with theoretical moderators. We survey 200 meta-analyses in three health promotion domains to examine when and how meta-analyses incorporate methodological quality. Although methodological quality assessments commonly appear in contemporary meta-analyses (usually as scales), they are rarely incorporated in analyses, and still more rarely analysed in interaction with theoretical determinants of the success of health promotions. The few meta-analyses (2.5%) that did include such an interaction analysis showed that moderator results remained significant in higher quality studies or were present only among higher quality studies. We describe how to model quality interactively with theoretically derived moderators and discuss strengths and weaknesses of this approach and in relation to current meta-analytic practice. In large literatures exhibiting heterogeneous effects, meta-analyses can incorporate methodological quality and generate conclusions that enable greater confidence not only about the substantive phenomenon but also about the role that methodological quality itself plays.

  3. Theoretical framework and methodological development of common subjective health outcome measures in osteoarthritis: a critical review

    PubMed Central

    Pollard, Beth; Johnston, Marie; Dixon, Diane

    2007-01-01

    Subjective measures involving clinician ratings or patient self-assessments have become recognised as an important tool for the assessment of health outcome. The value of a health outcome measure is usually assessed by a psychometric evaluation of its reliability, validity and responsiveness. However, psychometric testing involves an accumulation of evidence and has recognised limitations. It has been suggested that an evaluation of how well a measure has been developed would be a useful additional criteria in assessing the value of a measure. This paper explored the theoretical background and methodological development of subjective health status measures commonly used in osteoarthritis research. Fourteen subjective health outcome measures commonly used in osteoarthritis research were examined. Each measure was explored on the basis of their i) theoretical framework (was there a definition of what was being assessed and was it part of a theoretical model?) and ii) methodological development (what was the scaling strategy, how were the items generated and reduced, what was the response format and what was the scoring method?). Only the AIMS, SF-36 and WHOQOL defined what they were assessing (i.e. the construct of interest) and no measure assessed was part of a theoretical model. None of the clinician report measures appeared to have implemented a scaling procedure or described the rationale for the items selected or scoring system. Of the patient self-report measures, the AIMS, MPQ, OXFORD, SF-36, WHOQOL and WOMAC appeared to follow a standard psychometric scaling method. The DRP and EuroQol used alternative scaling methods. The review highlighted the general lack of theoretical framework for both clinician report and patient self-report measures. This review also drew attention to the wide variation in the methodological development of commonly used measures in OA. While, in general the patient self-report measures had good methodological development, the clinician report measures appeared less well developed. It would be of value if new measures defined the construct of interest and, that the construct, be part of theoretical model. By ensuring measures are both theoretically and empirically valid then improvements in subjective health outcome measures should be possible. PMID:17343739

  4. The Speaker Respoken: Material Rhetoric as Feminist Methodology.

    ERIC Educational Resources Information Center

    Collins, Vicki Tolar

    1999-01-01

    Presents a methodology based on the concept of "material rhetoric" that can help scholars avoid problems as they reclaim women's historical texts. Defines material rhetoric and positions it theoretically in relation to other methodologies, including bibliographical studies, reception theory, and established feminist methodologies. Illustrates…

  5. [Ambivalence--a key concept in gerontology? Elements of heuristics exemplified by identity formation in old age].

    PubMed

    Lüscher, Kurt; Haller, Miriam

    2016-01-01

    Ambivalence is a widely used concept in gerontology, mostly used in the common sense meaning. We propose that an elaborated notion based on the historical and systematic analysis, reveals important theoretical, methodological and practical potentials of the idea of ambivalence for the study of aging. We exemplify this view by proposing a heuristic perspective for the analysis of processes to constitute and reconstitute identities in old age using a model based on a multidimensional understanding of ambivalence. Ambivalence is defined as referring to the experiences of vacillating between polar contradictions of feeling, thinking, wanting and social structures in the search for the sense and meaning of social relationships, facts and texts, which are important for unfolding and altering facets of the self and agency.

  6. Assessing the validity of discourse analysis: transdisciplinary convergence

    NASA Astrophysics Data System (ADS)

    Jaipal-Jamani, Kamini

    2014-12-01

    Research studies using discourse analysis approaches make claims about phenomena or issues based on interpretation of written or spoken text, which includes images and gestures. How are findings/interpretations from discourse analysis validated? This paper proposes transdisciplinary convergence as a way to validate discourse analysis approaches to research. The argument is made that discourse analysis explicitly grounded in semiotics, systemic functional linguistics, and critical theory, offers a credible research methodology. The underlying assumptions, constructs, and techniques of analysis of these three theoretical disciplines can be drawn on to show convergence of data at multiple levels, validating interpretations from text analysis.

  7. A review of methodological factors in performance assessments of time-varying aircraft noise effects. [with annotated bibliography

    NASA Technical Reports Server (NTRS)

    Coates, G. D.; Alluisi, E. A.; Adkins, C. J., Jr.

    1977-01-01

    Literature on the effects of general noise on human performance is reviewed in an attempt to identify (1) those characteristics of noise that have been found to affect human performance; (2) those characteristics of performance most likely to be affected by the presence of noise, and (3) those characteristics of the performance situation typically associated with noise effects. Based on the characteristics identified, a theoretical framework is proposed that will permit predictions of possible effects of time-varying aircraft-type noise on complex human performance. An annotated bibliography of 50 articles is included.

  8. A Quantitative Approach to Assessing System Evolvability

    NASA Technical Reports Server (NTRS)

    Christian, John A., III

    2004-01-01

    When selecting a system from multiple candidates, the customer seeks the one that best meets his or her needs. Recently the desire for evolvable systems has become more important and engineers are striving to develop systems that accommodate this need. In response to this search for evolvability, we present a historical perspective on evolvability, propose a refined definition of evolvability, and develop a quantitative method for measuring this property. We address this quantitative methodology from both a theoretical and practical perspective. This quantitative model is then applied to the problem of evolving a lunar mission to a Mars mission as a case study.

  9. On ``Overestimation-free Computational Version of Interval Analysis''

    NASA Astrophysics Data System (ADS)

    Popova, Evgenija D.

    2013-10-01

    The transformation of interval parameters into trigonometric functions, proposed in Int. J. Comput. Meth. Eng. Sci. Mech., vol. 13, pp. 319-328 (2012), is not motivated in comparison to the infinitely many equivalent algebraic transformations. The conclusions about the efficacy of the methodology used are based on incorrect comparisons between solutions of different problems. We show theoretically, and in the examples considered in the commented article, that changing the number of the parameters in a system of linear algebraic equations may change the initial problem, respectively, its solution set. We also correct various misunderstandings and bugs that appear in the article noted above.

  10. Optimal illusion and invisibility of multilayered anisotropic cylinders and spheres.

    PubMed

    Zhang, Lin; Shi, Yan; Liang, Chang-Hong

    2016-10-03

    In this paper, full-wave electromagnetic scattering theory is employed to investigate illusion and invisibility of inhomogeneous anisotropic cylinders and spheres. With the use of a shell designed according to Mie series theory for multiple piecewise anisotropic layers, radar cross section (RCS) of the coated inhomogeneous anisotropic object can be dramatically reduced or disguised as another object in the long-wavelength limit. With the suitable adjustment of the anisotropy parameters of the shell, optimal illusion and invisibility characteristics of the coated inhomogeneous anisotropic object can be achieved. Details of theoretical analysis and numerical examples are presented to validate the proposed methodology.

  11. A multi-level systems perspective for the science of team science.

    PubMed

    Börner, Katy; Contractor, Noshir; Falk-Krzesinski, Holly J; Fiore, Stephen M; Hall, Kara L; Keyton, Joann; Spring, Bonnie; Stokols, Daniel; Trochim, William; Uzzi, Brian

    2010-09-15

    This Commentary describes recent research progress and professional developments in the study of scientific teamwork, an area of inquiry termed the "science of team science" (SciTS, pronounced "sahyts"). It proposes a systems perspective that incorporates a mixed-methods approach to SciTS that is commensurate with the conceptual, methodological, and translational complexities addressed within the SciTS field. The theoretically grounded and practically useful framework is intended to integrate existing and future lines of SciTS research to facilitate the field's evolution as it addresses key challenges spanning macro, meso, and micro levels of analysis.

  12. Specific methodology for capacitance imaging by atomic force microscopy: A breakthrough towards an elimination of parasitic effects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Estevez, Ivan; Concept Scientific Instruments, ZA de Courtaboeuf, 2 rue de la Terre de Feu, 91940 Les Ulis; Chrétien, Pascal

    2014-02-24

    On the basis of a home-made nanoscale impedance measurement device associated with a commercial atomic force microscope, a specific operating process is proposed in order to improve absolute (in sense of “nonrelative”) capacitance imaging by drastically reducing the parasitic effects due to stray capacitance, surface topography, and sample tilt. The method, combining a two-pass image acquisition with the exploitation of approach curves, has been validated on sets of calibration samples consisting in square parallel plate capacitors for which theoretical capacitance values were numerically calculated.

  13. Deterministic and stochastic methods of calculation of polarization characteristics of radiation in natural environment

    NASA Astrophysics Data System (ADS)

    Strelkov, S. A.; Sushkevich, T. A.; Maksakova, S. V.

    2017-11-01

    We are talking about russian achievements of the world level in the theory of radiation transfer, taking into account its polarization in natural media and the current scientific potential developing in Russia, which adequately provides the methodological basis for theoretically-calculated research of radiation processes and radiation fields in natural media using supercomputers and mass parallelism. A new version of the matrix transfer operator is proposed for solving problems of polarized radiation transfer in heterogeneous media by the method of influence functions, when deterministic and stochastic methods can be combined.

  14. [Photography as analysis document, body and medicine: theory, method and criticism--the experience of Museo Nacional de Medicina Enrique Laval].

    PubMed

    Robinson, César Leyton; Caballero, Andrés Díaz

    2007-01-01

    This article is an experimental methodological reflection on the use of medical images as useful documents for constructing the history of medicine. A method is used that is based on guidelines or analysis topics that include different ways of viewing documents, from aesthetic, technical, social and political theories to historical and medical thinking. Some exercises are also included that enhance the proposal for the reader: rediscovering the worlds in society that harbor these medical photographical archives to obtain a new theoretical approach to the construction of the history of medical science.

  15. A new scenario-based approach to damage detection using operational modal parameter estimates

    NASA Astrophysics Data System (ADS)

    Hansen, J. B.; Brincker, R.; López-Aenlle, M.; Overgaard, C. F.; Kloborg, K.

    2017-09-01

    In this paper a vibration-based damage localization and quantification method, based on natural frequencies and mode shapes, is presented. The proposed technique is inspired by a damage assessment methodology based solely on the sensitivity of mass-normalized experimental determined mode shapes. The present method differs by being based on modal data extracted by means of Operational Modal Analysis (OMA) combined with a reasonable Finite Element (FE) representation of the test structure and implemented in a scenario-based framework. Besides a review of the basic methodology this paper addresses fundamental theoretical as well as practical considerations which are crucial to the applicability of a given vibration-based damage assessment configuration. Lastly, the technique is demonstrated on an experimental test case using automated OMA. Both the numerical study as well as the experimental test case presented in this paper are restricted to perturbations concerning mass change.

  16. On synchronisation of a class of complex chaotic systems with complex unknown parameters via integral sliding mode control

    NASA Astrophysics Data System (ADS)

    Tirandaz, Hamed; Karami-Mollaee, Ali

    2018-06-01

    Chaotic systems demonstrate complex behaviour in their state variables and their parameters, which generate some challenges and consequences. This paper presents a new synchronisation scheme based on integral sliding mode control (ISMC) method on a class of complex chaotic systems with complex unknown parameters. Synchronisation between corresponding states of a class of complex chaotic systems and also convergence of the errors of the system parameters to zero point are studied. The designed feedback control vector and complex unknown parameter vector are analytically achieved based on the Lyapunov stability theory. Moreover, the effectiveness of the proposed methodology is verified by synchronisation of the Chen complex system and the Lorenz complex systems as the leader and the follower chaotic systems, respectively. In conclusion, some numerical simulations related to the synchronisation methodology is given to illustrate the effectiveness of the theoretical discussions.

  17. Calibration Testing of Network Tap Devices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Popovsky, Barbara; Chee, Brian; Frincke, Deborah A.

    2007-11-14

    Abstract: Understanding the behavior of network forensic devices is important to support prosecutions of malicious conduct on computer networks as well as legal remedies for false accusations of network management negligence. Individuals who seek to establish the credibility of network forensic data must speak competently about how the data was gathered and the potential for data loss. Unfortunately, manufacturers rarely provide information about the performance of low-layer network devices at a level that will survive legal challenges. This paper proposes a first step toward an independent calibration standard by establishing a validation testing methodology for evaluating forensic taps against manufacturermore » specifications. The methodology and the theoretical analysis that led to its development are offered as a conceptual framework for developing a standard and to "operationalize" network forensic readiness. This paper also provides details of an exemplar test, testing environment, procedures and results.« less

  18. The 2002 Sydney Gay Games: re-presenting "lesbian" identities through sporting space.

    PubMed

    Lambert, Karen

    2009-01-01

    In this article poetic representation in qualitative research is explored in relation to researching "lesbian" lives. Set within the context of The 2002 Sydney Gay Games the article considers how poetry can bring to light experiences at the intersection of sexuality, sport, and place. The article details three aspects to this process. First, by asking what queer theory could do for particular research subjects, a robust, malleable, and transportable theoretical concept of "queer" is proposed that is responsive to the participants' lives and experiences. Second, this concept is applied methodologically in order to unsettle more traditional academic modes of representing interview data through the use of poetic forms of representation. Finally, a poem constructed from the Opening Ceremony of The Gay Games is presented and analyzed. Poetic representation is thus offered as a distinct methodology that permits a particular kind of "queer" analysis when researching "lesbian" lives.

  19. Geometry and Dynamics for Markov Chain Monte Carlo

    NASA Astrophysics Data System (ADS)

    Barp, Alessandro; Briol, François-Xavier; Kennedy, Anthony D.; Girolami, Mark

    2018-03-01

    Markov Chain Monte Carlo methods have revolutionised mathematical computation and enabled statistical inference within many previously intractable models. In this context, Hamiltonian dynamics have been proposed as an efficient way of building chains which can explore probability densities efficiently. The method emerges from physics and geometry and these links have been extensively studied by a series of authors through the last thirty years. However, there is currently a gap between the intuitions and knowledge of users of the methodology and our deep understanding of these theoretical foundations. The aim of this review is to provide a comprehensive introduction to the geometric tools used in Hamiltonian Monte Carlo at a level accessible to statisticians, machine learners and other users of the methodology with only a basic understanding of Monte Carlo methods. This will be complemented with some discussion of the most recent advances in the field which we believe will become increasingly relevant to applied scientists.

  20. Probabilistic self-organizing maps for continuous data.

    PubMed

    Lopez-Rubio, Ezequiel

    2010-10-01

    The original self-organizing feature map did not define any probability distribution on the input space. However, the advantages of introducing probabilistic methodologies into self-organizing map models were soon evident. This has led to a wide range of proposals which reflect the current emergence of probabilistic approaches to computational intelligence. The underlying estimation theories behind them derive from two main lines of thought: the expectation maximization methodology and stochastic approximation methods. Here, we present a comprehensive view of the state of the art, with a unifying perspective of the involved theoretical frameworks. In particular, we examine the most commonly used continuous probability distributions, self-organization mechanisms, and learning schemes. Special emphasis is given to the connections among them and their relative advantages depending on the characteristics of the problem at hand. Furthermore, we evaluate their performance in two typical applications of self-organizing maps: classification and visualization.

  1. Acoustic evidence for phonologically mismatched speech errors.

    PubMed

    Gormley, Andrea

    2015-04-01

    Speech errors are generally said to accommodate to their new phonological context. This accommodation has been validated by several transcription studies. The transcription methodology is not the best choice for detecting errors at this level, however, as this type of error can be difficult to perceive. This paper presents an acoustic analysis of speech errors that uncovers non-accommodated or mismatch errors. A mismatch error is a sub-phonemic error that results in an incorrect surface phonology. This type of error could arise during the processing of phonological rules or they could be made at the motor level of implementation. The results of this work have important implications for both experimental and theoretical research. For experimentalists, it validates the tools used for error induction and the acoustic determination of errors free of the perceptual bias. For theorists, this methodology can be used to test the nature of the processes proposed in language production.

  2. Practical Strategies for Collaboration across Discipline-Based Education Research and the Learning Sciences

    PubMed Central

    Peffer, Melanie; Renken, Maggie

    2016-01-01

    Rather than pursue questions related to learning in biology from separate camps, recent calls highlight the necessity of interdisciplinary research agendas. Interdisciplinary collaborations allow for a complicated and expanded approach to questions about learning within specific science domains, such as biology. Despite its benefits, interdisciplinary work inevitably involves challenges. Some such challenges originate from differences in theoretical and methodological approaches across lines of work. Thus, aims at developing successful interdisciplinary research programs raise important considerations regarding methodologies for studying biology learning, strategies for approaching collaborations, and training of early-career scientists. Our goal here is to describe two fields important to understanding learning in biology, discipline-based education research and the learning sciences. We discuss differences between each discipline’s approach to biology education research and the benefits and challenges associated with incorporating these perspectives in a single research program. We then propose strategies for building productive interdisciplinary collaboration. PMID:27881446

  3. Report on an Assessment of the Application of EPP Results from the Strain Limit Evaluation Procedure to the Prediction of Cyclic Life Based on the SMT Methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jetter, R. I.; Messner, M. C.; Sham, T. -L.

    The goal of the proposed integrated Elastic Perfectly-Plastic (EPP) and Simplified Model Test (SMT) methodology is to incorporate an SMT data based approach for creep-fatigue damage evaluation into the EPP methodology to avoid the separate evaluation of creep and fatigue damage and eliminate the requirement for stress classification in current methods; thus greatly simplifying evaluation of elevated temperature cyclic service. This methodology should minimize over-conservatism while properly accounting for localized defects and stress risers. To support the implementation of the proposed methodology and to verify the applicability of the code rules, analytical studies and evaluation of thermomechanical test results continuedmore » in FY17. This report presents the results of those studies. An EPP strain limits methodology assessment was based on recent two-bar thermal ratcheting test results on 316H stainless steel in the temperature range of 405 to 7050C. Strain range predictions from the EPP evaluation of the two-bar tests were also evaluated and compared with the experimental results. The role of sustained primary loading on cyclic life was assessed using the results of pressurized SMT data from tests on Alloy 617 at 9500C. A viscoplastic material model was used in an analytic simulation of two-bar tests to compare with EPP strain limits assessments using isochronous stress strain curves that are consistent with the viscoplastic material model. A finite element model of a prior 304H stainless steel Oak Ridge National Laboratory (ORNL) nozzle-to-sphere test was developed and used for an EPP strain limits and creep-fatigue code case damage evaluations. A theoretical treatment of a recurring issue with convergence criteria for plastic shakedown illustrated the role of computer machine precision in EPP calculations.« less

  4. Discovering chemistry with an ab initio nanoreactor

    DOE PAGES

    Wang, Lee-Ping; Titov, Alexey; McGibbon, Robert; ...

    2014-11-02

    Chemical understanding is driven by the experimental discovery of new compounds and reactivity, and is supported by theory and computation that provides detailed physical insight. While theoretical and computational studies have generally focused on specific processes or mechanistic hypotheses, recent methodological and computational advances harken the advent of their principal role in discovery. Here we report the development and application of the ab initio nanoreactor – a highly accelerated, first-principles molecular dynamics simulation of chemical reactions that discovers new molecules and mechanisms without preordained reaction coordinates or elementary steps. Using the nanoreactor we show new pathways for glycine synthesis frommore » primitive compounds proposed to exist on the early Earth, providing new insight into the classic Urey-Miller experiment. Ultimately, these results highlight the emergence of theoretical and computational chemistry as a tool for discovery in addition to its traditional role of interpreting experimental findings.« less

  5. Structural modeling and analysis of an effluent treatment process for electroplating--a graph theoretic approach.

    PubMed

    Kumar, Abhishek; Clement, Shibu; Agrawal, V P

    2010-07-15

    An attempt is made to address a few ecological and environment issues by developing different structural models for effluent treatment system for electroplating. The effluent treatment system is defined with the help of different subsystems contributing to waste minimization. Hierarchical tree and block diagram showing all possible interactions among subsystems are proposed. These non-mathematical diagrams are converted into mathematical models for design improvement, analysis, comparison, storage retrieval and commercially off-the-shelf purchases of different subsystems. This is achieved by developing graph theoretic model, matrix models and variable permanent function model. Analysis is carried out by permanent function, hierarchical tree and block diagram methods. Storage and retrieval is done using matrix models. The methodology is illustrated with the help of an example. Benefits to the electroplaters/end user are identified. 2010 Elsevier B.V. All rights reserved.

  6. Discovering chemistry with an ab initio nanoreactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Lee-Ping; Titov, Alexey; McGibbon, Robert

    Chemical understanding is driven by the experimental discovery of new compounds and reactivity, and is supported by theory and computation that provides detailed physical insight. While theoretical and computational studies have generally focused on specific processes or mechanistic hypotheses, recent methodological and computational advances harken the advent of their principal role in discovery. Here we report the development and application of the ab initio nanoreactor – a highly accelerated, first-principles molecular dynamics simulation of chemical reactions that discovers new molecules and mechanisms without preordained reaction coordinates or elementary steps. Using the nanoreactor we show new pathways for glycine synthesis frommore » primitive compounds proposed to exist on the early Earth, providing new insight into the classic Urey-Miller experiment. Ultimately, these results highlight the emergence of theoretical and computational chemistry as a tool for discovery in addition to its traditional role of interpreting experimental findings.« less

  7. Sequence information gain based motif analysis.

    PubMed

    Maynou, Joan; Pairó, Erola; Marco, Santiago; Perera, Alexandre

    2015-11-09

    The detection of regulatory regions in candidate sequences is essential for the understanding of the regulation of a particular gene and the mechanisms involved. This paper proposes a novel methodology based on information theoretic metrics for finding regulatory sequences in promoter regions. This methodology (SIGMA) has been tested on genomic sequence data for Homo sapiens and Mus musculus. SIGMA has been compared with different publicly available alternatives for motif detection, such as MEME/MAST, Biostrings (Bioconductor package), MotifRegressor, and previous work such Qresiduals projections or information theoretic based detectors. Comparative results, in the form of Receiver Operating Characteristic curves, show how, in 70% of the studied Transcription Factor Binding Sites, the SIGMA detector has a better performance and behaves more robustly than the methods compared, while having a similar computational time. The performance of SIGMA can be explained by its parametric simplicity in the modelling of the non-linear co-variability in the binding motif positions. Sequence Information Gain based Motif Analysis is a generalisation of a non-linear model of the cis-regulatory sequences detection based on Information Theory. This generalisation allows us to detect transcription factor binding sites with maximum performance disregarding the covariability observed in the positions of the training set of sequences. SIGMA is freely available to the public at http://b2slab.upc.edu.

  8. Testing methodologies and systems for semiconductor optical amplifiers

    NASA Astrophysics Data System (ADS)

    Wieckowski, Michael

    Semiconductor optical amplifiers (SOA's) are gaining increased prominence in both optical communication systems and high-speed optical processing systems, due primarily to their unique nonlinear characteristics. This in turn, has raised questions regarding their lifetime performance reliability and has generated a demand for effective testing techniques. This is especially critical for industries utilizing SOA's as components for system-in-package products. It is important to note that very little research to date has been conducted in this area, even though production volume and market demand has continued to increase. In this thesis, the reliability of dilute-mode InP semiconductor optical amplifiers is studied experimentally and theoretically. The aging characteristics of the production level devices are demonstrated and the necessary techniques to accurately characterize them are presented. In addition, this work proposes a new methodology for characterizing the optical performance of these devices using measurements in the electrical domain. It is shown that optical performance degradation, specifically with respect to gain, can be directly qualified through measurements of electrical subthreshold differential resistance. This metric exhibits a linear proportionality to the defect concentration in the active region, and as such, can be used for prescreening devices before employing traditional optical testing methods. A complete theoretical analysis is developed in this work to explain this relationship based upon the device's current-voltage curve and its associated leakage and recombination currents. These results are then extended to realize new techniques for testing semiconductor optical amplifiers and other similarly structured devices. These techniques can be employed after fabrication and during packaged operation through the use of a proposed stand-alone testing system, or using a proposed integrated CMOS self-testing circuit. Both methods are capable of ascertaining SOA performance based solely on the subthreshold differential resistance signature, and are a first step toward the inevitable integration of self-testing circuits into complex optoelectronic systems.

  9. Between hype and hope: What is really at stake with personalized medicine?

    PubMed

    Abettan, Camille

    2016-09-01

    Over the last decade, personalized medicine has become a buzz word, which covers a broad spectrum of meanings and generates many different opinions. The purpose of this article is to achieve a better understanding of the reasons why personalized medicine gives rise to such conflicting opinions. We show that a major issue of personalized medicine is the gap existing between its claims and its reality. We then present and analyze different possible reasons for this gap. We propose an hypothesis inspired by the Windelband's distinction between nomothetic and idiographic methodology. We argue that the fuzzy situation of personalized medicine results from a mix between idiographic claims and nomothetic methodological procedures. Hence we suggest that the current quandary about personalized medicine cannot be solved without getting involved in a discussion about the complex epistemological and methodological status of medicine. To conclude, we show that the Gadamer's view of medicine as a dialogical process can be fruitfully used and reveals that personalization is not a theoretical task, but a practical one, which takes place within the clinical encounter.

  10. Improved Processes to Remove Naphthenic Acids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aihua Zhang; Qisheng Ma; Kangshi Wang

    2005-12-09

    In the past three years, we followed the work plan as we suggested in the proposal and made every efforts to fulfill the project objectives. Based on our large amount of creative and productive work, including both of experimental and theoretic aspects, we received important technical breakthrough on naphthenic acid removal process and obtained deep insight on catalytic decarboxylation chemistry. In detail, we established an integrated methodology to serve for all of the experimental and theoretical work. Our experimental investigation results in discovery of four type effective catalysts to the reaction of decarboxylation of model carboxylic acid compounds. The adsorptionmore » experiment revealed the effectiveness of several solid materials to naphthenic acid adsorption and acidity reduction of crude oil, which can be either natural minerals or synthesized materials. The test with crude oil also received promising results, which can be potentially developed into a practical process for oil industry. The theoretical work predicted several possible catalytic decarboxylation mechanisms that would govern the decarboxylation pathways depending on the type of catalysts being used. The calculation for reaction activation energy was in good agreement with our experimental measurements.« less

  11. Seamless Learning in the Mobile Age: A Theoretical and Methodological Discussion on Using Cooperative Inquiry to Study Digital Kids On-the-Move

    ERIC Educational Resources Information Center

    Toh, Yancy; So, Hyo-Jeong; Seow, Peter; Chen, Wenli; Looi, Chee-Kit

    2013-01-01

    This paper shares the theoretical and methodological frameworks that are deployed in a 3-year study to examine how Singapore primary school students leverage on mobile technology for seamless learning. This notion of seamless learning refers to the integrated and synergistic effects of learning in both formal and informal settings, which is…

  12. Digital design of scaffold for mandibular defect repair based on tissue engineering*

    PubMed Central

    Liu, Yun-feng; Zhu, Fu-dong; Dong, Xing-tao; Peng, Wei

    2011-01-01

    Mandibular defect occurs more frequently in recent years, and clinical repair operations via bone transplantation are difficult to be further improved due to some intrinsic flaws. Tissue engineering, which is a hot research field of biomedical engineering, provides a new direction for mandibular defect repair. As the basis and key part of tissue engineering, scaffolds have been widely and deeply studied in regards to the basic theory, as well as the principle of biomaterial, structure, design, and fabrication method. However, little research is targeted at tissue regeneration for clinic repair operations. Since mandibular bone has a special structure, rather than uniform and regular structure in existing studies, a methodology based on tissue engineering is proposed for mandibular defect repair in this paper. Key steps regarding scaffold digital design, such as external shape design and internal microstructure design directly based on triangular meshes are discussed in detail. By analyzing the theoretical model and the measured data from the test parts fabricated by rapid prototyping, the feasibility and effectiveness of the proposed methodology are properly verified. More works about mechanical and biological improvements need to be done to promote its clinical application in future. PMID:21887853

  13. A methodology to investigate the intrinsic effect of the pulsed electric current during the spark plasma sintering of electrically conductive powders

    PubMed Central

    Locci, Antonio Mario; Cincotti, Alberto; Todde, Sara; Orrù, Roberto; Cao, Giacomo

    2010-01-01

    A novel methodology is proposed for investigating the effect of the pulsed electric current during the spark plasma sintering (SPS) of electrically conductive powders without potential misinterpretation of experimental results. First, ensemble configurations (geometry, size and material of the powder sample, die, plunger and spacers) are identified where the electric current is forced to flow only through either the sample or the die, so that the sample is heated either through the Joule effect or by thermal conduction, respectively. These ensemble configurations are selected using a recently proposed mathematical model of an SPS apparatus, which, once suitably modified, makes it possible to carry out detailed electrical and thermal analysis. Next, SPS experiments are conducted using the ensemble configurations theoretically identified. Using aluminum powders as a case study, we find that the temporal profiles of sample shrinkage, which indicate densification behavior, as well as the final density of the sample are clearly different when the electric current flows only through the sample or through the die containing it, whereas the temperature cycle and mechanical load are the same in both cases. PMID:27877354

  14. Digital design of scaffold for mandibular defect repair based on tissue engineering.

    PubMed

    Liu, Yun-feng; Zhu, Fu-dong; Dong, Xing-tao; Peng, Wei

    2011-09-01

    Mandibular defect occurs more frequently in recent years, and clinical repair operations via bone transplantation are difficult to be further improved due to some intrinsic flaws. Tissue engineering, which is a hot research field of biomedical engineering, provides a new direction for mandibular defect repair. As the basis and key part of tissue engineering, scaffolds have been widely and deeply studied in regards to the basic theory, as well as the principle of biomaterial, structure, design, and fabrication method. However, little research is targeted at tissue regeneration for clinic repair operations. Since mandibular bone has a special structure, rather than uniform and regular structure in existing studies, a methodology based on tissue engineering is proposed for mandibular defect repair in this paper. Key steps regarding scaffold digital design, such as external shape design and internal microstructure design directly based on triangular meshes are discussed in detail. By analyzing the theoretical model and the measured data from the test parts fabricated by rapid prototyping, the feasibility and effectiveness of the proposed methodology are properly verified. More works about mechanical and biological improvements need to be done to promote its clinical application in future.

  15. Incorporating equity considerations in transport infrastructure evaluation: Current practice and a proposed methodology.

    PubMed

    Thomopoulos, N; Grant-Muller, S; Tight, M R

    2009-11-01

    Interest has re-emerged on the issue of how to incorporate equity considerations in the appraisal of transport projects and large road infrastructure projects in particular. This paper offers a way forward in addressing some of the theoretical and practical concerns that have presented difficulties to date in incorporating equity concerns in the appraisal of such projects. Initially an overview of current practice within transport regarding the appraisal of equity considerations in Europe is offered based on an extensive literature review. Acknowledging the value of a framework approach, research towards introducing a theoretical framework is then presented. The proposed framework is based on the well established MCA Analytic Hierarchy Process and is also contrasted with the use of a CBA based approach. The framework outlined here offers an additional support tool to decision makers who will be able to differentiate choices based on their views on specific equity principles and equity types. It also holds the potential to become a valuable tool for evaluators as a result of the option to assess predefined equity perspectives of decision makers against both the project objectives and the estimated project impacts. This framework may also be of further value to evaluators outside transport.

  16. Intersectorial health-related policies: the use of a legal and theoretical framework to propose a typology to a case study in a Brazilian municipality.

    PubMed

    Tess, Beatriz Helena; Aith, Fernando Mussa Abujamra

    2014-11-01

    This article analyzes intersectorial health-related policies (IHRP) based on a case study performed in 2008-2009 that mapped the social policies of the city of Piracicaba, State of Sao Paulo, Brazil. The research strategy comprised quantitative and qualitative methodologies and converging information sources. Legal and theoretical conceptual frameworks were applied to the Piracicaba study results and served as the basis for proposing a typology of IHRP. Three types of IHRP were identified: health policies where the health sector is coordinator but needs non-health sectors to succeed; policies with a sector other than health as coordinator, but which needs health sector collaboration to succeed; and thirdly, genuine intersectorial policies, not led by any one sector but by a specifically-appointed intersectorial coordinator. The authors contend that political commitment of local authorities alone may not be enough to promote efficient intersectorial social policies. Comprehension of different types of IHRP and their interface mechanisms may contribute to greater efficiency and coverage of social policies that affect health equity and its social determinants positively. In the final analysis,, this will lead to more equitable health outcomes.

  17. Research Strategies for Biomedical and Health Informatics

    PubMed Central

    Kulikowski, Casimir A.; Bakken, Suzanne; de Lusignan, Simon; Kimura, Michio; Koch, Sabine; Mantas, John; Maojo, Victor; Marschollek, Michael; Martin-Sanchez, Fernando; Moen, Anne; Park, Hyeoun-Ae; Sarkar, Indra Neil; Leong, Tze Yun; McCray, Alexa T.

    2017-01-01

    Summary Background Medical informatics, or biomedical and health informatics (BMHI), has become an established scientific discipline. In all such disciplines there is a certain inertia to persist in focusing on well-established research areas and to hold on to well-known research methodologies rather than adopting new ones, which may be more appropriate. Objectives To search for answers to the following questions: What are research fields in informatics, which are not being currently adequately addressed, and which methodological approaches might be insufficiently used? Do we know about reasons? What could be consequences of change for research and for education? Methods Outstanding informatics scientists were invited to three panel sessions on this topic in leading international conferences (MIE 2015, Medinfo 2015, HEC 2016) in order to get their answers to these questions. Results A variety of themes emerged in the set of answers provided by the panellists. Some panellists took the theoretical foundations of the field for granted, while several questioned whether the field was actually grounded in a strong theoretical foundation. Panellists proposed a range of suggestions for new or improved approaches, methodologies, and techniques to enhance the BMHI research agenda. Conclusions The field of BMHI is on the one hand maturing as an academic community and intellectual endeavour. On the other hand vendor-supplied solutions may be too readily and uncritically accepted in health care practice. There is a high chance that BMHI will continue to flourish as an important discipline; its innovative interventions might then reach the original objectives of advancing science and improving health care outcomes. PMID:28119991

  18. Individual behavioral phenotypes: an integrative meta-theoretical framework. Why "behavioral syndromes" are not analogs of "personality".

    PubMed

    Uher, Jana

    2011-09-01

    Animal researchers are increasingly interested in individual differences in behavior. Their interpretation as meaningful differences in behavioral strategies stable over time and across contexts, adaptive, heritable, and acted upon by natural selection has triggered new theoretical developments. However, the analytical approaches used to explore behavioral data still address population-level phenomena, and statistical methods suitable to analyze individual behavior are rarely applied. I discuss fundamental investigative principles and analytical approaches to explore whether, in what ways, and under which conditions individual behavioral differences are actually meaningful. I elaborate the meta-theoretical ideas underlying common theoretical concepts and integrate them into an overarching meta-theoretical and methodological framework. This unravels commonalities and differences, and shows that assumptions of analogy to concepts of human personality are not always warranted and that some theoretical developments may be based on methodological artifacts. Yet, my results also highlight possible directions for new theoretical developments in animal behavior research. Copyright © 2011 Wiley Periodicals, Inc.

  19. Comparison of Methodologies of Activation Barrier Measurements for Reactions with Deactivation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, Zhenhua; Yan, Binhang; Zhang, Li

    In this work, methodologies of activation barrier measurements for reactions with deactivation were theoretically analyzed. Reforming of ethane with CO 2 was introduced as an example for reactions with deactivation to experimentally evaluate these methodologies. Both the theoretical and experimental results showed that due to catalyst deactivation, the conventional method would inevitably lead to a much lower activation barrier, compared to the intrinsic value, even though heat and mass transport limitations were excluded. In this work, an optimal method was identified in order to provide a reliable and efficient activation barrier measurement for reactions with deactivation.

  20. Comparison of Methodologies of Activation Barrier Measurements for Reactions with Deactivation

    DOE PAGES

    Xie, Zhenhua; Yan, Binhang; Zhang, Li; ...

    2017-01-25

    In this work, methodologies of activation barrier measurements for reactions with deactivation were theoretically analyzed. Reforming of ethane with CO 2 was introduced as an example for reactions with deactivation to experimentally evaluate these methodologies. Both the theoretical and experimental results showed that due to catalyst deactivation, the conventional method would inevitably lead to a much lower activation barrier, compared to the intrinsic value, even though heat and mass transport limitations were excluded. In this work, an optimal method was identified in order to provide a reliable and efficient activation barrier measurement for reactions with deactivation.

  1. Full-field modal analysis during base motion excitation using high-speed 3D digital image correlation

    NASA Astrophysics Data System (ADS)

    Molina-Viedma, Ángel J.; López-Alba, Elías; Felipe-Sesé, Luis; Díaz, Francisco A.

    2017-10-01

    In recent years, many efforts have been made to exploit full-field measurement optical techniques for modal identification. Three-dimensional digital image correlation using high-speed cameras has been extensively employed for this purpose. Modal identification algorithms are applied to process the frequency response functions (FRF), which relate the displacement response of the structure to the excitation force. However, one of the most common tests for modal analysis involves the base motion excitation of a structural element instead of force excitation. In this case, the relationship between response and excitation is typically based on displacements, which are known as transmissibility functions. In this study, a methodology for experimental modal analysis using high-speed 3D digital image correlation and base motion excitation tests is proposed. In particular, a cantilever beam was excited from its base with a random signal, using a clamped edge join. Full-field transmissibility functions were obtained through the beam and converted into FRF for proper identification, considering a single degree-of-freedom theoretical conversion. Subsequently, modal identification was performed using a circle-fit approach. The proposed methodology facilitates the management of the typically large amounts of data points involved in the DIC measurement during modal identification. Moreover, it was possible to determine the natural frequencies, damping ratios and full-field mode shapes without requiring any additional tests. Finally, the results were experimentally validated by comparing them with those obtained by employing traditional accelerometers, analytical models and finite element method analyses. The comparison was performed by using the quantitative indicator modal assurance criterion. The results showed a high level of correspondence, consolidating the proposed experimental methodology.

  2. The Deskilling Controversy.

    ERIC Educational Resources Information Center

    Attewell, Paul

    1987-01-01

    Braverman and others argue that capitalism continues to degrade and deskill work. The author presents theoretical, empirical, and methodological criticisms that highlight methodological weaknesses in the deskilling approach. (SK)

  3. Application of validity theory and methodology to patient-reported outcome measures (PROMs): building an argument for validity.

    PubMed

    Hawkins, Melanie; Elsworth, Gerald R; Osborne, Richard H

    2018-07-01

    Data from subjective patient-reported outcome measures (PROMs) are now being used in the health sector to make or support decisions about individuals, groups and populations. Contemporary validity theorists define validity not as a statistical property of the test but as the extent to which empirical evidence supports the interpretation of test scores for an intended use. However, validity testing theory and methodology are rarely evident in the PROM validation literature. Application of this theory and methodology would provide structure for comprehensive validation planning to support improved PROM development and sound arguments for the validity of PROM score interpretation and use in each new context. This paper proposes the application of contemporary validity theory and methodology to PROM validity testing. The validity testing principles will be applied to a hypothetical case study with a focus on the interpretation and use of scores from a translated PROM that measures health literacy (the Health Literacy Questionnaire or HLQ). Although robust psychometric properties of a PROM are a pre-condition to its use, a PROM's validity lies in the sound argument that a network of empirical evidence supports the intended interpretation and use of PROM scores for decision making in a particular context. The health sector is yet to apply contemporary theory and methodology to PROM development and validation. The theoretical and methodological processes in this paper are offered as an advancement of the theory and practice of PROM validity testing in the health sector.

  4. [Participatory education and the development of critical reading in teachers theoretical texts. Multicenter study].

    PubMed

    Leyva-González, Félix Arturo; Leo-Amador, Guillermo Enrique; Viniegra-Velázquez, Leonardo; Degollado-Bardales, Lilia; Zavala-Arenas, Jesús Arturo; González-Cobos, Roberto Palemón; Valencia-Sánchez, Jesús Salvador; Leyva-Salas, César Arturo; Angulo-Bernal, Sonia Elizabeth; Gómez-Arteaga, Gress Marissell

    2010-01-01

    Determine what the relationship between participation in classroom of students attending courses at the Educational Research and Teacher Education (CIEFD's) and the development of proficiency in critical reading of theoretical texts in education. Intervention study, multicenter students (medical specialist) level Diploma in teaching methodology (DMDN) 1 and 2 (n=46 n=29) of the six CIEFD's (DF Siglo XXI, Mexico City La Raza, Nuevo Leon, Sonora, Puebla and Veracruz), period: March to August 2007 and a Masters in education (n=9, generation 2007-2008). Two instruments were constructed that evaluated the participation variables and critical reading of theoretical texts in education, conceptual validity; content and reliability were assessed by experts in education research. The educational intervention was in the form of seminars (three times a week in DMDN 1 and twice weekly in DMDN 2 and Masters). Participation was assessed halfway through the course and on completion, critical reading at the beginning as well as the end. Statistically significant associations were observed in DMDN 1 (four Centers) and the Masters, but not DMDN 2. In this investigation some of the theoretical proposals of the participatory education were recreated, starting from the analysis of our results. In some centers and in the masters, strengthening participation in this educational intervention is related to the development of critical reading of theoretical texts in education.

  5. 34 CFR 75.210 - General selection criteria.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... for research activities, and the use of appropriate theoretical and methodological tools, including... project implementation, and the use of appropriate methodological tools to ensure successful achievement...

  6. 34 CFR 75.210 - General selection criteria.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... for research activities, and the use of appropriate theoretical and methodological tools, including... project implementation, and the use of appropriate methodological tools to ensure successful achievement...

  7. 34 CFR 75.210 - General selection criteria.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... for research activities, and the use of appropriate theoretical and methodological tools, including... project implementation, and the use of appropriate methodological tools to ensure successful achievement...

  8. 34 CFR 75.210 - General selection criteria.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... for research activities, and the use of appropriate theoretical and methodological tools, including... project implementation, and the use of appropriate methodological tools to ensure successful achievement...

  9. 34 CFR 75.210 - General selection criteria.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... for research activities, and the use of appropriate theoretical and methodological tools, including... project implementation, and the use of appropriate methodological tools to ensure successful achievement...

  10. Optimal spatio-temporal design of water quality monitoring networks for reservoirs: Application of the concept of value of information

    NASA Astrophysics Data System (ADS)

    Maymandi, Nahal; Kerachian, Reza; Nikoo, Mohammad Reza

    2018-03-01

    This paper presents a new methodology for optimizing Water Quality Monitoring (WQM) networks of reservoirs and lakes using the concept of the value of information (VOI) and utilizing results of a calibrated numerical water quality simulation model. With reference to the value of information theory, water quality of every checkpoint with a specific prior probability differs in time. After analyzing water quality samples taken from potential monitoring points, the posterior probabilities are updated using the Baye's theorem, and VOI of the samples is calculated. In the next step, the stations with maximum VOI is selected as optimal stations. This process is repeated for each sampling interval to obtain optimal monitoring network locations for each interval. The results of the proposed VOI-based methodology is compared with those obtained using an entropy theoretic approach. As the results of the two methodologies would be partially different, in the next step, the results are combined using a weighting method. Finally, the optimal sampling interval and location of WQM stations are chosen using the Evidential Reasoning (ER) decision making method. The efficiency and applicability of the methodology are evaluated using available water quantity and quality data of the Karkheh Reservoir in the southwestern part of Iran.

  11. [The triad configuration, humanist-existential-personal: a theoretical and methodological approach to psychiatric and mental health nursing].

    PubMed

    Vietta, E P

    1995-01-01

    The author establishes a research line based on a theoretical-methodological referential for the qualitative investigation of psychiatric nursing and mental health. Aspects of humanist and existential philosophies and personalism were evaluated integrating them in a unique perspective. In order to maintain the scientific method of research in this referential the categorization process which will be adopted in this kind of investigation was explained.

  12. Predictive Methodology for Delamination Growth in Laminated Composites Part 1: Theoretical Development and Preliminary Experimental Results

    DOT National Transportation Integrated Search

    1998-04-01

    A methodology is presented for the prediction of delamination growth in laminated structures. The methodology is aimed at overcoming computational difficulties in the determination of energy release rate and mode mix. It also addresses the issue that...

  13. A Multi-Level Systems Perspective for the Science of Team Science

    PubMed Central

    Börner, Katy; Contractor, Noshir; Falk-Krzesinski, Holly J.; Fiore, Stephen M.; Hall, Kara L.; Keyton, Joann; Spring, Bonnie; Stokols, Daniel; Trochim, William; Uzzi, Brian

    2012-01-01

    This Commentary describes recent research progress and professional developments in the study of scientific teamwork, an area of inquiry termed the “science of team science” (SciTS, pronounced “sahyts”). It proposes a systems perspective that incorporates a mixed-methods approach to SciTS that is commensurate with the conceptual, methodological, and translational complexities addressed within the SciTS field. The theoretically grounded and practically useful framework is intended to integrate existing and future lines of SciTS research to facilitate the field’s evolution as it addresses key challenges spanning macro, meso, and micro levels of analysis. PMID:20844283

  14. Paper Tools and Periodic Tables: Newlands and Mendeleev Draw Grids.

    PubMed

    Gordin, Michael D

    2018-02-01

    This essay elaborates on Ursula Klein's methodological concept of "paper tools" by drawing on several examples from the history of the periodic table. Moving from John A. R. Newlands's "Law of Octaves," to Dmitrii Mendeleev's first drafts of his periodic system in 1869, to Mendeleev's chemical speculations on the place of the ether within his classification, one sees that the ways in which the scientists presented the balance between empirical data and theoretical manipulation proved crucial for the chemical community's acceptance or rejection of their proposed innovations. This negotiated balance illustrates an underemphasised feature of Klein's conceptualisation of the ways in which a paper tool generates new knowledge.

  15. Historic perspectives from anthropology. Reflections proposed to Transcultural Nursing.

    PubMed

    Rohrbach Viadas, Cecilia

    2015-01-01

    History brings together meanings related to earlier periods, being aware of the past as a panorama to reread the present. Madeleine Leininger presented in 1970 an implicit and respectful message to the Nursing Profession when introducing Nursing and Anthropology. Two Worlds to Blend. Implicitly: Nursing you disregard culture. This article shows the absence of the history of anthropology and of nursing within Transcultural Nursing and it includes how education has influenced theoretic, methodological, and comparative approaches giving researchers the responsibility to decide their fundamentals. Berthoud (2001) has inspired the anthropological and historic perspectives of the author, thus universalism, relativism, and comparison are presented.

  16. Patient influences on satisfaction and loyalty for GP services.

    PubMed

    Rundle-Thiele, Sharyn; Russell-Bennett, Rebekah

    2010-04-01

    Little is known about the influence that patients themselves have on their loyalty to a general practitioner (GP). Consequently, a theoretical framework that draws on diverse literature is proposed to suggest that along with satisfaction, patient loyalty is an important outcome for GPs. Comprising 174 Australian patients, this study identified that knowledgeable patients reported lower levels of loyalty while older patients and patients visiting a GP more frequently reported higher levels of loyalty. The results suggest that extending patient-centered care practices to encompass all patients may be warranted in order to improve patient satisfaction and loyalty. Further, future research opportunities abound, with intervention and dyadic research methodologies recommended.

  17. A Selective Review of Group Selection in High-Dimensional Models

    PubMed Central

    Huang, Jian; Breheny, Patrick; Ma, Shuangge

    2013-01-01

    Grouping structures arise naturally in many statistical modeling problems. Several methods have been proposed for variable selection that respect grouping structure in variables. Examples include the group LASSO and several concave group selection methods. In this article, we give a selective review of group selection concerning methodological developments, theoretical properties and computational algorithms. We pay particular attention to group selection methods involving concave penalties. We address both group selection and bi-level selection methods. We describe several applications of these methods in nonparametric additive models, semiparametric regression, seemingly unrelated regressions, genomic data analysis and genome wide association studies. We also highlight some issues that require further study. PMID:24174707

  18. "Epidemiological criminology": coming full circle.

    PubMed

    Akers, Timothy A; Lanier, Mark M

    2009-03-01

    Members of the public health and criminal justice disciplines often work with marginalized populations: people at high risk of drug use, health problems, incarceration, and other difficulties. As these fields increasingly overlap, distinctions between them are blurred, as numerous research reports and funding trends document. However, explicit theoretical and methodological linkages between the 2 disciplines remain rare. A new paradigm that links methods and statistical models of public health with those of their criminal justice counterparts is needed, as are increased linkages between epidemiological analogies, theories, and models and the corresponding tools of criminology. We outline disciplinary commonalities and distinctions, present policy examples that integrate similarities, and propose "epidemiological criminology" as a bridging framework.

  19. A multi-product green supply chain under government supervision with price and demand uncertainty

    NASA Astrophysics Data System (ADS)

    Hafezalkotob, Ashkan; Zamani, Soma

    2018-05-01

    In this paper, a bi-level game-theoretic model is proposed to investigate the effects of governmental financial intervention on green supply chain. This problem is formulated as a bi-level program for a green supply chain that produces various products with different environmental pollution levels. The problem is also regard uncertainties in market demand and sale price of raw materials and products. The model is further transformed into a single-level nonlinear programming problem by replacing the lower-level optimization problem with its Karush-Kuhn-Tucker optimality conditions. Genetic algorithm is applied as a solution methodology to solve nonlinear programming model. Finally, to investigate the validity of the proposed method, the computational results obtained through genetic algorithm are compared with global optimal solution attained by enumerative method. Analytical results indicate that the proposed GA offers better solutions in large size problems. Also, we conclude that financial intervention by government consists of green taxation and subsidization is an effective method to stabilize green supply chain members' performance.

  20. State-vector formalism and the Legendre polynomial solution for modelling guided waves in anisotropic plates

    NASA Astrophysics Data System (ADS)

    Zheng, Mingfang; He, Cunfu; Lu, Yan; Wu, Bin

    2018-01-01

    We presented a numerical method to solve phase dispersion curve in general anisotropic plates. This approach involves an exact solution to the problem in the form of the Legendre polynomial of multiple integrals, which we substituted into the state-vector formalism. In order to improve the efficiency of the proposed method, we made a special effort to demonstrate the analytical methodology. Furthermore, we analyzed the algebraic symmetries of the matrices in the state-vector formalism for anisotropic plates. The basic feature of the proposed method was the expansion of field quantities by Legendre polynomials. The Legendre polynomial method avoid to solve the transcendental dispersion equation, which can only be solved numerically. This state-vector formalism combined with Legendre polynomial expansion distinguished the adjacent dispersion mode clearly, even when the modes were very close. We then illustrated the theoretical solutions of the dispersion curves by this method for isotropic and anisotropic plates. Finally, we compared the proposed method with the global matrix method (GMM), which shows excellent agreement.

  1. How Does the Low-Rank Matrix Decomposition Help Internal and External Learnings for Super-Resolution.

    PubMed

    Wang, Shuang; Yue, Bo; Liang, Xuefeng; Jiao, Licheng

    2018-03-01

    Wisely utilizing the internal and external learning methods is a new challenge in super-resolution problem. To address this issue, we analyze the attributes of two methodologies and find two observations of their recovered details: 1) they are complementary in both feature space and image plane and 2) they distribute sparsely in the spatial space. These inspire us to propose a low-rank solution which effectively integrates two learning methods and then achieves a superior result. To fit this solution, the internal learning method and the external learning method are tailored to produce multiple preliminary results. Our theoretical analysis and experiment prove that the proposed low-rank solution does not require massive inputs to guarantee the performance, and thereby simplifying the design of two learning methods for the solution. Intensive experiments show the proposed solution improves the single learning method in both qualitative and quantitative assessments. Surprisingly, it shows more superior capability on noisy images and outperforms state-of-the-art methods.

  2. [Customer and patient satisfaction. An appropriate management tool in hospitals?].

    PubMed

    Pawils, S; Trojan, A; Nickel, S; Bleich, C

    2012-09-01

    Recently, the concept of patient satisfaction has been established as an essential part of the quality management of hospitals. Despite the concept's lack of theoretical and methodological foundations, patient surveys on subjective hospital experiences contribute immensely to the improvement of hospitals. What needs to be considered critically in this context is the concept of customer satisfaction for patients, the theoretical integration of empirical results, the reduction of false satisfaction indications and the application of risk-adjusted versus naïve benchmarking of data. This paper aims to contribute to the theoretical discussion of the topic and to build a basis for planning methodologically sound patient surveys.

  3. Theoretical study of γ-hexachlorocyclohexane and β-hexachlorocyclohexane isomers interaction with surface groups of activated carbon model.

    PubMed

    Enriquez-Victorero, Carlos; Hernández-Valdés, Daniel; Montero-Alejo, Ana Lilian; Durimel, Axelle; Gaspard, Sarra; Jáuregui-Haza, Ulises

    2014-06-01

    Activated carbon (AC) is employed in drinking water purification without almost any knowledge about the adsorption mechanism of persistent organic pollutants (POPs) onto it. Hexachlorocyclohexane (HCH) is an organochlorinated contaminant present in water and soils of banana crops production zones of the Caribbean. The most relevant isomers of HCH are γ-HCH and β-HCH, both with great environmental persistence. A theoretical study of the influence of AC surface groups (SGs) on HCH adsorption is done in order to help to understand the process and may lead to improve the AC selection process. A simplified AC model consisting of naphthalene with a functional group was used to assess the influence of SGs over the adsorption process. The Multiple Minima Hypersurface (MMH) methodology was employed to study γ-HCH and β-HCH interactions with different AC SGs (hydroxyl and carboxyl) under different hydration and pH conditions. The results obtained showed that association of HCH with SGs preferentially occurs between the axial protons of HCH and SG's oxygen atom, and the most favorable interactions occurring with charged SGs. An increase in carboxylic SGs content is proposed to enhance HCH adsorption onto AC under neutral pH conditions. Finally, this work presents an inexpensive computer aided methodology for preselecting activated carbon SGs content for the removal of a given compound. Copyright © 2014 Elsevier Inc. All rights reserved.

  4. The Theoretical and Methodological Crisis of the Afrocentric Conception.

    ERIC Educational Resources Information Center

    Banks, W. Curtis

    1992-01-01

    Defines the theory of the Afrocentric conception, and comments on Afrocentric research methodology. The Afrocentric conception is likely to succeed if it constructs a particularist theory in contrast to cross-cultural relativism and because it relies on the methodology of the absolute rather than the comparative. (SLD)

  5. Choosing between Methodologies: An Inquiry into English Learning Processes in a Taiwanese Indigenous School

    ERIC Educational Resources Information Center

    Lin, Wen-Chuan

    2012-01-01

    Traditional, cognitive-oriented theories of English language acquisition tend to employ experimental modes of inquiry and neglect social, cultural and historical contexts. In this paper, I review the theoretical debate over methodology by examining ontological, epistemological and methodological controversies around cognitive-oriented theories. I…

  6. Comparing the Energy Content of Batteries, Fuels, and Materials

    ERIC Educational Resources Information Center

    Balsara, Nitash P.; Newman, John

    2013-01-01

    A methodology for calculating the theoretical and practical specific energies of rechargeable batteries, fuels, and materials is presented. The methodology enables comparison of the energy content of diverse systems such as the lithium-ion battery, hydrocarbons, and ammonia. The methodology is relevant for evaluating the possibility of using…

  7. Speciation of adsorbates on surface of solids by infrared spectroscopy and chemometrics.

    PubMed

    Vilmin, Franck; Bazin, Philippe; Thibault-Starzyk, Frédéric; Travert, Arnaud

    2015-09-03

    Speciation, i.e. identification and quantification, of surface species on heterogeneous surfaces by infrared spectroscopy is important in many fields but remains a challenging task when facing strongly overlapped spectra of multiple adspecies. Here, we propose a new methodology, combining state of the art instrumental developments for quantitative infrared spectroscopy of adspecies and chemometrics tools, mainly a novel data processing algorithm, called SORB-MCR (SOft modeling by Recursive Based-Multivariate Curve Resolution) and multivariate calibration. After formal transposition of the general linear mixture model to adsorption spectral data, the main issues, i.e. validity of Beer-Lambert law and rank deficiency problems, are theoretically discussed. Then, the methodology is exposed through application to two case studies, each of them characterized by a specific type of rank deficiency: (i) speciation of physisorbed water species over a hydrated silica surface, and (ii) speciation (chemisorption and physisorption) of a silane probe molecule over a dehydrated silica surface. In both cases, we demonstrate the relevance of this approach which leads to a thorough surface speciation based on comprehensive and fully interpretable multivariate quantitative models. Limitations and drawbacks of the methodology are also underlined. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Laboratory meter-scale seismic monitoring of varying water levels in granular media

    NASA Astrophysics Data System (ADS)

    Pasquet, S.; Bodet, L.; Bergamo, P.; Guérin, R.; Martin, R.; Mourgues, R.; Tournat, V.

    2016-12-01

    Laboratory physical modelling and non-contacting ultrasonic techniques are frequently proposed to tackle theoretical and methodological issues related to geophysical prospecting. Following recent developments illustrating the ability of seismic methods to image spatial and/or temporal variations of water content in the vadose zone, we developed laboratory experiments aimed at testing the sensitivity of seismic measurements (i.e., pressure-wave travel times and surface-wave phase velocities) to water saturation variations. Ultrasonic techniques were used to simulate typical seismic acquisitions on small-scale controlled granular media presenting different water levels. Travel times and phase velocity measurements obtained at the dry state were validated with both theoretical models and numerical simulations and serve as reference datasets. The increasing water level clearly affects the recorded wave field in both its phase and amplitude, but the collected data cannot yet be inverted in the absence of a comprehensive theoretical model for such partially saturated and unconsolidated granular media. The differences in travel time and phase velocity observed between the dry and wet models show patterns that are interestingly coincident with the observed water level and depth of the capillary fringe, thus offering attractive perspectives for studying soil water content variations in the field.

  9. The effect of time synchronization of wireless sensors on the modal analysis of structures

    NASA Astrophysics Data System (ADS)

    Krishnamurthy, V.; Fowler, K.; Sazonov, E.

    2008-10-01

    Driven by the need to reduce the installation cost and maintenance cost of structural health monitoring (SHM) systems, wireless sensor networks (WSNs) are becoming increasingly popular. Perfect time synchronization amongst the wireless sensors is a key factor enabling the use of low-cost, low-power WSNs for structural health monitoring applications based on output-only modal analysis of structures. In this paper we present a theoretical framework for analysis of the impact created by time delays in the measured system response on the reconstruction of mode shapes using the popular frequency domain decomposition (FDD) technique. This methodology directly estimates the change in mode shape values based on sensor synchronicity. We confirm the proposed theoretical model by experimental validation in modal identification experiments performed on an aluminum beam. The experimental validation was performed using a wireless intelligent sensor and actuator network (WISAN) which allows for close time synchronization between sensors (0.6-10 µs in the tested configuration) and guarantees lossless data delivery under normal conditions. The experimental results closely match theoretical predictions and show that even very small delays in output response impact the mode shapes.

  10. Critical Analysis of the Mathematical Formalism of Theoretical Physics. I. Foundations of Differential and Integral Calculus

    NASA Astrophysics Data System (ADS)

    Kalanov, Temur Z.

    2013-04-01

    Critical analysis of the standard foundations of differential and integral calculus -- as mathematical formalism of theoretical physics -- is proposed. Methodological basis of the analysis is the unity of formal logic and rational dialectics. It is shown that: (a) the foundations (i.e. d 1ptyd,;=;δ,;->;0,;δ,δ,, δ,;->;0;δ,δ,;=;δ,;->;0;f,( x;+;δ, );-;f,( x )δ,;, d,;=;δ,, d,;=;δ, where y;=;f,( x ) is a continuous function of one argument x; δ, and δ, are increments; d, and d, are differentials) not satisfy formal logic law -- the law of identity; (b) the infinitesimal quantities d,, d, are fictitious quantities. They have neither algebraic meaning, nor geometrical meaning because these quantities do not take numerical values and, therefore, have no a quantitative measure; (c) expressions of the kind x;+;d, are erroneous because x (i.e. finite quantity) and d, (i.e. infinitely diminished quantity) have different sense, different qualitative determinacy; since x;,;,,,,onst under δ,;,;,, a derivative does not contain variable quantity x and depends only on constant c. Consequently, the standard concepts ``infinitesimal quantity (uninterruptedly diminishing quantity)'', ``derivative'', ``derivative as function of variable quantity'' represent incorrect basis of mathematics and theoretical physics.

  11. Load-Flow in Multiphase Distribution Networks: Existence, Uniqueness, Non-Singularity, and Linear Models

    DOE PAGES

    Bernstein, Andrey; Wang, Cong; Dall'Anese, Emiliano; ...

    2018-01-01

    This paper considers unbalanced multiphase distribution systems with generic topology and different load models, and extends the Z-bus iterative load-flow algorithm based on a fixed-point interpretation of the AC load-flow equations. Explicit conditions for existence and uniqueness of load-flow solutions are presented. These conditions also guarantee convergence of the load-flow algorithm to the unique solution. The proposed methodology is applicable to generic systems featuring (i) wye connections; (ii) ungrounded delta connections; (iii) a combination of wye-connected and delta-connected sources/loads; and, (iv) a combination of line-to-line and line-to-grounded-neutral devices at the secondary of distribution transformers. Further, a sufficient condition for themore » non-singularity of the load-flow Jacobian is proposed. Finally, linear load-flow models are derived, and their approximation accuracy is analyzed. Theoretical results are corroborated through experiments on IEEE test feeders.« less

  12. Low-complexity piecewise-affine virtual sensors: theory and design

    NASA Astrophysics Data System (ADS)

    Rubagotti, Matteo; Poggi, Tomaso; Oliveri, Alberto; Pascucci, Carlo Alberto; Bemporad, Alberto; Storace, Marco

    2014-03-01

    This paper is focused on the theoretical development and the hardware implementation of low-complexity piecewise-affine direct virtual sensors for the estimation of unmeasured variables of interest of nonlinear systems. The direct virtual sensor is designed directly from measured inputs and outputs of the system and does not require a dynamical model. The proposed approach allows one to design estimators which mitigate the effect of the so-called 'curse of dimensionality' of simplicial piecewise-affine functions, and can be therefore applied to relatively high-order systems, enjoying convergence and optimality properties. An automatic toolchain is also presented to generate the VHDL code describing the digital circuit implementing the virtual sensor, starting from the set of measured input and output data. The proposed methodology is applied to generate an FPGA implementation of the virtual sensor for the estimation of vehicle lateral velocity, using a hardware-in-the-loop setting.

  13. Real-Time UV-Visible Spectroscopy Analysis of Purple Membrane-Polyacrylamide Film Formation Taking into Account Fano Line Shapes and Scattering

    PubMed Central

    Gomariz, María; Blaya, Salvador; Acebal, Pablo; Carretero, Luis

    2014-01-01

    We theoretically and experimentally analyze the formation of thick Purple Membrane (PM) polyacrylamide (PA) films by means of optical spectroscopy by considering the absorption of bacteriorhodopsin and scattering. We have applied semiclassical quantum mechanical techniques for the calculation of absorption spectra by taking into account the Fano effects on the ground state of bacteriorhodopsin. A model of the formation of PM-polyacrylamide films has been proposed based on the growth of polymeric chains around purple membrane. Experimentally, the temporal evolution of the polymerization process of acrylamide has been studied as function of the pH solution, obtaining a good correspondence to the proposed model. Thus, due to the formation of intermediate bacteriorhodopsin-doped nanogel, by controlling the polymerization process, an alternative methodology for the synthesis of bacteriorhodopsin-doped nanogels can be provided. PMID:25329473

  14. Real-time UV-visible spectroscopy analysis of purple membrane-polyacrylamide film formation taking into account Fano line shapes and scattering.

    PubMed

    Gomariz, María; Blaya, Salvador; Acebal, Pablo; Carretero, Luis

    2014-01-01

    We theoretically and experimentally analyze the formation of thick Purple Membrane (PM) polyacrylamide (PA) films by means of optical spectroscopy by considering the absorption of bacteriorhodopsin and scattering. We have applied semiclassical quantum mechanical techniques for the calculation of absorption spectra by taking into account the Fano effects on the ground state of bacteriorhodopsin. A model of the formation of PM-polyacrylamide films has been proposed based on the growth of polymeric chains around purple membrane. Experimentally, the temporal evolution of the polymerization process of acrylamide has been studied as function of the pH solution, obtaining a good correspondence to the proposed model. Thus, due to the formation of intermediate bacteriorhodopsin-doped nanogel, by controlling the polymerization process, an alternative methodology for the synthesis of bacteriorhodopsin-doped nanogels can be provided.

  15. Load-Flow in Multiphase Distribution Networks: Existence, Uniqueness, Non-Singularity, and Linear Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bernstein, Andrey; Wang, Cong; Dall'Anese, Emiliano

    This paper considers unbalanced multiphase distribution systems with generic topology and different load models, and extends the Z-bus iterative load-flow algorithm based on a fixed-point interpretation of the AC load-flow equations. Explicit conditions for existence and uniqueness of load-flow solutions are presented. These conditions also guarantee convergence of the load-flow algorithm to the unique solution. The proposed methodology is applicable to generic systems featuring (i) wye connections; (ii) ungrounded delta connections; (iii) a combination of wye-connected and delta-connected sources/loads; and, (iv) a combination of line-to-line and line-to-grounded-neutral devices at the secondary of distribution transformers. Further, a sufficient condition for themore » non-singularity of the load-flow Jacobian is proposed. Finally, linear load-flow models are derived, and their approximation accuracy is analyzed. Theoretical results are corroborated through experiments on IEEE test feeders.« less

  16. The Topological Field Theory of Data: a program towards a novel strategy for data mining through data language

    NASA Astrophysics Data System (ADS)

    Rasetti, M.; Merelli, E.

    2015-07-01

    This paper aims to challenge the current thinking in IT for the 'Big Data' question, proposing - almost verbatim, with no formulas - a program aiming to construct an innovative methodology to perform data analytics in a way that returns an automaton as a recognizer of the data language: a Field Theory of Data. We suggest to build, directly out of probing data space, a theoretical framework enabling us to extract the manifold hidden relations (patterns) that exist among data, as correlations depending on the semantics generated by the mining context. The program, that is grounded in the recent innovative ways of integrating data into a topological setting, proposes the realization of a Topological Field Theory of Data, transferring and generalizing to the space of data notions inspired by physical (topological) field theories and harnesses the theory of formal languages to define the potential semantics necessary to understand the emerging patterns.

  17. Distinction between added-energy and phase-resetting mechanisms in non-invasively detected somatosensory evoked responses.

    PubMed

    Fedele, T; Scheer, H-J; Burghoff, M; Waterstraat, G; Nikulin, V V; Curio, G

    2013-01-01

    Non-invasively recorded averaged event-related potentials (ERP) represent a convenient opportunity to investigate human brain perceptive and cognitive processes. Nevertheless, generative ERP mechanisms are still debated. Two previous approaches have been contested in the past: the added-energy model in which the response raises independently from the ongoing background activity, and the phase-reset model, based on stimulus-driven synchronization of oscillatory ongoing activity. Many criteria for the distinction of these two models have been proposed, but there is no definitive methodology to disentangle them, owing also to the limited information at the single trial level. Here, we propose a new approach combining low-noise EEG technology and multivariate decomposition techniques. We present theoretical analyses based on simulated data and identify in high-frequency somatosensory evoked responses an optimal target for the distinction between the two mechanisms.

  18. A Mode-Shape-Based Fault Detection Methodology for Cantilever Beams

    NASA Technical Reports Server (NTRS)

    Tejada, Arturo

    2009-01-01

    An important goal of NASA's Internal Vehicle Health Management program (IVHM) is to develop and verify methods and technologies for fault detection in critical airframe structures. A particularly promising new technology under development at NASA Langley Research Center is distributed Bragg fiber optic strain sensors. These sensors can be embedded in, for instance, aircraft wings to continuously monitor surface strain during flight. Strain information can then be used in conjunction with well-known vibrational techniques to detect faults due to changes in the wing's physical parameters or to the presence of incipient cracks. To verify the benefits of this technology, the Formal Methods Group at NASA LaRC has proposed the use of formal verification tools such as PVS. The verification process, however, requires knowledge of the physics and mathematics of the vibrational techniques and a clear understanding of the particular fault detection methodology. This report presents a succinct review of the physical principles behind the modeling of vibrating structures such as cantilever beams (the natural model of a wing). It also reviews two different classes of fault detection techniques and proposes a particular detection method for cracks in wings, which is amenable to formal verification. A prototype implementation of these methods using Matlab scripts is also described and is related to the fundamental theoretical concepts.

  19. Understanding system dynamics of an adaptive enzyme network from globally profiled kinetic parameters.

    PubMed

    Chiang, Austin W T; Liu, Wei-Chung; Charusanti, Pep; Hwang, Ming-Jing

    2014-01-15

    A major challenge in mathematical modeling of biological systems is to determine how model parameters contribute to systems dynamics. As biological processes are often complex in nature, it is desirable to address this issue using a systematic approach. Here, we propose a simple methodology that first performs an enrichment test to find patterns in the values of globally profiled kinetic parameters with which a model can produce the required system dynamics; this is then followed by a statistical test to elucidate the association between individual parameters and different parts of the system's dynamics. We demonstrate our methodology on a prototype biological system of perfect adaptation dynamics, namely the chemotaxis model for Escherichia coli. Our results agreed well with those derived from experimental data and theoretical studies in the literature. Using this model system, we showed that there are motifs in kinetic parameters and that these motifs are governed by constraints of the specified system dynamics. A systematic approach based on enrichment statistical tests has been developed to elucidate the relationships between model parameters and the roles they play in affecting system dynamics of a prototype biological network. The proposed approach is generally applicable and therefore can find wide use in systems biology modeling research.

  20. Rethinking Case Study Methodology in Poststructural Research.

    PubMed

    Mohammed, Shan; Peter, Elizabeth; Gastaldo, Denise; Howell, Doris

    2015-03-01

    Little consideration has been given to how case study might be used in poststructural research to explore power relations that constitute a phenomenon. Many case study scholars, most notably Robert Yin, adopt a postpositivist perspective that assumes the "truth" can be accessed through applying prescriptive and rigid research techniques. Using a discussion of Michel Foucault's key theoretical ideas and the insights gained through a Foucauldian case study of people with advanced cancer who continue to receive curative treatment, the authors argue for the expansion of case study in poststructural inquiry. They propose that the use of poststructuralist case study is valuable because of the flexibility and comprehensiveness of the methodology, which allows for the exploration of a deeper understanding of the broader discourses that shape a phenomenon, as well as how power/knowledge relations shape the behaviours and perceptions of people. They also introduce the reflexive implications of poststructural case study research. Copyright© by Ingram School of Nursing, McGill University.

  1. [Strategic health planning based on determinants: case of the municipality of Campo Bom, Rio Grande do Sul State. A methodological proposal for the decentralized management].

    PubMed

    González, Martín Maximino León

    2009-10-01

    With the purpose to analyze the health strategic planning model based on determinants experienced in the municipality of Campo Bom, Rio Grande do Sul State, it was conducted an observational, qualitative study, of documental analysis as well as an evaluation of new process technologies in local health administration. This study contains an analysis of the methodological coherency and applicability of this model, based on the revision of the elaborated plans. The plans presented at Campo Bom case shows the possibility of integration and applicability at local level, of a health strategic planning model oriented to the new health concepts considering elements of different theoretical developments that enables the response to the most common local needs and situations. It was identified evolutional stages of health planning and analyzed integrative elements of the model and limitations of its application, pointing to the need of support the deepening on the study and the development of the field.

  2. Simulation of Ejecta Production and Mixing Process of Sn Sample under shock loading

    NASA Astrophysics Data System (ADS)

    Wang, Pei; Chen, Dawei; Sun, Haiquan; Ma, Dongjun

    2017-06-01

    Ejection may occur when a strong shock wave release at the free surface of metal material and the ejecta of high-speed particulate matter will be formed and further mixed with the surrounding gas. Ejecta production and its mixing process has been one of the most difficult problems in shock physics remain unresolved, and have many important engineering applications in the imploding compression science. The present paper will introduce a methodology for the theoretical modeling and numerical simulation of the complex ejection and mixing process. The ejecta production is decoupled with the particle mixing process, and the ejecta state can be achieved by the direct numerical simulation for the evolution of initial defect on the metal surface. Then the particle mixing process can be simulated and resolved by a two phase gas-particle model which uses the aforementioned ejecta state as the initial condition. A preliminary ejecta experiment of planar Sn metal Sample has validated the feasibility of the proposed methodology.

  3. Cost-benefit analysis of water-reuse projects for environmental purposes: a case study for Spanish wastewater treatment plants.

    PubMed

    Molinos-Senante, M; Hernández-Sancho, F; Sala-Garrido, R

    2011-12-01

    Water reuse is an emerging and promising non-conventional water resource. Feasibility studies are essential tools in the decision making process for the implementation of water-reuse projects. However, the methods used to assess economic feasibility tend to focus on internal costs, while external impacts are relegated to unsubstantiated statements about the advantages of water reuse. Using the concept of shadow prices for undesirable outputs of water reclamation, the current study developed a theoretical methodology to assess internal and external economic impacts. The proposed methodological approach is applied to 13 wastewater treatment plants in the Valencia region of Spain that reuse effluent for environmental purposes. Internal benefit analyses indicated that only a proportion of projects were economically viable, while when external benefits are incorporated all projects were economically viable. In conclusion, the economic feasibility assessments of water-reuse projects should quantitatively evaluate economic, environmental and resource availability. Copyright © 2011 Elsevier Ltd. All rights reserved.

  4. Enzymatic cybernetics: an unpublished work by Jacques Monod.

    PubMed

    Gayon, Jean

    2015-06-01

    In 1959, Jacques Monod wrote a manuscript entitled Cybernétique enzymatique [Enzymatic cybernetics]. Never published, this unpublished manuscript presents a synthesis of how Monod interpreted enzymatic adaptation just before the publication of the famous papers of the 1960s on the operon. In addition, Monod offers an example of a philosophy of biology immersed in scientific investigation. Monod's philosophical thoughts are classified into two categories, methodological and ontological. On the methodological side, Monod explicitly hints at his preferences regarding the scientific method in general: hypothetical-deductive method, and use of theoretical models. He also makes heuristic proposals regarding molecular biology: the need to analyse the phenomena in question at the level of individual cells, and the dual aspect of all biological explanation, functional and evolutionary. Ontological issues deal with the notions of information and genetic determinism, "cellular memory", the irrelevance of the notion of "living matter", and the usefulness of a cybernetic comprehension of molecular biology. Copyright © 2015 Académie des sciences. Published by Elsevier SAS. All rights reserved.

  5. QUADRO: A SUPERVISED DIMENSION REDUCTION METHOD VIA RAYLEIGH QUOTIENT OPTIMIZATION.

    PubMed

    Fan, Jianqing; Ke, Zheng Tracy; Liu, Han; Xia, Lucy

    We propose a novel Rayleigh quotient based sparse quadratic dimension reduction method-named QUADRO (Quadratic Dimension Reduction via Rayleigh Optimization)-for analyzing high-dimensional data. Unlike in the linear setting where Rayleigh quotient optimization coincides with classification, these two problems are very different under nonlinear settings. In this paper, we clarify this difference and show that Rayleigh quotient optimization may be of independent scientific interests. One major challenge of Rayleigh quotient optimization is that the variance of quadratic statistics involves all fourth cross-moments of predictors, which are infeasible to compute for high-dimensional applications and may accumulate too many stochastic errors. This issue is resolved by considering a family of elliptical models. Moreover, for heavy-tail distributions, robust estimates of mean vectors and covariance matrices are employed to guarantee uniform convergence in estimating non-polynomially many parameters, even though only the fourth moments are assumed. Methodologically, QUADRO is based on elliptical models which allow us to formulate the Rayleigh quotient maximization as a convex optimization problem. Computationally, we propose an efficient linearized augmented Lagrangian method to solve the constrained optimization problem. Theoretically, we provide explicit rates of convergence in terms of Rayleigh quotient under both Gaussian and general elliptical models. Thorough numerical results on both synthetic and real datasets are also provided to back up our theoretical results.

  6. Reverse Engineering Cellular Networks with Information Theoretic Methods

    PubMed Central

    Villaverde, Alejandro F.; Ross, John; Banga, Julio R.

    2013-01-01

    Building mathematical models of cellular networks lies at the core of systems biology. It involves, among other tasks, the reconstruction of the structure of interactions between molecular components, which is known as network inference or reverse engineering. Information theory can help in the goal of extracting as much information as possible from the available data. A large number of methods founded on these concepts have been proposed in the literature, not only in biology journals, but in a wide range of areas. Their critical comparison is difficult due to the different focuses and the adoption of different terminologies. Here we attempt to review some of the existing information theoretic methodologies for network inference, and clarify their differences. While some of these methods have achieved notable success, many challenges remain, among which we can mention dealing with incomplete measurements, noisy data, counterintuitive behaviour emerging from nonlinear relations or feedback loops, and computational burden of dealing with large data sets. PMID:24709703

  7. Blue space geographies: Enabling health in place.

    PubMed

    Foley, Ronan; Kistemann, Thomas

    2015-09-01

    Drawing from research on therapeutic landscapes and relationships between environment, health and wellbeing, we propose the idea of 'healthy blue space' as an important new development Complementing research on healthy green space, blue space is defined as; 'health-enabling places and spaces, where water is at the centre of a range of environments with identifiable potential for the promotion of human wellbeing'. Using theoretical ideas from emotional and relational geographies and critical understandings of salutogenesis, the value of blue space to health and wellbeing is recognised and evaluated. Six individual papers from five different countries consider how health can be enabled in mixed blue space settings. Four sub-themes; embodiment, inter-subjectivity, activity and meaning, document multiple experiences within a range of healthy blue spaces. Finally, we suggest a considerable research agenda - theoretical, methodological and applied - for future work within different forms of blue space. All are suggested as having public health policy relevance in social and public space. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Radiation impact on the characteristics of optical glasses test results on a selected set of materials

    NASA Astrophysics Data System (ADS)

    Fruit, Michel; Gussarov, Andrei; Berghmans, Francis; Doyle, Dominic; Ulbrich, Gerd

    2017-11-01

    It is well known within the Space optics community that radiation may significantly affect transmittance of glasses. To overcome this drawback, glass manufacturers have developed Cerium doped counterparts of classical glasses. This doped glasses display much less transmittance sensitivity to radiation. Still, the impact of radiation on refractive index is less known and may affect indifferently classical or Cerium doped glasses. ESTEC has initialised an R&D program with the aim of establishing a comprehensive data base gathering radiation sensitivity data, called Dose coefficients, for all the glass optical parameters (transmittance / refractive index / compaction……). The first part of this study, to define the methodology for such a data base, is run by ASTRIUM SAS in co-operation with SCK CEN. This covers theoretical studies associated to testing of a selected set of classical and "radiation hardened" glasses. It is proposed here to present first the theoretical backgrounds of this study and then to give results which have been obtained so far.

  9. Assessment of Environmental Enteropathy in the MAL-ED Cohort Study: Theoretical and Analytic Framework

    PubMed Central

    Kosek, Margaret; Guerrant, Richard L.; Kang, Gagandeep; Bhutta, Zulfiqar; Yori, Pablo Peñataro; Gratz, Jean; Gottlieb, Michael; Lang, Dennis; Lee, Gwenyth; Haque, Rashidul; Mason, Carl J.; Ahmed, Tahmeed; Lima, Aldo; Petri, William A.; Houpt, Eric; Olortegui, Maribel Paredes; Seidman, Jessica C.; Mduma, Estomih; Samie, Amidou; Babji, Sudhir

    2014-01-01

    Individuals in the developing world live in conditions of intense exposure to enteric pathogens due to suboptimal water and sanitation. These environmental conditions lead to alterations in intestinal structure, function, and local and systemic immune activation that are collectively referred to as environmental enteropathy (EE). This condition, although poorly defined, is likely to be exacerbated by undernutrition as well as being responsible for permanent growth deficits acquired in early childhood, vaccine failure, and loss of human potential. This article addresses the underlying theoretical and analytical frameworks informing the methodology proposed by the Etiology, Risk Factors and Interactions of Enteric Infections and Malnutrition and the Consequences for Child Health and Development (MAL-ED) cohort study to define and quantify the burden of disease caused by EE within a multisite cohort. Additionally, we will discuss efforts to improve, standardize, and harmonize laboratory practices within the MAL-ED Network. These efforts will address current limitations in the understanding of EE and its burden on children in the developing world. PMID:25305293

  10. Establishing a Research Agenda for Understanding the Role and Impact of Mental Health Peer Specialists.

    PubMed

    Chinman, Matthew; McInnes, D Keith; Eisen, Susan; Ellison, Marsha; Farkas, Marianne; Armstrong, Moe; Resnick, Sandra G

    2017-09-01

    Mental health peer specialists are individuals with serious mental illnesses who receive training to use their lived experiences to help others with serious mental illnesses in clinical settings. This Open Forum discusses the state of the research for mental health peer specialists and suggests a research agenda to advance the field. Studies have suggested that peer specialists vary widely in their roles, settings, and theoretical orientations. Theories of action have been proposed, but none have been tested. Outcome studies have shown benefits of peer specialists; however, many studies have methodological shortcomings. Qualitative descriptions of peer specialists are plentiful but lack grounding in implementation science frameworks. A research agenda advancing the field could include empirically testing theoretical mechanisms of peer specialists, developing a measure of peer specialist fidelity, conducting more rigorous outcomes studies, involving peer specialists in executing the research, and assessing various factors that influence implementing peer specialist services and testing strategies that could address those factors.

  11. Theoretical and Methodological Basis of Inclusive Education in the Researches of Russian Scientists in the First Quarter of 20th Century (P. P. Blonsky, L. S. Vygotsky, v. P. Kaschenko, S. T. Shatsky)

    ERIC Educational Resources Information Center

    Akhmetova, Daniya Z.; Chelnokova, Tatyana A.; Morozova, Ilona G.

    2017-01-01

    Article is devoted to the scientific heritage of educators and psychologists of Russia in the first quarter of the twentieth century. The aim of the research is the identification of the most significant ideas of P. P. Blonsky, L. S. Vygotsky, V. P. Kacshenko, S. T Shatsky which based the theoretical and methodological basis of inclusive…

  12. Modeling, Analyzing, and Mitigating Dissonance Between Alerting Systems

    NASA Technical Reports Server (NTRS)

    Song, Lixia; Kuchar, James K.

    2003-01-01

    Alerting systems are becoming pervasive in process operations, which may result in the potential for dissonance or conflict in information from different alerting systems that suggests different threat levels and/or actions to resolve hazards. Little is currently available to help in predicting or solving the dissonance problem. This thesis presents a methodology to model and analyze dissonance between alerting systems, providing both a theoretical foundation for understanding dissonance and a practical basis from which specific problems can be addressed. A state-space representation of multiple alerting system operation is generalized that can be tailored across a variety of applications. Based on the representation, two major causes of dissonance are identified: logic differences and sensor error. Additionally, several possible types of dissonance are identified. A mathematical analysis method is developed to identify the conditions for dissonance originating from logic differences. A probabilistic analysis methodology is developed to estimate the probability of dissonance originating from sensor error, and to compare the relative contribution to dissonance of sensor error against the contribution from logic differences. A hybrid model, which describes the dynamic behavior of the process with multiple alerting systems, is developed to identify dangerous dissonance space, from which the process can lead to disaster. Methodologies to avoid or mitigate dissonance are outlined. Two examples are used to demonstrate the application of the methodology. First, a conceptual In-Trail Spacing example is presented. The methodology is applied to identify the conditions for possible dissonance, to identify relative contribution of logic difference and sensor error, and to identify dangerous dissonance space. Several proposed mitigation methods are demonstrated in this example. In the second example, the methodology is applied to address the dissonance problem between two air traffic alert and avoidance systems: the existing Traffic Alert and Collision Avoidance System (TCAS) vs. the proposed Airborne Conflict Management system (ACM). Conditions on ACM resolution maneuvers are identified to avoid dynamic dissonance between TCAS and ACM. Also included in this report is an Appendix written by Lee Winder about recent and continuing work on alerting systems design. The application of Markov Decision Process (MDP) theory to complex alerting problems is discussed and illustrated with an abstract example system.

  13. A movement ecology paradigm for unifying organismal movement research

    PubMed Central

    Nathan, Ran; Getz, Wayne M.; Revilla, Eloy; Holyoak, Marcel; Kadmon, Ronen; Saltz, David; Smouse, Peter E.

    2008-01-01

    Movement of individual organisms is fundamental to life, quilting our planet in a rich tapestry of phenomena with diverse implications for ecosystems and humans. Movement research is both plentiful and insightful, and recent methodological advances facilitate obtaining a detailed view of individual movement. Yet, we lack a general unifying paradigm, derived from first principles, which can place movement studies within a common context and advance the development of a mature scientific discipline. This introductory article to the Movement Ecology Special Feature proposes a paradigm that integrates conceptual, theoretical, methodological, and empirical frameworks for studying movement of all organisms, from microbes to trees to elephants. We introduce a conceptual framework depicting the interplay among four basic mechanistic components of organismal movement: the internal state (why move?), motion (how to move?), and navigation (when and where to move?) capacities of the individual and the external factors affecting movement. We demonstrate how the proposed framework aids the study of various taxa and movement types; promotes the formulation of hypotheses about movement; and complements existing biomechanical, cognitive, random, and optimality paradigms of movement. The proposed framework integrates eclectic research on movement into a structured paradigm and aims at providing a basis for hypothesis generation and a vehicle facilitating the understanding of the causes, mechanisms, and spatiotemporal patterns of movement and their role in various ecological and evolutionary processes. ”Now we must consider in general the common reason for moving with any movement whatever.“ (Aristotle, De Motu Animalium, 4th century B.C.) PMID:19060196

  14. Methodology of Diagnostics of Interethnic Relations and Ethnosocial Processes

    ERIC Educational Resources Information Center

    Maximova, Svetlana G.; Noyanzina, Oksana Ye.; Omelchenko, Daria A.; Maximov, Maxim B.; Avdeeva, Galina C.

    2016-01-01

    The purpose of this study was to research the methodological approaches to the study of interethnic relations and ethno-social processes. The analysis of the literature was conducted in three main areas: 1) the theoretical and methodological issues of organizing the research of inter-ethnic relations, allowing to highlight the current…

  15. Complexity, Representation and Practice: Case Study as Method and Methodology

    ERIC Educational Resources Information Center

    Miles, Rebecca

    2015-01-01

    While case study is considered a common approach to examining specific and particular examples in research disciplines such as law, medicine and psychology, in the social sciences case study is often treated as a lesser, flawed or undemanding methodology which is less valid, reliable or theoretically rigorous than other methodologies. Building on…

  16. Towards Developing a Theoretical Framework for Measuring Public Sector Managers' Career Success

    ERIC Educational Resources Information Center

    Rasdi, Roziah Mohd; Ismail, Maimunah; Uli, Jegak; Noah, Sidek Mohd

    2009-01-01

    Purpose: The purpose of this paper is to develop a theoretical framework for measuring public sector managers' career success. Design/methodology/approach: The theoretical foundation used in this study is social cognitive career theory. To conduct a literature search, several keywords were identified, i.e. career success, objective and subjective…

  17. The Importance of Theoretical Frameworks and Mathematical Constructs in Designing Digital Tools

    ERIC Educational Resources Information Center

    Trinter, Christine

    2016-01-01

    The increase in availability of educational technologies over the past few decades has not only led to new practice in teaching mathematics but also to new perspectives in research, methodologies, and theoretical frameworks within mathematics education. Hence, the amalgamation of theoretical and pragmatic considerations in digital tool design…

  18. Toward Theory-Based Research in Political Communication.

    ERIC Educational Resources Information Center

    Simon, Adam F.; Iyengar, Shanto

    1996-01-01

    Praises the theoretical and methodological potential of the field of political communication. Calls for greater interaction and cross fertilization among the fields of political science, sociology, economics, and psychology. Briefly discusses relevant research methodologies. (MJP)

  19. A philosophy of rivers: Equilibrium states, channel evolution, teleomatic change and least action principle

    NASA Astrophysics Data System (ADS)

    Nanson, Gerald C.; Huang, He Qing

    2018-02-01

    Until recently no universal agreement as to a philosophical or scientific methodological framework has been proposed to guide the study of fluvial geomorphology. An understanding of river form and process requires an understanding of the principles that govern the behaviour and evolution of alluvial rivers at the most fundamental level. To date, the investigations of such principles have followed four approaches: develop qualitative unifying theories that are usually untested; collect and examine data visually and statistically to define semi-quantitative relationships among variables; apply Newtonian theoretical and empirical mechanics in a reductionist manner; resolve the primary flow equations theoretically by assuming maximum or minimum outputs. Here we recommend not a fifth but an overarching philosophy to embrace all four: clarifying and formalising an understanding of the evolution of river channels and iterative directional changes in the context of least action principle (LAP), the theoretical basis of variational mechanics. LAP is exemplified in rivers in the form of maximum flow efficiency (MFE). A sophisticated understanding of evolution in its broadest sense is essential to understand how rivers adjust towards an optimum state rather than towards some other. Because rivers, as dynamic contemporary systems, flow in valleys that are commonly historical landforms and often tectonically determined, we propose that most of the world's alluvial rivers are over-powered for the work they must do. To remain stable they commonly evolve to expend surplus energy via a variety of dynamic equilibrium forms that will further adjust, where possible, to maximise their stability as much less common MFE forms in stationary equilibrium. This paper: 1. Shows that the theory of evolution is derived from, and applicable to, both the physical and biological sciences; 2. Focusses the development of theory in geomorphology on the development of equilibrium theory; 3. Proposes that river channels, like organisms, evolve teleomatically (progression towards an end-state by following natural laws) and iteratively (one stage forming the basis for the next) towards an optimal end-state; 4. Describes LAP as the methodological basis for understanding the self-adjustment alluvial channels towards MFE. 5. Acknowledges that whereas river channels that form within their unmodified alluvium evolve into optimal minimum-energy systems, exogenic variables, such as riparian or aquatic vegetation, can cause significant variations in resultant river-styles. We specifically attempt to address Luna Leopold's lament in 1994 that no clearly expressed philosophy explains the remarkable self-adjustment of alluvial channels.

  20. 76 FR 50993 - Agency Information Collection Activities: Proposed Collection; Comment Request-Generic Clearance...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-17

    ...: Proposed Collection; Comment Request--Generic Clearance to Conduct Methodological Testing, Surveys, Focus... proposed information collection. This information collection will conduct research by methodological... Methodological Testing, Surveys, Focus Groups, and Related Tools to Improve the Management of Federal Nutrition...

  1. Teaching methodologies to promote creativity in the professional skills related to optics knowledge

    NASA Astrophysics Data System (ADS)

    Fernández-Oliveras, Alicia; Fernandez, Paz; Peña-García, Antonio; Oliveras, Maria L.

    2014-07-01

    We present the methodologies proposed and applied in the context of a teaching-innovation project developed at the University of Granada, Spain. The main objective of the project is the implementation of teaching methodologies that promote the creativity in the learning process and, subsequently, in the acquisition of professional skills. This project involves two subjects related with optics knowledge in undergraduate students. The subjects are "Illumination Engineering" (Bachelor's degree in Civil-Engineering) and "Optical and Optometric Instrumentation" (Bachelor's degree in and Optics and Optometry). For the first subject, the activities of our project were carried out in the theoretical classes. By contrast, in the case of the second subject, such activities were designed for the laboratory sessions. For "Illumination Engineering" we applied the maieutic technique. With this method the students were encouraged to establish relationships between the main applications of the subject and concepts that apparently unrelated with the subject framework. By means of several examples, the students became aware of the importance of cross-curricular and lateral thinking. We used the technique based on protocols of control and change in "Optical and Optometric Instrumentation". The modus operandi was focused on prompting the students to adopt the role of the professionals and to pose questions to themselves concerning the practical content of the subject from that professional role. This mechanism boosted the critical capacity and the independent-learning ability of the students. In this work, we describe in detail both subject proposals and the results of their application in the 2011-2012 academic course.

  2. Praxis educativa ecopacifista de enriquecimiento curricular: Conceptuacion, diseno y divulgacion

    NASA Astrophysics Data System (ADS)

    Osorio, Carlos Agustin Muniz

    A general consensus exists that the present worldwide state of the natural environment is in crisis. Tied to this crisis, the social dimension presents a discouraging picture in aspects like violence and poverty. The predominant neoliberal economic system---ecocidal and genocidal---just as the production system that sustains it, affects this crisis. Puerto Rico, in its political and economic relationship with the United States of America, is not exempt of this situation. Education arises as an alternative to transform this reality. Science education has the potential to address these socio-environmental problems in a creative way. From a scientific educational framework, we conceptualized, designed and disseminated diverse approaches and tools that integrate socio-ecological and environmental aspects, as well as issues related to violence, conflict and peace. The central research questions were: At present, what are some of the main characteristics of the social-ecological and environmental global and local (glocal) issues and what relation do they have with formal education?; What is the ethical responsibility of science education when, facing social-ecological and environmental situations and issues concerning peace?; What educational foundations justify the "Praxis Educativa Ecopacifista de Enriquecimiento Curricular" as an alternative to the situations and issues considered?; What didactic tools do we propose?; What curricular design and revision processes do we propose? What dissemination processes do we propose? The nature of our methodology is qualitative and is centered around curricular design. It includes a research-theoretical dimension, a practical-research dimension, and systematizing of learning elements. We emphasize the conceptualization of the theoretical-philosophical and methodological dimensions of the ecopacifist approach and its fundamental principles. We highlight the praxis, integrating creativity, intelligence and talent development, critical consciousness and nonviolent civil action. We design several curricular and didactic tools, among these: four ecopacifist activity guides, an ecopacifist curricular model, the educational strategy "TiERRa" (Earth in Spanish), reference articles, audio-visual materials, models of educational strategies and examples of curricular implementation. By means of the design and creation of a web page (http://proyectoecopaz.org), we hope to disseminate the knowledge constructed, the contributions and creations, in a rapid and accessible way.

  3. Solving a methodological challenge in work stress evaluation with the Stress Assessment and Research Toolkit (StART): a study protocol.

    PubMed

    Guglielmi, Dina; Simbula, Silvia; Vignoli, Michela; Bruni, Ilaria; Depolo, Marco; Bonfiglioli, Roberta; Tabanelli, Maria Carla; Violante, Francesco Saverio

    2013-06-22

    Stress evaluation is a field of strong interest and challenging due to several methodological aspects in the evaluation process. The aim of this study is to propose a study protocol to test a new method (i.e., the Stress Assessment and Research Toolkit) to assess psychosocial risk factors at work. This method addresses several methodological issues (e.g., subjective vs. objective, qualitative vs quantitative data) by assessing work-related stressors using different kinds of data: i) organisational archival data (organisational indicators sheet); ii) qualitative data (focus group); iii) worker perception (questionnaire); and iv) observational data (observational checklist) using mixed methods research. In addition, it allows positive and negative aspects of work to be considered conjointly, using an approach that considers at the same time job demands and job resources. The integration of these sources of data can reduce the theoretical and methodological bias related to stress research in the work setting, allows researchers and professionals to obtain a reliable description of workers' stress, providing a more articulate vision of psychosocial risks, and allows a large amount of data to be collected. Finally, the implementation of the method ensures in the long term a primary prevention for psychosocial risk management in that it aims to reduce or modify the intensity, frequency or duration of organisational demands.

  4. Solving a methodological challenge in work stress evaluation with the Stress Assessment and Research Toolkit (StART): a study protocol

    PubMed Central

    2013-01-01

    Background Stress evaluation is a field of strong interest and challenging due to several methodological aspects in the evaluation process. The aim of this study is to propose a study protocol to test a new method (i.e., the Stress Assessment and Research Toolkit) to assess psychosocial risk factors at work. Design This method addresses several methodological issues (e.g., subjective vs. objective, qualitative vs quantitative data) by assessing work-related stressors using different kinds of data: i) organisational archival data (organisational indicators sheet); ii) qualitative data (focus group); iii) worker perception (questionnaire); and iv) observational data (observational checklist) using mixed methods research. In addition, it allows positive and negative aspects of work to be considered conjointly, using an approach that considers at the same time job demands and job resources. Discussion The integration of these sources of data can reduce the theoretical and methodological bias related to stress research in the work setting, allows researchers and professionals to obtain a reliable description of workers’ stress, providing a more articulate vision of psychosocial risks, and allows a large amount of data to be collected. Finally, the implementation of the method ensures in the long term a primary prevention for psychosocial risk management in that it aims to reduce or modify the intensity, frequency or duration of organisational demands. PMID:23799950

  5. Verification of Gyrokinetic codes: theoretical background and applications

    NASA Astrophysics Data System (ADS)

    Tronko, Natalia

    2016-10-01

    In fusion plasmas the strong magnetic field allows the fast gyro motion to be systematically removed from the description of the dynamics, resulting in a considerable model simplification and gain of computational time. Nowadays, the gyrokinetic (GK) codes play a major role in the understanding of the development and the saturation of turbulence and in the prediction of the consequent transport. We present a new and generic theoretical framework and specific numerical applications to test the validity and the domain of applicability of existing GK codes. For a sound verification process, the underlying theoretical GK model and the numerical scheme must be considered at the same time, which makes this approach pioneering. At the analytical level, the main novelty consists in using advanced mathematical tools such as variational formulation of dynamics for systematization of basic GK code's equations to access the limits of their applicability. The indirect verification of numerical scheme is proposed via the Benchmark process. In this work, specific examples of code verification are presented for two GK codes: the multi-species electromagnetic ORB5 (PIC), and the radially global version of GENE (Eulerian). The proposed methodology can be applied to any existing GK code. We establish a hierarchy of reduced GK Vlasov-Maxwell equations using the generic variational formulation. Then, we derive and include the models implemented in ORB5 and GENE inside this hierarchy. At the computational level, detailed verification of global electromagnetic test cases based on the CYCLONE are considered, including a parametric β-scan covering the transition between the ITG to KBM and the spectral properties at the nominal β value.

  6. A Novel Consensus-Based Particle Swarm Optimization-Assisted Trust-Tech Methodology for Large-Scale Global Optimization.

    PubMed

    Zhang, Yong-Feng; Chiang, Hsiao-Dong

    2017-09-01

    A novel three-stage methodology, termed the "consensus-based particle swarm optimization (PSO)-assisted Trust-Tech methodology," to find global optimal solutions for nonlinear optimization problems is presented. It is composed of Trust-Tech methods, consensus-based PSO, and local optimization methods that are integrated to compute a set of high-quality local optimal solutions that can contain the global optimal solution. The proposed methodology compares very favorably with several recently developed PSO algorithms based on a set of small-dimension benchmark optimization problems and 20 large-dimension test functions from the CEC 2010 competition. The analytical basis for the proposed methodology is also provided. Experimental results demonstrate that the proposed methodology can rapidly obtain high-quality optimal solutions that can contain the global optimal solution. The scalability of the proposed methodology is promising.

  7. Detrended fluctuation analysis as a regression framework: Estimating dependence at different scales

    NASA Astrophysics Data System (ADS)

    Kristoufek, Ladislav

    2015-02-01

    We propose a framework combining detrended fluctuation analysis with standard regression methodology. The method is built on detrended variances and covariances and it is designed to estimate regression parameters at different scales and under potential nonstationarity and power-law correlations. The former feature allows for distinguishing between effects for a pair of variables from different temporal perspectives. The latter ones make the method a significant improvement over the standard least squares estimation. Theoretical claims are supported by Monte Carlo simulations. The method is then applied on selected examples from physics, finance, environmental science, and epidemiology. For most of the studied cases, the relationship between variables of interest varies strongly across scales.

  8. Socioeconomic Status, Family Processes, and Individual Development

    PubMed Central

    Conger, Rand D.; Conger, Katherine J.; Martin, Monica J.

    2010-01-01

    Research during the past decade shows that social class or socioeconomic status (SES) is related to satisfaction and stability in romantic unions, the quality of parent-child relationships, and a range of developmental outcomes for adults and children. This review focuses on evidence regarding potential mechanisms proposed to account for these associations. Research findings reported during the past decade demonstrate support for an interactionist model of the relationship between SES and family life, which incorporates assumptions from both the social causation and social selection perspectives. The review concludes with recommendations for future research on SES, family processes and individual development in terms of important theoretical and methodological issues yet to be addressed. PMID:20676350

  9. Does spatial attention modulate the C1 component? The jury continues to deliberate.

    PubMed

    Baumgartner, Hannah M; Graulty, Christian J; Hillyard, Steven A; Pitts, Michael A

    The thoughful comments on our study (Baumgartner et al., this issue) that failed to replicate the C1 attention effect reported by a previous study roughly fall into three broad categories. First, the commentators identified specific differences between the two studies that may have contributed to the discrepant results. Second, they highlighted some of the theoretical and methodological problems that are encountered when trying to demonstrate attention effects on the initial evoked response in primary visual cortex. Third, they offered a number of proposals for optimizing experimental designs and analysis methods that may increase the likelihood of observing attention-related modulations of the C1. We consider each of these topics in turn.

  10. Threat to life and risk-taking behaviors: a review of empirical findings and explanatory models.

    PubMed

    Ben-Zur, Hasida; Zeidner, Moshe

    2009-05-01

    This article reviews the literature focusing on the relationship between perceived threat to life and risk-taking behaviors. The review of empirical data, garnered from field studies and controlled experiments, suggests that personal threat to life results in elevated risk-taking behavior. To account for these findings, this review proposes a number of theoretical explanations. These frameworks are grounded in divergent conceptual models: coping with stress, emotion regulation, replenishing of lost resources through self-enhancement, modifications of key parameters of cognitive processing of risky outcomes, and neurocognitive mechanisms. The review concludes with a number of methodological considerations, as well as directions for future work in this promising area of research.

  11. Critical Review of Hamby's (2014) Article Titled "Intimate Partner and Sexual Violence Research, Scientific Progress, Scientific Challenges, and Gender".

    PubMed

    Winstok, Zeev

    2015-07-28

    In a recent article, Hamby advocates the replacement of the "old" Conflict Tactic Scales used to measure physical partner violence (PV) with a new measurement instrument that represents and supports a thesis that gender use of physical PV is asymmetrical rather than symmetrical. This article takes a critical look at the logic, assumptions, arguments, examples, interpretations, and conclusions, presented in Hamby's article, and in some cases disagrees with them. Furthermore, this article uses Hamby's proposals as an opportunity to review and examine core issues in the study of perpetration of physical PV, including gender-related theoretical and methodological issues. © The Author(s) 2015.

  12. “Epidemiological Criminology”: Coming Full Circle

    PubMed Central

    Lanier, Mark M.

    2009-01-01

    Members of the public health and criminal justice disciplines often work with marginalized populations: people at high risk of drug use, health problems, incarceration, and other difficulties. As these fields increasingly overlap, distinctions between them are blurred, as numerous research reports and funding trends document. However, explicit theoretical and methodological linkages between the 2 disciplines remain rare. A new paradigm that links methods and statistical models of public health with those of their criminal justice counterparts is needed, as are increased linkages between epidemiological analogies, theories, and models and the corresponding tools of criminology. We outline disciplinary commonalities and distinctions, present policy examples that integrate similarities, and propose “epidemiological criminology” as a bridging framework. PMID:19150901

  13. Conceptual Knowledge Acquisition in Biomedicine: A Methodological Review

    PubMed Central

    Payne, Philip R.O.; Mendonça, Eneida A.; Johnson, Stephen B.; Starren, Justin B.

    2007-01-01

    The use of conceptual knowledge collections or structures within the biomedical domain is pervasive, spanning a variety of applications including controlled terminologies, semantic networks, ontologies, and database schemas. A number of theoretical constructs and practical methods or techniques support the development and evaluation of conceptual knowledge collections. This review will provide an overview of the current state of knowledge concerning conceptual knowledge acquisition, drawing from multiple contributing academic disciplines such as biomedicine, computer science, cognitive science, education, linguistics, semiotics, and psychology. In addition, multiple taxonomic approaches to the description and selection of conceptual knowledge acquisition and evaluation techniques will be proposed in order to partially address the apparent fragmentation of the current literature concerning this domain. PMID:17482521

  14. [Methodological deficits in neuroethics: do we need theoretical neuroethics?].

    PubMed

    Northoff, G

    2013-10-01

    Current neuroethics can be characterized best as empirical neuroethics: it is strongly empirically oriented in that it not only includes empirical findings from neuroscience but also searches for applications within neuroscience. This, however, neglects the social and political contexts which could be subject to a future social neuroethics. In addition, methodological issues need to be considered as in theoretical neuroethics. The focus in this article is on two such methodological issues: (1) the analysis of the different levels and their inferences among each other which is exemplified by the inference of consciousness from the otherwise purely neuronal data in patients with vegetative state and (2) the problem of linking descriptive and normative concepts in a non-reductive and non-inferential way for which I suggest the mutual contextualization between both concepts. This results in a methodological strategy that can be described as contextual fact-norm iterativity.

  15. Anthropology, Participation, and the Democratization of Knowledge: Participatory Research Using Video with Youth Living in Extreme Poverty

    ERIC Educational Resources Information Center

    Batallan, Graciela; Dente, Liliana; Ritta, Loreley

    2017-01-01

    This article aims to open up a debate on methodological aspects of ethnographic research, arguing for the legitimacy of the information produced in a research "taller" or workshop using a participatory methodology and video production as a methodological tool. Based on the theoretical foundations and analysis of a "taller"…

  16. Teaching and Learning Methodologies Supported by ICT Applied in Computer Science

    ERIC Educational Resources Information Center

    Capacho, Jose

    2016-01-01

    The main objective of this paper is to show a set of new methodologies applied in the teaching of Computer Science using ICT. The methodologies are framed in the conceptual basis of the following sciences: Psychology, Education and Computer Science. The theoretical framework of the research is supported by Behavioral Theory, Gestalt Theory.…

  17. A theoretical and experimental investigation of propeller performance methodologies

    NASA Technical Reports Server (NTRS)

    Korkan, K. D.; Gregorek, G. M.; Mikkelson, D. C.

    1980-01-01

    This paper briefly covers aspects related to propeller performance by means of a review of propeller methodologies; presentation of wind tunnel propeller performance data taken in the NASA Lewis Research Center 10 x 10 wind tunnel; discussion of the predominent limitations of existing propeller performance methodologies; and a brief review of airfoil developments appropriate for propeller applications.

  18. Towards Culturally Relevant Classroom Science: A Theoretical Framework Focusing on Traditional Plant Healing

    ERIC Educational Resources Information Center

    Mpofu, Vongai; Otulaja, Femi S.; Mushayikwa, Emmanuel

    2014-01-01

    A theoretical framework is an important component of a research study. It grounds the study and guides the methodological design. It also forms a reference point for the interpretation of the research findings. This paper conceptually examines the process of constructing a multi-focal theoretical lens for guiding studies that aim to accommodate…

  19. Development Mechanism of an Integrated Model for Training of a Specialist and Conceptual-Theoretical Activity of a Teacher

    ERIC Educational Resources Information Center

    Marasulov, Akhmat; Saipov, Amangeldi; ?rymbayeva, Kulimkhan; Zhiyentayeva, Begaim; Demeuov, Akhan; Konakbaeva, Ulzhamal; Bekbolatova, Akbota

    2016-01-01

    The aim of the study is to examine the methodological-theoretical construction bases for development mechanism of an integrated model for a specialist's training and teacher's conceptual-theoretical activity. Using the methods of generalization of teaching experience, pedagogical modeling and forecasting, the authors determine the urgent problems…

  20. Theory and Methodology in Researching Emotions in Education

    ERIC Educational Resources Information Center

    Zembylas, Michalinos

    2007-01-01

    Differing theoretical approaches to the study of emotions are presented: emotions as private (psychodynamic approaches); emotions as sociocultural phenomena (social constructionist approaches); and a third perspective (interactionist approaches) transcending these two. These approaches have important methodological implications in studying…

  1. Learning from doing: the case for combining normalisation process theory and participatory learning and action research methodology for primary healthcare implementation research.

    PubMed

    de Brún, Tomas; O'Reilly-de Brún, Mary; O'Donnell, Catherine A; MacFarlane, Anne

    2016-08-03

    The implementation of research findings is not a straightforward matter. There are substantive and recognised gaps in the process of translating research findings into practice and policy. In order to overcome some of these translational difficulties, a number of strategies have been proposed for researchers. These include greater use of theoretical approaches in research focused on implementation, and use of a wider range of research methods appropriate to policy questions and the wider social context in which they are placed. However, questions remain about how to combine theory and method in implementation research. In this paper, we respond to these proposals. Focussing on a contemporary social theory, Normalisation Process Theory, and a participatory research methodology, Participatory Learning and Action, we discuss the potential of their combined use for implementation research. We note ways in which Normalisation Process Theory and Participatory Learning and Action are congruent and may therefore be used as heuristic devices to explore, better understand and support implementation. We also provide examples of their use in our own research programme about community involvement in primary healthcare. Normalisation Process Theory alone has, to date, offered useful explanations for the success or otherwise of implementation projects post-implementation. We argue that Normalisation Process Theory can also be used to prospectively support implementation journeys. Furthermore, Normalisation Process Theory and Participatory Learning and Action can be used together so that interventions to support implementation work are devised and enacted with the expertise of key stakeholders. We propose that the specific combination of this theory and methodology possesses the potential, because of their combined heuristic force, to offer a more effective means of supporting implementation projects than either one might do on its own, and of providing deeper understandings of implementation contexts, rather than merely describing change.

  2. When a new technological product launching fails: A multi-method approach of facial recognition and E-WOM sentiment analysis.

    PubMed

    Hernández-Fernández, Dra Asunción; Mora, Elísabet; Vizcaíno Hernández, María Isabel

    2018-04-17

    The dual aim of this research is, firstly, to analyze the physiological and unconscious emotional response of consumers to a new technological product and, secondly, link this emotional response to consumer conscious verbal reports of positive and negative product perceptions. In order to do this, biometrics and self-reported measures of emotional response are combined. On the one hand, a neuromarketing experiment based on the facial recognition of emotions of 10 subjects, when physical attributes and economic information of a technological product are exposed, shows the prevalence of the ambivalent emotion of surprise. On the other hand, a nethnographic qualitative approach of sentiment analysis of 67-user online comments characterise the valence of this emotion as mainly negative in the case and context studied. Theoretical, practical and methodological contributions are anticipated from this paper. From a theoretical point of view this proposal contributes valuable information to the product design process, to an effective development of the marketing mix variables of price and promotion, and to a successful selection of the target market. From a practical point of view, the approach employed in the case study on the product Google Glass provides empirical evidence useful in the decision making process for this and other technological enterprises launching a new product. And from a methodological point of view, the usefulness of integrated neuromarketing-eWOM analysis could contribute to the proliferation of this tandem in marketing research. Copyright © 2018 Elsevier Inc. All rights reserved.

  3. MASTER: a model to improve and standardize clinical breakpoints for antimicrobial susceptibility testing using forecast probabilities.

    PubMed

    Blöchliger, Nicolas; Keller, Peter M; Böttger, Erik C; Hombach, Michael

    2017-09-01

    The procedure for setting clinical breakpoints (CBPs) for antimicrobial susceptibility has been poorly standardized with respect to population data, pharmacokinetic parameters and clinical outcome. Tools to standardize CBP setting could result in improved antibiogram forecast probabilities. We propose a model to estimate probabilities for methodological categorization errors and defined zones of methodological uncertainty (ZMUs), i.e. ranges of zone diameters that cannot reliably be classified. The impact of ZMUs on methodological error rates was used for CBP optimization. The model distinguishes theoretical true inhibition zone diameters from observed diameters, which suffer from methodological variation. True diameter distributions are described with a normal mixture model. The model was fitted to observed inhibition zone diameters of clinical Escherichia coli strains. Repeated measurements for a quality control strain were used to quantify methodological variation. For 9 of 13 antibiotics analysed, our model predicted error rates of < 0.1% applying current EUCAST CBPs. Error rates were > 0.1% for ampicillin, cefoxitin, cefuroxime and amoxicillin/clavulanic acid. Increasing the susceptible CBP (cefoxitin) and introducing ZMUs (ampicillin, cefuroxime, amoxicillin/clavulanic acid) decreased error rates to < 0.1%. ZMUs contained low numbers of isolates for ampicillin and cefuroxime (3% and 6%), whereas the ZMU for amoxicillin/clavulanic acid contained 41% of all isolates and was considered not practical. We demonstrate that CBPs can be improved and standardized by minimizing methodological categorization error rates. ZMUs may be introduced if an intermediate zone is not appropriate for pharmacokinetic/pharmacodynamic or drug dosing reasons. Optimized CBPs will provide a standardized antibiotic susceptibility testing interpretation at a defined level of probability. © The Author 2017. Published by Oxford University Press on behalf of the British Society for Antimicrobial Chemotherapy. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  4. Functional mixture regression.

    PubMed

    Yao, Fang; Fu, Yuejiao; Lee, Thomas C M

    2011-04-01

    In functional linear models (FLMs), the relationship between the scalar response and the functional predictor process is often assumed to be identical for all subjects. Motivated by both practical and methodological considerations, we relax this assumption and propose a new class of functional regression models that allow the regression structure to vary for different groups of subjects. By projecting the predictor process onto its eigenspace, the new functional regression model is simplified to a framework that is similar to classical mixture regression models. This leads to the proposed approach named as functional mixture regression (FMR). The estimation of FMR can be readily carried out using existing software implemented for functional principal component analysis and mixture regression. The practical necessity and performance of FMR are illustrated through applications to a longevity analysis of female medflies and a human growth study. Theoretical investigations concerning the consistent estimation and prediction properties of FMR along with simulation experiments illustrating its empirical properties are presented in the supplementary material available at Biostatistics online. Corresponding results demonstrate that the proposed approach could potentially achieve substantial gains over traditional FLMs.

  5. An educational laboratory virtual instrumentation suite assisted experiment for studying fundamentals of series resistance-inductance-capacitance circuit

    NASA Astrophysics Data System (ADS)

    Rana, K. P. S.; Kumar, Vineet; Mendiratta, Jatin

    2017-11-01

    One of the most elementary concepts in freshmen Electrical Engineering subject comprises the Resistance-Inductance-Capacitance (RLC) circuit fundamentals, that is, their time and frequency domain responses. For a beginner, generally, it is difficult to understand and appreciate the step and the frequency responses, particularly the resonance. This paper proposes a student-friendly teaching and learning approach by inculcating the multifaceted versatile software LabVIEWTM along with the educational laboratory virtual instrumentation suite hardware, for studying the RLC circuit time and frequency domain responses. The proposed approach has offered an interactive laboratory experiment where students can model circuits in simulation and hardware circuits on prototype board, and then compare their performances. The theoretical simulations and the obtained experimental data are found to be in very close agreement, thereby enhancing the conviction of students. Finally, the proposed methodology was also subjected to the assessment of learning outcomes based on student feedback, and an average score of 8.05 out of 10 with a standard deviation of 0.471 was received, indicating the overall satisfaction of the students.

  6. 76 FR 43360 - Self-Regulatory Organizations; Financial Industry Regulatory Authority, Inc.; Notice of Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-20

    .... The text of the proposed rule change is set forth below. Proposed new language is italicized; proposed... methodology approved by FINRA as announced in a Regulatory Notice (``approved margin methodology''). The... an Approved Margin Methodology. Members shall require as a minimum for computing customer or broker...

  7. 78 FR 58307 - Statement of Organization, Functions, and Delegations of Authority

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-23

    ... reproduction, natality, and mortality; (10) performs theoretical and experimental investigations into the... dissemination; (15) conducts methodological research on the tools for evaluation, utilization, and presentation... classification to states, local areas, other countries, and private organizations; (12) conducts methodological...

  8. A Reconceptualization of Adolescent Peer Susceptibility.

    ERIC Educational Resources Information Center

    Kosten, Paul A.; Scheier, Lawrence M.

    Conceptual and methodological limitations have hampered researchers' ability to establish valid, substantively meaningful, and theoretically driven self-report assessments of peer susceptibility. As a result, many assessments of peer susceptibility have been conceptualized as unidimensional and void of any theoretical underpinnings. This study…

  9. [Methodological and operational notes for the assessment and management of the risk of work-related stress].

    PubMed

    De Ambrogi, Francesco; Ratti, Elisabetta Ceppi

    2011-01-01

    Today the Italian national debate over the Work-Related Stress Risk Assessment methodology is rather heated. Several methodological proposals and guidelines have been published in recent months, not least those by the "Commissione Consultiva". But despite this wide range of proposals, it appears that there is still a lack of attention to some of the basic methodological issues that must be taken into account in order to correctly implement the above-mentioned guidelines. The aim of this paper is to outline these methodological issues. In order to achieve this, the most authoritative methodological proposals and guidelines have been reviewed. The study focuses in particular on the methodological issues that could lead to important biases if not considered properly. The study leads to some considerations about the methodological validity of a Work-Related Stress Risk Assessment based exclusively on the literal interpretation of the considered proposals. Furthermore, the study provides some hints and working hypotheses on how to overcome these methodological limits. This study should be considered as a starting point for further investigations and debate on the Work-Related Stress Risk Assessment methodology on a national level.

  10. Speed-Accuracy Tradeoffs in Speech Production

    DTIC Science & Technology

    2017-06-01

    imaging data of speech production. A theoretical framework for considering Fitts’ law in the domain of speech production is elucidated. Methodological ...articulatory kinematics conform to Fitts’ law. A second, associated goal is to address the methodological challenges inherent in performing Fitts-style...analysis on rtMRI data of speech production. Methodological challenges include segmenting continuous speech into specific motor tasks, defining key

  11. Methodology of Computer-Aided Design of Variable Guide Vanes of Aircraft Engines

    ERIC Educational Resources Information Center

    Falaleev, Sergei V.; Melentjev, Vladimir S.; Gvozdev, Alexander S.

    2016-01-01

    The paper presents a methodology which helps to avoid a great amount of costly experimental research. This methodology includes thermo-gas dynamic design of an engine and its mounts, the profiling of compressor flow path and cascade design of guide vanes. Employing a method elaborated by Howell, we provide a theoretical solution to the task of…

  12. Bayesian cross-entropy methodology for optimal design of validation experiments

    NASA Astrophysics Data System (ADS)

    Jiang, X.; Mahadevan, S.

    2006-07-01

    An important concern in the design of validation experiments is how to incorporate the mathematical model in the design in order to allow conclusive comparisons of model prediction with experimental output in model assessment. The classical experimental design methods are more suitable for phenomena discovery and may result in a subjective, expensive, time-consuming and ineffective design that may adversely impact these comparisons. In this paper, an integrated Bayesian cross-entropy methodology is proposed to perform the optimal design of validation experiments incorporating the computational model. The expected cross entropy, an information-theoretic distance between the distributions of model prediction and experimental observation, is defined as a utility function to measure the similarity of two distributions. A simulated annealing algorithm is used to find optimal values of input variables through minimizing or maximizing the expected cross entropy. The measured data after testing with the optimum input values are used to update the distribution of the experimental output using Bayes theorem. The procedure is repeated to adaptively design the required number of experiments for model assessment, each time ensuring that the experiment provides effective comparison for validation. The methodology is illustrated for the optimal design of validation experiments for a three-leg bolted joint structure and a composite helicopter rotor hub component.

  13. The relationship between symbolic interactionism and interpretive description.

    PubMed

    Oliver, Carolyn

    2012-03-01

    In this article I explore the relationship between symbolic interactionist theory and interpretive description methodology. The two are highly compatible, making symbolic interactionism an excellent theoretical framework for interpretive description studies. The pragmatism underlying interpretive description supports locating the methodology within this cross-disciplinary theory to make it more attractive to nonnursing researchers and expand its potential to address practice problems across the applied disciplines. The theory and method are so compatible that symbolic interactionism appears to be part of interpretive description's epistemological foundations. Interpretive description's theoretical roots have, to date, been identified only very generally in interpretivism and the philosophy of nursing. A more detailed examination of its symbolic interactionist heritage furthers the contextualization or forestructuring of the methodology to meet one of its own requirements for credibility.

  14. Theoretical and methodological approaches in discourse analysis.

    PubMed

    Stevenson, Chris

    2004-01-01

    Discourse analysis (DA) embodies two main approaches: Foucauldian DA and radical social constructionist DA. Both are underpinned by social constructionism to a lesser or greater extent. Social constructionism has contested areas in relation to power, embodiment, and materialism, although Foucauldian DA does focus on the issue of power Embodiment and materialism may be especially relevant for researchers of nursing where the physical body is prominent. However, the contested nature of social constructionism allows a fusion of theoretical and methodological approaches tailored to a specific research interest. In this paper, Chris Stevenson suggests a framework for working out and declaring the DA approach to be taken in relation to a research area, as well as to aid anticipating methodological critique. Method, validity, reliability and scholarship are discussed from within a discourse analytic frame of reference.

  15. Theoretical and methodological approaches in discourse analysis.

    PubMed

    Stevenson, Chris

    2004-10-01

    Discourse analysis (DA) embodies two main approaches: Foucauldian DA and radical social constructionist DA. Both are underpinned by social constructionism to a lesser or greater extent. Social constructionism has contested areas in relation to power, embodiment, and materialism, although Foucauldian DA does focus on the issue of power. Embodiment and materialism may be especially relevant for researchers of nursing where the physical body is prominent. However, the contested nature of social constructionism allows a fusion of theoretical and methodological approaches tailored to a specific research interest. In this paper, Chris Stevenson suggests a frame- work for working out and declaring the DA approach to be taken in relation to a research area, as well as to aid anticipating methodological critique. Method, validity, reliability and scholarship are discussed from within a discourse analytic frame of reference.

  16. A Proposed Methodology for the Conceptualization, Operationalization, and Empirical Validation of the Concept of Information Need

    ERIC Educational Resources Information Center

    Afzal, Waseem

    2017-01-01

    Introduction: The purpose of this paper is to propose a methodology to conceptualize, operationalize, and empirically validate the concept of information need. Method: The proposed methodology makes use of both qualitative and quantitative perspectives, and includes a broad array of approaches such as literature reviews, expert opinions, focus…

  17. Theoretical and methodological issues with testing the SCCT and RIASEC models: Comment on Lent, Sheu, and Brown (2010) and Lubinski (2010).

    PubMed

    Armstrong, Patrick Ian; Vogel, David L

    2010-04-01

    The current article replies to comments made by Lent, Sheu, and Brown (2010) and Lubinski (2010) regarding the study "Interpreting the Interest-Efficacy Association From a RIASEC Perspective" (Armstrong & Vogel, 2009). The comments made by Lent et al. and Lubinski highlight a number of important theoretical and methodological issues, including the process of defining and differentiating between constructs, the assumptions underlying Holland's (1959, 1997) RIASEC (Realistic, Investigative, Artistic, Social, Enterprising, and Conventional types) model and interrelations among constructs specified in social cognitive career theory (SCCT), the importance of incremental validity for evaluating constructs, and methodological considerations when quantifying interest-efficacy correlations and for comparing models using multivariate statistical methods. On the basis of these comments and previous research on the SCCT and Holland models, we highlight the importance of considering multiple theoretical perspectives in vocational research and practice. Alternative structural models are outlined for examining the role of interests, self-efficacy, learning experiences, outcome expectations, personality, and cognitive abilities in the career choice and development process. PsycINFO Database Record (c) 2010 APA, all rights reserved.

  18. Pragmatic critical realism: could this methodological approach expand our understanding of employment relations?

    PubMed

    Mearns, Susan Lesley

    2011-01-01

    This paper seeks to highlight the need for employment relations academics and researchers to expand their use of research methodologies in order for them to enable the advancement of theoretical debate within their discipline. It focuses on the contribution that pragmatical critical realism has made to the field of perception and argues that it would add value to the subject of employment relations. It is a theoretically centred review of pragmatical critical realism and the possible contribution this methodology would make to the field of employment relations. The paper concludes that the employment relationship does not take place in a vacuum rather it is focussed on the interaction between imperfect individuals. Therefore, their interactions are moulded by emotions which can not be explored thoroughly or even acknowledged through a positivists' rigorous but limited acknowledgment of what constitutes 'knowledge' and development of theory. While not rejecting the contribution that quantitative data or positivism have made to the field, the study concludes that pragmatic critical realism has a lot to offer the development of the area and its theoretical foundations.

  19. Teaching Camera Calibration by a Constructivist Methodology

    ERIC Educational Resources Information Center

    Samper, D.; Santolaria, J.; Pastor, J. J.; Aguilar, J. J.

    2010-01-01

    This article describes the Metrovisionlab simulation software and practical sessions designed to teach the most important machine vision camera calibration aspects in courses for senior undergraduate students. By following a constructivist methodology, having received introductory theoretical classes, students use the Metrovisionlab application to…

  20. Deliverology

    ERIC Educational Resources Information Center

    Nordstrum, Lee E.; LeMahieu, Paul G.; Dodd, Karen

    2017-01-01

    Purpose: This paper is one of seven in this volume elaborating different approaches to quality improvement in education. This paper aims to delineate a methodology called Deliverology. Design/methodology/approach: The paper presents the origins, theoretical foundations, core principles and a case study showing an application of Deliverology in the…

  1. Contemporary HRD Research: A Triarchy of Theoretical Perspectives and Their Prescriptions for HRD.

    ERIC Educational Resources Information Center

    Garavan, Thomas N.; Gunnigle, Patrick; Morley, Michael

    2000-01-01

    Presents key debates in human resource development. One table outlines the research focus and methodology of articles in this special issue. Another table compares three theoretical perspectives: capability driven, psychological contract, and learning organization. Contains 253 references. (SK)

  2. Theoretical Approaches to Political Communication.

    ERIC Educational Resources Information Center

    Chesebro, James W.

    Political communication appears to be emerging as a theoretical and methodological academic area of research within both speech-communication and political science. Five complimentary approaches to political science (Machiavellian, iconic, ritualistic, confirmational, and dramatistic) may be viewed as a series of variations which emphasize the…

  3. Clarifying differences between review designs and methods

    PubMed Central

    2012-01-01

    This paper argues that the current proliferation of types of systematic reviews creates challenges for the terminology for describing such reviews. Terminology is necessary for planning, describing, appraising, and using reviews, building infrastructure to enable the conduct and use of reviews, and for further developing review methodology. There is insufficient consensus on terminology for a typology of reviews to be produced and any such attempt is likely to be limited by the overlapping nature of the dimensions along which reviews vary. It is therefore proposed that the most useful strategy for the field is to develop terminology for the main dimensions of variation. Three such main dimensions are proposed: (1) aims and approaches (including what the review is aiming to achieve, the theoretical and ideological assumptions, and the use of theory and logics of aggregation and configuration in synthesis); (2) structure and components (including the number and type of mapping and synthesis components and how they relate); and (3) breadth and depth and the extent of ‘work done’ in addressing a research issue (including the breadth of review questions, the detail with which they are addressed, and the amount the review progresses a research agenda). This then provides an overarching strategy to encompass more detailed descriptions of methodology and may lead in time to a more overarching system of terminology for systematic reviews. PMID:22681772

  4. QUADRO: A SUPERVISED DIMENSION REDUCTION METHOD VIA RAYLEIGH QUOTIENT OPTIMIZATION

    PubMed Central

    Fan, Jianqing; Ke, Zheng Tracy; Liu, Han; Xia, Lucy

    2016-01-01

    We propose a novel Rayleigh quotient based sparse quadratic dimension reduction method—named QUADRO (Quadratic Dimension Reduction via Rayleigh Optimization)—for analyzing high-dimensional data. Unlike in the linear setting where Rayleigh quotient optimization coincides with classification, these two problems are very different under nonlinear settings. In this paper, we clarify this difference and show that Rayleigh quotient optimization may be of independent scientific interests. One major challenge of Rayleigh quotient optimization is that the variance of quadratic statistics involves all fourth cross-moments of predictors, which are infeasible to compute for high-dimensional applications and may accumulate too many stochastic errors. This issue is resolved by considering a family of elliptical models. Moreover, for heavy-tail distributions, robust estimates of mean vectors and covariance matrices are employed to guarantee uniform convergence in estimating non-polynomially many parameters, even though only the fourth moments are assumed. Methodologically, QUADRO is based on elliptical models which allow us to formulate the Rayleigh quotient maximization as a convex optimization problem. Computationally, we propose an efficient linearized augmented Lagrangian method to solve the constrained optimization problem. Theoretically, we provide explicit rates of convergence in terms of Rayleigh quotient under both Gaussian and general elliptical models. Thorough numerical results on both synthetic and real datasets are also provided to back up our theoretical results. PMID:26778864

  5. Conceptual design of a crewed reusable space transportation system aimed at parabolic flights: stakeholder analysis, mission concept selection, and spacecraft architecture definition

    NASA Astrophysics Data System (ADS)

    Fusaro, Roberta; Viola, Nicole; Fenoglio, Franco; Santoro, Francesco

    2017-03-01

    This paper proposes a methodology to derive architectures and operational concepts for future earth-to-orbit and sub-orbital transportation systems. In particular, at first, it describes the activity flow, methods, and tools leading to the generation of a wide range of alternative solutions to meet the established goal. Subsequently, the methodology allows selecting a small number of feasible options among which the optimal solution can be found. For the sake of clarity, the first part of the paper describes the methodology from a theoretical point of view, while the second part proposes the selection of mission concepts and of a proper transportation system aimed at sub-orbital parabolic flights. Starting from a detailed analysis of the stakeholders and their needs, the major objectives of the mission have been derived. Then, following a system engineering approach, functional analysis tools as well as concept of operations techniques allowed generating a very high number of possible ways to accomplish the envisaged goals. After a preliminary pruning activity, aimed at defining the feasibility of these concepts, more detailed analyses have been carried out. Going on through the procedure, the designer should move from qualitative to quantitative evaluations, and for this reason, to support the trade-off analysis, an ad-hoc built-in mission simulation software has been exploited. This support tool aims at estimating major mission drivers (mass, heat loads, manoeuverability, earth visibility, and volumetric efficiency) as well as proving the feasibility of the concepts. Other crucial and multi-domain mission drivers, such as complexity, innovation level, and safety have been evaluated through the other appropriate analyses. Eventually, one single mission concept has been selected and detailed in terms of layout, systems, and sub-systems, highlighting also logistic, safety, and maintainability aspects.

  6. Computational Modeling of Mixed Solids for CO2 CaptureSorbents

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duan, Yuhua

    2015-01-01

    Since current technologies for capturing CO2 to fight global climate change are still too energy intensive, there is a critical need for development of new materials that can capture CO2 reversibly with acceptable energy costs. Accordingly, solid sorbents have been proposed to be used for CO2 capture applications through a reversible chemical transformation. By combining thermodynamic database mining with first principles density functional theory and phonon lattice dynamics calculations, a theoretical screening methodology to identify the most promising CO2 sorbent candidates from the vast array of possible solid materials has been proposed and validated. The calculated thermodynamic properties of differentmore » classes of solid materials versus temperature and pressure changes were further used to evaluate the equilibrium properties for the CO2 adsorption/desorption cycles. According to the requirements imposed by the pre- and post- combustion technologies and based on our calculated thermodynamic properties for the CO2 capture reactions by the solids of interest, we were able to screen only those solid materials for which lower capture energy costs are expected at the desired pressure and temperature conditions. Only those selected CO2 sorbent candidates were further considered for experimental validations. The ab initio thermodynamic technique has the advantage of identifying thermodynamic properties of CO2 capture reactions without any experimental input beyond crystallographic structural information of the solid phases involved. Such methodology not only can be used to search for good candidates from existing database of solid materials, but also can provide some guidelines for synthesis new materials. In this presentation, we apply our screening methodology to mixing solid systems to adjust the turnover temperature to help on developing CO2 capture Technologies.« less

  7. Theoretical investigation of the force and dynamically coupled torsional-axial-lateral dynamic response of eared rotors

    NASA Technical Reports Server (NTRS)

    David, J. W.; Mitchell, L. D.

    1982-01-01

    Difficulties in solution methodology to be used to deal with the potentially higher nonlinear rotor equations when dynamic coupling is included. A solution methodology is selected to solve the nonlinear differential equations. The selected method was verified to give good results even at large nonlinearity levels. The transfer matrix methodology is extended to the solution of nonlinear problems.

  8. Physical models of biological information and adaptation.

    PubMed

    Stuart, C I

    1985-04-07

    The bio-informational equivalence asserts that biological processes reduce to processes of information transfer. In this paper, that equivalence is treated as a metaphor with deeply anthropomorphic content of a sort that resists constitutive-analytical definition, including formulation within mathematical theories of information. It is argued that continuance of the metaphor, as a quasi-theoretical perspective in biology, must entail a methodological dislocation between biological and physical science. It is proposed that a general class of functions, drawn from classical physics, can serve to eliminate the anthropomorphism. Further considerations indicate that the concept of biological adaptation is central to the general applicability of the informational idea in biology; a non-anthropomorphic treatment of adaptive phenomena is suggested in terms of variational principles.

  9. [Motivating health education-based change].

    PubMed

    Puerto-Guerrero, Ana H

    2012-06-01

    The following work refers to academic experience regarding the training of nurses concerning primary prevention and child nursing within the area of public health. The target population consisted of children. Accumulated experience was systematized for identifying epistemological, theoretical and practical elements developed over five years in basic primary education institutions, in line with the educational proposal Experience sexuality with dignity. It was found that this type of work managed to develop special motivation in the scholastic community whilst allowing academics to approach the social reality which they must confront in their professional practice. The work emphasised strengthening children's awareness of the need for family, state and civil society participation. Motivating health education-based change did involve the systematisation of experience as a methodological tool.

  10. A multidisciplinary approach of workload assessment in real-job situations: investigation in the field of aerospace activities.

    PubMed

    Mélan, Claudine; Cascino, Nadine

    2014-01-01

    The present contribution presents two field studies combining tools and methods from cognitive psychology and from occupational psychology in order to perform a thorough investigation of workload in employees. Cognitive load theory proposes to distinguish different load categories of working memory, in a context of instruction. Intrinsic load is inherent to the task, extraneous load refers to components of a learning environment that may be modified to reduce total load, and germane load enables schemas construction and thus efficient learning. We showed previously that this theoretical framework may be successfully extended to working memory tasks in non-instructional designs. Other theoretical models, issued from the field of occupational psychology, account for an individual's perception of work demands or requirements in the context of different psychosocial features of the (work) environment. Combining these approaches is difficult as workload assessment by job-perception questionnaires explore an individual's overall job-perception over a large time-period, whereas cognitive load investigations in working memory tasks are typically performed within short time-periods. We proposed an original methodology enabling investigation of workload and load factors in a comparable time-frame. We report two field studies investigating workload on different shift-phases and between work-shifts, with two custom-made tools. The first one enabled workload assessment by manipulating intrinsic load (task difficulty) and extraneous load (time pressure) in a working-memory task. The second tool was a questionnaire based on the theoretical concepts of work-demands, control, and psychosocial support. Two additional dimensions suspected to contribute to job-perception, i.e., work-family conflicts and availability of human and technical resources were also explored. Results of workload assessments were discussed in light of operators' alertness and job-performance.

  11. A multidisciplinary approach of workload assessment in real-job situations: investigation in the field of aerospace activities

    PubMed Central

    Mélan, Claudine; Cascino, Nadine

    2014-01-01

    The present contribution presents two field studies combining tools and methods from cognitive psychology and from occupational psychology in order to perform a thorough investigation of workload in employees. Cognitive load theory proposes to distinguish different load categories of working memory, in a context of instruction. Intrinsic load is inherent to the task, extraneous load refers to components of a learning environment that may be modified to reduce total load, and germane load enables schemas construction and thus efficient learning. We showed previously that this theoretical framework may be successfully extended to working memory tasks in non-instructional designs. Other theoretical models, issued from the field of occupational psychology, account for an individual’s perception of work demands or requirements in the context of different psychosocial features of the (work) environment. Combining these approaches is difficult as workload assessment by job-perception questionnaires explore an individual’s overall job-perception over a large time-period, whereas cognitive load investigations in working memory tasks are typically performed within short time-periods. We proposed an original methodology enabling investigation of workload and load factors in a comparable time-frame. We report two field studies investigating workload on different shift-phases and between work-shifts, with two custom-made tools. The first one enabled workload assessment by manipulating intrinsic load (task difficulty) and extraneous load (time pressure) in a working-memory task. The second tool was a questionnaire based on the theoretical concepts of work-demands, control, and psychosocial support. Two additional dimensions suspected to contribute to job-perception, i.e., work–family conflicts and availability of human and technical resources were also explored. Results of workload assessments were discussed in light of operators’ alertness and job-performance. PMID:25232346

  12. Verification of Gyrokinetic codes: Theoretical background and applications

    NASA Astrophysics Data System (ADS)

    Tronko, Natalia; Bottino, Alberto; Görler, Tobias; Sonnendrücker, Eric; Told, Daniel; Villard, Laurent

    2017-05-01

    In fusion plasmas, the strong magnetic field allows the fast gyro-motion to be systematically removed from the description of the dynamics, resulting in a considerable model simplification and gain of computational time. Nowadays, the gyrokinetic (GK) codes play a major role in the understanding of the development and the saturation of turbulence and in the prediction of the subsequent transport. Naturally, these codes require thorough verification and validation. Here, we present a new and generic theoretical framework and specific numerical applications to test the faithfulness of the implemented models to theory and to verify the domain of applicability of existing GK codes. For a sound verification process, the underlying theoretical GK model and the numerical scheme must be considered at the same time, which has rarely been done and therefore makes this approach pioneering. At the analytical level, the main novelty consists in using advanced mathematical tools such as variational formulation of dynamics for systematization of basic GK code's equations to access the limits of their applicability. The verification of the numerical scheme is proposed via the benchmark effort. In this work, specific examples of code verification are presented for two GK codes: the multi-species electromagnetic ORB5 (PIC) and the radially global version of GENE (Eulerian). The proposed methodology can be applied to any existing GK code. We establish a hierarchy of reduced GK Vlasov-Maxwell equations implemented in the ORB5 and GENE codes using the Lagrangian variational formulation. At the computational level, detailed verifications of global electromagnetic test cases developed from the CYCLONE Base Case are considered, including a parametric β-scan covering the transition from ITG to KBM and the spectral properties at the nominal β value.

  13. A theoretical framework informing research about the role of stress in the pathophysiology of bipolar disorder.

    PubMed

    Brietzke, Elisa; Mansur, Rodrigo Barbachan; Soczynska, Joanna; Powell, Alissa M; McIntyre, Roger S

    2012-10-01

    The staggering illness burden associated with Bipolar Disorder (BD) invites the need for primary prevention strategies. Before preventative strategies can be considered in individuals during a pre-symptomatic period (i.e., at risk), unraveling the mechanistic steps wherein external stress is transduced and interacts with genetic vulnerability in the early stages of BD will be a critical conceptual necessity. Herein we comprehensively review extant studies reporting on stress and bipolar disorder. The overarching aim is to propose a conceptual framework to inform research about the role of stress in the pathophysiology of BD. Computerized databases i.e. PubMed, PsychInfo, Cochrane Library and Scielo were searched using the following terms: "bipolar disorder" cross-referenced with "stress", "general reaction to stress", "resilience", "resistance", "recovery" "stress-diathesis", "allostasis", and "hormesis". Data from literature indicate the existence of some theoretical models to understand the influence of stress in the pathophysiology of BD, including classical stress-diathesis model and new models such as allostasis and hormesis. In addition, molecular mechanisms involved in stress adaptation (resistance, resilience and recovery) can also be translated in research strategies to investigate the impact of stress in the pathophysiology of BD. Most studies are retrospective and/or cross sectional, do not consider the period of development, assess brain function with only one or few methodologies, and use animal models which are not always similar to human phenotypes. The interaction between stress and brain development is dynamic and complex. In this article we proposed a theoretical model for investigation about the role of stress in the pathophysiology of BD, based on the different kinds of stress adaptation response and their putative neurobiological underpinnings. Copyright © 2012 Elsevier Inc. All rights reserved.

  14. Outperforming Game Theoretic Play with Opponent Modeling in Two Player Dominoes

    DTIC Science & Technology

    2014-03-27

    36 III. Methodology Introduction This chapter describes the methodology of how a dominoes artificial intelligence agent employs...Applying this concept to a partially observable game means that both players will have to model each other and have some intelligence of the board...

  15. Scalar Implicatures in Child Language: Give Children a Chance

    ERIC Educational Resources Information Center

    Foppolo, Francesca; Guasti, Maria Teresa; Chierchia, Gennaro

    2012-01-01

    Children's pragmatic competence in deriving conversational implicatures (and scalar implicatures in particular) offers an intriguing standpoint to explore how developmental, methodological, and purely theoretical perspectives interact and feed each other. In this paper, we focus mainly on developmental and methodological issues, showing that…

  16. Learning outcomes of "The Oncology Patient" study among nursing students: A comparison of teaching strategies.

    PubMed

    Roca, Judith; Reguant, Mercedes; Canet, Olga

    2016-11-01

    Teaching strategies are essential in order to facilitate meaningful learning and the development of high-level thinking skills in students. To compare three teaching methodologies (problem-based learning, case-based teaching and traditional methods) in terms of the learning outcomes achieved by nursing students. This quasi-experimental research was carried out in the Nursing Degree programme in a group of 74 students who explored the subject of The Oncology Patient through the aforementioned strategies. A performance test was applied based on Bloom's Revised Taxonomy. A significant correlation was found between the intragroup theoretical and theoretical-practical dimensions. Likewise, intergroup differences were related to each teaching methodology. Hence, significant differences were estimated between the traditional methodology (x-=9.13), case-based teaching (x-=12.96) and problem-based learning (x-=14.84). Problem-based learning was shown to be the most successful learning method, followed by case-based teaching and the traditional methodology. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Physics of mind: Experimental confirmations of theoretical predictions.

    PubMed

    Schoeller, Félix; Perlovsky, Leonid; Arseniev, Dmitry

    2018-02-02

    What is common among Newtonian mechanics, statistical physics, thermodynamics, quantum physics, the theory of relativity, astrophysics and the theory of superstrings? All these areas of physics have in common a methodology, which is discussed in the first few lines of the review. Is a physics of the mind possible? Is it possible to describe how a mind adapts in real time to changes in the physical world through a theory based on a few basic laws? From perception and elementary cognition to emotions and abstract ideas allowing high-level cognition and executive functioning, at nearly all levels of study, the mind shows variability and uncertainties. Is it possible to turn psychology and neuroscience into so-called "hard" sciences? This review discusses several established first principles for the description of mind and their mathematical formulations. A mathematical model of mind is derived from these principles. This model includes mechanisms of instincts, emotions, behavior, cognition, concepts, language, intuitions, and imagination. We clarify fundamental notions such as the opposition between the conscious and the unconscious, the knowledge instinct and aesthetic emotions, as well as humans' universal abilities for symbols and meaning. In particular, the review discusses in length evolutionary and cognitive functions of aesthetic emotions and musical emotions. Several theoretical predictions are derived from the model, some of which have been experimentally confirmed. These empirical results are summarized and we introduce new theoretical developments. Several unsolved theoretical problems are proposed, as well as new experimental challenges for future research. Copyright © 2017. Published by Elsevier B.V.

  18. The mothers' experiences in the pediatrics hemodialysis unit.

    PubMed

    Mieto, Fernanda Stella Risseto; Bousso, Regina Szylit

    2014-01-01

    The need for hemodialysis exerts a deep impact on the lives of children and adolescents with end-stage kidney chronic failure and their mothers, who predominantly assume the care related to treatment. The hemodialysis requires that the mother accompanies the child during sessions at least three times a week and, since it is not a healing practice, they also experience the waiting for a kidney transplant, attributing different meanings to this experience. To understand what it means for the mothers to accompany the child in a Pediatric Hemodialysis Unit and to construct a theoretical model representing this experience. The Symbolic Interactionism was adopted as a theoretical model and the Grounded Theory as a methodological framework. Data were collected through interviews with 11 mothers. The comparative analysis of the data enabled the identification of two phenomena that compose the experience: "Seeing the child´s life being sucked by the hemodialysis machine" expresses the experiences of the mothers that generates new demands to comprehend the new health conditions of their children and "Giving new meaning to the dependence of the hemodialysis machine" that represents the strategies employed to endure the experience. The relationship of these phenomena allowed the identification of the main category: "Having the mother's life imprisoned by the hemodialysis machine", from which we propose a new theoretical model. The results of the study allow us to provide a theoretical ground for planning an assistance that meets the real needs of the mothers, identifying aspects that require intervention.

  19. Towards a theoretical model on medicines as a health need.

    PubMed

    Vargas-Peláez, Claudia Marcela; Soares, Luciano; Rover, Marina Raijche Mattozo; Blatt, Carine Raquel; Mantel-Teeuwisse, Aukje; Rossi Buenaventura, Francisco Augusto; Restrepo, Luis Guillermo; Latorre, María Cristina; López, José Julián; Bürgin, María Teresa; Silva, Consuelo; Leite, Silvana Nair; Mareni Rocha, Farias

    2017-04-01

    Medicines are considered one of the main tools of western medicine to resolve health problems. Currently, medicines represent an important share of the countries' healthcare budget. In the Latin America region, access to essential medicines is still a challenge, although countries have established some measures in the last years in order to guarantee equitable access to medicines. A theoretical model is proposed for analysing the social, political, and economic factors that modulate the role of medicines as a health need and their influence on the accessibility and access to medicines. The model was built based on a narrative review about health needs, and followed the conceptual modelling methodology for theory-building. The theoretical model considers elements (stakeholders, policies) that modulate the perception towards medicines as a health need from two perspectives - health and market - at three levels: international, national and local levels. The perception towards medicines as a health need is described according to Bradshaw's categories: felt need, normative need, comparative need and expressed need. When those different categories applied to medicines coincide, the patients get access to the medicines they perceive as a need, but when the categories do not coincide, barriers to access to medicines are created. Our theoretical model, which holds a broader view about the access to medicines, emphasises how power structures, interests, interdependencies, values and principles of the stakeholders could influence the perception towards medicines as a health need and the access to medicines in Latin American countries. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Causal inferences on the effectiveness of complex social programs: Navigating assumptions, sources of complexity and evaluation design challenges.

    PubMed

    Chatterji, Madhabi

    2016-12-01

    This paper explores avenues for navigating evaluation design challenges posed by complex social programs (CSPs) and their environments when conducting studies that call for generalizable, causal inferences on the intervention's effectiveness. A definition is provided of a CSP drawing on examples from different fields, and an evaluation case is analyzed in depth to derive seven (7) major sources of complexity that typify CSPs, threatening assumptions of textbook-recommended experimental designs for performing impact evaluations. Theoretically-supported, alternative methodological strategies are discussed to navigate assumptions and counter the design challenges posed by the complex configurations and ecology of CSPs. Specific recommendations include: sequential refinement of the evaluation design through systems thinking, systems-informed logic modeling; and use of extended term, mixed methods (ETMM) approaches with exploratory and confirmatory phases of the evaluation. In the proposed approach, logic models are refined through direct induction and interactions with stakeholders. To better guide assumption evaluation, question-framing, and selection of appropriate methodological strategies, a multiphase evaluation design is recommended. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Problem based learning - A brief review

    NASA Astrophysics Data System (ADS)

    Nunes, Sandra; Oliveira, Teresa A.; Oliveira, Amílcar

    2017-07-01

    Teaching is a complex mission that requires not only the theoretical knowledge transmission, but furthermore requires to provide the students the necessary skills for solving real problems in their respective professional activities where complex issues and problems must be frequently faced. Over more than twenty years we have been experiencing an increase in scholar failure in the scientific area of mathematics, which means that Teaching Mathematics and related areas can be even a more complex and hard task. Scholar failure is a complex phenomenon that depends on various factors as social factors, scholar factors or biophysical factors. After numerous attempts made in order to reduce scholar failure our goal in this paper is to understand the role of "Problem Based Learning" and how this methodology can contribute to the solution of both: increasing mathematical courses success and increasing skills in the near future professionals in Portugal. Before designing a proposal for applying this technique in our institutions, we decided to conduct a survey to provide us with the necessary information about and the respective advantages and disadvantages of this methodology, so this is the brief review aim.

  2. The New Method of Tsunami Source Reconstruction With r-Solution Inversion Method

    NASA Astrophysics Data System (ADS)

    Voronina, T. A.; Romanenko, A. A.

    2016-12-01

    Application of the r-solution method to reconstructing the initial tsunami waveform is discussed. This methodology is based on the inversion of remote measurements of water-level data. The wave propagation is considered within the scope of a linear shallow-water theory. The ill-posed inverse problem in question is regularized by means of a least square inversion using the truncated Singular Value Decomposition method. As a result of the numerical process, an r-solution is obtained. The method proposed allows one to control the instability of a numerical solution and to obtain an acceptable result in spite of ill posedness of the problem. Implementation of this methodology to reconstructing of the initial waveform to 2013 Solomon Islands tsunami validates the theoretical conclusion for synthetic data and a model tsunami source: the inversion result strongly depends on data noisiness, the azimuthal and temporal coverage of recording stations with respect to the source area. Furthermore, it is possible to make a preliminary selection of the most informative set of the available recording stations used in the inversion process.

  3. [Research on psychosomatic disease. Various theoretical and methodologic aspects].

    PubMed

    Barbosa, A; Castanheira, J L; Cordeiro, J C

    1992-07-01

    This article mentions ther present main lines of psychosomatic research either in what concerns the elimination of the concept of psychosomatic illness, or in what concerns its etiological understanding of the peculiar ways of therapeutic approach. We specify some methodological problems resulting from using several instruments to collect data and measure them. We analyse the theoric relevance of the constructs: depressive equivalents and, specially, the alexithymia one. Starting from the consensual phenomonological description of this construct, we explain its psychodynamic understanding, its neurophysiologic basis and sociocultural determination. We question the relationship between alexithymia and psychosomatic illness. We point out the pertinency of its utilization as a risk or maintainance factor and the possibility of its modelling by ambiance factors. We clarify the main heuristic contributions of this construct to psychosomatic investigation and we analyse, critically and concisely, the validity and fidelity of some instruments of measure built to measure it. It is necessary to pay prior attention to psychosomatic investigation in the health area. We propose lines of investigation to be developed in our country that should have a multidisciplinary perspective.

  4. Enhancement of life cycle assessment (LCA) methodology to include the effect of surface albedo on climate change: Comparing black and white roofs.

    PubMed

    Susca, Tiziana

    2012-04-01

    Traditionally, life cycle assessment (LCA) does not estimate a key property: surface albedo. Here an enhancement of the LCA methodology has been proposed through the development and employment of a time-dependent climatological model for including the effect of surface albedo on climate. The theoretical findings derived by the time-dependent model have been applied to the case study of a black and a white roof evaluated in the time-frames of 50 and 100 years focusing on the impact on global warming potential. The comparative life cycle impact assessment of the two roofs shows that the high surface albedo plays a crucial role in offsetting radiative forcings. In the 50-year time horizon, surface albedo is responsible for a decrease in CO(2)eq of 110-184 kg and 131-217 kg in 100 years. Furthermore, the white roof compared to the black roof, due to the high albedo, decreases the annual energy use of about 3.6-4.5 kWh/m(2). Copyright © 2011 Elsevier Ltd. All rights reserved.

  5. Estimation of integral curves from high angular resolution diffusion imaging (HARDI) data.

    PubMed

    Carmichael, Owen; Sakhanenko, Lyudmila

    2015-05-15

    We develop statistical methodology for a popular brain imaging technique HARDI based on the high order tensor model by Özarslan and Mareci [10]. We investigate how uncertainty in the imaging procedure propagates through all levels of the model: signals, tensor fields, vector fields, and fibers. We construct asymptotically normal estimators of the integral curves or fibers which allow us to trace the fibers together with confidence ellipsoids. The procedure is computationally intense as it blends linear algebra concepts from high order tensors with asymptotical statistical analysis. The theoretical results are illustrated on simulated and real datasets. This work generalizes the statistical methodology proposed for low angular resolution diffusion tensor imaging by Carmichael and Sakhanenko [3], to several fibers per voxel. It is also a pioneering statistical work on tractography from HARDI data. It avoids all the typical limitations of the deterministic tractography methods and it delivers the same information as probabilistic tractography methods. Our method is computationally cheap and it provides well-founded mathematical and statistical framework where diverse functionals on fibers, directions and tensors can be studied in a systematic and rigorous way.

  6. Estimation of integral curves from high angular resolution diffusion imaging (HARDI) data

    PubMed Central

    Carmichael, Owen; Sakhanenko, Lyudmila

    2015-01-01

    We develop statistical methodology for a popular brain imaging technique HARDI based on the high order tensor model by Özarslan and Mareci [10]. We investigate how uncertainty in the imaging procedure propagates through all levels of the model: signals, tensor fields, vector fields, and fibers. We construct asymptotically normal estimators of the integral curves or fibers which allow us to trace the fibers together with confidence ellipsoids. The procedure is computationally intense as it blends linear algebra concepts from high order tensors with asymptotical statistical analysis. The theoretical results are illustrated on simulated and real datasets. This work generalizes the statistical methodology proposed for low angular resolution diffusion tensor imaging by Carmichael and Sakhanenko [3], to several fibers per voxel. It is also a pioneering statistical work on tractography from HARDI data. It avoids all the typical limitations of the deterministic tractography methods and it delivers the same information as probabilistic tractography methods. Our method is computationally cheap and it provides well-founded mathematical and statistical framework where diverse functionals on fibers, directions and tensors can be studied in a systematic and rigorous way. PMID:25937674

  7. Fault Diagnosis of Induction Machines in a Transient Regime Using Current Sensors with an Optimized Slepian Window

    PubMed Central

    Burriel-Valencia, Jordi; Martinez-Roman, Javier; Sapena-Bano, Angel

    2018-01-01

    The aim of this paper is to introduce a new methodology for the fault diagnosis of induction machines working in the transient regime, when time-frequency analysis tools are used. The proposed method relies on the use of the optimized Slepian window for performing the short time Fourier transform (STFT) of the stator current signal. It is shown that for a given sequence length of finite duration, the Slepian window has the maximum concentration of energy, greater than can be reached with a gated Gaussian window, which is usually used as the analysis window. In this paper, the use and optimization of the Slepian window for fault diagnosis of induction machines is theoretically introduced and experimentally validated through the test of a 3.15-MW induction motor with broken bars during the start-up transient. The theoretical analysis and the experimental results show that the use of the Slepian window can highlight the fault components in the current’s spectrogram with a significant reduction of the required computational resources. PMID:29316650

  8. Dynamic Fluid in a Porous Transducer-Based Angular Accelerometer

    PubMed Central

    Cheng, Siyuan; Fu, Mengyin; Wang, Meiling; Ming, Li; Fu, Huijin; Wang, Tonglei

    2017-01-01

    This paper presents a theoretical model of the dynamics of liquid flow in an angular accelerometer comprising a porous transducer in a circular tube of liquid. Wave speed and dynamic permeability of the transducer are considered to describe the relation between angular acceleration and the differential pressure on the transducer. The permeability and streaming potential coupling coefficient of the transducer are determined in the experiments, and special prototypes are utilized to validate the theoretical model in both the frequency and time domains. The model is applied to analyze the influence of structural parameters on the frequency response and the transient response of the fluidic system. It is shown that the radius of the circular tube and the wave speed affect the low frequency gain, as well as the bandwidth of the sensor. The hydrodynamic resistance of the transducer and the cross-section radius of the circular tube can be used to control the transient performance. The proposed model provides the basic techniques to achieve the optimization of the angular accelerometer together with the methodology to control the wave speed and the hydrodynamic resistance of the transducer. PMID:28230793

  9. Fault Diagnosis of Induction Machines in a Transient Regime Using Current Sensors with an Optimized Slepian Window.

    PubMed

    Burriel-Valencia, Jordi; Puche-Panadero, Ruben; Martinez-Roman, Javier; Sapena-Bano, Angel; Pineda-Sanchez, Manuel

    2018-01-06

    The aim of this paper is to introduce a new methodology for the fault diagnosis of induction machines working in the transient regime, when time-frequency analysis tools are used. The proposed method relies on the use of the optimized Slepian window for performing the short time Fourier transform (STFT) of the stator current signal. It is shown that for a given sequence length of finite duration, the Slepian window has the maximum concentration of energy, greater than can be reached with a gated Gaussian window, which is usually used as the analysis window. In this paper, the use and optimization of the Slepian window for fault diagnosis of induction machines is theoretically introduced and experimentally validated through the test of a 3.15-MW induction motor with broken bars during the start-up transient. The theoretical analysis and the experimental results show that the use of the Slepian window can highlight the fault components in the current's spectrogram with a significant reduction of the required computational resources.

  10. [Cognitive neuroscience of aging. Contributions and challenges].

    PubMed

    Díaz, Fernando; Pereiro, Arturo X

    The cognitive neuroscience of aging is a young discipline that has emerged as a result of the combination of: A) the theoretical and explanatory frameworks proposed by the cognitive psychology perspective throughout the second half of the twentieth century; B) the designs and methodological procedures arising from experimental psychology and the need to test the hypotheses proposed from the cognitive psychology perspective; C) the contributions of the computer sciences to the explanation of brain functions; and D) the development and use of neuroimaging techniques that have enabled the recording of brain activity in humans while tasks that test some cognitive process or function are performed. An analysis on the impact of research conducted from this perspective over the last 3decades has been carried out, including its shortcomings, as well as the potential directions and usefulness that will advantageously continue to drive this discipline in its description and explanation of the process es of cerebral and cognitive aging. Copyright © 2017 SEGG. Publicado por Elsevier España, S.L.U. All rights reserved.

  11. An investigation into some implications of a Vygotskian perspective on the origins of mind: psychoanalysis and Vygotskian psychology, Part I.

    PubMed

    Wilson, A; Weinstein, L

    1992-01-01

    The Russian psychologist Lev Vygotsky proposed an analysis of language, thought, and internalization that has direct relevance to the current concerns of psychoanalysts. Striking methodological and conceptual similarities and useful complementarities with psychoanalysis are discovered when one peers beneath the surface of Vygotskian psychology. Our adaptation of Vygostsky's views expands upon Freud's assigned role to language in the topographic model. We suggest that the analysand's speech offers several windows into the history of the individual, through prosody, tropes, word meaning, and word sense. We particularly emphasize Vygotsky's views on the genesis and utilization of word meanings. The acquisition of word meanings will contain key elements of the internal climate present when the word meaning was forged. Bearing this in mind, crucial theoretical questions follow, such as how psychoanalysis is to understand the unconscious fantasies, identifications, anxieties, and defenses associated with the psychodynamics of language acquisition and later language usage. We propose that the clinical situation is an ideal place to test these hypotheses.

  12. Suppression of seizures based on the multi-coupled neural mass model.

    PubMed

    Cao, Yuzhen; Ren, Kaili; Su, Fei; Deng, Bin; Wei, Xile; Wang, Jiang

    2015-10-01

    Epilepsy is one of the most common serious neurological disorders, which affects approximately 1% of population in the world. In order to effectively control the seizures, we propose a novel control methodology, which combines the feedback linearization control (FLC) with the underlying mechanism of epilepsy, to achieve the suppression of seizures. The three coupled neural mass model is constructed to study the property of the electroencephalographs (EEGs). Meanwhile, with the model we research on the propagation of epileptiform waves and the synchronization of populations, which are taken as the foundation of our control method. Results show that the proposed approach not only yields excellent performances in clamping the pathological spiking patterns to the reference signals derived under the normal state but also achieves the normalization of the pathological parameter, where the parameters are estimated from EEGs with Unscented Kalman Filter. The specific contribution of this paper is to treat the epilepsy from its pathogenesis with the FLC, which provides critical theoretical basis for the clinical treatment of neurological disorders.

  13. What is adaptive about adaptive decision making? A parallel constraint satisfaction account.

    PubMed

    Glöckner, Andreas; Hilbig, Benjamin E; Jekel, Marc

    2014-12-01

    There is broad consensus that human cognition is adaptive. However, the vital question of how exactly this adaptivity is achieved has remained largely open. Herein, we contrast two frameworks which account for adaptive decision making, namely broad and general single-mechanism accounts vs. multi-strategy accounts. We propose and fully specify a single-mechanism model for decision making based on parallel constraint satisfaction processes (PCS-DM) and contrast it theoretically and empirically against a multi-strategy account. To achieve sufficiently sensitive tests, we rely on a multiple-measure methodology including choice, reaction time, and confidence data as well as eye-tracking. Results show that manipulating the environmental structure produces clear adaptive shifts in choice patterns - as both frameworks would predict. However, results on the process level (reaction time, confidence), in information acquisition (eye-tracking), and from cross-predicting choice consistently corroborate single-mechanisms accounts in general, and the proposed parallel constraint satisfaction model for decision making in particular. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. Systematic errors in the determination of the spectroscopic g-factor in broadband ferromagnetic resonance spectroscopy: A proposed solution

    NASA Astrophysics Data System (ADS)

    Gonzalez-Fuentes, C.; Dumas, R. K.; García, C.

    2018-01-01

    A theoretical and experimental study of the influence of small offsets of the magnetic field (δH) on the measurement accuracy of the spectroscopic g-factor (g) and saturation magnetization (Ms) obtained by broadband ferromagnetic resonance (FMR) measurements is presented. The random nature of δH generates systematic and opposite sign deviations of the values of g and Ms with respect to their true values. A δH on the order of a few Oe leads to a ˜10% error of g and Ms for a typical range of frequencies employed in broadband FMR experiments. We propose a simple experimental methodology to significantly minimize the effect of δH on the fitted values of g and Ms, eliminating their apparent dependence in the range of frequencies employed. Our method was successfully tested using broadband FMR measurements on a 5 nm thick Ni80Fe20 film for frequencies ranging between 3 and 17 GHz.

  15. Plasmonic metasurface cavity for simultaneous enhancement of optical electric and magnetic fields in deep subwavelength volume.

    PubMed

    Hong, Jongwoo; Kim, Sun-Je; Kim, Inki; Yun, Hansik; Mun, Sang-Eun; Rho, Junsuk; Lee, Byoungho

    2018-05-14

    It has been hard to achieve simultaneous plasmonic enhancement of nanoscale light-matter interactions in terms of both electric and magnetic manners with easily reproducible fabrication method and systematic theoretical design rule. In this paper, a novel concept of a flat nanofocusing device is proposed for simultaneously squeezing both electric and magnetic fields in deep-subwavelength volume (~λ 3 /538) in a large area. Based on the funneled unit cell structures and surface plasmon-assisted coherent interactions between them, the array of rectangular nanocavity connected to a tapered nanoantenna, plasmonic metasurface cavity, is constructed by periodic arrangement of the unit cell. The average enhancement factors of electric and magnetic field intensities reach about 60 and 22 in nanocavities, respectively. The proposed outstanding performance of the device is verified numerically and experimentally. We expect that this work would expand methodologies involving optical near-field manipulations in large areas and related potential applications including nanophotonic sensors, nonlinear responses, and quantum interactions.

  16. Modified Distribution-Free Goodness-of-Fit Test Statistic.

    PubMed

    Chun, So Yeon; Browne, Michael W; Shapiro, Alexander

    2018-03-01

    Covariance structure analysis and its structural equation modeling extensions have become one of the most widely used methodologies in social sciences such as psychology, education, and economics. An important issue in such analysis is to assess the goodness of fit of a model under analysis. One of the most popular test statistics used in covariance structure analysis is the asymptotically distribution-free (ADF) test statistic introduced by Browne (Br J Math Stat Psychol 37:62-83, 1984). The ADF statistic can be used to test models without any specific distribution assumption (e.g., multivariate normal distribution) of the observed data. Despite its advantage, it has been shown in various empirical studies that unless sample sizes are extremely large, this ADF statistic could perform very poorly in practice. In this paper, we provide a theoretical explanation for this phenomenon and further propose a modified test statistic that improves the performance in samples of realistic size. The proposed statistic deals with the possible ill-conditioning of the involved large-scale covariance matrices.

  17. A Systematic Process for Developing High Quality SaaS Cloud Services

    NASA Astrophysics Data System (ADS)

    La, Hyun Jung; Kim, Soo Dong

    Software-as-a-Service (SaaS) is a type of cloud service which provides software functionality through Internet. Its benefits are well received in academia and industry. To fully utilize the benefits, there should be effective methodologies to support the development of SaaS services which provide high reusability and applicability. Conventional approaches such as object-oriented methods do not effectively support SaaS-specific engineering activities such as modeling common features, variability, and designing quality services. In this paper, we present a systematic process for developing high quality SaaS and highlight the essentiality of commonality and variability (C&V) modeling to maximize the reusability. We first define criteria for designing the process model and provide a theoretical foundation for SaaS; its meta-model and C&V model. We clarify the notion of commonality and variability in SaaS, and propose a SaaS development process which is accompanied with engineering instructions. Using the proposed process, SaaS services with high quality can be effectively developed.

  18. Intervention bioethics: a proposal for peripheral countries in a context of power and injustice.

    PubMed

    Garrafa, Volnei; Porto, Dora

    2003-10-01

    The bioethics of the so-called 'peripheral countries' must preferably be concerned with persistent situations, that is, with those problems that are still happening, but should not happen anymore in the 21st century. Resulting conflicts cannot be exclusively analysed based on ethical (or bioethical) theories derived from 'central countries.' The authors warn of the growing lack of political analysis of moral conflicts and of human indignation. The indiscriminate utilisation of the bioethics justification as a neutral methodological tool softens and even cancels out the seriousness of several problems, even those that might result in the most profound social distortions. The current study takes as a theoretical reference the fact that natural resources (which affect us all) are relevant. Based on these premises, and on the concept that equity means 'treating unevenly the unequal', a proposal of a hard bioethics (or intervention bioethics) is introduced, in defence of the historical insights and rights of economically and socially excluded populations that are separated from the international developmental process.

  19. Development of Innovative Business Model of Modern Manager's Qualities

    ERIC Educational Resources Information Center

    Yashkova, Elena V.; Sineva, Nadezda L.; Shkunova, Angelika A.; Bystrova, Natalia V.; Smirnova, Zhanna V.; Kolosova, Tatyana V.

    2016-01-01

    The paper defines a complex of manager's qualities based on theoretical and methodological analysis and synthesis methods, available national and world literature, research papers and publications. The complex approach methodology was used, which provides an innovative view of the development of modern manager's qualities. The methodological…

  20. The Critical Period Concept: Research, Methodology, and Theoretical Issues.

    ERIC Educational Resources Information Center

    Colombo, John

    1982-01-01

    Considers evidence on the criteria and characteristics of critical period phenomena with respect to endogenous and exogenous influences. Describes and evaluates methodology of critical period research and discusses past attempts at subclassification of the field and "recovery of function" as a refutation of the critical period…

  1. A New Methodology for Systematic Exploitation of Technology Databases.

    ERIC Educational Resources Information Center

    Bedecarrax, Chantal; Huot, Charles

    1994-01-01

    Presents the theoretical aspects of a data analysis methodology that can help transform sequential raw data from a database into useful information, using the statistical analysis of patents as an example. Topics discussed include relational analysis and a technology watch approach. (Contains 17 references.) (LRW)

  2. Educational Policymaking and the Methodology of Positive Economics: A Theoretical Critique

    ERIC Educational Resources Information Center

    Gilead, Tal

    2014-01-01

    By critically interrogating the methodological foundations of orthodox economic theory, Tal Gilead challenges the growing conviction in educational policymaking quarters that, being more scientific than other forms of educational investigation, inquiries grounded in orthodox economics should provide the basis for educational policymaking. He…

  3. Lean for Education

    ERIC Educational Resources Information Center

    LeMahieu, Paul G.; Nordstrum, Lee E.; Greco, Patricia

    2017-01-01

    Purpose: This paper is one of seven in this volume that aims to elaborate different approaches to quality improvement in education. It delineates a methodology called Lean for Education. Design/methodology/approach: The paper presents the origins, theoretical foundations, core concepts and a case study demonstrating an application in US education,…

  4. Developing Idiomatic Competence in the ESOL Classroom: A Pragmatic Account

    ERIC Educational Resources Information Center

    Liontas, John I.

    2015-01-01

    Building on previous theoretical constructs and empirical findings on idioms, this article advances an integrated theoretical and methodological framework for developing idiomatic competence in English for speakers of other languages (ESOL). Beginning with a definition of the term "idiomatic competence," the author then presents a…

  5. Researching Society and Culture.

    ERIC Educational Resources Information Center

    Seale, Clive, Ed.

    This book provides theoretically informed guidance to practicing the key research methods for investigating society and culture. It is a text in both methods and methodology, in which the importance of understanding the historical, theoretical and institutional context in which particular methods have developed is stressed. The contributors of the…

  6. Action Research: Theory and Applications

    ERIC Educational Resources Information Center

    Jefferson, Renée N.

    2014-01-01

    Action research as a methodology is suitable for use within academic library settings. Its theoretical foundations are located in several disciplines and its applications span across many professions. In this article, an overview of the theoretical beginnings and evolution of action research is presented. Approaches generally used in conducting an…

  7. Adjectives That Aren't: An ERP-Theoretical Analysis of Adjectives in Spanish

    ERIC Educational Resources Information Center

    Bartlett, Laura B.

    2013-01-01

    This thesis investigates the syntactic status of adjectives in Spanish through a crossdisciplinary perspective, incorporating methodologies from both theoretical linguistics and neurolinguistics, specifically, event-related potentials (ERPs). It presents conflicting theories about the syntax of adjectives and explores the ways that the processing…

  8. Statistical Anomalies of Bitflips in SRAMs to Discriminate SBUs From MCUs

    NASA Astrophysics Data System (ADS)

    Clemente, Juan Antonio; Franco, Francisco J.; Villa, Francesca; Baylac, Maud; Rey, Solenne; Mecha, Hortensia; Agapito, Juan A.; Puchner, Helmut; Hubert, Guillaume; Velazco, Raoul

    2016-08-01

    Recently, the occurrence of multiple events in static tests has been investigated by checking the statistical distribution of the difference between the addresses of the words containing bitflips. That method has been successfully applied to Field Programmable Gate Arrays (FPGAs) and the original authors indicate that it is also valid for SRAMs. This paper presents a modified methodology that is based on checking the XORed addresses with bitflips, rather than on the difference. Irradiation tests on CMOS 130 & 90 nm SRAMs with 14-MeV neutrons have been performed to validate this methodology. Results in high-altitude environments are also presented and cross-checked with theoretical predictions. In addition, this methodology has also been used to detect modifications in the organization of said memories. Theoretical predictions have been validated with actual data provided by the manufacturer.

  9. Generating or developing grounded theory: methods to understand health and illness.

    PubMed

    Woods, Phillip; Gapp, Rod; King, Michelle A

    2016-06-01

    Grounded theory is a qualitative research methodology that aims to explain social phenomena, e.g. why particular motivations or patterns of behaviour occur, at a conceptual level. Developed in the 1960s by Glaser and Strauss, the methodology has been reinterpreted by Strauss and Corbin in more recent times, resulting in different schools of thought. Differences arise from different philosophical perspectives concerning knowledge (epistemology) and the nature of reality (ontology), demanding that researchers make clear theoretical choices at the commencement of their research when choosing this methodology. Compared to other qualitative methods it has ability to achieve understanding of, rather than simply describing, a social phenomenon. Achieving understanding however, requires theoretical sampling to choose interviewees that can contribute most to the research and understanding of the phenomenon, and constant comparison of interviews to evaluate the same event or process in different settings or situations. Sampling continues until conceptual saturation is reached, i.e. when no new concepts emerge from the data. Data analysis focusses on categorising data (finding the main elements of what is occurring and why), and describing those categories in terms of properties (conceptual characteristics that define the category and give meaning) and dimensions (the variations within properties which produce specificity and range). Ultimately a core category which theoretically explains how all other categories are linked together is developed from the data. While achieving theoretical abstraction in the core category, it should be logical and capture all of the variation within the data. Theory development requires understanding of the methodology not just working through a set of procedures. This article provides a basic overview, set in the literature surrounding grounded theory, for those wanting to increase their understanding and quality of research output.

  10. 78 FR 4369 - Rates for Interstate Inmate Calling Services

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-22

    .... Marginal Location Methodology. In 2008, ICS providers submitted the ICS Provider Proposal for ICS rates. The ICS Provider Proposal uses the ``marginal location'' methodology, previously adopted by the... ``marginal location'' methodology provides a ``basis for rates that represent `fair compensation' as set...

  11. 78 FR 48720 - Agency Information Collection Activities; Proposed Collection; Comment Request: Methodological...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-09

    ... DEPARTMENT OF JUSTICE Office of Justice Programs [OMB Number 1121-NEW] Agency Information Collection Activities; Proposed Collection; Comment Request: Methodological Research To Support the National... Redesign Research (NCVS-RR) program: Methodological Research to Support the National Crime Victimization...

  12. 78 FR 66954 - Agency Information Collection Activities: Proposed Collection; Comments Requested Methodological...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-07

    ... DEPARTMENT OF JUSTICE Office of Justice Programs [OMB No. 1121-NEW] Agency Information Collection Activities: Proposed Collection; Comments Requested Methodological Research To Support the National Crime... related to the National Crime Victimization Survey Redesign Research (NCVS-RR) program: Methodological...

  13. Situational Methodology as Multifaceted Pedagogical Tool of Influence on the Formation of Socio-Ethical Values of Future Managers-Economists in Higher Schools of Ukraine and Germany

    ERIC Educational Resources Information Center

    Sikaliuk, Anzhela

    2014-01-01

    The role and importance of situational methodology as one of the pedagogical tools of influence on the formation of socio-ethical values of future managers in higher schools of Ukraine and Germany have been theoretically substantiated. The possibilities of situational methodology influence on the formation of socio-ethical values of…

  14. A transformation theory of stochastic evolution in Red Moon methodology to time evolution of chemical reaction process in the full atomistic system.

    PubMed

    Suzuki, Yuichi; Nagaoka, Masataka

    2017-05-28

    Atomistic information of a whole chemical reaction system, e.g., instantaneous microscopic molecular structures and orientations, offers important and deeper insight into clearly understanding unknown chemical phenomena. In accordance with the progress of a number of simultaneous chemical reactions, the Red Moon method (a hybrid Monte Carlo/molecular dynamics reaction method) is capable of simulating atomistically the chemical reaction process from an initial state to the final one of complex chemical reaction systems. In the present study, we have proposed a transformation theory to interpret the chemical reaction process of the Red Moon methodology as the time evolution process in harmony with the chemical kinetics. For the demonstration of the theory, we have chosen the gas reaction system in which the reversible second-order reaction H 2 + I 2  ⇌ 2HI occurs. First, the chemical reaction process was simulated from the initial configurational arrangement containing a number of H 2 and I 2 molecules, each at 300 K, 500 K, and 700 K. To reproduce the chemical equilibrium for the system, the collision frequencies for the reactions were taken into consideration in the theoretical treatment. As a result, the calculated equilibrium concentrations [H 2 ] eq and equilibrium constants K eq at all the temperatures were in good agreement with their corresponding experimental values. Further, we applied the theoretical treatment for the time transformation to the system and have shown that the calculated half-life τ's of [H 2 ] reproduce very well the analytical ones at all the temperatures. It is, therefore, concluded that the application of the present theoretical treatment with the Red Moon method makes it possible to analyze reasonably the time evolution of complex chemical reaction systems to chemical equilibrium at the atomistic level.

  15. Influence of defect distribution on the thermoelectric properties of FeNbSb based materials.

    PubMed

    Guo, Shuping; Yang, Kaishuai; Zeng, Zhi; Zhang, Yongsheng

    2018-05-21

    Doping and alloying are important methodologies to improve the thermoelectric performance of FeNbSb based materials. To fully understand the influence of point defects on the thermoelectric properties, we have used density functional calculations in combination with the cluster expansion and Monte Carlo methods to examine the defect distribution behaviors in the mesoscopic FeNb1-xVxSb and FeNb1-xTixSb systems. We find that V and Ti exhibit different distribution behaviors in FeNbSb at low temperature: forming the FeNbSb-FeVSb phase separations in the FeNb1-xVxSb system but two thermodynamically stable phases in FeNb1-xTixSb. Based on the calculated effective mass and band degeneracy, it seems the doping concentration of V or Ti in FeNbSb has little effect on the electrical properties, except for one of the theoretically predicted stable Ti phases (Fe6Nb5Ti1Sb6). Thus, an essential methodology to improve the thermoelectric performance of FeNbSb should rely on phonon scattering to decrease the thermal conductivity. According to the theoretically determined phase diagrams of Fe(Nb,V)Sb and Fe(Nb,Ti)Sb, we propose the (composition, temperature) conditions for the experimental synthesis to improve the thermoelectric performance of FeNbSb based materials: lowering the experimental preparation temperature to around the phase boundary to form a mixture of the solid solution and phase separation. The point defects in the solid solution effectively scatter the short-wavelength phonons and the (coherent or incoherent) interfaces introduced by the phase separation can additionally scatter the middle-wavelength phonons to further decrease the thermal conductivity. Moreover, the induced interfaces could enhance the Seebeck coefficient as well, through the energy filtering effect. Our results give insight into the understanding of the impact of the defect distribution on the thermoelectric performance of materials and strengthen the connection between theoretical predictions and experimental measurements.

  16. Using Counter-Stories to Challenge Stock Stories about Traveller Families

    ERIC Educational Resources Information Center

    D'Arcy, Kate

    2017-01-01

    Critical Race Theory (CRT) is formed from a series of different methodological tools to expose and address racism and discrimination. Counter-stories are one of these tools. This article considers the potential of counter-stories as a methodological, theoretical and practical tool to analyse existing educational inequalities for Traveller…

  17. Allometric scaling theory applied to FIA biomass estimation

    Treesearch

    David C. Chojnacky

    2002-01-01

    Tree biomass estimates in the Forest Inventory and Analysis (FIA) database are derived from numerous methodologies whose abundance and complexity raise questions about consistent results throughout the U.S. A new model based on allometric scaling theory ("WBE") offers simplified methodology and a theoretically sound basis for improving the reliability and...

  18. Network Analysis in Comparative Social Sciences

    ERIC Educational Resources Information Center

    Vera, Eugenia Roldan; Schupp, Thomas

    2006-01-01

    This essay describes the pertinence of Social Network Analysis (SNA) for the social sciences in general, and discusses its methodological and conceptual implications for comparative research in particular. The authors first present a basic summary of the theoretical and methodological assumptions of SNA, followed by a succinct overview of its…

  19. Becoming an Entrepreneur: Researching the Role of Mentors in Identity Construction

    ERIC Educational Resources Information Center

    Rigg, Clare; O'Dwyer, Breda

    2012-01-01

    Purpose: The purpose of this paper is to provide a theoretical discussion of a developing epistemology and methodology for a qualitative study of participants of enterprise education in south-west Ireland, run collaboratively between third level academics, a regional development agency, and entrepreneurs. Design/methodology/approach: The…

  20. Six Sigma in Education

    ERIC Educational Resources Information Center

    LeMahieu, Paul G.; Nordstrum, Lee E.; Cudney, Elizabeth A.

    2017-01-01

    Purpose: This paper is one of seven in this volume that aims to elaborate different approaches to quality improvement in education. It delineates a methodology called Six Sigma. Design/methodology/approach: The paper presents the origins, theoretical foundations, core principles and a case study demonstrating an application of Six Sigma in a…

  1. Positive Deviance: Learning from Positive Anomalies

    ERIC Educational Resources Information Center

    LeMahieu, Paul G.; Nordstrum, Lee E.; Gale, Dick

    2017-01-01

    Purpose: This paper is one of seven in this volume, each elaborating different approaches to quality improvement in education. The purpose of this paper is to delineate a methodology called positive deviance. Design/methodology/approach: The paper presents the origins, theoretical foundations, core principles and a case study demonstrating an…

  2. Development of a Teaching Methodology for Undergraduate Human Development in Psychology

    ERIC Educational Resources Information Center

    Rodriguez, Maria A.; Espinoza, José M.

    2015-01-01

    The development of a teaching methodology for the undergraduate Psychology course Human Development II in a private university in Lima, Peru is described. The theoretical framework consisted of an integration of Citizen Science and Service Learning, with the application of Information and Communications Technology (ICT), specifically Wikipedia and…

  3. Applying Threshold Concepts to Finance Education

    ERIC Educational Resources Information Center

    Hoadley, Susan; Wood, Leigh N.; Tickle, Leonie; Kyng, Tim

    2016-01-01

    Purpose: The purpose of this paper is to investigate and identify threshold concepts that are the essential conceptual content of finance programmes. Design/Methodology/Approach: Conducted in three stages with finance academics and students, the study uses threshold concepts as both a theoretical framework and a research methodology. Findings: The…

  4. Researching Assessment as Social Practice: Implications for Research Methodology

    ERIC Educational Resources Information Center

    Shay, Suellen

    2008-01-01

    Recent educational journals on both sides of the Atlantic have seen a resurgence of debate about the nature of educational research. As a contribution to these debates, this paper draws on theoretical and methodological "thinking tools" of French sociologist Pierre Bourdieu. Specifically, the paper explores what Jenkins [Jenkins, R.…

  5. Consideracoes Extemporaneas acerca das Metodologias Qualitativas (Extemporaneous Considerations about Qualitative Methodology).

    ERIC Educational Resources Information Center

    Pucci, Bruno

    2000-01-01

    Considers the differences between quantitative and qualitative research. Cites some essays by Adorno when he was living in New York which led to the conclusion that empirical data has much to say and discusses the theoretical-methodological contributions in a recent master's thesis in education. (BT)

  6. Capturing Individual Uptake: Toward a Disruptive Research Methodology

    ERIC Educational Resources Information Center

    Bastian, Heather

    2015-01-01

    This article presents and illustrates a qualitative research methodology for studies of uptake. It does so by articulating a theoretical framework for qualitative investigations of uptake and detailing a research study designed to invoke and capture students' uptakes in a first-year writing classroom. The research design sought to make uptake…

  7. Impact Evaluation of Quality Assurance in Higher Education: Methodology and Causal Designs

    ERIC Educational Resources Information Center

    Leiber, Theodor; Stensaker, Bjørn; Harvey, Lee

    2015-01-01

    In this paper, the theoretical perspectives and general methodological elements of impact evaluation of quality assurance in higher education institutions are discussed, which should be a cornerstone of quality development in higher education and contribute to improving the knowledge about the effectiveness (or ineffectiveness) of quality…

  8. 78 FR 50111 - Agency Information Collection Activities; Proposed Collection; Comment Request: Methodological...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-16

    ... DEPARTMENT OF JUSTICE Office of Justice Programs [OMB Number 1121-NEW] Agency Information Collection Activities; Proposed Collection; Comment Request: Methodological Research to Support the National...: Methodological Research to Support the National Crime Victimization Survey: Self-Report Data on Rape and Sexual...

  9. Theoretical Bases for Teacher- and Peer-Delivered Sexual Health Promotion

    ERIC Educational Resources Information Center

    Wight, Daniel

    2008-01-01

    Purpose: This paper seeks to explore the theoretical bases for teacher-delivered and peer-delivered sexual health promotion and education. Design/methodology/approach: The first section briefly outlines the main theories informing sexual health interventions for young people, and the second discusses their implications for modes of delivery.…

  10. Data, Methods, and Theoretical Implications

    ERIC Educational Resources Information Center

    Hannagan, Rebecca J.; Schneider, Monica C.; Greenlee, Jill S.

    2012-01-01

    Within the subfields of political psychology and the study of gender, the introduction of new data collection efforts, methodologies, and theoretical approaches are transforming our understandings of these two fields and the places at which they intersect. In this article we present an overview of the research that was presented at a National…

  11. University Students' Understanding of the Concepts Empirical, Theoretical, Qualitative and Quantitative Research

    ERIC Educational Resources Information Center

    Murtonen, Mari

    2015-01-01

    University research education in many disciplines is frequently confronted by problems with students' weak level of understanding of research concepts. A mind map technique was used to investigate how students understand central methodological concepts of empirical, theoretical, qualitative and quantitative. The main hypothesis was that some…

  12. Military Design Insights for Online Education Program Evaluation: A Revised Theoretical Construct

    ERIC Educational Resources Information Center

    Culkin, David T.

    2017-01-01

    This theoretical development article examines how design methodology currently applied in United States military doctrine can offer insights into the increasingly complex arena of program evaluations of online modes for adult distance education. The article presents key themes that emerge from a multidisciplinary literature review. These themes…

  13. Levels of analysis in neuroscientific studies of emotion: Comment on "The quartet theory of human emotions: an integrative and neurofunctional model" by S. Koelsch et al.

    NASA Astrophysics Data System (ADS)

    Kuiken, Don; Douglas, Shawn

    2015-06-01

    In the conduct of neuroscience research, methodological choices and theoretical claims often reveal underlying metamethodological and ontological commitments. Koelsch et al. [1] accentuate such commitments in their description of four "neuroanatomically distinct systems," each the substrate of "a specific class of affects" (p. 1). Explication of those classes of affect require theoretical integration across methodologically diverse disciplines, including "psychology, neurobiology, sociology, anthropology, and psycholinguistics" (p. 3). (Philosophy is noticeably missing from this list, but several aspects of the authors' stance indicate that it is not ignored.)

  14. Games and Diabetes: A Review Investigating Theoretical Frameworks, Evaluation Methodologies, and Opportunities for Design Grounded in Learning Theories.

    PubMed

    Lazem, Shaimaa; Webster, Mary; Holmes, Wayne; Wolf, Motje

    2015-09-02

    Here we review 18 articles that describe the design and evaluation of 1 or more games for diabetes from technical, methodological, and theoretical perspectives. We undertook searches covering the period 2010 to May 2015 in the ACM, IEEE, Journal of Medical Internet Research, Studies in Health Technology and Informatics, and Google Scholar online databases using the keywords "children," "computer games," "diabetes," "games," "type 1," and "type 2" in various Boolean combinations. The review sets out to establish, for future research, an understanding of the current landscape of digital games designed for children with diabetes. We briefly explored the use and impact of well-established learning theories in such games. The most frequently mentioned theoretical frameworks were social cognitive theory and social constructivism. Due to the limitations of the reported evaluation methodologies, little evidence was found to support the strong promise of games for diabetes. Furthermore, we could not establish a relation between design features and the game outcomes. We argue that an in-depth discussion about the extent to which learning theories could and should be manifested in the design decisions is required. © 2015 Diabetes Technology Society.

  15. Temperature - Emissivity Separation Assessment in a Sub-Urban Scenario

    NASA Astrophysics Data System (ADS)

    Moscadelli, M.; Diani, M.; Corsini, G.

    2017-10-01

    In this paper, a methodology that aims at evaluating the effectiveness of different TES strategies is presented. The methodology takes into account the specific material of interest in the monitored scenario, sensor characteristics, and errors in the atmospheric compensation step. The methodology is proposed in order to predict and analyse algorithms performances during the planning of a remote sensing mission, aimed to discover specific materials of interest in the monitored scenario. As case study, the proposed methodology is applied to a real airborne data set of a suburban scenario. In order to perform the TES problem, three state-of-the-art algorithms, and a recently proposed one, are investigated: Temperature-Emissivity Separation '98 (TES-98) algorithm, Stepwise Refining TES (SRTES) algorithm, Linear piecewise TES (LTES) algorithm, and Optimized Smoothing TES (OSTES) algorithm. At the end, the accuracy obtained with real data, and the ones predicted by means of the proposed methodology are compared and discussed.

  16. Shape-programmable magnetic soft matter

    PubMed Central

    Lum, Guo Zhan; Ye, Zhou; Dong, Xiaoguang; Marvi, Hamid; Erin, Onder; Hu, Wenqi; Sitti, Metin

    2016-01-01

    Shape-programmable matter is a class of active materials whose geometry can be controlled to potentially achieve mechanical functionalities beyond those of traditional machines. Among these materials, magnetically actuated matter is particularly promising for achieving complex time-varying shapes at small scale (overall dimensions smaller than 1 cm). However, previous work can only program these materials for limited applications, as they rely solely on human intuition to approximate the required magnetization profile and actuating magnetic fields for their materials. Here, we propose a universal programming methodology that can automatically generate the required magnetization profile and actuating fields for soft matter to achieve new time-varying shapes. The universality of the proposed method can therefore inspire a vast number of miniature soft devices that are critical in robotics, smart engineering surfaces and materials, and biomedical devices. Our proposed method includes theoretical formulations, computational strategies, and fabrication procedures for programming magnetic soft matter. The presented theory and computational method are universal for programming 2D or 3D time-varying shapes, whereas the fabrication technique is generic only for creating planar beams. Based on the proposed programming method, we created a jellyfish-like robot, a spermatozoid-like undulating swimmer, and an artificial cilium that could mimic the complex beating patterns of its biological counterpart. PMID:27671658

  17. Shape-programmable magnetic soft matter.

    PubMed

    Lum, Guo Zhan; Ye, Zhou; Dong, Xiaoguang; Marvi, Hamid; Erin, Onder; Hu, Wenqi; Sitti, Metin

    2016-10-11

    Shape-programmable matter is a class of active materials whose geometry can be controlled to potentially achieve mechanical functionalities beyond those of traditional machines. Among these materials, magnetically actuated matter is particularly promising for achieving complex time-varying shapes at small scale (overall dimensions smaller than 1 cm). However, previous work can only program these materials for limited applications, as they rely solely on human intuition to approximate the required magnetization profile and actuating magnetic fields for their materials. Here, we propose a universal programming methodology that can automatically generate the required magnetization profile and actuating fields for soft matter to achieve new time-varying shapes. The universality of the proposed method can therefore inspire a vast number of miniature soft devices that are critical in robotics, smart engineering surfaces and materials, and biomedical devices. Our proposed method includes theoretical formulations, computational strategies, and fabrication procedures for programming magnetic soft matter. The presented theory and computational method are universal for programming 2D or 3D time-varying shapes, whereas the fabrication technique is generic only for creating planar beams. Based on the proposed programming method, we created a jellyfish-like robot, a spermatozoid-like undulating swimmer, and an artificial cilium that could mimic the complex beating patterns of its biological counterpart.

  18. Shape-programmable magnetic soft matter

    NASA Astrophysics Data System (ADS)

    Zhan Lum, Guo; Ye, Zhou; Dong, Xiaoguang; Marvi, Hamid; Erin, Onder; Hu, Wenqi; Sitti, Metin

    2016-10-01

    Shape-programmable matter is a class of active materials whose geometry can be controlled to potentially achieve mechanical functionalities beyond those of traditional machines. Among these materials, magnetically actuated matter is particularly promising for achieving complex time-varying shapes at small scale (overall dimensions smaller than 1 cm). However, previous work can only program these materials for limited applications, as they rely solely on human intuition to approximate the required magnetization profile and actuating magnetic fields for their materials. Here, we propose a universal programming methodology that can automatically generate the required magnetization profile and actuating fields for soft matter to achieve new time-varying shapes. The universality of the proposed method can therefore inspire a vast number of miniature soft devices that are critical in robotics, smart engineering surfaces and materials, and biomedical devices. Our proposed method includes theoretical formulations, computational strategies, and fabrication procedures for programming magnetic soft matter. The presented theory and computational method are universal for programming 2D or 3D time-varying shapes, whereas the fabrication technique is generic only for creating planar beams. Based on the proposed programming method, we created a jellyfish-like robot, a spermatozoid-like undulating swimmer, and an artificial cilium that could mimic the complex beating patterns of its biological counterpart.

  19. Correntropy-based partial directed coherence for testing multivariate Granger causality in nonlinear processes

    NASA Astrophysics Data System (ADS)

    Kannan, Rohit; Tangirala, Arun K.

    2014-06-01

    Identification of directional influences in multivariate systems is of prime importance in several applications of engineering and sciences such as plant topology reconstruction, fault detection and diagnosis, and neurosciences. A spectrum of related directionality measures, ranging from linear measures such as partial directed coherence (PDC) to nonlinear measures such as transfer entropy, have emerged over the past two decades. The PDC-based technique is simple and effective, but being a linear directionality measure has limited applicability. On the other hand, transfer entropy, despite being a robust nonlinear measure, is computationally intensive and practically implementable only for bivariate processes. The objective of this work is to develop a nonlinear directionality measure, termed as KPDC, that possesses the simplicity of PDC but is still applicable to nonlinear processes. The technique is founded on a nonlinear measure called correntropy, a recently proposed generalized correlation measure. The proposed method is equivalent to constructing PDC in a kernel space where the PDC is estimated using a vector autoregressive model built on correntropy. A consistent estimator of the KPDC is developed and important theoretical results are established. A permutation scheme combined with the sequential Bonferroni procedure is proposed for testing hypothesis on absence of causality. It is demonstrated through several case studies that the proposed methodology effectively detects Granger causality in nonlinear processes.

  20. Deriving in vivo biotransformation rate constants and metabolite parent concentration factor/stable metabolite factor from bioaccumulation and bioconcentration experiments: An illustration with worm accumulation data.

    PubMed

    Kuo, Dave Ta Fu; Chen, Ciara Chun

    2016-12-01

    Growing concern for the biological fate of organic contaminants and their metabolites and the urge to connect in vitro and in vivo toxicokinetics have prompted researchers to characterize the biotransformation behavior of organic contaminants in biota. The whole body biotransformation rate constant (k M ) is currently determined by the difference approach, which has significant methodological limitations. A new approach for determining k M from the kinetic observations of the parent contaminant and its intermediate metabolites is proposed. In this method, k M can be determined by fitting kinetic data of the parent contaminant and the metabolites to analytical equations that depict the bioaccumulation kinetics. The application of the proposed method is illustrated using worm bioaccumulation-biotransformation data collected from the literature. Furthermore, a metabolite parent concentration factor (MPCF) is also proposed to characterize the persistence of the metabolite in biota. Because both the proposed k M method and MPCF build on the existing theoretical framework for bioaccumulation, they can be readily incorporated into standard experimental bioaccumulation protocols or risk assessment procedures or frameworks. Possible limitations, implications, and future directions are elaborated. Environ Toxicol Chem 2016;35:2903-2909. © 2016 SETAC. © 2016 SETAC.

  1. Efficient digital implementation of a conductance-based globus pallidus neuron and the dynamics analysis

    NASA Astrophysics Data System (ADS)

    Yang, Shuangming; Wei, Xile; Deng, Bin; Liu, Chen; Li, Huiyan; Wang, Jiang

    2018-03-01

    Balance between biological plausibility of dynamical activities and computational efficiency is one of challenging problems in computational neuroscience and neural system engineering. This paper proposes a set of efficient methods for the hardware realization of the conductance-based neuron model with relevant dynamics, targeting reproducing the biological behaviors with low-cost implementation on digital programmable platform, which can be applied in wide range of conductance-based neuron models. Modified GP neuron models for efficient hardware implementation are presented to reproduce reliable pallidal dynamics, which decode the information of basal ganglia and regulate the movement disorder related voluntary activities. Implementation results on a field-programmable gate array (FPGA) demonstrate that the proposed techniques and models can reduce the resource cost significantly and reproduce the biological dynamics accurately. Besides, the biological behaviors with weak network coupling are explored on the proposed platform, and theoretical analysis is also made for the investigation of biological characteristics of the structured pallidal oscillator and network. The implementation techniques provide an essential step towards the large-scale neural network to explore the dynamical mechanisms in real time. Furthermore, the proposed methodology enables the FPGA-based system a powerful platform for the investigation on neurodegenerative diseases and real-time control of bio-inspired neuro-robotics.

  2. 75 FR 14165 - National Institute of Child Health and Human Development; Revision to Proposed Collection...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-24

    ... Information Collection: The purpose of the proposed methodological study is to evaluate the feasibility... the NCS, the multiple methodological studies conducted during the Vanguard phase will inform the... methodological study is identification of recruitment strategies and components of recruitment strategies that...

  3. 48 CFR 1552.215-72 - Instructions for the Preparation of Proposals.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... used. If escalation is included, state the degree (percent) and methodology. The methodology shall.... If so, state the number required, the professional or technical level and the methodology used to... for which the salary is applicable; (C) List of other research Projects or proposals for which...

  4. The use of grounded theory in studies of nurses and midwives' coping processes: a systematic literature search.

    PubMed

    Cheer, Karen; MacLaren, David; Tsey, Komla

    2015-01-01

    Researchers are increasingly using grounded theory methodologies to study the professional experience of nurses and midwives. To review common grounded theory characteristics and research design quality as described in grounded theory studies of coping strategies used by nurses and midwives. A systematic database search for 2005-2015 identified and assessed grounded theory characteristics from 16 studies. Study quality was assessed using a modified Critical Appraisal Skills Programme tool. Grounded theory was considered a methodology or a set of methods, able to be used within different nursing and midwifery contexts. Specific research requirements determined the common grounded theory characteristics used in different studies. Most researchers did not clarify their epistemological and theoretical perspectives. To improve research design and trustworthiness of grounded theory studies in nursing and midwifery, researchers need to state their theoretical stance and clearly articulate their use of grounded theory methodology and characteristics in research reporting.

  5. The role of risk perception in making flood risk management more effective

    NASA Astrophysics Data System (ADS)

    Buchecker, M.; Salvini, G.; Di Baldassarre, G.; Semenzin, E.; Maidl, E.; Marcomini, A.

    2013-11-01

    Over the last few decades, Europe has suffered from a number of severe flood events and, as a result, there has been a growing interest in probing alternative approaches to managing flood risk via prevention measures. A literature review reveals that, although in the last decades risk evaluation has been recognized as key element of risk management, and risk assessment methodologies (including risk analysis and evaluation) have been improved by including social, economic, cultural, historical and political conditions, the theoretical schemes are not yet applied in practice. One main reason for this shortcoming is that risk perception literature is mainly of universal and theoretical nature and cannot provide the necessary details to implement a comprehensive risk evaluation. This paper therefore aims to explore a procedure that allows the inclusion of stakeholders' perceptions of prevention measures in risk assessment. It proposes to adopt methods of risk communication (both one-way and two-way communication) in risk assessment with the final aim of making flood risk management more effective. The proposed procedure not only focuses on the effect of discursive risk communication on risk perception, and on achieving a shared assessment of the prevention alternatives, but also considers the effects of the communication process on perceived uncertainties, accepted risk levels, and trust in the managing institutions. The effectiveness of this combined procedure has been studied and illustrated using the example of the participatory flood prevention assessment process on the Sihl River in Zurich, Switzerland. The main findings of the case study suggest that the proposed procedure performed well, but that it needs some adaptations for it to be applicable in different contexts and to allow a (semi-) quantitative estimation of risk perception to be used as an indicator of adaptive capacity.

  6. Design Optimization Method for Composite Components Based on Moment Reliability-Sensitivity Criteria

    NASA Astrophysics Data System (ADS)

    Sun, Zhigang; Wang, Changxi; Niu, Xuming; Song, Yingdong

    2017-08-01

    In this paper, a Reliability-Sensitivity Based Design Optimization (RSBDO) methodology for the design of the ceramic matrix composites (CMCs) components has been proposed. A practical and efficient method for reliability analysis and sensitivity analysis of complex components with arbitrary distribution parameters are investigated by using the perturbation method, the respond surface method, the Edgeworth series and the sensitivity analysis approach. The RSBDO methodology is then established by incorporating sensitivity calculation model into RBDO methodology. Finally, the proposed RSBDO methodology is applied to the design of the CMCs components. By comparing with Monte Carlo simulation, the numerical results demonstrate that the proposed methodology provides an accurate, convergent and computationally efficient method for reliability-analysis based finite element modeling engineering practice.

  7. Applications of information theory, genetic algorithms, and neural models to predict oil flow

    NASA Astrophysics Data System (ADS)

    Ludwig, Oswaldo; Nunes, Urbano; Araújo, Rui; Schnitman, Leizer; Lepikson, Herman Augusto

    2009-07-01

    This work introduces a new information-theoretic methodology for choosing variables and their time lags in a prediction setting, particularly when neural networks are used in non-linear modeling. The first contribution of this work is the Cross Entropy Function (XEF) proposed to select input variables and their lags in order to compose the input vector of black-box prediction models. The proposed XEF method is more appropriate than the usually applied Cross Correlation Function (XCF) when the relationship among the input and output signals comes from a non-linear dynamic system. The second contribution is a method that minimizes the Joint Conditional Entropy (JCE) between the input and output variables by means of a Genetic Algorithm (GA). The aim is to take into account the dependence among the input variables when selecting the most appropriate set of inputs for a prediction problem. In short, theses methods can be used to assist the selection of input training data that have the necessary information to predict the target data. The proposed methods are applied to a petroleum engineering problem; predicting oil production. Experimental results obtained with a real-world dataset are presented demonstrating the feasibility and effectiveness of the method.

  8. What Is Everyday Ethics? A Review and a Proposal for an Integrative Concept.

    PubMed

    Zizzo, Natalie; Bell, Emily; Racine, Eric

    2016-01-01

    "Everyday ethics" is a term that has been used in the clinical and ethics literature for decades to designate normatively important and pervasive issues in healthcare. In spite of its importance, the term has not been reviewed and analyzed carefully. We undertook a literature review to understand how the term has been employed and defined, finding that it is often contrasted to "dramatic ethics." We identified the core attributes most commonly associated with everyday ethics. We then propose an integrative model of everyday ethics that builds on the contribution of different ethical theories. This model proposes that the function of everyday ethics is to serve as an integrative concept that (1) helps to detect current blind spots in bioethics (that is, shifts the focus from dramatic ethics) and (2) mobilizes moral agents to address these shortcomings of ethical insight. This novel integrative model has theoretical, methodological, practical, and pedagogical implications, which we explore. Because of the pivotal role that moral experience plays in this integrative model, the model could help to bridge empirical ethics research with more conceptual and normative work. Copyright 2016 The Journal of Clinical Ethics. All rights reserved.

  9. Space-based infrared scanning sensor LOS determination and calibration using star observation

    NASA Astrophysics Data System (ADS)

    Chen, Jun; Xu, Zhan; An, Wei; Deng, Xin-Pu; Yang, Jun-Gang

    2015-10-01

    This paper provides a novel methodology for removing sensor bias from a space based infrared (IR) system (SBIRS) through the use of stars detected in the background field of the sensor. Space based IR system uses the LOS (line of sight) of target for target location. LOS determination and calibration is the key precondition of accurate location and tracking of targets in Space based IR system and the LOS calibration of scanning sensor is one of the difficulties. The subsequent changes of sensor bias are not been taking into account in the conventional LOS determination and calibration process. Based on the analysis of the imaging process of scanning sensor, a theoretical model based on the estimation of bias angles using star observation is proposed. By establishing the process model of the bias angles and the observation model of stars, using an extended Kalman filter (EKF) to estimate the bias angles, and then calibrating the sensor LOS. Time domain simulations results indicate that the proposed method has a high precision and smooth performance for sensor LOS determination and calibration. The timeliness and precision of target tracking process in the space based infrared (IR) tracking system could be met with the proposed algorithm.

  10. Requirements and Solutions for Personalized Health Systems.

    PubMed

    Blobel, Bernd; Ruotsalainen, Pekka; Lopez, Diego M; Oemig, Frank

    2017-01-01

    Organizational, methodological and technological paradigm changes enable a precise, personalized, predictive, preventive and participative approach to health and social services supported by multiple actors from different domains at diverse level of knowledge and skills. Interoperability has to advance beyond Information and Communication Technologies (ICT) concerns, including the real world business domains and their processes, but also the individual context of all actors involved. The paper introduces and compares personalized health definitions, summarizes requirements and principles for pHealth systems, and considers intelligent interoperability. It addresses knowledge representation and harmonization, decision intelligence, and usability as crucial issues in pHealth. On this basis, a system-theoretical, ontology-based, policy-driven reference architecture model for open and intelligent pHealth ecosystems and its transformation into an appropriate ICT design and implementation is proposed.

  11. The Euclidean model of measurement in Fechner's psychophysics.

    PubMed

    Zudini, Verena

    2011-01-01

    Historians acknowledge Euclid and Fechner, respectively, as the founders of classical geometry and classical psychophysics. At all times, their ideas have been reference points and have shared the same destiny of being criticized, corrected, and even radically rejected, in their theoretical and methodological aspects and in their epistemological value. According to a model of measurement of magnitudes which goes back to Euclid, Fechner (1860) developed a theory for psychical magnitudes that opened a lively debate among numerous scholars. Fechner's attempt to apply the model proposed by Euclid to subjective sensation magnitudes--and the debate that followed--generated ideas and concepts that were destined to have rich developments in the psychological and (more generally) scientific field of the twentieth century and that still animate current psychophysics. © 2011 Wiley Periodicals, Inc.

  12. Historical antecedents to the philosophy of Paul Feyerabend.

    PubMed

    Munévar, Gonzalo

    2016-06-01

    Paul Feyerabend has been considered a very radical philosopher of science for proposing that we may advance hypotheses contrary to well-confirmed experimental results, that observations make theoretical assumptions, that all methodological rules have exceptions, that ordinary citizens may challenge the judgment of experts, and that human happiness should be a key value for science. As radical as these theses may sound, they all have historical antecedents. In defending the Copernican view, Galileo exemplified the first two; Mill, Aristotle and Machiavelli all argued for pluralism; Aristotle gave commonsense reasons for why ordinary citizens may be able to judge the work of experts; and a combination of Plato's and Aristotle's views can offer strong support for the connection between science and happiness. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Toward a new approach to the study of personality in culture.

    PubMed

    Cheung, Fanny M; van de Vijver, Fons J R; Leong, Frederick T L

    2011-10-01

    We review recent developments in the study of culture and personality measurement. Three approaches are described: an etic approach that focuses on establishing measurement equivalence in imported measures of personality, an emic (indigenous) approach that studies personality in specific cultures, and a combined emic-etic approach to personality. We propose the latter approach as a way of combining the methodological rigor of the etic approach and the cultural sensitivity of the emic approach. The combined approach is illustrated by two examples: the first with origins in Chinese culture and the second in South Africa. The article ends with a discussion of the theoretical and practical implications of the combined emic-etic approach for the study of culture and personality and for psychology as a science.

  14. A passivity based control methodology for flexible joint robots with application to a simplified shuttle RMS arm

    NASA Technical Reports Server (NTRS)

    Sicard, Pierre; Wen, John T.

    1991-01-01

    The main goal is to develop a general theory for the control of flexible robots, including flexible joint robots, flexible link robots, rigid bodies with flexible appendages, etc. As part of the validation, the theory is applied to the control law development for a test example which consists of a three-link arm modeled after the shoulder yaw joint of the space shuttle remote manipulator system (RMS). The performance of the closed loop control system is then compared with the performance of the existing RMS controller to demonstrate the effectiveness of the proposed approach. The theoretical foundation of this new approach to the control of flexible robots is presented and its efficacy is demonstrated through simulation results on the three-link test arm.

  15. Generating Nonnormal Multivariate Data Using Copulas: Applications to SEM.

    PubMed

    Mair, Patrick; Satorra, Albert; Bentler, Peter M

    2012-07-01

    This article develops a procedure based on copulas to simulate multivariate nonnormal data that satisfy a prespecified variance-covariance matrix. The covariance matrix used can comply with a specific moment structure form (e.g., a factor analysis or a general structural equation model). Thus, the method is particularly useful for Monte Carlo evaluation of structural equation models within the context of nonnormal data. The new procedure for nonnormal data simulation is theoretically described and also implemented in the widely used R environment. The quality of the method is assessed by Monte Carlo simulations. A 1-sample test on the observed covariance matrix based on the copula methodology is proposed. This new test for evaluating the quality of a simulation is defined through a particular structural model specification and is robust against normality violations.

  16. On methods of estimating cosmological bulk flows

    NASA Astrophysics Data System (ADS)

    Nusser, Adi

    2016-01-01

    We explore similarities and differences between several estimators of the cosmological bulk flow, B, from the observed radial peculiar velocities of galaxies. A distinction is made between two theoretical definitions of B as a dipole moment of the velocity field weighted by a radial window function. One definition involves the three-dimensional (3D) peculiar velocity, while the other is based on its radial component alone. Different methods attempt at inferring B for either of these definitions which coincide only for the case of a velocity field which is constant in space. We focus on the Wiener Filtering (WF) and the Constrained Minimum Variance (CMV) methodologies. Both methodologies require a prior expressed in terms of the radial velocity correlation function. Hoffman et al. compute B in Top-Hat windows from a WF realization of the 3D peculiar velocity field. Feldman et al. infer B directly from the observed velocities for the second definition of B. The WF methodology could easily be adapted to the second definition, in which case it will be equivalent to the CMV with the exception of the imposed constraint. For a prior with vanishing correlations or very noisy data, CMV reproduces the standard Maximum Likelihood estimation for B of the entire sample independent of the radial weighting function. Therefore, this estimator is likely more susceptible to observational biases that could be present in measurements of distant galaxies. Finally, two additional estimators are proposed.

  17. Evaluating perceptual integration: uniting response-time- and accuracy-based methodologies.

    PubMed

    Eidels, Ami; Townsend, James T; Hughes, Howard C; Perry, Lacey A

    2015-02-01

    This investigation brings together a response-time system identification methodology (e.g., Townsend & Wenger Psychonomic Bulletin & Review 11, 391-418, 2004a) and an accuracy methodology, intended to assess models of integration across stimulus dimensions (features, modalities, etc.) that were proposed by Shaw and colleagues (e.g., Mulligan & Shaw Perception & Psychophysics 28, 471-478, 1980). The goal was to theoretically examine these separate strategies and to apply them conjointly to the same set of participants. The empirical phases were carried out within an extension of an established experimental design called the double factorial paradigm (e.g., Townsend & Nozawa Journal of Mathematical Psychology 39, 321-359, 1995). That paradigm, based on response times, permits assessments of architecture (parallel vs. serial processing), stopping rule (exhaustive vs. minimum time), and workload capacity, all within the same blocks of trials. The paradigm introduced by Shaw and colleagues uses a statistic formally analogous to that of the double factorial paradigm, but based on accuracy rather than response times. We demonstrate that the accuracy measure cannot discriminate between parallel and serial processing. Nonetheless, the class of models supported by the accuracy data possesses a suitable interpretation within the same set of models supported by the response-time data. The supported model, consistent across individuals, is parallel and has limited capacity, with the participants employing the appropriate stopping rule for the experimental setting.

  18. Bayesian data fusion for spatial prediction of categorical variables in environmental sciences

    NASA Astrophysics Data System (ADS)

    Gengler, Sarah; Bogaert, Patrick

    2014-12-01

    First developed to predict continuous variables, Bayesian Maximum Entropy (BME) has become a complete framework in the context of space-time prediction since it has been extended to predict categorical variables and mixed random fields. This method proposes solutions to combine several sources of data whatever the nature of the information. However, the various attempts that were made for adapting the BME methodology to categorical variables and mixed random fields faced some limitations, as a high computational burden. The main objective of this paper is to overcome this limitation by generalizing the Bayesian Data Fusion (BDF) theoretical framework to categorical variables, which is somehow a simplification of the BME method through the convenient conditional independence hypothesis. The BDF methodology for categorical variables is first described and then applied to a practical case study: the estimation of soil drainage classes using a soil map and point observations in the sandy area of Flanders around the city of Mechelen (Belgium). The BDF approach is compared to BME along with more classical approaches, as Indicator CoKringing (ICK) and logistic regression. Estimators are compared using various indicators, namely the Percentage of Correctly Classified locations (PCC) and the Average Highest Probability (AHP). Although BDF methodology for categorical variables is somehow a simplification of BME approach, both methods lead to similar results and have strong advantages compared to ICK and logistic regression.

  19. Light Bulbs and Change: Systems Thinking and Organisational Learning for New Ventures

    ERIC Educational Resources Information Center

    Hebel, Misha

    2007-01-01

    Purpose: The purpose of the paper is to revisit the practical worth of different systems thinking tools applied to three different business clients, which may be dismissed by academic researchers as theoretically old fashioned. Design/methodology/approach: The methodologies used are systems-based (SSM, VSM and causal loop diagrams), culminating in…

  20. Reflective Pedagogy: The Integration of Methodology and Subject-Matter Content in a Graduate-Level Course

    ERIC Educational Resources Information Center

    Jakeman, Rick C.; Henderson, Markesha M.; Howard, Lionel C.

    2017-01-01

    This article presents a critical reflection on how we, instructors of a graduate-level course in higher education administration, sought to integrate theoretical and subject-matter content and research methodology. Our reflection, guided by autoethnography and teacher reflection, challenged both our assumptions about curriculum design and our…

  1. Trends and Issues in ELT Methods and Methodology

    ERIC Educational Resources Information Center

    Waters, Alan

    2012-01-01

    Trends and issues in ELT methods and methodology can be identified at two main levels. One is in terms of the theoretical pronouncements of the "professional discourse", as manifested by major publications, conference presentations, and so on. This article therefore begins by briefly summarizing some of the main developments of this kind from 1995…

  2. U.S. Comparative and International Graduate Programs: An Overview of Programmatic Size, Relevance, Philosophy, and Methodology

    ERIC Educational Resources Information Center

    Drake, Timothy A.

    2011-01-01

    Previous work has concentrated on the epistemological foundation of comparative and international education (CIE) graduate programs. This study focuses on programmatic size, philosophy, methodology, and pedagogy. It begins by reviewing previous studies. It then provides a theoretical framework and describes the size, relevance, content, and…

  3. Journeys into Inner/Outer Space: Reflections on the Methodological Challenges of Negotiating Insider/Outsider Status in International Educational Research

    ERIC Educational Resources Information Center

    Savvides, Nicola; Al-Youssef, Joanna; Colin, Mindy; Garrido, Cecilia

    2014-01-01

    This article highlights key theoretical and methodological issues and implications of being an insider/outsider when undertaking qualitative research in international educational settings. It first addresses discourses of "self" and "other," noting that identity and belonging emerge from fluid engagement between researchers and…

  4. Towards a Methodology for the Characterization of Teachers' Didactic-Mathematical Knowledge

    ERIC Educational Resources Information Center

    Pino-Fan, Luis R.; Assis, Adriana; Castro, Walter F.

    2015-01-01

    This research study aims at exploring the use of some dimensions and theoretical-methodological tools suggested by the model of Didactic-Mathematical Knowledge (DMK) for the analysis, characterization and development of knowledge that teachers should have in order to efficiently develop within their practice. For this purpose, we analyzed the…

  5. Constraints, Resources, and Interpretative Schema: Explorations of Teachers' Decisions to Utilize, Under-Utilize or Ignore Technology

    ERIC Educational Resources Information Center

    Pereira-Leon, Maura J.

    2010-01-01

    This three-year study examined how participation in a 10-month technology-enhanced professional development program (PDP) influenced K-12 teachers' decisions to utilize or ignore technology into teaching practices. Carspecken's (1996) qualitative research methodology of Critical Ethnography provided the theoretical and methodological framework to…

  6. Contextual Determination of Human Thinking: About Some Conceptual and Methodological Obstacles in Psychology Studies

    ERIC Educational Resources Information Center

    Sorsana, Christine; Trognon, Alain

    2011-01-01

    This theoretical paper discusses some conceptual and methodological obstacles that one encounters when analyzing the contextual determination of thinking in psychology. First, we comment upon the various representations of the "cognitive" individual that have been formed over the years--from the epistemic subject to the psychological subject, and…

  7. Perspectives Do Matter: "Joint Screen", a Promising Methodology for Multimodal Interaction Analysis

    ERIC Educational Resources Information Center

    Arend, Béatrice; Sunnen, Patrick; Fixmer, Pierre; Sujbert, Monika

    2014-01-01

    This paper discusses theoretical and methodological issues arising from a video-based research design and the emergent tool "Joint Screen'"when grasping joint activity. We share our reflections regarding the combined reading of four synchronised camera perspectives combined in one screen. By these means we reconstruct and analyse…

  8. Policy capturing as a method of quantifying the determinants of landscape preference

    Treesearch

    Dennis B. Propst

    1979-01-01

    Policy Capturing, a potential methodology for evaluating landscape preference, was described and tested. This methodology results in a mathematical model that theoretically represents the human decision-making process. Under experimental conditions, judges were asked to express their preferences for scenes of the Blue Ridge Parkway. An equation which "captures,...

  9. Toward a Reconstruction of Organizational Theory: Androcentric Bias in A. H. Maslow's Theory of Human Motivation and Self-Actualization.

    ERIC Educational Resources Information Center

    Tietze, Irene Nowell; Shakeshaft, Charol

    An exploration in the context of feminist science of one theoretical basis of educational administration--Abraham Maslow's theory of human motivation and self-actualization--finds an androcentric bias in Maslow's methodology, philosophical underpinnings, and theory formulation. Maslow's hypothetico-deductive methodology was based on a…

  10. Scale in Education Research: Towards a Multi-Scale Methodology

    ERIC Educational Resources Information Center

    Noyes, Andrew

    2013-01-01

    This article explores some theoretical and methodological problems concerned with scale in education research through a critique of a recent mixed-method project. The project was framed by scale metaphors drawn from the physical and earth sciences and I consider how recent thinking around scale, for example, in ecosystems and human geography might…

  11. Soft-Systems Methodology. Mendip Papers.

    ERIC Educational Resources Information Center

    Kowszun, J.

    This paper provides an introduction to a particular systems-theoretical approach to problem-solving in the management of education usually referred to as soft-systems methodology (SSM), developed by Peter Checkland in the 1970s. SSM should provide a powerful tool for managers in education at any level who have a strategic role because it can be…

  12. Resisting Coherence: Trans Men's Experiences and the Use of Grounded Theory Methods

    ERIC Educational Resources Information Center

    Catalano, D. Chase J.

    2017-01-01

    In this methodological reflective manuscript, I explore my decision to use a grounded theoretical approach to my dissertation study on trans* men in higher education. Specifically, I question whether grounded theory as a methodology is capable of capturing the complexity and capaciousness of trans*-masculine experiences. Through the lenses of…

  13. Web-Assisted Instruction in Upper Division Communication Studies Curriculum: A Theoretical and Quantitative Analysis

    ERIC Educational Resources Information Center

    Olaniran, Bolanle; Austin, Katherine A.

    2009-01-01

    Purpose: This paper aims to describe the incorporation of technologies into two upper division Communication Studies courses at Texas Tech University. Design/methodology/approach: The article discusses the methodological and pedagogical rationale used to select the appropriate technologies and to effectively incorporate them into the classroom. An…

  14. Sociological Tools in the Study of Knowledge and Practice in Mathematics Teacher Education

    ERIC Educational Resources Information Center

    Parker, Diane; Adler, Jill

    2014-01-01

    In this paper, we put Basil Bernstein's theory of pedagogic discourse to work together with additional theoretical resources to interrogate knowledge and practice in mathematics teacher education. We illustrate this methodology through analysis of an instance of mathematics teacher education pedagogic practice. While the methodology itself is…

  15. From the Analysis of Work-Processes to Designing Competence-Based Occupational Standards and Vocational Curricula

    ERIC Educational Resources Information Center

    Tutlys, Vidmantas; Spöttl, Georg

    2017-01-01

    Purpose: This paper aims to explore methodological and institutional challenges on application of the work-process analysis approach in the design and development of competence-based occupational standards for Lithuania. Design/methodology/approach: The theoretical analysis is based on the review of scientific literature and the analysis of…

  16. Processing Speed and Executive Functions in Cognitive Aging: How to Disentangle Their Mutual Relationship?

    ERIC Educational Resources Information Center

    Albinet, Cedric T.; Boucard, Geoffroy; Bouquet, Cedric; Audiffren, Michel

    2012-01-01

    The processing-speedtheory and the prefrontal-executivetheory are competing theories of cognitive aging. Here we used a theoretically and methodologically-driven framework to investigate the relationships among measures classically used to assess these two theoretical constructs. Twenty-eight young adults (18-32 years) and 39 healthy older adults…

  17. Point and Click: Theoretical and Phenomenological Reflections on the Digitization of Early Childhood Education

    ERIC Educational Resources Information Center

    Mangen, Anne

    2010-01-01

    This article presents some theoretical-methodological reflections on the current state of the art of research on information and communication technology (ICT) in early childhood education. The implementation of ICT in preschool has triggered considerable research activity on the educational potential of digital technologies. Numerous projects and…

  18. School Leadership, Social Justice and Immigration: Examining, Exploring and Extending Two Frameworks

    ERIC Educational Resources Information Center

    Brooks, Jeffrey S.; Normore, Anthony H.; Wilkinson, Jane

    2017-01-01

    Purpose: The purpose of this paper is to explore theoretical connections between educational leadership for social justice and support for immigration. The authors seek to identify strengths, weaknesses and opportunities for further study and improved practice. Design/methodology/approach: This is a theoretical research paper that introduces,…

  19. Early Experience and the Development of Cognitive Competence: Some Theoretical and Methodological Issues.

    ERIC Educational Resources Information Center

    Ulvund, Stein Erik

    1982-01-01

    Argues that in analyzing effects of early experience on development of cognitive competence, theoretical analyses as well as empirical investigations should be based on a transactional model of development. Shows optimal stimulation hypothesis, particularly the enhancement prediction, seems to represent a transactional approach to the study of…

  20. In Search of a Theoretical Basis for Storytelling in Education Research: Story as Method

    ERIC Educational Resources Information Center

    Gallagher, Kathleen Marie

    2011-01-01

    In this article, the author argues that storytelling is centrally important to education research. The proliferation of narrative methodologies, albeit significant and innovative in the evolution of qualitative studies in education, has, nonetheless, not been accompanied by a theoretical body that has captured the complexities--ethical and…

  1. Theoretical Notes on the Sociological Analysis of School Reform Networks

    ERIC Educational Resources Information Center

    Ladwig, James G.

    2014-01-01

    Nearly two decades ago, Ladwig outlined the theoretical and methodological implications of Bourdieu's concept of the social field for sociological analyses of educational policy and school reform. The current analysis extends this work to consider the sociological import of one of the most ubiquitous forms of educational reform found around…

  2. The Soul of Teaching and Professional Learning: An Appreciative Inquiry into the Enneagram of Reflective Practice

    ERIC Educational Resources Information Center

    Luckcock, Tim

    2007-01-01

    This paper makes a contribution to the theory and practice of educational action research by introducing two theoretical and methodological resources as part of a personal review of sustained professional experience: "appreciative inquiry" and the "enneagram". It is more than a theoretical exercise, however, because it also…

  3. A Political Multi-Layered Approach to Researching Children's Digital Literacy Practices

    ERIC Educational Resources Information Center

    Koutsogiannis, Dimitris

    2007-01-01

    This paper attempts to present a theoretical framework for researching the out-of-school digital literacy practices of Greek adolescents. The broader aim, however, is to discuss the theoretical and methodological issues concerning research designs to investigate literacy practices in the globalisation era. Based on data representing local and…

  4. 77 FR 67363 - Sunshine Act Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-09

    ... 20571. Open Agenda Items: Item No. 1: Proposed Economic Impact Procedures and Methodological Guidelines. Documentation including the proposed Economic Impact Procedures and Methodological Guidelines as well as the...

  5. Information-theoretic metamodel of organizational evolution

    NASA Astrophysics Data System (ADS)

    Sepulveda, Alfredo

    2011-12-01

    Social organizations are abstractly modeled by holarchies---self-similar connected networks---and intelligent complex adaptive multiagent systems---large networks of autonomous reasoning agents interacting via scaled processes. However, little is known of how information shapes evolution in such organizations, a gap that can lead to misleading analytics. The research problem addressed in this study was the ineffective manner in which classical model-predict-control methods used in business analytics attempt to define organization evolution. The purpose of the study was to construct an effective metamodel for organization evolution based on a proposed complex adaptive structure---the info-holarchy. Theoretical foundations of this study were holarchies, complex adaptive systems, evolutionary theory, and quantum mechanics, among other recently developed physical and information theories. Research questions addressed how information evolution patterns gleamed from the study's inductive metamodel more aptly explained volatility in organization. In this study, a hybrid grounded theory based on abstract inductive extensions of information theories was utilized as the research methodology. An overarching heuristic metamodel was framed from the theoretical analysis of the properties of these extension theories and applied to business, neural, and computational entities. This metamodel resulted in the synthesis of a metaphor for, and generalization of organization evolution, serving as the recommended and appropriate analytical tool to view business dynamics for future applications. This study may manifest positive social change through a fundamental understanding of complexity in business from general information theories, resulting in more effective management.

  6. Developing a theoretical foundation to change road user behavior and improve traffic safety: Driving under the influence of cannabis (DUIC).

    PubMed

    Ward, Nicholas J; Schell, William; Kelley-Baker, Tara; Otto, Jay; Finley, Kari

    2018-05-19

    This study explored a theoretical model to assess the influence of culture on willingness and intention to drive under the influence of cannabis (DUIC). This model is expected to guide the design of strategies to change future DUIC behavior in road users. This study used a survey methodology to obtain a nationally representative sample (n = 941) from the AmeriSpeak Panel. Survey items were designed to measure aspects of a proposed definition of traffic safety culture and a predictive model of its relationship to DUIC. Although the percentage of reported past DUIC behaviors was relatively low (8.5%), this behavior is still a significant public health issue-especially for younger drivers (18-29 years), who reported more DUIC than expected. Findings suggest that specific cultural components (attitudes, norms) reliably predict past DUIC behavior, general DUIC willingness, and future DUIC intention. Most DUIC behavior appears to be deliberate, related significantly to willingness and intention. Intention and willingness both appear to fully moderate the relationship between traffic safety culture and DUIC behavior. This study explored a theoretical model to understand road user behavior involving drug (cannabis)-impaired driving as a significant risk factor for traffic safety. By understanding the cultural factors that increase DUIC behavior, we can create strategies to transform this culture and sustain safer road user behavior.

  7. Highly effective hydrogen isotope separation in nanoporous metal-organic frameworks with open metal sites: direct measurement and theoretical analysis.

    PubMed

    Oh, Hyunchul; Savchenko, Ievgeniia; Mavrandonakis, Andreas; Heine, Thomas; Hirscher, Michael

    2014-01-28

    Separating gaseous mixtures that consist of very similar size is one of the critical issues in modern separation technology. Especially, the separation of the isotopes hydrogen and deuterium requires special efforts, even though these isotopes show a very large mass ratio. Conventionally, H/D separation can be realized through cryogenic distillation of the molecular species or the Girdler-sulfide process, which are among the most energy-intensive separation techniques in the chemical industry. However, costs can be significantly reduced by using highly mass-selective nanoporous sorbents. Here, we describe a hydrogen isotope separation strategy exploiting the strongly attractive open metal sites present in nanoporous metal-organic frameworks of the CPO-27 family (also referred to as MOF-74). A theoretical analysis predicts an outstanding hydrogen isotopologue separation at open metal sites due to isotopal effects, which has been directly observed through cryogenic thermal desorption spectroscopy. For H2/D2 separation of an equimolar mixture at 60 K, the selectivity of 12 is the highest value ever measured, and this methodology shows extremely high separation efficiencies even above 77 K. Our theoretical results imply also a high selectivity for HD/H2 separation at similar temperatures, and together with catalytically active sites, we propose a mechanism to produce D2 from HD/H2 mixtures with natural or enriched deuterium content.

  8. Stability and performance analysis of a jump linear control system subject to digital upsets

    NASA Astrophysics Data System (ADS)

    Wang, Rui; Sun, Hui; Ma, Zhen-Yang

    2015-04-01

    This paper focuses on the methodology analysis for the stability and the corresponding tracking performance of a closed-loop digital jump linear control system with a stochastic switching signal. The method is applied to a flight control system. A distributed recoverable platform is implemented on the flight control system and subject to independent digital upsets. The upset processes are used to stimulate electromagnetic environments. Specifically, the paper presents the scenarios that the upset process is directly injected into the distributed flight control system, which is modeled by independent Markov upset processes and independent and identically distributed (IID) processes. A theoretical performance analysis and simulation modelling are both presented in detail for a more complete independent digital upset injection. The specific examples are proposed to verify the methodology of tracking performance analysis. The general analyses for different configurations are also proposed. Comparisons among different configurations are conducted to demonstrate the availability and the characteristics of the design. Project supported by the Young Scientists Fund of the National Natural Science Foundation of China (Grant No. 61403395), the Natural Science Foundation of Tianjin, China (Grant No. 13JCYBJC39000), the Scientific Research Foundation for the Returned Overseas Chinese Scholars, State Education Ministry, China, the Tianjin Key Laboratory of Civil Aircraft Airworthiness and Maintenance in Civil Aviation of China (Grant No. 104003020106), and the Fund for Scholars of Civil Aviation University of China (Grant No. 2012QD21x).

  9. Accurate theoretical prediction of vibrational frequencies in an inhomogeneous dynamic environment: A case study of a glutamate molecule in water solution and in a protein-bound form

    PubMed Central

    Speranskiy, Kirill; Kurnikova, Maria

    2012-01-01

    We propose a hierarchical approach to model vibrational frequencies of a ligand in a strongly fluctuating inhomogeneous environment such as a liquid solution or when bound to a macromolecule, e.g., a protein. Vibrational frequencies typically measured experimentally are ensemble averaged quantities which result (in part) from the influence of the strongly fluctuating solvent. Solvent fluctuations can be sampled effectively by a classical molecular simulation, which in our model serves as the first, low level of the hierarchy. At the second high level of the hierarchy a small subset of system coordinates is used to construct a patch of the potential surface (ab initio) relevant to the vibration in question. This subset of coordinates is under the influence of an instantaneous external force exerted by the environment. The force is calculated at the lower level of the hierarchy. The proposed methodology is applied to model vibrational frequencies of a glutamate in water and when bound to the Glutamate receptor protein and its mutant. Our results are in close agreement with the experimental values and frequency shifts measured by the Jayaraman group by the Fourier transform infrared spectroscopy [Q. Cheng et al., Biochem. 41, 1602 (2002)]. Our methodology proved useful in successfully reproducing vibrational frequencies of a ligand in such a soft, flexible, and strongly inhomogeneous protein as the Glutamate receptor. PMID:15260697

  10. A Long Short-Term Memory deep learning network for the prediction of epileptic seizures using EEG signals.

    PubMed

    Tsiouris, Κostas Μ; Pezoulas, Vasileios C; Zervakis, Michalis; Konitsiotis, Spiros; Koutsouris, Dimitrios D; Fotiadis, Dimitrios I

    2018-05-17

    The electroencephalogram (EEG) is the most prominent means to study epilepsy and capture changes in electrical brain activity that could declare an imminent seizure. In this work, Long Short-Term Memory (LSTM) networks are introduced in epileptic seizure prediction using EEG signals, expanding the use of deep learning algorithms with convolutional neural networks (CNN). A pre-analysis is initially performed to find the optimal architecture of the LSTM network by testing several modules and layers of memory units. Based on these results, a two-layer LSTM network is selected to evaluate seizure prediction performance using four different lengths of preictal windows, ranging from 15 min to 2 h. The LSTM model exploits a wide range of features extracted prior to classification, including time and frequency domain features, between EEG channels cross-correlation and graph theoretic features. The evaluation is performed using long-term EEG recordings from the open CHB-MIT Scalp EEG database, suggest that the proposed methodology is able to predict all 185 seizures, providing high rates of seizure prediction sensitivity and low false prediction rates (FPR) of 0.11-0.02 false alarms per hour, depending on the duration of the preictal window. The proposed LSTM-based methodology delivers a significant increase in seizure prediction performance compared to both traditional machine learning techniques and convolutional neural networks that have been previously evaluated in the literature. Copyright © 2018 Elsevier Ltd. All rights reserved.

  11. Big data and tactical analysis in elite soccer: future challenges and opportunities for sports science.

    PubMed

    Rein, Robert; Memmert, Daniel

    2016-01-01

    Until recently tactical analysis in elite soccer were based on observational data using variables which discard most contextual information. Analyses of team tactics require however detailed data from various sources including technical skill, individual physiological performance, and team formations among others to represent the complex processes underlying team tactical behavior. Accordingly, little is known about how these different factors influence team tactical behavior in elite soccer. In parts, this has also been due to the lack of available data. Increasingly however, detailed game logs obtained through next-generation tracking technologies in addition to physiological training data collected through novel miniature sensor technologies have become available for research. This leads however to the opposite problem where the shear amount of data becomes an obstacle in itself as methodological guidelines as well as theoretical modelling of tactical decision making in team sports is lacking. The present paper discusses how big data and modern machine learning technologies may help to address these issues and aid in developing a theoretical model for tactical decision making in team sports. As experience from medical applications show, significant organizational obstacles regarding data governance and access to technologies must be overcome first. The present work discusses these issues with respect to tactical analyses in elite soccer and propose a technological stack which aims to introduce big data technologies into elite soccer research. The proposed approach could also serve as a guideline for other sports science domains as increasing data size is becoming a wide-spread phenomenon.

  12. Scoping reviews: time for clarity in definition, methods, and reporting.

    PubMed

    Colquhoun, Heather L; Levac, Danielle; O'Brien, Kelly K; Straus, Sharon; Tricco, Andrea C; Perrier, Laure; Kastner, Monika; Moher, David

    2014-12-01

    The scoping review has become increasingly popular as a form of knowledge synthesis. However, a lack of consensus on scoping review terminology, definition, methodology, and reporting limits the potential of this form of synthesis. In this article, we propose recommendations to further advance the field of scoping review methodology. We summarize current understanding of scoping review publication rates, terms, definitions, and methods. We propose three recommendations for clarity in term, definition and methodology. We recommend adopting the terms "scoping review" or "scoping study" and the use of a proposed definition. Until such time as further guidance is developed, we recommend the use of the methodological steps outlined in the Arksey and O'Malley framework and further enhanced by Levac et al. The development of reporting guidance for the conduct and reporting of scoping reviews is underway. Consistency in the proposed domains and methodologies of scoping reviews, along with the development of reporting guidance, will facilitate methodological advancement, reduce confusion, facilitate collaboration and improve knowledge translation of scoping review findings. Copyright © 2014 Elsevier Inc. All rights reserved.

  13. Interdisciplinary mixed methods research with structurally vulnerable populations: case studies of injection drug users in San Francisco.

    PubMed

    Lopez, Andrea M; Bourgois, Philippe; Wenger, Lynn D; Lorvick, Jennifer; Martinez, Alexis N; Kral, Alex H

    2013-03-01

    Research with injection drug users (IDUs) benefits from interdisciplinary theoretical and methodological innovation because drug use is illegal, socially sanctioned and often hidden. Despite the increasing visibility of interdisciplinary, mixed methods research projects with IDUs, qualitative components are often subordinated to quantitative approaches and page restrictions in top addiction journals limit detailed reports of complex data collection and analysis logistics, thus minimizing the fuller scientific potential of genuine mixed methods. We present the methodological logistics and conceptual approaches of four mixed-methods research projects that our interdisciplinary team conducted in San Francisco with IDUs over the past two decades. These projects include combinations of participant-observation ethnography, in-depth qualitative interviewing, epidemiological surveys, photo-documentation, and geographic mapping. We adapted Greene et al.'s framework for combining methods in a single research project through: data triangulation, methodological complementarity, methodological initiation, and methodological expansion. We argue that: (1) flexible and self-reflexive methodological procedures allowed us to seize strategic opportunities to document unexpected and sometimes contradictory findings as they emerged to generate new research questions, (2) iteratively mixing methods increased the scope, reliability, and generalizability of our data, and (3) interdisciplinary collaboration contributed to a scientific "value added" that allowed for more robust theoretical and practical findings about drug use and risk-taking. Copyright © 2013 Elsevier B.V. All rights reserved.

  14. Interdisciplinary Mixed Methods Research with Structurally Vulnerable Populations: Case Studies of Injection Drug Users in San Francisco

    PubMed Central

    Lopez, Andrea; Bourgois, Philippe; Wenger, Lynn; Lorvick, Jennifer; Martinez, Alexis; Kral, Alex H.

    2013-01-01

    Research with injection drug users (IDUs) benefits from interdisciplinary theoretical and methodological innovation because drug use is illegal, socially sanctioned and often hidden. Despite the increasing visibility of interdisciplinary, mixed methods research projects with IDUs, qualitative components are often subordinated to quantitative approaches and page restrictions in top addiction journals limit detailed reports of complex data collection and analysis logistics, thus minimizing the fuller scientific potential of genuine mixed methods. We present the methodological logistics and conceptual approaches of four mixed-methods research projects that our interdisciplinary team conducted in San Francisco with IDUs over the past two decades. These projects include combinations of participant-observation ethnography, in-depth qualitative interviewing, epidemiological surveys, photo-documentation, and geographic mapping. We adapted Greene et al.’s framework for combining methods in a single research project through: data triangulation, methodological complementarity, methodological initiation, and methodological expansion. We argue that: (1) flexible and self-reflexive methodological procedures allowed us to seize strategic opportunities to document unexpected and sometimes contradictory findings as they emerged to generate new research questions, (2) iteratively mixing methods increased the scope, reliability, and generalizability of our data, and (3) interdisciplinary collaboration contributed to a scientific “value added” that allowed for more robust theoretical and practical findings about drug use and risk-taking. PMID:23312109

  15. Experimental validation of ultrasonic guided modes in electrical cables by optical interferometry.

    PubMed

    Mateo, Carlos; de Espinosa, Francisco Montero; Gómez-Ullate, Yago; Talavera, Juan A

    2008-03-01

    In this work, the dispersion curves of elastic waves propagating in electrical cables and in bare copper wires are obtained theoretically and validated experimentally. The theoretical model, based on Gazis equations formulated according to the global matrix methodology, is resolved numerically. Viscoelasticity and attenuation are modeled theoretically using the Kelvin-Voigt model. Experimental tests are carried out using interferometry. There is good agreement between the simulations and the experiments despite the peculiarities of electrical cables.

  16. Comparison of holographic and field theoretic complexities for time dependent thermofield double states

    NASA Astrophysics Data System (ADS)

    Yang, Run-Qiu; Niu, Chao; Zhang, Cheng-Yong; Kim, Keun-Young

    2018-02-01

    We compute the time-dependent complexity of the thermofield double states by four different proposals: two holographic proposals based on the "complexity-action" (CA) conjecture and "complexity-volume" (CV) conjecture, and two quantum field theoretic proposals based on the Fubini-Study metric (FS) and Finsler geometry (FG). We find that four different proposals yield both similarities and differences, which will be useful to deepen our understanding on the complexity and sharpen its definition. In particular, at early time the complexity linearly increase in the CV and FG proposals, linearly decreases in the FS proposal, and does not change in the CA proposal. In the late time limit, the CA, CV and FG proposals all show that the growth rate is 2 E/(πℏ) saturating the Lloyd's bound, while the FS proposal shows the growth rate is zero. It seems that the holographic CV conjecture and the field theoretic FG method are more correlated.

  17. A new methodology to integrate planetary quarantine requirements into mission planning, with application to a Jupiter orbiter

    NASA Technical Reports Server (NTRS)

    Howard, R. A.; North, D. W.; Pezier, J. P.

    1975-01-01

    A new methodology is proposed for integrating planetary quarantine objectives into space exploration planning. This methodology is designed to remedy the major weaknesses inherent in the current formulation of planetary quarantine requirements. Application of the methodology is illustrated by a tutorial analysis of a proposed Jupiter Orbiter mission. The proposed methodology reformulates planetary quarantine planning as a sequential decision problem. Rather than concentrating on a nominal plan, all decision alternatives and possible consequences are laid out in a decision tree. Probabilities and values are associated with the outcomes, including the outcome of contamination. The process of allocating probabilities, which could not be made perfectly unambiguous and systematic, is replaced by decomposition and optimization techniques based on principles of dynamic programming. Thus, the new methodology provides logical integration of all available information and allows selection of the best strategy consistent with quarantine and other space exploration goals.

  18. From individual coping strategies to illness codification: the reflection of gender in social science research on multiple chemical sensitivities (MCS).

    PubMed

    Nadeau, Geneviève; Lippel, Katherine

    2014-09-10

    Emerging fields such as environmental health have been challenged, in recent years, to answer the growing methodological calls for a finer integration of sex and gender in health-related research and policy-making. Through a descriptive examination of 25 peer-reviewed social science papers published between 1996 and 2011, we explore, by examining methodological designs and theoretical standpoints, how the social sciences have integrated gender sensitivity in empirical work on Multiple Chemical Sensitivities (MCS). MCS is a "diagnosis" associated with sensitivities to chronic and low-dose chemical exposures, which remains contested in both the medical and institutional arenas, and is reported to disproportionately affect women. We highlighted important differences between papers that did integrate a gender lens and those that did not. These included characteristics of the authorship, purposes, theoretical frameworks and methodological designs of the studies. Reviewed papers that integrated gender tended to focus on the gender roles and identity of women suffering from MCS, emphasizing personal strategies of adaptation. More generally, terminological confusions in the use of sex and gender language and concepts, such as a conflation of women and gender, were observed. Although some men were included in most of the study samples reviewed, specific data relating to men was undereported in results and only one paper discussed issues specifically experienced by men suffering from MCS. Papers that overlooked gender dimensions generally addressed more systemic social issues such as the dynamics of expertise and the medical codification of MCS, from more consistently outlined theoretical frameworks. Results highlight the place for a critical, systematic and reflexive problematization of gender and for the development of methodological and theoretical tools on how to integrate gender in research designs when looking at both micro and macro social dimensions of environmental health conditions. This paper contributes to a discussion on the methodological and policy implications of taking sex and gender into account appropriately in order to contribute to better equity in health, especially where the critical social contexts of definition and medico-legal recognition play a major role such as in the case of MCS.

  19. Paul's gospel and the rhetoric of apostolic rejection: A study of Galatians 1:15--17, 1 Corinthians 15:8, F. C. Baur, and the origins of Paul's Gentile mission

    NASA Astrophysics Data System (ADS)

    Mitchell, Matthew Wesley

    This dissertation proposes a new understanding of Paul's Gentile mission and its relationship to his so-called "conversion." This dissertation examines the origins of Paul's mission to the Gentiles, and locates it in his claims to have been personally commissioned to undertake such a mission by Jesus. Specifically, I argue that it is the rejection of Paul's claim to be an apostle, a claim founded upon his "conversion" experience, that precipitates his mission to the Gentiles. In arguing this view, I draw upon Ferdinand Christian Baur's nineteenth century theories concerning both the unreliability of Acts as a historical source, and his proposal of a clear division between Paul and the other apostles. In establishing the methodological and theoretical framework of the dissertation, I discuss the "New Perspective on Paul" that has dominated New Testament scholarship over the past thirty years. My study is also informed methodologically by the growing interest in rhetorical criticism among biblical scholars, although the emphasis of this dissertation bears more of a resemblance to the approach of the New Rhetoric than the categories of classical, Greco-Roman rhetoric. The textual component of this work falls into two stages. The first contains a full examination of Paul's "conversion passages" in Galatians 1:15--17 and 1 Corinthians 15:8, attempting to situate these seemingly unusual self-descriptions in their cultural contexts. The second involves an examination of F. C. Baur's presentation of Paul, and the reception of Baur's views among biblical scholars throughout the years following his scholarly activity. This dissertation makes two claims, each of which can stand on its own as an important contribution to scholarship. My first claim is that components of Baur's work support my proposal concerning Paul's Gentile mission and his experience of apostolic rejection, and that this proposal has much to commend it as an explanation of a perennial scholarly puzzle. My second claim is methodological, as I demonstrate that scholarly writings about Paul and his modern interpreters are themselves exercises in argumentation, and thus are not to be accepted uncritically, or without close attention to the rhetorical practices they utilize.

  20. Diffusion and decay chain of radioisotopes in stagnant water in saturated porous media.

    PubMed

    Guzmán, Juan; Alvarez-Ramirez, Jose; Escarela-Pérez, Rafael; Vargas, Raúl Alejandro

    2014-09-01

    The analysis of the diffusion of radioisotopes in stagnant water in saturated porous media is important to validate the performance of barrier systems used in radioactive repositories. In this work a methodology is developed to determine the radioisotope concentration in a two-reservoir configuration: a saturated porous medium with stagnant water is surrounded by two reservoirs. The concentrations are obtained for all the radioisotopes of the decay chain using the concept of overvalued concentration. A methodology, based on the variable separation method, is proposed for the solution of the transport equation. The novelty of the proposed methodology involves the factorization of the overvalued concentration in two factors: one that describes the diffusion without decay and another one that describes the decay without diffusion. It is possible with the proposed methodology to determine the required time to obtain equal injective and diffusive concentrations in reservoirs. In fact, this time is inversely proportional to the diffusion coefficient. In addition, the proposed methodology allows finding the required time to get a linear and constant space distribution of the concentration in porous mediums. This time is inversely proportional to the diffusion coefficient. In order to validate the proposed methodology, the distributions in the radioisotope concentrations are compared with other experimental and numerical works. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. Applying a contemporary grounded theory methodology.

    PubMed

    Licqurish, Sharon; Seibold, Carmel

    2011-01-01

    The aim of this paper is to discuss the application of a contemporary grounded theory methodology to a research project exploring the experiences of students studying for a degree in midwifery. Grounded theory is a qualitative research approach developed by Glaser and Strauss in the 1950s but the methodology for this study was modelled on Clarke's (2005) approach and was underpinned by a symbolic interactionist theoretical perspective, post-structuralist theories of Michel Foucault and a constructionist epistemology. The study participants were 19 midwifery students completing their final placement. Data were collected through individual in-depth interviews and participant observation, and analysed using the grounded theory analysis techniques of coding, constant comparative analysis and theoretical sampling, as well as situational maps. The analysis focused on social action and interaction and the operation of power in the students' environment. The social process in which the students were involved, as well as the actors and discourses that affected the students' competency development, were highlighted. The methodology allowed a thorough exploration of the students' experiences of achieving competency. However, some difficulties were encountered. One of the major issues related to the understanding and application of complex sociological theories that challenged positivist notions of truth and power. Furthermore, the mapping processes were complex. Despite these minor challenges, the authors recommend applying this methodology to other similar research projects.

  2. 76 FR 39876 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-07

    ... Survey--Pretest of Proposed Questions and Methodology.'' In accordance with the Paperwork Reduction Act... Health Plan Survey-- Pretest of Proposed Questions and Methodology The Consumer Assessment of Healthcare... year to year. The CAHPS[supreg] program was designed to: Make it possible to compare survey results...

  3. 76 FR 57046 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-15

    ... Survey--Pretest of Proposed Questions and Methodology.'' In accordance with the Paperwork Reduction Act... Health Plan Survey-- Pretest of Proposed Questions and Methodology The Consumer Assessment of Healthcare... often changed from year to year. The CAHPS[reg] program was designed to: Make it possible to compare...

  4. 42 CFR 495.204 - Incentive payments to qualifying MA organizations for MA-EPs and MA-affiliated eligible hospitals.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    .... (iii) Methodological proposals must be submitted to CMS by June of the payment year and must be... the payment year. (4) CMS requires the qualifying MA organization to develop a methodological proposal... MA organization in the payment year. The methodological proposal— (i) Must be approved by CMS; and...

  5. A Proposed Theory Seeded Methodology for Design Based Research into Effective Use of MUVES in Vocational Education Contexts

    ERIC Educational Resources Information Center

    Cochrane, Todd; Davis, Niki; Morrow, Donna

    2013-01-01

    A methodology for design based research (DBR) into effective development and use of Multi-User Virtual Environments (MUVE) in vocational education is proposed. It blends software development with DBR with two theories selected to inform the methodology. Legitimate peripheral participation LPP (Lave & Wenger, 1991) provides a filter when…

  6. Examining Approaches to Research on Self-Regulated Learning: Conceptual and Methodological Considerations

    ERIC Educational Resources Information Center

    Karabenick, Stuart A.; Zusho, Akane

    2015-01-01

    We provide a conceptual commentary on the articles in this special issue, first by describing the unique features of each study, focusing on what we consider to be their theoretical and methodological contributions, and then by highlighting significant crosscutting themes and future directions in the study of SRL. Specifically, we define SRL to be…

  7. Prototyping a Microcomputer-Based Online Library Catalog. Occasional Papers Number 177.

    ERIC Educational Resources Information Center

    Lazinger, Susan S.; Shoval, Peretz

    This report examines and evaluates the application of prototyping methodology in the design of a microcomputer-based online library catalog. The methodology for carrying out the research involves a five-part examination of the problem on both the theoretical and applied levels, each of which is discussed in a separate section as follows: (1) a…

  8. Couple Attachment and Relationship Duration in Psychotherapy Patients: Exploring a New Methodology of Assessment

    ERIC Educational Resources Information Center

    Sochos, Antigonos

    2014-01-01

    The couple relationship is an essential source of support for individuals undergoing psychological treatment and the aim of this study was to apply a new methodology in assessing the quality of such support. A theoretically informed thematic analysis of interview transcripts was conducted, triangulated by quantitative data. Twenty-one brief…

  9. Collaborative Action Research in the Context of Developmental Work Research: A Methodological Approach for Science Teachers' Professional Development

    ERIC Educational Resources Information Center

    Piliouras, Panagiotis; Lathouris, Dimitris; Plakitsi, Katerina; Stylianou, Liana

    2015-01-01

    The paper refers to the theoretical establishment and brief presentation of collaborative action research with the characteristics of "developmental work research" as an effective methodological approach so that science teachers develop themselves professionally. A specific case study is presented, in which we aimed to transform the…

  10. An Examination of the State of Imitation Research in Children with Autism: Issues of Definition and Methodology

    ERIC Educational Resources Information Center

    Sevlever, Melina; Gillis, Jennifer M.

    2010-01-01

    Several authors have suggested that children with autism are impaired in their ability to imitate others. However, diverse methodologies, contradictory findings, and varying theoretical explanations continue to exist in the literature despite decades of research. A comprehensive account of imitation in children with autism is hampered by the lack…

  11. Theoretical Significance in Q Methodology: A Qualitative Approach to a Mixed Method

    ERIC Educational Resources Information Center

    Ramlo, Susan

    2015-01-01

    Q methodology (Q) has offered researchers a unique scientific measure of subjectivity since William Stephenson's first article in 1935. Q's focus on subjectivity includes self-referential meaning and interpretation. Q is most often identified with its technique (Q-sort) and its method (factor analysis to group people); yet, it consists of a…

  12. Methodology of Comparative Analysis of Public School Teachers' Continuing Professional Development in Great Britain, Canada and the USA

    ERIC Educational Resources Information Center

    Mukan, Nataliya; Kravets, Svitlana

    2015-01-01

    In the article the methodology of comparative analysis of public school teachers' continuing professional development (CPD) in Great Britain, Canada and the USA has been presented. The main objectives are defined as theoretical analysis of scientific and pedagogical literature, which highlights different aspects of the problem under research;…

  13. Learning Terminology in Order to Become an Active Agent in the Development of Basque Biomedical Registers

    ERIC Educational Resources Information Center

    Zabala Unzalu, Igone; San Martin Egia, Itziar; Lersundi Ayestaran, Mikel

    2016-01-01

    The aim of this article is to describe some theoretical and methodological bases underpinning the design of the course Health Communication in Basque (HCB) at the University of the Basque Country (UPV/EHU). Based on some relevant theoretical tenets of the socioterminologic and communicative approaches to Terminology, the authors assume that…

  14. Cross-Cultural Career Psychology: Comment on Fouad, Harmon, and Borgen (1997) and Tracey, Watanabe, and Schneider (1997).

    ERIC Educational Resources Information Center

    Leong, Frederick T. L.

    1997-01-01

    Uses the theoretical framework of cultural validity and cultural specificity in career psychology to comment on theoretical and methodological issues raised by two articles on cross-cultural career psychology. Discusses the distinction between etic and emic approaches to cross-cultural research and the role of cultural context in understanding…

  15. Diminishing the Divisions among Us: Reading and Writing across Difference in Theory and Method in the Sociology of Education

    ERIC Educational Resources Information Center

    Weis, Lois; Jenkins, Heather; Stich, Amy

    2009-01-01

    Evidenced in several now classic reviews of the field, much has been made of theoretical and methodological "difference" with regard to research in the sociology of education. Although such renditions often constitute important intellectual contributions, the authors suggest that it is increasingly important to read across theoretical and…

  16. A Warranted Domain Theory and Developmental Framework for a Web-Based Treatment in Support of Physician Wellness

    ERIC Educational Resources Information Center

    Donnelly, David S.

    2013-01-01

    This study employed a design-based research methodology to develop a theoretically sound approach for designing instructional treatments. The instruction of interest addressed the broad issue of physician wellness among medical school faculty, with particular emphasis on physician self-diagnosis and self-care. The theoretically sound approach…

  17. Rural Schools, Social Capital and the Big Society: A Theoretical and Empirical Exposition

    ERIC Educational Resources Information Center

    Bagley, Carl; Hillyard, Sam

    2014-01-01

    The paper commences with a theoretical exposition of the current UK government's policy commitment to the idealised notion of the Big Society and the social capital currency underpinning its formation. The paper positions this debate in relation to the rural and adopts an ethnographically-informed methodological approach to provide an in-depth…

  18. A review of modern approaches to the hydrodynamic characterisation of polydisperse macromolecular systems in biotechnology.

    PubMed

    Gillis, Richard B; Rowe, Arthur J; Adams, Gary G; Harding, Stephen E

    2014-10-01

    This short review considers the range of modern techniques for the hydrodynamic characterisation of macromolecules - particularly large glycosylated systems used in the food, biopharma and healthcare industries. The range or polydispersity of molecular weights and conformations presents special challenges compared to proteins. The review is aimed, without going into any great theoretical or methodological depth, to help the Industrial Biotechnologist choose the appropriate methodology or combination of methodologies for providing the detail he/she needs for particular applications.

  19. Bayesian outcome-based strategy classification.

    PubMed

    Lee, Michael D

    2016-03-01

    Hilbig and Moshagen (Psychonomic Bulletin & Review, 21, 1431-1443, 2014) recently developed a method for making inferences about the decision processes people use in multi-attribute forced choice tasks. Their paper makes a number of worthwhile theoretical and methodological contributions. Theoretically, they provide an insightful psychological motivation for a probabilistic extension of the widely-used "weighted additive" (WADD) model, and show how this model, as well as other important models like "take-the-best" (TTB), can and should be expressed in terms of meaningful priors. Methodologically, they develop an inference approach based on the Minimum Description Length (MDL) principles that balances both the goodness-of-fit and complexity of the decision models they consider. This paper aims to preserve these useful contributions, but provide a complementary Bayesian approach with some theoretical and methodological advantages. We develop a simple graphical model, implemented in JAGS, that allows for fully Bayesian inferences about which models people use to make decisions. To demonstrate the Bayesian approach, we apply it to the models and data considered by Hilbig and Moshagen (Psychonomic Bulletin & Review, 21, 1431-1443, 2014), showing how a prior predictive analysis of the models, and posterior inferences about which models people use and the parameter settings at which they use them, can contribute to our understanding of human decision making.

  20. Ergonomics and design: traffic sign and street name sign.

    PubMed

    Moroni, Janaina Luisa da Silva; Aymone, José Luís Farinatti

    2012-01-01

    This work proposes a design methodology using ergonomics and anthropometry concepts applied to traffic sign and street name sign projects. Initially, a literature revision on cognitive ergonomics and anthropometry is performed. Several authors and their design methodologies are analyzed and the aspects to be considered in projects of traffic and street name signs are selected and other specific aspects are proposed for the design methodology. A case study of the signs of "Street of Antiques" in Porto Alegre city is presented. To do that, interviews with the population are made to evaluate the current situation of signs. After that, a new sign proposal with virtual prototyping is done using the developed methodology. The results obtained with new interviews about the proposal show the user satisfaction and the importance of cognitive ergonomics to development of this type of urban furniture.

  1. The influence of capture-recapture methodology on the evolution of the North American Bird Banding Program

    USGS Publications Warehouse

    Tautin, J.; Lebreton, J.-D.; North, P.M.

    1993-01-01

    Capture-recapture methodology has advanced greatly in the last twenty years and is now a major factor driving the continuing evolution of the North American bird banding program. Bird banding studies are becoming more scientific with improved study designs and analytical procedures. Researchers and managers are gaining more reliable knowledge which in turn betters the conservation of migratory birds. The advances in capture-recapture methodology have benefited gamebird studies primarily, but nongame bird studies will benefit similarly as they expand greatly in the next decade. Further theoretical development of capture-recapture methodology should be encouraged, and, to maximize benefits of the methodology, work on practical applications should be increased.

  2. Development and evaluation of a high-performance liquid chromatography/isotope ratio mass spectrometry methodology for delta13C analyses of amino sugars in soil.

    PubMed

    Bodé, Samuel; Denef, Karolien; Boeckx, Pascal

    2009-08-30

    Amino sugars have been used as biomarkers to assess the relative contribution of dead microbial biomass of different functional groups of microorganisms to soil carbon pools. However, little is known about the dynamics of these compounds in soil. The isotopic composition of individual amino sugars can be used as a tool to determine the turnover of these compounds. Methods to determine the delta(13)C of amino sugars using gas chromatography/combustion/isotope ratio mass spectrometry (GC/C/IRMS) have been proposed in literature. However, due to derivatization, the uncertainty on the obtained delta(13)C is too high to be used for natural abundance studies. Therefore, a new high-performance liquid chromatography/isotope ratio mass spectrometry (HPLC/IRMS) methodology, with increased accuracy and precision, has been developed. The repeatability on the obtained delta(13)C values when pure amino sugars were analyzed were not significantly concentration-dependent as long as the injected amount was higher than 1.5 nmol. The delta(13)C value of the same amino sugar spiked to a soil deviated by only 0.3 per thousand from the theoretical value. 2009 John Wiley & Sons, Ltd.

  3. Analysis of the methods for assessing socio-economic development level of urban areas

    NASA Astrophysics Data System (ADS)

    Popova, Olga; Bogacheva, Elena

    2017-01-01

    The present paper provides a targeted analysis of current approaches (ratings) in the assessment of socio-economic development of urban areas. The survey focuses on identifying standardized methodologies to area assessment techniques formation that will result in developing the system of intelligent monitoring, dispatching, building management, scheduling and effective management of an administrative-territorial unit. This system is characterized by complex hierarchical structure, including tangible and intangible properties (parameters, attributes). Investigating the abovementioned methods should increase the administrative-territorial unit's attractiveness for investors and residence. The research aims at studying methods for evaluating socio-economic development level of the Russian Federation territories. Experimental and theoretical territory estimating methods were revealed. Complex analysis of the characteristics of the areas was carried out and evaluation parameters were determined. Integral indicators (resulting rating criteria values) as well as the overall rankings (parameters, characteristics) were analyzed. The inventory of the most widely used partial indicators (parameters, characteristics) of urban areas was revealed. The resulting criteria of rating values homogeneity were verified and confirmed by determining the root mean square deviation, i.e. divergence of indices. The principal shortcomings of assessment methodologies were revealed. The assessment methods with enhanced effectiveness and homogeneity were proposed.

  4. Challenges of implementing digital technology in motion picture distribution and exhibition: testing and evaluation methodology

    NASA Astrophysics Data System (ADS)

    Swartz, Charles S.

    2003-05-01

    The process of distributing and exhibiting a motion picture has changed little since the Lumière brothers presented the first motion picture to an audience in 1895. While this analog photochemical process is capable of producing screen images of great beauty and expressive power, more often the consumer experience is diminished by third generation prints and by the wear and tear of the mechanical process. Furthermore, the film industry globally spends approximately $1B annually manufacturing and shipping prints. Alternatively, distributing digital files would theoretically yield great benefits in terms of image clarity and quality, lower cost, greater security, and more flexibility in the cinema (e.g., multiple language versions). In order to understand the components of the digital cinema chain and evaluate the proposed technical solutions, the Entertainment Technology Center at USC in 2000 established the Digital Cinema Laboratory as a critical viewing environment, with the highest quality film and digital projection equipment. The presentation describes the infrastructure of the Lab, test materials, and testing methodologies developed for compression evaluation, and lessons learned up to the present. In addition to compression, the Digital Cinema Laboratory plans to evaluate other components of the digital cinema process as well.

  5. Policy discourse, people's internal frames, and declared aircraft noise annoyance: an application of Q-methodology.

    PubMed

    Kroesen, Maarten; Bröer, Christian

    2009-07-01

    Aircraft noise annoyance is studied extensively, but often without an explicit theoretical framework. In this article, a social approach for noise annoyance is proposed. The idea that aircraft noise is meaningful to people within a socially produced discourse is assumed and tested. More particularly, it is expected that the noise policy discourse influences people's assessment of aircraft noise. To this end, Q-methodology is used, which, to the best of the authors' knowledge, has not been used for aircraft noise annoyance so far. Through factor analysis five distinct frames are revealed: "Long live aviation!," "aviation: an ecological threat," "aviation and the environment: a solvable problem," "aircraft noise: not a problem," and "aviation: a local problem." It is shown that the former three frames are clearly related to the policy discourse. Based on this observation it is argued that policy making is a possible mechanism through which the sound of aircraft is turned into annoyance. In addition, it is concluded that the experience of aircraft noise and, in particular, noise annoyance is part of coherent frames of mind, which consist of mutually reinforcing positions and include non-acoustical factors.

  6. Mechanical model of suture joints with fibrous connective layer

    NASA Astrophysics Data System (ADS)

    Miroshnichenko, Kateryna; Liu, Lei; Tsukrov, Igor; Li, Yaning

    2018-02-01

    A composite model for suture joints with a connective layer of aligned fibers embedded in soft matrix is proposed. Based on the principle of complementary virtual work, composite cylinder assemblage (CCA) approach and generalized self-consistent micro-mechanical models, a hierarchical homogenization methodology is developed to systematically quantify the synergistic effects of suture morphology and fiber orientation on the overall mechanical properties of sutures. Suture joints with regular triangular wave-form serve as an example material system to apply this methodology. Both theoretical and finite element mechanical models are developed and compared to evaluate the overall normal stiffness of sutures as a function of wavy morphology of sutures, fiber orientation, fiber volume fraction, and the mechanical properties of fibers and matrix in the interfacial layer. It is found that generally due to the anisotropy-induced coupling effects between tensile and shear deformation, the effective normal stiffness of sutures is highly dependent on the fiber orientation in the connective layer. Also, the effective shear modulus of the connective layer and the stiffness ratio between the fiber and matrix significantly influence the effects of fiber orientation. In addition, optimal fiber orientations are found to maximize the stiffness of suture joints.

  7. The Power of Theory, Research Design, and Transdisciplinary Integration in Moving Psychopathology Forward.

    PubMed

    Vaidyanathan, Uma; Vrieze, Scott I; Iacono, William G

    While the past few decades have seen much work in psychopathology research that has yielded provocative insights, relatively little progress has been made in understanding the etiology of mental disorders. We contend that this is due to an overreliance on statistics and technology with insufficient attention to adequacy of experimental design, a lack of integration of data across various domains of research, and testing of theoretical models using relatively weak study designs. We provide a conceptual discussion of these issues and follow with a concrete demonstration of our proposed solution. Using two different disorders - depression and substance use - as examples, we illustrate how we can evaluate competing theories regarding their etiology by integrating information from various domains including latent variable models, neurobiology, and quasi-experimental data such as twin and adoption studies, rather than relying on any single methodology alone. More broadly, we discuss the extent to which such integrative thinking allows for inferences about the etiology of mental disorders, rather than focusing on descriptive correlates alone. Greater scientific insight will require stringent tests of competing theories and a deeper conceptual understanding of the advantages and pitfalls of methodologies and criteria we use in our studies.

  8. Critical thinking and contemporary mental health care: Michel Foucault's "history of the present".

    PubMed

    Roberts, Marc

    2017-04-01

    In order to be able to provide informed, effective and responsive mental health care and to do so in an evidence-based, collaborative and recovery-focused way with those who use mental health services, there is a recognition of the need for mental health professionals to possess sophisticated critical thinking capabilities. This article will therefore propose that such capabilities can be productively situated within the context of the work of the French philosopher Michel Foucault, one of the most challenging, innovative and influential thinkers of the 20th century. However, rather than focusing exclusively upon the content of Foucault's work, it will be suggested that it is possible to discern a general methodological approach across that work, a methodological approach that he refers to as "the history of the present." In doing so, Foucault's history of the present can be understood as a productive, albeit provisional, framework in which to orientate the purpose and process of critical thinking for mental health professionals by emphasizing the need to both historicize and politicize the theoretical perspectives and therapeutic practices that characterize contemporary mental health care. © 2016 John Wiley & Sons Ltd.

  9. Integration of low level and ontology derived features for automatic weapon recognition and identification

    NASA Astrophysics Data System (ADS)

    Sirakov, Nikolay M.; Suh, Sang; Attardo, Salvatore

    2011-06-01

    This paper presents a further step of a research toward the development of a quick and accurate weapons identification methodology and system. A basic stage of this methodology is the automatic acquisition and updating of weapons ontology as a source of deriving high level weapons information. The present paper outlines the main ideas used to approach the goal. In the next stage, a clustering approach is suggested on the base of hierarchy of concepts. An inherent slot of every node of the proposed ontology is a low level features vector (LLFV), which facilitates the search through the ontology. Part of the LLFV is the information about the object's parts. To partition an object a new approach is presented capable of defining the objects concavities used to mark the end points of weapon parts, considered as convexities. Further an existing matching approach is optimized to determine whether an ontological object matches the objects from an input image. Objects from derived ontological clusters will be considered for the matching process. Image resizing is studied and applied to decrease the runtime of the matching approach and investigate its rotational and scaling invariance. Set of experiments are preformed to validate the theoretical concepts.

  10. A New Methodology of Spatial Cross-Correlation Analysis

    PubMed Central

    Chen, Yanguang

    2015-01-01

    Spatial correlation modeling comprises both spatial autocorrelation and spatial cross-correlation processes. The spatial autocorrelation theory has been well-developed. It is necessary to advance the method of spatial cross-correlation analysis to supplement the autocorrelation analysis. This paper presents a set of models and analytical procedures for spatial cross-correlation analysis. By analogy with Moran’s index newly expressed in a spatial quadratic form, a theoretical framework is derived for geographical cross-correlation modeling. First, two sets of spatial cross-correlation coefficients are defined, including a global spatial cross-correlation coefficient and local spatial cross-correlation coefficients. Second, a pair of scatterplots of spatial cross-correlation is proposed, and the plots can be used to visually reveal the causality behind spatial systems. Based on the global cross-correlation coefficient, Pearson’s correlation coefficient can be decomposed into two parts: direct correlation (partial correlation) and indirect correlation (spatial cross-correlation). As an example, the methodology is applied to the relationships between China’s urbanization and economic development to illustrate how to model spatial cross-correlation phenomena. This study is an introduction to developing the theory of spatial cross-correlation, and future geographical spatial analysis might benefit from these models and indexes. PMID:25993120

  11. A new methodology of spatial cross-correlation analysis.

    PubMed

    Chen, Yanguang

    2015-01-01

    Spatial correlation modeling comprises both spatial autocorrelation and spatial cross-correlation processes. The spatial autocorrelation theory has been well-developed. It is necessary to advance the method of spatial cross-correlation analysis to supplement the autocorrelation analysis. This paper presents a set of models and analytical procedures for spatial cross-correlation analysis. By analogy with Moran's index newly expressed in a spatial quadratic form, a theoretical framework is derived for geographical cross-correlation modeling. First, two sets of spatial cross-correlation coefficients are defined, including a global spatial cross-correlation coefficient and local spatial cross-correlation coefficients. Second, a pair of scatterplots of spatial cross-correlation is proposed, and the plots can be used to visually reveal the causality behind spatial systems. Based on the global cross-correlation coefficient, Pearson's correlation coefficient can be decomposed into two parts: direct correlation (partial correlation) and indirect correlation (spatial cross-correlation). As an example, the methodology is applied to the relationships between China's urbanization and economic development to illustrate how to model spatial cross-correlation phenomena. This study is an introduction to developing the theory of spatial cross-correlation, and future geographical spatial analysis might benefit from these models and indexes.

  12. Degradation of ticarcillin by subcritial water oxidation method: Application of response surface methodology and artificial neural network modeling.

    PubMed

    Yabalak, Erdal

    2018-05-18

    This study was performed to investigate the mineralization of ticarcillin in the artificially prepared aqueous solution presenting ticarcillin contaminated waters, which constitute a serious problem for human health. 81.99% of total organic carbon removal, 79.65% of chemical oxygen demand removal, and 94.35% of ticarcillin removal were achieved by using eco-friendly, time-saving, powerful and easy-applying, subcritical water oxidation method in the presence of a safe-to-use oxidizing agent, hydrogen peroxide. Central composite design, which belongs to the response surface methodology, was applied to design the degradation experiments, to optimize the methods, to evaluate the effects of the system variables, namely, temperature, hydrogen peroxide concentration, and treatment time, on the responses. In addition, theoretical equations were proposed in each removal processes. ANOVA tests were utilized to evaluate the reliability of the performed models. F values of 245.79, 88.74, and 48.22 were found for total organic carbon removal, chemical oxygen demand removal, and ticarcillin removal, respectively. Moreover, artificial neural network modeling was applied to estimate the response in each case and its prediction and optimizing performance was statistically examined and compared to the performance of central composite design.

  13. The joint search for gravitational wave and low energy neutrino signals from core-collapse supernovae: methodology and status report

    NASA Astrophysics Data System (ADS)

    Gromov, M. B.; Casentini, C.

    2017-09-01

    The detection of gravitational waves opens a new era in physics. Now it's possible to observe the Universe using a fundamentally new way. Gravitational waves potentially permit getting insight into the physics of Core-Collapse Supernovae (CCSNe). However, due to significant uncertainties on the theoretical models of gravitational wave emission associated with CCSNe, benefits may come from multi-messenger observations of CCSNe. Such benefits include increased confidence in detection, extending the astrophysical reach of the detectors and allowing deeper understanding of the nature of the phenomenon. Fortunately, CCSNe have a neutrino signature confirmed by the observation of SN1987A. The gravitational and neutrino signals propagate with the speed of light and without significant interaction with interstellar matter. So that they must reach an observer on the Earth almost simultaneously. These facts open a way to search for the correlation between the signals. However, this method is limited by the sensitivity of modern neutrino detectors that allow to observe CCSNe only in the Local Group of galaxies. The methodology and status of a proposed joint search for the correlation signals are presented here.

  14. The joint search for gravitational wave and low energy neutrino signals from core-collapse supernovae: methodology and status report

    NASA Astrophysics Data System (ADS)

    Gromov, M. B.; Casentini, C.

    2017-09-01

    The detection of gravitational waves opens a new era in physics. Now it's possible to observe the Universe using a fundamentally new way. Gravitational waves potentially permit getting insight into the physics of Core-Collapse Supernovae (CCSNe). However, due to signi cant uncertainties on the theoretical models of gravitational wave emission associated with CCSNe, bene ts may come from multi-messenger observations of CCSNe. Such bene ts include increased con dence in detection, extending the astrophysical reach of the detectors and allowing deeper understanding of the nature of the phenomenon. Fortunately, CCSNe have a neutrino signature con rmed by the observation of SN1987A. The gravitational and neutrino signals propagate with the speed of light and without signi cant interaction with interstellar matter. So that they must reach an observer on the Earth almost simultaneously. These facts open a way to search for the correlation between the signals. However, this method is limited by the sensitivity of modern neutrino detectors that allow to observe CCSNe only in the Local Group of galaxies. The methodology and status of a proposed joint search for the correlation signals are presented here.

  15. Increasing organizational energy conservation behaviors: Comparing the theory of planned behavior and reasons theory for identifying specific motivational factors to target for change

    NASA Astrophysics Data System (ADS)

    Finlinson, Scott Michael

    Social scientists frequently assess factors thought to underlie behavior for the purpose of designing behavioral change interventions. Researchers commonly identify these factors by examining relationships between specific variables and the focal behaviors being investigated. Variables with the strongest relationships to the focal behavior are then assumed to be the most influential determinants of that behavior, and therefore often become the targets for change in a behavioral change intervention. In the current proposal, multiple methods are used to compare the effectiveness of two theoretical frameworks for identifying influential motivational factors. Assessing the relative influence of all factors and sets of factors for driving behavior should clarify which framework and methodology is the most promising for identifying effective change targets. Results indicated each methodology adequately predicted the three focal behaviors examined. However, the reasons theory approach was superior for predicting factor influence ratings compared to the TpB approach. While common method variance contamination had minimal impact on the results or conclusions derived from the present study's findings, there were substantial differences in conclusions depending on the questionnaire design used to collect the data. Examples of applied uses of the present study are discussed.

  16. GeomarCD project; an educational CD-Rom about marine geophysics

    NASA Astrophysics Data System (ADS)

    Diaz, J.; Rubio, E.; Gómez, M.; Gallart, J.

    2009-04-01

    This project aims to introduce the main aspects of marine geophysics experiments to the high school students. We have chosen to present the information in the form an interactive game in which, taking care of the scientific objectives and the technological and logistic resources, the player must found the best strategy to make one of the 3 research projects proposed. Along the game, the player is introduced to the main aspects of the plate tectonics theory and the crustal structure as well as to the main methodologies available (seismics, potencial fields, cores). Rather than being based in theoretical aspects, largely covered by other outreach projects, this work focuses in how a realistic problem can be solved through a field experiment. The game takes place in the researcher's desk and in an oceanographic vessel as the BIO Hesperides and includes the choice of the research project, the design and development of the field work and the interpretation of the results. At the end, the player must complete a questionnaire to elaborate the final report. The correct choice of the appropriate methodologies and its interpretation is necessary to succeed. CD copies in Spanish are freely available upon request.

  17. Kinetic Monte Carlo simulations for transient thermal fields: Computational methodology and application to the submicrosecond laser processes in implanted silicon.

    PubMed

    Fisicaro, G; Pelaz, L; Lopez, P; La Magna, A

    2012-09-01

    Pulsed laser irradiation of damaged solids promotes ultrafast nonequilibrium kinetics, on the submicrosecond scale, leading to microscopic modifications of the material state. Reliable theoretical predictions of this evolution can be achieved only by simulating particle interactions in the presence of large and transient gradients of the thermal field. We propose a kinetic Monte Carlo (KMC) method for the simulation of damaged systems in the extremely far-from-equilibrium conditions caused by the laser irradiation. The reference systems are nonideal crystals containing point defect excesses, an order of magnitude larger than the equilibrium density, due to a preirradiation ion implantation process. The thermal and, eventual, melting problem is solved within the phase-field methodology, and the numerical solutions for the space- and time-dependent thermal field were then dynamically coupled to the KMC code. The formalism, implementation, and related tests of our computational code are discussed in detail. As an application example we analyze the evolution of the defect system caused by P ion implantation in Si under nanosecond pulsed irradiation. The simulation results suggest a significant annihilation of the implantation damage which can be well controlled by the laser fluence.

  18. Multi-scaling allometric analysis for urban and regional development

    NASA Astrophysics Data System (ADS)

    Chen, Yanguang

    2017-01-01

    The concept of allometric growth is based on scaling relations, and it has been applied to urban and regional analysis for a long time. However, most allometric analyses were devoted to the single proportional relation between two elements of a geographical system. Few researches focus on the allometric scaling of multielements. In this paper, a process of multiscaling allometric analysis is developed for the studies on spatio-temporal evolution of complex systems. By means of linear algebra, general system theory, and by analogy with the analytical hierarchy process, the concepts of allometric growth can be integrated with the ideas from fractal dimension. Thus a new methodology of geo-spatial analysis and the related theoretical models emerge. Based on the least squares regression and matrix operations, a simple algorithm is proposed to solve the multiscaling allometric equation. Applying the analytical method of multielement allometry to Chinese cities and regions yields satisfying results. A conclusion is reached that the multiscaling allometric analysis can be employed to make a comprehensive evaluation for the relative levels of urban and regional development, and explain spatial heterogeneity. The notion of multiscaling allometry may enrich the current theory and methodology of spatial analyses of urban and regional evolution.

  19. Implementation of efficient trajectories for an ultrasonic scanner using chaotic maps

    NASA Astrophysics Data System (ADS)

    Almeda, A.; Baltazar, A.; Treesatayapun, C.; Mijarez, R.

    2012-05-01

    Typical ultrasonic methodology for nondestructive scanning evaluation uses systematic scanning paths. In many cases, this approach is time inefficient and also energy and computational power consuming. Here, a methodology for the scanning of defects using an ultrasonic echo-pulse scanning technique combined with chaotic trajectory generation is proposed. This is implemented in a Cartesian coordinate robotic system developed in our lab. To cover the entire search area, a chaotic function and a proposed mirror mapping were incorporated. To improve detection probability, our proposed scanning methodology is complemented with a probabilistic approach of discontinuity detection. The developed methodology was found to be more efficient than traditional ones used to localize and characterize hidden flaws.

  20. Reducing social inequities in health through settings-related interventions -- a conceptual framework.

    PubMed

    Shareck, Martine; Frohlich, Katherine L; Poland, Blake

    2013-06-01

    The creation of supportive environments for health is a basic action principle of health promotion, and equity is a core value. A settings approach offers an opportunity to bridge these two, with its focus on the interplay between individual, environmental and social determinants of health. We conducted a scoping review of the literature on theoretical bases and practical applications of the settings approach. Interventions targeting social inequities in health through action on various settings were analyzed to establish what is done in health equity research and action as it relates to settings. Four elements emerged as central to an equity-focused settings approach: a focus on social determinants of health, addressing the needs of marginalized groups, effecting change in a setting's structure, and involving stakeholders. Each came with related challenges. To offer potential solutions to these challenges we developed a conceptual framework that integrates theoretical and methodological approaches, along with six core guiding principles, into a 'settings praxis'. Reducing social inequities in health through the creation of supportive environments requires the application of the settings approach in an innovative way. The proposed conceptual framework can serve as a guide to do so, and help develop, implement and evaluate equity-focused settings-related interventions.

  1. The use of UV, FT-IR and Raman spectra for the identification of the newest penem analogs: solutions based on mathematic procedure and the density functional theory.

    PubMed

    Cielecka-Piontek, J; Lewandowska, K; Barszcz, B; Paczkowska, M

    2013-02-15

    The application of ultraviolet, FT-IR and Raman spectra was proposed for identification studies of the newest penem analogs (doripenem, biapenem and faropenem). An identification of the newest penem analogs based on their separation from related substances was achieved after the application of first derivative of direct spectra in ultraviolet which permitted elimination of overlapping effects. A combination of experimental and theoretical studies was performed for analyzing the structure and vibrational spectra (FT-IR and Raman spectra) of doripenem, biapenem and faropenem. The calculations were conducted using the density functional theory with the B3LYP hybrid functional and 6-31G(d,p) basis set. The confirmation of the applicability of the DFT methodology for interpretation of vibrational IR and Raman spectra of the newest penem analogs contributed to determination of changes of vibrations in the area of the most labile bonds. By employing the theoretical approach it was possible to eliminate necessity of using reference standards which - considering the instability of penem analogs - require that correction coefficients are factored in. Copyright © 2012 Elsevier B.V. All rights reserved.

  2. Kardashev's Classification at 50+: A Fine Vehicle With Room for Improvement

    NASA Astrophysics Data System (ADS)

    Ćirković, M. M.

    2015-12-01

    We review the history and status of the famous classification of extraterrestrial civilizations given by the great Russian astrophysicist Nikolai Semenovich Kardashev, roughly half a century after it has been proposed. While Kardashev's classification (or Kardashev's scale) has often been seen as oversimplified, and multiple improvements, refinements, and alternatives to it have been suggested, it is still one of the major tools for serious theoretical investigation of SETI issues. During these 50+ years, several attempts at modifying or reforming the classification have been made; we review some of them here, together with presenting some of the scenarios which present difficulties to the standard version. Recent results in both theoretical and observational SETI studies, especially the {Ĝ infrared survey (2014-2015), have persuasively shown that the emphasis on detectability inherent in Kardashev's classification obtains new significance and freshness. Several new movements and conceptual frameworks, such as the Dysonian SETI, tally extremely well with these developments. So, the apparent simplicity of the classification is highly deceptive: Kardashev's work offers a wealth of still insufficiently studied methodological and epistemological ramifications and it remains, in both letter and spirit, perhaps the worthiest legacy of the SETI "founding fathers".

  3. Predicting the ultimate potential of natural gas SOFC power cycles with CO2 capture - Part A: Methodology and reference cases

    NASA Astrophysics Data System (ADS)

    Campanari, Stefano; Mastropasqua, Luca; Gazzani, Matteo; Chiesa, Paolo; Romano, Matteo C.

    2016-08-01

    Driven by the search for the highest theoretical efficiency, in the latest years several studies investigated the integration of high temperature fuel cells in natural gas fired power plants, where fuel cells are integrated with simple or modified Brayton cycles and/or with additional bottoming cycles, and CO2 can be separated via chemical or physical separation, oxy-combustion and cryogenic methods. Focusing on Solid Oxide Fuel Cells (SOFC) and following a comprehensive review and analysis of possible plant configurations, this work investigates their theoretical potential efficiency and proposes two ultra-high efficiency plant configurations based on advanced intermediate-temperature SOFCs integrated with a steam turbine or gas turbine cycle. The SOFC works at atmospheric or pressurized conditions and the resulting power plant exceeds 78% LHV efficiency without CO2 capture (as discussed in part A of the work) and 70% LHV efficiency with substantial CO2 capture (part B). The power plants are simulated at the 100 MW scale with a complete set of realistic assumptions about fuel cell (FC) performance, plant components and auxiliaries, presenting detailed energy and material balances together with a second law analysis.

  4. The 3R principle and the use of non-human primates in the study of neurodegenerative diseases: the case of Parkinson's disease.

    PubMed

    Vitale, Augusto; Manciocco, Arianna; Alleva, Enrico

    2009-01-01

    The aim of this paper is to offer an ethical perspective on the use of non-human primates in neurobiological studies, using the Parkinson's disease (PD) as an important case study. We refer, as theoretical framework, to the 3R principle, originally proposed by Russell and Burch [Russell, W.M.S., Burch, R.L., 1959. The Principles of Humane Experimental Technique. Universities Federation for Animal Welfare Wheathampstead, England (reprinted in 1992)]. Then, the use of non-human primates in the study of PD will be discussed in relation to the concepts of Replacement, Reduction, and Refinement. Replacement and Reduction result to be the more problematic concept to be applied, whereas Refinement offers relatively more opportunities of improvement. However, although in some cases the 3R principle shows its applicative limits, its value, as conceptual and inspirational tool remains extremely valuable. It suggests to the researchers a series of questions, both theoretical and methodological, which can have the results of improving the quality of life on the experimental models, the quality of the scientific data, and the public perception from the non-scientist community.

  5. Assessment of environmental enteropathy in the MAL-ED cohort study: theoretical and analytic framework.

    PubMed

    Kosek, Margaret; Guerrant, Richard L; Kang, Gagandeep; Bhutta, Zulfiqar; Yori, Pablo Peñataro; Gratz, Jean; Gottlieb, Michael; Lang, Dennis; Lee, Gwenyth; Haque, Rashidul; Mason, Carl J; Ahmed, Tahmeed; Lima, Aldo; Petri, William A; Houpt, Eric; Olortegui, Maribel Paredes; Seidman, Jessica C; Mduma, Estomih; Samie, Amidou; Babji, Sudhir

    2014-11-01

    Individuals in the developing world live in conditions of intense exposure to enteric pathogens due to suboptimal water and sanitation. These environmental conditions lead to alterations in intestinal structure, function, and local and systemic immune activation that are collectively referred to as environmental enteropathy (EE). This condition, although poorly defined, is likely to be exacerbated by undernutrition as well as being responsible for permanent growth deficits acquired in early childhood, vaccine failure, and loss of human potential. This article addresses the underlying theoretical and analytical frameworks informing the methodology proposed by the Etiology, Risk Factors and Interactions of Enteric Infections and Malnutrition and the Consequences for Child Health and Development (MAL-ED) cohort study to define and quantify the burden of disease caused by EE within a multisite cohort. Additionally, we will discuss efforts to improve, standardize, and harmonize laboratory practices within the MAL-ED Network. These efforts will address current limitations in the understanding of EE and its burden on children in the developing world. © The Author 2014. Published by Oxford University Press on behalf of the Infectious Diseases Society of America. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  6. A predictive machine learning approach for microstructure optimization and materials design

    NASA Astrophysics Data System (ADS)

    Liu, Ruoqian; Kumar, Abhishek; Chen, Zhengzhang; Agrawal, Ankit; Sundararaghavan, Veera; Choudhary, Alok

    2015-06-01

    This paper addresses an important materials engineering question: How can one identify the complete space (or as much of it as possible) of microstructures that are theoretically predicted to yield the desired combination of properties demanded by a selected application? We present a problem involving design of magnetoelastic Fe-Ga alloy microstructure for enhanced elastic, plastic and magnetostrictive properties. While theoretical models for computing properties given the microstructure are known for this alloy, inversion of these relationships to obtain microstructures that lead to desired properties is challenging, primarily due to the high dimensionality of microstructure space, multi-objective design requirement and non-uniqueness of solutions. These challenges render traditional search-based optimization methods incompetent in terms of both searching efficiency and result optimality. In this paper, a route to address these challenges using a machine learning methodology is proposed. A systematic framework consisting of random data generation, feature selection and classification algorithms is developed. Experiments with five design problems that involve identification of microstructures that satisfy both linear and nonlinear property constraints show that our framework outperforms traditional optimization methods with the average running time reduced by as much as 80% and with optimality that would not be achieved otherwise.

  7. A novel approach to study effects of asymmetric stiffness on parametric instabilities of multi-rotor-system

    NASA Astrophysics Data System (ADS)

    Jain, Anuj Kumar; Rastogi, Vikas; Agrawal, Atul Kumar

    2018-01-01

    The main focus of this paper is to study effects of asymmetric stiffness on parametric instabilities of multi-rotor-system through extended Lagrangian formalism, where symmetries are broken in terms of the rotor stiffness. The complete insight of dynamic behaviour of multi-rotor-system with asymmetries is evaluated through extension of Lagrangian equation with a case study. In this work, a dynamic mathematical model of a multi-rotor-system through a novel approach of extension of Lagrangian mechanics is developed, where the system is having asymmetries due to varying stiffness. The amplitude and the natural frequency of the rotor are obtained analytically through the proposed methodology. The bond graph modeling technique is used for modeling the asymmetric rotor. Symbol-shakti® software is used for the simulation of the model. The effects of the stiffness of multi-rotor-system on amplitude and frequencies are studied using numerical simulation. Simulation results show a considerable agreement with the theoretical results obtained through extended Lagrangian formalism. It is further shown that amplitude of the rotor increases inversely the stiffness of the rotor up to a certain limit, which is also affirmed theoretically.

  8. The basic traumatic situation in the analytical relationship.

    PubMed

    Hartke, Raul

    2005-04-01

    The author attempts to develop a concept of psychic trauma which would comply with the nucleus of this Freudian notion, that is, an excess of excitations that cannot be processed by the mental apparatus, but which would also consider the functions and the crucial role of objects in the constitution of the psychism and in traumatic conditions, as well as taking into account the methodological positioning according to which the analytical relationship is the sole possible locus of observation, inference and intervention by the psychoanalyst. He considers as a basic or minimal traumatic psychoanalytical situation that in which a magnitude or quality of emotions exceeds the capacity for containment of the psychoanalytical pair, to the point of generating a period or area of dementalisation in the psyche of one or both of the participants, of requiring analytical work on the matter and promoting a significant positive or negative change in the relationship. Availing himself of Bion's theory about the alpha function and the metapsychological conceptions of Freud and Green concerning psychic representations, he presents two theoretical formulations relating to this traumatic situation, utilising them according to the 'altered focus' model proposed by Bion. He presents three clinical examples to illustrate the concept and the relevant theoretical formulations.

  9. Age-related differences in associative memory: Empirical evidence and theoretical perspectives.

    PubMed

    Naveh-Benjamin, Moshe; Mayr, Ulrich

    2018-02-01

    Systematic research and anecdotal evidence both indicate declines in episodic memory in older adults in good health without dementia-related disorders. Several hypotheses have been proposed to explain these age-related changes in episodic memory, some of which attribute such declines to a deterioration in associative memory. The current special issue of Psychology and Aging on Age-Related Differences in Associative Memory includes 16 articles by top researchers in the area of memory and aging. Their contributions provide a wealth of empirical work that addresses different aspects of aging and associative memory, including different mediators and predictors of age-related declines in binding and associative memory, cognitive, noncognitive, genetic, and neuro-related ones. The contributions also address the processing phases where these declines manifest themselves and look at ways to ameliorate these age-related declines. Furthermore, the contributions in this issue draw on different theoretical perspectives to explain age-related changes in associative memory and provide a wealth of varying methodologies to assess older and younger adults' performance. Finally, although most of the studies focus on normative/healthy aging, some of them contain insights that are potentially applicable to disorders and pathologies. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  10. Discriminant content validity: a quantitative methodology for assessing content of theory-based measures, with illustrative applications.

    PubMed

    Johnston, Marie; Dixon, Diane; Hart, Jo; Glidewell, Liz; Schröder, Carin; Pollard, Beth

    2014-05-01

    In studies involving theoretical constructs, it is important that measures have good content validity and that there is not contamination of measures by content from other constructs. While reliability and construct validity are routinely reported, to date, there has not been a satisfactory, transparent, and systematic method of assessing and reporting content validity. In this paper, we describe a methodology of discriminant content validity (DCV) and illustrate its application in three studies. Discriminant content validity involves six steps: construct definition, item selection, judge identification, judgement format, single-sample test of content validity, and assessment of discriminant items. In three studies, these steps were applied to a measure of illness perceptions (IPQ-R) and control cognitions. The IPQ-R performed well with most items being purely related to their target construct, although timeline and consequences had small problems. By contrast, the study of control cognitions identified problems in measuring constructs independently. In the final study, direct estimation response formats for theory of planned behaviour constructs were found to have as good DCV as Likert format. The DCV method allowed quantitative assessment of each item and can therefore inform the content validity of the measures assessed. The methods can be applied to assess content validity before or after collecting data to select the appropriate items to measure theoretical constructs. Further, the data reported for each item in Appendix S1 can be used in item or measure selection. Statement of contribution What is already known on this subject? There are agreed methods of assessing and reporting construct validity of measures of theoretical constructs, but not their content validity. Content validity is rarely reported in a systematic and transparent manner. What does this study add? The paper proposes discriminant content validity (DCV), a systematic and transparent method of assessing and reporting whether items assess the intended theoretical construct and only that construct. In three studies, DCV was applied to measures of illness perceptions, control cognitions, and theory of planned behaviour response formats. Appendix S1 gives content validity indices for each item of each questionnaire investigated. Discriminant content validity is ideally applied while the measure is being developed, before using to measure the construct(s), but can also be applied after using a measure. © 2014 The British Psychological Society.

  11. The methodology for modeling queuing systems using Petri nets

    NASA Astrophysics Data System (ADS)

    Kotyrba, Martin; Gaj, Jakub; Tvarůžka, Matouš

    2017-07-01

    This papers deals with the use of Petri nets in modeling and simulation of queuing systems. The first part is focused on the explanation of basic concepts and properties of Petri nets and queuing systems. The proposed methodology for the modeling of queuing systems using Petri nets is described in the practical part. The proposed methodology will be tested on specific cases.

  12. Calibration of CORSIM models under saturated traffic flow conditions.

    DOT National Transportation Integrated Search

    2013-09-01

    This study proposes a methodology to calibrate microscopic traffic flow simulation models. : The proposed methodology has the capability to calibrate simultaneously all the calibration : parameters as well as demand patterns for any network topology....

  13. Free radical propulsion concept

    NASA Technical Reports Server (NTRS)

    Hawkins, C. E.; Nakanishi, S.

    1981-01-01

    The concept of a free radical propulsion system, utilizing the recombination energy of dissociated low molecular weight gases to produce thrust, is analyzed. The system, operating at a theoretical impulse with hydrogen, as high as 2200 seconds at high thrust to power ratio, is hypothesized to bridge the gap between chemical and electrostatic propulsion capabilities. A comparative methodology is outlined by which characteristics of chemical and electric propulsion for orbit raising mission can be investigated. It is noted that free radicals proposed in rockets previously met with difficulty and complexity in terms of storage requirements; the present study proposes to eliminate the storage requirements by using electric energy to achieve a continuous-flow product of free radicals which are recombined to produce a high velocity propellant. Microwave energy used to dissociate a continuously flowing gas is transferred to the propellant via three-body-recombination for conversion to propellant kinetic energy. Microwave plasma discharge was found in excess of 90 percent over a broad range of pressure in preliminary experiments, and microwave heating compared to electrothermal heating showed much higher temperatures in gasdynamic equations.

  14. Fractional time-dependent apparent viscosity model for semisolid foodstuffs

    NASA Astrophysics Data System (ADS)

    Yang, Xu; Chen, Wen; Sun, HongGuang

    2017-10-01

    The difficulty in the description of thixotropic behaviors in semisolid foodstuffs is the time dependent nature of apparent viscosity under constant shear rate. In this study, we propose a novel theoretical model via fractional derivative to address the high demand by industries. The present model adopts the critical parameter of fractional derivative order α to describe the corresponding time-dependent thixotropic behavior. More interestingly, the parameter α provides a quantitative insight into discriminating foodstuffs. With the re-exploration of three groups of experimental data (tehineh, balangu, and natillas), the proposed methodology is validated in good applicability and efficiency. The results show that the present fractional apparent viscosity model performs successfully for tested foodstuffs in the shear rate range of 50-150 s^{ - 1}. The fractional order α decreases with the increase of temperature at low temperature, below 50 °C, but increases with growing shear rate. While the ideal initial viscosity k decreases with the increase of temperature, shear rate, and ingredient content. It is observed that the magnitude of α is capable of characterizing the thixotropy of semisolid foodstuffs.

  15. Study of spectroscopic properties of nanosized particles of core-shell morphology

    NASA Astrophysics Data System (ADS)

    Bzhalava, T. N.; Kervalishvili, P. J.

    2018-03-01

    Method of studying spectroscopic properties of nanosized particles and estimation of resonance wavelength range for determination of specific and unique “spectral” signatures in purpose of sensing, identification of nanobioparticles, viruses is proposed. Elaboration of relevant models of viruses, estimation of spectral response on interaction of electromagnetic (EM) field and viral nanoparticle is the goal of proposed methodology. Core-shell physical model is used as the first approximation of shape-structure of virion. Theoretical solution of EM wave scattering on single spherical virus-like particle (VLP) is applied for determination of EM fields in the areas of core, shell and surrounding medium of (VLP), as well as scattering and absorption characteristics. Numerical results obtained by computer simulation for estimation of EM “spectra” of bacteriophage T7 demonstrate the strong dependence of spectroscopic characteristics on core-shell related electric and geometric parameters of VLP in resonance wavelengths range. Expected spectral response is observable on far-field characterizations. Obtained analytical EM field expressions, modelling technique in complement with experimental spectroscopic methods should be the way of providing the virus spectral signatures, important in bioparticles characterization.

  16. Methodology for modeling the microbial contamination of air filters.

    PubMed

    Joe, Yun Haeng; Yoon, Ki Young; Hwang, Jungho

    2014-01-01

    In this paper, we propose a theoretical model to simulate microbial growth on contaminated air filters and entrainment of bioaerosols from the filters to an indoor environment. Air filter filtration and antimicrobial efficiencies, and effects of dust particles on these efficiencies, were evaluated. The number of bioaerosols downstream of the filter could be characterized according to three phases: initial, transitional, and stationary. In the initial phase, the number was determined by filtration efficiency, the concentration of dust particles entering the filter, and the flow rate. During the transitional phase, the number of bioaerosols gradually increased up to the stationary phase, at which point no further increase was observed. The antimicrobial efficiency and flow rate were the dominant parameters affecting the number of bioaerosols downstream of the filter in the transitional and stationary phase, respectively. It was found that the nutrient fraction of dust particles entering the filter caused a significant change in the number of bioaerosols in both the transitional and stationary phases. The proposed model would be a solution for predicting the air filter life cycle in terms of microbiological activity by simulating the microbial contamination of the filter.

  17. Shunt resistance and saturation current determination in CdTe and CIGS solar cells. Part 1: a new theoretical procedure and comparison with other methodologies

    NASA Astrophysics Data System (ADS)

    Rangel-Kuoppa, Victor-Tapio; Albor-Aguilera, María-de-Lourdes; Hérnandez-Vásquez, César; Flores-Márquez, José-Manuel; González-Trujillo, Miguel-Ángel; Contreras-Puente, Gerardo-Silverio

    2018-04-01

    A new proposal for the extraction of the shunt resistance (R sh ) and saturation current (I sat ) of a current-voltage (I-V) measurement of a solar cell, within the one-diode model, is given. First, the Cheung method is extended to obtain the series resistance (R s ), the ideality factor (n) and an upper limit for I sat . In this article which is Part 1 of two parts, two procedures are proposed to obtain fitting values for R sh and I sat within some voltage range. These two procedures are used in two simulated I-V curves (one in darkness and the other one under illumination) to recover the known solar cell parameters R sh , R s , n, I sat and the light current I lig and test its accuracy. The method is compared with two different common parameter extraction methods. These three procedures are used and compared in Part 2 in the I-V curves of CdS-CdTe and CIGS-CdS solar cells.

  18. Full-band quantum simulation of electron devices with the pseudopotential method: Theory, implementation, and applications

    NASA Astrophysics Data System (ADS)

    Pala, M. G.; Esseni, D.

    2018-03-01

    This paper presents the theory, implementation, and application of a quantum transport modeling approach based on the nonequilibrium Green's function formalism and a full-band empirical pseudopotential Hamiltonian. We here propose to employ a hybrid real-space/plane-wave basis that results in a significant reduction of the computational complexity compared to a full plane-wave basis. To this purpose, we provide a theoretical formulation in the hybrid basis of the quantum confinement, the self-energies of the leads, and the coupling between the device and the leads. After discussing the theory and the implementation of the new simulation methodology, we report results for complete, self-consistent simulations of different electron devices, including a silicon Esaki diode, a thin-body silicon field effect transistor (FET), and a germanium tunnel FET. The simulated transistors have technologically relevant geometrical features with a semiconductor film thickness of about 4 nm and a channel length ranging from 10 to 17 nm. We believe that the newly proposed formalism may find applications also in transport models based on ab initio Hamiltonians, as those employed in density functional theory methods.

  19. Phylogenetic Analysis Shows That Neolithic Slate Plaques from the Southwestern Iberian Peninsula Are Not Genealogical Recording Systems

    PubMed Central

    García Rivero, Daniel; O'Brien, Michael J.

    2014-01-01

    Prehistoric material culture proposed to be symbolic in nature has been the object of considerable archaeological work from diverse theoretical perspectives, yet rarely are methodological tools used to test the interpretations. The lack of testing is often justified by invoking the opinion that the slippery nature of past human symbolism cannot easily be tackled by the scientific method. One such case, from the southwestern Iberian Peninsula, involves engraved stone plaques from megalithic funerary monuments dating ca. 3,500–2,750 B.C. (calibrated age). One widely accepted proposal is that the plaques are ancient mnemonic devices that record genealogies. The analysis reported here demonstrates that this is not the case, even when the most supportive data and techniques are used. Rather, we suspect there was a common ideological background to the use of plaques that overlay the southwestern Iberian Peninsula, with little or no geographic patterning. This would entail a cultural system in which plaque design was based on a fundamental core idea, with a number of mutable and variable elements surrounding it. PMID:24558384

  20. On Enactivism and Language: Towards a Methodology for Studying Talk in Mathematics Classrooms

    ERIC Educational Resources Information Center

    Coles, Alf

    2015-01-01

    This article is an early step in the development of a methodological approach to the study of language deriving from an enactivist theoretical stance. Language is seen as a co-ordination of co-ordinations of action. Meaning and intention cannot easily be interpreted from the actions and words of others; instead, careful attention can be placed in…

Top