Sample records for empirically based theoretical

  1. Generalized Constitutive-Based Theoretical and Empirical Models for Hot Working Behavior of Functionally Graded Steels

    NASA Astrophysics Data System (ADS)

    Vanini, Seyed Ali Sadough; Abolghasemzadeh, Mohammad; Assadi, Abbas

    2013-07-01

    Functionally graded steels with graded ferritic and austenitic regions including bainite and martensite intermediate layers produced by electroslag remelting have attracted much attention in recent years. In this article, an empirical model based on the Zener-Hollomon (Z-H) constitutive equation with generalized material constants is presented to investigate the effects of temperature and strain rate on the hot working behavior of functionally graded steels. Next, a theoretical model, generalized by strain compensation, is developed for the flow stress estimation of functionally graded steels under hot compression based on the phase mixture rule and boundary layer characteristics. The model is used for different strains and grading configurations. Specifically, the results for αβγMγ steels from empirical and theoretical models showed excellent agreement with those of experiments of other references within acceptable error.

  2. Designing Educative Curriculum Materials: A Theoretically and Empirically Driven Process

    ERIC Educational Resources Information Center

    Davis, Elizabeth A.; Palincsar, Annemarie Sullivan; Arias, Anna Maria; Bismack, Amber Schultz; Marulis, Loren M.; Iwashyna, Stefanie K.

    2014-01-01

    In this article, the authors argue for a design process in the development of educative curriculum materials that is theoretically and empirically driven. Using a design-based research approach, they describe their design process for incorporating educative features intended to promote teacher learning into existing, high-quality curriculum…

  3. Converging Paradigms: A Reflection on Parallel Theoretical Developments in Psychoanalytic Metapsychology and Empirical Dream Research.

    PubMed

    Schmelowszky, Ágoston

    2016-08-01

    In the last decades one can perceive a striking parallelism between the shifting perspective of leading representatives of empirical dream research concerning their conceptualization of dreaming and the paradigm shift within clinically based psychoanalytic metapsychology with respect to its theory on the significance of dreaming. In metapsychology, dreaming becomes more and more a central metaphor of mental functioning in general. The theories of Klein, Bion, and Matte-Blanco can be considered as milestones of this paradigm shift. In empirical dream research, the competing theories of Hobson and of Solms respectively argued for and against the meaningfulness of the dream-work in the functioning of the mind. In the meantime, empirical data coming from various sources seemed to prove the significance of dream consciousness for the development and maintenance of adaptive waking consciousness. Metapsychological speculations and hypotheses based on empirical research data seem to point in the same direction, promising for contemporary psychoanalytic practice a more secure theoretical base. In this paper the author brings together these diverse theoretical developments and presents conclusions regarding psychoanalytic theory and technique, as well as proposing an outline of an empirical research plan for testing the specificity of psychoanalysis in developing dream formation.

  4. Outcome (competency) based education: an exploration of its origins, theoretical basis, and empirical evidence.

    PubMed

    Morcke, Anne Mette; Dornan, Tim; Eika, Berit

    2013-10-01

    Outcome based or competency based education (OBE) is so firmly established in undergraduate medical education that it might not seem necessary to ask why it was included in recommendations for the future, like the Flexner centenary report. Uncritical acceptance may not, however, deliver its greatest benefits. Our aim was to explore the underpinnings of OBE: its historical origins, theoretical basis, and empirical evidence of its effects in order to answer the question: How can predetermined learning outcomes influence undergraduate medical education? This literature review had three components: A review of historical landmarks in the evolution of OBE; a review of conceptual frameworks and theories; and a systematic review of empirical publications from 1999 to 2010 that reported data concerning the effects of learning outcomes on undergraduate medical education. OBE had its origins in behaviourist theories of learning. It is tightly linked to the assessment and regulation of proficiency, but less clearly linked to teaching and learning activities. Over time, there have been cycles of advocacy for, then criticism of, OBE. A recurring critique concerns the place of complex personal and professional attributes as "competencies". OBE has been adopted by consensus in the face of weak empirical evidence. OBE, which has been advocated for over 50 years, can contribute usefully to defining requisite knowledge and skills, and blueprinting assessments. Its applicability to more complex aspects of clinical performance is not clear. OBE, we conclude, provides a valuable approach to some, but not all, important aspects of undergraduate medical education.

  5. Computer-generated tailored feedback letters for smoking cessation: theoretical and empirical variability of tailoring.

    PubMed

    Schumann, Anja; John, Ulrich; Ulbricht, Sabina; Rüge, Jeannette; Bischof, Gallus; Meyer, Christian

    2008-11-01

    This study examines tailored feedback letters of a smoking cessation intervention that is conceptually based on the transtheoretical model, from a content-based perspective. Data of 2 population-based intervention studies, both randomized controlled trials, with total N=1044 were used. The procedure of the intervention, the tailoring principle for the feedback letters, and the content of the intervention materials are described in detail. Theoretical and empirical frequencies of unique feedback letters are presented. The intervention system was able to generate a total of 1040 unique letters with normative feedback only, and almost half a million unique letters with normative and ipsative feedback. Almost every single smoker in contemplation, preparation, action, and maintenance had an empirically unique combination of tailoring variables and received a unique letter. In contrast, many smokers in precontemplation shared a combination of tailoring variables and received identical letters. The transtheoretical model provides an enormous theoretical and empirical variability of tailoring. However, tailoring for a major subgroup of smokers, i.e. those who do not intend to quit, needs improvement. Conceptual ideas for additional tailoring variables are discussed.

  6. Theoretical vs. empirical discriminability: the application of ROC methods to eyewitness identification.

    PubMed

    Wixted, John T; Mickes, Laura

    2018-01-01

    Receiver operating characteristic (ROC) analysis was introduced to the field of eyewitness identification 5 years ago. Since that time, it has been both influential and controversial, and the debate has raised an issue about measuring discriminability that is rarely considered. The issue concerns the distinction between empirical discriminability (measured by area under the ROC curve) vs. underlying/theoretical discriminability (measured by d' or variants of it). Under most circumstances, the two measures will agree about a difference between two conditions in terms of discriminability. However, it is possible for them to disagree, and that fact can lead to confusion about which condition actually yields higher discriminability. For example, if the two conditions have implications for real-world practice (e.g., a comparison of competing lineup formats), should a policymaker rely on the area-under-the-curve measure or the theory-based measure? Here, we illustrate the fact that a given empirical ROC yields as many underlying discriminability measures as there are theories that one is willing to take seriously. No matter which theory is correct, for practical purposes, the singular area-under-the-curve measure best identifies the diagnostically superior procedure. For that reason, area under the ROC curve informs policy in a way that underlying theoretical discriminability never can. At the same time, theoretical measures of discriminability are equally important, but for a different reason. Without an adequate theoretical understanding of the relevant task, the field will be in no position to enhance empirical discriminability.

  7. Color and psychological functioning: a review of theoretical and empirical work

    PubMed Central

    Elliot, Andrew J.

    2015-01-01

    In the past decade there has been increased interest in research on color and psychological functioning. Important advances have been made in theoretical work and empirical work, but there are also important weaknesses in both areas that must be addressed for the literature to continue to develop apace. In this article, I provide brief theoretical and empirical reviews of research in this area, in each instance beginning with a historical background and recent advancements, and proceeding to an evaluation focused on weaknesses that provide guidelines for future research. I conclude by reiterating that the literature on color and psychological functioning is at a nascent stage of development, and by recommending patience and prudence regarding conclusions about theory, findings, and real-world application. PMID:25883578

  8. Implementing Geographical Key Concepts: Design of a Symbiotic Teacher Training Course Based on Empirical and Theoretical Evidence

    ERIC Educational Resources Information Center

    Fögele, Janis; Mehren, Rainer

    2015-01-01

    A central desideratum for the professionalization of qualified teachers is an improved practice of further teacher education. The present work constitutes a course of in-service training, which is built upon both a review of empirical findings concerning the efficacy of in-service training courses for teachers and theoretical assumptions about the…

  9. Theoretical and Empirical Descriptions of Thermospheric Density

    NASA Astrophysics Data System (ADS)

    Solomon, S. C.; Qian, L.

    2004-12-01

    The longest-term and most accurate overall description the density of the upper thermosphere is provided by analysis of change in the ephemeris of Earth-orbiting satellites. Empirical models of the thermosphere developed in part from these measurements can do a reasonable job of describing thermospheric properties on a climatological basis, but the promise of first-principles global general circulation models of the coupled thermosphere/ionosphere system is that a true high-resolution, predictive capability may ultimately be developed for thermospheric density. However, several issues are encountered when attempting to tune such models so that they accurately represent absolute densities as a function of altitude, and their changes on solar-rotational and solar-cycle time scales. Among these are the crucial ones of getting the heating rates (from both solar and auroral sources) right, getting the cooling rates right, and establishing the appropriate boundary conditions. However, there are several ancillary issues as well, such as the problem of registering a pressure-coordinate model onto an altitude scale, and dealing with possible departures from hydrostatic equilibrium in empirical models. Thus, tuning a theoretical model to match empirical climatology may be difficult, even in the absence of high temporal or spatial variation of the energy sources. We will discuss some of the challenges involved, and show comparisons of simulations using the NCAR Thermosphere-Ionosphere-Electrodynamics General Circulation Model (TIE-GCM) to empirical model estimates of neutral thermosphere density and temperature. We will also show some recent simulations using measured solar irradiance from the TIMED/SEE instrument as input to the TIE-GCM.

  10. Promoting mental wellbeing: developing a theoretically and empirically sound complex intervention.

    PubMed

    Millar, S L; Donnelly, M

    2014-06-01

    This paper describes the development of a complex intervention to promote mental wellbeing using the revised framework for developing and evaluating complex interventions produced by the UK Medical Research Council (UKMRC). Application of the first two phases of the framework is described--development and feasibility and piloting. The theoretical case and evidence base were examined analytically to explicate the theoretical and empirical foundations of the intervention. These findings informed the design of a 12-week mental wellbeing promotion programme providing early intervention for people showing signs of mental health difficulties. The programme is based on the theoretical constructs of self-efficacy, self-esteem, purpose in life, resilience and social support and comprises 10 steps. A mixed methods approach was used to conduct a feasibility study with community and voluntary sector service users and in primary care. A significant increase in mental wellbeing was observed following participation in the intervention. Qualitative data corroborated this finding and suggested that the intervention was feasible to deliver and acceptable to participants, facilitators and health professionals. The revised UKMRC framework can be successfully applied to the development of public health interventions. © The Author 2013. Published by Oxford University Press on behalf of Faculty of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  11. Competence and Drug Use: Theoretical Frameworks, Empirical Evidence and Measurement.

    ERIC Educational Resources Information Center

    Lindenberg, Cathy Strachan; Solorzano, Rosa; Kelley, Maureen; Darrow, Vicki; Gendrop, Sylvia C.; Strickland, Ora

    1998-01-01

    Discusses the Social Stress Model of Substance Abuse. Summarizes theoretical and conceptual formulations for the construct of competence, reviews empirical evidence for the association of competence with drug use, and describes the preliminary development of a multiscale instrument designed to assess drug-protective competence among low-income…

  12. Quantifying heterogeneity attributable to polythetic diagnostic criteria: theoretical framework and empirical application.

    PubMed

    Olbert, Charles M; Gala, Gary J; Tupler, Larry A

    2014-05-01

    Heterogeneity within psychiatric disorders is both theoretically and practically problematic: For many disorders, it is possible for 2 individuals to share very few or even no symptoms in common yet share the same diagnosis. Polythetic diagnostic criteria have long been recognized to contribute to this heterogeneity, yet no unified theoretical understanding of the coherence of symptom criteria sets currently exists. A general framework for analyzing the logical and mathematical structure, coherence, and diversity of Diagnostic and Statistical Manual diagnostic categories (DSM-5 and DSM-IV-TR) is proposed, drawing from combinatorial mathematics, set theory, and information theory. Theoretical application of this framework to 18 diagnostic categories indicates that in most categories, 2 individuals with the same diagnosis may share no symptoms in common, and that any 2 theoretically possible symptom combinations will share on average less than half their symptoms. Application of this framework to 2 large empirical datasets indicates that patients who meet symptom criteria for major depressive disorder and posttraumatic stress disorder tend to share approximately three-fifths of symptoms in common. For both disorders in each of the datasets, pairs of individuals who shared no common symptoms were observed. Any 2 individuals with either diagnosis were unlikely to exhibit identical symptomatology. The theoretical and empirical results stemming from this approach have substantive implications for etiological research into, and measurement of, psychiatric disorders.

  13. Theoretical and Empirical Analysis of a Spatial EA Parallel Boosting Algorithm.

    PubMed

    Kamath, Uday; Domeniconi, Carlotta; De Jong, Kenneth

    2018-01-01

    Many real-world problems involve massive amounts of data. Under these circumstances learning algorithms often become prohibitively expensive, making scalability a pressing issue to be addressed. A common approach is to perform sampling to reduce the size of the dataset and enable efficient learning. Alternatively, one customizes learning algorithms to achieve scalability. In either case, the key challenge is to obtain algorithmic efficiency without compromising the quality of the results. In this article we discuss a meta-learning algorithm (PSBML) that combines concepts from spatially structured evolutionary algorithms (SSEAs) with concepts from ensemble and boosting methodologies to achieve the desired scalability property. We present both theoretical and empirical analyses which show that PSBML preserves a critical property of boosting, specifically, convergence to a distribution centered around the margin. We then present additional empirical analyses showing that this meta-level algorithm provides a general and effective framework that can be used in combination with a variety of learning classifiers. We perform extensive experiments to investigate the trade-off achieved between scalability and accuracy, and robustness to noise, on both synthetic and real-world data. These empirical results corroborate our theoretical analysis, and demonstrate the potential of PSBML in achieving scalability without sacrificing accuracy.

  14. Emotions and Motivation in Mathematics Education: Theoretical Considerations and Empirical Contributions

    ERIC Educational Resources Information Center

    Schukajlow, Stanislaw; Rakoczy, K.; Pekrun, R.

    2017-01-01

    Emotions and motivation are important prerequisites, mediators, and outcomes of learning and achievement. In this article, we first review major theoretical approaches and empirical findings in research on students' emotions and motivation in mathematics, including a discussion of how classroom instruction can support emotions and motivation.…

  15. University Students' Understanding of the Concepts Empirical, Theoretical, Qualitative and Quantitative Research

    ERIC Educational Resources Information Center

    Murtonen, Mari

    2015-01-01

    University research education in many disciplines is frequently confronted by problems with students' weak level of understanding of research concepts. A mind map technique was used to investigate how students understand central methodological concepts of empirical, theoretical, qualitative and quantitative. The main hypothesis was that some…

  16. Does the U.S. exercise contagion on Italy? A theoretical model and empirical evidence

    NASA Astrophysics Data System (ADS)

    Cerqueti, Roy; Fenga, Livio; Ventura, Marco

    2018-06-01

    This paper deals with the theme of contagion in financial markets. At this aim, we develop a model based on Mixed Poisson Processes to describe the abnormal returns of financial markets of two considered countries. In so doing, the article defines the theoretical conditions to be satisfied in order to state that one of them - the so-called leader - exercises contagion on the others - the followers. Specifically, we employ an invariant probabilistic result stating that a suitable transformation of a Mixed Poisson Process is still a Mixed Poisson Process. The theoretical claim is validated by implementing an extensive simulation analysis grounded on empirical data. The countries considered are the U.S. (as the leader) and Italy (as the follower) and the period under scrutiny is very large, ranging from 1970 to 2014.

  17. A Unified Model of Knowledge Sharing Behaviours: Theoretical Development and Empirical Test

    ERIC Educational Resources Information Center

    Chennamaneni, Anitha; Teng, James T. C.; Raja, M. K.

    2012-01-01

    Research and practice on knowledge management (KM) have shown that information technology alone cannot guarantee that employees will volunteer and share knowledge. While previous studies have linked motivational factors to knowledge sharing (KS), we took a further step to thoroughly examine this theoretically and empirically. We developed a…

  18. Trophic interaction modifications: an empirical and theoretical framework.

    PubMed

    Terry, J Christopher D; Morris, Rebecca J; Bonsall, Michael B

    2017-10-01

    Consumer-resource interactions are often influenced by other species in the community. At present these 'trophic interaction modifications' are rarely included in ecological models despite demonstrations that they can drive system dynamics. Here, we advocate and extend an approach that has the potential to unite and represent this key group of non-trophic interactions by emphasising the change to trophic interactions induced by modifying species. We highlight the opportunities this approach brings in comparison to frameworks that coerce trophic interaction modifications into pairwise relationships. To establish common frames of reference and explore the value of the approach, we set out a range of metrics for the 'strength' of an interaction modification which incorporate increasing levels of contextual information about the system. Through demonstrations in three-species model systems, we establish that these metrics capture complimentary aspects of interaction modifications. We show how the approach can be used in a range of empirical contexts; we identify as specific gaps in current understanding experiments with multiple levels of modifier species and the distributions of modifications in networks. The trophic interaction modification approach we propose can motivate and unite empirical and theoretical studies of system dynamics, providing a route to confront ecological complexity. © 2017 The Authors. Ecology Letters published by CNRS and John Wiley & Sons Ltd.

  19. Cognitive culture: theoretical and empirical insights into social learning strategies.

    PubMed

    Rendell, Luke; Fogarty, Laurel; Hoppitt, William J E; Morgan, Thomas J H; Webster, Mike M; Laland, Kevin N

    2011-02-01

    Research into social learning (learning from others) has expanded significantly in recent years, not least because of productive interactions between theoretical and empirical approaches. This has been coupled with a new emphasis on learning strategies, which places social learning within a cognitive decision-making framework. Understanding when, how and why individuals learn from others is a significant challenge, but one that is critical to numerous fields in multiple academic disciplines, including the study of social cognition. Copyright © 2010 Elsevier Ltd. All rights reserved.

  20. A review of the nurtured heart approach to parenting: evaluation of its theoretical and empirical foundations.

    PubMed

    Hektner, Joel M; Brennan, Alison L; Brotherson, Sean E

    2013-09-01

    The Nurtured Heart Approach to parenting (NHA; Glasser & Easley, 2008) is summarized and evaluated in terms of its alignment with current theoretical perspectives and empirical evidence in family studies and developmental science. Originally conceived and promoted as a behavior management approach for parents of difficult children (i.e., with behavior disorders), NHA is increasingly offered as a valuable strategy for parents of any children, despite a lack of published empirical support. Parents using NHA are trained to minimize attention to undesired behaviors, provide positive attention and praise for compliance with rules, help children be successful by scaffolding and shaping desired behavior, and establish a set of clear rules and consequences. Many elements of the approach have strong support in the theoretical and empirical literature; however, some of the assumptions are more questionable, such as that negative child behavior can always be attributed to unintentional positive reinforcement by parents responding with negative attention. On balance, NHA appears to promote effective and validated parenting practices, but its effectiveness now needs to be tested empirically. © FPI, Inc.

  1. A theoretical and empirical review of the death-thought accessibility concept in terror management research.

    PubMed

    Hayes, Joseph; Schimel, Jeff; Arndt, Jamie; Faucher, Erik H

    2010-09-01

    Terror management theory (TMT) highlights the motivational impact of thoughts of death in various aspects of everyday life. Since its inception in 1986, research on TMT has undergone a slight but significant shift from an almost exclusive focus on the manipulation of thoughts of death to a marked increase in studies that measure the accessibility of death-related cognition. Indeed, the number of death-thought accessibility (DTA) studies in the published literature has grown substantially in recent years. In light of this increasing reliance on the DTA concept, the present article is meant to provide a comprehensive theoretical and empirical review of the literature employing this concept. After discussing the roots of DTA, the authors outline the theoretical refinements to TMT that have accompanied significant research findings associated with the DTA concept. Four distinct categories (mortality salience, death association, anxiety-buffer threat, and dispositional) are derived to organize the reviewed DTA studies, and the theoretical implications of each category are discussed. Finally, a number of lingering empirical and theoretical issues in the DTA literature are discussed with the aim of stimulating and focusing future research on DTA specifically and TMT in general.

  2. A Theoretical and Empirical Integrated Method to Select the Optimal Combined Signals for Geometry-Free and Geometry-Based Three-Carrier Ambiguity Resolution

    PubMed Central

    Zhao, Dongsheng; Roberts, Gethin Wyn; Lau, Lawrence; Hancock, Craig M.; Bai, Ruibin

    2016-01-01

    Twelve GPS Block IIF satellites, out of the current constellation, can transmit on three-frequency signals (L1, L2, L5). Taking advantages of these signals, Three-Carrier Ambiguity Resolution (TCAR) is expected to bring much benefit for ambiguity resolution. One of the research areas is to find the optimal combined signals for a better ambiguity resolution in geometry-free (GF) and geometry-based (GB) mode. However, the existing researches select the signals through either pure theoretical analysis or testing with simulated data, which might be biased as the real observation condition could be different from theoretical prediction or simulation. In this paper, we propose a theoretical and empirical integrated method, which first selects the possible optimal combined signals in theory and then refines these signals with real triple-frequency GPS data, observed at eleven baselines of different lengths. An interpolation technique is also adopted in order to show changes of the AR performance with the increase in baseline length. The results show that the AR success rate can be improved by 3% in GF mode and 8% in GB mode at certain intervals of the baseline length. Therefore, the TCAR can perform better by adopting the combined signals proposed in this paper when the baseline meets the length condition. PMID:27854324

  3. A Theoretical and Empirical Integrated Method to Select the Optimal Combined Signals for Geometry-Free and Geometry-Based Three-Carrier Ambiguity Resolution.

    PubMed

    Zhao, Dongsheng; Roberts, Gethin Wyn; Lau, Lawrence; Hancock, Craig M; Bai, Ruibin

    2016-11-16

    Twelve GPS Block IIF satellites, out of the current constellation, can transmit on three-frequency signals (L1, L2, L5). Taking advantages of these signals, Three-Carrier Ambiguity Resolution (TCAR) is expected to bring much benefit for ambiguity resolution. One of the research areas is to find the optimal combined signals for a better ambiguity resolution in geometry-free (GF) and geometry-based (GB) mode. However, the existing researches select the signals through either pure theoretical analysis or testing with simulated data, which might be biased as the real observation condition could be different from theoretical prediction or simulation. In this paper, we propose a theoretical and empirical integrated method, which first selects the possible optimal combined signals in theory and then refines these signals with real triple-frequency GPS data, observed at eleven baselines of different lengths. An interpolation technique is also adopted in order to show changes of the AR performance with the increase in baseline length. The results show that the AR success rate can be improved by 3% in GF mode and 8% in GB mode at certain intervals of the baseline length. Therefore, the TCAR can perform better by adopting the combined signals proposed in this paper when the baseline meets the length condition.

  4. Dignity in the care of older people – a review of the theoretical and empirical literature

    PubMed Central

    Gallagher, Ann; Li, Sarah; Wainwright, Paul; Jones, Ian Rees; Lee, Diana

    2008-01-01

    Background Dignity has become a central concern in UK health policy in relation to older and vulnerable people. The empirical and theoretical literature relating to dignity is extensive and as likely to confound and confuse as to clarify the meaning of dignity for nurses in practice. The aim of this paper is critically to examine the literature and to address the following questions: What does dignity mean? What promotes and diminishes dignity? And how might dignity be operationalised in the care of older people? This paper critically reviews the theoretical and empirical literature relating to dignity and clarifies the meaning and implications of dignity in relation to the care of older people. If nurses are to provide dignified care clarification is an essential first step. Methods This is a review article, critically examining papers reporting theoretical perspectives and empirical studies relating to dignity. The following databases were searched: Assia, BHI, CINAHL, Social Services Abstracts, IBSS, Web of Knowledge Social Sciences Citation Index and Arts & Humanities Citation Index and location of books a chapters in philosophy literature. An analytical approach was adopted to the publications reviewed, focusing on the objectives of the review. Results and discussion We review a range of theoretical and empirical accounts of dignity and identify key dignity promoting factors evident in the literature, including staff attitudes and behaviour; environment; culture of care; and the performance of specific care activities. Although there is scope to learn more about cultural aspects of dignity we know a good deal about dignity in care in general terms. Conclusion We argue that what is required is to provide sufficient support and education to help nurses understand dignity and adequate resources to operationalise dignity in their everyday practice. Using the themes identified from our review we offer proposals for the direction of future research. PMID:18620561

  5. Theoretical Foundation of Zisman's Empirical Equation for Wetting of Liquids on Solid Surfaces

    ERIC Educational Resources Information Center

    Zhu, Ruzeng; Cui, Shuwen; Wang, Xiaosong

    2010-01-01

    Theories of wetting of liquids on solid surfaces under the condition that van der Waals force is dominant are briefly reviewed. We show theoretically that Zisman's empirical equation for wetting of liquids on solid surfaces is a linear approximation of the Young-van der Waals equation in the wetting region, and we express the two parameters in…

  6. On the complex relationship between energy expenditure and longevity: Reconciling the contradictory empirical results with a simple theoretical model.

    PubMed

    Hou, Chen; Amunugama, Kaushalya

    2015-07-01

    The relationship between energy expenditure and longevity has been a central theme in aging studies. Empirical studies have yielded controversial results, which cannot be reconciled by existing theories. In this paper, we present a simple theoretical model based on first principles of energy conservation and allometric scaling laws. The model takes into considerations the energy tradeoffs between life history traits and the efficiency of the energy utilization, and offers quantitative and qualitative explanations for a set of seemingly contradictory empirical results. We show that oxidative metabolism can affect cellular damage and longevity in different ways in animals with different life histories and under different experimental conditions. Qualitative data and the linearity between energy expenditure, cellular damage, and lifespan assumed in previous studies are not sufficient to understand the complexity of the relationships. Our model provides a theoretical framework for quantitative analyses and predictions. The model is supported by a variety of empirical studies, including studies on the cellular damage profile during ontogeny; the intra- and inter-specific correlations between body mass, metabolic rate, and lifespan; and the effects on lifespan of (1) diet restriction and genetic modification of growth hormone, (2) the cold and exercise stresses, and (3) manipulations of antioxidant. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  7. Collective behavior in animal groups: theoretical models and empirical studies

    PubMed Central

    Giardina, Irene

    2008-01-01

    Collective phenomena in animal groups have attracted much attention in the last years, becoming one of the hottest topics in ethology. There are various reasons for this. On the one hand, animal grouping provides a paradigmatic example of self-organization, where collective behavior emerges in absence of centralized control. The mechanism of group formation, where local rules for the individuals lead to a coherent global state, is very general and transcends the detailed nature of its components. In this respect, collective animal behavior is a subject of great interdisciplinary interest. On the other hand, there are several important issues related to the biological function of grouping and its evolutionary success. Research in this field boasts a number of theoretical models, but much less empirical results to compare with. For this reason, even if the general mechanisms through which self-organization is achieved are qualitatively well understood, a quantitative test of the models assumptions is still lacking. New analysis on large groups, which require sophisticated technological procedures, can provide the necessary empirical data. PMID:19404431

  8. Physical Violence between Siblings: A Theoretical and Empirical Analysis

    ERIC Educational Resources Information Center

    Hoffman, Kristi L.; Kiecolt, K. Jill; Edwards, John N.

    2005-01-01

    This study develops and tests a theoretical model to explain sibling violence based on the feminist, conflict, and social learning theoretical perspectives and research in psychology and sociology. A multivariate analysis of data from 651 young adults generally supports hypotheses from all three theoretical perspectives. Males with brothers have…

  9. Attachment-based family therapy for depressed and suicidal adolescents: theory, clinical model and empirical support.

    PubMed

    Ewing, E Stephanie Krauthamer; Diamond, Guy; Levy, Suzanne

    2015-01-01

    Attachment-Based Family Therapy (ABFT) is a manualized family-based intervention designed for working with depressed adolescents, including those at risk for suicide, and their families. It is an empirically informed and supported treatment. ABFT has its theoretical underpinnings in attachment theory and clinical roots in structural family therapy and emotion focused therapies. ABFT relies on a transactional model that aims to transform the quality of adolescent-parent attachment, as a means of providing the adolescent with a more secure relationship that can support them during challenging times generally, and the crises related to suicidal thinking and behavior, specifically. This article reviews: (1) the theoretical foundations of ABFT (attachment theory, models of emotional development); (2) the ABFT clinical model, including training and supervision factors; and (3) empirical support.

  10. Radiation effects on space-based stellar photometry: theoretical models and empirical results for CoRoT Space Telescope

    NASA Astrophysics Data System (ADS)

    Pinheiro da Silva, L.; Rolland, G.; Lapeyrere, V.; Auvergne, M.

    2008-03-01

    Convection, Rotation and planetary Transits (CoRoT) is a space mission dedicated to stellar seismology and the search for extrasolar planets. Both scientific programs are based on very high precision photometry and require long, uninterrupted observations. The instrument is based on an afocal telescope and a wide-field camera, consisting of four E2V-4280 CCD devices. This set is mounted on a recurrent platform for insertion in low Earth orbit. The CoRoT satellite has been recently launched for a nominal mission duration of three years. In this work, we discuss the impact of space radiation on CoRoT CCDs, in sight of the in-flight characterization results obtained during the satellite's commissioning phase, as well as the very first observational data. We start by describing the population of trapped particles at the satellite altitude, and by presenting a theoretical prediction for the incoming radiation fluxes seen by the CCDs behind shielding. Empirical results regarding particle impact rates and their geographical distribution are then presented and discussed. The effect of particle impacts is also statistically characterized, with respect to the ionizing energy imparted to the CCDs and the size of impact trails. Based on these results, we discuss the effects of space radiation on precise and time-resolved stellar photometry from space. Finally, we present preliminary results concerning permanent radiation damage on CoRoT CCDs, as extrapolated from the data available at the beginning of the satellite's lifetime.

  11. The Role of Trait Emotional Intelligence in Academic Performance: Theoretical Overview and Empirical Update.

    PubMed

    Perera, Harsha N

    2016-01-01

    Considerable debate still exists among scholars over the role of trait emotional intelligence (TEI) in academic performance. The dominant theoretical position is that TEI should be orthogonal or only weakly related to achievement; yet, there are strong theoretical reasons to believe that TEI plays a key role in performance. The purpose of the current article is to provide (a) an overview of the possible theoretical mechanisms linking TEI with achievement and (b) an update on empirical research examining this relationship. To elucidate these theoretical mechanisms, the overview draws on multiple theories of emotion and regulation, including TEI theory, social-functional accounts of emotion, and expectancy-value and psychobiological model of emotion and regulation. Although these theoretical accounts variously emphasize different variables as focal constructs, when taken together, they provide a comprehensive picture of the possible mechanisms linking TEI with achievement. In this regard, the article redresses the problem of vaguely specified theoretical links currently hampering progress in the field. The article closes with a consideration of directions for future research.

  12. Empirical STORM-E Model. [I. Theoretical and Observational Basis

    NASA Technical Reports Server (NTRS)

    Mertens, Christopher J.; Xu, Xiaojing; Bilitza, Dieter; Mlynczak, Martin G.; Russell, James M., III

    2013-01-01

    Auroral nighttime infrared emission observed by the Sounding of the Atmosphere using Broadband Emission Radiometry (SABER) instrument onboard the Thermosphere-Ionosphere-Mesosphere Energetics and Dynamics (TIMED) satellite is used to develop an empirical model of geomagnetic storm enhancements to E-region peak electron densities. The empirical model is called STORM-E and will be incorporated into the 2012 release of the International Reference Ionosphere (IRI). The proxy for characterizing the E-region response to geomagnetic forcing is NO+(v) volume emission rates (VER) derived from the TIMED/SABER 4.3 lm channel limb radiance measurements. The storm-time response of the NO+(v) 4.3 lm VER is sensitive to auroral particle precipitation. A statistical database of storm-time to climatological quiet-time ratios of SABER-observed NO+(v) 4.3 lm VER are fit to widely available geomagnetic indices using the theoretical framework of linear impulse-response theory. The STORM-E model provides a dynamic storm-time correction factor to adjust a known quiescent E-region electron density peak concentration for geomagnetic enhancements due to auroral particle precipitation. Part II of this series describes the explicit development of the empirical storm-time correction factor for E-region peak electron densities, and shows comparisons of E-region electron densities between STORM-E predictions and incoherent scatter radar measurements. In this paper, Part I of the series, the efficacy of using SABER-derived NO+(v) VER as a proxy for the E-region response to solar-geomagnetic disturbances is presented. Furthermore, a detailed description of the algorithms and methodologies used to derive NO+(v) VER from SABER 4.3 lm limb emission measurements is given. Finally, an assessment of key uncertainties in retrieving NO+(v) VER is presented

  13. Evidence-based Nursing Education - a Systematic Review of Empirical Research

    PubMed Central

    Reiber, Karin

    2011-01-01

    The project „Evidence-based Nursing Education – Preparatory Stage“, funded by the Landesstiftung Baden-Württemberg within the programme Impulsfinanzierung Forschung (Funding to Stimulate Research), aims to collect information on current research concerned with nursing education and to process existing data. The results of empirical research which has already been carried out were systematically evaluated with aim of identifying further topics, fields and matters of interest for empirical research in nursing education. In the course of the project, the available empirical studies on nursing education were scientifically analysed and systematised. The over-arching aim of the evidence-based training approach – which extends beyond the aims of this project - is the conception, organisation and evaluation of vocational training and educational processes in the caring professions on the basis of empirical data. The following contribution first provides a systematic, theoretical link to the over-arching reference framework, as the evidence-based approach is adapted from thematically related specialist fields. The research design of the project is oriented towards criteria introduced from a selection of studies and carries out a two-stage systematic review of the selected studies. As a result, the current status of research in nursing education, as well as its organisation and structure, and questions relating to specialist training and comparative education are introduced and discussed. Finally, the empirical research on nursing training is critically appraised as a complementary element in educational theory/psychology of learning and in the ethical tradition of research. This contribution aims, on the one hand, to derive and describe the methods used, and to introduce the steps followed in gathering and evaluating the data. On the other hand, it is intended to give a systematic overview of empirical research work in nursing education. In order to preserve a

  14. Common liability to addiction and “gateway hypothesis”: Theoretical, empirical and evolutionary perspective

    PubMed Central

    Vanyukov, Michael M.; Tarter, Ralph E.; Kirillova, Galina P.; Kirisci, Levent; Reynolds, Maureen D.; Kreek, Mary Jeanne; Conway, Kevin P.; Maher, Brion S.; Iacono, William G.; Bierut, Laura; Neale, Michael C.; Clark, Duncan B.; Ridenour, Ty A.

    2013-01-01

    Background Two competing concepts address the development of involvement with psychoactive substances: the “gateway hypothesis” (GH) and common liability to addiction (CLA). Method The literature on theoretical foundations and empirical findings related to both concepts is reviewed. Results The data suggest that drug use initiation sequencing, the core GH element, is variable and opportunistic rather than uniform and developmentally deterministic. The association between risks for use of different substances, if any, can be more readily explained by common underpinnings than by specific staging. In contrast, the CLA concept is grounded in genetic theory and supported by data identifying common sources of variation in the risk for specific addictions. This commonality has identifiable neurobiological substrate and plausible evolutionary explanations. Conclusions Whereas the “gateway” hypothesis does not specify mechanistic connections between “stages”, and does not extend to the risks for addictions, the concept of common liability to addictions incorporates sequencing of drug use initiation as well as extends to related addictions and their severity, provides a parsimonious explanation of substance use and addiction co-occurrence, and establishes a theoretical and empirical foundation to research in etiology, quantitative risk and severity measurement, as well as targeted non-drug-specific prevention and early intervention. PMID:22261179

  15. Theoretical geology

    NASA Astrophysics Data System (ADS)

    Mikeš, Daniel

    2010-05-01

    erroneous assumptions and do not solve the very fundamental issue that lies at the base of the problem. This problem is straighforward and obvious: a sedimentary system is inherently four-dimensional (3 spatial dimensions + 1 temporal dimension). Any method using an inferior number or dimensions is bound to fail to describe the evolution of a sedimentary system. It is indicative of the present day geological world that such fundamental issues be overlooked. The only reason for which one can appoint the socalled "rationality" in todays society. Simple "common sense" leads us to the conclusion that in this case the empirical method is bound to fail and the only method that can solve the problem is the theoretical approach. Reasoning that is completely trivial for the traditional exact sciences like physics and mathematics and applied sciences like engineering. However, not for geology, a science that was traditionally descriptive and jumped to empirical science, skipping the stage of theoretical science. I argue that the gap of theoretical geology is left open and needs to be filled. Every discipline in geology lacks a theoretical base. This base can only be filled by the theoretical/inductive approach and can impossibly be filled by the empirical/deductive approach. Once a critical mass of geologists realises this flaw in todays geology, we can start solving the fundamental problems in geology.

  16. Absorption line indices in the UV. I. Empirical and theoretical stellar population models

    NASA Astrophysics Data System (ADS)

    Maraston, C.; Nieves Colmenárez, L.; Bender, R.; Thomas, D.

    2009-01-01

    Aims: Stellar absorption lines in the optical (e.g. the Lick system) have been extensively studied and constitute an important stellar population diagnostic for galaxies in the local universe and up to moderate redshifts. Proceeding towards higher look-back times, galaxies are younger and the ultraviolet becomes the relevant spectral region where the dominant stellar populations shine. A comprehensive study of ultraviolet absorption lines of stellar population models is however still lacking. With this in mind, we study absorption line indices in the far and mid-ultraviolet in order to determine age and metallicity indicators for UV-bright stellar populations in the local universe as well as at high redshift. Methods: We explore empirical and theoretical spectral libraries and use evolutionary population synthesis to compute synthetic line indices of stellar population models. From the empirical side, we exploit the IUE-low resolution library of stellar spectra and system of absorption lines, from which we derive analytical functions (fitting functions) describing the strength of stellar line indices as a function of gravity, temperature and metallicity. The fitting functions are entered into an evolutionary population synthesis code in order to compute the integrated line indices of stellar populations models. The same line indices are also directly evaluated on theoretical spectral energy distributions of stellar population models based on Kurucz high-resolution synthetic spectra, In order to select indices that can be used as age and/or metallicity indicators for distant galaxies and globular clusters, we compare the models to data of template globular clusters from the Magellanic Clouds with independently known ages and metallicities. Results: We provide synthetic line indices in the wavelength range ~1200 Å to ~3000 Å for stellar populations of various ages and metallicities.This adds several new indices to the already well-studied CIV and SiIV absorptions

  17. Evolution of the empirical and theoretical foundations of eyewitness identification reform.

    PubMed

    Clark, Steven E; Moreland, Molly B; Gronlund, Scott D

    2014-04-01

    Scientists in many disciplines have begun to raise questions about the evolution of research findings over time (Ioannidis in Epidemiology, 19, 640-648, 2008; Jennions & Møller in Proceedings of the Royal Society, Biological Sciences, 269, 43-48, 2002; Mullen, Muellerleile, & Bryan in Personality and Social Psychology Bulletin, 27, 1450-1462, 2001; Schooler in Nature, 470, 437, 2011), since many phenomena exhibit decline effects-reductions in the magnitudes of effect sizes as empirical evidence accumulates. The present article examines empirical and theoretical evolution in eyewitness identification research. For decades, the field has held that there are identification procedures that, if implemented by law enforcement, would increase eyewitness accuracy, either by reducing false identifications, with little or no change in correct identifications, or by increasing correct identifications, with little or no change in false identifications. Despite the durability of this no-cost view, it is unambiguously contradicted by data (Clark in Perspectives on Psychological Science, 7, 238-259, 2012a; Clark & Godfrey in Psychonomic Bulletin & Review, 16, 22-42, 2009; Clark, Moreland, & Rush, 2013; Palmer & Brewer in Law and Human Behavior, 36, 247-255, 2012), raising questions as to how the no-cost view became well-accepted and endured for so long. Our analyses suggest that (1) seminal studies produced, or were interpreted as having produced, the no-cost pattern of results; (2) a compelling theory was developed that appeared to account for the no-cost pattern; (3) empirical results changed over the years, and subsequent studies did not reliably replicate the no-cost pattern; and (4) the no-cost view survived despite the accumulation of contradictory empirical evidence. Theories of memory that were ruled out by early data now appear to be supported by data, and the theory developed to account for early data now appears to be incorrect.

  18. Social-Emotional Well-Being and Resilience of Children in Early Childhood Settings--PERIK: An Empirically Based Observation Scale for Practitioners

    ERIC Educational Resources Information Center

    Mayr, Toni; Ulich, Michaela

    2009-01-01

    Compared with the traditional focus on developmental problems, research on positive development is relatively new. Empirical research in children's well-being has been scarce. The aim of this study was to develop a theoretically and empirically based instrument for practitioners to observe and assess preschool children's well-being in early…

  19. Patient perceptions of patient-centred care: empirical test of a theoretical model.

    PubMed

    Rathert, Cheryl; Williams, Eric S; McCaughey, Deirdre; Ishqaidef, Ghadir

    2015-04-01

    Patient perception measures are gaining increasing interest among scholars and practitioners. The aim of this study was to empirically examine a conceptual model of patient-centred care using patient perception survey data. Patient-centred care is one of the Institute of Medicine's objectives for improving health care in the 21st century. Patient interviews conducted by the Picker Institute/Commonwealth Fund in the 1980s resulted in a theoretical model and survey questions with dimensions and attributes patients defined as patient-centered. The present study used survey data from patients with overnight visits at 142 U.S. hospitals. Regression analysis found significant support for the theoretical model. Perceptions of emotional support had the strongest relationship with overall care ratings. Coordination of care, and physical comfort were strongly related as well. Understanding how patients experience their care can help improve understanding of what patients believe is patient-centred, and of how care processes relate to important patient outcomes. © 2012 John Wiley & Sons Ltd.

  20. Theoretical and Empirical Comparison of Big Data Image Processing with Apache Hadoop and Sun Grid Engine.

    PubMed

    Bao, Shunxing; Weitendorf, Frederick D; Plassard, Andrew J; Huo, Yuankai; Gokhale, Aniruddha; Landman, Bennett A

    2017-02-11

    The field of big data is generally concerned with the scale of processing at which traditional computational paradigms break down. In medical imaging, traditional large scale processing uses a cluster computer that combines a group of workstation nodes into a functional unit that is controlled by a job scheduler. Typically, a shared-storage network file system (NFS) is used to host imaging data. However, data transfer from storage to processing nodes can saturate network bandwidth when data is frequently uploaded/retrieved from the NFS, e.g., "short" processing times and/or "large" datasets. Recently, an alternative approach using Hadoop and HBase was presented for medical imaging to enable co-location of data storage and computation while minimizing data transfer. The benefits of using such a framework must be formally evaluated against a traditional approach to characterize the point at which simply "large scale" processing transitions into "big data" and necessitates alternative computational frameworks. The proposed Hadoop system was implemented on a production lab-cluster alongside a standard Sun Grid Engine (SGE). Theoretical models for wall-clock time and resource time for both approaches are introduced and validated. To provide real example data, three T1 image archives were retrieved from a university secure, shared web database and used to empirically assess computational performance under three configurations of cluster hardware (using 72, 109, or 209 CPU cores) with differing job lengths. Empirical results match the theoretical models. Based on these data, a comparative analysis is presented for when the Hadoop framework will be relevant and non-relevant for medical imaging.

  1. Theoretical and empirical comparison of big data image processing with Apache Hadoop and Sun Grid Engine

    NASA Astrophysics Data System (ADS)

    Bao, Shunxing; Weitendorf, Frederick D.; Plassard, Andrew J.; Huo, Yuankai; Gokhale, Aniruddha; Landman, Bennett A.

    2017-03-01

    The field of big data is generally concerned with the scale of processing at which traditional computational paradigms break down. In medical imaging, traditional large scale processing uses a cluster computer that combines a group of workstation nodes into a functional unit that is controlled by a job scheduler. Typically, a shared-storage network file system (NFS) is used to host imaging data. However, data transfer from storage to processing nodes can saturate network bandwidth when data is frequently uploaded/retrieved from the NFS, e.g., "short" processing times and/or "large" datasets. Recently, an alternative approach using Hadoop and HBase was presented for medical imaging to enable co-location of data storage and computation while minimizing data transfer. The benefits of using such a framework must be formally evaluated against a traditional approach to characterize the point at which simply "large scale" processing transitions into "big data" and necessitates alternative computational frameworks. The proposed Hadoop system was implemented on a production lab-cluster alongside a standard Sun Grid Engine (SGE). Theoretical models for wall-clock time and resource time for both approaches are introduced and validated. To provide real example data, three T1 image archives were retrieved from a university secure, shared web database and used to empirically assess computational performance under three configurations of cluster hardware (using 72, 109, or 209 CPU cores) with differing job lengths. Empirical results match the theoretical models. Based on these data, a comparative analysis is presented for when the Hadoop framework will be relevant and nonrelevant for medical imaging.

  2. Scientific thinking in young children: theoretical advances, empirical research, and policy implications.

    PubMed

    Gopnik, Alison

    2012-09-28

    New theoretical ideas and empirical research show that very young children's learning and thinking are strikingly similar to much learning and thinking in science. Preschoolers test hypotheses against data and make causal inferences; they learn from statistics and informal experimentation, and from watching and listening to others. The mathematical framework of probabilistic models and Bayesian inference can describe this learning in precise ways. These discoveries have implications for early childhood education and policy. In particular, they suggest both that early childhood experience is extremely important and that the trend toward more structured and academic early childhood programs is misguided.

  3. Misleading Theoretical Assumptions in Hypertext/Hypermedia Research.

    ERIC Educational Resources Information Center

    Tergan, Sigmar-Olaf

    1997-01-01

    Reviews basic theoretical assumptions of research on learning with hypertext/hypermedia. Focuses on whether the results of research on hypertext/hypermedia-based learning support these assumptions. Results of empirical studies and theoretical analysis reveal that many research approaches have been misled by inappropriate theoretical assumptions on…

  4. Theoretical and Empirical Comparison of Big Data Image Processing with Apache Hadoop and Sun Grid Engine

    PubMed Central

    Bao, Shunxing; Weitendorf, Frederick D.; Plassard, Andrew J.; Huo, Yuankai; Gokhale, Aniruddha; Landman, Bennett A.

    2016-01-01

    The field of big data is generally concerned with the scale of processing at which traditional computational paradigms break down. In medical imaging, traditional large scale processing uses a cluster computer that combines a group of workstation nodes into a functional unit that is controlled by a job scheduler. Typically, a shared-storage network file system (NFS) is used to host imaging data. However, data transfer from storage to processing nodes can saturate network bandwidth when data is frequently uploaded/retrieved from the NFS, e.g., “short” processing times and/or “large” datasets. Recently, an alternative approach using Hadoop and HBase was presented for medical imaging to enable co-location of data storage and computation while minimizing data transfer. The benefits of using such a framework must be formally evaluated against a traditional approach to characterize the point at which simply “large scale” processing transitions into “big data” and necessitates alternative computational frameworks. The proposed Hadoop system was implemented on a production lab-cluster alongside a standard Sun Grid Engine (SGE). Theoretical models for wall-clock time and resource time for both approaches are introduced and validated. To provide real example data, three T1 image archives were retrieved from a university secure, shared web database and used to empirically assess computational performance under three configurations of cluster hardware (using 72, 109, or 209 CPU cores) with differing job lengths. Empirical results match the theoretical models. Based on these data, a comparative analysis is presented for when the Hadoop framework will be relevant and non-relevant for medical imaging. PMID:28736473

  5. Empirical population and public health ethics: A review and critical analysis to advance robust empirical-normative inquiry.

    PubMed

    Knight, Rod

    2016-05-01

    The field of population and public health ethics (PPHE) has yet to fully embrace the generation of evidence as an important project. This article reviews the philosophical debates related to the 'empirical turn' in clinical bioethics, and critically analyses how PPHE has and can engage with the philosophical implications of generating empirical data within the task of normative inquiry. A set of five conceptual and theoretical issues pertaining to population health that are unresolved and could potentially benefit from empirical PPHE approaches to normative inquiry are discussed. Each issue differs from traditional empirical bioethical approaches, in that they emphasize (1) concerns related to the population, (2) 'upstream' policy-relevant health interventions - within and outside of the health care system and (3) the prevention of illness and disease. Within each theoretical issue, a conceptual example from population and public health approaches to HIV prevention and health promotion is interrogated. Based on the review and critical analysis, this article concludes that empirical-normative approaches to population and public health ethics would be most usefully pursued as an iterative project (rather than as a linear project), in which the normative informs the empirical questions to be asked and new empirical evidence constantly directs conceptualizations of what constitutes morally robust public health practices. Finally, a conceptualization of an empirical population and public health ethics is advanced in order to open up new interdisciplinary 'spaces', in which empirical and normative approaches to ethical inquiry are transparently (and ethically) integrated. © The Author(s) 2015.

  6. A Theoretical and Empirical Comparison of Three Approaches to Achievement Testing.

    ERIC Educational Resources Information Center

    Haladyna, Tom; Roid, Gale

    Three approaches to the construction of achievement tests are compared: construct, operational, and empirical. The construct approach is based upon classical test theory and measures an abstract representation of the instructional objectives. The operational approach specifies instructional intent through instructional objectives, facet design,…

  7. Rural Employment, Migration, and Economic Development: Theoretical Issues and Empirical Evidence from Africa. Africa Rural Employment Paper No. 1.

    ERIC Educational Resources Information Center

    Byerlee, Derek; Eicher, Carl K.

    Employment problems in Africa were examined with special emphasis on rural employment and migration within the context of overall economic development. A framework was provided for analyzing rural employment in development; that framework was used to analyze empirical information from Africa; and theoretical issues were raised in analyzing rural…

  8. Five ways of being "theoretical": applications to provider-patient communication research.

    PubMed

    Hall, Judith A; Schmid Mast, Marianne

    2009-03-01

    Analyzes the term "theoretical" as it applies to the area of provider-patient communication research, in order to understand better at a conceptual level what the term may mean for authors and critics. Based on literature on provider-patient communication. Offers, and discusses, five definitions of the term "theoretical" as it applies to empirical research and its exposition: (1) grounding, (2) referencing, (3) design and analysis, (4) interpretation, and (5) impact. Each of these definitions embodies a different standard for evaluating the theoretical aspects of research. Although it is often said that research on provider-patient communication is not "theoretical" enough, the term is ambiguous and often applied vaguely. A multidimensional analysis reveals that there are several distinct ways in which empirical research can be strong or weak theoretically. Researchers, educators, editors, and reviewers could use the "Five Ways" framework to appraise the theory-relevant strengths and weaknesses of empirical research and its exposition.

  9. Binocular disparities, motion parallax, and geometric perspective in Patrick Hughes's 'reverspectives': theoretical analysis and empirical findings.

    PubMed

    Rogers, Brian; Gyani, Alex

    2010-01-01

    Abstract. Patrick Hughes's 'reverspective' artworks provide a novel way of investigating the effectiveness of different sources of 3-D information for the human visual system. Our empirical findings show that the converging lines of simple linear perspective can be as effective as the rich array of 3-D cues present in natural scenes in determining what we see, even when these cues are in conflict with binocular disparities. Theoretical considerations reveal that, once the information provided by motion parallax transformations is correctly understood, there is no need to invoke higher-level processes or an interpretation based on familiarity or past experience in order to explain either the 'reversed' depth or the apparent, concomitant rotation of a reverspective artwork as the observer moves from side to side. What we see in reverspectives is the most likely real-world scenario (distal stimulus) that could have created the perspective and parallax transformations (proximal stimulus) that stimulate our visual systems.

  10. An empirical/theoretical model with dimensionless numbers to predict the performance of electrodialysis systems on the basis of operating conditions.

    PubMed

    Karimi, Leila; Ghassemi, Abbas

    2016-07-01

    Among the different technologies developed for desalination, the electrodialysis/electrodialysis reversal (ED/EDR) process is one of the most promising for treating brackish water with low salinity when there is high risk of scaling. Multiple researchers have investigated ED/EDR to optimize the process, determine the effects of operating parameters, and develop theoretical/empirical models. Previously published empirical/theoretical models have evaluated the effect of the hydraulic conditions of the ED/EDR on the limiting current density using dimensionless numbers. The reason for previous studies' emphasis on limiting current density is twofold: 1) to maximize ion removal, most ED/EDR systems are operated close to limiting current conditions if there is not a scaling potential in the concentrate chamber due to a high concentration of less-soluble salts; and 2) for modeling the ED/EDR system with dimensionless numbers, it is more accurate and convenient to use limiting current density, where the boundary layer's characteristics are known at constant electrical conditions. To improve knowledge of ED/EDR systems, ED/EDR models should be also developed for the Ohmic region, where operation reduces energy consumption, facilitates targeted ion removal, and prolongs membrane life compared to limiting current conditions. In this paper, theoretical/empirical models were developed for ED/EDR performance in a wide range of operating conditions. The presented ion removal and selectivity models were developed for the removal of monovalent ions and divalent ions utilizing the dominant dimensionless numbers obtained from laboratory scale electrodialysis experiments. At any system scale, these models can predict ED/EDR performance in terms of monovalent and divalent ion removal. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Theoretical and empirical bases for dialect-neutral language assessment: contributions from theoretical and applied linguistics to communication disorders.

    PubMed

    Pearson, Barbara Zurer

    2004-02-01

    Three avenues of theoretical research provide insights for discovering abstract properties of language that are subject to disorder and amenable to assessment: (1) the study of universal grammar and its acquisition; (2) descriptions of African American English (AAE) Syntax, Semantics, and Phonology within theoretical linguistics; and (3) the study of specific language impairment (SLI) cross-linguistically. Abstract linguistic concepts were translated into a set of assessment protocols that were used to establish normative data on language acquisition (developmental milestones) in typically developing AAE children ages 4 to 9 years. Testing AAE-speaking language impaired (LI) children and both typically developing (TD) and LI Mainstream American English (MAE)-learning children on these same measures provided the data to select assessments for which (1) TD MAE and AAE children performed the same, and (2) TD performance was reliably different from LI performance in both dialect groups.

  12. The Theoretical and Empirical Basis for Meditation as an Intervention for PTSD

    ERIC Educational Resources Information Center

    Lang, Ariel J.; Strauss, Jennifer L.; Bomyea, Jessica; Bormann, Jill E.; Hickman, Steven D.; Good, Raquel C.; Essex, Michael

    2012-01-01

    In spite of the existence of good empirically supported treatments for posttraumatic stress disorder (PTSD), consumers and providers continue to ask for more options for managing this common and often chronic condition. Meditation-based approaches are being widely implemented, but there is minimal research rigorously assessing their effectiveness.…

  13. Empirical likelihood-based tests for stochastic ordering

    PubMed Central

    BARMI, HAMMOU EL; MCKEAGUE, IAN W.

    2013-01-01

    This paper develops an empirical likelihood approach to testing for the presence of stochastic ordering among univariate distributions based on independent random samples from each distribution. The proposed test statistic is formed by integrating a localized empirical likelihood statistic with respect to the empirical distribution of the pooled sample. The asymptotic null distribution of this test statistic is found to have a simple distribution-free representation in terms of standard Brownian bridge processes. The approach is used to compare the lengths of rule of Roman Emperors over various historical periods, including the “decline and fall” phase of the empire. In a simulation study, the power of the proposed test is found to improve substantially upon that of a competing test due to El Barmi and Mukerjee. PMID:23874142

  14. Discovering the Neural Nature of Moral Cognition? Empirical, Theoretical, and Practical Challenges in Bioethical Research with Electroencephalography (EEG).

    PubMed

    Wagner, Nils-Frederic; Chaves, Pedro; Wolff, Annemarie

    2017-06-01

    In this article we critically review the neural mechanisms of moral cognition that have recently been studied via electroencephalography (EEG). Such studies promise to shed new light on traditional moral questions by helping us to understand how effective moral cognition is embodied in the brain. It has been argued that conflicting normative ethical theories require different cognitive features and can, accordingly, in a broadly conceived naturalistic attempt, be associated with different brain processes that are rooted in different brain networks and regions. This potentially morally relevant brain activity has been empirically investigated through EEG-based studies on moral cognition. From neuroscientific evidence gathered in these studies, a variety of normative conclusions have been drawn and bioethical applications have been suggested. We discuss methodological and theoretical merits and demerits of the attempt to use EEG techniques in a morally significant way, point to legal challenges and policy implications, indicate the potential to reveal biomarkers of psychopathological conditions, and consider issues that might inform future bioethical work.

  15. Public Disaster Communication and Child and Family Disaster Mental Health: a Review of Theoretical Frameworks and Empirical Evidence.

    PubMed

    Houston, J Brian; First, Jennifer; Spialek, Matthew L; Sorenson, Mary E; Koch, Megan

    2016-06-01

    Children have been identified as particularly vulnerable to psychological and behavioral difficulties following disaster. Public child and family disaster communication is one public health tool that can be utilized to promote coping/resilience and ameliorate maladaptive child reactions following an event. We conducted a review of the public disaster communication literature and identified three main functions of child and family disaster communication: fostering preparedness, providing psychoeducation, and conducting outreach. Our review also indicates that schools are a promising system for child and family disaster communication. We complete our review with three conclusions. First, theoretically, there appears to be a great opportunity for public disaster communication focused on child disaster reactions. Second, empirical research assessing the effects of public child and family disaster communication is essentially nonexistent. Third, despite the lack of empirical evidence in this area, there is opportunity for public child and family disaster communication efforts that address new domains.

  16. Empirical and Theoretical Aspects of Generation and Transfer of Information in a Neuromagnetic Source Network

    PubMed Central

    Vakorin, Vasily A.; Mišić, Bratislav; Krakovska, Olga; McIntosh, Anthony Randal

    2011-01-01

    Variability in source dynamics across the sources in an activated network may be indicative of how the information is processed within a network. Information-theoretic tools allow one not only to characterize local brain dynamics but also to describe interactions between distributed brain activity. This study follows such a framework and explores the relations between signal variability and asymmetry in mutual interdependencies in a data-driven pipeline of non-linear analysis of neuromagnetic sources reconstructed from human magnetoencephalographic (MEG) data collected as a reaction to a face recognition task. Asymmetry in non-linear interdependencies in the network was analyzed using transfer entropy, which quantifies predictive information transfer between the sources. Variability of the source activity was estimated using multi-scale entropy, quantifying the rate of which information is generated. The empirical results are supported by an analysis of synthetic data based on the dynamics of coupled systems with time delay in coupling. We found that the amount of information transferred from one source to another was correlated with the difference in variability between the dynamics of these two sources, with the directionality of net information transfer depending on the time scale at which the sample entropy was computed. The results based on synthetic data suggest that both time delay and strength of coupling can contribute to the relations between variability of brain signals and information transfer between them. Our findings support the previous attempts to characterize functional organization of the activated brain, based on a combination of non-linear dynamics and temporal features of brain connectivity, such as time delay. PMID:22131968

  17. Age-related differences in associative memory: Empirical evidence and theoretical perspectives.

    PubMed

    Naveh-Benjamin, Moshe; Mayr, Ulrich

    2018-02-01

    Systematic research and anecdotal evidence both indicate declines in episodic memory in older adults in good health without dementia-related disorders. Several hypotheses have been proposed to explain these age-related changes in episodic memory, some of which attribute such declines to a deterioration in associative memory. The current special issue of Psychology and Aging on Age-Related Differences in Associative Memory includes 16 articles by top researchers in the area of memory and aging. Their contributions provide a wealth of empirical work that addresses different aspects of aging and associative memory, including different mediators and predictors of age-related declines in binding and associative memory, cognitive, noncognitive, genetic, and neuro-related ones. The contributions also address the processing phases where these declines manifest themselves and look at ways to ameliorate these age-related declines. Furthermore, the contributions in this issue draw on different theoretical perspectives to explain age-related changes in associative memory and provide a wealth of varying methodologies to assess older and younger adults' performance. Finally, although most of the studies focus on normative/healthy aging, some of them contain insights that are potentially applicable to disorders and pathologies. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  18. Empirical evidence for site coefficients in building code provisions

    USGS Publications Warehouse

    Borcherdt, R.D.

    2002-01-01

    Site-response coefficients, Fa and Fv, used in U.S. building code provisions are based on empirical data for motions up to 0.1 g. For larger motions they are based on theoretical and laboratory results. The Northridge earthquake of 17 January 1994 provided a significant new set of empirical data up to 0.5 g. These data together with recent site characterizations based on shear-wave velocity measurements provide empirical estimates of the site coefficients at base accelerations up to 0.5 g for Site Classes C and D. These empirical estimates of Fa and Fnu; as well as their decrease with increasing base acceleration level are consistent at the 95 percent confidence level with those in present building code provisions, with the exception of estimates for Fa at levels of 0.1 and 0.2 g, which are less than the lower confidence bound by amounts up to 13 percent. The site-coefficient estimates are consistent at the 95 percent confidence level with those of several other investigators for base accelerations greater than 0.3 g. These consistencies and present code procedures indicate that changes in the site coefficients are not warranted. Empirical results for base accelerations greater than 0.2 g confirm the need for both a short- and a mid- or long-period site coefficient to characterize site response for purposes of estimating site-specific design spectra.

  19. Theoretical models of parental HIV disclosure: a critical review.

    PubMed

    Qiao, Shan; Li, Xiaoming; Stanton, Bonita

    2013-01-01

    This study critically examined three major theoretical models related to parental HIV disclosure (i.e., the Four-Phase Model [FPM], the Disclosure Decision Making Model [DDMM], and the Disclosure Process Model [DPM]), and the existing studies that could provide empirical support to these models or their components. For each model, we briefly reviewed its theoretical background, described its components and/or mechanisms, and discussed its strengths and limitations. The existing empirical studies supported most theoretical components in these models. However, hypotheses related to the mechanisms proposed in the models have not yet tested due to a lack of empirical evidence. This study also synthesized alternative theoretical perspectives and new issues in disclosure research and clinical practice that may challenge the existing models. The current study underscores the importance of including components related to social and cultural contexts in theoretical frameworks, and calls for more adequately designed empirical studies in order to test and refine existing theories and to develop new ones.

  20. The Safety Culture Enactment Questionnaire (SCEQ): Theoretical model and empirical validation.

    PubMed

    de Castro, Borja López; Gracia, Francisco J; Tomás, Inés; Peiró, José M

    2017-06-01

    This paper presents the Safety Culture Enactment Questionnaire (SCEQ), designed to assess the degree to which safety is an enacted value in the day-to-day running of nuclear power plants (NPPs). The SCEQ is based on a theoretical safety culture model that is manifested in three fundamental components of the functioning and operation of any organization: strategic decisions, human resources practices, and daily activities and behaviors. The extent to which the importance of safety is enacted in each of these three components provides information about the pervasiveness of the safety culture in the NPP. To validate the SCEQ and the model on which it is based, two separate studies were carried out with data collection in 2008 and 2014, respectively. In Study 1, the SCEQ was administered to the employees of two Spanish NPPs (N=533) belonging to the same company. Participants in Study 2 included 598 employees from the same NPPs, who completed the SCEQ and other questionnaires measuring different safety outcomes (safety climate, safety satisfaction, job satisfaction and risky behaviors). Study 1 comprised item formulation and examination of the factorial structure and reliability of the SCEQ. Study 2 tested internal consistency and provided evidence of factorial validity, validity based on relationships with other variables, and discriminant validity between the SCEQ and safety climate. Exploratory Factor Analysis (EFA) carried out in Study 1 revealed a three-factor solution corresponding to the three components of the theoretical model. Reliability analyses showed strong internal consistency for the three scales of the SCEQ, and each of the 21 items on the questionnaire contributed to the homogeneity of its theoretically developed scale. Confirmatory Factor Analysis (CFA) carried out in Study 2 supported the internal structure of the SCEQ; internal consistency of the scales was also supported. Furthermore, the three scales of the SCEQ showed the expected correlation

  1. An empirical investigation of theoretical loss and gambling intensity.

    PubMed

    Auer, Michael; Griffiths, Mark D

    2014-12-01

    Many recent studies of internet gambling-particularly those that have analysed behavioural tracking data-have used variables such 'bet size' and 'number of games played' as proxy measures for 'gambling intensity'. In this paper it is argued that the most stable and reliable measure for 'gambling intensity' is the 'theoretical loss' (a product of total bet size and house advantage). In the long run, the theoretical loss corresponds with the Gross Gaming Revenue generated by commercial gaming operators. For shorter periods of time, theoretical loss is the most stable measure of gambling intensity as it is not distorted by gamblers' occasional wins. Even for single bets, the theoretical loss reflects the amount a player is willing to risk. Using behavioural tracking data of 100,000 players who played online casino, lottery and/or poker games, this paper also demonstrates that bet size does not equate to or explain theoretical loss as it does not take into account the house advantage. This lack of accuracy is shown to be even more pronounced for gamblers who play a variety of games.

  2. Evaluation of theoretical and empirical water vapor sorption isotherm models for soils

    NASA Astrophysics Data System (ADS)

    Arthur, Emmanuel; Tuller, Markus; Moldrup, Per; de Jonge, Lis W.

    2016-01-01

    The mathematical characterization of water vapor sorption isotherms of soils is crucial for modeling processes such as volatilization of pesticides and diffusive and convective water vapor transport. Although numerous physically based and empirical models were previously proposed to describe sorption isotherms of building materials, food, and other industrial products, knowledge about the applicability of these functions for soils is noticeably lacking. We present an evaluation of nine models for characterizing adsorption/desorption isotherms for a water activity range from 0.03 to 0.93 based on measured data of 207 soils with widely varying textures, organic carbon contents, and clay mineralogy. In addition, the potential applicability of the models for prediction of sorption isotherms from known clay content was investigated. While in general, all investigated models described measured adsorption and desorption isotherms reasonably well, distinct differences were observed between physical and empirical models and due to the different degrees of freedom of the model equations. There were also considerable differences in model performance for adsorption and desorption data. While regression analysis relating model parameters and clay content and subsequent model application for prediction of measured isotherms showed promise for the majority of investigated soils, for soils with distinct kaolinitic and smectitic clay mineralogy predicted isotherms did not closely match the measurements.

  3. Empirical State Error Covariance Matrix for Batch Estimation

    NASA Technical Reports Server (NTRS)

    Frisbee, Joe

    2015-01-01

    State estimation techniques effectively provide mean state estimates. However, the theoretical state error covariance matrices provided as part of these techniques often suffer from a lack of confidence in their ability to describe the uncertainty in the estimated states. By a reinterpretation of the equations involved in the weighted batch least squares algorithm, it is possible to directly arrive at an empirical state error covariance matrix. The proposed empirical state error covariance matrix will contain the effect of all error sources, known or not. This empirical error covariance matrix may be calculated as a side computation for each unique batch solution. Results based on the proposed technique will be presented for a simple, two observer and measurement error only problem.

  4. Viscoelastic shear properties of human vocal fold mucosa: theoretical characterization based on constitutive modeling.

    PubMed

    Chan, R W; Titze, I R

    2000-01-01

    The viscoelastic shear properties of human vocal fold mucosa (cover) were previously measured as a function of frequency [Chan and Titze, J. Acoust. Soc. Am. 106, 2008-2021 (1999)], but data were obtained only in a frequency range of 0.01-15 Hz, an order of magnitude below typical frequencies of vocal fold oscillation (on the order of 100 Hz). This study represents an attempt to extrapolate the data to higher frequencies based on two viscoelastic theories, (1) a quasilinear viscoelastic theory widely used for the constitutive modeling of the viscoelastic properties of biological tissues [Fung, Biomechanics (Springer-Verlag, New York, 1993), pp. 277-292], and (2) a molecular (statistical network) theory commonly used for the rheological modeling of polymeric materials [Zhu et al., J. Biomech. 24, 1007-1018 (1991)]. Analytical expressions of elastic and viscous shear moduli, dynamic viscosity, and damping ratio based on the two theories with specific model parameters were applied to curve-fit the empirical data. Results showed that the theoretical predictions matched the empirical data reasonably well, allowing for parametric descriptions of the data and their extrapolations to frequencies of phonation.

  5. Strengthening Theoretical Testing in Criminology Using Agent-based Modeling.

    PubMed

    Johnson, Shane D; Groff, Elizabeth R

    2014-07-01

    The Journal of Research in Crime and Delinquency ( JRCD ) has published important contributions to both criminological theory and associated empirical tests. In this article, we consider some of the challenges associated with traditional approaches to social science research, and discuss a complementary approach that is gaining popularity-agent-based computational modeling-that may offer new opportunities to strengthen theories of crime and develop insights into phenomena of interest. Two literature reviews are completed. The aim of the first is to identify those articles published in JRCD that have been the most influential and to classify the theoretical perspectives taken. The second is intended to identify those studies that have used an agent-based model (ABM) to examine criminological theories and to identify which theories have been explored. Ecological theories of crime pattern formation have received the most attention from researchers using ABMs, but many other criminological theories are amenable to testing using such methods. Traditional methods of theory development and testing suffer from a number of potential issues that a more systematic use of ABMs-not without its own issues-may help to overcome. ABMs should become another method in the criminologists toolbox to aid theory testing and falsification.

  6. Mindfulness-based treatment to prevent addictive behavior relapse: theoretical models and hypothesized mechanisms of change.

    PubMed

    Witkiewitz, Katie; Bowen, Sarah; Harrop, Erin N; Douglas, Haley; Enkema, Matthew; Sedgwick, Carly

    2014-04-01

    Mindfulness-based treatments are growing in popularity among addiction treatment providers, and several studies suggest the efficacy of incorporating mindfulness practices into the treatment of addiction, including the treatment of substance use disorders and behavioral addictions (i.e., gambling). The current paper provides a review of theoretical models of mindfulness in the treatment of addiction and several hypothesized mechanisms of change. We provide an overview of mindfulness-based relapse prevention (MBRP), including session content, treatment targets, and client feedback from participants who have received MBRP in the context of empirical studies. Future research directions regarding operationalization and measurement, identifying factors that moderate treatment effects, and protocol adaptations for specific populations are discussed.

  7. Mindfulness-Based Treatment to Prevent Addictive Behavior Relapse: Theoretical Models and Hypothesized Mechanisms of Change

    PubMed Central

    Witkiewitz, Katie; Bowen, Sarah; Harrop, Erin N.; Douglas, Haley; Enkema, Matthew; Sedgwick, Carly

    2017-01-01

    Mindfulness-based treatments are growing in popularity among addiction treatment providers, and several studies suggest the efficacy of incorporating mindfulness practices into the treatment of addiction, including the treatment of substance use disorders and behavioral addictions (i.e., gambling). The current paper provides a review of theoretical models of mindfulness in the treatment of addiction and several hypothesized mechanisms of change. We provide an overview of mindfulness-based relapse prevention (MBRP), including session content, treatment targets, and client feedback from participants who have received MBRP in the context of empirical studies. Future research directions regarding operationalization and measurement, identifying factors that moderate treatment effects, and protocol adaptations for specific populations are discussed. PMID:24611847

  8. Gender differences in the perception and utilization of social support: theoretical perspectives and an empirical test.

    PubMed

    Flaherty, J; Richman, J

    1989-01-01

    The authors contend that women are the more supportive, nurturing and affectively-connected sex. They argue that these gender differences result from socialization experiences which may be modified by social and occupational roles. Theoretical perspectives and research addressing this proposition are reviewed. Empirical data on support-eliciting and support-providing behaviors in a cohort of medical students are then provided to test their thesis. The data suggest that women have developed a greater sensitivity to the needs of themselves and others, leading to a greater capacity to provide support and a greater dependence upon social support for psychological well-being. Personality and developmental factors that may account for these differences are examined. The implications of these findings for gender differences in mental health are discussed.

  9. Empirical projection-based basis-component decomposition method

    NASA Astrophysics Data System (ADS)

    Brendel, Bernhard; Roessl, Ewald; Schlomka, Jens-Peter; Proksa, Roland

    2009-02-01

    Advances in the development of semiconductor based, photon-counting x-ray detectors stimulate research in the domain of energy-resolving pre-clinical and clinical computed tomography (CT). For counting detectors acquiring x-ray attenuation in at least three different energy windows, an extended basis component decomposition can be performed in which in addition to the conventional approach of Alvarez and Macovski a third basis component is introduced, e.g., a gadolinium based CT contrast material. After the decomposition of the measured projection data into the basis component projections, conventional filtered-backprojection reconstruction is performed to obtain the basis-component images. In recent work, this basis component decomposition was obtained by maximizing the likelihood-function of the measurements. This procedure is time consuming and often unstable for excessively noisy data or low intrinsic energy resolution of the detector. Therefore, alternative procedures are of interest. Here, we introduce a generalization of the idea of empirical dual-energy processing published by Stenner et al. to multi-energy, photon-counting CT raw data. Instead of working in the image-domain, we use prior spectral knowledge about the acquisition system (tube spectra, bin sensitivities) to parameterize the line-integrals of the basis component decomposition directly in the projection domain. We compare this empirical approach with the maximum-likelihood (ML) approach considering image noise and image bias (artifacts) and see that only moderate noise increase is to be expected for small bias in the empirical approach. Given the drastic reduction of pre-processing time, the empirical approach is considered a viable alternative to the ML approach.

  10. Empirically Based Play Interventions for Children

    ERIC Educational Resources Information Center

    Reddy, Linda A., Ed.; Files-Hall, Tara M., Ed.; Schaefer, Charles E., Ed.

    2005-01-01

    "Empirically Based Play Interventions for Children" is a compilation of innovative, well-designed play interventions, presented for the first time in one text. Play therapy is the oldest and most popular form of child therapy in clinical practice and is widely considered by practitioners to be uniquely responsive to children's developmental needs.…

  11. Uncovering curvilinear relationships between conscientiousness and job performance: how theoretically appropriate measurement makes an empirical difference.

    PubMed

    Carter, Nathan T; Dalal, Dev K; Boyce, Anthony S; O'Connell, Matthew S; Kung, Mei-Chuan; Delgado, Kristin M

    2014-07-01

    The personality trait of conscientiousness has seen considerable attention from applied psychologists due to its efficacy for predicting job performance across performance dimensions and occupations. However, recent theoretical and empirical developments have questioned the assumption that more conscientiousness always results in better job performance, suggesting a curvilinear link between the 2. Despite these developments, the results of studies directly testing the idea have been mixed. Here, we propose this link has been obscured by another pervasive assumption known as the dominance model of measurement: that higher scores on traditional personality measures always indicate higher levels of conscientiousness. Recent research suggests dominance models show inferior fit to personality test scores as compared to ideal point models that allow for curvilinear relationships between traits and scores. Using data from 2 different samples of job incumbents, we show the rank-order changes that result from using an ideal point model expose a curvilinear link between conscientiousness and job performance 100% of the time, whereas results using dominance models show mixed results, similar to the current state of the literature. Finally, with an independent cross-validation sample, we show that selection based on predicted performance using ideal point scores results in more favorable objective hiring outcomes. Implications for practice and future research are discussed.

  12. How beauty works. Theoretical mechanisms and two empirical applications on students' evaluation of teaching.

    PubMed

    Wolbring, Tobias; Riordan, Patrick

    2016-05-01

    Plenty of studies show that the physical appearance of a person affects a variety of outcomes in everyday life. However, due to an incomplete theoretical explication and empirical problems in disentangling different beauty effects, it is unclear which mechanisms are at work. To clarify how beauty works we present explanations from evolutionary theory and expectation states theory and show where both perspectives differ and where interlinkage appears promising. Using students' evaluations of teaching we find observational and experimental evidence for the different causal pathways of physical attractiveness. First, independent raters strongly agree over the physical attractiveness of a person. Second, attractive instructors receive better student ratings. Third, students attend classes of attractive instructors more frequently - even after controlling for teaching quality. Fourth, we find no evidence that attractiveness effects become stronger if rater and ratee are of the opposite sex. Finally, the beauty premium turns into a penalty if an attractive instructor falls short of students' expectations. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Energy risk in the arbitrage pricing model: an empirical and theoretical study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bremer, M.A.

    1986-01-01

    This dissertation empirically explores the Arbitrage Pricing Theory in the context of energy risk for securities over the 1960s, 1970s, and early 1980s. Starting from a general multifactor pricing model, the paper develops a two factor model based on a market-like factor and an energy factor. This model is then tested on portfolios of securities grouped according to industrial classification using several econometric techniques designed to overcome some of the more serious estimation problems common to these models. The paper concludes that energy risk is priced in the 1970s and possibly even in the 1960s. Energy risk is found tomore » be priced in the sense that investors who hold assets subjected to energy risk are paid for this risk. The classic version of the Capital Asset Pricing Model which posits the market as the single priced factor is rejected in favor of the Arbitrage Pricing Theory or multi-beta versions of the Capital Asset Pricing Model. The study introduces some original econometric methodology to carry out empirical tests.« less

  14. Subgrade evaluation based on theoretical concepts.

    DOT National Transportation Integrated Search

    1971-01-01

    Evaluations of pavement soil subgrades for the purpose of design are mostly based on empirical methods such as the CBR, California soil resistance method, etc. The need for the application of theory and the evaluation of subgrade strength in terms of...

  15. Bridging process-based and empirical approaches to modeling tree growth

    Treesearch

    Harry T. Valentine; Annikki Makela; Annikki Makela

    2005-01-01

    The gulf between process-based and empirical approaches to modeling tree growth may be bridged, in part, by the use of a common model. To this end, we have formulated a process-based model of tree growth that can be fitted and applied in an empirical mode. The growth model is grounded in pipe model theory and an optimal control model of crown development. Together, the...

  16. The Equivalence of Information-Theoretic and Likelihood-Based Methods for Neural Dimensionality Reduction

    PubMed Central

    Williamson, Ross S.; Sahani, Maneesh; Pillow, Jonathan W.

    2015-01-01

    Stimulus dimensionality-reduction methods in neuroscience seek to identify a low-dimensional space of stimulus features that affect a neuron’s probability of spiking. One popular method, known as maximally informative dimensions (MID), uses an information-theoretic quantity known as “single-spike information” to identify this space. Here we examine MID from a model-based perspective. We show that MID is a maximum-likelihood estimator for the parameters of a linear-nonlinear-Poisson (LNP) model, and that the empirical single-spike information corresponds to the normalized log-likelihood under a Poisson model. This equivalence implies that MID does not necessarily find maximally informative stimulus dimensions when spiking is not well described as Poisson. We provide several examples to illustrate this shortcoming, and derive a lower bound on the information lost when spiking is Bernoulli in discrete time bins. To overcome this limitation, we introduce model-based dimensionality reduction methods for neurons with non-Poisson firing statistics, and show that they can be framed equivalently in likelihood-based or information-theoretic terms. Finally, we show how to overcome practical limitations on the number of stimulus dimensions that MID can estimate by constraining the form of the non-parametric nonlinearity in an LNP model. We illustrate these methods with simulations and data from primate visual cortex. PMID:25831448

  17. Guiding Empirical and Theoretical Explorations of Organic Matter Decay By Synthesizing Temperature Responses of Enzyme Kinetics, Microbes, and Isotope Fluxes

    NASA Astrophysics Data System (ADS)

    Billings, S. A.; Ballantyne, F.; Lehmeier, C.; Min, K.

    2014-12-01

    Soil organic matter (SOM) transformation rates generally increase with temperature, but whether this is realized depends on soil-specific features. To develop predictive models applicable to all soils, we must understand two key, ubiquitous features of SOM transformation: the temperature sensitivity of myriad enzyme-substrate combinations and temperature responses of microbial physiology and metabolism, in isolation from soil-specific conditions. Predicting temperature responses of production of CO2 vs. biomass is also difficult due to soil-specific features: we cannot know the identity of active microbes nor the substrates they employ. We highlight how recent empirical advances describing SOM decay can help develop theoretical tools relevant across diverse spatial and temporal scales. At a molecular level, temperature effects on purified enzyme kinetics reveal distinct temperature sensitivities of decay of diverse SOM substrates. Such data help quantify the influence of microbial adaptations and edaphic conditions on decay, have permitted computation of the relative availability of carbon (C) and nitrogen (N) liberated upon decay, and can be used with recent theoretical advances to predict changes in mass specific respiration rates as microbes maintain biomass C:N with changing temperature. Enhancing system complexity, we can subject microbes to temperature changes while controlling growth rate and without altering substrate availability or identity of the active population, permitting calculation of variables typically inferred in soils: microbial C use efficiency (CUE) and isotopic discrimination during C transformations. Quantified declines in CUE with rising temperature are critical for constraining model CUE estimates, and known changes in δ13C of respired CO2 with temperature is useful for interpreting δ13C-CO2 at diverse scales. We suggest empirical studies important for advancing knowledge of how microbes respond to temperature, and ideas for theoretical

  18. Whole-body cryotherapy: empirical evidence and theoretical perspectives.

    PubMed

    Bleakley, Chris M; Bieuzen, François; Davison, Gareth W; Costello, Joseph T

    2014-01-01

    Whole-body cryotherapy (WBC) involves short exposures to air temperatures below -100°C. WBC is increasingly accessible to athletes, and is purported to enhance recovery after exercise and facilitate rehabilitation postinjury. Our objective was to review the efficacy and effectiveness of WBC using empirical evidence from controlled trials. We found ten relevant reports; the majority were based on small numbers of active athletes aged less than 35 years. Although WBC produces a large temperature gradient for tissue cooling, the relatively poor thermal conductivity of air prevents significant subcutaneous and core body cooling. There is weak evidence from controlled studies that WBC enhances antioxidant capacity and parasympathetic reactivation, and alters inflammatory pathways relevant to sports recovery. A series of small randomized studies found WBC offers improvements in subjective recovery and muscle soreness following metabolic or mechanical overload, but little benefit towards functional recovery. There is evidence from one study only that WBC may assist rehabilitation for adhesive capsulitis of the shoulder. There were no adverse events associated with WBC; however, studies did not seem to undertake active surveillance of predefined adverse events. Until further research is available, athletes should remain cognizant that less expensive modes of cryotherapy, such as local ice-pack application or cold-water immersion, offer comparable physiological and clinical effects to WBC.

  19. Counselor Training: Empirical Findings and Current Approaches

    ERIC Educational Resources Information Center

    Buser, Trevor J.

    2008-01-01

    The literature on counselor training has included attention to cognitive and interpersonal skill development and has reported on empirical findings regarding the relationship of training with client outcomes. This article reviews the literature on each of these topics and discusses empirical and theoretical underpinnings of recently developed…

  20. Semi-empirical and empirical L X-ray production cross sections for elements with 50 ⩽ Z ⩽ 92 for protons of 0.5 3.0 MeV

    NASA Astrophysics Data System (ADS)

    Nekab, M.; Kahoul, A.

    2006-04-01

    We present in this contribution, semi-empirical production cross sections of the main X-ray lines Lα, Lβ and Lγ for elements from Sn to U and for protons with energies varying from 0.5 to 3.0 MeV. The theoretical X-ray production cross sections are firstly calculated from the theoretical ionization cross sections of the L i ( i = 1, 2, 3) subshell within the ECPSSR theory. The semi-empirical Lα, Lβ and Lγ cross sections are then deduced by fitting the available experimental data normalized to their corresponding theoretical values and give the better representation of the experimental data in some cases. On the other hand, the experimental data are directly fitted to deduce the empirical L X-ray production cross sections. A comparison is made between the semi-empirical cross sections, the empirical cross sections reported in this work and the empirical ones reported by Reis and Jesus [M.A. Reis, A.P. Jesus, Atom. Data Nucl. Data Tables 63 (1996) 1] and those of Strivay and Weber [Strivay, G. Weber, Nucl. Instr. and Meth. B 190 (2002) 112].

  1. On the Hilbert-Huang Transform Theoretical Foundation

    NASA Technical Reports Server (NTRS)

    Kizhner, Semion; Blank, Karin; Huang, Norden E.

    2004-01-01

    The Hilbert-Huang Transform [HHT] is a novel empirical method for spectrum analysis of non-linear and non-stationary signals. The HHT is a recent development and much remains to be done to establish the theoretical foundation of the HHT algorithms. This paper develops the theoretical foundation for the convergence of the HHT sifting algorithm and it proves that the finest spectrum scale will always be the first generated by the HHT Empirical Mode Decomposition (EMD) algorithm. The theoretical foundation for cutting an extrema data points set into two parts is also developed. This then allows parallel signal processing for the HHT computationally complex sifting algorithm and its optimization in hardware.

  2. Coaching and guidance with patient decision aids: A review of theoretical and empirical evidence

    PubMed Central

    2013-01-01

    Background Coaching and guidance are structured approaches that can be used within or alongside patient decision aids (PtDAs) to facilitate the process of decision making. Coaching is provided by an individual, and guidance is embedded within the decision support materials. The purpose of this paper is to: a) present updated definitions of the concepts “coaching” and “guidance”; b) present an updated summary of current theoretical and empirical insights into the roles played by coaching/guidance in the context of PtDAs; and c) highlight emerging issues and research opportunities in this aspect of PtDA design. Methods We identified literature published since 2003 on shared decision making theoretical frameworks inclusive of coaching or guidance. We also conducted a sub-analysis of randomized controlled trials included in the 2011 Cochrane Collaboration Review of PtDAs with search results updated to December 2010. The sub-analysis was conducted on the characteristics of coaching and/or guidance included in any trial of PtDAs and trials that allowed the impact of coaching and/or guidance with PtDA to be compared to another intervention or usual care. Results Theoretical evidence continues to justify the use of coaching and/or guidance to better support patients in the process of thinking about a decision and in communicating their values/preferences with others. In 98 randomized controlled trials of PtDAs, 11 trials (11.2%) included coaching and 63 trials (64.3%) provided guidance. Compared to usual care, coaching provided alongside a PtDA improved knowledge and decreased mean costs. The impact on some other outcomes (e.g., participation in decision making, satisfaction, option chosen) was more variable, with some trials showing positive effects and other trials reporting no differences. For values-choice agreement, decisional conflict, adherence, and anxiety there were no differences between groups. None of these outcomes were worse when patients were exposed

  3. Empirical approaches to metacommunities: a review and comparison with theory.

    PubMed

    Logue, Jürg B; Mouquet, Nicolas; Peter, Hannes; Hillebrand, Helmut

    2011-09-01

    Metacommunity theory has advanced understanding of how spatial dynamics and local interactions shape community structure and biodiversity. Here, we review empirical approaches to metacommunities, both observational and experimental, pertaining to how well they relate to and test theoretical metacommunity paradigms and how well they capture the realities of natural ecosystems. First, we show that the species-sorting and mass-effects paradigms are the most commonly tested and supported paradigms. Second, the dynamics observed can often be ascribed to two or more of the four non-exclusive paradigms. Third, empirical approaches relate only weakly to the concise assumptions and predictions made by the paradigms. Consequently, we suggest major avenues of improvement for empirical metacommunity approaches, including the integration across theoretical approaches and the incorporation of evolutionary and meta-ecosystem dynamics. We hope for metacommunity ecology to thereby bridge existing gaps between empirical and theoretical work, thus becoming a more powerful framework to understand dynamics across ecosystems. Copyright © 2011 Elsevier Ltd. All rights reserved.

  4. What is the danger of the anomaly zone for empirical phylogenetics?

    PubMed

    Huang, Huateng; Knowles, L Lacey

    2009-10-01

    The increasing number of observations of gene trees with discordant topologies in phylogenetic studies has raised awareness about the problems of incongruence between species trees and gene trees. Moreover, theoretical treatments focusing on the impact of coalescent variance on phylogenetic study have also identified situations where the most probable gene trees are ones that do not match the underlying species tree (i.e., anomalous gene trees [AGTs]). However, although the theoretical proof of the existence of AGTs is alarming, the actual risk that AGTs pose to empirical phylogenetic study is far from clear. Establishing the conditions (i.e., the branch lengths in a species tree) for which AGTs are possible does not address the critical issue of how prevalent they might be. Furthermore, theoretical characterization of the species trees for which AGTs may pose a problem (i.e., the anomaly zone or the species histories for which AGTs are theoretically possible) is based on consideration of just one source of variance that contributes to species tree and gene tree discord-gene lineage coalescence. Yet, empirical data contain another important stochastic component-mutational variance. Estimated gene trees will differ from the underlying gene trees (i.e., the actual genealogy) because of the random process of mutation. Here, we take a simulation approach to investigate the prevalence of AGTs, among estimated gene trees, thereby characterizing the boundaries of the anomaly zone taking into account both coalescent and mutational variances. We also determine the frequency of realized AGTs, which is critical to putting the theoretical work on AGTs into a realistic biological context. Two salient results emerge from this investigation. First, our results show that mutational variance can indeed expand the parameter space (i.e., the relative branch lengths in a species tree) where AGTs might be observed in empirical data. By exploring the underlying cause for the expanded

  5. On precipitation monitoring with theoretical statistical distributions

    NASA Astrophysics Data System (ADS)

    Cindrić, Ksenija; Juras, Josip; Pasarić, Zoran

    2018-04-01

    A common practice in meteorological drought monitoring is to transform the observed precipitation amounts to the standardised precipitation index (SPI). Though the gamma distribution is usually employed for this purpose, some other distribution may be used, particularly in regions where zero precipitation amounts are recorded frequently. In this study, two distributions are considered alongside with the gamma distribution: the compound Poisson exponential distribution (CPE) and the square root normal distribution (SRN). They are fitted to monthly precipitation amounts measured at 24 stations in Croatia in the 55-year-long period (1961-2015). At five stations, long-term series (1901-2015) are available and they have been used for a more detailed investigation. The accommodation of the theoretical distributions to empirical ones is tested by comparison of the corresponding empirical and theoretical ratios of the skewness and the coefficient of variation. Furthermore, following the common approach to precipitation monitoring (CLIMAT reports), the comparison of the empirical and theoretical quintiles in the two periods (1961-1990 and 1991-2015) is examined. The results from the present study reveal that it would be more appropriate to implement theoretical distributions in such climate reports, since they provide better evaluation for monitoring purposes than the current empirical distribution. Nevertheless, deciding on an optimal theoretical distribution for different climate regimes and for different time periods is not easy to accomplish. With regard to Croatian stations (covering different climate regimes), the CPE or SRN distribution could also be the right choice in the climatological practice, in addition to the gamma distribution.

  6. Empirical analysis of storm-time energetic electron enhancements

    NASA Astrophysics Data System (ADS)

    O'Brien, Thomas Paul, III

    This Ph.D. thesis documents a program for studying the appearance of energetic electrons in the Earth's outer radiation belts that is associated with many geomagnetic storms. The dynamic evolution of the electron radiation belts is an outstanding empirical problem in both theoretical space physics and its applied sibling, space weather. The project emphasizes the development of empirical tools and their use in testing several theoretical models of the energization of the electron belts. First, I develop the Statistical Asynchronous Regression technique to provide proxy electron fluxes throughout the parts of the radiation belts explored by geosynchronous and GPS spacecraft. Next, I show that a theoretical adiabatic model can relate the local time asymmetry of the proxy geosynchronous fluxes to the asymmetry of the geomagnetic field. Then, I perform a superposed epoch analysis on the proxy fluxes at local noon to identify magnetospheric and interplanetary precursors of relativistic electron enhancements. Finally, I use statistical and neural network phase space analyses to determine the hourly evolution of flux at a virtual stationary monitor. The dynamic equation quantitatively identifies the importance of different drivers of the electron belts. This project provides empirical constraints on theoretical models of electron acceleration.

  7. Whole-body cryotherapy: empirical evidence and theoretical perspectives

    PubMed Central

    Bleakley, Chris M; Bieuzen, François; Davison, Gareth W; Costello, Joseph T

    2014-01-01

    Whole-body cryotherapy (WBC) involves short exposures to air temperatures below −100°C. WBC is increasingly accessible to athletes, and is purported to enhance recovery after exercise and facilitate rehabilitation postinjury. Our objective was to review the efficacy and effectiveness of WBC using empirical evidence from controlled trials. We found ten relevant reports; the majority were based on small numbers of active athletes aged less than 35 years. Although WBC produces a large temperature gradient for tissue cooling, the relatively poor thermal conductivity of air prevents significant subcutaneous and core body cooling. There is weak evidence from controlled studies that WBC enhances antioxidant capacity and parasympathetic reactivation, and alters inflammatory pathways relevant to sports recovery. A series of small randomized studies found WBC offers improvements in subjective recovery and muscle soreness following metabolic or mechanical overload, but little benefit towards functional recovery. There is evidence from one study only that WBC may assist rehabilitation for adhesive capsulitis of the shoulder. There were no adverse events associated with WBC; however, studies did not seem to undertake active surveillance of predefined adverse events. Until further research is available, athletes should remain cognizant that less expensive modes of cryotherapy, such as local ice-pack application or cold-water immersion, offer comparable physiological and clinical effects to WBC. PMID:24648779

  8. Agent-Based Models in Empirical Social Research

    ERIC Educational Resources Information Center

    Bruch, Elizabeth; Atwell, Jon

    2015-01-01

    Agent-based modeling has become increasingly popular in recent years, but there is still no codified set of recommendations or practices for how to use these models within a program of empirical research. This article provides ideas and practical guidelines drawn from sociology, biology, computer science, epidemiology, and statistics. We first…

  9. Evidence-based ethics? On evidence-based practice and the "empirical turn" from normative bioethics

    PubMed Central

    Goldenberg, Maya J

    2005-01-01

    Background The increase in empirical methods of research in bioethics over the last two decades is typically perceived as a welcomed broadening of the discipline, with increased integration of social and life scientists into the field and ethics consultants into the clinical setting, however it also represents a loss of confidence in the typical normative and analytic methods of bioethics. Discussion The recent incipiency of "Evidence-Based Ethics" attests to this phenomenon and should be rejected as a solution to the current ambivalence toward the normative resolution of moral problems in a pluralistic society. While "evidence-based" is typically read in medicine and other life and social sciences as the empirically-adequate standard of reasonable practice and a means for increasing certainty, I propose that the evidence-based movement in fact gains consensus by displacing normative discourse with aggregate or statistically-derived empirical evidence as the "bottom line". Therefore, along with wavering on the fact/value distinction, evidence-based ethics threatens bioethics' normative mandate. The appeal of the evidence-based approach is that it offers a means of negotiating the demands of moral pluralism. Rather than appealing to explicit values that are likely not shared by all, "the evidence" is proposed to adjudicate between competing claims. Quantified measures are notably more "neutral" and democratic than liberal markers like "species normal functioning". Yet the positivist notion that claims stand or fall in light of the evidence is untenable; furthermore, the legacy of positivism entails the quieting of empirically non-verifiable (or at least non-falsifiable) considerations like moral claims and judgments. As a result, evidence-based ethics proposes to operate with the implicit normativity that accompanies the production and presentation of all biomedical and scientific facts unchecked. Summary The "empirical turn" in bioethics signals a need for

  10. Coping, acculturation, and psychological adaptation among migrants: a theoretical and empirical review and synthesis of the literature

    PubMed Central

    Kuo, Ben C.H.

    2014-01-01

    Given the continuous, dynamic demographic changes internationally due to intensive worldwide migration and globalization, the need to more fully understand how migrants adapt and cope with acculturation experiences in their new host cultural environment is imperative and timely. However, a comprehensive review of what we currently know about the relationship between coping behavior and acculturation experience for individuals undergoing cultural changes has not yet been undertaken. Hence, the current article aims to compile, review, and examine cumulative cross-cultural psychological research that sheds light on the relationships among coping, acculturation, and psychological and mental health outcomes for migrants. To this end, this present article reviews prevailing literature pertaining to: (a) the stress and coping conceptual perspective of acculturation; (b) four theoretical models of coping, acculturation and cultural adaptation; (c) differential coping pattern among diverse acculturating migrant groups; and (d) the relationship between coping variabilities and acculturation levels among migrants. In terms of theoretical understanding, this review points to the relative strengths and limitations associated with each of the four theoretical models on coping-acculturation-adaptation. These theories and the empirical studies reviewed in this article further highlight the central role of coping behaviors/strategies in the acculturation process and outcome for migrants and ethnic populations, both conceptually and functionally. Moreover, the review shows that across studies culturally preferred coping patterns exist among acculturating migrants and migrant groups and vary with migrants' acculturation levels. Implications and limitations of the existing literature for coping, acculturation, and psychological adaptation research are discussed and recommendations for future research are put forth. PMID:25750766

  11. Innovation in Information Technology: Theoretical and Empirical Study in SMQR Section of Export Import in Automotive Industry

    NASA Astrophysics Data System (ADS)

    Edi Nugroho Soebandrija, Khristian; Pratama, Yogi

    2014-03-01

    This paper has the objective to provide the innovation in information technology in both theoretical and empirical study. Precisely, both aspects relate to the Shortage Mispacking Quality Report (SMQR) Claims in Export and Import in Automotive Industry. This paper discusses the major aspects of Innovation, Information Technology, Performance and Competitive Advantage. Furthermore, In the empirical study of PT. Astra Honda Motor (AHM) refers to SMQR Claims, Communication Systems, Analysis and Design Systems. Briefly both aspects of the major aspects and its empirical study are discussed in the Introduction Session. Furthermore, the more detail discussion is conducted in the related aspects in other sessions of this paper, in particular in Literature Review in term classical and updated reference of current research. The increases of SMQR claim and communication problem at PT. Astra Daihatsu Motor (PT. ADM) which still using the email cause the time of claim settlement become longer and finally it causes the rejected of SMQR claim by supplier. With presence of this problem then performed to design the integrated communication system to manage the communication process of SMQR claim between PT. ADM with supplier. The systems was analyzed and designed is expected to facilitate the claim communication process so that can be run in accordance with the procedure and fulfill the target of claim settlement time and also eliminate the difficulties and problems on the previous manual communication system with the email. The design process of the system using the approach of system development life cycle method by Kendall & Kendall (2006)which design process covers the SMQR problem communication process, judgment process by the supplier, claim process, claim payment process and claim monitoring process. After getting the appropriate system designs for managing the SMQR claim, furthermore performed the system implementation and can be seen the improvement in claim communication

  12. Why Psychology Cannot be an Empirical Science.

    PubMed

    Smedslund, Jan

    2016-06-01

    The current empirical paradigm for psychological research is criticized because it ignores the irreversibility of psychological processes, the infinite number of influential factors, the pseudo-empirical nature of many hypotheses, and the methodological implications of social interactivity. An additional point is that the differences and correlations usually found are much too small to be useful in psychological practice and in daily life. Together, these criticisms imply that an objective, accumulative, empirical and theoretical science of psychology is an impossible project.

  13. Chronic Fatigue Syndrome and Myalgic Encephalomyelitis: Toward An Empirical Case Definition

    PubMed Central

    Jason, Leonard A.; Kot, Bobby; Sunnquist, Madison; Brown, Abigail; Evans, Meredyth; Jantke, Rachel; Williams, Yolonda; Furst, Jacob; Vernon, Suzanne D.

    2015-01-01

    Current case definitions of Myalgic Encephalomyelitis (ME) and chronic fatigue syndrome (CFS) have been based on consensus methods, but empirical methods could be used to identify core symptoms and thereby improve the reliability. In the present study, several methods (i.e., continuous scores of symptoms, theoretically and empirically derived cut off scores of symptoms) were used to identify core symptoms best differentiating patients from controls. In addition, data mining with decision trees was conducted. Our study found a small number of core symptoms that have good sensitivity and specificity, and these included fatigue, post-exertional malaise, a neurocognitive symptom, and unrefreshing sleep. Outcomes from these analyses suggest that using empirically selected symptoms can help guide the creation of a more reliable case definition. PMID:26029488

  14. Why it is hard to find genes associated with social science traits: theoretical and empirical considerations.

    PubMed

    Chabris, Christopher F; Lee, James J; Benjamin, Daniel J; Beauchamp, Jonathan P; Glaeser, Edward L; Borst, Gregoire; Pinker, Steven; Laibson, David I

    2013-10-01

    We explain why traits of interest to behavioral scientists may have a genetic architecture featuring hundreds or thousands of loci with tiny individual effects rather than a few with large effects and why such an architecture makes it difficult to find robust associations between traits and genes. We conducted a genome-wide association study at 2 sites, Harvard University and Union College, measuring more than 100 physical and behavioral traits with a sample size typical of candidate gene studies. We evaluated predictions that alleles with large effect sizes would be rare and most traits of interest to social science are likely characterized by a lack of strong directional selection. We also carried out a theoretical analysis of the genetic architecture of traits based on R.A. Fisher's geometric model of natural selection and empirical analyses of the effects of selection bias and phenotype measurement stability on the results of genetic association studies. Although we replicated several known genetic associations with physical traits, we found only 2 associations with behavioral traits that met the nominal genome-wide significance threshold, indicating that physical and behavioral traits are mainly affected by numerous genes with small effects. The challenge for social science genomics is the likelihood that genes are connected to behavioral variation by lengthy, nonlinear, interactive causal chains, and unraveling these chains requires allying with personal genomics to take advantage of the potential for large sample sizes as well as continuing with traditional epidemiological studies.

  15. AGENT-BASED MODELS IN EMPIRICAL SOCIAL RESEARCH*

    PubMed Central

    Bruch, Elizabeth; Atwell, Jon

    2014-01-01

    Agent-based modeling has become increasingly popular in recent years, but there is still no codified set of recommendations or practices for how to use these models within a program of empirical research. This article provides ideas and practical guidelines drawn from sociology, biology, computer science, epidemiology, and statistics. We first discuss the motivations for using agent-based models in both basic science and policy-oriented social research. Next, we provide an overview of methods and strategies for incorporating data on behavior and populations into agent-based models, and review techniques for validating and testing the sensitivity of agent-based models. We close with suggested directions for future research. PMID:25983351

  16. Nursing management of sensory overload in psychiatry – Theoretical densification and modification of the framework model

    PubMed

    Scheydt, Stefan; Needham, Ian; Behrens, Johann

    2017-01-01

    Background: Within the scope of the research project on the subjects of sensory overload and stimulus regulation, a theoretical framework model of the nursing care of patients with sensory overload in psychiatry was developed. In a second step, this theoretical model should now be theoretically compressed and, if necessary, modified. Aim: Empirical verification as well as modification, enhancement and theoretical densification of the framework model of nursing care of patients with sensory overload in psychiatry. Method: Analysis of 8 expert interviews by summarizing and structuring content analysis methods based on Meuser and Nagel (2009) as well as Mayring (2010). Results: The developed framework model (Scheydt et al., 2016b) could be empirically verified, theoretically densificated and extended by one category (perception modulation). Thus, four categories of nursing care of patients with sensory overload can be described in inpatient psychiatry: removal from stimuli, modulation of environmental factors, perceptual modulation as well as help somebody to help him- or herself / coping support. Conclusions: Based on the methodological approach, a relatively well-saturated, credible conceptualization of a theoretical model for the description of the nursing care of patients with sensory overload in stationary psychiatry could be worked out. In further steps, these measures have to be further developed, implemented and evaluated regarding to their efficacy.

  17. The Influence of Education and Socialization on Radicalization: An Exploration of Theoretical Presumptions and Empirical Research.

    PubMed

    Pels, Trees; de Ruyter, Doret J

    2012-06-01

    BACKGROUND AND OBJECTIVE: Research into radicalization does not pay much attention to education. This is remarkable and possibly misses an important influence on the process of radicalization. Therefore this article sets out to explore the relation between education on the one hand and the onset or prevention of radicalization on the other hand. METHOD: This article is a theoretical literature review. It has analyzed empirical studies-mainly from European countries-about the educational aims, content and style of Muslim parents and parents with (extreme) right-wing sympathies. RESULTS: Research examining similarity in right-wing sympathies between parents and children yields mixed results, but studies among adolescents point to a significant concordance. Research also showed that authoritarian parenting may play a significant role. Similar research among Muslim families was not found. While raising children with distrust and an authoritarian style are prevalent, the impact on adolescents has not been investigated. The empirical literature we reviewed does not give sufficient evidence to conclude that democratic ideal in and an authoritative style of education are conducive to the development of a democratic attitude. CONCLUSION: There is a knowledge gap with regard to the influence of education on the onset or the prevention of radicalization. Schools and families are underappreciated sources of informal social control and social capital and therefore the gap should be closed. If there is a better understanding of the effect of education, policy as well as interventions can be developed to assist parents and teachers in preventing radicalization.

  18. [Settings-based prevention of overweight in childhood and adolescents : Theoretical foundation, determinants and intervention planning].

    PubMed

    Quilling, Eike; Dadaczynski, Kevin; Müller, Merle

    2016-11-01

    Childhood and adolescent overweight can still be seen as a global public health problem. Based on our socioeconomic understanding, overweight is the result of a complex interplay of a diverse array of factors acting on different levels. Hence, in addition to individual level determinants overweight prevention should also address environmental related factors as part of a holistic and integrated setting approach. This paper aims to discuss the setting approach with regard to overweight prevention in childhood and adolescence. In addition to a summary of environmental factors and their empirical influence on the determinants of overweight, theoretical approaches and planning models of settings-based overweight prevention are discussed. While settings can be characterized as specific social-spatial subsystems (e. g. kindergarten, schools), living environments relate to complex subject-oriented environments that may include various subsystems. Direct social contexts, educational contexts and community contexts as relevant systems for young people contain different evidence-based influences that need to be taken into account in settings based overweight prevention. To support a theory-driven intervention, numerous planning models exist, which are presented here. Given the strengthening of environments for health within the prevention law, the underlying settings approach also needs further development with regard to overweigth prevention. This includes the improvement of the theoretical foundation by aligning intervention practice of planning models, which also has a positive influence on the ability to measure its success.

  19. Establishing evidence-based training in cognitive behavioral therapy: A review of current empirical findings and theoretical guidance.

    PubMed

    Rakovshik, Sarah G; McManus, Freda

    2010-07-01

    Cognitive behavior therapy's (CBT) demonstrated efficacy has prompted calls for its increased dissemination to routine clinical practice settings. For the widespread dissemination of CBT to be successful in achieving effects similar to the original efficacy trials, there must also be effective dissemination of CBT training practices. However, as yet, CBT training is not evidence-based. This review examines what can be learned from existing research into the efficacy and effectiveness of CBT training. Due to the paucity of research specifically investigating CBT training, CBT effectiveness and dissemination studies are also examined to glean information about potentially effective training practices. In order to draw conclusions about effective training practices, comparisons are drawn between studies according to the clinical outcomes that they achieved. Training approaches are compared according to dose and active training elements, and theoretical models of learning are applied to interpret the findings. The limitations of the existing literature are discussed, as well as recommendations for improving training research to meet the standards evident in treatment trials (e.g., random allocation, control conditions, self-report and blind assessment, and adherence monitoring). Finally, the process of developing efficacious CBT treatment protocols is offered as a template for developing evidence-based CBT training protocols. 2010 Elsevier Ltd. All rights reserved.

  20. Quantifying multi-dimensional functional trait spaces of trees: empirical versus theoretical approaches

    NASA Astrophysics Data System (ADS)

    Ogle, K.; Fell, M.; Barber, J. J.

    2016-12-01

    Empirical, field studies of plant functional traits have revealed important trade-offs among pairs or triplets of traits, such as the leaf (LES) and wood (WES) economics spectra. Trade-offs include correlations between leaf longevity (LL) vs specific leaf area (SLA), LL vs mass-specific leaf respiration rate (RmL), SLA vs RmL, and resistance to breakage vs wood density. Ordination analyses (e.g., PCA) show groupings of traits that tend to align with different life-history strategies or taxonomic groups. It is unclear, however, what underlies such trade-offs and emergent spectra. Do they arise from inherent physiological constraints on growth, or are they more reflective of environmental filtering? The relative importance of these mechanisms has implications for predicting biogeochemical cycling, which is influenced by trait distributions of the plant community. We address this question using an individual-based model of tree growth (ACGCA) to quantify the theoretical trait space of trees that emerges from physiological constraints. ACGCA's inputs include 32 physiological, anatomical, and allometric traits, many of which are related to the LES and WES. We fit ACGCA to 1.6 million USFS FIA observations of tree diameters and heights to obtain vectors of trait values that produce realistic growth, and we explored the structure of this trait space. No notable correlations emerged among the 496 trait pairs, but stepwise regressions revealed complicated multi-variate structure: e.g., relationships between pairs of traits (e.g., RmL and SLA) are governed by other traits (e.g., LL, radiation-use efficiency [RUE]). We also simulated growth under various canopy gap scenarios that impose varying degrees of environmental filtering to explore the multi-dimensional trait space (hypervolume) of trees that died vs survived. The centroid and volume of the hypervolumes differed among dead and live trees, especially under gap conditions leading to low mortality. Traits most predictive

  1. Attachment-Based Family Therapy: A Review of the Empirical Support.

    PubMed

    Diamond, Guy; Russon, Jody; Levy, Suzanne

    2016-09-01

    Attachment-based family therapy (ABFT) is an empirically supported treatment designed to capitalize on the innate, biological desire for meaningful and secure relationships. The therapy is grounded in attachment theory and provides an interpersonal, process-oriented, trauma-focused approach to treating adolescent depression, suicidality, and trauma. Although a process-oriented therapy, ABFT offers a clear structure and road map to help therapists quickly address attachment ruptures that lie at the core of family conflict. Several clinical trials and process studies have demonstrated empirical support for the model and its proposed mechanism of change. This article provides an overview of the clinical model and the existing empirical support for ABFT. © 2016 Family Process Institute.

  2. Diversifying Theory and Science: Expanding the Boundaries of Empirically Supported Interventions in School Psychology.

    ERIC Educational Resources Information Center

    Kratochwill, Thomas R.; Stoiber, Karen Callan

    2000-01-01

    Developmental psychopathology and principles advance in Hughes' target article can be useful to promote development, evaluation, and application of empirically supported interventions (ESIs), but embracing a pathological framework is extremely limited given the diversity in theoretical approaches relevant to school-based ESIs. Argues that in order…

  3. The Impact of Training for Empirically Based Practice.

    ERIC Educational Resources Information Center

    Simons, Ronald

    1987-01-01

    Consistent with previous research, a course to teach empirically based practice produced a short-term effect on students which dissipated in the months following graduation. Long-term impact was related to the attitudes of co-workers and the type of agency. Strategies are suggested for countering those negative influences. (Author/MH)

  4. GIS Teacher Training: Empirically-Based Indicators of Effectiveness

    ERIC Educational Resources Information Center

    Höhnle, Steffen; Fögele, Janis; Mehren, Rainer; Schubert, Jan Christoph

    2016-01-01

    In spite of various actions, the implementation of GIS (geographic information systems) in German schools is still very low. In the presented research, teaching experts as well as teaching novices were presented with empirically based constraints for implementation stemming from an earlier survey. In the process of various group discussions, the…

  5. Assessing differential expression in two-color microarrays: a resampling-based empirical Bayes approach.

    PubMed

    Li, Dongmei; Le Pape, Marc A; Parikh, Nisha I; Chen, Will X; Dye, Timothy D

    2013-01-01

    Microarrays are widely used for examining differential gene expression, identifying single nucleotide polymorphisms, and detecting methylation loci. Multiple testing methods in microarray data analysis aim at controlling both Type I and Type II error rates; however, real microarray data do not always fit their distribution assumptions. Smyth's ubiquitous parametric method, for example, inadequately accommodates violations of normality assumptions, resulting in inflated Type I error rates. The Significance Analysis of Microarrays, another widely used microarray data analysis method, is based on a permutation test and is robust to non-normally distributed data; however, the Significance Analysis of Microarrays method fold change criteria are problematic, and can critically alter the conclusion of a study, as a result of compositional changes of the control data set in the analysis. We propose a novel approach, combining resampling with empirical Bayes methods: the Resampling-based empirical Bayes Methods. This approach not only reduces false discovery rates for non-normally distributed microarray data, but it is also impervious to fold change threshold since no control data set selection is needed. Through simulation studies, sensitivities, specificities, total rejections, and false discovery rates are compared across the Smyth's parametric method, the Significance Analysis of Microarrays, and the Resampling-based empirical Bayes Methods. Differences in false discovery rates controls between each approach are illustrated through a preterm delivery methylation study. The results show that the Resampling-based empirical Bayes Methods offer significantly higher specificity and lower false discovery rates compared to Smyth's parametric method when data are not normally distributed. The Resampling-based empirical Bayes Methods also offers higher statistical power than the Significance Analysis of Microarrays method when the proportion of significantly differentially

  6. TheoReTS - An information system for theoretical spectra based on variational predictions from molecular potential energy and dipole moment surfaces

    NASA Astrophysics Data System (ADS)

    Rey, Michaël; Nikitin, Andrei V.; Babikov, Yurii L.; Tyuterev, Vladimir G.

    2016-09-01

    Knowledge of intensities of rovibrational transitions of various molecules and theirs isotopic species in wide spectral and temperature ranges is essential for the modeling of optical properties of planetary atmospheres, brown dwarfs and for other astrophysical applications. TheoReTS ("Theoretical Reims-Tomsk Spectral data") is an Internet accessible information system devoted to ab initio based rotationally resolved spectra predictions for some relevant molecular species. All data were generated from potential energy and dipole moment surfaces computed via high-level electronic structure calculations using variational methods for vibration-rotation energy levels and transitions. When available, empirical corrections to band centers were applied, all line intensities remaining purely ab initio. The current TheoReTS implementation contains information on four-to-six atomic molecules, including phosphine, methane, ethylene, silane, methyl-fluoride, and their isotopic species 13CH4 , 12CH3D , 12CH2D2 , 12CD4 , 13C2H4, … . Predicted hot methane line lists up to T = 2000 K are included. The information system provides the associated software for spectra simulation including absorption coefficient, absorption and emission cross-sections, transmittance and radiance. The simulations allow Lorentz, Gauss and Voight line shapes. Rectangular, triangular, Lorentzian, Gaussian, sinc and sinc squared apparatus function can be used with user-defined specifications for broadening parameters and spectral resolution. All information is organized as a relational database with the user-friendly graphical interface according to Model-View-Controller architectural tools. The full-featured web application is written on PHP using Yii framework and C++ software modules. In case of very large high-temperature line lists, a data compression is implemented for fast interactive spectra simulations of a quasi-continual absorption due to big line density. Applications for the TheoReTS may

  7. An empirical model for polarized and cross-polarized scattering from a vegetation layer

    NASA Technical Reports Server (NTRS)

    Liu, H. L.; Fung, A. K.

    1988-01-01

    An empirical model for scattering from a vegetation layer above an irregular ground surface is developed in terms of the first-order solution for like-polarized scattering and the second-order solution for cross-polarized scattering. The effects of multiple scattering within the layer and at the surface-volume boundary are compensated by using a correction factor based on the matrix doubling method. The major feature of this model is that all parameters in the model are physical parameters of the vegetation medium. There are no regression parameters. Comparisons of this empirical model with theoretical matrix-doubling method and radar measurements indicate good agreements in polarization, angular trends, and k sub a up to 4, where k is the wave number and a is the disk radius. The computational time is shortened by a factor of 8, relative to the theoretical model calculation.

  8. Empirical fitness landscapes and the predictability of evolution.

    PubMed

    de Visser, J Arjan G M; Krug, Joachim

    2014-07-01

    The genotype-fitness map (that is, the fitness landscape) is a key determinant of evolution, yet it has mostly been used as a superficial metaphor because we know little about its structure. This is now changing, as real fitness landscapes are being analysed by constructing genotypes with all possible combinations of small sets of mutations observed in phylogenies or in evolution experiments. In turn, these first glimpses of empirical fitness landscapes inspire theoretical analyses of the predictability of evolution. Here, we review these recent empirical and theoretical developments, identify methodological issues and organizing principles, and discuss possibilities to develop more realistic fitness landscape models.

  9. Sharing methodology: a worked example of theoretical integration with qualitative data to clarify practical understanding of learning and generate new theoretical development.

    PubMed

    Yardley, Sarah; Brosnan, Caragh; Richardson, Jane

    2013-01-01

    Theoretical integration is a necessary element of study design if clarification of experiential learning is to be achieved. There are few published examples demonstrating how this can be achieved. This methodological article provides a worked example of research methodology that achieved clarification of authentic early experiences (AEEs) through a bi-directional approach to theory and data. Bi-directional refers to our simultaneous use of theory to guide and interrogate empirical data and the use of empirical data to refine theory. We explain the five steps of our methodological approach: (1) understanding the context; (2) critique on existing applications of socio-cultural models to inform study design; (3) data generation; (4) analysis and interpretation and (5) theoretical development through a novel application of Metis. These steps resulted in understanding of how and why different outcomes arose from students participating in AEE. Our approach offers a mechanism for clarification without which evidence-based effective ways to maximise constructive learning cannot be developed. In our example it also contributed to greater theoretical understanding of the influence of social interactions. By sharing this example of research undertaken to develop both theory and educational practice we hope to assist others seeking to conduct similar research.

  10. Against the empirical viability of the Deutsch-Wallace-Everett approach to quantum mechanics

    NASA Astrophysics Data System (ADS)

    Dawid, Richard; Thébault, Karim P. Y.

    2014-08-01

    The subjective Everettian approach to quantum mechanics presented by Deutsch and Wallace fails to constitute an empirically viable theory of quantum phenomena. The decision theoretic implementation of the Born rule realized in this approach provides no basis for rejecting Everettian quantum mechanics in the face of empirical data that contradicts the Born rule. The approach of Greaves and Myrvold, which provides a subjective implementation of the Born rule as well but derives it from empirical data rather than decision theoretic arguments, avoids the problem faced by Deutsch and Wallace and is empirically viable. However, there is good reason to cast doubts on its scientific value.

  11. Early Experience and the Development of Cognitive Competence: Some Theoretical and Methodological Issues.

    ERIC Educational Resources Information Center

    Ulvund, Stein Erik

    1982-01-01

    Argues that in analyzing effects of early experience on development of cognitive competence, theoretical analyses as well as empirical investigations should be based on a transactional model of development. Shows optimal stimulation hypothesis, particularly the enhancement prediction, seems to represent a transactional approach to the study of…

  12. Empirical Likelihood-Based Estimation of the Treatment Effect in a Pretest-Posttest Study.

    PubMed

    Huang, Chiung-Yu; Qin, Jing; Follmann, Dean A

    2008-09-01

    The pretest-posttest study design is commonly used in medical and social science research to assess the effect of a treatment or an intervention. Recently, interest has been rising in developing inference procedures that improve efficiency while relaxing assumptions used in the pretest-posttest data analysis, especially when the posttest measurement might be missing. In this article we propose a semiparametric estimation procedure based on empirical likelihood (EL) that incorporates the common baseline covariate information to improve efficiency. The proposed method also yields an asymptotically unbiased estimate of the response distribution. Thus functions of the response distribution, such as the median, can be estimated straightforwardly, and the EL method can provide a more appealing estimate of the treatment effect for skewed data. We show that, compared with existing methods, the proposed EL estimator has appealing theoretical properties, especially when the working model for the underlying relationship between the pretest and posttest measurements is misspecified. A series of simulation studies demonstrates that the EL-based estimator outperforms its competitors when the working model is misspecified and the data are missing at random. We illustrate the methods by analyzing data from an AIDS clinical trial (ACTG 175).

  13. Empirical Likelihood-Based Estimation of the Treatment Effect in a Pretest–Posttest Study

    PubMed Central

    Huang, Chiung-Yu; Qin, Jing; Follmann, Dean A.

    2013-01-01

    The pretest–posttest study design is commonly used in medical and social science research to assess the effect of a treatment or an intervention. Recently, interest has been rising in developing inference procedures that improve efficiency while relaxing assumptions used in the pretest–posttest data analysis, especially when the posttest measurement might be missing. In this article we propose a semiparametric estimation procedure based on empirical likelihood (EL) that incorporates the common baseline covariate information to improve efficiency. The proposed method also yields an asymptotically unbiased estimate of the response distribution. Thus functions of the response distribution, such as the median, can be estimated straightforwardly, and the EL method can provide a more appealing estimate of the treatment effect for skewed data. We show that, compared with existing methods, the proposed EL estimator has appealing theoretical properties, especially when the working model for the underlying relationship between the pretest and posttest measurements is misspecified. A series of simulation studies demonstrates that the EL-based estimator outperforms its competitors when the working model is misspecified and the data are missing at random. We illustrate the methods by analyzing data from an AIDS clinical trial (ACTG 175). PMID:23729942

  14. When complexity science meets implementation science: a theoretical and empirical analysis of systems change.

    PubMed

    Braithwaite, Jeffrey; Churruca, Kate; Long, Janet C; Ellis, Louise A; Herkes, Jessica

    2018-04-30

    Implementation science has a core aim - to get evidence into practice. Early in the evidence-based medicine movement, this task was construed in linear terms, wherein the knowledge pipeline moved from evidence created in the laboratory through to clinical trials and, finally, via new tests, drugs, equipment, or procedures, into clinical practice. We now know that this straight-line thinking was naïve at best, and little more than an idealization, with multiple fractures appearing in the pipeline. The knowledge pipeline derives from a mechanistic and linear approach to science, which, while delivering huge advances in medicine over the last two centuries, is limited in its application to complex social systems such as healthcare. Instead, complexity science, a theoretical approach to understanding interconnections among agents and how they give rise to emergent, dynamic, systems-level behaviors, represents an increasingly useful conceptual framework for change. Herein, we discuss what implementation science can learn from complexity science, and tease out some of the properties of healthcare systems that enable or constrain the goals we have for better, more effective, more evidence-based care. Two Australian examples, one largely top-down, predicated on applying new standards across the country, and the other largely bottom-up, adopting medical emergency teams in over 200 hospitals, provide empirical support for a complexity-informed approach to implementation. The key lessons are that change can be stimulated in many ways, but a triggering mechanism is needed, such as legislation or widespread stakeholder agreement; that feedback loops are crucial to continue change momentum; that extended sweeps of time are involved, typically much longer than believed at the outset; and that taking a systems-informed, complexity approach, having regard for existing networks and socio-technical characteristics, is beneficial. Construing healthcare as a complex adaptive system

  15. A Unifying Framework for Causal Analysis in Set-Theoretic Multimethod Research

    ERIC Educational Resources Information Center

    Rohlfing, Ingo; Schneider, Carsten Q.

    2018-01-01

    The combination of Qualitative Comparative Analysis (QCA) with process tracing, which we call set-theoretic multimethod research (MMR), is steadily becoming more popular in empirical research. Despite the fact that both methods have an elected affinity based on set theory, it is not obvious how a within-case method operating in a single case and a…

  16. Solubility of caffeine from green tea in supercritical CO2: a theoretical and empirical approach.

    PubMed

    Gadkari, Pravin Vasantrao; Balaraman, Manohar

    2015-12-01

    Decaffeination of fresh green tea was carried out with supercritical CO2 in the presence of ethanol as co-solvent. The solubility of caffeine in supercritical CO2 varied from 44.19 × 10(-6) to 149.55 × 10(-6) (mole fraction) over a pressure and temperature range of 15 to 35 MPa and 313 to 333 K, respectively. The maximum solubility of caffeine was obtained at 25 MPa and 323 K. Experimental solubility data were correlated with the theoretical equation of state models Peng-Robinson (PR), Soave Redlich-Kwong (SRK), and Redlich-Kwong (RK). The RK model had regressed experimental data with 15.52 % average absolute relative deviation (AARD). In contrast, Gordillo empirical model regressed the best to experimental data with only 0.96 % AARD. Under supercritical conditions, solubility of caffeine in tea matrix was lower than the solubility of pure caffeine. Further, solubility of caffeine in supercritical CO2 was compared with solubility of pure caffeine in conventional solvents and a maximum solubility 90 × 10(-3) mol fraction was obtained with chloroform.

  17. Empirically based Suggested Insights into the Concept of False-Self Defense: Contributions From a Study on Normalization of Children With Disabilities.

    PubMed

    Eichengreen, Adva; Hoofien, Dan; Bachar, Eytan

    2016-02-01

    The concept of the false self has been used widely in psychoanalytic theory and practice but seldom in empirical research. In this empirically based study, elevated features of false-self defense were hypothetically associated with risk factors attendant on processes of rehabilitation and integration of children with disabilities, processes that encourage adaptation of the child to the able-bodied environment. Self-report questionnaires and in-depth interviews were conducted with 88 deaf and hard-of-hearing students and a comparison group of 88 hearing counterparts. Results demonstrate that despite the important contribution of rehabilitation and integration to the well-being of these children, these efforts may put the child at risk of increased use of the false-self defense. The empirical findings suggest two general theoretical conclusions: (1) The Winnicottian concept of the environment, usually confined to the parent-child relationship, can be understood more broadly as including cultural, social, and rehabilitational variables that both influence the parent-child relationship and operate independently of it. (2) The monolithic conceptualization of the false self may be more accurately unpacked to reveal two distinct subtypes: the compliant and the split false self. © 2016 by the American Psychoanalytic Association.

  18. An empirical evaluation of two theoretically-based hypotheses on the directional association between self-worth and hope.

    PubMed

    McDavid, Lindley; McDonough, Meghan H; Smith, Alan L

    2015-06-01

    Fostering self-worth and hope are important goals of positive youth development (PYD) efforts, yet intervention design is complicated by contrasting theoretical hypotheses regarding the directional association between these constructs. Therefore, within a longitudinal design we tested: (1) that self-worth predicts changes in hope (self theory; Harter, 1999), and (2) that hope predicts changes in self-worth (hope theory; Snyder, 2002) over time. Youth (N = 321; Mage = 10.33 years) in a physical activity-based PYD program completed surveys 37-45 days prior to and on the second day and third-to-last day of the program. A latent variable panel model that included autoregressive and cross-lagged paths indicated that self-worth was a significant predictor of change in hope, but hope did not predict change in self-worth. Therefore, the directional association between self-worth and hope is better explained by self-theory and PYD programs should aim to enhance perceptions of self-worth to build perceptions of hope. Copyright © 2015 The Foundation for Professionals in Services for Adolescents. Published by Elsevier Ltd. All rights reserved.

  19. Masculinities in Higher Education: Theoretical and Practical Considerations

    ERIC Educational Resources Information Center

    Laker, Jason A., Ed.; Davis, Tracy, Ed.

    2011-01-01

    "Masculinities in Higher Education" provides empirical evidence, theoretical support, and developmental interventions for educators working with college men both in and out of the classroom. The critical philosophical perspective of the text challenges the status-quo and offers theoretically sound educational strategies to successfully promote…

  20. Theoretical and Empirical Comparisons between Two Models for Continuous Item Responses.

    ERIC Educational Resources Information Center

    Ferrando, Pere J.

    2002-01-01

    Analyzed the relations between two continuous response models intended for typical response items: the linear congeneric model and Samejima's continuous response model (CRM). Illustrated the relations described using an empirical example and assessed the relations through a simulation study. (SLD)

  1. On the relative ages of galactic globular clusters. A new observable, a semi-empirical calibration and problems with the theoretical isochrones

    NASA Astrophysics Data System (ADS)

    Buonanno, R.; Corsi, C. E.; Pulone, L.; Fusi Pecci, F.; Bellazzini, M.

    1998-05-01

    A new procedure is described to derive homogeneous relative ages from the Color-Magnitude Diagrams (CMDs) of Galactic globular clusters (GGCs). It is based on the use of a new observable, Delta V(0.05) , namely the difference in magnitude between an arbitrary point on the upper main sequence (V_{+0.05} -the V magnitude of the MS-ridge, 0.05 mag redder than the Main Sequence (MS) Turn-off, (TO)) and the horizontal branch (HB). The observational error associated to Delta V(0.05) is substantially smaller than that of previous age-indicators, keeping the property of being strictly independent of distance and reddening and of being based on theoretical luminosities rather than on still uncertain theoretical temperatures. As an additional bonus, the theoretical models show that Delta V(0.05) has a low dependence on metallicity. Moreover, the estimates of the relative age so obtained are also sufficiently invariant (to within ~ +/- 1 Gyr) with varying adopted models and transformations. Since the difference in the color difference Delta (B-V)_{TO,RGB} (VandenBerg, Bolte and Stetson 1990 -VBS, Sarajedini and Demarque 1990 -SD) remains the most reliable technique to estimate relative cluster ages for clusters where the horizontal part of the HB is not adequately populated, we have used the differential ages obtained via the "vertical" Delta V(0.05) parameter for a selected sample of clusters (with high quality CMDs, well populated HBs, trustworthy calibrations) to perform an empirical calibration of the "horizontal" observable in terms of [Fe/H] and age. A direct comparison with the corresponding calibration derived from the theoretical models reveals the existence of clear-cut discrepancies, which call into question the model scaling with metallicity in the observational planes. Starting from the global sample of considered clusters, we have thus evaluated, within a homogeneous procedure, relative ages for 33 GGCs having different metallicity, HB-morphologies, and

  2. A Detection-Theoretic Model of Echo Inhibition

    ERIC Educational Resources Information Center

    Saberi, Kourosh; Petrosyan, Agavni

    2004-01-01

    A detection-theoretic analysis of the auditory localization of dual-impulse stimuli is described, and a model for the processing of spatial cues in the echo pulse is developed. Although for over 50 years "echo suppression" has been the topic of intense theoretical and empirical study within the hearing sciences, only a rudimentary understanding of…

  3. Empirical Testing of a Theoretical Extension of the Technology Acceptance Model: An Exploratory Study of Educational Wikis

    ERIC Educational Resources Information Center

    Liu, Xun

    2010-01-01

    This study extended the technology acceptance model and empirically tested the new model with wikis, a new type of educational technology. Based on social cognitive theory and the theory of planned behavior, three new variables, wiki self-efficacy, online posting anxiety, and perceived behavioral control, were added to the original technology…

  4. An Empirical State Error Covariance Matrix for the Weighted Least Squares Estimation Method

    NASA Technical Reports Server (NTRS)

    Frisbee, Joseph H., Jr.

    2011-01-01

    State estimation techniques effectively provide mean state estimates. However, the theoretical state error covariance matrices provided as part of these techniques often suffer from a lack of confidence in their ability to describe the un-certainty in the estimated states. By a reinterpretation of the equations involved in the weighted least squares algorithm, it is possible to directly arrive at an empirical state error covariance matrix. This proposed empirical state error covariance matrix will contain the effect of all error sources, known or not. Results based on the proposed technique will be presented for a simple, two observer, measurement error only problem.

  5. Development of Mathematical Literacy: Results of an Empirical Study

    ERIC Educational Resources Information Center

    Kaiser, Gabriele; Willander, Torben

    2005-01-01

    In the paper the results of an empirical study, which has evaluated the development of mathematical literacy in an innovative teaching programme, are presented. The theoretical approach of mathematical literacy relies strongly on applications and modelling and the study follows the approach of R. Bybee, who develops a theoretical concept of…

  6. Inter-firm Networks, Organizational Learning and Knowledge Updating: An Empirical Study

    NASA Astrophysics Data System (ADS)

    Zhang, Su-rong; Wang, Wen-ping

    In the era of knowledge-based economy which information technology develops rapidly, the rate of knowledge updating has become a critical factor for enterprises to gaining competitive advantage .We build an interactional theoretical model among inter-firm networks, organizational learning and knowledge updating thereby and demonstrate it with empirical study at last. The result shows that inter-firm networks and organizational learning is the source of knowledge updating.

  7. Partial differential equation-based approach for empirical mode decomposition: application on image analysis.

    PubMed

    Niang, Oumar; Thioune, Abdoulaye; El Gueirea, Mouhamed Cheikh; Deléchelle, Eric; Lemoine, Jacques

    2012-09-01

    The major problem with the empirical mode decomposition (EMD) algorithm is its lack of a theoretical framework. So, it is difficult to characterize and evaluate this approach. In this paper, we propose, in the 2-D case, the use of an alternative implementation to the algorithmic definition of the so-called "sifting process" used in the original Huang's EMD method. This approach, especially based on partial differential equations (PDEs), was presented by Niang in previous works, in 2005 and 2007, and relies on a nonlinear diffusion-based filtering process to solve the mean envelope estimation problem. In the 1-D case, the efficiency of the PDE-based method, compared to the original EMD algorithmic version, was also illustrated in a recent paper. Recently, several 2-D extensions of the EMD method have been proposed. Despite some effort, 2-D versions for EMD appear poorly performing and are very time consuming. So in this paper, an extension to the 2-D space of the PDE-based approach is extensively described. This approach has been applied in cases of both signal and image decomposition. The obtained results confirm the usefulness of the new PDE-based sifting process for the decomposition of various kinds of data. Some results have been provided in the case of image decomposition. The effectiveness of the approach encourages its use in a number of signal and image applications such as denoising, detrending, or texture analysis.

  8. Empirical and Theoretical Bases of Zipf's Law.

    ERIC Educational Resources Information Center

    Wyllys, Ronald E.

    1981-01-01

    Explains Zipf's Law of Vocabulary Distribution (i.e., relationship between frequency of a word in a corpus and its rank), noting the discovery of the law, alternative forms, and literature relating to the search for a rationale for Zipf's Law. Thirty-eight references are cited. (EJS)

  9. Empirical Scientific Research and Legal Studies Research--A Missing Link

    ERIC Educational Resources Information Center

    Landry, Robert J., III

    2016-01-01

    This article begins with an overview of what is meant by empirical scientific research in the context of legal studies. With that backdrop, the argument is presented that without engaging in normative, theoretical, and doctrinal research in tandem with empirical scientific research, the role of legal studies scholarship in making meaningful…

  10. Why Do People Need Self-Esteem? A Theoretical and Empirical Review

    ERIC Educational Resources Information Center

    Pyszczynsi, Tom; Greenberg, Jeff; Solomon, Sheldon; Arndt, Jamie; Schimel, Jeff

    2004-01-01

    Terror management theory (TMT; J. Greenberg, T. Pyszczynski, & S. Solomon, 1986) posits that people are motivated to pursue positive self-evaluations because self-esteem provides a buffer against the omnipresent potential for anxiety engendered by the uniquely human awareness of mortality. Empirical evidence relevant to the theory is reviewed…

  11. [Memorandum IV: Theoretical and Normative Grounding of Health Services Research].

    PubMed

    Baumann, W; Farin, E; Menzel-Begemann, A; Meyer, T

    2016-05-01

    With Memoranda and other initiatives, the German Network for Health Service Research [Deutsches Netzwerk Versorgungsforschung e.V. (DNVF)] is fostering the methodological quality of care research studies for years. Compared to the standards of empirical research, questions concerning the role and function of theories, theoretical approaches and scientific principles have not been taken up on its own. Therefore, the DNVF e.V. has set up a working group in 2013, which was commissioned to prepare a memorandum on "theories in health care research". This now presented memorandum will primarily challenge scholars in health care services research to pay more attention to questions concerning the theoretical arsenal and the background assumptions in the research process. The foundation in the philosophy of science, the reference to normative principles and the theory-bases of the research process are addressed. Moreover, the memorandum will call on to advance the theorizing in health services research and to strengthen not empirical approaches, research on basic principles or studies with regard to normative sciences and to incorporate these relevant disciplines in health services research. Research structures and funding of health services research needs more open space for theoretical reflection and for self-observation of their own, multidisciplinary research processes. © Georg Thieme Verlag KG Stuttgart · New York.

  12. Multi-focus image fusion based on window empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Qin, Xinqiang; Zheng, Jiaoyue; Hu, Gang; Wang, Jiao

    2017-09-01

    In order to improve multi-focus image fusion quality, a novel fusion algorithm based on window empirical mode decomposition (WEMD) is proposed. This WEMD is an improved form of bidimensional empirical mode decomposition (BEMD), due to its decomposition process using the adding window principle, effectively resolving the signal concealment problem. We used WEMD for multi-focus image fusion, and formulated different fusion rules for bidimensional intrinsic mode function (BIMF) components and the residue component. For fusion of the BIMF components, the concept of the Sum-modified-Laplacian was used and a scheme based on the visual feature contrast adopted; when choosing the residue coefficients, a pixel value based on the local visibility was selected. We carried out four groups of multi-focus image fusion experiments and compared objective evaluation criteria with other three fusion methods. The experimental results show that the proposed fusion approach is effective and performs better at fusing multi-focus images than some traditional methods.

  13. Firm productivity, pollution, and output: theory and empirical evidence from China.

    PubMed

    Tang, Erzi; Zhang, Jingjing; Haider, Zulfiqar

    2015-11-01

    Using a theoretical model, this paper argues that as firm productivity increases, there is a decrease in firm-level pollution intensity. However, as productivity increases, firms tend to increase their aggregate output, which requires the use of additional resources that increase pollution. Hence, an increase in productivity results in two opposing effects where increased productivity may in fact increase pollution created by a firm. We describe the joint effect of these two mechanisms on pollution emissions as the "productivity dilemma" of pollution emission. Based on firm-level data from China, we also empirically test this productivity dilemma hypothesis. Our empirical results suggest that, in general, firm productivity has a positive and statistically significant impact on pollution emission in China. However, the impact of productivity on pollution becomes negative when we control for increases in firm output. The empirical evidence also confirms the positive influence of productivity on output, which suggests that the main determinant of pollution is the firm's output. The empirical results provide evidence of the existence of, what we describe as, the productivity dilemma of pollution emission.

  14. Collective animal navigation and migratory culture: from theoretical models to empirical evidence

    PubMed Central

    Dell, Anthony I.

    2018-01-01

    Animals often travel in groups, and their navigational decisions can be influenced by social interactions. Both theory and empirical observations suggest that such collective navigation can result in individuals improving their ability to find their way and could be one of the key benefits of sociality for these species. Here, we provide an overview of the potential mechanisms underlying collective navigation, review the known, and supposed, empirical evidence for such behaviour and highlight interesting directions for future research. We further explore how both social and collective learning during group navigation could lead to the accumulation of knowledge at the population level, resulting in the emergence of migratory culture. This article is part of the theme issue ‘Collective movement ecology’. PMID:29581394

  15. Data-Driven Approaches to Empirical Discovery

    DTIC Science & Technology

    1988-10-31

    if nece ry and identify by block number) empirical discovery history of science data-driven heuristics numeric laws theoretical terms scope of laws...to the normative side. Machine Discovery and the History of Science The history of science studies the actual path followed by scientists over the

  16. The Nature of Procrastination: A Meta-Analytic and Theoretical Review of Quintessential Self-Regulatory Failure

    ERIC Educational Resources Information Center

    Steel, Piers

    2007-01-01

    Procrastination is a prevalent and pernicious form of self-regulatory failure that is not entirely understood. Hence, the relevant conceptual, theoretical, and empirical work is reviewed, drawing upon correlational, experimental, and qualitative findings. A meta-analysis of procrastination's possible causes and effects, based on 691 correlations,…

  17. Augmented Reality-Based Simulators as Discovery Learning Tools: An Empirical Study

    ERIC Educational Resources Information Center

    Ibáñez, María-Blanca; Di-Serio, Ángela; Villarán-Molina, Diego; Delgado-Kloos, Carlos

    2015-01-01

    This paper reports empirical evidence on having students use AR-SaBEr, a simulation tool based on augmented reality (AR), to discover the basic principles of electricity through a series of experiments. AR-SaBEr was enhanced with knowledge-based support and inquiry-based scaffolding mechanisms, which proved useful for discovery learning in…

  18. Bacterial clonal diagnostics as a tool for evidence-based empiric antibiotic selection.

    PubMed

    Tchesnokova, Veronika; Avagyan, Hovhannes; Rechkina, Elena; Chan, Diana; Muradova, Mariya; Haile, Helen Ghirmai; Radey, Matthew; Weissman, Scott; Riddell, Kim; Scholes, Delia; Johnson, James R; Sokurenko, Evgeni V

    2017-01-01

    Despite the known clonal distribution of antibiotic resistance in many bacteria, empiric (pre-culture) antibiotic selection still relies heavily on species-level cumulative antibiograms, resulting in overuse of broad-spectrum agents and excessive antibiotic/pathogen mismatch. Urinary tract infections (UTIs), which account for a large share of antibiotic use, are caused predominantly by Escherichia coli, a highly clonal pathogen. In an observational clinical cohort study of urgent care patients with suspected UTI, we assessed the potential for E. coli clonal-level antibiograms to improve empiric antibiotic selection. A novel PCR-based clonotyping assay was applied to fresh urine samples to rapidly detect E. coli and the urine strain's clonotype. Based on a database of clonotype-specific antibiograms, the acceptability of various antibiotics for empiric therapy was inferred using a 20%, 10%, and 30% allowed resistance threshold. The test's performance characteristics and possible effects on prescribing were assessed. The rapid test identified E. coli clonotypes directly in patients' urine within 25-35 minutes, with high specificity and sensitivity compared to culture. Antibiotic selection based on a clonotype-specific antibiogram could reduce the relative likelihood of antibiotic/pathogen mismatch by ≥ 60%. Compared to observed prescribing patterns, clonal diagnostics-guided antibiotic selection could safely double the use of trimethoprim/sulfamethoxazole and minimize fluoroquinolone use. In summary, a rapid clonotyping test showed promise for improving empiric antibiotic prescribing for E. coli UTI, including reversing preferential use of fluoroquinolones over trimethoprim/sulfamethoxazole. The clonal diagnostics approach merges epidemiologic surveillance, antimicrobial stewardship, and molecular diagnostics to bring evidence-based medicine directly to the point of care.

  19. The Theoretical Basis of Experience-Based Career Education.

    ERIC Educational Resources Information Center

    Jenks, C. Lynn

    This study analyzes the extent to which the assumptions and procedures of the Experience-Based Career Education model (EBCE) as developed by the Far West Laboratory (FWL) are supported by empirical data and by recognized scholars in educational theory. The analysis is presented as relevant to the more general problem: the limited availability of…

  20. Volatility in financial markets: stochastic models and empirical results

    NASA Astrophysics Data System (ADS)

    Miccichè, Salvatore; Bonanno, Giovanni; Lillo, Fabrizio; Mantegna, Rosario N.

    2002-11-01

    We investigate the historical volatility of the 100 most capitalized stocks traded in US equity markets. An empirical probability density function (pdf) of volatility is obtained and compared with the theoretical predictions of a lognormal model and of the Hull and White model. The lognormal model well describes the pdf in the region of low values of volatility whereas the Hull and White model better approximates the empirical pdf for large values of volatility. Both models fail in describing the empirical pdf over a moderately large volatility range.

  1. Palm vein recognition based on directional empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Lee, Jen-Chun; Chang, Chien-Ping; Chen, Wei-Kuei

    2014-04-01

    Directional empirical mode decomposition (DEMD) has recently been proposed to make empirical mode decomposition suitable for the processing of texture analysis. Using DEMD, samples are decomposed into a series of images, referred to as two-dimensional intrinsic mode functions (2-D IMFs), from finer to large scale. A DEMD-based 2 linear discriminant analysis (LDA) for palm vein recognition is proposed. The proposed method progresses through three steps: (i) a set of 2-D IMF features of various scale and orientation are extracted using DEMD, (ii) the 2LDA method is then applied to reduce the dimensionality of the feature space in both the row and column directions, and (iii) the nearest neighbor classifier is used for classification. We also propose two strategies for using the set of 2-D IMF features: ensemble DEMD vein representation (EDVR) and multichannel DEMD vein representation (MDVR). In experiments using palm vein databases, the proposed MDVR-based 2LDA method achieved recognition accuracy of 99.73%, thereby demonstrating its feasibility for palm vein recognition.

  2. Developing a theoretical framework for complex community-based interventions.

    PubMed

    Angeles, Ricardo N; Dolovich, Lisa; Kaczorowski, Janusz; Thabane, Lehana

    2014-01-01

    Applying existing theories to research, in the form of a theoretical framework, is necessary to advance knowledge from what is already known toward the next steps to be taken. This article proposes a guide on how to develop a theoretical framework for complex community-based interventions using the Cardiovascular Health Awareness Program as an example. Developing a theoretical framework starts with identifying the intervention's essential elements. Subsequent steps include the following: (a) identifying and defining the different variables (independent, dependent, mediating/intervening, moderating, and control); (b) postulating mechanisms how the independent variables will lead to the dependent variables; (c) identifying existing theoretical models supporting the theoretical framework under development; (d) scripting the theoretical framework into a figure or sets of statements as a series of hypotheses, if/then logic statements, or a visual model; (e) content and face validation of the theoretical framework; and (f) revising the theoretical framework. In our example, we combined the "diffusion of innovation theory" and the "health belief model" to develop our framework. Using the Cardiovascular Health Awareness Program as the model, we demonstrated a stepwise process of developing a theoretical framework. The challenges encountered are described, and an overview of the strategies employed to overcome these challenges is presented.

  3. Theoretical and Empirical Base for Implementation Components of Health-Promoting Schools

    ERIC Educational Resources Information Center

    Samdal, Oddrun; Rowling, Louise

    2011-01-01

    Purpose: Efforts to create a scientific base for the health-promoting school approach have so far not articulated a clear "Science of Delivery". There is thus a need for systematic identification of clearly operationalised implementation components. To address a next step in the refinement of the health-promoting schools' work, this paper sets out…

  4. Integrative Behavioral Couple Therapy: Theoretical Background, Empirical Research, and Dissemination.

    PubMed

    Roddy, McKenzie K; Nowlan, Kathryn M; Doss, Brian D; Christensen, Andrew

    2016-09-01

    Integrative Behavioral Couple Therapy (IBCT), developed by Drs. Andrew Christensen and Neil Jacobson, builds off the tradition of behavioral couple therapy by including acceptance strategies as key components of treatment. Results from a large randomized clinical trial of IBCT indicate that it yields large and significant gains in relationship satisfaction. Furthermore, these benefits have been shown to persist for at least 5 years after treatment for the average couple. Not only does IBCT positively impact relationship constructs such as satisfaction and communication, but the benefits of therapy extend to individual, co-parenting, and child functioning. Moreover, IBCT has been shown to operate through the putative mechanisms of improvements in emotional acceptance, behavior change, and communication. IBCT was chosen for nationwide training and dissemination through the Veteran Affairs Medical Centers. Furthermore, the principles of IBCT have been translated into a web-based intervention for distressed couples, OurRelationship.com. IBCT is continuing to evolve and grow as research and technologies allow for continued evaluation and dissemination of this well-supported theoretical model. © 2016 Family Process Institute.

  5. Empirical study of fuzzy compatibility measures and aggregation operators

    NASA Astrophysics Data System (ADS)

    Cross, Valerie V.; Sudkamp, Thomas A.

    1992-02-01

    Two fundamental requirements for the generation of support using incomplete and imprecise information are the ability to measure the compatibility of discriminatory information with domain knowledge and the ability to fuse information obtained from disparate sources. A generic architecture utilizing the generalized fuzzy relational database model has been developed to empirically investigate the support generation capabilities of various compatibility measures and aggregation operators. This paper examines the effectiveness of combinations of compatibility measures from the set-theoretic, geometric distance, and logic- based classes paired with t-norm and generalized mean families of aggregation operators.

  6. Bacterial clonal diagnostics as a tool for evidence-based empiric antibiotic selection

    PubMed Central

    Tchesnokova, Veronika; Avagyan, Hovhannes; Rechkina, Elena; Chan, Diana; Muradova, Mariya; Haile, Helen Ghirmai; Radey, Matthew; Weissman, Scott; Riddell, Kim; Scholes, Delia; Johnson, James R.

    2017-01-01

    Despite the known clonal distribution of antibiotic resistance in many bacteria, empiric (pre-culture) antibiotic selection still relies heavily on species-level cumulative antibiograms, resulting in overuse of broad-spectrum agents and excessive antibiotic/pathogen mismatch. Urinary tract infections (UTIs), which account for a large share of antibiotic use, are caused predominantly by Escherichia coli, a highly clonal pathogen. In an observational clinical cohort study of urgent care patients with suspected UTI, we assessed the potential for E. coli clonal-level antibiograms to improve empiric antibiotic selection. A novel PCR-based clonotyping assay was applied to fresh urine samples to rapidly detect E. coli and the urine strain's clonotype. Based on a database of clonotype-specific antibiograms, the acceptability of various antibiotics for empiric therapy was inferred using a 20%, 10%, and 30% allowed resistance threshold. The test's performance characteristics and possible effects on prescribing were assessed. The rapid test identified E. coli clonotypes directly in patients’ urine within 25–35 minutes, with high specificity and sensitivity compared to culture. Antibiotic selection based on a clonotype-specific antibiogram could reduce the relative likelihood of antibiotic/pathogen mismatch by ≥ 60%. Compared to observed prescribing patterns, clonal diagnostics-guided antibiotic selection could safely double the use of trimethoprim/sulfamethoxazole and minimize fluoroquinolone use. In summary, a rapid clonotyping test showed promise for improving empiric antibiotic prescribing for E. coli UTI, including reversing preferential use of fluoroquinolones over trimethoprim/sulfamethoxazole. The clonal diagnostics approach merges epidemiologic surveillance, antimicrobial stewardship, and molecular diagnostics to bring evidence-based medicine directly to the point of care. PMID:28350870

  7. Performance-Based Service Quality Model: An Empirical Study on Japanese Universities

    ERIC Educational Resources Information Center

    Sultan, Parves; Wong, Ho

    2010-01-01

    Purpose: This paper aims to develop and empirically test the performance-based higher education service quality model. Design/methodology/approach: The study develops 67-item instrument for measuring performance-based service quality with a particular focus on the higher education sector. Scale reliability is confirmed using the Cronbach's alpha.…

  8. Empirical Equation Based Chirality (n, m) Assignment of Semiconducting Single Wall Carbon Nanotubes from Resonant Raman Scattering Data

    PubMed Central

    Arefin, Md Shamsul

    2012-01-01

    This work presents a technique for the chirality (n, m) assignment of semiconducting single wall carbon nanotubes by solving a set of empirical equations of the tight binding model parameters. The empirical equations of the nearest neighbor hopping parameters, relating the term (2n− m) with the first and second optical transition energies of the semiconducting single wall carbon nanotubes, are also proposed. They provide almost the same level of accuracy for lower and higher diameter nanotubes. An algorithm is presented to determine the chiral index (n, m) of any unknown semiconducting tube by solving these empirical equations using values of radial breathing mode frequency and the first or second optical transition energy from resonant Raman spectroscopy. In this paper, the chirality of 55 semiconducting nanotubes is assigned using the first and second optical transition energies. Unlike the existing methods of chirality assignment, this technique does not require graphical comparison or pattern recognition between existing experimental and theoretical Kataura plot. PMID:28348319

  9. Strength design of Zr(x)Ti(x)Hf(x)Nb(x)Mo(x) alloys based on empirical electron theory of solids and molecules

    NASA Astrophysics Data System (ADS)

    Li, Y. K.; Chen, Y. W.; Cheng, X. W.; Wu, C.; Cheng, B.

    2018-05-01

    In this paper, the valence electron structure parameters of Zr(x)Ti(x)Hf(x)Nb(x)Mo(x) alloys were calculated based on the empirical electron theory of solids and molecules (EET), and their performance through these parameters were predicted. Subsequently, the alloys with special valence electron structure parameters were prepared byarc melting. The hardness and high-temperature mechanical properties were analyzed to verify the prediction. Research shows that the influence of shared electron number nA on the strongest bond determines the strength of these alloys and the experiments are consistent with the theoretical prediction.

  10. An empirically based model for knowledge management in health care organizations.

    PubMed

    Sibbald, Shannon L; Wathen, C Nadine; Kothari, Anita

    2016-01-01

    Knowledge management (KM) encompasses strategies, processes, and practices that allow an organization to capture, share, store, access, and use knowledge. Ideal KM combines different sources of knowledge to support innovation and improve performance. Despite the importance of KM in health care organizations (HCOs), there has been very little empirical research to describe KM in this context. This study explores KM in HCOs, focusing on the status of current intraorganizational KM. The intention is to provide insight for future studies and model development for effective KM implementation in HCOs. A qualitative methods approach was used to create an empirically based model of KM in HCOs. Methods included (a) qualitative interviews (n = 24) with senior leadership to identify types of knowledge important in these roles plus current information-seeking behaviors/needs and (b) in-depth case study with leaders in new executive positions (n = 2). The data were collected from 10 HCOs. Our empirically based model for KM was assessed for face and content validity. The findings highlight the paucity of formal KM in our sample HCOs. Organizational culture, leadership, and resources are instrumental in supporting KM processes. An executive's knowledge needs are extensive, but knowledge assets are often limited or difficult to acquire as much of the available information is not in a usable format. We propose an empirically based model for KM to highlight the importance of context (internal and external), and knowledge seeking, synthesis, sharing, and organization. Participants who reviewed the model supported its basic components and processes, and potential for incorporating KM into organizational processes. Our results articulate ways to improve KM, increase organizational learning, and support evidence-informed decision-making. This research has implications for how to better integrate evidence and knowledge into organizations while considering context and the role of

  11. Time Domain Strain/Stress Reconstruction Based on Empirical Mode Decomposition: Numerical Study and Experimental Validation.

    PubMed

    He, Jingjing; Zhou, Yibin; Guan, Xuefei; Zhang, Wei; Zhang, Weifang; Liu, Yongming

    2016-08-16

    Structural health monitoring has been studied by a number of researchers as well as various industries to keep up with the increasing demand for preventive maintenance routines. This work presents a novel method for reconstruct prompt, informed strain/stress responses at the hot spots of the structures based on strain measurements at remote locations. The structural responses measured from usage monitoring system at available locations are decomposed into modal responses using empirical mode decomposition. Transformation equations based on finite element modeling are derived to extrapolate the modal responses from the measured locations to critical locations where direct sensor measurements are not available. Then, two numerical examples (a two-span beam and a 19956-degree of freedom simplified airfoil) are used to demonstrate the overall reconstruction method. Finally, the present work investigates the effectiveness and accuracy of the method through a set of experiments conducted on an aluminium alloy cantilever beam commonly used in air vehicle and spacecraft. The experiments collect the vibration strain signals of the beam via optical fiber sensors. Reconstruction results are compared with theoretical solutions and a detailed error analysis is also provided.

  12. Empathy and child neglect: a theoretical model.

    PubMed

    De Paul, Joaquín; Guibert, María

    2008-11-01

    To present an explanatory theory-based model of child neglect. This model does not address neglectful behaviors of parents with mental retardation, alcohol or drug abuse, or severe mental health problems. In this model parental behavior aimed to satisfy a child's need is considered a helping behavior and, as a consequence, child neglect is considered as a specific type of non-helping behavior. The central hypothesis of the theoretical model presented here suggests that neglectful parents cannot develop the helping response set to care for their children because the observation of a child's signal of need does not lead to the experience of emotions that motivate helping or because the parents experience these emotions, but specific cognitions modify the motivation to help. The present theoretical model suggests that different typologies of neglectful parents could be developed based on different reasons that parents might not to experience emotions that motivate helping behaviors. The model can be helpful to promote new empirical studies about the etiology of different groups of neglectful families.

  13. Ignorance, Vulnerability and the Occurrence of "Radical Surprises": Theoretical Reflections and Empirical Findings

    NASA Astrophysics Data System (ADS)

    Kuhlicke, C.

    2009-04-01

    , that the flood was far beyond people's power of imagination (nescience). The reason therefore is that previous to the flood an institutionalized space of experience and horizon of expectation existed, which did not consider the possibility that the "stability" of the river is artificially created by engineering achievements to reduce its naturally given variability. Based on the empirical findings and the theoretical reasoning overall conclusions are drawn and implications for flood risk management under conditions global environmental change are outlined.

  14. Adaptation of the concept of varying time of concentration within flood modelling: Theoretical and empirical investigations across the Mediterranean

    NASA Astrophysics Data System (ADS)

    Michailidi, Eleni Maria; Antoniadi, Sylvia; Koukouvinos, Antonis; Bacchi, Baldassare; Efstratiadis, Andreas

    2017-04-01

    The time of concentration, tc, is a key hydrological concept and often is an essential parameter of rainfall-runoff modelling, which has been traditionally tackled as a characteristic property of the river basin. However, both theoretical proof and empirical evidence imply that tc is a hydraulic quantity that depends on flow, and thus it should be considered as variable and not as constant parameter. Using a kinematic method approach, easily implemented in GIS environment, we first illustrate that the relationship between tc and the effective rainfall produced over the catchment is well-approximated by a power-type law, the exponent of which is associated with the slope of the longest flow path of the river basin. Next, we take advantage of this relationship to adapt the concept of varying time of concentration within flood modelling, and particularly the well-known SCS-CN approach. In this context, the initial abstraction ratio is also considered varying, while the propagation of the effective rainfall is employed through a parametric unit hydrograph, the shape of which is dynamically adjusted according to the runoff produced during the flood event. The above framework is tested in a number of Mediterranean river basins in Greece, Italy and Cyprus, ensuring faithful representation of most of the observed flood events. Based on the outcomes of this extended analysis, we provide guidance for employing this methodology for flood design studies in ungauged basins.

  15. An update on the "empirical turn" in bioethics: analysis of empirical research in nine bioethics journals.

    PubMed

    Wangmo, Tenzin; Hauri, Sirin; Gennet, Eloise; Anane-Sarpong, Evelyn; Provoost, Veerle; Elger, Bernice S

    2018-02-07

    A review of literature published a decade ago noted a significant increase in empirical papers across nine bioethics journals. This study provides an update on the presence of empirical papers in the same nine journals. It first evaluates whether the empirical trend is continuing as noted in the previous study, and second, how it is changing, that is, what are the characteristics of the empirical works published in these nine bioethics journals. A review of the same nine journals (Bioethics; Journal of Medical Ethics; Journal of Clinical Ethics; Nursing Ethics; Cambridge Quarterly of Healthcare Ethics; Hastings Center Report; Theoretical Medicine and Bioethics; Christian Bioethics; and Kennedy Institute of Ethics Journal) was conducted for a 12-year period from 2004 to 2015. Data obtained was analysed descriptively and using a non-parametric Chi-square test. Of the total number of original papers (N = 5567) published in the nine bioethics journals, 18.1% (n = 1007) collected and analysed empirical data. Journal of Medical Ethics and Nursing Ethics led the empirical publications, accounting for 89.4% of all empirical papers. The former published significantly more quantitative papers than qualitative, whereas the latter published more qualitative papers. Our analysis reveals no significant difference (χ2 = 2.857; p = 0.091) between the proportion of empirical papers published in 2004-2009 and 2010-2015. However, the increasing empirical trend has continued in these journals with the proportion of empirical papers increasing from 14.9% in 2004 to 17.8% in 2015. This study presents the current state of affairs regarding empirical research published nine bioethics journals. In the quarter century of data that is available about the nine bioethics journals studied in two reviews, the proportion of empirical publications continues to increase, signifying a trend towards empirical research in bioethics. The growing volume is mainly attributable to two

  16. Base course resilient modulus for the mechanistic-empirical pavement design guide.

    DOT National Transportation Integrated Search

    2011-11-01

    The Mechanistic-Empirical Pavement Design Guidelines (MEPDG) recommend use of modulus in lieu of structural number for base layer thickness design. Modulus is nonlinear with respect to effective confinement stress, loading strain, and moisture. For d...

  17. Theoretical Principles to Guide the Teaching of Adjectives to Children Who Struggle With Word Learning: Synthesis of Experimental and Naturalistic Research With Principles of Learning Theory.

    PubMed

    Ricks, Samantha L; Alt, Mary

    2016-07-01

    The purpose of this tutorial is to provide clinicians with a theoretically motivated and evidence-based approach to teaching adjectives to children who struggle with word learning. Given that there are almost no treatment studies to guide this topic, we have synthesized findings from experimental and theoretical literature to come up with a principles-based approach to treatment. We provide a sample lesson plan, incorporating our 3 theoretical principles, and describe the materials chosen and methods used during treatment and assessment. This approach is theoretically motivated, but it needs to be empirically tested.

  18. Empirical corroboration of an earlier theoretical resolution to the UV paradox of insect polarized skylight orientation.

    PubMed

    Wang, Xin; Gao, Jun; Fan, Zhiguo

    2014-02-01

    It is surprising that many insect species use only the ultraviolet (UV) component of the polarized skylight for orientation and navigation purposes, while both the intensity and the degree of polarization of light from the clear sky are lower in the UV than at longer (blue, green, red) wavelengths. Why have these insects chosen the UV part of the polarized skylight? This strange phenomenon is called the "UV-sky-pol paradox". Although earlier several speculations tried to resolve this paradox, they did this without any quantitative data. A theoretical and computational model has convincingly explained why it is advantageous for certain animals to detect celestial polarization in the UV. We performed a sky-polarimetric approach and built a polarized skylight sensor that models the processing of polarization signals by insect photoreceptors. Using this model sensor, we carried out measurements under clear and cloudy sky conditions. Our results showed that light from the cloudy sky has maximal degree of polarization in the UV. Furthermore, under both clear and cloudy skies the angle of polarization of skylight can be detected with a higher accuracy. By this, we corroborated empirically the soundness of the earlier computational resolution of the UV-sky-pol paradox.

  19. Empirical corroboration of an earlier theoretical resolution to the UV paradox of insect polarized skylight orientation

    NASA Astrophysics Data System (ADS)

    Wang, Xin; Gao, Jun; Fan, Zhiguo

    2014-02-01

    It is surprising that many insect species use only the ultraviolet (UV) component of the polarized skylight for orientation and navigation purposes, while both the intensity and the degree of polarization of light from the clear sky are lower in the UV than at longer (blue, green, red) wavelengths. Why have these insects chosen the UV part of the polarized skylight? This strange phenomenon is called the "UV-sky-pol paradox". Although earlier several speculations tried to resolve this paradox, they did this without any quantitative data. A theoretical and computational model has convincingly explained why it is advantageous for certain animals to detect celestial polarization in the UV. We performed a sky-polarimetric approach and built a polarized skylight sensor that models the processing of polarization signals by insect photoreceptors. Using this model sensor, we carried out measurements under clear and cloudy sky conditions. Our results showed that light from the cloudy sky has maximal degree of polarization in the UV. Furthermore, under both clear and cloudy skies the angle of polarization of skylight can be detected with a higher accuracy. By this, we corroborated empirically the soundness of the earlier computational resolution of the UV-sky-pol paradox.

  20. A theoretical method for the analysis and design of axisymmetric bodies. [flow distribution and incompressible fluids

    NASA Technical Reports Server (NTRS)

    Beatty, T. D.

    1975-01-01

    A theoretical method is presented for the computation of the flow field about an axisymmetric body operating in a viscous, incompressible fluid. A potential flow method was used to determine the inviscid flow field and to yield the boundary conditions for the boundary layer solutions. Boundary layer effects in the forces of displacement thickness and empirically modeled separation streamlines are accounted for in subsequent potential flow solutions. This procedure is repeated until the solutions converge. An empirical method was used to determine base drag allowing configuration drag to be computed.

  1. Unifying Different Theories of Learning: Theoretical Framework and Empirical Evidence

    ERIC Educational Resources Information Center

    Phan, Huy Phuong

    2008-01-01

    The main aim of this research study was to test out a conceptual model encompassing the theoretical frameworks of achievement goals, study processing strategies, effort, and reflective thinking practice. In particular, it was postulated that the causal influences of achievement goals on academic performance are direct and indirect through study…

  2. Statistical learning as an individual ability: Theoretical perspectives and empirical evidence

    PubMed Central

    Siegelman, Noam; Frost, Ram

    2015-01-01

    Although the power of statistical learning (SL) in explaining a wide range of linguistic functions is gaining increasing support, relatively little research has focused on this theoretical construct from the perspective of individual differences. However, to be able to reliably link individual differences in a given ability such as language learning to individual differences in SL, three critical theoretical questions should be posed: Is SL a componential or unified ability? Is it nested within other general cognitive abilities? Is it a stable capacity of an individual? Following an initial mapping sentence outlining the possible dimensions of SL, we employed a battery of SL tasks in the visual and auditory modalities, using verbal and non-verbal stimuli, with adjacent and non-adjacent contingencies. SL tasks were administered along with general cognitive tasks in a within-subject design at two time points to explore our theoretical questions. We found that SL, as measured by some tasks, is a stable and reliable capacity of an individual. Moreover, we found SL to be independent of general cognitive abilities such as intelligence or working memory. However, SL is not a unified capacity, so that individual sensitivity to conditional probabilities is not uniform across modalities and stimuli. PMID:25821343

  3. The growth of business firms: theoretical framework and empirical evidence.

    PubMed

    Fu, Dongfeng; Pammolli, Fabio; Buldyrev, S V; Riccaboni, Massimo; Matia, Kaushik; Yamasaki, Kazuko; Stanley, H Eugene

    2005-12-27

    We introduce a model of proportional growth to explain the distribution P(g)(g) of business-firm growth rates. The model predicts that P(g)(g) is exponential in the central part and depicts an asymptotic power-law behavior in the tails with an exponent zeta = 3. Because of data limitations, previous studies in this field have been focusing exclusively on the Laplace shape of the body of the distribution. In this article, we test the model at different levels of aggregation in the economy, from products to firms to countries, and we find that the predictions of the model agree with empirical growth distributions and size-variance relationships.

  4. Implementing community-based provider participation in research: an empirical study.

    PubMed

    Teal, Randall; Bergmire, Dawn M; Johnston, Matthew; Weiner, Bryan J

    2012-05-08

    Since 2003, the United States National Institutes of Health (NIH) has sought to restructure the clinical research enterprise in the United States by promoting collaborative research partnerships between academically-based investigators and community-based physicians. By increasing community-based provider participation in research (CBPPR), the NIH seeks to advance the science of discovery by conducting research in clinical settings where most people get their care, and accelerate the translation of research results into everyday clinical practice. Although CBPPR is seen as a promising strategy for promoting the use of evidence-based clinical services in community practice settings, few empirical studies have examined the organizational factors that facilitate or hinder the implementation of CBPPR. The purpose of this study is to explore the organizational start-up and early implementation of CBPPR in community-based practice. We used longitudinal, case study research methods and an organizational model of innovation implementation to theoretically guide our study. Our sample consisted of three community practice settings that recently joined the National Cancer Institute's (NCI) Community Clinical Oncology Program (CCOP) in the United States. Data were gathered through site visits, telephone interviews, and archival documents from January 2008 to May 2011. The organizational model for innovation implementation was useful in identifying and investigating the organizational factors influencing start-up and early implementation of CBPPR in CCOP organizations. In general, the three CCOP organizations varied in the extent to which they achieved consistency in CBPPR over time and across physicians. All three CCOP organizations demonstrated mixed levels of organizational readiness for change. Hospital management support and resource availability were limited across CCOP organizations early on, although they improved in one CCOP organization. As a result of weak IPPs, all

  5. Implementing community-based provider participation in research: an empirical study

    PubMed Central

    2012-01-01

    Background Since 2003, the United States National Institutes of Health (NIH) has sought to restructure the clinical research enterprise in the United States by promoting collaborative research partnerships between academically-based investigators and community-based physicians. By increasing community-based provider participation in research (CBPPR), the NIH seeks to advance the science of discovery by conducting research in clinical settings where most people get their care, and accelerate the translation of research results into everyday clinical practice. Although CBPPR is seen as a promising strategy for promoting the use of evidence-based clinical services in community practice settings, few empirical studies have examined the organizational factors that facilitate or hinder the implementation of CBPPR. The purpose of this study is to explore the organizational start-up and early implementation of CBPPR in community-based practice. Methods We used longitudinal, case study research methods and an organizational model of innovation implementation to theoretically guide our study. Our sample consisted of three community practice settings that recently joined the National Cancer Institute’s (NCI) Community Clinical Oncology Program (CCOP) in the United States. Data were gathered through site visits, telephone interviews, and archival documents from January 2008 to May 2011. Results The organizational model for innovation implementation was useful in identifying and investigating the organizational factors influencing start-up and early implementation of CBPPR in CCOP organizations. In general, the three CCOP organizations varied in the extent to which they achieved consistency in CBPPR over time and across physicians. All three CCOP organizations demonstrated mixed levels of organizational readiness for change. Hospital management support and resource availability were limited across CCOP organizations early on, although they improved in one CCOP organization

  6. Body surface assessment with 3D laser-based anthropometry: reliability, validation, and improvement of empirical surface formulae.

    PubMed

    Kuehnapfel, Andreas; Ahnert, Peter; Loeffler, Markus; Scholz, Markus

    2017-02-01

    Body surface area is a physiological quantity relevant for many medical applications. In clinical practice, it is determined by empirical formulae. 3D laser-based anthropometry provides an easy and effective way to measure body surface area but is not ubiquitously available. We used data from laser-based anthropometry from a population-based study to assess validity of published and commonly used empirical formulae. We performed a large population-based study on adults collecting classical anthropometric measurements and 3D body surface assessments (N = 1435). We determined reliability of the 3D body surface assessment and validity of 18 different empirical formulae proposed in the literature. The performance of these formulae is studied in subsets of sex and BMI. Finally, improvements of parameter settings of formulae and adjustments for sex and BMI were considered. 3D body surface measurements show excellent intra- and inter-rater reliability of 0.998 (overall concordance correlation coefficient, OCCC was used as measure of agreement). Empirical formulae of Fujimoto and Watanabe, Shuter and Aslani and Sendroy and Cecchini performed best with excellent concordance with OCCC > 0.949 even in subgroups of sex and BMI. Re-parametrization of formulae and adjustment for sex and BMI slightly improved results. In adults, 3D laser-based body surface assessment is a reliable alternative to estimation by empirical formulae. However, there are empirical formulae showing excellent results even in subgroups of sex and BMI with only little room for improvement.

  7. Empirical Bayes methods for smoothing data and for simultaneous estimation of many parameters.

    PubMed Central

    Yanagimoto, T; Kashiwagi, N

    1990-01-01

    A recent successful development is found in a series of innovative, new statistical methods for smoothing data that are based on the empirical Bayes method. This paper emphasizes their practical usefulness in medical sciences and their theoretically close relationship with the problem of simultaneous estimation of parameters, depending on strata. The paper also presents two examples of analyzing epidemiological data obtained in Japan using the smoothing methods to illustrate their favorable performance. PMID:2148512

  8. Potential benefits of remote sensing: Theoretical framework and empirical estimate

    NASA Technical Reports Server (NTRS)

    Eisgruber, L. M.

    1972-01-01

    A theoretical framwork is outlined for estimating social returns from research and application of remote sensing. The approximate dollar magnitude is given of a particular application of remote sensing, namely estimates of corn production, soybeans, and wheat. Finally, some comments are made on the limitations of this procedure and on the implications of results.

  9. Validation of the theoretical domains framework for use in behaviour change and implementation research.

    PubMed

    Cane, James; O'Connor, Denise; Michie, Susan

    2012-04-24

    An integrative theoretical framework, developed for cross-disciplinary implementation and other behaviour change research, has been applied across a wide range of clinical situations. This study tests the validity of this framework. Validity was investigated by behavioural experts sorting 112 unique theoretical constructs using closed and open sort tasks. The extent of replication was tested by Discriminant Content Validation and Fuzzy Cluster Analysis. There was good support for a refinement of the framework comprising 14 domains of theoretical constructs (average silhouette value 0.29): 'Knowledge', 'Skills', 'Social/Professional Role and Identity', 'Beliefs about Capabilities', 'Optimism', 'Beliefs about Consequences', 'Reinforcement', 'Intentions', 'Goals', 'Memory, Attention and Decision Processes', 'Environmental Context and Resources', 'Social Influences', 'Emotions', and 'Behavioural Regulation'. The refined Theoretical Domains Framework has a strengthened empirical base and provides a method for theoretically assessing implementation problems, as well as professional and other health-related behaviours as a basis for intervention development.

  10. PATENTS AND RESEARCH INVESTMENTS: ASSESSING THE EMPIRICAL EVIDENCE.

    PubMed

    Budish, Eric; Roin, Benjamin N; Williams, Heidi L

    2016-05-01

    A well-developed theoretical literature - dating back at least to Nordhaus (1969) - has analyzed optimal patent policy design. We re-present the core trade-off of the Nordhaus model and highlight an empirical question which emerges from the Nordhaus framework as a key input into optimal patent policy design: namely, what is the elasticity of R&D investment with respect to the patent term? We then review the - surprisingly small - body of empirical evidence that has been developed on this question over the nearly half century since the publication of Nordhaus's book.

  11. What should we mean by empirical validation in hypnotherapy: evidence-based practice in clinical hypnosis.

    PubMed

    Alladin, Assen; Sabatini, Linda; Amundson, Jon K

    2007-04-01

    This paper briefly surveys the trend of and controversy surrounding empirical validation in psychotherapy. Empirical validation of hypnotherapy has paralleled the practice of validation in psychotherapy and the professionalization of clinical psychology, in general. This evolution in determining what counts as evidence for bona fide clinical practice has gone from theory-driven clinical approaches in the 1960s and 1970s through critical attempts at categorization of empirically supported therapies in the 1990s on to the concept of evidence-based practice in 2006. Implications of this progression in professional psychology are discussed in the light of hypnosis's current quest for validation and empirical accreditation.

  12. Theoretical Implementations of Various Mobile Applications Used in English Language Learning

    ERIC Educational Resources Information Center

    Small, Melissa

    2014-01-01

    This review of the theoretical framework for Mastery Learning Theory and Sense of Community theories is provided in conjunction with a review of the literature for mobile technology in relation to language learning. Although empirical research is minimal for mobile phone technology as an aid for language learning, the empirical research that…

  13. The normative background of empirical-ethical research: first steps towards a transparent and reasoned approach in the selection of an ethical theory.

    PubMed

    Salloch, Sabine; Wäscher, Sebastian; Vollmann, Jochen; Schildmann, Jan

    2015-04-04

    Empirical-ethical research constitutes a relatively new field which integrates socio-empirical research and normative analysis. As direct inferences from descriptive data to normative conclusions are problematic, an ethical framework is needed to determine the relevance of the empirical data for normative argument. While issues of normative-empirical collaboration and questions of empirical methodology have been widely discussed in the literature, the normative methodology of empirical-ethical research has seldom been addressed. Based on our own research experience, we discuss one aspect of this normative methodology, namely the selection of an ethical theory serving as a background for empirical-ethical research. Whereas criteria for a good ethical theory in philosophical ethics are usually related to inherent aspects, such as the theory's clarity or coherence, additional points have to be considered in the field of empirical-ethical research. Three of these additional criteria will be discussed in the article: (a) the adequacy of the ethical theory for the issue at stake, (b) the theory's suitability for the purposes and design of the empirical-ethical research project, and (c) the interrelation between the ethical theory selected and the theoretical backgrounds of the socio-empirical research. Using the example of our own study on the development of interventions which support clinical decision-making in oncology, we will show how the selection of an ethical theory as a normative background for empirical-ethical research can proceed. We will also discuss the limitations of the procedures chosen in our project. The article stresses that a systematic and reasoned approach towards theory selection in empirical-ethical research should be given priority rather than an accidental or implicit way of choosing the normative framework for one's own research. It furthermore shows that the overall design of an empirical-ethical study is a multi-faceted endeavor which has to

  14. Systematic Review of Empirically Evaluated School-Based Gambling Education Programs.

    PubMed

    Keen, Brittany; Blaszczynski, Alex; Anjoul, Fadi

    2017-03-01

    Adolescent problem gambling prevalence rates are reportedly five times higher than in the adult population. Several school-based gambling education programs have been developed in an attempt to reduce problem gambling among adolescents; however few have been empirically evaluated. The aim of this review was to report the outcome of studies empirically evaluating gambling education programs across international jurisdictions. A systematic review following guidelines outlined in the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) Statement searching five academic databases: PubMed, Scopus, Medline, PsycINFO, and ERIC, was conducted. A total of 20 papers and 19 studies were included after screening and exclusion criteria were applied. All studies reported intervention effects on cognitive outcomes such as knowledge, perceptions, and beliefs. Only nine of the studies attempted to measure intervention effects on behavioural outcomes, and only five of those reported significant changes in gambling behaviour. Of these five, methodological inadequacies were commonly found including brief follow-up periods, lack of control comparison in post hoc analyses, and inconsistencies and misclassifications in the measurement of gambling behaviour, including problem gambling. Based on this review, recommendations are offered for the future development and evaluation of school-based gambling education programs relating to both methodological and content design and delivery considerations.

  15. The Empirical Distribution of Singletons for Geographic Samples of DNA Sequences.

    PubMed

    Cubry, Philippe; Vigouroux, Yves; François, Olivier

    2017-01-01

    Rare variants are important for drawing inference about past demographic events in a species history. A singleton is a rare variant for which genetic variation is carried by a unique chromosome in a sample. How singletons are distributed across geographic space provides a local measure of genetic diversity that can be measured at the individual level. Here, we define the empirical distribution of singletons in a sample of chromosomes as the proportion of the total number of singletons that each chromosome carries, and we present a theoretical background for studying this distribution. Next, we use computer simulations to evaluate the potential for the empirical distribution of singletons to provide a description of genetic diversity across geographic space. In a Bayesian framework, we show that the empirical distribution of singletons leads to accurate estimates of the geographic origin of range expansions. We apply the Bayesian approach to estimating the origin of the cultivated plant species Pennisetum glaucum [L.] R. Br . (pearl millet) in Africa, and find support for range expansion having started from Northern Mali. Overall, we report that the empirical distribution of singletons is a useful measure to analyze results of sequencing projects based on large scale sampling of individuals across geographic space.

  16. Segmented crystalline scintillators: empirical and theoretical investigation of a high quantum efficiency EPID based on an initial engineering prototype CsI(TI) detector.

    PubMed

    Sawant, Amit; Antonuk, Larry E; El-Mohri, Youcef; Zhao, Qihua; Wang, Yi; Li, Yixin; Du, Hong; Perna, Louis

    2006-04-01

    Modern-day radiotherapy relies on highly sophisticated forms of image guidance in order to implement increasingly conformal treatment plans and achieve precise dose delivery. One of the most important goals of such image guidance is to delineate the clinical target volume from surrounding normal tissue during patient setup and dose delivery, thereby avoiding dependence on surrogates such as bony landmarks. In order to achieve this goal, it is necessary to integrate highly efficient imaging technology, capable of resolving soft-tissue contrast at very low doses, within the treatment setup. In this paper we report on the development of one such modality, which comprises a nonoptimized, prototype electronic portal imaging device (EPID) based on a 40 mm thick, segmented crystalline CsI(Tl) detector incorporated into an indirect-detection active matrix flat panel imager (AMFPI). The segmented detector consists of a matrix of 160 x 160 optically isolated, crystalline CsI(Tl) elements spaced at 1016 microm pitch. The detector was coupled to an indirect detection-based active matrix array having a pixel pitch of 508 microm, with each detector element registered to 2 x 2 array pixels. The performance of the prototype imager was evaluated under very low-dose radiotherapy conditions and compared to that of a conventional megavoltage AMFPI based on a Lanex Fast-B phosphor screen. Detailed quantitative measurements were performed in order to determine the x-ray sensitivity, modulation transfer function, noise power spectrum, and detective quantum efficiency (DQE). In addition, images of a contrast-detail phantom and an anthropomorphic head phantom were also acquired. The prototype imager exhibited approximately 22 times higher zero-frequency DQE (approximately 22%) compared to that of the conventional AMFPI (approximately 1%). The measured zero-frequency DQE was found to be lower than theoretical upper limits (approximately 27%) calculated from Monte Carlo simulations, which

  17. EMPIRE and pyenda: Two ensemble-based data assimilation systems written in Fortran and Python

    NASA Astrophysics Data System (ADS)

    Geppert, Gernot; Browne, Phil; van Leeuwen, Peter Jan; Merker, Claire

    2017-04-01

    We present and compare the features of two ensemble-based data assimilation frameworks, EMPIRE and pyenda. Both frameworks allow to couple models to the assimilation codes using the Message Passing Interface (MPI), leading to extremely efficient and fast coupling between models and the data-assimilation codes. The Fortran-based system EMPIRE (Employing Message Passing Interface for Researching Ensembles) is optimized for parallel, high-performance computing. It currently includes a suite of data assimilation algorithms including variants of the ensemble Kalman and several the particle filters. EMPIRE is targeted at models of all kinds of complexity and has been coupled to several geoscience models, eg. the Lorenz-63 model, a barotropic vorticity model, the general circulation model HadCM3, the ocean model NEMO, and the land-surface model JULES. The Python-based system pyenda (Python Ensemble Data Assimilation) allows Fortran- and Python-based models to be used for data assimilation. Models can be coupled either using MPI or by using a Python interface. Using Python allows quick prototyping and pyenda is aimed at small to medium scale models. pyenda currently includes variants of the ensemble Kalman filter and has been coupled to the Lorenz-63 model, an advection-based precipitation nowcasting scheme, and the dynamic global vegetation model JSBACH.

  18. Personality traits and achievement motives: theoretical and empirical relations between the NEO Personality Inventory-Revised and the Achievement Motives Scale.

    PubMed

    Diseth, Age; Martinsen, Øyvind

    2009-04-01

    Theoretical and empirical relations between personality traits and motive dispositions were investigated by comparing scores of 315 undergraduate psychology students on the NEO Personality Inventory-Revised and the Achievement Motives Scale. Analyses showed all NEO Personality Inventory-Revised factors except agreeableness were significantly correlated with the motive for success and the motive to avoid failure. A structural equation model showed that motive for success was predicted by Extraversion, Openness, Conscientiousness, and Neuroticism (negative relation), and motive to avoid failure was predicted by Neuroticism and Openness (negative relation). Although both achievement motives were predicted by several personality factors, motive for success was most strongly predicted by Openness, and motive to avoid failure was most strongly predicted by neuroticism. These findings extended previous research on the relations of personality traits and achievement motives and provided a basis for the discussion of motive dispositions in personality. The results also added to the construct validity of the Achievement Motives Scale.

  19. Empirical factors and structure transference: Returning to the London account

    NASA Astrophysics Data System (ADS)

    Bueno, Otávio; French, Steven; Ladyman, James

    2012-05-01

    We offer a framework to represent the roles of empirical and theoretical factors in theory construction, and examine a case study to illustrate how the framework can be used to illuminate central features of scientific reasoning. The case study provides an extension of French and Ladyman's (1997) analysis of Fritz and Heinz London's model of superconductivity to accommodate the role of the analogy between superconductivity and diamagnetic phenomena in the development of the model between 1935 and 1937. We focus on this case since it allows us to separate the roles of empirical and theoretical factors, and so provides an example of the utility of the approach that we have adopted. We conclude the paper by drawing on the particular framework here developed to address a range of concerns.

  20. Peers and Obesity during Childhood and Adolescence: A Review of the Empirical Research on Peers, Eating, and Physical Activity

    PubMed Central

    Salvy, Sarah-Jeanne; Bowker, Julie C.

    2015-01-01

    Obesity during childhood and adolescence is a growing problem in the United States, Canada, and around the world that leads to significant physical, psychological, and social impairment. In recent years, empirical research on factors that contribute to the development and maintenance of obesity has begun to consider peer experiences, such as peer rejection, peer victimization, and friendship. Peer experiences have been theoretically and empirically related to the “Big Two” contributors to the obesity epidemic, eating and physical activity, but there has not been a comprehensive review of the extant empirical literature. In this article, we review and synthesize the emerging theoretical and empirical literatures on peer experiences in relation to: (a) eating (food consumption and food selection); and (b) physical activity, during childhood and adolescence. A number of limitations and issues in the theoretical and empirical literatures are also discussed, along with future research directions. In conclusion, we argue that the involvement of children and adolescents’ peer networks in prevention and intervention efforts may be critical for promoting and maintaining positive behavioral health trajectories. PMID:28090396

  1. Music Therapy for Posttraumatic Stress in Adults: A Theoretical Review

    PubMed Central

    Landis-Shack, Nora; Heinz, Adrienne J.; Bonn-Miller, Marcel O.

    2017-01-01

    Music therapy has been employed as a therapeutic intervention to facilitate healing across a variety of clinical populations. There is theoretical and empirical evidence to suggest that individuals with trauma exposure and Posttraumatic Stress Disorder (PTSD), a condition characterized by enduring symptoms of distressing memory intrusions, avoidance, emotional disturbance, and hyperarousal, may derive benefits from music therapy. The current narrative review describes the practice of music therapy and presents a theoretically-informed assessment and model of music therapy as a tool for addressing symptoms of PTSD. The review also presents key empirical studies that support the theoretical assessment. Social, cognitive, and neurobiological mechanisms (e.g., community building, emotion regulation, increased pleasure, anxiety reduction) that promote music therapy’s efficacy as an adjunctive treatment for individuals with posttraumatic stress are discussed. It is concluded that music therapy may be a useful therapeutic tool to reduce symptoms and improve functioning among individuals with trauma exposure and PTSD, though more rigorous empirical study is required. In addition, music therapy may help foster resilience and engage individuals who struggle with stigma associated with seeking professional help. Practical recommendations for incorporating music therapy into clinical practice are offered along with several suggestions for future research. PMID:29290641

  2. Theoretical Models, Assessment Frameworks and Test Construction.

    ERIC Educational Resources Information Center

    Chalhoub-Deville, Micheline

    1997-01-01

    Reviews the usefulness of proficiency models influencing second language testing. Findings indicate that several factors contribute to the lack of congruence between models and test construction and make a case for distinguishing between theoretical models. Underscores the significance of an empirical, contextualized and structured approach to the…

  3. Empirical dual energy calibration (EDEC) for cone-beam computed tomography.

    PubMed

    Stenner, Philip; Berkus, Timo; Kachelriess, Marc

    2007-09-01

    Material-selective imaging using dual energy CT (DECT) relies heavily on well-calibrated material decomposition functions. These require the precise knowledge of the detected x-ray spectra, and even if they are exactly known the reliability of DECT will suffer from scattered radiation. We propose an empirical method to determine the proper decomposition function. In contrast to other decomposition algorithms our empirical dual energy calibration (EDEC) technique requires neither knowledge of the spectra nor of the attenuation coefficients. The desired material-selective raw data p1 and p2 are obtained as functions of the measured attenuation data q1 and q2 (one DECT scan = two raw data sets) by passing them through a polynomial function. The polynomial's coefficients are determined using a general least squares fit based on thresholded images of a calibration phantom. The calibration phantom's dimension should be of the same order of magnitude as the test object, but other than that no assumptions on its exact size or positioning are made. Once the decomposition coefficients are determined DECT raw data can be decomposed by simply passing them through the polynomial. To demonstrate EDEC simulations of an oval CTDI phantom, a lung phantom, a thorax phantom and a mouse phantom were carried out. The method was further verified by measuring a physical mouse phantom, a half-and-half-cylinder phantom and a Yin-Yang phantom with a dedicated in vivo dual source micro-CT scanner. The raw data were decomposed into their components, reconstructed, and the pixel values obtained were compared to the theoretical values. The determination of the calibration coefficients with EDEC is very robust and depends only slightly on the type of calibration phantom used. The images of the test phantoms (simulations and measurements) show a nearly perfect agreement with the theoretical micro values and density values. Since EDEC is an empirical technique it inherently compensates for scatter

  4. Social Health Inequalities and eHealth: A Literature Review With Qualitative Synthesis of Theoretical and Empirical Studies.

    PubMed

    Latulippe, Karine; Hamel, Christine; Giroux, Dominique

    2017-04-27

    eHealth is developing rapidly and brings with it a promise to reduce social health inequalities (SHIs). Yet, it appears that it also has the potential to increase them. The general objective of this review was to set out how to ensure that eHealth contributes to reducing SHIs rather than exacerbating them. This review has three objectives: (1) identifying characteristics of people at risk of experiencing social inequality in health; (2) determining the possibilities of developing eHealth tools that avoid increasing SHI; and (3) modeling the process of using an eHealth tool by people vulnerable to SHI. Following the EPPI approach (Evidence for Policy and Practice of Information of the Institute of Education at the University of London), two databases were searched for the terms SHIs and eHealth and their derivatives in titles and abstracts. Qualitative, quantitative, and mixed articles were included and evaluated. The software NVivo (QSR International) was employed to extract the data and allow for a metasynthesis of the data. Of the 73 articles retained, 10 were theoretical, 7 were from reviews, and 56 were based on empirical studies. Of the latter, 40 used a quantitative approach, 8 used a qualitative approach, 4 used mixed methods approach, and only 4 were based on participatory research-action approach. The digital divide in eHealth is a serious barrier and contributes greatly to SHI. Ethnicity and low income are the most commonly used characteristics to identify people at risk of SHI. The most promising actions for reducing SHI via eHealth are to aim for universal access to the tool of eHealth, become aware of users' literacy level, create eHealth tools that respect the cultural attributes of future users, and encourage the participation of people at risk of SHI. eHealth has the potential to widen the gulf between those at risk of SHI and the rest of the population. The widespread expansion of eHealth technologies calls for rigorous consideration of

  5. Social Health Inequalities and eHealth: A Literature Review With Qualitative Synthesis of Theoretical and Empirical Studies

    PubMed Central

    Hamel, Christine; Giroux, Dominique

    2017-01-01

    Background eHealth is developing rapidly and brings with it a promise to reduce social health inequalities (SHIs). Yet, it appears that it also has the potential to increase them. Objectives The general objective of this review was to set out how to ensure that eHealth contributes to reducing SHIs rather than exacerbating them. This review has three objectives: (1) identifying characteristics of people at risk of experiencing social inequality in health; (2) determining the possibilities of developing eHealth tools that avoid increasing SHI; and (3) modeling the process of using an eHealth tool by people vulnerable to SHI. Methods Following the EPPI approach (Evidence for Policy and Practice of Information of the Institute of Education at the University of London), two databases were searched for the terms SHIs and eHealth and their derivatives in titles and abstracts. Qualitative, quantitative, and mixed articles were included and evaluated. The software NVivo (QSR International) was employed to extract the data and allow for a metasynthesis of the data. Results Of the 73 articles retained, 10 were theoretical, 7 were from reviews, and 56 were based on empirical studies. Of the latter, 40 used a quantitative approach, 8 used a qualitative approach, 4 used mixed methods approach, and only 4 were based on participatory research-action approach. The digital divide in eHealth is a serious barrier and contributes greatly to SHI. Ethnicity and low income are the most commonly used characteristics to identify people at risk of SHI. The most promising actions for reducing SHI via eHealth are to aim for universal access to the tool of eHealth, become aware of users’ literacy level, create eHealth tools that respect the cultural attributes of future users, and encourage the participation of people at risk of SHI. Conclusions eHealth has the potential to widen the gulf between those at risk of SHI and the rest of the population. The widespread expansion of e

  6. An empirical study using permutation-based resampling in meta-regression

    PubMed Central

    2012-01-01

    Background In meta-regression, as the number of trials in the analyses decreases, the risk of false positives or false negatives increases. This is partly due to the assumption of normality that may not hold in small samples. Creation of a distribution from the observed trials using permutation methods to calculate P values may allow for less spurious findings. Permutation has not been empirically tested in meta-regression. The objective of this study was to perform an empirical investigation to explore the differences in results for meta-analyses on a small number of trials using standard large sample approaches verses permutation-based methods for meta-regression. Methods We isolated a sample of randomized controlled clinical trials (RCTs) for interventions that have a small number of trials (herbal medicine trials). Trials were then grouped by herbal species and condition and assessed for methodological quality using the Jadad scale, and data were extracted for each outcome. Finally, we performed meta-analyses on the primary outcome of each group of trials and meta-regression for methodological quality subgroups within each meta-analysis. We used large sample methods and permutation methods in our meta-regression modeling. We then compared final models and final P values between methods. Results We collected 110 trials across 5 intervention/outcome pairings and 5 to 10 trials per covariate. When applying large sample methods and permutation-based methods in our backwards stepwise regression the covariates in the final models were identical in all cases. The P values for the covariates in the final model were larger in 78% (7/9) of the cases for permutation and identical for 22% (2/9) of the cases. Conclusions We present empirical evidence that permutation-based resampling may not change final models when using backwards stepwise regression, but may increase P values in meta-regression of multiple covariates for relatively small amount of trials. PMID:22587815

  7. Satellite-based empirical models linking river plume dynamics with hypoxic area andvolume

    EPA Science Inventory

    Satellite-based empirical models explaining hypoxic area and volume variation were developed for the seasonally hypoxic (O2 < 2 mg L−1) northern Gulf of Mexico adjacent to the Mississippi River. Annual variations in midsummer hypoxic area and ...

  8. Navigating Instructional Dialectics: Empirical Exploration of Paradox in Teaching

    ERIC Educational Resources Information Center

    Thompson, Blair; Rudick, C. Kyle; Kerssen-Griep, Jeff; Golsan, Kathryn

    2018-01-01

    Navigating contradiction represents an integral part of the teaching process. While educational literature has discussed the paradoxes that teachers experience in the classroom, minimal empirical research has analyzed the strategies teachers employ to address these paradoxes. Using relational dialectics as a theoretical framework for understanding…

  9. Implementing Evidence-Based Practice: A Review of the Empirical Research Literature

    ERIC Educational Resources Information Center

    Gray, Mel; Joy, Elyssa; Plath, Debbie; Webb, Stephen A.

    2013-01-01

    The article reports on the findings of a review of empirical studies examining the implementation of evidence-based practice (EBP) in the human services. Eleven studies were located that defined EBP as a research-informed, clinical decision-making process and identified barriers and facilitators to EBP implementation. A thematic analysis of the…

  10. Differentially Private Empirical Risk Minimization

    PubMed Central

    Chaudhuri, Kamalika; Monteleoni, Claire; Sarwate, Anand D.

    2011-01-01

    Privacy-preserving machine learning algorithms are crucial for the increasingly common setting in which personal data, such as medical or financial records, are analyzed. We provide general techniques to produce privacy-preserving approximations of classifiers learned via (regularized) empirical risk minimization (ERM). These algorithms are private under the ε-differential privacy definition due to Dwork et al. (2006). First we apply the output perturbation ideas of Dwork et al. (2006), to ERM classification. Then we propose a new method, objective perturbation, for privacy-preserving machine learning algorithm design. This method entails perturbing the objective function before optimizing over classifiers. If the loss and regularizer satisfy certain convexity and differentiability criteria, we prove theoretical results showing that our algorithms preserve privacy, and provide generalization bounds for linear and nonlinear kernels. We further present a privacy-preserving technique for tuning the parameters in general machine learning algorithms, thereby providing end-to-end privacy guarantees for the training process. We apply these results to produce privacy-preserving analogues of regularized logistic regression and support vector machines. We obtain encouraging results from evaluating their performance on real demographic and benchmark data sets. Our results show that both theoretically and empirically, objective perturbation is superior to the previous state-of-the-art, output perturbation, in managing the inherent tradeoff between privacy and learning performance. PMID:21892342

  11. Empirical conversion of the vertical profile of reflectivity from Ku-band to S-band frequency

    NASA Astrophysics Data System (ADS)

    Cao, Qing; Hong, Yang; Qi, Youcun; Wen, Yixin; Zhang, Jian; Gourley, Jonathan J.; Liao, Liang

    2013-02-01

    ABSTRACT This paper presents an empirical method for converting reflectivity from Ku-band (13.8 GHz) to S-band (2.8 GHz) for several hydrometeor species, which facilitates the incorporation of Tropical Rainfall Measuring Mission (TRMM) Precipitation Radar (PR) measurements into quantitative precipitation estimation (QPE) products from the U.S. Next-Generation Radar (NEXRAD). The development of empirical dual-frequency relations is based on theoretical simulations, which have assumed appropriate scattering and microphysical models for liquid and solid hydrometeors (raindrops, snow, and ice/hail). Particle phase, shape, orientation, and density (especially for snow particles) have been considered in applying the T-matrix method to compute the scattering amplitudes. Gamma particle size distribution (PSD) is utilized to model the microphysical properties in the ice region, melting layer, and raining region of precipitating clouds. The variability of PSD parameters is considered to study the characteristics of dual-frequency reflectivity, especially the variations in radar dual-frequency ratio (DFR). The empirical relations between DFR and Ku-band reflectivity have been derived for particles in different regions within the vertical structure of precipitating clouds. The reflectivity conversion using the proposed empirical relations has been tested using real data collected by TRMM-PR and a prototype polarimetric WSR-88D (Weather Surveillance Radar 88 Doppler) radar, KOUN. The processing and analysis of collocated data demonstrate the validity of the proposed empirical relations and substantiate their practical significance for reflectivity conversion, which is essential to the TRMM-based vertical profile of reflectivity correction approach in improving NEXRAD-based QPE.

  12. [Theoretical and methodological uses of research in Social and Human Sciences in Health].

    PubMed

    Deslandes, Suely Ferreira; Iriart, Jorge Alberto Bernstein

    2012-12-01

    The current article aims to map and critically reflect on the current theoretical and methodological uses of research in the subfield of social and human sciences in health. A convenience sample was used to select three Brazilian public health journals. Based on a reading of 1,128 abstracts published from 2009 to 2010, 266 articles were selected that presented the empirical base of research stemming from social and human sciences in health. The sample was classified thematically as "theoretical/ methodological reference", "study type/ methodological design", "analytical categories", "data production techniques", and "analytical procedures". We analyze the sample's emic categories, drawing on the authors' literal statements. All the classifications and respective variables were tabulated in Excel. Most of the articles were self-described as qualitative and used more than one data production technique. There was a wide variety of theoretical references, in contrast with the almost total predominance of a single type of data analysis (content analysis). In several cases, important gaps were identified in expounding the study methodology and instrumental use of the qualitative research techniques and methods. However, the review did highlight some new objects of study and innovations in theoretical and methodological approaches.

  13. A system of safety management practices and worker engagement for reducing and preventing accidents: an empirical and theoretical investigation.

    PubMed

    Wachter, Jan K; Yorio, Patrick L

    2014-07-01

    The overall research objective was to theoretically and empirically develop the ideas around a system of safety management practices (ten practices were elaborated), to test their relationship with objective safety statistics (such as accident rates), and to explore how these practices work to achieve positive safety results (accident prevention) through worker engagement. Data were collected using safety manager, supervisor and employee surveys designed to assess and link safety management system practices, employee perceptions resulting from existing practices, and safety performance outcomes. Results indicate the following: there is a significant negative relationship between the presence of ten individual safety management practices, as well as the composite of these practices, with accident rates; there is a significant negative relationship between the level of safety-focused worker emotional and cognitive engagement with accident rates; safety management systems and worker engagement levels can be used individually to predict accident rates; safety management systems can be used to predict worker engagement levels; and worker engagement levels act as mediators between the safety management system and safety performance outcomes (such as accident rates). Even though the presence of safety management system practices is linked with incident reduction and may represent a necessary first-step in accident prevention, safety performance may also depend on mediation by safety-focused cognitive and emotional engagement by workers. Thus, when organizations invest in a safety management system approach to reducing/preventing accidents and improving safety performance, they should also be concerned about winning over the minds and hearts of their workers through human performance-based safety management systems designed to promote and enhance worker engagement. Copyright © 2013 The Authors. Published by Elsevier Ltd.. All rights reserved.

  14. Time to Guideline-Based Empiric Antibiotic Therapy in the Treatment of Pneumonia in a Community Hospital: A Retrospective Review.

    PubMed

    Erwin, Beth L; Kyle, Jeffrey A; Allen, Leland N

    2016-08-01

    The 2005 American Thoracic Society/Infectious Diseases Society of America (ATS/IDSA) guidelines for hospital-acquired pneumonia (HAP), ventilator-associated pneumonia (VAP), and health care-associated pneumonia (HCAP) stress the importance of initiating prompt appropriate empiric antibiotic therapy. This study's purpose was to determine the percentage of patients with HAP, VAP, and HCAP who received guideline-based empiric antibiotic therapy and to determine the average time to receipt of an appropriate empiric regimen. A retrospective chart review of adults with HAP, VAP, or HCAP was conducted at a community hospital in suburban Birmingham, Alabama. The hospital's electronic medical record system utilized International Classification of Diseases, Ninth Revision (ICD-9) codes to identify patients diagnosed with pneumonia. The percentage of patients who received guideline-based empiric antibiotic therapy was calculated. The mean time from suspected diagnosis of pneumonia to initial administration of the final antibiotic within the empiric regimen was calculated for patients who received guideline-based therapy. Ninety-three patients met the inclusion criteria. The overall guideline adherence rate for empiric antibiotic therapy was 31.2%. The mean time to guideline-based therapy in hours:minutes was 7:47 for HAP and 28:16 for HCAP. For HAP and HCAP combined, the mean time to appropriate therapy was 21:55. Guideline adherence rates were lower and time to appropriate empiric therapy was greater for patients with HCAP compared to patients with HAP. © The Author(s) 2015.

  15. Stomatal regulation based on competition for water, stochastic rainfall, and xylem hydraulic vulnerability - a new theoretical model

    NASA Astrophysics Data System (ADS)

    Lu, Y.; Duursma, R.; Farrior, C.; Medlyn, B. E.

    2016-12-01

    Stomata control the exchange of soil water for atmospheric CO2, which is one of the most important resource trade-offs for plants. This trade-off has been studied a lot but not in the context of competition. Based on the theory of evolutionarily stable strategy, we search for the uninvadable (or the ESS) response of stomatal conductance to soil water content under stochastic rainfall, with which the dominant plant population should never be invaded by any rare mutants in the water competition due to a higher fitness. In this study, we define the fitness as the difference between the long-term average photosynthetic carbon gain and a carbon cost of stomatal opening. This cost has traditionally been considered an unknown constant. Here we extend this framework by assuming it as the energy required for xylem embolism refilling. With regard to the refilling process, we explore 2 questions 1) to what extent the embolized xylem vessels can be repaired via refilling; and 2) whether this refilling is immediate or has a time delay following the formation of xylem embolism. We compare various assumptions in a total of 5 scenarios and find that the ESS exists only if the xylem damage can be repaired completely. Then, with this ESS, we estimate annual vegetation photosynthesis and water consumption and compare them with empirical results. In conclusion, this study provides a different insight from the existing empirical and mechanistic models as well as the theoretical models based on the optimization theory. In addition, as the model result is a simple quantitative relation between stomatal conductance and soil water content, it can be easily incorporated into other vegetation function models.

  16. Empirical Prediction of Aircraft Landing Gear Noise

    NASA Technical Reports Server (NTRS)

    Golub, Robert A. (Technical Monitor); Guo, Yue-Ping

    2005-01-01

    This report documents a semi-empirical/semi-analytical method for landing gear noise prediction. The method is based on scaling laws of the theory of aerodynamic noise generation and correlation of these scaling laws with current available test data. The former gives the method a sound theoretical foundation and the latter quantitatively determines the relations between the parameters of the landing gear assembly and the far field noise, enabling practical predictions of aircraft landing gear noise, both for parametric trends and for absolute noise levels. The prediction model is validated by wind tunnel test data for an isolated Boeing 737 landing gear and by flight data for the Boeing 777 airplane. In both cases, the predictions agree well with data, both in parametric trends and in absolute noise levels.

  17. Genetic constraints on adaptation: a theoretical primer for the genomics era.

    PubMed

    Connallon, Tim; Hall, Matthew D

    2018-06-01

    Genetic constraints are features of inheritance systems that slow or prohibit adaptation. Several population genetic mechanisms of constraint have received sustained attention within the field since they were first articulated in the early 20th century. This attention is now reflected in a rich, and still growing, theoretical literature on the genetic limits to adaptive change. In turn, empirical research on constraints has seen a rapid expansion over the last two decades in response to changing interests of evolutionary biologists, along with new technologies, expanding data sets, and creative analytical approaches that blend mathematical modeling with genomics. Indeed, one of the most notable and exciting features of recent progress in genetic constraints is the close connection between theoretical and empirical research. In this review, we discuss five major population genetic contexts of genetic constraint: genetic dominance, pleiotropy, fitness trade-offs between types of individuals of a population, sign epistasis, and genetic linkage between loci. For each, we outline historical antecedents of the theory, specific contexts where constraints manifest, and their quantitative consequences for adaptation. From each of these theoretical foundations, we discuss recent empirical approaches for identifying and characterizing genetic constraints, each grounded and motivated by this theory, and outline promising areas for future work. © 2018 New York Academy of Sciences.

  18. Predicting Child Abuse Potential: An Empirical Investigation of Two Theoretical Frameworks

    ERIC Educational Resources Information Center

    Begle, Angela Moreland; Dumas, Jean E.; Hanson, Rochelle F.

    2010-01-01

    This study investigated two theoretical risk models predicting child maltreatment potential: (a) Belsky's (1993) developmental-ecological model and (b) the cumulative risk model in a sample of 610 caregivers (49% African American, 46% European American; 53% single) with a child between 3 and 6 years old. Results extend the literature by using a…

  19. Empirical likelihood method for non-ignorable missing data problems.

    PubMed

    Guan, Zhong; Qin, Jing

    2017-01-01

    Missing response problem is ubiquitous in survey sampling, medical, social science and epidemiology studies. It is well known that non-ignorable missing is the most difficult missing data problem where the missing of a response depends on its own value. In statistical literature, unlike the ignorable missing data problem, not many papers on non-ignorable missing data are available except for the full parametric model based approach. In this paper we study a semiparametric model for non-ignorable missing data in which the missing probability is known up to some parameters, but the underlying distributions are not specified. By employing Owen (1988)'s empirical likelihood method we can obtain the constrained maximum empirical likelihood estimators of the parameters in the missing probability and the mean response which are shown to be asymptotically normal. Moreover the likelihood ratio statistic can be used to test whether the missing of the responses is non-ignorable or completely at random. The theoretical results are confirmed by a simulation study. As an illustration, the analysis of a real AIDS trial data shows that the missing of CD4 counts around two years are non-ignorable and the sample mean based on observed data only is biased.

  20. A Model of Resource Allocation in Public School Districts: A Theoretical and Empirical Analysis.

    ERIC Educational Resources Information Center

    Chambers, Jay G.

    This paper formulates a comprehensive model of resource allocation in a local public school district. The theoretical framework specified could be applied equally well to any number of local public social service agencies. Section 1 develops the theoretical model describing the process of resource allocation. This involves the determination of the…

  1. Theoretical foundations of learning through simulation.

    PubMed

    Zigmont, Jason J; Kappus, Liana J; Sudikoff, Stephanie N

    2011-04-01

    Health care simulation is a powerful educational tool to help facilitate learning for clinicians and change their practice to improve patient outcomes and safety. To promote effective life-long learning through simulation, the educator needs to consider individuals, their experiences, and their environments. Effective education of adults through simulation requires a sound understanding of both adult learning theory and experiential learning. This review article provides a framework for developing and facilitating simulation courses, founded upon empiric and theoretic research in adult and experiential learning. Specifically, this article provides a theoretic foundation for using simulation to change practice to improve patient outcomes and safety. Copyright © 2011 Elsevier Inc. All rights reserved.

  2. Theoretical and experimental physical methods of neutron-capture therapy

    NASA Astrophysics Data System (ADS)

    Borisov, G. I.

    2011-09-01

    This review is based to a substantial degree on our priority developments and research at the IR-8 reactor of the Russian Research Centre Kurchatov Institute. New theoretical and experimental methods of neutron-capture therapy are developed and applied in practice; these are: A general analytical and semi-empiric theory of neutron-capture therapy (NCT) based on classical neutron physics and its main sections (elementary theories of moderation, diffuse, reflection, and absorption of neutrons) rather than on methods of mathematical simulation. The theory is, first of all, intended for practical application by physicists, engineers, biologists, and physicians. This theory can be mastered by anyone with a higher education of almost any kind and minimal experience in operating a personal computer.

  3. Reading Comprehension to 1970: Its Theoretical and Empirical Bases, and Its Implementation in Secondary Professional Textbooks, Instructional Materials, and Tests.

    ERIC Educational Resources Information Center

    Harker, William John

    This study was designed: (1) to determine current concepts of reading comprehension deriving from experimental investigations and theoretical statements, and (2) to establish whether these concepts are represented consistently in current secondary professional reading textbooks, instructional materials, and published tests. Current knowledge of…

  4. Ontology-Based Empirical Knowledge Verification for Professional Virtual Community

    ERIC Educational Resources Information Center

    Chen, Yuh-Jen

    2011-01-01

    A professional virtual community provides an interactive platform for enterprise experts to create and share their empirical knowledge cooperatively, and the platform contains a tremendous amount of hidden empirical knowledge that knowledge experts have preserved in the discussion process. Therefore, enterprise knowledge management highly…

  5. Assessing Two Theoretical Frameworks of Civic Engagement

    ERIC Educational Resources Information Center

    García-Cabrero, Benilde; Pérez-Martínez, María Guadalupe; Sandoval-Hernández, Andrés; Caso-Niebla, Joaquín; Díaz-López, Carlos David

    2016-01-01

    The purpose of this study was to empirically test two major theoretical models: a modified version of the social capital model (Pattie, Seyd and Whiteley, 2003), and the Informed Social Engagement Model (Barr and Selman, 2014; Selman and Kwok, 2010), to explain civic participation and civic knowledge of adolescents from Chile, Colombia and Mexico,…

  6. Empirical analysis of web-based user-object bipartite networks

    NASA Astrophysics Data System (ADS)

    Shang, Ming-Sheng; Lü, Linyuan; Zhang, Yi-Cheng; Zhou, Tao

    2010-05-01

    Understanding the structure and evolution of web-based user-object networks is a significant task since they play a crucial role in e-commerce nowadays. This letter reports the empirical analysis on two large-scale web sites, audioscrobbler.com and del.icio.us, where users are connected with music groups and bookmarks, respectively. The degree distributions and degree-degree correlations for both users and objects are reported. We propose a new index, named collaborative similarity, to quantify the diversity of tastes based on the collaborative selection. Accordingly, the correlation between degree and selection diversity is investigated. We report some novel phenomena well characterizing the selection mechanism of web users and outline the relevance of these phenomena to the information recommendation problem.

  7. Theoretical and experimental characterization of the DUal-BAse transistor (DUBAT)

    NASA Astrophysics Data System (ADS)

    Wu, Chung-Yu; Wu, Ching-Yuan

    1980-11-01

    A new A-type integrated voltage controlled differential negative resistance device using an extra effective base region to form a lateral pnp (npn) bipolar transistor beside the original base region of a vertical npn (pnp) bipolar junction transistor, and so called the DUal BAse Transistor (DUBAT), is studied both experimentally and theoretically, The DUBAT has three terminals and is fully comparible with the existing bipolar integrated circuits technologies. Based upon the equivalent circuit of the DUBAT, a simple first-order analytical theory is developed, and important device parameters, such as: the I-V characteristic, the differential negative resistance, and the peak and valley points, are also characterized. One of the proposed integrated structures of the DUBAT, which is similar in structure to I 2L but with similar high density and a normally operated vertical npn transistor, has been successfully fabricated and studied. Comparisons between the experimental data and theoretical analyses are made, and show in satisfactory agreements.

  8. Waiting time distribution in public health care: empirics and theory.

    PubMed

    Dimakou, Sofia; Dimakou, Ourania; Basso, Henrique S

    2015-12-01

    Excessive waiting times for elective surgery have been a long-standing concern in many national healthcare systems in the OECD. How do the hospital admission patterns that generate waiting lists affect different patients? What are the hospitals characteristics that determine waiting times? By developing a model of healthcare provision and analysing empirically the entire waiting time distribution we attempt to shed some light on those issues. We first build a theoretical model that describes the optimal waiting time distribution for capacity constraint hospitals. Secondly, employing duration analysis, we obtain empirical representations of that distribution across hospitals in the UK from 1997-2005. We observe important differences on the 'scale' and on the 'shape' of admission rates. Scale refers to how quickly patients are treated and shape represents trade-offs across duration-treatment profiles. By fitting the theoretical to the empirical distributions we estimate the main structural parameters of the model and are able to closely identify the main drivers of these empirical differences. We find that the level of resources allocated to elective surgery (budget and physical capacity), which determines how constrained the hospital is, explains differences in scale. Changes in benefits and costs structures of healthcare provision, which relate, respectively, to the desire to prioritise patients by duration and the reduction in costs due to delayed treatment, determine the shape, affecting short and long duration patients differently. JEL Classification I11; I18; H51.

  9. Validation of the theoretical domains framework for use in behaviour change and implementation research

    PubMed Central

    2012-01-01

    Background An integrative theoretical framework, developed for cross-disciplinary implementation and other behaviour change research, has been applied across a wide range of clinical situations. This study tests the validity of this framework. Methods Validity was investigated by behavioural experts sorting 112 unique theoretical constructs using closed and open sort tasks. The extent of replication was tested by Discriminant Content Validation and Fuzzy Cluster Analysis. Results There was good support for a refinement of the framework comprising 14 domains of theoretical constructs (average silhouette value 0.29): ‘Knowledge’, ‘Skills’, ‘Social/Professional Role and Identity’, ‘Beliefs about Capabilities’, ‘Optimism’, ‘Beliefs about Consequences’, ‘Reinforcement’, ‘Intentions’, ‘Goals’, ‘Memory, Attention and Decision Processes’, ‘Environmental Context and Resources’, ‘Social Influences’, ‘Emotions’, and ‘Behavioural Regulation’. Conclusions The refined Theoretical Domains Framework has a strengthened empirical base and provides a method for theoretically assessing implementation problems, as well as professional and other health-related behaviours as a basis for intervention development. PMID:22530986

  10. Effectiveness of a theoretically-based judgment and decision making intervention for adolescents.

    PubMed

    Knight, Danica K; Dansereau, Donald F; Becan, Jennifer E; Rowan, Grace A; Flynn, Patrick M

    2015-05-01

    Although adolescents demonstrate capacity for rational decision making, their tendency to be impulsive, place emphasis on peers, and ignore potential consequences of their actions often translates into higher risk-taking including drug use, illegal activity, and physical harm. Problems with judgment and decision making contribute to risky behavior and are core issues for youth in treatment. Based on theoretical and empirical advances in cognitive science, the Treatment Readiness and Induction Program (TRIP) represents a curriculum-based decision making intervention that can be easily inserted into a variety of content-oriented modalities as well as administered as a separate therapeutic course. The current study examined the effectiveness of TRIP for promoting better judgment among 519 adolescents (37 % female; primarily Hispanic and Caucasian) in residential substance abuse treatment. Change over time in decision making and premeditation (i.e., thinking before acting) was compared among youth receiving standard operating practice (n = 281) versus those receiving standard practice plus TRIP (n = 238). Change in TRIP-specific content knowledge was examined among clients receiving TRIP. Premeditation improved among youth in both groups; TRIP clients showed greater improvement in decision making. TRIP clients also reported significant increases over time in self-awareness, positive-focused thinking (e.g., positive self-talk, goal setting), and recognition of the negative effects of drug use. While both genders showed significant improvement, males showed greater gains in metacognitive strategies (i.e., awareness of one's own cognitive process) and recognition of the negative effects of drug use. These results suggest that efforts to teach core thinking strategies and apply/practice them through independent intervention modules may benefit adolescents when used in conjunction with content-based programs designed to change problematic behaviors.

  11. Effectiveness of a Theoretically-Based Judgment and Decision Making Intervention for Adolescents

    PubMed Central

    Knight, Danica K.; Dansereau, Donald F.; Becan, Jennifer E.; Rowan, Grace A.; Flynn, Patrick M.

    2014-01-01

    Although adolescents demonstrate capacity for rational decision making, their tendency to be impulsive, place emphasis on peers, and ignore potential consequences of their actions often translates into higher risk-taking including drug use, illegal activity, and physical harm. Problems with judgment and decision making contribute to risky behavior and are core issues for youth in treatment. Based on theoretical and empirical advances in cognitive science, the Treatment Readiness and Induction Program (TRIP) represents a curriculum-based decision making intervention that can be easily inserted into a variety of content-oriented modalities as well as administered as a separate therapeutic course. The current study examined the effectiveness of TRIP for promoting better judgment among 519 adolescents (37% female; primarily Hispanic and Caucasian) in residential substance abuse treatment. Change over time in decision making and premeditation (i.e., thinking before acting) was compared among youth receiving standard operating practice (n = 281) versus those receiving standard practice plus TRIP (n = 238). Change in TRIP-specific content knowledge was examined among clients receiving TRIP. Premeditation improved among youth in both groups; TRIP clients showed greater improvement in decision making. TRIP clients also reported significant increases over time in self-awareness, positive-focused thinking (e.g., positive self-talk, goal setting), and recognition of the negative effects of drug use. While both genders showed significant improvement, males showed greater gains in metacognitive strategies (i.e., awareness of one’s own cognitive process) and recognition of the negative effects of drug use. These results suggest that efforts to teach core thinking strategies and apply/practice them through independent intervention modules may benefit adolescents when used in conjunction with content-based programs designed to change problematic behaviors. PMID:24760288

  12. Assessing the Effectiveness of Two Theoretically Motivated Computer-Assisted Reading Interventions in the United Kingdom: GG Rime and GG Phoneme

    ERIC Educational Resources Information Center

    Kyle, Fiona; Kujala, Janne; Richardson, Ulla; Lyytinen, Heikki; Goswami, Usha

    2013-01-01

    We report an empirical comparison of the effectiveness of two theoretically motivated computer-assisted reading interventions (CARI) based on the Finnish GraphoGame CARI: English GraphoGame Rime (GG Rime) and English GraphoGame Phoneme (GG Phoneme). Participants were 6-7-year-old students who had been identified by their teachers as being…

  13. Traditional Arabic & Islamic medicine: validation and empirical assessment of a conceptual model in Qatar.

    PubMed

    AlRawi, Sara N; Khidir, Amal; Elnashar, Maha S; Abdelrahim, Huda A; Killawi, Amal K; Hammoud, Maya M; Fetters, Michael D

    2017-03-14

    Evidence indicates traditional medicine is no longer only used for the healthcare of the poor, its prevalence is also increasing in countries where allopathic medicine is predominant in the healthcare system. While these healing practices have been utilized for thousands of years in the Arabian Gulf, only recently has a theoretical model been developed illustrating the linkages and components of such practices articulated as Traditional Arabic & Islamic Medicine (TAIM). Despite previous theoretical work presenting development of the TAIM model, empirical support has been lacking. The objective of this research is to provide empirical support for the TAIM model and illustrate real world applicability. Using an ethnographic approach, we recruited 84 individuals (43 women and 41 men) who were speakers of one of four common languages in Qatar; Arabic, English, Hindi, and Urdu, Through in-depth interviews, we sought confirming and disconfirming evidence of the model components, namely, health practices, beliefs and philosophy to treat, diagnose, and prevent illnesses and/or maintain well-being, as well as patterns of communication about their TAIM practices with their allopathic providers. Based on our analysis, we find empirical support for all elements of the TAIM model. Participants in this research, visitors to major healthcare centers, mentioned using all elements of the TAIM model: herbal medicines, spiritual therapies, dietary practices, mind-body methods, and manual techniques, applied singularly or in combination. Participants had varying levels of comfort sharing information about TAIM practices with allopathic practitioners. These findings confirm an empirical basis for the elements of the TAIM model. Three elements, namely, spiritual healing, herbal medicine, and dietary practices, were most commonly found. Future research should examine the prevalence of TAIM element use, how it differs among various populations, and its impact on health.

  14. Theoretical Accuracy of Along-Track Displacement Measurements from Multiple-Aperture Interferometry (MAI)

    PubMed Central

    Jung, Hyung-Sup; Lee, Won-Jin; Zhang, Lei

    2014-01-01

    The measurement of precise along-track displacements has been made with the multiple-aperture interferometry (MAI). The empirical accuracies of the MAI measurements are about 6.3 and 3.57 cm for ERS and ALOS data, respectively. However, the estimated empirical accuracies cannot be generalized to any interferometric pair because they largely depend on the processing parameters and coherence of the used SAR data. A theoretical formula is given to calculate an expected MAI measurement accuracy according to the system and processing parameters and interferometric coherence. In this paper, we have investigated the expected MAI measurement accuracy on the basis of the theoretical formula for the existing X-, C- and L-band satellite SAR systems. The similarity between the expected and empirical MAI measurement accuracies has been tested as well. The expected accuracies of about 2–3 cm and 3–4 cm (γ = 0.8) are calculated for the X- and L-band SAR systems, respectively. For the C-band systems, the expected accuracy of Radarsat-2 ultra-fine is about 3–4 cm and that of Sentinel-1 IW is about 27 cm (γ = 0.8). The results indicate that the expected MAI measurement accuracy of a given interferometric pair can be easily calculated by using the theoretical formula. PMID:25251408

  15. Development of theoretical oxygen saturation calibration curve based on optical density ratio and optical simulation approach

    NASA Astrophysics Data System (ADS)

    Jumadi, Nur Anida; Beng, Gan Kok; Ali, Mohd Alauddin Mohd; Zahedi, Edmond; Morsin, Marlia

    2017-09-01

    The implementation of surface-based Monte Carlo simulation technique for oxygen saturation (SaO2) calibration curve estimation is demonstrated in this paper. Generally, the calibration curve is estimated either from the empirical study using animals as the subject of experiment or is derived from mathematical equations. However, the determination of calibration curve using animal is time consuming and requires expertise to conduct the experiment. Alternatively, an optical simulation technique has been used widely in the biomedical optics field due to its capability to exhibit the real tissue behavior. The mathematical relationship between optical density (OD) and optical density ratios (ODR) associated with SaO2 during systole and diastole is used as the basis of obtaining the theoretical calibration curve. The optical properties correspond to systolic and diastolic behaviors were applied to the tissue model to mimic the optical properties of the tissues. Based on the absorbed ray flux at detectors, the OD and ODR were successfully calculated. The simulation results of optical density ratio occurred at every 20 % interval of SaO2 is presented with maximum error of 2.17 % when comparing it with previous numerical simulation technique (MC model). The findings reveal the potential of the proposed method to be used for extended calibration curve study using other wavelength pair.

  16. Entrepreneurship Education in Schools: Empirical Evidence on the Teacher's Role

    ERIC Educational Resources Information Center

    Ruskovaara, Elena; Pihkala, Timo

    2015-01-01

    Different approaches and methodologies for entrepreneurship education have been introduced for schools. However, a better theoretical and empirical understanding of the antecedents of entrepreneurship education is needed. The authors analyze what entrepreneurship education practices are used in schools and what role the school and the teacher are…

  17. Outcome (Competency) Based Education: An Exploration of Its Origins, Theoretical Basis, and Empirical Evidence

    ERIC Educational Resources Information Center

    Morcke, Anne Mette; Dornan, Tim; Eika, Berit

    2013-01-01

    Outcome based or competency based education (OBE) is so firmly established in undergraduate medical education that it might not seem necessary to ask why it was included in recommendations for the future, like the Flexner centenary report. Uncritical acceptance may not, however, deliver its greatest benefits. Our aim was to explore the…

  18. Empirically based device modeling of bulk heterojunction organic photovoltaics

    NASA Astrophysics Data System (ADS)

    Pierre, Adrien; Lu, Shaofeng; Howard, Ian A.; Facchetti, Antonio; Arias, Ana Claudia

    2013-10-01

    An empirically based, open source, optoelectronic model is constructed to accurately simulate organic photovoltaic (OPV) devices. Bulk heterojunction OPV devices based on a new low band gap dithienothiophene- diketopyrrolopyrrole donor polymer (P(TBT-DPP)) are blended with PC70BM and processed under various conditions, with efficiencies up to 4.7%. The mobilities of electrons and holes, bimolecular recombination coefficients, exciton quenching efficiencies in donor and acceptor domains and optical constants of these devices are measured and input into the simulator to yield photocurrent with less than 7% error. The results from this model not only show carrier activity in the active layer but also elucidate new routes of device optimization by varying donor-acceptor composition as a function of position. Sets of high and low performance devices are investigated and compared side-by-side.

  19. Three essays on energy and environmental economics: Empirical, applied, and theoretical

    NASA Astrophysics Data System (ADS)

    Karney, Daniel Houghton

    Energy and environmental economics are closely related fields as nearly all forms of energy production generate pollution and thus nearly all forms of environmental policy affect energy production and consumption. The three essays in this dissertation are related by their common themes of energy and environmental economics, but they differ in their methodologies. The first chapter is an empirical exercise that looks that the relationship between electricity price deregulation and maintenance outages at nuclear power plants. The second chapter is an applied theory paper that investigates environmental regulation in a multiple pollutants setting. The third chapter develops a new methodology regarding the construction of analytical general equilibrium models that can be used to study topics in energy and environmental economics.

  20. Empirical and theoretical analysis of complex systems

    NASA Astrophysics Data System (ADS)

    Zhao, Guannan

    structures evolve on a similar timescale to individual level transmission, we investigated the process of transmission through a model population comprising of social groups which follow simple dynamical rules for growth and break-up, and the profiles produced bear a striking resemblance to empirical data obtained from social, financial and biological systems. Finally, for better implementation of a widely accepted power law test algorithm, we have developed a fast testing procedure using parallel computation.

  1. Empirical resistive-force theory for slender biological filaments in shear-thinning fluids

    NASA Astrophysics Data System (ADS)

    Riley, Emily E.; Lauga, Eric

    2017-06-01

    Many cells exploit the bending or rotation of flagellar filaments in order to self-propel in viscous fluids. While appropriate theoretical modeling is available to capture flagella locomotion in simple, Newtonian fluids, formidable computations are required to address theoretically their locomotion in complex, nonlinear fluids, e.g., mucus. Based on experimental measurements for the motion of rigid rods in non-Newtonian fluids and on the classical Carreau fluid model, we propose empirical extensions of the classical Newtonian resistive-force theory to model the waving of slender filaments in non-Newtonian fluids. By assuming the flow near the flagellum to be locally Newtonian, we propose a self-consistent way to estimate the typical shear rate in the fluid, which we then use to construct correction factors to the Newtonian local drag coefficients. The resulting non-Newtonian resistive-force theory, while empirical, is consistent with the Newtonian limit, and with the experiments. We then use our models to address waving locomotion in non-Newtonian fluids and show that the resulting swimming speeds are systematically lowered, a result which we are able to capture asymptotically and to interpret physically. An application of the models to recent experimental results on the locomotion of Caenorhabditis elegans in polymeric solutions shows reasonable agreement and thus captures the main physics of swimming in shear-thinning fluids.

  2. Culminating Experience Empirical and Theoretical Research Projects, University of Tennessee at Chattanooga, Spring, 2005

    ERIC Educational Resources Information Center

    Watson, Sandy White, Ed.

    2005-01-01

    This document represents a sample collection of master's theses from the University of Tennessee at Chattanooga's Teacher Education Program, spring semester, 2005. The majority of these student researchers were simultaneously student teaching while writing their theses. Studies were empirical and conceptual in nature and demonstrate some ways in…

  3. Empirical links between natural mortality and recovery in marine fishes.

    PubMed

    Hutchings, Jeffrey A; Kuparinen, Anna

    2017-06-14

    Probability of species recovery is thought to be correlated with specific aspects of organismal life history, such as age at maturity and longevity, and how these affect rates of natural mortality ( M ) and maximum per capita population growth ( r max ). Despite strong theoretical underpinnings, these correlates have been based on predicted rather than realized population trajectories following threat mitigation. Here, we examine the level of empirical support for postulated links between a suite of life-history traits (related to maturity, age, size and growth) and recovery in marine fishes. Following threat mitigation (medium time since cessation of overfishing = 20 years), 71% of 55 temperate populations had fully recovered, the remainder exhibiting, on average, negligible change (impaired recovery). Singly, life-history traits did not influence recovery status. In combination, however, those that jointly reflect length-based mortality at maturity, M α , revealed that recovered populations have higher M α , which we hypothesize to reflect local adaptations associated with greater r max But, within populations, the smaller sizes at maturity generated by overfishing are predicted to increase M α , slowing recovery and increasing its uncertainty. We conclude that recovery potential is greater for populations adapted to high M but that temporal increases in M concomitant with smaller size at maturity will have the opposite effect. The recovery metric documented here ( M α ) has a sound theoretical basis, is significantly correlated with direct estimates of M that directly reflect r max , is not reliant on data-intensive time series, can be readily estimated, and offers an empirically defensible correlate of recovery, given its clear links to the positive and impaired responses to threat mitigation that have been observed in fish populations over the past three decades. © 2017 The Author(s).

  4. Image fusion method based on regional feature and improved bidimensional empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Qin, Xinqiang; Hu, Gang; Hu, Kai

    2018-01-01

    The decomposition of multiple source images using bidimensional empirical mode decomposition (BEMD) often produces mismatched bidimensional intrinsic mode functions, either by their number or their frequency, making image fusion difficult. A solution to this problem is proposed using a fixed number of iterations and a union operation in the sifting process. By combining the local regional features of the images, an image fusion method has been developed. First, the source images are decomposed using the proposed BEMD to produce the first intrinsic mode function (IMF) and residue component. Second, for the IMF component, a selection and weighted average strategy based on local area energy is used to obtain a high-frequency fusion component. Third, for the residue component, a selection and weighted average strategy based on local average gray difference is used to obtain a low-frequency fusion component. Finally, the fused image is obtained by applying the inverse BEMD transform. Experimental results show that the proposed algorithm provides superior performance over methods based on wavelet transform, line and column-based EMD, and complex empirical mode decomposition, both in terms of visual quality and objective evaluation criteria.

  5. Semi-empirical master curve concept describing the rate capability of lithium insertion electrodes

    NASA Astrophysics Data System (ADS)

    Heubner, C.; Seeba, J.; Liebmann, T.; Nickol, A.; Börner, S.; Fritsch, M.; Nikolowski, K.; Wolter, M.; Schneider, M.; Michaelis, A.

    2018-03-01

    A simple semi-empirical master curve concept, describing the rate capability of porous insertion electrodes for lithium-ion batteries, is proposed. The model is based on the evaluation of the time constants of lithium diffusion in the liquid electrolyte and the solid active material. This theoretical approach is successfully verified by comprehensive experimental investigations of the rate capability of a large number of porous insertion electrodes with various active materials and design parameters. It turns out, that the rate capability of all investigated electrodes follows a simple master curve governed by the time constant of the rate limiting process. We demonstrate that the master curve concept can be used to determine optimum design criteria meeting specific requirements in terms of maximum gravimetric capacity for a desired rate capability. The model further reveals practical limits of the electrode design, attesting the empirically well-known and inevitable tradeoff between energy and power density.

  6. Theories of behaviour change synthesised into a set of theoretical groupings: introducing a thematic series on the theoretical domains framework.

    PubMed

    Francis, Jill J; O'Connor, Denise; Curran, Janet

    2012-04-24

    Behaviour change is key to increasing the uptake of evidence into healthcare practice. Designing behaviour-change interventions first requires problem analysis, ideally informed by theory. Yet the large number of partly overlapping theories of behaviour makes it difficult to select the most appropriate theory. The need for an overarching theoretical framework of behaviour change was addressed in research in which 128 explanatory constructs from 33 theories of behaviour were identified and grouped. The resulting Theoretical Domains Framework (TDF) appears to be a helpful basis for investigating implementation problems. Research groups in several countries have conducted TDF-based studies. It seems timely to bring together the experience of these teams in a thematic series to demonstrate further applications and to report key developments. This overview article describes the TDF, provides a brief critique of the framework, and introduces this thematic series.In a brief review to assess the extent of TDF-based research, we identified 133 papers that cite the framework. Of these, 17 used the TDF as the basis for empirical studies to explore health professionals' behaviour. The identified papers provide evidence of the impact of the TDF on implementation research. Two major strengths of the framework are its theoretical coverage and its capacity to elicit beliefs that could signify key mediators of behaviour change. The TDF provides a useful conceptual basis for assessing implementation problems, designing interventions to enhance healthcare practice, and understanding behaviour-change processes. We discuss limitations and research challenges and introduce papers in this series.

  7. Theories of behaviour change synthesised into a set of theoretical groupings: introducing a thematic series on the theoretical domains framework

    PubMed Central

    2012-01-01

    Behaviour change is key to increasing the uptake of evidence into healthcare practice. Designing behaviour-change interventions first requires problem analysis, ideally informed by theory. Yet the large number of partly overlapping theories of behaviour makes it difficult to select the most appropriate theory. The need for an overarching theoretical framework of behaviour change was addressed in research in which 128 explanatory constructs from 33 theories of behaviour were identified and grouped. The resulting Theoretical Domains Framework (TDF) appears to be a helpful basis for investigating implementation problems. Research groups in several countries have conducted TDF-based studies. It seems timely to bring together the experience of these teams in a thematic series to demonstrate further applications and to report key developments. This overview article describes the TDF, provides a brief critique of the framework, and introduces this thematic series. In a brief review to assess the extent of TDF-based research, we identified 133 papers that cite the framework. Of these, 17 used the TDF as the basis for empirical studies to explore health professionals’ behaviour. The identified papers provide evidence of the impact of the TDF on implementation research. Two major strengths of the framework are its theoretical coverage and its capacity to elicit beliefs that could signify key mediators of behaviour change. The TDF provides a useful conceptual basis for assessing implementation problems, designing interventions to enhance healthcare practice, and understanding behaviour-change processes. We discuss limitations and research challenges and introduce papers in this series. PMID:22531601

  8. Application of empirical and mechanistic-empirical pavement design procedures to Mn/ROAD concrete pavement test sections

    DOT National Transportation Integrated Search

    1997-05-01

    Current pavement design procedures are based principally on empirical approaches. The current trend toward developing more mechanistic-empirical type pavement design methods led Minnesota to develop the Minnesota Road Research Project (Mn/ROAD), a lo...

  9. Absorption coefficients of silicon: A theoretical treatment

    NASA Astrophysics Data System (ADS)

    Tsai, Chin-Yi

    2018-05-01

    A theoretical model with explicit formulas for calculating the optical absorption and gain coefficients of silicon is presented. It incorporates direct and indirect interband transitions and considers the effects of occupied/unoccupied carrier states. The indirect interband transition is calculated from the second-order time-independent perturbation theory of quantum mechanics by incorporating all eight possible routes of absorption or emission of photons and phonons. Absorption coefficients of silicon are calculated from these formulas. The agreements and discrepancies among the calculated results, the Rajkanan-Singh-Shewchun (RSS) formula, and Green's data are investigated and discussed. For example, the RSS formula tends to overestimate the contributions of indirect transitions for cases with high photon energy. The results show that the state occupied/unoccupied effect is almost negligible for silicon absorption coefficients up to the onset of the optical gain condition where the energy separation of Quasi-Femi levels between electrons and holes is larger than the band-gap energy. The usefulness of using the physics-based formulas, rather than semi-empirical fitting ones, for absorption coefficients in theoretical studies of photovoltaic devices is also discussed.

  10. Deriving Empirically-Based Design Guidelines for Advanced Learning Technologies that Foster Disciplinary Comprehension

    ERIC Educational Resources Information Center

    Poitras, Eric; Trevors, Gregory

    2012-01-01

    Planning, conducting, and reporting leading-edge research requires professionals who are capable of highly skilled reading. This study reports the development of an empirically informed computer-based learning environment designed to foster the acquisition of reading comprehension strategies that mediate expertise in the social sciences. Empirical…

  11. How much does participatory flood management contribute to stakeholders' social capacity building? Empirical findings based on a triangulation of three evaluation approaches

    NASA Astrophysics Data System (ADS)

    Buchecker, M.; Menzel, S.; Home, R.

    2013-06-01

    Recent literature suggests that dialogic forms of risk communication are more effective to build stakeholders' hazard-related social capacities. In spite of the high theoretical expectations, there is a lack of univocal empirical evidence on the relevance of these effects. This is mainly due to the methodological limitations of the existing evaluation approaches. In our paper we aim at eliciting the contribution of participatory river revitalisation projects on stakeholders' social capacity building by triangulating the findings of three evaluation studies that were based on different approaches: a field-experimental, a qualitative long-term ex-post and a cross-sectional household survey approach. The results revealed that social learning and avoiding the loss of trust were more relevant benefits of participatory flood management than acceptance building. The results suggest that stakeholder involvements should be more explicitly designed as tools for long-term social learning.

  12. Mechanistic-empirical Pavement Design Guide Implementation

    DOT National Transportation Integrated Search

    2010-06-01

    The recently introduced Mechanistic-Empirical Pavement Design Guide (MEPDG) and associated computer software provides a state-of-practice mechanistic-empirical highway pavement design methodology. The MEPDG methodology is based on pavement responses ...

  13. Empirical research in bioethical journals. A quantitative analysis

    PubMed Central

    Borry, P; Schotsmans, P; Dierickx, K

    2006-01-01

    Objectives The objective of this research is to analyse the evolution and nature of published empirical research in the fields of medical ethics and bioethics. Design Retrospective quantitative study of nine peer reviewed journals in the field of bioethics and medical ethics (Bioethics, Cambridge Quarterly of Healthcare Ethics, Hastings Center Report, Journal of Clinical Ethics, Journal of Medical Ethics, Kennedy Institute of Ethics Journal, Nursing Ethics, Christian Bioethics, andTheoretical Medicine and Bioethics). Results In total, 4029 articles published between 1990 and 2003 were retrieved from the journals studied. Over this period, 435 (10.8%) studies used an empirical design. The highest percentage of empirical research articles appeared in Nursing Ethics (n = 145, 39.5%), followed by the Journal of Medical Ethics (n = 128, 16.8%) and the Journal of Clinical Ethics (n = 93, 15.4%). These three journals account for 84.1% of all empirical research in bioethics published in this period. The results of the χ2 test for two independent samples for the entire dataset indicate that the period 1997–2003 presented a higher number of empirical studies (n = 309) than did the period 1990–1996 (n = 126). This increase is statistically significant (χ2 = 49.0264, p<.0001). Most empirical studies employed a quantitative paradigm (64.6%, n = 281). The main topic of research was prolongation of life and euthanasia (n = 68). Conclusions We conclude that the proportion of empirical research in the nine journals increased steadily from 5.4% in 1990 to 15.4% in 2003. It is likely that the importance of empirical methods in medical ethics and bioethics will continue to increase. PMID:16574880

  14. Empirical research in medical ethics: how conceptual accounts on normative-empirical collaboration may improve research practice.

    PubMed

    Salloch, Sabine; Schildmann, Jan; Vollmann, Jochen

    2012-04-13

    The methodology of medical ethics during the last few decades has shifted from a predominant use of normative-philosophical analyses to an increasing involvement of empirical methods. The articles which have been published in the course of this so-called 'empirical turn' can be divided into conceptual accounts of empirical-normative collaboration and studies which use socio-empirical methods to investigate ethically relevant issues in concrete social contexts. A considered reference to normative research questions can be expected from good quality empirical research in medical ethics. However, a significant proportion of empirical studies currently published in medical ethics lacks such linkage between the empirical research and the normative analysis. In the first part of this paper, we will outline two typical shortcomings of empirical studies in medical ethics with regard to a link between normative questions and empirical data: (1) The complete lack of normative analysis, and (2) cryptonormativity and a missing account with regard to the relationship between 'is' and 'ought' statements. Subsequently, two selected concepts of empirical-normative collaboration will be presented and how these concepts may contribute to improve the linkage between normative and empirical aspects of empirical research in medical ethics will be demonstrated. Based on our analysis, as well as our own practical experience with empirical research in medical ethics, we conclude with a sketch of concrete suggestions for the conduct of empirical research in medical ethics. High quality empirical research in medical ethics is in need of a considered reference to normative analysis. In this paper, we demonstrate how conceptual approaches of empirical-normative collaboration can enhance empirical research in medical ethics with regard to the link between empirical research and normative analysis.

  15. Implications of Project-Based Funding of Research on Budgeting and Financial Management in Public Universities

    ERIC Educational Resources Information Center

    Raudla, Ringa; Karo, Erkki; Valdmaa, Kaija; Kattel, Rainer

    2015-01-01

    The main goal of the paper is to explore--both theoretically and empirically--the implications of project-based research funding for budgeting and financial management at public universities. The theoretical contribution of the paper is to provide a synthesized discussion of the possible impacts of project-based funding on university financial…

  16. An Empirical State Error Covariance Matrix Orbit Determination Example

    NASA Technical Reports Server (NTRS)

    Frisbee, Joseph H., Jr.

    2015-01-01

    State estimation techniques serve effectively to provide mean state estimates. However, the state error covariance matrices provided as part of these techniques suffer from some degree of lack of confidence in their ability to adequately describe the uncertainty in the estimated states. A specific problem with the traditional form of state error covariance matrices is that they represent only a mapping of the assumed observation error characteristics into the state space. Any errors that arise from other sources (environment modeling, precision, etc.) are not directly represented in a traditional, theoretical state error covariance matrix. First, consider that an actual observation contains only measurement error and that an estimated observation contains all other errors, known and unknown. Then it follows that a measurement residual (the difference between expected and observed measurements) contains all errors for that measurement. Therefore, a direct and appropriate inclusion of the actual measurement residuals in the state error covariance matrix of the estimate will result in an empirical state error covariance matrix. This empirical state error covariance matrix will fully include all of the errors in the state estimate. The empirical error covariance matrix is determined from a literal reinterpretation of the equations involved in the weighted least squares estimation algorithm. It is a formally correct, empirical state error covariance matrix obtained through use of the average form of the weighted measurement residual variance performance index rather than the usual total weighted residual form. Based on its formulation, this matrix will contain the total uncertainty in the state estimate, regardless as to the source of the uncertainty and whether the source is anticipated or not. It is expected that the empirical error covariance matrix will give a better, statistical representation of the state error in poorly modeled systems or when sensor performance

  17. Knowledge-Based Information Retrieval.

    ERIC Educational Resources Information Center

    Ford, Nigel

    1991-01-01

    Discussion of information retrieval focuses on theoretical and empirical advances in knowledge-based information retrieval. Topics discussed include the use of natural language for queries; the use of expert systems; intelligent tutoring systems; user modeling; the need for evaluation of system effectiveness; and examples of systems, including…

  18. Empirical likelihood-based confidence intervals for mean medical cost with censored data.

    PubMed

    Jeyarajah, Jenny; Qin, Gengsheng

    2017-11-10

    In this paper, we propose empirical likelihood methods based on influence function and jackknife techniques for constructing confidence intervals for mean medical cost with censored data. We conduct a simulation study to compare the coverage probabilities and interval lengths of our proposed confidence intervals with that of the existing normal approximation-based confidence intervals and bootstrap confidence intervals. The proposed methods have better finite-sample performances than existing methods. Finally, we illustrate our proposed methods with a relevant example. Copyright © 2017 John Wiley & Sons, Ltd.

  19. The neural mediators of kindness-based meditation: a theoretical model

    PubMed Central

    Mascaro, Jennifer S.; Darcher, Alana; Negi, Lobsang T.; Raison, Charles L.

    2015-01-01

    Although kindness-based contemplative practices are increasingly employed by clinicians and cognitive researchers to enhance prosocial emotions, social cognitive skills, and well-being, and as a tool to understand the basic workings of the social mind, we lack a coherent theoretical model with which to test the mechanisms by which kindness-based meditation may alter the brain and body. Here, we link contemplative accounts of compassion and loving-kindness practices with research from social cognitive neuroscience and social psychology to generate predictions about how diverse practices may alter brain structure and function and related aspects of social cognition. Contingent on the nuances of the practice, kindness-based meditation may enhance the neural systems related to faster and more basic perceptual or motor simulation processes, simulation of another’s affective body state, slower and higher-level perspective-taking, modulatory processes such as emotion regulation and self/other discrimination, and combinations thereof. This theoretical model will be discussed alongside best practices for testing such a model and potential implications and applications of future work. PMID:25729374

  20. Empirical research in medical ethics: How conceptual accounts on normative-empirical collaboration may improve research practice

    PubMed Central

    2012-01-01

    Background The methodology of medical ethics during the last few decades has shifted from a predominant use of normative-philosophical analyses to an increasing involvement of empirical methods. The articles which have been published in the course of this so-called 'empirical turn' can be divided into conceptual accounts of empirical-normative collaboration and studies which use socio-empirical methods to investigate ethically relevant issues in concrete social contexts. Discussion A considered reference to normative research questions can be expected from good quality empirical research in medical ethics. However, a significant proportion of empirical studies currently published in medical ethics lacks such linkage between the empirical research and the normative analysis. In the first part of this paper, we will outline two typical shortcomings of empirical studies in medical ethics with regard to a link between normative questions and empirical data: (1) The complete lack of normative analysis, and (2) cryptonormativity and a missing account with regard to the relationship between 'is' and 'ought' statements. Subsequently, two selected concepts of empirical-normative collaboration will be presented and how these concepts may contribute to improve the linkage between normative and empirical aspects of empirical research in medical ethics will be demonstrated. Based on our analysis, as well as our own practical experience with empirical research in medical ethics, we conclude with a sketch of concrete suggestions for the conduct of empirical research in medical ethics. Summary High quality empirical research in medical ethics is in need of a considered reference to normative analysis. In this paper, we demonstrate how conceptual approaches of empirical-normative collaboration can enhance empirical research in medical ethics with regard to the link between empirical research and normative analysis. PMID:22500496

  1. Psychosocial functioning in the context of diagnosis: assessment and theoretical issues.

    PubMed

    Ro, Eunyoe; Clark, Lee Anna

    2009-09-01

    Psychosocial functioning is an important focus of attention in the revision of the Diagnostic and Statistical Manual of Mental Disorders. Researchers and clinicians are converging upon the opinion that psychometrically strong, comprehensive assessment of individuals' functioning is needed to characterize disorder fully. Also shared is the realization that existing theory and research in this domain have critical shortcomings. The authors urge that the field reexamine the empirical evidence and address theoretical issues to guide future development of the construct and its measurement. The authors first discuss several theoretical issues relevant to the conceptualization and assessment of functioning: (a) definitions of functioning, (b) the role of functioning in defining disorder, and (c) understanding functioning within environmental contexts. The authors then present data regarding empirical domains of psychosocial functioning and their interrelations. Self-reported data on multiple domains of psychosocial functioning were collected from 429 participants. Factor-analytic results (promax rotation) suggest a 4-factor structure of psychosocial functioning: Well-Being, Basic Functioning, Self-Mastery, and Interpersonal and Social Relationships. Finally, the authors propose an integration of theory and empirical findings, which they believe will better incorporate psychosocial functioning into future diagnostic systems. Copyright 2009 APA, all rights reserved.

  2. Patients’ Acceptance of Smartphone Health Technology for Chronic Disease Management: A Theoretical Model and Empirical Test

    PubMed Central

    Dou, Kaili; Yu, Ping; Liu, Fang; Guan, YingPing; Li, Zhenye; Ji, Yumeng; Du, Ningkai; Lu, Xudong; Duan, Huilong

    2017-01-01

    Background Chronic disease patients often face multiple challenges from difficult comorbidities. Smartphone health technology can be used to help them manage their conditions only if they accept and use the technology. Objective The aim of this study was to develop and test a theoretical model to predict and explain the factors influencing patients’ acceptance of smartphone health technology for chronic disease management. Methods Multiple theories and factors that may influence patients’ acceptance of smartphone health technology have been reviewed. A hybrid theoretical model was built based on the technology acceptance model, dual-factor model, health belief model, and the factors identified from interviews that might influence patients’ acceptance of smartphone health technology for chronic disease management. Data were collected from patient questionnaire surveys and computer log records about 157 hypertensive patients’ actual use of a smartphone health app. The partial least square method was used to test the theoretical model. Results The model accounted for .412 of the variance in patients’ intention to adopt the smartphone health technology. Intention to use accounted for .111 of the variance in actual use and had a significant weak relationship with the latter. Perceived ease of use was affected by patients’ smartphone usage experience, relationship with doctor, and self-efficacy. Although without a significant effect on intention to use, perceived ease of use had a significant positive influence on perceived usefulness. Relationship with doctor and perceived health threat had significant positive effects on perceived usefulness, countering the negative influence of resistance to change. Perceived usefulness, perceived health threat, and resistance to change significantly predicted patients’ intentions to use the technology. Age and gender had no significant influence on patients’ acceptance of smartphone technology. The study also

  3. Supervision for School Psychologists in Training: Developing a Framework from Empirical Findings

    ERIC Educational Resources Information Center

    Gibbs, Simon; Atkinson, Cathy; Woods, Kevin; Bond, Caroline; Hill, Vivian; Howe, Julia; Morris, Sue

    2016-01-01

    Similar to other professional disciplines, the importance of supervision within school psychology has attracted considerable attention within recent years. Despite this, systematic review of current literature reveals a dearth of empirical literature proposing underlying theoretical structures. This study extends recent qualitative research by…

  4. Imitative Modeling as a Theoretical Base for Instructing Language-Disordered Children

    ERIC Educational Resources Information Center

    Courtright, John A.; Courtright, Illene C.

    1976-01-01

    A modification of A. Bandura's social learning theory (imitative modeling) was employed as a theoretical base for language instruction with eight language disordered children (5 to 10 years old). (Author/SBH)

  5. Biomarker-based strategy for early discontinuation of empirical antifungal treatment in critically ill patients: a randomized controlled trial.

    PubMed

    Rouzé, Anahita; Loridant, Séverine; Poissy, Julien; Dervaux, Benoit; Sendid, Boualem; Cornu, Marjorie; Nseir, Saad

    2017-11-01

    The aim of this study was to determine the impact of a biomarker-based strategy on early discontinuation of empirical antifungal treatment. Prospective randomized controlled single-center unblinded study, performed in a mixed ICU. A total of 110 patients were randomly assigned to a strategy in which empirical antifungal treatment duration was determined by (1,3)-β-D-glucan, mannan, and anti-mannan serum assays, performed on day 0 and day 4; or to a routine care strategy, based on international guidelines, which recommend 14 days of treatment. In the biomarker group, early stop recommendation was determined using an algorithm based on the results of biomarkers. The primary outcome was the percentage of survivors discontinuing empirical antifungal treatment early, defined as a discontinuation strictly before day 7. A total of 109 patients were analyzed (one patient withdraw consent). Empirical antifungal treatment was discontinued early in 29 out of 54 patients in the biomarker strategy group, compared with one patient out of 55 in the routine strategy group [54% vs 2%, p < 0.001, OR (95% CI) 62.6 (8.1-486)]. Total duration of antifungal treatment was significantly shorter in the biomarker strategy compared with routine strategy [median (IQR) 6 (4-13) vs 13 (12-14) days, p < 0.0001). No significant difference was found in the percentage of patients with subsequent proven invasive Candida infection, mechanical ventilation-free days, length of ICU stay, cost, and ICU mortality between the two study groups. The use of a biomarker-based strategy increased the percentage of early discontinuation of empirical antifungal treatment among critically ill patients with suspected invasive Candida infection. These results confirm previous findings suggesting that early discontinuation of empirical antifungal treatment had no negative impact on outcome. However, further studies are needed to confirm the safety of this strategy. This trial was registered at Clinical

  6. An Empirical Calibration of the Mixing-Length Parameter α

    NASA Astrophysics Data System (ADS)

    Ferraro, Francesco R.; Valenti, Elena; Straniero, Oscar; Origlia, Livia

    2006-05-01

    We present an empirical calibration of the mixing-length free parameter α based on a homogeneous infrared database of 28 Galactic globular clusters spanning a wide metallicity range (-2.15<[Fe/H]<-0.2). Empirical estimates of the red giant effective temperatures have been obtained from infrared colors. Suitable relations linking these temperatures to the cluster metallicity have been obtained and compared to theoretical predictions. An appropriate set of models for the Sun and Population II giants has been computed by using both the standard solar metallicity (Z/X)solar=0.0275 and the most recently proposed value (Z/X)solar=0.0177. We find that when the standard solar metallicity is adopted, a unique value of α=2.17 can be used to reproduce both the solar radius and the Population II red giant temperature. Conversely, when the new solar metallicity is adopted, two different values of α are required: α=1.86 to fit the solar radius and α~2.0 to fit the red giant temperatures. However, it must be noted that regardless the adopted solar reference, the α-parameter does not show any significant dependence on metallicity. Based on observations collected at the European Southern Observatory (ESO), La Silla, Chile. Also based on observations made with the Italian Telescopio Nazionale Galileo (TNG) operated on the island of La Palma by the Fundacion Galileo Galilei of the INAF (Istituto Nazionale di Astrofisica) at the Spanish Observatorio del Roque de los Muchachos of the Instituto de Astrofisica de Canarias.

  7. Schema therapy for borderline personality disorder: a comprehensive review of its empirical foundations, effectiveness and implementation possibilities.

    PubMed

    Sempértegui, Gabriela A; Karreman, Annemiek; Arntz, Arnoud; Bekker, Marrie H J

    2013-04-01

    Borderline personality disorder is a serious psychiatric disorder for which the effectiveness of the current pharmacotherapeutical and psychotherapeutic approaches has shown to be limited. In the last decades, schema therapy has increased in popularity as a treatment of borderline personality disorder; however, systematic evaluation of both effectiveness and empirical evidence for the theoretical background of the therapy is limited. This literature review comprehensively evaluates the current empirical status of schema therapy for borderline personality disorder. We first described the theoretical framework and reviewed its empirical foundations. Next, we examined the evidence regarding effectiveness and implementability. We found evidence for a considerable number of elements of Young's schema model; however, the strength of the results varies and there are also mixed results and some empirical blanks in the theory. The number of studies on effectiveness is small, but reviewed findings suggest that schema therapy is a promising treatment. In Western-European societies, the therapy could be readily implemented as a cost-effective strategy with positive economic consequences. Copyright © 2012 Elsevier Ltd. All rights reserved.

  8. Impact of Inadequate Empirical Therapy on the Mortality of Patients with Bloodstream Infections: a Propensity Score-Based Analysis

    PubMed Central

    Retamar, Pilar; Portillo, María M.; López-Prieto, María Dolores; Rodríguez-López, Fernando; de Cueto, Marina; García, María V.; Gómez, María J.; del Arco, Alfonso; Muñoz, Angel; Sánchez-Porto, Antonio; Torres-Tortosa, Manuel; Martín-Aspas, Andrés; Arroyo, Ascensión; García-Figueras, Carolina; Acosta, Federico; Corzo, Juan E.; León-Ruiz, Laura; Escobar-Lara, Trinidad

    2012-01-01

    The impact of the adequacy of empirical therapy on outcome for patients with bloodstream infections (BSI) is key for determining whether adequate empirical coverage should be prioritized over other, more conservative approaches. Recent systematic reviews outlined the need for new studies in the field, using improved methodologies. We assessed the impact of inadequate empirical treatment on the mortality of patients with BSI in the present-day context, incorporating recent methodological recommendations. A prospective multicenter cohort including all BSI episodes in adult patients was performed in 15 hospitals in Andalucía, Spain, over a 2-month period in 2006 to 2007. The main outcome variables were 14- and 30-day mortality. Adjusted analyses were performed by multivariate analysis and propensity score-based matching. Eight hundred one episodes were included. Inadequate empirical therapy was administered in 199 (24.8%) episodes; mortality at days 14 and 30 was 18.55% and 22.6%, respectively. After controlling for age, Charlson index, Pitt score, neutropenia, source, etiology, and presentation with severe sepsis or shock, inadequate empirical treatment was associated with increased mortality at days 14 and 30 (odds ratios [ORs], 2.12 and 1.56; 95% confidence intervals [95% CI], 1.34 to 3.34 and 1.01 to 2.40, respectively). The adjusted ORs after a propensity score-based matched analysis were 3.03 and 1.70 (95% CI, 1.60 to 5.74 and 0.98 to 2.98, respectively). In conclusion, inadequate empirical therapy is independently associated with increased mortality in patients with BSI. Programs to improve the quality of empirical therapy in patients with suspicion of BSI and optimization of definitive therapy should be implemented. PMID:22005999

  9. Empirical Wavelet Transform Based Features for Classification of Parkinson's Disease Severity.

    PubMed

    Oung, Qi Wei; Muthusamy, Hariharan; Basah, Shafriza Nisha; Lee, Hoileong; Vijean, Vikneswaran

    2017-12-29

    Parkinson's disease (PD) is a type of progressive neurodegenerative disorder that has affected a large part of the population till now. Several symptoms of PD include tremor, rigidity, slowness of movements and vocal impairments. In order to develop an effective diagnostic system, a number of algorithms were proposed mainly to distinguish healthy individuals from the ones with PD. However, most of the previous works were conducted based on a binary classification, with the early PD stage and the advanced ones being treated equally. Therefore, in this work, we propose a multiclass classification with three classes of PD severity level (mild, moderate, severe) and healthy control. The focus is to detect and classify PD using signals from wearable motion and audio sensors based on both empirical wavelet transform (EWT) and empirical wavelet packet transform (EWPT) respectively. The EWT/EWPT was applied to decompose both speech and motion data signals up to five levels. Next, several features are extracted after obtaining the instantaneous amplitudes and frequencies from the coefficients of the decomposed signals by applying the Hilbert transform. The performance of the algorithm was analysed using three classifiers - K-nearest neighbour (KNN), probabilistic neural network (PNN) and extreme learning machine (ELM). Experimental results demonstrated that our proposed approach had the ability to differentiate PD from non-PD subjects, including their severity level - with classification accuracies of more than 90% using EWT/EWPT-ELM based on signals from motion and audio sensors respectively. Additionally, classification accuracy of more than 95% was achieved when EWT/EWPT-ELM is applied to signals from integration of both signal's information.

  10. An Empirical Evaluation of Puzzle-Based Learning as an Interest Approach for Teaching Introductory Computer Science

    ERIC Educational Resources Information Center

    Merrick, K. E.

    2010-01-01

    This correspondence describes an adaptation of puzzle-based learning to teaching an introductory computer programming course. Students from two offerings of the course--with and without the puzzle-based learning--were surveyed over a two-year period. Empirical results show that the synthesis of puzzle-based learning concepts with existing course…

  11. Facts and values in psychotherapy-A critique of the empirical reduction of psychotherapy within evidence-based practice.

    PubMed

    Berg, Henrik; Slaattelid, Rasmus

    2017-10-01

    This paper addresses an implicit presupposition in research-supported psychological treatments and evidence-based practice in psychology. It argues that the notion of research-supported psychological treatments is based on a reductive conceptualisation of psychotherapy. Research-supported psychological treatments hinge upon an empirical reduction where psychotherapy schools become conceptualized as mere collections of empirical propositions. However, this paper argues that the different psychotherapy schools have distinct ethoses that are constituted by normative claims. Consequently, the evaluation of the different psychotherapy schools and the practice of psychotherapy should include the underlying normative claims of these ethoses. © 2017 John Wiley & Sons, Ltd.

  12. A thermodynamic and theoretical view for enzyme regulation.

    PubMed

    Zhao, Qinyi

    2015-01-01

    Precise regulation is fundamental to the proper functioning of enzymes in a cell. Current opinions about this, such as allosteric regulation and dynamic contribution to enzyme regulation, are experimental models and substantially empirical. Here we proposed a theoretical and thermodynamic model of enzyme regulation. The main idea is that enzyme regulation is processed via the regulation of abundance of active conformation in the reaction buffer. The theoretical foundation, experimental evidence, and experimental criteria to test our model are discussed and reviewed. We conclude that basic principles of enzyme regulation are laws of protein thermodynamics and it can be analyzed using the concept of distribution curve of active conformations of enzymes.

  13. Foundations of quantum gravity: The role of principles grounded in empirical reality

    NASA Astrophysics Data System (ADS)

    Holman, Marc

    2014-05-01

    When attempting to assess the strengths and weaknesses of various principles in their potential role of guiding the formulation of a theory of quantum gravity, it is crucial to distinguish between principles which are strongly supported by empirical data - either directly or indirectly - and principles which instead (merely) rely heavily on theoretical arguments for their justification. Principles in the latter category are not necessarily invalid, but their a priori foundational significance should be regarded with due caution. These remarks are illustrated in terms of the current standard models of cosmology and particle physics, as well as their respective underlying theories, i.e., essentially general relativity and quantum (field) theory. For instance, it is clear that both standard models are severely constrained by symmetry principles: an effective homogeneity and isotropy of the known universe on the largest scales in the case of cosmology and an underlying exact gauge symmetry of nuclear and electromagnetic interactions in the case of particle physics. However, in sharp contrast to the cosmological situation, where the relevant symmetry structure is more or less established directly on observational grounds, all known, nontrivial arguments for the "gauge principle" are purely theoretical (and far less conclusive than usually advocated). Similar remarks apply to the larger theoretical structures represented by general relativity and quantum (field) theory, where - actual or potential - empirical principles, such as the (Einstein) equivalence principle or EPR-type nonlocality, should be clearly differentiated from theoretical ones, such as general covariance or renormalizability. It is argued that if history is to be of any guidance, the best chance to obtain the key structural features of a putative quantum gravity theory is by deducing them, in some form, from the appropriate empirical principles (analogous to the manner in which, say, the idea that

  14. Empirically Based Myths: Astrology, Biorhythms, and ATIs.

    ERIC Educational Resources Information Center

    Ragsdale, Ronald G.

    1980-01-01

    A myth may have an empirical basis through chance occurrence; perhaps Aptitude Treatment Interactions (ATIs) are in this category. While ATIs have great utility in describing, planning, and implementing instruction, few disordinal interactions have been found. Article suggests narrowing of ATI research with replications and estimates of effect…

  15. Experimental and Theoretical Study of Propeller Spinner/Shank Interference. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Cornell, C. C.

    1986-01-01

    A fundamental experimental and theoretical investigation into the aerodynamic interference associated with propeller spinner and shank regions was conducted. The research program involved a theoretical assessment of solutions previously proposed, followed by a systematic experimental study to supplement the existing data base. As a result, a refined computational procedure was established for prediction of interference effects in terms of interference drag and resolved into propeller thrust and torque components. These quantities were examined with attention to engineering parameters such as two spinner finess ratios, three blade shank forms, and two/three/four/six/eight blades. Consideration of the physics of the phenomena aided in the logical deduction of two individual interference quantities (cascade effects and spinner/shank juncture interference). These interference effects were semi-empirically modeled using existing theories and placed into a compatible form with an existing propeller performance scheme which provided the basis for examples of application.

  16. A Sociocultural Perspective of Learning: Developing a New Theoretical Tenet

    ERIC Educational Resources Information Center

    Phan, Huy P.

    2012-01-01

    Explanation pertaining to individuals' cognitive development and learning approaches is a recurring theme in the areas of education and psychology. The work of Okagaki (e.g., Okagaki, 2001; Okagaki & Frensch, 1998), for example, has provided both theoretical and empirical insights into the structuring and situational positioning of individuals…

  17. Systems View of School Climate: A Theoretical Framework for Research

    ERIC Educational Resources Information Center

    Rudasill, Kathleen Moritz; Snyder, Kate E.; Levinson, Heather; Adelson, Jill L.

    2018-01-01

    School climate has been widely examined through both empirical and theoretical means. However, there is little conceptual consensus underlying the landscape of this literature, offering inconsistent guidance for research examining this important construct. In order to best assist the efforts of developing causal models that describe how school…

  18. The Theoretical Basis of the Effective School Improvement Model (ESI)

    ERIC Educational Resources Information Center

    Scheerens, Jaap; Demeuse, Marc

    2005-01-01

    This article describes the process of theoretical reflection that preceded the development and empirical verification of a model of "effective school improvement". The focus is on basic mechanisms that could be seen as underlying "getting things in motion" and change in education systems. Four mechanisms are distinguished:…

  19. Why do people need self-esteem? A theoretical and empirical review.

    PubMed

    Pyszczynski, Tom; Greenberg, Jeff; Solomon, Sheldon; Arndt, Jamie; Schimel, Jeff

    2004-05-01

    Terror management theory (TMT; J. Greenberg, T. Pyszczynski, & S. Solomon, 1986) posits that people are motivated to pursue positive self-evaluations because self-esteem provides a buffer against the omnipresent potential for anxiety engendered by the uniquely human awareness of mortality. Empirical evidence relevant to the theory is reviewed showing that high levels of self-esteem reduce anxiety and anxiety-related defensive behavior, reminders of one's mortality increase self-esteem striving and defense of self-esteem against threats in a variety of domains, high levels of self-esteem eliminate the effect of reminders of mortality on both self-esteem striving and the accessibility of death-related thoughts, and convincing people of the existence of an afterlife eliminates the effect of mortality salience on self-esteem striving. TMT is compared with other explanations for why people need self-esteem, and a critique of the most prominent of these, sociometer theory, is provided. ((c) 2004 APA, all rights reserved)

  20. Unsupervised active learning based on hierarchical graph-theoretic clustering.

    PubMed

    Hu, Weiming; Hu, Wei; Xie, Nianhua; Maybank, Steve

    2009-10-01

    Most existing active learning approaches are supervised. Supervised active learning has the following problems: inefficiency in dealing with the semantic gap between the distribution of samples in the feature space and their labels, lack of ability in selecting new samples that belong to new categories that have not yet appeared in the training samples, and lack of adaptability to changes in the semantic interpretation of sample categories. To tackle these problems, we propose an unsupervised active learning framework based on hierarchical graph-theoretic clustering. In the framework, two promising graph-theoretic clustering algorithms, namely, dominant-set clustering and spectral clustering, are combined in a hierarchical fashion. Our framework has some advantages, such as ease of implementation, flexibility in architecture, and adaptability to changes in the labeling. Evaluations on data sets for network intrusion detection, image classification, and video classification have demonstrated that our active learning framework can effectively reduce the workload of manual classification while maintaining a high accuracy of automatic classification. It is shown that, overall, our framework outperforms the support-vector-machine-based supervised active learning, particularly in terms of dealing much more efficiently with new samples whose categories have not yet appeared in the training samples.

  1. Theoretical Bases for Teacher- and Peer-Delivered Sexual Health Promotion

    ERIC Educational Resources Information Center

    Wight, Daniel

    2008-01-01

    Purpose: This paper seeks to explore the theoretical bases for teacher-delivered and peer-delivered sexual health promotion and education. Design/methodology/approach: The first section briefly outlines the main theories informing sexual health interventions for young people, and the second discusses their implications for modes of delivery.…

  2. Study of network resource allocation based on market and game theoretic mechanism

    NASA Astrophysics Data System (ADS)

    Liu, Yingmei; Wang, Hongwei; Wang, Gang

    2004-04-01

    We work on the network resource allocation issue concerning network management system function based on market-oriented mechanism. The scheme is to model the telecommunication network resources as trading goods in which the various network components could be owned by different competitive, real-world entities. This is a multidisciplinary framework concentrating on the similarity between resource allocation in network environment and the market mechanism in economic theory. By taking an economic (market-based and game theoretic) approach in routing of communication network, we study the dynamic behavior under game-theoretic framework in allocating network resources. Based on the prior work of Gibney and Jennings, we apply concepts of utility and fitness to the market mechanism with an intention to close the gap between experiment environment and real world situation.

  3. Like grandparents, like parents: Empirical evidence and psychoanalytic thinking on the transmission of parenting styles.

    PubMed

    De Carli, Pietro; Tagini, Angela; Sarracino, Diego; Santona, Alessandra; Bonalda, Valentina; Cesari, Paola Elena; Parolin, Laura

    2018-01-01

    The authors discuss the issue of intergenerational transmission of parenting from an empirical and psychoanalytic perspective. After presenting a framework to explain their conception of parenting, they describe intergenerational transmission of parenting as a key to interpreting and eventually changing parenting behaviors. Then they present (1) the empirical approach aimed at determining if there is actually a stability across generations that contributes to harsh parenting and eventually maltreatment and (2) the psyphoanalytic thinking that seeks to explain the continuity in terms of representations and clinical phenomena. The authors also discuss the relationship between the attachment and the caregiving systems and hypothesize a common base for the two systems in childhood experience. Finally, they propose the psychoanalytic perspective as a fruitful theoretical framework to integrate the evidence for the neurophysiological mediators and moderators of intergenerational transmission. Psychoanalytically informed research can provide clinically relevant insights and hypotheses to be tested.

  4. Development of an empirically based dynamic biomechanical strength model

    NASA Technical Reports Server (NTRS)

    Pandya, A.; Maida, J.; Aldridge, A.; Hasson, S.; Woolford, B.

    1992-01-01

    The focus here is on the development of a dynamic strength model for humans. Our model is based on empirical data. The shoulder, elbow, and wrist joints are characterized in terms of maximum isolated torque, position, and velocity in all rotational planes. This information is reduced by a least squares regression technique into a table of single variable second degree polynomial equations determining the torque as a function of position and velocity. The isolated joint torque equations are then used to compute forces resulting from a composite motion, which in this case is a ratchet wrench push and pull operation. What is presented here is a comparison of the computed or predicted results of the model with the actual measured values for the composite motion.

  5. Measuring metacognitive ability based on science literacy in dynamic electricity topic

    NASA Astrophysics Data System (ADS)

    Warni; Sunyono; Rosidin

    2018-01-01

    This study aims to produce an instrument of metacognition ability assessment based on science literacy on theoretically and empirically feasible dynamic electrical material. The feasibility of the assessment instrument includes theoretical validity on material, construction, and language aspects, as well as empirical validity, reliability, difficulty, distinguishing, and distractor indices. The development of assessment instruments refers to the Dick and Carey development model which includes the preliminary study stage, initial product development, validation and revision, and piloting. The instrument was tested to 32 students of class IX in SMP Negeri 20 Bandar Lampung, using the design of One Group Pretest-Postest Design. The result shows that the metacognition ability assessment instrument based on science literacy is feasible theoretically with theoretical validity percentage of 95.44% and empirical validity of 43.75% for the high category, 43.75% for the medium category, and 12.50 % for low category questions; Reliability of assessment instruments of 0.83 high categories; Difficulty level of difficult item is about 31.25% and medium category is equal to 68.75%. Item that has very good distinguishing power is 12.50%, 62.50% for good stage, and medium category is 25.00%; As well as the duplexing function on a matter of multiple choice is 80.00% including good category and 20.00% for medium category.

  6. Patients' Acceptance of Smartphone Health Technology for Chronic Disease Management: A Theoretical Model and Empirical Test.

    PubMed

    Dou, Kaili; Yu, Ping; Deng, Ning; Liu, Fang; Guan, YingPing; Li, Zhenye; Ji, Yumeng; Du, Ningkai; Lu, Xudong; Duan, Huilong

    2017-12-06

    Chronic disease patients often face multiple challenges from difficult comorbidities. Smartphone health technology can be used to help them manage their conditions only if they accept and use the technology. The aim of this study was to develop and test a theoretical model to predict and explain the factors influencing patients' acceptance of smartphone health technology for chronic disease management. Multiple theories and factors that may influence patients' acceptance of smartphone health technology have been reviewed. A hybrid theoretical model was built based on the technology acceptance model, dual-factor model, health belief model, and the factors identified from interviews that might influence patients' acceptance of smartphone health technology for chronic disease management. Data were collected from patient questionnaire surveys and computer log records about 157 hypertensive patients' actual use of a smartphone health app. The partial least square method was used to test the theoretical model. The model accounted for .412 of the variance in patients' intention to adopt the smartphone health technology. Intention to use accounted for .111 of the variance in actual use and had a significant weak relationship with the latter. Perceived ease of use was affected by patients' smartphone usage experience, relationship with doctor, and self-efficacy. Although without a significant effect on intention to use, perceived ease of use had a significant positive influence on perceived usefulness. Relationship with doctor and perceived health threat had significant positive effects on perceived usefulness, countering the negative influence of resistance to change. Perceived usefulness, perceived health threat, and resistance to change significantly predicted patients' intentions to use the technology. Age and gender had no significant influence on patients' acceptance of smartphone technology. The study also confirmed the positive relationship between intention to use

  7. Group Theoretical Characterization of Wave Equations

    NASA Astrophysics Data System (ADS)

    Nisticò, Giuseppe

    2017-12-01

    Group theoretical methods, worked out in particular by Mackey and Wigner, allow to attain the explicit Quantum Theory of a free particle through a purely deductive development based on symmetry principles. The extension of these methods to the case of an interacting particle finds a serious obstacle in the loss of the symmetry condition for the transformations of Galilei's group. The known attempts towards such an extension introduce restrictions which lead to theories empirically too limited. In the present article we show how the difficulties raised by the loss of symmetry can be overcome without the restrictions that affect tha past attempts. According to our results, the different specific forms of the wave equation of an interacting particle are implied by particular first order invariance properties that characterize the interaction with respect to specific sub-groups of galileian transformations. Moreover, the possibility of yet unknown forms of the wave equation is left open.

  8. Untangling the Evidence: Introducing an Empirical Model for Evidence-Based Library and Information Practice

    ERIC Educational Resources Information Center

    Gillespie, Ann

    2014-01-01

    Introduction: This research is the first to investigate the experiences of teacher-librarians as evidence-based practice. An empirically derived model is presented in this paper. Method: This qualitative study utilised the expanded critical incident approach, and investigated the real-life experiences of fifteen Australian teacher-librarians,…

  9. A new prescription for empirical ethics research in pharmacy: a critical review of the literature

    PubMed Central

    Cooper, R J; Bissell, P; Wingfield, J

    2007-01-01

    Empirical ethics research is increasingly valued in bioethics and healthcare more generally, but there remain as yet under‐researched areas such as pharmacy, despite the increasingly visible attempts by the profession to embrace additional roles beyond the supply of medicines. A descriptive and critical review of the extant empirical pharmacy ethics literature is provided here. A chronological change from quantitative to qualitative approaches is highlighted in this review, as well as differing theoretical approaches such as cognitive moral development and the four principles of biomedical ethics. Research with pharmacy student cohorts is common, as is representation from American pharmacists. Many examples of ethical problems are identified, as well as commercial and legal influences on ethical understanding and decision making. In this paper, it is argued that as pharmacy seeks to develop additional roles with concomitant ethical responsibilities, a new prescription is needed for empirical ethics research in pharmacy—one that embraces an agenda of systematic research using a plurality of methodological and theoretical approaches to better explore this under‐researched discipline. PMID:17264193

  10. [Methodological deficits in neuroethics: do we need theoretical neuroethics?].

    PubMed

    Northoff, G

    2013-10-01

    Current neuroethics can be characterized best as empirical neuroethics: it is strongly empirically oriented in that it not only includes empirical findings from neuroscience but also searches for applications within neuroscience. This, however, neglects the social and political contexts which could be subject to a future social neuroethics. In addition, methodological issues need to be considered as in theoretical neuroethics. The focus in this article is on two such methodological issues: (1) the analysis of the different levels and their inferences among each other which is exemplified by the inference of consciousness from the otherwise purely neuronal data in patients with vegetative state and (2) the problem of linking descriptive and normative concepts in a non-reductive and non-inferential way for which I suggest the mutual contextualization between both concepts. This results in a methodological strategy that can be described as contextual fact-norm iterativity.

  11. A test-based strategy is more cost effective than empiric dose escalation for patients with Crohn's disease who lose responsiveness to infliximab.

    PubMed

    Velayos, Fernando S; Kahn, James G; Sandborn, William J; Feagan, Brian G

    2013-06-01

    Patients with Crohn's disease who become unresponsive to therapy with tumor necrosis factor antagonists are managed initially with either empiric dose escalation or testing-based strategies. The comparative cost effectiveness of these 2 strategies is unknown. We investigated whether a testing-based strategy is more cost effective than an empiric dose-escalation strategy. A decision analytic model that simulated 2 cohorts of patients with Crohn's disease compared outcomes for the 2 strategies over a 1-year time period. The incremental cost-effectiveness ratio of the empiric strategy was expressed as cost per quality-adjusted life-year (QALY) gained, compared with the testing-based strategy. We performed 1-way, probabilistic, and prespecified secondary analyses. The testing strategy yielded similar QALYs compared with the empiric strategy (0.801 vs 0.800, respectively) but was less expensive ($31,870 vs $37,266, respectively). In sensitivity analyses, the incremental cost-effectiveness ratio of the empiric strategy ranged from $500,000 to more than $5 million per QALY gained. Similar rates of remission (63% vs 66%) and response (28% vs 26%) were achieved through differential use of available interventions. The testing-based strategy resulted in a higher percentage of surgeries (48% vs 34%) and lower percentage use of high-dose biological therapy (41% vs 54%). A testing-based strategy is a cost-effective alternative to the current strategy of empiric dose escalation for managing patients with Crohn's disease who have lost responsiveness to infliximab. The basis for this difference is lower cost at similar outcomes. Copyright © 2013 AGA Institute. Published by Elsevier Inc. All rights reserved.

  12. [Mes differ by positioning: empirical testing of decentralized dynamics of the self].

    PubMed

    Mizokami, Shinichi

    2013-10-01

    The present study empirically tested the conceptualization of the decentralized dynamics of the self proposed by Hermans & Kempen (1993), which they developed theoretically and from clinical cases, not from large samples of empirical data. They posited that worldviews and images of the self could vary by positioning even in the same individual, and denied that the ego was an omniscient entity that knew and controlled all aspects of the self (centralized ego). Study 1 tested their conceptualization empirically with 47 university students in an experimental group and 17 as a control group. The results showed that the scores on the Rosenberg's self-esteem scale and images of the Mes in the experimental group significantly varied by positioning, but those in the control group did not. Similar results were found in Study 2 with a sample of 120 university students. These results empirically supported the conceptualization of the decentralized dynamics of the self.

  13. Empirical Research on Spatial Diffusion Process of Knowledge Spillovers

    NASA Astrophysics Data System (ADS)

    Jin, Xuehui

    2018-02-01

    Firstly, this paper gave a brief review of the core issues of previous studies on spatial distribution of knowledge spillovers. That laid the theoretical foundation for further research. Secondly, this paper roughly described the diffusion process of solar patents in Bejing-Tianjin-Hebei and the Pearl River Delta regions by means of correlation analysis based on patent information of the application date and address of patentee. After that, this paper introduced the variables of spatial distance, knowledge absorptive capacity, knowledge gap and pollution control and built the empirical model of patent, and then collecting data to test them. The results showed that knowledge absorptive capacity was the most significant factor than the other three, followed by the knowledge gap. The influence of spatial distance on knowledge spillovers was limited and the most weak influence factor was pollution control.

  14. Feasibility of an Empirically Based Program for Parents of Preschoolers with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Dababnah, Sarah; Parish, Susan L.

    2016-01-01

    This article reports on the feasibility of implementing an existing empirically based program, "The Incredible Years," tailored to parents of young children with autism spectrum disorder. Parents raising preschool-aged children (aged 3-6?years) with autism spectrum disorder (N?=?17) participated in a 15-week pilot trial of the…

  15. Empirical Determination of Effectiveness of a Competency Based Program in Distributive Education. Final Report.

    ERIC Educational Resources Information Center

    Charters, Margaret; And Others

    The primary objective of the Syracuse project was to make an empirical determination of the effectiveness of a competency-based (CB) distributive education program by comparing student achievement in three of its major components with similar traditionally organized courses at Syracuse, Buffalo, and Baruch. The three components were retailing,…

  16. Towards a Theoretical Framework for the Comparative Understanding of Globalisation, Higher Education, the Labour Market and Inequality

    ERIC Educational Resources Information Center

    Kupfer, Antonia

    2011-01-01

    This paper is a theoretical examination of three major empirical trends that affect many people: globalisation, increasingly close relations between higher education (HE) and labour markets, and increasing social inequality. Its aim is to identify key theoretical resources and their contribution to the development of a comparative theoretical…

  17. Is Project Based Learning More Effective than Direct Instruction in School Science Classrooms? An Analysis of the Empirical Research Evidence

    NASA Astrophysics Data System (ADS)

    Dann, Clifford

    An increasingly loud call by parents, school administrators, teachers, and even business leaders for "authentic learning", emphasizing both group-work and problem solving, has led to growing enthusiasm for inquiry-based learning over the past decade. Although "inquiry" can be defined in many ways, a curriculum called "project-based learning" has recently emerged as the inquiry practice-of-choice with roots in the educational constructivism that emerged in the mid-twentieth century. Often, project-based learning is framed as an alternative instructional strategy to direct instruction for maximizing student content knowledge. This study investigates the empirical evidence for such a comparison while also evaluating the overall quality of the available studies in the light of accepted standards for educational research. Specifically, this thesis investigates what the body of quantitative research says about the efficacy of project-based learning vs. direct instruction when considering student acquisition of content knowledge in science classrooms. Further, existing limitations of the research pertaining to project based learning and secondary school education are explored. The thesis concludes with a discussion of where and how we should focus our empirical efforts in the future. The research revealed that the available empirical research contains flaws in both design and instrumentation. In particular, randomization is poor amongst all the studies considered. The empirical evidence indicates that project-based learning curricula improved student content knowledge but that, while the results were statistically significant, increases in raw test scores were marginal.

  18. Deriving Multidimensional Poverty Indicators: Methodological Issues and an Empirical Analysis for Italy

    ERIC Educational Resources Information Center

    Coromaldi, Manuela; Zoli, Mariangela

    2012-01-01

    Theoretical and empirical studies have recently adopted a multidimensional concept of poverty. There is considerable debate about the most appropriate degree of multidimensionality to retain in the analysis. In this work we add to the received literature in two ways. First, we derive indicators of multiple deprivation by applying a particular…

  19. A Theoretical Perspective of Learning in the Pacific Context: A Sociocultural Perspective

    ERIC Educational Resources Information Center

    Phan, Huy P.

    2010-01-01

    This theoretical article discusses the importance of learning approaches in sociocultural contexts. Our discussion synthesizes previous empirical research studies, taking into consideration the importance of individuals' cultural background and environmental settings. Research studies by Marton and Saljo (1976) and others (Biggs, 1987; Watkins…

  20. Rural Schools, Social Capital and the Big Society: A Theoretical and Empirical Exposition

    ERIC Educational Resources Information Center

    Bagley, Carl; Hillyard, Sam

    2014-01-01

    The paper commences with a theoretical exposition of the current UK government's policy commitment to the idealised notion of the Big Society and the social capital currency underpinning its formation. The paper positions this debate in relation to the rural and adopts an ethnographically-informed methodological approach to provide an in-depth…

  1. How "Does" the Comforting Process Work? An Empirical Test of an Appraisal-Based Model of Comforting

    ERIC Educational Resources Information Center

    Jones, Susanne M.; Wirtz, John G.

    2006-01-01

    Burleson and Goldsmith's (1998) comforting model suggests an appraisal-based mechanism through which comforting messages can bring about a positive change in emotional states. This study is a first empirical test of three causal linkages implied by the appraisal-based comforting model. Participants (N=258) talked about an upsetting event with a…

  2. Dynamics of bloggers’ communities: Bipartite networks from empirical data and agent-based modeling

    NASA Astrophysics Data System (ADS)

    Mitrović, Marija; Tadić, Bosiljka

    2012-11-01

    We present an analysis of the empirical data and the agent-based modeling of the emotional behavior of users on the Web portals where the user interaction is mediated by posted comments, like Blogs and Diggs. We consider the dataset of discussion-driven popular Diggs, in which all comments are screened by machine-learning emotion detection in the text, to determine positive and negative valence (attractiveness and aversiveness) of each comment. By mapping the data onto a suitable bipartite network, we perform an analysis of the network topology and the related time-series of the emotional comments. The agent-based model is then introduced to simulate the dynamics and to capture the emergence of the emotional behaviors and communities. The agents are linked to posts on a bipartite network, whose structure evolves through their actions on the posts. The emotional states (arousal and valence) of each agent fluctuate in time, subject to the current contents of the posts to which the agent is exposed. By an agent’s action on a post its current emotions are transferred to the post. The model rules and the key parameters are inferred from the considered empirical data to ensure their realistic values and mutual consistency. The model assumes that the emotional arousal over posts drives the agent’s action. The simulations are preformed for the case of constant flux of agents and the results are analyzed in full analogy with the empirical data. The main conclusions are that the emotion-driven dynamics leads to long-range temporal correlations and emergent networks with community structure, that are comparable with the ones in the empirical system of popular posts. In view of pure emotion-driven agents actions, this type of comparisons provide a quantitative measure for the role of emotions in the dynamics on real blogs. Furthermore, the model reveals the underlying mechanisms which relate the post popularity with the emotion dynamics and the prevalence of negative

  3. The Contribution of Environmental Assessment to Sustainable Development: Toward a Richer Empirical Understanding

    NASA Astrophysics Data System (ADS)

    Cashmore, Matthew; Bond, Alan; Cobb, Dick

    2007-09-01

    It has long been suggested that environmental assessment has the potential to contribute to sustainable development through mechanisms above and beyond informing design and consent decisions, and while theories have been proposed to explain how this might occur, few have been subjected to rigorous empirical validation. This research advances the theoretical debate by building a rich empirical understanding of environmental assessment’s practical outcomes, from which its potential to contribute to sustainable development can be gauged. Three case study environmental assessment processes in England were investigated using a combination of data generated from content analysis, in-depth interviews, and a questionnaire survey. Four categories of outcomes are delineated based on the research data: learning outcomes; governance outcomes; attitudinal and value changes; and developmental outcomes. The data provide a robust critique of mainstream theory, with its focus on design and consent decisions. The article concludes with an examination of the consequences of the context-specific nature of environmental assessment practices in terms of developing theory and focusing future research.

  4. Corrective Feedback in L2 Writing: Theoretical Perspectives, Empirical Insights, and Future Directions

    ERIC Educational Resources Information Center

    Van Beuningen, Catherine

    2010-01-01

    The role of (written) corrective feedback (CF) in the process of acquiring a second language (L2) has been an issue of considerable controversy among theorists and researchers alike. Although CF is a widely applied pedagogical tool and its use finds support in SLA theory, practical and theoretical objections to its usefulness have been raised…

  5. 'Nobody tosses a dwarf!' The relation between the empirical and the normative reexamined.

    PubMed

    Leget, Carlo; Borry, Pascal; de Vries, Raymond

    2009-05-01

    This article discusses the relation between empirical and normative approaches in bioethics. The issue of dwarf tossing, while admittedly unusual, is chosen as a point of departure because it challenges the reader to look with fresh eyes upon several central bioethical themes, including human dignity, autonomy, and the protection of vulnerable people. After an overview of current approaches to the integration of empirical and normative ethics, we consider five ways that the empirical and normative can be brought together to speak to the problem of dwarf tossing: prescriptive applied ethics, theoretical ethics, critical applied ethics, particularist ethics and integrated empirical ethics. We defend a position of critical applied ethics that allows for a two-way relation between empirical and normative theories. Against efforts fully to integrate the normative and the empirical into one synthesis, we propose that the two should stand in tension and relation to one another. The approach we endorse acknowledges that a social practice can and should be judged both by the gathering of empirical data and by normative ethics. Critical applied ethics uses a five stage process that includes: (a) determination of the problem, (b) description of the problem, (c) empirical study of effects and alternatives, (d) normative weighing and (e) evaluation of the effects of a decision. In each stage, we explore the perspective from both the empirical (sociological) and the normative ethical point of view. We conclude by applying our five-stage critical applied ethics to the example of dwarf tossing.

  6. Reviews of theoretical frameworks: Challenges and judging the quality of theory application.

    PubMed

    Hean, Sarah; Anderson, Liz; Green, Chris; John, Carol; Pitt, Richard; O'Halloran, Cath

    2016-06-01

    Rigorous reviews of available information, from a range of resources, are required to support medical and health educators in their decision making. The aim of this article is to highlight the importance of a review of theoretical frameworks specifically as a supplement to reviews that focus on a synthesis of the empirical evidence alone. Establishing a shared understanding of theory as a concept is highlighted as a challenge and some practical strategies to achieving this are presented. This article also introduces the concept of theoretical quality, arguing that a critique of how theory is applied should complement the methodological appraisal of the literature in a review. We illustrate the challenge of establishing a shared meaning of theory through reference to experiences of an on-going review of this kind conducted in the field of interprofessional education (IPE) and use a high scoring paper selected in this review to illustrate how theoretical quality can be assessed. In reaching a shared understanding of theory as a concept, practical strategies that promote experiential and practical ways of knowing are required in addition to more propositional ways of sharing knowledge. Concepts of parsimony, testability, operational adequacy and empirical adequacy are explored as concepts that establish theoretical quality. Reviews of theoretical frameworks used in medical education are required to inform educational practice. Review teams should make time and effort to reach a shared understanding of the term theory. Theory reviews, and reviews more widely, should add an assessment of theory application to the protocol of their review method.

  7. An empirically-based model for the lift coefficients of twisted airfoils with leading-edge tubercles

    NASA Astrophysics Data System (ADS)

    Ni, Zao; Su, Tsung-chow; Dhanak, Manhar

    2018-04-01

    Experimental data for untwisted airfoils are utilized to propose a model for predicting the lift coefficients of twisted airfoils with leading-edge tubercles. The effectiveness of the empirical model is verified through comparison with results of a corresponding computational fluid-dynamic (CFD) study. The CFD study is carried out for both twisted and untwisted airfoils with tubercles, the latter shown to compare well with available experimental data. Lift coefficients of twisted airfoils predicted from the proposed empirically-based model match well with the corresponding coefficients determined using the verified CFD study. Flow details obtained from the latter provide better insight into the underlying mechanism and behavior at stall of twisted airfoils with leading edge tubercles.

  8. Development of an Empirically Based Questionnaire to Investigate Young Students' Ideas about Nature of Science

    ERIC Educational Resources Information Center

    Chen, Sufen; Chang, Wen-Hua; Lieu, Sang-Chong; Kao, Huey-Lien; Huang, Mao-Tsai; Lin, Shu-Fen

    2013-01-01

    This study developed an empirically based questionnaire to monitor young learners' conceptions of nature of science (NOS). The questionnaire, entitled Students' Ideas about Nature of Science (SINOS), measured views on theory-ladenness, use of creativity and imagination, tentativeness of scientific knowledge, durability of scientific knowledge,…

  9. From the Cover: The growth of business firms: Theoretical framework and empirical evidence

    NASA Astrophysics Data System (ADS)

    Fu, Dongfeng; Pammolli, Fabio; Buldyrev, S. V.; Riccaboni, Massimo; Matia, Kaushik; Yamasaki, Kazuko; Stanley, H. Eugene

    2005-12-01

    We introduce a model of proportional growth to explain the distribution Pg(g) of business-firm growth rates. The model predicts that Pg(g) is exponential in the central part and depicts an asymptotic power-law behavior in the tails with an exponent = 3. Because of data limitations, previous studies in this field have been focusing exclusively on the Laplace shape of the body of the distribution. In this article, we test the model at different levels of aggregation in the economy, from products to firms to countries, and we find that the predictions of the model agree with empirical growth distributions and size-variance relationships. proportional growth | preferential attachment | Laplace distribution

  10. Developing an Empirical Model for Jet-Surface Interaction Noise

    NASA Technical Reports Server (NTRS)

    Brown, Clifford A.

    2014-01-01

    The process of developing an empirical model for jet-surface interaction noise is described and the resulting model evaluated. Jet-surface interaction noise is generated when the high-speed engine exhaust from modern tightly integrated or conventional high-bypass ratio engine aircraft strikes or flows over the airframe surfaces. An empirical model based on an existing experimental database is developed for use in preliminary design system level studies where computation speed and range of configurations is valued over absolute accuracy to select the most promising (or eliminate the worst) possible designs. The model developed assumes that the jet-surface interaction noise spectra can be separated from the jet mixing noise and described as a parabolic function with three coefficients: peak amplitude, spectral width, and peak frequency. These coefficients are fit to functions of surface length and distance from the jet lipline to form a characteristic spectra which is then adjusted for changes in jet velocity and/or observer angle using scaling laws from published theoretical and experimental work. The resulting model is then evaluated for its ability to reproduce the characteristic spectra and then for reproducing spectra measured at other jet velocities and observer angles; successes and limitations are discussed considering the complexity of the jet-surface interaction noise versus the desire for a model that is simple to implement and quick to execute.

  11. Developing an Empirical Model for Jet-Surface Interaction Noise

    NASA Technical Reports Server (NTRS)

    Brown, Clif

    2014-01-01

    The process of developing an empirical model for jet-surface interaction noise is described and the resulting model evaluated. Jet-surface interaction noise is generated when the high-speed engine exhaust from modern tightly integrated or conventional high-bypass ratio engine aircraft strikes or flows over the airframe surfaces. An empirical model based on an existing experimental database is developed for use in preliminary design system level studies where computation speed and range of configurations is valued over absolute accuracy to select the most promising (or eliminate the worst) possible designs. The model developed assumes that the jet-surface interaction noise spectra can be separated from the jet mixing noise and described as a parabolic function with three coefficients: peak amplitude, spectral width, and peak frequency. These coefficients are t to functions of surface length and distance from the jet lipline to form a characteristic spectra which is then adjusted for changes in jet velocity and/or observer angle using scaling laws from published theoretical and experimental work. The resulting model is then evaluated for its ability to reproduce the characteristic spectra and then for reproducing spectra measured at other jet velocities and observer angles; successes and limitations are discussed considering the complexity of the jet-surface interaction noise versus the desire for a model that is simple to implement and quick to execute.

  12. Pharmaceuticals, political money, and public policy: a theoretical and empirical agenda.

    PubMed

    Jorgensen, Paul D

    2013-01-01

    Why, when confronted with policy alternatives that could improve patient care, public health, and the economy, does Congress neglect those goals and tailor legislation to suit the interests of pharmaceutical corporations? In brief, for generations, the pharmaceutical industry has convinced legislators to define policy problems in ways that protect its profit margin. It reinforces this framework by selectively providing information and by targeting campaign contributions to influential legislators and allies. In this way, the industry displaces the public's voice in developing pharmaceutical policy. Unless citizens mobilize to confront the political power of pharmaceutical firms, objectionable industry practices and public policy will not change. Yet we need to refine this analysis. I propose a research agenda to uncover pharmaceutical influence. It develops the theory of dependence corruption to explain how the pharmaceutical industry is able to deflect the broader interests of the general public. It includes empirical studies of lobbying and campaign finance to uncover the means drug firms use to: (1) shape the policy framework adopted and information used to analyze policy; (2) subsidize the work of political allies; and (3) influence congressional voting. © 2013 American Society of Law, Medicine & Ethics, Inc.

  13. The Role of Empirical Research in Bioethics

    PubMed Central

    Kon, Alexander A.

    2010-01-01

    There has long been tension between bioethicists whose work focuses on classical philosophical inquiry and those who perform empirical studies on bioethical issues. While many have argued that empirical research merely illuminates current practices and cannot inform normative ethics, others assert that research-based work has significant implications for refining our ethical norms. In this essay, I present a novel construct for classifying empirical research in bioethics into four hierarchical categories: Lay of the Land, Ideal Versus Reality, Improving Care, and Changing Ethical Norms. Through explaining these four categories and providing examples of publications in each stratum, I define how empirical research informs normative ethics. I conclude by demonstrating how philosophical inquiry and empirical research can work cooperatively to further normative ethics. PMID:19998120

  14. The role of empirical research in bioethics.

    PubMed

    Kon, Alexander A

    2009-01-01

    There has long been tension between bioethicists whose work focuses on classical philosophical inquiry and those who perform empirical studies on bioethical issues. While many have argued that empirical research merely illuminates current practices and cannot inform normative ethics, others assert that research-based work has significant implications for refining our ethical norms. In this essay, I present a novel construct for classifying empirical research in bioethics into four hierarchical categories: Lay of the Land, Ideal Versus Reality, Improving Care, and Changing Ethical Norms. Through explaining these four categories and providing examples of publications in each stratum, I define how empirical research informs normative ethics. I conclude by demonstrating how philosophical inquiry and empirical research can work cooperatively to further normative ethics.

  15. Multi-scale pixel-based image fusion using multivariate empirical mode decomposition.

    PubMed

    Rehman, Naveed ur; Ehsan, Shoaib; Abdullah, Syed Muhammad Umer; Akhtar, Muhammad Jehanzaib; Mandic, Danilo P; McDonald-Maier, Klaus D

    2015-05-08

    A novel scheme to perform the fusion of multiple images using the multivariate empirical mode decomposition (MEMD) algorithm is proposed. Standard multi-scale fusion techniques make a priori assumptions regarding input data, whereas standard univariate empirical mode decomposition (EMD)-based fusion techniques suffer from inherent mode mixing and mode misalignment issues, characterized respectively by either a single intrinsic mode function (IMF) containing multiple scales or the same indexed IMFs corresponding to multiple input images carrying different frequency information. We show that MEMD overcomes these problems by being fully data adaptive and by aligning common frequency scales from multiple channels, thus enabling their comparison at a pixel level and subsequent fusion at multiple data scales. We then demonstrate the potential of the proposed scheme on a large dataset of real-world multi-exposure and multi-focus images and compare the results against those obtained from standard fusion algorithms, including the principal component analysis (PCA), discrete wavelet transform (DWT) and non-subsampled contourlet transform (NCT). A variety of image fusion quality measures are employed for the objective evaluation of the proposed method. We also report the results of a hypothesis testing approach on our large image dataset to identify statistically-significant performance differences.

  16. Multi-Scale Pixel-Based Image Fusion Using Multivariate Empirical Mode Decomposition

    PubMed Central

    Rehman, Naveed ur; Ehsan, Shoaib; Abdullah, Syed Muhammad Umer; Akhtar, Muhammad Jehanzaib; Mandic, Danilo P.; McDonald-Maier, Klaus D.

    2015-01-01

    A novel scheme to perform the fusion of multiple images using the multivariate empirical mode decomposition (MEMD) algorithm is proposed. Standard multi-scale fusion techniques make a priori assumptions regarding input data, whereas standard univariate empirical mode decomposition (EMD)-based fusion techniques suffer from inherent mode mixing and mode misalignment issues, characterized respectively by either a single intrinsic mode function (IMF) containing multiple scales or the same indexed IMFs corresponding to multiple input images carrying different frequency information. We show that MEMD overcomes these problems by being fully data adaptive and by aligning common frequency scales from multiple channels, thus enabling their comparison at a pixel level and subsequent fusion at multiple data scales. We then demonstrate the potential of the proposed scheme on a large dataset of real-world multi-exposure and multi-focus images and compare the results against those obtained from standard fusion algorithms, including the principal component analysis (PCA), discrete wavelet transform (DWT) and non-subsampled contourlet transform (NCT). A variety of image fusion quality measures are employed for the objective evaluation of the proposed method. We also report the results of a hypothesis testing approach on our large image dataset to identify statistically-significant performance differences. PMID:26007714

  17. A theoretical approach to medication adherence for children and youth with psychiatric disorders.

    PubMed

    Charach, Alice; Volpe, Tiziana; Boydell, Katherine M; Gearing, Robin E

    2008-01-01

    This article provides a theoretical review of treatment adherence for children and youth with psychiatric disorders where pharmacological agents are first-line interventions. Four empirically based models of health behavior are reviewed and applied to the sparse literature about medication adherence for children with attention-deficit/hyperactivity disorder and young people with first-episode psychosis. Three qualitative studies of medication use are summarized, and details from the first-person narratives are used to illustrate the theoretical models. These studies indicate, when taken together, that the clinical approach to addressing poor medication adherence in children and youth with psychiatric disorders should be guided by more than one theoretical model. Mental health experts should clarify beliefs, address misconceptions, and support exploration of alternative treatment options unless contraindicated. Recognizing the larger context of the family, allowing time for parents and children to change their attitudes, and offering opportunities for easy access to medication in the future are important ways of respecting patient preferences, while steering them toward best-evidence interventions. Future research using qualitative methods of inquiry to investigate parent, child, and youth experiences of mental health interventions should identify effective ways to improve treatment adherence.

  18. Theoretical bases for conducting certain technological processes in space

    NASA Technical Reports Server (NTRS)

    Okhotin, A. S.

    1979-01-01

    Dimensionless conservation equations are presented and the theoretical bases of fluid behavior aboard orbiting satellites with application to the processes of manufacturing crystals in weightlessness. The small amount of gravitational acceleration is shown to increase the separation of bands of varying concentration. Natural convection is shown to have no practical effect on crystallization from a liquid melt. Barodiffusion is also negligibly small in realistic conditions of weightlessness. The effects of surface tension become increasingly large, and suggestions are made for further research.

  19. Methodological support for the further abstraction of and philosophical examination of empirical findings in the context of caring science.

    PubMed

    Lindberg, Elisabeth; Österberg, Sofia A; Hörberg, Ulrica

    2016-01-01

    Phenomena in caring science are often complex and laden with meanings. Empirical research with the aim of capturing lived experiences is one way of revealing the complexity. Sometimes, however, results from empirical research need to be further discussed. One way is to further abstract the result and/or philosophically examine it. This has previously been performed and presented in scientific journals and doctoral theses, contributing to a greater understanding of phenomena in caring science. Although the intentions in many of these publications are laudable, the lack of methodological descriptions as well as a theoretical and systematic foundation can contribute to an ambiguity concerning how the results have emerged during the analysis. The aim of this paper is to describe the methodological support for the further abstraction of and/or philosophical examination of empirical findings. When trying to systematize the support procedures, we have used a reflective lifeworld research (RLR) approach. Based on the assumptions in RLR, this article will present methodological support for a theoretical examination that can include two stages. In the first stage, data from several (two or more) empirical results on an essential level are synthesized into a general structure. Sometimes the analysis ends with the general structure, but sometimes there is a need to proceed further. The second stage can then be a philosophical examination, in which the general structure is discussed in relation to a philosophical text, theory, or concept. It is important that the theories are brought in as the final stage after the completion of the analysis. Core dimensions of the described methodological support are, in accordance with RLR, openness, bridling, and reflection. The methodological support cannot be understood as fixed stages, but rather as a guiding light in the search for further meanings.

  20. Methodological support for the further abstraction of and philosophical examination of empirical findings in the context of caring science

    PubMed Central

    Lindberg, Elisabeth; Österberg, Sofia A.; Hörberg, Ulrica

    2016-01-01

    Phenomena in caring science are often complex and laden with meanings. Empirical research with the aim of capturing lived experiences is one way of revealing the complexity. Sometimes, however, results from empirical research need to be further discussed. One way is to further abstract the result and/or philosophically examine it. This has previously been performed and presented in scientific journals and doctoral theses, contributing to a greater understanding of phenomena in caring science. Although the intentions in many of these publications are laudable, the lack of methodological descriptions as well as a theoretical and systematic foundation can contribute to an ambiguity concerning how the results have emerged during the analysis. The aim of this paper is to describe the methodological support for the further abstraction of and/or philosophical examination of empirical findings. When trying to systematize the support procedures, we have used a reflective lifeworld research (RLR) approach. Based on the assumptions in RLR, this article will present methodological support for a theoretical examination that can include two stages. In the first stage, data from several (two or more) empirical results on an essential level are synthesized into a general structure. Sometimes the analysis ends with the general structure, but sometimes there is a need to proceed further. The second stage can then be a philosophical examination, in which the general structure is discussed in relation to a philosophical text, theory, or concept. It is important that the theories are brought in as the final stage after the completion of the analysis. Core dimensions of the described methodological support are, in accordance with RLR, openness, bridling, and reflection. The methodological support cannot be understood as fixed stages, but rather as a guiding light in the search for further meanings. PMID:26925926

  1. A conceptual and empirical analysis of the cognitive ability-voluntary turnover relationship.

    PubMed

    Maltarich, Mark A; Nyberg, Anthony J; Reilly, Greg

    2010-11-01

    Despite much research into cognitive ability as a selection tool and a separate large literature on the causes of voluntary turnover, little theoretical or empirical work connects the two. We propose that voluntary turnover is also a potentially key outcome of cognitive ability. Incorporating ideas from the person-environment fit literature and those regarding push and pull influences on turnover, we posit a theoretical connection between cognitive ability and voluntary turnover that addresses both why and how voluntary turnover is related to cognitive ability. Integrating data from 3 different sources, our empirical analyses support the theoretical perspective that the relationship between cognitive ability and voluntary turnover depends on the cognitive demands of the job. When the cognitive demands of a job are high, our findings support the hypothesized curvilinear relationship between cognitive ability and voluntary turnover, such that employees of higher and lower cognitive ability are more likely than medium cognitive ability employees to leave voluntarily. With regard to jobs with low cognitive demands, our data are more consistent with a negative linear relationship between cognitive ability and voluntary turnover, such that higher cognitive ability employees are less likely to leave voluntarily. We also examine the role of job satisfaction, finding that job satisfaction is more strongly linked to voluntary turnover in jobs with high cognitive demands. (c) 2010 APA, all rights reserved.

  2. Language Interdependence between American Sign Language and English: A Review of Empirical Studies

    ERIC Educational Resources Information Center

    Rusher, Melissa Ausbrooks

    2012-01-01

    This study provides a contemporary definition of American Sign Language/English bilingual education (AEBE) and outlines an essential theoretical framework. Included is a history and evolution of the methodology. The author also summarizes the general findings of twenty-six (26) empirical studies conducted in the United States that directly or…

  3. Density-based empirical likelihood procedures for testing symmetry of data distributions and K-sample comparisons.

    PubMed

    Vexler, Albert; Tanajian, Hovig; Hutson, Alan D

    In practice, parametric likelihood-ratio techniques are powerful statistical tools. In this article, we propose and examine novel and simple distribution-free test statistics that efficiently approximate parametric likelihood ratios to analyze and compare distributions of K groups of observations. Using the density-based empirical likelihood methodology, we develop a Stata package that applies to a test for symmetry of data distributions and compares K -sample distributions. Recognizing that recent statistical software packages do not sufficiently address K -sample nonparametric comparisons of data distributions, we propose a new Stata command, vxdbel, to execute exact density-based empirical likelihood-ratio tests using K samples. To calculate p -values of the proposed tests, we use the following methods: 1) a classical technique based on Monte Carlo p -value evaluations; 2) an interpolation technique based on tabulated critical values; and 3) a new hybrid technique that combines methods 1 and 2. The third, cutting-edge method is shown to be very efficient in the context of exact-test p -value computations. This Bayesian-type method considers tabulated critical values as prior information and Monte Carlo generations of test statistic values as data used to depict the likelihood function. In this case, a nonparametric Bayesian method is proposed to compute critical values of exact tests.

  4. Construct Definition Using Cognitively Based Evidence: A Framework for Practice

    ERIC Educational Resources Information Center

    Ketterlin-Geller, Leanne R.; Yovanoff, Paul; Jung, EunJu; Liu, Kimy; Geller, Josh

    2013-01-01

    In this article, we highlight the need for a precisely defined construct in score-based validation and discuss the contribution of cognitive theories to accurately and comprehensively defining the construct. We propose a framework for integrating cognitively based theoretical and empirical evidence to specify and evaluate the construct. We apply…

  5. Holding-based network of nations based on listed energy companies: An empirical study on two-mode affiliation network of two sets of actors

    NASA Astrophysics Data System (ADS)

    Li, Huajiao; Fang, Wei; An, Haizhong; Gao, Xiangyun; Yan, Lili

    2016-05-01

    Economic networks in the real world are not homogeneous; therefore, it is important to study economic networks with heterogeneous nodes and edges to simulate a real network more precisely. In this paper, we present an empirical study of the one-mode derivative holding-based network constructed by the two-mode affiliation network of two sets of actors using the data of worldwide listed energy companies and their shareholders. First, we identify the primitive relationship in the two-mode affiliation network of the two sets of actors. Then, we present the method used to construct the derivative network based on the shareholding relationship between two sets of actors and the affiliation relationship between actors and events. After constructing the derivative network, we analyze different topological features on the node level, edge level and entire network level and explain the meanings of the different values of the topological features combining the empirical data. This study is helpful for expanding the usage of complex networks to heterogeneous economic networks. For empirical research on the worldwide listed energy stock market, this study is useful for discovering the inner relationships between the nations and regions from a new perspective.

  6. Empirical agreement in model validation.

    PubMed

    Jebeile, Julie; Barberousse, Anouk

    2016-04-01

    Empirical agreement is often used as an important criterion when assessing the validity of scientific models. However, it is by no means a sufficient criterion as a model can be so adjusted as to fit available data even though it is based on hypotheses whose plausibility is known to be questionable. Our aim in this paper is to investigate into the uses of empirical agreement within the process of model validation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Education, Labour Market and Human Capital Models: Swedish Experiences and Theoretical Analyses.

    ERIC Educational Resources Information Center

    Sohlman, Asa

    An empirical study concerning development of the Swedish educational system from a labor market point of view, and a theoretical study on human capital models are discussed. In "Education and Labour Market; The Swedish Experience 1900-1975," attention is directed to the following concerns: the official educational policy regarding…

  8. Semi-empirical studies of atomic structure. Progress report, 1 July 1982-1 February 1983

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curtis, L.J.

    1983-01-01

    A program of studies of the properties of the heavy and highly ionized atomic systems which often occur as contaminants in controlled fusion devices is continuing. The project combines experimental measurements by fast-ion-beam excitation with semi-empirical data parametrizations to identify and exploit regularities in the properties of these very heavy and very highly ionized systems. The increasing use of spectroscopic line intensities as diagnostics for determining thermonuclear plasma temperatures and densities requires laboratory observation and analysis of such spectra, often to accuracies that exceed the capabilities of ab initio theoretical methods for these highly relativistic many electron systems. Through themore » acquisition and systematization of empirical data, remarkably precise methods for predicting excitation energies, transition wavelengths, transition probabilities, level lifetimes, ionization potentials, core polarizabilities, and core penetrabilities are being developed and applied. Although the data base for heavy, highly ionized atoms is still sparse, parametrized extrapolations and interpolations along isoelectronic, homologous, and Rydberg sequences are providing predictions for large classes of quantities, with a precision that is sharpened by subsequent measurements.« less

  9. Physics of mind: Experimental confirmations of theoretical predictions.

    PubMed

    Schoeller, Félix; Perlovsky, Leonid; Arseniev, Dmitry

    2018-02-02

    What is common among Newtonian mechanics, statistical physics, thermodynamics, quantum physics, the theory of relativity, astrophysics and the theory of superstrings? All these areas of physics have in common a methodology, which is discussed in the first few lines of the review. Is a physics of the mind possible? Is it possible to describe how a mind adapts in real time to changes in the physical world through a theory based on a few basic laws? From perception and elementary cognition to emotions and abstract ideas allowing high-level cognition and executive functioning, at nearly all levels of study, the mind shows variability and uncertainties. Is it possible to turn psychology and neuroscience into so-called "hard" sciences? This review discusses several established first principles for the description of mind and their mathematical formulations. A mathematical model of mind is derived from these principles. This model includes mechanisms of instincts, emotions, behavior, cognition, concepts, language, intuitions, and imagination. We clarify fundamental notions such as the opposition between the conscious and the unconscious, the knowledge instinct and aesthetic emotions, as well as humans' universal abilities for symbols and meaning. In particular, the review discusses in length evolutionary and cognitive functions of aesthetic emotions and musical emotions. Several theoretical predictions are derived from the model, some of which have been experimentally confirmed. These empirical results are summarized and we introduce new theoretical developments. Several unsolved theoretical problems are proposed, as well as new experimental challenges for future research. Copyright © 2017. Published by Elsevier B.V.

  10. Evaluating Fast Maximum Likelihood-Based Phylogenetic Programs Using Empirical Phylogenomic Data Sets

    PubMed Central

    Zhou, Xiaofan; Shen, Xing-Xing; Hittinger, Chris Todd

    2018-01-01

    Abstract The sizes of the data matrices assembled to resolve branches of the tree of life have increased dramatically, motivating the development of programs for fast, yet accurate, inference. For example, several different fast programs have been developed in the very popular maximum likelihood framework, including RAxML/ExaML, PhyML, IQ-TREE, and FastTree. Although these programs are widely used, a systematic evaluation and comparison of their performance using empirical genome-scale data matrices has so far been lacking. To address this question, we evaluated these four programs on 19 empirical phylogenomic data sets with hundreds to thousands of genes and up to 200 taxa with respect to likelihood maximization, tree topology, and computational speed. For single-gene tree inference, we found that the more exhaustive and slower strategies (ten searches per alignment) outperformed faster strategies (one tree search per alignment) using RAxML, PhyML, or IQ-TREE. Interestingly, single-gene trees inferred by the three programs yielded comparable coalescent-based species tree estimations. For concatenation-based species tree inference, IQ-TREE consistently achieved the best-observed likelihoods for all data sets, and RAxML/ExaML was a close second. In contrast, PhyML often failed to complete concatenation-based analyses, whereas FastTree was the fastest but generated lower likelihood values and more dissimilar tree topologies in both types of analyses. Finally, data matrix properties, such as the number of taxa and the strength of phylogenetic signal, sometimes substantially influenced the programs’ relative performance. Our results provide real-world gene and species tree phylogenetic inference benchmarks to inform the design and execution of large-scale phylogenomic data analyses. PMID:29177474

  11. On the Development of a Theory of Traveler Attitude-Behavior Interrelationships : Volume 2. Theoretical and Empirical Findings.

    DOT National Transportation Integrated Search

    1978-08-01

    The second volume of this final report presents conceptual and empirical findings which support the development of a theory of traveler attitude-behavior interrelationships. Such a theory will be useful in the design of transport systems and operatin...

  12. Towards high performing hospital enterprise systems: an empirical and literature based design framework

    NASA Astrophysics Data System (ADS)

    dos Santos Fradinho, Jorge Miguel

    2014-05-01

    Our understanding of enterprise systems (ES) is gradually evolving towards a sense of design which leverages multidisciplinary bodies of knowledge that may bolster hybrid research designs and together further the characterisation of ES operation and performance. This article aims to contribute towards ES design theory with its hospital enterprise systems design (HESD) framework, which reflects a rich multidisciplinary literature and two in-depth hospital empirical cases from the US and UK. In doing so it leverages systems thinking principles and traditionally disparate bodies of knowledge to bolster the theoretical evolution and foundation of ES. A total of seven core ES design elements are identified and characterised with 24 main categories and 53 subcategories. In addition, it builds on recent work which suggests that hospital enterprises are comprised of multiple internal ES configurations which may generate different levels of performance. Multiple sources of evidence were collected including electronic medical records, 54 recorded interviews, observation, and internal documents. Both in-depth cases compare and contrast higher and lower performing ES configurations. Following literal replication across in-depth cases, this article concludes that hospital performance can be improved through an enriched understanding of hospital ES design.

  13. 78 FR 6316 - Empire Pipeline, Inc. (Empire); Notice of Filing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-30

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. AC13-22-000] Empire Pipeline, Inc. (Empire); Notice of Filing Take notice that on November 29, 2012 Empire Pipeline Company (Empire) submitted a request for a waiver of the reporting requirement to file the FERC Form 2 CPA Certification for...

  14. Integrating Motivational Interviewing and Brief Behavioral Activation Therapy: Theoretical and Practical Considerations

    PubMed Central

    Balán, Iván C.; Lejuez, C. W.; Hoffer, Marcela; Blanco, Carlos

    2017-01-01

    Behavioral Activation and specifically the Brief Behavioral Activation Therapy for Depression (BATD) has a strong record of empirical support but its focus on practical out of session activation-based assignments can lead to poor levels of adherence if efforts to enhance motivation are not prioritized. Towards this end, this manuscript describes the assimilative integration of Motivational Interviewing (MI) and BATD to improve clinical outcomes by integrating MI's focus on building and maintaining motivation to change into BATD. The manuscript provides an overview of MI and BATD, theoretical issue raised in integrating the two approaches, and examples of how this integration results in a nondirective and motivation-focused approach to conducting BATD. PMID:29151779

  15. Tissue artifact removal from respiratory signals based on empirical mode decomposition.

    PubMed

    Liu, Shaopeng; Gao, Robert X; John, Dinesh; Staudenmayer, John; Freedson, Patty

    2013-05-01

    On-line measurement of respiration plays an important role in monitoring human physical activities. Such measurement commonly employs sensing belts secured around the rib cage and abdomen of the test object. Affected by the movement of body tissues, respiratory signals typically have a low signal-to-noise ratio. Removing tissue artifacts therefore is critical to ensuring effective respiration analysis. This paper presents a signal decomposition technique for tissue artifact removal from respiratory signals, based on the empirical mode decomposition (EMD). An algorithm based on the mutual information and power criteria was devised to automatically select appropriate intrinsic mode functions for tissue artifact removal and respiratory signal reconstruction. Performance of the EMD-algorithm was evaluated through simulations and real-life experiments (N = 105). Comparison with low-pass filtering that has been conventionally applied confirmed the effectiveness of the technique in tissue artifacts removal.

  16. A theoretical model for smoking prevention studies in preteen children.

    PubMed

    McGahee, T W; Kemp, V; Tingen, M

    2000-01-01

    The age of the onset of smoking is on a continual decline, with the prime age of tobacco use initiation being 12-14 years. A weakness of the limited research conducted on smoking prevention programs designed for preteen children (ages 10-12) is a well-defined theoretical basis. A theoretical perspective is needed in order to make a meaningful transition from empirical analysis to application of knowledge. Bandura's Social Cognitive Theory (1977, 1986), the Theory of Reasoned Action (Ajzen & Fishbein, 1980), and other literature linking various concepts to smoking behaviors in preteens were used to develop a model that may be useful for smoking prevention studies in preteen children.

  17. Empirical source strength correlations for rans-based acoustic analogy methods

    NASA Astrophysics Data System (ADS)

    Kube-McDowell, Matthew Tyndall

    JeNo is a jet noise prediction code based on an acoustic analogy method developed by Mani, Gliebe, Balsa, and Khavaran. Using the flow predictions from a standard Reynolds-averaged Navier-Stokes computational fluid dynamics solver, JeNo predicts the overall sound pressure level and angular spectra for high-speed hot jets over a range of observer angles, with a processing time suitable for rapid design purposes. JeNo models the noise from hot jets as a combination of two types of noise sources; quadrupole sources dependent on velocity fluctuations, which represent the major noise of turbulent mixing, and dipole sources dependent on enthalpy fluctuations, which represent the effects of thermal variation. These two sources are modeled by JeNo as propagating independently into the far-field, with no cross-correlation at the observer location. However, high-fidelity computational fluid dynamics solutions demonstrate that this assumption is false. In this thesis, the theory, assumptions, and limitations of the JeNo code are briefly discussed, and a modification to the acoustic analogy method is proposed in which the cross-correlation of the two primary noise sources is allowed to vary with the speed of the jet and the observer location. As a proof-of-concept implementation, an empirical correlation correction function is derived from comparisons between JeNo's noise predictions and a set of experimental measurements taken for the Air Force Aero-Propulsion Laboratory. The empirical correlation correction is then applied to JeNo's predictions of a separate data set of hot jets tested at NASA's Glenn Research Center. Metrics are derived to measure the qualitative and quantitative performance of JeNo's acoustic predictions, and the empirical correction is shown to provide a quantitative improvement in the noise prediction at low observer angles with no freestream flow, and a qualitative improvement in the presence of freestream flow. However, the results also demonstrate

  18. Mass media and environmental issues: a theoretical analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parlour, J.W.

    1980-01-01

    A critique of the weak empirical and theoretical foundations of commentaries on the mass media in the environmental literature argues that they stem from the incidental rather than fundamental concern for the social dimensions of environmental problems. The contributions of information theory, cybernetics, sociology, and political science to micro and macro theories of mass communications are reviewed. Information from empirical analyses of the mass media's portrayal of social issues, including the environment, is related to Hall's dominant ideology thesis of the mass media and the elitist-conflict model of society. It is argued that the media's portrayal of environmental issues ismore » structured by dominant power-holding groups in society with the result that the media effectively function to maintain and reinforce the status quo to the advantage of these dominant groups. 78 references.« less

  19. Organizational culture and organizational effectiveness: a meta-analytic investigation of the competing values framework's theoretical suppositions.

    PubMed

    Hartnell, Chad A; Ou, Amy Yi; Kinicki, Angelo

    2011-07-01

    We apply Quinn and Rohrbaugh's (1983) competing values framework (CVF) as an organizing taxonomy to meta-analytically test hypotheses about the relationship between 3 culture types and 3 major indices of organizational effectiveness (employee attitudes, operational performance [i.e., innovation and product and service quality], and financial performance). The paper also tests theoretical suppositions undergirding the CVF by investigating the framework's nomological validity and proposed internal structure (i.e., interrelationships among culture types). Results based on data from 84 empirical studies with 94 independent samples indicate that clan, adhocracy, and market cultures are differentially and positively associated with the effectiveness criteria, though not always as hypothesized. The findings provide mixed support for the CVF's nomological validity and fail to support aspects of the CVF's proposed internal structure. We propose an alternative theoretical approach to the CVF and delineate directions for future research.

  20. An Empirical Study on Washback Effects of the Internet-Based College English Test Band 4 in China

    ERIC Educational Resources Information Center

    Wang, Chao; Yan, Jiaolan; Liu, Bao

    2014-01-01

    Based on Bailey's washback model, in respect of participants, process and products, the present empirical study was conducted to find the actual washback effects of the internet-based College English Test Band 4 (IB CET-4). The methods adopted are questionnaires, class observation, interview and the analysis of both the CET-4 teaching and testing…

  1. Semivolatile Organic Compounds in Homes: Strategies for Efficient and Systematic Exposure Measurement Based on Empirical and Theoretical Factors

    PubMed Central

    2014-01-01

    Residential exposure can dominate total exposure for commercial chemicals of health concern; however, despite the importance of consumer exposures, methods for estimating household exposures remain limited. We collected house dust and indoor air samples in 49 California homes and analyzed for 76 semivolatile organic compounds (SVOCs)—phthalates, polybrominated diphenyl ethers (PBDEs), polychlorinated biphenyls (PCBs), polycyclic aromatic hydrocarbons (PAHs), and pesticides. Sixty chemicals were detected in either dust or air and here we report 58 SVOCs detected in dust for the first time. In dust, phthalates (bis(2-ethylhexyl) phthalate, benzyl butyl phthalate, di-n-butyl phthalate) and flame retardants (PBDE 99, PBDE 47) were detected at the highest concentrations relative to other chemicals at the 95th percentile, while phthalates were highest at the median. Because SVOCs are found in both gas and condensed phases and redistribute from their original source over time, partitioning models can clarify their fate indoors. We use empirical data to validate air-dust partitioning models and use these results, combined with experience in SVOC exposure assessment, to recommend residential exposure measurement strategies. We can predict dust concentrations reasonably well from measured air concentrations (R2 = 0.80). Partitioning models and knowledge of chemical Koa elucidate exposure pathways and suggest priorities for chemical regulation. These findings also inform study design by allowing researchers to select sampling approaches optimized for their chemicals of interest and study goals. While surface wipes are commonly used in epidemiology studies because of ease of implementation, passive air sampling may be more standardized between homes and also relatively simple to deploy. Validation of passive air sampling methods for SVOCs is a priority. PMID:25488487

  2. Evidence-Based Administration for Decision Making in the Framework of Knowledge Strategic Management

    ERIC Educational Resources Information Center

    Del Junco, Julio Garcia; Zaballa, Rafael De Reyna; de Perea, Juan Garcia Alvarez

    2010-01-01

    Purpose: This paper seeks to present a model based on evidence-based administration (EBA), which aims to facilitate the creation, transformation and diffusion of knowledge in learning organizations. Design/methodology/approach: A theoretical framework is proposed based on EBA and the case method. Accordingly, an empirical study was carried out in…

  3. Empirical evaluation of H.265/HEVC-based dynamic adaptive video streaming over HTTP (HEVC-DASH)

    NASA Astrophysics Data System (ADS)

    Irondi, Iheanyi; Wang, Qi; Grecos, Christos

    2014-05-01

    Real-time HTTP streaming has gained global popularity for delivering video content over Internet. In particular, the recent MPEG-DASH (Dynamic Adaptive Streaming over HTTP) standard enables on-demand, live, and adaptive Internet streaming in response to network bandwidth fluctuations. Meanwhile, emerging is the new-generation video coding standard, H.265/HEVC (High Efficiency Video Coding) promises to reduce the bandwidth requirement by 50% at the same video quality when compared with the current H.264/AVC standard. However, little existing work has addressed the integration of the DASH and HEVC standards, let alone empirical performance evaluation of such systems. This paper presents an experimental HEVC-DASH system, which is a pull-based adaptive streaming solution that delivers HEVC-coded video content through conventional HTTP servers where the client switches to its desired quality, resolution or bitrate based on the available network bandwidth. Previous studies in DASH have focused on H.264/AVC, whereas we present an empirical evaluation of the HEVC-DASH system by implementing a real-world test bed, which consists of an Apache HTTP Server with GPAC, an MP4Client (GPAC) with open HEVC-based DASH client and a NETEM box in the middle emulating different network conditions. We investigate and analyze the performance of HEVC-DASH by exploring the impact of various network conditions such as packet loss, bandwidth and delay on video quality. Furthermore, we compare the Intra and Random Access profiles of HEVC coding with the Intra profile of H.264/AVC when the correspondingly encoded video is streamed with DASH. Finally, we explore the correlation among the quality metrics and network conditions, and empirically establish under which conditions the different codecs can provide satisfactory performance.

  4. Theoretical investigation on multilayer nanocomposite-based fiber optic SPR sensor

    NASA Astrophysics Data System (ADS)

    Shojaie, Ehsan; Madanipour, Khosro; Gharibzadeh, Azadeh; Abbasi, Shabnam

    2017-06-01

    In this work, a multilayer nanocomposite based fiber optic SPR sensor is considered and especially designed for CO2 gas detection. This proposed fiber sensor consists of fiber core, gold-silver alloy and the absorber layers. The investigation is based on the evaluation of the transmitted-power derived under the transfer matrix method and the multiple-reflection in the sensing area. In terms of sensitivity, the sensor performance is studied theoretically under various conditions related to the metal layer and its gold and silver nanoparticles to form a single alloy film. Effect of additional parameters such as the ratio of the alloy composition and the thickness of the alloy film on the performance of the SPR sensor is studied, as well. Finally, a four-layer structure is introduced to detect carbon dioxide gas. It contains core fiber, gold-silver alloy layer, an absorbent layer of carbon dioxide gas (KOH) and measurement environment. Lower price and size are the main advantages of using such a sensor in compare with commercial (NDIR) gas sensor. Theoretical results show by increasing the metal layer thickness the sensitivity of sensor is increased, and by increasing the ratio of the gold in alloy the sensitivity is decreased.

  5. Searching for a Common Ground--A Literature Review of Empirical Research on Scientific Inquiry Activities

    ERIC Educational Resources Information Center

    Rönnebeck, Silke; Bernholt, Sascha; Ropohl, Mathias

    2016-01-01

    Despite the importance of scientific inquiry in science education, researchers and educators disagree considerably regarding what features define this instructional approach. While a large body of literature addresses theoretical considerations, numerous empirical studies investigate scientific inquiry on quite different levels of detail and also…

  6. An empirical test of Maslow's theory of need hierarchy using hologeistic comparison by statistical sampling.

    PubMed

    Davis-Sharts, J

    1986-10-01

    Maslow's hierarchy of basic human needs provides a major theoretical framework in nursing science. The purpose of this study was to empirically test Maslow's need theory, specifically at the levels of physiological and security needs, using a hologeistic comparative method. Thirty cultures taken from the 60 cultural units in the Health Relations Area Files (HRAF) Probability Sample were found to have data available for examining hypotheses about thermoregulatory (physiological) and protective (security) behaviors practiced prior to sleep onset. The findings demonstrate there is initial worldwide empirical evidence to support Maslow's need hierarchy.

  7. Transition mixing study empirical model report

    NASA Technical Reports Server (NTRS)

    Srinivasan, R.; White, C.

    1988-01-01

    The empirical model developed in the NASA Dilution Jet Mixing Program has been extended to include the curvature effects of transition liners. This extension is based on the results of a 3-D numerical model generated under this contract. The empirical model results agree well with the numerical model results for all tests cases evaluated. The empirical model shows faster mixing rates compared to the numerical model. Both models show drift of jets toward the inner wall of a turning duct. The structure of the jets from the inner wall does not exhibit the familiar kidney-shaped structures observed for the outer wall jets or for jets injected in rectangular ducts.

  8. A theoretical framework for whole-plant carbon assimilation efficiency based on metabolic scaling theory: a test case using Picea seedlings.

    PubMed

    Wang, Zhiqiang; Ji, Mingfei; Deng, Jianming; Milne, Richard I; Ran, Jinzhi; Zhang, Qiang; Fan, Zhexuan; Zhang, Xiaowei; Li, Jiangtao; Huang, Heng; Cheng, Dongliang; Niklas, Karl J

    2015-06-01

    Simultaneous and accurate measurements of whole-plant instantaneous carbon-use efficiency (ICUE) and annual total carbon-use efficiency (TCUE) are difficult to make, especially for trees. One usually estimates ICUE based on the net photosynthetic rate or the assumed proportional relationship between growth efficiency and ICUE. However, thus far, protocols for easily estimating annual TCUE remain problematic. Here, we present a theoretical framework (based on the metabolic scaling theory) to predict whole-plant annual TCUE by directly measuring instantaneous net photosynthetic and respiratory rates. This framework makes four predictions, which were evaluated empirically using seedlings of nine Picea taxa: (i) the flux rates of CO(2) and energy will scale isometrically as a function of plant size, (ii) whole-plant net and gross photosynthetic rates and the net primary productivity will scale isometrically with respect to total leaf mass, (iii) these scaling relationships will be independent of ambient temperature and humidity fluctuations (as measured within an experimental chamber) regardless of the instantaneous net photosynthetic rate or dark respiratory rate, or overall growth rate and (iv) TCUE will scale isometrically with respect to instantaneous efficiency of carbon use (i.e., the latter can be used to predict the former) across diverse species. These predictions were experimentally verified. We also found that the ranking of the nine taxa based on net photosynthetic rates differed from ranking based on either ICUE or TCUE. In addition, the absolute values of ICUE and TCUE significantly differed among the nine taxa, with both ICUE and temperature-corrected ICUE being highest for Picea abies and lowest for Picea schrenkiana. Nevertheless, the data are consistent with the predictions of our general theoretical framework, which can be used to access annual carbon-use efficiency of different species at the level of an individual plant based on simple, direct

  9. Time distortions in Alzheimer’s disease: a systematic review and theoretical integration

    PubMed Central

    El Haj, Mohamad; Kapogiannis, Dimitrios

    2016-01-01

    Time perception is an essential function of the human brain, which is compromised in Alzheimer’s disease (AD). Here, we review empirical findings on time distortions in AD and provide a theoretical framework that integrates time and memory distortions in AD and explains their bidirectional modulation. The review was based on a literature survey performed on the PubMed and PsycInfo databases. According to our theoretical framework, time distortions may induce decline in the ability to mentally project oneself in time (i.e., mental time travel), and consequently may contribute to an episodic memory compromise in AD. Conversely, episodic memory compromise in AD may result in a loss of the ability to retrieve information about time and/or the ability to project oneself in subjective time. The relationship between time distortions and memory decline in AD can be jointly attributed to hippocampus involvement, as this brain area supports both time perception and memory and is preferentially targeted by the neuropathological processes of AD. Clinical implications of time distortions are discussed and directions for future research are suggested. PMID:28721270

  10. North Dakota implementation of mechanistic-empirical pavement design guide (MEPDG).

    DOT National Transportation Integrated Search

    2014-12-01

    North Dakota currently designs roads based on the AASHTO Design Guide procedure, which is based on : the empirical findings of the AASHTO Road Test of the late 1950s. However, limitations of the current : empirical approach have prompted AASHTO to mo...

  11. Moving Beyond Pioneering: Empirical and Theoretical Perspectives on Lesbian, Gay, and Bisexual Affirmative Training.

    ERIC Educational Resources Information Center

    Croteau, James M.; Bieschke, Kathleen J.; Phillips, Julia C.; Lark, Julianne S.

    1998-01-01

    States that the literature to date has broken the silence on lesbian, gay, and bisexual (LGB) issues and has affirmed the field of psychology as being affirmative toward these issues. Proposes that research should move toward a greater understanding of LGB affirmative professional training by focusing on training from theoretical and empirical…

  12. ['Walkability' and physical activity - results of empirical studies based on the 'Neighbourhood Environment Walkability Scale (NEWS)'].

    PubMed

    Rottmann, M; Mielck, A

    2014-02-01

    'Walkability' is mainly assessed by the NEWS questionnaire (Neighbourhood Environment Walkability Scale); in Germany this questionnaire is widely unknown. We now try to fill this gap by providing a systematic overview of empirical studies based on the NEWS. A systematic review was conducted concerning original papers including empirical analyses based on the NEWS. The results are summarised and presented in tables. Altogether 31 publications could be identified. Most of them focus on associations with the variable 'physical activity', and they often report significant associations with at least some of the scales included in the NEWS. Due to methodological differences between the studies it is difficult to compare the results. The concept of 'walkability' should also be established in the German public health discussion. A number of methodological challenges remain to be solved, such as the identification of those scales and items in the NEWS that show the strongest associations with individual health behaviours. © Georg Thieme Verlag KG Stuttgart · New York.

  13. Reframing Serial Murder Within Empirical Research.

    PubMed

    Gurian, Elizabeth A

    2017-04-01

    Empirical research on serial murder is limited due to the lack of consensus on a definition, the continued use of primarily descriptive statistics, and linkage to popular culture depictions. These limitations also inhibit our understanding of these offenders and affect credibility in the field of research. Therefore, this comprehensive overview of a sample of 508 cases (738 total offenders, including partnered groups of two or more offenders) provides analyses of solo male, solo female, and partnered serial killers to elucidate statistical differences and similarities in offending and adjudication patterns among the three groups. This analysis of serial homicide offenders not only supports previous research on offending patterns present in the serial homicide literature but also reveals that empirically based analyses can enhance our understanding beyond traditional case studies and descriptive statistics. Further research based on these empirical analyses can aid in the development of more accurate classifications and definitions of serial murderers.

  14. Empirical data and moral theory. A plea for integrated empirical ethics.

    PubMed

    Molewijk, Bert; Stiggelbout, Anne M; Otten, Wilma; Dupuis, Heleen M; Kievit, Job

    2004-01-01

    Ethicists differ considerably in their reasons for using empirical data. This paper presents a brief overview of four traditional approaches to the use of empirical data: "the prescriptive applied ethicists," "the theorists," "the critical applied ethicists," and "the particularists." The main aim of this paper is to introduce a fifth approach of more recent date (i.e. "integrated empirical ethics") and to offer some methodological directives for research in integrated empirical ethics. All five approaches are presented in a table for heuristic purposes. The table consists of eight columns: "view on distinction descriptive-prescriptive sciences," "location of moral authority," "central goal(s)," "types of normativity," "use of empirical data," "method," "interaction empirical data and moral theory," and "cooperation with descriptive sciences." Ethicists can use the table in order to identify their own approach. Reflection on these issues prior to starting research in empirical ethics should lead to harmonization of the different scientific disciplines and effective planning of the final research design. Integrated empirical ethics (IEE) refers to studies in which ethicists and descriptive scientists cooperate together continuously and intensively. Both disciplines try to integrate moral theory and empirical data in order to reach a normative conclusion with respect to a specific social practice. IEE is not wholly prescriptive or wholly descriptive since IEE assumes an interdepence between facts and values and between the empirical and the normative. The paper ends with three suggestions for consideration on some of the future challenges of integrated empirical ethics.

  15. Empirical Evaluation Indicators in Thai Higher Education: Theory-Based Multidimensional Learners' Assessment

    ERIC Educational Resources Information Center

    Sritanyarat, Dawisa; Russ-Eft, Darlene

    2016-01-01

    This study proposed empirical indicators which can be validated and adopted in higher education institutions to evaluate quality of teaching and learning, and to serve as an evaluation criteria for human resource management and development of higher institutions in Thailand. The main purpose of this study was to develop empirical indicators of a…

  16. Health Status and Health Dynamics in an Empirical Model of Expected Longevity*

    PubMed Central

    Benítez-Silva, Hugo; Ni, Huan

    2010-01-01

    Expected longevity is an important factor influencing older individuals’ decisions such as consumption, savings, purchase of life insurance and annuities, claiming of Social Security benefits, and labor supply. It has also been shown to be a good predictor of actual longevity, which in turn is highly correlated with health status. A relatively new literature on health investments under uncertainty, which builds upon the seminal work by Grossman (1972), has directly linked longevity with characteristics, behaviors, and decisions by utility maximizing agents. Our empirical model can be understood within that theoretical framework as estimating a production function of longevity. Using longitudinal data from the Health and Retirement Study, we directly incorporate health dynamics in explaining the variation in expected longevities, and compare two alternative measures of health dynamics: the self-reported health change, and the computed health change based on self-reports of health status. In 38% of the reports in our sample, computed health changes are inconsistent with the direct report on health changes over time. And another 15% of the sample can suffer from information losses if computed changes are used to assess changes in actual health. These potentially serious problems raise doubts regarding the use and interpretation of the computed health changes and even the lagged measures of self-reported health as controls for health dynamics in a variety of empirical settings. Our empirical results, controlling for both subjective and objective measures of health status and unobserved heterogeneity in reporting, suggest that self-reported health changes are a preferred measure of health dynamics. PMID:18187217

  17. [Medical education under the Revolution and the Empire].

    PubMed

    Legaye, Jean

    2014-01-01

    After the suppression of medical education during the French revolution in 1793, the lack of caregivers is dramatic, especially in the army. The medical education is therefore rehabilitated in 1794 in 3 (then 6) Health Schools, which will become Schools of Medicine and Faculties of Medicine, incorporated in 1808 into then Imperial University. During 3 years, the courses are theoretical and also based on a practical teaching on the patient. The defense of a thesis provides access to the title of doctor in medicine or surgery and allows practicing for all the pathologies on the entire territory of the Empire. Meanwhile, medical courses are given in military hospitals to train officers of health. They are dedicated for the service of the army and for minor diseases in rural areas. They are authorized to practice only in the department in which they were received. The inspectors general provide medical education directly in the military medical structures and conduct examinations about medical care. This type of career is illustrated by the biography of Surgeon Major François Augustin Legaÿ.

  18. Theoretical and experimental research on laser-beam homogenization based on metal gauze

    NASA Astrophysics Data System (ADS)

    Liu, Libao; Zhang, Shanshan; Wang, Ling; Zhang, Yanchao; Tian, Zhaoshuo

    2018-03-01

    Method of homogenization of CO2 laser heating by means of metal gauze is researched theoretically and experimentally. Distribution of light-field of expanded beam passing through metal gauze was numerically calculated with diffractive optical theory and the conclusion is that method is effective, with comparing the results to the situation without metal gauze. Experimentally, using the 30W DC discharge laser as source and enlarging beam by concave lens, with and without metal gauze, beam intensity distributions in thermal paper were compared, meanwhile the experiments based on thermal imager were performed. The experimental result was compatible with theoretical calculation, and all these show that the homogeneity of CO2 laser heating could be enhanced by metal gauze.

  19. Denial of Chronic Illness and Disability: Part I. Theoretical, Functional, and Dynamic Perspectives

    ERIC Educational Resources Information Center

    Livneh, Hanoch

    2009-01-01

    Denial has been an integral part of the psychological and disability literature for more than 100 years. Yet, denial is an elusive concept and has been associated with mixed, indeed conflicting, theoretical perspectives, clinical strategies, and empirical findings. In this two-part article, the author provides an overview of the existing…

  20. An Enduring Dialogue between Computational and Empirical Vision.

    PubMed

    Martinez-Conde, Susana; Macknik, Stephen L; Heeger, David J

    2018-04-01

    In the late 1970s, key discoveries in neurophysiology, psychophysics, computer vision, and image processing had reached a tipping point that would shape visual science for decades to come. David Marr and Ellen Hildreth's 'Theory of edge detection', published in 1980, set out to integrate the newly available wealth of data from behavioral, physiological, and computational approaches in a unifying theory. Although their work had wide and enduring ramifications, their most important contribution may have been to consolidate the foundations of the ongoing dialogue between theoretical and empirical vision science. Copyright © 2018 Elsevier Ltd. All rights reserved.

  1. Replacing Old Spatial Empires of the Mind: Rethinking Space and Place through Network Spatiality

    ERIC Educational Resources Information Center

    Beech, Jason; Larsen, Marianne A.

    2014-01-01

    In this article we argue for the spatialization of research on educational transfer in the field of comparative education within a theoretical framework that focuses on networks, connections, and flows. We present what we call a "spatial empire of the mind," which is comprised of a set of taken-for-granted "truths" about space…

  2. Objectives Stated for the Use of Literature at School: An Empirical Analysis, Part I.

    ERIC Educational Resources Information Center

    Klingberg, Gote; Agren, Bengt

    This report presents a theoretical basis for literary education through goal analyses. The object of the analyses is to obtain clearer formulations of the subgoals of instruction with the help of literature, and to arrange them in logical sequence. Using 79 sources from 12 countries, an empirical study was made, and goal descriptions were…

  3. Group-Based Hatred in Intractable Conflict in Israel

    ERIC Educational Resources Information Center

    Halperin, Eran

    2008-01-01

    Countless theoretical texts have been written regarding the centrality of hatred as a force that motivates intergroup conflicts. However, surprisingly, at present, almost no empirical study has been conducted either on the nature and character of group-based hatred or on its implications for conflicts. Therefore, the goal of the current work has…

  4. Empirical psychology, common sense, and Kant's empirical markers for moral responsibility.

    PubMed

    Frierson, Patrick

    2008-12-01

    This paper explains the empirical markers by which Kant thinks that one can identify moral responsibility. After explaining the problem of discerning such markers within a Kantian framework I briefly explain Kant's empirical psychology. I then argue that Kant's empirical markers for moral responsibility--linked to higher faculties of cognition--are not sufficient conditions for moral responsibility, primarily because they are empirical characteristics subject to natural laws. Next. I argue that these markers are not necessary conditions of moral responsibility. Given Kant's transcendental idealism, even an entity that lacks these markers could be free and morally responsible, although as a matter of fact Kant thinks that none are. Given that they are neither necessary nor sufficient conditions, I discuss the status of Kant's claim that higher faculties are empirical markers of moral responsibility. Drawing on connections between Kant's ethical theory and 'common rational cognition' (4:393), I suggest that Kant's theory of empirical markers can be traced to ordinary common sense beliefs about responsibility. This suggestion helps explain both why empirical markers are important and what the limits of empirical psychology are within Kant's account of moral responsibility.

  5. VizieR Online Data Catalog: A framework for empirical galaxy phenomenology (Munoz+, 2015)

    NASA Astrophysics Data System (ADS)

    Munoz, J. A.; Peeples, M. S.

    2017-11-01

    In this study, we develop a cohesive theoretical formalism for translating empirical relations into an understanding of the variations in galactic star formation histories. We achieve this goal by incorporating into the Main Sequence Integration (MSI) method the scatter suggested by the evolving fraction of quiescent galaxies and the spread in the observed stellar mass-star formation rate relation. (2 data files).

  6. Theoretical Bound of CRLB for Energy Efficient Technique of RSS-Based Factor Graph Geolocation

    NASA Astrophysics Data System (ADS)

    Kahar Aziz, Muhammad Reza; Heriansyah; Saputra, EfaMaydhona; Musa, Ardiansyah

    2018-03-01

    To support the increase of wireless geolocation development as the key of the technology in the future, this paper proposes theoretical bound derivation, i.e., Cramer Rao lower bound (CRLB) for energy efficient of received signal strength (RSS)-based factor graph wireless geolocation technique. The theoretical bound derivation is crucially important to evaluate whether the energy efficient technique of RSS-based factor graph wireless geolocation is effective as well as to open the opportunity to further innovation of the technique. The CRLB is derived in this paper by using the Fisher information matrix (FIM) of the main formula of the RSS-based factor graph geolocation technique, which is lied on the Jacobian matrix. The simulation result shows that the derived CRLB has the highest accuracy as a bound shown by its lowest root mean squared error (RMSE) curve compared to the RMSE curve of the RSS-based factor graph geolocation technique. Hence, the derived CRLB becomes the lower bound for the efficient technique of RSS-based factor graph wireless geolocation.

  7. Rationale for hedging initiatives: Empirical evidence from the energy industry

    NASA Astrophysics Data System (ADS)

    Dhanarajata, Srirajata

    Theory offers different rationales for hedging including (i) financial distress and bankruptcy cost, (ii) capacity to capture attractive investment opportunities, (iii) information asymmetry, (iv) economy of scale, (v) substitution for hedging, (vi) managerial risk aversion, and (vii) convexity of tax schedule. The purpose of this dissertation is to empirically test the explanatory power of the first five theoretical rationales on hedging done by oil and gas exploration and production (E&P) companies. The level of hedging is measured by the percentage of production effectively hedged, calculated based on the concept of delta and delta-gamma hedging. I employ Tobit regression, principal components, and panel data analysis on dependent and raw independent variables. Tobit regression is applied due to the fact that the dependent variable used in the analysis is non-negative. Principal component analysis helps to reduce the dimension of explanatory variables while panel data analysis combines/pools the data that is a combination of time-series and cross-sectional. Based on the empirical results, leverage level is consistently found to be a significant factor on hedging activities, either due to an attempt to avoid financial distress by the firm, or an attempt to control agency cost by debtholders, or both. The effect of capital expenditures and discretionary cash flows are both indeterminable due possibly to a potential mismatch in timing of realized cash flow items and hedging decision. Firm size is found to be positively related to hedging supporting economy of scale hypothesis, which is introduced in past literature, as well as the argument that large firm usually are more sophisticated and should be more willing and more comfortable to use hedge instruments than smaller firms.

  8. Empirical risk factors for delinquency and best treatments: where do we go from here?

    PubMed

    Zagar, Robert John; Busch, Kenneth G; Hughes, John Russell

    2009-02-01

    Youth development and prevention of violence are two sides of the same public policy issue. A great deal of theoretical and empirical effort has focused on identification of risk factors for delinquency and development of interventions for general risks. Recent calls for changes in public policy are evaluated here--and challenged--in light of new comprehensive, longitudinal empirical data on urban violent delinquency. Treatments such as prenatal care, home visitation, prevention of bullying, prevention of alcohol and/or drug abuse, promotion of alternative thinking, mentoring, life skills training, rewards for graduation and employment, functional family therapy, and multidimensional foster care are effective because they prevent or ameliorate risks for delinquency occurring during development. At present, the best treatments yield 10 to 40% reductions in delinquent recidivism. Better controlled application of developmentally appropriate treatments in higher doses, with narrow targeting of the highest-risk youth based on actuarial testing--rather than less accurate clinical judgment--should result in higher effectiveness. Such a focused approach in a geographical area with high homicide rates should be cost-effective. A prediction of cost-benefit outcomes for a carefully constructed example of a large-scale program is presented.

  9. Theoretical Bases of Science Education Research.

    ERIC Educational Resources Information Center

    Good, Ronald; And Others

    This symposium examines the science education research enterprise from multiple theoretical perspectives. The first paper, "Contextual Constructivism; The Impact of Culture on the Learning and Teaching of Science (William Cobern), focuses on broad issues of culture and how constructivism is affected by the context of culture. Culturally based…

  10. An Attitudinal Explanation of Biases in the Criminal Justice System: An Empirical Testing of Defensive Attribution Theory

    ERIC Educational Resources Information Center

    Herzog, Sergio

    2008-01-01

    Theoretical perspectives, supported by empirical evidence, have consistently argued that the judicial treatment of offenders by criminal justice agents is sometimes biased by extralegal factors, such as offenders' sociodemographic characteristics. According to defensive attribution theory, individuals tend to protect themselves against unfortunate…

  11. Marx and Dahrendorf on Income Inequality, Class Consciousness and Class Conflict: An Empirical Test.

    ERIC Educational Resources Information Center

    Robinson, Robert V.; Kelley, Jonathan

    The issue addressed by this paper is the lack of empirical research on the class theories of Karl Marx and Ralf Dahrendorf. In order to bridge this gap, data are analyzed on the theoretical and statistical implications of Marx's theory (which focuses on ownership of the means of production) and Dahrendorf's theory (which focuses on authority in…

  12. Empirical Bases for a Prekindergarten Curriculum for Disadvantaged Children.

    ERIC Educational Resources Information Center

    Di Lorenzo, Louis T.; And Others

    This project was undertaken to establish a basis for a compensatory curriculum for disadvantaged preschool children by using existing empirical data to identify factors that predict success in reading comprehension and that differentiate the disadvantaged from the nondisadvantaged. The project focused on factors related to success in learning to…

  13. Proposing a Theoretical Framework for Digital Age Youth Information Behavior Building upon Radical Change Theory

    ERIC Educational Resources Information Center

    Koh, Kyungwon

    2011-01-01

    Contemporary young people are engaged in a variety of information behaviors, such as information seeking, using, sharing, and creating. The ways youth interact with information have transformed in the shifting digital information environment; however, relatively little empirical research exists and no theoretical framework adequately explains…

  14. Theoretical study of diaquamalonatozinc(II) single crystal for applications in non-linear optical devices

    NASA Astrophysics Data System (ADS)

    Chakraborty, Mitesh; Rai, Vineet Kumar

    2017-12-01

    The aim of the present paper is to employ theoretical methods to investigate the zero field splitting (ZFS) parameter and to investigate the position of the dopant in the host. These theoretical calculations have been compared with the empirical results. The superposition model (SPM) with the microscopic spin-Hamiltonian (MSH) theory and the coefficient of fractional parentage have been employed to investigate the dopant manganese(II) ion substitution in the diaquamalonatozinc(II) (DAMZ) single crystal. The magnetic parameters, viz. g-tensor and D-tensor, has been determined by using the ORCA program package developed by F Neese et al. The unrestricted Kohn-Sham orbitals-based Pederson-Khanna (PK) as the unperturbed wave function is observed to be the most suitable for the computational calculation of spin-orbit tensor (D^{SO}) of the axial ZFS parameter D. The effects of spin-spin dipolar couplings are taken into account. The unrestricted natural orbital (UNO) is used for the calculation of spin-spin dipolar contributions to the ZFS tensor. A comparative study of the quantum mechanical treatment of Pederson-Khanna (PK) with coupled perturbation (CP) is reported in the present study. The unrestricted Kohn-Sham-based natural orbital with Pederson-Khanna-type of perturbation approach validates the experimental results in the evaluation of ZFS parameters. The theoretical results are appropriate with the experimental ones and indicate the interstitial occupancy of Mn^{2+} ion in the host matrix.

  15. How fast is fisheries-induced evolution? Quantitative analysis of modelling and empirical studies

    PubMed Central

    Audzijonyte, Asta; Kuparinen, Anna; Fulton, Elizabeth A

    2013-01-01

    A number of theoretical models, experimental studies and time-series studies of wild fish have explored the presence and magnitude of fisheries-induced evolution (FIE). While most studies agree that FIE is likely to be happening in many fished stocks, there are disagreements about its rates and implications for stock viability. To address these disagreements in a quantitative manner, we conducted a meta-analysis of FIE rates reported in theoretical and empirical studies. We discovered that rates of phenotypic change observed in wild fish are about four times higher than the evolutionary rates reported in modelling studies, but correlation between the rate of change and instantaneous fishing mortality (F) was very similar in the two types of studies. Mixed-model analyses showed that in the modelling studies traits associated with reproductive investment and growth evolved slower than rates related to maturation. In empirical observations age-at-maturation was changing faster than other life-history traits. We also found that, despite different assumption and modelling approaches, rates of evolution for a given F value reported in 10 of 13 modelling studies were not significantly different. PMID:23789026

  16. Learning Science Content through Socio-Scientific Issues-Based Instruction: A Multi-Level Assessment Study

    ERIC Educational Resources Information Center

    Sadler, Troy D.; Romine, William L.; Topçu, Mustafa Sami

    2016-01-01

    Science educators have presented numerous conceptual and theoretical arguments in favor of teaching science through the exploration of socio-scientific issues (SSI). However, the empirical knowledge base regarding the extent to which SSI-based instruction supports student learning of science content is limited both in terms of the number of…

  17. Using an empirical and rule-based modeling approach to map cause of disturbance in U.S

    Treesearch

    Todd A. Schroeder; Gretchen G. Moisen; Karen Schleeweis; Chris Toney; Warren B. Cohen; Zhiqiang Yang; Elizabeth A. Freeman

    2015-01-01

    Recently completing over a decade of research, the NASA/NACP funded North American Forest Dynamics (NAFD) project has led to several important advancements in the way U.S. forest disturbance dynamics are mapped at regional and continental scales. One major contribution has been the development of an empirical and rule-based modeling approach which addresses two of the...

  18. Artifact removal from EEG data with empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Grubov, Vadim V.; Runnova, Anastasiya E.; Efremova, Tatyana Yu.; Hramov, Alexander E.

    2017-03-01

    In the paper we propose the novel method for dealing with the physiological artifacts caused by intensive activity of facial and neck muscles and other movements in experimental human EEG recordings. The method is based on analysis of EEG signals with empirical mode decomposition (Hilbert-Huang transform). We introduce the mathematical algorithm of the method with following steps: empirical mode decomposition of EEG signal, choosing of empirical modes with artifacts, removing empirical modes with artifacts, reconstruction of the initial EEG signal. We test the method on filtration of experimental human EEG signals from movement artifacts and show high efficiency of the method.

  19. The birth of the empirical turn in bioethics.

    PubMed

    Borry, Pascal; Schotsmans, Paul; Dierickx, Kris

    2005-02-01

    Since its origin, bioethics has attracted the collaboration of few social scientists, and social scientific methods of gathering empirical data have remained unfamiliar to ethicists. Recently, however, the clouded relations between the empirical and normative perspectives on bioethics appear to be changing. Three reasons explain why there was no easy and consistent input of empirical evidence in bioethics. Firstly, interdisciplinary dialogue runs the risk of communication problems and divergent objectives. Secondly, the social sciences were absent partners since the beginning of bioethics. Thirdly, the meta-ethical distinction between 'is' and 'ought' created a 'natural' border between the disciplines. Now, bioethics tends to accommodate more empirical research. Three hypotheses explain this emergence. Firstly, dissatisfaction with a foundationalist interpretation of applied ethics created a stimulus to incorporate empirical research in bioethics. Secondly, clinical ethicists became engaged in empirical research due to their strong integration in the medical setting. Thirdly, the rise of the evidence-based paradigm had an influence on the practice of bioethics. However, a problematic relationship cannot simply and easily evolve into a perfect interaction. A new and positive climate for empirical approaches has arisen, but the original difficulties have not disappeared.

  20. An Empirical Introduction to the Concept of Chemical Element Based on Van Hiele's Theory of Level Transitions

    ERIC Educational Resources Information Center

    Vogelezang, Michiel; Van Berkel, Berry; Verdonk, Adri

    2015-01-01

    Between 1970 and 1990, the Dutch working group "Empirical Introduction to Chemistry" developed a secondary school chemistry education curriculum based on the educational vision of the mathematicians van Hiele and van Hiele-Geldof. This approach viewed learning as a process in which students must go through discontinuous level transitions…

  1. Argumentation and Participation in the Primary Mathematics Classroom: Two Episodes and Related Theoretical Abductions

    ERIC Educational Resources Information Center

    Krummheuer, Gotz

    2007-01-01

    The main assumption of this article is that learning mathematics depends on the student's participation in processes of collective argumentation. On the empirical level, such processes will be analyzed with Toulmin's theory of argumentation and Goffman's idea of decomposition of the speaker's role. On the theoretical level, different statuses of…

  2. Discovery of Empirical Components by Information Theory

    DTIC Science & Technology

    2016-08-10

    AFRL-AFOSR-VA-TR-2016-0289 Discovery of Empirical Components by Information Theory Amit Singer TRUSTEES OF PRINCETON UNIVERSITY 1 NASSAU HALL...3. DATES COVERED (From - To) 15 Feb 2013 to 14 Feb 2016 5a. CONTRACT NUMBER Discovery of Empirical Components by Information Theory 5b. GRANT...they draw not only from traditional linear algebra based numerical analysis or approximation theory , but also from information theory , graph theory

  3. Integrated empirical ethics: loss of normativity?

    PubMed

    van der Scheer, Lieke; Widdershoven, Guy

    2004-01-01

    An important discussion in contemporary ethics concerns the relevance of empirical research for ethics. Specifically, two crucial questions pertain, respectively, to the possibility of inferring normative statements from descriptive statements, and to the danger of a loss of normativity if normative statements should be based on empirical research. Here we take part in the debate and defend integrated empirical ethical research: research in which normative guidelines are established on the basis of empirical research and in which the guidelines are empirically evaluated by focusing on observable consequences. We argue that in our concrete example normative statements are not derived from descriptive statements, but are developed within a process of reflection and dialogue that goes on within a specific praxis. Moreover, we show that the distinction in experience between the desirable and the undesirable precludes relativism. The normative guidelines so developed are both critical and normative: they help in choosing the right action and in evaluating that action. Finally, following Aristotle, we plead for a return to the view that morality and ethics are inherently related to one another, and for an acknowledgment of the fact that moral judgments have their origin in experience which is always related to historical and cultural circumstances.

  4. Results of Chilean water markets: Empirical research since 1990

    NASA Astrophysics Data System (ADS)

    Bauer, Carl J.

    2004-09-01

    Chile's free-market Water Code turned 20 years old in October 2001. This anniversary was an important milestone for both Chilean and international debates about water policy because Chile has become the world's leading example of the free-market approach to water law and water resources management, the textbook case of treating water rights not merely as private property but also as a fully marketable commodity. The predominant view outside of Chile is that Chilean water markets and the Chilean model of water management have been a success, and this perception has encouraged other countries to follow Chile's lead in water law reform. Much of the debate about Chilean water markets, however, has been based more on theoretical or political beliefs than on empirical study. This paper reverses that emphasis by reviewing the evolution of empirical research about these markets since 1990, when Chile returned to democratic government after 16 years of military rule. During the period since 1990, understanding of how Chilean water markets have worked in practice has gradually improved. There have been two major trends in this research: first, a gradual shift from exaggerated claims of the markets' success toward more balanced assessments of mixed results and, second, a heavy emphasis on the economics of water rights trading with very little attention given to the Water Code's impacts on social equity, river basin management, environmental protection, or resolution of water conflicts. The analysis in this study is qualitative and interdisciplinary, combining law, economics, and institutions.

  5. Gender in medical ethics: re-examining the conceptual basis of empirical research.

    PubMed

    Conradi, Elisabeth; Biller-Andorno, Nikola; Boos, Margarete; Sommer, Christina; Wiesemann, Claudia

    2003-01-01

    Conducting empirical research on gender in medical ethics is a challenge from a theoretical as well as a practical point of view. It still has to be clarified how gender aspects can be integrated without sustaining gender stereotypes. The developmental psychologist Carol Gilligan was among the first to question ethics from a gendered point of view. The notion of care introduced by her challenged conventional developmental psychology as well as moral philosophy. Gilligan was criticised, however, because her concept of 'two different voices' may reinforce gender stereotypes. Moreover, although Gilligan stressed relatedness, this is not reflected in her own empirical approach, which still focuses on individual moral reflection. Concepts from social psychology can help overcome both problems. Social categories like gender shape moral identity and moral decisions. If morality is understood as being lived through actions of persons in social relationships, gender becomes a helpful category of moral analysis. Our findings will provide a conceptual basis for the question how empirical research in medical ethics can successfully embrace a gendered perspective.

  6. Theoretical kinetics of O + C 2H 4

    DOE PAGES

    Li, Xiaohu; Jasper, Ahren W.; Zádor, Judit; ...

    2016-06-01

    The reaction of atomic oxygen with ethylene is a fundamental oxidation step in combustion and is prototypical of reactions in which oxygen adds to double bonds. For 3O+C 2H 4 and for this class of reactions generally, decomposition of the initial adduct via spin-allowed reaction channels on the triplet surface competes with intersystem crossing (ISC) and a set of spin-forbidden reaction channels on the ground-state singlet surface. The two surfaces share some bimolecular products but feature different intermediates, pathways, and transition states. In addition, the overall product branching is therefore a sensitive function of the ISC rate. The 3O+C 2Hmore » 4 reaction has been extensively studied, but previous experimental work has not provided detailed branching information at elevated temperatures, while previous theoretical studies have employed empirical treatments of ISC. Here we predict the kinetics of 3O+C 2H 4 using an ab initio transition state theory based master equation (AITSTME) approach that includes an a priori description of ISC. Specifically, the ISC rate is calculated using Landau–Zener statistical theory, consideration of the four lowest-energy electronic states, and a direct classical trajectory study of the product branching immediately after ISC. The present theoretical results are largely in good agreement with existing low-temperature experimental kinetics and molecular beam studies. Good agreement is also found with past theoretical work, with the notable exception of the predicted product branching at elevated temperatures. Above ~1000 K, we predict CH 2CHO+H and CH 2+CH 2O as the major products, which differs from the room temperature preference for CH 3+HCO (which is assumed to remain at higher temperatures in some models) and from the prediction of a previous detailed master equation study.« less

  7. Health status and health dynamics in an empirical model of expected longevity.

    PubMed

    Benítez-Silva, Hugo; Ni, Huan

    2008-05-01

    Expected longevity is an important factor influencing older individuals' decisions such as consumption, savings, purchase of life insurance and annuities, claiming of Social Security benefits, and labor supply. It has also been shown to be a good predictor of actual longevity, which in turn is highly correlated with health status. A relatively new literature on health investments under uncertainty, which builds upon the seminal work by Grossman [Grossman, M., 1972. On the concept of health capital and demand for health. Journal of Political Economy 80, 223-255] has directly linked longevity with characteristics, behaviors, and decisions by utility maximizing agents. Our empirical model can be understood within that theoretical framework as estimating a production function of longevity. Using longitudinal data from the Health and Retirement Study, we directly incorporate health dynamics in explaining the variation in expected longevities, and compare two alternative measures of health dynamics: the self-reported health change, and the computed health change based on self-reports of health status. In 38% of the reports in our sample, computed health changes are inconsistent with the direct report on health changes over time. And another 15% of the sample can suffer from information losses if computed changes are used to assess changes in actual health. These potentially serious problems raise doubts regarding the use and interpretation of the computed health changes and even the lagged measures of self-reported health as controls for health dynamics in a variety of empirical settings. Our empirical results, controlling for both subjective and objective measures of health status and unobserved heterogeneity in reporting, suggest that self-reported health changes are a preferred measure of health dynamics.

  8. An Empirical Investigation of a Theoretically Based Measure of Perceived Wellness

    ERIC Educational Resources Information Center

    Harari, Marc J.; Waehler, Charles A.; Rogers, James R.

    2005-01-01

    The Perceived Wellness Survey (PWS; T. Adams, 1995; T. Adams, J. Bezner, & M. Steinhardt, 1997) is a recently developed instrument intended to operationalize the comprehensive Perceived Wellness Model (T. Adams, J. Bezner, & M. Steinhardt, 1997), an innovative model that attempts to include the balance of multiple life activities in its evaluation…

  9. Electrocardiogram signal denoising based on empirical mode decomposition technique: an overview

    NASA Astrophysics Data System (ADS)

    Han, G.; Lin, B.; Xu, Z.

    2017-03-01

    Electrocardiogram (ECG) signal is nonlinear and non-stationary weak signal which reflects whether the heart is functioning normally or abnormally. ECG signal is susceptible to various kinds of noises such as high/low frequency noises, powerline interference and baseline wander. Hence, the removal of noises from ECG signal becomes a vital link in the ECG signal processing and plays a significant role in the detection and diagnosis of heart diseases. The review will describe the recent developments of ECG signal denoising based on Empirical Mode Decomposition (EMD) technique including high frequency noise removal, powerline interference separation, baseline wander correction, the combining of EMD and Other Methods, EEMD technique. EMD technique is a quite potential and prospective but not perfect method in the application of processing nonlinear and non-stationary signal like ECG signal. The EMD combined with other algorithms is a good solution to improve the performance of noise cancellation. The pros and cons of EMD technique in ECG signal denoising are discussed in detail. Finally, the future work and challenges in ECG signal denoising based on EMD technique are clarified.

  10. Bivariate empirical mode decomposition for ECG-based biometric identification with emotional data.

    PubMed

    Ferdinando, Hany; Seppanen, Tapio; Alasaarela, Esko

    2017-07-01

    Emotions modulate ECG signals such that they might affect ECG-based biometric identification in real life application. It motivated in finding good feature extraction methods where the emotional state of the subjects has minimum impacts. This paper evaluates feature extraction based on bivariate empirical mode decomposition (BEMD) for biometric identification when emotion is considered. Using the ECG signal from the Mahnob-HCI database for affect recognition, the features were statistical distributions of dominant frequency after applying BEMD analysis to ECG signals. The achieved accuracy was 99.5% with high consistency using kNN classifier in 10-fold cross validation to identify 26 subjects when the emotional states of the subjects were ignored. When the emotional states of the subject were considered, the proposed method also delivered high accuracy, around 99.4%. We concluded that the proposed method offers emotion-independent features for ECG-based biometric identification. The proposed method needs more evaluation related to testing with other classifier and variation in ECG signals, e.g. normal ECG vs. ECG with arrhythmias, ECG from various ages, and ECG from other affective databases.

  11. Intrinsic fluorescence of protein in turbid media using empirical relation based on Monte Carlo lookup table

    NASA Astrophysics Data System (ADS)

    Einstein, Gnanatheepam; Udayakumar, Kanniyappan; Aruna, Prakasarao; Ganesan, Singaravelu

    2017-03-01

    Fluorescence of Protein has been widely used in diagnostic oncology for characterizing cellular metabolism. However, the intensity of fluorescence emission is affected due to the absorbers and scatterers in tissue, which may lead to error in estimating exact protein content in tissue. Extraction of intrinsic fluorescence from measured fluorescence has been achieved by different methods. Among them, Monte Carlo based method yields the highest accuracy for extracting intrinsic fluorescence. In this work, we have attempted to generate a lookup table for Monte Carlo simulation of fluorescence emission by protein. Furthermore, we fitted the generated lookup table using an empirical relation. The empirical relation between measured and intrinsic fluorescence is validated using tissue phantom experiments. The proposed relation can be used for estimating intrinsic fluorescence of protein for real-time diagnostic applications and thereby improving the clinical interpretation of fluorescence spectroscopic data.

  12. Law of Empires.

    ERIC Educational Resources Information Center

    Martz, Carlton

    2001-01-01

    This issue of "Bill of Rights in Action" explores issues raised by empires and imperial law. The first article, "Clash of Empires: The Fight for North America," looks at the clash of empires and the fight for North America during the 18th century. The second article, "When Roman Law Ruled the Western World," examines…

  13. Richness-Productivity Relationships Between Trophic Levels in a Detritus-Based System: Significance of Abundance and Trophic Linkage.

    EPA Science Inventory

    Most theoretical and empirical studies of productivity–species richness relationships fail to consider linkages among trophic levels. We quantified productivity–richness relationships in detritus-based, water-filled tree-hole communities for two trophic levels: invertebrate consu...

  14. Research into Practice: The Task-Based Approach to Instructed Second Language Acquisition

    ERIC Educational Resources Information Center

    East, Martin

    2017-01-01

    This article discusses the phenomenon of task-based language teaching (TBLT) in instructed additional language settings. It begins from the premise that, despite considerable theoretical and empirical support, TBLT remains a contested endeavour. Critics of TBLT argue that, particularly with regard to time-limited foreign language instructional…

  15. An Empirical Typology of Residential Care/Assisted Living Based on a Four-State Study

    ERIC Educational Resources Information Center

    Park, Nan Sook; Zimmerman, Sheryl; Sloane, Philip D.; Gruber-Baldini, Ann L.; Eckert, J. Kevin

    2006-01-01

    Purpose: Residential care/assisted living describes diverse facilities providing non-nursing home care to a heterogeneous group of primarily elderly residents. This article derives typologies of assisted living based on theoretically and practically grounded evidence. Design and Methods: We obtained data from the Collaborative Studies of Long-Term…

  16. Activity Theory as a Theoretical Framework for Health Self-Quantification: A Systematic Review of Empirical Studies

    PubMed Central

    2016-01-01

    Background Self-quantification (SQ) is a way of working in which, by using tracking tools, people aim to collect, manage, and reflect on personal health data to gain a better understanding of their own body, health behavior, and interaction with the world around them. However, health SQ lacks a formal framework for describing the self-quantifiers’ activities and their contextual components or constructs to pursue these health related goals. Establishing such framework is important because it is the first step to operationalize health SQ fully. This may in turn help to achieve the aims of health professionals and researchers who seek to make or study changes in the self-quantifiers’ health systematically. Objective The aim of this study was to review studies on health SQ in order to answer the following questions: What are the general features of the work and the particular activities that self-quantifiers perform to achieve their health objectives? What constructs of health SQ have been identified in the scientific literature? How have these studies described such constructs? How would it be possible to model these constructs theoretically to characterize the work of health SQ? Methods A systematic review of peer-reviewed literature was conducted. A total of 26 empirical studies were included. The content of these studies was thematically analyzed using Activity Theory as an organizing framework. Results The literature provided varying descriptions of health SQ as data-driven and objective-oriented work mediated by SQ tools. From the literature, we identified two types of SQ work: work on data (ie, data management activities) and work with data (ie, health management activities). Using Activity Theory, these activities could be characterized into 6 constructs: users, tracking tools, health objectives, division of work, community or group setting, and SQ plan and rules. We could not find a reference to any single study that accounted for all these activities and

  17. Activity Theory as a Theoretical Framework for Health Self-Quantification: A Systematic Review of Empirical Studies.

    PubMed

    Almalki, Manal; Gray, Kathleen; Martin-Sanchez, Fernando

    2016-05-27

    Self-quantification (SQ) is a way of working in which, by using tracking tools, people aim to collect, manage, and reflect on personal health data to gain a better understanding of their own body, health behavior, and interaction with the world around them. However, health SQ lacks a formal framework for describing the self-quantifiers' activities and their contextual components or constructs to pursue these health related goals. Establishing such framework is important because it is the first step to operationalize health SQ fully. This may in turn help to achieve the aims of health professionals and researchers who seek to make or study changes in the self-quantifiers' health systematically. The aim of this study was to review studies on health SQ in order to answer the following questions: What are the general features of the work and the particular activities that self-quantifiers perform to achieve their health objectives? What constructs of health SQ have been identified in the scientific literature? How have these studies described such constructs? How would it be possible to model these constructs theoretically to characterize the work of health SQ? A systematic review of peer-reviewed literature was conducted. A total of 26 empirical studies were included. The content of these studies was thematically analyzed using Activity Theory as an organizing framework. The literature provided varying descriptions of health SQ as data-driven and objective-oriented work mediated by SQ tools. From the literature, we identified two types of SQ work: work on data (ie, data management activities) and work with data (ie, health management activities). Using Activity Theory, these activities could be characterized into 6 constructs: users, tracking tools, health objectives, division of work, community or group setting, and SQ plan and rules. We could not find a reference to any single study that accounted for all these activities and constructs of health SQ activity. A

  18. Understanding Skill in EVA Mass Handling. Volume 2; Empirical Investigation

    NASA Technical Reports Server (NTRS)

    Riccio, Gary; McDonald, Vernon; Peters, Brian; Layne, Charles; Bloomberg, Jacob

    1997-01-01

    In this report we describe the details of our empirical protocol effort investigating skill in extravehicular mass handling using NASA's principal mass handling simulator, the precision air bearing floor. Contents of this report include a description of the necessary modifications to the mass handling simulator; choice of task, and the description of an operationally relevant protocol. Our independent variables are presented in the context of the specific operational issues they were designed to simulate. The explanation of our dependent variables focuses on the specific data processing procedures used to transform data from common laboratory instruments into measures that are relevant to a special class of nested control systems (discussed in Volume 1): manual interactions between an individual and the substantial environment. The data reduction is explained in the context of the theoretical foundation described in Volume 1. Finally as a preface to the presentation of the empirical data in Volume 3 of this report series, a set of detailed hypotheses is presented.

  19. Fringe-projection profilometry based on two-dimensional empirical mode decomposition.

    PubMed

    Zheng, Suzhen; Cao, Yiping

    2013-11-01

    In 3D shape measurement, because deformed fringes often contain low-frequency information degraded with random noise and background intensity information, a new fringe-projection profilometry is proposed based on 2D empirical mode decomposition (2D-EMD). The fringe pattern is first decomposed into numbers of intrinsic mode functions by 2D-EMD. Because the method has partial noise reduction, the background components can be removed to obtain the fundamental components needed to perform Hilbert transformation to retrieve the phase information. The 2D-EMD can effectively extract the modulation phase of a single direction fringe and an inclined fringe pattern because it is a full 2D analysis method and considers the relationship between adjacent lines of a fringe patterns. In addition, as the method does not add noise repeatedly, as does ensemble EMD, the data processing time is shortened. Computer simulations and experiments prove the feasibility of this method.

  20. Comparison of binding energies of SrcSH2-phosphotyrosyl peptides with structure-based prediction using surface area based empirical parameterization.

    PubMed Central

    Henriques, D. A.; Ladbury, J. E.; Jackson, R. M.

    2000-01-01

    The prediction of binding energies from the three-dimensional (3D) structure of a protein-ligand complex is an important goal of biophysics and structural biology. Here, we critically assess the use of empirical, solvent-accessible surface area-based calculations for the prediction of the binding of Src-SH2 domain with a series of tyrosyl phosphopeptides based on the high-affinity ligand from the hamster middle T antigen (hmT), where the residue in the pY+ 3 position has been changed. Two other peptides based on the C-terminal regulatory site of the Src protein and the platelet-derived growth factor receptor (PDGFR) are also investigated. Here, we take into account the effects of proton linkage on binding, and test five different surface area-based models that include different treatments for the contributions to conformational change and protein solvation. These differences relate to the treatment of conformational flexibility in the peptide ligand and the inclusion of proximal ordered solvent molecules in the surface area calculations. This allowed the calculation of a range of thermodynamic state functions (deltaCp, deltaS, deltaH, and deltaG) directly from structure. Comparison with the experimentally derived data shows little agreement for the interaction of SrcSH2 domain and the range of tyrosyl phosphopeptides. Furthermore, the adoption of the different models to treat conformational change and solvation has a dramatic effect on the calculated thermodynamic functions, making the predicted binding energies highly model dependent. While empirical, solvent-accessible surface area based calculations are becoming widely adopted to interpret thermodynamic data, this study highlights potential problems with application and interpretation of this type of approach. There is undoubtedly some agreement between predicted and experimentally determined thermodynamic parameters: however, the tolerance of this approach is not sufficient to make it ubiquitously applicable

  1. Modeling and empirical characterization of the polarization response of off-plane reflection gratings.

    PubMed

    Marlowe, Hannah; McEntaffer, Randall L; Tutt, James H; DeRoo, Casey T; Miles, Drew M; Goray, Leonid I; Soltwisch, Victor; Scholze, Frank; Herrero, Analia Fernandez; Laubis, Christian

    2016-07-20

    Off-plane reflection gratings were previously predicted to have different efficiencies when the incident light is polarized in the transverse-magnetic (TM) versus transverse-electric (TE) orientations with respect to the grating grooves. However, more recent theoretical calculations which rigorously account for finitely conducting, rather than perfectly conducting, grating materials no longer predict significant polarization sensitivity. We present the first empirical results for radially ruled, laminar groove profile gratings in the off-plane mount, which demonstrate no difference in TM versus TE efficiency across our entire 300-1500 eV bandpass. These measurements together with the recent theoretical results confirm that grazing incidence off-plane reflection gratings using real, not perfectly conducting, materials are not polarization sensitive.

  2. Simulation-Based Probabilistic Tsunami Hazard Analysis: Empirical and Robust Hazard Predictions

    NASA Astrophysics Data System (ADS)

    De Risi, Raffaele; Goda, Katsuichiro

    2017-08-01

    Probabilistic tsunami hazard analysis (PTHA) is the prerequisite for rigorous risk assessment and thus for decision-making regarding risk mitigation strategies. This paper proposes a new simulation-based methodology for tsunami hazard assessment for a specific site of an engineering project along the coast, or, more broadly, for a wider tsunami-prone region. The methodology incorporates numerous uncertain parameters that are related to geophysical processes by adopting new scaling relationships for tsunamigenic seismic regions. Through the proposed methodology it is possible to obtain either a tsunami hazard curve for a single location, that is the representation of a tsunami intensity measure (such as inundation depth) versus its mean annual rate of occurrence, or tsunami hazard maps, representing the expected tsunami intensity measures within a geographical area, for a specific probability of occurrence in a given time window. In addition to the conventional tsunami hazard curve that is based on an empirical statistical representation of the simulation-based PTHA results, this study presents a robust tsunami hazard curve, which is based on a Bayesian fitting methodology. The robust approach allows a significant reduction of the number of simulations and, therefore, a reduction of the computational effort. Both methods produce a central estimate of the hazard as well as a confidence interval, facilitating the rigorous quantification of the hazard uncertainties.

  3. Development of a Problem-Based Learning Matrix for Data Collection

    ERIC Educational Resources Information Center

    Sipes, Shannon M.

    2017-01-01

    Few of the papers published in journals and conference proceedings on problem-based learning (PBL) are empirical studies, and most of these use self-report as the measure of PBL (Beddoes, Jesiek, & Borrego, 2010). The current study provides a theoretically derived matrix for coding and classifying PBL that was objectively applied to official…

  4. Determination of knock characteristics in spark ignition engines: an approach based on ensemble empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Li, Ning; Yang, Jianguo; Zhou, Rui; Liang, Caiping

    2016-04-01

    Knock is one of the major constraints to improve the performance and thermal efficiency of spark ignition (SI) engines. It can also result in severe permanent engine damage under certain operating conditions. Based on the ensemble empirical mode decomposition (EEMD), this paper proposes a new approach to determine the knock characteristics in SI engines. By adding a uniformly distributed and finite white Gaussian noise, the EEMD can preserve signal continuity in different scales and therefore alleviates the mode-mixing problem occurring in the classic empirical mode decomposition (EMD). The feasibilities of applying the EEMD to detect the knock signatures of a test SI engine via the pressure signal measured from combustion chamber and the vibration signal measured from cylinder head are investigated. Experimental results show that the EEMD-based method is able to detect the knock signatures from both the pressure signal and vibration signal, even in initial stage of knock. Finally, by comparing the application results with those obtained by short-time Fourier transform (STFT), Wigner-Ville distribution (WVD) and discrete wavelet transform (DWT), the superiority of the EEMD method in determining knock characteristics is demonstrated.

  5. A new simple local muscle recovery model and its theoretical and experimental validation.

    PubMed

    Ma, Liang; Zhang, Wei; Wu, Su; Zhang, Zhanwu

    2015-01-01

    This study was conducted to provide theoretical and experimental validation of a local muscle recovery model. Muscle recovery has been modeled in different empirical and theoretical approaches to determine work-rest allowance for musculoskeletal disorder (MSD) prevention. However, time-related parameters and individual attributes have not been sufficiently considered in conventional approaches. A new muscle recovery model was proposed by integrating time-related task parameters and individual attributes. Theoretically, this muscle recovery model was compared to other theoretical models mathematically. Experimentally, a total of 20 subjects participated in the experimental validation. Hand grip force recovery and shoulder joint strength recovery were measured after a fatiguing operation. The recovery profile was fitted by using the recovery model, and individual recovery rates were calculated as well after fitting. Good fitting values (r(2) > .8) were found for all the subjects. Significant differences in recovery rates were found among different muscle groups (p < .05). The theoretical muscle recovery model was primarily validated by characterization of the recovery process after fatiguing operation. The determined recovery rate may be useful to represent individual recovery attribute.

  6. Meta-Theoretical Contributions to the Constitution of a Model-Based Didactics of Science

    NASA Astrophysics Data System (ADS)

    Ariza, Yefrin; Lorenzano, Pablo; Adúriz-Bravo, Agustín

    2016-10-01

    There is nowadays consensus in the community of didactics of science (i.e. science education understood as an academic discipline) regarding the need to include the philosophy of science in didactical research, science teacher education, curriculum design, and the practice of science education in all educational levels. Some authors have identified an ever-increasing use of the concept of `theoretical model', stemming from the so-called semantic view of scientific theories. However, it can be recognised that, in didactics of science, there are over-simplified transpositions of the idea of model (and of other meta-theoretical ideas). In this sense, contemporary philosophy of science is often blurred or distorted in the science education literature. In this paper, we address the discussion around some meta-theoretical concepts that are introduced into didactics of science due to their perceived educational value. We argue for the existence of a `semantic family', and we characterise four different versions of semantic views existing within the family. In particular, we seek to contribute to establishing a model-based didactics of science mainly supported in this semantic family.

  7. Stalking: developing an empirical typology to classify stalkers.

    PubMed

    Del Ben, Kevin; Fremouw, W

    2002-01-01

    Stalking has received a great deal of attention from the media and its harmful effects on victims have been well documented. Stalking is also more common than previously thought, leading researchers to classify stalkers into groups in an attempt to predict future behavior. Previous research has grouped stalkers based on theoretical models rather than trying to empirically examine stalking behaviors along with other factors such as motivation, type of relationship, and attachment style in determining a typology of stalkers. Female college students (N = 108) who had experienced stalking behaviors responded to questions regarding their perceptions of those behaviors. First, these victim perceptions were factor analyzed. Then, cluster analysis grouped those factors to produce a four-cluster typology of stalkers. Cluster 1 (Harmless) appeared to reflect a more casual, less jealous pattern of behavior. Cluster 2 (Low Threat) appeared the least likely to become physically violent or threatening, or to engage in illegal behaviors. Cluster 3 (Violent Criminal) appeared to be the most likely to engage in physically threatening and illegal behaviors. Cluster 4 (High Threat) was characterized by a more serious type of relationship and may attempt to be more restrictive of their partner when first meeting them.

  8. The successful merger of theoretical thermochemistry with fragment-based methods in quantum chemistry.

    PubMed

    Ramabhadran, Raghunath O; Raghavachari, Krishnan

    2014-12-16

    CONSPECTUS: Quantum chemistry and electronic structure theory have proven to be essential tools to the experimental chemist, in terms of both a priori predictions that pave the way for designing new experiments and rationalizing experimental observations a posteriori. Translating the well-established success of electronic structure theory in obtaining the structures and energies of small chemical systems to increasingly larger molecules is an exciting and ongoing central theme of research in quantum chemistry. However, the prohibitive computational scaling of highly accurate ab initio electronic structure methods poses a fundamental challenge to this research endeavor. This scenario necessitates an indirect fragment-based approach wherein a large molecule is divided into small fragments and is subsequently reassembled to compute its energy accurately. In our quest to further reduce the computational expense associated with the fragment-based methods and overall enhance the applicability of electronic structure methods to large molecules, we realized that the broad ideas involved in a different area, theoretical thermochemistry, are transferable to the area of fragment-based methods. This Account focuses on the effective merger of these two disparate frontiers in quantum chemistry and how new concepts inspired by theoretical thermochemistry significantly reduce the total number of electronic structure calculations needed to be performed as part of a fragment-based method without any appreciable loss of accuracy. Throughout, the generalized connectivity based hierarchy (CBH), which we developed to solve a long-standing problem in theoretical thermochemistry, serves as the linchpin in this merger. The accuracy of our method is based on two strong foundations: (a) the apt utilization of systematic and sophisticated error-canceling schemes via CBH that result in an optimal cutting scheme at any given level of fragmentation and (b) the use of a less expensive second

  9. Empirically Guided Coordination of Multiple Evidence-Based Treatments: An Illustration of Relevance Mapping in Children's Mental Health Services

    ERIC Educational Resources Information Center

    Chorpita, Bruce F.; Bernstein, Adam; Daleiden, Eric L.

    2011-01-01

    Objective: Despite substantial progress in the development and identification of psychosocial evidence-based treatments (EBTs) in mental health, there is minimal empirical guidance for selecting an optimal "set" of EBTs maximally applicable and generalizable to a chosen service sample. Relevance mapping is a proposed methodology that…

  10. The Theoretical Research Article as a Reflection of Disciplinary Practices: The Case of Pure Mathematics

    ERIC Educational Resources Information Center

    Kuteeva, Maria; McGrath, Lisa

    2015-01-01

    Recent years have seen an interest in the generic structure of empirical research articles across a variety of disciplines. However, significantly less attention has been given to theoretical articles. This study aims to begin to address this imbalance by presenting the results of an investigation into the organizational and rhetorical structure…

  11. Empirical Data Collection and Analysis Using Camtasia and Transana

    ERIC Educational Resources Information Center

    Thorsteinsson, Gisli; Page, Tom

    2009-01-01

    One of the possible techniques for collecting empirical data is video recordings of a computer screen with specific screen capture software. This method for collecting empirical data shows how students use the BSCWII (Be Smart Cooperate Worldwide--a web based collaboration/groupware environment) to coordinate their work and collaborate in…

  12. Language acquisition is model-based rather than model-free.

    PubMed

    Wang, Felix Hao; Mintz, Toben H

    2016-01-01

    Christiansen & Chater (C&C) propose that learning language is learning to process language. However, we believe that the general-purpose prediction mechanism they propose is insufficient to account for many phenomena in language acquisition. We argue from theoretical considerations and empirical evidence that many acquisition tasks are model-based, and that different acquisition tasks require different, specialized models.

  13. Theoretical Conversions of Different Hardness and Tensile Strength for Ductile Materials Based on Stress-Strain Curves

    NASA Astrophysics Data System (ADS)

    Chen, Hui; Cai, Li-Xun

    2018-04-01

    Based on the power-law stress-strain relation and equivalent energy principle, theoretical equations for converting between Brinell hardness (HB), Rockwell hardness (HR), and Vickers hardness (HV) were established. Combining the pre-existing relation between the tensile strength ( σ b ) and Hollomon parameters ( K, N), theoretical conversions between hardness (HB/HR/HV) and tensile strength ( σ b ) were obtained as well. In addition, to confirm the pre-existing σ b -( K, N) relation, a large number of uniaxial tensile tests were conducted in various ductile materials. Finally, to verify the theoretical conversions, plenty of statistical data listed in ASTM and ISO standards were adopted to test the robustness of the converting equations with various hardness and tensile strength. The results show that both hardness conversions and hardness-strength conversions calculated from the theoretical equations accord well with the standard data.

  14. Beyond the theoretical rhetoric: a proposal to study the consequences of drug legalization.

    PubMed

    Yacoubian, G S

    2001-01-01

    Drug legalization is a frequently-debated drug control policy alternative. It should come as little surprise, therefore, that the arguments in favor of both legalization and prohibition have resulted in a conceptual stalemate. While theoretical deliberations are unquestionably valuable, they seem to have propelled this particular issue to its limit. To date, no works have suggested any empirical studies that might test the framework and potential consequences of drug legalization. In the current study, the arguments surrounding the drug legalization debate are synthesized into a proposal for future research. Such a proposal illustrates that the core elements surrounding drug legalization are not only testable, but that the time may be right to consider such an empirical effort.

  15. Adult Coping with Childhood Sexual Abuse: A Theoretical and Empirical Review

    PubMed Central

    Walsh, Kate; Fortier, Michelle A.; DiLillo, David

    2009-01-01

    Coping has been suggested as an important element in understanding the long-term functioning of individuals with a history of child sexual abuse (CSA). The present review synthesizes the literature on coping with CSA, first by examining theories of coping with trauma, and, second by examining how these theories have been applied to studies of coping in samples of CSA victims. Thirty-nine studies were reviewed, including eleven descriptive studies of the coping strategies employed by individuals with a history of CSA, eighteen correlational studies of the relationship between coping strategies and long-term functioning of CSA victims, and ten investigations in which coping was examined as a mediational factor in relation to long-term outcomes. These studies provide initial information regarding early sexual abuse and subsequent coping processes. However, this literature is limited by several theoretical and methodological issues, including a failure to specify the process of coping as it occurs, a disparity between theory and research, and limited applicability to clinical practice. Future directions of research are discussed and include the need to understand coping as a process, identification of coping in relation to adaptive outcomes, and considerations of more complex mediational and moderational processes in the study of coping with CSA. PMID:20161502

  16. Optical dosimetry probes to validate Monte Carlo and empirical-method-based NIR dose planning in the brain.

    PubMed

    Verleker, Akshay Prabhu; Shaffer, Michael; Fang, Qianqian; Choi, Mi-Ran; Clare, Susan; Stantz, Keith M

    2016-12-01

    A three-dimensional photon dosimetry in tissues is critical in designing optical therapeutic protocols to trigger light-activated drug release. The objective of this study is to investigate the feasibility of a Monte Carlo-based optical therapy planning software by developing dosimetry tools to characterize and cross-validate the local photon fluence in brain tissue, as part of a long-term strategy to quantify the effects of photoactivated drug release in brain tumors. An existing GPU-based 3D Monte Carlo (MC) code was modified to simulate near-infrared photon transport with differing laser beam profiles within phantoms of skull bone (B), white matter (WM), and gray matter (GM). A novel titanium-based optical dosimetry probe with isotropic acceptance was used to validate the local photon fluence, and an empirical model of photon transport was developed to significantly decrease execution time for clinical application. Comparisons between the MC and the dosimetry probe measurements were on an average 11.27%, 13.25%, and 11.81% along the illumination beam axis, and 9.4%, 12.06%, 8.91% perpendicular to the beam axis for WM, GM, and B phantoms, respectively. For a heterogeneous head phantom, the measured % errors were 17.71% and 18.04% along and perpendicular to beam axis. The empirical algorithm was validated by probe measurements and matched the MC results (R20.99), with average % error of 10.1%, 45.2%, and 22.1% relative to probe measurements, and 22.6%, 35.8%, and 21.9% relative to the MC, for WM, GM, and B phantoms, respectively. The simulation time for the empirical model was 6 s versus 8 h for the GPU-based Monte Carlo for a head phantom simulation. These tools provide the capability to develop and optimize treatment plans for optimal release of pharmaceuticals in the treatment of cancer. Future work will test and validate these novel delivery and release mechanisms in vivo.

  17. Forecasting stochastic neural network based on financial empirical mode decomposition.

    PubMed

    Wang, Jie; Wang, Jun

    2017-06-01

    In an attempt to improve the forecasting accuracy of stock price fluctuations, a new one-step-ahead model is developed in this paper which combines empirical mode decomposition (EMD) with stochastic time strength neural network (STNN). The EMD is a processing technique introduced to extract all the oscillatory modes embedded in a series, and the STNN model is established for considering the weight of occurrence time of the historical data. The linear regression performs the predictive availability of the proposed model, and the effectiveness of EMD-STNN is revealed clearly through comparing the predicted results with the traditional models. Moreover, a new evaluated method (q-order multiscale complexity invariant distance) is applied to measure the predicted results of real stock index series, and the empirical results show that the proposed model indeed displays a good performance in forecasting stock market fluctuations. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. An empirically based conceptual framework for fostering meaningful patient engagement in research.

    PubMed

    Hamilton, Clayon B; Hoens, Alison M; Backman, Catherine L; McKinnon, Annette M; McQuitty, Shanon; English, Kelly; Li, Linda C

    2018-02-01

    Patient engagement in research (PEIR) is promoted to improve the relevance and quality of health research, but has little conceptualization derived from empirical data. To address this issue, we sought to develop an empirically based conceptual framework for meaningful PEIR founded on a patient perspective. We conducted a qualitative secondary analysis of in-depth interviews with 18 patient research partners from a research centre-affiliated patient advisory board. Data analysis involved three phases: identifying the themes, developing a framework and confirming the framework. We coded and organized the data, and abstracted, illustrated, described and explored the emergent themes using thematic analysis. Directed content analysis was conducted to derive concepts from 18 publications related to PEIR to supplement, confirm or refute, and extend the emergent conceptual framework. The framework was reviewed by four patient research partners on our research team. Participants' experiences of working with researchers were generally positive. Eight themes emerged: procedural requirements, convenience, contributions, support, team interaction, research environment, feel valued and benefits. These themes were interconnected and formed a conceptual framework to explain the phenomenon of meaningful PEIR from a patient perspective. This framework, the PEIR Framework, was endorsed by the patient research partners on our team. The PEIR Framework provides guidance on aspects of PEIR to address for meaningful PEIR. It could be particularly useful when patient-researcher partnerships are led by researchers with little experience of engaging patients in research. © 2017 The Authors Health Expectations Published by John Wiley & Sons Ltd.

  19. Differentiating and defusing theoretical Ecology's criticisms: A rejoinder to Sagoff's reply to Donhauser (2016).

    PubMed

    Donhauser, Justin

    2017-06-01

    In a (2016) paper in this journal, I defuse allegations that theoretical ecological research is problematic because it relies on teleological metaphysical assumptions. Mark Sagoff offers a formal reply. In it, he concedes that I succeeded in establishing that ecologists abandoned robust teleological views long ago and that they use teleological characterizations as metaphors that aid in developing mechanistic explanations of ecological phenomena. Yet, he contends that I did not give enduring criticisms of theoretical ecology a fair shake in my paper. He says this is because enduring criticisms center on concerns about the nature of ecological networks and forces, the instrumentality of ecological laws and theoretical models, and the relation between theoretical and empirical methods in ecology that that paper does not broach. Below I set apart the distinct criticisms Sagoff presents in his commentary and respond to each in turn. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Iso standardization of theoretical activity evaluation method for low and intermediate level activated waste generated at nuclear power plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Makoto Kashiwagi; Garamszeghy, Mike; Lantes, Bertrand

    Disposal of low-and intermediate-level activated waste generated at nuclear power plants is being planned or carried out in many countries. The radioactivity concentrations and/or total quantities of long-lived, difficult-to-measure nuclides (DTM nuclides), such as C-14, Ni-63, Nb-94, α emitting nuclides etc., are often restricted by the safety case for a final repository as determined by each country's safety regulations, and these concentrations or amounts are required to be known and declared. With respect to waste contaminated by contact with process water, the Scaling Factor method (SF method), which is empirically based on sampling and analysis data, has been applied asmore » an important method for determining concentrations of DTM nuclides. This method was standardized by the International Organization for Standardization (ISO) and published in 2007 as ISO21238 'Scaling factor method to determine the radioactivity of low and intermediate-level radioactive waste packages generated at nuclear power plants' [1]. However, for activated metal waste with comparatively high concentrations of radioactivity, such as may be found in reactor control rods and internal structures, direct sampling and radiochemical analysis methods to evaluate the DTM nuclides are limited by access to the material and potentially high personnel radiation exposure. In this case, theoretical calculation methods in combination with empirical methods based on remote radiation surveys need to be used to best advantage for determining the disposal inventory of DTM nuclides while minimizing exposure to radiation workers. Pursuant to this objective a standard for the theoretical evaluation of the radioactivity concentration of DTM nuclides in activated waste, is in process through ISO TC85/SC5 (ISO Technical Committee 85: Nuclear energy, nuclear technologies, and radiological protection; Subcommittee 5: Nuclear fuel cycle). The project team for this ISO standard was formed in 2011 and is

  1. Smartphone-Based, Self-Administered Intervention System for Alcohol Use Disorders: Theory and Empirical Evidence Basis

    PubMed Central

    Dulin, Patrick L.; Gonzalez, Vivian M.; King, Diane K.; Giroux, Danielle; Bacon, Samantha

    2013-01-01

    Advances in mobile technology provide an opportunity to deliver in-the-moment interventions to individuals with alcohol use disorders, yet availability of effective “apps” that deliver evidence-based interventions is scarce. We developed an immediately available, portable, smartphone-based intervention system whose purpose is to provide stand-alone, self-administered assessment and intervention. In this paper, we describe how theory and empirical evidence, combined with smartphone functionality contributed to the construction of a user-friendly, engaging alcohol intervention. With translation in mind, we discuss how we selected appropriate intervention components including assessments, feedback and tools, that work together to produce the hypothesized outcomes. PMID:24347811

  2. Empirically Based Phenotypic Profiles of Children with Pervasive Developmental Disorders: Interpretation in the Light of the DSM-5

    ERIC Educational Resources Information Center

    Greaves-Lord, Kirstin; Eussen, Mart L. J. M.; Verhulst, Frank C.; Minderaa, Ruud B.; Mandy, William; Hudziak, James J.; Steenhuis, Mark Peter; de Nijs, Pieter F.; Hartman, Catharina A.

    2013-01-01

    This study aimed to contribute to the Diagnostic and Statistical Manual (DSM) debates on the conceptualization of autism by investigating (1) whether empirically based distinct phenotypic profiles could be distinguished within a sample of mainly cognitively able children with pervasive developmental disorder (PDD), and (2) how profiles related to…

  3. Theoretical NMR correlations based Structure Discussion.

    PubMed

    Junker, Jochen

    2011-07-28

    The constitutional assignment of natural products by NMR spectroscopy is usually based on 2D NMR experiments like COSY, HSQC, and HMBC. The actual difficulty of the structure elucidation problem depends more on the type of the investigated molecule than on its size. The moment HMBC data is involved in the process or a large number of heteroatoms is present, a possibility of multiple solutions fitting the same data set exists. A structure elucidation software can be used to find such alternative constitutional assignments and help in the discussion in order to find the correct solution. But this is rarely done. This article describes the use of theoretical NMR correlation data in the structure elucidation process with WEBCOCON, not for the initial constitutional assignments, but to define how well a suggested molecule could have been described by NMR correlation data. The results of this analysis can be used to decide on further steps needed to assure the correctness of the structural assignment. As first step the analysis of the deviation of carbon chemical shifts is performed, comparing chemical shifts predicted for each possible solution with the experimental data. The application of this technique to three well known compounds is shown. Using NMR correlation data alone for the description of the constitutions is not always enough, even when including 13C chemical shift prediction.

  4. Ensemble empirical mode decomposition based fluorescence spectral noise reduction for low concentration PAHs

    NASA Astrophysics Data System (ADS)

    Wang, Shu-tao; Yang, Xue-ying; Kong, De-ming; Wang, Yu-tian

    2017-11-01

    A new noise reduction method based on ensemble empirical mode decomposition (EEMD) is proposed to improve the detection effect for fluorescence spectra. Polycyclic aromatic hydrocarbons (PAHs) pollutants, as a kind of important current environmental pollution source, are highly oncogenic. Using the fluorescence spectroscopy method, the PAHs pollutants can be detected. However, instrument will produce noise in the experiment. Weak fluorescent signals can be affected by noise, so we propose a way to denoise and improve the detection effect. Firstly, we use fluorescence spectrometer to detect PAHs to obtain fluorescence spectra. Subsequently, noises are reduced by EEMD algorithm. Finally, the experiment results show the proposed method is feasible.

  5. Theoretical and empirical investigations of KCl:Eu2+ for nearly water-equivalent radiotherapy dosimetry

    PubMed Central

    Zheng, Yuanshui; Han, Zhaohui; Driewer, Joseph P.; Low, Daniel A.; Li, H. Harold

    2010-01-01

    Purpose: The low effective atomic number, reusability, and other computed radiography-related advantages make europium doped potassium chloride (KCl:Eu2+) a promising dosimetry material. The purpose of this study is to model KCl:Eu2+ point dosimeters with a Monte Carlo (MC) method and, using this model, to investigate the dose responses of two-dimensional (2D) KCl:Eu2+ storage phosphor films (SPFs). Methods: KCl:Eu2+ point dosimeters were irradiated using a 6 MV beam at four depths (5–20 cm) for each of five square field sizes (5×5–25×25 cm2). The dose measured by KCl:Eu2+ was compared to that measured by an ionization chamber to obtain the magnitude of energy dependent dose measurement artifact. The measurements were simulated using DOSXYZnrc with phase space files generated by BEAMnrcMP. Simulations were also performed for KCl:Eu2+ films with thicknesses ranging from 1 μm to 1 mm. The work function of the prototype KCl:Eu2+ material was determined by comparing the sensitivity of a 150 μm thick KCl:Eu2+ film to a commercial BaFBr0.85I0.15:Eu2+-based SPF with a known work function. The work function was then used to estimate the sensitivity of a 1 μm thick KCl:Eu2+ film. Results: The simulated dose responses of prototype KCl:Eu2+ point dosimeters agree well with measurement data acquired by irradiating the dosimeters in the 6 MV beam with varying field size and depth. Furthermore, simulations with films demonstrate that an ultrathin KCl:Eu2+ film with thickness of the order of 1 μm would have nearly water-equivalent dose response. The simulation results can be understood using classic cavity theories. Finally, preliminary experiments and theoretical calculations show that ultrathin KCl:Eu2+ film could provide excellent signal in a 1 cGy dose-to-water irradiation. Conclusions: In conclusion, the authors demonstrate that KCl:Eu2+-based dosimeters can be accurately modeled by a MC method and that 2D KCl:Eu2+ films of the order of 1 μm thick would have

  6. Development and empirical validation of symmetric component measures of multidimensional constructs: customer and competitor orientation.

    PubMed

    Sørensen, Hans Eibe; Slater, Stanley F

    2008-08-01

    Atheoretical measure purification may lead to construct deficient measures. The purpose of this paper is to provide a theoretically driven procedure for the development and empirical validation of symmetric component measures of multidimensional constructs. Particular emphasis is placed on establishing a formalized three-step procedure for achieving a posteriori content validity. Then the procedure is applied to development and empirical validation of two symmetrical component measures of market orientation, customer orientation and competitor orientation. Analysis suggests that average variance extracted is particularly critical to reliability in the respecification of multi-indicator measures. In relation to this, the results also identify possible deficiencies in using Cronbach alpha for establishing reliable and valid measures.

  7. Non-suicidal self-injury and life stress: A systematic meta-analysis and theoretical elaboration

    PubMed Central

    Liu, Richard T.; Cheek, Shayna M.; Nestor, Bridget A.

    2016-01-01

    Recent years have seen a considerable growth of interest in the study of life stress and non-suicidal self-injury (NSSI). The current article presents a systematic review of the empirical literature on this association. In addition to providing a comprehensive meta-analysis, the current article includes a qualitative review of the findings for which there were too few cases (i.e., < 3) for reliable approximations of effect sizes. Across the studies included in the meta-analysis, a significant but modest relation between life stress and NSSI was found (pooled OR = 1.81 [95% CI = 1.49–2.21]). After an adjustment was made for publication bias, the estimated effect size was smaller but still significant (pooled OR = 1.33 [95% CI = 1.08–1.63]). This relation was moderated by sample type, NSSI measure type, and length of period covered by the NSSI measure. The empirical literature is characterized by several methodological limitations, particularly the frequent use of cross-sectional analyses involving temporal overlap between assessments of life stress and NSSI, leaving unclear the precise nature of the relation between these two phenomena (e.g., whether life stress may be a cause, concomitant, or consequence of NSSI). Theoretically informed research utilizing multi-wave designs, assessing life stress and NSSI over relatively brief intervals, and featuring interview-based assessments of these constructs holds promise for advancing our understanding of their relation. The current review concludes with a theoretical elaboration of the association between NSSI and life stress, with the aim of providing a conceptual framework to guide future study in this area. PMID:27267345

  8. Dewey's Concept of Experience for Inquiry-Based Landscape Drawing during Field Studies

    ERIC Educational Resources Information Center

    Tillmann, Alexander; Albrecht, Volker; Wunderlich, Jürgen

    2017-01-01

    The epistemological and educational philosophy of John Dewey is used as a theoretical basis to analyze processes of knowledge construction during geographical field studies. The experience of landscape drawing as a method of inquiry and a starting point for research-based learning is empirically evaluated. The basic drawing skills are acquired…

  9. A practical application of practice-based learning: development of an algorithm for empiric antibiotic coverage in ventilator-associated pneumonia.

    PubMed

    Miller, Preston R; Partrick, Matthew S; Hoth, J Jason; Meredith, J Wayne; Chang, Michael C

    2006-04-01

    Development of practice-based learning (PBL) is one of the core competencies required for resident education by the Accreditation Council for Graduate Medical Education, and specialty organizations including the American College of Surgeons have formed task forces to understand and disseminate information on this important concept. However, translating this concept into daily practice may be difficult. Our goal was to describe the successful application of PBL to patient care improvement with development of an algorithm for the empiric therapy of ventilator-associated pneumonia (VAP). The algorithm development occurred in two phases. In phase 1, the microbiology and timing of VAP as diagnosed by bronchoalveolar lavage was reviewed over a 2-year period to allow for recognition of patterns of infection. In phase 2, based on these data, an algorithm for empiric antibiotic coverage that would ensure that the large majority of patients with VAP received adequate initial empiric therapy was developed and put into practice. The period of algorithm use was then examined to determine rate of adequate coverage and outcome. : In Phase 1, from January 1, 2000 to December 31 2001, 110 patients were diagnosed with VAP. Analysis of microbiology revealed a sharp increase in the recovery of nosocomial pathogens on postinjury day 7 (19% < day 7 versus 47% > or = day 7, p = 0.003). Adequate initial antibiotic coverage was seen in 74%. In Phase 2, an algorithm employing ampicillin- sulbactam for coverage of community- acquired pathogens before day 7 and cefipime for nosocomial coverage > or =day 7 was then employed from January 1, 2002 to December 31, 2003. Evaluation of 186 VAP cases during this interval revealed a similar distribution of nosocomial cases (13% < day 7 versus 64% > or = day 7, p < 0.0001). Empiric antibiotic therapy was adequate in 82% of cases or =day 7: overall accuracy improved to 83% (p = 0.05). Mortality from phase 1 to phase 2 trended

  10. Proposed Core Competencies and Empirical Validation Procedure in Competency Modeling: Confirmation and Classification.

    PubMed

    Baczyńska, Anna K; Rowiński, Tomasz; Cybis, Natalia

    2016-01-01

    Competency models provide insight into key skills which are common to many positions in an organization. Moreover, there is a range of competencies that is used by many companies. Researchers have developed core competency terminology to underline their cross-organizational value. The article presents a theoretical model of core competencies consisting of two main higher-order competencies called performance and entrepreneurship. Each of them consists of three elements: the performance competency includes cooperation, organization of work and goal orientation, while entrepreneurship includes innovativeness, calculated risk-taking and pro-activeness. However, there is lack of empirical validation of competency concepts in organizations and this would seem crucial for obtaining reliable results from organizational research. We propose a two-step empirical validation procedure: (1) confirmation factor analysis, and (2) classification of employees. The sample consisted of 636 respondents (M = 44.5; SD = 15.1). Participants were administered a questionnaire developed for the study purpose. The reliability, measured by Cronbach's alpha, ranged from 0.60 to 0.83 for six scales. Next, we tested the model using a confirmatory factor analysis. The two separate, single models of performance and entrepreneurial orientations fit quite well to the data, while a complex model based on the two single concepts needs further research. In the classification of employees based on the two higher order competencies we obtained four main groups of employees. Their profiles relate to those found in the literature, including so-called niche finders and top performers. Some proposal for organizations is discussed.

  11. Proposed Core Competencies and Empirical Validation Procedure in Competency Modeling: Confirmation and Classification

    PubMed Central

    Baczyńska, Anna K.; Rowiński, Tomasz; Cybis, Natalia

    2016-01-01

    Competency models provide insight into key skills which are common to many positions in an organization. Moreover, there is a range of competencies that is used by many companies. Researchers have developed core competency terminology to underline their cross-organizational value. The article presents a theoretical model of core competencies consisting of two main higher-order competencies called performance and entrepreneurship. Each of them consists of three elements: the performance competency includes cooperation, organization of work and goal orientation, while entrepreneurship includes innovativeness, calculated risk-taking and pro-activeness. However, there is lack of empirical validation of competency concepts in organizations and this would seem crucial for obtaining reliable results from organizational research. We propose a two-step empirical validation procedure: (1) confirmation factor analysis, and (2) classification of employees. The sample consisted of 636 respondents (M = 44.5; SD = 15.1). Participants were administered a questionnaire developed for the study purpose. The reliability, measured by Cronbach’s alpha, ranged from 0.60 to 0.83 for six scales. Next, we tested the model using a confirmatory factor analysis. The two separate, single models of performance and entrepreneurial orientations fit quite well to the data, while a complex model based on the two single concepts needs further research. In the classification of employees based on the two higher order competencies we obtained four main groups of employees. Their profiles relate to those found in the literature, including so-called niche finders and top performers. Some proposal for organizations is discussed. PMID:27014111

  12. The Role of Identity in Acculturation among Immigrant People: Theoretical Propositions, Empirical Questions, and Applied Recommendations

    ERIC Educational Resources Information Center

    Schwartz, Seth J.; Montgomery, Marilyn J.; Briones, Ervin

    2006-01-01

    The present paper advances theoretical propositions regarding the relationship between acculturation and identity. The most central thesis argued is that acculturation represents changes in cultural identity and that personal identity has the potential to "anchor" immigrant people during their transition to a new society. The article emphasizes…

  13. Theoretical results on the tandem junction solar cell based on its Ebers-Moll transistor model

    NASA Technical Reports Server (NTRS)

    Goradia, C.; Vaughn, J.; Baraona, C. R.

    1980-01-01

    A one-dimensional theoretical model of the tandem junction solar cell (TJC) with base resistivity greater than about 1 ohm-cm and under low level injection has been derived. This model extends a previously published conceptual model which treats the TJC as an npn transistor. The model gives theoretical expressions for each of the Ebers-Moll type currents of the illuminated TJC and allows for the calculation of the spectral response, I(sc), V(oc), FF and eta under variation of one or more of the geometrical and material parameters and 1MeV electron fluence. Results of computer calculations based on this model are presented and discussed. These results indicate that for space applications, both a high beginning of life efficiency, greater than 15% AM0, and a high radiation tolerance can be achieved only with thin (less than 50 microns) TJC's with high base resistivity (greater than 10 ohm-cm).

  14. Nursing management of sensory overload in psychiatry – development of a theoretical framework model

    PubMed

    Scheydt, Stefan; Needham, Ian; Nielsen, Gunnar H; Behrens, Johann

    2016-09-01

    Background: The concept of “removal from stimuli” has already been examined by a Delphi-Study. However, some knowledge gaps remained open, which have now been further investigated. Aim: Examination of the concept “management of sensory overload in inpatient psychiatry” including its sub-concepts and specific measures. Method: Analysis of qualitative data about “removal from stimuli” by content analysis according to Mayring. Results: A theoretical description and definition of the concept could be achieved. In addition, sub-concepts (removal from stimuli, modulation of environmental factors, help somebody to help him-/herself) could be identified, theoretical defined and complemented by possible specific measures. Conclusions: The conceptual descriptions provide a further step to raise awareness of professionals in the subject area. Furthermore, we created a theoretical basis for further empirical studies.

  15. Theoretical Perspectives Guiding QOL Indicator Projects

    ERIC Educational Resources Information Center

    Sirgy, M. Joseph

    2011-01-01

    Most of the theoretically based QOL indicators projects can be classified in terms of six major theoretical concepts: (a) socio-economic development (b) personal utility, (c) just society, (d) human development, (e) sustainability, and (f) functioning. I explain the core aspects of these six theoretical paradigms and show how they help guide QOL…

  16. Defining epitope coverage requirements for T cell-based HIV vaccines: Theoretical considerations and practical applications

    PubMed Central

    2011-01-01

    metrics such as the minimum epitope count required to reach a desired level of coverage can be easily calculated. We propose that such analyses can be applied early in the planning stages and during the execution phase of a vaccine trial to explore theoretical and empirical suitability of a vaccine product to a particular epidemic setting. PMID:22152192

  17. Imaging the Material Properties of Bone Specimens using Reflection-Based Infrared Microspectroscopy

    PubMed Central

    Acerbo, Alvin S.; Carr, G. Lawrence; Judex, Stefan; Miller, Lisa M.

    2012-01-01

    Fourier Transform InfraRed Microspectroscopy (FTIRM) is a widely used method for mapping the material properties of bone and other mineralized tissues, including mineralization, crystallinity, carbonate substitution, and collagen cross-linking. This technique is traditionally performed in a transmission-based geometry, which requires the preparation of plastic-embedded thin sections, limiting its functionality. Here, we theoretically and empirically demonstrate the development of reflection-based FTIRM as an alternative to the widely adopted transmission-based FTIRM, which reduces specimen preparation time and broadens the range of specimens that can be imaged. In this study, mature mouse femurs were plastic-embedded and longitudinal sections were cut at a thickness of 4 μm for transmission-based FTIRM measurements. The remaining bone blocks were polished for specular reflectance-based FTIRM measurements on regions immediately adjacent to the transmission sections. Kramers-Kronig analysis of the reflectance data yielded the dielectric response from which the absorption coefficients were directly determined. The reflectance-derived absorbance was validated empirically using the transmission spectra from the thin sections. The spectral assignments for mineralization, carbonate substitution, and collagen cross-linking were indistinguishable in transmission and reflection geometries, while the stoichiometric/non-stoichiometric apatite crystallinity parameter shifted from 1032 / 1021 cm−1 in transmission-based to 1035 / 1025 cm−1 in reflection-based data. This theoretical demonstration and empirical validation of reflection-based FTIRM eliminates the need for thin sections of bone and more readily facilitates direct correlations with other methods such nanoindentation and quantitative backscatter electron imaging (qBSE) from the same specimen. It provides a unique framework for correlating bone’s material and mechanical properties. PMID:22455306

  18. Theoretical investigation of the molecular structure of the isoquercitrin molecule

    NASA Astrophysics Data System (ADS)

    Cornard, J. P.; Boudet, A. C.; Merlin, J. C.

    1999-09-01

    Isoquercitrin is a glycosilated flavonoid that has received a great deal of attention because of its numerous biological effects. We present a theoretical study on isoquercitrin using both empirical (Molecular Mechanics (MM), with MMX force field) and quantum chemical (AM1 semiempirical method) techniques. The most stable structures of the molecule obtained by MM calculations have been used as input data for the semiempirical treatment. The position and orientation of the glucose moiety with regard to the remainder of the molecule have been investigated. The flexibility of isoquercitrin principally lies in rotations around the inter-ring bond and the sugar link. In order to know the structural modifications generated by the substitution by a sugar, geometrical parameters of quercetin (aglycon) and isoquercitrin have been compared. The good accordance between theoretical and experimental electronic spectra permits to confirm the reliability of the structural model.

  19. Reflective equilibrium and empirical data: third person moral experiences in empirical medical ethics.

    PubMed

    De Vries, Martine; Van Leeuwen, Evert

    2010-11-01

    In ethics, the use of empirical data has become more and more popular, leading to a distinct form of applied ethics, namely empirical ethics. This 'empirical turn' is especially visible in bioethics. There are various ways of combining empirical research and ethical reflection. In this paper we discuss the use of empirical data in a special form of Reflective Equilibrium (RE), namely the Network Model with Third Person Moral Experiences. In this model, the empirical data consist of the moral experiences of people in a practice. Although inclusion of these moral experiences in this specific model of RE can be well defended, their use in the application of the model still raises important questions. What precisely are moral experiences? How to determine relevance of experiences, in other words: should there be a selection of the moral experiences that are eventually used in the RE? How much weight should the empirical data have in the RE? And the key question: can the use of RE by empirical ethicists really produce answers to practical moral questions? In this paper we start to answer the above questions by giving examples taken from our research project on understanding the norm of informed consent in the field of pediatric oncology. We especially emphasize that incorporation of empirical data in a network model can reduce the risk of self-justification and bias and can increase the credibility of the RE reached. © 2009 Blackwell Publishing Ltd.

  20. A Rigorous Test of the Fit of the Circumplex Model to Big Five Personality Data: Theoretical and Methodological Issues and Two Large Sample Empirical Tests.

    PubMed

    DeGeest, David Scott; Schmidt, Frank

    2015-01-01

    Our objective was to apply the rigorous test developed by Browne (1992) to determine whether the circumplex model fits Big Five personality data. This test has yet to be applied to personality data. Another objective was to determine whether blended items explained correlations among the Big Five traits. We used two working adult samples, the Eugene-Springfield Community Sample and the Professional Worker Career Experience Survey. Fit to the circumplex was tested via Browne's (1992) procedure. Circumplexes were graphed to identify items with loadings on multiple traits (blended items), and to determine whether removing these items changed five-factor model (FFM) trait intercorrelations. In both samples, the circumplex structure fit the FFM traits well. Each sample had items with dual-factor loadings (8 items in the first sample, 21 in the second). Removing blended items had little effect on construct-level intercorrelations among FFM traits. We conclude that rigorous tests show that the fit of personality data to the circumplex model is good. This finding means the circumplex model is competitive with the factor model in understanding the organization of personality traits. The circumplex structure also provides a theoretically and empirically sound rationale for evaluating intercorrelations among FFM traits. Even after eliminating blended items, FFM personality traits remained correlated.

  1. Research across the disciplines: a road map for quality criteria in empirical ethics research.

    PubMed

    Mertz, Marcel; Inthorn, Julia; Renz, Günter; Rothenberger, Lillian Geza; Salloch, Sabine; Schildmann, Jan; Wöhlke, Sabine; Schicktanz, Silke

    2014-03-01

    Research in the field of Empirical Ethics (EE) uses a broad variety of empirical methodologies, such as surveys, interviews and observation, developed in disciplines such as sociology, anthropology, and psychology. Whereas these empirical disciplines see themselves as purely descriptive, EE also aims at normative reflection. Currently there is literature about the quality of empirical research in ethics, but little or no reflection on specific methodological aspects that must be considered when conducting interdisciplinary empirical ethics. Furthermore, poor methodology in an EE study results in misleading ethical analyses, evaluations or recommendations. This not only deprives the study of scientific and social value, but also risks ethical misjudgement. While empirical and normative-ethical research projects have quality criteria in their own right, we focus on the specific quality criteria for EE research. We develop a tentative list of quality criteria--a "road map"--tailored to interdisciplinary research in EE, to guide assessments of research quality. These quality criteria fall into the categories of primary research question, theoretical framework and methods, relevance, interdisciplinary research practice and research ethics and scientific ethos. EE research is an important and innovative development in bioethics. However, a lack of standards has led to concerns about and even rejection of EE by various scholars. Our suggested orientation list of criteria, presented in the form of reflective questions, cannot be considered definitive, but serves as a tool to provoke systematic reflection during the planning and composition of an EE research study. These criteria need to be tested in different EE research settings and further refined.

  2. Thinking meta-theoretically about the role of internalization in the development of body dissatisfaction and body change behaviors.

    PubMed

    Karazsia, Bryan T; van Dulmen, Manfred H M; Wong, Kendal; Crowther, Janis H

    2013-09-01

    Internalization of societal standards of physical attractiveness (i.e., internalization of the thin ideal for women and internalization of the mesomorphic ideal for men) is a widely studied and robust risk factor for body dissatisfaction and maladaptive body change behaviors. Substantial empirical research supports internalization as both a mediator and a moderator of the relation between societal influences and body dissatisfaction. In this paper, a primer on mediation and moderation is followed by a review of literature and discussion of the extent to which internalization can theoretically fulfill the roles of both mediation and moderation. The literature review revealed a stark contrast in research design (experimental versus non-experimental design) when alternate conceptualizations of internalization are adopted. A meta-theoretical, moderated mediation model is presented. This model integrates previous research and can inform future empirical and clinical endeavors. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Symbiotic empirical ethics: a practical methodology.

    PubMed

    Frith, Lucy

    2012-05-01

    Like any discipline, bioethics is a developing field of academic inquiry; and recent trends in scholarship have been towards more engagement with empirical research. This 'empirical turn' has provoked extensive debate over how such 'descriptive' research carried out in the social sciences contributes to the distinctively normative aspect of bioethics. This paper will address this issue by developing a practical research methodology for the inclusion of data from social science studies into ethical deliberation. This methodology will be based on a naturalistic conception of ethical theory that sees practice as informing theory just as theory informs practice - the two are symbiotically related. From this engagement with practice, the ways that such theories need to be extended and developed can be determined. This is a practical methodology for integrating theory and practice that can be used in empirical studies, one that uses ethical theory both to explore the data and to draw normative conclusions. © 2010 Blackwell Publishing Ltd.

  4. Empirically Founded Teaching in Psychology--An Example for the Combination of Evidence-Based Teaching and the Scholarship of Teaching and Learning

    ERIC Educational Resources Information Center

    Boser, Julia; Scherer, Sonja; Kuchta, Kathrin; Wenzel, S. Franziska C.; Horz, Holger

    2017-01-01

    To improve teaching in higher education, teachers in psychology are encouraged to use evidence-based teaching, that is, to apply empirical findings regarding learning and teaching, when designing learning opportunities. This report illustrates the combination of evidence-based teaching and the Scholarship of Teaching and Learning in teaching…

  5. Written institutional ethics policies on euthanasia: an empirical-based organizational-ethical framework.

    PubMed

    Lemiengre, Joke; Dierckx de Casterlé, Bernadette; Schotsmans, Paul; Gastmans, Chris

    2014-05-01

    As euthanasia has become a widely debated issue in many Western countries, hospitals and nursing homes especially are increasingly being confronted with this ethically sensitive societal issue. The focus of this paper is how healthcare institutions can deal with euthanasia requests on an organizational level by means of a written institutional ethics policy. The general aim is to make a critical analysis whether these policies can be considered as organizational-ethical instruments that support healthcare institutions to take their institutional responsibility for dealing with euthanasia requests. By means of an interpretative analysis, we conducted a process of reinterpretation of results of former Belgian empirical studies on written institutional ethics policies on euthanasia in dialogue with the existing international literature. The study findings revealed that legal regulations, ethical and care-oriented aspects strongly affected the development, the content, and the impact of written institutional ethics policies on euthanasia. Hence, these three cornerstones-law, care and ethics-constituted the basis for the empirical-based organizational-ethical framework for written institutional ethics policies on euthanasia that is presented in this paper. However, having a euthanasia policy does not automatically lead to more legal transparency, or to a more professional and ethical care practice. The study findings suggest that the development and implementation of an ethics policy on euthanasia as an organizational-ethical instrument should be considered as a dynamic process. Administrators and ethics committees must take responsibility to actively create an ethical climate supporting care providers who have to deal with ethical dilemmas in their practice.

  6. The implementation of mindfulness in healthcare systems: a theoretical analysis.

    PubMed

    Demarzo, M M P; Cebolla, A; Garcia-Campayo, J

    2015-01-01

    Evidence regarding the efficacy of mindfulness-based interventions (MBIs) is increasing exponentially; however, there are still challenges to their integration in healthcare systems. Our goal is to provide a conceptual framework that addresses these challenges in order to bring about scholarly dialog and support health managers and practitioners with the implementation of MBIs in healthcare. This is an opinative narrative review based on theoretical and empirical data that address key issues in the implementation of mindfulness in healthcare systems, such as the training of professionals, funding and costs of interventions, cost effectiveness and innovative delivery models. We show that even in the United Kingdom, where mindfulness has a high level of implementation, there is a high variability in the access to MBIs. In addition, we discuss innovative approaches based on "complex interventions," "stepped-care" and "low intensity-high volume" concepts that may prove fruitful in the development and implementation of MBIs in national healthcare systems, particularly in Primary Care. In order to better understand barriers and opportunities for mindfulness implementation in healthcare systems, it is necessary to be aware that MBIs are "complex interventions," which require innovative approaches and delivery models to implement these interventions in a cost-effective and accessible way. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Framing curriculum discursively: theoretical perspectives on the experience of VCE physics

    NASA Astrophysics Data System (ADS)

    Hart, Christina

    2002-10-01

    The process of developing prescribed curricula has been subject to little empirical investigation, and there have been few attempts to develop theoretical frameworks for understanding the shape and content of particular subjects. This paper presents an account of the author's experience of developing a new course for school physics in the State of Victoria, Australia, at the end of the 1980s. The course was to represent a significant departure from traditional physics courses, and was intended to broaden participation and improve the quality of student learning. In the event the new course turned out to be very similar to traditional courses in Physics. The paper explores the reasons for this outcome. Some powerful discursive mechanisms are identified and some implications of post-structuralism for the theoretical understanding of curriculum are discussed.

  8. Entering the Historical Problem Space: Whole-Class Text-Based Discussion in History Class

    ERIC Educational Resources Information Center

    Reisman, Abby

    2015-01-01

    Background/Context: The Common Core State Standards Initiative reveals how little we understand about the components of effective discussion-based instruction in disciplinary history. Although the case for classroom discussion as a core method for subject matter learning stands on stable theoretical and empirical ground, to date, none of the…

  9. Empirically-Derived, Person-Oriented Patterns of School Readiness in Typically-Developing Children: Description and Prediction to First-Grade Achievement

    ERIC Educational Resources Information Center

    Konold, Timothy R.; Pianta, Robert C.

    2005-01-01

    School readiness assessment is a prominent feature of early childhood education. Because the construct of readiness is multifaceted, we examined children's patterns on multiple indicators previously found to be both theoretically and empirically linked to school readiness: social skill, interactions with parents, problem behavior, and performance…

  10. Rock Fracture Toughness Under Mode II Loading: A Theoretical Model Based on Local Strain Energy Density

    NASA Astrophysics Data System (ADS)

    Rashidi Moghaddam, M.; Ayatollahi, M. R.; Berto, F.

    2018-01-01

    The values of mode II fracture toughness reported in the literature for several rocks are studied theoretically by using a modified criterion based on strain energy density averaged over a control volume around the crack tip. The modified criterion takes into account the effect of T-stress in addition to the singular terms of stresses/strains. The experimental results are related to mode II fracture tests performed on the semicircular bend and Brazilian disk specimens. There are good agreements between theoretical predictions using the generalized averaged strain energy density criterion and the experimental results. The theoretical results reveal that the value of mode II fracture toughness is affected by the size of control volume around the crack tip and also the magnitude and sign of T-stress.

  11. Empirical validation of an agent-based model of wood markets in Switzerland

    PubMed Central

    Hilty, Lorenz M.; Lemm, Renato; Thees, Oliver

    2018-01-01

    We present an agent-based model of wood markets and show our efforts to validate this model using empirical data from different sources, including interviews, workshops, experiments, and official statistics. Own surveys closed gaps where data was not available. Our approach to model validation used a variety of techniques, including the replication of historical production amounts, prices, and survey results, as well as a historical case study of a large sawmill entering the market and becoming insolvent only a few years later. Validating the model using this case provided additional insights, showing how the model can be used to simulate scenarios of resource availability and resource allocation. We conclude that the outcome of the rigorous validation qualifies the model to simulate scenarios concerning resource availability and allocation in our study region. PMID:29351300

  12. Decoding the "CoDe": A Framework for Conceptualizing and Designing Help Options in Computer-Based Second Language Listening

    ERIC Educational Resources Information Center

    Cardenas-Claros, Monica Stella; Gruba, Paul A.

    2013-01-01

    This paper proposes a theoretical framework for the conceptualization and design of help options in computer-based second language (L2) listening. Based on four empirical studies, it aims at clarifying both conceptualization and design (CoDe) components. The elements of conceptualization consist of a novel four-part classification of help options:…

  13. Congested traffic states in empirical observations and microscopic simulations

    NASA Astrophysics Data System (ADS)

    Treiber, Martin; Hennecke, Ansgar; Helbing, Dirk

    2000-08-01

    We present data from several German freeways showing different kinds of congested traffic forming near road inhomogeneities, specifically lane closings, intersections, or uphill gradients. The states are localized or extended, homogeneous or oscillating. Combined states are observed as well, like the coexistence of moving localized clusters and clusters pinned at road inhomogeneities, or regions of oscillating congested traffic upstream of nearly homogeneous congested traffic. The experimental findings are consistent with a recently proposed theoretical phase diagram for traffic near on-ramps [D. Helbing, A. Hennecke, and M. Treiber, Phys. Rev. Lett. 82, 4360 (1999)]. We simulate these situations with a continuous microscopic single-lane model, the ``intelligent driver model,'' using empirical boundary conditions. All observations, including the coexistence of states, are qualitatively reproduced by describing inhomogeneities with local variations of one model parameter. We show that the results of the microscopic model can be understood by formulating the theoretical phase diagram for bottlenecks in a more general way. In particular, a local drop of the road capacity induced by parameter variations has essentially the same effect as an on-ramp.

  14. Achilles tendons from decorin- and biglycan-null mouse models have inferior mechanical and structural properties predicted by an image-based empirical damage model

    PubMed Central

    Gordon, J.A.; Freedman, B.R.; Zuskov, A.; Iozzo, R.V.; Birk, D.E.; Soslowsky, L.J.

    2015-01-01

    Achilles tendons are a common source of pain and injury, and their pathology may originate from aberrant structure function relationships. Small leucine rich proteoglycans (SLRPs) influence mechanical and structural properties in a tendon-specific manner. However, their roles in the Achilles tendon have not been defined. The objective of this study was to evaluate the mechanical and structural differences observed in mouse Achilles tendons lacking class I SLRPs; either decorin or biglycan. In addition, empirical modeling techniques based on mechanical and image-based measures were employed. Achilles tendons from decorin-null (Dcn−/−) and biglycan-null (Bgn−/−) C57BL/6 female mice (N=102) were used. Each tendon underwent a dynamic mechanical testing protocol including simultaneous polarized light image capture to evaluate both structural and mechanical properties of each Achilles tendon. An empirical damage model was adapted for application to genetic variation and for use with image based structural properties to predict tendon dynamic mechanical properties. We found that Achilles tendons lacking decorin and biglycan had inferior mechanical and structural properties that were age dependent; and that simple empirical models, based on previously described damage models, were predictive of Achilles tendon dynamic modulus in both decorin- and biglycan-null mice. PMID:25888014

  15. Achilles tendons from decorin- and biglycan-null mouse models have inferior mechanical and structural properties predicted by an image-based empirical damage model.

    PubMed

    Gordon, J A; Freedman, B R; Zuskov, A; Iozzo, R V; Birk, D E; Soslowsky, L J

    2015-07-16

    Achilles tendons are a common source of pain and injury, and their pathology may originate from aberrant structure function relationships. Small leucine rich proteoglycans (SLRPs) influence mechanical and structural properties in a tendon-specific manner. However, their roles in the Achilles tendon have not been defined. The objective of this study was to evaluate the mechanical and structural differences observed in mouse Achilles tendons lacking class I SLRPs; either decorin or biglycan. In addition, empirical modeling techniques based on mechanical and image-based measures were employed. Achilles tendons from decorin-null (Dcn(-/-)) and biglycan-null (Bgn(-/-)) C57BL/6 female mice (N=102) were used. Each tendon underwent a dynamic mechanical testing protocol including simultaneous polarized light image capture to evaluate both structural and mechanical properties of each Achilles tendon. An empirical damage model was adapted for application to genetic variation and for use with image based structural properties to predict tendon dynamic mechanical properties. We found that Achilles tendons lacking decorin and biglycan had inferior mechanical and structural properties that were age dependent; and that simple empirical models, based on previously described damage models, were predictive of Achilles tendon dynamic modulus in both decorin- and biglycan-null mice. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Signal enhancement based on complex curvelet transform and complementary ensemble empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Dong, Lieqian; Wang, Deying; Zhang, Yimeng; Zhou, Datong

    2017-09-01

    Signal enhancement is a necessary step in seismic data processing. In this paper we utilize the complementary ensemble empirical mode decomposition (CEEMD) and complex curvelet transform (CCT) methods to separate signal from random noise further to improve the signal to noise (S/N) ratio. Firstly, the original data with noise is decomposed into a series of intrinsic mode function (IMF) profiles with the aid of CEEMD. Then the IMFs with noise are transformed into CCT domain. By choosing different thresholds which are based on the noise level difference of each IMF profile, the noise in original data can be suppressed. Finally, we illustrate the effectiveness of the approach by simulated and field datasets.

  17. An empirical and model study on automobile market in Taiwan

    NASA Astrophysics Data System (ADS)

    Tang, Ji-Ying; Qiu, Rong; Zhou, Yueping; He, Da-Ren

    2006-03-01

    We have done an empirical investigation on automobile market in Taiwan including the development of the possession rate of the companies in the market from 1979 to 2003, the development of the largest possession rate, and so on. A dynamic model for describing the competition between the companies is suggested based on the empirical study. In the model each company is given a long-term competition factor (such as technology, capital and scale) and a short-term competition factor (such as management, service and advertisement). Then the companies play games in order to obtain more possession rate in the market under certain rules. Numerical simulation based on the model display a competition developing process, which qualitatively and quantitatively agree with our empirical investigation results.

  18. Empirical and targeted therapy of candidemia with fluconazole versus echinocandins: a propensity score-derived analysis of a population-based, multicentre prospective cohort.

    PubMed

    López-Cortés, L E; Almirante, B; Cuenca-Estrella, M; Garnacho-Montero, J; Padilla, B; Puig-Asensio, M; Ruiz-Camps, I; Rodríguez-Baño, J

    2016-08-01

    We compared the clinical efficacy of fluconazole and echinocandins in the treatment of candidemia in real practice. The CANDIPOP study is a prospective, population-based cohort study on candidemia carried out between May 2010 and April 2011 in 29 Spanish hospitals. Using strict inclusion criteria, we separately compared the impact of empirical and targeted therapy with fluconazole or echinocandins on 30-day mortality. Cox regression, including a propensity score (PS) for receiving echinocandins, stratified analysis on the PS quartiles and PS-based matched analyses, were performed. The empirical and targeted therapy cohorts comprised 316 and 421 cases, respectively; 30-day mortality was 18.7% with fluconazole and 33.9% with echinocandins (p 0.02) in the empirical therapy group and 19.8% with fluconazole and 27.7% with echinocandins (p 0.06) in the targeted therapy group. Multivariate Cox regression analysis including PS showed that empirical therapy with fluconazole was associated with better prognosis (adjusted hazard ratio 0.38; 95% confidence interval 0.17-0.81; p 0.01); no differences were found within each PS quartile or in cases matched according to PS. Targeted therapy with fluconazole did not show a significant association with mortality in the Cox regression analysis (adjusted hazard ratio 0.77; 95% confidence interval 0.41-1.46; p 0.63), in the PS quartiles or in PS-matched cases. The results were similar among patients with severe sepsis and septic shock. Empirical or targeted treatment with fluconazole was not associated with increased 30-day mortality compared to echinocandins among adults with candidemia. Copyright © 2016 European Society of Clinical Microbiology and Infectious Diseases. Published by Elsevier Ltd. All rights reserved.

  19. Kindness in Australia: an empirical critique of moral decline sociology.

    PubMed

    Habibis, Daphne; Hookway, Nicholas; Vreugdenhil, Anthea

    2016-09-01

    A new sociological agenda is emerging that interrogates how morality can be established in the absence of the moral certainties of the past but there is a shortage of empirical work on this topic. This article establishes a theoretical framework for the empirical analysis of everyday morality drawing on the work of theorists including Ahmed, Bauman and Taylor. It uses the Australian Survey of Social Attitudes to assess the state and shape of contemporary moralities by asking how kind are Australians, how is its expression socially distributed, and what are the motivations for kindness. The findings demonstrate that Australians exhibit a strong attachment and commitment to kindness as a moral value that is primarily motivated by interiorized sources of moral authority. We argue these findings support the work of theorists such as Ahmed and Taylor who argue authenticity and embodied emotion are legitimate sources of morality in today's secular societies. The research also provides new evidence that generational changes are shaping understandings and practices of kindness in unexpected ways. © London School of Economics and Political Science 2016.

  20. Bayesian model reduction and empirical Bayes for group (DCM) studies

    PubMed Central

    Friston, Karl J.; Litvak, Vladimir; Oswal, Ashwini; Razi, Adeel; Stephan, Klaas E.; van Wijk, Bernadette C.M.; Ziegler, Gabriel; Zeidman, Peter

    2016-01-01

    This technical note describes some Bayesian procedures for the analysis of group studies that use nonlinear models at the first (within-subject) level – e.g., dynamic causal models – and linear models at subsequent (between-subject) levels. Its focus is on using Bayesian model reduction to finesse the inversion of multiple models of a single dataset or a single (hierarchical or empirical Bayes) model of multiple datasets. These applications of Bayesian model reduction allow one to consider parametric random effects and make inferences about group effects very efficiently (in a few seconds). We provide the relatively straightforward theoretical background to these procedures and illustrate their application using a worked example. This example uses a simulated mismatch negativity study of schizophrenia. We illustrate the robustness of Bayesian model reduction to violations of the (commonly used) Laplace assumption in dynamic causal modelling and show how its recursive application can facilitate both classical and Bayesian inference about group differences. Finally, we consider the application of these empirical Bayesian procedures to classification and prediction. PMID:26569570

  1. Bayesian model reduction and empirical Bayes for group (DCM) studies.

    PubMed

    Friston, Karl J; Litvak, Vladimir; Oswal, Ashwini; Razi, Adeel; Stephan, Klaas E; van Wijk, Bernadette C M; Ziegler, Gabriel; Zeidman, Peter

    2016-03-01

    This technical note describes some Bayesian procedures for the analysis of group studies that use nonlinear models at the first (within-subject) level - e.g., dynamic causal models - and linear models at subsequent (between-subject) levels. Its focus is on using Bayesian model reduction to finesse the inversion of multiple models of a single dataset or a single (hierarchical or empirical Bayes) model of multiple datasets. These applications of Bayesian model reduction allow one to consider parametric random effects and make inferences about group effects very efficiently (in a few seconds). We provide the relatively straightforward theoretical background to these procedures and illustrate their application using a worked example. This example uses a simulated mismatch negativity study of schizophrenia. We illustrate the robustness of Bayesian model reduction to violations of the (commonly used) Laplace assumption in dynamic causal modelling and show how its recursive application can facilitate both classical and Bayesian inference about group differences. Finally, we consider the application of these empirical Bayesian procedures to classification and prediction. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  2. Satellite, climatological, and theoretical inputs for modeling of the diurnal cycle of fire emissions

    NASA Astrophysics Data System (ADS)

    Hyer, E. J.; Reid, J. S.; Schmidt, C. C.; Giglio, L.; Prins, E.

    2009-12-01

    The diurnal cycle of fire activity is crucial for accurate simulation of atmospheric effects of fire emissions, especially at finer spatial and temporal scales. Estimating diurnal variability in emissions is also a critical problem for construction of emissions estimates from multiple sensors with variable coverage patterns. An optimal diurnal emissions estimate will use as much information as possible from satellite fire observations, compensate known biases in those observations, and use detailed theoretical models of the diurnal cycle to fill in missing information. As part of ongoing improvements to the Fire Location and Monitoring of Burning Emissions (FLAMBE) fire monitoring system, we evaluated several different methods of integrating observations with different temporal sampling. We used geostationary fire detections from WF_ABBA, fire detection data from MODIS, empirical diurnal cycles from TRMM, and simple theoretical diurnal curves based on surface heating. Our experiments integrated these data in different combinations to estimate the diurnal cycles of emissions for each location and time. Hourly emissions estimates derived using these methods were tested using an aerosol transport model. We present results of this comparison, and discuss the implications of our results for the broader problem of multi-sensor data fusion in fire emissions modeling.

  3. A comparison of measured and theoretical predictions for STS ascent and entry sonic booms

    NASA Technical Reports Server (NTRS)

    Garcia, F., Jr.; Jones, J. H.; Henderson, H. R.

    1983-01-01

    Sonic boom measurements have been obtained during the flights of STS-1 through 5. During STS-1, 2, and 4, entry sonic boom measurements were obtained and ascent measurements were made on STS-5. The objectives of this measurement program were (1) to define the sonic boom characteristics of the Space Transportation System (STS), (2) provide a realistic assessment of the validity of xisting theoretical prediction techniques, and (3) establish a level of confidence for predicting future STS configuration sonic boom environments. Detail evaluation and reporting of the results of this program are in progress. This paper will address only the significant results, mainly those data obtained during the entry of STS-1 at Edwards Air Force Base (EAFB), and the ascent of STS-5 from Kennedy Space Center (KSC). The theoretical prediction technique employed in this analysis is the so called Thomas Program. This prediction technique is a semi-empirical method that required definition of the near field signatures, detailed trajectory characteristics, and the prevailing meteorological characteristics as an input. This analytical procedure then extrapolates the near field signatures from the flight altitude to an altitude consistent with each measurement location.

  4. Theoretical resources for a globalised bioethics.

    PubMed

    Verkerk, Marian A; Lindemann, Hilde

    2011-02-01

    In an age of global capitalism, pandemics, far-flung biobanks, multinational drug trials and telemedicine it is impossible for bioethicists to ignore the global dimensions of their field. However, if they are to do good work on the issues that globalisation requires of them, they need theoretical resources that are up to the task. This paper identifies four distinct understandings of 'globalised' in the bioethics literature: (1) a focus on global issues; (2) an attempt to develop a universal ethical theory that can transcend cultural differences; (3) an awareness of how bioethics itself has expanded, with new centres and journals emerging in nearly every corner of the globe; (4) a concern to avoid cultural imperialism in encounters with other societies. Each of these approaches to globalisation has some merit, as will be shown. The difficulty with them is that the standard theoretical tools on which they rely are not designed for cross-cultural ethical reflection. As a result, they leave important considerations hidden. A set of theoretical resources is proposed to deal with the moral puzzles of globalisation. Abandoning idealised moral theory, a normative framework is developed that is sensitive enough to account for differences without losing the broader context in which ethical issues arise. An empirically nourished, self-reflexive, socially inquisitive, politically critical and inclusive ethics allows bioethicists the flexibility they need to pick up on the morally relevant particulars of this situation here without losing sight of the broader cultural contexts in which it all takes place.

  5. Secondary use of empirical research data in medical ethics papers on gamete donation: forms of use and pitfalls.

    PubMed

    Provoost, Veerle

    2015-03-01

    This paper aims to provide a description of how authors publishing in medical ethics journals have made use of empirical research data in papers on the topic of gamete or embryo donation by means of references to studies conducted by others (secondary use). Rather than making a direct contribution to the theoretical methodological literature about the role empirical research data could play or should play in ethics studies, the focus is on the particular uses of these data and the problems that can be encountered with this use. In the selection of papers examined, apart from being used to describe the context, empirical evidence was mainly used to recount problems that needed solving. Few of the authors looked critically at the quality of the studies they quoted, and several instances were found of empirical data being used poorly or inappropriately. This study provides some initial baseline evidence that shows empirical data, in the form of references to studies, are sometimes being used in inappropriate ways. This suggests that medical ethicists should be more concerned about the quality of the empirical data selected, the appropriateness of the choice for a particular type of data (from a particular type of study) and the correct integration of this evidence in sound argumentation. Given that empirical data can be misused also when merely cited instead of reported, it may be worthwhile to explore good practice requirements for this type of use of empirical data in medical ethics.

  6. Development and evaluation of consensus-based sediment effect concentrations for polychlorinated biphenyls

    USGS Publications Warehouse

    MacDonald, Donald D.; Dipinto, Lisa M.; Field, Jay; Ingersoll, Christopher G.; Long, Edward R.; Swartz, Richard C.

    2000-01-01

    Sediment-quality guidelines (SQGs) have been published for polychlorinated biphenyls (PCBs) using both empirical and theoretical approaches. Empirically based guidelines have been developed using the screening-level concentration, effects range, effects level, and apparent effects threshold approaches. Theoretically based guidelines have been developed using the equilibrium-partitioning approach. Empirically-based guidelines were classified into three general categories, in accordance with their original narrative intents, and used to develop three consensus-based sediment effect concentrations (SECs) for total PCBs (tPCBs), including a threshold effect concentration, a midrange effect concentration, and an extreme effect concentration. Consensus-based SECs were derived because they estimate the central tendency of the published SQGs and, thus, reconcile the guidance values that have been derived using various approaches. Initially, consensus-based SECs for tPCBs were developed separately for freshwater sediments and for marine and estuarine sediments. Because the respective SECs were statistically similar, the underlying SQGs were subsequently merged and used to formulate more generally applicable SECs. The three consensus-based SECs were then evaluated for reliability using matching sediment chemistry and toxicity data from field studies, dose-response data from spiked-sediment toxicity tests, and SQGs derived from the equilibrium-partitioning approach. The results of this evaluation demonstrated that the consensus-based SECs can accurately predict both the presence and absence of toxicity in field-collected sediments. Importantly, the incidence of toxicity increases incrementally with increasing concentrations of tPCBs. Moreover, the consensus-based SECs are comparable to the chronic toxicity thresholds that have been estimated from dose-response data and equilibrium-partitioning models. Therefore, consensus-based SECs provide a unifying synthesis of existing

  7. The Problem of Empirical Redundancy of Constructs in Organizational Research: An Empirical Investigation

    ERIC Educational Resources Information Center

    Le, Huy; Schmidt, Frank L.; Harter, James K.; Lauver, Kristy J.

    2010-01-01

    Construct empirical redundancy may be a major problem in organizational research today. In this paper, we explain and empirically illustrate a method for investigating this potential problem. We applied the method to examine the empirical redundancy of job satisfaction (JS) and organizational commitment (OC), two well-established organizational…

  8. An Acceptance and Mindfulness-Based Approach to Social Phobia: A Case Study

    ERIC Educational Resources Information Center

    Brady, Victoria Popick; Whitman, Sarah M.

    2012-01-01

    Over the past few years, there has been a proliferation of theoretical discussions and empirical research on the use of acceptance and mindfulness-based therapies to treat anxiety disorders. Because these treatment approaches are in their infancy, many clinicians may still be uncertain about how to apply such treatments in their work with clients.…

  9. An all-atom structure-based potential for proteins: bridging minimal models with all-atom empirical forcefields.

    PubMed

    Whitford, Paul C; Noel, Jeffrey K; Gosavi, Shachi; Schug, Alexander; Sanbonmatsu, Kevin Y; Onuchic, José N

    2009-05-01

    Protein dynamics take place on many time and length scales. Coarse-grained structure-based (Go) models utilize the funneled energy landscape theory of protein folding to provide an understanding of both long time and long length scale dynamics. All-atom empirical forcefields with explicit solvent can elucidate our understanding of short time dynamics with high energetic and structural resolution. Thus, structure-based models with atomic details included can be used to bridge our understanding between these two approaches. We report on the robustness of folding mechanisms in one such all-atom model. Results for the B domain of Protein A, the SH3 domain of C-Src Kinase, and Chymotrypsin Inhibitor 2 are reported. The interplay between side chain packing and backbone folding is explored. We also compare this model to a C(alpha) structure-based model and an all-atom empirical forcefield. Key findings include: (1) backbone collapse is accompanied by partial side chain packing in a cooperative transition and residual side chain packing occurs gradually with decreasing temperature, (2) folding mechanisms are robust to variations of the energetic parameters, (3) protein folding free-energy barriers can be manipulated through parametric modifications, (4) the global folding mechanisms in a C(alpha) model and the all-atom model agree, although differences can be attributed to energetic heterogeneity in the all-atom model, and (5) proline residues have significant effects on folding mechanisms, independent of isomerization effects. Because this structure-based model has atomic resolution, this work lays the foundation for future studies to probe the contributions of specific energetic factors on protein folding and function.

  10. An All-atom Structure-Based Potential for Proteins: Bridging Minimal Models with All-atom Empirical Forcefields

    PubMed Central

    Whitford, Paul C.; Noel, Jeffrey K.; Gosavi, Shachi; Schug, Alexander; Sanbonmatsu, Kevin Y.; Onuchic, José N.

    2012-01-01

    Protein dynamics take place on many time and length scales. Coarse-grained structure-based (Gō) models utilize the funneled energy landscape theory of protein folding to provide an understanding of both long time and long length scale dynamics. All-atom empirical forcefields with explicit solvent can elucidate our understanding of short time dynamics with high energetic and structural resolution. Thus, structure-based models with atomic details included can be used to bridge our understanding between these two approaches. We report on the robustness of folding mechanisms in one such all-atom model. Results for the B domain of Protein A, the SH3 domain of C-Src Kinase and Chymotrypsin Inhibitor 2 are reported. The interplay between side chain packing and backbone folding is explored. We also compare this model to a Cα structure-based model and an all-atom empirical forcefield. Key findings include 1) backbone collapse is accompanied by partial side chain packing in a cooperative transition and residual side chain packing occurs gradually with decreasing temperature 2) folding mechanisms are robust to variations of the energetic parameters 3) protein folding free energy barriers can be manipulated through parametric modifications 4) the global folding mechanisms in a Cα model and the all-atom model agree, although differences can be attributed to energetic heterogeneity in the all-atom model 5) proline residues have significant effects on folding mechanisms, independent of isomerization effects. Since this structure-based model has atomic resolution, this work lays the foundation for future studies to probe the contributions of specific energetic factors on protein folding and function. PMID:18837035

  11. Comparison of ensemble post-processing approaches, based on empirical and dynamical error modelisation of rainfall-runoff model forecasts

    NASA Astrophysics Data System (ADS)

    Chardon, J.; Mathevet, T.; Le Lay, M.; Gailhard, J.

    2012-04-01

    In the context of a national energy company (EDF : Electricité de France), hydro-meteorological forecasts are necessary to ensure safety and security of installations, meet environmental standards and improve water ressources management and decision making. Hydrological ensemble forecasts allow a better representation of meteorological and hydrological forecasts uncertainties and improve human expertise of hydrological forecasts, which is essential to synthesize available informations, coming from different meteorological and hydrological models and human experience. An operational hydrological ensemble forecasting chain has been developed at EDF since 2008 and is being used since 2010 on more than 30 watersheds in France. This ensemble forecasting chain is characterized ensemble pre-processing (rainfall and temperature) and post-processing (streamflow), where a large human expertise is solicited. The aim of this paper is to compare 2 hydrological ensemble post-processing methods developed at EDF in order improve ensemble forecasts reliability (similar to Monatanari &Brath, 2004; Schaefli et al., 2007). The aim of the post-processing methods is to dress hydrological ensemble forecasts with hydrological model uncertainties, based on perfect forecasts. The first method (called empirical approach) is based on a statistical modelisation of empirical error of perfect forecasts, by streamflow sub-samples of quantile class and lead-time. The second method (called dynamical approach) is based on streamflow sub-samples of quantile class and streamflow variation, and lead-time. On a set of 20 watersheds used for operational forecasts, results show that both approaches are necessary to ensure a good post-processing of hydrological ensemble, allowing a good improvement of reliability, skill and sharpness of ensemble forecasts. The comparison of the empirical and dynamical approaches shows the limits of the empirical approach which is not able to take into account hydrological

  12. Comparisons of ground motions from the 1999 Chi-Chi, earthquake with empirical predictions largely based on data from California

    USGS Publications Warehouse

    Boore, D.M.

    2001-01-01

    This article has the modest goal of comparing the ground motions recorded during the 1999 Chi-Chi, Taiwan, mainshock with predictions from four empirical-based equations commonly used for western North America; these empirical predictions are largely based on data from California. Comparisons are made for peak acceleration and 5%-damped response spectra at periods between 0.1 and 4 sec. The general finding is that the Chi-Chi ground motions are smaller than those predicted from the empirically based equations for periods less than about 1 sec by factors averaging about 0.4 but as small as 0.26 (depending on period, on which equation is used, and on whether the sites are assumed to be rock or soil). There is a trend for the observed motions to approach or even exceed the predicted motions for longer periods. Motions at similar distances (30-60 km) to the east and to the west of the fault differ dramatically at periods between about 2 and 20 sec: Long-duration wave trains are present on the motions to the west, and when normalized to similar amplitudes at short periods, the response spectra of the motions at the western stations are as much as five times larger than those of motions from eastern stations. The explanation for the difference is probably related to site and propagation effects; the western stations are on the Coastal Plain, whereas the eastern stations are at the foot of young and steep mountains, either in the relatively narrow Longitudinal Valley or along the eastern coast-the sediments underlying the eastern stations are probably shallower and have higher velocity than those under the western stations.

  13. Practical guidelines for development of web-based interventions.

    PubMed

    Chee, Wonshik; Lee, Yaelim; Chee, Eunice; Im, Eun-Ok

    2014-10-01

    Despite a recent high funding priority on technological aspects of research and a high potential impact of Web-based interventions on health, few guidelines for the development of Web-based interventions are currently available. In this article, we propose practical guidelines for development of Web-based interventions based on an empirical study and an integrative literature review. The empirical study aimed at development of a Web-based physical activity promotion program that was specifically tailored to Korean American midlife women. The literature review included a total of 202 articles that were retrieved through multiple databases. On the basis of the findings of the study and the literature review, we propose directions for development of Web-based interventions in the following steps: (1) meaningfulness and effectiveness, (2) target population, (3) theoretical basis/program theory, (4) focus and objectives, (5) components, (6) technological aspects, and (7) logistics for users. The guidelines could help promote further development of Web-based interventions at this early stage of Web-based interventions in nursing.

  14. Empirically based device modeling of bulk heterojunction organic photovoltaics

    NASA Astrophysics Data System (ADS)

    Pierre, Adrien; Lu, Shaofeng; Howard, Ian A.; Facchetti, Antonio; Arias, Ana Claudia

    2013-04-01

    We develop an empirically based optoelectronic model to accurately simulate the photocurrent in organic photovoltaic (OPV) devices with novel materials including bulk heterojunction OPV devices based on a new low band gap dithienothiophene-DPP donor polymer, P(TBT-DPP), blended with PC70BM at various donor-acceptor weight ratios and solvent compositions. Our devices exhibit power conversion efficiencies ranging from 1.8% to 4.7% at AM 1.5G. Electron and hole mobilities are determined using space-charge limited current measurements. Bimolecular recombination coefficients are both analytically calculated using slowest-carrier limited Langevin recombination and measured using an electro-optical pump-probe technique. Exciton quenching efficiencies in the donor and acceptor domains are determined from photoluminescence spectroscopy. In addition, dielectric and optical constants are experimentally determined. The photocurrent and its bias-dependence that we simulate using the optoelectronic model we develop, which takes into account these physically measured parameters, shows less than 7% error with respect to the experimental photocurrent (when both experimentally and semi-analytically determined recombination coefficient is used). Free carrier generation and recombination rates of the photocurrent are modeled as a function of the position in the active layer at various applied biases. These results show that while free carrier generation is maximized in the center of the device, free carrier recombination is most dominant near the electrodes even in high performance devices. Such knowledge of carrier activity is essential for the optimization of the active layer by enhancing light trapping and minimizing recombination. Our simulation program is intended to be freely distributed for use in laboratories fabricating OPV devices.

  15. A patient-centered pharmacy services model of HIV patient care in community pharmacy settings: a theoretical and empirical framework.

    PubMed

    Kibicho, Jennifer; Owczarzak, Jill

    2012-01-01

    Reflecting trends in health care delivery, pharmacy practice has shifted from a drug-specific to a patient-centered model of care, aimed at improving the quality of patient care and reducing health care costs. In this article, we outline a theoretical model of patient-centered pharmacy services (PCPS), based on in-depth, qualitative interviews with a purposive sample of 28 pharmacists providing care to HIV-infected patients in specialty, semispecialty, and nonspecialty pharmacy settings. Data analysis was an interactive process informed by pharmacists' interviews and a review of the general literature on patient centered care, including Medication Therapy Management (MTM) services. Our main finding was that the current models of pharmacy services, including MTM, do not capture the range of pharmacy services in excess of mandated drug dispensing services. In this article, we propose a theoretical PCPS model that reflects the actual services pharmacists provide. The model includes five elements: (1) addressing patients as whole, contextualized persons; (2) customizing interventions to unique patient circumstances; (3) empowering patients to take responsibility for their own health care; (4) collaborating with clinical and nonclinical providers to address patient needs; and (5) developing sustained relationships with patients. The overarching goal of PCPS is to empower patients' to take responsibility for their own health care and self-manage their HIV-infection. Our findings provide the foundation for future studies regarding how widespread these practices are in diverse community settings, the validity of the proposed PCPS model, the potential for standardizing pharmacist practices, and the feasibility of a PCPS framework to reimburse pharmacists services.

  16. Theoretical integration in motivational science: System justification as one of many "autonomous motivational structures".

    PubMed

    Kay, Aaron C; Jost, John T

    2014-04-01

    Recognizing that there is a multiplicity of motives - and that the accessibility and strength of each one varies chronically and temporarily - is essential if motivational scientists are to achieve genuine theoretical and empirical integration. We agree that system justification is a case of nonconscious goal pursuit and discuss implications of the fact that it conflicts with many other psychological goals.

  17. Improper ferroelectricity: A theoretical and experimental investigation

    NASA Astrophysics Data System (ADS)

    Hardy, J. R.; Ullman, F. G.

    1984-02-01

    A combined theoretical and experimental study has been made of the origins and properties of the improper ferroelectricity associated with structural modulations of non-zero wavelengths. Two classes of materials have been studied: rare earth molybdates (specifically, gadolinium molybdate: GMO), and potassium selenate and its isomorphs. In the former, the modulation is produced by a zone boundary phonon instability, and in the latter by the instability of a phonon of wave vector approximately two-thirds of the way to the zone-boundary. In the second case the initial result is a modulated structure whose repeat distance is not a rational multiple of the basic lattice repeat distance. This result is a modulated polarization which, when the basic modulation locks in to a rational multiple of the lattice spacing, becomes uniform, and improper ferroelectricity results. The origins of these effects have been elucidated by theoretical studies, initially semi-empirical, but subsequently from first-principles. These complemented the experimental work, which primarily used inelastic light scattering, uniaxial stress, and hydrostatic pressure, to probe the balance between the interionic forces through the effects on the phonons and dielectric properties.

  18. STEAM: a software tool based on empirical analysis for micro electro mechanical systems

    NASA Astrophysics Data System (ADS)

    Devasia, Archana; Pasupuleti, Ajay; Sahin, Ferat

    2006-03-01

    In this research a generalized software framework that enables accurate computer aided design of MEMS devices is developed. The proposed simulation engine utilizes a novel material property estimation technique that generates effective material properties at the microscopic level. The material property models were developed based on empirical analysis and the behavior extraction of standard test structures. A literature review is provided on the physical phenomena that govern the mechanical behavior of thin films materials. This survey indicates that the present day models operate under a wide range of assumptions that may not be applicable to the micro-world. Thus, this methodology is foreseen to be an essential tool for MEMS designers as it would develop empirical models that relate the loading parameters, material properties, and the geometry of the microstructures with its performance characteristics. This process involves learning the relationship between the above parameters using non-parametric learning algorithms such as radial basis function networks and genetic algorithms. The proposed simulation engine has a graphical user interface (GUI) which is very adaptable, flexible, and transparent. The GUI is able to encompass all parameters associated with the determination of the desired material property so as to create models that provide an accurate estimation of the desired property. This technique was verified by fabricating and simulating bilayer cantilevers consisting of aluminum and glass (TEOS oxide) in our previous work. The results obtained were found to be very encouraging.

  19. The Convergence of Institutional Logics on the Community College Sector and the Normalization of Emotional Labor: A New Theoretical Approach for Considering the Community College Faculty Labor Expectations

    ERIC Educational Resources Information Center

    Gonzales, Leslie D.; Ayers, David F.

    2018-01-01

    Little empirical research has systematically focused on, or interrogated, the labor expectations set forth for community college faculty. Thus, in this paper, we present a theoretical argument, which we formed by (re) reading several community college focused studies through various theoretical lenses. Ultimately, we merged two…

  20. Security-aware Virtual Machine Allocation in the Cloud: A Game Theoretic Approach

    DTIC Science & Technology

    2015-01-13

    predecessor, however, this paper used empirical evidence and actual data from running experiments on the Amazon EC2 cloud . They began by running all 5...is through effective VM allocation management of the cloud provider to ensure delivery of maximum security for all cloud users. The negative... Cloud : A Game Theoretic Approach 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f

  1. Time-frequency analysis of neuronal populations with instantaneous resolution based on noise-assisted multivariate empirical mode decomposition.

    PubMed

    Alegre-Cortés, J; Soto-Sánchez, C; Pizá, Á G; Albarracín, A L; Farfán, F D; Felice, C J; Fernández, E

    2016-07-15

    Linear analysis has classically provided powerful tools for understanding the behavior of neural populations, but the neuron responses to real-world stimulation are nonlinear under some conditions, and many neuronal components demonstrate strong nonlinear behavior. In spite of this, temporal and frequency dynamics of neural populations to sensory stimulation have been usually analyzed with linear approaches. In this paper, we propose the use of Noise-Assisted Multivariate Empirical Mode Decomposition (NA-MEMD), a data-driven template-free algorithm, plus the Hilbert transform as a suitable tool for analyzing population oscillatory dynamics in a multi-dimensional space with instantaneous frequency (IF) resolution. The proposed approach was able to extract oscillatory information of neurophysiological data of deep vibrissal nerve and visual cortex multiunit recordings that were not evidenced using linear approaches with fixed bases such as the Fourier analysis. Texture discrimination analysis performance was increased when Noise-Assisted Multivariate Empirical Mode plus Hilbert transform was implemented, compared to linear techniques. Cortical oscillatory population activity was analyzed with precise time-frequency resolution. Similarly, NA-MEMD provided increased time-frequency resolution of cortical oscillatory population activity. Noise-Assisted Multivariate Empirical Mode Decomposition plus Hilbert transform is an improved method to analyze neuronal population oscillatory dynamics overcoming linear and stationary assumptions of classical methods. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Familiarizing Students with the Empirically Supported Treatment Approaches for Childhood Problems.

    ERIC Educational Resources Information Center

    Wilkins, Victoria; Chambliss, Catherine

    The clinical research literature exploring the efficacy of particular treatment approaches is reviewed with the intent to facilitate the training of counseling students. Empirically supported treatments (ESTs) is defined operationally as evidence-based treatments following the listing of empirically validated psychological treatments reported by…

  3. Impact of implant support on mandibular free-end base removable partial denture: theoretical study.

    PubMed

    Oh, Won-suk; Oh, Tae-Ju; Park, Ju-mi

    2016-02-01

    This study investigated the impact of implant support on the development of shear force and bending moment in mandibular free-end base removable partial dentures (RPDs). Three theoretical test models of unilateral mandibular free-end base RPDs were constructed to represent the base of tooth replacement, as follows: Model 1: first and second molars (M1 and M2); Model 2: second premolar (P2), M1, and M2; and Model 3: first premolar (P1), P2, M1, and M2. The implant support located either at M1 or M2 sites. The occlusal loading was concentrated at each replacement tooth to calculate the stress resultants developed in the RPD models using the free-body diagrams of shear force and bending moment. There was a trend of reduction in the peak shear force and bending moment when the base was supported by implant. However, the degree of reduction varied with the location of implant support. The moment reduced by 76% in Model 1, 58% in Model 2, and 42% in Model 3, when the implant location shifted from M1 to M2 sites. The shear forces and bending moments subjected to mandibular free-end base RPDs were found to decrease with the addition of implant support. However, the impact of implant support varied with the location of implant in this theoretical study. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  4. Research on Sustainable Development Level Evaluation of Resource-based Cities Based on Shapely Entropy and Chouqet Integral

    NASA Astrophysics Data System (ADS)

    Zhao, Hui; Qu, Weilu; Qiu, Weiting

    2018-03-01

    In order to evaluate sustainable development level of resource-based cities, an evaluation method with Shapely entropy and Choquet integral is proposed. First of all, a systematic index system is constructed, the importance of each attribute is calculated based on the maximum Shapely entropy principle, and then the Choquet integral is introduced to calculate the comprehensive evaluation value of each city from the bottom up, finally apply this method to 10 typical resource-based cities in China. The empirical results show that the evaluation method is scientific and reasonable, which provides theoretical support for the sustainable development path and reform direction of resource-based cities.

  5. MDMA is certainly damaging after 25 years of empirical research: a reply and refutation of Doblin et al. (2014).

    PubMed

    Parrott, Andrew C

    2014-03-01

    Human Psychopharmacology recently published my review into the increase in empirical knowledge about the human psychobiology of MDMA over the past 25 years (Parrott, 2013a). Deficits have been demonstrated in retrospective memory, prospective memory, higher cognition, complex visual processing, sleep architecture, sleep apnoea, pain, neurohormonal activity, and psychiatric status. Neuroimaging studies have shown serotonergic deficits, which are associated with lifetime Ecstasy/MDMA usage, and degree of neurocognitive impairment. Basic psychological skills remain intact. Ecstasy/MDMA use by pregnant mothers leads to psychomotor impairments in the children. Hence, the damaging effects of Ecstasy/MDMA were far more widespread than was realized a few years ago. In their critique of my review, Doblin et al. (2014) argued that my review contained misstatements, omitted contrary findings, and recited dated misconceptions. In this reply, I have answered all the points they raised. I have been able to refute each of their criticisms by citing the relevant empirical data, since many of their points were based on inaccurate summaries of the actual research findings. Doblin and colleagues are proponents of the use of MDMA for drug-assisted psychotherapy, and their strongest criticisms were focused on my concerns about this proposal. However, again all the issues I raised were based on sound empirical evidence or theoretical understanding. Indeed I would recommend potentially far safer co-drugs such as D-cycloserine or oxytocin. In summary, MDMA can induce a wide range of neuropsychobiological changes, many of which are damaging to humans. Copyright © 2014 John Wiley & Sons, Ltd.

  6. Development of An Empirical Water Quality Model for Stormwater Based on Watershed Land Use in Puget Sound

    DTIC Science & Technology

    2007-03-29

    Development of An Empirical Water Quality Model for Stormwater Based on Watershed Land Use in Puget Sound Valerie I. Cullinan, Christopher W. May...Systems Center, Bremerton, WA) Introduction The Sinclair and Dyes Inlet watershed is located on the west side of Puget Sound in Kitsap County...Washington, U.S.A. (Figure 1). The Puget Sound Naval Shipyard (PSNS), U.S Environmental Protection Agency (USEPA), the Washington State Department of

  7. Research across the disciplines: a road map for quality criteria in empirical ethics research

    PubMed Central

    2014-01-01

    Background Research in the field of Empirical Ethics (EE) uses a broad variety of empirical methodologies, such as surveys, interviews and observation, developed in disciplines such as sociology, anthropology, and psychology. Whereas these empirical disciplines see themselves as purely descriptive, EE also aims at normative reflection. Currently there is literature about the quality of empirical research in ethics, but little or no reflection on specific methodological aspects that must be considered when conducting interdisciplinary empirical ethics. Furthermore, poor methodology in an EE study results in misleading ethical analyses, evaluations or recommendations. This not only deprives the study of scientific and social value, but also risks ethical misjudgement. Discussion While empirical and normative-ethical research projects have quality criteria in their own right, we focus on the specific quality criteria for EE research. We develop a tentative list of quality criteria – a “road map” – tailored to interdisciplinary research in EE, to guide assessments of research quality. These quality criteria fall into the categories of primary research question, theoretical framework and methods, relevance, interdisciplinary research practice and research ethics and scientific ethos. Summary EE research is an important and innovative development in bioethics. However, a lack of standards has led to concerns about and even rejection of EE by various scholars. Our suggested orientation list of criteria, presented in the form of reflective questions, cannot be considered definitive, but serves as a tool to provoke systematic reflection during the planning and composition of an EE research study. These criteria need to be tested in different EE research settings and further refined. PMID:24580847

  8. Adolescent preventive health and team-games-tournaments: five decades of evidence for an empirically based paradigm.

    PubMed

    Wodarski, John S; Feit, Marvin D

    2011-01-01

    The problematic behaviors of teenagers and the subsequent negative consequences are extensive and well documented: unwanted pregnancy, substance abuse, violent behavior, depression, and social and psychological consequences of unemployment. In this article, the authors review an approach that uses a cooperative learning, empirically based intervention that employs peers as teachers. This intervention of choice is Teams-Games-Tournaments (TGT), a paradigm backed by five decades of empirical support. The application of TGT in preventive health programs incorporates elements in common with other prevention programs that are based on a public health orientation and constitute the essential components of health education, that is, skills training and practice in applying skills. The TGT intervention supports the idea that children and adolescents from various socioeconomic classes, between the ages of 8 and 18 and in classrooms or groups ranging in size from 4 to 17 members, can work together for one another. TGT has been applied successfully in such diverse areas as adolescent development, sexuality education, psychoactive substance abuse education, anger control, coping with depression and suicide, nutrition, comprehensive employment preparation, and family intervention. This article reviews the extensive research on TGT using examples of successful projects in substance abuse, violence, and nutrition. Issues are raised that relate to the implementation of preventive health strategies for adolescents, including cognitive aspects, social and family networks, and intervention components.

  9. Debris flow susceptibility assessment based on an empirical approach in the central region of South Korea

    NASA Astrophysics Data System (ADS)

    Kang, Sinhang; Lee, Seung-Rae

    2018-05-01

    Many debris flow spreading analyses have been conducted during recent decades to prevent damage from debris flows. An empirical approach that has been used in various studies on debris flow spreading has advantages such as simple data acquisition and good applicability for large areas. In this study, a GIS-based empirical model that was developed at the University of Lausanne (Switzerland) is used to assess the debris flow susceptibility. Study sites are classified based on the types of soil texture or geological conditions, which can indirectly consider geotechnical or rheological properties, to supplement the weaknesses of Flow-R which neglects local controlling factors. The mean travel angle for each classification is calculated from a debris flow inventory map. The debris flow susceptibility is assessed based on changes in the flow-direction algorithm, an inertial function with a 5-m DEM resolution. A simplified friction-limited model was applied to the runout distance analysis by using the appropriate travel angle for the corresponding classification with a velocity limit of 28 m/s. The most appropriate algorithm combinations that derived the highest average of efficiency and sensitivity for each classification are finally determined by applying a confusion matrix with the efficiency and the sensitivity to the results of the susceptibility assessment. The proposed schemes can be useful for debris flow susceptibility assessment in both the study area and the central region of Korea, which has similar environmental factors such as geological conditions, topography and rainfall characteristics to the study area.

  10. Randomized Trial of ConquerFear: A Novel, Theoretically Based Psychosocial Intervention for Fear of Cancer Recurrence.

    PubMed

    Butow, Phyllis N; Turner, Jane; Gilchrist, Jemma; Sharpe, Louise; Smith, Allan Ben; Fardell, Joanna E; Tesson, Stephanie; O'Connell, Rachel; Girgis, Afaf; Gebski, Val J; Asher, Rebecca; Mihalopoulos, Cathrine; Bell, Melanie L; Zola, Karina Grunewald; Beith, Jane; Thewes, Belinda

    2017-12-20

    Purpose Fear of cancer recurrence (FCR) is prevalent, distressing, and long lasting. This study evaluated the impact of a theoretically/empirically based intervention (ConquerFear) on FCR. Methods Eligible survivors had curable breast or colorectal cancer or melanoma, had completed treatment (not including endocrine therapy) 2 months to 5 years previously, were age > 18 years, and had scores above the clinical cutoff on the FCR Inventory (FCRI) severity subscale at screening. Participants were randomly assigned at a one-to-one ratio to either five face-to-face sessions of ConquerFear (attention training, metacognitions, acceptance/mindfulness, screening behavior, and values-based goal setting) or an attention control (Taking-it-Easy relaxation therapy). Participants completed questionnaires at baseline (T0), immediately post-therapy (T1), and 3 (T2) and 6 months (T3) later. The primary outcome was FCRI total score. Results Of 704 potentially eligible survivors from 17 sites and two online databases, 533 were contactable, of whom 222 (42%) consented; 121 were randomly assigned to intervention and 101 to control. Study arms were equivalent at baseline on all measured characteristics. ConquerFear participants had clinically and statistically greater improvements than control participants from T0 to T1 on FCRI total ( P < .001) and severity subscale scores ( P = .001), which were maintained at T2 ( P = .017 and P = .023, respectively) and, for FCRI total only, at T3 ( P = .018), and from T0 to T1 on three FCRI subscales (coping, psychological distress, and triggers) as well as in general anxiety, cancer-specific distress (total), and mental quality of life and metacognitions (total). Differences in FCRI psychological distress and cancer-specific distress (total) remained significantly different at T3. Conclusion This randomized trial demonstrated efficacy of ConquerFear compared with attention control (Taking-it-Easy) in reduction of FCRI total scores immediately post

  11. Theoretical Analysis on Mechanical Deformation of Membrane-Based Photomask Blanks

    NASA Astrophysics Data System (ADS)

    Marumoto, Kenji; Aya, Sunao; Yabe, Hedeki; Okada, Tatsunori; Sumitani, Hiroaki

    2012-04-01

    Membrane-based photomask is used in proximity X-ray lithography including that in LIGA (Lithographie, Galvanoformung und Abformung) process, and near-field photolithography. In this article, out-of-plane deformation (OPD) and in-plane displacement (IPD) of membrane-based photomask blanks are theoretically analyzed to obtain the mask blanks with flat front surface and low stress absorber film. First, we derived the equations of OPD and IPD for the processing steps of membrane-based photomask such as film deposition, back-etching and bonding, using a theory of symmetrical bending of circular plates with a coaxial circular hole and that of deformation of cylinder under hydrostatic pressure. The validity of the equations was proved by comparing the calculation results with experimental ones. Using these equations, we investigated the relation between the geometry of the mask blanks and the distortions generally, and gave the criterion to attain the flat front surface. Moreover, the absorber stress-bias required to obtain zero-stress on finished mask blanks was also calculated and it has been found that only little stress-bias was required for adequate hole size of support plate.

  12. Theoretical study of carbon-based tips for scanning tunnelling microscopy.

    PubMed

    González, C; Abad, E; Dappe, Y J; Cuevas, J C

    2016-03-11

    Motivated by recent experiments, we present here a detailed theoretical analysis of the use of carbon-based conductive tips in scanning tunnelling microscopy. In particular, we employ ab initio methods based on density functional theory to explore a graphitic, an amorphous carbon and two diamond-like tips for imaging with a scanning tunnelling microscope (STM), and we compare them with standard metallic tips made of gold and tungsten. We investigate the performance of these tips in terms of the corrugation of the STM images acquired when scanning a single graphene sheet. Moreover, we analyse the impact of the tip-sample distance and show that it plays a fundamental role in the resolution and symmetry of the STM images. We also explore in depth how the adsorption of single atoms and molecules in the tip apexes modifies the STM images and demonstrate that, in general, it leads to an improved image resolution. The ensemble of our results provides strong evidence that carbon-based tips can significantly improve the resolution of STM images, as compared to more standard metallic tips, which may open a new line of research in scanning tunnelling microscopy.

  13. [Empirical study of the market orientation of veterinarians in The Netherlands].

    PubMed

    Schuurmans, A J; Smidts, A

    1990-04-01

    Linked to the theoretical framework of marketing in the veterinary practice, as explained in Schuurmans and Smidts (1) pp. 1-10 and Schuurmans and Smidts (2), an empirical research project has been undertaken. This research gives insight into the extent to which practices base their services on a marketing orientation. By means of telephone inquiries among a sample of 166 veterinarians, this was investigated. The research shows that veterinarians think more in a product-oriented way than in a market-oriented way, and they do not use all the opportunities a marketing orientation could bring to their services. This expresses itself, among others by not using market segmentation, by the inadequate use of the marketing mix elements communication and distribution, and by the fact that the opportunities of the marketing information system are hardly applied. By means of further research in individual practices it might be possible to give concrete advise fit for that practice. Research among the clients of the veterinarians might, beside many other kinds of research, also give valuable insights.

  14. Definitions of homework, types of homework, and ratings of the importance of homework among psychologists with cognitive behavior therapy and psychoanalytic theoretical orientations.

    PubMed

    Kazantzis, Nikolaos; Dattilio, Frank M

    2010-07-01

    A random sample of 827 psychologists were surveyed to assess their definitions of homework, use of homework tasks, and perceived importance of homework. Theoretical orientation distinguished practitioners' responses. Cognitive-behavioral therapists defined homework as being closer to empirically supported therapy, whereas psychodynamic therapists rated homework as less characteristic of a process that embraces client responsibility and adaptive skills. Cognitive-behavior therapists did not limit their choices to activity-based tasks, and psychodynamic therapists reported using behavioral tasks "sometimes." Monitoring dreams and conscious thought were also used among the entire sample surveyed. Psychodynamic therapists rated homework as "somewhat" or "moderately" important, whereas cognitive-behavior therapists more often rated homework as "very important." Data suggest some homework may be common to different psychotherapeutic approaches. Findings are discussed in the context of recent theoretical work on homework in psychotherapy and recommendations for future research.

  15. Empirical modeling of an alcohol expectancy memory network using multidimensional scaling.

    PubMed

    Rather, B C; Goldman, M S; Roehrich, L; Brannick, M

    1992-02-01

    Risk-related antecedent variables can be linked to later alcohol consumption by memory processes, and alcohol expectancies may be one relevant memory content. To advance research in this area, it would be useful to apply current memory models such as semantic network theory to explain drinking decision processes. We used multidimensional scaling (MDS) to empirically model a preliminary alcohol expectancy semantic network, from which a theoretical account of drinking decision making was generated. Subanalyses (PREFMAP) showed how individuals with differing alcohol consumption histories may have had different association pathways within the expectancy network. These pathways may have, in turn influenced future drinking levels and behaviors while the person was under the influence of alcohol. All individuals associated positive/prosocial effects with drinking, but heavier drinkers indicated arousing effects as their highest probability associates, whereas light drinkers expected sedation. An important early step in this MDS modeling process is the determination of iso-meaning expectancy adjective groups, which correspond to theoretical network nodes.

  16. Landscape influences on dispersal behaviour: a theoretical model and empirical test using the fire salamander, Salamandra infraimmaculata.

    PubMed

    Kershenbaum, Arik; Blank, Lior; Sinai, Iftach; Merilä, Juha; Blaustein, Leon; Templeton, Alan R

    2014-06-01

    When populations reside within a heterogeneous landscape, isolation by distance may not be a good predictor of genetic divergence if dispersal behaviour and therefore gene flow depend on landscape features. Commonly used approaches linking landscape features to gene flow include the least cost path (LCP), random walk (RW), and isolation by resistance (IBR) models. However, none of these models is likely to be the most appropriate for all species and in all environments. We compared the performance of LCP, RW and IBR models of dispersal with the aid of simulations conducted on artificially generated landscapes. We also applied each model to empirical data on the landscape genetics of the endangered fire salamander, Salamandra infraimmaculata, in northern Israel, where conservation planning requires an understanding of the dispersal corridors. Our simulations demonstrate that wide dispersal corridors of the low-cost environment facilitate dispersal in the IBR model, but inhibit dispersal in the RW model. In our empirical study, IBR explained the genetic divergence better than the LCP and RW models (partial Mantel correlation 0.413 for IBR, compared to 0.212 for LCP, and 0.340 for RW). Overall dispersal cost in salamanders was also well predicted by landscape feature slope steepness (76%), and elevation (24%). We conclude that fire salamander dispersal is well characterised by IBR predictions. Together with our simulation findings, these results indicate that wide dispersal corridors facilitate, rather than hinder, salamander dispersal. Comparison of genetic data to dispersal model outputs can be a useful technique in inferring dispersal behaviour from population genetic data.

  17. Distress Tolerance and Psychopathological Symptoms and Disorders: A Review of the Empirical Literature among Adults

    PubMed Central

    Leyro, Teresa M.; Zvolensky, Michael J.; Bernstein, Amit

    2010-01-01

    In the present paper, we review theory and empirical study of distress tolerance, an emerging risk factor candidate for various forms of psychopathology. Despite the long-standing interest in, and promise of work on, distress tolerance for understanding adult psychopathology, there has not been a comprehensive review of the extant empirical literature focused on the construct. As a result, a comprehensive synthesis of theoretical and empirical scholarship on distress tolerance including integration of extant research on the relations between distress tolerance and psychopathology is lacking. Inspection of the scientific literature indicates that there are a number of promising ways to conceptualize and measure distress tolerance, as well as documented relations between distress tolerance factor(s) and psychopathological symptoms and disorders. Although promising, there also is notable conceptual and operational heterogeneity across the distress tolerance literature(s). Moreoever, a number of basic questions remain unanswered regarding the associations between distress tolerance and other risk and protective factors and processes, as well as its putative role(s) in vulnerability for, and resilience to, psychopathology. Thus, the current paper provides a comprehensive review of past and contemporary theory and research and proposes key areas for future empirical study of this construct. PMID:20565169

  18. Linking agent-based models and stochastic models of financial markets

    PubMed Central

    Feng, Ling; Li, Baowen; Podobnik, Boris; Preis, Tobias; Stanley, H. Eugene

    2012-01-01

    It is well-known that financial asset returns exhibit fat-tailed distributions and long-term memory. These empirical features are the main objectives of modeling efforts using (i) stochastic processes to quantitatively reproduce these features and (ii) agent-based simulations to understand the underlying microscopic interactions. After reviewing selected empirical and theoretical evidence documenting the behavior of traders, we construct an agent-based model to quantitatively demonstrate that “fat” tails in return distributions arise when traders share similar technical trading strategies and decisions. Extending our behavioral model to a stochastic model, we derive and explain a set of quantitative scaling relations of long-term memory from the empirical behavior of individual market participants. Our analysis provides a behavioral interpretation of the long-term memory of absolute and squared price returns: They are directly linked to the way investors evaluate their investments by applying technical strategies at different investment horizons, and this quantitative relationship is in agreement with empirical findings. Our approach provides a possible behavioral explanation for stochastic models for financial systems in general and provides a method to parameterize such models from market data rather than from statistical fitting. PMID:22586086

  19. Linking agent-based models and stochastic models of financial markets.

    PubMed

    Feng, Ling; Li, Baowen; Podobnik, Boris; Preis, Tobias; Stanley, H Eugene

    2012-05-29

    It is well-known that financial asset returns exhibit fat-tailed distributions and long-term memory. These empirical features are the main objectives of modeling efforts using (i) stochastic processes to quantitatively reproduce these features and (ii) agent-based simulations to understand the underlying microscopic interactions. After reviewing selected empirical and theoretical evidence documenting the behavior of traders, we construct an agent-based model to quantitatively demonstrate that "fat" tails in return distributions arise when traders share similar technical trading strategies and decisions. Extending our behavioral model to a stochastic model, we derive and explain a set of quantitative scaling relations of long-term memory from the empirical behavior of individual market participants. Our analysis provides a behavioral interpretation of the long-term memory of absolute and squared price returns: They are directly linked to the way investors evaluate their investments by applying technical strategies at different investment horizons, and this quantitative relationship is in agreement with empirical findings. Our approach provides a possible behavioral explanation for stochastic models for financial systems in general and provides a method to parameterize such models from market data rather than from statistical fitting.

  20. An empirical polytrope law for solar wind thermal electrons between 0.45 and 4.76 AU - Voyager 2 and Mariner 10

    NASA Technical Reports Server (NTRS)

    Sittler, E. C., Jr.; Scudder, J. D.

    1980-01-01

    In this paper empirical evidence is presented that between 0.4 and 5 AU the thermal portion (but not all) of the solar wind electron population obeys a polytrope relation. It is also shown that this functional relationship is a member of a broader class of possible laws required of a steady state, fully ionized plasma whose proper frame electric field is dominated by the polarization electric field. The empirically determined, thermodynamically interesting value of the polytrope index (1.175) is virtually that predicted (1.16) by the theoretical considerations of Scudder and Olbert (1979). Strong, direct, empirical evidence for the nearly isothermal behavior of solar wind electrons as has been indirectly argued in the literature for some time is provided.

  1. Understanding Skill in EVA Mass Handling. Volume 1; Theoretical and Operational Foundations

    NASA Technical Reports Server (NTRS)

    Riccio, Gary; McDonald, Vernon; Peters, Brian; Layne, Charles; Bloomberg, Jacob

    1997-01-01

    This report describes the theoretical and operational foundations for our analysis of skill in extravehicular mass handling. A review of our research on postural control, human-environment interactions, and exploratory behavior in skill acquisition is used to motivate our analysis. This scientific material is presented within the context of operationally valid issues concerning extravehicular mass handling. We describe the development of meaningful empirical measures that are relevant to a special class of nested control systems: manual interactions between an individual and the substantial environment. These measures are incorporated into a unique empirical protocol implemented on NASA's principal mass handling simulator, the precision air-bearing floor, in order to evaluate skill in extravehicular mass handling. We discuss the components of such skill with reference to the relationship between postural configuration and controllability of an orbital replacement unit, the relationship between orbital replacement unit control and postural stability, the relationship between antecedent and consequent movements of an orbital replacement unit, and the relationship between antecedent and consequent postural movements. Finally, we describe our expectations regarding the operational relevance of the empirical results as it pertains to extravehicular activity tools, training, monitoring, and planning.

  2. Theoretical evaluation of the reaction rates for {sup 26}Al(n,p){sup 26}Mg and {sup 26}Al(n,{alpha}){sup 23}Na

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oginni, B. M.; Iliadis, C.; Champagne, A. E.

    2011-02-15

    The reactions that destroy {sup 26}Al in massive stars have significance in a number of astrophysical contexts. We evaluate the reaction rates of {sup 26}Al(n,p){sup 26}Mg and {sup 26}Al(n,{alpha}){sup 23}Na using cross sections obtained from the codes empire and talys. These have been compared to the published rates obtained from the non-smoker code and to some experimental data. We show that the results obtained from empire and talys are comparable to those from non-smoker. We also show how the theoretical results vary with respect to changes in the input parameters. Finally, we present recommended rates for these reactions using themore » available experimental data and our new theoretical results.« less

  3. A Compound Fault Diagnosis for Rolling Bearings Method Based on Blind Source Separation and Ensemble Empirical Mode Decomposition

    PubMed Central

    Wang, Huaqing; Li, Ruitong; Tang, Gang; Yuan, Hongfang; Zhao, Qingliang; Cao, Xi

    2014-01-01

    A Compound fault signal usually contains multiple characteristic signals and strong confusion noise, which makes it difficult to separate week fault signals from them through conventional ways, such as FFT-based envelope detection, wavelet transform or empirical mode decomposition individually. In order to improve the compound faults diagnose of rolling bearings via signals’ separation, the present paper proposes a new method to identify compound faults from measured mixed-signals, which is based on ensemble empirical mode decomposition (EEMD) method and independent component analysis (ICA) technique. With the approach, a vibration signal is firstly decomposed into intrinsic mode functions (IMF) by EEMD method to obtain multichannel signals. Then, according to a cross correlation criterion, the corresponding IMF is selected as the input matrix of ICA. Finally, the compound faults can be separated effectively by executing ICA method, which makes the fault features more easily extracted and more clearly identified. Experimental results validate the effectiveness of the proposed method in compound fault separating, which works not only for the outer race defect, but also for the rollers defect and the unbalance fault of the experimental system. PMID:25289644

  4. Theoretical molecular studies of astrophysical interest

    NASA Technical Reports Server (NTRS)

    Flynn, George

    1991-01-01

    When work under this grant began in 1974 there was a great need for state-to-state collisional excitation rates for interstellar molecules observed by radio astronomers. These were required to interpret observed line intensities in terms of local temperatures and densities, but, owing to lack of experimental or theoretical values, estimates then being used for this purpose ranged over several orders of magnitude. A problem of particular interest was collisional excitation of formaldehyde; Townes and Cheung had suggested that the relative size of different state-to-state rates (propensity rules) was responsible for the anomalous absorption observed for this species. We believed that numerical molecular scattering techniques (in particular the close coupling or coupled channel method) could be used to obtain accurate results, and that these would be computationally feasible since only a few molecular rotational levels are populated at the low temperatures thought to prevail in the observed regions. Such calculations also require detailed knowledge of the intermolecular forces, but we thought that those could also be obtained with sufficient accuracy by theoretical (quantum chemical) techniques. Others, notably Roy Gordon at Harvard, had made progress in solving the molecular scattering equations, generally using semi-empirical intermolecular potentials. Work done under this grant generalized Gordon's scattering code, and introduced the use of theoretical interaction potentials obtained by solving the molecular Schroedinger equation. Earlier work had considered only the excitation of a diatomic molecule by collisions with an atom, and we extended the formalism to include excitation of more general molecular rotors (e.g., H2CO, NH2, and H2O) and also collisions of two rotors (e.g., H2-H2).

  5. Theoretical Loss and Gambling Intensity (Revisited): A Response to Braverman et al. (2013).

    PubMed

    Auer, Michael; Griffiths, Mark D

    2015-09-01

    In this paper, we provide a brief response to Braverman et al. (J Gambl Stud. doi: 10.1007/s10899-013-9428-z , 2013b) critique of our 'Theoretical Loss' metric as a measure of monetary gambling intensity (Auer and Griffiths in J Gambl Stud. doi: 10.1007/s10899-013-9376-7 , 2013a; Auer et al. in Gaming Law Rev Econ 16:269-273, 2012). We argue that 'gambling intensity' and 'gambling involvement' are essentially the same construct as descriptors of monetary gambling activity. Additionally, we acknowledge that playing duration (i.e., the amount of time—as opposed to money—actually spent gambling) is clearly another important indicator of gambling involvement-something that we have consistently noted in our previous studies including our empirical studies on gambling using behavioural tracking data. Braverman and colleagues claim that the concept of Theoretical Loss is nullified when statistical analysis focuses solely on one game type as the house edge is constant across all games. In fact, they state, the correlation between total amount wagered and Theoretical Loss is perfect. Unfortunately, this is incorrect. To disprove the claim made, we demonstrate that in sports betting (i.e., a single game type), the amount wagered does not reflect monetary gambling involvement using actual payout percentage data (based on 52,500 independent bets provided to us by an online European bookmaker). After reviewing the arguments presented by Braverman and colleagues, we are still of the view that when it comes to purely monetary measures of 'gambling intensity', the Theoretical Loss metric is a more robust and accurate measure than other financial proxy measures such as 'amount wagered' (i.e., bet size) as a measure of what players are prepared to financially risk while gambling.

  6. EMPIRICAL DETERMINATION OF EINSTEIN A-COEFFICIENT RATIOS OF BRIGHT [Fe II] LINES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giannini, T.; Antoniucci, S.; Nisini, B.

    The Einstein spontaneous rates (A-coefficients) of Fe{sup +} lines have been computed by several authors with results that differ from each other by up to 40%. Consequently, models for line emissivities suffer from uncertainties that in turn affect the determination of the physical conditions at the base of line excitation. We provide an empirical determination of the A-coefficient ratios of bright [Fe II] lines that would represent both a valid benchmark for theoretical computations and a reference for the physical interpretation of the observed lines. With the ESO-Very Large Telescope X-shooter instrument between 3000 Å and 24700 Å, we obtainedmore » a spectrum of the bright Herbig-Haro object HH 1. We detect around 100 [Fe II] lines, some of which with a signal-to-noise ratios ≥100. Among these latter lines, we selected those emitted by the same level, whose dereddened intensity ratios are direct functions of the Einstein A-coefficient ratios. From the same X-shooter spectrum, we got an accurate estimate of the extinction toward HH 1 through intensity ratios of atomic species, H I  recombination lines and H{sub 2} ro-vibrational transitions. We provide seven reliable A-coefficient ratios between bright [Fe II] lines, which are compared with the literature determinations. In particular, the A-coefficient ratios involving the brightest near-infrared lines (λ12570/λ16440 and λ13209/λ16440) are in better agreement with the predictions by the Quinet et al. relativistic Hartree-Fock model. However, none of the theoretical models predict A-coefficient ratios in agreement with all of our determinations. We also show that literature data of near-infrared intensity ratios better agree with our determinations than with theoretical expectations.« less

  7. Towards a Conceptual Framework of GBL Design for Engagement and Learning of Curriculum-Based Content

    ERIC Educational Resources Information Center

    Jabbar, Azita Iliya Abdul; Felicia, Patrick

    2016-01-01

    This paper aims to show best practices of GBL design for engagement. It intends to show how teachers can implement GBL in a collaborative, comprehensive and systematic way, in the classrooms, and probably outside the classrooms, based on empirical evidence and theoretical framework designed accordingly. This paper presents the components needed to…

  8. Endogenously determined cycles: empirical evidence from livestock industries.

    PubMed

    McCullough, Michael P; Huffaker, Ray; Marsh, Thomas L

    2012-04-01

    This paper applies the techniques of phase space reconstruction and recurrence quantification analysis to investigate U.S. livestock cycles in relation to recent literature on the business cycle. Results are presented for pork and cattle cycles, providing empirical evidence that the cycles themselves have slowly diminished. By comparing the evolution of production processes for the two livestock cycles we argue that the major cause for this moderation is largely endogenous. The analysis suggests that previous theoretical models relying solely on exogenous shocks to create cyclical patterns do not fully capture changes in system dynamics. Specifically, the biological constraint in livestock dynamics has become less significant while technology and information are relatively more significant. Concurrently, vertical integration of the supply chain may have improved inventory management, all resulting in a small, less deterministic, cyclical effect.

  9. Understanding Transactional Distance in Web-Based Learning Environments: An Empirical Study

    ERIC Educational Resources Information Center

    Huang, Xiaoxia; Chandra, Aruna; DePaolo, Concetta A.; Simmons, Lakisha L.

    2016-01-01

    Transactional distance is an important pedagogical theory in distance education that calls for more empirical support. The purpose of this study was to verify the theory by operationalizing and examining the relationship of (1) dialogue, structure and learner autonomy to transactional distance, and (2) environmental factors and learner demographic…

  10. Social Capital in Schools: A Conceptual and Empirical Analysis of the Equity of Its Distribution and Relation to Academic Achievement

    ERIC Educational Resources Information Center

    Salloum, Serena J.; Goddard, Roger D.; Larsen, Ross

    2017-01-01

    Background: Schools face pressure to promote equitable student outcomes as the achievement gap continues to persist. The authors examine different ways in which social capital has been conceptualized as well as prior theory and research on its formation and consequences. While some theoretical and empirical work conceptualizes social capital as a…

  11. Toward a research-based assessment of dyslexia: using cognitive measures to identify reading disabilities.

    PubMed

    Bell, Sherry Mee; McCallum, R Steve; Cox, Elizabeth A

    2003-01-01

    One hundred five participants from a random sample of elementary and middle school children completed measures of reading achievement and cognitive abilities presumed, based on a synthesis of current dyslexia research, to underlie reading. Factor analyses of these cognitive variables (including auditory processing, phonological awareness, short-term auditory memory, visual memory, rapid automatized naming, and visual processing speed) produced three empirically and theoretically derived factors (auditory processing, visual processing/speed, and memory), each of which contributed to the prediction of reading and spelling skills. Factor scores from the three factors combined predicted 85% of the variance associated with letter/sight word naming, 70% of the variance associated with reading comprehension, 73% for spelling, and 61% for phonetic decoding. The auditory processing factor was the strongest predictor, accounting for 27% to 43% of the variance across the different achievement areas. The results provide practitioner and researcher with theoretical and empirical support for the inclusion of measures of the three factors, in addition to specific measures of reading achievement, in a standardized assessment of dyslexia. Guidelines for a thorough, research-based assessment are provided.

  12. Development of theory-based health messages: three-phase programme of formative research

    PubMed Central

    Epton, Tracy; Norman, Paul; Harris, Peter; Webb, Thomas; Snowsill, F. Alexandra; Sheeran, Paschal

    2015-01-01

    Online health behaviour interventions have great potential but their effectiveness may be hindered by a lack of formative and theoretical work. This paper describes the process of formative research to develop theoretically and empirically based health messages that are culturally relevant and can be used in an online intervention to promote healthy lifestyle behaviours among new university students. Drawing on the Theory of Planned Behaviour, a three-phase programme of formative research was conducted with prospective and current undergraduate students to identify (i) modal salient beliefs (the most commonly held beliefs) about fruit and vegetable intake, physical activity, binge drinking and smoking, (ii) which beliefs predicted intentions/behaviour and (iii) reasons underlying each of the beliefs that could be targeted in health messages. Phase 1, conducted with 96 pre-university college students, elicited 56 beliefs about the behaviours. Phase 2, conducted with 3026 incoming university students, identified 32 of these beliefs that predicted intentions/behaviour. Phase 3, conducted with 627 current university students, elicited 102 reasons underlying the 32 beliefs to be used to construct health messages to bolster or challenge these beliefs. The three-phase programme of formative research provides researchers with an example of how to develop health messages with a strong theoretical- and empirical base for use in health behaviour change interventions. PMID:24504361

  13. Accuracy Analysis of a Box-wing Theoretical SRP Model

    NASA Astrophysics Data System (ADS)

    Wang, Xiaoya; Hu, Xiaogong; Zhao, Qunhe; Guo, Rui

    2016-07-01

    For Beidou satellite navigation system (BDS) a high accuracy SRP model is necessary for high precise applications especially with Global BDS establishment in future. The BDS accuracy for broadcast ephemeris need be improved. So, a box-wing theoretical SRP model with fine structure and adding conical shadow factor of earth and moon were established. We verified this SRP model by the GPS Block IIF satellites. The calculation was done with the data of PRN 1, 24, 25, 27 satellites. The results show that the physical SRP model for POD and forecast for GPS IIF satellite has higher accuracy with respect to Bern empirical model. The 3D-RMS of orbit is about 20 centimeters. The POD accuracy for both models is similar but the prediction accuracy with the physical SRP model is more than doubled. We tested 1-day 3-day and 7-day orbit prediction. The longer is the prediction arc length, the more significant is the improvement. The orbit prediction accuracy with the physical SRP model for 1-day, 3-day and 7-day arc length are 0.4m, 2.0m, 10.0m respectively. But they are 0.9m, 5.5m and 30m with Bern empirical model respectively. We apply this means to the BDS and give out a SRP model for Beidou satellites. Then we test and verify the model with Beidou data of one month only for test. Initial results show the model is good but needs more data for verification and improvement. The orbit residual RMS is similar to that with our empirical force model which only estimate the force for along track, across track direction and y-bias. But the orbit overlap and SLR observation evaluation show some improvement. The remaining empirical force is reduced significantly for present Beidou constellation.

  14. Security Analysis of Selected AMI Failure Scenarios Using Agent Based Game Theoretic Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abercrombie, Robert K; Schlicher, Bob G; Sheldon, Frederick T

    Information security analysis can be performed using game theory implemented in dynamic Agent Based Game Theoretic (ABGT) simulations. Such simulations can be verified with the results from game theory analysis and further used to explore larger scale, real world scenarios involving multiple attackers, defenders, and information assets. We concentrated our analysis on the Advanced Metering Infrastructure (AMI) functional domain which the National Electric Sector Cyber security Organization Resource (NESCOR) working group has currently documented 29 failure scenarios. The strategy for the game was developed by analyzing five electric sector representative failure scenarios contained in the AMI functional domain. From thesemore » five selected scenarios, we characterize them into three specific threat categories affecting confidentiality, integrity and availability (CIA). The analysis using our ABGT simulation demonstrates how to model the AMI functional domain using a set of rationalized game theoretic rules decomposed from the failure scenarios in terms of how those scenarios might impact the AMI network with respect to CIA.« less

  15. A rights-based proposal for managing faith-based values and expectations of migrants at end-of-life illustrated by an empirical study involving South Asians in the UK.

    PubMed

    Samanta, Jo; Samanta, Ash; Madhloom, Omar

    2018-06-08

    International migration is an important issue for many high-income countries and is accompanied by opportunities as well as challenges. South Asians are the largest minority ethnic group in the United Kingdom, and this diaspora is reflective of the growing diversity of British society. An empirical study was performed to ascertain the faith-based values, beliefs, views and attitudes of participants in relation to their perception of issues pertaining to end-of-life care. Empirical observations from this study, as well as the extant knowledge-base from the literature, are used to support and contextualise our reflections against a socio-legal backdrop. We argue for accommodation of faith-based values of migrants at end-of-life within normative structures of receiving countries. We posit the ethically relevant principles of inclusiveness, integration and embedment, for an innovative bioethical framework as a vehicle for accommodating faith-based values and needs of migrants at end-of-life. These tenets work conjunctively, as well as individually, in respect of individual care, enabling processes and procedures, and ultimately for formulating policy and strategy. © 2018 John Wiley & Sons Ltd.

  16. Complex dynamics and empirical evidence (Invited Paper)

    NASA Astrophysics Data System (ADS)

    Delli Gatti, Domenico; Gaffeo, Edoardo; Giulioni, Gianfranco; Gallegati, Mauro; Kirman, Alan; Palestrini, Antonio; Russo, Alberto

    2005-05-01

    Standard macroeconomics, based on a reductionist approach centered on the representative agent, is badly equipped to explain the empirical evidence where heterogeneity and industrial dynamics are the rule. In this paper we show that a simple agent-based model of heterogeneous financially fragile agents is able to replicate a large number of scaling type stylized facts with a remarkable degree of statistical precision.

  17. Base course resilient modulus for the mechanistic-empirical pavement design guide : [summary].

    DOT National Transportation Integrated Search

    2011-01-01

    Elastic modulus determination is often used in designing pavements and evaluating pavement performance. The Mechanistic-Empirical Pavement Design Guide (MEPDG) has become an important source of guidance for pavement design and rehabilitation. MEPDG r...

  18. The evolution of genomic imprinting: theories, predictions and empirical tests

    PubMed Central

    Patten, M M; Ross, L; Curley, J P; Queller, D C; Bonduriansky, R; Wolf, J B

    2014-01-01

    The epigenetic phenomenon of genomic imprinting has motivated the development of numerous theories for its evolutionary origins and genomic distribution. In this review, we examine the three theories that have best withstood theoretical and empirical scrutiny. These are: Haig and colleagues' kinship theory; Day and Bonduriansky's sexual antagonism theory; and Wolf and Hager's maternal–offspring coadaptation theory. These theories have fundamentally different perspectives on the adaptive significance of imprinting. The kinship theory views imprinting as a mechanism to change gene dosage, with imprinting evolving because of the differential effect that gene dosage has on the fitness of matrilineal and patrilineal relatives. The sexual antagonism and maternal–offspring coadaptation theories view genomic imprinting as a mechanism to modify the resemblance of an individual to its two parents, with imprinting evolving to increase the probability of expressing the fitter of the two alleles at a locus. In an effort to stimulate further empirical work on the topic, we carefully detail the logic and assumptions of all three theories, clarify the specific predictions of each and suggest tests to discriminate between these alternative theories for why particular genes are imprinted. PMID:24755983

  19. Gain-clamped semiconductor optical amplifiers based on compensating light: Theoretical model and performance analysis

    NASA Astrophysics Data System (ADS)

    Jia, Xin-Hong; Wu, Zheng-Mao; Xia, Guang-Qiong

    2006-12-01

    It is well known that the gain-clamped semiconductor optical amplifier (GC-SOA) based on lasing effect is subject to transmission rate restriction because of relaxation oscillation. The GC-SOA based on compensating effect between signal light and amplified spontaneous emission by combined SOA and fiber Bragg grating (FBG) can be used to overcome this problem. In this paper, the theoretical model on GC-SOA based on compensating light has been constructed. The numerical simulations demonstrate that good gain and noise figure characteristics can be realized by selecting reasonably the FBG insertion position, the peak reflectivity of FBG and the biasing current of GC-SOA.

  20. Contingency theoretic methodology for agent-based web-oriented manufacturing systems

    NASA Astrophysics Data System (ADS)

    Durrett, John R.; Burnell, Lisa J.; Priest, John W.

    2000-12-01

    The development of distributed, agent-based, web-oriented, N-tier Information Systems (IS) must be supported by a design methodology capable of responding to the convergence of shifts in business process design, organizational structure, computing, and telecommunications infrastructures. We introduce a contingency theoretic model for the use of open, ubiquitous software infrastructure in the design of flexible organizational IS. Our basic premise is that developers should change in the way they view the software design process from a view toward the solution of a problem to one of the dynamic creation of teams of software components. We postulate that developing effective, efficient, flexible, component-based distributed software requires reconceptualizing the current development model. The basic concepts of distributed software design are merged with the environment-causes-structure relationship from contingency theory; the task-uncertainty of organizational- information-processing relationships from information processing theory; and the concept of inter-process dependencies from coordination theory. Software processes are considered as employees, groups of processes as software teams, and distributed systems as software organizations. Design techniques already used in the design of flexible business processes and well researched in the domain of the organizational sciences are presented. Guidelines that can be utilized in the creation of component-based distributed software will be discussed.